RWKV Open Source Development Blog

RWKV Open Source Development Blog

Share this post

RWKV Open Source Development Blog
RWKV Open Source Development Blog
🦅 Eagle & 🐦 Finch - architecture paper is here
Copy link
Facebook
Email
Notes
More
User's avatar
Discover more from RWKV Open Source Development Blog
Development blog for the RWKV open source architecture, and their derivative OSS models
Already have an account? Sign in

🦅 Eagle & 🐦 Finch - architecture paper is here

Available at your local arxiv

RWKV's avatar
RWKV
Apr 10, 2024

Share this post

RWKV Open Source Development Blog
RWKV Open Source Development Blog
🦅 Eagle & 🐦 Finch - architecture paper is here
Copy link
Facebook
Email
Notes
More
Share

We present the Eagle and Finch architecture paper at arxiv: https://arxiv.org/abs/2404.05892

Which covers and documents the architecture changes from RWKV-v4 onwards. This paper is a collaborative effort with the folks at Eleuther AI, who helped us in the paper-writing process

Special shout-out to

  • BlinkDL: The creator of RWKV project

  • Eleuther AI: Who helped us throughout the paper writing process

  • Linux Foundation AI & Data: For hosting our project

  • Stability AI: Who sponsored the bulk of the compute, for the models covered.


Does this cover our latest model?

No - this covers our previously released Eagle and Finch line of models, trained up to 1.1T tokens

A reminder, that as a fully Open Source project, we release in the following sequence: Code, Weights, then the paper Not the other way around


Stay tuned for more details on our upcoming models this week

  • Eagle: 2.25T 7B

  • Finch: 2.5T 1.6B

(Some of you probably already know where to find it, if you search through our repos / discord)

Thanks for reading RWKV Open Source Development Blog! Subscribe for free to receive new posts and support my work.

Share this post

RWKV Open Source Development Blog
RWKV Open Source Development Blog
🦅 Eagle & 🐦 Finch - architecture paper is here
Copy link
Facebook
Email
Notes
More
Share

Discussion about this post

User's avatar
🦅 Eagle 7B : Soaring past Transformers with 1 Trillion Tokens Across 100+ Languages (RWKV-v5)
A brand new era for the RWKV-v5 architecture and linear transformer's has arrived - with the strongest multi-lingual model in open source today
Jan 29, 2024 • 
Eugene Cheah
30

Share this post

RWKV Open Source Development Blog
RWKV Open Source Development Blog
🦅 Eagle 7B : Soaring past Transformers with 1 Trillion Tokens Across 100+ Languages (RWKV-v5)
Copy link
Facebook
Email
Notes
More
6
🚀 RWKV.cpp - shipping to 1.5 billion systems worldwide
We went from ~50k installation, to 1.5 billion. On every windows 10 and 11 computer, near you (even the ones in the IT store)
Sep 3, 2024 • 
RWKV
2

Share this post

RWKV Open Source Development Blog
RWKV Open Source Development Blog
🚀 RWKV.cpp - shipping to 1.5 billion systems worldwide
Copy link
Facebook
Email
Notes
More
🦅 EagleX v2 : Soaring past LLaMA2 7B in both English and Multi-lang evals (RWKV-v5)
You have seen the teaser with the EagleX 1.7T, now its here - the definitive version of linear transformer trained past, LLaMA 2 7B.
Apr 18, 2024 • 
RWKV
1

Share this post

RWKV Open Source Development Blog
RWKV Open Source Development Blog
🦅 EagleX v2 : Soaring past LLaMA2 7B in both English and Multi-lang evals (RWKV-v5)
Copy link
Facebook
Email
Notes
More

Ready for more?

© 2025 RWKV
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share

Copy link
Facebook
Email
Notes
More

Create your profile

User's avatar

Only paid subscribers can comment on this post

Already a paid subscriber? Sign in

Check your email

For your security, we need to re-authenticate you.

Click the link we sent to , or click here to sign in.