RWKV Open Source Development Blog
Subscribe
Sign in
Home
RWKV Wiki
Discord
Archive
About
RWKV-6 Finch 7B World 3 now with 3.1T tokens trained!
Moar training, moar capable!
Dec 11
•
RWKV
1
Share this post
RWKV Open Source Development Blog
RWKV-6 Finch 7B World 3 now with 3.1T tokens trained!
Copy link
Facebook
Email
Notes
More
September 2024
🚀 RWKV.cpp - shipping to 1.5 billion systems worldwide
We went from ~50k installation, to 1.5 billion. On every windows 10 and 11 computer, near you (even the ones in the IT store)
Sep 3
•
RWKV
2
Share this post
RWKV Open Source Development Blog
🚀 RWKV.cpp - shipping to 1.5 billion systems worldwide
Copy link
Facebook
Email
Notes
More
🐦 RWKV v6 Finch 14B is here!
From 14B, 7B, 3B, 1.6B here are the various RWKV v6 models
Sep 3
•
RWKV
1
Share this post
RWKV Open Source Development Blog
🐦 RWKV v6 Finch 14B is here!
Copy link
Facebook
Email
Notes
More
April 2024
🦅 EagleX v2 : Soaring past LLaMA2 7B in both English and Multi-lang evals (RWKV-v5)
You have seen the teaser with the EagleX 1.7T, now its here - the definitive version of linear transformer trained past, LLaMA 2 7B.
Apr 18
•
RWKV
1
Share this post
RWKV Open Source Development Blog
🦅 EagleX v2 : Soaring past LLaMA2 7B in both English and Multi-lang evals (RWKV-v5)
Copy link
Facebook
Email
Notes
More
🦅 Eagle & 🐦 Finch - architecture paper is here
Available at your local arxiv
Apr 10
•
RWKV
Share this post
RWKV Open Source Development Blog
🦅 Eagle & 🐦 Finch - architecture paper is here
Copy link
Facebook
Email
Notes
More
January 2024
🦅 Eagle 7B : Soaring past Transformers with 1 Trillion Tokens Across 100+ Languages (RWKV-v5)
A brand new era for the RWKV-v5 architecture and linear transformer's has arrived - with the strongest multi-lingual model in open source today
Jan 29
•
Eugene Cheah
30
Share this post
RWKV Open Source Development Blog
🦅 Eagle 7B : Soaring past Transformers with 1 Trillion Tokens Across 100+ Languages (RWKV-v5)
Copy link
Facebook
Email
Notes
More
6
🌳 The World's Greenest AI Model: RWKV's Pioneering Sustainability
10-100x lower inference cost = lower carbon footprint
Jan 28
•
RWKV
2
Share this post
RWKV Open Source Development Blog
🌳 The World's Greenest AI Model: RWKV's Pioneering Sustainability
Copy link
Facebook
Email
Notes
More
🐣 RWKV v5 1.5B - Achieves SOTA multi-lingual performance
The best AI model in the smol <2B param weight class has arrived
Jan 23
1
Share this post
RWKV Open Source Development Blog
🐣 RWKV v5 1.5B - Achieves SOTA multi-lingual performance
Copy link
Facebook
Email
Notes
More
🏘️ RWKV joins the Linux Foundation - As the first AI model under the Generative AI Commons
Putting the "Open Source" into "Open AI"
Jan 23
1
Share this post
RWKV Open Source Development Blog
🏘️ RWKV joins the Linux Foundation - As the first AI model under the Generative AI Commons
Copy link
Facebook
Email
Notes
More
Share
Copy link
Facebook
Email
Notes
More
Error
This site requires JavaScript to run correctly. Please
turn on JavaScript
or unblock scripts