RWKV Open Source Development Blog
Subscribe
Sign in
Home
RWKV Wiki
Discord
Archive
About
RWKV-6 Finch 7B World 3 now with 3.1T tokens trained!
Moar training, moar capable!
Dec 11, 2024
•
RWKV
1
September 2024
🚀 RWKV.cpp - shipping to 1.5 billion systems worldwide
We went from ~50k installation, to 1.5 billion. On every windows 10 and 11 computer, near you (even the ones in the IT store)
Sep 3, 2024
•
RWKV
2
🐦 RWKV v6 Finch 14B is here!
From 14B, 7B, 3B, 1.6B here are the various RWKV v6 models
Sep 3, 2024
•
RWKV
1
April 2024
🦅 EagleX v2 : Soaring past LLaMA2 7B in both English and Multi-lang evals (RWKV-v5)
You have seen the teaser with the EagleX 1.7T, now its here - the definitive version of linear transformer trained past, LLaMA 2 7B.
Apr 18, 2024
•
RWKV
1
🦅 Eagle & 🐦 Finch - architecture paper is here
Available at your local arxiv
Apr 10, 2024
•
RWKV
January 2024
🦅 Eagle 7B : Soaring past Transformers with 1 Trillion Tokens Across 100+ Languages (RWKV-v5)
A brand new era for the RWKV-v5 architecture and linear transformer's has arrived - with the strongest multi-lingual model in open source today
Jan 29, 2024
•
Eugene Cheah
30
6
2
🌳 The World's Greenest AI Model: RWKV's Pioneering Sustainability
10-100x lower inference cost = lower carbon footprint
Jan 28, 2024
•
RWKV
2
🐣 RWKV v5 1.5B - Achieves SOTA multi-lingual performance
The best AI model in the smol <2B param weight class has arrived
Jan 23, 2024
1
🏘️ RWKV joins the Linux Foundation - As the first AI model under the Generative AI Commons
Putting the "Open Source" into "Open AI"
Jan 23, 2024
1
This site requires JavaScript to run correctly. Please
turn on JavaScript
or unblock scripts