RWKV Open Source Development Blog
Subscribe
Sign in
Home
RWKV Wiki
Discord
Archive
About
RWKV-6 Finch 7B World 3 now with 3.1T tokens trained!
Moar training, moar capable!
Dec 11, 2024
•
RWKV
1
Latest
Top
Discussions
🚀 RWKV.cpp - shipping to 1.5 billion systems worldwide
We went from ~50k installation, to 1.5 billion. On every windows 10 and 11 computer, near you (even the ones in the IT store)
Sep 3, 2024
•
RWKV
2
🐦 RWKV v6 Finch 14B is here!
From 14B, 7B, 3B, 1.6B here are the various RWKV v6 models
Sep 3, 2024
•
RWKV
1
🦅 EagleX v2 : Soaring past LLaMA2 7B in both English and Multi-lang evals (RWKV-v5)
You have seen the teaser with the EagleX 1.7T, now its here - the definitive version of linear transformer trained past, LLaMA 2 7B.
Apr 18, 2024
•
RWKV
1
🦅 Eagle & 🐦 Finch - architecture paper is here
Available at your local arxiv
Apr 10, 2024
•
RWKV
🦅 Eagle 7B : Soaring past Transformers with 1 Trillion Tokens Across 100+ Languages (RWKV-v5)
A brand new era for the RWKV-v5 architecture and linear transformer's has arrived - with the strongest multi-lingual model in open source today
Jan 29, 2024
•
Eugene Cheah
31
6
2
🌳 The World's Greenest AI Model: RWKV's Pioneering Sustainability
10-100x lower inference cost = lower carbon footprint
Jan 28, 2024
•
RWKV
2
🐣 RWKV v5 1.5B - Achieves SOTA multi-lingual performance
The best AI model in the smol <2B param weight class has arrived
Jan 23, 2024
1
See all
RWKV Open Source Development Blog
Development blog for the RWKV open source architecture, and their derivative OSS models
Subscribe
RWKV Open Source Development Blog
Subscribe
About
Archive
Sitemap
This site requires JavaScript to run correctly. Please
turn on JavaScript
or unblock scripts