RWKV Open Source Development Blog

RWKV Open Source Development Blog

Share this post

RWKV Open Source Development Blog
RWKV Open Source Development Blog
🐣 RWKV v5 1.5B - Achieves SOTA multi-lingual performance
Copy link
Facebook
Email
Notes
More

🐣 RWKV v5 1.5B - Achieves SOTA multi-lingual performance

The best AI model in the smol <2B param weight class has arrived

Jan 23, 2024
1

Share this post

RWKV Open Source Development Blog
RWKV Open Source Development Blog
🐣 RWKV v5 1.5B - Achieves SOTA multi-lingual performance
Copy link
Facebook
Email
Notes
More
Share

RWKV v5 1.5B achieves SOTA status with

  • Industry leading multi-lingual performance (across xLBD, xSC, xWG, xCOPA benchmarks) by significant margins, against all existing models

  • Comparable performance to falcon-rw-1b in english based benchmark

    • We win out in LAMBDA, StoryCloze16, arch_challenge, arc_easy, headQA_en, openbookQA, sciq, COPA

    • but looses out very slightly on PIQA, Hellaswag, WinoGrade,ReCoRD, COPA

For nearly all use cases under the 2B param model class, RWKV V5 now represents either the best model for multi-lingual use, or a tied 1st place model with falcon-rw-1b

Making this a strong default model of choice within its weight class.

A pattern we intend to repeat in the 3, 7, and 14B weight classes respectively. We expect the 3B model to be out by first week december.


You can access the model today via the following options

  • Public Demo: https://huggingface.co/spaces/BlinkDL/ChatRWKV-gradio

  • Model Download : https://huggingface.co/BlinkDL/rwkv-5-world/tree/main

This is a repost of a past event, prior to the setup of this blog


Subscribe to RWKV Open Source Development Blog

Launched a year ago
Development blog for the RWKV open source architecture, and their derivative OSS models
Vikram Dutt's avatar
1 Like
1

Share this post

RWKV Open Source Development Blog
RWKV Open Source Development Blog
🐣 RWKV v5 1.5B - Achieves SOTA multi-lingual performance
Copy link
Facebook
Email
Notes
More
Share

Discussion about this post

User's avatar
🦅 Eagle 7B : Soaring past Transformers with 1 Trillion Tokens Across 100+ Languages (RWKV-v5)
A brand new era for the RWKV-v5 architecture and linear transformer's has arrived - with the strongest multi-lingual model in open source today
Jan 29, 2024 • 
Eugene Cheah
30

Share this post

RWKV Open Source Development Blog
RWKV Open Source Development Blog
🦅 Eagle 7B : Soaring past Transformers with 1 Trillion Tokens Across 100+ Languages (RWKV-v5)
Copy link
Facebook
Email
Notes
More
6
🚀 RWKV.cpp - shipping to 1.5 billion systems worldwide
We went from ~50k installation, to 1.5 billion. On every windows 10 and 11 computer, near you (even the ones in the IT store)
Sep 3, 2024 • 
RWKV
2

Share this post

RWKV Open Source Development Blog
RWKV Open Source Development Blog
🚀 RWKV.cpp - shipping to 1.5 billion systems worldwide
Copy link
Facebook
Email
Notes
More
🦅 EagleX v2 : Soaring past LLaMA2 7B in both English and Multi-lang evals (RWKV-v5)
You have seen the teaser with the EagleX 1.7T, now its here - the definitive version of linear transformer trained past, LLaMA 2 7B.
Apr 18, 2024 • 
RWKV
1

Share this post

RWKV Open Source Development Blog
RWKV Open Source Development Blog
🦅 EagleX v2 : Soaring past LLaMA2 7B in both English and Multi-lang evals (RWKV-v5)
Copy link
Facebook
Email
Notes
More

Ready for more?

© 2025 RWKV
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share

Copy link
Facebook
Email
Notes
More

Create your profile

User's avatar

Only paid subscribers can comment on this post

Already a paid subscriber? Sign in

Check your email

For your security, we need to re-authenticate you.

Click the link we sent to , or click here to sign in.