RWKV Open Source Development Blog

RWKV Open Source Development Blog

Share this post

RWKV Open Source Development Blog
RWKV Open Source Development Blog
🚀 RWKV.cpp - shipping to 1.5 billion systems worldwide
Copy link
Facebook
Email
Notes
More
User's avatar
Discover more from RWKV Open Source Development Blog
Development blog for the RWKV open source architecture, and their derivative OSS models
Already have an account? Sign in

🚀 RWKV.cpp - shipping to 1.5 billion systems worldwide

We went from ~50k installation, to 1.5 billion. On every windows 10 and 11 computer, near you (even the ones in the IT store)

RWKV's avatar
RWKV
Sep 03, 2024
2

Share this post

RWKV Open Source Development Blog
RWKV Open Source Development Blog
🚀 RWKV.cpp - shipping to 1.5 billion systems worldwide
Copy link
Facebook
Email
Notes
More
Share

Silently overnight, it’s everywhere, in every Windows 10 and 11 PC.

Or more specifically, “windows 11: version 23H2” and “windows 10: version 22H2” …

Thanks for reading RWKV Open Source Development Blog! Subscribe for free to receive new posts and support my work.

C:\Program Files\Microsoft Office\root\vfs\ProgramFilesCommonX64\Microsoft Shared\OFFICE16

Today, you can literally walk into your local IT store, find a laptop with Windows 11 copilot, and search rwkv (enable system files), and find the files there.

RWKV at the local IT store near you!

With an estimated, half a billion, windows 11, and 1 billion windows 10 installations. This marks the largest rollout for RWKV in terms of installation 🤯

Is it real?

To validate the binaries, we have since decompiled them, to verify that they are based on the RWKV.cpp project, supporting up to version 5 of our models (we are currently on version 6).

So yes, it is real.

Our project is Apache 2 licensed, Microsoft is allowed to do this. (assuming proper Apache 2 license attribution)

What is Microsoft using it for?

While it’s unclear what Microsoft is specifically using our models for, it is believed, this is in preparation for local Co-pilot running with on-device models

RWKV's biggest advantage is its ability to process information like a transformer model, at a fraction of the GPU time, and energy cost. Making it one of the world’s greenest model

The AI model energy usage, is critical, for a laptop’s battery life.

RWKV is probably used in combination with the Microsoft phi line of models (which handles image processing), to provide

  • best-in-class multi-lingual support

  • low computation, batch processing in the background (MS recall)

  • general-purpose chat (though this is probably the phi model)

Its main advantages are its low energy cost and language support.


Fingers crossed on the rollout

For now, until the roll-out of offline co-pilot into the Microsoft operating system and/or Office 365. We will be keeping tabs, to see how our models are deployed into Windows.

We are excited to see what is next, as we scale out the deployment for the RWKV open source foundation model.

Change note: the article was originally citing 0.5 billion which is the estiamted size of windows 11 deployment.

It has been updated to include windows 10, as we have gotten confirmation for it as well.


Thanks for reading RWKV Open Source Development Blog! Subscribe for free to receive new posts and support my work.

Bukit Sorrento's avatar
Marko Tasic's avatar
2 Likes
2

Share this post

RWKV Open Source Development Blog
RWKV Open Source Development Blog
🚀 RWKV.cpp - shipping to 1.5 billion systems worldwide
Copy link
Facebook
Email
Notes
More
Share

Discussion about this post

User's avatar
🦅 Eagle 7B : Soaring past Transformers with 1 Trillion Tokens Across 100+ Languages (RWKV-v5)
A brand new era for the RWKV-v5 architecture and linear transformer's has arrived - with the strongest multi-lingual model in open source today
Jan 29, 2024 • 
Eugene Cheah
30

Share this post

RWKV Open Source Development Blog
RWKV Open Source Development Blog
🦅 Eagle 7B : Soaring past Transformers with 1 Trillion Tokens Across 100+ Languages (RWKV-v5)
Copy link
Facebook
Email
Notes
More
6
🦅 EagleX v2 : Soaring past LLaMA2 7B in both English and Multi-lang evals (RWKV-v5)
You have seen the teaser with the EagleX 1.7T, now its here - the definitive version of linear transformer trained past, LLaMA 2 7B.
Apr 18, 2024 • 
RWKV
1

Share this post

RWKV Open Source Development Blog
RWKV Open Source Development Blog
🦅 EagleX v2 : Soaring past LLaMA2 7B in both English and Multi-lang evals (RWKV-v5)
Copy link
Facebook
Email
Notes
More
🌳 The World's Greenest AI Model: RWKV's Pioneering Sustainability
10-100x lower inference cost = lower carbon footprint
Jan 28, 2024 • 
RWKV
2

Share this post

RWKV Open Source Development Blog
RWKV Open Source Development Blog
🌳 The World's Greenest AI Model: RWKV's Pioneering Sustainability
Copy link
Facebook
Email
Notes
More

Ready for more?

© 2025 RWKV
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share

Copy link
Facebook
Email
Notes
More

Create your profile

User's avatar

Only paid subscribers can comment on this post

Already a paid subscriber? Sign in

Check your email

For your security, we need to re-authenticate you.

Click the link we sent to , or click here to sign in.