As a major locally-hosted AI proponent, aka a kind of AI fan, absolutely. I’d wager it’s even worse than crypto, and I hate crypto.
What I’m kinda hoping happens is that bitnet takes off in the next few months/years, and that running a very smart model on a phone or desktop takes milliwatts… Who’s gonna buy into Sam Altman $7 trillion cloud scheme to burn the Earth when anyone can run models offline on their phones, instead of hitting APIs running on multi-kilowatt servers?
And ironically it may be a Chinese company like Alibaba that pops the bubble, lol.
As a major locally-hosted AI proponent, aka a kind of AI fan, absolutely. I’d wager it’s even worse than crypto, and I hate crypto.
What I’m kinda hoping happens is that bitnet takes off in the next few months/years, and that running a very smart model on a phone or desktop takes milliwatts… Who’s gonna buy into Sam Altman $7 trillion cloud scheme to burn the Earth when anyone can run models offline on their phones, instead of hitting APIs running on multi-kilowatt servers?
And ironically it may be a Chinese company like Alibaba that pops the bubble, lol.
If bitnet takes off, that’s very good news for everyone.
The problem isn’t AI, it’s AI that’s so intensive to host that only corporations with big datacenters can do it.
The fuck is bitnet
https://www.microsoft.com/en-us/research/publication/bitnet-scaling-1-bit-transformers-for-large-language-models/ use 1 bit instead of 8 or 16, yay performance gainz
So will the return of the flag conclude the adventures of ressource usage in computers?
What star said, but what it also does is turn hard matrix multiplication into simple addition.
Basically, AI will be hilariously easy to run compared to now once ASICs start coming out, thought it will run on CPUs/GPUs just fine.