arstechnica , Englisch
@arstechnica@mastodon.social avatar

Researchers upend AI status quo by eliminating matrix multiplication in LLMs

Running AI models without matrix math means far less power consumption—and fewer GPUs?

https://arstechnica.com/information-technology/2024/06/researchers-upend-ai-status-quo-by-eliminating-matrix-multiplication-in-llms/?utm_brand=arstechnica&utm_social-type=owned&utm_source=mastodon&utm_medium=social

PhoenixGee ,
@PhoenixGee@soc.k512.studio avatar

@arstechnica What's more likely to happen is that they change their training software to accomodate the new algorithms and if it really is that much more efficient, they'll STILL use the same power and GPU for EVEN MOOOOOORE Ai.

It's almost like the thing when you get a new hard drive and you go "I'm never gonna fill that up" and a few months later you're at capacity again...

  • Alle
  • Abonniert
  • Moderiert
  • Favoriten
  • random
  • haupteingang
  • Alle Magazine