August 24, 2021
“We have 1,000 times larger models requiring more than 1,000 times more compute, and that has happened in 2 years”
For comparison: GPT-3 > 175 billion parameters.
@CerebrasSystems CS-2 brain-scale chip can power #AI models with 120 trillion parameters https://t.co/DYKRuMBJv5 https://t.co/quKZhxPvm3
Original tweet: https://twitter.com/giano/status/1430271454931787781
This is one of the many thoughts I post on Twitter on daily basis. They span many disciplines, including art, artificial intelligence, automation, behavioral economics, cloud computing, cognitive psychology, enterprise management, finance, leadership, marketing, neuroscience, startups, and venture capital.
I archive all my tweets here.
I archive all my tweets here.