6) An alternative to the transformer architecture will see meaningful adoption.
57
Ṁ3956
Dec 31
69%
chance

All these predictions are taken from Forbes/Rob Toews' "10 AI Predictions For 2024".
For the 2023 predictions you can find them here, and their resolution here.
You can find all the markets under the tag [2024 Forbes AI predictions].

  • I will resolve to whatever Forbes/Rob Toews say in their resolution article for 2024's predictions.

  • I might bet in this market, as I have no power over the resolution.


Description of this prediction from the article:
Introduced in a seminal 2017 paper out of Google, the transformer architecture is the dominant paradigm in AI technology today. Every major generative AI model and product in existence—ChatGPT, Midjourney, GitHub Copilot and so on—is built using transformers.

But no technology remains dominant forever.

On the edges of the AI research community, a few groups have been hard at work developing novel, next-generation AI architectures that are superior to transformers in different ways.

One key hub of these efforts is Chris Ré’s lab at Stanford. The central theme of Ré and his students’ work has been to build a new model architecture that scales sub-quadratically with sequence length (rather than quadratically, as transformers do). Sub-quadratic scaling would enable AI models that are (1) less computationally intensive and (2) better able to process long sequences compared to transformers. Notable sub-quadratic model architectures out of Ré’s lab in recent years have included S4, Monarch Mixer and Hyena.

The most recent sub-quadratic architecture—and perhaps the most promising yet—is Mamba. Published just last month by two Ré protégés, Mamba has inspired tremendous buzz in the AI research community, with some commentators hailing it as “the end of transformers.”

Other efforts to build alternatives to the transformer architecture include liquid neural networks, developed at MIT, and Sakana AI, a new startup led by one of the co-inventors of the transformer.

Next year, we predict that one or more of these challenger architectures will break through and win real adoption, transitioning from a mere research novelty to a credible alternative AI approach used in production.

To be clear, we do not expect transformers to go away in 2024. They are a deeply entrenched technology on which the world’s most important AI systems are based. But we do predict that 2024 will be the year in which cutting-edge alternatives to the transformer become viable options for real-world AI use cases.

Get Ṁ1,000 play money
Sort by:

I think Mamba has crossed the threshold of "meaningful adoption" at this point

bought Ṁ200 NO

How much is significant?