Is attention all you need? (transformers SOTA in 2027)
126
Ṁ23k2027
56%
chance
1D
1W
1M
ALL
This market simulates the wager between Jonathan Frankie (@jefrankle) and Sasha Rush (@srush_nlp)
Details can be fount at https://www.isattentionallyouneed.com/
Proposition
On January 1, 2027, a Transformer-like model will continue to hold the state-of-the-art position in most benchmarked tasks in natural language processing.
Other markets on the same:
Get Ṁ1,000 play money
Sort by:
What about hybrid models, like Jamba? They might be the best of both worlds.
bought Ṁ4 YES at 61%
predicts YES
@EchoNolan I talked to Sasha, and his response is basically that as long as the E in the MoE is Transformer, its a transformer.
@jacksonpolack Hm, I will add in subsidy at a later point wherever the market stabilizes to maintain that
Related questions
Related questions
Will a transformer based model be SOTA for video generation by the end of 2025?
82% chance
Is Attention All You Need?
62% chance
Will SOTA on MATH in Sep 2024 utilize a hard-coded search/amplification procedure?
42% chance
Will the transformer architecture be replaced in SOTA LLMs by 2028?
61% chance