On January 1, 2027, a Transformer-like model will continue to hold the state-of-the-art position in most benchmark
33
Ṁ27502026
59%
chance
1D
1W
1M
ALL
Tracking external bet: https://www.isattentionallyouneed.com/
Get Ṁ1,000 play money
Sort by:
https://blog.rwkv.com/p/eagle-7b-soaring-past-transformers
It even links the wager page in its bragging points:
All while being an “Attention-Free Transformer”
Related questions
Related questions
By the end of Q1 2025 will an open source model beat OpenAI’s o1 model?
44% chance
Will there be a model that has a 75% win rate against the latest iteration of GPT-4 as of January 1st, 2025?
63% chance
Will models be able to do the work of an AI researcher/engineer before 2027?
36% chance
Will the most capable, public multimodal model at the end of 2027 in my judgement use a transformer-like architecture?
55% chance
Will transformers still be the dominant DL architecture in 2026?
58% chance
Will superposition in transformers be mostly solved by 2026?
62% chance
Will Transformer based architectures still be SOTA for language modelling by 2026?
69% chance
If we find out in 2024, was o1's Transformer base trained on 10+x as much compute as GPT-4's?
45% chance
Will any model get above human level (92%) on the Simple Bench benchmark before September 1st, 2025.
60% chance
By EOY 2025, will the model with the lowest perplexity on Common Crawl will not be based on transformers?
28% chance