At least one of the most powerful neural nets at end of 2030 will be trained using 10^26 FLOPs
➕
Plus
15
Ṁ14k
resolved Apr 15
Resolved
YES

Resolves YES if at least one of the most powerfull neural nets publicly known to exist by end of 2030 was trained using at least 10^26 FLOPs. This is ~3 exaFLOP/s years. It does not matter if the compute is distributed, as long as one of the largest models used it. A neural net which uses 10^26 FLOPs but is inferior to other models does not count. Low precision floating point such as fp32, fp16, or fp8 is permitted.

Resolves NO if no such model exists by end of 2030.

If we have no good estimates of training compute usage of top models, resolves N/A.

Show less

Get Ṁ1,000 play money

🏅 Top traders

#NameTotal profit
1Ṁ143
2Ṁ34
3Ṁ27
4Ṁ24
5Ṁ18
Sort by:
bought Ṁ1,250 YES

@Amaryllis This should resolve YES, Grok 3 meets this criteria

@MingCat thanks!

@traders I'm planning on resolving YES unless someone wants to object.