an LLM as capable as GPT-4 runs on one 3090 by March 2025
➕
Plus
26
Ṁ1640
Mar 2
30%
chance

e.g. Winograde >= 87.5%

Get Ṁ1,000 play money
Sort by:

I imagine the weights need to fit in VRAM; does that sound right? Or can the solution involve moving weights on/off the GPU

@NoaNabeshima weights have to fit in vram

@sylv ram and not necessarily vram?

*vram sorry

bought Ṁ80 NO

@sylv The difference between the 3090 market and the 4090 market is because.. the 4090 supports fp8??

@NoaNabeshima I mean it has better specs in general but fp8 is pretty cool

@NoaNabeshima Oh it's different dates