Will an open sourced SOTA LLM be trained on Intel hardware by 2024?
Mini
9
Ṁ359
Dec 31
14%
chance

In this market, we define

  • Open sourced as in the weights are released to the public to be downloaded


If all of these above conditions are filfilled, this market will be marked as resolved.

Get Ṁ1,000 play money
Sort by:

I recall Facebook was experimenting with Habanas in their servers a few years ago, but I think they settled on Nvidia by now. OpenAI was using Graphcore(I think?) & Nvidia, so not going to happen there. Most others are sticking with Nvidia. There's a niche business for FPGAs for deep learning, and Intel is one of the two manufacturers of cutting-edge FPGAs; but those are usually more for inference than training.

So this isn't impossible, but does seem unlikely.

Intel GPUs aren't good enough (they only have lower end options) and they won't be for a while, and the maker would have to disclose that they used Intel. Really unlikely.

predicts YES

@ShadowyZephyr intel's HPC gpus are very good. PVC has more VRAM than anything avaliable atm, i wouldnt discount raja koduri. Remember, he basically was responsible for GCN which is now CDNA.

predicts NO

@GiftedGummyBee It has the same amount of VRAM as AMD's offerings from what I could find. Thank you for making me aware of this though, I didn't realize they had HPC offerings at the moment.

predicts YES

@ShadowyZephyr That is very true. However, two things to consider. The first is that AMD has ROCm which is not working very well. OneAPI already has been proven to work again and again. Two, intel has been in the deep learning game for much longer, as seen with how they already had habana in 2020 and earlier. I would bet more on Intel than AMD because of those reasons.

predicts NO

@GiftedGummyBee Hmm, interesting. I didn't know Intel's Habana beat the A100 so early on. Although I think availability is still an issue regarding Intel (easy for a consumer to buy an A100, intel habana, not so much), does seem more likely than I first thought.