What will the lineage of GPT-4o be known to be at the end of 2024?
Mini
11
แน€552
Dec 31
2%
GPT-4 finetune
35%
New base model
63%
Unknown

Resolves at the end of 2024 at whether GPT-4o is a GPT-4 finetune or a completed new base model if this is known (through eg public statement or very convincing forensics), and unknown otherwise.

Get แน€1,000 play money
Sort by:

it's obviously not a finetune lol

bought แน€5 GPT-4 finetune YES

None of the above ๐Ÿ˜Ÿ

I don't seem to be able to add an answer, but I'm guessing that the referenced dramatic decrease in inference costs is the result of a model being distilled.

It may not be a straightforward distillation of some base GPT-4 model, they may have done fancier things to the model in order to improve the resulting amount of useful generalizable patterns in the distilled model. Though it seems obvious that a distillation step is among the main techniques involved that would make GPT-4o importantly distinct from GPT-4.

https://youtu.be/fMtbrKhXMWc?t=6m&si=PldsUGtz4P_KjtXG

(Sam Altman talking about GPT-4o being way cheaper to run)

https://en.wikipedia.org/wiki/Knowledge_distillation

(What I mean roughly by "distillation" - using the embeddings output by a larger model to train a smaller one - though in my mind this is more of a general category of compressing the trained shape of a larger model into a smaller one)