Will Kurzgesagt release a video specifically about the AI Alignment Problem before the end of 2024?
284
Ṁ62k
Jan 1
66%
chance

Resolves yes if anytime before end of year 2024 there is a video on the Kurzgesagt YouTube channel that I deem to be about the AI Alignment Problem. It does not have to mention the words "alignment problem" but must describe the same thing.

Get Ṁ1,000 play money
Sort by:

I'm really glad that Kurzgesagt published the video that was just released but I don't think that it counts as being about alignment. I hope that they publish another video that serves as an explainer for the alignment problem, and I hope that they can do that soon

opened a Ṁ1,000 YES at 75% order

I agree that the video released today does not count, but they hinted that there would be a video about alignment in the future.

bought Ṁ1,000 YES

Amazing. I’ll watch this sometime today to confirm that it’s about the alignment problem.

it seems clearly not to be, but hinting at future videos that might be about it? it says AI might go very well or very poorly but it's a stretch to say that's "about the alignment problem", he makes it more sound like whoever makes AI has control of it, and can use it for good or for bad

bought Ṁ250 NO

curious if anyone disagrees and thinks this video counts. ofc including you harlan

I had a similar impression. It kinda sorta lingers in the air, and they're kinda mentioning it on the side but not really (see picture)? I don't think this is "about" the alignment problem, but a future very well might be.

I watched. It sorta briefly alludes to rogue AI at the end but it doesn't really talk about alignment, just that AGI is a transformative technology on the horizon that has risks, focusing on misuse risks.

I see much doomerism flirt too deep into apocalyptic masturbation, like cheap porno degrees of doom, but there's a very good range of material that kurzgesagt could deliver which would be very high quality, sensual content, that delivers realistic, believable and sexy Doom mongering, that's also educational, they'd be idiots not to go there in a year

predicts YES

They have recently released two videos funded by OpenPhil on Bio and Nuclear risk, so this seems increasingly likely given the relative importance given to AI Alignment.

Their latest video was explicitly inspired by Rational Animations, who put out a lot of AI alignment content. I think there's a really good chance we'll see Kurzgesagt put out a video or two on the topic soon. https://youtu.be/Kr57ax0OWMk

i rly hope they do, usually the epistemic quality of their videos are quite decent and they're quite trusted by their viewers