In 2028, will AI be at least as big a political issue as abortion?
โž•
Plus
2.4k
แน€1.7m
2028
70%
chance

I will resolve this based on some combination of how much it gets talked about in elections, how much money goes to interest groups on both topics, and how much of the "political conversation" seems to be about either.

Get แน€1,000 play money
Sort by:

By 2028 I expect the current hype will have died down a lot. AI will not have cured cancer, or become sentient, or whatever the fuck Manifold and the SV-adjacent "intelligentsia" fantasizes about. It will be a normal tool, good for some stuff, bad for most other stuff. I use it every day to help me code, but farmers use machines to help them farm and I don't think combine harvesters were ever a hot political topic.

@pietrokc People were killing each other over mechanized textile production in the 1810s until the British government sent in 12,000 troops to arrest the protestors.

John Deere is still pretty politically salient today, it's the most commonly cited reason for the right to repair bills that have been passed in a few states recently.

Neither of those really reached 'as significant as abortion' levels, but if AI can reduce demand for knowledge workers in the US by 10%, that's 10 million people who have lost their livelihoods. I think it's quite plausible that that could push the issue to the top priority

@Frogswap That's interesting (though I have not fact checked it), but as you say people protest things all the time and many small laws exist for many diverse reasons. But abortion was at one time (haven't checked lately) the single issue for O(10%) of voters.

An important requirement for something to be a big political issue is that people have to disagree about it. If 10M people lose their livelihoods, do they disagree on anything? I guess somehow the impacted people could be overwhelmingly Dems or GOP, but afaik "knowledge work" is done ~equally by supporters of both parties.

@pietrokc If 10 million voters think their only chance of holding on to their living conditions is through government action, both parties are going to be advertising their strategy as much as they can going into the 2028 election. I expect that's true even if they have the same position, but there are plenty of degrees of freedom beyond everyone being anti-AI. One might support a ban while the other supports UBI via nationalization, for example.

@pietrokc If the hype dies down a lot, there is quite a chance that would cause a huge market crash (โ€œ35% of the US stock market is held up by five or six companies buying GPUsโ€). Would that count as โ€œAI being a political issueโ€?

@PetrKadlec in my understanding the crash itself would not count, but it would likely lead to demands for regulation etc which would count

@PetrKadlec I read that essay and I thought it was dumb. I don't think the hype dying down would lead to a huge market crash.

How would this resolve if significantly more money is spent from/on AI but it isnโ€™t mentioned at all in debates/messaging?

bought แน€245 NO

@zaperrer That has to resolve NO. Way more money is spent on cars than on AI but cars are not a bigger political issue than AI.

bought แน€25,000 YES

It's very possible that AI becomes very politically significant, but is not divisive enough to be a major electoral issue. It also suffers from the fact that it's so all encompassing that it's hard to rally agains, nor trust our governing bodies to handle. It'd be akin to having capitalism as a political issue.

The only way I can see this happening is if abortion becomes a much smaller topic nationally. It's been sent back to the states, so it is no longer a national issue. Is anyone trying to push national legislation for or against it? I haven't heard much about it since the SCOTUS decision.

@pietrokc bro it has already happened

@bens lol maybe in the Manifold bubble.

Regardless, this question is about what is the case in 2028, not now

@JimHays I'm worried Scott may resolve the market other than how voters prioritize

@benjaminIkuta Certainly based on the written description, thereโ€™s other criteria that should be considered. But I do think itโ€™s one useful barometer for anybody who might think that abortion is not that big of a deal politically

LOL odds still 1%, 10% if market improperly resolved.

bought แน€25 YES at 68%
bought แน€100 NO from 71% to 70%

One way this could happen is if AI is performing the abortions

soldแน€1,079NO

@Tumbles lol, I got a push notification that you'd filled my limit orders and I thought "good grief, is Tumbles diversifying and betting in a market other than the Canadian election??" but turns out you were selling not buying so I guess not. ๐Ÿ˜‰

@Fion ALL SHARES MUST FEED PIERRE

Decided to add to my YES position.

By 2028 the AI stuff is going to be interwoven with most of the existing issues.

AI and labor force replacement.

AI and military competitiveness

AI and Healthcare

AI and education

etc. etc.

Total amount of money flowing into politics vis a vis AI, total amount of discussion time related to it will outstrip abortion as an issue.

I think the explicit "vote for me because of position on abortion" might still be higher, but OP is likely to (imo correctly) look past just attempts to win single issue voters.

https://x.com/adcock_brett/status/1886860098980733197

if general purpose robots are enabled by AI, but the conversation is focused on the robots themselves without any mention of "AI," does it count? its unlikely people wouldn't draw the connection but just in case

@fakebaechallenge (I'm holding a yes)

I would think robots should count, because the debates about that are going to be directly tied in to AI debates.

Different then like "auto translate is so good that tech support is being off shored" imo

Analysis:

  • Scenarios leaning towards a No resolution:

    • Capabilities rapidly plateau; maybe some AI companies overextend and go bankrupt leading to less promotion and more use of (maybe weaker, maybe more expensive) open-source models; various obstruction efforts successfully slow down adoption enough that in 2028 (less than three years from time of writing) things still haven't exploded in everyone's faces. AI's rise as a political issue does not accelerate.

    • AI is de-powered and becomes a Resolved Issue.

      • PauseAI succeeds, an international treaty is adopted with strict limits on anything more powerful than ~GPT4.

      • AI rights advocates win, ban AI slavery/brainwashing/execution, eliminating practical uses.

    • Global catastrophe eliminates or radically reduces technological growth and/or compute-related manufacturing (but still leaves enough tech for Manifold to continue functioning).

  • Scenarios where resolution is irrelevant:

    • ASI happens.

    • ASI is imminent but isn't the global emergency of everyone's focus. Absence of global pause efforts result in certain apocalypse.

    • Other annihilation of existing technological civilization.

  • Scenarios leaning towards a Yes resolution:

    • Imminent X-risk (or other issue pointing towards Pause) becomes obvious to the public, is Not Solved, and becomes the global emergency of everyone's focus.

    • AI is Resolved, but in a manner leading to longer-term controversy about the details of its resolution.

    • Capabilities rapidly plateau, but adoption doesn't slow. "Mundane" AI concerns (eg unemployment) overtake the political landscape.

Overall, I'd say the current 60-65% is about right, maybe a bit high.