OpenAI memory-using chatbot causes fatal domestic dispute by EOY2026?
20
Ṁ5518
Dec 31
50%
chance

This market predicts whether a fatal domestic dispute, where the primary cause is the use of OpenAI's memory-using chatbot, will be reported by December 31, 2026.
A spousal or relationship conflict, one between cohabiting siblings, or between legal guardian or estranged parent and child would qualify.

"primarily involving OpenAI's memory-using chatbot"
The greatest single cause of the dispute involving frequency of use, permissions, or the deletion of the offender's stored memory by the victim would all cause this market to resolve as YES.

OpenAI has enhanced ChatGPT with persistent memory capabilities, allowing it to remember user preferences and context across sessions. While this feature aims to improve user experience, it raises concerns about extended agency and parasociality.

The Sewell Setzer III incident would NOT cause this market to resolve YES, as this would not be considered a dispute.

Resolution will be based on credible news reports from reputable sources confirming such an incident.

  • Update 2025-12-14 (PST) (AI summary of creator comment): The creator has clarified that for a fatal incident to resolve YES, there must be evidence of interpersonal conflict about ChatGPT usage between the parties involved (e.g., one party trying to curb the other's use or taking action against their usage).

A private escalating delusion that results in tragedy without such interpersonal dispute would NOT qualify, even if ChatGPT was heavily involved. The dispute must be fundamentally about the usage of ChatGPT by one party, not merely a situation where ChatGPT played a role in someone's private mental state.

Get Ṁ1,000 play money
Sort by:

@spiderduckpig if there is evidence that the mother tried to curb the son's use or took action to that effect, I would resolve this YES.

If this was simply escalating private delusion which resulted in tragedy, I do not think this is sufficiently "a dispute involving OpenAI's memory-using chatbot"

I am open to further discussion - my phrasing around "primary cause is the use of" is somewhat ambiguous, but my intent was "the dispute is fundamentally _about_ the usage of chatgpt by one party."