Elon Musk has been very explicit in promising a robotaxi launch in Austin in June with unsupervised full self-driving (FSD). We'll give him some leeway on the timing and say this counts as a YES if it happens by the end of August.
As of April 2025, Tesla seems to be testing this with employees and with supervised FSD and doubling down on the public Austin launch.
PS: A big monkey wrench no one anticipated when we created this market is how to treat the passenger-seat safety monitors. See FAQ9 for how we're trying to handle that in a principled way. Tesla is very polarizing and I know it's "obvious" to one side that safety monitors = "supervised" and that it's equally obvious to the other side that the driver's seat being empty is what matters. I can't emphasize enough how not obvious any of this is. At least so far, speaking now in August 2025.
FAQ
1. Does it have to be a public launch?
Yes, but we won't quibble about waitlists. As long as even 10 non-handpicked members of the public have used the service by the end of August, that's a YES. Also if there's a waitlist, anyone has to be able to get on it and there has to be intent to scale up. In other words, Tesla robotaxis have to be actually becoming a thing, with summer 2025 as when it started.
If it's invite-only and Tesla is hand-picking people, that's not a public launch. If it's viral-style invites with exponential growth from the start, that's likely to be within the spirit of a public launch.
A potential litmus test is whether serious journalists and Tesla haters end up able to try the service.
2. What if there's a human backup driver in the driver's seat?
This importantly does not count. That's supervised FSD.
3. But what if the backup driver never actually intervenes?
Compare to Waymo, which goes millions of miles between [injury-causing] incidents. If there's a backup driver we're going to presume that it's because interventions are still needed, even if rarely.
4. What if it's only available for certain fixed routes?
That would resolve NO. It has to be available on unrestricted public roads [restrictions like no highways is ok] and you have to be able to choose an arbitrary destination. I.e., it has to count as a taxi service.
5. What if it's only available in a certain neighborhood?
This we'll allow. It just has to be a big enough neighborhood that it makes sense to use a taxi. Basically anything that isn't a drastic restriction of the environment.
6. What if they drop the robotaxi part but roll out unsupervised FSD to Tesla owners?
This is unlikely but if this were level 4+ autonomy where you could send your car by itself to pick up a friend, we'd call that a YES per the spirit of the question.
7. What about level 3 autonomy?
Level 3 means you don't have to actively supervise the driving (like you can read a book in the driver's seat) as long as you're available to immediately take over when the car beeps at you. This would be tantalizingly close and a very big deal but is ultimately a NO. My reason to be picky about this is that a big part of the spirit of the question is whether Tesla will catch up to Waymo, technologically if not in scale at first.
8. What about tele-operation?
The short answer is that that's not level 4 autonomy so that would resolve NO for this market. This is a common misconception about Waymo's phone-a-human feature. It's not remotely (ha) like a human with a VR headset steering and braking. If that ever happened it would count as a disengagement and have to be reported. See Waymo's blog post with examples and screencaps of the cars needing remote assistance.
To get technical about the boundary between a remote human giving guidance to the car vs remotely operating it, grep "remote assistance" in Waymo's advice letter filed with the California Public Utilities Commission last month. Excerpt:
The Waymo AV [autonomous vehicle] sometimes reaches out to Waymo Remote Assistance for additional information to contextualize its environment. The Waymo Remote Assistance team supports the Waymo AV with information and suggestions [...] Assistance is designed to be provided quickly - in a mater of seconds - to help get the Waymo AV on its way with minimal delay. For a majority of requests that the Waymo AV makes during everyday driving, the Waymo AV is able to proceed driving autonomously on its own. In very limited circumstances such as to facilitate movement of the AV out of a freeway lane onto an adjacent shoulder, if possible, our Event Response agents are able to remotely move the Waymo AV under strict parameters, including at a very low speed over a very short distance.
Tentatively, Tesla needs to meet the bar for autonomy that Waymo has set. But if there are edge cases where Tesla is close enough in spirit, we can debate that in the comments.
9. What about human safety monitors in the passenger seat?
Oh geez, it's like Elon Musk is trolling us to maximize the ambiguity of these market resolutions. Tentatively (we'll keep discussing in the comments) my verdict on this question depends on whether the human safety monitor has to be eyes-on-the-road the whole time with their finger on a kill switch or emergency brake. If so, I believe that's still level 2 autonomy. Or sub-4 in any case.
See also FAQ3 for why this matters even if a kill switch is never actually used. We need there not only to be no actual disengagements but no counterfactual disengagements. Like imagine that these robotaxis would totally mow down a kid who ran into the road. That would mean a safety monitor with an emergency brake is necessary, even if no kids happen to jump in front of any robotaxis before this market closes. Waymo, per the definition of level 4 autonomy, does not have that kind of supervised self-driving.
10. Will we ultimately trust Tesla if it reports it's genuinely level 4?
I want to avoid this since I don't think Tesla has exactly earned our trust on this. I believe the truth will come out if we wait long enough, so that's what I'll be inclined to do. If the truth seems impossible for us to ascertain, we can consider resolve-to-PROB.
11. Will we trust government certification that it's level 4?
Yes, I think this is the right standard. Elon Musk said on 2025-07-09 that Tesla was waiting on regulatory approval for robotaxis in California and expected to launch in the Bay Area "in a month or two". I'm not sure what such approval implies about autonomy level but I expect it to be evidence in favor. (And if it starts to look like Musk was bullshitting, that would be evidence against.)
12. What if it's still ambiguous on August 31?
Then we'll extend the market close. The deadline for Tesla to meet the criteria for a launch is August 31 regardless. We just may need more time to determine, in retrospect, whether it counted by then. I suspect that with enough hindsight the ambiguity will resolve. Note in particular FAQ1 which says that Tesla robotaxis have to be becoming a thing (what "a thing" is is TBD but something about ubiquity and availability) with summer 2025 as when it started. Basically, we may need to look back on summer 2025 and decide whether that was a controlled demo, done before they actually had level 4 autonomy, or whether they had it and just were scaling up slowing and cautiously at first.
Ask more clarifying questions! I'll be super transparent about my thinking and will make sure the resolution is fair if I have a conflict of interest due to my position in this market.
[Ignore any auto-generated clarifications below this line. I'll add to the FAQ as needed.]
Thanks, looks like the source for that is the following:
https://www.tesla.com/careers/search/?query=vehicle%20operator&site=US
Namely, Tesla is hiring 10 test drivers across 6 US states. Excerpt from the job description:
Drive an engineering vehicle for extended hours in a designated area for data collection, 5 to 8 hours daily [...] conducting dynamic audio and camera data collection for testing and training purposes [...] requires excellent driving skills.
I guess it's a small positive update that scaling up is imminent, and a smaller positive update that this market will end up resolving YES.
But it's also consistent with the theory that they don't have level 4 autonomy and are scrambling to figure it out.
PS: I bought some YES when I saw this, bringing the market probability to 31% but now I'm really not sure and might push it back down again...
@dreev This is not something terribly new. Tesla has been doing testing/data validation with their own cars for some time now. Here is an article from 2024 https://www.teslarati.com/tesla-boosts-data-collection-team-prototype-vehicle-operators-hiring-ramp/
And they are doing testing even outside of the US, see https://timesofindia.indiatimes.com/auto/news/tesla-hiring-drivers-in-delhi-and-mumbai-to-test-autopilot-ahead-of-india-launch/articleshow/121505022.cms
The official Tesla account on Twitter has also been posting videos of their FSD tests around Europe and in Australia.
That said, the job postings are a further indication that Tesla has determined that they need extensive local training in combination with building/testing an operation centre in each city.
@dreev A thought experiment: Do we consider that Waymo has solved autonomy when two of their cars crash each other? https://x.com/thetumbleweedX/status/1950665926707494925

@MarkosGiannopoulos Better link:
https://www.reddit.com/r/SelfDrivingCars/comments/1mdl5zn/two_waymo_cars_collided_in_phoenix_today/
Seems we don't know the post-mortem on that one yet. Now I'm curious!
To repeat a comment I added to the last AGI Friday on this topic and how Waymo just hit 100 million customer miles:
If Waymos were merely human-level, they'd be overdue for a fatality at this point. That statement probably needs a lot of caveats, like adjusting for the fact they still don't take customers on freeways, and that the fancy cars they use are safer than average.
In any case, it may help to keep the scale in perspective when seeing posts on social media about Waymos and Teslas making driving mistakes.
@dreev It's a crash in a half-empty parking space. It seems like a situation that should be avoidable. Do they get confused by parking space lines? :D
@MarkosGiannopoulos That would surprise me, since Waymos seem great about disregarding painted lines on the road when that's what makes sense. In terms of possible Waymo weak points, could the cars' lidar sensors interfere with each other? Also seems unlikely.
My favorite from that Reddit thread: "We could be witnessing some emergent behavior and this is a mating display."
This one's probably least likely but the person filming the aftermath of the crash seems so gleeful I can't help but wonder if they somehow orchestrated it. My curiosity is piqued, for sure, though I guess there's no real way for it to be relevant to this market. (Stats on crashes for Waymo vs Tesla robotaxi, on the other hand, would be relevant.)
A rumour from Business Insider that Tesla will launch Robotaxi in California this weekend(!), even if they do not currently have a permit for transferring employees (which they have been doing for a year now). The service will have a safety driver. No real change for this market IMHO.
https://www.businessinsider.com/tesla-robotaxi-bay-area-launch-san-francisco-memo-2025-7

@MarkosGiannopoulos Sounds exactly right. This seems about equally likely in both possible worlds.
Like if Tesla is stalling for time while they figure out level 4 autonomy, launching robotaxis with safety drivers in the driver's seat in California is a great distraction. And in the other possible world, where they're at level 4 (or will be in another month?) this makes sense as a way to get the ball rolling in California while they await regulatory approval to remove the safety drivers.
I.e., Reverend Bayes says no update to the probability based on this evidence.
But I'd love to hear if others have different intuitions here.
The great thing about this is that it sets up for stronger Bayesian evidence when we see whether regulatory approval in California materializes. See FAQ11 in the market description for more on this.
PS: I guess if Musk is thinking strategically, he'd anticipate the embarrassment of regulatory approval stalling in California. Which implies he doesn't expect it to stall. Which makes this news a positive update for this market. Unless there's some reason he was backed into this? Tricky stuff.
@dreev I think you and @MarkosGiannopoulos are missing the more significant part of this story. Tesla has not even applied for an AV testing permit! They are instead using the permit they already have to operate a chauffeur service that is not related to AV testing at all. In fact, they are only allowed to use this permit if they say they are NOT doing AV testing.
California has a progressive, multi-stage permit process to launch tobotaxis. The first permit, launching level 4 vehicles without passengers and with a testing driver, is easy to get and required to move to stage 2. It's apparent that Tesla does not want to start this process because it would require they file annual disengagement reports, which would not be flattering to FSD.
If you disagree with me, please fill my limit orders in the 2026 California market.
@WrongoPhD "they are only allowed to use this permit if they say they are NOT doing AV testing" - Of course, they officially say on some piece of paper that they are not doing AV testing. But do you think they are not, in fact, collecting training data for the very specific reason of starting a full service soon? Their data-collecting/validation cars (with extra hardware on the roof of the car) have been spotted around California for several weeks now.
@MarkosGiannopoulos You are missing the point. By doing testing outside of the permitting process, there is no way they could possibly launch a service soon. The first permit (phase 1 of 4) is to operate with a safety driver to collect data proving you are safe enough to operate without a safety driver (phase 2 of 4). California requires you get the permits in order. Tesla can't say, "I know we said we weren't doing AV testing, but actually we were and just not reporting our data as required. Let us skip to the final permit."
@WrongoPhD I did not mean that they would ask for some backtracking of past rides. Just that in practice Tesla has already been doing their testing. And it looks like they are getting to proceed to applying the phase 1 permit.
Testing in California (notice the TCP license number at the back), but not really testing :D

"We will further improve and expand the service (more vehicles covering a larger area, eventually without a safety rider) while testing in other U.S. cities in anticipation of additional launches. Our efforts to refine the Robotaxi offering in Austin are not location-specific and will allow us to scale to other cities quickly with marginal investment"
https://www.tesla.com/sites/default/files/downloads/TSLA-Q2-2025-Update.pdf
@MarkosGiannopoulos Thanks for finding that. Seems like a crucial datapoint. With 10-20 robotaxis operating for 30 days, that'd be 12-23 miles per car per day, on average. I'd expect a lot more than that for cars in active service all day every day. For comparison, Waymo hit 100 million customer miles last week.
This has me wanting to push the probability down a bit.
Sorry if this has already been hashed out but
>9. What about human safety monitors in the passenger seat?
>Oh geez, it's like Elon Musk is trolling us to maximize the ambiguity of these market resolutions. >Tentatively (we'll keep discussing in the comments) my verdict on this question depends on whether >the human safety monitor has to be eyes-on-the-road the whole time with their finger on a kill >switch or emergency brake. If so, I believe that's still level 2 autonomy.
Why not just say "NO OTHER HUMAN JOINING THE CUSTOMER"
@Bandors That would've been a perfectly cromulent bright line to have drawn back when we created the market. The problem is that no one (in any of the Tesla markets on Manifold!) considered the possibility of safety monitors in the passenger seat. So now that that's happened it's a total judgment call on which side of the supervised/unsupervised line it's on.
I don't have an opinion myself yet, other than that it's genuinely ambiguous and depends on specifics that we don't know yet.
I think we just have to wait.
I'm hopeful that Tesla having started the process of getting regulatory approval in California will give us answers. California requires more transparency (things like documenting every disengagement) than Texas does. Unless Elon Musk is lying about starting that process, or if Tesla will get the regulatory approval and then not comply with the reporting requirements. Or submit fraudulent reports. I really have no idea what lines Musk will and won't cross at this point.
I guess if Tesla hasn't actually even begun the process of getting regulatory approval in California by the end of August, that itself will be evidence (not saying it'll be dispositive) that this market should resolve NO.
@dreev My understanding is that Tesla actually does have a permit with the DMV (but not CPUC) to allow internal testing of L3 autonomous vehicles. Despite this, they mostly flout reporting regulations, disingenuously arguing that they are merely level 2 without ambitions for L4 so they are exempt from reporting requirements. There are two exceptions where Tesla did actually report miles to California: in 2016 they reported 500 miles, and in 2019 they reported 12.2 miles, both in conjunction with promotional events for FSD. I personally do not believe Tesla will ever actually try to launch a robotaxi service in California because the reporting requirements would be too harmful to their PR.
https://electrek.co/2020/02/27/tesla-reports-fully-autonomous-miles-first-time-years/
https://electrek.co/2020/02/27/tesla-reports-fully-autonomous-miles-first-time-years/
https://www.caranddriver.com/news/a35785277/tesla-fsd-california-self-driving/
@WrongoPhD Tesla has TCP authorisation to run autonomous (supervised) rides for their own employees in California. They have already announced, some months ago in investors' calls, that they are doing this. So they have been collecting data in California for some time now (in addition to what they get from customers who are FSD users). They have not officially applied for driverless rides.
@dreev If the safety monitors are removed in Texas, that would be enough for this market, right? No need to wait for a California launch as well.
@MarkosGiannopoulos I'd also want some evidence against tele-operation, which regulatory approval in California would be, as I understand things.
@MarkosGiannopoulos Yeah, I'm thinking the market should stay open until we know. In the case that they finally scale up much later and it doesn't seem like we'll ever get a definitive answer on whether Tesla had achieved level 4 as of August 31, maybe resolve-to-PROB will be fairest. (There's also the question of public launch. I don't think we've met the bar for that yet, though we're getting closer, right?)
@dreev Re: "public launch", random (e.g. "non-influencers") people on X are saying they signed up on the website form and got an invite to ride. Isn't that what the market description is asking for?
@MarkosGiannopoulos Close, but I think it leaves uncertainty about hand-picking. The market description suggests a litmus test of serious journalists and Tesla haters being able to try it. Maybe we could call that a sufficient but not necessary condition? What feels fairest to you?
@dreev All limited-access/waiting-lists systems are not easy to figure out if they are being filtered or not. And I would not blame Tesla for being careful not to invite journalists. There is a long record of misrepresentation of Tesla in media. I asked ChatGPT to find articles where the journalist actually took a ride and it cam up with an article where the journalist took rides thanks to a “random person” who signed up on the website https://www.businessinsider.com/tesla-robotaxi-experience-human-intervention-incidents-2025-7
So you have at least an example of a member of the public and a member of the media using the service (and paying for it).
@dreev Well Musk does not consider Business Insider a legitimate media :D
They have run a lot of negative articles on Tesla and Musk personally.
But most people would consider BI to be part of mass media.
Now, if at August 31st you still have 10 cars, a few hundred actual people that took rides and the safety monitors have not been removed, maybe you have grounds to declare this “not good enough” (even though the car is driving itself and there were no accidents)
@MarkosGiannopoulos Yeah, I'm nervous we'll end up in a situation where Tesla has technically hit the minimum requirements but we're not convinced it counts in spirit. We do have this from FAQ1:
There has to be intent to scale up. In other words, Tesla robotaxis have to be actually becoming a thing, with summer 2025 as when it started.
But August 31 was just the deadline to launch, so I think we'll need to wait a while after August 31 before determining that this market should resolve NO based on the above clause.
I suspect that with enough hindsight it will be obvious at least.
PS: I turned this into FAQ12. Thanks again for all the excellent questions and research on this!