With v. 12.5.5.2 Tesla’s vision based autopilot (supervised) the emphasis is on driver supervision, still. While autosteer and adaptive speed control has become relatively polished and worry free there are edge cases where driver intervention is necessary/mandatory.
One might think that the driver, after having made a manual course correction or driving adjustment, would “teach” the car. Future revisits to that kink in the road would then go smoothly but Tesla’s system doesn’t work that way. At the end of the day, reams of video clips upload from the vehicle cameras to mothership — Tesla’s neural network.
Sampling the data (last 30 days) transiting my network from vehicle (as a network client) uploaded: 71.6 GB
Autopilot behavioral changes and improvements come from Tesla’s end-to-end AI training system, code named Dojo, via versional updates downlink to the fleet. In this my car (and all cars) becomes smarter. What needs to be smarter? Safety, always of course, but key on the list is the ability to flow naturally with traffic and mimic driver behaviors good or bad. For instance:
The “Hollywood Stop” aka rolling stop might be considered bad but most people do it. Autopilot is not permitted such liberty. It comes to a dead stop before proceeding and this is slightly irritating but lawful.
Along this vein, the NHTSA stop, where one comes to a complete halt at the white painted marking at a stop sign before edging forward to see traffic and proceeding. Good form but unexpected in the common place sense.
While at the intersection, number 1 at the traffic light signal, turns green. Go. Defensive driving used to be taught but people seeing the green light step right out. Autopilot waits a split second before proceeding. It’s a minor irritation, and I haven’t had anyone from behind impatiently sound their horn yet (but I know in certain parts they surely would). The pause is good. Drivers with the green should look for oncoming. Autopilot has been hardcoded to know that cross traffic frequently will run fudge the red light.
Not wanting to be labeled as one of those Hypermiling roadie types, I still like to begin a coast down as I approach a traffic light that has turned red. Aggressive drivers usually charge right up until the last moment before braking. This avoids the dreaded cut-in and unfortunately autopilot mimics this. I would prefer that autopilot would rely on re-gen to decelerate and minimize use of the friction brakes. Conversely, without traffic, autopilot accelerates from stop signals like a jackrabbit — way beyond Chill.
v. 12 autopilot dictates what it considers to be a safe rate of speed based on conditions and environment. Previous iterations of FSD would rely solely upon and match posted speed limits. But now, the car dawdles on some stretches sometimes doing 10 mph under. I know other cars in trail are thinking “Sunday Driver” or little old lady that can barely see over the steering wheel. You can poke the accelerator pedal a bit to help with the confidence I suppose. So much for the annoyances.
Intervention: the need to take charge, assume control, over-ride the automation. Save the day!
More seriously, and Tesla does require users to acknowledge the need and requirement for supervision, there are edge case where autopilot has low confidence or encounters a situation for which it can not deal without help. In safety situations autopilot (hopefully) recognizes and alerts the driver with an audio chime and a take immediate command text exclamation. A situation is imminent (1-2 secs.) and if the driver has not been attentive there could be consequences.
The Handover: automation off, manual control on, is fine when the human operator desires but when it happens unexpectedly because of autopilot initiative the human can be momentarily caught off guard. A surprise transition is messy.
Luckily, these are now rare. The vast majority of intervention occurs when the driver’s comfort level is close to being exceeded. e.g. the vehicle cuts a corner, or threatens to curb a wheel. In such a circumstance driver takes control by disengaging.
Disengagement (driver induced): press the off button, tap the brake pedal, or for the most immediate action; yank on the steering wheel
A disengagement by manual steering will literally cause the autopilot to take embarrassed offense by posting a text to driver: “what happened/what went wrong?” Optionally the driver can respond with a haptic press on the microphone and give a short verbal complaint. e.g. “the car lost situational awareness in that turn and assumed the incorrect lane”. This, along with associated before/after video clip sequence capture, goes to the engineers for review.
I wish there was a way to similarly send kudos or an attaboy. Occasionally, autopilot does something unexpectedly brilliant. Recently exiting a parking lot with a construction barricade just prior to turning right onto the highway required a hard 90 degree turn which the self-driving accomplished with low speed steering full-lock all while maintaining lane discipline. I was primed to takeover because I was skeptical for the outcome but no need. Later, also a 2 lane road, I spied debris (a small branch and leaves clump) in my path and seeing it coming put my hands on the wheel when I felt autopilot do the nudging and offset ever so slightly right to avoid contact. Very subtle and I wanted to applaud.
There are other features with v. 12 that I am experiencing such as Autopark. More than a party trick (i.e. Summon) Autopark is cool and works well consistently backing into the white lined bounded box centered and without any involvement. When its time to leave the parking spot autopilot knows in which direction you want to go. i.e. if you backed into it you probably want to pull out forward when leaving it. Autopilot automatically selects forward ‘gear’ saving you the effort. (As supervisor, verify and confirm the selection of course)
There are times of despair when you say don’t take away the steering wheel just yet, FSD won’t be happening anytime soon, and not ready for prime time. You feel like a beta tester and wonder how they get away with selling this Full $elf Driving option. The manufacturer should be paying ME to be the early adopter helping with the machine learning. I do feel like I’m riding herd with a 15 yr. old with a learners permit coaching / anticipating everything.
The next FSD version 12.5.6 and within a short time (Elon time) v. 13 there are to be significant changes.
With FSD version 12 my driver job is quite secure. It is more fun to drive than to watch it being done and less stressful. I’m always comparing technique and so far I can still say I’m much smoother and more efficient at it. Never the less Technology is a wonder and it is exciting to be a first hand observer of innovation and progress.