• 👋 Welcome! If you were registered on Cybertruckownersclub.com as of October 14, 2024 or earlier, you can simply login here with the same username and password as on Cybertruckownersclub.

    If you wish, you can remove your account here.

Fiery deaths?

FutureBoy

Well-known member
First Name
Reginald
Joined
Oct 1, 2020
Threads
162
Messages
2,764
Reaction score
734
Location
Kirkland WA USA
Vehicles
Toyota Sienna
Occupation
Financial Advisor
Country flag
More details have been leaked:


The log does indicate that cruise control was activated and the speed was increased but at no point was lanekeeping active and the overrides were being used.

Same with the Consumer Reports demonstration - not only did they need to do several steps to defeat the attention sensors, they were holding down the override on the steering wheel and the brake pedal as they swapped seats.

CR holding down the override while swapping seats kinda makes me steamed at them. There *should* be overrides. That's the point of having a human driver - to be able to tell the car not to panic or disengage. "The human driver has this."

Expecting the car to stop reduces its reliability ...and makes it impossible to add assistance features: if the safety interlocks just going to reduce the reliability, who would buy it? You'd always be having it panic as it fails at one of a dozen sensors.

-Crissa
I swear that she has the perfect voice for doing the news release on tragic situations like this. Or for downed commercial flights. Calm, thoughtful, collected, detailed, and direct.

That being said, I cannot keep my eyes open listening to her. Between her and the Limiting Factor I could catch up on all the sleep I ever miss from insomnia or late-night distractions. Even if I am fully rested up, full of caffeine, revving the engine, and about to launch, 3 minutes of her voice and I will be out cold. Come back in 1 hour and I'll still be asleep.

I will certainly need to avoid this if I'm ever driving. It would be worse than having a couple shots of vodka before taking a road trip.

Wish I could be more attentive though. Great info. And presented in a trustworthy way.
 

FutureBoy

Well-known member
First Name
Reginald
Joined
Oct 1, 2020
Threads
162
Messages
2,764
Reaction score
734
Location
Kirkland WA USA
Vehicles
Toyota Sienna
Occupation
Financial Advisor
Country flag
I love Jordon Giesige's voice! It's so smooth!

My spouse can't stand Nikki's voice. Alas. I love Nikki's seriousness and enthusiasm. I feel like I'm watching those tomorrow-tech shows from the 80s.

-Crissa
I agree that Jordon’s voice is smooth. So smooth I can’t get through more than 3 minutes before passing out. I really like his info but had to give up listening. ? His channel for me is like listening to ASMR.

Nikki doesn’t get me quite so fast. I still fall asleep but I do get good info out first. Sometimes I actually get to more than one video in a row.
 

rr6013

Well-known member
First Name
Rex
Joined
Apr 22, 2020
Threads
36
Messages
1,242
Reaction score
208
Location
Coronado Bay Panama
Website
shorttakes.substack.com
Vehicles
1997 Tahoe 2 door 4x4
Occupation
Retired software developer and heavy commercial design builder
Country flag
Need picture of Harris County Tesla crash that shows a bent steering wheel, please.

Driver seatback does not show sign of force from backseat occupant during instantaneous de-acceleration. Rear occupants typically submarine under driver’s seat. None showing that occured in the burned out pictures online.

Thanks in advance!
 

MEDICALJMP

Well-known member
First Name
Jeff
Joined
Apr 28, 2020
Threads
164
Messages
851
Reaction score
412
Location
Omaha, NE
Vehicles
Toyota Avalon, Rav4, Tri-motor Cybertruck
Occupation
Nurse
Country flag
https://insideevs.com/news/506498/ntsb-report-tesla-texas-crash/
NTSB Issues Preliminary Report For Fatal, Texas Tesla Crash
May 10, 2021
By: Michael Cantu

The accident happened just 550 feet from the owner's residence.
The National Transportation Safety Board (NTSB) has issued a preliminary report today for the ongoing investigation of the fatal accident involving a 2019 Tesla Model S on April 17 near Spring, Texas.


This is a preliminary report that is subject to change: "Information in the report is preliminary and subject to change as the investigation progresses and as such, no conclusions about the cause of the crash should be drawn from the report."

According to the NTSB, the Model S was equipped with Autopilot but it was not engaged when the accident happened. The NTSB tested an “exemplar car” at the crash location and was able to activate the Traffic Aware Cruise Control (adaptive cruise control) system, but “Autosteer was not available on the part of the road (Hammock Dunes Place) where the crash happened.” Tesla’s Autopilot requires both Traffic Aware Cruise Control and Autosteer to be activated.

According to security footage from the Tesla owner’s residence, the crash trip began at the owner's house and shows the owner getting into the driver’s seat. This disproves early reports that no one was in the driver's seat. The passenger entered the front passenger seat. The video also shows the Model S entering the street and accelerating out of view. According to the report:

“Based on examination of the accident scene investigators have determined the car traveled about 550 feet before departing the road on a curve, driving over the curb, and hitting a drainage culvert, a raised manhole and a tree.”
The Tesla’s onboard data storage device was destroyed in the resulting fire and the restraint control module was damaged. Restraint control modules can record valuable information associated with airbag deployment, speed, seat belt status, and acceleration. The damaged module and the damaged steering wheel were shipped to the NTSB for further evaluation.


The investigation is ongoing, and more data still needs to be collected including “data to analyze the crash dynamics, postmortem toxicology test results, seat belt use, occupant egress, and the post-crash fire.” The 59-year-old Tesla owner and a 69-year-old passenger did not survive the crash.
Source: NTSB

——————————————————————————-
Nothing about this on standard media. Probably because Tesla wasn’t at fault. Sort of like printing a retraction on page F21 next to a dog food ad.

Edit: 5/12/2021
A New York Times article today about a person abusing Tesla‘s ability to aid driving by moving his keister into the backseat while the car is going down the highway made mention of the accident in Texas. It stated, quite wrongly, that the accident had no driver. Obviously the New York Times doesn’t bother to check the recent NTSB reports. They are doing a horrible job of checking the accuracy of their stories lately. At least as far as Tesla goes.
 
Last edited:

MEDICALJMP

Well-known member
First Name
Jeff
Joined
Apr 28, 2020
Threads
164
Messages
851
Reaction score
412
Location
Omaha, NE
Vehicles
Toyota Avalon, Rav4, Tri-motor Cybertruck
Occupation
Nurse
Country flag
The Tesla Autopilot Excuse: How EV ignorance created the perfect storm for a misinformation nightmare
CREDIT: TESLA
Simon Alvarez

BySimon AlvarezPosted on June 20, 2021
It was only a few hours after the accident and a bold statement was already making its rounds in the mainstream media. Another Tesla has crashed, and this time, it took the lives of two individuals from Texas. Facing inquiries from journalists eager for some clarity as to what happened in the tragic incident, Harris County Pct. 4 Constable Mark Herman shared a surprisingly confident and bold statement: there was no one in the ill-fated Model S’ driver seat when it crashed.

“They are 100% certain that no one was in the driver seat driving that vehicle at the time of impact. They are positive. And again, the height from the back seat to the front seat, that would be almost impossible, but again our investigators are trained. They handle collisions. Several of our folks are reconstructionists, but they feel very confident just with the positioning of the bodies after the impact that there was no one driving that vehicle,” Herman said, also noting that the electric car’s fire was out of control for four hours.

This statement, as well as the headlines that followed it, have since been proven false. And today, they stand as a remarkable case study on how misinformation spreads, and how the truth — even if it eventually emerges from legitimate sources — becomes largely ignored. This is the story of a Model S crash, rushed statements, and how general ignorance of electric vehicles could result in a massive misinformation nightmare.

But to get a complete view of this story, one has to go back to that fateful night on April 17, 2021, when two men, a 59-year-old Tesla owner and his 69-year-old passenger, crashed after traveling just about 550 feet, before departing the road on a curve, driving over a curb, hitting a drainage culvert and a raised manhole, and smashing into a tree. The vehicle was ablaze following its crash.


THE ACCIDENT
As it is with other Tesla crashes, the Model S crash in Texas immediately caught the attention of national media. It did not take long before even foreign outlets were running with the story. It was during this initial wave of media attention that Constable Mark Herman noted that investigators were 100% sure that there was no one driving the car when it crashed. This statement was gold to numerous media outlets, with some like the New York Post posting a tweet noting that the ill-fated Tesla was on Autopilot. It’s pertinent to note that the Constable never mentioned Autopilot, though his statement declaring that there was no one in the driver’s seat seemed like a strong enough link to the driver-assist suite.

Soon, even organizations such as Consumer Reports joined the fray, graciously demonstrating that Autopilot could indeed be “fooled” into operating without a human in the driver’s seat. Consumer Reports‘ walkthrough was thorough, showing audiences exactly what needs to be done to defeat Autopilot’s safety measures. This stunt caught the eye of both national and international media as well, and by this time, the narrative was set: Teslas can drive themselves without a driver, and Autopilot could kill. It’s a chilling thought, but it is one that seemed to be casually supported by Ford CEO Jim Farley, who shared Consumer Reports‘ Autopilot defeat device walkthrough on his personal Twitter page.

This does not mean to say the narrative surrounding the fatal Model S crash in Texas was ironclad, however. Just days after the initial crash, Palmer Buck, fire chief for The Woodlands Township Fire Department, told the Houston Chronicle that contrary to some reports in the media, the ill-fated Model S was not ablaze for four hours. The fire chief also stated that firefighters did not call Tesla for help, and he was unaware of any hotlines for tips on how to control a battery fire.

THE FIRST CRACKS — AND A PERSISTENT MISUNDERSTANDING
Interestingly enough, even Constable Herman himself seemed less sure about his information later on, noting in a statement to Reuters that his investigators were “almost 99.9% sure” that there was no one in the driver’s seat of the ill-fated car. This was despite Herman noting that they had executed a search warrant on Tesla to secure data about the tragic incident. Meanwhile, Elon Musk went on Twitter to state that data logs so far showed that the ill-fated vehicle was not on Autopilot when it crashed.

Tesla’s online community took it upon themselves to make sense of the situation, which seemed to have red flags all over the place. The Constable’s statements seemed premature at best, and reports about the vehicle’s fire had been proven false by the fire chief. Couple this with Elon Musk noting that Autopilot was not involved, and it was no surprise that the crash became a topic for analysis and conversations among Tesla supporters. These efforts, however, were largely dismissed if not mocked, with media outlets such as VICE stating that the behavior of the Tesla sleuths was akin to those who believe in conspiracy theories.

“Rather than waiting for the two different federal authorities investigating the crash to publish their findings, some Tesla owners are engaging in the classic behavior of conspiracy theorists and amateur internet sleuths in an apparent attempt to cast doubt on even the most basic facts surrounding the crash,” the publication noted.

More cracks about the initial “Autopilot crash” narrative emerged during the company’s Q1 2021 earnings call. Lars Moravy, Tesla’s vice president of vehicle engineering, stated that the company had conducted tests with investigators, and they have determined that Autosteer could not be engaged in the area. He also stated that judging by the distance of the vehicle from the owner’s home to the crash site, the Model S would have only accelerated to 30 mph before covering the entire 550-foot distance using Adaptive Cruise Control. This is undoubtedly a clarification about the incident, but like many things in this story, this was also misunderstood.

Not long after Tesla’s Q1 2021 earnings call, CBS published a piece titled “At Least One Tesla Autopilot Feature Was Active During Texas Crash That Killed 2.” It’s definitely a catchy headline and one that was sure to draw a decent amount of eyes. There was only one problem: the whole premise of the article was false. To add salt to the wound, Texas Rep. Kevin Brady shared the CBS piece on Twitter, noting that “Despite early claims by (Tesla and Elon Musk), Autopilot WAS engaged in (the) tragic crash in The Woodlands. We need answers.”


A GRASSROOTS MOVEMENT
In a world where misinformation is prevalent from media outlets that may or may not be incentivized to publish reports that are completely accurate, citizen journalism has the potential to become the voice of reason. And in the case of the Tesla Texas crash, this was certainly the case. After conversations with sources, some of whom have opted to remain anonymous, Teslarati could surmise that it was the efforts of regular people, from electric vehicle advocates and space enthusiasts who were inspired by Elon Musk’s SpaceX, that may have ultimately helped get the right information about the incident to the right place.

Days after the incident, and a few weeks before the release of the National Transportation Safety Board (NTSB) preliminary report, @GoGundam1, a Texas-based SpaceX advocate, felt alarm bells in his head after Constable Herman declared confidently that he was 100% sure there was no one in the driver’s seat of the ill-fated Model S. Having been familiar with Elon Musk’s companies, the SpaceX enthusiast was also knowledgeable about Tesla and its products, which made the Constable’s statements seem disingenuous at best. Annoyed by the noticeably false narrative that was being formed, the space advocate sent out some feelers to test out the waters.

The story that emerged was quite remarkable. Information gathered by citizen informants suggested that by April 22, Constable Herman’s office was already in possession of video evidence that was in direct contradiction to the narrative that was initially presented to the media. It was a disturbing thought, but informants also suggested that the office of the Constable had intentions to sit on the information for as long as possible. Granted, these events may seem like they came from the plot of a semi-decent movie, but considering the relative silence from the Constable following his statements of a search warrant being submitted to Tesla, it does seem like the motivations for a follow-up report clarifying the incident were not really there.

Pertinent information about the Tesla Texas crash, no matter how valuable, would be next to useless if it did not catch the attention of the right entities. And thus, with the information gathered, the SpaceX enthusiast decided to reach out to members of the Tesla community for help. It was a challenging task, but eventually, @LordPente, a longtime Tesla advocate, decided to lend a hand. After numerous messages to other members of the Tesla community, the longtime EV advocate appeared to hit a breakthrough by (seemingly) reaching someone at Tesla. The SpaceX enthusiast, for his part, failed to get in touch with Tesla but was able to send a report to the NTSB, tipping off the agency about the additional video evidence in the Constable’s office.

During Teslarati’s conversation with the informant and the Tesla advocate, both noted that they were not really sure if their information reached the right entities. However, something happened not long after which suggested that it did.

THE LIE UNRAVELS
On May 10, 2021, the National Transportation Safety Board (NTSB) published its preliminary report about the Tesla Model S’ fatal Texas crash. As per the NTSB’s report, “footage from the owner’s home security camera shows the owner entering the car’s driver’s seat and the passenger entering the front passenger seat.” Apart from this, the NTSB also noted that tests of a similar vehicle at the crash location showed that Autopilot could not be engaged in the area, just as Tesla and the electric vehicle community suggested amidst the initial wave of “Autopilot crash” reports. The investigation is ongoing, of course, but based on what the NTSB has published so far, it appears that Autopilot has been absolved in the incident.

The findings presented in the NTSB’s report all but confirmed what Elon Musk and Tesla supporters were arguing online. It may be disappointing to media outlets like VICE, but as it turned out, the conspiracy theorist-like behavior exhibited by some Tesla sleuths online turned out to be justified. There really was misinformation being floated around, and if it wasn’t for the efforts of a few individuals, pertinent information about the incident might not have been submitted to Tesla or the NTSB on time.

Interestingly enough, Harris County Pct. 4 Constable Mark Herman has remained silent for now. Teslarati has attempted to reach out to his office through email but was unsuccessful. The Constable, at least for now, seems yet to issue a correction or retraction of his initial and now-debunked statements about the incident. Individuals such as Texas Rep. Kevin Brady have not admitted to making a mistake either.

HOW MISINFORMATION BECOMES TRUTH
Tesla, being a rather unorthodox company led by an equally unorthodox man, tends to fall victim to misinformation — lots and lots of it. The story of the Texas crash is a great example, but it is one drop in a whole bucket full of inaccurate reports about the company. Tesla CEO Elon Musk has seemingly thrown the towel with mainstream media coverage, reportedly abolishing Tesla’s PR department last year. This, of course, has pretty much opened the doors to even more misinformation — and to a point, even disinformation — which, in turn, becomes the general public’s truth.

For professional insights on how misinformation becomes accepted, Teslarati reached out to Stephen Benning, a Professor of Psychology at the University of Las Vegas. Professor Benning explained that humans tend to have an anchoring bias, in which the first information used to make a judgment influences it. While anchoring bias is typically considered in numerical judgments (like estimates on how much something is worth), it could also play out when people hear the first reports of what happened. This is most notable if the event were memorable, like a fatal Tesla crash. The initial information would likely stick on people’s minds and create an initial framework that sets their beliefs about an event.

“Because initial reports set people’s prior beliefs, additional information has to weigh against established beliefs. People might have additional biases at play, like the confirmation bias that filters out information that isn’t consistent with a previous set of beliefs. It’s as if people put up filters to help themselves maintain the consistency of their beliefs at the expense of their potential correspondence with reality. The initial crash reports were also likely more vivid than the drier details of the subsequent investigation, so the availability heuristic might make those initial reports more vivid and accessible in people’s memories when they think about the crash – even if they’ve followed the subsequent reports,” he wrote.

Emma Frances Bloomfield (Ph.D.), currently an Assistant Professor of Communication Studies at the University of Nevada, Las Vegas with an expertise in strategies for combatting misinformation, explained to Teslarati that ultimately, misinformation and disinformation travel very quickly because they tend to be compelling and engaging, all while confirming an audience’s biases. This made the Texas crash a perfect storm of sorts, as it had a compelling event that catered to biases against Tesla and its Autopilot system. Unfortunately, Assistant Professor Bloomfield also highlighted that once misinformation sets in, it takes a ton of effort to overturn.

“To address misinformation, people can create more complete stories that replace the incorrect one, provide trustworthy authority figures to deliver the message, and not repeat the false information when making the correction. You can also emphasize the importance of accurate information to make the best decisions moving forward and highlight how those changes might benefit the audience/consumer. We also say, ‘correct early and correct often’ to try and get ahead of the temporal advantage misinformation has and to counter the repetition of the false information,” she wrote.

A BATTLE THAT TESLA DOESN’T NEED TO LOSE
If there is something highlighted by Professor Benning and Assistant Professor Bloomfield, it is that misinformation is hard to battle once it’s settled in. And for a lie to settle in, it has to be repeated. The Texas crash demonstrated this. It didn’t start with a lie, but it started with a premature, careless statement that could be easily twisted into one.

The Constable’s certainty that there was no one in the driver’s seat was premature at best, and reports about the incident being an Autopilot crash were also premature then, or a lie at worst. Reports about an uncontrollable blaze burning for four hours were false as well. Yet the narrative was so hammered down and unchallenged that even when the NTSB preliminary report came out, the needle barely moved.

Elon Musk’s reservations about maintaining a relationship with the media are understandable. Years of inaccurate reports tend to do that to a person. However, Tesla could also adopt a much more assertive anti-misinformation strategy. Tesla China has been doing this as of late, to great results. Anyone following the Tesla China story would know that the company was embroiled in a PR storm that involved alleged reports of “brake failure” incidents surrounding the company’s vehicles. But after an assertive legal campaign from Tesla China, media outlets have issued apologies for misreporting on the company and social media personalities have admitted to making up alleged incidents that painted the company’s vehicles in a negative light. Granted, such strategies may not be as effective in the United States, but something has to be done. What this something is remains up for question.
https://www.teslarati.com/tesla-autopilot-fatal-crash-misinfromation-exposed/
 

CyberMoose

Well-known member
First Name
Jacob
Joined
Aug 19, 2020
Threads
1
Messages
513
Reaction score
81
Location
Canada
Vehicles
Model 3
Country flag
Good article. I remember when that crash was in the news and I put my faith in the statements that Elon had made on twitter. I felt that when there is a story about something new and unknown like Tesla's FSD software, it can lead to confirmation bias.

If this was a cheap Toyota that crashed and no one was in the drivers seat, they would assume that they weren't wearing a seat belt and the body was thrown during the crash. But when they seen a Tesla with no one in the drivers seat, they claimed it was driving itself. They started to create the story about how this vehicle was self driving, then it was brought to light that it didn't have FSD, so the story changed to the autopilot was enabled. All of a sudden you have all these groups trying to figure out how someone could make a Tesla drive with no one behind the wheel and how the car drove around a corner too fast and crashed.

While it was proven that it is possible to trick a Tesla to drive with no one behind the wheel, that was never proof of what happened, but the public started to believe that was the cause.

Elon was critisized during this time for releasing statements, something that a company under an investigation for a crash isn't supposed to do publically until the investigation is concluded. But imagine watching your company get blamed for a crash due to software that they work so hard to make safe, when you have information that disproves it.
 

drscot

Banned
Well-known member
Banned
First Name
Martin
Joined
Oct 25, 2020
Threads
2
Messages
197
Reaction score
3
Location
Alma, AR
Vehicles
Cybertruck
Occupation
Retired physician
Country flag
Good article. I remember when that crash was in the news and I put my faith in the statements that Elon had made on twitter. I felt that when there is a story about something new and unknown like Tesla's FSD software, it can lead to confirmation bias.

If this was a cheap Toyota that crashed and no one was in the drivers seat, they would assume that they weren't wearing a seat belt and the body was thrown during the crash. But when they seen a Tesla with no one in the drivers seat, they claimed it was driving itself. They started to create the story about how this vehicle was self driving, then it was brought to light that it didn't have FSD, so the story changed to the autopilot was enabled. All of a sudden you have all these groups trying to figure out how someone could make a Tesla drive with no one behind the wheel and how the car drove around a corner too fast and crashed.

While it was proven that it is possible to trick a Tesla to drive with no one behind the wheel, that was never proof of what happened, but the public started to believe that was the cause.

Elon was critisized during this time for releasing statements, something that a company under an investigation for a crash isn't supposed to do publically until the investigation is concluded. But imagine watching your company get blamed for a crash due to software that they work so hard to make safe, when you have information that disproves it.
I ordered FSD on my CT as a safety backup, not knowing for sure if it would operate adequately as such. Insurance so to speak. I have sleep apnea that isn't completely treated by CPAP, and I can get drowsy driving on somewhat lengthy trips. I have fallen asleep at the wheel and had accidents. Available medication for the condition helps significantly, but runs about $1,000 per prescription so I take it as needed for long trips. My question and my hope, if I were to fall asleep at the wheel, would FSD detect the "driverless" state and simply pull over and stop safely(which is what I would want it to do), or would it keep driving? Has such a situation been addressed by Elon? There are many sleep apnea patients out there and many are in the same boat I am (or is that a floating CT?).
 

happy intruder

Well-known member
First Name
O. K.
Joined
Mar 5, 2020
Threads
4
Messages
769
Reaction score
106
Location
Irvine
Vehicles
Model 3 Jun 2019..... Model S Jan 2020
Occupation
Retired
Country flag
I ordered FSD on my CT as a safety backup, not knowing for sure if it would operate adequately as such. Insurance so to speak. I have sleep apnea that isn't completely treated by CPAP, and I can get drowsy driving on somewhat lengthy trips. I have fallen asleep at the wheel and had accidents. Available medication for the condition helps significantly, but runs about $1,000 per prescription so I take it as needed for long trips. My question and my hope, if I were to fall asleep at the wheel, would FSD detect the "driverless" state and simply pull over and stop safely(which is what I would want it to do), or would it keep driving? Has such a situation been addressed by Elon? There are many sleep apnea patients out there and many are in the same boat I am (or is that a floating CT?).
if the car has a inside camera and turned on, it should do something....vibrate steering wheel or make a sound.....that may be enough to wake you....but not sure
 

jerhenderson

Well-known member
First Name
Jeremy
Joined
Feb 20, 2020
Threads
12
Messages
1,735
Reaction score
487
Location
Prince George BC
Vehicles
Cybertruck
Occupation
Correctional Officer
Country flag
Well, in all likelihood, you may not have to worry about it. It actually happened to me. A high speed head-on collision rendered me trapped and severely injured, The car appeared to be smoking as pointed out by the lookie-loos. "It's gonna blow!" I remember hearing once I regained consciousness. The trooper had no interest in getting me out while waiting on the jaws of life. He had to get information for his report and so interrogation was in order. He couldn't wait until after they got me out. After all, if it exploded, his interrogation would have been cut short anyway. Well, as it turned out, the "smoke" was the powder from the airbag, but nobody knew that. I was just pissed that his report was more important than my life. True story.
to be fair to the trooper, if he felt your car was going to explode he also has a duty to protect his own life, ergo not climbing in to get you.
 

drscot

Banned
Well-known member
Banned
First Name
Martin
Joined
Oct 25, 2020
Threads
2
Messages
197
Reaction score
3
Location
Alma, AR
Vehicles
Cybertruck
Occupation
Retired physician
Country flag
if the car has a inside camera and turned on, it should do something....vibrate steering wheel or make a sound.....that may be enough to wake you....but not sure
That may be too little too late if I'm veering out of the lane or at a curve. Do you mean in addition to self driving? Heck, I don't know, my hands may have fallen off the wheel. Flashing lights and a siren would work for sure!
 

Dids

Well-known member
First Name
Les
Joined
Dec 21, 2019
Threads
7
Messages
1,347
Reaction score
401
Location
Massachusetts
Vehicles
04 Tacoma, 23 Cybertruck
Occupation
Self
Country flag
That may be too little too late if I'm veering out of the lane or at a curve. Do you mean in addition to self driving? Heck, I don't know, my hands may have fallen off the wheel. Flashing lights and a siren would work for sure!
Wow it sounds like you should consider hiring a driver. Seems like a huge risk to drive yourself.
Alternatively have you inspected your diet. A UK study found a huge benefit to controlling sleep apnea by eliminating animal products from a diet.
https://journals.sagepub.com/doi/abs/10.1177/1559827618765097?journalCode=ajla&
 
Last edited:
 
Top