The vehicle, certified for airworthiness and flight-tested over 170 hours with more than 500 takeoffs and landings, is now headed for mass production.
Jesus fuck please tell me that's a typo and they left some zeros off. Your average commercial pilot has more than that before any airline is even willing to consider hiring them, that is absolutely not sufficient testing.
Still, early adopters may face bureaucratic turbulence. Potential buyers must be both licensed drivers and certified pilots.
No shit. And that is never going to change, becoming a certified pilot is a lot fuckin harder than getting a driver's license, and for very good reasons. If some BMW-driving cunt can't even be arsed to use his turn signal do you really want to ease the "bureaucratic turbulence" just so he can fuck up and crash into a packed airliner? These things still have to use runways, the people flying them still need to know how to behave themselves at airports, how to identify and avoid restricted airspace, how to communicate with ATC and how to behave if (let's be honest, when) they get Intercepted.
And all of this so you can have a car that's worse at being a car, and an airplane that's worse at being an airplane, but hey at least you won't have to book a rental car at your destination airport, which I remind you they still have to use.
The problem isn't one of motivated learners being forced to drag their heels amidst their unmotivated peers.
The problem is that the core function of LLMs, the whole basis for their existence, is completely and entirely truth-agnostic. Not only do they not know what is the truth and what is not, they don't even know what the difference is. LLMs are very good at guessing what word looks like it should come next, they can make very convincing statements, they can be very persuasive, but those words don't MEAN anything to the machine and they are made without any consideration for accuracy.
They are literally making everything up on the basis of whether or not it sounds good, and every crackpot bullshit conspiracy theory from flat earth dumbshittery to very sincere-sounding arguments that birds aren't real have been included in the training data. That all linguistically SOUNDS fine, so to an LLM it's fair game!
And even curating your training data to ONLY contain things like textbooks wouldn't cure the problem, because LLMs just aren't capable of knowing what those words mean. It's why they can't do even basic Math, the one thing Computers have been incredible at!
Using an LLM as an actual teacher is genuinely worse than no education at all, because it will just create a generation that, instead of knowing nothing, will very confidently be wrong all the time.