Ag with kids said:chickencoupe16 said:
AI can't do a Google search or fact check a short essay without making **** up. Maybe in the future but nowhere near reliable enough now.
Hallucinations are "real" and do put out erroneous information. There is NO WAY it should be used for flight-critical information at this time.
Yeah, AirGPT hallucinating a non-existing runway, incorrect flight ceiling, or a mountain in the wrong place could have some terrible consequences.