This post was originally published on this site
Advertisement – Featured Research: By evaluating their digital behavior, you can understand a candidate’s workplace impact before their first day. In this new research report, 3Sixty Insights explores how Behavioral Intelligence from Fama gives employers a smarter way to evaluate fit and flag risk – before day one. Get your copy here.
The Food and Drug Administration is running into trouble with the AI tool it launched to speed the pace of drug approvals. According to CNN, FDA employees say the tool hallucinates non-existent studies or misinterprets real research. “It hallucinates confidently,” said one FDA staffer.
It’s another instance of AI tools being applied to a problem despite known flaws. The issues here are obvious: The FDA reviews medications primarily to verify their efficacy and safety. Clinical trials are conducted in phases, starting with a handful of patients in Phase I before moving up to tens of thousands of patients in Phase III. Its AI tool, called Elsa, is meant to speed up the process and make informed decisions regarding drug performance and, again, safety.
As Engadget notes, CNN’s report appeared on the same day the Trump Administration unveiled its “AI Action Plan,” which seeks to remove “red tape and onerous regulation”