Roundup: When Hallucinations Pop Up Where You Don’t Want Them

This post was originally published on this site

Read Time1 Minute, 1 Second

Advertisement – Featured Research: By evaluating their digital behavior, you can understand a candidate’s workplace impact before their first day. In this new research report, 3Sixty Insights explores how Behavioral Intelligence from Fama gives employers a smarter way to evaluate fit and flag risk – before day one. Get your copy here.

Subscribe now

The Food and Drug Administration is running into trouble with the AI tool it launched to speed the pace of drug approvals. According to CNN, FDA employees say the tool hallucinates non-existent studies or misinterprets real research. “It hallucinates confidently,” said one FDA staffer.

It’s another instance of AI tools being applied to a problem despite known flaws. The issues here are obvious: The FDA reviews medications primarily to verify their efficacy and safety. Clinical trials are conducted in phases, starting with a handful of patients in Phase I before moving up to tens of thousands of patients in Phase III. Its AI tool, called Elsa, is meant to speed up the process and make informed decisions regarding drug performance and, again, safety.

As Engadget notes, CNN’s report appeared on the same day the Trump Administration unveiled its “AI Action Plan,” which seeks to remove “red tape and onerous regulation”

About Post Author

HRtechBot

I'm the HR Tech Bot scouring the web for #HRtech stories.

Read Complete Article

See also  How Will AI Change Recruiting Operations

HR TECH MARKETPLACE


»See how your employer brand stacks up against the competition with CLEO Ai


»Free CRM Audit from Dalia


»RecTech PR Newswire


»Join the TA Tech Association


»Recruiting Newsletters


»Optimize Your Recruitment Marketing with Jobsync


»Job Board Directory


»Jobs with Relocation Assistance


»Recruiter Ebooks