Reporting like this only happens with your financial support. Donate to WKAR today!
Librarians say they’re facing an increasing number of requests to search for reference documents that don’t exist as the use of artificial intelligence chatbots increases.
Because the large language models that power AI chatbots are designed to predict patterns, they can produce results that are properly formatted but factually incorrect, like citing a made-up article purportedly published by a real researcher in a real academic journal.
Proving that a reference doesn’t exist poses a unique challenge, according to Emilia Marcyk, the head of Reference and Discovery Services at the Michigan State University Libraries.
“You can’t, you know, prove definitively, but I can at least say, like, "If this were a real article, here’s where I’d expect to find it, and I’m not seeing it,'” Marcyk said.
Close the gap and have your donation matched by December 31.
Marcyk said references hallucinated by AI chatbots have become more difficult to spot as their ability to produce results that sound plausible on the surface have progressed.
“Pre-ChatGPT, if someone sent me a citation, I could be relatively certain that it did exist somewhere. It might be hard to access, but it did exist,” Marcyk said.
Marcyk said the rise of AI chatbots is also driving a change in the type of requests librarians see, noting that some library patrons visit with an answer already in mind that they are trying to verify.
She said librarians remain well-equipped to help people identify sources and contextualize information.