Skip to Main Content

AI for Students Guide: Evaluating AI-Generated Output

Effective and ethical use of AI tools for research and learning

Evaluating AI Output

Like other sources, AI-generated output must be evaluated for accuracy, credibility, currency, bias, and relevance.

AI generated content may include inaccurate information. It may cite sources that don't exist or it may draw conclusions based on flawed training data. It's important to check AI generated content against other trusted sources. Do not use sources cited by an AI tool without reading those sources yourself.

Some questions you may ask as you evaluate AI generated content:

  • Can claims be verified in other reliable, credible sources on the same topic?
  • Could the content be missing any important information or points of view? Is there any inherent bias?
  • Are the claims based on the most current information or research available?
  • If the AI tool provides links to sources, have the sources been accurately cited or summarized?

Fact-Checking AI with Lateral Reading

Diagram of a fact-checking process for AI. The diagram is titled “AI-Fact Checking” and shows a linear flow chart with five steps, represented by a series of different-colored arrows. The text of the diagram reads: Step 1: Break It Down. Break down the information. Identify specific claims. Step 2: Search. Look for information supporting a specific claim. For specific info claims: try Google or Wikipedia. For confirming something exists: try Google Scholar or WorldCat. Step 3: Analyze. Consider the info discovered in light of assumptions: What did your prompt assume? What did the Al assume? What perspective or agenda do your fact-check findings hold? Step 4: Decide. What is true? What is misleading? What is factually incorrect? Can you update your prompt to address any errors? Step 5: Repeat/Conclude. Repeat this process for each of the claims identified in the "Break It Down" stage. Make judgment calls on the validity of the claims and decide if they are relevant and useful for your research.


Lateral reading is an evaluation strategy of using other sources to confirm the claims made by the source you are evaluating.

Here's how to fact-check something you got from ChatGPT or another AI tool:

  1. Break down the information. Take a look at the response and see if you can isolate the individual and specific claims. This is called factionalization.
  2. Now comes the lateral reading part! Open a new tab (or several) and look for supporting pieces of information in other sources: Google, Google Scholar, or LibSearch, our library catalog are good places to search for other sources. Even Wikipedia (gasp!) might be able to verify some types of claims. 
  3. Next, think deeper about the assumptions being made in the AI output and your AI prompt:
    • What did your prompt assume?
    • What did the AI assume?
    • Who else would know things about this topic? Would they have a different perspective than what the AI is offering? Where could you check to find out?
  4. You decide, based on the evidence you've assembled from different sources:
    • What here is true?
    • What is misleading?
    • What is factually incorrect or might need to be further investigated?
      • You may want to re-prompt the AI tool to try and fix some of these errors. 
      • You may want to dive deeper into the alternate sources you found while fact-checking.
  5. Repeat this process for each of the claims the AI made – go back to your list from the first step and keep going.

Image and content from University of Maryland Fact-Checking AI guide

Verifying AI-generated Citations

If a generative AI tool provides a reference, confirm that the source exists. Trying copying the title into a search tool like Google Scholar or LibSearch. Do an internet search for the lead author or for the publication. Are they real?

Here's a real-life example of a fake citation generated by an AI tool:

Baker, C. K., Niolon, P. H., & Oliphant, H. (2021). Linking gender-based violence and housing instability: Expanding solutions for survivors. American Journal of Preventive Medicine, 61(1), 121-129. 

It looks like an appropriately cited scholarly journal article, right?

A search of Google Scholar did not turn up this article, and a web search for this publication also confirmed that the an article with this title and page number doesn't exist in the issue of this journal. You may find that parts of a citation are correct but if the whole thing isn't accurate you've been duped.

 

©2024 St. Catherine University Library, St. Paul, Minnesota, USA

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License