Skip to Main Content

AI Resource Guide

A guide to artificial intelligence (AI) resources.

Critical Thinking and GenAI Use for Research

While Generative AI (GenAI) research and writing tools may speed up work or make it easier, overreliance on GenAI has impacts on our critical thinking skills:

  • GenAI tools (like search engines) present information as though it’s fact, seeming to reduce the need for users to use critical thinking skills such as questioning, comparing sources, or evaluating claims.
  • This undermines core research habits like source triangulation (verifying claims or facts with other sources) and evidence-based reasoning– forming conclusions or arguments by using reliable and relevant evidence to support them.
  • Algorithms often personalize search results by offering information that mirrors a user’s previous searches and accessed content.
  • This limits exposure to diverse or challenging perspectives, narrowing critical thinking by discouraging intellectual humility and open-minded inquiry
  • Overreliance on using GenAI for writing, summarizing, or data analysis can lead researchers to skip essential steps like close reading, critical interpretation, or understanding methodologies.
  • This may obscure how conclusions are reached and replace deep analysis with shallow synthesis.
  • Despite marketing claims, AI does not have access to “all” global knowledge. It operates on datasets that are partial, outdated, or biased (nevermind, stolen and used without permission). These limitations can reinforce existing stereotypes and inaccuracies.
  • In fields like health, criminal justice, and education, these biases can cause researchers to draw faulty conclusions that replicate social inequities.
  • If researchers assume GenAI outputs are neutral, they may be ignoring embedded cultural and political biases.
  • This can lead to accepting flawed results and undermines critical thinking skills.
  • Researchers should use critical thinking to achieve contextual understanding, ethical reflecting, and relying on the wisdom of lived experience: things AI cannot do.
  • This can lead to researchers making decisions disconnected from human insight and real-world complexity.

To find out more about how AI replicates systems of oppression, and is built upon systems with incomplete data and/or information, check out these resources:

To read about the cognitive effects of ChatGPT:

  • Kosmyna, Nataliya, et al. Your Brain on ChatGPT: Accumulation of Cognitive Debt When Using an AI Assistant for Essay Writing Task. arXiv:2506.08872, arXiv, 10 Jun. 2025. arXiv.org, https://doi.org/10.48550/arXiv.2506.08872.

 


  Report a Problem with this Page