Tuesday, March 5, 2024

ChatGPT has a reputation for generating hallucinations, or false information. So can an Artificial Intelligence (AI) platform be trusted to assist in a literature review? Yes, if the tool you are using is the right one for the job. ChatGPT and Copilot are not designed to provide accurate citations. Instead, use them to brainstorm research questions. Keep alert for misinformation, hallucinations, and bias that could be part of the generative AI’s responses. Be aware of historical biases in the literature, which can also influence the output you encounter. 

Be sure to keep track of what tools you use, your purpose for using them, and the output from your interactions. Be prepared to disclose the AI tools, databases, and criteria used to select and analyze sources. Remember you are the one ultimately responsible for anything you create, generative AI is only your assistant.  

Try these five AI platforms to assist you in your literature reviews and academic research: 

  1. Copilot. Many people are exploring the ways that AI can be used to improve research. Even with a general generative AI platform like Copilot, you can use AI to help you brainstorm or discover new perspectives on research topics. An example prompt for this purpose can be found in David Maslach's article, "Generative AI Can Supercharge Your Academic Research," “I am thinking about [insert topic], but this is not a very novel idea. Can you help me find innovative papers and research from the last 10 years that has discussed [insert topic]?”  
  2. Elicit. This AI research assistant helps in evidence synthesis and text extraction. Users can enter a research question, and the AI identifies top papers in the field, even without perfect keyword matching. Elicit only includes academic papers, since Elicit is designed around finding and analyzing academic papers specifically. Elicit pulls from over 126 million papers through Semantic Scholar. Elicit organizes papers into an easy-to-use table and provides features for brainstorming research questions. 
  3. Consensus. This is an AI-powered search engine that pulls answers from research papers. Consensus is not meant to be used to ask questions about basic facts such as, “How many people live in Europe?” or “When is the next leap year?” as there would likely not be research dedicated to investigating these subjects. Consensus is more effective with research questions on topics that have likely been studied by researchers. Yes/No questions will generate a “Consensus” from papers on the topic. Papers in Consensus also are from Semantic Scholar. Results in a Consensus search can be filtered by sample size of the study, population studied, study types, and more. This makes Consensus an interesting tool for finding related literature on your search topic. 
  4. Research Rabbit. An AI research assistant designed to assist researchers in literature research, discovering and organizing academic papers efficiently. It offers features such as interactive visualizations, collaborative exploration, and personalized recommendations. Users can create collections of papers, visualize networks of papers and co-authorships, and explore research questions. Unlike the previous two platforms listed, Research Rabbit doesn’t start with a question, but a paper that already is known. You need to have a starting article to go down a “rabbit hole” to see connections between papers. 
  5. Litmaps. A similar tool to Research Rabbit, a Litmap shows the relationships between the articles in your collection in the form of connecting lines which trace the citations for you. It allows a user to start with a citation, or a seed, and then through a simple interface, investigate connections between papers. 

For further reading, see "How to Write AI-Powered Literature Reviews: Balancing Speed, Depth, and Breadth in Academic Research" which includes a helpful table comparing the different tools that specialize in literature searching. And check out the February 2024 webinar, "Unlock the Power of AI for Academic Research" hosted by Tracy Mendolia-Moore and Brett Christie for more information on this topic.