🧑🏼‍💻 Research - July 19, 2025

An academic evaluation of ChatGpt’s ability and accuracy in creating patient education resources for rare cardiovascular diseases.

🌟 Stay Updated!
Join AI Health Hub to receive the latest insights in health and AI.

⚡ Quick Summary

This study evaluates ChatGPT’s accuracy in generating patient education resources for rare cardiovascular diseases (rCVD). The findings indicate that while ChatGPT can provide information, its effectiveness is limited, necessitating physician clarification for patient use.

🔍 Key Details

  • 📊 Questions Asked: 40 questions related to rCVD
  • 🧩 Evaluation Method: Expert academicians assessed the responses
  • ⚙️ Technology: Generative Pre-trained Transformer (ChatGPT)
  • 🏆 Performance: Lower success rate compared to classical diseases

🔑 Key Takeaways

  • 🤖 ChatGPT can provide information on rCVD but is not fully reliable.
  • 📉 Accuracy was lower for rCVD compared to more common diseases.
  • 🔄 Redundancy in responses was noted, with some answers being repetitive.
  • ❌ Incorrect Information was minimal, but some answers were not satisfactory.
  • 🩺 Physicians should clarify ChatGPT’s responses for patient understanding.
  • 🛠️ Auxiliary Tool: ChatGPT should be used as a supplementary resource.
  • 📚 Study Published: In the journal Sci Rep, 2025.

📚 Background

Rare cardiovascular diseases (rCVD) pose unique challenges due to their specialized knowledge requirements and the limited information available in web databases. As healthcare increasingly integrates technology, tools like ChatGPT offer potential for enhancing patient education, but their reliability must be critically assessed.

🗒️ Study

The study involved a systematic evaluation of ChatGPT’s responses to 40 questions about rCVD. Expert academicians in the field reviewed the answers based on current guidelines and medical literature, aiming to determine the accuracy and reliability of the AI-generated information.

📈 Results

The results revealed that ChatGPT’s performance in answering questions about rCVD was lower than expected. While the AI provided a wealth of information, many responses were found to be redundant or not directly addressing the questions posed. Importantly, the study noted that there was very little incorrect information, but the overall effectiveness was still limited.

🌍 Impact and Implications

The implications of this study are significant for both patients and healthcare providers. While ChatGPT can serve as a valuable auxiliary tool for gathering information, it is crucial for physicians to interpret and clarify the AI’s responses. This ensures that patients receive accurate and comprehensible information regarding their conditions, ultimately enhancing patient education and care.

🔮 Conclusion

This evaluation highlights the potential and limitations of using AI like ChatGPT in the realm of patient education for rare cardiovascular diseases. While it can assist in information retrieval, the need for professional oversight remains paramount. Future research should focus on improving AI accuracy and reliability, ensuring that such technologies can be effectively integrated into patient care.

💬 Your comments

What are your thoughts on the use of AI in patient education? Do you believe tools like ChatGPT can be effectively utilized in healthcare? 💬 Share your insights in the comments below or connect with us on social media:

An academic evaluation of ChatGpt’s ability and accuracy in creating patient education resources for rare cardiovascular diseases.

Abstract

Generative Pre-trained Transformer (ChatGPT) is a web-based artificial intelligence assistant with the potential to provide information, answer questions, and make recommendations on various topics. Rare cardiovascular diseases (rCVD) are among the health problems that require specialized knowledge and attention, and web databases provide relatively limited information. In this study, we investigated the accuracy and reliability of ChatGPT’s answers to patients’ possible questions about rCVD. ChatGPT was asked forty questions about rCVD. Based on current guidelines and information, academicians who are experts in their fields evaluated ChatGPT’s answers to these questions. The success of ChatGPT, which has been repeatedly evaluated in classical diseases, was lower in rCVD. The responses to various questions exhibited significant similarity, with some answers including redundant information. In addition, ChatGPT did not give the desired answers to some questions. However, although some answers were longer than necessary, there was very little incorrect information in the answers. Although ChatGPT is competent in obtaining information about rCVD, physicians should clarify the answers given by ChatGPT to patients. Therefore, ChatGPT should be used as an auxiliary information acquisition tool rather than as a primary resource for patients with rCVD.

Author: [‘Sevinç S’, ‘Candemir M’, ‘Yamak BA’, ‘Kızıltunç E’, ‘Sezenöz B’, ‘Şahin OB’, ‘Topal S’, ‘Demir Y’, ‘Yalçın MR’, ‘Şahinarslan A’]

Journal: Sci Rep

Citation: Sevinç S, et al. An academic evaluation of ChatGpt’s ability and accuracy in creating patient education resources for rare cardiovascular diseases. An academic evaluation of ChatGpt’s ability and accuracy in creating patient education resources for rare cardiovascular diseases. 2025; 15:25929. doi: 10.1038/s41598-025-11567-w

Share on facebook
Facebook
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on whatsapp
WhatsApp

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.