Chatbots and Sexual Health

Someone using a computer

Everyone has the right to access accurate information about sexual health. This SexPlus Week, we're talking about how to find information you can trust in the age of artificial intelligence. 

Today, AI is an increasing part of our everyday lives—even when it comes to our sexual health! It's fast and easy to ask an AI chatbot a quick question about sexually transmitted infections (STIs), the difference between birth control options, or even asking advice on how to have "the talk" with your kids. 

In 2025, nearly a billion (with a B!) people used an AI chatbot every week. That's a lot of people, and a lot of questions. 

But how do we know we're getting accurate answers? 

What are Generative AI chatbots?

A chatbot is a computer program that is designed to have human-like conversations when we give it prompts. Generative AI chatbots include ChatGPT, Microsoft Copilot, Google Gemini, and Meta AI, although there are many more. 

Generative AI chatbots are trained to create custom answers, drawing on huge amounts of data and identifying language patterns to predict possible answers and then create a response.  

TLDR; It's like a much more advanced version of when your smartphone predicts what word you will type next when writing a text. 

What are the risks?

  • Fake or incorrect results: AI chatbots aren't perfect. Research shows that chatbots create false results or errors in 46% of its answers.(1) When testing ChatGPT on sexual health questions, a 2025 study found it was only accurate 64% of the time.(2)
  • Privacy: AI companies collect and store private information you share with their chatbots. This information could be shared with other users or sold to other companies, used to train the AI model, or even be used to create a profile about you.
  • Discrimination: AI chatbots are trained with large sets of data (usually from thousands of websites), including from sources which might be racist, homophobic, and sexist. This can result in answers which are biased or discriminatory against certain groups.
  • Brain activity: Early studies(3) show that regularly using an AI chatbot like ChatGPT can decrease your brain function, including your ability to learn, be creative, and think critically.
  • Carbon footprint: AI relies on huge amounts of electricity and water to run and cool AI data centers (basically giant buildings full of computers), creating a huge carbon footprint and new threats to our environment,(4) even as Canadian forest fires get worse every year.
  • Other ethical concerns: They are many questions about how generative AI chatbots are trained, store data, and used. Concerns include AI companies stealing data from artists and writers, and being used by people or groups to spread false information or political propaganda. 

How can I use chatbots more safely?

  • Check your chat: ask it to provide sources for the information it shares with you. Ask yourself:
    • Is this information from a reliable or trustworthy source?
    • Who is the author and how are they qualified?
    • Is this content sponsored or trying to sell me something?
    • When was this information published or last updated?
  • Protect your privacy: don’t share any personal or private information with a chatbot, including details about your health. You never know where that information will end up!
  • Use it as a starting point, not the finish line: chatbots can help point you in the right direction. Then, you can take things into your own hands and explore trusted sources like Action Canada, your local library, or official health websites.  

How can I prepare for the age of AI when it comes to sexual health information?

  • Look for information from trusted sources like Action Canada's Sexual Health Hub, your local sexual health centre, medical professionals, or sex education resources like Sex & U and Sexfluent.
  • Let's talk about sex(ual health)! Talk to friends, parents, or partners to see if what you found seems real or needs more research! The more openly we can talk about sexual health, the easier it is for people to ask questions and find answers.
  • Boundaries are an important part of all relationships. This includes our relationships with technology like AI. Rather than becoming dependent on billionaire-run chatbots, we can choose to turn to resources created by real people, for real people.
  • Advocate for comprehensive sex-ed for all students! When young people have access to comprehensive sexuality education, they are equipped with information about their bodies, their sexuality, and how to have healthy relationships of all kinds. This sets them up for success in every area of their life—including sexual health and critical thinking about changing technologies. 

As our digital worlds get even more complicated, it's more important than ever to know where to find sexual health information we can trust. Our rights, our health, and our futures depend on it.  

Learn more at SexPlusWeek.ca. 

Download the Factsheet.

References:

  1. De Wynter, A., Wang, X., Sokolov, A., Gu, Q., & Chen, S. (2023). An evaluation on large language model outputs: Discourse and memorization. Natural Language Processing Journal, 4, 100024. https://doi.org/10.1016/j.nlp.2023.100024
  2. Latt, P. M., Aung, E. T., Htaik, K., Soe, N. N., Lee, D., King, A. J., Fortune, R., Ong, J. J., Chow, E. P. F., Bradshaw, C. S., Rahman, R., Deneen, M., Dobinson, S., Randall, C., Zhang, L., & Fairley, C. K. (2025). Evaluation of artificial intelligence (AI) chatbots for providing sexual health information: a consensus study using real-world clinical queries. BMC Public Health, 25(1), 1788. https://doi.org/10.1186/s12889-025-22933-8
  3. Chow, A. R. (2025, June 23). ChatGPT may be eroding critical thinking skills, according to a new MIT study. TIME. https://time.com/7295195/ai-chatgpt-google-learning-school/
  4. Explained: Generative AI’s environmental impact. (2025, January 17). MIT News | Massachusetts Institute of Technology. https://news.mit.edu/2025/explained-generative-ai-environmental-impact-0117
Posted on 2025-12-02
Related campaign