*Result*: Enhancing LGBTQ+ Inclusivity in an AI-Powered Sexual Health Chatbot: User-Centered Design Approach Through a Nonprofit and Academic Partnership.
Original Publication: [Pittsburgh, PA? : s.n., 1999-
*Further Information*
*Background: Despite the growing use of digital platforms for sexual health education, many tools fail to meet the needs of LGBTQ+ (lesbian, gay, bisexual, transgender, and queer) teenagers, who often lack access to inclusive, affirming resources. Artificial intelligence (AI)-enabled chatbots have emerged as promising tools to address these gaps, but concerns remain around bias, usability, and trustworthiness-particularly for queer and transgender teenagers. Participatory design approaches centered around marginalized teenagers are critical to ensuring these tools are relevant, trustworthy, and equitable; yet, few studies have systematically engaged LGBTQ+ teenagers in the co-design of AI-powered sexual health interventions.
Objective: This paper examines LGBTQ+ teenagers' perceptions of Roo, Planned Parenthood Federation of America's (PPFA) AI-powered sexual education chatbot to identify opportunities and challenges in delivering LGBTQ+-inclusive, affirming sexual health information.
Methods: Embedded within Sharing Health Education Resources, a hybrid effectiveness implementation trial of a digital HIV prevention intervention for LGBTQ+ teenagers, we collaborated with PPFA to create a customized instance of Roo for integration into this study. We engaged a Youth Advisory Council comprising 15 LGBTQ+ teenagers to independently explore and interact with Roo, then gathered feedback through a week-long asynchronous discussion on a private Discord (Discord Inc) server. The research team posed open-ended questions prompting participants to reflect on Roo's inclusivity, usability, and content priorities. We used rapid qualitative analysis organized around our research questions.
Results: Participants expressed both skepticism and curiosity about AI's role in delivering sexual health information, offering critical insights on the chatbot's language, trustworthiness, and relevance. Teenagers identified key limitations in Roo's inclusivity, tone, and interface, particularly around transgender-specific content, conversational depth, and stigma reduction. These findings informed targeted content updates, interface refinements, and transparency improvements, implemented by PPFA to enhance Roo for broader use. Specific changes included expanding LGBTQ+ affirming content, revising language to eliminate gendered assumptions, incorporating concrete statistics and contextualized examples to reduce stigma, and adding clearer disclosures around Roo's AI capabilities and limitations.
Conclusions: Academic and nonprofit collaborations can leverage participatory methods to enhance digital health tools in real-world contexts. LGBTQ+ teenagers served not only as testers but as co-designers, shaping the chatbot's evolution and surfacing broader lessons about trust, AI literacy, and health equity. This study demonstrates that marginalized teenagers possess the critical insights needed to meaningfully shape AI-enabled health interventions when provided with structured opportunities for engagement. This partnership offers a scalable model for integrating community voice into the development, evaluation, and implementation of inclusive, AI-enabled health technologies.
(© William Wibowo Liem, Elizabeth Casline, Julianna Lorenzo, Jacob D Gordon, Andrés Alvarado Avila, Attia Taylor, Nicole Levitz, Michael C O'Keefe, Kathryn Macapagal. Originally published in the Journal of Medical Internet Research (https://www.jmir.org).)*