학술논문

OS8.1 Chatbot-Assisted Self-Assessment (CASA): Designing a Novel AI-Enabled Sexual Health Intervention for Racially Minoritised Communities.
Document Type
Article
Source
Sexually Transmitted Diseases. 2024 Supplement, Vol. 51, pS37-S38. 2p.
Subject
Language
ISSN
0148-5717
Abstract
Background: The digitalisation of healthcare services and advancements in artificial intelligence offer opportunities to tackle racial and ethnic disparities in sexual health (SH). Contemporary chatbots using natural language processing can provide accurate SH advice and refer users to appropriate medical services. We aimed to design principles for an effective and culturally sensitive self-assessment intervention based on the disclosure of healthrelated information to chatbots. Methods: We conducted an online survey amongst racially minoritised communities in the UK (N=1287; 57% women; Mage=30.6, SD=10.1) to identify the level and type of information people were willing to disclose to SH chatbots and to measure reactions to chatbots' risk appraisals (5-Likert scales). Follow-up interviews (N=41) explored perceptions of chatbotled health assessment to identify themes related to acceptability and utilisation. We analysed datasets using ANOVA tests, regression models and Thematic Analysis. Results: Survey participants had neutral-to-positive attitudes towards SH chatbots (M=2.5, SD=0.60). They were comfortable disclosing "demographic information" (i.e. age, ethnicity; M=1.66, SD=0.89) and "sensitive health information" (i.e. condom use, sex and number of sex partners; M=2.26, SD=0.96) but less comfortable to disclose "personally identifiable information" (name, email address; M=2.56, SD=1.29); F=66.51,p< .001 After model adjustments, information disclosure was predicted by chatbot awareness, previous chatbot experience, positive attitudes and overall acceptability of chatbots, but not by any demographic or behavioural variables. Approximately 51% and 32% indicated they would visit a clinic if the chatbot told them they were 'at higher risk of STIs' or it was 'too early to test', respectively. Thematic analysis revealed four themes: "Chatbot as an artificial health advisor", "Disclosing information to a chatbot", "Ways to facilitate trust and disclosure", and "Acting on selfassessment". Important chatbot features included neutral, empathic and medically accurate language with an automated translation tool and the ability to book medical appointments directly. Conclusion: Chatbots are acceptable for selfassessment and professional SH advice amongst racially minoritised communities. If chatbots are to achieve their potential as supplementary tools for health education and screening within SH services, anonymous use needs to be enabled in their design. Future research needs to establish chatbots' impact on screening uptake and access to SH services. [ABSTRACT FROM AUTHOR]