How we use your data to power Clare's AI and improve your recovery support
Last updated: January 2025
Clare's effectiveness as an AI Recovery Coach depends on understanding emotional context, recovery patterns, and personalized support needs. This Data Usage Policy explains how we responsibly use your information to:
All data usage follows strict privacy protections and ethical AI principles.
Every message you send to Clare is processed in real-time to understand emotional context, detect potential concerns, and generate appropriate, empathetic responses.
Voice calls with Clare use HumeAI's emotional intelligence platform to analyze vocal patterns, tone, and emotional indicators for deeper understanding and support.
Your interactions help make Clare smarter and more effective for all users in recovery. Here's how we responsibly use data for AI improvement:
By understanding what support strategies work best, we can improve Clare's ability to help people in different stages of recovery, facing various challenges, and with diverse backgrounds.
Your safety is our highest priority. We use advanced AI to detect potential crisis situations and provide immediate support:
AI monitors for suicide risk, severe depression, or relapse indicators
Immediate supportive messaging and resource provision
Connection to crisis hotlines and emergency resources when needed
Clare learns about your unique recovery journey to provide increasingly personalized support:
Personal learning data is encrypted and siloed to your account. It's not shared with other users or used in general AI training without anonymization.
You have control over how your data is used:
Access all privacy controls through your account settings. Changes take effect immediately and you can modify them anytime.
We conduct research to improve recovery support technology while maintaining the highest ethical standards:
We update this policy as our AI capabilities evolve. We'll notify you of significant changes and always maintain transparency about data usage.