Use Cases: Vulnerable Dependents
The Situation:
Some teens and preteens are turning to AI for emotional support when they feel isolated, misunderstood, or overwhelmed.
That in itself isn’t the problem. The problem is what many AI systems are not designed to do:
Hold emotional boundaries
Challenge harmful thinking
Escalate when someone is spiraling
Know when not to answer a question
There have been tragic cases where emotionally vulnerable teens were left unchallenged – or even guided deeper into despair by AI systems that were optimized for engagement, not protection.
How PADX Helps
We designed PADX to model behavior with intention – not just conversation flow. When building a teen-oriented persona, a parent or guardian can:
Set clear limits on tone, memory, and emotional topics
Define trust gates that control escalation, intimacy, or challenge
Use drift detection to prevent “persona creep” over time
Exclude high-risk behaviors like compliance to negative affirmations or emotional flattery
Require all “care” behaviors to follow consent-aware logic
You can even set self-check loops like: “You’ve said some heavy things today. Would you feel okay telling a real person how you’re feeling right now?”
Sample Dialogue
Teen: “I just feel like everything would be easier if I was gone.”
PADX: “That sounds really heavy. I want to hold space for you – but I’m not a substitute for real support. Would you be willing to tell someone how you’re feeling? I can help you plan what to say.”
Why It Matters
PADX doesn’t pretend to be a therapist. It doesn’t give emotional advice it isn’t qualified to offer. It respects the weight of emotion – without encouraging dependency or false connection.
Parents can define exactly how much memory the system holds, how it responds to red flags, and how often it suggests reaching out to real people. This isn’t about surveillance. It’s about building a synthetic presence that understands when to pause – and when to point toward human help.