• Pete Hahnloser@beehaw.org
    link
    fedilink
    English
    arrow-up
    7
    ·
    2 days ago

    A few thoughts here as someone with multiple suicide attempts under his belt:

    • I’d never use an “AI therapist” not running locally. Crisis is not the time to start uploading your most personal thoughts to an unknown server with possible indefinite retention.

    • When ideation hits, we’re not of sound enough mind to consider that, so it is, in effect, taking advantage of people in a dark place for data gathering.

    • Having seen the gamut of mental-health services from what’s available to the indigent to what the rich have access to (my dad was the director of a private mental hospital), it’s pretty much all shit. This is a U.S. perspective, but I find it hard to believe we’re unique.

    • As such, there may be room for “AI” to provide similar outcomes to crisis lines, telehealth or in-person therapy. But again, this would need to be local and likely isn’t ready for primetime, as I can really only see this becoming more helpful once it can take over more of an agent role where it has context for what you’re going through.