idk dude if your technology encourages a teenager to kill itself and prevents him from alerting his parents via a cry for help, I don’t care how “beneficial” it is.
Although I don't believe current technology is ready for talk therapy, I'd say that anti-depressants can also cause suicidal thoughts and feelings. Judging the efficacy of medical technology can't be done on this kind of moral absolutism.
The suicidal ideation of Antidepressants is a well communicated side effect. Antidepressants are prescribed by trained medical professionals who will tell you, encourage you to tell them if these side effects occur, and will encourage you to stop the medication if it does occur.
It's almost as if we've built systems around this stuff for a reason.
Not my experience at all. The Psych that prescribed me antidepressants was _incredibly_ diligent. Including with side effects that affected my day to day life like loss of Libido.
We spent a long time finding something, but when we did it worked exceptionally well. We absolutely did not just increase the dose. And I'm almost certain the literature for this would NOT recommend an increase of dosage if the side effect was increased suicidality.
The demonisation of medication needs to stop. It is an important tool in the toolbelt for depression. It is not the end of the journey, but it makes that journey much easier to walk.
I'm a happy sertraline user, but your experience sounds like the exception.
Most people are prescribed antidepressants by their GP/PCP after a short consultation.
In my case, I went to the doctor, said I was having problems with panic attacks, they asked a few things to make sure it was unlikely to be physical and then said to try sertraline. I said OK. In and out in about 5 minutes, and I've been on it for 3 years now without a follow up with a human. Every six months I do have to fill in an online questionnaire when getting a new prescription which asks if I've had any negative side effects. I've never seen a psychiatrist or psychologist in my life.
From discussions with friends and other acquaintances, this is a pretty typical experience.
P.S. This isn't in any way meant to be critical. Sertraline turned my life around.
This is probably fair - My experience comes both from the UK (where it was admittedly worse, but not that much) and the Netherlands - where it was fantastic.
Even in the worst experiences, I had a followup appointment in 2, 4 and 6 weeks to check the medication.
My experience is in the UK, but it doesn't surprise me that you got more attention in the Netherlands. From the experiences of my family, if you want anything more than a paracetamol, you practically need sign off from the Minister of Health!
Joking aside, they do seem to escalate more to specialists whereas we do more at the GP level.
Unfortunately that's just a single good experience. (Unfortunately overall, not for you! I'm happy that your experience was so good.) Psych drugs (and many other drugs) are regularly overprescribed. Here is just one documented example: https://pmc.ncbi.nlm.nih.gov/articles/PMC6731049/
I think it's fine to be "morally absolutist" when it's non-medical technology, developed with zero input from federal regulators, yet being misused and misleadingly marketed for medical purposes.
That's a bit of an apples-to-oranges comparison. Anti-depressants are medical technology, ChatGPT is not. Anti-depressants are administered after a medical diagnosis, and use and effects are monitored by a doctor. This doesn't always work perfectly, of course, but there are accepted, regulated ways to use these things. ChatGPT is... none of that.
You might not care personally, but this isn't how we do anything, because we wouldn't have anything in the world at all. Different things harm and kill people all the time, and many of them have barely any use than harmful activity, yet they are still active parts of our lives.
I understand the emotional impact of what happened in this case, but there is not much to discuss if we just reject everything outright.
This is the same case that is being discussed, and your comment up-thread does not demonstrate awareness that you are, in fact, agreeing with the parent comment that you replied to. I get the impression that you read only the headline, not the article, and assumed it was a story about someone using ChatGPT for therapy and gaining a positive outcome.
I recommend you get in the habit of searching for those. They are often posted, guaranteed on popular stories. Commenting without context does not make for good discussion.
What on Earth? You're posting an article about the same thing we're already discussing. If you want to contribute to the conversation you owe it to the people who are taking time out of their day to engage with you to read the material under discussion.