Artificial intelligence is being used as a way to help those dealing with depression, anxiety and eating disorders, but some therapists worry that some chatbots could offer harmful advice. “60 Minutes” is the most successful television show in history. Offering hard-hitting investigative reporting, interviews, segments and profiles of people in the news, the broadcast began in 1968 and is still a hit more than 50 seasons later, regularly entering the Nielsen Top 10. Subscribe to 60 Minutes on YouTube : Watch full episodes: Get more 60 Minutes from 60 Minutes: Prime Time : Follow 60 Minutes on Instagram : Like 60 Minutes on Facebook : Follow 60 Minutes ‘ on Twitter: Subscribe to our newsletter: Download the CBS News App: Try Paramount+ for free:…
Says "correct." But shakes her head No.
…and what Professional License and responsibility do they have?! How are these records kept confidential…oh yeah…there is no license so it doesn't matter! This is irresponsible.
one of the first AI apps was ELIZA, an AI therapist made in the 1960s
I appreciate the advocacy for mental health apps and recognize some of their potential benefits. They can provide a valuable resource, especially considering the stigma surrounding mental health and the limited availability of therapists at certain times. However, there is an irreplaceable value in-person or teletherapy contact offers. Speaking with a real person, whether face-to-face or via video, allows for a deeper connection and understanding that AI cannot replicate. We are also talking about EMPATHY! AI cannot offer empathy and being present like us human therapists can. This human element is crucial for many individuals in their mental health journey.
Chatbots can't refuse to do what they are told on moral grounds.
Whoever thought of this just doesn’t understand therapy
First, I understand peoples reaction that you don't want to talk to a bot when you are feeling down. However, for people with ''lighter complaints', there are many benefits in talking to an app. Like availability 24/7 and low-cost. In general, we have to face it that the healthcare systems are over-asked (for whatever reason) and that we have technology available that can just work really well.
The chatbots should only be used in case of light mental health complaints. For severe and especially suicideal patients they are outright dangerous and shouldn't be used.
See also my podcast where I talk with a researcher who built her own chatbot and has a critical view on when and how chatbots should be used. https://youtu.be/GkVHPUMwBf8?si=jeBqfeG6Szmz0ico
Do these companies have a malpractice insurance?
AI is highly leftist and fully supports CDC and WHO without question. Also supports ALL protected classes to the point of apologizing for them without any criticism, ever. So this should be perfect for the transmental folks who are highly confused anyway.
A therapist we need AI for the documentation side of therapy n not for the delivery of sessions to the client
No thanks. I'd rather be alone
Care: Mental Health & Therapy has had an incredibly positive impact on my life. This AI-powered app has been amazing in helping me deal with past traumas and track my mood. It offers constant support, which is a lifesaver during tough times. The scientific approach and personalized therapy sessions have significantly improved my well-being. I highly recommend this app to anyone in need of mental health support!
lol, Woebot is inferior to ChatGPT. Her business is dead.
I’m using an ai chatbot and it really helps.
#AI-Copilot 1:53
Want to improve your mental health? Limit your technology (including social media), get outside and move that body 😊
Something is lost with telehealth therapy and psychiatry. Nevermind AI chat bots. Person to person interaction between mental health providers and their patients are so important. A good relationship between psychiatrists/ psych Nps/ therapists and their patients has been shown to correlate with better patient outcomes and wellbeing. Ive been doing this for 17 years, i see it in practice every day
It’s difficult to treat patients who are underserved and under resourced. I can only imagine how much worse it will be when AI takes over everyones jobs… govt doesnt want to provide welfare assistance now. Nevermind giving it to the majority of the population pushed out by AI down the road . Scary scary stuff
😉
Very helpful
the app woman is full of bs for people who don't understand either AI or psychology
Emotionally sad ppl need emotionless AI. Great start😅
this is sooooo missing the point of human suffering…we need LESS technology and MORE in-person human connection. This is convoluting the issues and the solutions. Gross.
💩This is free everywhere. Don't pay for this junk! 💩
I have read a bunch of the comments and see that most people think it’s a bad idea. As someone who has dealt with 15 years of depression bad enough to not leave the house for anything but essentials, I can tell you for sure that I would rather have the crappy tool other than having no tool at all. As long as government isn’t involved, I’m good with it. Lol😅
I was waiting for them to say how much the app costs, and then I hear“ONLY PHYSICIANS”………What’s the point of using an app then?? They’re like, here we’ll protect you…….if you pay the right person.
There NOT SHORTAGES OF THERAPISTS! Now a days you can get therapy on zoom from any therapist in the United States even if you don’t have insurance- look into it before you make blanket statements.
Does nothing to solve the environment factors that negatively affect mental heath. Bad government that constantly lies, give weapons to foreign governments for genocide, out of control crimes, even police are killed during robberies and carjacking, media that identities itself as news and not propaganda. High taxes, possible layoffs and homelessness. Blame a person genes for their diseases instead of the food supply. The list can go on forever. I'm sure people who are layoff due to AI will really appreciate an AI therapist. I'm sure everything you tell the AI therapist sill be stored on servers so that the government can use it against you making the 5th amendment worthless.
Why does 60-Minutes think it needs to include footage of interviewees actually at work? Why show us footage of a lady walking down a hallway, or at her desk responding to an email?
I think this can be useful along with 988
This bot helped me in times of need but they have made it private and can only be accessed in the USA now.
What an insult to human beings! AI is a petri dish for disaster.
This is dumb
Human therapists are also infallible and can give dangerous advice.
Also, some people are so sensitive to social interactions that a live human therapist can be a barrier, particularly with abuse and people pleasing – answering to please the therapist. An AI would reduce this.
she's not the first person to come up with this idea. Inflection AI acquired by Microsoft already had an AI mental health chatbot.
Gee, what could go wrong?
There is no magic therapy for those unwilling to participate. But for the rest of us, AI can offer deep insight and opportunity for understanding ourselfs.
Google Wysa is awesome for mental health! I really benefitted from their mental health and mood chatbot APP.
We are suffering from the lack of human interaction. – This will dig an even deeper hole to sink into.