Artificial intelligence is being used as a way to help those dealing with depression, anxiety and eating disorders, but some therapists worry that some chatbots could offer harmful advice. “60 Minutes” is the most successful television show in history. Offering hard-hitting investigative reporting, interviews, segments and profiles of people in the news, the broadcast began in 1968 and is still a hit more than 50 seasons later, regularly entering the Nielsen Top 10. Subscribe to 60 Minutes on YouTube : Watch full episodes: Get more 60 Minutes from 60 Minutes: Prime Time : Follow 60 Minutes on Instagram : Like 60 Minutes on Facebook : Follow 60 Minutes ‘ on Twitter: Subscribe to our newsletter: Download the CBS News App: Try Paramount+ for free:…

By admin

38 thoughts on “AI-powered mental health chatbots developed as a therapy support tool | 60 Minutes”
  1. …and what Professional License and responsibility do they have?! How are these records kept confidential…oh yeah…there is no license so it doesn't matter! This is irresponsible.

  2. I appreciate the advocacy for mental health apps and recognize some of their potential benefits. They can provide a valuable resource, especially considering the stigma surrounding mental health and the limited availability of therapists at certain times. However, there is an irreplaceable value in-person or teletherapy contact offers. Speaking with a real person, whether face-to-face or via video, allows for a deeper connection and understanding that AI cannot replicate. We are also talking about EMPATHY! AI cannot offer empathy and being present like us human therapists can. This human element is crucial for many individuals in their mental health journey.

  3. First, I understand peoples reaction that you don't want to talk to a bot when you are feeling down. However, for people with ''lighter complaints', there are many benefits in talking to an app. Like availability 24/7 and low-cost. In general, we have to face it that the healthcare systems are over-asked (for whatever reason) and that we have technology available that can just work really well.

    The chatbots should only be used in case of light mental health complaints. For severe and especially suicideal patients they are outright dangerous and shouldn't be used.

    See also my podcast where I talk with a researcher who built her own chatbot and has a critical view on when and how chatbots should be used. https://youtu.be/GkVHPUMwBf8?si=jeBqfeG6Szmz0ico

  4. AI is highly leftist and fully supports CDC and WHO without question. Also supports ALL protected classes to the point of apologizing for them without any criticism, ever. So this should be perfect for the transmental folks who are highly confused anyway.

  5. Care: Mental Health & Therapy has had an incredibly positive impact on my life. This AI-powered app has been amazing in helping me deal with past traumas and track my mood. It offers constant support, which is a lifesaver during tough times. The scientific approach and personalized therapy sessions have significantly improved my well-being. I highly recommend this app to anyone in need of mental health support!

  6. Something is lost with telehealth therapy and psychiatry. Nevermind AI chat bots. Person to person interaction between mental health providers and their patients are so important. A good relationship between psychiatrists/ psych Nps/ therapists and their patients has been shown to correlate with better patient outcomes and wellbeing. Ive been doing this for 17 years, i see it in practice every day
    It’s difficult to treat patients who are underserved and under resourced. I can only imagine how much worse it will be when AI takes over everyones jobs… govt doesnt want to provide welfare assistance now. Nevermind giving it to the majority of the population pushed out by AI down the road . Scary scary stuff

  7. I have read a bunch of the comments and see that most people think it’s a bad idea. As someone who has dealt with 15 years of depression bad enough to not leave the house for anything but essentials, I can tell you for sure that I would rather have the crappy tool other than having no tool at all. As long as government isn’t involved, I’m good with it. Lol😅

  8. I was waiting for them to say how much the app costs, and then I hear“ONLY PHYSICIANS”………What’s the point of using an app then?? They’re like, here we’ll protect you…….if you pay the right person.

  9. There NOT SHORTAGES OF THERAPISTS! Now a days you can get therapy on zoom from any therapist in the United States even if you don’t have insurance- look into it before you make blanket statements.

  10. Does nothing to solve the environment factors that negatively affect mental heath. Bad government that constantly lies, give weapons to foreign governments for genocide, out of control crimes, even police are killed during robberies and carjacking, media that identities itself as news and not propaganda. High taxes, possible layoffs and homelessness. Blame a person genes for their diseases instead of the food supply. The list can go on forever. I'm sure people who are layoff due to AI will really appreciate an AI therapist. I'm sure everything you tell the AI therapist sill be stored on servers so that the government can use it against you making the 5th amendment worthless.

  11. Why does 60-Minutes think it needs to include footage of interviewees actually at work? Why show us footage of a lady walking down a hallway, or at her desk responding to an email?

  12. Human therapists are also infallible and can give dangerous advice.

    Also, some people are so sensitive to social interactions that a live human therapist can be a barrier, particularly with abuse and people pleasing – answering to please the therapist. An AI would reduce this.

  13. There is no magic therapy for those unwilling to participate. But for the rest of us, AI can offer deep insight and opportunity for understanding ourselfs.

Leave a Reply

Your email address will not be published. Required fields are marked *