Sewell Setzer was a happy child - before he fell in love with a chatbot and took his own life at 14. His mother has now filed a lawsuit against the most powerful company in the world.
Yes, I’m sure you’ll be able to convince kids that the new thing is bad because you say so, especially if you compare it to the antiquated mascot of a legacy word processor.
It’s not about it being bad. It’s about expectations and reality. It’s not human. Can’t replace human emotion and thought. Just process data and give analysis.
There is an emotional factor that goes into proper human decision making that is required. Or else half the human population would probably be suggested to be wiped out for some kind of cold, efficiency sake only a machine or psychopath can accept.
Same goes with something like suicide and mental health/human relationships. I don’t trust a machine’s judgment on that.
Just teach kids that AI isn’t human and isn’t a replacement for humanity or human interaction of any kind.
It’s clippy with a ginormous database. It’s cold blooded.
Yes, I’m sure you’ll be able to convince kids that the new thing is bad because you say so, especially if you compare it to the antiquated mascot of a legacy word processor.
It’s not about it being bad. It’s about expectations and reality. It’s not human. Can’t replace human emotion and thought. Just process data and give analysis.
There is an emotional factor that goes into proper human decision making that is required. Or else half the human population would probably be suggested to be wiped out for some kind of cold, efficiency sake only a machine or psychopath can accept.
Same goes with something like suicide and mental health/human relationships. I don’t trust a machine’s judgment on that.