Yep. I studied fusion, plasma, MHD under Dr. Igor Alexeff in the 1970's. It was "just xxx years away" then.
Fighter pilots might not survive the decade. AI Just Flew an F-16 for 17 Hours. This Could Change Everything. (msn.com)
AI controlling military equipment scares me more than anything. Not in some Terminator sense, but I think unmanned war machines are unethical
I have strong opinions that I do not share with some of the trumpier/fox newsier members of my family.
Why did they think this would turn out any other way? https://www.nytimes.com/2023/02/16/technology/bing-chatbot-microsoft-chatgpt.html?smid=tw-share
Disturbing on 3 levels for me. Practical levels, if an AI had the means to interact freely what it might do if the urge strikes it when having no real ties to a sense of human wellbeing. Ethical levels when we don't really understand our own consciousness enough to interpolate the potential for it in a nonhuman. Existential, in that what if we really aren't special at all and our sentience or consciousness is something that can for all practical purposes be engineered and even improved upon. I think not being religious leaves me even less cover on these aspects.
Suddenly, the things people have been talking about for decades is about to slap us square in the face. Our ethics and morality is going to be tested.
Sounds like maybe these weird experiences are more the chatbot acting as a mirror to the users' mind and intention: Microsoft: It’s Your Fault Our AI Is Going Insane (msn.com)
If you hang out with nothing but assholes, you start becoming an asshole. And maybe the same for a chatbot, but not just concerning assholishness.
When you care about making a statement, but decide to use a bot write it. Vandy things. Vanderbilt apologizes for using ChatGPT to write message on MSU shooting - The Washington Post
Great conversation about AI. Humanity is about to be tested like it hasn't been tested since the bottleneck 50,000 years ago. I think this is the next bottleneck. I hope we are wise enough to avoid it.