CEO of OpenAI says misuse of artificial intelligence could be ‘lights out for all’
In a recent interview with StrictlyVC (opens in new tab) (via Yahoo Finance (opens in new tab)), OpenAI CEO and co-founder Sam Altman spoke about the future of AI, good and bad, in the vaguest way possible—which scares me a bit.
Altman answered questions about OpenAI, the makers of the wildly popular AI chatbot ChatGPT (opens in new tab), and AI art tool DALL-E, (opens in new tab) as well the overall AI landscape. While most of the interview is mostly a word salad of Silicon Valley terms, Altman did give his thoughts on the best and worst-case scenarios for artificial intelligence.
Altman said that he thinks “the best case is so unbelievably good that it’s hard for me to even imagine” and that it could help “resolve deadlocks and improve all aspects of reality and let us all live our best lives.”
“I think that the good case is so unbelievably good; you sound like a really crazy person when you talk about it.”
As for the worst-case scenario, Altman gave a less-than-sunny response to the potential dangers if and when AI goes bad. “The bad case—and I think this is important to say—is, like, lights out for all of us,” Altman said. “I’m more worried about an accidental misuse case in the short term.”
Altman stressed that it’s not an issue of “AI wakes up and decides to be evil,” but rather that it’s “impossible to overstate the importance of AI safety and alignment work” in order to prevent abuse of the systems, intentional or otherwise.
We’ve already seen issues with AI-generated tools (opens in new tab), including ChatGPT, that can be used to commit plagiarism, write phishing emails, and spread misinformation, as well as legal issues involving copyrights with AI art generation. Frustratingly, Altman didn’t get into specifics. Considering Microsoft’s investment of billions of dollars into OpenAI (opens in new tab), it would have been nice to hear about the potential pitfalls that could show up with AI tools used poorly or maliciously.
Source link