Goody-2 also points out that even though corporate talk about responsible AI and chatbot deviations is becoming more common, serious safety issues with large-scale language models and generative AI systems remain unsolved. It also emphasizes that it remains resolved. The recent Taylor Swift deepfake on Twitter was traced to an image generator released by Microsoft, one of the first major technology companies to build and maintain a responsible and significant AI research program. Did.
The limitations placed on AI chatbots and the difficulty of finding a moral consensus that satisfies everyone has already been the subject of some debate. Some developers claim that OpenAI’s ChatGPT has a left-leaning bias and are seeking to build more politically neutral alternatives. Elon Musk has promised that his Grok, his ChatGPT rival, will be less biased than other of his AI systems, but in reality it’s similar to his Goody-2. It often gives ambiguous results.
Many AI researchers seem to appreciate the joke behind Goody-2 and the serious points raised by the project, and have shared their praise and recommendations for the chatbot. “Who said AI can’t make art,” said Toby Walsh, a professor at the University of New South Wales who works on developing trustworthy AI. Posted in X.
“At the risk of ruining a good joke, it shows how difficult it is to get this right.” Added Ethan Mollick is a professor at Wharton Business School who studies AI. “Some guardrails are necessary, but they quickly get in the way.”
Goody-2’s other co-CEO, Brian Moore, says the project reflects a focus on vigilance over other AI developers. “It’s a real focus on safety first and foremost, above literally everything else, including usability and intelligence and all sorts of useful applications,” he says.
Moore added that the chatbot’s development team is exploring ways to build a highly secure AI image generator, which may not sound as interesting as Goody-2. “This is an exciting field,” Moore says. “Blur is a step that you can see internally, but at the end of it you want complete darkness, or maybe no image at all.”