“The solution to continuous learning is literally a billion-dollar problem.”
Learning Gaze
New research has highlighted apparent flaws in AI models’ ability to learn new information — or, as it turns out, they just can’t learn at all.
According to a study conducted by a team of scientists from the University of Alberta in Canada and published in the journal Nature this week: NatureAI algorithms trained with deep learning (i.e. AI models such as large language models that are built by finding patterns in large amounts of data) do not work in a “continuous learning setting”, that is, when new concepts are introduced to the model’s existing training.
In other words, if you want to teach an existing deep learning model something new, you will probably have to retrain it from scratch, otherwise research shows that the artificial neurons in that model will have values of zero, resulting in a total loss of “plasticity,” or ability to learn.
“If you think of it like a brain, it’s like 90 percent of your neurons are dead,” said Shibhanshu Dohare, a computer scientist at the University of Alberta and lead author of the study. New Scientist“There’s not enough for you to learn.”
And as the researchers point out, training advanced AI models is a laborious and extremely expensive process, posing a major economic obstacle for cash-hungry AI companies.
“If the network is a large language model and the data represents a significant portion of the Internet, each retraining can cost millions of dollars in computational costs,” the study said.
Obstacle Course
This phenomenon of loss of plasticity is also a major disconnect between current AI models and imagined “artificial general intelligence,” a theoretical AI that would be as intelligent as humans. After all, in human terms, this would be like having to completely reboot your brain from scratch every time you take a new college course, lest you destroy a large portion of your neurons.
If there’s a silver lining for AI companies, what is it? Interestingly, the study authors were able to create an algorithm capable of randomly reviving damaged or “dead” AI neurons, achieving some success in addressing the plasticity issue.
However, as things stand, no practical solution has yet been found.
“The solution to continuous learning is literally a billion-dollar problem,” D’Hare said. New Scientist“A truly comprehensive solution that can continuously update models would significantly reduce the cost of training these models.”
More about AI training: When trained on AI-generated data, the AI spits out gibberish