The neural network was trained not to forget the acquired skills

Developers from Google “taught” the neural network DeepMind to consistently learn and solve several tasks, without forgetting the skills necessary for previous tasks. A similar result was made possible by knowledge of the neurobiological features of memorization in animals.

Standard neural networks can be trained to deal well with a particular task, but when changing activities, it will not be able to use the skills obtained previously, because new knowledge will be written on top of the old ones. This problem must be overcome for the construction of the so-called general artificial intelligence, which will be compared in terms of abilities with the human. “If we want to get more intelligent and more useful programs, then they should be able to train consistently,” says James Kirkpatrick of the DeepMind team.

To create a new program, developers took advantage of the knowledge acquired by neurobiologists, which show that during the training in the brain of animals the most important for certain communication skills are retained. The new neural network works in a similar way: before the transition to the next task, the program determines the most important links for the previous one and makes them less volatile. “If the network can use what was learned earlier, then it will take advantage of this,” Kirkpatrick explains.

The program was taught to play ten classic Atari computer games in a random order. After learning each game for several days, the network could play at the level of a person in seven out of ten games. The standard neural network in such conditions hardly could compare with a person even in one. It remains unclear the degree of use of the knowledge gained previously: the program has learned how to play different games, but in none has reached the level that the program that only works in one form achieves. “We demonstrated that she is able to train consistently, but we have not shown that she is better at learning because of this,” Kirkpatrick summed up.

Notify of
Inline Feedbacks
View all comments
Would love your thoughts, please comment.x