Scientists have ceased to understand exactly how AI makes decisions

Speakers at the conference Neural Information Processing Systems specialists in the field of artificial intelligence said that they ceased to understand the principle of decision-making, which is guided by AI, – said Quartz. According to experts, taking for granted the actions of AI without understanding its logic is quite frivolous, because people need to adopt models of machine learning, they need to know what exactly AI is guided by, deciding exactly how it should act in a particular situation.

Often, decisions made by AI are biased, in addition, its associative “thinking” is also often not so ideal, as a result of which AI conducts incorrect analogies. Such mistakes can be costly if AI controls complex projects, such as flying to Mars. In this case, an incorrect act of artificial intelligence can not only destroy expensive equipment, but also lead to the death of people. Therefore, before allowing AI to make important decisions on its own, it is first necessary to learn the principles by which it is guided, “explains Metra Raghu, an AI expert from Google.

At the conference, she presented a report describing the process of tracking the actions of individual parts of the neural network. Viewing them in parts, you can understand the logic of AI, and then, in which case, it and correct it. Analyzing millions of operations, she was able to identify individual artificial “neurons” that focused on misconceptions, and then disable them, making AI more compliant and correct.
It’s something like a teacher submitting some material, and then asks the student to retell in his own words what exactly he understood from the lecture, “explains Kiri Wagstaff, an AI expert from NASA.

Judging by the results of this study, it is not so difficult to understand correctly the actions of artificial intelligence. The main thing is to solve the problem before it becomes too late.

Notify of
Inline Feedbacks
View all comments
Would love your thoughts, please comment.x