Article Title: How do you teach a car that a snowman won’t walk across the road?
In the article at issue, the author isolates the learning problems of modern day AI, the incidents those lead to, and suggests ways to fix the issue. Beginning with the example of self-driving cars, the author establishes the point that a major problem of the carsis not knowing when to not stop before an obstacle, causing rear-ending accidents. If the cars had more “common sense”, they would have known. “Common sense”, argues the author, is almost innate in human babies, and accentuated by years of social and emotional growth in society, separate from physical and intellectual growth. They learn common sense by seeing others, and having feelings of others mirrored in their own hearts. On the other hand, AI (akin to self-driving cars), learning mostly through statistics, is alienated from such efficient learning in the domain of common sense. Researchers today are struggling with making AI learn common sense, or even have the IQ of a child, and failing. The public may be misled to believe they are succeeding, but they are not, argue scientists like Ernest Davis of NYU. The solution, believes the author, lies in teaching AI the way human babies are taught to pick up common sense. Only then can they be made fit to assume greater responsibilities in the broader world.
Words to learn from this article:
Bootstrap: achieve a goal through the pooling of several kinds of resources.
Tumbleweed: a light plant which breaks and floats in the air, found in arid regions.
Rear-ending: an accident involving one vehicle crashing into the back of another ahead of it.
Causality: the phenomenon of some thing or incident being the direct reason of another.
Nascent: very newly formed.
Crowdsourcing: goals which require the services of a large array of people, generally fulfilled in the form of surveys.
Violation: breaking of the norms or sanctity of something.
Cognitive: relating to understanding or perception.
Want more Daily Reads? Explore here: