IT Best Practices

AI Is Starting to Look at Objects How Humans Do

To most AI, relationships are hard, and no we’re not referring to The Notebook kind. Figuring out a dog is chasing a cat through a picture is hard for a computer because it is simply analyzing the elements. But as Will Knight reveals in an article for MIT Technology Review, that could all change very soon.

AI Gets New Eyes

At DeepMind (that created the AlphaGo Go-playing robot), it’s all about the AI being able to process what humans can. While the cat and dog example above is a bit advanced for it now, DeepMind has set its sights on the physical relationship between objects. Using training involving sets of simple objects, the researchers have found that AI has been able to notice these relationships, which is far beyond anything currently in the field.

Knight cites Sam Gershman, a professor of psychology at Harvard, about how important this is to the future of AI:

“Our brains represent the world in terms of relations between objects, agents, and events,” he told MIT Technology Review via e-mail. “Representing the world in this way massively constrains the kinds of inferences we draw from data, making it harder to learn some things and easier to learn other things. So in that sense this work is a step in the right direction: building in human-like constraints that enable machines to more easily learn tasks that are natural for humans.”

You can view the original article here:

Show More

Leave a Reply


We use cookies on our website

We use cookies to give you the best user experience. Please confirm, if you accept our tracking cookies. You can also decline the tracking, so you can continue to visit our website without any data sent to third party services.