Main Menu
Home / IT Best Practices / AI Is Starting to Look at Objects How Humans Do

AI Is Starting to Look at Objects How Humans Do

To most AI, relationships are hard, and no we’re not referring to The Notebook kind. Figuring out a dog is chasing a cat through a picture is hard for a computer because it is simply analyzing the elements. But as Will Knight reveals in an article for MIT Technology Review, that could all change very soon.

AI Gets New Eyes

At DeepMind (that created the AlphaGo Go-playing robot), it’s all about the AI being able to process what humans can. While the cat and dog example above is a bit advanced for it now, DeepMind has set its sights on the physical relationship between objects. Using training involving sets of simple objects, the researchers have found that AI has been able to notice these relationships, which is far beyond anything currently in the field.

Knight cites Sam Gershman, a professor of psychology at Harvard, about how important this is to the future of AI:

“Our brains represent the world in terms of relations between objects, agents, and events,” he told MIT Technology Review via e-mail. “Representing the world in this way massively constrains the kinds of inferences we draw from data, making it harder to learn some things and easier to learn other things. So in that sense this work is a step in the right direction: building in human-like constraints that enable machines to more easily learn tasks that are natural for humans.”

You can view the original article here:

About Austin J. Gruver

Austin is a Staff Writer for AITS. He has a background in professional writing from York College.

Check Also

What Is the Future of 3D Holograms?

When it comes to most-demanded technologies, 3D holograms are up there with hover chairs and …

Leave a Reply

Your email address will not be published. Required fields are marked *