ITMPI FLAT 003
Main Menu
Home / IT Best Practices / AI Is Starting to Look at Objects How Humans Do

AI Is Starting to Look at Objects How Humans Do

To most AI, relationships are hard, and no we’re not referring to The Notebook kind. Figuring out a dog is chasing a cat through a picture is hard for a computer because it is simply analyzing the elements. But as Will Knight reveals in an article for MIT Technology Review, that could all change very soon.

AI Gets New Eyes

At DeepMind (that created the AlphaGo Go-playing robot), it’s all about the AI being able to process what humans can. While the cat and dog example above is a bit advanced for it now, DeepMind has set its sights on the physical relationship between objects. Using training involving sets of simple objects, the researchers have found that AI has been able to notice these relationships, which is far beyond anything currently in the field.

Knight cites Sam Gershman, a professor of psychology at Harvard, about how important this is to the future of AI:

“Our brains represent the world in terms of relations between objects, agents, and events,” he told MIT Technology Review via e-mail. “Representing the world in this way massively constrains the kinds of inferences we draw from data, making it harder to learn some things and easier to learn other things. So in that sense this work is a step in the right direction: building in human-like constraints that enable machines to more easily learn tasks that are natural for humans.”

You can view the original article here: https://www.technologyreview.com/s/608108/forget-alphago-deepminds-has-a-more-interesting-step-towards-general-ai/

About Austin J. Gruver

Austin is a Staff Writer for AITS. He has a background in professional writing from York College.

Check Also

3 Ways to Take Advantage of the Talent Gap

It’s a great time to be in project management. PMI’s recent Job Growth and Talent …

Leave a Reply

Your email address will not be published. Required fields are marked *