Advertisement

Google’s Similar Images: Teaching computers to see

Share

This article was originally on a blog post platform and may be missing photos, graphics or links. See About archive blog posts.

This week, Google unveiled an odd but interesting new feature of its image search capabilities. Similar Images lets the user look for images that are visually close to a target image without being exactly the same. Playing around with the tool lets you see just how far the science of ‘computer vision’ has come. Fundamentally, digital images are nothing more than patterns of lines and colors -- but Google has somehow taught its search engine to look at those patterns and decide which images a human would consider similar.

Try typing in ‘ferrari.’ The engine will return a page of listings, many of which have a ‘similar images’ link below them. If you find one you like, you can click it, and be returned a page of images that are startlingly similar without being identical:

Advertisement

This is neat for Ferrari 360 fans who like to surf through pages and pages of car photos. But, in general, there aren’t many reasons why you’d want to have a few hundred pictures of the same thing.

That’s why it’s better to think of the similar image search as a way to find similar things, rather than similar pictures of the same thing. If you’re shopping for diamond rings, for example ...

... you wouldn’t want a list of photos of the exact same ring.

The search would seem to come in handy for shopping, especially if you don’t mind ordering from online jewelry retailers you’ve probably never heard of.

Update: Apparently there are already ventures out there that elegantly employ computer vision to help buyers find what they’re looking for. Modista.com helps you do it with shoes.

You can also use Similar Images for tracking memes, like all the variants on the Shepard Fairey image of Barack Obama, or keeping tabs on all the UFO images popping up online.

Similar Images itself may not be the most obviously useful tool around, but as Google developer Radhika Malpani points out, it’s just one piece of a larger puzzle of image search.

‘In general, search is such a hard problem,’ she said. ‘We’ve been working for a while to move beyond just textual signals.’

Advertisement

Since the beginning of Google’s image search, the search engine classified images according to the words associated with them on a Web page. But that’s not enough, Malpani, said. Instead of looking at the context, ‘we’ve been focusing on analyzing the content of an image, and saying, let’s try and understand what’s in an image and see how we can use this to help our users find what they’re looking for.’

In order to do that, Google’s software has to be able to process hundreds of millions of images, and in each case, try to decide what it’s looking at. Which might not sound all that amazing until you remember that computers don’t have eyes.

-- David Sarno

Advertisement