Google’s Knowledge Graph was launched in 2012 and since then, Google has collected over 500 billion facts about 5 billion+ entities. They have now announced that they are integrating the Knowledge Graph into the Google Image Search on mobile for users in the USA.
To enable this, Google will use its deep learning techniques on a image, combine it with an analysation of the text in the image’s web page. With this information, Google can determine the most likely people, place or thing related to the image and connect the relevant topic in their Knowledge Graph. Using this, Users will be able to better understand the image that they searched for, with related information and web links about the person, place or thing included in the image.
For example, when searching for place like a park, the user might come across an image of a river in the park. Beneath the Search image result, Google will now show name of the river, which city the park is in and more. Expand the topic and Google will also show a short description of the river and other links where more information is available.
In another example, when researching about a particular style of architecture, clicking on images of buildings of that style will show up information about the architect, when it was built and other relevant information.
Google is gradually rolling out this feature to mobile users in the US, starting with some images of people, places and things in Google Images. Over time, they will expand this to more images, languages and surfaces.