Google introduced a new feature to its Google Photos that will automatically compile and self-create an album to ones photos.
Google Photos is available on the web, Android and iOS, in which the feature reckons to compile a user's best photos and suggests new albums on its own. It works partly from the use of location data, where it gathers snapped photos around the same time and place it was taken as part of the same event, then compiles them together into a digital album. In addition, the feature will test its computer vision skills to try to evaluate the best photos from the original collection.
Google has given the effort in developing its automation tool, a learning machinery that can come useful for pin pointing an accurate location. Google Photos is equipped to read the camera's location information that depends on its database of landmarks to recognize a photo to the exact place the picture was first snapped.
A Google engineer said, "We have 255,000 landmarks that we automatically recognize, it's a combination of both computer vision and geotags. Even without the geotags, we'd be able to recognize a landmark."
The album offers maps to recognize the location of the photo that was taken. The feature can also provide captions to the snaps that enables to title the album.
As for face detection, the feature manages to familiarize a person based on how many times they show up in a user's photo, but still remains private to where it doesn't search for profiles in social media to every individual's photo taken.
Google announced back in December a new attribute that allows anyone to create shared albums for adding images and videos, which also works with the new feature to these albums.
Although Google Photos appears to resemble other existing apps, with a feature that focuses on location and maps into the mix, Google has delivered enough effort on the concept to make the feature worth experiencing.