Google Photos starts rolling out ‘Ask Photos’ to selected users, allowing them to search their photo library using natural language queries. Following a sign-up waitlist period that began last month, the feature is gradually being introduced through a server-side update for both Android and iOS users. This new capability allows users to search and interact with their photo libraries using natural language queries.
The “Ask Photos” feature replaces the existing “Search” tab in the Google Photos app. Users who have access to it will see a new tab on the bottom navigation bar of the app. The function allows complex searches within personal photo collections by asking questions such as, “Show Halloween costumes over the years” or “When did the child learn to swim?”
This AI-powered search system uses location data, faces, and names, making it easier to recall specific moments. However, early reports indicate that while the feature excels at identifying people and places, it may struggle with more event- or object-specific searches.
Review and Set Permissions
Upon first use, the app prompts users to review the feature’s functionality, set data access permissions, and verify the accuracy of names linked to individuals in the photos. Users are also required to set relationships for people and pets, which are organized based on how frequently they appear in the library. This helps the system better understand and identify photos.
Initial testing suggests that “Ask Photos” delivers strong results when dealing with searches related to people and places, such as locating recent pictures with specific people. However, queries related to events or objects may not be as accurate. For example, a question about when a person moved into their house might pull up random photos rather than the actual moving day images. On the other hand, searches asking for pictures with specific individuals yield better results.
Google’s AI is designed to leverage location data to refine search results, especially for queries like “Show the meal at a certain restaurant” or “What activities were done at a specific location.” This could improve the user experience, but accuracy may still vary based on the nature of the query.
Exclusive to U.S. Users
Google Photos starts rolling out ‘Ask Photos’, which allows people to organize their photo collections more efficiently by tagging faces and relationships. Currently, the “Ask Photos” feature is limited to users in the United States. Those who signed up for the waitlist in September are likely to see the feature soon. Users who haven’t joined can still sign up through Google’s official channels. Google plans to expand the feature to more users over the coming weeks.
Google assures that “Ask Photos” prioritizes privacy. The personal data in Google Photos will not be used for ads, and only in rare cases will personal data be reviewed to address potential misuse. Additionally, Google ensures that data from “Ask Photos” will not be used to train other AI models outside Google Photos. As always, user data is safeguarded by Google’s industry-leading security measures.
Using Ask Photos
As Google Photos starts rolling out ‘Ask Photos’, users can now ask questions such as “Show me my travel photos from last year” and get instant results. Once set up, using ‘Ask Photos’ is straightforward:
- Open the Google Photos app on Android.
- Tap the “Ask” button located at the bottom of the screen.
- In the text box, type a query, such as “Photos of me over time” or “Cities I visited last year.”
- The AI will process the request and return relevant photos and chat messages, which may include follow-up questions like, “Want me to narrow it down for you?”
During the search, the system will display updates in the chat window, including messages like “thinking,” “searching,” and “reviewing.” Users can view more identified photos by tapping the “View more” button.
Also Read: OpenAI Closes the Largest VC Round, Secures $6.6B at $157B Valuation.