Google Lens now identifies skin conditions, from moles to rashes

Google Lens will soon receive a very important update that will allow the AR app to recognize potential skin conditions through a simple photo. This new feature is a crucial step for telehealth and will allow both doctors and patients to try to identify a problem very quickly before visiting a doctor for a better diagnosis. How will this feature work? It’s actually very simple!

Google Lens now recognizes skin conditions

Leveraging artificial intelligence, Google Lens analyzes a photo captured by the user to look for clearly visible skin conditions, eventually finding “visual matches” for informational purposes only. Google Lens can then recognize moles, skin rashes, or describe other abnormalities such as psoriasis, bumps on the lips or hair loss. Just point the Google Lens camera, snap the photo, and wait for search results.

Google Lens AI skin conditions

As already specified, this solution only makes it possible to facilitate the process of finding and identifying potential health problems before a medical examination, which is still necessary for a better analysis.

It’s not clear when this feature for identifying skin conditions will officially become available on Lens for all users but, considering that the official announcement has arrived right now, we can expect a distribution in the next weeks.

Always talking about the latest news regarding the Big G, a few days ago an important update arrived for Google Meet which introduces the “On-The-Go” mode for video calls during car journeys.


Please enter your comment!
Please enter your name here