Skip to main content

AI app clicks nail selfie to detect blood condition affecting billions

Person clicking picture of their nails.
Nature Communications

Nearly two billion people across the world suffer from a blood condition called anemia. People living with anemia have a lower than average number of red blood cells, or the hemoglobin (Hgb) protein, and as a result, reduced oxygen carrying capacity. 

Chronic anemia can lead to serious health issues such as heart attack and organ damage, with pregnant women being at a particularly higher risk. So far, anemia identification has required a visit to the clinic for CBC blood tests, Hemoglobin and Hematocrit analysis, or peripheral blood smear assessment. 

Recommended Videos

What if you could click a selfie of your nails, and an AI-powered app could tell whether you have anemia? That’s exactly what experts at Chapman University have developed. The mobile app offers a non-invasive and convenient route to checking signs of anemia with a high degree of accuracy. 

Does it really work?

The app has helped over 200,000 users across the United States and conducted over a million tests as part of a medical study. According to the experts behind it, the app can be deployed as a highly scalable and accessible anemia surveillance tool.

The team behind the research found that their app delivered “accuracy and performance that match gold standard laboratory testing and a sensitivity and specificity of 89% and 93%, respectively.” Moreover, the app also offers an AI-driven personalization system for people who have already been diagnosed with anemia

Once the app was personalized, the error rate decreased even further. An easily accessible digital tool like this will allow hundreds of millions of patients to regularly monitor their Hgb levels instantly, without having to visit clinics and get expensive blood tests done. 

In 2020, Sanguina also developed an app called AnemoCheck for people suffering from chronic anemia. Back then, the company said it was not pursuing any regulatory approval for the app, and that it was more of a lifestyle solution. A similar app was tested for public health service in India two years ago and was deemed good enough for screening.

What’s the core benefit? 

Experts at Chapman University made it abundantly clear that this app is not a replacement for proper medical tests, nor is it targeted at self-diagnosis. Instead, it merely serves as a warning system that lets users know if they should consult a doctor, especially if they see the pre-existing condition worsening. 

“The app is particularly valuable for those with chronic anemia, such as people with kidney disease or cancer, who often require frequent monitoring,” says the team. In fact, when the app’s personalization feature was enabled, the usage increased the accuracy by as much as 50% in the target user pool.

The overarching goal is to allow self-monitoring and open the doors for early interventions by experts, without having to wait for lab results to come in. Interestingly, the app’s built-in geolocation feature enabled what the team calls “the first county-level anemia prevalence map in the U.S.” 

Experts behind the project are hoping that this app can help improve public health efforts by allowing population-wide anemia screening in tandem with regional mapping. More details about the project can be accessed in the  Proceedings of the National Academy of Sciences (PNAS) journal.

Nadeem Sarwar
Nadeem is a tech and science journalist who started reading about cool smartphone tech out of curiosity and soon started…
Gemini app finally gets the world-understanding Project Astra update
Gemini Live App on the Galaxy S25 Ultra broadcast to a TV showing the Gemini app with the camera feature open

At MWC 2025, Google confirmed that its experimental Project Astra assistant will roll out widely in March. It seems the feature has started reaching out to users, albeit in a phased manner, beginning with Android smartphones.
On Reddit, one user shared a demo video that shows a new “Share Screen With Live” option when the Gemini Assistant is summoned. Moreover, the Gemini Live interface also received two new options for live video and screen sharing.
Google has also confirmed to The Verge that the aforementioned features are now on the rollout trajectory. So far, Gemini has only been capable of contextual on-screen awareness courtesy of the “Ask about screen” feature.

Project Astra is the future of Gemini AI

Read more
HuggingSnap app serves Apple’s best AI tool, with a convenient twist
HuggingSnap recognizing contents on a table.

Machine learning platform, Hugging Face, has released an iOS app that will make sense of the world around you as seen by your iPhone’s camera. Just point it at a scene, or click a picture, and it will deploy an AI to describe it, identify objects, perform translation, or pull text-based details.
Named HuggingSnap, the app takes a multi-model approach to understanding the scene around you as an input, and it’s now available for free on the App Store. It is powered by SmolVLM2, an open AI model that can handle text, image, and video as input formats.
The overarching goal of the app is to let people learn about the objects and scenery around them, including plant and animal recognition. The idea is not too different from Visual Intelligence on iPhones, but HuggingSnap has a crucial leg-up over its Apple rival.

It doesn’t require internet to work
SmolVLM2 running in an iPhone
All it needs is an iPhone running iOS 18 and you’re good to go. The UI of HuggingSnap is not too different from what you get with Visual Intelligence. But there’s a fundamental difference here.
Apple relies on ChatGPT for Visual Intelligence to work. That’s because Siri is currently not capable of acting like a generative AI tool, such as ChatGPT or Google’s Gemini, both of which have their own knowledge bank. Instead, it offloads all such user requests and queries to ChatGPT.
That requires an internet connection since ChatGPT can’t work in offline mode. HuggingSnap, on the other hand, works just fine. Moreover, an offline approach means no user data ever leaves your phone, which is always a welcome change from a privacy perspective. 

Read more
Gemini is replacing Google Assistant. How will the shift affect you?
Google Assistant and Gemini apps on an Android phone.

The writing has been on the wall for a while, but the shift away from Google Assistant is now official. Google has announced that it will shift users to Gemini as the default AI assistant on their devices in the coming months. Once that happens, they will no longer be able to access the Google Assistant.
At the moment, you can switch to Google Assistant as the default option on your Android phone, even on newer phones that come with Gemini running out of the box. In addition to phones, Google will be giving a similar treatment to smartwatches, Android Auto, tablets, smart home devices, TVs, and audio gear.
“We're also bringing a new experience, powered by Gemini, to home devices like speakers, displays, and TVs,” says Google, without sharing a specific time frame for the transition. What happens to Google Assistant following the transition? Well, it will be removed from devices and will no longer be available to download from app stores.

Talking about apps, Gemini can already interact with a wide range of Google’s own as well as a few third-party apps. Users can ask it to perform chores across different products, without ever having to open those apps. In addition to in-house apps such as Docs, Drive, and Gmail, the Gemini assistant can also perform tasks in third-party apps such as WhatsApp and Spotify, alongside a bunch of Samsung apps.

Read more