Key takeaways
- At IFSCC congress 2025, EveLab Insight and the Shanghai Institute of Nutrition and Health revealed the results of a new study that analysed over 3,000 faces to compare human and AI age perception.
- The study introduced ΔAge – a new benchmark showing how old someone looks versus their actual age.
- Key ageing features identified include nasolabial folds, under-eye bags, and pigmentation.
- AI models trained with demographic data outperform traditional models in predicting perceived age.
- Findings support more personalised skincare solutions by accounting for human bias in age judgement.
Singapore-based beauty tech company EveLab Insight, which works with brands such as Estée Lauder, Dr. Barbara Sturm, and Kate Somerville, has revealed collaborative research with the Shanghai Institute of Nutrition and Health, affiliated with the Chinese Academy of Sciences, on how humans and AI differ in judging age.
The research was unveiled at the annual IFSCC Conference in Cannes in September 2025.
Understanding ΔAge: a new metric for perceived age
The study itself analysed over 3,000 faces to uncover how age perception shifts by gender and generation and introduced ΔAge (delta age) – a new benchmark showing how much younger or older someone appears compared to their actual age.
The research pinpointed the facial cues most strongly associated with looking older: nasolabial folds, under-eye bags, and pigmentation. It also demonstrated how AI can strip out human bias to deliver more accurate, personalised skincare insights, paving the way for more precise applications of AI in age perception and prediction.
The study observed not only what men and women today consider to be modern signals of ageing, but also how this compares with large language models (LLMs), which initially were not subject to the same human predispositions.
According to Professor Sijia Wang, Director at the Shanghai Institute of Nutrition and Health, the study marks a turning point in how we think about age prediction in beauty science.
“Until now, AI models have largely worked in isolation, without reflecting the diversity of human perception,” said Prof Wang. “By considering the impact of human bias, we are bridging the gap between computational accuracy and lived experience. For the industry, this means moving closer to technologies that not only measure ageing more precisely but also respect the differences in how ageing is perceived across demographics.”
Meanwhile, Yolanda Ching from EveLab Insight highlighted the relevance of the study for the future of personalised beauty technology. “By understanding how different consumer groups perceive ageing features, we can help brands design products and strategies that truly resonate, moving away from one-size-fits-all solutions towards more tailored, effective skincare,” said Ching.

How AI and humans differ in age judgement
For the study, both men and women aged from their 20s to 50s were asked to guess the ages of over 3,000 faces. This enabled the researchers to track differences in age determination between humans and AI engines, and to identify unbiased results once evident preconceptions were removed.
In today’s age of longevity and evolving concepts of ageing beautifully, perceived age – how old someone looks rather than their actual age – can vary significantly. The research sought to understand how an individual’s age and gender might influence these perceptions, and to establish which specific facial features are commonly believed to signal older age.
The analysis concluded that older assessors tended to judge faces as older than younger assessors. In terms of gender differences, women tended to judge faces as slightly younger than men. However, overall, age judgements were relatively consistent.
When AI was tasked with assessing the same 3,000 faces, three models were trained to predict perceived age. One visual model, STDC2-FLD-HR, outperformed two popular LLMs in accurately matching human views, achieving a correlation of 0.95. The other two models, Qwen and LLaMA, were less effective in interpreting visual age cues.
Following these initial assessments, the models were retrained to account for the assumptions evident in human assessors. This aimed to remove the influence of age and gender on perceived age results.
These findings led to the creation of ΔAge (delta age), a new measure showing how much older or younger someone appears compared to their real age.
Implications for personalised beauty and skincare innovation
Overall, the study could help the beauty industry understand not only how well AI systems can identify the key facial features contributing to older perceived age, but also what can be achieved when human biases across generations and genders are removed.
Evelab noted that age perception isn’t just about the face being judged – it’s also influenced by who’s doing the judging. This opens the door to more personalised and effective skincare and anti-ageing strategies for diverse consumer groups.