Predicting human chronological age via AI analysis of dorsal hand versus facial images: A study in a cohort of Indian females

Predicting a person's chronological age (CA) from visible skin features using artificial intelligence (AI) is now commonplace. Often, convolutional neural network (CNN) models are built using images of the face as biometric data. However, hands hold telltale signs of a person's age. To determine the utility of using only hand images in predicting CA, we developed two deep CNNs based on 1) dorsal hand images (H) and 2) frontal face images (F). Subjects (n = 1454) were Indian women, 20–80 years, across three geographic cohorts (Mumbai, New Delhi and Bangalore) and having a broad variation in skin tones. Images were randomised: 70% of F and 70% of H were used to train CNNs. The remaining 30% of F and H were retained for validation. CNN validation showed mean absolute error for predicting CA using F and H of 4.1 and 4.7 years, respectively. In both cases correlations of predicted and actual age were statistically significant (r(F) = 0.93, r(H) = 0.90). The CNNs for F and H were validated for dark and light skin tones. Finally, by blurring or accentuating visible features on specific regions of the hand and face, we identified those features that contributed to the CNN models. For the face, areas of the inner eye corner and around the mouth were most important for age prediction. For the hands, knuckle texture was a key driver for age prediction. Collectively, for AI estimates of CA, CNNs based solely on hand images are a viable alternative and comparable to CNNs based on facial images.

][10] As for hands, cosmetic concerns related to ageing include fingernail defects, excess skin and prominent veins, as well as hyperpigmentation of the dorsal sides. 11,12That is why dorsal hand skin is considered an important target for aesthetic treatment. 13ate-of-the-art in current image analysis for age prediction includes artificial intelligence (AI)-based algorithms including CNNs. 1,22][23][24] Nusinovici et al. 25 connected biological age with conditions of eye retina using CNN.Jana et al. 22 used the wrinkle features to predict the age of Indian subjects using AI.The algorithms were proposed for age prediction from individual markers of the face: forehead wrinkles, glabellar wrinkles, periorbital wrinkles, nasolabial fold, marionette lines, density of pigmentary spots, ptosis of lower part of the face, vascular disorders and cheek skin pores. 23,2417][18][19]21 At the same time, the issue of using hand images is less elucidated. 26,27High accuracy was shown for identification of a person by face and hand images by instrumental dermatological methods. 28This shows hand analysis could be a prospective alternative for facial non-invasive biometrics.
Recently published works proposed an algorithm of segmentation of veins from dorsal hand images for biometrics based on veingenerative adversarial networks. 29AI-based approaches have been proposed for age 30 and gender 26,27 prediction.
Large sources of data are substantially important for new DL algorithms.Many commonly known datasets of facial images exist, including FGNET, 21 MORPH, 31 Adience, 31,32 and IMDB-Wiki. 15These sources of information should be balanced in terms of ethnicity, age, etc. 33 Earlier CNNs for age prediction were mainly trained on datasets with a predominance of light-skinned Caucasians. 19Most other ethnicity-specific algorithms have become available only in recent years.For example, a CNN detecting facial features related to age in African, Asian and Caucasian females has been created. 23The algorithm by Voegeli et al. 34 for the determination of age of females has been developed based on subjects from Guangzhou (China), Lyon (France), New Delhi (India), Tokyo (Japan) and Cape Town (South Africa).The CNN for age prediction in Asians and the Asian-specific AFAD dataset have been reported. 17Punyani et al. 19 widely reviewed CNNs for age prediction trained on the Private database of Asian and Caucasian video frames and on the Asian face age dataset.
The issue of deficiency of ethnic-specific data is also typical for hand images.Most datasets, such as NYU Hand Pose Dataset, 35  To bridge the gap between the shortage of ethnic-specific datasets and the need for a wide age dispersion, we collected 1454 Indian women's hand and face images using standardised photographic systems.To create an effective automatic pipeline for age prediction from hands, CNNs trained separately on face and hand images were compared.The new age-prediction algorithm was successfully validated.Moreover, we identified some specific ageing markers of hands and faces that were most sensitive for the prediction of age by the developed CNNs.

| Study Design and Subjects
This was a multi-site, observational study of 1454 self-reported healthy Indian females.Three cities from different parts of India were selected: Mumbai (n = 488) (Figure 1A), New Delhi (n = 491) (Figure 1B) and Bangalore (n = 475) (Figure 1C) stratified across 5 decades of chronological age (20s, 30s, 40s, 50s and 60+ years).The Indian population is characterised by a high variation of skin tones ranging from fairer in North India to darker in South India. 36Thus, selection of these study sites was important to cover a wide range of skin tones.The data on geographic location of and climatic parameters in the study sites are summarised in Appendix S1.All locations are characterised by high levels of air pollution.[39][40] Subjects were free from excessive facial hair, acne, cuts, abrasions, fissures, wounds, lacerations, or any other active skin conditions or lesions including vascular lesions on the face and hands.
Pregnant or lactating subjects were also excluded from enrollment.
Subjects arrived at the study site with a clean face and hands and without having applied any product on their face or forearms.

| Face and hand imageing
Standardised images of each subject's dorsal right (Figure 1D) and left (Figure 1E) hands as well as a full-frontal face (Figure 1F) were captured.For the New Delhi and Bangalore cohorts, all images were captured with the VISIA-CR Imageing System (Canfield Scientific, Parsippany, NJ, USA) according to the manufacturers' instructions.Subjects' hands were stabilised using a hand rest at a fixed distance from the camera.For the Mumbai cohort, face and hand images were captured similarly with the VISIO-Face Imageing System (Spincontrol, Tours, France).To reduce specular reflectance, photographs were captured with the cross-polarised lighting. 41It provided the opportunity to focus on the back-scattered signal carrying data about pigmentation, erythema, vascularization, etc. from deeper skin layers, which is more informative. 42,43[46]

| Skin colour classification
Images were analysed in the CIEL*a*b* colour space.The parameters L* (lightness) and b* (yellowness) were determined using the VAESTRO Image Analysis Toolkit (Canfield Scientific, Parsippany, NJ, USA).Individual typology angle (ITA) 47 is defined as was calculated as a viable measure for skin tone classification. 48,49A common ITA grade was used: VI-VII: ITA < -30° (very dark, dark), V; 10° > ITA > -30° (brown), IV: 28° > ITA > 10° (tan), III: 41° > ITA > 28° (intermediate), II: 55° > ITA > 41° (light) and I: ITA > 55° (very light). 49bjects' ITA was in the range from −50° to +41° (Figure 1G,H) which corresponded to III-VII colour classification of skin tones.The majority of subjects had tan or brown skin colour (Figure 1G,H).ITA calculated from the dorsal hand and face correlated between each other (Figure 1I).For most subjects, classification of skin tone on the hands was similar to that on the face.When hand skin tone grouping diverged from that of the face, hand skin was more often darker (27% of subjects) versus lighter (4% of subjects) than the face (Figure 1J).

| Algorithms for age prediction
For automated age prediction from face and hand images, a CNNbased pipeline was developed.Hand and face images were randomly split into train (70%) and test (30%) datasets stratified for age.Moreover, the hand and face training datasets share the same subjects, while the test datasets contain the rest of the subjects, i.e., hand and face images for one person presented in one the same dataset (either training or test).
Facial images were cropped using the open-source library DLIB. 50For hand images, we applied the Otsu filtering technique from the OpenCV computer vision library 51 to crop the images.

| Model interpretation technique
To identify face and hand features contributing to age prediction, the sensitivity heatmap evaluation method 52 was used.In this paper, four individuals of 25 years, 34 years, 58 years and 69 years of age were randomly selected.Cropped images of faces and hands (see above) were used in the analysis.
The pipeline of the analysis is presented in Appendix S2.Each image was segmented into a 50 × 50 grid.In each grid cell, we used two photo filters with different kernels; first a Gaussian blur simulation that smooths folds and colour heterogeneity and a second modulated by a sinusoid (SM kernel), which creates a skin wrinkling effect.For every face and hand grid area, the difference between the model prediction and actual chronological age was calculated both without and with the filters.To visualise the effects, a heatmap was applied to the photo of each subject.The magnitude of the differences with and without filters was coded with a brightness level.Zones with the most effect (both positive and negative) are presented as brighter areas on the heat maps.Based on heatmap analysis, original images were overlaid by green or purple clouds in points responsible for positive (increase of the predicted age) and negative (decrease of the predicted age) contributions to the predicted age, respectively.

| Model evaluation -Statistical treatment
The Shapiro-Wilk test was used to evaluate the normality of the data distribution in the samples.Accuracy of prediction was estimated by

| Prediction of age from face and hand images
The total dataset of images captured by both VISIA CR and VISIA Face systems was analysed.The two systems were found to have no significant difference in their ability to predict age (Appendix S3).
Therefore, these datasets were combined.Age was predicted by both face and hand images, and the results were found to correlate well with the actual age.The Pearson coefficient (r) for age prediction from face images (Figure 2A) was 0.93 (p < 0.0001) in the test dataset, while from hand images (Figure 2B), it was 0.90 (p < 0.0001) (Table 1).Mean absolute error (MAE) in predicting age from face images was 4.1 years, which was slightly lower than the MAE in predicting age from hand images (4.7 years) (Table 1).The results of age prediction from both face and hand images were found to be consistent with each other (Figure 2C).The values of r for hand vs. face images were 0.95 (p < 0.0001) and 0.96 (p < 0.0001) in the test and trained samples, respectively (Table 1).
Based on r values, the correlations between actual age and predicted age using either hand vs. face images were highly statistically significant.Furthermore, the r values for predicting actual age from hands versus faces were similarly high (Table 1).As such, hand images represent a viable alternative to facial images for age prediction.

| Accuracy of age prediction in different age groups
Similar MAE were observed in all age groups of the train face dataset (Table 2).In the test dataset of faces, the lowest MAE was observed in the 30-40 years and 40-50 years age groups (Table 1).However, the differences in MAE between all age groups in the test face dataset were less than 1 year (ranging from 3.6-4.4years) (Table 2).
In the case of hand images, the lowest MAE was observed in the <30 years age group, with 3.4 and 3.0 years in the test and train datasets, respectively (Table 2).The highest MAE was observed in the 40-50 years and > 50 years age groups (Table 2).In general, in the hand dataset MAE increased with the increase of age.

| Accuracy of age prediction in different ITA skin tone groups
Age prediction accuracy across different skin tone groups in the train face dataset was found to have similar MAE, ranging from 1.5-1.6 years (Table 2).The same was observed in the test face dataset, where the MAE ranged from 3.8-4.4years across all skin tone groups.The group with the darkest skin tones, characterised by ITA lower than −30°, had the lowest MAE value.
The accuracy of age prediction in the hand images dataset varied depending on the ITA skin tone.The highest MAE was observed in the groups with the lightest skin tone (28° < ITA), with 8.0 and 6.3 years in the test and train datasets, respectively (Table 2).In both the test and train datasets, MAE decreased with decreasing ITA, except for the 10° < ITA < 28° group, where MAE was 3.8 years, while MAE was 4.8 years in the −30° < ITA < 10° and ITA < −30° groups (Table 2).This suggests that age prediction accuracy was higher when hand skin was darker.It is worth noting that in the test face dataset, MAE decreased with the increase of the size of the dataset, except for the 10° < ITA < 28° group, where MAE was 3.8 years at n = 43, while MAE  and n = 93, respectively (Table 2).

| Face and hand features that increase or decrease age prediction
Using Gaussian distortion, we identified facial features that either increased or decreased predicted age.Figures 3A,B show results for the 58 year-old-subject.These patterns are located at the cheekbones, labial commissure, marionette lines, glabella, infrapalpebral sulcus, lips, nose (tip), philtrum, rhinion, superciliary arch, supraorbital notch and eyelids and temple area (Appendix S4).On the hand images, the following patterns contributed positively (Figure 3C) or negatively (Figure 3D) to age prediction in the analysis applying the Gaussian distortion: crease over the interphalangeal joint, crease over the proximal interphalangeal joint, knuckles, medial and phalanges, interdigital spaces, thumb (nail), zone of metacarpal bones, as well as a part of the palmar side of the thumb, which was seen on the images (Appendix S4).
In the analysis using the SM distortion, the facial zones were found to have a positive (Figure 3E) and/or negative (Figure 3F) effect on age prediction: cheekbones, labial commissure, forehead, glabella, infrapalpebral sulcus, lips, marionette lines, nose (tip), philtrum, rhinion, superciliary arch and supraorbital notch and eyelids (Appendix S4).Similarly, the analysis of hand images using the SM distortion revealed the following zones that determined age prediction either positively (Figure 3G) or negatively (Figure 3H): crease over the interphalangeal joint, crease over the proximal interphalangeal joint, proximal and medial phalanges, interdigital spaces, knuckles, thumb (nail), zone of metacarpal bones and a part of the palmar side of the thumb (Appendix S4).
In the analysis with the Gaussian distortion, infrapalpebral sulcus and cheekbones were essential for increasing the predicted age in all studied subjects (Figure 4A, Supplementary File S4).Except for the 34-year-old individual, philtrum and the tip of the nose were  The palmar surface of thumb and interdigital spaces contributed positively to the predicted age in the both Gaussian kernel and SM kernel based analyses in all studied individuals (Figure 4E,F).I-II, II-III and III-IV interdigital zones were especially valuable (Appendix S4).
In addition, in the SM kernel-based analysis, the zone of metacarpal bones and zones of proximal phalanges were also significant for the increase of the predicted age in all analysed cases (Figure 4F).Crease over the interphalangeal joint was significant for the decrease of the predicted age in both Gaussian and SM analyses for all studied individuals (Figure 4G,H, Appendix S4).Proximal phalanges also was significant for all individuals in the Gaussian analysis (Figure 4G).
Crease over the proximal interphalangeal joint and knuckles were significant for all individuals in the SM analysis (Figure 4H).
Collectively, there were common face and hand features that were significant drivers for age prediction for all four subject examples.However, there were also some features that were unique to certain subjects.

| Comparison to previous studies on biomarkers for age prediction by AI
Age prediction using artificial intelligence (AI)-based algorithms has become increasingly popular in recent years.Various types of data have been used for age prediction, including facial images, 1,2,[15][16][17][18][19]21 profiles of DNA methylation, 53 and brain magnetic resonance images. 54fferent analysis pipelines vary in terms of the type and quality of required data, applicability to different demographics, accuracy of prediction and algorithm complexity.
57 Furthermore, there are differences in dataset sizes.In the present study, we used MAE and r as measures of accu- the range of 0.83-0.95. 20,23,58 In ou study, we achieved an MAE of 4.1 years and an r of 0.93, which is comparable to existing algorithms.
Although our CNN-based age prediction algorithm has a slightly higher MAE compared to some previous models, it offers significant advantages.Our final model is compact, weighing only 26 megabytes, which is notably smaller than complex models like VGG-16, AlexNet, GoogleNet, Dan-CNN and ResNet. 16,19While these larger networks may yield slightly better accuracy, their use in dermatology clinics or by aesthetic practitioners for skin assessment or anti-ageing treatment evaluation is resource-intensive and limited.

| Hand biomarkers for age prediction by AI
Prior research mainly focused on age prediction using facial images.
Our study expands this scope to explore the potential of hand images as a reliable age predictor.We achieved a MAE of 4.7 years, demonstrating a significant correlation with actual chronological age (r = 0.90, p < 0.0001).Notably, we found no significant difference in age prediction between face and hand images for the same individual, with the MAE difference being less than 1 year.Furthermore, there was a strong correlation between age predictions based on hands and faces (r = 0.95, p < 0.0001).
The correlation of predicted versus actual age is highly statistically significant using either hand or facial image datasets.Thus, dorsal hand images are a viable alternative to facial images for age prediction and provides a potential solution for predicting age in situations where facial images are unavailable or unsuitable, such as in forensic investigations, medical assessments or criminal suspect identification and surveillance.
The study of hand biomarkers has an important ethical application.Facial images are often considered to be more sensitive than hand images because they are more directly linked to personal identity and can reveal more personal information about an individual, such as age, sex, gender, race, ethnicity and emotions. 59As a result, the use of facial images in research may raise greater concerns about privacy and confidentiality, as well as the potential for stigmatisation or discrimination. 60,61Hand images seem to be less sensitive than facial images and may be viewed as more impersonal or less revealing of personal identity.Thus, in some cases, they could be more appropriate for research.At the same time, the proven possibility of age prediction from hands shows these images can be a source of ethical-sensitive biomarkers.Moreover, it is possible to identify a person based on dorsal hand images as in the case of facial photographs. 28In addition, hand images may reveal personal information such as scars or tattoos that could be sensitive or stigmatising for the individual.Listed concerns raise the question of stricter requirements to storage, processing and publishing of dorsal hand images.
There is ongoing debate regarding the biological basis of using dorsal hand images as a source of ageing biomarkers.2][13] Photodamage of the skin on the dorsal side of the hands has also been found to correlate with chronological age. 12However, using hand images for age prediction has some limitations.For example, Flament et al. 6 concluded that the visual features of the hands of Japanese individuals can hardly be taken as reliable markers of their photoaging.Furthermore, Fink et al. 5 found in a study conducted in the United Kingdom that the perceived age of hands is lower than that of the face.Therefore, further research is needed to justify the use of hand images for age prediction.

| Racial specificity of age biomarkers
The relationship between hand features and age prediction is strongly influenced by race.Wang and Deng 62 note that racial bias can lead to lower recognition accuracy and higher error rates for non-Caucasians.Messaraa et al. 63 found that ageing of the dorsal hand skin was more pronounced in Caucasians than in Chinese cohorts.Skin tone is an important race-specific attribute, but it cannot fully explain the differences in skin ageing across different races.
Passeron et al. 3 report that skin properties such as transepidermal water loss and DNA reparation mechanisms also vary across different races.Therefore, it is essential to study race-specific datasets to achieve accurate age prediction.In this regard, our study focused on the Indian population, which is underrepresented in the literature. 34,64st published works have been based on datasets where images of African and European people were predominant.However, there are also works that have focused on Asians. 17,56,58CNNs trained on datasets with balanced races, including Asians, Indians, Caucasians and Africans have been proposed. 62In the current study, which focuses on Indians, the MAE is 4.1 years and r = 0.93, which is comparable with existing algorithms.To the best of our knowledge, this is the first CNN focused on age prediction from a dataset of Indian faces only.
The current study is particularly important because of the high skin tone diversity in the Indian population. 36,65This diversity is what distinguishes Indians from other ethnicities.Actually, Indians are represented by more than 2000 ethnic groups. 65This gives grounds to consider using the CNN trained on the Indian dataset for age prediction in non-Indian populations.However, it should be verified in further study and validation.
Notably, there were no significant differences in the MAE for age prediction from face images when comparing different skin tone groups.However, a pronounced difference in both trained and test groups was observed in the case of hand images.This could be due to the imbalance of the dataset in terms of skin tone, as light skin tone was represented by a lower number of individuals.It is commonly accepted that a skin tone imbalance in a training dataset leads to a decrease in age prediction accuracy in the less represented groups. 33,55,66Training on the imbalanced dataset could lead to the bias in the prediction of the newly created CNN. 61

| Possible effects of lifestyle on age prediction from hand and face images
Our research does not include effects of environmental factors such as social status and lifestyle on the correlation between visual skin features and age and correlation of face and hand skin tone.For example, diet, skincare habits, occupation, history of disease, exposure to pollutants and solar radiation can all contribute to age-related changes in the skin.For example, it was shown for Chinese females that indoor air pollution, specifically from heating and cooking with fossil fuels in rural areas of China, is significantly associated with an increased risk of severe facial wrinkles and fine wrinkles on the back of the hands, independent of age and other influences on skin ageing. 3On the other hand, the CNN developed here, being agnostic to the myriad of environmental factors that affect skin ageing, makes it more robust as a scientific methodology.
The skin tone of the face tends to be more uniform and consistent than that of the hands. 67The skin on the hands is often exposed to more sunlight and environmental pollutants than the face, which can result in a darker or more uneven skin tone. 3,67,68At the same time, the degree of exposure depends on the individual's behaviour.
The face is typically uncovered, while the dorsal hand may be covered by clothing or gloves, reducing its exposure. 6,63Thus, a higher variation of skin parameters is expected on hands.Dorsal hand skin also tends to be thinner and less hydrated than the skin on the face, which can contribute to differences in skin tone. 68In addition, people may use different skincare products and makeup on their face than on their hands, which can also affect the difference.
The above circumstances could explain the increase in MAE with the increase in age in hand datasets: the cumulative effect of lifestyle and climate influence on hand skin conditions.The lowest MAE of prediction by hand was observed in the group 30-40 years, which was in accordance with previous studies.In the study on face images, the lowest MAE was also observed in the younger adult groups (20-30 and 35-45 years). 55For the prediction by eye corners, the highest 1-off accuracy was also observed in the 33-40 group. 200][71] The importance of infrapalpebral sulcus, cheeks/cheekbones and forehead has been shown for age prediction from facial images by AI algorithms. 72In a study on Chinese males, a significant correlation was observed between actual age and age predicted by CNNs from nasolabial sulcus and wrinkles in the zone of forehead, infrapalpebral sulcus and glabella. 23The size of the philtrum and the geometry of lips, particularly the lip commissure, depend on the age of Korean females. 71In a cohort of Japanese females, the depth of periorbital wrinkles, crow's feet wrinkles, glabellar wrinkles, nasolabial folds, texture of lips and density of pigment spots on cheeks were correlated with age. 58The study by Flament et al. 64 showed that wrinkles/texture and ptosis/sagging were the most significant factors considered in almost all ethnicities, although to varying degrees.Pigmentation disorders were also crucial secondary factors in Japanese, South African and Indian females. 58In vivo multiphoton multiparametric 3D quantification showed age dependence of the temple area. 73However, it should be noted that morphometric parameters are sometimes ethnic and sex-specific. 70Therefore, the same models are not always applicable for prediction on different datasets.

| Revealing of visual age biomarkers by the sensitivity-like interpretation technique
In both analyses, features like thumbs, interdigital spaces, joint creases, knuckles and metacarpal bone zones were significant for age prediction in most individuals.These findings may be attributed to age-related changes, such as the development of wrinkles on the palmar side of the hand due to dehydration. 74,75Wrinkles in interdigital spaces become more pronounced with age, and hand creases and knuckles are naturally prone to wrinkles as well.As people age, the skin loses elasticity and becomes thinner, 63 which can cause wrinkles and creases to become more prominent.Moreover, it can cause the veins and tendons on the dorsal side of the hand to become more prominent, and the bones in the hand to become more visible, including zones of the knuckles, phalanges and metacarpal bones.The thickness and density of the cortical bone in the proximal phalanges and metacarpal bones can decrease with age, leading to changes in the appearance of the hand bones. 76Additionally, joint spaces may widen or narrow with age due to wear and tear or degenerative changes, which can also affect the appearance of the hand bones. 76Dorsal skin of the metacarpal zone is a pattern of hyperpigmentation due to photoageing. 6,63In Japanese females, ageing is accompanied by the increase of number and size of pigment spots. 6 the cohorts of Caucasian and Chinese females, ageing is accompanied by decreasing lightness and hydration, enhancing heterogeneity of pigmentation, increased skin roughness and more pronounced veins and bones.
do not include ethnicity/race and age annotations simultaneously.Some datasets suffer from age imbalance for screening of ageing biomarkers.For example, most images in the 11 k Hand's Dataset 27 were made for people in the age range of 20-25 years.It complicates the already challenging work with poor data on hands.
Subjects cleansed their faces and hands at the study site with a commercial facial cleanser and acclimated for 30 min in an airconditioned room before image capture.The protocol, informed consent statement, and photo release forms were approved by an F I G U R E 1 Characteristics of a generated dataset.Age distribution of the subjects in the Mumbai (A), New Delhi (B) and Bangalore (C) datasets.Representative images (VISIA CR) used for the analysis of right (D) and left (E) hands as well as of face (F) of subjects, obtained using cross-polarized lighting.Skin tone classification for the dataset.(G) Calculation of the individual typology angle (ITA) in the colour CIE L*a*b* subspace and skin tone clusterization based on facial images.(H) Calculation of the ITA in the colour CIE L*a*b* subspace and skin tone clusterisation based on dorsal hand images.(I) Correlation of ITA calculated from facial and dorsal hand images.(J) Similarity matrix of skin tones predicted by face (columns) and hand (rows) images.Colour reflects the number of individuals in each group.a.u.-arbitrary units.appropriately constituted IEC as per Indian Regulations (Schedule Y of the Drug and Cosmetic Act).
The images were padded to square and resized to 512 × 512 pixels.During training, the following augmentations were utilised: image rotations, horizontal and vertical mirroring.We used Adam optimisation algorithm and mean squared error as a loss function.The same neural network architecture was used for training on face and hand images to predict age.We achieved the best accuracy for both models after training for 40 epochs.For automated age prediction from face and hand images, the CNN based pipelines were developed.Hand and face datasets are randomly split into train and test in the following proportion train: test = 7:3.The age distributions in the training and test datasets were the same.Moreover, hand and face training datasets were sharing the same subjects while test datasets contained the rest of subjects, i.e., one subject could be presented in the train or the test dataset only, and if a subject's image was presented in the face training dataset, then their hand images are in the test dataset for hands.
b * b * the Pearson correlation coefficient (r) and the significance level (p) at the given size of the dataset (n).MAE between the parameters A and B was calculated as where x A i , x B i are the values of A and B for each individual subject i, respectively, and n is the number of subjects in the dataset.

was 4 .
8 years in the −30° < ITA < 10° and ITA < −30° groups at n = 173 important.Meanwhile, in the analysis using the SM kernel, rhinion, philtrum and the zone of supraorbital notch and eyelids were significant in positively increasing the predicted age for all individuals (Figure4B, Appendix S4).Labial commissure and cheekbones were also significant for all individuals except the 58-year-old and 69-year-old, respectively (Figure4B, Appendix S4).In the Gaussian distortion analysis, marionette lines and temple areas were important for decreasing the predicted age for all individuals.Labial commissure was significant for all studied individuals except the 25-year-old (Figure4C, Appendix S4).No features were found to contribute to the decrease of the predicted age in the analysis of face images applying the SM distortion (Figure4D, Appendix S4).Applying the SM kernel to the zone of marionette lines contributed to a decrease in predicted age in 3 of the 4 subjects, the exception being the 25-year-old individual (FigureAppendix S4).Similarly, applying the SM kernel to the glabella contributed to the decrease in predicted age in all but one subject, the 69-year-old individual (Figure4D, Appendix S4).
racy.Recently published articles report an MAE for age prediction from facial images in the range of 2.3-5.8 years 15,20,55,56 and an r in F I G U R E 3 Representative results (58 years old subjects) of the interpretation of the model for age prediction.The regions important for age prediction are overlaid by green (positive effect) or purple (negative effect) colour.(A) Face images, Gaussian kernel, positive effect on predicted age.(B) Face images, Gaussian kernel, negative effect on predicted age.(С) Hand images, Gaussian kernel, positive effect on predicted age.(D) Hand images, Gaussian kernel, negative effect on predicted age.(E) Face images, sinusoidal modulated kernel, positive effect on predicted age.(F) Face images, sinusoidal modulated kernel, negative effect on predicted age.(G) Hand images, sinusoidal modulated kernel, positive effect on predicted age.(H) Hand images, sinusoidal modulated kernel, negative effect on predicted age.

4. 6 |
Limitations of the studyThis study had the following limitations: (i) the dataset was not balanced in terms of skin tones.It could lead to very low accuracy in less-represented groups (light skin colour).(ii) Only photographs obtained in the cross-polarised mode were taken into

years n Predicted age by face images versus actual age Predicted age by hand images versus actual age
Accuracy of age prediction from face and hand images of Indian females.
TA B L E 1 Note: Metrics were calculated separately for test and train datasets.The values of face images versus actual age, predicted by hand images versus actual age and predicted by face images versus predicted by hand images are compared.Abbreviations: MAE, mean absolute error; n, size of the dataset; p, significance level; r, Pearson correlation coefficient.Note: Metrics were calculated separately for test and train datasets.Abbreviations: MAE, mean absolute error; n, size of the dataset.
Venn diagrams demonstrating the effect of different face and hand zones on chronological age prediction after applying a distortion in corresponding zones of the image.
(A) Face images, Gaussian kernel, positive effect on predicted age.(B) Face images, sinusoidal modulated kernel, positive effect on predicted age.(C) Face images, Gaussian kernel, negative effect on predicted age.(D) Face images, sinusoidal modulated (SM) kernel, negative effect on predicted age.(E) Hand images, Gaussian kernel, positive effect on predicted age.(F) Hand images, sinusoidal modulated kernel, positive effect on predicted age.(G) Hand images, Gaussian kernel, negative effect on predicted age.(H) Hand images, sinusoidal modulated kernel, negative effect on predicted age.CIP: crease over the interphalangeal joint, CPIP: crease over the proximal interphalangeal joint, IP: interpalpebral, MB: metacarpal bone, ML: marionette lines, MP: medial phalange, PP: proximal phalanx, SA: supraorbital.Symbols '+' and '−' refer to positive and negative effects on the predicted age, respectively.Symbols 'G' and 'S' refer to Gaussian and sinusoidal modulated kernels, respectively.