S7504: Improving Consumer Compliance Through Better Product Recommendation New Skin Advisor Tool Powered by AI Jun Xu PhD, Faiz Sherman PhD, Matthew Barker PhD* Frauke Neuser PhD, Shannon Weitz The Procter & Gamble Company
THE SCIENCE BEHIND OLAY SKIN ADVISOR skinadvisor.olay.com
Procter & Gamble (P&G) NYSE: PG Global Research & Development P&G scientists around the world work together to develop products that improve more lives in more meaningful ways – now and for generations to come. 40,000 2016 IRI New Product Pacesetters 5 of the top 10 BRAND INNOVATION Years of 179 Active patents BUSINESS INNOVATION innovation worldwide SOCIAL INNOVATION history 50 100 150 7,500 More than Non-Food Product Launches R&D employees 1,000 $ 2 billion + Ph.Ds Annual R&D investment In 120 scientific disciplines
P&G 25+ Years of Industry Leading Skin Imaging
Background • Consumers struggle to find the right cosmetic skin care products suited to their personal needs and preferences. The ability to make an informed product selection decision can drive product compliance and delight. • A new skin advisor tool has been developed to deliver a personalized beauty consultation tailored for consumers’ unique skin needs right at her fingertips. • This tool combines deep learning with consumer preferences related to visible skin concerns, cosmetic product use, and skin feel for the optimal product recommendation.
Development Overview Visible Skin Age Compliance Validation Verification Predictions compared Deep Learning Proving skin to expert. Algorithm advisor with Visible skin deep learning age prediction algorithm, with aging visible aging area Aging Area Insights insights and identification. Facial Mapping Study consumer informs how preferences appearance of aging drives areas change with compliance. chronological age.
Facial Features & Aging glabella marionette lines nasolabial folds
Deep Neural Network application • The skin advisor uses convolutional neural networks trained using NVIDIA graphics processors to perform trillions of calculations per second. The model was trained on 50,000 images with chronological age data tags. • When an image of a user is received, the model is used to determine the visible skin age based on the pixels in the image, further a two- dimensional heat map is generated that identifies a region of the image that contributes to the visible skin age. Predicted Age Raw Image Pixels
Data Setup • Face detection & alignment performed using dlib: rotated, scaled & cropped to a standard size. • Spatial augmentation was applied: random horizontal flipping, rotation, scaling, zoom cropping causing slight translation. • HSV Color augmentation: random changes to saturation & exposure. • Oval Mask, global contrast normalization GCN, reapply Oval Mask.
CNN using Torch • Regime of learning rate as epochs increase • Small batch size was utilized, using less memory, explorer more places to find parameters estimates corresponding minimum • 20+ layers: convolution, max pooling, leaky ReLU, decreasing spatial dimension while increasing depth dimension. Dropout was also used in later layers … consistent with Alexnet architecture with adaption • Multi-threading to aid speed of decompressing JPG and send data to the GPU, otherwise the GPU is starved • RMSProp was used to optimize gradient descent • Model training took roughly 8 hours on NVIDIA Titan X GPU
Gradient Heat Map for Visualization • After training, with fixed model parameters. A gradient heat map was created in order to localize pixel differences of a subject’s image relative to younger than their predicted age. • An input image was forward propagated through the model to obtain a predicted age. Then a target of predicted age minus 10 years was set and the gradients were propagated back through the network to the input image. A heat map was created by summing absolute values of the RGB gradients for each pixel and rescaling from 0 to 1 for display purposes. • The gradient heat map was then blended with the original image to visualize areas that were different from their younger predicted age.
Visible Skin Age Validation Evaluate robustness of the visible skin age algorithm by comparing output to a gold standard dermatologist assessment. 1. A validation set of 630 selfie images representing the general US female population were obtained. 2. These images were presented to 615 dermatologists, who represent the gold standard in visible skin evaluation, in a randomized order in sets of 8 images. Each dermatologist evaluated images. 3. The dermatologists were asked to input the perceived age of each image.
Validation Results The mean difference of the predicted visible skin age versus the chronological age using the skin advisor deep learning algorithm was comparable to the mean difference of the perceived age versus the chronological age by dermatologists. Mean Age Difference Deep Learning Dermatologists Algorithm
Facial Area Insights – Mapping Study Physical Lifestyle Molecular Visual Optical • To build a fundamental understanding of the underlying mechanisms of facial aging across different facial sites, a clinical Facial Mapping Study enrolling over 150 subjects • Study assessed facial skin genomics, image analysis parameters, lifestyle factors, and skin measurements in two groups of female subjects: a younger ages (20-29 years) and an older ages (55-75 years). Study did not assess applying cosmetics. • Facial locations analyzed included the forehead, crow’s feet area, under eye, nasolabial fold, cheek, glabella, marionette lines, above mouth, and nose regions.
Facial Mapping Study - Results • The Skin Advisor Tool shares the best aging area and the area that needs improvement based on the deep learning algorithm. Key educational information about how those areas age is also given. • Insights from the facial mapping study were used to inform how visible aging areas change with chronological age. • Quantitative assessment of wrinkles revealed distinct visible topography feature presentation across facial zones and with aging. Ages 20-29 Ages 55-75 CK = Cheek GB = Glabella NL = Nasolabial Fold FH = Forehead MN = Marionette UE = Under Eye LP = Above Lip CF = Crow’s Feet NS = Nose
Compliance Verification • 100 US women, age 25-65, facial moisturizer users, were enrolled in a 4-week online consumer test. • Group 1 (n=50) received a product regimen based on the skin advisor deep learning algorithm and preferences and Group 2 (n=50) self-selected a product regimen. • Self-assessment questions were completed pre-use and post- 4 weeks product use.
Compliance Results Figure 4 Figure 5 Pre-product use indicates satisfaction Post 4 weeks product use indicates satisfaction with the skin advisor product with the skin advisor product recommendation recommendation. and improved consumer compliance with longer product use.
Demo
Olay Skin Advisor – Website Results • Over 1.4 million visits to the site • High engagement rates • Half the bounce ratio of a typical beautybrand.com website • Twice the time spent vs. a typical beautybrand.com website • Huge opportunity for real time consumer learnings • 2.0 upgrade launched 6 weeks ago
Conclusion Creating a tool that leverages a deep learning algorithm to predict visible skin age and aging areas creates motivation to comply to a cosmetic skin care regimen. Visible skin age and aging area analysis is further backed by dermatologist validation and clinical data to support a robust product recommendation. The new skin advisor tool combines this technical information with consumer preferences to recommended cosmetic products that provide delight required for skin care regimen compliance.
Thank you! Questions? Barker.ML@pg.com
Recommend
More recommend