Personalised beauty has transformed in recent years as consumers want products tailored towards their individual needs – much like in nutrition and fitness. This trend has only been accelerated by the pandemic, with “try before you buy” no longer an option. Online retailers are now using tools such as digital face scanning techniques, artificial intelligence (AI) and augmented reality to offer customers the perfect product, and even diagnose skin conditions and skin types in the absence of an in-store expert. Retailers will need to think carefully about how they use the data collected from consumers and ensure transparency.
Law firm and member of the CEW Board, Mishcon de Reya, have set out some guidance below for brands to consider when navigating the developments in beauty tech.
Trends
In the past few years, more and more retailers are exploring various tools to enhance their customer offering. For example, Wella Professional’s augmented reality enabled Smart Mirror uses the CareOS operating system to allow customers to try different hair dye shades in real time. By using facial recognition, it is able to retrieve past hair styles, catering for a fully personalised experience. Other retailers are using ‘bots’. Sephora’s Virtual Artist bot, created by ModiFace, allows customers to virtually ‘try on’ products and even scan faces of a celebrity to view a list of matching lipsticks. This year, we expect to see the launch of L’Oréal’s Perso device which will enable users to create custom lipsticks by blending pre-sets in the app or colour matching to any image, trend or outfit.
Legal Framework
Data Protection
Beauty companies are now more frequently collecting personal data from individuals, much of which will constitute ‘special category data’. Examples of ‘special category data’ include: health data (e.g. those connected with a user’s skin conditions such as rosacea or acne); and biometric data (e.g. the user’s facial image). Following Brexit, beauty companies which offer their products or services to consumers in the UK and the EEA will need to comply with two data protection regimes, ‘UK GDPR’ and the actual GDPR. Whilst the two regimes share common starting blocks, this does add an additional layer of complexity to regulatory compliance.
In order to lawfully process special category data, companies must identify both a lawful basis under Article 6, GDPR/UK GDPR (e.g. necessary for the performance of a contract) and a separate condition for processing under Article 9, GDPR/UK GDPR (e.g. the data subject has given explicit consent). These do not have to be linked. It is also important to ensure that the lawful basis is documented before the start of any such processing. Retailers will need to ensure that users are informed about the type of data being collected, the legal basis for processing, the recipients of such data and so forth through a clear and concise privacy notice. Where ‘explicit consent’ is required, an appropriate mechanism should be implemented to ensure that the customer has given their clear, unambiguous agreement for their data to be used in a specific way.
Where beauty companies use an AI-powered system to recommend certain products to consumers, this is a form of “automated decision making”. Article 22, GDPR/UK GDPR notes that individuals have the right not to be subject to a decision based solely on automated processing “which produces legal effects concerning him or her or similarly affects him or her” (unless one of the few exceptions applies). Before implementing such a system, organisations should: (a) carry out a data protection impact assessment (DPIA) to assess the risks to individuals from a processing operation and identify ways to address those risks; and (b) provide specific information about automated decision making in a clear and concise layered privacy notice or just-in-time notification.
Any beauty company should be aware that consumers have a right to be informed about the underlying algorithm and logic involved behind any product recommended directly to them by the AI-powered system. As the process can only make an assumption about someone’s behaviour or characteristics, there will always be a margin of error and a balancing exercise is needed to weigh up the risks of using the results. When using an AI-powered systems, beauty companies should ensure they understand the underlying rules that apply to automated decision making and that they can provide an audit trail showing the key decision points that formed the basis for the decision. If the system considered any alternative decisions, a beauty company needs to understand why these were not preferred. A process should also be put in place for individuals to challenge or appeal a decision made by the system.
To reduce the risk, it may be prudent in some instances to ensure that a specialist dermatologist internally approves any automated skin care products recommended by an AI system. Further, mechanisms should be implemented to diagnose any quality issues or biases in the AI-powered system. This should then be documented (including the steps taken to resolve such issues). Where an individual objects to such profiling, the processing must be stopped, with confirmation to the individual within a month of receipt of their objection. Where the request is complex or where an individual makes a number of requests, the response time may be extended by a further two months (provided information is given to the individual as to why the extension is necessary without undue delay and within one month of an organisation receiving the request from the individual concerned).
Medical Device Regulations
In some instances, beauty companies may also need to ensure compliance with medical device regulations. In the UK, beauty companies should be aware of the Medical Devices Regulations 2002 (SI 2002 No 618, as amended) (UK MDR 2002). Under UK MDR 2002, a medical device “means an instrument, apparatus, appliance, material or other article, whether used alone or in combination, together with any software necessary for its proper application, which—(a) is intended by the manufacturer to be used for human beings for the purpose of- (i) diagnosis, prevention, monitoring, treatment or alleviation of disease,…“. Software (including AI) and apps (either incorporated into an existing device or supplied separately) that are used for contributing to diagnostic processes (i.e. certain skin diseases) could be regarded as a “medical device” unless an exemption applies. As noted in the ‘Government Guidance: Medical device stand-alone software including apps’ (Guidance), there are some exemptions to such software being classified as a “medical device”. This includes software used for “monitoring general health and general wellbeing” and “booking a virtual consultation or providing reference information to a Healthcare Professional to use their knowledge to make a clinical decision”..
Where UK MDR 2002 applies, the manufacturer will need to (among other things): eliminate or reduce risks as far as possible (inherently safe design and construction); where appropriate take adequate protection measures, in relation to risks that cannot be eliminated; and inform users of the residual risks due to any shortcomings of the protection measures adopted. Any app will need to be clinically evaluated and designed with safety in mind. For devices which incorporate software or which are medical software in themselves, the software must be validated according to the state of the art taking into account the principles of development lifecycle, risk management, validation and verification. Appropriate labelling requirements must also be followed and for software, it is important to consider a system of registration / activation as this may aid the manufacturer to trace devices that have been distributed by third party distributors or by app stores. As noted by the Guidance, this is important when undertaking any field safety corrective action.
Beauty companies and manufacturers using any of these tools will need to ensure that the processes they put in place are compliant with data protection legislation and any other regulations that may apply to consumers in any jurisdiction where they are being targeted or monitored.
For more information or to get in touch with Mishcon de Reya please visit their website: www.mishcon.com.