Sectors
Foley In Fashion

2026 Privacy Landscape: What to Watch for and How to Build a Resilient Data Privacy Compliance Program

Fashion, beauty, and wearable technology companies are entering 2026 amid a rapidly evolving data privacy landscape that will shape how brands design products, personalize customer experiences, and process personal information. 2026 brings new U.S. state omnibus consumer privacy laws, increasingly prescriptive regulations governing the use of AI, and stricter rules regarding health and biometric data, and children’s privacy. Privacy is no longer just a legal box to check; it is becoming a competitive advantage that builds consumer trust and mitigates enforcement and litigation risk. To lean into this competitive advantage, brands should consider promoting transparent data practices to stand out in a crowded market.

U.S. State Comprehensive Consumer Privacy Laws

Comprehensive consumer privacy laws and regulations that went into effect in January

Three new comprehensive consumer privacy laws took effect on January 1, 2026, in Indiana, Kentucky, and Rhode Island, bringing the total number of states with such laws to 20. We expect additional states to enact comprehensive privacy legislation in 2026, as well as further amendments to existing laws, following recent updates in Colorado, Connecticut, Montana, and Utah. Significant updates to the California Consumer Privacy Act (“CCPA”) implementing regulations also went into effect on January 1, 2026. These amendments cover automated decision-making technology, risk assessments, and cybersecurity audits, as well as revisions to the definition of sensitive personal information, privacy notice requirements, and consumer rights.

To track these fast-moving developments, check out Foley’s U.S. State Comprehensive Consumer Privacy Laws Comparison chart that is updated quarterly.

Biometric Data

Biometric and body-related data, such as facial mapping, body scanning, and skin analysis tools used in virtual dressing rooms, wearables, and beauty devices remain a top-risk category in 2026. This data generally meets the statutory definition of sensitive personal information under state comprehensive consumer privacy laws.

Most state comprehensive consumer privacy laws limit the definition of biometric data to data that is used to uniquely identify an individual. By contrast, the Connecticut Data Privacy Act (“CTDPA”) considers biometric data to be sensitive personal information regardless of whether biometric data is processed for the purpose of uniquely identifying an individual. Processing biometric data under these privacy laws therefore triggers heightened requirements, such as opt-in consent requirements, additional consumer rights, and disclosure requirements.

Under the CCPA, consumers have the right to limit the use of their sensitive personal information, which requires additional disclosures and certain links on a company’s website, such as a “Your Privacy Choices” link depending on the categories of personal information a company collects. Accordingly, a clear understanding of the categories of personal information collected and the purposes for which it is collected, used, and disclosed is essential to assessing a company’s compliance obligations.

Consumer Health Data Privacy Laws

Consumer health and wellness data will also remain a key risk area in 2026. Wearable devices and health-adjacent apps that infer stress, sleep, menstrual cycle patterns, or skin conditions are increasingly regulated by state privacy laws that govern such consumer health data outside of HIPAA. As discussed in a prior blog, the CTDPA was recently amended to cover consumer health data, and Washington and Nevada have implemented laws that specifically govern consumer health data privacy. Companies that develop wearable devices that collect consumer health- and wellness-related data must understand their obligations under these laws, as regulators have made clear that protecting consumer health-related data is an enforcement priority.

A Rapidly Evolving Children’s Privacy Landscape

Brands that intend to collect children’s personal information (e.g., youth-focused fashion apps, gamified beauty tutorials, and wearables) will be expected to comply with stronger protections in 2026. Various state laws governing children’s privacy and the protection of children and teens in online spaces will go into effect in 2026. These laws generally govern social media platforms, children’s and teens’ use of AI, and app stores and app developers, often imposing age verification and parental consent requirements. This remains a rapidly developing space, as many of these laws are frequently challenged in court and states continue to enact new legislation designed to withstand such legal challenges.

At the federal level, children’s privacy is also a high priority. In early 2025, the FTC finalized rules regarding the Children’s Online Privacy Protection Act (“COPPA”), focused on opt-in consent for targeted advertising, limits on data retention, and COPPA’s self-regulatory Safe Harbor Program. The FTC has indicated that it plans to begin enforcing these rules in 2026. In addition, the FTC has focused on age verification and age estimation technologies and hosted a workshop on January 28 to gather input from key stakeholders as age verification is becoming a key compliance requirement under various laws. Brands should closely monitor these developments at the state and federal levels.

Artificial Intelligence

The use and regulation of artificial intelligence is unlikely to slow down in 2026. Companies deploying AI for personalization, AI-generated models and virtual fitting rooms, dynamic pricing, and similar use cases, will be required to comply with heightened transparency standards. For example, the EU Artificial Intelligence Act (“EU AI Act”) classifies certain biometric systems that infer personal attributes as prohibited AI systems, which could have an impact on the retail industry as the use of AI continues to develop. Enforcement of the EU AI Act begins later this year.

In the United States, states continue to enact AI-related legislation. In 2025, California passed multiple AI-focused laws and the CCPA’s implementing regulations now address the use of automated decision-making technologies. Given the rapid adoption of AI and the broad commercial benefits, states are likely to continue regulating its use. These developments underscore the growing importance of robust AI governance to help companies monitor and meet evolving compliance obligations.

Targeted Advertising, Cookies and Tracking Technologies, and Class-Action Litigation

Fashion, beauty, and wearable companies will continue to leverage data, cookies, and other tracking technologies for the purposes of targeted advertising and analytics. In recent years, class-action litigation under state wiretap laws, such as California Invasion of Privacy Act (“CIPA”) has surged, challenging the use of cookies and tracking technologies without proper consent. Plaintiffs allege that the use of cookies and tracking technologies, such as pixels, constitute illegal wiretapping under these laws. Further, there has also been a surge in litigation under the Video Privacy Protection Act (“VPPA”), a law enacted in 1989, drafted to protect a consumer’s video rental history. However, recently, plaintiffs allege that disclosures of video-viewing data through the use of tracking technologies violates the VPPA. Earlier this year, the Supreme Court granted certiorari in a case involving the definition of “consumer” under the VPPA to resolve a circuit split. The Supreme Court’s decision will inevitably have an effect on the future of privacy litigation and is one to watch in the coming months.

To mitigate the risk of being named in a class action suit, companies should review their cookie consent configurations to ensure that cookies and other tracking technologies do not load until a website visitor affirmatively consents. Further, companies will need to ensure that such cookies and tracking technologies do not deploy after a user rejects the use of cookies and tracking technologies. Companies should also review their privacy policies and disclosures to ensure they accurately reflects the ways in which the company uses such cookies and tracking technologies.

While reviewing consent procedures, companies should also ensure that their interfaces are not designed in a way that includes dark patterns. Under state comprehensive consumer privacy laws, dark patterns are generally defined as design practices that impair consumer autonomy and manipulate a consumer’s decision-making. One common example of a dark pattern is making it easier to consent to the collection of personal information, such as by including only an “Accept All Cookies” button on a cookie consent banner, without a clear option to reject or opt out of tracking. To mitigate these risks, companies should implement “symmetry-in-choice,” as required by the CCPA. A best practice is to include both an “Accept All Cookies” and “Reject All Cookies” option on a cookie consent banner, to help ensure that the cookie consent banner complies with applicable law and industry standard practices.

Developing a Comprehensive Data Privacy Compliance Program to Mitigate Risk

Given the growing number of laws and regulatory requirements, companies should continuously monitor legislative developments and regularly assess their data privacy program and policies to ensure they reflect the latest legal obligations and their current data collection practices. In 2026, companies should review their privacy program, including but not limited to, the following:

■ Integrate privacy by design from development to implementation. Companies that combine personal expression with advanced technology that collects sensitive personal information about consumers, implementing privacy by design at the outset reduces redesign costs and helps to mitigate the risk of breaches, misuse, and reputational damage.

■ Conduct data mapping exercises with key stakeholders to understand the categories of personal information collected, the purposes for which such personal information is collected, and to whom such personal information is disclosed. This will enable companies to understand the categories of personal information that are being collected and what obligations they have under state comprehensive consumer privacy laws and consumer health data privacy laws.

■ Establish a formal AI and automated decision-making governance framework that inventories AI and automated decision-making systems, classifies risk, and aligns with applicable legal requirements.

■ Implement consent procedures, particularly with respect to the collection of sensitive personal information, and the use of cookies and tracking technologies.

■ Conduct due diligence on vendors and standardize vendor agreements to ensure that each vendor complies with a data processing agreement that complies with applicable legal requirements.

■ Conduct regular audits, risk assessments, and employee training.

■ Build specific children’s privacy controls, including age assurance verification, parental consent, profiling and targeted advertising limitations, and content moderation on relevant products to align with state children and teen privacy laws.

With new state laws, expanding AI regulations, and evolving children’s privacy obligations, 2026 will demand disciplined data governance and proactive risk management. Companies that invest in comprehensive data privacy compliance programs will not only reduce litigation and regulatory risk but also enhance their brand reputation and build consumer trust.

Disclaimer