This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 9 minute read

The 2026 Privacy Landscape: A Practical Guide for In-House Counsel

This article is adapted from the Decrypted podcast series, a Barnes & Thornburg production on data security and privacy. To subscribe to Decrypted, click here.

More than 20 states have now enacted comprehensive privacy laws. Add expanding enforcement, broader sensitive data definitions, and new protections for teens, and the privacy compliance landscape in 2026 looks very different than it did just a few years ago — and in-house attorneys need to keep pace.

For in-house counsel, the challenge is no longer just keeping up with new laws. It is also ensuring that existing programs are still fit for purpose as regulatory interpretations evolve, sensitive data categories expand, and regulators focus on emerging issues around artificial intelligence, biometric data, and the digital lives of teenagers. This article provides a practical overview of what has changed, where the pressure points are, and how to build a program that holds up over time.

A Landscape That Has Fundamentally Shifted

Many organizations completed meaningful privacy compliance work around 2018–2020—updating policies for GDPR, then overhauling practices for CCPA—and have not revisited those programs in a structured way since. That gap matters, because the regulatory environment they were designed for no longer exists.

By the end of 2025, more than 20 U.S. states will have enacted comprehensive privacy statutes. Indiana, Kentucky, and Rhode Island are among the most recent to take effect, with additional states expected to follow. In the absence of a federal privacy law, state legislatures have filled the void independently, producing a patchwork of requirements that vary by jurisdiction but share a common architecture: consumer rights to access, correct, delete, and port their data; opt-out mechanisms for targeted advertising and data sales; and meaningful transparency obligations around collection and use.

Critically, the more recent wave of legislation has consistently pushed in a more demanding direction. Newer statutes tend to impose stricter requirements around opt-out signals—including obligations to honor Global Privacy Control (GPC) and similar browser-level signals—and broader definitions of what constitutes “sensitive” personal information. Organizations whose programs were calibrated to 2020 standards are likely operating with material gaps.

Key Takeaways

20+ states now have comprehensive privacy laws.

 If your program was built for GDPR or CCPA and hasn’t been updated since, it almost certainly does not reflect current requirements.

Newer laws follow the Virginia/Colorado model, not CCPA.

CCPA is increasingly an outlier. Programs built around California’s framework may not translate cleanly to the broader state law landscape.

Enforcement is accelerating.

 Regulators are no longer just establishing rules—they are actively testing and enforcing them, including under statutes that have been on the books since 2020.

Why Enforcement Changes the Calculation

Passing new laws is one thing. Enforcing them is another—and the enforcement environment has shifted materially. Regulatory agencies are bringing more cases, under more statutes, with greater focus on specific data practices that have attracted public concern. Each enforcement action adds specificity to the compliance landscape: it signals what regulators are prioritizing, how they are interpreting existing requirements, and what standards they expect organizations to meet.

Enforcement is also not limited to recently enacted laws. Regulators continue to bring actions under older statutes, and the growth of artificial intelligence has created new pressure. Agencies are applying existing privacy frameworks to AI-driven data practices—scrutinizing how organizations collect and use biometric data, whether AI systems are drawing on personal information in ways users did not anticipate, and how consumer data is being used to train or inform algorithmic systems. These are not hypothetical risks. They are active enforcement priorities.

For organizations that have assumed a low enforcement profile based on past experience, that assumption deserves reconsideration. The cost of non-compliance—fines, litigation exposure, operational disruption, and reputational harm—is becoming more concrete, and the regulatory appetite for enforcement continues to grow.

Key Takeaways

Enforcement expectations have risen even where the law hasn’t changed. 

A program that was adequate in 2020 may now be deficient not because new laws passed, but because regulatory interpretation has evolved.

AI is drawing regulatory attention to existing privacy requirements.

Biometric data, consumer profiling, and the use of personal information in AI systems are active areas of scrutiny under current law.

Non-compliance costs are real and increasing.

Fines, litigation, and business disruption from enforcement actions are no longer theoretical. They are happening.

The Core Principles That Run Across Every Jurisdiction

Despite the complexity of navigating more than 20 state statutes, the underlying requirements are more consistent than they appear. Every comprehensive privacy law—regardless of the state—rests on a common set of principles. Organizations that build their programs around these fundamentals will be far better positioned to absorb new requirements as they emerge, rather than rebuilding from scratch each time a new law takes effect.

  • Notice and transparency. Every statute requires organizations to inform individuals about what personal data is collected, why it is collected, and who it is shared with. These disclosures must accurately reflect actual practices. A privacy policy that does not correspond to what is happening operationally offers little legal protection and may itself be a violation.
  • Data inventory and internal controls. Effective compliance begins with knowing what data the organization actually holds. This requires genuine engagement across business functions—marketing, human resources, sales, product—each of which collects personal information for different purposes. That understanding needs to be reduced to written policies and procedures. In the event of a breach, enforcement action, or litigation, documented governance is what demonstrates due diligence.
  • Vendor management. Every third party that receives personal information from the organization must be subject to a written Data Processing Agreement (DPA) governing the terms of that transfer. DPAs should be reviewed by qualified counsel—not simply accepted as presented by the vendor. The terms matter, and they vary significantly.
  • Ongoing governance. A data inventory completed in 2023 reflects 2023. As organizational data practices evolve, new vendors are engaged, and new laws take effect, the compliance program must keep pace. Privacy governance needs to be embedded in regular business operations, not treated as a periodic project.

Key Takeaways

The principles are consistent across jurisdictions.

 Organizations anchored to notice, transparency, data minimization, consent, and vendor oversight will absorb new laws far more efficiently than those responding to each statute in isolation.

Documentation is not bureaucracy—it is your program. 

When something goes wrong, written policies and procedures are the evidence of a good-faith compliance effort.

DPAs require real attention. 

Rubber-stamping vendor agreements is a common and significant gap. Someone who understands what they are reviewing needs to look at these.

Sensitive Data: An Expanding and Consequential Category

One of the most significant substantive changes in recent privacy legislation is the expansion of the “sensitive” personal data category. Historically, sensitive data meant financial account numbers, government identifiers, health information, biometric data, and precise geolocation. Those categories remain sensitive—but recent state laws have extended the designation to types of information that were previously treated as ordinary personal data.

Categories now classified as sensitive under one or more state statutes include certain online activity, information relating to union membership, and data concerning aspects of an individual’s personal or sexual life. These may not be the first things that come to mind when privacy teams conduct their data inventories—but they carry significant legal consequences. In most jurisdictions, collecting sensitive information requires affirmative opt-in consent from the individual, rather than the opt-out mechanisms that apply to general personal data. Organizations that have been collecting this information without the requisite consent may no longer be able to lawfully process it.

This issue is particularly acute in the human resources and employment context. Legacy HR intake forms, job applicant questionnaires, and employee databases frequently contain categories of information that have since been reclassified as sensitive, and that were gathered without the consents or disclosures now required. If that data is still in the organization’s systems, decisions need to be made about whether it can continue to be processed—and if not, how it should be handled.

Key Takeaways

Sensitive data definitions have expanded significantly.

 Information once treated as ordinary personal data—certain online activity, union membership, aspects of personal life—now requires opt-in consent in many states.

HR and employment data is a common problem area. 

Older intake forms and applicant databases often contain newly sensitive categories collected without adequate consent or notice.

Audit your data inventory against current definitions.

Identify what you have, confirm the legal basis on which it was collected, and address any data that can no longer be lawfully processed.

Teen and Minor Data: A Rapidly Developing Area

The regulation of data relating to minors is undergoing significant change. For many years, the framework was straightforward: COPPA established protective requirements for children under 13, and individuals 18 and older were treated as adults capable of consenting on their own behalf. The large and largely unregulated space in between—early and mid-adolescence—has become a major area of legislative focus.

Public concern about the effects of digital platforms on teenagers has generated broad political support for extending heightened privacy protections to individuals up to age 18. A growing number of states have enacted or are actively considering legislation in this area, imposing obligations on organizations that collect or process data about adolescents regardless of whether those individuals affirmatively sought out a service marketed to minors.

At the same time, Apple and Google have implemented new app store age-rating and age-verification requirements that are reshaping compliance obligations for developers in practical, immediate terms. Under these frameworks, the app stores verify user ages and pass that information to developers. Developers are then required to apply age-appropriate consent flows, parental authorization mechanisms, and data processing restrictions that correspond to each user’s age rating. For organizations distributing applications through these channels, this is not an abstract compliance question—it is a condition of continued access to the platform.

The previous practice of drawing the line at 13 in a terms of service document and calling it compliance is no longer sufficient. App store operators now provide developers with direct, reliable notice of user ages, and the expectation is that developers will act on that information.

Key Takeaways

The 13-and-under framework is no longer the standard. 

State legislation and app store policy are extending meaningful protections to teenagers up to 18.

App store requirements are creating immediate compliance pressure.

 Non-compliance risks loss of platform access, which is a more immediate consequence than most regulatory enforcement timelines.

Review your products and services for minor data exposure. 

Assess whether you collect data relating to individuals under 18, map applicable requirements by age group and jurisdiction, and update consent flows accordingly.

Building a Program That Holds Up Over Time

The breadth and pace of change in the privacy landscape argues against a compliance model that responds to individual statutes in isolation. Organizations that take that approach will find themselves in a perpetual state of catch-up. A more effective strategy is to build a program grounded in the foundational principles common to virtually every privacy regime—and to layer jurisdiction-specific requirements on top of that foundation as they arise.

This means treating privacy not as a project that gets completed and filed away, but as an ongoing organizational function. Data practices change. New vendors are engaged. New products are built. New laws pass. The compliance program needs to evolve with the organization, which requires building privacy into regular business conversations rather than scheduling a review every few years.

Practically, this means involving business teams—not just legal—in the ongoing work of understanding what data is collected and why. It means asking, at the point of product development or process design, whether specific data is actually needed, how it will be protected, and what the appropriate consent framework looks like. And it means maintaining a data inventory that reflects current reality, not a snapshot from a prior compliance project.

Finally, the case for privacy investment is strongest when it is made in terms leadership cares about. Regulatory and enforcement risk is increasingly concrete and worth quantifying. But equally important is the trust dimension: consumers and employees are paying more attention to data practices than at any prior point, and organizations that treat privacy as a genuine commitment—rather than a compliance minimum—are better positioned in the markets they serve.

Key Takeaways

Treat privacy as a function, not a project.

A program last updated in 2020 or 2023 is out of date. Compliance requires ongoing engagement, not periodic overhauls.

Build on durable principles.

Programs grounded in notice, transparency, consent, data minimization, and vendor oversight absorb new legal requirements without requiring a full rebuild.

Involve business teams early. 

Privacy questions are best addressed at the point of product or process design, not after the fact.

Make the business case.

 Enforcement risk, reputational exposure, and consumer trust are all legitimate and increasingly quantifiable reasons to invest in a strong privacy program.

Learn More and Get Our Practical Checklist

Listen to the full episode for a practical checklist on auditing your data map against 2026 standards, plus a preview of upcoming episodes on AI and CIPA litigation.

To subscribe to Decrypted, click here.

Tags

data security and privacy