Each year Data Privacy Day provides a necessary reminder of the importance of embedding privacy into all business activities. The fact that it is January 28 makes it perfectly positioned to provide this reminder just as annual initiatives are beginning to ramp up. It is prudent to use this opportunity to take a step back and consider how your organization is doing in the effort of ensuring privacy protections for your consumers.
It is also helpful to look to the horizon and try to identify what major privacy trends can be expected in the year ahead so you can be proactive with your privacy strategy. The five big privacy trends that we expect are:
- A continuing global patchwork of privacy regulations
- Privacy challenges resulting from AI
- Compliance headaches caused by the shifting ad-tech landscape
- Enforcement focus on sensitive data
- Increased operational compliance overhead due to opacity in data collection
Let’s dive into each of these trends to explore what you should know about each and how your organization can begin preparing for the privacy needs of 2024.
A Continuing Global Patchwork of Privacy Regulations
When people say “patchwork privacy” it is often about the current state of privacy regulations in the United States. And this is very true! In the absence of a comprehensive federal privacy regulation, unique individual state laws dictate obligations in the United States. We enter 2024 with five states having comprehensive state privacy laws in effect (California, Virginia, Colorado, Connecticut, and Utah). In 2024, four more states will have privacy laws that will go into effect (Tennessee, Oregon, Texas, and Montana). And as of January 28, 2024, we’ve already seen an additional state, New Jersey, pass a state privacy law. They will join Iowa and Delaware in 2025 with a law going into effect.
But it’s not just the United States!
Europe already has GDPR and ePrivacy, which we all know and love, but 2024 will see the Digital Markets Act, AI Act, and the EU Data Act all coming into force and carrying obligations related to privacy.
In the Asia Pacific region, Australia is expected to release proposed amendments to the Australian Privacy Act; India’s Digital Personal Data Protection Act is expected to become fully operational, and other countries such as Indonesia with the Personal Data Protection Law will have laws coming into force.
Not to be left out, South American countries are also working towards new and updated privacy legislation, with Argentina, Chile, and Colombia all actively discussing bills to amend data protection laws in their respective countries.
With the complexity of global regulations, it is critical for your business to focus on the fundamentals of understanding (and documenting!) all data being collected, the platforms where data is being sent, and exactly what is being done with the data you have from consumers. This starts with marketers and the business. Maintaining proper documentation for all marketing and advertising activities will make the lives of compliance professionals immeasurably simpler and ultimately will lead to faster time to deployment of new strategies and initiatives as well as reduced compliance risk.
Privacy Implications of AI
It’s not a stretch to say that every business, no matter how large or small, has some plans related to Artificial Intelligence (AI) for 2024—be that creating new products with AI or using new technologies which leverage AI to be more efficient and effective. It is important to consider the privacy implications and requirements in all of these.
While there is a growing number of AI-specific regulations either coming into force or being crafted, existing privacy requirements are also applicable for AI. Two principal aspects of existing privacy regulations applicable to AI are requirements related to data inputs and requirements related to automated decision making.
First, consider the data inputs going into AI systems and machine learning algorithms. The data must be compliantly collected and the usage of the data must align with current privacy requirements. For compliant collection, this means that consumers must be made aware of what data is being collected from them and for what purposes it is being used—including purposes involving AI. Often consent for the collection and use via the AI system must be given and recorded for data inputs to be lawfully used. For any activities involving personal data, consumers’ privacy rights still apply. An example we’ve already seen this year is a GDPR complaint raised by security and privacy researcher Lukasz Olejnik to Poland’s data protection authority regarding the processing of his personal data by ChatGPT. Olejnik submitted a Data Subject Access Request (DSAR) to obtain what personal data Open AI has about him, which is yet to be fulfilled after three months. While AI systems operate differently than traditional data modeling, making these types of requests much more difficult to fulfill, the rights of consumers still apply. Consider the data being used and the obligations related to that data before wasting time and resources on potentially non-compliant initiatives.
The second privacy principle to consider related to AI are requirements related to automated decision making. Much of the promise of AI systems is their ability to ingest information and then use that information to automate actions and outputs. Any usage of systems resulting in automated decision making which impacts a consumer carries heightened requirements in pretty much every piece of privacy legislation globally. Factors such as potential bias, the nature of impacts on the consumer resulting from the automated decision-making process, and ensuring adequate disclosure to the consumer regarding the use of these systems to make decisions (disclosure which must happen at the point of data collection) must all be considered.
It’s fun to play with new toys, and often those toys can greatly enhance the quality of work being done, but making sure the privacy rights of consumers are being adhered to is of critical importance. A heightened focus for enforcement of privacy requirements in these new systems should be expected. A heightened awareness and adherence to these principles should be a focus for businesses using the technology.
Compliance Headaches from the Shifting Ad Tech Landscape
With Google beginning the process of sunsetting support for third-party cookies in Chrome on January 7, 2024, there are no shortage of discussions and think pieces about the “cookieless future”. All of these discussions revolve around questions from advertisers for how they can accomplish strategies for targeting and measurement once their favorite tool is no longer available. Common answers to these questions often involve new browser APIs, new AI capabilities in ad tech platforms, alternative identifiers, and new platforms such as Data Clean Rooms and Customer Data Platforms. New approaches are great! They’re fun! But they also require brand new technical knowledge and compliance reviews.
In January, we’ve already seen a clear example of this with Google Consent Mode. For Google to comply with their new obligations under the EU’s Digital Markets Act, advertisers are required to implement a new technology, Consent Mode, to use user data collected from EEA users via Google marketing platforms on their websites/applications for advertising. This has led to a mad dash for compliance teams to understand what Consent Mode is, its implications on data collection, what data is sent to Google, and how the changes impact current policies and processes related to privacy strategy for data collection. [Side note: if you have questions on this, check out our resource page to learn more]. This same process will be repeating itself throughout the year for the plethora of new technologies entering the market.
Enforcement Focus on Sensitive Data
Enforcement bodies have signaled both via enforcement actions, as well as published enforcement strategies that a heightened focus will be placed on various types of sensitive data. Chief among these are protections for data collected from children as well as protections for data related to health. In the United States specifically, a process is underway to review the Children’s Online Privacy Protection Act to recommend amendments to further raise the duty of care and penalties related to violations involving children’s data. For health data, 2023 already saw an uptick in FTC enforcement actions for violations of HIPAA and new state laws such as Washington’s My Health My Data Act which extends many of the protection principles typically associated with HIPAA to non-HIPAA covered entities. Expect this focus to continue.
For organizations, this means it is critical to identify if any sensitive data is being collected and used throughout the organization. This includes things like demographic information and granular location data in addition to health and biometric information. It’s impossible to protect that which you don’t know you have. Take an inventory of data collection across digital assets to identify any instances of sensitive data collection and make sure applicable laws and requirements are being followed.
Increased Operational Compliance Overhead Due to Opacity in Data Collection
A trend that picked up in the back half of 2023 and is only growing is organizations exploring a migration to server-side tag management. Put simply, server-side tag management shifts data collection and distribution from happening directly in a user’s browser or device to collection and distribution happening from an organization’s owned server environment. There are many benefits associated with this type of architecture, including an opportunity to improve compliance strategy by asserting full control over what data is sent to what third-party platforms. But there are also risks related to less visibility into the flow of consumer data. Consumers lose the ability to see all of this data collection directly, instead having to fully rely on disclosure and assurances from the business they are interacting with.
This by no means disqualifies the approach! On the contrary, for many businesses, the benefits afforded by a server-side architecture will greatly outweigh the risks. But it does mean that compliance policies and processes must be shored up. New methods of monitoring data flows, documenting data practices, and instituting controls to ensure compliance are necessary. This means more work for the privacy team! As this architecture becomes more prevalent and pilots graduate to scale, compliance professionals need to be educated on the nuance of how it all works and be involved in the design process. Organizations must ensure operational and technical safeguards are put in place, as well as activities in practice can be tested to ensure a defensible position against legal challenges.
2024 is shaping up to be a year of education. The advertising technology industry has not seen such a radical shift in the underlying architecture of the web in 30 years. Everyone is learning together. In such a dynamic environment, it’s important to focus on the things that never change—the fundamentals. Marketers and advertisers need to be explicit about exactly what data is being collected, how that data is being used, and what platforms (and features within each platform) are involved in realizing the outcomes desired. It’s easy to make mistakes when so much is new. Focusing on these fundamentals and working closely with compliance teams when designing and testing new technology is critical to not end up on the wrong side of compliance enforcement and on the front page of the news.