Skip Navigation
Resource

Closing the Data Broker Loophole

Congress must pass legislation that prohibits government agencies from buying its way around the Fourth Amendment and other legal privacy protections.

Published: February 13, 2024
View the entire Artificial Intelligence and Civil Liberties & Civil Rights collection

Americans leave a trail of personal data with almost every action we take — every website visit, credit card payment, browser search, and online message generates data. Third parties like cell phone companies, internet service providers, social media platforms, and app developers collect and hold this information, often without our knowledge, and use it to offer tailored products and more personalized recommendations. This data also finds its way into exhaustive dossiers, compiled by data brokers, that reveal the most intimate details of our lives: our movements, habits, associations, health conditions, and ideologies.

Fourth Amendment case law and privacy-related legislation have failed to keep up with this proliferation of data. The lack of comprehensive data protection in the United States and a reliance on “notice and consent” privacy regimes have fostered an age of surveillance capitalism, enabling a shadow digital economy of platforms and third-party data brokers that collect and commodify users’ data. This underregulated data broker ecosystem, built to target consumers with ads, also engenders ever-increasing risks of data breaches; discrimination in housingemployment, and credit decisions; and dragnet surveillance by government agencies that can acquire vast amounts of personal information without legal process.

While all these risks demand action, the government’s ability to obtain troves of information about Americans from private-sector data brokers — including information that the government would otherwise need a warrant, court order, or subpoena to obtain — raises unique concerns. Unlike private entities, government agencies may purchase personal data to further their exercise of coercive powers, including the ability to deport, arrest, incarcerate, or even use lethal force.

Unfettered government access to personal data without judicial or legislative oversight can exacerbate existing biases in law enforcement and intelligence practices, permitting speculative investigations on the basis of constitutionally protected categories and the targeting of marginalized communities. Evidence of this phenomenon abounds, from the Defense Department purchasing location data collected from prayer apps to monitor Muslim communities to police departments purchasing information to track racial justice protesters. As state governments continue to pass strict anti-abortion legislation — laws that disproportionately harm women of color — such misuses will only expand. And as President Biden’s recent executive order on artificial intelligence acknowledges, the integration of new AI tools will make it easier to “extract, re-identify, link, infer, and act on sensitive information about people’s identities, locations, habits, and desires,” amplifying the risks to Americans’ privacy and freedoms of speech and association.

Over the past few years, lawmakers have sought to address the threat that third-party data poses to Americans’ privacy, proposing bills that would limit the collection of location and health information and rein in the government’s purchases of data from third parties. This report describes the major legal loopholes that necessitate such reforms and highlights two legislative proposals that would constrain the government’s ability to acquire large swaths of personal information without legal process. It discusses the proposals’ strengths and potential shortfalls and emphasizes important considerations for legislators when moving forward with these proposals or crafting future ones.

The Data Broker Problem

Among the main purveyors in this surveillance capitalism ecosystem are data brokers — companies that collect, assemble, and analyze personal information to create detailed profiles of individuals, which they then sell. Although most of these companies are not household names, they serve an ever-growing demand; the industry pulled in more than $250 billion in 2022.

Data brokers collect information from various sources. They pay app developers to install code that siphons users’ data, including location information. They use cookies or other web trackers to capture online activity. They scrape information from public-facing sites, including social media platforms, often in violation of those platforms’ terms of service. They also collect information from public records and purchase data from a wide range of companies that collect and maintain personal information, including app developers, internet service providerscar manufacturersadvertisersutility companiessupermarkets, and other data brokers.

Data brokers and their clients claim that some or all of the data is “anonymized,” but it can often be reidentified when combined with other information. The data can be highly sensitive, and brokers sometimes use algorithmic tools to make additional inferences and predictions about individuals, lumping them into categories on the basis of where they live, their health, their ethnicity or religion, their political affiliation, or their expected levels of spending. Through such means, these companies gather “thousands of attributes each for billions of people,” analyze and repackage that data, and then sell it to buyers.

Myriad entities buy this information. For example, financial institutions and insurance firms use data for identity verification and risk assessments. Advertising companies use data to offer more relevant and targeted advertisements. More troublingly, data brokers have sold personal information to predatory loan companiesstalkers, and scammers, as well as to political consultants (like Cambridge Analytica in 2016) that can use data to send voters disinformation and attempt to skew electoral outcomes. Data brokers also sell data to foreign actors whose uses of the information are not constrained by U.S. law. And — as highlighted by news reports, civil society organizations, and a recently declassified report commissioned by the Office of the Director of National Intelligence — law enforcement and other government agencies (including state and local law enforcement, the FBI, the IRS, the Drug Enforcement Administration, the Department of Defense, and the Department of Homeland Security) have secretly been paying data brokers to access vast databases of personal information, including geolocation data, without any warrant, court order, or even subpoena.

The Legal Loopholes

Typically, the Fourth Amendment and various federal statutes require government agencies to comply with legal process to obtain personal data on Americans. Yet over the past few decades, government agencies have taken advantage of statutory loopholes and a stalled Fourth Amendment doctrine to access personal information without legal process.

Constitutional Protections: The Fourth Amendment’s Warrant Requirement Post-Carpenter

The Fourth Amendment requires the government to obtain a warrant to access information in which individuals have a “reasonable expectation of privacy.” For decades, however, Fourth Amendment protections were constrained by the third-party doctrine, which holds that people lose any expectation of privacy in information that they voluntarily disclose to others.

In today’s world, where much of our information is stored online and accessible to the third parties that facilitate our digital transactions, the third-party doctrine has proven untenable. Nearly every American carries a cell phone that tracks their every move and stores their most private personal data. Internet service providers store our search and browsing histories, and our documents — once locked away in safes or desk drawers — are hosted by cloud service providers. Often, we disclose information unwittingly: simply using our devices generates metadata, like the location and IP address of a device used to search for or post content, which can then be collected and stored by third parties.

A 2018 Supreme Court decision began to chip away at this unworkable doctrine. In Carpenter v. United States, the Court held that police must have a warrant to obtain seven days’ worth of historical cell-site location records. Specifically, the Court found that individuals have a reasonable expectation of privacy in their cell phone location data because that data reflects the “whole of their physical movements,” which in turn can reveal the most intimate details of their private lives. The Court also recognized that cell phone location information “is not truly ‘shared’ as one normally understands the term,” because cell phones are “indispensable to participation in modern society” and they collect location information automatically. Although the Court declined to explain how its holding might be applied to other types of information, it made clear that the Fourth Amendment protects highly sensitive information conveyed through the use of essential technologies, and that the government must have a warrant to obtain such data.

Nonetheless, several government agencies have argued that the Carpenter decision applies only to the specific type of location data at issue in that case (i.e., historical cell-site location information), and that Fourth Amendment protections apply only when the government compels companies to disclose information, not when private companies sell or voluntarily disclose information to law enforcement. Based on these purported distinctions, agencies are continuing to purchase cell phone location data — without a warrant or any other legal process — in large volumes. The Supreme Court presumably will clarify Carpenter’s applicability in due time, but for now, government agencies are relying heavily on data purchases to sidestep the Fourth Amendment’s central safeguard against abusive policing: the requirement that police obtain a warrant from a judge before invading a reasonable expectation of privacy.

Statutory Protections: Loopholes in the Electronic Communications Privacy Act

In the 1980s and 1990s, Congress sought to extend some protections to information held by third parties, but the resulting patchwork of statutes does not sufficiently safeguard privacy today. Critically, the United States lacks a comprehensive data privacy law. Instead, a piecemeal statutory structure — consisting of an outdated communications privacy law and sector-specific data protection laws — protects certain types of personal information from certain privacy intrusions while leaving other types of data and intrusions unregulated.

In 1986, Congress passed the Electronic Communications Privacy Act (ECPA) to protect the privacy of Americans’ communications in an era of new and emerging communications technologies. As part of ECPA, the Stored Communications Act (SCA) restricts certain private companies from voluntarily revealing digital communications or information about those communications to the government.

Much of ECPA was ahead of its time, but the statute today fails to address many issues created by technologies that were unimaginable to Congress in the 1980s. Specifically, the law applies only to communications-related information held by two categories of service providers:

  • providers of an electronic communication service (ECS), defined as “any service which provides to users thereof the ability to send or receive wire or electronic communications” (i.e., phone and messaging services, social media platforms, and other forms of internet-based messaging); and
  • providers of a remote computing service (RCS), defined as the “provision to the public of computer storage or processing services by means of an electronic communications system” (i.e., data storage and processing services).

Applied today, ECPA covers phone companies, internet service providers, providers of email and text messaging services, and social media platforms (apart from messages that are “readily accessible to the general public”). However, it does not cover third-party data brokers or many app developers that collect and maintain personal information.

Subject to certain exceptions, ECS and RCS providers may not voluntarily disclose the contents of communications to anyone, including the government. The term contents includes “any information concerning the substance, purport, or meaning” of a communication. With respect to non-contents information, ECS and RCS providers may not voluntarily disclose “record[s] or other information pertaining to a subscriber to or customer of such service . . . to any government entity.” This non-contents category is divided into two subgroups: subscriber information (e.g., name, address, and phone number) and other non-contents information (e.g., traffic or transactional information or other communications-related metadata, often referred to simply as communications metadata).

Section 2703 of the SCA conveys the specific legal process that the government must follow to obtain customer information held by an ECS or RCS. The process differs depending on the type of information sought. For example, when the government seeks to obtain the actual contents of electronic communications, generally it must obtain a probable cause warrant. But for some types of non-contents information, the government may obtain a court order based on “specific and articulable facts showing that there are reasonable grounds to believe” that the information is “relevant and material to an ongoing criminal investigation” — a less stringent standard than the probable cause requirement for a search warrant. And for other types of non-contents information (including subscriber information), the government may issue a subpoena, which requires no court approval or order.

As more and more data has been collected and processed online, the line has blurred between what is contents information (i.e., that requiring a search warrant) and what is non-contents information (i.e., that requiring just a court order or subpoena). For instance, whether location data is considered contents or non-contents information can vary based on whether the purpose of a service is to record or communicate geolocation. Google’s position is that its location history data qualifies as contents under ECPA, but some apps may consider such data non-contents information. Moreover, the lesser protection for non-contents information does not always make sense in the modern era, in which some types of non-contents information can be just as revealing as contents — as the American public learned a decade ago, when Edward Snowden revealed that the National Security Agency was collecting Americans’ phone metadata in bulk and experts explained how exquisitely sensitive such data can be.

The uncertainties and incongruities in ECPA’s application are sufficient cause for Congress to amend the SCA to ensure adequate protections for information produced through new technologies. But perhaps more troublingly, two major gaps in ECPA together have enabled government agencies to sidestep the law’s privacy protections altogether. First, ECPA’s definitions of ECS and RCS providers do not cover many app developers or third-party data brokers that collect and maintain data. Thus, the statute does not prohibit those entities from selling or otherwise disclosing information to the government. Second, ECPA permits ECS and RCS providers to voluntarily disclose non-contents information to nongovernmental third parties, like data brokers, who can then sell or share that data with government agencies. As a result, government agencies can purchase or acquire individuals’ information from data brokers or other entities not covered by ECPA even though those agencies would need a warrant, a court order, or a subpoena to access that very same data from ECS or RCS providers.

Recent Proposals That Affect Government Purchases of Data

As more reports of government agencies using data brokers to purchase Americans’ personal information have surfaced, Congress has considered legislative proposals that would limit the government’s practice of circumventing the Fourth Amendment and other privacy protections by buying and accessing personal data without legal process. This section highlights two of those proposals and examines their strengths and potential shortcomings.

The first, the Fourth Amendment Is Not For Sale Act (FAINFSA), seeks to address the ECPA loophole by barring law enforcement and intelligence agencies from purchasing communications-related and geolocation data from any company that collects that information in certain ways. The second, the American Data Privacy and Protection Act (ADPPA), is a comprehensive federal consumer privacy bill that promises to reduce the amount of personal information flowing into and out of the hands of data brokers by restricting the collection of such information to only that necessary to provide a service or achieve a specific, enumerated purpose, and by placing additional limits on data transfers.

The Fourth Amendment Is Not For Sale Act

First introduced in April 2021 by Sens. Ron Wyden (D-OR) and Rand Paul (R-KY) and cosponsored by 18 other senators, FAINFSA was reintroduced this year with a companion bill in the House, cosponsored by Reps. Jerry Nadler (D-NY), Warren Davidson (R-OH), and six other lawmakers. The House Judiciary Committee voted almost unanimously (with the exception of one member who voted “present”) to report the bill out of committee in July 2023, and the bill’s language was included in the Protect Liberty and End Warrantless Surveillance Act, which was reported out of the House Judiciary Committee by a vote of 35–2 in December 2023.

FAINFSA aims to address Carpenter’s uncertain applicability to data purchases and amend ECPA to bar government agencies from purchasing certain types of information. Specifically, FAINFSA adds a provision to the SCA that prohibits government agencies from buying covered records from third parties who collect the information from specified sources: ECS or RCS providers; intermediary service providers (i.e., internet backbone companies like AT&T or Verizon that deliver, store, or process communications for or on behalf of ECS or RCS providers); online accounts with ECS or RCS providers; or electronic devices. Covered records include communications non-contents information relating to a subscriber or customer of an ECS or RCS provider, communications content, and geolocation data.

FAINSFA also prohibits the government from purchasing “illegitimately obtained information” — namely, information that third parties obtain through deceit, through unauthorized access to a device or online account, or (when obtained from ECS or RCS providers) in violation of the provider’s terms of service or privacy policies. The bill thus prevents the government from buying data from companies like Clearview AI and Voyager Labs, which scrape photos and data from social media platforms in violation of their terms of service and sell that data — or access to it through data-matching services — to government agencies. The bill applies to downstream data transfers, meaning that FAINFSA prohibits data purchases regardless of whether a third party initially obtained the information or received it from another third party. Furthermore, covered data includes so-called anonymized information that, if combined with other information, could be used to identify a person.

FAINFSA’s prohibitions could be stronger. For one, the bill only covers a limited universe of data: communications content, geolocation information, and non-contents information pertaining to a consumer or subscriber of an ECS or RCS provider. It does not purport to cover health, financial, or biometric information or other types of sensitive data. Even within the realm of communications-related or geolocation information, the bill has some holes. For instance, it does not cover communications metadata (other than geolocation information) collected by apps that do not qualify as ECS or RCS providers (e.g., health and fitness apps). And it applies only to data acquired by third parties from ECS or RCS providers, from intermediary service providers, from a person’s “online account” with an ECS or RCS provider, or from or about an electronic device. Whether these categories would cover data purchased by a third party from an entity other than an ECS or RCS provider or intermediary service provider is unclear.

In addition to these coverage limitations, FAINFSA only bans the government from obtaining data “in exchange for anything of value.” That definition includes information received “in connection with services being provided for consideration,” which would bar government entities from accessing information through data-matching services such as those that Clearview AI offers. However, if read narrowly to include only fee-based consideration or other financial compensation, FAINFSA could still leave room for third parties to voluntarily disclose or grant access to personal data to government agencies without payment. For example, Amazon’s efforts to get law enforcement to promote its Ring cameras in exchange for user data might not be covered. Companies also might disclose data without a fee in the hope of currying favor to avoid regulation or to obtain government contracts for other services.

Finally, FAINFSA would go a long way toward prohibiting the government’s purchase and use of certain highly sensitive information from data brokers. But it would not address the overcollection of data or the trafficking of personal information to other, nongovernmental third parties — practices that will likely intensify with the proliferation of AI models reliant on vast data sets. Without comprehensive limitations on companies’ collection and transfer of personal information, private actors could continue to purchase data to skew electionsharass abortion-seekers, or stalk domestic violence survivors. Foreign governments could also purchase exhaustive dossiers on American citizens for purposes of espionage recruitment or other malicious reasons.

The American Data Privacy and Protection Act

The ADPPA is, in some ways, the converse of FAINFSA. It proposes more sweeping privacy protections that would reduce the amount of personal information collected by companies at the outset and place stronger restrictions on the transfer of personal data to third parties generally. But it includes exceptions and potential loopholes that still enable government access to sensitive data without legal process.

Introduced by Rep. Frank Pallone (D-NJ) in June 2022 and cosponsored by Reps. Cathy McMorris Rodgers (R-WA), Jan Schakowsky (D-IL), and Gus Bilirakis (R-FL), the ADPPA is a comprehensive federal consumer privacy bill that establishes requirements for how companies, including telecommunications “common carriers” and nonprofits, collect, process, and transfer personal data. The ADPPA advanced out of the House Energy and Commerce Committee on a bipartisan 53–2 vote in July 2022, but it has not yet been reintroduced this Congress.

Unlike past notice-and-consent regimes, the ADPPA takes the onus of reading and accepting complicated privacy policies off the individual and instead imposes a baseline duty on companies to refrain from collecting an individual’s personal data unless they need it to provide a product or service to that individual. In practice, this model would reduce the amount of data available to data brokers and thereby limit the amount of personal information that data brokers could share with third parties, including government agencies. Not surprisingly, the data broker industry has lobbied fervently against the ADPPA.

The bill also includes prohibitions on the transfer of sensitive personal information (defined below) to third parties without the customer’s consent, and it allows customers to opt out of the transfer of nonsensitive data. However, as discussed below, the ADPPA includes certain exceptions that threaten to swallow the rule when it comes to government access to personal data.

Limitations on the Collection, Processing, and Transfer of Personal Information

The ADPPA defines personal information (or “covered data”) broadly to include inferences (or “derived data”) and any information that “identifies or is linked or reasonably linkable, alone or in combination with other information, to an individual or [an individual’s] device.”  To the extent that companies seek to collect or transfer “de-identified data” (i.e., information that does not identify a distinct individual or device), the ADPPA requires companies to take “reasonable technical measures to ensure that the information cannot, at any point, be used to re-identify any individual or device,” and to “contractually obligate[]” any person or entity that receives the de-identified data to comply with that requirement.

Similarly, the bill defines “sensitive covered data” fairly broadly, and it permits the Federal Trade Commission (FTC) — the ADPPA’s primary enforcer — to add additional categories through rulemaking. Sensitive data includes precise geolocation, biometric and genetic information, health information, sexual behavior, private communications and any related metadata, race, religion, ethnicity, and union membership. Sensitive data also includes “information identifying an individual’s online activities over time and across third-party websites or online services,” which would cover cookie data or other tools that track users’ browsing or search activities across the web (and off a company’s specific website). In practice, this last prohibition would curb the practice of tracking users’ browsing activity to build a dossier that could be sold to advertisers, data brokers, and other third parties, including governments. But whether individual search queries that reveal highly sensitive information would be covered is not clear. Congress should clarify that the definition of sensitive covered data includes individual search queries that reflect or pertain to a category of information that would itself meet the bill’s definition of sensitive (e.g., searches pertaining to sexual behavior, race, religion, ethnicity, or union activity).

The ADPPA establishes baseline “data minimization” limitations on how companies collect, process, and transfer personal information. For personal information in general, covered entities may only collect, process, or transfer what is “reasonably necessary and proportionate” to deliver a product or service requested by the individual or to effectuate one of 17 specified purposes (for a complete list of these permissible purposes, see Appendix). Notably, companies may not collect personal information simply for advertising purposes; to provide advertising, companies may process or transfer only data previously collected for other purposes. In addition, individuals have the right to opt out of targeted advertising and the transfer of nonsensitive information to third parties, including through a centralized, user-friendly opt-out mechanism.

As to the collection and processing of sensitive personal information, the ADPPA is more restrictive: companies may only collect and process what is “strictly necessary” to provide a product or service requested by the individual or to effectuate one of 14 out of the 17 purposes that apply to nonsensitive information. Notably, one of the excluded purposes is targeted advertising, which has been a driving force behind the overcollection and transfer of sensitive information to data brokers.

The bill is also more restrictive on transfers of sensitive personal information to third parties. Specifically, it prohibits transfers unless an individual opts in to the transfer via “affirmative express consent” or unless one of six exceptions applies. Companies must request consent from individuals in a “clear and conspicuous standalone disclosure” with a clear description of the data subject to the request. The option to refuse consent for the collection, processing, or transfer of covered data must be “at least as prominent as the option to accept, and the option to refuse consent shall take the same number of steps or fewer as the option to accept.”

Moreover, the ADPPA mandates that third parties may not use or transfer sensitive information for any purpose other than that to which the user consented (or for one of three exceptions). In practice, those constraints would prohibit data brokers from, for instance, taking data collected to authenticate a user and selling it for advertising purposes. Taken together, these heightened restrictions would limit some of the most harmful business practices that lead to the overcollection and out-of-context secondary uses of personal data, including the sale to and use by data brokers trafficking in consumer profiles.

In addition, the ADPPA gives individuals more control over their data and imposes transparency requirements on data brokers (defined as entities whose principal source of revenue derives from processing or transferring personal information not collected directly from individuals). Specifically, the bill grants individuals the right to access, delete, correct, and move their personal information. Also, as noted, individuals have the right to opt out of (for nonsensitive information) or in to (for sensitive information) the transfer of information to third parties. And the ADPPA requires data brokers to register with the FTC and be included in a searchable, publicly available central registry, whereby individuals may “easily submit a request” to delete all data collected by those data brokers and ensure that they no longer collect that individual’s data.

In short, the ADPPA’s principles of data minimization, user data rights, and heightened restrictions regarding sensitive information would limit overcollection and secondary uses of personal information, including the sale to and use by data brokers, advertising firms, private actors, and (to some degree) government agencies. State and federal law enforcement agencies recognize this fact, noting in an opposition letter to Congress that the ADPPA would “likely complicate the private sector’s ability to continue its ongoing efforts to cooperate and voluntarily share certain information with law enforcement“ to “generate leads.”

Problematic Law Enforcement Exceptions

Nonetheless, various exceptions and gaps in the ADPPA could leave open troubling avenues for the government to obtain personal information without legal process. As a general matter, the ADPPA’s data minimization principles do not apply to government agencies, which are excluded from the law’s definition of “covered entities.” The bill also exempts service providers that collect, process, or transfer information provided by or on behalf of government entities from restrictions on nonsensitive data collection, processing, and transfer. The bill’s definition of service providers is broad enough to potentially allow extremely broad collection, processing, and transfer of nonsensitive data by data brokers to a government agency when acting under contract with that agency — restricted only by the terms of the contract. And while it generally prohibits service providers from combining the data they collect on behalf of an entity with personal data collected for other purposes, the bill does not prohibit service providers from combining data sets if necessary to effectuate 15 permissible purposes, including the law enforcement exceptions discussed below.

Additionally, the ADPPA includes several exceptions that would permit companies to collect or process personal data, and in some cases to transfer data to the government without legal process, for law enforcement-related purposes. To start, the bill allows service providers “acting at the direction of a government entity” or covered entities providing “a service” to a government entity to process and transfer personal data (including sensitive data) — and possibly to collect such data, per certain ambiguous language in Sections 101 and 102 — “to prevent, detect, protect against or respond to a public safety incident, including trespass, natural disaster, or national security incident” (albeit only “insofar as authorized by statute”). The ADPPA also permits those same entities to process and transfer personal data previously collected for other purposes in order to prevent, detect, protect against, or respond to a public safety incident.

These provisions would undo many of the bill’s promised protections, allowing law enforcement to access vast amounts of data to “generate leads” without specific, articulable, and credible facts demonstrating the existence of a public safety concern. Although the bill tries to mitigate this possibility by explicitly prohibiting “the transfer of covered data for payment or other valuable consideration to a government entity,” this language might not prohibit the voluntary transfer of personal information for nonfinancial benefits, as discussed above.

Future iterations of the ADPPA should close this loophole. First, the bill should make clear that these entities may not collect data for this public safety purpose; rather, they may only process and transfer data previously collected for other purposes. Such a limitation would bar entities from exploiting the public safety exception to justify broad collection of personal information. Companies working with the government would still be able to collect relevant data from publicly available sources, as publicly available information does not meet the ADPPA’s definition of covered data.

Furthermore, rather than presenting a nonexhaustive list of examples of public safety incidents, the ADPPA should define that term to mean criminal activity affecting public safety (which would encompass the “national security” and “trespass” examples in the current bill), natural disasters, or threats to public health. These terms would capture the public safety incidents that are of legitimate concern — such as bomb threats or other violent, criminal acts, as well as crises stemming from pandemics, earthquakes, hurricanes, floods, or wildfires — while ensuring that covered entities cannot stretch the term public safety incident to include nonviolent protests or other lawful activity. Lastly, the ADPPA should prohibit personal information transfers unless (1) the covered data directly pertains to, and could reasonably be expected to assist the government in addressing, a specific and significant threat to public safety; or (2) the government obtains the warrant, court order, or subpoena that would be required to compel production of the information.

In addition to the public safety incident exception, the ADPPA includes broad exceptions that allow the collection and processing of personal data, including sensitive data, if necessary “to prevent, detect, protect against, or respond to” security incidents (defined as network security, physical security, or life safety), fraud, harassment, or illegal activity (defined as a felony or misdemeanor “that can directly harm”). These exceptions also permit the transfer of nonsensitive personal data. Read broadly, these allowances would let companies collect masses of data under the general guise of preventing fraud or averting “security incidents.” Indeed, data brokers like RELX have historically exploited fraud prevention to justify bulk collection of personal information, which they then sell to advertisers, law enforcement agencies, and other third parties. Combined with the above exception regarding the transfer of previously collected data to law enforcement for “public safety” purposes, these permissions would grant law enforcement access to huge amounts of data without legal process.

To disincentivize such overcollection and secondary use of data, the ADPPA should prohibit the transfer of this data for payment or other valuable consideration to a third party, including a government entity (borrowing from the similar prohibition in the public safety incident exception). And the bill should make clear that personal data acquired for these purposes may not be used or transferred to third parties for other purposes, or to law enforcement, unless (1) the covered data directly pertains to, and could reasonably be expected to assist the government in addressing, a security incident, fraud, harassment, or illegal activity; or (2) the government obtains the warrant, court order, or subpoena that would be required to compel production of the information.

Finally, the ADPPA lets companies refuse an individual’s request to access, delete, correct, or move personal data if doing so would “interfere with law enforcement.” This wording, which understandably seeks to prevent criminals under investigation from deleting or changing evidence of their illegal activity, is vague and overbroad. An entity that collects information (and that has a financial incentive to maintain the information) could always posit that the information might prove useful to a future law enforcement investigation. This language also largely duplicates another, similarly overbroad ADPPA provision that permits companies to refuse an individual’s request to delete or correct information if it interferes with investigations or “reasonable efforts to guard against, detect, prevent, or investigate fraudulent, malicious, or unlawful activity.” Future proposals should replace these provisions with one allowing companies to refuse an individual’s request to access, delete, correct, or move personal data if the information that is the subject of the request reasonably appears to reflect or relate to fraud, harassment, or unlawful activity, or if there are specific, articulable facts indicating that compliance with the request would interfere with an ongoing law enforcement investigation.

The ADPPA is a promising template, but it should be strengthened. Congress should amend it to make clear that covered entities, service providers, and third parties may not voluntarily transfer personal information to law enforcement when there is no clear indication of a specific threat to public safety, a security incident, fraud, harassment, or illegal activity. Moreover, the bill currently preempts state law, which would impede states from providing stronger privacy protections against government access to data or harms arising from advancements in technologies like AI or other algorithmic models. States have so often been laboratories of democracy, creating innovative policy ideas that can be adopted at the federal level. Future proposals should ensure that this flexibility remains, perhaps by granting states a waiver to set higher standards or lifting preemption after a fixed term (e.g., five years) to allow them to address new privacy challenges.

Conclusion

Congress must act to bring its patchwork of privacy-protecting statutes in line with the modern world and prohibit government agencies from sidestepping the Fourth Amendment.

A good starting point would be closing the ECPA loophole by barring government agencies from obtaining communications-related information and geolocation data from app developers and third parties, including data brokers, without legal process. These efforts are already underway and enjoy broad bipartisan support. Even so, Congress should expand existing legislation to cover communications metadata collected by all entities, as well as data obtained from entities other than ECS or RCS providers. And statutes should focus not just on the transfer of data for monetary value but also on nontraditional partnerships between private entities and government agencies that supply data for nonfinancial compensation. Such legislation could serve as a model for other sector-specific reforms addressing health data, financial data, biometrics, and other types of highly sensitive information.

Additionally, Congress should pass a comprehensive privacy law that limits the unfettered collection, processing, and transfer of personal data that has driven the surveillance capitalism ecosystem for decades. That law should be based on principles of data minimization and user data rights. It should not, however, impede states from affording stronger protections to address technological advances and future abusive data practices. Moreover, that law must reach disclosures to government agencies and explicitly prohibit transfers of data to law enforcement absent clear indications of a threat to public safety, a security incident, fraud, harassment, or illegal activity, or unless the government has followed the legal process required for compelled disclosure.

Appendix

ADPPA’s Permissible Purposes for Collection, Processing, or Transfer of Data

Permissible Purposes for Collection and Processing of Sensitive and Nonsensitive Data (underlined purposes are permitted for nonsensitive data only) and Transfer of Nonsensitive Data:

  1. To initiate, manage, or complete a transaction or fulfill an order for specific products or services requested by an individual, including any associated routine administrative, operational, and account-servicing activity such as billing, shipping, delivery, storage, and accounting.
     
  2. With respect to covered data previously collected in accordance with this Act, notwithstanding this exception—
    1. to process such data as necessary to perform system maintenance or diagnostics;
    2. to develop, maintain, repair, or enhance a product or service for which such data was collected;
    3. to conduct internal research or analytics to improve a product or service for which such data was collected;
    4. to perform inventory management or reasonable network management;
    5. to protect against spam; or
    6. to debug or repair errors that impair the functionality of a service or product for which such data was collected.
  3. To authenticate users of a product or service.
     
  4. To fulfill a product or service warranty.
     
  5. To prevent, detect, protect against, or respond to a security incident. For purposes of this paragraph, security is defined as network security and physical security and life safety, including an intrusion or trespass, medical alerts, fire alarms, and access control security.
     
  6. To prevent, detect, protect against, or respond to fraud, harassment, or illegal activity. For purposes of this paragraph, the term “illegal activity” means a violation of a Federal, State, or local law punishable as a felony or misdemeanor that can directly harm.
     
  7. To comply with a legal obligation imposed by Federal, Tribal, local, or State law, or to investigate, establish, prepare for, exercise, or defend legal claims involving the covered entity or service provider.
     
  8. To prevent an individual, or group of individuals, from suffering harm where the covered entity or service provider believes in good faith that the individual, or group of individuals, is at risk of death, serious physical injury, or other serious health risk.
     
  9. To effectuate a product recall pursuant to Federal or State law.
     
  10. (A) To conduct a public or peer-reviewed scientific, historical, or statistical research project that—
    1. is in the public interest; and
    2. adheres to all relevant laws and regulations governing such research, including regulations for the protection of human subjects, or is excluded from criteria of the institutional review board.
  11. To deliver a communication that is not an advertisement to an individual, if the communication is reasonably anticipated by the individual within the context of the individual’s interactions with the covered entity.
     
  12. To deliver a communication at the direction of an individual between such individual and one or more individuals or entities.
     
  13. To transfer assets to a third party in the context of a merger, acquisition, bankruptcy, or similar transaction when the third party assumes control, in whole or in part, of the covered entity’s assets, only if the covered entity, in a reasonable time prior to such transfer, provides each affected individual with—
    1. a notice describing such transfer, including the name of the entity or entities receiving the individual’s covered data and their privacy policies as described in section 202 of the ADPPA; and
    2. a reasonable opportunity to withdraw any previously given consents in accordance with the requirements of affirmative express consent under this Act related to the individual’s covered data and a reasonable opportunity to request the deletion of the individual’s covered data, as described in section 203 of the ADPPA.
  14. To ensure the data security and integrity of covered data, as described in section 208 of the ADPPA.
     
  15. With respect to covered data previously collected in accordance with this Act, a service provider acting at the direction of a government entity, or a service provided to a government entity by a covered entity, and only insofar as authorized by statute, to prevent, detect, protect against or respond to a public safety incident, including trespass, natural disaster, or national security incident. This paragraph does not permit, however, the transfer of covered data for payment or other valuable consideration to a government entity.
     
  16. With respect to covered data collected in accordance with this Act, notwithstanding this exception, to process such data as necessary to provide first party advertising or marketing of products or services provided by the covered entity for individuals who are not-covered minors.
     
  17. With respect to covered data previously collected in accordance with this Act, notwithstanding this exception and provided such collection, processing, and transferring otherwise complies with the requirements of this Act, including section 204(c), to provide targeted advertising.

Permissible Purposes for Transfer of Sensitive Data:

  1. The transfer is made pursuant to the affirmative express consent of the individual.
     
  2. The transfer is necessary to comply with a legal obligation imposed by Federal, State, Tribal, or local law, or to establish, exercise, or defend legal claims.
     
  3. The transfer is necessary to prevent an individual from imminent injury where the covered entity believes in good faith that the individual is at risk of death, serious physical injury, or serious health risk.
     
  4. With respect to covered data collected in accordance with this Act, notwithstanding this exception, a service provider acting at the direction of a government entity, or a service provided to a government entity by a covered entity, and only insofar as authorized by statute, the transfer is necessary to prevent, detect, protect against or respond to a public safety incident including trespass, natural disaster, or national security incident. This paragraph does not permit, however, the transfer of covered data for payment or other valuable consideration to a government entity.
     
  5. In the case of the transfer of a password, the transfer is necessary to use a designated password manager or is to a covered entity for the exclusive purpose of identifying passwords that are being re-used across sites or accounts.
     
  6. In the case of the transfer of genetic information, the transfer is necessary to perform a medical diagnosis or medical treatment specifically requested by an individual, or to conduct medical research in accordance with conditions of section 101(b)(10) of the ADPPA.
     
  7. To transfer assets to a third party in the context of a merger, acquisition, bankruptcy, or similar transaction when the third party assumes control, in whole or in part, of the covered entity’s assets, only if the covered entity, in a reasonable time prior to such transfer, provides each affected individual with a notice describing such transfer and a reasonable opportunity to withdraw previously given consents.
     
  8. In the case of the transfer of a Social Security number, the transfer is necessary to facilitate an extension of credit, authentication, fraud and identity fraud detection and prevention, the payment or collection of taxes, the enforcement of a contract between parties, or the prevention, investigation, or prosecution of fraud or illegal activity, or as otherwise required by Federal, State, or local law.