A landmark data privacy bill still lacks adequate protections for minors and marginalized communities, say advocates and a key lawmaker.
Top House and Senate committee leaders have been widely praised for breaking years of deadlock with a bipartisan privacy proposal, as tech companies and consumers push for a uniform data protection regime. national data.
The measure taken by House Energy and Commerce Chair Frank Pallone (DN.J.), ranking member Cathy McMorrisRodgers (R-Wash.), and Ranking Member of the Senate of Commerce, Science and Transportation Roger Osier (R-Miss.) includes a ban on targeted advertising if companies have “actual knowledge” that an individual is under 17.
The term, which was taken from the Children’s Online Privacy Protection Act (Public Law 105-277), “has been a huge loophole for big tech companies to claim ignorance,” said the representative. Catherine Beaver (D-Fla.) said during an Energy and Commerce Subcommittee hearing on Tuesday. “This is clearly an area where there is room for improvement in the draft if we are serious about protecting children online.”
Alphabet Inc.’s YouTube is the starkest example of a company touting its popularity with child consumers in conversations with potential advertisers — and simultaneously claiming to be a COPPA-free “consumer site,” a said Jolina Cuaresma, senior privacy and technology policy attorney at Common Sense Media.
YouTube did not immediately respond to a request for comment.
Rep. Gus Bilirakis (R-Fla.), the top Republican on the consumer protection subcommittee, said improving the law is important, but within limits.
“To be clear, this is historic, and we must continue to move forward together and leverage constructive feedback while rejecting tactics that may seek to derail our bipartisan work,” he said during the briefing. subcommittee hearing on Tuesday.
Previously: Bipartisan bill would strengthen children’s data privacy
Legal obligations, Lawsuits
Even though the company was subject to COPPA, the law does not require companies to make a good faith attempt to determine a user’s age, Cuaresma added. Common Sense Media, which focuses on children and technology, does not advocate strict liability. Instead, he wants the bill to require companies to use age information not only in their marketing divisions, but also in their legal divisions.
The bipartisan privacy bill would also prohibit companies from transferring the data of individuals they know are between the ages of 13 and 17 without express, affirmative consent. Common Sense advocates that the wording be changed to cover anyone under the age of 18, in part to simplify compliance.
“It’s really easy not knowing if someone is 14 or 17,” Cuaresma said. “My daughter has a pretty mature voice and can often pass for me, so I would like to make sure that we don’t create burdens for small businesses by putting all these different ages in one invoice.”
The bipartisan bill also allows individuals to directly sue companies, known as a private right of action, for certain privacy violations. In cases involving minors, companies are prohibited in the bipartisan bill from requiring consumers to settle the dispute in private arbitration.
David Brody, general counsel for the Digital Justice Initiative at the Lawyers’ Committee for Civil Rights Under Law, said it was very important to improve the bipartisan bill’s “narrow private right of action” which “restricts the ability of individuals to obtain redress from a search.”
Earlier: Tech companies couldn’t force arbitration under bill
President of the Senate in charge of trade Maria Canwell (D-Wash.) introduced its own privacy bill with even broader language on the private right of action. Cantwell says she parted ways with her counterparts because the trio’s bill enforcement mechanisms aren’t strong enough.
On the other hand, John Miller, senior vice president of policy and general counsel at the Information Technology Industry Council, said companies are still concerned that the language of private right of action in the bipartisan project is too broad and will not limit a likely wave of litigation.
ITI is a global trade association whose members include such giants as Google, Meta Platforms Inc. and Alphabet Inc.’s Apple Inc.
Discrimination in algorithms
Advocacy groups, such as the Lawyers’ Committee, the Future of Privacy Forum, and the Electronic Privacy Information Center, have also shed light on how communities of color and marginalized populations face discriminatory algorithms and artificial intelligence models that reinforce structural racism and prejudice.
Caitriona Fitzgerald, deputy director of the Electronic Privacy Information Center, recommended adding stronger requirements to algorithm impact assessments “so they don’t just become box-checking exercises,” including requiring companies to explain how each algorithm was developed, training data, and intended goals and capabilities.
Brody of the Lawyers’ Committee praised the “very strong” civil rights provisions in the bipartisan bill, including algorithmic bias assessment requirements and anti-discrimination language, but maintained that the app needs to be better.
“The current proposal inserts several procedural hurdles that will not reduce legal costs but will prevent aggrieved persons from being able to spend their day in court, such as traps designed to trip up people who do not use magic words to argue their rights,” he said. said.
To contact the reporter on this story: Mary Curi in washington at [email protected]