Tag Archives: FTC

A Privacy Blueprint for Biden

Privacy And Digital Rights For All

The weakening of The Family Educational Rights and Privacy Act (FERPA) and the Covid19 rush to usher in virtual learning and edtech in place of in-person learning, have created a perfect storm for student data collection and tracking. Students are increasingly subjected to edtech data collection, profiling, and surveillance as a condition of attending a public school. We call on the next administration to protect children and begin implementing these important recommendations within the first 100 days of office.

Leading privacy and civil rights advocates recently called on the next U.S. administration to make protecting digital privacy a top priority. The press release signed by Campaign for a Commercial-Free Childhood, Center for Digital Democracy, Color of Change, Consumer Action, Consumer Federation of America, Electronic Privacy Information Center, Privacy Rights Clearinghouse, Parent Coalition for Student Privacy, Public Citizen, and U.S. PIRG states:

“The Biden administration and the next Congress should make protecting digital privacy a top priority, and 10 leading privacy, civil rights and consumer organizations today released a memo of recommendations for executive actions on Day One, actions during the first 100 days and legislation.

“The United States is facing an unprecedented privacy and data justice crisis,” the blueprint memo reads. “We live in a world of constant data collection where companies track our every movement, monitor our most intimate and personal relationships, and create detailed, granular profiles on us. Those profiles are shared widely and used to predict and influence our future behaviors, including what we buy and how we vote. We urgently need a new approach to privacy and data protection. The time is now.”

“The U.S. urgently needs a comprehensive baseline federal privacy law. The Biden administration and Congress should not delay in setting out strong rights for internet users, meaningful obligations on businesses, and establishing a U.S. Data Protection Agency with strong enforcement powers,” said Caitriona Fitzgerald, policy director, Electronic Privacy Information Center.

“Privacy is a basic human right, and children’s personal information should not be profiled, licensed, sold, commercialized or shared with third parties as a condition of attending a public school. We hope policymakers will move to prohibit the use of student data for marketing purposes and require all public schools and education agencies to adopt strict security and privacy standards,” said Leonie Haimson, co-chair, Parent Coalition for Student Privacy.

“For far too long, companies have deceptively tracked kids and used their sensitive data to exploit their vulnerabilities and target them with marketing. Families are counting on the Biden administration and the next Congress to recognize that children and teens are vulnerable, and to put protections in place which will allow young people to use the internet more safely,” said David Monahan, campaign manager, Campaign for a Commercial-Free Childhood.

The recommendation memo, Privacy and Digital Rights for All, specifically calls for protection of children, teen, and student data, including parent consent before sharing student data:

Action item within the first year: Protect children and teens.

Action 8: Protect Children and Teens from Corporate Surveillance and Exploitative Marketing Practices Recommendations for First 100 Days
•Urge the FTC to begin 6(b) studies on ad tech and ed tech companies’ data practices and their impacts on children and teens before undertaking any rulemaking under the Children’s Online Privacy Protection Act (COPPA).
•Protect students through an executive order that requires the Department of Education (DoE) to:
o Prohibit the selling or licensing of student data;
o Issue recommendations on transparency and governance of algorithms used in education;and
o Minimize data collection on students,ensure parental consent is affirmatively obtained before disclosing student data, and issue rules enabling parents to access and also govern data on their child.
Recommendations for Legislative Action
•Ensure children and teen privacy is legislatively protected as part of a comprehensive baseline federal privacy bill that:
o Establishes the special status of children and teens as vulnerable online users; provides strong limits on collection, use, and disclosure of data, and narrowly defines permissible uses;
o Requires employing privacy policies specific to children’s data on all sites and platforms used by children; and
o Prohibits targeted marketing to children and teens under the age of 18 and profiling them for commercial purposes.
•Strengthen COPPA by raising the covered age to 17 years and under, banning behavioral and targeted ads, banning the use of student data for advertising, and requiring manufacturers and operators of connected devices and software to prominently display a privacy dashboard detailing how information on children and teens is collected, transmitted, retained, used, and protected.
See more recommended principles for protection of children and teens here.

It’s time for the U.S. to take data privacy seriously.  Citizens should have consent and control over collection and use of their data; “pay-for-privacy provisions” and “take-it-or leave it” terms of service should be prohibited.  Finally,  our most vulnerable, our children should be protected, not exploited and surveilled as a condition of attending public school.

Coalition tells the FTC: Time is up for TiKTok

 

The Parent Coalition for Student Privacy is one of twenty advocacy, consumer, and privacy groups that filed a May 14, 2020 complaint with the Federal Trade Commission (FTC), asking them to investigate and sanction TikTok, formerly Musical.ly, for continuing to violate COPPA, the Children’s Online Privacy Protection Act. The complaint argues that TikTok continues to store and collect children’s personal information without notice to and consent of parents, in violation of its 2019 order by the FTC.

If you are not familiar with TikTok, it is a very popular social media app, with 800  million worldwide users, many of them children.  TikTok allows users to record and upload videos of themselves dancing and singing and the app has more downloads than Facebook.  As this Manchester Evening News piece points out,  the recommended ages are for 12 plus, but “online safety experts say it has been designed with the young user in mind and has a very addictive appeal.”

Why this complaint is important

Because TikTok  is a popular platform for children, parents  worry that TikTok is not safe and that it puts kids at risk of sexual predation. For example, this father warned other parents after his 7 year old daughter was asked to send nude pictures of herself on TikTok. In another instance, a 35 year old Los Angeles man was allegedly targeting girls by posing as a 13 year old boy on TikTok and engaging in  “sexual and vulgar conversations with at least 21 girls, some were as young as 9”.  This February 2020 piece in Parents  says,  “TikTok allows users to contact anyone in the world, and this comes with its own host of hazards”.  The Parents piece goes on to point out that “kids can be targeted by predators, it’s easy to encounter inappropriate content”, and “Even if you set your own account to private, you may still be exposed to sexual or violent content posted to the public feed.”

There are many more concerning  examples of underage TikTok use cited in the complaint. And as the complaint notes, it is easy for a child to fake their date of birth and sign up for an adult TikTok account.

Data is money. Children’s data is valuable, predictive and can profile the user.  As the complaint states,

“TikTok collects vast amounts of personal information including videos, usage history, the content of messages sent on the platform, and geolocation.  It shares this information with third parties and uses it for targeted advertising.”

Parents want to know how TikTok is using their children’s data.  TikTok, owned by Bytedance, uses Artificial Intelligence (AI) and facial recognition.  Per this 2018 Verge article,

 “A Bytedance representative tells The Verge that TikTok makes use of the company’s AI technologies in various ways, from facial recognition for the filters through to the recommendation engine in the For You feed. “Artificial intelligence powers all of Bytedance’s content platforms,” the spokesperson says. “We build intelligent machines that are capable of understanding and analyzing text, images and videos using natural language processing and computer vision technology. This enables us to serve users with the content that they find most interesting, and empower creators to share moments that matter in everyday life to a global audience.”
TikTok also uses persistent identifiers to track kids and TikTok algorithms create profiles of children.  Per the complaint

 

“TikTok uses the device ID and app activity data to
run its video-selection algorithm. When a child scrolls away from the video they are watching, TikTok’s algorithm uses artificial intelligence to make sophisticated inferences from the data TikTok collects to present the next video. The algorithm “entirely interprets and decides what the user will watch instead of presenting a list of recommendations to the users like Netflix and YouTube.”

 

Using personal information in this manner exceeds the limited exceptions for personalization of content. The COPPA Rule is quite clear that information collected to support internal operations may, under no circumstances, be used “to amass a profile on a specific individual.”

 

Yet TikTok does, indeed, amass a profile of each user—including child users—and draws upon that profile to suggest videos of interest to the user. That profile may be based in part on users’ overt behavior, such as liking videos. However, TikTok also appears to amass user profiles based on passive tracking.  As reported in The New Yorker, “Although TikTok’s algorithm likely relies in part, as other systems do, on user history and video-engagement patterns, the app seems remarkably attuned to a person’s unarticulated interests.” Another article observed that the algorithm “goes right to the source using AI to map out interests and desires we may not even be able to articulate to ourselves.” The profiles that TikTok amasses on its users are designed to be used not only to curate which user-generated videos appear in each users’ stream, but also to assist with advertising. ” [Emphasis added]

 

It’s time the FTC uses its power to protect children and enforce COPPA.  The FTC should investigate TikTok,  ensure TikTok is in compliance with COPPA and its consent decree. If TikTok is found in violation, the FTC should take action and sanction TikTok again–with a fine that is proportionate to the degree of TikTok’s violations.

 

We are grateful to the Campaign for a Commercial-Free Childhood (CCFC), the Center for Digital Democracy (CDD), Institute for Public Representation Georgetown University Law Center  and many others for their work on this complaint.

 

Here is The Campaign for a Commercial-Free Childhood (CCFC) full press release.  Additional coverage of the TikTok complaint can be seen as reported in the New York Times, Financial Times, Politico, Morning Tech, and Reuters