Parents: Google Classroom is not your friend

The following is by Carrie McLaren, a Brooklyn parent.  If others have similar experiences with Chromebooks, please let us know at

A couple of years ago, my then-4th grade son started watching YouTube videos about Magic, a trading card game. These were snoozy, lo-tech commentaries that struck me as quasi-educational. But I soon noticed that YouTube’s algorithm would start recommending more and more “engaging” videos —  a video of white gamer known for dropping the N-word, for instance.

A close friend noticed the same thing happening with her teen. The boy watches videos about American history and started slowly being fed conspiratorial, alt.right nonsense. The racism was not intended on Google’s part. It’s simply the formula we’ve seen all over media platforms: big emotions + edgy content = more engagement. YouTube is in the center of the attention economy, after all, and YouTube’s goal is to keep users watching YouTube.

This economic imperative doesn’t end with Google Classroom. Classroom is just another piece of Google’s data-mining machine. Why school districts are so eager to jump on board the platform is hard to fathom were it not so cheap and convenient. But as anyone with a passing familiarity with Big Tech knows, you get what you pay for. When the tech is free, you are the product.

Prior to distance learning, my son had a Chromebook that he could log into via his gmail account, which we could monitor via Google’s parent controls, Family Link. Once we started distance learning, he needed to login via his school’s gmail. But these Classroom accounts are not subject to Google’s parent control. So, thanks to Google Classroom, my son could log into his Chromebook using his school account and potentially access porn sites, spend the day watching YouTube and ads hawking age-inappropriate games, or do pretty much anything else on the internet, unguarded.

Odd, yes? Chromebooks are often sold as the ideal student laptop. When I contacted Google about this (6/17/20), the customer service rep said it’s the school’s responsibility to limit adult sites and other distractions, not Google’s. But schools can only limit devices linked to their individual network; they cannot do this when students are working from home.

When I expressed concern about limitless YouTube during the home/school day, the Google customer service rep told me not to worry: “Students can’t use YouTube via their school account.”

I laughed at this because my son’s YouTube use amped up dramatically when he started relying on his school gmail account. Google’s subterfuge here runs deep. It’s true that a student cannot “like” or comment on YouTube videos via a student account. Nor can they view their watch history. But they can watch as many YouTube videos as they like. And just because they can’t view their own watch history doesn’t mean Google isn’t tracking that watch history!  Whenever my kid would open a YouTube browser, the home page would be highly tailored to his interests, luring him down a rabbit hole expertly tuned to keep him hooked.

If I want to limit my son’s internet access during distance learning, I need to get rid of the Chromebook and use a different laptop  (Apple and Microsoft have parental controls that can function with Classroom).

Or invest in expensive network-based parent controls, such as Circle. Or, I suppose, I can stop using Google Classroom and give up on school.

Is anyone at the NYC Department of Education thinking about this?  Anyone at all?

– – – Parents, one trick I’ve fallen back on is go into settings and delete my son’s Watch History,  Search History, and turn off targeted Advertising.  I then turned off Watch & Search history by putting them on Pause. These changes make the site a little less addicting and more diverse. 

—Carrie McLaren


Budget cuts at NYC Department of Education may threaten student privacy

The following was written by a concerned stakeholder who prefers to stay anonymous.  One wonders if the budget savings involved in DOE’s decision to cut the only part-time staff assistant vetting research proposals is worth risking student privacy.

NYC public school students are diverse demographically, culturally, linguistically, and academically and there are a wide variety of programs established to meet their needs. The NYC Department of Education Institutional Review Board (IRB) reviews over 500 research proposals every year, many of which aim to evaluate these programs and test new curriculums.  A large portion of these proposals target the most vulnerable NYC DOE students and families.

An IRB is an administrative body that is formally designated to review human subject research proposals, to protect the rights of those individuals who are recruited to participate in research activities.  For most people, the mention of an IRB conjures images of drug trials or medical treatment research.  However, IRBs don’t solely exist for biomedical research. Social science research that collects personal information about participants is also subject to IRB review, and education research is no exception.

Historically, the NYC DOE IRB Board has been supported by only one full-time Director and only one part-time consultant who are tasked with initial review of all submitted proposals, communication with the research community, as well as oversight and compliance monitoring. In addition, there are two Boards made of up 30 volunteers who vet the proposals after the initial review by staff. Comparable institutions reviewing the same volume and type of research normally have between 3-5 full-time administrative support staff to perform initial reviews and support Board members (

Faced with a projected deficit in the billions of dollars, the NYC DOE has opted to eliminate the one part-time IRB assistant position, which will reduce the DOE’s ability to thoroughly review the research studies being proposed and could open the doors to a whole host of privacy and confidentiality breaches.

Proposed studies submitted to the NYC DOE IRB may ask questions regarding family immigration status, financial hardship, experiences with abuse or neglect, sexual practices of children, drug and alcohol use and abuse, and physical or learning disabilities or challenges or more. Researchers also frequently request extensive FERPA-protected student records including disciplinary and suspension data. The NYC DOE IRB is the sole DOE body that reviews these requests and ensures that inappropriate questions – including immigration status — are removed before the study is approved and introduced to students and families.

In reviewing these proposals, the IRB ensures, among other things, that:

  • The risks to students and families are minimized by using procedures that do not unnecessarily expose the research participants to risk.
  • The selection of students and families for research participation is equitable.
  • Research participants are adequately informed of the risks that will be involved in the research.
  • The research plan, when appropriate, makes adequate provisions for monitoring the data collected to ensure the safety of the subjects.
  • There are adequate provisions to protect the privacy of the research subjects and to maintain the confidentiality of the data.
  • Appropriate additional safeguards have been included in the study to protect the rights and welfare of research subjects who are likely to be vulnerable to coercion or undue influence (e.g., children, non-English speakers, undocumented, economically or educationally disadvantaged persons).

Absent NYC DOE IRB review and oversight, many of these research studies could move forward with limited safeguards for NYC DOE students and families.

The NYC DOE IRB’s historic commitment to student privacy and ethical research must be preserved. Due to COVID-19 and the shift to online learning, access to students is now being sought via telecommunication platforms such as Zoom and Google Meets, and online classroom platforms such as Google Classroom. Much sensitive data detailed previously are now being collected using these platforms.

Faced with an avalanche of research proposals focused on the impact the pandemic and the shift to remote learning, the NYC DOE IRB is needed now more than ever to combat against big data research and the exploitation of public school students for profit. It is with these concerns in mind that this institutional cornerstone requires a revamp involving an influx of resources, and support.

Those who care about student privacy should be outraged with the NYC DOE’s shortsighted and nonchalant decision to cut staff from an institutional entity whose mission is to protect 1.1+ million students’ privacy.

It is with this dire call to action that we hope the NYC DOE will reconsider the elimination of the IRB assistant and do everything in their power to promote the mission of the IRB, make strides to advance its current means and abilities, and safeguard it from future crises. Appeals can be sent to the Office of the Chancellor ( ) and the office of the Chief Academic Officer, Linda P. Chen (

Montgomery County, MD Parents Concerned About the Privacy and Security of Children’s Data Shared with Zoom and Google

The below post expresses concerns that are widely shared by parents throughout the country whose children are using programs like Zoom and Google Classroom that have not been thoroughly vetted for privacy and security protections.

by Joel Schwarz, Esq., CIPP

To say that 2020 has proven to be a challenging time for everyone would be an understatement.  Nowhere is this more true than in the education space where, with little time to plan , school systems around the country were required to convert in-person programs into remote educational programs, all the while wrestling with ensuring that children who rely on in-school meals still receive them, children’s special needs requirements are still met, etc.

Overall, school administrators, parents, and students alike have risen to the occasion in admirable fashion and deserve our gratitude and appreciation. That said, as the parents of students in the Montgomery County Public School (MCPS) (Montgomery County, Maryland), we’ve grown increasingly concerned about some of the technologies deployed to assist in remote learning. Two (2) companies in-particular stand out: Zoom and Google.

Our concern with Zoom stems from the fact that Zoom was never designed for the student/school setting, where there are special sensitivities relating to student privacy and data sharing, as well as FERPA and COPPA requirements.  While Zoom bombing (hijacking Zoom’s virtual meetings) has certainly been the most prominent issue in the press, other significant security and privacy concerns with Zoom include:

  • Zoom misrepresenting the encryption it uses, claiming to use “end-to-end” encryption, which Zoom later conceded was untrue (in an April 4 interview in the Wall Street Journal, Zoom’s CEO conceded that he’d “messed up on security,” but would begin working on true end-to-end encryption). Notably, in May 2020 Zoom announced its purchase of Keybase, a company that specializes in encryption solutions. This doesn’t solve Zoom’s lack of end-to-end encryption, however, as it’ll take time to integrate Keybase’s technology, during which time Zoom will still lack end to end encryption;
  • Zoom’s custom encryption is predictable, weak, and is vulnerable to cracking by hackers;
  • Zoom’s encryption keys may be retrieved from servers in China, giving rise to a risk that the Chinese government can (and may already have) forced Zoom to share all Zoom communications;
  • Zoom’s collection of information from students in excess of what is needed for purely educational purposes, potentially in violation of FERPA.

Interestingly, upon discovering problems with Zoom, a number of school systems walked back plans to utilize Zoom, including New York City public schools, Clark County Public Schools in Nevada, and schools in Utah, Washington state and beyond.  These actions were later followed by investigations into Zoom by Attorneys General offices of New York, Florida and Connecticut, to name a few.

Naturally, as parents of MCPS students, we raised similar concerns with MCPS.  Despite our requests, however, MCPS did not take action, nor were we provided with a look at the contract between Zoom and MCPS, or Google and MCPS (although we were given the option of opting out Zoom calls for our children).

We later learned that school districts in upstate New York had obtained more favorable terms and conditions from Zoom for their students, which any school district in New York can choose to opt into, including an agreement by Zoom to “delete any student, teacher and principal data it had collected or stored when the contract expires later this year.”

It seemed reasonable to us that Maryland students deserved the same protections.

Google also presents significant concerns for us as MCPS parents, because Google has been completely unresponsive to privacy requests made by MCPS regarding our children’s data. Specifically, last year the Montgomery County Council of PTA’s Safe Technology Subcommittee and MCPS initiated a “Data Deletion Week,” which required, among other things, that ed tech providers certify the deletion/purge of certain student data at the completion of the school year. Several other ed tech providers promptly complied, but Google failed to do so, and has continued in this failure for almost nine months now.

But Maryland parents are not alone in concerns about Google’s handling of students’ personal information.  The New Mexico Attorney General’s Office filed a lawsuit against Google in February 2020 for deceptive trade practices, alleging that once Google collects student data, it shares that data across all of its business segments “for its own commercial purposes” despite having promised to use it only for educational purposes. Likewise, privacy-focused Internet browser Brave filed a lawsuit with the Irish Data Protection Authority on March 16, 2020, alleging that Google fails to fence off data collected by its different services, sharing data widely across all business lines in what Brave refers to as “Google’s internal data free-for-all.” This is eerily reminiscent of the concerns raised by the New Mexico Attorney General.

Our concerns escalated further when, due to COVID-19, student use of, and reliance on, Google Chromebooks and Google Classroom increased exponentially, turning the small spigot of information that previously flowed to Google into a virtual fire hose, compromising the privacy of hundreds of thousands of Maryland students.

As a result of our concerns with Zoom and Google, we wrote to Maryland State Attorney General Brian Frosh, seeking his help and intervention.  Specifically, we requested that Attorney General Frosh’s Office take immediate action to ensure robust protections for student data acquired by Zoom and Google, including:

  • Publicly posting the Zoom and Google contracts with MCPS so that we have greater transparency into the privacy and security protections (or lack of them thereof) for our children;
  • Securing binding public assurances that Zoom and Google will secure and protect our children’s data, by:
    • segregating personal information and usage information from all of their other lines of business;
    • ensuring that all student data, communications and encryption keys remain inside the U.S.;
    • committing to not sharing or otherwise using student data for any purpose other than purely educational purposes; and
    • purging all student data and related information at the end of the current school year, or the end of the pandemic, whichever comes first, and then certifying this in writing, under oath.

To date, we have yet to receive a response from Attorney General Frosh’s office (our letter was sent on April 17 and was received on April 20). We nonetheless remain hopeful that progress is being made behind the scenes, as we’ve heard from individuals inside MCPS that the Maryland Attorney General’s office has engaged with them.

So as the old saying goes, hope springs eternal. In this case, we’re hopeful that Attorney General Frosh will eventually revert to us with positive news regarding our requests, because it’s only through AG Frosh’s intervention that we will ensure greater protection of our children’s data, and greater transparency for us, as parents, allowing us to make informed choices about our children’s education and personal information.

If you’re interested in staying abreast of our progress on this and other related issues and you live in Montgomery County, Maryland, please join the Montgomery County PTA’s Safe Tech Listserv by emailing

And if you’re interested in hosting an online meeting, webinar or virtual coffee on this topic or related Ed Tech topic, contact your PTA President and then contact us  at, as we’d be happy to arrange a guest speaker(s) from the Safe Tech Committee to discuss these topics.

Tell Congress to protect your family’s privacy

HR 6172, the USA FREEDOM Reauthorization Act, would reauthorize portions of the Foreign Intelligence Surveillance Act governing the intelligence agencies’ search and surveillance activities. A critical privacy amendment introduced by Senators Wyden and Daines failed by only one vote in the Senate last week that would have prohibited the government from spying on private citizens’ internet searches without a warrant, as well as their phone and computer histories.

Please send a letter to your Representatives in Congress today, asking them to support an amendment to FISA with similar language, to protect your privacy and that of your children under the Fourth Amendment against the government surveilling your family’s internet searches and phone and computer histories without a warrant.

Since the Wyden-Daine amendment failed, a bipartisan coalition of more than 60 groups wrote a letter to Congress saying that the FBI should not be allowed to to spy on Americans’ internet activity without a warrant.   More on this in Roll Call.

Especially in these times of students being required to use the internet for remote learning, let your House members know that the protecting privacy and civil rights of your family and all Americans are important to you.


Cheri Kiesecker and Leonie Haimson

Co-chairs, Parent Coalition for Student Privacy

Coalition tells the FTC: Time is up for TiKTok


The Parent Coalition for Student Privacy is one of twenty advocacy, consumer, and privacy groups that filed a May 14, 2020 complaint with the Federal Trade Commission (FTC), asking them to investigate and sanction TikTok, formerly, for continuing to violate COPPA, the Children’s Online Privacy Protection Act. The complaint argues that TikTok continues to store and collect children’s personal information without notice to and consent of parents, in violation of its 2019 order by the FTC.

If you are not familiar with TikTok, it is a very popular social media app, with 800  million worldwide users, many of them children.  TikTok allows users to record and upload videos of themselves dancing and singing and the app has more downloads than Facebook.  As this Manchester Evening News piece points out,  the recommended ages are for 12 plus, but “online safety experts say it has been designed with the young user in mind and has a very addictive appeal.”

Why this complaint is important

Because TikTok  is a popular platform for children, parents  worry that TikTok is not safe and that it puts kids at risk of sexual predation. For example, this father warned other parents after his 7 year old daughter was asked to send nude pictures of herself on TikTok. In another instance, a 35 year old Los Angeles man was allegedly targeting girls by posing as a 13 year old boy on TikTok and engaging in  “sexual and vulgar conversations with at least 21 girls, some were as young as 9”.  This February 2020 piece in Parents  says,  “TikTok allows users to contact anyone in the world, and this comes with its own host of hazards”.  The Parents piece goes on to point out that “kids can be targeted by predators, it’s easy to encounter inappropriate content”, and “Even if you set your own account to private, you may still be exposed to sexual or violent content posted to the public feed.”

There are many more concerning  examples of underage TikTok use cited in the complaint. And as the complaint notes, it is easy for a child to fake their date of birth and sign up for an adult TikTok account.

Data is money. Children’s data is valuable, predictive and can profile the user.  As the complaint states,

“TikTok collects vast amounts of personal information including videos, usage history, the content of messages sent on the platform, and geolocation.  It shares this information with third parties and uses it for targeted advertising.”

Parents want to know how TikTok is using their children’s data.  TikTok, owned by Bytedance, uses Artificial Intelligence (AI) and facial recognition.  Per this 2018 Verge article,

 “A Bytedance representative tells The Verge that TikTok makes use of the company’s AI technologies in various ways, from facial recognition for the filters through to the recommendation engine in the For You feed. “Artificial intelligence powers all of Bytedance’s content platforms,” the spokesperson says. “We build intelligent machines that are capable of understanding and analyzing text, images and videos using natural language processing and computer vision technology. This enables us to serve users with the content that they find most interesting, and empower creators to share moments that matter in everyday life to a global audience.”
TikTok also uses persistent identifiers to track kids and TikTok algorithms create profiles of children.  Per the complaint


“TikTok uses the device ID and app activity data to
run its video-selection algorithm. When a child scrolls away from the video they are watching, TikTok’s algorithm uses artificial intelligence to make sophisticated inferences from the data TikTok collects to present the next video. The algorithm “entirely interprets and decides what the user will watch instead of presenting a list of recommendations to the users like Netflix and YouTube.”


Using personal information in this manner exceeds the limited exceptions for personalization of content. The COPPA Rule is quite clear that information collected to support internal operations may, under no circumstances, be used “to amass a profile on a specific individual.”


Yet TikTok does, indeed, amass a profile of each user—including child users—and draws upon that profile to suggest videos of interest to the user. That profile may be based in part on users’ overt behavior, such as liking videos. However, TikTok also appears to amass user profiles based on passive tracking.  As reported in The New Yorker, “Although TikTok’s algorithm likely relies in part, as other systems do, on user history and video-engagement patterns, the app seems remarkably attuned to a person’s unarticulated interests.” Another article observed that the algorithm “goes right to the source using AI to map out interests and desires we may not even be able to articulate to ourselves.” The profiles that TikTok amasses on its users are designed to be used not only to curate which user-generated videos appear in each users’ stream, but also to assist with advertising. ” [Emphasis added]


It’s time the FTC uses its power to protect children and enforce COPPA.  The FTC should investigate TikTok,  ensure TikTok is in compliance with COPPA and its consent decree. If TikTok is found in violation, the FTC should take action and sanction TikTok again–with a fine that is proportionate to the degree of TikTok’s violations.


We are grateful to the Campaign for a Commercial-Free Childhood (CCFC), the Center for Digital Democracy (CDD), Institute for Public Representation Georgetown University Law Center  and many others for their work on this complaint.


Here is The Campaign for a Commercial-Free Childhood (CCFC) full press release.  Additional coverage of the TikTok complaint can be seen as reported in the New York Times, Financial Times, Politico, Morning Tech, and Reuters