Our letter to Reps. Polis and Messer

February 11, 2015

Dear Representatives Polis and Messer:

We write on behalf of the Parent Coalition for Student Privacy, a nationwide network of parents, citizens, and privacy advocates, concerned with the widespread, rampant, and poorly regulated data collection, data-sharing, data-tracking, data-warehousing, data-mining, and commercial exploitation of personally identifiable student information. We thank you for your interest in this important topic and for your ongoing efforts to strengthen student privacy protections.

As you are well aware, parents across the country are increasingly alarmed about the everyday uses and abuses of their children’s personal data. Many parents are only recently learning how much of their children’s most sensitive information is being collected and shared via their schools with commercial vendors, private organizations, state agencies, and other third parties. Though the evidence of the benefits of this widespread collection and disclosure of children’s personal information is weak, the risks are all too evident. Families are mobilizing to counter this virtually unfettered third-party access to their children’s private data, and have demonstrated the effectiveness of their advocacy at the state level.

While we welcome federal legislation to strengthen student privacy protections, we are concerned that this effort may be incomplete, inadequate, or co-opted by special interests. As the tide of opposition to non-consensual capture, disclosure, and re-disclosure of student educational data has grown, various groups have sought to placate parents with various assurances. These assurances, however, are weak, as they fail to deal with student privacy within the framework of fair information practices. The recent voluntary corporate Student Privacy Pledge, for example, was a first step in addressing these issues; but the Pledge has deficiencies and gaps that render it ineffective in addressing our legitimate concerns.

One of our crucial concerns is the current lack of a clear affirmative obligation on the part of schools and districts to notify parents about what student data is being collected, what data is being shared with which third parties, and under what conditions. Another crucial concern is the lack of a clear legal obligation on the part of schools and districts to notify parents about which vendors the schools have authorized to collect information directly from children in class, as schools – not vendors – are the sole contact point for most parents.

Accordingly, we are writing to urge you to draft legislation that deals with educational and student privacy in a more comprehensive and effective manner. Here is a framework that we respectfully ask you to consider:

  • All personally identifiable data collected directly from students, by vendors or other third parties, whether collected in school or assigned by teachers in class or for home, should require that the school provide full notification and informed consent to parents, or to the students themselves if they are over age 18. At a minimum, parents should be informed of what data is being collected, the purpose of the data collection, how long the data will be retained and by whom and where, and the security provisions and safeguarding practices utilized by the third party. As pursuant to COPPA, parents must be afforded the right to opt out of any collection of their child’s data, at any time, if they so choose;
  • All disclosures of students’ personally identifiable information by schools, districts, and states to third-parties must require parental notification. There must be written agreements specifying the use of the data, and these agreements must be made publicly available. The agreements should also specify that only employees of the company or organization with a legitimate educational interest be allowed to access it, that adequate breach prevention and notification technologies and policies are in place, including levels and standards of encryptions for data in-motion and at-rest, that independent audits be required, and that the third party will assume financial liability for any damages caused by any breach;
  • Parents must be afforded the opportunity and ability to inspect any personal student data that is collected, shared, or warehoused, correct if it is wrong, request that it be deleted, and opt out of further collection;
  • Parental consent must be required before any school, district or state can share any student data with any third party that includes sensitive information that could harm a child’s future if breached or abused, including but not limited to their grades, test scores, disabilities, health conditions, biometric information, disciplinary or behavior records;
  • There should be an absolute ban on selling any student data, including in case of a bankruptcy, merger, or sale of a company, as well as a ban on using personal student data for advertising or marketing purposes, or for developing or refining commercial products;
  • There must be protections against schools or vendors creating “learner profiles” of students, whether through “predictive and adaptive analytics” or other measures. These profiles could lead to a student being stereotyped or their chances of future success undermined;
  • Absolutely no re-disclosures or repurposing of personally identifiable student information by third parties without informed parental consent should be allowed;
  • Tough monitoring and enforcement provisions should be required, including substantial fines to be levied on any school, state agency, nonprofit organization, or third party vendor that violates the law’s provisions;
  • A clear private right of action should be created, with parents afforded the right to sue if schools, districts, state agencies, nonprofit organizations, or third party vendors have violated the law and their children’s privacy;
  • Each state must publicly report all the data elements being collected for their state longitudinal student databases, as well as publicly report with which governmental and non-governmental third parties they plan to disclose and/or share such data;
  • State advisory boards made up of stakeholder groups, including parents, security experts, and privacy advocates, should be created to ensure that these state longitudinal databases collect the minimum amount of personal data necessary, and develop rigorous restrictions on access to such data;
  • Any new federal law should recognize the right of states to legislate more robust requirements and provide for more vigorous privacy and security protections. Federal law should therefore not preempt state laws if such state laws are stronger.

Only if these principles and provisions are adopted in a new federal student privacy law will parents be assured that the unregulated and irresponsible trafficking of personal student data will have been adequately addressed. We thank you for your leadership on this important issue and stand ready to work with you and your colleagues to ensure that a strong, workable federal student privacy law is enacted as soon as possible.

Yours sincerely,

Leonie Haimson and Rachael Stickland

Co-chairs, Parent Coalition for Student Privacy


[email protected]



Press Release 1.29.14

For immediate release: January 29, 2015

Contact: Leonie Haimson, [email protected], 917-435-9329                                                                                                                            Rachael Stickland, [email protected], 303-204-1272

Obama privacy bill fails to put children’s safety first

Education Week has gotten hold of a draft student privacy bill out of the White House that from its description is far too weak to satisfy most parents concerned about the use and sharing of their children’s personal data.   The EdWeek article describing the bill is here: http://go.shr.lc/1vahJrs

Said Leonie Haimson, Executive Director of Class Size Matters and co-chair of the Parent Coalition for Student Privacy, “We were startled by the slide released by the White House after the President gave his speech at the FTC that students’ personal data should be able to be sold as long as it was for “educational” purposes.  Student personal data should never be sold, without the knowledge and consent of their parents.  I am very concerned that the Obama administration and the Department of Education  have been captured by the interests of ed tech entrepreneurs, and are members of the cult that believes that outsourcing education and “big data” into the hands of corporations is the answer to all educational ills. This is, after all, the administration that revealed a blind spot as to the need to protect children’s privacy by creating huge loopholes in FERPA in the first place, to encourage the amassing of highly sensitive and confidential student information and allowing it to be disclosed to a wide variety of commercial ventures.”

Rachael Stickland, co-chair of the Coalition for Student Privacy said, “Parents will now fight even harder for a bill that takes their children’s interests into account; that minimizes data sharing without parent notification and consent, and provides for real protections for student privacy and security.  We will continue to speak out until a new law is passed which puts our children’s safety first.  As described by the Ed Week article, the Obama bill clearly does not do the job.”

Weaknesses of the Obama proposal similar to California’s law, according to the EdWeek description:

1. Operators may use personal student information for internal commercial purposes including “for maintaining, developing, supporting, improving, or diagnosing the operator’s site, service or application.”

2. The proposal would allow the use of student information for “adaptive or personalized student learning purposes.” The Parent Coalition for Student Privacy cited this weakness in our press release critiquing the California law here: http://go.shr.lc/1IlSVil

3. Allows the sale of data in mergers and acquisitions “so long as the information remains subject to the same legal protections in place when it was originally collected.” (Quoted section is from EdWeek.)

4. Requires companies to “maintain reasonable security procedures and protocols” for student information, and allow the information to be deleted at the request of a school or district. However, there needs to be specific security and encryption provisions in the law,  as well as parental rights to be notified, consent or delete data.

Areas where the proposal appears to be even weaker than California law:

1. There appears to be no prohibition on vendors amassing profiles of students for non-education purposes. Profiling – whether for targeted advertising or sorting students based on abilities or disabilities – is one of our greatest concerns.

2.  Does not prohibit the collection of student information from an online education site to be used on other commercial websites or services for targeted advertising or marketing purposes. Presumably, this means that if a child uses Google Apps for Education (GAFE), Google would be unable to target ads to the child using GAFE but could target ads to the child on other commercial services linked to Google.  This is entirely unacceptable.


Press Release 1.12.15

For immediate release: January 12, 2015

Contact: Leonie Haimson, [email protected], 917-435-9329;                                                                                                                            Rachael Stickland, [email protected], 303-204-1272

Parent Coalition for Student Privacy on President’s Announcement of Need for New Federal Student Privacy Protections

The Parent Coalition for Student Privacy thanks the President for recognizing the need for new federal student privacy protections, but points out how the California law that the President lauded as a model cannot be used without strengthening its provisions around parental notification, consent, security protections and enforcement.

“Any effort to ban the sale of student information for targeted advertising is a good first step, but the White House’s proposal appears to allow companies to sell and monetize student data for unspecified ‘educational purposes,’ including to develop products that would amass enormous personal profiles on our children. Profiling children based on their learning styles, interests and academic performance and then being able to sell this information could  undermine a student’s future. Parents want to ban sale of student data for any use and demand full notification and opt-out rights before their children’s personal information can be disclosed to or collected by data-mining vendors,” said Rachael Stickland, co-chair of Parent Coalition for Student Privacy.

Leonie Haimson, Parent Coalition co-chair and Executive Director of Class Size Matters said, “We also need strong enforcement and security mechanisms to prevent against breaches.  Schools and vendors are routinely collecting and sharing highly sensitive personal information that could literally ruin children’s lives if breached or used inappropriately.  This has been a year of continuous scandalous breaches; we owe it to our children to require security provisions at least as strict as in the case of personal health information.“

Here is a summary of the gaps and weaknesses in the California student privacy bill, which the President said should serve as a model for a federal law:

  • Bans vendors using personally identifiable information (PII) student data to target advertising or selling of data, but not in case of merger or acquisitions, or presumably in case of bankruptcy, as in the recent Connectedu case.  The President’s proposal would be even weaker, as it would apparently allow the sale of student data for unspecified “educational purposes”;
  • Only regulates online vendors but not the data-sharing activities of schools, districts or states;
  • Provides no notification requirements for parents, nor provides them with the ability to correct, delete, or opt out of their child’s participation in programs operated by data-mining vendors;
  • Unlike HIPAA, sets no specific security or encryption standards for the storage or transmission of children’s personal information, but only that standards should be “reasonable”;
  • Allows tech companies to use children’s PII to create student profiles for “educational” purposes or even to improve products;
  • Allows tech companies to share  PII with additional and unlimited “service” providers, without either parent or district/school knowledge or consent – as long as they abide by similarly vague “reasonable” security provisions;
  • Allows tech companies to redisclose PII for undefined “research” purposes to unlimited third parties, without parental knowledge or consent –without requiring ANY sort of security provisions for these third parties or even that they have recognized status as actual researchers;
  • Contains no enforcement or oversight mechanisms;
  • Would not have stopped inBloom or other similar massive “big data” schemes designed to hand off PII to data-mining vendors – and like inBloom, would also be able to charge vendors or “service providers” fees to access the data, as long as states/districts consented.


In Wayzata, Minnesota, a school spies on its students

Nathan Ringo is a high school 11th grader at Wayzata High School in Minnesota.  The post is reprinted with his permission from Boing Boing; the Minneapolis Post has also reported on his work to protect student privacy, and its impact.

I’m a student. As a student, my school is one of my favorite places to be: I enjoy learning and find almost all my teachers to be agreeable. I’m also a programmer and an advocate of free speech. In that role, my school holds a more dubious distinction: it’s the first place where my interests in computers and my rights were questioned.

Like many other school districts, #284 of Wayzata, Minnesota puts censorware between students and the Internet. This filter lets the school claim federal funding in exchange for blocking pornography. However, Wayzata chose to implement an unsavory policy of blocking not just porn, but anything and everything they feel is inappropriate in a school setting. Worse, I could not find out who makes the judgements about what should be considered inappropriate. It’s not stated in the school board policy that mandates the filter: that police say that the filter should “only block porn, hate speech, and harassment.” Our censorware, however, blocks material ranging from Twitter to comic books. Meanwhile, students are told to use Twitter as part of our Spanish classes and our school offers a course on comic books. Beyond blocking sites that are used in classes, there are also many false positives.

I started trying to get around the content filtering system in 7th grade, halfway through middle school. I used the old trick of accessing blocked sites by looking up their IPs, then using those in place of their domain names. Back then, the censoring layer was something like a regex matcher strapped onto an HTTP proxy–in other words, all the data was routed through software that simply looked for certain domain names or terms in the URL, then blocked those requests. When the school upgraded their filter to a different product, I was stuck on the censored net again for a few months. By eighth grade, I had taught myself to code in C++, an “actual programming language” more powerful than the basic web scripting languages I’d known up until that point. Although I still wasn’t able to get past the new censorship with my relatively rudimentary knowledge, I did get introduced to the software tools that could – Linux, SSL, and SOCKS5. With these, I was unaffected by all the bad Internet policy decisions made in the next two and a half years: the blocking of YouTube and Vimeo, rate-limiting on downloads, and an exponentially expanding list of addresses that are deemed to be too horrifying for students to view, such as XKCD, Wikipedia, news websites and anywhere else that, somewhere, contains a naughty word.

Prior to starting the 10th grade last year, I’d only ever had one major run-in with my school, when the librarian and I had a misunderstanding about my using the computer lab to teach myself DOS Batch after finishing my classwork. So I was surprised to get a summons to the Associate Principal’s office. When I arrived, I was told that someone had alleged that I was “hacking the firewall.” I have a habit of talking rather quickly when excited, which may have messed up my attempt to explain the difference between cracking and hacking; and that I’d never touched the school’s firewall. This first meeting ended inconclusively, with my insistence that I hadn’t broken any laws or school rules (true), and that she was using the wrong terminology. A couple of days later, I got called into her office again. This time, the school’s webmaster was present. Assuming that he’d know what I was talking about, I then gave a more technical explanation of everything I was doing. The response was that I was still in trouble, despite his understanding that I hadn’t done anything wrong “yet”. I felt like they were implying that by avoiding censorship, I was obviously heading for a life of computer crime. Weeks passed, and I assumed that the whole thing had blown over.

Nope. I was brought to a conference room, and I started to get worried. While I knew that I hadn’t done anything wrong, I also knew that there are very few good things that happen when a student is told to report to a conference room. There, the “Director of Technology” responded to my previous complaint that I was being persecuted for a non-existent rule violation with more implications of future illegal activity, with a librarian chipping in one of my most hated lines, “But if you’re not doing anything wrong, why are you so concerned about privacy?”

The associate principal “helped” by referring to me as a cracker. I don’t think too many people at the meeting had knowledge of the cracker-hacker dichotomy, so there was a bit of silence after that line. The Director of Technology then pulled out a copy of a board policy with a highlighted bit essentially claiming that he personally is entitled to enact and enforce any punishment that he deems fit, regarding any sort of conduct relating to the school’s technology. After that, my Internet access within the school was revoked for the rest of the term. To get it back, I was assigned to write an apologetic letter talking about how I’d be more “responsible” in the future, as if I had shown some outburst of immaturity by wanting uncensored access to the Internet.

This year, the problems started again. Before the year even started I got in trouble for opposing the upcoming technology plan. The school board decided to purchase an iPad for every student, fill it up with spyware and more censorware, and hand them out with little explanation of this software and what it did.

I thought this was horrible, so naturally I fought it. I stood by the line for iPads and read aloud the “contract” that all students were forced to agree to, and loudly pointed out the clause that explicitly allows the district to monitor us at any time, for any reason.

I was directed to the same Associate Principal. I was once again subjected to the “if you’re innocent, why are you hiding stuff” line before being directed out of the building, without an iPad. Since then, the ban on Internet access from the school has been reinstated, until I meet with a different Associate Principal and the Director of Technology.

Students don’t get to call those meetings, so I’m just waiting until my day comes. In the meantime, I’ve had to use my own tablet– a Surface Pro 1 running Arch Linux–and my own Internet connection–over Bluetooth from my phone–while in school.

As a student, I find my school a great place to be. As an advocate of free speech, however, the school’s policies are terrible.

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Restoring Privacy in the Era of Big Data by Kris Alman

Edward Snowden became a household name after he leaked top secret documents that demonstrated the vast scope of our government’s domestic surveillance programs. Much of this work is outsourced to private companies–such as Booz Allen Hamilton, owned by the Carlyle Group, a US-based investment fund with $176 billion in assets.

But the “war on terrorism” and the long arm of the Patriot Act (passed by Congress in October, 2001) go beyond telephone and Internet communications. Government and law enforcement now have unparalleled access to student records and medical records.

It’s been the perfect storm for business to swoop into pubic coffers to mine personal data.  “Authorized representatives” and “business associates” access personally identifiable information in both education records and “Protected” Health Information (PHI) in our medical records. In the meantime, state agencies collect this data in big databases–without attention to fair information practices and principals, the central contribution of an HEW (Health, Education, Welfare) Advisory Committee on Automated Data Systems in 1972.

Unfortunately there are no Edward Snowdens among education and healthcare technocrats. They seem to be both smitten with data utopia and tempted by “free” services of the digital economy. The train with our education and medical data collected without our consent has already left the station.

Case in point: The Los Angeles Unified School District has spent more than $130 million on a student information system, which has become a technological disaster. Ron Chandler, the district’s Internet technology officer, said that once the problems are ironed out, the system will free the district from the consent decree and provide a valuable tool for tracking and boosting student success.

A parallel explosion of big data since 2001 is not coincidental. Big data utopians proclaim better integration of fragmented health and education sectors and data analysis will improve outcomes and improve value. The question never seems to be asked, “For whom?”

The P in HIPAA does not stand for privacy.

HIPAA is the Health Insurance Portability and Accountability Act. For one brief year in 2001, newly implemented HIPAA privacy rules meant “…a covered healthcare provider must obtain the individual’s consent, in accordance with this section, prior to using or disclosing protected health information to carry out treatment, payment or health care operations.” That all changed in 2002 when Health and Human Services eliminated the right of consent and replaced it with a “new provision…that provides regulatory permission for covered entities to use and disclose protected health information for treatment, payment, or health care operations.”

Traditional (doctors, pharmacists, hospitals, health plans, Medicare/Medicaid etc.) and not-so-traditional (just what is a clearinghouse?) covered entities must comply with HIPAA privacy and security rules, enacted in 2002. Businesses that contract with covered entities gain access to our PHI, without our consent, by signing a business associate agreement to comply with these rules.

It’s impossible to create a detailed map of where sensitive personal health information flows from prescription records, to DNA, to diagnoses. And without a “chain of custody,” it’s also impossible to know who uses our data or why. Dr. Deborah Peel from Patient Privacy Rights points out in a recent TEDx talk, if the 2002 HIPAA were supposed to improve care and cut costs, why has the opposite occurred?

Big data simplifies access to data—a win-win for business and government. While corporations learn our secrets, trade secrets simultaneously protect how they profit from data mining our private lives. And it’s far more efficient for the government to obtain confidential information data mining big businesses, thus bypassing teachers and doctors, who would compromise professional ethics when confidentiality is compromised.

Many states have created, or are in the process of creating all payer health care claims databases. The goal is “a regional all payer dataset… (which is) seamless across state lines in terms of being a longitudinal record based on the patient.”

In Oregon, payers (insurance carriers, other third-party payers, or health plan sponsors, such as employers or unions) directly send “patient demographic information such as date of birth, gender, geography, and race/ethnicity” along with “medical and pharmacy insurance claims (that) capture plan payments, member financial responsibility (co-pay, co-insurance, deductible), diagnoses, procedures performed, and numerous other data fields” to Milliman Inc.

Milliman is one of the top purchasers of medical records. Oregon pays this global actuarial firm to collect an incredible amount of confidential data—all done without patient consent. Could Milliman’s computers glean data that could be used to deny life insurance for companies that use Milliman’s services? After all, they boast that, “No firm has a more complete understanding of insurance than Milliman, from the nuances of various regulatory regimes to the patterns in policyholder behavior.

While Oregon’s goal is to provide information to consumers and purchasers of health care, most states, including Oregon score an F when it comes to price transparency. Trade secrets protections are used to prohibit databases from “revealing proprietary fee schedule amounts for any payer/provider.”

Patients across the nation are feeling the financial sting when it comes to the not-so-Affordable Care Act. Wanna’ have a baby? A financial counselor may spring a “global fee” on you, which doesn’t include hospital charges or anything else on a long list of exclusions. And while they may point to an “average” cost in the “summary of benefits,” disclaimers allow for actual costs that may be higher.  So much for market-based transformations!

Pushback from parents for student data privacy

Privacy protections in FERPA, the Family Educational Rights and Privacy Act, were gutted with rule changes in 2008 (including those relating to section 507 of the USA Patriot Act) and 2011. US Secretary of Education Arne Duncan, Obama’s basketball buddy, implemented these rule changes and sweetened the pot with stimulus money. States were tasked to create statewide longitudinal data systems that collect and warehouse student data.

Earlier this year, parent activists successfully pushed back in shutting down inBloom. Founded in 2011 with $100 million from the Bill & Melinda Gates Foundation and the Carnegie Corp, this nonprofit was designed to collect confidential and personally identifiable student and teacher data.

This data included student names, addresses, grades, test scores, economic, race, special education status, disciplinary status and more from school districts and states throughout the country… on a data cloud run by Amazon.com, with an operating system by Wireless/Amplify, a subsidiary of Rupert Murdoch’s News Corporation. What’s more, InBloom planned to share this highly sensitive information with software companies and other for-profit vendors.

Constitutional rights to data privacy?

Is data speech, protected by 1st Amendment rights? Or property, protected by the 4th Amendment.

Authors of a recent Stanford Law Review article argued for the former. “When the collection or distribution of data troubles lawmakers, it does so because data has the potential to inform and to inspire new opinions. Data privacy laws regulate minds, not technology.” The authors state that whenever state regulations interfere with the creation of knowledge, that regulation should draw First Amendment scrutiny.

If you think your data is property, protected against unlawful search and seizure with Fourth Amendment protections, think again. As reviewed in the Emory Law Journal, “if a person “volunteers” information to a third party, she loses all constitutional protection for the information, regardless of whether it reflects an underlying autonomy interest that is otherwise protected by the Constitution.” This is the third-party doctrine.

Media conglomerates and bloggers compete for readers to monetize digital content through “behaviorally targeted advertising.”  The third party doctrine allows private companies to track individuals and create single, comprehensive profiles for each user. Campaigns strategically mine our hobbies, passions and vulnerabilities to micro-target a tailored message that effectively sells politics and products. The Federal Trade Commission has taken a hands-off approach when pressuring businesses to self-regulate when it comes to behavioral targeting. So states are responding.

California passed a new student privacy law that “prohibit(s)an operator of an Internet Web site, online service, online application, or mobile application from knowingly engaging in targeted advertising to students or their parents or legal guardians, using covered information to amass a profile about a K–12 student, selling a student’s information, or disclosing covered information.”

The application of FERPA to data derived from online personalized learning programs is not entirely clear. Are “personalized learning programs,” a hybrid model that combines online and traditional instruction, another type of behavioral targeting? Can the third party doctrine be invoked when districts and universities sign privacy agreements with businesses for these outsourced services?

Should we put faith in industry signatories to a “student privacy pledge“? The Future of Privacy Forum and the Software and Information Industry Association conceived this pledge. Interestingly, Google has not signed the pledge, though they are one of the many data miners supporting the Future of Privacy Forum.

Google Apps for surreptitious user profiles

Google Apps for Education is one of the freebies school districts and universities clamor for.  Bram Bout, the head of Google Apps for Education told the Guardian, “More than 30 million students, teachers and administrators rely on Google Apps for Education every day to communicate and collaborate more efficiently.” But Google presents “take-it-or–leave-it contracts” and a “gag clause” in its negotiations with schools for this service. As such, Berkley IT professionals couldn’t learn “how other campuses protected the privacy of their students and faculty.”

In a lawsuit against Google, students (both as individuals and in a class action complaint) claimed Google violated federal and state wiretap laws by intercepting electronic Gmail messages and data-mining those messages for advertising-related purposes–including the building of “surreptitious user profiles.” Google sought dismissal, saying “automated (non-human) scanning is not illegal ‘interception’ ” and that “the processes at issue are a standard and fully-disclosed part of the Gmail service.”

Judge Lucy Koh, whose jurisdiction is in the heart of Silicon Valley, denied a motion from Google to dismiss the case entirely. She rejected the company’s argument that Gmail users agreed to let their messages be scanned when they accepted subscription service terms and privacy policies.

But she later denied the plaintiffs’ motion to turn the suit into a class action on the grounds that it would be impossible to determine which email users consented to Google’s privacy policies. This means email users must sue individually or in small groups, lowering recoveries and boosting costs.

Joel R Reidenberg, a law professor at Fordham University, told Education Week, “The complexity of these arrangements exceeds what FERPA is really capable of addressing.” The 40-year-old FERPA does not adequately define what constitutes an education record at a time when previously unthinkable amounts of digital data about students proliferate.

With this lawsuit in mind, should patients feel reassured by Google’s Business Associate Agreement that offers “HIPAA compliant online services for covered entities”?

Data breaches, big data and identity theft

Then there are the inevitable breaches. The Office of Civil Rights must investigate and post health record breaches of over 500. The many flavors include hacking/IT incident, improper disposal, loss, theft, unauthorized access/disclosure, unknown and other.

One of the most recent breaches reported (and not yet in the database) affected 4.5 million patients served by the for-profit hospital chain, Community Health Systems Inc. Investigators believe the attack was the work of Chinese hackers that exploited the Heartbleed bug. Affected patients must worry about identity theft.

As USA Today reports, medical identity theft is epidemic and we should all be on the alert for that possibility. Having experienced tax related identity theft this past March, I assume my husband and I will never understand how our identity was stolen. This is especially disturbing when one considers that child identity theft rates are fifty-one times higher than adults. While these digital natives are savvier with technology, they are more vulnerable as well.

Furthermore, there is “no private right of action” when unlawful access, use or disclosure of protected health information or student’s protected information occurs. In other words, you can’t sue under HIPAA or FERPA laws when your personal data has been compromised.

Heath, Education and Welfare?

The Department of Health Education and Welfare was a Cabinet post from 1953-1979, when the Department of Education was created. But these departments still intersect. Joint guidance on the application of FERPA and HIPAA to student health records was published in 2008.

Schools use assessments for special education eligibility and 504 accommodations (such as for ADD/ADHD). The monopoly on these tests is London-based Pearson, the largest education company and book publisher in the world. Since last year, Pearson Clinical has been using Q-global to score and store tests. This decreased administrative burden is attractive for districts that are increasingly choosing the Q-Global option instead of scoring manually or with software.

Students who receive special education or 504 accommodations are afforded confidentiality provisions under IDEA, the Individuals with Disabilities Education Act. How could parents feel reassured that safeguards and policies to destroy information will be enforced?

Should Pearson Q-Global have the right to glean data for and use “non-personally identifiable statistically aggregated data raw test data and other information collected in the testing process for our research, quality control, operations management, security and internal marketing purposes and to enhance, develop or improve tests and testing processes”? Or transfer the data “in connection with a sale, joint venture or other transfer of some or all of the assets of NCS Pearson, Inc.” or “to our contractors or agents who are committed or obliged to protect the privacy of Personal Information in a manner consistent with this Privacy Policy“?

Conclusion: Without strong privacy and security protections for individuals, the costs of 21st Century digital disruption appear to outweigh benefits. Our identity is fundamentally our intellectual and spiritual property. Corporations protect their intellectual property with trade secret laws, yet laws don’t afford the same privacy rights to people. 

We must demand the right to privacy. As such, we should support the Student Privacy Bill of Rights, (conceived by the Electronic Privacy Information Center) as an enforceable student privacy and data security framework. The Patient’s Bill of Rights, implemented in 2010, doesn’t address privacy–reflecting the need to modernize HIPAA.

In June 2014 Joel R Reidenberg testified before two Congressional sub-committees on “How Data Mining Threatens Student Privacy.” His four recommendations equally apply to patient privacy. (Suggested modifications are in parentheses.)

  1. Modernize FERPA (and HIPAA) to protect and limit the use of all student (and patient) information whether held by schools (and covered entities) or vendors (and business associates)—including a prohibition on non-educational (and non-medical) uses of student (and patient) information and graduated enforcement remedies such as private rights of action.
  2. Require that the processing of student (and patient) data under any federally financed educational (and health care) program be prohibited unless there is a written agreement spelling out the purposes for the processing, restricting the processing to the minimum amount of data necessary for those purposes, restricting the processing to permissible educational (and health care) uses, mandating (enhanced) data security, requiring data deletion at the end of the contract, and providing for schools’ (and covered entities’) audit and inspection rights with respect to vendors (and business associates).
  3. Require that states adopt an oversight mechanism for the collection and use of student (and patient) data by local and state (educational) agencies. A Chief Privacy Officer (in state departments of education) is essential to provide transparency to the public, assistance for local school districts (and coveredentities) to meet their privacy responsibilities, and oversight for compliance with privacy requirements.
  4. Provide support to the Departments of Education (and Health and Human Services) and to the research community to address privacy in the context of rapidly evolving educational technologies, including support for a clearing center to assist schools (and covered entities) and vendors (and business associates) find appropriate best practices for their needs.

The Chief Privacy Officer (CPO) should be independent of the state agency involved. One state serves as a potential model: Ohio.

Further, an advisory group that includes agency representatives and citizens from stakeholder groups should help the CPO develop privacy policies. We need to restore full consent and notification of confidential data sharing and oversee data collection that include longitudinal data systems, created in direct response to various federal programs. Meetings should be open to the public to foster participation. These steps are essential to restoring trust in our government.

To be a free and democratic and globally responsive society, power should be in the hands of the people and not the 1%. We need digital innovations that put people in control of their data. We should repeal the Patriot Act and demand net neutrality.  With that power, we can battle the huge problems facing us—including climate change, Ebola and poverty.

These are not simple solutions. We need to learn more and to get involved. For more information, go to Patient Privacy Rights and the Parent Coalition for Student Privacy.