Skip to Main Content

Internet and Data Privacy Guide

Tips for building a privacy practice with a critical technology lens

Introduction to Data Privacy

Websites and other first-party data collection entities are not the only internet actors collecting your data. Data can be sold by first-party data collectors to third-party data collectors, that store data for future use. The image below illustrates the differences between first-party, second-party, and third-party data collectors.

These third-party data collectors become another potentially vulnerable site for data. The notorious 2017 Equifax data breach involved hacking third-party data aggregator Equifax, which used data points to determine credit scores. By neglecting a software update, Equifax left an extensive data trove vulnerable to hackers.

This section describes some third-party data collectors and ways that schools and libraries work to recognize and protect data collected.

Image Credit:

Laws Around Data Privacy

Third-Party Data Sharing in Schools

From Learning Management Systems to campus security apparatuses, the technologies that ease student learning often collect student data. These sources identify third-party educational platforms that siphon student data, and the specific security concerns of each platform:

Educational Technologies

  • Shea Swauger's article about privacy concerns with algorithmic test proctoring.
  • Article in the Chronicles of Higher Education about privacy concerns in educational technology, and the expansion of educational tech use in remote learning environments.
  • Long-form article from the Chronicles of Higher Education about California State University's student success tracking program.

Learning Management Systems

  • Learning Management Systems seem to do the bulk of data mining in schools according to this blog post.
    • Pratt started using Canvas in Fall 2020. A June 2020 article in EdSurge discusses the potential for data from Canvas to get absorbed into other algorithms, including those that determine credit scores.
  • This March 2020 article from the Association of Computing Machinery suggests that if LMS data is managed safely, students can benefit from some data-driven tools, including electronic tutoring platforms and setting data-driven benchmarks for professor intervention.

Security Concerns with Remote Learning

  • The Electronic Frontier Foundation's guide to digital rights during COVID-19 poses larger questions about surveillance and remote work softwares. What surveillance technologies are necessary for public health, and will these tracing technologies be repurposed after transmission has slowed?
  • A March 2020 article from the Electronic Frontier Foundation about Google's COVID-19 screening website Project Baseline. Project Baseline was initially a public-private partnership with the state of California. The site requires that people use their Google account to access the list of testing sites or COVID health information: Google has not been entirely clear about whether Project Baseline will save confidential health records with Google account information or allow third-party data collector access.

Data Violence: Consequences of Data Collection Beyond Privacy

  • University of Washington Professor Anna Lauren Hoffman's April 2018 article about data violence - defined as how data reinforces systemic inequalities, and how this holistically effects marginalized peoples as the algorithm factors into more consequential decision-making.
  • The Feminist Data Manifest-No, crafted by a number of data scientists and academics, is a series of declarations about extending a feminist ethics of care to data studies in order to fight harmful data practices. Includes an extensive bibliography called the Manifest-No Playlist.
  • Data for Black Lives are activists, organizers, and data scientists that aim to create concrete and measurable change in the lives of Black people by identifying and rooting out oppressive and racist algorithms.
  • Data and Society is a New York based group that hosts events examining the social implications of data-centric technologies and automation. Data and Society's Algorithmic Accountability: A Primer helps coders identify points in their algorithm that cause harm, creating a framework for algorithmic accountability and data harm reduction. 

  Report a Problem with this Page