Warren, Markey Investigation Finds That EdTech Student Surveillance Platforms Need Urgent Federal Action to Protect Students
Software May Be Misused for Disciplinary Purposes; Parents Not Adequately Informed
Washington, DC – United States Senators Elizabeth Warren (D-Mass.) and Edward J. Markey (D-Mass.) today released the findings of the October 2021 investigation they opened into four educational technology companies — Gaggle.net, Bark Technologies, GoGuardian, and Securly Inc. — regarding their use of artificial intelligence (AI) and algorithmic systems to monitor students’ online activity. The senators sought to determine whether these products were surveilling students inappropriately, compounding racial disparities in school discipline, and draining resources from more effective student supports. Their investigation confirms the need for federal action to protect students’ civil rights, safety, and privacy.
Warren and Markey’s 14-page report, is the first Congressional investigation of the impacts of the use of these student monitoring tools, which have become increasingly prevalent in the wake of the COVID-19 pandemic. It finds that:
- Student activity monitoring software may be misused for disciplinary purposes and result in increased contact with law enforcement. A survey of teachers revealed that 43% reported that their schools are using these tools to identify violations of discipline policies, revealing that these products may be exacerbating the school-to-prison pipeline by increasing law enforcement interactions with students.
- Software companies have not taken any steps to determine whether student activity monitoring software disproportionately targets students from marginalized groups, leaving schools in the dark. None of the companies that the senators contacted have analyzed their products for potential discriminatory bias – even though there is data indicating that students from marginalized groups, particularly students of color, face disparities in discipline, and more recent studies indicate that algorithms are more likely to flag language used by people of color and LGBTQ+ students as problematic.
- Schools, parents and communities are not being appropriately informed of the use – and potential misuse – of the data. Three of the four monitoring software companies indicated that they do not directly alert students and guardians of their surveillance.
- Regulatory and legal gaps exacerbate the risks of student activity monitoring software. There is an urgent need for increased coordination between federal agencies to clarify and evaluate existing guidelines to protect student safety and privacy, to improve data collection to determine whether these products pose risks to students’ civil rights, and to address these problems when they are confirmed.
The senators concluded with the following recommendations: “Absent federal action, these surveillance products may continue to put students’ civil rights, safety, and privacy at risk. Given these risks, the federal government should seek methods to track the potential impacts of student surveillance technology on students in protected classes, clarify the definition of ‘monitoring the online activities’ as mentioned in CIPA, and work to ensure that products used by schools maintain student safety and privacy.”
This report builds on Senator Warren’s concerns about algorithmic bias disproportionately affecting communities of color in the financial sector and health care systems. Senators Warren and Markey previously sent a letter to Zoom asking about its student safety and privacy protections during the pandemic and signed onto a letter about student privacy and racial bias in exam-proctoring software.