Strong privacy protection for students using education technology seems like common sense. But, like most things in technology, it’s actually nuanced and subject to unintended consequences. Why? Privacy, like teaching, should be about building a relationship of trust with students. And that’s more complicated than simply complying with regulations.
The 2019 EdTech Privacy Report reviewed the privacy policies of 150 of the most popular EdTech apps and services. The report was underwritten with support from the foundations of tech titans: namely, the Michael and Susan Dell Foundation, the Chan Zuckerberg Initiative, and the Bill & Melinda Gates Foundation. Common Sense found a 7-point year-over-year improvement in median scores, from 45% to 52%, across categories ranging from data collection and sharing to data rights and parental consent. That improvement’s good news.
Tymochenko and Kutarna have this to say about consent:
The degree of consent required is context-dependent, which makes it nuanced. (Professor Ari Ezra Waldman discusses the role of context in information privacy in Georgian Capital’s podcast, Information Privacy for an Information Age). The Office of the Privacy Commissioner of Canada provides guidelines as to in what context explicit consent (as opposed to umbrella consent under, for example, a broad-brush Responsible/Appropriate Use of Technology policy) is required:
Genuine consent’s important, as is having a process in place if a parent or student withdraws consent. And to me, the most important element of the framework in the Educator’s Guide is the right to an alternative method of instruction (i.e., read a book, or review paper materials, etc.). Without that, there can be no true consent – if you have to use X software at school in order to complete school work, even if you are opposed to X software’s surveillance, you’ll have to consent in order to pass the class. That’s not genuine consent. I think the percentage of students who will actually opt out of using tech in the classroom and seek alternative methods of instruction is in the low single digits, but providing the choice to do so is vital. Anything else is inauthentic and erodes trust.
Let’s move to the final two elements, teacher training and auditing/reporting. Tymochenko and Kutarna suggest that school boards consider running pilots on new software before rollout, and training teachers on new tools and their privacy policies. Pilot programs and training would reduce the risk of unintended consequences and of teachers using the software for purposes other than what the school policy intended. In terms of auditing and reporting, Tymochenko and Kutarna suggest surveying teachers and students on an ongoing basis about the impact of technology in the classroom, and the importance of appointing a “system leader” for identifying opportunities to improve:
Innovation and rapid change are indeed constants in high tech, which can create the uncertainty and unintended consequences I referred to earlier. The framework in the Educator’s Guide is, however, a very good place to start for school boards looking to protect student privacy, and themselves, as they increasingly roll out new technology and, soon enough, AI.
What’s next in the K-12 EdTech conversation? Something that has nothing to do with compliance or legal regulations. Technology in the classroom changes the relationship between teacher and student. Technology inserts a screen where a quiet conversation and guidance used to be. This is far more important than any policy or regulation ever will be. Leading schools will spend as much time researching this relational impact as they do developing policies, in order to truly leverage technology to create a better learning environment and outcomes for students.