As technology continues to advance at an unprecedented rate, the use of artificial intelligence (AI) in various industries has become more prevalent. One such industry is education, where schools are increasingly turning to AI-powered monitoring systems to track students’ performance, behavior, and even emotions. While this may seem like a positive development, there are growing concerns about the potential negative impacts of these systems on students’ privacy and well-being.
Recently, EdSurge had the opportunity to speak with a group of passionate teens who are lobbying against the dark sides of AI in education. They raised an important question: What do we know about the companies that schools are using to monitor students?
To understand the issue at hand, it’s essential to first delve into the concept of AI-powered monitoring systems in schools. These systems use algorithms to collect and analyze data from students, including their academic performance, attendance, and behavior. This data is then used to identify patterns and predict future outcomes, such as a student’s likelihood of dropping out or struggling in a particular subject.
On the surface, this may seem like a helpful tool for educators to personalize learning and provide targeted support to students. However, as these teens pointed out, there are significant concerns about the companies behind these systems and how they are using students’ data.
First and foremost, there is a lack of transparency about the companies that schools are using for AI monitoring. Many of these companies have not disclosed their methods or algorithms, making it difficult to assess the accuracy and fairness of their systems. This lack of transparency also raises questions about how these companies are using students’ data and whether it is being shared with third parties without their knowledge or consent.
Furthermore, there are concerns about the potential bias in these systems. AI algorithms are only as good as the data they are trained on, and if the data is biased, the results will be too. This could have serious consequences for students, particularly those from marginalized communities who may already face systemic biases in education.
Another issue raised by the teens was the potential for these systems to have a negative impact on students’ mental health. With constant monitoring and data collection, students may feel like they are under constant scrutiny, leading to added pressure and stress. This could also lead to a lack of trust between students and their teachers, as students may feel like they are being reduced to data points rather than individuals with unique needs and abilities.
So what can be done to address these concerns? The first step is for schools to be more transparent about the companies they are using for AI monitoring and the data being collected. This will allow for a better understanding of how students’ data is being used and ensure that their privacy is protected.
Secondly, there needs to be more regulation and oversight of these companies. Currently, there is a lack of laws and regulations governing the use of AI in education, leaving students vulnerable to potential exploitation. It is vital for policymakers to step in and establish guidelines to protect students’ rights and ensure that these systems are used ethically and responsibly.
Lastly, it is crucial for schools to involve students in the conversation. As seen with the teens lobbying against the dark sides of AI, students have a unique perspective and valuable insights to contribute. By including them in the decision-making process, schools can ensure that the use of AI in education is in the best interest of students.
In conclusion, while AI-powered monitoring systems may have the potential to improve education, it is crucial to address the concerns raised by teens and take steps to ensure that students’ privacy and well-being are protected. By promoting transparency, regulation, and student involvement, we can harness the power of AI for good and create a better, more equitable education system for all.


