Note to the Editor: David Riedman is the founder of the K-12 School Shooting Database, an open source research project documenting school shootings dating back to 1966. He researches gun violence in schools and has written multiple peer-reviewed articles on homeland security policy, critical infrastructure protection and emergency management. Previously, he was a firefighter and emergency medical technician for 18 years in Maryland, where he attained the rank of captain. The opinions expressed in this commentary are his own. Read more opinion at CNN.
Since the start of the 2023-24 school year in August, a gun has been fired on a K-12 campus at least 300 times. In the past decade, the number of school shootings has increased tenfold from 34 in 2013 to 348 in 2023.
This escalating pattern of gun violence on campus has parents, teachers and school officials desperate for a solution.
Many schools are buying new artificial intelligence products and technology that are marketed to districts looking for help in detecting a potential gunman on campus. This intense pressure on school officials to do something to protect students has transformed school security from a niche field to a multi-billion dollar industry.
Public schools often lack funds, equipment, and personnel, and AI offers an incredible ability to automatically detect threats faster than anyone. There is not enough time, money and manpower to watch every security camera and look inside every pocket of every student’s backpack. When humans cannot do this job, using AI technology can be a powerful proposition.
I have collected data on more than 2,700 school shootings since 1966 as well as security issues such as swatting, online threats, evasion plots, near misses, stabbings and students armed with guns.
Based on my research, there is no simple solution to this range of threats because school security is extremely complex. Unlike airport terminals and government buildings, schools are large public campuses that are hubs for community activities beyond traditional school hours.
A high school weeknight might have varsity basketball, drama club, English classes for adults and a church group in the cafeteria for rent – with potential security gaps amid this flurry of activity.
Two common applications of AI right now are computer vision and pattern analysis with large language models. These provide the opportunity to monitor a campus in ways that humans cannot.
Schools are using AI to interpret the signals from metal detectors, classify objects seen on CCTV, identify the sound of gunshots, monitor doors and gates, search social media for threats, flags look for reds in student records and recognize student faces. identify intruders.
This AI software works best when tackling clearly understood and clearly defined problems such as identifying a weapon or an intruder. If these systems work correctly, when a security camera sees a stranger in possession of a gun, the AI software flags the face of an unauthorized adult and object classification recognizes the gun as a weapon. These two autonomous processes trigger another set of AI systems to lock the doors, call 911 and send text message alerts.
What AI can and cannot do
With school security, we want certainty. Does the person on CCTV have a gun? We expect a “yes” or “no” answer. The problem is that AI models give “maybe” answers. This is because AI models are based on probability.
For AI that classifies images as weapons, an algorithm compares each new image to the patterns of weapons in training data. AI doesn’t know what a gun is like a computer program doesn’t know what anything is. When an AI model is shown millions of pictures of guns, the model will try to find that shape and pattern in future images. It is up to the software vendor to decide the probability threshold between gun and not gun.
This is a messy process. An umbrella might score 90% and a handgun partially concealed by clothing might only score 60%. Do you want to avoid a false alarm for every umbrella, or get an alert for every handgun?
AI software interpreted this CCTV image as a gunshot at Brazoswood High School in Clute, Texas, sending the school into lockdown and police racing to the campus. The dark spot is a shadow on a drainage ditch lined with a person walking.
Cameras generate poor quality images in low light, bright light, rain, snow and fog. Should a school be using AI to make life or death decisions based on a dark and ugly image that an algorithm can’t accurately process? A major transportation system in Pennsylvania canceled its contract with the same vendor used by Brazoswood because it said the software could not reliably detect gunshots.
Schools need to understand the limits of what an AI system can and cannot do.
With cameras or hardware, AI is not magic. Adding AI software to a magnetometer does not change the physics of a gun and a metal water bottle producing the same signal. That’s why the FCC and SEC are investigating an AI screening vendor for allegedly inaccurate marketing claims made to schools across the country.
An expensive endeavor
The biggest cost of school security is the physical equipment (cameras, doors, scanners) and the staff who operate it. AI software on an old security camera generates revenue for the security solutions company without the vendor or school needing to spend money on equipment. It’s great to save the money until a shadow triggers a police response to what the AI thinks is an active shooter.
Instead of selecting schools to test or find the best solutions based on merit, vendors lobby to structure local, state and federal government funding to create a short list of specific products that schools must purchase. In an era of rapid AI innovations, schools should be able to choose the best product available rather than being forced into a contract with one company.
Schools are unique environments and require security solutions — both hardware and software — designed for schools from the ground up. This requires companies to analyze and understand the specifics of gun violence on campus before developing an AI product. For example, a scanner created for sports venues that only allows fans to carry a limited number of items into a school where children each carry backpacks, stickers, pens, tablets, mobile phones and metal water bottles will not work well . day.
For AI technology to be useful and successful in schools, companies must address campuses’ biggest security challenges. In my studies of thousands of shootings, the most common case I see is a teenager who usually carries a gun in his backpack, and they fire a shot during a fight. Manually searching every student and bag is not a viable solution as students spend hours in security lines instead of classrooms. Searching bags is no easy task and shootings still happen inside schools with metal detectors.
Image classification from CCTV or retrofitted metal detectors does not address the systemic problem of teenagers carrying a gun to school every day. Solving this challenge requires better sensors with higher AI than any product available today.
Schools cannot be fortresses
Unfortunately, school security is currently drawing from the past instead of envisioning a better future. Medieval fortresses were a failed experiment that focused on risk rather than reducing it. We are reinforcing school buildings without understanding why European empires stopped building castles centuries ago.
The next wave of AI security technology has the potential to make schools safer with open campuses that have invisible layers of frictionless security. When something goes wrong, open spaces offer the most opportunities to seek cover. Children should never again be trapped in a classroom like they were by the gunman who killed 19 children and two teachers in Uvalde, Texas, in 2022.
Schools are on the edge between the past and a safer future. AI can either hinder or enable the way we get there. The choice is ours.
For more CNN news and newsletters create an account at CNN.com