Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124

Last week, a 16-year-old high school student encountered a harrowing experience after an artificial intelligence gun detection system misidentified his bag of chips as a firearm. The incident at Kenwood High School in Essex, Maryland, left both students and officials rattled.
The student, Taki Allen, was waiting for a ride after school when he casually tucked an empty bag of chips into his pocket. Moments later, chaos ensued as police officers surrounded him, ordering him to the ground and handcuffing him, according to reports from local media.
Body camera footage released by the Baltimore Police Department shows responding officers quickly realizing that the AI system, which monitors real-time video feeds, had erroneously flagged the bag as a weapon, prompting their urgent response.
“Police showed up with eight cop cars, all guns pointed, and they were shouting for me to get on the ground. I raised my hands, confused, asking what was going on,” Allen recounted in an interview. The alarming situation escalated until officers could ascertain the nature of the flagged item.
After reviewing the footage identified by the AI system, the officers located the supposed weapon near a trash can and confirmed it was merely a bag of chips. One officer humorously remarked to the students, “I guess just the way you guys were eating chips… It picked it up as a gun. AI’s not the best.” The incident on October 20, 2023, has raised questions among students, city officials, and school administrators regarding responsibility for this distressing experience.
During a conference call that followed the incident, Superintendent Dr. Myriam Rogers stated that the alert regarding the potential threat had initially been canceled. However, the school principal, unaware of this cancellation, proceeded to coordinate a police response.
“The alert was canceled by the BCPS Safety Team. Unfortunately, the principal did not see the cancellation and contacted our School Resource Officer,” Rogers explained, citing a statement from Baltimore County Public Schools. The situation escalated before appropriate verification could occur, highlighting a potential flaw in the school’s security protocols.
Superintendent Rogers emphasized to local news outlets that the detection system functioned as intended. According to her, the program relies heavily on human verification to discern genuine threats from non-threatening situations.
“The system is designed to signal an alert, prompting a review by humans to determine if there is any immediate concern,” Rogers clarified.
Omnilert, the company behind the AI technology, responded to the incident, asserting that their security system combines AI capabilities with human verification. This dual approach aims to minimize errors and properly assess risks before escalating situations to law enforcement.
In their statement, Omnilert affirmed, “Our system operated as designed — it identified a potential threat, elevated it for human review, and relied on authorized safety personnel for the final determination.” The alert was eventually deemed resolved, confirming that the bag was not a firearm.
However, some critics remain skeptical about the efficacy of AI in high-stress environments like schools. Could such technology lead to unnecessary panic and chaos in similar situations? The reliance on AI for critical safety decisions demands thorough examination and discussion.
For Taki Allen, the aftermath of this incident has been unsettling. He expressed that he no longer feels secure, particularly after his football practice. The emotional toll has left him hesitant about going outside, fearing that mundane activities could lead to another frightening encounter.
“I don’t think a chip bag should ever be mistaken for a gun,” Allen stated, emphasizing the absurdity of the situation. His sense of safety has been compromised, as he explained, “I just stay inside until my ride comes. I don’t feel safe eating or drinking anything outside anymore.”
This incident has sparked broader discussions about accountability in school safety measures and the appropriate use of advanced technology. As educational institutions continue implementing AI systems to enhance security, they must also ensure that these technologies do not inadvertently escalate minor situations into traumatic experiences.
The reliance on artificial intelligence raises critical questions about its integration into environments meant to foster learning and growth. Local officials must evaluate the risks and benefits of technology in school settings, ensuring that student safety remains paramount without compromising their mental well-being.
As educational institutions increasingly adopt AI-driven security solutions, incidents like this serve as cautionary tales. They underline a pressing need for ongoing training, cross-communication among school officials, and rigorous assessment of the technologies employed.
By fostering an environment where students feel safe and secure, schools can encourage a supportive learning atmosphere. The integration of technology should enhance safety measures without creating unnecessary fear or panic.
Ultimately, as educators and technologists work together, the goal must be to strengthen safety protocols while prioritizing the well-being of students. Ensuring clear communication and proper training can help prevent future misunderstandings and ensure that human judgment remains at the forefront when assessing threats.