Technology is rapidly becoming a tool for abuse, enabling stalkers and traffickers to control victims remotely. But a growing field within computer science is fighting back: researchers are now proactively studying how technology is weaponized against vulnerable people, and developing solutions to restore their safety. Leading this charge is Nicola Dell, a computer scientist at Cornell Tech whose work is helping survivors of domestic violence and human trafficking reclaim control of their digital lives.
The New Frontier of Abuse
Traditional forms of stalking and harassment have moved online, becoming more insidious and harder to detect. Instead of physical surveillance, abusers now exploit location tracking on smartphones, hijack accounts, and manipulate security features to maintain control. This isn’t just a matter of convenience; it’s a systemic problem that computer scientists have largely overlooked until recently.
Dell’s research focuses on predicting and mitigating these attacks, recognizing that abusers often know their targets intimately and can bypass standard security measures. The key difference is anticipating the intent behind the technology, not just its technical capabilities.
Pioneering Tech Abuse Clinics
In 2018, Dell co-founded the Clinic to End Tech Abuse (CETA) at Cornell Tech, the first center of its kind dedicated to helping survivors of intimate partner violence. CETA provides free consultations to identify compromised devices and accounts, offering practical steps to improve digital safety. This work earned Dell a 2024 MacArthur Fellowship—an $800,000 grant recognizing her creativity and impact.
CETA’s approach is unique: it brings technical expertise directly to survivors, bridging the gap between abstract research and real-world harm. The center’s success underscores the urgent need for more resources dedicated to this understudied field.
From Outsider to Advocate
Dell’s path to this work wasn’t traditional. Growing up in Zimbabwe, she didn’t start coding until her teens, when access to computers was limited. She navigated a male-dominated field in the UK, where she felt intimidated but persevered.
This experience shaped her commitment to inclusivity. After earning her Ph.D., she discovered that technology could be designed to address the needs of underserved communities, not just those with access to resources.
The Hidden Side of Technology
Dell’s research has uncovered disturbing patterns. She and her team have developed algorithms to identify malicious apps used for harassment, fraud, and stalking, leading to the removal of hundreds of harmful applications from the Google Play Store. They have also exposed vulnerabilities in “passkey” systems, where abusers can exploit biometric authentication to access victims’ accounts without permission.
These findings are not merely academic: they expose fundamental flaws in current security models, which prioritize convenience over safety. The challenge is balancing usability with the reality that technology can be weaponized against those who need it most.
Beyond Research: Bridging the Gap
Dell’s work extends beyond technical solutions. CETA is pioneering a model for pro bono tech work, encouraging professionals to volunteer their skills to help survivors. The center also trains social workers to recognize and mitigate tech-facilitated abuse, fostering cross-disciplinary collaboration.
This is critical because technology isn’t neutral. It reflects the biases and vulnerabilities of its creators. Dell’s approach emphasizes the need for ethical design, where security is prioritized before convenience, and where warnings are clear when tracking is enabled.
The Future of Tech Safety
Dell’s work demonstrates that computer science can be a powerful force for good, but only if researchers actively confront the dark side of technology. By prioritizing the safety of vulnerable populations, she is helping to reshape the field, making it more responsive to the realities of abuse and more accountable for the harm it can inflict. This isn’t just about fixing technical flaws; it’s about recognizing that technology is never truly neutral, and that its design must reflect a commitment to justice and safety.
