University of Cambridge > Talks.cam > Computer Laboratory Security Seminar > Abusability of Automation Apps in Intimate Partner Violence

Abusability of Automation Apps in Intimate Partner Violence

Download to your calendar using vCal

If you have a question about this talk, please contact Alexandre Pauwels .

Recording link: https://www.cl.cam.ac.uk/research/security/seminars/archive/video/2026-02-10-t243715.html

Automation apps such as iOS Shortcuts and Android Tasker enable users to “program” new functionalities, also called recipes, on their smartphones. For example, users can create recipes to set the phone to silent mode once they arrive at their office or save a note when an email is received from a particular sender. These automation apps provide convenience and can help improve productivity. However, these automation apps can also provide new avenues for abuse, particularly in the context of intimate partner violence (IPV). This paper systematically explores the potential of automation apps to be used for surveillance and harassment in IPV scenarios. We analyze four popular automation apps—iOS Shortcuts, Samsung Modes & Routines, Tasker, and IFTTT —evaluating their capabilities to facilitate surveillance and harassment. Our study reveals that these tools can be exploited by abusers today to monitor, impersonate, overload, and control their victims. The current notification and logging mechanisms implemented in these automation apps are insufficient to warn the victim about the abuse or to help them identify the root cause and stop it. We therefore built a detection mechanism to identify potentially malicious Shortcuts recipes and tested it on 12,962 publicly available Shortcuts recipes. We found 1,014 recipes that can be used to surveil and harass others. We then discuss how users and platforms mitigate such abuse potential of automation apps. 

This talk is part of the Computer Laboratory Security Seminar series.

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2025 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity