Thoughts on personal tracker privacy
Posted by ekr on 08 May 2021
The privacy implications of Apple's new AirTag tracking system are getting some negative attention right now. Briefly, AirTags are little battery powered BlueTooth (among other wireless protocols) transponders which you attach to/put in items you own (e.g., your keys). You pair them with your phone and can then use your phone to find the tags and whatever you attached them to. Obviously, these protocols are short range, and you might have lost your item somewhere else, so AirTags include a feature where other iPhones will report the location of your AirTag via Apple, allowing you to locate it even when it's out of range.
There's nothing fundamentally new here: a number of companies such as TrackR and Tile already make this kind of device. The primary difference here -- aside from the usual slick Apple engineering -- is the large size of the potential network of devices that can report an AirTag's position (It's a little unclear exactly which Apple devices are involved here, but Apple says "the Find My network — hundreds of millions of iPhone, iPad, and Mac devices around the world", which probably means it's every device with the Find My feature turned on. A bigger network is better at tracking, and there are a lot of Apple devices, so it seems likely that AirTags will work pretty well.
Track more than your own stuff #
So, what's the problem? Well, any system like this can be used not only to track your stuff but also to track other people's stuff, and transitively, other people. All I have to do is buy a tracker, pair it to my phone and then stuff it in your bag and I can use it to track you. This is obviously not ideal, and as WaPo observes, can be used by stalkers:
Clip a button-sized AirTag onto your keys, and it’ll help you find where you accidentally dropped them in the park. But if someone else slips an AirTag into your bag or car without your knowledge, it could also be used to covertly track everywhere you go. Along with helping you find lost items, AirTags are a new means of inexpensive, effective stalking.
Apple has built in some countermeasures for this form of attack. Specifically:
If AirTags are away from their owners for "an extended period of time" they make a sound when moved.
If your iOS device detects that an AirTag that doesn't belong to you moving with you, it will notify you on the device and then you can try to find it and figure out what's going on.
As WaPo points out, this is an imperfect defense: you might not notice the AirTag playing a sound and it's possible for someone who controls your phone temporarily to disable the feature which detects the AirTag moving with you. And of course, that feature won't protect you at all if you have an Android phone. Note: WaPo also says: "Apple has done more to combat stalking than small tracking-device competitors like Tile, which so far has done nothing."
So the good news is that if you have an iOS device -- and a lot of people do -- and nobody has tampered with it, then you'll have some measure of protection by default. On the other hand, if you don't have an iOS device -- and of course many people don't -- the situation is more complicated. You won't have any protection by default and you may not be able to do much of anything to protect yourself. Presumably Apple could build an Android app that would do whatever it is that iOS devices do now, but they don't seem to have done so. It might or might not be possible for someone else to do so, depending on exactly how this function works. There are a number of Android apps that appear to let you look at BlueTooth or NFC devices in your local area, but it's not clear how easy they are for ordinary people to use for thus purpose; the same identifier changing techniques which make it hard to track tags trivially may also make it hard to use this kind of program to detect tracking.
Is it possible to do better? #
Clearly, the privacy properties of this kind of tracker aren't ideal. This raises the question of whether it's possible to do better. Specifically: can we significantly improve the privacy properties of this kind of system without also significantly reducing its usefulness for legitimate applications? If we can do so, then that's good. If not, then there are some hard tradeoffs. In addition to a few ergonomic-type tweaks suggested in the WaPo article (scan your local network for trackers, tune the "moves with" you algorithm to work if there is a tracker in your car, etc.) it seems like there are a few small things that one could do:
Make the notification when the device is away from the owner more apparent (louder, etc.). In general, it's just not obvious how useful this whole feature is, though. In a domestic abuse situation, the tracker is likely to be in the presence of the abuser pretty regularly, so it's not clear whether this would really work (this is a point WaPo makes).
Have a standardized mechanism for detecting that a device is "moving with you" that is implemented by every major tracker type and every device manufacturer. Ideally, devices would just do this by default, so that users didn't have to take any positive action.
Note: These aren't original to me; they're implied or outright suggested in the WaPo article.
These would improve the situation somewhat though I could imagine the "I've been separated from my owner" feature getting pretty annoying: consider what happens if your spouse goes out of town for a few days and then suddenly you have to listen to all their tagged devices angrily beeping like a smoke alarm that has run out of battery until you can find them and shut them off.
Another potential improvement would be to separate the functionality of being in the Apple "Find My" network for the purposes of having your devices found by you from the functionality of reporting back about trackers it sees. This would prevent your device from reporting the position of a tracker that is tracking you, but not other devices that don't belong to you from doing that. However, given the large number of devices that are going to be doing that reporting, it seems likely that that tracker will still be trackable; after all, that is the whole premise of the ordinary use of the system. For this reason, I don't think that this change would significantly improve things.
We could substantially improve the privacy of these systems by removing the ability to track trackers in real-time. Right now, you can usually just show the current position of a tracker on a map, which obviously makes tracking someone easier. If instead you could only interrogate the status of a particular tracker at a given time and then that tracker somehow indicated it was being tracked (e.g., it made a loud sound or alerted every device in the area) that would make surreptitious tracking much more difficult, but would obviously make the system rather less useful.
At the end of the day, this kind of tracker is a dual-use technology: It can be used both for legitimate ends (finding your stuff) and for illegitimate ends (tracking other people). While there are some things one could do to deter illegitimate use -- and Apple has done some of these -- It's not clear how much one can really do technically to make it hard to use for illegitimate purposes without also making it less useful for legitimate users as well.
I'm actually really curious how this works. Apple says "AirTag was designed with privacy at its core. AirTag has unique Bluetooth identifiers that change frequently. This helps prevent you from being tracked from place to place. When the Find My network is used to locate an offline device or AirTag, everyone’s information is protected with end-to-end encryption. No one, including Apple, knows the location or identity of any of the participating users or devices who help locate a missing AirTag." Matt Green has some ideas. ↩︎
Standardization might also come with some drawbacks. Consider the case where a countermeasure involves the tracker doing something, like alerting the user or sending out some other signal; with a standardized protocol, you could make and sell trackers which followed the standard enough to be tracked but didn't do the alerting piece. ↩︎