Knowing exactly where your friends are all the time is creepy; only someone who works in surveillance has that data. Knowing roughly where they are, though, feels a lot more acceptable. That’s the thinking behind Facebook Nearby Friends (being rolled out gradually), which will let your Facebook friends let you know roughly where they are – grouped by “ambient proximity”, which splits roughly into half-mile or mile differences, but no more precise. The fascinating thing about “Nearby Friends” is that Facebook could tell you exactly where people are. Apple’s “Find My Friends” app does that (if a friend gives you permission to track their location); Google’s Latitude did the same between 2009 and 2013. But Facebook chooses not to be accurate. Even as our devices can tell the world more about us, with greater precision, we’re choosing to be less precise.
Foursquare, another location-sharing app (which encourages people to “check in” at locations to become “mayor”) is also releasing Swarm, which will group friends based on rough location. Rather than the radar accuracy that we imagined would be so desirable a few years ago, both Facebook and Foursquare are aiming for “good enough”. But it’s not that they don’t know where you are. It’s that they’re intentionally blurring the signal.
It’s a fascinating conundrum, which also finds echoes in apps such as Snapchat – which creates photos and comments which are intentionally shortlived – and Secret, where people whom you might know (friends, and friends of friends) anonymously dispense truths, or lies, or something in between.
It seems that we’ve crashed up against the absolute precision of 64-bit floating-point calculations that can be stored on the internet for ever, and we’ve decided that in some cases we’d prefer to be vague, and transient. Human meets machine – and retreats from what the machine can do.
I suspect that “fuzzy location” will be a lot more popular than the precise version when it comes to sharing. You can turn on fuzzy location and not worry that a trail of friends who you hadn’t actually wanted to meet just at the moment for whatever reason will track you down. In that sense, the question of why precise location-fixing isn’t so popular pretty much answers itself: we don’t want to feel that the smartphone in our pockets is going to be blabbing all about us to the world. We want to be in control. If a friend who’s in the area sends an SMS and asks to meet, then you can say no. If they just turn up while you’re having a heart-to-heart in the coffee shop because your phone was blabbing on the internet, you won’t feel well-disposed towards the friend, the phone or the app that created the problem. Which is why Facebook is being smart in making Nearby Friends opt-in; you specifically have to enable it. The shift from precision to vagueness (Nearby Friends), from storage to disposal (Snapchat), and from identification to anonymity (Secret) are all examples of us making the systems that we use more human – more analogue, less digital. It’s ironic that it seems to take very large amounts of processing power to be less accurate.
Given that our smartphones, and the systems that feed into them and take data from them, are the most personal items that we have – laden with so much of what we know and plan and have done – there’s a pleasing symmetry in that, as they become more powerful, we can make them more human: less precise when we don’t want them to be accurate, more forgiving if we make a mistake (satnav systems are already endlessly patient when we miss a turn, but still lack the human ability to understand why and react appropriately). What’s the next boundary for getting these little computers to be more human? Understanding automatically whom we do and don’t want to receive phone calls from? Telling us we have “lots” of email rather than a precise figure? There are still sharp edges in computing. Sanding them down to human shape could take some time – but the power to do it is now available.
[Image via AFP]