Uber’s flawed facial recognition, and police drones
One evening last February, a 23-year-old Uber driver named Niradi Srikanth was preparing to start another shift, carrying passengers around the southern Indian city of Hyderabad. He pointed his phone at his selfie face to verify his identity. This process usually works smoothly. But this time he couldn’t log in.
Srikanth suspected it was because he had recently shaved his head. After further login attempts were rejected, Uber informed him that his account was blocked. He is not alone. In a survey conducted by MIT Technology Review of 150 Uber drivers in the country, nearly half had their accounts temporarily or permanently locked due to problems with their selfies.
Hundreds of thousands of workers in India’s gig economy are reliant on facial recognition technology, with little legal, policy or regulatory protection. For workers like Srikanth, being blocked or kicked off the platform can have dire consequences. Read full story.
—Varsha Bansal
I met a police drone in VR—and hated it
Police departments around the world are using drones, deploying them for everything from surveillance and intelligence gathering to even chasing criminals. However, none of them seem to be trying to figure out how drone encounters make people feel — or whether this technology will help or hinder the work of control. .
A team from University College London and the London School of Economics is filling the void, studying how people react to police drones in virtual reality and whether they feel more or less trusting. to the police or not.
MIT Technology Review’s Melissa Heikkilä leaves the VR police drone encounter with trepidation. If others feel the same way, then the big question is whether these drones are effective tools for policing in the first place. Read full story.
Melissa’s story is from The Algorithm, her weekly newsletter on AI and its effects on society. Register to get it in your inbox every Monday.