Police are rolling out new technologies without knowing their effects on people
It matters because police departments are racing forward and started using drones, for everything from surveillance and intelligence gathering to chase criminals.
Last week, San Francisco approved The use of robots, including drones, can be deadly in certain emergencies, such as when dealing with a mass shooter. In the UK, most police drones have thermal cameras that can be used to detect how many people are inside a home, Pósch said. This has been used for everything: catching traffickers or rogue landlords, and even targeting people holding suspected parties during the covid-19 lockdown.
Virtual reality will allow researchers test technology In a controlled and safe way among a lot of test subjects, says Pósch.
Even though I know I’m in a VR environment, I found the drone encounter terrifying. My opinion of these drones has not improved, although I have encountered one that is said to be polite, human-operated (there are even experimental modes) more positive than I’ve ever experienced.)
Christian Enemark, a professor at the University of Southampton who specializes in the ethics of war and drones, says in the end it’s up to the drone to be “polite” or “rude” may not makes a lot of difference. That’s because the use of drones in itself is a “reminder that the police aren’t here, whether they don’t want to be here or they’re too scared to be here,” he said.
“So there could be something fundamentally disrespectful about any encounter.”
Deeper learning
GPT-4 is coming soon, but OpenAI is still fixing GPT-3
The internet is buzzing with excitement about the latest version of the OpenAI AI lab of its famous big language model, GPT-3. The latest demo, ChatGPT, answers people’s questions through back-and-forth dialogue. Since its launch last Wednesday, the demo has overcome more than 1 million users. Read the story of Will Douglas Heaven here.
GPT-3 is a confident jerk and can be easily prompted to say malicious things. OpenAI says it’s fixed a lot of these issues with ChatGPT, its solution to answering follow-up questions, admitting mistakes, challenging incorrect facilities, and denying inappropriate requests. It even refused to answer some questions, such as how to be evil, or how to break into someone’s house.