<p>As early as last <a href="https://www.technologyreview.com/s/608598/when-a-face-is-worth-a-billion-dollars/" target="_blank">August</a>, Chinese police forces in Hangzhou, a city with a population comparable to New York, began using surveillance cameras fitted with this technology to identify suspects. Recently, Chinese police officers began using <a href="https://www.telegraph.co.uk/news/2018/02/07/chinese-police-using-facial-recognition-glasses-identify-suspects/" target="_blank">electronic sunglasses</a> fitted with facial recognition software. These glasses allow officers to access a database and pull up information on any person that comes into their line of sight. While this technology seems like it belongs in a Ridley Scott movie, it's here now. And it's important for us to recognize its political and social implications. </p><p>You don't have to be a luddite to spot the dangerous precedent set by this new technology. When police officers can access your personal data on the fly, it's certainly reasonable to wonder about your civil rights. Still, this technology doesn't seem to be the privacy-erasing apocalypse that haunts the dreams of libertarians everywhere. It's helped police officers in China identify people involved in kidnappings and hit and runs, as well as scammers using fake IDs. With regard to privacy, the pros to using this technology seem to outweigh the cons. Where this tech becomes an issue, is in its inability to deal with nuance. For example, <a href="https://www.independent.co.uk/news/world/asia/china-police-facial-recognition-technology-ai-jaywalkers-fines-text-wechat-weibo-cctv-a8279531.html" target="_blank">authorities in Shenzhen City</a> are using facial recognition to automatically issue fines (via text) to jaywalkers. This technology will also keep records on repeat offenders, and has the potential to affect their credit scores. </p><p><div id="ezoic-pub-ad-placeholder-145"></div>
<!-- End Ezoic - incontent3 - longer_content --></p><p><script async="" src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script>
<ins class="adsbygoogle" style="display:block; text-align:center;" data-ad-layout="in-article" data-ad-format="fluid" data-ad-client="ca-pub-4301469008561137" data-ad-slot="1710267498"></ins>
<script>
(adsbygoogle = window.adsbygoogle || []).push({});
</script></p><p class="shortcode-media shortcode-media-rebelmouse-image">
<img type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8xNzQ5OTk2My9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTYyOTI5ODI5N30.OQHcUCCYBEKE4UPOH9Wp-vNZOpEVswOalRqAwYBc0zY/img.jpg?width=980" id="21b74" class="rm-shortcode" data-rm-shortcode-id="795bd6c8fa708bf9adcb73858fa3d388" data-rm-shortcode-name="rebelmouse-image">
<small class="image-media media-caption" placeholder="add caption...">Officers in China review footage using facial recognition software </small></p><p>The issue this technology presents is similar to that of traffic cameras. Before they were <a href="https://www.courierpostonline.com/story/news/local/new-jersey/2014/12/16/motorists-await-end-red-light-camera-experiment/20493157/" target="_blank">banned</a> in New Jersey, these cameras would issue tickets for running red lights and making illegal turns. The issue was, that these cameras were programmed to operate within the strictest possible parameters. They followed the law to a tee. Since the program was completely automated, there was no way for the cameras to look at each case individually. Tickets were shot out at a rapid clip, arriving by mail to anyone who so much as made a right turn a second after the light turned red. From a government funding standpoint, it was a slam dunk, and the towns that put these traffic cams up made a ton of money from issuing the tickets, but the public outcry against the cameras was huge. While China has a much more authoritarian social structure than we do in the States, it's doubtful that the people of Shenzhen City will embrace this new system of doling out fines.</p><p><div id="ezoic-pub-ad-placeholder-147"></div>
<!-- End Ezoic - incontent4 - longest_content --></p><p></p><div class="OUTBRAIN" data-src="DROP_PERMALINK_HERE" data-widget-id="AR_7"></div><p></p><p class="shortcode-media shortcode-media-rebelmouse-image">
<img type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8xNzQ5OTk2Ni9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTYzNDE3NzM4NX0.zC2KEJfUhEk6v1REvAJv5K32P2YcHWF-mpzuEZrbTHs/img.jpg?width=980" id="9d933" class="rm-shortcode" data-rm-shortcode-id="a66e0a56ce4a23cb9164403a9ebb5823" data-rm-shortcode-name="rebelmouse-image">
<small class="image-media media-caption" placeholder="add caption...">A facial recognition programs scans the face of a passerby </small></p><p></p><center><script async="" src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script><!-- Popdust Mobile Banner 320x100 --><ins class="adsbygoogle" style="display:inline-block;width:320px;height:100px" data-ad-client="ca-pub-4301469008561137" data-ad-slot="6952751205"></ins><script>
(adsbygoogle = window.adsbygoogle || []).push({});
</script></center><p></p><p>As usual, the fundamental issue with <em>this</em> new tech isn't something deliberately insidious by design, nor is it the way in which it's used by law enforcement. The real problem, as is the problem with all automated technologies, is its inability to replicate human decision making. There's no amount of programming that will allow this technology to distinguish subtle differences between offenders. There's a reason why we shouldn't <a href="http://time.com/4966125/police-departments-algorithms-chicago/" target="_blank">let algorithms</a> run our police departments; it's impossible to account for the nearly infinite amount of variables that go into human behavior. While there are certainly patterns, if we rely too heavily on these machines, we set the precedent that their programming is superior to our officers' powers of deduction. Machines are fundamentally tools that <em>help</em> us complete jobs-they can't do the jobs for us.<br></p>Still, this new facial recognition software is too useful and frankly too cool, to be left inside the box. Companies, as well as law enforcement agencies, are going to use this technology one way or another. There's already a <p><div id="ezoic-pub-ad-placeholder-163"> </div>
<!-- End Ezoic - incontent_5 - incontent_5 --></p><a href="https://mashable.com/2017/12/19/caliburger-facial-recognition-kiosk/#r8nvchWnAaqs" target="_blank">burger chain</a> in the US that's using the "Smile to Pay" method, and it's only a matter of time before our law enforcement agencies begin to catch on, if they haven't already. In implementing this new technology it's important that our police don't mistake the tool for the worker.Related Articles Around the Web
KEEP READING...
Show less