Showing posts with label panoptic prison. Show all posts
Showing posts with label panoptic prison. Show all posts

Tuesday, 12 February 2019

Under The All Seeing State.

      How much more policing by surveillance machines can we accept? We are watched in all aspects of our lives, as you walk down the street, enter a pub/cafe/library, or what ever, you are being watch. Shopping, place of work, sitting on the bus, they are there, the CCTV cameras, 24-7. Having got us all to accept the presence of the camera, they keep adding another surveillance/control facility to its armoury. Facial recognition, profiling, and now predictive algorithms. The computer will tell the police where a crime is likely to be committed and who is likely to commit a crime. Wonderful no need for detective work, just ask the computer where was the crime committed and who "dunnit", problem solved. No denying it the computer has worked it out, so you must be guilty, computers don't make mistakes!!!! 
    What a load of crap to depend on for your freedom, humans design the algorithms, humans come with bias and make mistakes, but the computer will be accurate? We as a society are sleep walking into a panoptic prison.
Policing by Machine – Predictive Policing and the Threat to Our Rights collates the results of 90 Freedom of Information requests sent to every force in the UK, laying bare the full extent of biased ‘predictive policing’ for the first time – and how it threatens everyone’s rights and freedoms.
It reveals that 14 forces are using, have previously used or are planning to use shady algorithms which ‘map’ future crime or predict who will commit or be a victim of crime, using biased police data.
The report exposes:
  • police algorithms entrenching pre-existing discrimination, directing officers to patrol areas which are already disproportionately over-policed
  • predictive policing programs which assess a person’s chances of victimisation, vulnerability, being reported missing or being the victim of domestic violence or a sexual offence, based on offensive profiling
  • a severe lack of transparency with the public given very little information as to how predictive algorithms reach their decisions – and even the police do not understand how the machines come to their conclusions
  • the significant risk of ‘automation bias’ – a human decision-maker simply deferring to the machine and accepting its indecipherable recommendation as correct.
 Visit ann arky's home at radicalglasgow.me.uk