عندما التقى كافكا بأورويل: الاعتقال بواسطة خوازمية
التاريخ: 
05/07/2017

Do you have a documented history of protest? Do you dislike the president? The government in general? Israeli-made software may soon be secretly cataloging it.

In April 2017, an extraordinary claim of the Israeli security services was published in Ha’aretz (Hebrew): over 400 Palestinians had been detained under suspicion they may be involved in future terrorist attacks. They were detained not on the basis of evidence, but on the decision made by an algorithm.

The practice grew out of a security issue for the Israeli authorities. The so-called “Intifada of the Individuals,” beginning in late 2015, presented the ISA (Israeli Security Agency, AKA Shin Beth) with a quandary. The ISA had spent decades dismantling Palestinian society via a network of informers and intimidation, but those tools, while very useful against any sort of a cell organization, proved helpless in the face of individuals who decided to go on an attack on a whim.

It took the service a few months to recalibrate, and then – most likely with the assistance of Israel’s version of the National Security Agency, the vaunted Unit 8200 – it began analyzing the social media profiles of Palestinians, and deriving from them a series of indicators which, when aggregated, produced a profile of a possible attacker.

The past few years have seen algorithms used to predict the likelihood of a convict returning to crime, and those results were used to determine whether that person is worthy of parole. Those systems, when checked, often show proof of bias – for instance, against African Americans in the United States. At least the systems used by the American justice system can be challenged; US courts are now dealing with several appeals by prisoners whose sentencing or parole refusal were determined by algorithms.

But when it comes to the military justice system as applied to Palestinians, the situation becomes much more twisted than in the U.S. Who precisely is going to oversee a system developed and used by the ISA?

As John Brown and Noam Rotem noted (Hebrew), the fact that someone fits such a profile – for instance, he praised attackers and changed his profile picture – does not in any way serve as evidence a court will accept. Basically we are asked to believe a system, of which we know nothing, may accurately predict the actions of a specific person in the future, and, on the verdict of said system, we may then detain that person – not for something he did, or even planned to do, but for something he may do.

The first option is to get around the courts: when a Palestinian is detained by algorithm predicting future actions, the security services simply put him under administrative detention. This basically means no legal process: the military commander of the West Bank rubber-stamps an order actually issued by the ISA, and the person is thus sentenced to six months incarceration with no possibility of appeal. After six months, the general may rubber-stamp the detention order again, ad infinitum; people have served long years without ever seeing a court.

This draws attention, however, and as Brown and Rotem noted, Israeli lawyers of Palestinians noticed a new pattern: when someone is arrested by algorithm, he is charged with “incitement.” The level of proof required is quite low. Even supporting a Palestinian armed group online may suffice. The smoking gun is what happens in the rare cases when an algorithm detainee is acquitted: then he is almost immediately put into administrative detention, i.e. the one without any judicial oversight.

So, basically: A computer program whose biases may never be discovered, as it is a state secret, decides that a person may commit a crime; the person is then detained, but is not informed of the real evidence against him or her, as there is no valid evidence; he or she is then charged with the faux crime of “incitement”, and, should the judge refuse to be a cypher and acquit the person, the person is thrown into the maws of a technically lawless system of administrative detention.

All the while, the prisoner is repeatedly told they should confess to the phantom crime of “incitement”: they are informed that should they confess, they will be given a relatively light sentence – but should they plead innocence, they will be held in detention until the process is over. That may take several years; the offered sentence is lesser. So, our software says you’re guilty. Do you want to take it to court and go home after five years, or confess and go home after three?

While this particular legal twist may only be used against Palestinians, the algorithm itself is not so limited. Former Unit 8200 soldiers are highly-sought-after programmers, and what they know is surveillance. The systems used for practice against Palestinians are often later sold to other countries. Your local police department may soon acquire one.

Do you have a documented history of protest? Do you dislike the president? The government in general? Israeli-made software may soon be secretly cataloging it.

A major part of the problem, of course, is the fact that we have become used to sharing information on social media platforms that collect numerous data points about us. We have done so willingly, for ephemeral benefits: we have forged the bars of our own cell.

And yet, for all that, someone is shaping the bars into a cell, and we better stop it soon – because doing so from within will be infinitely more difficult. We should not accept the normalization of tools used by a military dictatorship against an occupied people.

This article first appeared on Mondoweiss.