In 1948 the world was gifted with one of the most important and significative literature masterpieces: 1984 by George Orwell. The dystopic scenario described by the author was undoubtedly influenced by the Second world war which had devastated the globe not only in a material way – with bombs and armed conflicts – but also in an ideological and psychological way, since it saw the success of several tyrannies and non-democratic governments.
The most surprising aspect of the Orwellian novel was the modernity the author staged: telescreens, cameras and other devices were used by the government in order to guarantee the full control on the society of the new city of London. These systems were so sophisticated for that time and people read the book thinking about something impossible to reach, both for the technological innovations and for the re-born of such a tyrannical scenario able to change global rules and the world we are – and they were – used to know.
But the most astonishing thing is that George Orwell – who died in 1950 – was right about the technological progresses the majority of the global population would have faced during the years; additionally, he was right about the fact these new technological devices would have been used by several societies to control their people – in a more or less hided way. Moreover, tyrannies haven’t disappeared throughout history and there still are countries veering among governments which don’t really give the opportunity to exercise the basilar freedoms of the human being.
So, today, we still have this masterpiece in our library, and it’s very difficult to believe it has been written so many years ago, since it seems to describe our contemporary society, where you’re controlled even if you’re apparently living in a democratic country.
From our phones to grocery stores kiosks to traffic lights, cameras are watching us and they’re here to stay.
From our phones to grocery stores kiosks to traffic lights, cameras are watching us and they’re here to stay. Governments, law-enforcement agencies, tech companies are more and more using facial recognition to track who we are in public space. In an age when technology knows us better than we know ourselves, you might be asking: how can we hide? 1
It’s to answer this question that different personalities in different business fields have started involving themselves. Particularly, in the fashion field, there have been a lot of designers who have made available their production activities to guarantee the security and, above all, the privacy of the customers during their social lives. From this desire, was born what we currently call Anti-surveillance fashion: it’s about a way of doing fashion – including make-up, hair dressing and clothing and the usage of other technological devices – with the main objective of hiding the identity of people in front of public cameras, making them able not to be over-controlled and continually recognized.
Here’s an example of some designers - among many others - who have decided to dedicate their productions to the anti-surveillance fashion:
CV Dazzle 2 is an open source anti-facial recognition toolkit that explores how fashion, specifically hair and makeup, can be used to camouflage one’s self from new technologies. CV Dazzle was a project created by Adam Harvey 3 as part of his NYU Master’s thesis in 2010 and is ongoing. Using low-cost methods, CV Dazzle works by manipulating the expected dark and light areas of a face in relation to what computer vision algorithms look for when identifying objects. The designs can be easily achieved using hair styling, makeup and fashion accessories. The idea behind CV Dazzle is simple: facial recognition algorithms look for certain patterns when they analyse images, so obstructing those patterns means obstructing the ability for the algorithm to recognise you. Techniques to do this include creating asymmetry, using hair to conceal the face (particularly the nose bridge), using make-up that contrasts with your skin tone in unusual tones and directions, and sticking gems on the face.
REALFACE Glamoflauge offers a series of garments designed specifically to confuse Facebook’s facial recognition software that automatically tags your face in photos. Fashioned by Simone C. Niquille 4, this project was synthesized from her Master’s thesis in 2013. The t-shirt’s pattern is composed of images taken from the public, typically celebrity faces, and remixes them in a bizarre and bold design that confuses facial recognition technology. Instead of identifying your face, these t-shirts will keep Facebook’s algorithms figuring out who exactly they’re looking at.
The CHBL Jammer Coat 5 was designed by architecture firm Coop Himmelb(l)au in 2014 for the exhibition “Abiti da lavoro.” It exists to distract, deter and diffuse surveillance technology from recognizing your existence in any space. The CHBL Jammer Coat is made of metallized fabrics that block radio waves and shield the wearer against tracking devices. While the garment keeps you safe from radio frequency identification (RFID) readers that steal credit card information, it will also make you unreachable on your mobile device, so it’s for you to decide whether the protection is worth dropping out of touch. The fabric pattern is comprised of dots of various sizes in a wave formation creating a “vibrating” sort of look to the coat. Its waviness is so visually busy, cameras don’t detect it.
Wearable projector 6 is a wearable headgear, from a collection titled Anonymous from 2017, which projects different faces on top of your own as you walk around in public. As if your face were a screen, this projector gives you a new appearance by constantly shifting through different looks. As with other anti-facial recognition wearables, this projector alters the appearance of your face, making your true identity undiscoverable to cameras. The headgear also works while you are moving. In fact, it further implements the “shakiness” of the flashing images, making your face even less detectable. On top of the constant shifting through images, the brightness of the projection confuses computer vision algorithms.
Urme surveillance (pronounced “you’re me”) is a collective dedicated to protecting the public from surveillance by providing various products that conceal a person’s true identity. It was a crowd-funded project created by artist Leo Selvaggio 7 in 2014. His goal was to make anti-surveillance available to all, especially to those participating in protests all over the world. Anti-surveillance attempts in the early 2010s were typically limited to altering the appearance of your face or completely hiding it with full-face ski masks (balaclavas). Selvaggio offers another way of hiding from cameras by supplying everyone with his own face. Therefore, rather than computer vision algorithms identifying your true identity, by wearing a URME mask, computers will identify Leo Selvaggio instead.
Altering people's appearance to cheat cameras has become increasingly popular with artists and designers in recent years -
Computers have become adept at identifying people in recent years, unlocking myriad applications for facial recognition, from tracking criminals to counting truants. Altering people's appearance to cheat cameras has become increasingly popular with artists and designers in recent years, as the use of facial recognition has grown more pervasive, raising fears over privacy, according to fashion experts.
However, anti-facial recognition fashion has also drawn some criticism, with one academic saying it risked "normalising" surveillance. "These artworks are accepting pervasive surveillance as being inevitable," says Torin Monahan, a professor of communication at the University of North Carolina at Chapel Hill 8.
According to the critics, anti-surveillance fashion wouldn’t be used only by people in pacific protests and in non-democratic governments, but it will be also used by criminals to escape from controls, police and legal persecutions.
So, which is the usage that will prevail over? Are these items going to be used for worthy causes or for helping out some criminals?