Image credit: Austin Kleon
On September 18, far-right British nationalist party Britain First posted an image to their Facebook page. Like most of the images shared by Britain First, and many of the images shared to Facebook in general, the photograph did not appear to be original, but pulled from a Google Image Search, with text superimposed to modify its meaning. In this case, the image depicted a woman in a pastel blue burqa grasping a handgun, accompanied by text reading, ‘Terror attack level: Severe – an attack is highly likely; For security reasons it’s now time to ban the burqa’.
We know now what the administrators of the Britain First page presumably didn’t at the time: that the woman in the burqa was Malalai Kakar, the first female graduate of the Kandahar Police Academy in the south of Afghanistan and a women’s rights advocate, who was murdered by the Taliban in 2008. The image was subsequently shared by Palmer United Senator Jacqui Lambie, after which the photographer who took the picture contacted the ABC, arguing that Malalai’s image had been ‘desecrated’ on Lambie’s page. In response, Lambie refused to delete the shared post, arguing that Kakar would have agreed with the repurposing of the image.
If it’s true that, as Lawrence Lessig has suggested, we’re living in a ‘Remix Culture’, did anybody really do anything wrong? Is there any real difference, after all, between the actions of Britain First (and, indirectly, Lambie), and Reddit users appropriating photographs of absolute strangers to create Advice Animal macros, Girl Talk’s Greg Gillis releasing albums on which every sound is sampled and radically decontextualized, those on the left disseminating digitally-manipulated photos of Tony Abbott, or redubbing, culture jamming or political supercutting?
There may be a difference, but let’s not pretend this is anything new. The combination of Google Image Search, Photoshop, and Facebook is a powerful one, providing web users with the ability to seek out swaths of copyrighted visual material, rip and manipulate these pictures so the original source is obscured, then share the freshly ‘remixed’ images to a broad audience with no real fear of legal action. Every moment of every day, the Facebook News Feed offers up a nearly endless stream of illegally disseminated visual content, which the original creators have no knowledge of and for which they receive no compensation. Facebook users are rarely clearly warned not to post content they don’t own on the network, and it is nearly impossible to report copyrighted or misappropriated content on the site: attempting to flag an image as copyrighted leads the user down a labyrinthine path in which they are repeatedly dissuaded from filing a claim.
In the case of the misappropriated Malalai Kakar image, it’s easy to see how this particular chain of events transpired. An administrator of the Britain First page searched Google Images for ‘Burqa gun‘ and, without much thought, dragged the first image on the page into their preferred desktop image editor. Haven’t thousands of others done the same? Just as Facebook subtly encourages users to fill their feeds with visual content they don’t own, Google Images is structured to enable images to be easily downloaded devoid of all context. Instead of attributing creators or siginalling how the picture was originally used, every image on Google Images is simply accompanied by low-contrast text reading ‘Images may be subject to copyright’, with no further information about fair use or how to contact a creator to obtain consent. Google Image Search becomes a smorgasbord of ready-to-remix visuals that somebody, somewhere must have created, but with no clear indication as to who that somebody might be.
On some level, we all have some trouble understanding why Google Images exists or how, exactly, it should be used. I’ve met more than one magazine editor who believes that once a photograph is accessible via Google Image Search, it automatically becomes part of our shared visual language, part of a de facto “commons” open to reuse and remixing by all. The ability to search for ‘Burqa gun’ on Google Images and immediately gain access to an endless grid of burqa-clad women holding handguns makes it easy to mistake Google Images for a stock photo website, with every real woman in every real photo mistaken for just another model striking a pose. In the age of Google Image Search, it seems as though the only way to prevent accidental misuse of an image would involve baking everything the viewer might need to know about a photograph right into the image itself, but this seems impractical. How could photographer Lana Slezic have constructed her portrait of Malalai Kakar to make it clear that Kakar was a police officer? Or even that Kakar had a complex relationship to the wearing of the garment, let alone – obviously – that she was later assassinated. Few images are able to effectively convey this volume of information, and even fewer as compressed jpeg thumbnails. Google Images strips away the backstory and simply presents Kakar, alongside many others, as a part of a broad catalogue of unknown Muslim women wielding weapons.
In Steal Like an Artist, Austin Kleon distinguishes between ‘good theft’ and ‘bad theft’. ‘Good theft’, according to Kleon, involves honor, study, credit, and stealing from many (ideally nabbing from such a vast pool of work that it becomes impossible to identify individual expropriations), while ‘bad theft’ involves degrading, skimming, plagiarisation, and blatantly stealing from a single source. Putting aside one’s politics for a moment, this is what makes Britain First’s use of the Kakar portrait a bad remix. If Britain First had attempted to create a similar picture by reworking imagery from a vast number of sources, the same political point could have been made without misusing the work of others.
Ultimately, though, our entire online system is now set up to make bad remixes possible. Britain First gained attention because it is effortless to remix and redistribute an image of an assassinated Afghanistan policewoman in which that woman is presented as a terrorist. Moreover, our entrenched misunderstanding of online copyright and our obsession with Remix Culture has led us to a situation in which the original photographer is unable to assert her right to ownership over the image she created, and unable to take action to have her misappropriated work rapidly removed from a social network that continues to enable others to share it.
At the same time, our misunderstanding of the differences between creators, remixers, and curators has led many to mistakenly direct the brunt of their anger at Lambie, who really did nothing but visit another Facebook page and click “share”. If we want to hold web users accountable for liking or retweeting misleading, bogus, or doctored imagery, then we all need to commit to rigorously fact-checking everything we decide to engage with online. This is probably a good idea, but virtually none of us have the time to obsessively question the origins of a photograph before we “like” it – we simply take it on trust that the original uploader did that work for us.
In Remix, Lawrence Lessig writes about a woman who found a piece of her music reworked by a remixer who had ‘totally destroyed the meaning’ of her track. To Lessig’s relief, the woman ‘described how the experience had totally changed how she thought about creating music… the sound had taken on new meaning.’ Our online systems are designed to facilitate this kind of reworking of meaning, and the results can be fascinating and weird and wonderful. At the same time, though, the idea that remixing simply involves the addition, modification, or relocation of meaning can lead us to overlook the consequence that in the online space, remixing just as often results in the erasure of necessary context that must be painfully reconstructed – if anybody bothers to reconstruct it at all.