Deus Ex: Human Revolution echoes a very real conflict for our rights as users and consumers

There is a moment in Deus Ex: Human Revolution where you as the protagonist, cyborg security officer, international vigilante, and owner of exquisite goatee Adam Jensen, are asked to upgrade your cybernetic limbs by replacing a particular chip. It’s a routine process, akin to asking someone at the Apple Store to upgrade your iPhone’s software, which can be taken care of at any branch of the ubiquitous LIMB clinics all over the game’s world.

As with mostly everything else in DXHR, it’s your choice whether to go through with this or not. Regardless of that choice, a few hours of play time later (um, SPOILER warning for Deus Ex: Human Revolution I guess? You really should have played it by now, though), you encounter one of the game’s villains, a delightfully decadent corporate adversary, who makes it clear that you have to fight her chief henchman in order to proceed (as you do). Then she presses a button on a remote.

(One more heavy spoiler warning for the video below; make sure you’ve either completed the game or don’t care to before viewing.)

Now, if you opted for the upgrade, you realise with some apprehension that none of your special powers conferred by your cybernetic implants (dubbed “augmentations” in the game) function anymore. The corporate villain has taken control of your augmentations and turned them off, which leaves you to face her henchman with your bare hands (and, granted, some considerable firepower; this is a videogame, after all).

It’s the sort of science fiction scenario that, more and more these days, seems to have worrying parallels to real-life situations. While it’s unlikely any one of us will be forced to go toe to toe with a radical extremist without being able to squeeze out little explosive candy from our forearms anytime soon, we increasingly rely on devices and services that are controlled remotely. Too often, these are devices and services whose functions we don’t completely understand or over which we have limited control.

A problem such as a manufacturer shutting down your prosthetic cyber-arm because you tried to use open-source software on it may well turn out to be a concern for our descendants. But right now, we are faced with precisely this scenario for our smartphones, our cars, our computers, and several other devices, on which we rely for business, entertainment, or basic human needs. With every mouse click and keystroke we agree to contracts that forbid us to “unlock” our smartphones, to make a copy of a legally bought movie or novel for our own private use, to access our car’s computer. Governments pass laws that enforce those same agreements and restrict users’ freedom to choose what they want to do with their own legally obtained property, often reinventing what “property” is in the process.

Instead of “property” we have “license to use”, or “access” as the EFF puts it. Yet we own our smartphone, our e-book reader, our car. We just can’t do whatever we want with them, because, “licence”. Do you want to modify your smartphone’s software in order to better suit your needs? Sorry, you can’t. Not legally anyway (although that has been changing, thankfully). Do you want to service your car at your chosen repair shop? Nope, you’re breaking the law. And remember when you bought 1984 for your Kindle? Funny story.

For most people the loss of their consumers’ rights, their privacy, and their freedom of choice is not such a big deal when it’s quid pro quo; give up your rights for ease of access to your tablet or their Macbook. They might feel differently, however, when it comes to things like their vital organs. What if your artificial heart has data you need for your health locked inside it, like Sherwin Siy’s mother? What if your self-driving car is commandeered by the dealership, or worse, the police, because you have an unpaid parking ticket? And what if, for your own, perfectly legitimate reasons, you decided you wanted to have custom access to said artificial heart or self-driving car (ostensibly breaking laws such as the United States’ DMCA)?

Enjoying the connected world and the considerable benefits these technological advancements bring to our lives is all well and good, but we never asked for this.

At the end of the DXHR, you are given a choice between four possible endings. I won’t go into detail, both to avoid too many spoilers and also to avoid some of the more… colourful aspects of the story (OK, OK, the Illuminati show up. DXHR can be a very silly and a very poignant work of fiction at the same time). One of the choices is to urge the government to push for regulation on the use of augmentations on humans, and not let the augmentation manufacturers run amok. I mention it here both because it is the choice of ending I went for, and also because I feel it’s the best answer to such issues in the world, at least the world as it is currently structured.

Our rights and liberties, protected by laws and regulations, are our own augmentations, our special powers against those who would seek to benefit from our inability to react. Perhaps this is why those real-life villains keep trying to strip them away from us, promising upgrades that actually undermine those powers – and this is why we must keep fighting against them.

Sure, in our dystopian version of the future, corporations have more power than governments, and too often government representatives can be found in the pockets of those same corporate villains. But that’s no reason to give up on the basic democratic principles that augment us in a very real and important way.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: