Only We can Make The Terminator

Feb 20, 2016

Unearthly lightning crackles with no source. A man crouches, gets up, walks to a motorcycle bar. He stands eye to eye with the biggest guy in the bar. He says “I need your clothes, your boots, and your motorcycle.” The biker laughs at first, then presses his cigar against the unclothed man’s shoulder but doesn’t get a reaction. The man grabs the biker’s wrist. Every knuckle seems to shatter. Next you see he walks out with clothes, boots, and motorcycle keys. The mysterious man is none other than… The Terminator.

The Terminator is the perfect killing machine. It’s indistinguishable from a human, it time travels, and it self repairs. The Terminator is the biggest, baddest, scariest robot. The Terminator also represents the public’s general conception of what artificial intelligence is. The problem with the Terminator though, is that it completely misrepresents artificial intelligence to be violent and unpredictable.

I think people have an association between artificial intelligence and “big scary robots” because they don’t know what AI does. Artificial intelligence is just a tool for automating tasks that would normally take human level intelligence, the emphasis being on the word “tool.” A tool is something that has many different use cases and acts predictably.

Artificial intelligence is actually in most niches of your technological life, quietly not plotting to kill all humans. Google Search, Facebook Tags, and Gmail all use artificial intelligence to do tasks where they need to process large amounts of data. Google uses artificial intelligence to give you the most relevant google results. When Facebook automatically asks to tag your friends in a picture. Gmail’s awesome spam filters also use artificial intelligence. Most artificial intelligences aren’t even in robots like the Terminator.

A good example of a tool is a hammer. Hammers mainly help carpenters hit nails with precision and efficiency. Hammers can also help people hit other people with precision and efficiency. Both the carpenter and the malicious person are using the hammer. Artificial intelligence can be used to help you avoid email spam and it could in theory be used to make human killing machines. Artificial intelligences work like hammers in that they don’t behave unexpectedly. Google spam filters aren’t going to start stealing personal information in the same way that hammers aren’t going to start hitting people by themselves.

Computer scientists are also just as afraid of violent AI’s as carpenters are afraid of people misusing hammers. In “Autonomous Weapons: An Open Letter from AI & Robotics Researchers” the Future of Life Institute states that autonomous weapons are “feasible within years, not decades.” The letter continues, saying “If any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable.” I would argue artificial intelligence isn’t the danger; people are the danger.

comments powered by Disqus