The Potential Havoc From Drones
For the past 10-20 years, drones have been a major sticking point when talking about national security. The United States military uses drones all the time to conduct bombings on foreign targets and they even have stealth drones — the Lockheed Martin RQ-170 Sentinel — that the majority of the American public had no idea about until one crashed in Iran.
People forget that this technology is also sold to people online. Of course it’s not stealth capable technology, but it’s technology that can easily land on the White House lawn. That was just one drone that landed on the lawn of the most guarded buildings on the planet, and it was done by some random guy. The crazy thing is that it was the SECOND drone to land on the White House lawn — the first was controlled by a drunk government employee.
Two separate occasions in which the most secret service security guards could do was just look on and react to a remote control drone landing a few hundred feet from where the President could be sitting. Two occasions where it wasn’t shot down by the men on the of the White House. THE WHITE HOUSE. A building so intimidating with security that I would be afraid to even come near the fence to take a cheesy tourist photo.
My intention with the above paragraph was to show you just how freely drones can go flying through the air when controlled by ordinary people or drunk government employees (which is an entirely different issue). Sure, we at Social Underground love drone technology because we are the type of people who would use it for amazing reasons, but the fringe element of society will always take something amazing and use it for alternate purposes.
A gif was posted on Reddit a few weeks ago of a drone equipped with a pistol that fired four shots. What went through my head was “Wow, people really like using guns for stupid reasons.”
The idea also popped in my head that there could be an overreaction to the pistol drone that would cause alarm and also make for a really repeatable news story. Days passed, I forgot, and then I turned on my television to CNN:
The gun drone in Connecticut appears to have been fired on private property and — so far, authorities said — it did not appear any laws were broken. There were no complaints from neighbors until after the “Flying Gun” video went viral with almost 2 million views as of Tuesday, authorities said.
“It appears to be a case of technology surpassing current legislation,” police in Clinton, Connecticut, said.
Nevertheless, authorities said they are investigating whether any laws or regulations could have been broken when the handgun drone fired four shots on the wooded grounds of the 18-year-old student’s residence in Clinton, authorities said.
“We are attempting to determine if any laws have been violated at this point. It would seem to the average person, there should be something prohibiting a person from attaching a weapon to a drone. At this point, we can’t find anything that’s been violated,” Clinton Police Chief Todd Lawrie said.
“The legislature in Connecticut (recently) addressed a number of questions with drones, mostly around how law enforcement was going to use drones. It is a gray area, and it’s caught the legislature flatfooted,” the chief said.
“As luck of the draw goes, Clinton, Connecticut, got to be the test site,” he said.
Ask a police officer a few decades ago what they would do if they saw a flying drone hover over their head and fly off. It sounds like the plot of an X-Files episode where Mulder goes off alone to investigate UFO’s, but it isn’t believed. The Federal Aviation Administration is currently looking into see if any laws were broken, but as Chief Todd said, “It is a gray area.”
Most recently, leading scientists and innovators like Stephen Hawking and Steve Wozniak have come out against drones that are controlled by artificial intelligence. They have all signed their names to a letter against weapons that don’t involve humans.
Autonomous weapons select and engage targets without human intervention. They might include, for example, armed quadcopters that can search for and eliminate people meeting certain pre-defined criteria, but do not include cruise missiles or remotely piloted drones for which humans make all targeting decisions. Artificial Intelligence (AI) technology has reached a point where the deployment of such systems is — practically if not legally — feasible within years, not decades, and the stakes are high: autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms.
Many arguments have been made for and against autonomous weapons, for example that replacing human soldiers by machines is good by reducing casualties for the owner but bad by thereby lowering the threshold for going to battle. The key question for humanity today is whether to start a global AI arms race or to prevent it from starting. If any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable, and the endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow. Unlike nuclear weapons, they require no costly or hard-to-obtain raw materials, so they will become ubiquitous and cheap for all significant military powers to mass-produce. It will only be a matter of time until they appear on the black market and in the hands of terrorists, dictators wishing to better control their populace, warlords wishing to perpetrate ethnic cleansing, etc. Autonomous weapons are ideal for tasks such as assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group. We therefore believe that a military AI arms race would not be beneficial for humanity. There are many ways in which AI can make battlefields safer for humans, especially civilians, without creating new tools for killing people.
Just as most chemists and biologists have no interest in building chemical or biological weapons, most AI researchers have no interest in building AI weapons — and do not want others to tarnish their field by doing so, potentially creating a major public backlash against AI that curtails its future societal benefits. Indeed, chemists and biologists have broadly supported international agreements that have successfully prohibited chemical and biological weapons, just as most physicists supported the treaties banning space-based nuclear weapons and blinding laser weapons.
In summary, we believe that AI has great potential to benefit humanity in many ways, and that the goal of the field should be to do so. Starting a military AI arms race is a bad idea, and should be prevented by a ban on offensive autonomous weapons beyond meaningful human control. (Via)
Having a drone fleet that is controlled by AI sounds like something out of a dystopian war film where the humans get annihilated. If you think of a fleet of drones acting as one unit, it gets even scarier. One drone can make it to a target quite easily, but think of $150 drones bought in bulk, programmed with AI to hit a target, and acting like a swarm. Birds and fish do it easily, so why can’t drones do it?
Imagine a rogue nation having the capability to purchase 50,000 drones, program them with bombs or other harmful materials like chemical weapons and have them attack a target as large as a city. For under $8 million, they could have a fleet of drones attack a large area or a single target for a small amount of money. What could it look like? Well, an AI control fleet could look something like this:
OR you can just look at what they look like now. Yeah, we already started doing it:
Are drones the future of warfare? For America, they’ve been the present. Could drones with artificial intelligence eventually turn into a terrifying weapon the likes of a nuclear bomb? It all depends on how willing we are to keep pushing the envelope on how much we like destroying ourselves.
FOLLOW JEFF SORENSEN ON TWITTER
Jeff Sorensen is an author, writer and occasional comedian living in Detroit, Michigan. You can look for more of his work on The Huffington Post, UPROXX, BGR and by just looking up his name.
Contact: jeff@socialunderground.com