Please ensure Javascript is enabled for purposes ofwebsite accessibilityHumans vs. Machines: A loophole in the law | KRCG
Close Alert

Humans vs. Machines: A Loophole in the Law


A team of robotics experts at Missouri University of Science & Technology designed this flying drone to help farmers survey their crops. (KRCG 13)
A team of robotics experts at Missouri University of Science & Technology designed this flying drone to help farmers survey their crops. (KRCG 13)
Facebook Share IconTwitter Share IconEmail Share Icon
Comment bubble
0

Self driving cars, drones and robots have launched the 21st century into a new, high-tech world.

In that new world, we face new challenges. People can use technology to deliver food, for transportation and someday, it could even be used to commit crimes. The problem is, the law has yet to catch up.

Self-driving cars have taken center stage at auto shows in recent years.

Electric car model Tesla already deployed it's Model S, which allows the car to keep in a lane, adjust its speed and change lanes, all without a person behind the controls.

In October, San Francisco-based company Otto sent a driverless 18-wheeler on a 120 mile trip, using radar and laser based sensors with cameras and software to navigate.

Dr. Keng Siau, head of the business and information technology department at Missouri University of Science & Technology, has spent years researching and studying the economic and business impacts of artificial intelligence or "A-I".

He said self-driving technology puts the jobs of 3.5 million american truck drivers at risk.

"What's going to happen with these professional truck drivers?" Siau asked. "What kind of jobs are they going to do? Are we able to train them to do something else or will they be replaced?"

On a more global scale, autonomous weapons systems, also known as killer robots, face criticism.

In 2012, the Obama administration created a department of defense directive setting policy for the new technology.

It made the United Stated the first country ever to do so.

The policy has a five-year expiration date, and a decision on how to manage unmanned aircraft is now up to President Donald Trump.

S&T robotics professor Dr. Zhaozheng Yin said if Trump renews the policy, he'll need to better distinguish between semi and fully autonomous weapons - like war drones.

"It's a challenge to say the will not make any mistakes. They will possibly say the accuracy is 99.99 percent - but even with a 99.99 percent you still have a very slight chance to kill some person."

The difference is human control, Yin said. It can be too dangerous for a robot to be making the decision about whether to kill. He said he'd rather see his life's work be used to help people and not to hurt them.

Yin and his team continue to work on drones that help farmers survey their land. For example, they can be taught to spot crops infected by diseases by flying overhead using multiple cameras.

Another drone has been used to help engineers monitor bridge conditions that are often hard to access.

He fears people could use the technology to commit crimes.

That's where Dr. Siau said the law needs to stay ahead of the new wave.

"There is no driver in there. So who are we supposed to issue a ticket to? So how are we going to manage all this?

"So the legal aspect, the policy aspect, the regulation needs to catch up," Siau said.

In Washington D.C., a small six-wheeled robot has already been used to deliver food. It uses nine cameras, radar and sensors like what you'd find in a self driving car. It's fully autonomous, but works with a human guide in the beginning to make sure nothing goes wrong.

Professors said the United States needs to act fast - before it's too late, fearing that if the law can't stay ahead, criminals could take advantage of it.

In 2015, a Connecticut teenager was arrested after posting online videos of a drone firing a handgun and another where the flying device deploys a flamethrower. The FAA launched investigations into both incidents.

Comment bubble
JOIN THE CONVERSATION (
0
)

"When a robot can behave like human beings, it can think like human beings, is it going to be safe? Or is it going to be very dangerous?" Siau asked. "If you don't manage it properly it's going to be very dangerous."




Loading ...