- Sunday, January 28, 2018

“Our dependence on connected technology is growing faster than our ability to secure it — in areas affecting public safety and human life.” — @iamthecavalry

Through our overdependence on undependable information technology (IT), we have created the conditions such that the actions of any single outlier can have a profound and asymmetric impact on human life, economic and national security.

We need to find political will to lead on cybersecurity affecting public safety. We need to find it now.

As society increasingly depends upon technology, the importance of effective cybersecurity must evolve in kind. In the case of connected cars, connected medicine, Industrial Internet of Things (IoT), oil and gas, smart cities and the like, the consequences of failure will bleed into public safety and human life. We must be at our best.

There is a promise and a peril to connected technologies. Medical innovations are increasing access, reducing costs, improving care and enabling breakthroughs. But if we’re cavalier about the perils, a single exotic death could trigger a crisis of confidence in the public or medical professionals to trust these otherwise superior technologies. We must be conscientious and proactive in managing these perils.

I had the privilege to serve on the Congressionally mandated Health Care Industry Cybersecurity Task Force. While we all knew the situation was quite dire, the headline of our summary graphic correctly and candidly stated: “Healthcare Cybersecurity is in Critical Condition.” Within weeks of the June 2017 final publication of our findings, the WannaCry ransom worm took out 81 United Kingdom hospitals in a single day — over 40 percent of their national capacity. The U.S. got very, very lucky.

Worse, time is the enemy. There is notoriously slow movement in the relay race of public policy, regulation, research and development, buying cycles and deployment lifespans for safety critical technologies. We cannot wait for such a crisis to initiate necessary hygiene. Moreover, under duress, such reactions are often hurried and more prone to introducing unintended consequences.

We need to be more mature in our posture toward technology and accountability. Much debate over regulating technology sounds a good deal like “fire bad!” Clutching to clichs and talking points is burning valuable time for preparedness and corrective actions.

Over the last 30 years, we have been reluctant to regulate software and IT. There are a number of concerns that have fueled this — some valid, some now less so, and some never were. The chief concern has been a fear that such actions might “stifle innovation and hurt the economy.” Malware attacks like Mirai launched from the long tail of low-cost, low-hygiene IoT devices showed us that a failure to regulate IT can “stifle innovation and hurt the economy.”

Uncomfortable truths command uncomfortable responses. If we want to see something different, we need to incentivize something different.

We have technical solutions for many of our exposures. What we have lacked is motivation and will. In October, I testified to the House Oversight and Government Reform subcommittee on Information Technology about Virginia Democrat Sen. Mark Warner’s IoT cybersecurity bill, which seeks more hygienic IoT for federal use. The House Energy & Commerce Committee asked the Health and Human Services Department to enact one of our Health Care Task Force recommendations: create a software “bill of materials” (or ingredients list) for medical technologies. Two Members of Congress, Rep. Will Hurd, Texas Republican, and Rep. James Langevin, Rhode Island Democrat, joined me at DEF CON, the world’s largest hacker conference in August. Earlier that summer, the Cyber Med Summit in Phoenix saw the first hospital hacking simulations with medical stakeholders. I am hopeful these discussions take root.

From a policy perspective, Mirai disrupted the “prior prevailing hopes” with regards to lighter touch regulation/policy. There was the belief that adding transparency, security “nutrition labels” and a software bill of materials would enable consumers and purchasers to better discern “more secure products” from “less secure products.” The bulk of discussion was about enabling free market choice. Mirai revealed the externalities challenges and “tragedy of the commons” aspects of our interdependence. Yes, transparency can enable informed and conscientious individuals to buy a safer product, but choices made by others can still hurt us — severely.

At current hygiene levels, the stunning growth rate of IoT and connected technologies represents a public health issue. Hackable — but unpatchable technologies — cannot remain the norm. If you add software to something, you make it hackable. If you connect something, you make it exposed. While this was bad enough when it was $100 internet cameras taking out the Internet for an afternoon, we will surely regret it when a similar attack is comprised of life-and-limb medical equipment and patient care and actual lives are impacted.

Mirai, WannaCry, NotPetya and attacks on the grid and critical infrastructure are increasing. If we are overdependent on undependable things, we have choices: Muster the will to ensure these things are more dependable or depend upon them less. We are prone. We are prey. Predators have taken notice. Our relative obscurity is over. What will we do about it?

Joshua Corman, a nationally recognized security expert, is Chief Security Officer and Senior Vice President at PTC. He is Founder of I Am The Cavalry (iamthecavalry.org), a global, grassroots organization that focuses on issues — such as medical devices, automobiles, home electronics and public infrastructure — where computer security intersects public safety and human life. @joshcorman.

Copyright © 2024 The Washington Times, LLC. Click here for reprint permission.

Please read our comment policy before commenting.

Click to Read More and View Comments

Click to Hide