NIST and university researchers have proposed a new computational model for assessing cybersecurity costs in network protection.
The larger the network, the more opportunities there may be for threat actors to infiltrate, cause damage, or conduct theft.
Today’s corporate networks often provide a vast attack surface including Internet of Things (IoT) devices, mobile products, remote work tools, on-prem and off-prem services, and cloud systems.
It may be a challenge for businesses to work out what the most important areas are in terms of cybersecurity investment, but a new computational model could take out some of the guesswork.
Authored by US National Institute of Standards and Technology (NIST) researchers Van Sy Mai, Richard La, and Abdella Battou, a new paper published in IEEE/ACM Transactions on Networking, titled “Optimal Cybersecurity Investments in Large Networks Using SIS Model: Algorithm Design,” proposes “a way to determine optimum investments needed to minimize the costs of securing these networks, providing recovery from infections and repairing their damage.”
The algorithm was designed with pandemic and disease tracking as inspiration. Viruses can spread through a population with no immunity through social contact and digital viruses can also spread through networks and points of system-to-system contact if no protection is in place.
“A virus/malware infection in one system can spread internally, attacking other systems, potentially impacting the overall system,” NIST says. “The problem is similar to that of the spread of diseases in social networks.”
The model uses datasets based on a network’s long-term behavior to generate key performance metrics in analyzing large network systems and risk areas.
Whereas vaccine rate tracking can be used to measure the impact of protection on a pandemic’s risk level and spread, in this study, a time-averaged security cost was imposed in protecting different elements of a network with the overall aim being the development of cybersecurity investment strategies.
The researchers’ “susceptible-infected-susceptible” (SIS) model considered investments, economic loss, and recovery requirements caused by malware infections.
Four algorithms assess network probabilities of being breached, the likely rates of spread, how long — and how much it would cost — to repair the damage, and the expense associated with full recovery.
These assessments were then compared to the model’s investment strategies, including network monitoring and diagnostics to generate recommendations for the ‘optimal’ areas money should be spent in protecting a network.
This study may highlight how machine learning could be harnessed to provide a foundation for cybersecurity investments in the future. It could also become a valuable tool for enterprise users in the future, who are facing an average cost of at least $4 million due to a data breach today.
In related news this month, NIST has been working on improved product labeling for IoT devices and software to improve cybersecurity education and to help consumers make more informed choices.
Previous and related coverage
Have a tip? Get in touch securely via WhatsApp | Signal at +447713 025 499, or over at Keybase: charlie0