The Darknet is a series of overlay networks that can only be accessed using protocols, such as the Onion Router (better know as Tor) that guarantee privacy and anonymity, used for a wide variety of applications – including illicit purposes like drug sales and the sharing of articles outside publisher paywalls.
Due to its reputation as the dark underbelly of the Internet, the Darknet is often subject to a variety of cyber attacks. Thanks to its unique topology and the ability to quickly return to stasis after a disturbance, however, most attempts to break it fall by the wayside.
According to Manlio De Domenico and Alex Arenas from the Department of Computer Engineering and Mathematics at Rovira i Virgili University in Tarragona, Spain, the key reason behind such resiliency is a decentralised network of “nodes” that comprise it.
As the research duo explains in a new paper, published in the science journal Physical Review E, the Darknet is “characterized by a non-homogeneous distribution of connections, typical of scale-free networks; very short path lengths and high clustering, typical of small-world networks; and lack of a core of highly connected nodes”.
First, the researchers used data from the Internet Research Lab at the University of California, Los Angeles to quantify the resilience of the Darknet by way of network analysis, which allowed them to develop a model of how information is transferred via the Darknet with “onion routing” (a technique for anonymous communication over a network).
Then, they implemented the same model to simulate the Darknet’s response to three types of disturbances: attacks that target specific network nodes, random failures of some nodes, and cascades of failures that propagate through the network.
Results showed that in order for an attack to succeed, it needs to target four times the number of nodes as compared to the Internet, or, in the words of the authors themselves:
“Unexpectedly, we reveal that its peculiar structure makes the Darknet much more resilient than the Internet (used as a benchmark for comparison at a descriptive level) to random failures, targeted attacks, and cascade failures, as a result of adaptive changes in response to the attempts of dismantling the network across time.”
Sources: physics.aps.org, journals.aps.org,agenciasinc.es.
Comment this news or article