Attack tolerance
In the study of complex networks, attack tolerance refers to a network's robustness, or its ability to preserve overall connectivity and network diameter when nodes are removed due to failures or attacks. This concept is key to understanding how networks—such as the internet, social networks, or power grids—resist disruption. Several graph metrics have been developed to measure this resilience, with algebraic connectivity considered the most effective indicator of network robustness among them.
Attack types
If an attack was to be mounted on a network, it would not be through random nodes but ones that were the most significant to the network. Different methods of ranking are utilized to determine the nodes priority in the network.Average node degree
This form of attack prioritizes the most connected nodes as the most important ones. This takes into account the network changing over time, by analyzing the network as a series of snapshots ; we denote the snapshot at time by. The average of the degree of a node, labeled, within a given snapshot, throughout a time interval, is given by:Node persistence
This form of attack prioritizes nodes that occur most frequently over a period of time. The equation below calculates the frequency that a node occurs in a time interval. When the node is present during the snapshot then equation is equal to 1, but if the node is not present then it is equal to 0.Where
Temporal closeness
This form of attack prioritizes nodes by the summation of temporal distances from one node to all other nodes over a period of time. The equation below calculates the temporal distance of a node by averaging the sum of all the temporal distances for the interval .Network model tolerances
Not all networks are the same, so it is no surprise that an attack on different networks would have different results. The common method for measuring change in the network is through the average of the size of all the isolated clusters,Erdős–Rényi model
In the ER model, the network generated is homogeneous, meaning each node has the same number of links. This is considered to be an exponential network. When comparing the connectivity of the ER model when it undergoes random failures vs directed attacks, we are shown that the exponential network reacts the same way to a random failure as it does to a directed attack. This is due to the homogeneity of the network, making it so that it does not matter whether a random node is selected or one is specifically targeted. All the nodes on average are the same in degree therefore attacking one shouldn't cause anymore damage than attacking another. As the number of attacks go up and more nodes are removed, we observe that S decreases non-linearly and acts as if a threshold exists when a fraction of the nodes has been removed,. At this point, S goes to zero. The average size of the isolated clusters behaves opposite, increasing exponentially toThis model was tested for a large range of nodes and proven to maintain the same pattern.
Scale-free model
In the scale-free model, the network is defined by its degree distribution following the power law, which means that each node has no set number of links, unlike the exponential network. This makes the scale-free model more vulnerable because there are nodes that are more important than others, and if these nodes were to be deliberately attacked the network would break down. However this inhomogeneous network has its strengths when it comes to random failures. Due to the power law there are many more nodes in the system that have very few links, and probability estimates that these are the nodes that will be targeted. Severing these smaller nodes will not affect the network as a whole and therefore allows the structure of the network to stay approximately the same.When the scale-free model undergoes random failures, S slowly decreases with no threshold-like behavior and