Hiding behind the pseudonym MafiaBoy, this fifteen-year-old successfully halted the operations of billion-dollar companies with access to the best computer security experts in the world. (Location 129)
how could this teenager’s actions take out the largest corporations of the new economy? If a mere youth can wreak havoc on the Internet, what could a small group of trained and skilled professionals achieve? How vulnerable are we to such attacks? (Location 138)
nearly 2,000 years after Paul we are making the first inroads toward understanding what made Paul and MafiaBoy successful. We now know that the answer lies as much in the structure and topology of the networks on which they operated as in their ability to navigate them. (Location 171)
A string of recent breathtaking discoveries has forced us to acknowledge that amazingly simple and far-reaching natural laws govern the structure and evolution of all the complex networks that surround us. (Location 185)
Have you ever seen a child take apart a favorite toy? Did you then see the little one cry after realizing he could not put all the pieces back together again? Well, here is a secret that never makes the headlines: We have taken apart the universe and have no idea how to put it back together. (Location 187)
Reductionism was the driving force behind much of the twentieth century’s scientific research. To comprehend nature, it tells us, we first must decipher its components. The assumption is that once we understand the parts, it will be easy to grasp the whole. Divide and conquer; the devil is in the details. Therefore, for decades we have been forced to see the world through its constituents. We have been trained to study atoms and superstrings to understand the universe; molecules to comprehend life; individual genes to understand complex human behavior; prophets to see the origins of fads and religions. Now we are close to knowing just about everything there is to know about the pieces. But we are as far as we have ever been from understanding nature as a whole. Indeed, the reassembly turned out to be much harder than scientists anticipated. The reason is simple: Riding reductionism, we run into the hard wall of complexity. We have learned that nature is not a well-designed puzzle with only one way to put it back together. In complex systems the components can fit in so many different ways that it would take billions of years for us to try them all. Yet nature assembles the pieces with a grace and precision honed over millions of years. It does so by exploiting the all-encompassing laws of self-organization, whose roots are still largely a mystery to us. Today we increasingly recognize that nothing happens in isolation. Most events and phenomena are connected, caused by, and interacting with a huge number of other pieces of a complex universal puzzle. We have come to see that we live in a small world, where everything is linked to everything else. We are witnessing a revolution in the making as scientists from all different disciplines discover that complexity has a strict architecture. We have come to grasp the importance of networks. (Location 191)
This book has a simple aim: to get you to think networks. It is about how networks emerge, what they look like, and how they evolve. It shows you a Web-based view of nature, society, and business, a new framework for understanding issues ranging from democracy on the Web to the vulnerability of the Internet and the spread of deadly viruses. (Location 210)
You will come to appreciate how the Internet, often viewed as entirely human in its creation, has become more akin to an organism or an ecosystem, demonstrating the power of the basic laws that govern all networks. (Location 217)
Opera Omnia, the still incomplete record of Euler’s collected works, currently runs to over seventy-three volumes, six hundred pages each. (Location 234)
Almost 150 years before the new bridge, in 1736, Euler offered a rigorous mathematical proof stating that with the seven bridges such a path does not exist. He not only solved the Königsberg problem but in his brief paper inadvertently started an immense branch of mathematics known as graph theory. Today graph theory is the basis for our thinking about networks. (Location 249)
Euler’s proof is simple and elegant, easily understood even by those not trained in mathematics. Nevertheless, it is not the proof that made history but rather the intermediate step that he took to solve the problem. Euler’s great insight lay in viewing Königsberg’s bridges as a graph, a collection of nodes connected by links. For this he used nodes to represent each of the four land areas separated by the river, distinguishing them with letters A, B, C, and D. Next he called the bridges the links and connected with lines those pieces of land that had a bridge between them. He thus obtained a graph whose nodes were pieces of land and links were bridges. (Location 260)
For our purpose the most important aspect of Euler’s proof is that the existence of the path does not depend on our ingenuity to find it. Rather, it is a property of the graph. Given the layout of the Königsberg bridges, no matter how smart we are, we will never succeed at finding the desired path. (Location 268)
In retrospect, Euler’s unintended message is very simple: Graphs or networks have properties, hidden in their construction, that limit or enhance our ability to do things with them. (Location 273)
In many ways Euler’s result symbolizes an important message of this book: The construction and structure of graphs or networks is the key to understanding the complex world around us. Small changes in the topology, affecting only a few of the nodes or links, can open up hidden doors, allowing new possibilities to emerge. (Location 277)
the ultimate goal of all scientists is to find the simplest possible explanation for very complex phenomena. Erdős and Rényi took on this challenge by proposing an elegant mathematical answer to describe all complex graphs within a single framework. Since different systems follow such disparate rules in building their own networks, Erdős and Rényi deliberately disregarded this diversity and came up with the simplest solution nature could follow: connect the nodes randomly. They decided that the simplest way to create a network was to play dice: Choose two nodes and, if you roll a six, place a link between them. For any other roll of the dice, do not connect these two nodes but choose a different pair and start over. Therefore, Erdős and Rényi viewed graphs and the world they represented as fundamentally random. (Location 341)
But when you add enough links such that each node has an average of one link, a miracle happens: A unique giant cluster emerges. (Location 358)
Mathematicians call this phenomenon the emergence of a giant component, one that includes a large fraction of all nodes. Physicists call it percolation and will tell you that we just witnessed a phase transition, similar to the moment in which water freezes. Sociologists would tell you that your subjects had just formed a community. (Location 361)
Each of us is part of a large cluster, the worldwide social net, from which no one is left out. We do not know everybody on this globe, but it is guaranteed that there is a path between any two of us in this web of people. Likewise, there is a path between any two neurons in our brain, between any two companies in the world, between any two chemicals in our body. Nothing is excluded from this highly interconnected web of life. Paul Erdős and Alfréd Rényi told us why: It requires only one link per node to stay connected. (Location 367)
Random network theory tells us that as the average number of links per node increases beyond the critical one, the number of nodes left out of the giant cluster decreases exponentially. That is, the more links we add, the harder it is to find a node that remains isolated. Nature does not take risks by staying close to the threshold. It well surpasses it. (Location 378)
Erdős and Rényi acknowledged for the first time that real graphs, from social networks to phone lines, are not nice and regular. They are hopelessly complicated. Humbled by their complexity, the two assumed that these networks are random. (Location 389)
The random world of Erdős and Rényi can be simultaneously unfair and generous: It can make some poor and others rich. Yet a far-reaching prediction of Erdős and Rényi’s theory tells us that this only appears to be so. If the network is large, despite the links’ completely random placement, almost all nodes will have approximately the same number of links. (Location 425)
we live in a small world. Our world is small because society is a very dense web. We have far more friends than the critical one needed to keep us connected. Yet is six degrees something uniquely human, tied somehow to our desire to form social links? Or do other kinds of networks look the same? Answers to these questions surfaced only a few years ago. We now know that social networks are not the only small worlds. (Location 554)
The natural question is: Why? How do networks achieve such a uniformly short path despite consisting of billions of nodes? The answer lies in the highly interconnected nature of these networks. (Location 628)
These arguments can be easily turned into a mathematical formula that predicts the separation in a random network as a function of the number of nodes.2 The origin of the small separation is a logarithmic term present in the formula. Indeed, the logarithm of even a very large number is rather small. The ten-based logarithm of a billion is only nine. For example, if we have two networks, both with an average of ten links per node, but one 100 times larger than the other, the separation of the larger net will be only two degrees higher than the separation of the smaller one. The logarithm shrinks the huge networks, creating the small worlds around us. (Location 636)
“Small worlds” are a generic property of networks in general. Short separation is not a mystery of our society or something peculiar about the Web: Most networks around us obey it. It is rooted in their structure—it simply doesn’t take many links for me to reach a huge number of Webpages or friends. The resulting small worlds are rather different from the Euclidean world to which we are accustomed and in which distances are measured in miles. Our ability to reach people has less and less to do with the physical distance between us. Discovering common acquaintances with perfect strangers on worldwide trips repeatedly reminds us that some people on the other side of the planet are often closer along the social network than people living next door. Navigating this non-Euclidean world repeatedly tricks our intuition and reminds us that there is a new geometry out there that we need to master in order to make sense of the complex world around us. (Location 710)
Today Granovetter’s paper, The Strength of Weak Ties, is recognized as one of the most influential sociology papers ever written. It is one of the most cited as well, featured as a Citation Classic by Current Contents in 1986. (Location 733)
“La fuerza de los lazos débiles”
sociología redes evidencia nota
Weak ties play a crucial role in our ability to communicate with the outside world. (Location 751)
Though Granovetter’s argument about the importance of weak ties at first glance may seem counterintuitive and even paradoxical, it formulates a simple truth about our social organization. Granovetter’s society, a fragmented web of fully connected clusters communicating through weak ties, is truer to our daily experience than the completely random picture offered by Erdős and Rényi. To fully understand the structure of society, somehow the theory of random networks had to be reconciled with the clustered reality depicted by Granovetter. It took almost three decades to accomplish this. Interestingly, the clue for a possible solution did not come from sociology or graph theory. (Location 759)
To gather evidence about the clustered nature of society in terms that are acceptable to a mathematician or physicist we need to be able to measure clustering. To achieve this, Watts and Strogatz introduced a quantity called the clustering coefficient. (Location 800)
reassembling the highly interwoven network of 70,975 mathematicians connected by over 200,000 coauthorship links. If the mathematicians had chosen their coauthors randomly, the resulting random network would be predicted by the Erdős-Rényi theory to have a very small clustering coefficient, approximately 10-5. However, our measurements indicated that the clustering coefficient for the real collaboration network is about 10,000 times larger than that, proving that mathematicians do not pick their collaborators randomly. Rather, they form a highly clustered network, similar to the one spotted by Granovetter in society at large. (Location 841)
Newman’s paper proved that the day-to-day business of science is conducted in densely linked clusters of scientists connected by occasional weak ties. His work, combined with our own, offered quantitative evidence for something we had felt to be true all along but that was notoriously difficult to measure before computers: Clustering is indeed present in social systems. (Location 850)
ciencia sistemas sociales clustering
The discovery that clustering is ubiquitous has rapidly elevated it from a unique feature of society to a generic property of complex networks and posed the first serious challenge to the view that real networks are fundamentally random. (Location 873)
The surprising finding of Watts and Strogatz is that even a few extra links are sufficient to drastically decrease the average separation between the nodes. These few links will not significantly change the clustering coefficient. Yet thanks to the long bridges they form, often connecting nodes on the opposite side of the circle, the separation between all nodes spectacularly collapses. The model’s ability to severely decrease the separation while keeping the clustering coefficient practically unchanged indicates that we can afford to be very provincial in choosing our friends, as long as a small fraction of the population has some long-range links. (Location 902)
Today we understand that the Watts-Strogatz model is not incompatible with the Erdős-Rényi worldview. To be sure, by assuming that we start with a regular lattice, it does allow for clusters. But in many ways, its fundamental philosophy continues to follow closely the Erdős-Rényi vision. Indeed, apart from the initial arranging of the nodes along a circle, we connect the nodes completely randomly to each other. Therefore both models depict a deeply egalitarian society, whose links are ruled by the throw of a dice. (Location 918)
Connectors—nodes with an anomalously large number of links—are present in very diverse complex systems, ranging from the economy to the cell. They are a fundamental property of most networks, a fact that intrigues scientists from disciplines as disparate as biology, computer science, and ecology. Their discovery has turned everything we thought we knew about networks on its head. Clustering exposed the first crack in the Erdős-Rényi random worldview. The simple model of Watts and Strogatz, discussed in the previous chapter, saved the day, reconciling the circle of friends with six degrees of separation. The connectors are the final blow to both models. Accounting for these highly connected nodes requires abandoning once and for all the random worldview. (Location 947)
If the Web were a random network, they would be right. But it is not. The most intriguing result of our Web-mapping project was the complete absence of democracy, fairness, and egalitarian values on the Web. We learned that the topology of the Web prevents us from seeing anything but a mere handful of the billion documents out there. (Location 956)
The hubs are the strongest argument against the utopian vision of an egalitarian cyberspace. Yes, we all have the right to put anything we wish on the Web. But will anybody notice? If the Web were a random network, we would all have the same chance to be seen and heard. In a collective manner, we somehow create hubs, Websites to which everyone links. They are very easy to find, no matter where you are on the Web. Compared to these hubs, the rest of the Web is invisible. For all practical purposes, pages linked by only one or two other documents do not exist. It is almost impossible to find them. Even the search engines are biased against them, ignoring them as they crawl the Web looking for the hottest new sites. (Location 983)
The truly central position in networks is reserved for those nodes that are simultaneously part of many large clusters. They are the actors who have played in very different genres during their careers. (Location 1036)
Hubs appear in most large complex networks that scientists have been able to study so far. They are ubiquitous, a generic building block of our complex, interconnected world. (Location 1068)
The attention to hubs is well deserved. Hubs are special. They dominate the structure of all networks in which they are present, making them look like small worlds. Indeed, with links to an unusually large number of nodes, hubs create short paths between any two nodes in the system. Consequently, while the average separation between two randomly selected people on Earth is six, the distance between anybody and a connector is often only one or two. (Location 1076)
we have found that hubs are not rare accidents of our interlinked universe. Instead, they follow strict mathematical laws whose ubiquity and reach challenge us to think very differently about networks. (Location 1086)
In the past few decades scientists have recognized that on occasion nature generates quantities that follow a power law distribution instead of a bell curve. Power laws are very different from the bell curves describing our heights. First, a power law distribution does not have a peak. Rather, a histogram following a power law is a continuously decreasing curve, implying that many small events coexist with a few large events. If the heights of an imaginary planet’s inhabitants followed a power law distribution, most creatures would be really short. But nobody would be surprised to see occasionally a hundred-feet-tall monster walking down the street. In fact, among six billion inhabitants there would be at least one over 8,000 feet tall. So the distinguishing feature of a power law is not only that there are many small events but that the numerous tiny events coexist with a few very large ones. These extraordinarily large events are simply forbidden in a bell curve. (Location 1126)
Subsequently, in numerous large networks that we and many other scientists have had a chance to investigate, an amazingly simple and consistent pattern has emerged: The number of nodes with exactly k links follows a power law, each with a unique degree exponent that for most systems varies between two and three. (Location 1149)
implies that the vast majority of nodes have the same number of links and that nodes deviating from the average are extremely rare. Therefore, a random network has a characteristic scale in its node connectivity, embodied by the average node and fixed by the peak of the degree distribution. In contrast, the absence of a peak in a power-law degree distribution implies that in a real network there is no such thing as a characteristic node. We see a continuous hierarchy of nodes, spanning from rare hubs to the numerous tiny nodes. The largest hub is closely followed by two or three somewhat smaller hubs, followed by dozens that are even smaller, and so on, eventually arriving at the numerous small nodes. (Location 1166)
The power law distribution thus forces us to abandon the idea of a scale, or a characteristic node. In a continuous hierarchy there is no single node which we could pick out and claim to be characteristic of all the nodes. There is no intrinsic scale in these networks. This is the reason my research group started to describe networks with power-law degree distribution as scale-free. With the realization that most complex networks in nature have a power-law degree distribution, the term scale-free networks rapidly infiltrated most disciplines faced with complex webs. (Location 1171)
The slowly decaying power law distribution accommodates such highly linked anomalies in a natural way. It predicts that each scale-free network will have several large hubs that will fundamentally define the network’s topology. (Location 1186)
His most celebrated discovery was that income distribution follows a power law, implying that most money is earned by a few very wealthy individuals, while the majority of the population earn rather small amounts. (Location 1194)
Pareto’s finding implies that roughly 80 percent of money is earned by only 20 percent of the population, an inequality that is still with us a hundred years after Pareto’s discovery. (Location 1195)
Power laws formulate in mathematical terms the notion that a few large events carry most of the action. Power laws rarely emerge in systems completely dominated by a roll of the dice. Physicists have learned that most often they signal a transition from disorder to order. (Location 1199)
Power law distribution
redes distribución emergencia caos orden cita nota leyes de potencia
Thus the power laws we spotted on the Web indicated, for the first time in precise mathematical terms, that real networks are far from random. Complex networks finally started to speak to us in a language that scientists trained in self-organization and complexity could finally understand. They spoke of order and emerging behavior. We just needed to listen carefully. (Location 1201)
Exchanging their water dance for the cold crystalline order of a solid is one of the best-known examples of a phase transition, a phenomenon that physicists had sought to understand for decades prior to the 1960s. (Location 1222)
The freezing of a liquid and the emergence of a magnet are both transitions from disorder to order. (Location 1227)
Such sudden transitions hold the key to a deep question about how nature works, of equal interest to scientists and philosophers alike: How does order emerge from disorder? (Location 1231)
The ordered and the disordered states of a magnet correspond to thermodynamically distinct phases of matter. Right at the transition point the system is poised to choose between these two phases, just like a climber on a crest choosing which side to go down the mountain. Undecided which way to go, the system frequently goes back and forth, and its vacillations increase near the critical point. These vacillations have experimentally measurable consequences. Near the critical point, elements of order and disorder mix within the same material, signaling that the system explores both sides of the crest. In metals close to the transition temperature, clusters of atoms develop whose spins point in the same direction. The closer the metal gets to the critical point, the larger these ordered magnetic clusters become. (Location 1233)
emergencia complejidad transición de fase caos cita
The increasing amount of experimental evidence collected by physicists during the 1960s indicated that in the vicinity of the critical point several key quantities follow power laws. (Location 1239)
The disorder-order transition started to display an amazing degree of mathematical consistency. The problem was that nobody knew why. Why do liquids, magnets, and superconductors lose their identity at some critical point and decide to follow identical power laws? What is behind the high degree of similarity between such disparate systems? And what do power laws have to do with it? (Location 1247)
By giving a rigorous mathematical foundation to scale invariance, his theory spat out power laws each time he approached the critical point, the place where disorder makes room for order. Wilson’s renormalization group not only called for power laws but for the first time could predict the values of the two missing critical exponents as well. (Location 1272)
Nature normally hates power laws. In ordinary systems all quantities follow bell curves, and correlations decay rapidly, obeying exponential laws. But all that changes if the system is forced to undergo a phase transition. Then power laws emerge—nature’s unmistakable sign that chaos is departing in favor of order. The theory of phase transitions told us loud and clear that the road from disorder to order is maintained by the powerful forces of self-organization and is paved by power laws. It told us that power laws are not just another way of characterizing a system’s behavior. They are the patent signatures of self-organization in complex systems. (Location 1276)
If power laws are the signature of systems in transition from chaos to order, what kind of transition is taking place in complex networks? If power laws appear in the vicinity of a critical point, what tunes real networks to their own critical point, allowing them to display a scale-free behavior? We had come to understand critical phenomena after physicists uncovered the mechanisms governing phase transitions; rigorous theories now allow us to calculate with high precision all quantities characterizing systems giving birth to order. But so far, in networks we had only observed the hubs. We now knew that they were the consequence of power laws—a hint of self-organization and order. To be sure, this was an important breakthrough, allowing us to remove networks from the realm of the random. But the most important questions, pertaining to the mechanisms that are responsible for the hubs and the power laws, were still unanswered. Are real networks in a continuous state of transition from disorder to order? Why do hubs appear in networks of all kinds, ranging from actors to the Web? Why are they described by power laws? Are there fundamental laws forcing different networks to take up the same universal form and shape? How does nature spin its webs? (Location 1298)
Despite their diversity most real networks share an essential feature: growth. Pick any network you can think of and the following will likely be true: Starting with a few nodes, it grew incrementally through the addition of new nodes, gradually reaching its current size. (Location 1368)
our examples suggested that for real networks the static hypothesis is not appropriate. Instead, we should incorporate growth into our network models. This was the initial insight we gained while trying to explain the hubs. In so doing, we ended up dethroning the first fundamental assumption of the random universe—its static character. (Location 1372)
In real networks linking is never random. Instead, popularity is attractive. (Location 1419)
real networks are governed by two laws: growth and preferential attachment. Each network starts from a small nucleus and expands with the addition of new nodes. Then these new nodes, when deciding where to link, prefer the nodes that have more links. These laws represent a significant departure from earlier models, which assumed a fixed number of nodes that are randomly connected to each other. (Location 1423)
most cases when growth and preferential attachment are simultaneously present, hubs and power laws emerge as well. In complex networks a scale-free structure is not the exception but the norm, which explains its ubiquity in most real systems. (Location 1490)
Networks are not en route from a random to an ordered state. Neither are they at the edge of randomness and chaos. Rather, the scale-free topology is evidence of organizing principles acting at each stage of the network formation process. (Location 1507)
redes scale free network topología
and preferential attachment, the two basic mechanisms governing network evolution. It changes, however, what is considered attractive in a competitive environment. In the scale-free model, we assumed that a node’s attractiveness was determined solely by its number of links. In a competitive environment, fitness also plays a role: Nodes with higher fitness are linked to more frequently. (Location 1572)
Bianconi’s calculations first confirmed our suspicion that in the presence of fitness the early bird is not necessarily the winner. Rather, fitness is in the driver’s seat, making or breaking the hubs. (Location 1589)
the speed at which nodes acquire links is no longer a matter of seniority. Independent of when a node joins the network, a fit node will soon leave behind all nodes with smaller fitness. (Location 1594)
Using a simple mathematical transformation,8 Bianconi substituted fitness for energy, assigning an individual energy level to each node in the fitness model. Suddenly the calculations took on an unsuspected meaning: They started to resemble those that Einstein ran across eighty years earlier when he discovered the condensate. This could have been coincidental but of no consequence. But there was indeed a precise mathematical mapping between the fitness model and a Bose gas. According to this mapping, each node in the network corresponds to an energy level in the Bose gas. The larger the node’s fitness, the smaller its corresponding energy level. The links of the network turned into particles in the gas, each assigned to a given energy level. Adding a new node to the network is like adding a new energy level to the Bose gas; adding a new link to the network is the same as injecting a new Bose particle into the gas. In this mapping, complex networks are like a huge quantum gas, their links behaving like subatomic particles. (Location 1656)
The most important prediction resulting from this mapping is that some networks can undergo Bose-Einstein condensation. The consequences of this prediction can be understood without knowing anything about quantum mechanics: It is, simply, that in some networks the winner can take all. Just as in a Bose-Einstein condensate all particles crowd into the lowest energy level, leaving the rest of the energy levels unpopulated, in some networks the fittest node could theoretically grab all the links, leaving none for the rest of the nodes. The winner takes all. (Location 1672)
the mathematical tools developed decades earlier to describe quantum gases enabled us to see that, independent of the nature of links and nodes, a network’s behavior and topology are determined by the shape of its fitness distribution. (Location 1680)
But even though each system, from the Web to Hollywood, has a unique fitness distribution, Bianconi’s calculation indicated that in terms of topology all networks fall into one of only two possible categories. In most networks the competition does not have an easily noticeable impact on the network’s topology. In some networks, however, the winner takes all the links, a clear signature of Bose-Einstein condensation. (Location 1682)
As long as we thought of networks as random, we modeled them as static graphs. The scale-free model reflects our awakening to the reality that networks are dynamic systems that change constantly through the addition of new nodes and links. The fitness model allows us to describe networks as competitive systems in which nodes fight fiercely for links. Now Bose-Einstein condensation explains how some winners get the chance to take it all. (Location 1742)
In general, natural systems have a unique ability to survive in a wide range of conditions. Although internal failures can affect their behavior, they often sustain their basic functions under very high error rates. This is in stark contrast to most products of human design, in which the breakdown of a single component often handicaps the whole device. Lately, scientists from all disciplines have recognized the resilience of nature’s designs, raising the hope that we can exploit that convenience in human-made structures. (Location 1786)
Most systems displaying a high degree of tolerance against failures share a common feature: Their functionality is guaranteed by a highly interconnected complex network. (Location 1797)
Decades of research on random networks, however, had indicated that network breakdown is not a gradual process. Removing only a few nodes will have little impact on the network’s integrity. Yet, if the number of removed nodes reaches a critical point, the system abruptly breaks into tiny unconnected islands. Failures in random networks offer an example of an inverse phase transition: There is a critical error threshold below which the system is relatively unharmed. Above this threshold, however, the network simply falls apart. (Location 1815)
Computer simulations we performed on networks generated by the scale-free model indicated that a significant fraction of nodes can be randomly removed from any scale-free network without its breaking apart. The unsuspected robustness against failures is that scale-free networks display a property not shared by random networks. (Location 1827)
Topological robustness is thus rooted in the structural unevenness of scale-free networks: Failures disproportionately affect small nodes. (Location 1842)
for scale-free networks the critical threshold disappears in cases where the degree exponent is smaller or equal to three. Amazingly, most networks of interest, ranging from the Internet to the cell, are scale-free and have a degree exponent smaller than three. Therefore, these networks break apart only after all nodes have been removed—or, for all practical purposes, never. (Location 1853)
we no longer selected the nodes randomly but attacked the network by targeting the hubs. First, we removed the largest hub, followed by the next largest, and so on. The consequences of our attack were evident. The removal of the first hub did not break the system, because the rest of the hubs were still able to hold the network together. After the removal of several hubs, however, the effect of the disruptions was clear. Large chunks of nodes were falling off the network, becoming disconnected from the main cluster. As we pushed further, removing even more hubs, we witnessed the network’s spectacular collapse. The critical point, conspicuously absent under failures, suddenly reemerged when the network was attacked. The removal of a few hubs broke the Internet into tiny, hopelessly isolated pieces. (Location 1872)
If crackers launched a successful attack against the largest Internet hubs, the potential damage could be tremendous. This is not a consequence of bad design or flaws in Internet protocols. Such vulnerability to attack is an inherent property of all scale-free networks. (Location 1881)
The response of scale-free networks to attacks is similar to the behavior of random networks under failures. There is a crucial difference, however. We do not need to remove a large number of nodes to reach the critical point. Disable a few of the hubs and a scale-free network will fall to pieces in no time. (Location 1888)
“Red libre de escala” “Red aleatoria” “Red”
Taken together, the findings indicate that scale-free networks are not vulnerable to failures. The price of this unprecedented resilience comes in their fragility under attack. The removal of the most connected nodes rapidly disintegrates these networks, breaking them into tiny noncommunicating islands. Therefore, hidden within their structure, scale-free networks harbor an unsuspected Achilles’ heel, coupling a robustness against failures with vulnerability to attack. (Location 1896)
The 1996 blackout is a typical example of what scientists often call a cascading failure. When a network acts as a transportation system, a local failure shifts loads or responsibilities to other nodes. If the extra load is negligible, it can be seamlessly absorbed by the rest of the system, and the failure remains effectively unnoticed. If the extra load is too much for the neighboring nodes to carry, they will either tip or again redistribute the load to their neighbors. Either way, we are faced with a cascading event, the magnitude and reach of which depend on the centrality and capacity of the nodes that have been removed in the first round. (Location 1926)
favorite redes reacción en cadena sistemas no lineares
the removal of several large nodes could easily create the same catastrophic disruption on the Internet as the dropping power line in Oregon did to the power system. (Location 1936)
Cascading failures are frequent phenomena in the economy. Indeed, many attribute the East Asian economic crisis of 1997, to be discussed in more detail in Chapter 14, to the pressure the International Monetary Fund (IMF) put on the central banks of several Pacific nations, limiting their ability to provide emergency credit to troubled banks. These banks, in turn, called their loans in from companies, turning the IMF decision, arguably the biggest financial hub, into a cascade of bank and corporation failures. (Location 1937)
Despite these advances, our understanding of cascading failures is rather limited. Topological robustness is a structural feature of networks. Cascading failures, however, are a dynamic property of complex systems, a relatively uncharted territory. I would not be surprised to learn that there are still undiscovered laws that govern cascading failures. The discovery of those laws could have profound implications for many fields, ranging from the Internet to marketing. (Location 1949)
The price of this topological robustness, however, is extreme exposure to attacks. Taking out a hierarchy of highly connected hubs will break any system. This is bad news for the Internet, since it allows crackers to design strategies that can harm the whole infrastructure. It is bad news for our economic establishment as well, for it indicates that, by focusing on the networks behind the economy, one can design strategies to cripple it. The results of the research described in this chapter thus forced us to acknowledge that topology, robustness, and vulnerability cannot be fully separated from one another. All complex systems have their Achilles’ heel. We have learned that topology matters, prompting us to better appreciate the hubs. This is the first step towards defending them. (Location 1957)
The Pfizer study demonstrated that innovations spread from innovators to hubs. The hubs in turn send the information out along their numerous links, reaching most people within a given social or professional network. (Location 2071)
Hubs, the integral components of scale-free networks, are the statistically rare, highly connected individuals who keep social networks together. (Location 2072)
Recognizing that passing a critical threshold is the prerequisite for the spread of fads and viruses was probably the most important conceptual advance in understanding spreading and diffusion. Currently the critical threshold is part of every diffusion theory. Epidemiologists work with it when they model the probability that a new infection will turn into an epidemic, as the AIDS virus did. Marketing textbooks talk about it when estimating the likelihood that a product will make it in the marketplace or to understand why some never do. Sociologists use it to explain the spread of birth control practices among women. Political science exploits it to explain the life cycle of parties and movements or to model the likelihood that peaceful demonstrations will turn into riots. (Location 2108)
umbral crítico transición de fase emergencia redes no linearidad
in contrast to the solid predictions of threshold models, in real networks high virulence does not guarantee a virus’s spread. (Location 2135)
In scale-free networks the epidemic threshold miraculously vanished! That is, even if a virus is not very contagious, it spreads and persists. Defying all wisdom accumulated during five decades of diffusion studies, viruses traveling in scale-free networks do not appear to notice any threshold. They are practically unstoppable. (Location 2162)
If you are a fifteen-year-old in Botswana today, your risk of contracting and dying of AIDS during your lifetime is almost 90 percent. (Location 2220)
Zoltán Dezső undertook this comparison and we were surprised by the results. To be sure, each policy that continued to distribute the treatments randomly continued to have zero threshold and failed to stop the virus. But any policy that displayed bias toward the more connected nodes, even a small bias, restored the finite epidemic threshold. That is, even if we are not successful in finding all hubs, by trying to do so we can lower the rate at which the disease spreads. (Location 2245)
What neither computer scientists nor biologists know is how the large-scale (Location 2390)
A few well-trained crackers could destroy the net in thirty minutes from anywhere in the world. There are many ways to accomplish this, from breaking into the computers running several key routers to launching denial-of-service attacks against the busiest nodes. The Code (Location 2463)
Researchers studying these huge samples have made some fascinating discoveries. They have found that the Web is fragmented into continents and communities, limiting and determining our behavior in the online universe. Paradoxically, they have also told us that there is terra incognita out there, whole continents of the Web never visited or seen by robots. Most important, we learned that the structure of the World Wide Web (Location 2550)
These four continents significantly limit the Web’s navigability. How far we can get surfing depends on where we start. Taking off from a node belonging to the central core, we can reach all pages belonging to this major continent. No matter how many times we are willing to click, however, about half of the Web will still be invisible to us, since the IN land and the isolated islands cannot be reached from the core. If we step out of this core, into the OUT land, we will soon hit a dead end. If we start our journey from a tendril or an isolated island, the Web will appear very tiny because only the other documents on the same island will be reachable. If your Webpage is on an island, the search engines will never discover it, unless you submit your URL address to them. Therefore, our ability to map out the full World Wide Web is not only a question of resources or economic incentives. The directedness of the links creates a very fragmented Web dominated by four major continents. Search engines have an easy time mapping out about half of it, the connected component and the OUT land, since the nodes belonging to them can be located starting from any node of the frequently visited central core. However, the other half of the Web, made up of the islands and IN land, is hopelessly isolated. No matter how hard the robots try, they will not be able to find the documents on them. This is why most search engines allow you to submit the address of your Website. If you do that, they can start crawling from it and potentially discover links to regions of the Web where they have never been. If you refuse to volunteer this information, many nodes could be residing in terra incognita for years to come. (Location 2640)
The bottom line is that all directed networks break into the same four continents. Their (Location 2662)
Recently Gary Flake, Steve Lawrence, and Lee Giles, from NEC, suggested that documents belong to the same community if they have more links to each other than to documents outside of the community. This definition is precise enough to develop algorithms to identify different groupings given the topology of the World Wide Web. (Location 2694)
Like architects’ buildings, the Web’s architecture is the product of two equally important layers: code and collective human actions taking advantage of the code. The first can be regulated by courts, government, and companies alike. The second, however, cannot be shaped by any single user or institution, because the Web has no central design—it is self-organized. It evolves from the individual actions of millions of users. As a result, its architecture is much richer than the sum of its parts. Most of the Web’s truly important features and emerging properties derive from its large-scale self-organized topology. (Location 2741)
For most cells this map is almost as elusive now as it was fifteen years ago at the beginning of the Human Genome Project. The absence of a cellular search engine is only part of the problem. The biggest difficulty is that within each cell there are many layers of organization that can each be viewed as a complex network. To understand the web of life, we need to acquaint ourselves with some of these. (Location 2858)
Think of the cellular metabolism as the engine in your car. Having an engine in and of itself will not get you very far. You need wheels, suspension, brakes, lights, and many other components, each ensuring that the car will run safely on the road. In a similar vein, the cell has an intricate regulatory network that controls everything from metabolism to cell death. The nodes of this network are the genes and the proteins encoded by the gigantic DNA molecule. The (Location 2866)
sequence. We now have the complete sequence for several key organisms, ranging from Esherichia coli bacteria to humans. We are only at the beginning, however, of the second, equally revolutionary scientific endeavor: uncovering the gene’s functional role. To achieve this we need a second genome project, this time mapping the web within the cell. We have the “book of life.” Now we need the map of life. (Location 2883)
The robustness of the results was shocking. No matter which organism we examined, a clear scale-free topology greeted us. Each cell looked like a tiny web, extremely uneven, with a few molecules involved in the majority of reactions—the hubs of the metabolism—while most molecules participated in only one or two. (Location 2914)
most pairs of molecules can be linked by a path of three reactions. Perturbations, therefore, are never localized: Any change in the concentration of a molecule will shortly reach most other molecules. This (Location 2925)
Web’s diameter increases with the number of documents. Surprisingly, the measurements indicated that whether we are navigating the tiny network of a small parasite bacterium or the highly developed highway system of a multicellular organism, such as a flower, the separation is the same. (Location 2930)
dynamically relevant networks, all cells feel like a small town. Digging deeper, we learned that most cells share the same hubs as well. That is, for the vast majority of organisms the ten most-connected molecules are the same. Adenosine triphosphate (ATP) is almost always the biggest hub, followed closely by adenosine diphosphate (ADP) and water. (Location 2933)
Some of these molecules are believed to be the remnants of the so-called RNA world, the evolutionary step before the emergence of DNA, while others are known to be the components of the most ancient metabolic pathways. Therefore, the first mover advantage seems to pervade the emergence of life as well. (Location 2942)
Comparing the metabolic network of all forty-three organisms, we found that only 4 percent of the molecules appear in all of them. Though the hubs are identical, when it comes to the less connected molecules, all organisms have their own distinct varieties. (Location 2946)
the protein interaction network has a scale-free topology. That is, most proteins in the cell play a very specific role, interacting with only one or two other proteins. A few proteins, however, are able to physically attach to a huge number of other proteins. These hubs are crucial for the cell’s proper functioning and survival. Indeed, we were able to show that removing a gene responsible for a hub protein kills the cell 60 to 70 percent of the time. Mutations affecting a weakly connected protein, in contrast, have a less than 20 percent likelihood of proving lethal. (Location 2971)
Taken together, the similar large-scale topology of the metabolic and the protein interaction networks indicate the existence of a high degree of harmony in the cell’s architecture: Whichever organizational level we examine, a scale-free topology greets us. These journeys within the cell indicate that Hollywood and the Web have only rediscovered the topology that life had already developed 3 billion years earlier. Cells are really small worlds that share the topology of many other nonbiological networks, as if the architect of life could design only these. (Location 2982)
Despite its important role in human cancer, fixing the p53 gene alone will not lead to a cure for this deadly disease. The reason was recently articulated by the very people responsible for placing p53 at the center of cancer research. Vogelstein, Lane, and Levine in November 2000 coauthored a Nature paper that made networks the crux of their argument. The reason why we do not fully understand cancer, the three suggested, is that the cell is like the Internet. (Location 3027)
Regardless of industry and scope, the network behind all twentieth century corporations has the same structure: It is a tree, where the CEO occupies the root and the bifurcating branches represent the increasingly specialized and nonoverlapping tasks of lower-level managers and workers. (Location 3144)
Despite its pervasiveness, there are many problems with the corporate tree. First, information must be carefully filtered as it rises in the hierarchy. If filtering is less than ideal, the overload at the top level, where all branches meet, could be huge. As a company expands and the tree grows, information at the top level inevitably explodes. Second, integration leads to unexpected organizational rigidity. (Location 3147)
The tree model is best suited for mass production, which was the way of economic success until recently. These days, however, the value is in ideas and information. We have gotten to the point that we can produce anything that we can dream of. The expensive question now is, what should that be? As companies face an information explosion (Location 3154)
The most visible element of this remaking is a shift from a tree to a web or a network organization, flat and with lots of cross-links between the nodes. As valuable resources shift from physical assets to bits and information, operations move from vertical to virtual integration, the reach of businesses increasingly expands from domestic to global, the lifetime of inventories decreases from months to hours, business strategy changes from top-down to bottom-up, and workers transform into employees or free agents. (Location 3159)
Therefore, companies aiming to compete in a fast-moving marketplace are shifting from a static and optimized tree into a dynamic and evolving web, offering a more malleable, flexible command structure. Those that resist this change could easily be forced to the periphery. (Location 3166)
in the economy, economic theory pays surprisingly little attention to networks. Until recently economists viewed the economy as a set of autonomous and anonymous individuals interacting through the price system only, a model often called the standard formal model of economics. The individual actions of companies and consumers were assumed to have little consequence on the state of the market. Instead, the state of the economy was best captured by such aggregate quantities as employment, output, or inflation, ignoring the interrelated microbehavior responsible for these aggregate measures. Companies and corporations were seen as interacting not with each other but rather with “the market,” a mythical entity that mediates all economic interactions. In reality, the market is nothing but a directed network. Companies, firms, corporations, financial institutions, governments, and all potential economic players are the nodes. Links quantify various interactions between these institutions, involving purchases and sales, joint research and marketing projects, and so forth. The weight of the links captures the value of the transaction, and the direction points from the provider to the receiver. The structure and evolution of this weighted and directed network determine the outcome of all macroeconomic processes. (Location 3266)
Hierarchy: Network Forms of Organization, “in markets the standard strategy is to drive the hardest possible bargain on the immediate exchange. In networks, the preferred option is often creating indebtedness and reliance over the long haul.” Therefore, in a network economy, buyers and suppliers are not competitors but partners. The relationship between them is often very long lasting and stable. The (Location 3276)
property development company shake the world’s largest stock market and keep the president of the “world’s strongest nation” explaining even two years after? If we view the economy as a highly interconnected network of companies and financial institutions, we can begin to make sense of these events. In such networks the failure of a node has little effect on the system’s integrity. Occasionally, however, the breakdown of some well-selected nodes sets off a cascade of failures that can shake the whole system. The Asian crisis was a large-scale example of a cascading financial failure similar to those we discussed in Chapter 9, a natural consequence of connectedness and interdependency. It was not the first, however: South America and Mexico had experienced similar cascading failures two years earlier. It is surely not the last either, despite all the measures banks and governments seem to have taken to avoid it. (Location 3307)
In the absence of a spider, there is no meticulous design behind these networks either. Real networks are self-organized. They offer a vivid example of how the independent actions of millions of nodes and links lead to spectacular emergent behavior. Their spiderless scale-free topology is an unavoidable consequence of their evolution. Each time nature is ready to spin a new web, unable to escape its own laws, it creates a network whose fundamental structural features are those of dozens of other webs spun before. The robustness of the laws governing the emergence of complex networks is the explanation for the ubiquity of the scale-free topology, describing such diverse systems as the network behind language, the links between the proteins in the cell, sexual relationships between people, the wiring diagram of a computer chip, the metabolism of the cell, the Internet, Hollywood, the World Wide Web, the web of scientists linked by coauthorships, and the intricate collaborative web behind the economy, to name only a few. (Location 3442)
Networks are by their very nature the fabric of most complex systems, and nodes and links deeply infuse all strategies aimed at approaching our interlocked universe. (Location 3455)
Today the world’s most dangerous aggressors, ranging from al Qaeda to the Columbian drug cartels, are not military organizations with divisions but self-organized networks of terror. In the absence of familiar signs of organization and order, we often call them “irregular armies.” Yet by doing so we again equate complexity with randomness. In reality, terrorist networks obey rigid laws that determine their topology, structure, and therefore their ability to function. They exploit all the natural advantages of self-organized networks, including flexibility and tolerance to internal failures. Unfamiliarity with this new order and a lack of language for formalizing our experience are perhaps our most deadly enemies. (Location 3475)
look at the networks behind such complex systems as the cell or the society, we concealed all the details. By seeing only nodes and links, we were privileged to observe the architecture of complexity. By distancing ourselves from the particulars, we glimpsed the universal organizing principles behind these complex systems. Concealment revealed the fundamental laws that govern the evolution of the weblike world around us and helped us understand how this tangled architecture affects everything from democracy to curing cancer. (Location 3503)