Feature Paper: Allesina & Pascual (2009) Googling food webs: Can an Eigenvector measure species' importance for coextinctions? PLoS Computational Biology, 5(9):e1000494 (6pp). doi:10.1371/journal.pcbi.1000494
Author Abstract: A major challenge in ecology is forecasting the effects of species’ extinctions, a pressing problem given current human impacts on the planet. Consequences of species losses such as secondary extinctions are difficult to forecast because species are not isolated, but interact instead in a complex network of ecological relationships. Because of their mutual dependence, the loss of a single species can cascade in multiple coextinctions. Here we show that an algorithm adapted from the one Google uses to rank web-pages can order species according to their importance for coextinctions, providing the sequence of losses that results in the fastest collapse of the network. Moreover, we use the algorithm to bridge the gap between qualitative (who eats whom) and quantitative (at what rate) descriptions of food webs. We show that our simple algorithm finds the best possible solution for the problem of assigning importance from the perspective of secondary extinctions in all analyzed networks. Our approach relies on network structure, but applies regardless of the specific dynamical model of species’ interactions, because it identifies the subset of coextinctions common to all possible models, those that will happen with certainty given the complete loss of prey of a given predator. Results show that previous measures of importance based on the concept of ‘‘hubs’’ or number of connections, as well as centrality measures, do not identify the most effective extinction sequence. The proposed algorithm provides a basis for further developments in the analysis of extinction risk in ecosystems.
Note to Readers: Follow links above for author email, full article text, or the publishing scientific journal. Author notes in my review are in quotes.
Review: What is particularly interesting about this paper is that it uses a technique from computer science (specifically, the algorithm Google uses to establish page ranks) to help predict species extinctions. Taking the place of a computer network, the authors substitute the network of a food web.
Crucial to the author's definition of species extinction is the concept of keystone species, where other species rely on one species for their survival. When such keystone species go extinct, there is the potential for ecosystem collapse to occur. The authors note that "a species is important if important species rely on it for their survival," with the most important keystone species having the highest "rank" much as Google might rank a website as the highest based on how many people or links connect or reference to it.
The authors use a food web approach because "species are not isolated, but connected to each other in tangled networks of relationships known as food webs." The authors use the Google approach because they recognize that the more species there are in an ecosystem, the more complex a ranking system becomes. Therefore, taking a cue from computer science, the authors "reverse engineer" the problem of determining how to "make biodiversity collapse in the most efficient way in order to investigate which species cause the most damage if removed." The authors found that the Google "Page Rank" algorithm (adapted for food webs) "always solves this seemingly intractable problem, finding the most efficient route to collapse. The algorithm works in this sense better than all the others previously proposed and lays the foundation for a complete analysis of extinction risk in ecosystems."
Using knowledge of food web connections the authors ran their algorithm by removing one species at a time and recording the number of secondary extinctions. The more secondary extinctions, the more important a given primary species is. Species with particular importance form nodes, with the largest nodes being keystone species.
Through using techniques in the mathematical branch of topology and a computer science approach, the authors were able to solve a problem in ecology that has been around for 50 years: how can scientists predict extinction rates. Besides helping with better understanding and protecting modern ecosystems, the results of this paper can also be used to look at extinction coefficients that are used to reconstruct mutation rates and speciation events in historical ecology.
Author Abstract: A major challenge in ecology is forecasting the effects of species’ extinctions, a pressing problem given current human impacts on the planet. Consequences of species losses such as secondary extinctions are difficult to forecast because species are not isolated, but interact instead in a complex network of ecological relationships. Because of their mutual dependence, the loss of a single species can cascade in multiple coextinctions. Here we show that an algorithm adapted from the one Google uses to rank web-pages can order species according to their importance for coextinctions, providing the sequence of losses that results in the fastest collapse of the network. Moreover, we use the algorithm to bridge the gap between qualitative (who eats whom) and quantitative (at what rate) descriptions of food webs. We show that our simple algorithm finds the best possible solution for the problem of assigning importance from the perspective of secondary extinctions in all analyzed networks. Our approach relies on network structure, but applies regardless of the specific dynamical model of species’ interactions, because it identifies the subset of coextinctions common to all possible models, those that will happen with certainty given the complete loss of prey of a given predator. Results show that previous measures of importance based on the concept of ‘‘hubs’’ or number of connections, as well as centrality measures, do not identify the most effective extinction sequence. The proposed algorithm provides a basis for further developments in the analysis of extinction risk in ecosystems.
Note to Readers: Follow links above for author email, full article text, or the publishing scientific journal. Author notes in my review are in quotes.
Review: What is particularly interesting about this paper is that it uses a technique from computer science (specifically, the algorithm Google uses to establish page ranks) to help predict species extinctions. Taking the place of a computer network, the authors substitute the network of a food web.
Crucial to the author's definition of species extinction is the concept of keystone species, where other species rely on one species for their survival. When such keystone species go extinct, there is the potential for ecosystem collapse to occur. The authors note that "a species is important if important species rely on it for their survival," with the most important keystone species having the highest "rank" much as Google might rank a website as the highest based on how many people or links connect or reference to it.
The authors use a food web approach because "species are not isolated, but connected to each other in tangled networks of relationships known as food webs." The authors use the Google approach because they recognize that the more species there are in an ecosystem, the more complex a ranking system becomes. Therefore, taking a cue from computer science, the authors "reverse engineer" the problem of determining how to "make biodiversity collapse in the most efficient way in order to investigate which species cause the most damage if removed." The authors found that the Google "Page Rank" algorithm (adapted for food webs) "always solves this seemingly intractable problem, finding the most efficient route to collapse. The algorithm works in this sense better than all the others previously proposed and lays the foundation for a complete analysis of extinction risk in ecosystems."
Using knowledge of food web connections the authors ran their algorithm by removing one species at a time and recording the number of secondary extinctions. The more secondary extinctions, the more important a given primary species is. Species with particular importance form nodes, with the largest nodes being keystone species.
Through using techniques in the mathematical branch of topology and a computer science approach, the authors were able to solve a problem in ecology that has been around for 50 years: how can scientists predict extinction rates. Besides helping with better understanding and protecting modern ecosystems, the results of this paper can also be used to look at extinction coefficients that are used to reconstruct mutation rates and speciation events in historical ecology.
No comments:
Post a Comment