The effects of influenza on a population are attributable to the clinical severity of illness and the number of persons infected, which can vary greatly between seasons or pandemics. To create a systematic framework for assessing the public health effects of an emerging pandemic, we reviewed data from past influenza seasons and pandemics to characterize severity and transmissibility (based on ranges of these measures in the United States) and outlined a formal assessment of the potential effects of a novel virus. The assessment was divided into 2 periods. Because early in a pandemic, measurement of severity and transmissibility is uncertain, we used a broad dichotomous scale in the initial assessment to divide the range of historic values. In the refined assessment, as more data became available, we categorized those values more precisely. By organizing and prioritizing data collection, this approach may inform an evidence-based assessment of pandemic effects and guide decision making.
In the new millennium, the centuries-old strategy of quarantine is becoming a powerful component of the public health response to emerging and reemerging infectious diseases. During the 2003 pandemic of severe acute respiratory syndrome, the use of quarantine, border controls, contact tracing, and surveillance proved effective in containing the global threat in just over 3 months. For centuries, these practices have been the cornerstone of organized responses to infectious disease outbreaks. However, the use of quarantine and other measures for controlling epidemic diseases has always been controversial because such strategies raise political, ethical, and socioeconomic issues and require a careful balance between public interest and individual rights. In a globalized world that is becoming ever more vulnerable to communicable diseases, a historical perspective can help clarify the use and implications of a still-valid public health strategy.
Glyphosate, hard water and nephrotoxic metals: are they the culprits behind the epidemic of chronic kidney disease of unknown etiology in sri lanka?
- International journal of environmental research and public health
- Published over 6 years ago
The current chronic kidney disease epidemic, the major health issue in the rice paddy farming areas in Sri Lanka has been the subject of many scientific and political debates over the last decade. Although there is no agreement among scientists about the etiology of the disease, a majority of them has concluded that this is a toxic nephropathy. None of the hypotheses put forward so far could explain coherently the totality of clinical, biochemical, histopathological findings, and the unique geographical distribution of the disease and its appearance in the mid-1990s. A strong association between the consumption of hard water and the occurrence of this special kidney disease has been observed, but the relationship has not been explained consistently. Here, we have hypothesized the association of using glyphosate, the most widely used herbicide in the disease endemic area and its unique metal chelating properties. The possible role played by glyphosate-metal complexes in this epidemic has not been given any serious consideration by investigators for the last two decades. Furthermore, it may explain similar kidney disease epidemics observed in Andra Pradesh (India) and Central America. Although glyphosate alone does not cause an epidemic of chronic kidney disease, it seems to have acquired the ability to destroy the renal tissues of thousands of farmers when it forms complexes with a localized geo environmental factor (hardness) and nephrotoxic metals.
We report the results of a study we conducted using a simple multiplayer online game that simulates the spread of an infectious disease through a population composed of the players. We use our virtual epidemics game to examine how people respond to epidemics. The analysis shows that people’s behavior is responsive to the cost of self-protection, the reported prevalence of disease, and their experiences earlier in the epidemic. Specifically, decreasing the cost of self-protection increases the rate of safe behavior. Higher reported prevalence also raises the likelihood that individuals would engage in self-protection, where the magnitude of this effect depends on how much time has elapsed in the epidemic. Individuals' experiences in terms of how often an infection was acquired when they did not engage in self-protection are another factor that determines whether they will invest in preventive measures later on. All else being equal, individuals who were infected at a higher rate are more likely to engage in self-protective behavior compared to those with a lower rate of infection. Lastly, fixing everything else, people’s willingness to engage in safe behavior waxes or wanes over time, depending on the severity of an epidemic: when prevalence is high, people are more likely to adopt self-protective measures as time goes by; when prevalence is low, a ‘self-protection fatigue’ effect sets in whereby individuals are less willing to engage in safe behavior over time.
Mark Siedner and colleagues reflect on the early response to the Ebola epidemic and lessons that can be learned for future epidemics.
Increasing the durability of crop resistance to plant pathogens is one of the key goals of virulence management. Despite the recognition of the importance of demographic and environmental stochasticity on the dynamics of an epidemic, their effects on the evolution of the pathogen and durability of resistance has not received attention. We formulated a stochastic epidemiological model, based on the Kramer-Moyal expansion of the Master Equation, to investigate how random fluctuations affect the dynamics of an epidemic and how these effects feed through to the evolution of the pathogen and durability of resistance. We focused on two hypotheses: firstly, a previous deterministic model has suggested that the effect of cropping ratio (the proportion of land area occupied by the resistant crop) on the durability of crop resistance is negligible. Increasing the cropping ratio increases the area of uninfected host, but the resistance is more rapidly broken; these two effects counteract each other. We tested the hypothesis that similar counteracting effects would occur when we take account of demographic stochasticity, but found that the durability does depend on the cropping ratio. Secondly, we tested whether a superimposed external source of stochasticity (for example due to environmental variation or to intermittent fungicide application) interacts with the intrinsic demographic fluctuations and how such interaction affects the durability of resistance. We show that in the pathosystem considered here, in general large stochastic fluctuations in epidemics enhance extinction of the pathogen. This is more likely to occur at large cropping ratios and for particular frequencies of the periodic external perturbation (stochastic resonance). The results suggest possible disease control practises by exploiting the natural sources of stochasticity.
- Philosophical transactions of the Royal Society of London. Series B, Biological sciences
- Published over 3 years ago
Ebola virus causes a severe haemorrhagic fever in humans with high case fatality and significant epidemic potential. The 2013-2016 outbreak in West Africa was unprecedented in scale, being larger than all previous outbreaks combined, with 28 646 reported cases and 11 323 reported deaths. It was also unique in its geographical distribution and multicountry spread. It is vital that the lessons learned from the world’s largest Ebola outbreak are not lost. This article aims to provide a detailed description of the evolution of the outbreak. We contextualize this outbreak in relation to previous Ebola outbreaks and outline the theories regarding its origins and emergence. The outbreak is described by country, in chronological order, including epidemiological parameters and implementation of outbreak containment strategies. We then summarize the factors that led to rapid and extensive propagation, as well as highlight the key successes, failures and lessons learned from this outbreak and the response.This article is part of the themed issue ‘The 2013-2016 West African Ebola epidemic: data, decision-making and disease control’.
How social structures, space, and behaviors shape the spread of infectious diseases using chikungunya as a case study
- Proceedings of the National Academy of Sciences of the United States of America
- Published almost 4 years ago
Whether an individual becomes infected in an infectious disease outbreak depends on many interconnected risk factors, which may relate to characteristics of the individual (e.g., age, sex), his or her close relatives (e.g., household members), or the wider community. Studies monitoring individuals in households or schools have helped elucidate the determinants of transmission in small social structures due to advances in statistical modeling; but such an approach has so far largely failed to consider individuals in the wider context they live in. Here, we used an outbreak of chikungunya in a rural community in Bangladesh as a case study to obtain a more comprehensive characterization of risk factors in disease spread. We developed Bayesian data augmentation approaches to account for uncertainty in the source of infection, recall uncertainty, and unobserved infection dates. We found that the probability of chikungunya transmission was 12% [95% credible interval (CI): 8-17%] between household members but dropped to 0.3% for those living 50 m away (95% CI: 0.2-0.5%). Overall, the mean transmission distance was 95 m (95% CI: 77-113 m). Females were 1.5 times more likely to become infected than males (95% CI: 1.2-1.8), which was virtually identical to the relative risk of being at home estimated from an independent human movement study in the country. Reported daily use of antimosquito coils had no detectable impact on transmission. This study shows how the complex interplay between the characteristics of an individual and his or her close and wider environment contributes to the shaping of infectious disease epidemics.
Effective response to infectious disease epidemics requires focused control measures in areas predicted to be at high risk of new outbreaks. We aimed to test whether mobile operator data could predict the early spatial evolution of the 2010 Haiti cholera epidemic. Daily case data were analysed for 78 study areas from October 16 to December 16, 2010. Movements of 2.9 million anonymous mobile phone SIM cards were used to create a national mobility network. Two gravity models of population mobility were implemented for comparison. Both were optimized based on the complete retrospective epidemic data, available only after the end of the epidemic spread. Risk of an area experiencing an outbreak within seven days showed strong dose-response relationship with the mobile phone-based infectious pressure estimates. The mobile phone-based model performed better (AUC 0.79) than the retrospectively optimized gravity models (AUC 0.66 and 0.74, respectively). Infectious pressure at outbreak onset was significantly correlated with reported cholera cases during the first ten days of the epidemic (p < 0.05). Mobile operator data is a highly promising data source for improving preparedness and response efforts during cholera outbreaks. Findings may be particularly important for containment efforts of emerging infectious diseases, including high-mortality influenza strains.
Sustained and coordinated vaccination efforts have brought polio eradication within reach. Anticipating the eradication of wild poliovirus (WPV) and the subsequent challenges in preventing its re-emergence, we look to the past to identify why polio rose to epidemic levels in the mid-20th century, and how WPV persisted over large geographic scales. We analyzed an extensive epidemiological dataset, spanning the 1930s to the 1950s and spatially replicated across each state in the United States, to glean insight into the drivers of polio’s historical expansion and the ecological mode of its persistence prior to vaccine introduction. We document a latitudinal gradient in polio’s seasonality. Additionally, we fitted and validated mechanistic transmission models to data from each US state independently. The fitted models revealed that: (1) polio persistence was the product of a dynamic mosaic of source and sink populations; (2) geographic heterogeneity of seasonal transmission conditions account for the latitudinal structure of polio epidemics; (3) contrary to the prevailing “disease of development” hypothesis, our analyses demonstrate that polio’s historical expansion was straightforwardly explained by demographic trends rather than improvements in sanitation and hygiene; and (4) the absence of clinical disease is not a reliable indicator of polio transmission, because widespread polio transmission was likely in the multiyear absence of clinical disease. As the world edges closer to global polio eradication and continues the strategic withdrawal of the Oral Polio Vaccine (OPV), the regular identification of, and rapid response to, these silent chains of transmission is of the utmost importance.