The main finding of the study, however, was not that more people had allergies; it was already an accepted observation. It was who had them and who didn’t. The author, immunologist David Strachan, reported that people in their then 20s, who had taken part in a large and lengthy study of British children born in 1958, seemed less likely to have hay fever if they had. grew up with older siblings. The implication was that older siblings – who allegedly left home, went to school, and ran outside with friends when toddlers stayed home – exposed younger children to something that they had brought him home. It was a phenomenon that would not be accessible to an elder or an only child – people who, in this original research, had higher rates of hay fever than younger siblings.
The possibility that early exposure to Something The prevention of later problems was intuitively appealing and led to a cascade of research linking allergies, eczema, and asthma to modern, hygienic living. Many observational studies have reported that allergies and asthma were less likely in people whose infancy has passed outside cities, which were put in nursery in infancy, or who grew up with pets where were raised at farms“Overall, leading to the conclusion that a dirty and messy premodern life is healthier for a growing child.
This led to a backlash – a feeling that parents desperately wanted to avoid allergies neglect basic cleanliness – and to a cropping of the hygiene problem. Version 2.0, formulated by Rook in 2003, suggests that the source of allergies is not the lack of infections, but rather the deprivation of contact with environmental organisms that have been our evolutionary companions for millennia. Rook called this the “old friends” hypothesis, suggesting that exposure to these organisms allowed our immune systems to learn the difference between pathogens and harmless fellow travelers.
While this overhaul was happening, lab science was putting in place the tools to characterize the microbiome, the films of bacteria and fungi that occupy the outer and inner surfaces of everything in the world, including us. This helped to recast the exposures children received in these observational studies – to animals, other children, feces, dander, and dust – not as infectious threats, but as opportunities to store their own. microbiomes with a wide range of organisms.
And this recognition in turn led to version 3.0, the hygiene hypothesis as it currently exists. The “endangered microbiota” hypothesis has been renamed and reformulated 10 years ago, by microbiologist Stanley Falkow (deceased in 2018) and physician-researcher Martin J. Blaser, this iteration proposes that our microbiomes intervene in our immune system. It also warns that our microbial diversity is becoming depleted, and therefore less protective, due to the impact of antibiotics, antiseptics and poor nutrition, among other threats.
Here’s a quick look at the claim that a lack of exposure – to childhood infections, environmental bacteria, and other possibilities to recharge microbial diversity – leaves immune systems out of balance with their environment. It’s an idea that’s widely accepted in pediatrics and immunology today, although surviving supporters of the various versions may disagree on the details. But what does this mean for our immune system as we emerge from the fight against Covid-19? The hypothesis cannot say exactly what will happen, because so far researchers only have data on the prevalence of viral infections, not on other types of exposures. But these data are provocative.
In the southern hemisphere, where the flu season overlaps the northern hemisphere summer, there was “virtually no flu circulation” in 2020, according to a CDC report in September. The agency has yet to release its final report on the US flu experience this winter, but the World Health Organization reported last month it remained “below baseline” across the northern hemisphere.