Food Allergies and Modern Life

20 years ago, I knew hardly anyone with a food allergy. Shellfish and strawberries were the only foods I’d ever heard of someone being allergic to. Then, suddenly, airlines were replacing peanuts with pretzels because of food allergies, and food started being labeled “Processed in a facility that also processes tree nuts.” A few years later, I met someone who was allergic to wheat. Pretty soon, it seemed like everyone I knew was allergic to something – gluten, lactose, chocolate, and a gazillion other things.

How can we explain this epidemic of food allergies? The radical shift from hunting and gathering finally catching up with us? Radical advances in medical technology that allow us to identify conditions that went unnoticed a generation ago? A build-up of environmental toxins in common foods? Interaction of foods with strange new food-like products like high fructose corn syrup and artificial flavors?

Or maybe we’re imagining the whole thing.

That’s the conclusion suggested by a recent study in the UK that found that only 2% of people who claimed to suffer from food allergies were actually allergic. The rest are suffering from something else, namely, the belief that they suffer from food allergies.

Now, I don’t know much about medicine and physiology, but I do know a thing or two about belief, and when millions of people believe something that isn’t empirically verifiable (1 in 5 Britons, according to the article above), we’ve got some ‘splaining to do.

Now, my first reaction is what I think many food allergy sufferers will share: that the study is flawed, not in its procedure, but in its very medical-ness. That is, there’s a strain of anti-modernism in the recent explosion of food allergy awareness that simply doesn’t trust the mainstream medical industry to  recognize and treat food allergies. So when you get a bunch of mainstream medical researchers to study the issue, it’s no surprise that they don’t find anything.

I doubt that’s true, but here’s the thing: the belief that it’s true is part and parcel of the food allergy… can I call it a “movement”? In their rejection of modern medical knowledge and modern food processing technologies, as well as their yearning for a more “natural” diet and a greater connection to their bodily functions, food allergy advocates (if not food allergy sufferers) certainly have at least some of the hallmarks of a social movement. And they’ve certainly created social change, as well – modern supermarket shelves are packed with (ironically) high-tech allergen-free foods: gluten-free beer, bread made of spelt, soy milk and ice cream, and so on.

But leave aside the political aspects of today’s food allergies; what intrigues me is the almost religious asceticism imposed by many food allergies. A vast number of foods are made containing wheat, for instance, so the wheat allergy sufferer is constrained to a diet that eliminates a great many common foods – much like a Jew during Passover, when most wheat-containing foods must be avoided as “leavened”.

The author of the Telegraph piece above notes the similarities between food allergies and food taboos, drawing on Mary Douglas’ understanding of the way boundaries create meaning and order:

[W]hat we eat not only defines us as people but also helps us to feel control and mastery over an otherwise chaotic and random world. She argued that by ordering foods into those we can consume and those that we can’t, we create meaning, and the boundaries provide order in our lives.

As a set of dietary restrictions, rather than a medical phenomenon, it seems reasonable to see food allergies – along with vegetarianism/veganism, the Slow Food movement, the “buy local” movement, and the $30 billion-plus diet market (in the US) – as an attempt to wrest back control over an aspect of our lives that we are increasingly and maybe irretrievable disconnected with. Few of us have any connection with the food cycle except as consumers at the end of a very long and complicated food production cycle. Food allergies allow us to assert control – on pain of death – over what we ingest, and demands an attentiveness – again, on pain of death – to what’s in the foods that we buy.

But this fussiness is part of a larger yearning for control altogether, which is where the anti-modernism comes in. Food has long been not only a means of forging and asserting cultural identity but of resisting the onslaught of a homogenizing, enervating modernity that threatens to dissolve not just cultural identities but individual identities. From the health spa/retreats of the Kellogg brothers and their peers (that gave us corn flakes and granola) to the popularity of Sweet-n-Low in the ‘50s and ‘60s to the communes of the hippie era to the herbal remedies of today, food has been seen as a way to “get back” to a more “natural” way of life – as opposed to the high-stress,  low-community, detached and distracted way of life that is modernity.

None of this is to suggest that there are not very real food allergies – it’s hard to argue with anaphylactic shock. Nor, more importantly, is it to say that the 98% of food allergy sufferers in the study with no medically detectable food allergies do not, in a very real way, suffer. The bodily manifestations of the most obviously social disorders can still drastically limit a person’s quality of life.

What it does suggest is that treatment of food allergies needs to go much further than antihistamines and food avoidance to encompass the cultural psychological. If control is a central issue – as it is already recognized to be in anorexia nervosa and other eating disorders, which strike bright, ambitious young women with overbearing parents hardest precisely because they are the least in control of their lives and the most aware of it – then a) developing non-food strategies for regaining control, and b) developing a realistic relationship with the demands and pressures of daily life are also important to individual adjustment.

On a social level, food allergies and other dietary restrictions join a range of other control-seeking phenomena – pop psychology, personal productivity, conspiracy theorism, and religious fundamentalism, all of which attempt to throw a lasso around the neck of our stampeding lives. As a critique of modernity, there’s nothing original here; Georg Simmel’s The Metropolis and Mental Life addressed similar concerns about the loss of autonomy in 1903, and Emile Durkheim addressed similar concerns a decade earlier, noting the anomie inherent in industrial/commercial society in The Division of Labor in Society.

But over a century of social critique has done little to alleviate the real suffering of real people. The question is, do we have the resources and will to take on these challenges at a social level today? Or are food allergies, in fact, an adequate collective response to dehumanizing social conditions? Do food allergies, like, say, spirit possession on Chinese factory floors, provide the relief people need to cope with the impacts of modernity, even as they suffer?

19 thoughts on “Food Allergies and Modern Life

  1. I think another element that’s being ignored is that the term “allergies” has become a shorthand for “intolerance” which is not the same thing, medically speaking. I often explain my intolerance for bell peppers as an allergy, even though it’s not, because the fact that bell peppers trigger migraines for some unknown reason is much harder for people to grasp than a simple “allergy.”

  2. I’m not sure I trust the results of the study that you reference.

    I personally have allergies (yes, the actual definition of an allergy, not just a mamby-pamby intolerance) to several foods / plants, insect stings, and chemical products, and throughout my life I’ve acquired more — getting a new, entirely unexpected allergy about every 5 years to something that had never bothered me before then. Two of my allergies are life-threatening – with the smallest of exposure I could have anaphylactic shock. I don’t look for them, I don’t dwell on them, I don’t want them, and I don’t imagine that I have them so that I can fit in with a ‘cool crowd’. This year, it was garlic – sadly, because I love garlic. You can’t argue with a swollen mouth and suddenly-bleeding, crusting, peeling lips upon contact. I may have psychosomatic talents, but that isn’t one of them.

    Just yesterday I read a different UK research report that was described in the UK newspaper The Guardian (in their Sunday edition, which is called The Observer).

    This report is based on the UK’s National Health Service (NHS) actual experience with treating allergies, and it quotes several allergy experts. You might accuse them of trying to promote their professional subject area by promulgating the idea that allergies are suddenly becoming more common in adults, and that life-threatening allergic reactions are suddenly becoming more common in people of all ages, but I would tend to trust their figures and viewpoints.


    Sharp rise in number of people with fatal allergies; Worrying trend sees increasing numbers of older adults developing allergies for the first time

    Amelia Hill, social affairs correspondent
    The Observer, Sunday 7 February 2010

    The number of people at risk from severe and fatal allergic reactions has increased sharply every year for the past 15 years, according to new NHS figures. The number of adults developing potentially lethal new allergies for the first time has also accelerated dramatically.

    The figures reveal an unprecedented year-on-year increase in the number of prescriptions issued to those at risk of the most serious allergic reaction, known as anaphylactic shock. The most common triggers are allergies to eggs, nuts, fish, dairy products, fruit and vegetables, and latex. Potentially fatal reactions to insect stings are also increasingly common, as are dramatically adverse reactions to drugs and medication.

    New research obtained by the Observer from the NHS Information Centre reveals the number of emergency adrenaline injectors issued by doctors to combat severe allergies rose by 112% in 2008. The tables show that a record 211,040 injectors were issued, compared with 101,032 in 2003 and just 25,320 in 1995 – a rise of more than 700% in 13 years.

    But although the number of prescriptions has accelerated to a record high, there has also been an increase of more than a quarter in the number of emergency hospital admissions of people suffering anaphylactic shock.

    Experts say that a large proportion of these admissions involving “new onset” patients, who are experiencing a severe reaction to a food, medication or drug with which they have never previously had a problem, or never come into contact before.

    Pam Ewan, a consultant allergist at Addenbrooke’s hospital in ­Cambridge, and a member of the National Allergy Strategy Group, said: “The rise in numbers is to do with a raised general awareness of allergies, but we are, as a population, becoming more allergic overall.

    “What I have very certainly seen over the past three to five years is an increase in the number of older adults developing allergies for the first time,” she added. “Allergies usually start in childhood and young adulthood, so this is a very surprising new trend and very hard to explain. It is so new, however, and there are so few allergists in the UK, that we have not yet even started collecting data, much less analysing it.

    “It could be to do with changes in our environment, a change in allergen exposure, pollution or diet. The only thing we know is that it is clearly related to modern, western ways of living.”

    Ewan said adults who develop allergies for the first time are more likely to suffer extreme reactions if their sensitivity is to eggs, milk or nuts, ­particularly hazelnuts. Bee and wasp stings are also likely to catalyse a severe reaction. “Adults also seem more likely than ­children to develop allergies to fruit and vegetables,” she added.

    Tina Dixon, a consultant allergist at the Royal Liverpool and ­Broadgreen University hospital, said: “Older adults coming to my clinic suffering late ­sensitisation to fruit and vegetables were a rarity in the 1980s.”

    Dixon believes adults could develop severe allergies if they have unusual exposure to something. “The evidence suggests that if you absorb something in an unnatural way, you could develop an allergy after years of exposure,” she said. “So, for example, I treat a chef for late onset egg allergies because, I think, he has spent years absorbing egg through his skin and breathing it in through his nose, as he cooks with it.”

    In the past year alone, there were more than 30,000 admissions to hospital of those suffering anaphylaxis. Medications for allergies cost almost £1bn annually, 11% of the total NHS drug budget.

    In the past seven years, there has been a fourfold increase in all allergies, according to the British Society for Allergy and Clinical Immunology (BSACI), the national allergy body.

    Moira Austin, helpline manager at Anaphylaxis Campaign, said she has noted an increase in the number of women seeking help for allergies. “It tends to be women who become allergic around the time of the menopause or after a stay in hospital. It comes on suddenly and involves foods they have eaten happily for their entire life,” she said.”

  3. PRT: That’s true in a medical sense but irrelevant here. I don’t have access to the original study but I’m sure that medical researchers didn’t conflate the two in their study, but in writing about it, it’s alot easier to say “allergies” than “allergies and intolerances and other sorts of negative reactions” — what’s important is that millions of people have created dietary restrictions that are understood in medical terms despite being unsupported by medical science — and that allergies, intolerances, and what-have-you join a range of other scientistically-framed dietary restrictions all of which promise a corrective against the ills of modern living.

  4. Point of information – turns out this wasn’t actually original research, but a non-peer reviewed report commissioned by the Flour Advisory Bureau, which, obviously, has a vested interest in people not being allergic or, indeed, intolerant to wheat. So it’s not really very “medical” at all – if it were, I might be more inclined to take it seriously. You can read it for yourself here:,107159,en.html

    I do think you make interesting points about purity/asceticism and control in relation to food, though.

  5. Point of information – turns out this wasn’t actually original research, but a non-peer reviewed report commissioned by the Flour Advisory Bureau, which, obviously, has a vested interest in people not being allergic or, indeed, intolerant to wheat. Its focus is primarily on wheat. So it’s not really very “medical” at all – if it were, I might be more inclined to take it seriously. You can read it for yourself here:,107159,en.html

    I do think you make interesting points about purity/asceticism and control in relation to food, though.

  6. bq. when millions of people believe something that isn’t empirically verifiable (1 in 5 Britons, according to the article above), we’ve got some ‘splaining to do.

    At least part of the problem is that the vast majority of individuals do not understand what ‘failure to reject’ means. I mean, we live in a world in which parents take Jenny McCarthy seriously when she asserts that vaccines cause autism and that chelation therapy can cure it.

  7. Metal-eating arachnid: OK, that does put the results under some suspicion, as does the article C quotes (although I can’t help but notice the dig at “modern Western ways of life” there, too). I’m not sure that changes the main point, though, that we’ve latched onto food as a way of restoring order and control in our lives, and bodily illness as not so much a personal affliction but a social one.

    MTBradley: I’ll agree with that. While I do know some seriously allergic people who receive ongoing medical treatment for their allergies, I know far more allergic/intolerant/whatever people who are self-diagnosed, and I’d say that the constant stream of diet/nutrition literature and “experts” on afternoon talk shows or midnight infomercials plays a big role in how people decide they’re allergic. In one sense, we’re just that suggestible; anyone who’s taken Psych 101 knows how easy it is to realize that you suffer from paranoid schizophrenia, no, clinical depression, no wait, an oral fixation, no, actually, it’s stress, no… as you move from chapter to chapter. On the other hand, though, fears about food, vaccination, environmental stress, and whatever else still express an underlying anxiety about modernity whoever they’re expressed by, even Jenny McCarthy. The specifics propounded by one spokesperson or another are just a handle for people already vaguely uneasy about the world they live in.

  8. I think it’s funny because the study even admits that wheat intolerance is difficult to diagnose. Celiac disease is too. My cousin was negative for the blood test, but she continued having problems and finally was diagnosed with the biopsy. There are some intolerances that can only be diagnosed with difficult elimination diets that very few patients can even follow. Until testing is better, I don’t think we can say that most of these things are just made up. It’s a total blame the victim mentality, because not eating wheat sucks and it’s worse if people just think you are faking it.

  9. Apart from cultivating a sense of control it occurred to me that there’s another possible reason why a self-diagnosed wheat allergy would improve a person’s sense of well being. Cutting white flour our of one’s diet would cut a lot of not so good things out of your overall diet. It’s something a lot of readers of this blog might consider self-evident but in my experience it is difficult to underestimate the man in the street’s knowledge of basic human health.

  10. et: Nothing in this post or the article it draws from suggest that people are “faking” food allergies. There’s a huge difference between perceiving one’s diet as a source of ill-health and suffering and intentionally deceiving others about your health. I have no doubt that the 98% of people the study asserts have no medical allergies really feel lousy, nor that said lousiness is triggered by varous kinds of food consumption. The question here isn’t “what kind of scam are these people pulling?” but “what kind of social pressures are these people responding to?”

  11. This seems to be a near perfect example of a quite different kind of Mary Douglas’s ‘boundary maintenance’ – that between expert and non-expert. The claim there are too many self-diagnosed food intolerances derives credibility by coming from university-based academics, but is then questioned by connections with the flour industry. Are these experts or not? Is this science or not? Mary Douglas took Durkheim’s ideas of sacred and profane and re-interpreted them in terms of purity and pollution. Is the science here flowing from the ‘pure’ source of academia, or is it ‘polluted’ by association with the Flour Advisory Bureau? Note that the Telegraph reported this as coming from Portsmouth University only, whereas the Portsmouth University media release noted who actually commisioned the research.
    The report itself, not peer-reviewed (therefore not scientific???), is clearly labeled as a flour Advisory Bureau report. It mixes medical information from the Lancet journal (pure?) with consumer survey data (polluted?).
    Is it scientific enough? Not for publication in a science journal, perhaps. But certainly scientific enough to be ‘distributed to health professionals’ and reported in a leading English newspaper just in time for national allergy week. Mission accomplished?

    For more on boundary work among scientists see Brendan Swedlow 2007.

  12. Fourcultures offers a valuable addition to the (very interesting to me) discussion by highlighting the question of *lay and expert knowledge in a complex society.* This last phrase is the focus for a working group of folklorists that I am collaborating with to develop innovations in undergraduate teaching in folklore studies. The issues raised by fourcultures ramify in all directions and provide a significant link between old and new work, old and new problems. A neighboring contemporary case to Dustin’s (one of many) would be differential understandings of vaccine risk. (Chris’ work on nanotechnology is another.) The Pro-Am dynamic joins the know-your-source media/scientific literacy one that fourcultures mentions. This post is good to think with. Thanks!

  13. Jason: I think there’s another interesting opposition to look at here between scientific thinking and scientistic thinking. These concerns over foods are often framed in scientific language, even as they express an underlying distrust of scientific knowledge. My mother, for instance, won’t eat canola oil, artificial sweeteners, or high-fructose corn syrup but will eat colloidal silver (it’s good enough for the Nav’i!). Not all those are bad choices, but what interests me is that there’s a huge scientific-sounding literature about these things that routinely demonizes scientists for a) being stooges of the food chemistry industry (the concerns about the flour board-sponsored research above being a “Lite” form of this) and b) advancing a worldview of disconnection from nature and our bodies. I guess we could say it’s laypersons borrowing the language and discursive strategies of the expert, while profoundly distrusting them.

    I don’t know where to go with that, but it seems intriguing to think.

  14. Thanks very much for the article, Dustin.

    I think it would be interesting to see some data on the class background of people self-declaring (I want to stress that, I’m not bagging on medical diagnoses) food allergies. While you mentioned supermarket sections, I’ve noticed in my area that gluten-free labels are increasingly emerging in café culture, I’ve seen a gluten-free artisan bakery open up, and at the farmer’s market gluten-free cookie sellers stand shoulder to shoulder with the organic growers and the indie cupcakers. Wandering into complete speculation, I’ve been wondering if gluten-free has now got a foothold in the premium food market that organic used to largely monopolise. The Telegraph article suggests a certain amount of attention-seeking: I’m tempted to suggest that we’re rather seeing a form of conspicuous consumption.

  15. bq. I guess we could say it’s laypersons borrowing the language and discursive strategies of the expert, while profoundly distrusting them.

    David Schneider claimed that a group’s notion of kinship was always related to their folk model of biology. My impressionistic take is that the American folk model of biology has a relation to scientific biology and that embedded within the folk model is the notion that it is not a folk model at all but in fact a scientific model. I think that gives American kinship a special quirk. If that makes any sense to you at all maybe it has some relation to what you are getting at.

  16. Great article and a great beginning for a complex topic/discussion. Having had an eating disorder earlier in my life, I have developed a keen interest in the psychological &cultural role of food and eating. Another angle on food allergies and intolerance is the “Hygiene Hypothesis.” The theory is that the human immune system requires “tuning” to its environment through exposure to bacteria and parasites early in life. Allergy being the result of one’s immune system going into overdrive. Our super-hygenic lifestyle, especially in the high level of vigilance around babies and young children, has limited this exposure and spawned an epidemic of malfunctioning immune systems. I first heard of this in a NY Times article last year:

    There are lots of links online, but here’s a good overview:

    Thanks and will look forward to the ongoing discussion, Julia

  17. I develop mild hives from McDonalds and salad bar. Big deal? Nah.

    But the “food hysteria” is a fad. In the past, the same sort of personality who nowadays follow the “green” and “natural” rules were the ones who followed religious rules to the max.

    Flannery O’Connor quipped that unlike the Fundamentalists, Catholics isolated their religious nuts in convents and monasteries so the rest of us could live in peace…alas, now the secular fundamentalists are driving us nuts with rules for this or that with food or saving the planet.

  18. This was a thought-provoking post, Dustin, thanks for the good read.

    My son has been diagnosed with food-protein induced enterocolitis. In short, it’s a short of auto-immune response that his digestive system has to certain foods, in his case oats and barley. We were very lucky that we caught it quickly and found a great allergist who helped us uncover other allergens, and to recognize the symptoms so that other ‘contaminated’ or mislabelled foods could be avoided. We have a treatment plan (two years of avoidance, then gradual re-introduction). Happy to share more stories (and a link to our family blog with ALL the gorey details) if you’re interested, just contact me via email. Given that he was young (less than six months) when we started seeing symptoms, I’m hard-pressed to accept some of the social hysteria theories around his allergies.

    However, that doesn’t mean that it’s not true in other cases. In my comments here I just wanted to note two things:

    (1) My son is two and a half now, but six months ago, two thirds of the students in his class had allergies. Two thirds. The cubbies at the daycare were all labelled with the allergens, it was incredible – everything from eggs and milk to blueberries to butternut squash. Really?! Many of the kids had epi-pens on hand for emergencies, and so on. It made me wonder what other factors are at work. In Austria and many other countries in Europe, bio-engineering of food is not permitted. You buy bread, if you don’t eat it the next day, you need to go buy bread again. It doesn’t have a 2+ week shelf life, you just can’t buy products like that. And I would be willing to bet that they don’t have the instances of and the hysteria around food allergies that we have in the US. So it led me to wonder if there is some relationship between the allergies and the US propensity to ‘improve’ our food through bio-engineering. Coming back to Celiac’s Disease, for example, our wheat has been altered to increase the gluten content. So are we then surprised that more and more people ‘contract’ Celiac’s? Interestingly enough, some of the grains that are completely safe for my son include quinoa, which is a grain that comes from Latin America and hasn’t been subject to modification up until now. And I wonder whether peanut allergies are somehow related to our obsession with using peanuts or peanut oil in everything. You know Nutella, that fabulous hazelnut spread from Europe? In the US it’s made largely with peanuts.

    (2) Have you read the book Connected? I wrote a brief overview of it on my blog here – My intention is to summarize a few of the most compelling case studies in an upcoming post. The authors do talk about the growing hysteria around allergies, and address allergies it as a social phenomena. I found the book very accessible, you might enjoy it, and it could certainly add a powerful quantitative element to your storyline.


Comments are closed.