Information

Why humans didn't evolve to safely consume rotten food?

Why humans didn't evolve to safely consume rotten food?


We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

I'm not a biology person at all so please forgive me if my question is silly. I was just wondering that why didn't evolution cause us to digest, without issues decomposed food?


Food is considered "rotten" when it has a large population of micororganisms like bacteria and fungus eating it. Some of these microorganisms have evolved to produce toxins that harm animals that would otherwise eat the food. Animals could evolve to tolerate these toxins, and some have. However, bacteria have a large advantage over animals in an evolutionary arms race due to their short generation time and large population number. So competing with bacteria over carrion is a specialized niche, avoiding it has been selected for in most animal lineages.


Should Humans Eat Meat? [Excerpt]

There is no doubt that human evolution has been linked to meat in many fundamental ways. Our digestive tract is not one of obligatory herbivores our enzymes evolved to digest meat whose consumption aided higher encephalization and better physical growth. Cooperative hunting promoted the development of language and socialization the evolution of Old World societies was, to a significant extent, based on domestication of animals in traditional societies, meat eating, more than the consumption of any other category of foodstuffs, has led to fascinating preferences, bans and diverse foodways and modern Western agricultures are obviously heavily meat-oriented. In nutritional terms, the links range from satiety afforded by eating fatty megaherbivores to meat as a prestige food throughout the millennia of preindustrial history to high-quality protein supplied by mass-scale production of red meat and poultry in affluent economies.

But is it possible to come up with a comprehensive appraisal in order to contrast the positive effects of meat consumption with the negative consequences of meat production and to answer a simple question: are the benefits (health and otherwise) of eating meat greater than the undesirable cost, multitude of environmental burdens in particular, of producing it?

Killing animals and eating meat have been significant components of human evolution that had a synergistic relationship with other key attributes that have made us human, with larger brains, smaller guts, bipedalism and language. Larger brains benefited from consuming high-quality proteins in meat-containing diets, and, in turn, hunting and killing of large animals, butchering of carcasses and sharing of meat have inevitably contributed to the evolution of human intelligence in general and to the development of language and of capacities for planning, cooperation and socializing in particular. Even if the trade-off between smaller guts and larger brains has not been as strong as is claimed by the expensive-tissue hypothesis, there is no doubt that the human digestive tract has clearly evolved for omnivory, not for purely plant-based diets. And the role of scavenging, and later hunting, in the evolution of bipedalism and the mastery of endurance running cannot be underestimated, and neither can the impact of planned, coordinated hunting on non-verbal communication and the evolution of language.

Homo sapiens is thus a perfect example of an omnivorous species with a high degree of natural preferences for meat consumption, and only later environmental constraints (need to support relatively high densities of population by progressively more intensive versions of sedentary cropping) accompanied by cultural adaptations (meat-eating restrictions and taboos, usually embedded in religious commandments) have turned meat into a relatively rare foodstuff for majorities of populations (but not for their rulers) in traditional agricultural societies. Return to more frequent meat eating has been a key component of a worldwide dietary transition that began in Europe and North America with accelerating industrialization and urbanization during the latter half of the 19th century. In affluent economies, this transition was accomplished during the post-WW II decades, at a time when it began to unfold, often very rapidly, in modernizing countries of Asia and Latin America.

As a result, global meat production rose from less than 50t in 1950 to about 110t in 1975 it doubled during the next 25 years, and by 2010 it was about 275t, prorating to some 40g/capita, with the highest levels (in the US, Spain and Brazil) in excess of 100g/capita. This increased demand was met by a combination of expanded traditional meat production in mixed farming operations (above all in the EU and China), extensive conversion of tropical forests to new pastures (Brazil being the leader) and the rise of concentrated animal feeding facilities (for beef mostly in North America, for pork and chicken in all densely populated countries).

This, in turn, led to a rise of modern mass-scale feed industry that relies primarily on grains (mainly corn) and legumes (with soybeans dominant, fed as a meal after expressing edible oil) combined with tubers, food-processing residues and many additives to produce a variety of balanced feedstuffs containing optimal shares of carbohydrates, proteins, lipids and micronutrients (and added antibiotics). But it has also led to a widespread adoption of practices that create unnatural and stressful conditions for animals and that have greatly impaired their welfare even as they raised their productivity to unprecedented levels (with broilers ready for slaughter in just six to seven weeks and pigs killed less than six months after weaning).

Meat is undoubtedly an environmentally expensive food. Large animals have inherently low efficiency of converting feed to muscle, and only modern broilers can be produced with less than two units of feed per unit of meat. This translates into relatively large demands for cropland (to grow concentrates and forages), water, fertilizers and other agrochemicals, and other major environmental impacts are created by gaseous emissions from livestock and its wastes water pollution (above all nitrates) from fertilizers and manure is also a major factor in the intensifying human interference in the global nitrogen cycle.

Opportunities for higher efficiency can be found all along the meat production&ndashconsumption chain. Agronomic improvements &ndash above all reduced tillage and varieties of precision cropping (including optimized irrigation) &ndash can reduce both the overall demand for natural resources and energy inputs required for feed production while, at the same time, improving yields, reducing soil erosion, increasing biodiversity and minimizing nitrogen leakage (Merrington et al. 2002). Many improvements can lower energy used in livestock operations (Nguyen et al. 2010), reduce the specific consumption of feed (Reynolds et al. 2011) and minimize environmental impacts of large landless livestock facilities (IST 2002). Considerable energy savings can also be realized by using better slaughter and meat processing methods (Fritzson and Berntsson 2006).

Rational meat eating is definitely a viable option.

Toward Rational Meat Eating
We could produce globally several hundred millions of tons of meat without ever-larger confined animal feeding operations (CAFOs), without turning any herbivores into cannibalistic carnivores, without devoting large shares of arable land to monocropping that produces animal feed and without subjecting many grasslands to damaging overgrazing &ndash and a single hamburger patty does not have to contain meat from several countries, not just from several cows. And there is definitely nothing desirable to aim for ever higher meat intakes: we could secure adequate meat supply for all of today&rsquos humanity with production methods whose energy and feed costs and whose environmental impacts would be only a fraction of today&rsquos consequences.

Meat consumption is a part of our evolutionary heritage meat production has been a major component of modern food systems carnivory should remain, within limits, an important component of a civilization that finally must learn how to maintain the integrity of its only biosphere.

The most obvious path toward more rational meat production is to improve efficiencies of many of its constituent processes and hence reduce waste and minimize many undesirable environmental impacts. As any large-scale human endeavor, meat production is accompanied by a great deal of waste and inefficiency, and while he have come close to optimizing some aspects of the modern meat industry, we have a long way to go before making the entire enterprise more acceptable. And, unlike in other forms of food production, there is an added imperative: because meat production involves breeding, confinement, feeding, transportation and killing of highly evolved living organisms able to experience pain and fear, it is also accompanied by a great deal of unnecessary suffering that should be eliminated as much as possible.

Opportunities to do better on all of these counts abound, and some are neither costly nor complicated: excellent examples range from preventing the stocking densities of pastured animals from surpassing grassland&rsquos long-term carrying capacity to better designs for moving cattle around slaughterhouses without fear and panic. There is no shortage of prescriptions to increase global agricultural production with the maintenance of well-functioning biosphere or, as many of my colleagues would say, to develop sustainable food production while freezing agriculture&rsquos environmental footprint of food (Clay 2011) &ndash or even shrinking it dramatically (Foley et al. 2011).

The two key components in the category of improvements are the effort to close yield gaps due to poor management rather than to inferior environmental limitations and to maximize the efficiency with which the key resources are used in agricultural production. Claims regarding the closing of the yield gaps must be handled very carefully as there are simply too many technical, managerial, social and political obstacles in the way of replicating Iowa corn yield throughout Asia, to say nothing about most of sub-Saharan Africa, during the coming generations. Africa&rsquos average corn yield rose by 40% between 1985 and 2010 to 2.1/ha, far behind the European mean of 6.1 and the US average of 9.6/ha, but even if it were double during the next 25 years to 4.2/ha, the continent&rsquos continuing rapid growth would reduce it to no more than about 35% gain in per capita terms. Asian prospects for boosting the yields are better, but in many densely populated parts of that continent, such yields might be greatly reduced, even negated by the loss of arable land to continuing rapid urbanization and industrialization.

At the same time, there does not appear to be anything in the foreseeable future that could fundamentally change today&rsquos practices of growing livestock for meat. Indeed, many arguments can be made that after half a century of focused breeding, accelerated maturation of animals and improvements in feed conversion, these advances have gone too far and are now detrimental to the well-being of animals and to the quality of the food chain and have raised environmental burdens of meat production to an unprecedented level that should not be tolerated in the future. And neither the expanded aquaculture nor plant-based meat imitations will claim large shares of the global market anytime soon, and cultured meat will remain (for a variety of reasons) an oddity for a long time to come.

Consequently, it is very unlikely that the undoubted, continuing (and possibly even slightly accelerating) positive impact of the combination of higher productivities, reduced waste, better management and alternative protein supplies would make up for additional negative impacts engendered by rising meat production and that there would be discernible net worldwide improvement: the circle of reduced environmental impacts cannot be squared solely by more efficient production. At the same time, the notion that an ideal form of food production operating with a minimal environmental impact should exclude meat &ndash nothing less than enacting &ldquovegetarian imperative&rdquo (Saxena 2011) on a global scale &ndash does not make sense.

This is because both grasslands and croplands produce plenty of phytomass that is not digestible by humans and that would be, if not regularly harvested, simply wasted and left to decay. In addition, processing of crops to produce milled grains, plant oils and other widely consumed foodstuffs generates a large volume of by-products that make (as described in Chapter 4) perfect animal feeds. Rice milling strips typically 30% of the grain&rsquos outermost layers, wheat milling takes away about 15%: what would we do with about 300 Mt of these grain milling residues, with roughly the same mass of protein-rich oil cakes left after extraction of oil (in most species accounts for only 20&ndash25% of oilseed phytomass), and also with the by-products of ethanol (distillers grain) and dairy industries (whey), waste from fruit and vegetable canning (leaves, peels), and citrus rinds and pulp?

They would have to be incinerated, composted or simply left to rot if they were not converted to meat (or milk, eggs and aquacultured seafood). Not tapping these resources is also costly, particularly in the case of porcine omnivory that has been used for millennia as an efficient and rewarding way of organic garbage disposal. Unfortunately, in 2001, the EU regulations banned the use of pig swill for feeding, and Stuart (2009) estimated that this resulted in an economic loss of &euro15 billion a year even when not counting the costs of alternative food waste disposal from processors, restaurants and institutions. Moreover, the ban has increased CO2 emissions as the swill must be replaced by cultivated feed.

At the same time, given the widespread environmental degradation caused by overgrazing, the pasture-based production should be curtailed in order to avoid further soil and plant cover degradation. Similarly, not all crop residues that could be digested by animals can be removed from fields, and some of those that can be have other competing uses or do not make excellent feed choices, and not all food processing residues can be converted to meat. This means that a realistic quantification of meat production potential based on phytomass that does not require any cultivation of feed crops on arable land cannot be done without assumptions regarding their final uses, and it also requires choices of average feed conversion ratios. As a result, all such calculations could be only rough approximations of likely global totals, and all of my assumptions (clearly spelled out) err on a conservative side.

Because most of the world&rsquos grasslands are already degraded, I will assume that the pasture-based meat production in low-income countries of Asia, Africa and Latin America should be reduced by as much as 25%, that there will be absolutely no further conversion of forests to grasslands throughout Latin America or in parts of Africa, and that (in order to minimize pasture degradation in arid regions and nitrogen losses from improved pastures in humid areas) grazing in affluent countries should be reduced by at least 10%. These measures would lower pasture-based global beef output to about 30t/year and mutton and goat meat production to about 5t.

Another way to calculate a minimum production derived from grasslands is to assume that as much as 25% of the total area (the most overgrazed pastures) should be taken out of production and that the remaining 2.5ha would support only an equivalent of about half a livestock unit (roughly 250g of cattle live weight) per hectare (for comparison, since 1998 the EU limits the grazing densities to 2U/ha, Brazil&rsquos grasslands typically support 1U/ha and 0.5U is common in sub-Saharan Africa). Assuming average annual 10% off-take rate and 0.6 conversion rate from live to carcass weight, global meat production from grazing would be close to 40t/year, an excellent confirmation of the previous total derived by different means.

At the same time, all efforts should be made to feed available crop residues to the greatest extent possible. Where yields are low and where the cultivated land is prone to erosion, crop residues should be recycled in order to limit soil losses, retain soil moisture and enrich soil organic matter. But even with much reduced harvest ratios of modern cultivars (typically a unit of straw per unit of grain), high yields result in annual production of 4&ndash8 of straw or corn stover per hectare, and a very large part of that phytomass could be safely removed from fields and used as ruminant feed. The annual production of crop residues (dominated by cereal straws) now amounts to roughly 3 Gt of dry phytomass.

Depending on crops, soils and climate, recycling should return 30&ndash60% of all residues to soil, and not all of the remaining phytomass is available for feeding: crop residues are also used for animal bedding for many poor rural families in low-income countries, they are the only inexpensive household fuel and in many regions (in both rich and poor countries) farmers still prefer to burn cereal straw in the fields &ndash this recycles mineral nutrients but it also generates air pollution. Moreover, while oat and barley straws and stalks and leaves of leguminous crops are fairly, or highly, palatable, ruminants should not be fed solely by wheat or rice straw rice straw in particular is very high in silica (often in excess of 10%), and its overall mineral content may be as high as 17%, more than twice that of alfalfa. As a result, the best use of cereal straws in feeding is to replace a large share (30&ndash60%) of high-quality forages.

These forages should be cultivated preferably as leguminous cover crops (alfalfa, clovers, vetch) in order to enhance the soil&rsquos reserves of organic matter and nitrogen. If only 10% of the world&rsquos arable land (or about 130ha) were planted annually with these forage crops (rotated with cereals and tubers), then even with a low yield of no more than 3/ha of dry phytomass, there would be some 420t of phytomass available for feeding, either as fresh cuttings or as silage or hay. Matching this phytomass with crop residues would be quite realistic as 420t would be only about 15% of the global residual phytomass produced in 2010. Feeding 840t of combined forage and residue phytomass would, even with a very conservative ratio of 20g of dry matter/kg of meat (carcass weight), produce at least 40t of ruminant meat.

Unlike in the case of crop residues, most of the food processing residues are already used for feeding, and the following approximations quantify meat production based on their conversion. Grain milling residues (dominated by rice and wheat) added up to at least 270t in 2010, and extraction of oil yielded about 310t of oil cakes. However, most of the latter total was soybean cake whose output was so large because the crop is now grown in such quantity (about 260t in 2010) primarily not to produce food (be it as whole grains, fermented products including soy sauce and bean curd, and cooking oil) but as a protein-rich feed.

When assuming that soybean output would match the production of the most popular oilseed grown for food (rapeseed, at about 60t/year), the worldwide output of oil cakes would be about 160t/year. After adding less important processing by-products (from sugar and tuber, and from vegetable and fruit canning and freezing industries), the total dry mass of highly nutritious residues would be about 450t/year of which some 400t would be available as animal feed. When splitting this mass between broilers and pigs, and when assuming feed : live weight conversion ratios at, respectively, 2 : 1 and 3 : 1 and carcass weights of 70% and 60% of live weight, feeding of all crop processing residues would yield about 70t of chicken meat and 40t of pork.

The grand total of meat production that would come from grazing practiced with greatly reduced pasture degradation (roughly 40t of beef and small ruminant meat), from feeding forages and crop residues (40t of ruminant meat) and from converting highly nutritious crop processing residues (70t chicken meat and 40t pork) would thus amount to about 190t/year. This output would require no further conversions of forests to pastures, no arable land for growing feed crops, no additional applications of fertilizers and pesticides with all the ensuing environmental problems. And it would be equal to almost exactly two-thirds of some 290t of meat produced in 2010 &ndash but that production causes extensive overgrazing and pasture degradation, and it requires feeding of about 750t of grain and almost 200t of other feed crops cultivated on arable land predicated on large inputs of agrochemicals and energy.

And the gap between what I call rational production and the actual 2010 meat output could be narrowed. As I have used very conservative assumptions, every component of my broad estimate could be easily increased by 5% or even 10%. Specifically, this could be achieved by a combination of slightly higher planting of leguminous forages rotated with cereals, by treatment of straws with ammonia to increase its nutrition and palatability, by a slightly more efficient use of food processing by-products and also by elimination of some of the existing post-production meat waste. Consequently, the total of 200t/year can be taken as an unassailably realistic total of global meat output that could be achieved without any further conversion of natural ecosystems to grazing land, with conservative pasture management, and without any direct feeding of grains (corn, sorghum, barley), tubers or vegetables, that is, without any direct competition with food produced on arable land.

This amounts to almost 70% of the actual meat output of about 290t in the year 2010: it would not be difficult to adjust the existing system in the described ways, eliminate all cultivation of feed crops on arable land (save for the beneficial rotation with leguminous forages) and still average eating only a third less meat than we eat today.

A key question to ask then is how the annual total of some 200t of meat would compare with what I would term a rational consumption of meat rather than with the existing level. Making assumptions about rational levels of average per capita meat consumption is done best by considering actual meat intakes and their consequences. A slight majority of people in France, the country considered to be a paragon of classic meat-based cuisine, now eat no more than about 16g of meat a year per capita, and the average in Japan, the nation with the longest life expectancy, is now about 28g of meat (both rates are for edible weight). Consequently, I will round these two rates and take the per capita values of 15&ndash30g/year as the range of rational meat consumption. For seven billion people in 2012, this would translate to between 105 and 210 Mt/year &ndash or, assuming 20/30/50 beef/pork/chicken shares, between 140 and 280 Mt in carcass weight. The latter total is almost equal to the actual global meat output in 2010, with the obvious difference being that the consumption of today&rsquos output is very unevenly distributed.

If we could produce 200t/year without any competition with food crops, then the next step is to inquire how much concentrate feed we would need to grow if we were to equal current output of roughly 300t with the lowest possible environmental impact. Assuming that the additional 100t meat a year would come from a combination of 10t of beef fed from expanded cultivation of leguminous forages, 10t of herbivorous fish (conversion ratio 1 : 1) and 80t of chicken meat (conversion ratio 2 : 1), its output would require about 170t of concentrate feed, that is, less than a fifth of all feed now produced on arable land. Moreover, a significant share of this feed could come from extensive (low-yield and hence low-impact) cultivation of corn and soybeans on currently idle farmland.

Roques et al. (2011) estimated that in 2007 there were 19&ndash48 Mha of idle land (an equivalent of 1.3&ndash3.3% of the world&rsquos arable area), that is, land cultivated previously that can be planted again, most of it in North America and Asia. Using 20ha of this land would produce at least an additional 60t of feed. And when factoring in increasing crop yields, regular rotations with leguminous forages (producing excellent ruminant feed while reducing inputs of nitrogen fertilizers) and, eventually, slightly higher feed conversion efficiencies, it is realistic to expect that the share of the existing farmland used to grow feed crops could be reduced from the current share of about 33% to less than 10% of the total. Consequently, there is no doubt that we could match recent global meat output of about 300t meat a year without overgrazing, with realistically estimated feeding of residues and by-products, and with only a small claim on arable land, a combination that would greatly limit livestock&rsquos environmental impact.

Prospects for Change
Many years ago, I decided not to speculate about the course and intensity of any truly long-term developments: all that is needed to show a near-complete futility of these efforts is to look back and see to what extent would have any forecast made in 1985 captured the realities of 2010 &ndash and that would be looking just a single generation ahead, while forecasts looking half a century into the future are now quite common. Forecasting demand for meat &ndash a commodity whose production depends on so many environmental, technical and economic variables and whose future level of consumption will be, as in the past, determined by a complex interaction of population and economic growth, disposable income, cultural preferences, social norms and health concerns &ndash thus amounts to a guessing game with a fairly wide range of outcomes.

But FAO&rsquos latest long-range forecast gives just single global values (accurate to 1t) not just for 2030 (374t) but also for 2050 (455t) and 2080 (524t). Compared to 2010, the demand in 2030 would be nearly 30%, and in 2050 about 55% higher. When subdivided between developing and developed countries, the forecast has the latter group producing in 2080 only a third as much as the former. These estimates imply slow but continuing growth of average per capita meat consumption in affluent countries (more than 20% higher in 2080 than in 2007) and 70% higher per capita meat supply in the rest of the world.

Standard assumptions driving these kinds of forecasts are obvious: either a slow growth or stagnation and decline of affluent population accompanied by a slow increase of average incomes continuing, albeit slowing, population growth in modernizing countries where progressing urbanization will create not only many new large cities but also megacities, conurbations with more than 20 or 30 million people, and boost average disposable incomes of billions of people advancing technical improvements that will keep in check the relative cost of essential agricultural inputs (fertilizers, other agrochemicals, field machinery) and that will keep reducing environmental impacts and all of this powered by a continuing supply of readily available fuels and electricity whose cost per unit of final demand will not depart dramatically from the long-term trend.

Standard assumptions also imply continuation and intensification of existing practices ranging from large-scale cultivation of feed crops on arable land (with all associated environmental burdens) to further worldwide diffusion of massive centralized animal feeding operations for pork and poultry. Undoubtedly, more measures will be taken to improve the lot of mammals and birds in CAFOs. Many of them will be given a bit more space, their feed will not contain some questionable ingredients, an increasing share of them will be dosed less with unnecessary antibiotics and their wastes will be better treated. Some of these changes will be driven by animal welfare considerations, others by public health concerns, new environmental regulations and basic economic realities all of them will be incremental and uneven. And while they might be cumulatively important, it is unlikely that their aggregate positive impact will be greater than the additional negative impact created by substantial increases in the expected demand for meat: by 2030 or 2050, our carnivory could thus well exact an even higher environmental price than today.

I would strongly argue that there is absolutely no need for higher meat supply in any affluent economy, and I do not think that improved nutrition, better health and increased longevity in the rest of the world is predicated on nearly doubling meat supply in today&rsquos developing countries. Global output of as little as 140t/year (carcass weight) would guarantee minimum intakes compatible with good health, and production on the order of 200t of meat a year could be achieved without claiming any additional grazing or arable land and with water and nutrient inputs no higher than those currently used for growing just food crops.

And it could also be done in a manner that would actually improve soil quality and diversify farming income. Moreover, an additional 100t/year could be produced by using less than a fifth of the existing harvest of concentrate feeds, and it could come from less than a tenth of the farmland that is now under cultivation and that could be used to grow food crops. Even for a global population of eight billion, the output of 300t/year would prorate to nearly 40g of meat a year/capita, or well above 50g a year for adults. This means that the average for the most frequent meat eaters, adolescent and adult men, could be 55g/year, and the mean for women, children and people over 60 would be between 25 and 30g/year, rates that are far above the minima needed for adequate nutrition and even above the optima correlated with desirable health indicators (low obesity rates, low CVD mortality) and with record nationwide longevities.

Global inequalities of all kinds are not going to be eliminated in a generation or two, and hence a realistic goal is not any rapid converging toward an egalitarian consumption mean: that mean would require significant consumption cuts in some of the richest countries (halving today&rsquos average per capita supply) and some substantial increases in the poorest ones (doubling today&rsquos per capita availability). What is desirable and what should be pursued by all possible means is a gradual convergence toward that egalitarian mean combined with continuing efficiency improvements and with practical displacement of some meat consumption by environmentally less demanding animal foodstuffs.

Such a process would be benefiting everybody by improving health and life expectancies of both affluent and low-income populations and by reducing the environmental burdens of meat production. Although the two opposite consumption trends of this great transition have been evident during the past generation, a much less uneven distribution of meat supply could come about only as a result of complex adjustments that will take decades to unfold. In the absence of dietary taboos, average meat intakes can rise fast as disposable incomes go up in contrast, food preferences are among the most inertial of all behavioral traits and (except as result of a sudden economic hardship) consumption cuts of a similar rapidity are much less likely.

At the same time, modern dietary transition has modified eating habits of most of the humanity in what have been, in historic terms, relative short spans of time, in some cases as brief as a single generation. These dietary changes have been just a part of the general post-WW II shift toward greater affluence, and the two generations of these (only mildly interrupted) gains have created a habit of powerful anticipations of further gains. That may not be the case during the coming two generations because several concatenated trends are creating a world that will be appreciably different from that whose apogee was reached during the last decade of the 20th century.

Aging of Western population and, in many cases, their absolute decline appear to be irreversible processes: fertilities have fallen too far to recover above the replacement level, marriage rates are falling, first births are being postponed while the cost of raising a family in modern cities has risen considerably. By 2050, roughly two out of five Japanese, Spaniards and Germans will be above 60 years of age even in China that share will be one-third (compared to just 12% in 2010!), and, together with many smaller countries, Germany, Japan and Russia will have millions (even tens of millions) fewer people than they have today.

We have yet to understand the complex impacts of these fundamental realities, but (judging by the German, Japanese and even Chinese experiences) continuing rise in meat demand will not be one of them. And while the American population will continue to grow, the country&rsquos extraordinarily high rate of overweight and obesity, accompanied by a no less extraordinary waste of food, offer a perfect justification for greatly reduced meat consumption. Beef consumption is already in long-term decline, and the easiest way to achieve gradual lowering of America&rsquos overall per capita meat intakes would not be by appealing to environmental consciousness (or by pointing out exaggerated threats to health) but by paying a price that more accurately reflects meat&rsquos claim on energy, soils, water and the atmosphere.

Meat, of course, is not unique as we do not pay directly for the real cost of any foodstuff we consume or any form of energy that powers the modern civilizations or raw material that makes its complex infrastructures. Meat has become more affordable not only because of the rising productivity of the livestock sector but also because much less has been spent on other foodstuffs. This post-WW II spending shift has been pronounced even in the US where food was already abundant and relatively inexpensive: food expenditures took more than 40% of an average household&rsquos disposable income in 1900 by 1950, the share was about 21% it fell below 15% in 1966 and below 10% (9.9%) in the year 2000 in 2010, it was 9.4%, with just 5.5% spent on food consumed at home and 3.9% on food eaten away from home (USDA 2012b). The total expenditure was slightly less than spending on recreation and much less than spending on health care. At the same time, the share of overall food and drink spending received by farmers shrank from 14% in 1967 to 5% in 2007, while the share going to restaurants rose from 8% to 14%.

These trends cannot continue, and their arrest and a partial reversal should be a part of the affluent world&rsquos broader return to rational spending after decades of living beyond its means. Unfortunately, such adjustments may not be gradual: while the FAO food price index stayed fairly steady between 1990 and 2005, the post-2008 spike lifted it to more than double the 2002&ndash2004 mean, and it led to renewed concerns about future food supply and about the chances of recurring, and even higher, price spikes. Increased food prices in affluent countries would undoubtedly reduce the overall meat consumption, but their effect on food security on low-income nations is much less clear. For decades, low international food prices were seen as a major reason for continuing insecurity of their food supply (making it impossible for small-scale farmers to compete), but that conclusion was swiftly reversed with the post-2007 rapid rise of commodity prices that came to be seen as a major factor pushing people into hunger and poverty (Swinnen and Squicciarini 2012).

In any case, it is most unlikely that food prices in populous nations of Asia and Africa will decline to levels now prevailing in the West: China&rsquos share of food spending is still 25% of disposable income, and given the country&rsquos chronic water shortages, declining availability of high-quality farmland and rising feed imports, it is certain that it will not be halved yet again by the 2030s as it was during the past generation. And the food production and supply situation in India, Indonesia, Pakistan, Nigeria or Ethiopia is far behind China&rsquos achievements, and it will put even greater limits on the eventual rise in meat demand. In a rational world, consumers in the rich countries should be willing to pay more for a food in order to lower the environmental impacts of its production, especially when that higher cost and the resulting lower consumption would also improve agriculture&rsquos long-term prospects and benefit the health of the affected population.

So far, modern societies have shown little inclination to follow such a course &ndash but I think that during the coming decades, a combination of economic and environmental realities will hasten such rational changes. Short-term outlook for complex systems is usually more of the same, but (as in the past) unpredictable events (or events whose eventual occurrence is widely anticipated but whose timing is beyond our ken) will eventually lead to some relatively rapid changes. These realities make it impossible to predict the durability of specific trends, but I think that during the next two to four decades, the odds are more than even that many rational adjustments needed to moderate livestock&rsquos environmental impact (changes ranging from higher meat prices and reduced meat intakes to steps leading to lower environmental impacts of livestock production) will take place &ndash if not by design, then by the force of changing circumstances.

Most nations in the West, as well as Japan, have already seen saturations of per capita meat consumption: inexorably, growth curves have entered the last, plateauing, stage and in some cases have gone beyond it, resulting in actual consumption declines. Most low-income countries are still at various points along the rapidly ascending phase of their consumption growth curves, but some are already approaching the upper bend. There is a high probability that by the middle of the 21st century, global meat production will cease to pose a steadily growing threat to the biosphere&rsquos integrity.

ABOUT THE AUTHOR(S)

Vaclav Smil is a Czech-Canadian scientist and policy analyst. He is distinguished professor emeritus in the Faculty of Environment at the University of Manitoba in Winnipeg, Manitoba, Canada.


Breaking a Sweat

To understand how water has influenced the course of human evolution, we need to page back to a pivotal chapter of our prehistory. Between around three million and two million years ago, the climate in Africa, where hominins (members of the human family) first evolved, became drier. During this interval, the early hominin genus Australopithecus gave way to our own genus, Homo. In the course of this transition, body proportions changed: whereas australopithecines were short and stocky, Homo had a taller, slimmer build with more surface area. These changes reduced our ancestors' exposure to solar radiation while allowing for greater exposure to wind, which increased their ability to dissipate heat, making them more water-efficient.

Other key adaptations accompanied this shift in body plan. As climate change replaced forests with grasslands, and early hominins became more proficient at traveling on two legs in open environments, they lost their body hair and developed more sweat glands. These adaptations increased our ancestors' ability to unload excess heat and thus maintain a safe body temperature while moving, as work by Nina Jablonski of Pennsylvania State University and Peter Wheeler of Liverpool John Moores University in England has shown.

Sweat glands are a crucial part of our story. Mammals have three types of sweat glands: apocrine, sebaceous and eccrine. The eccrine glands mobilize the water and electrolytes inside cells to produce sweat. Humans have more eccrine sweat glands than any other primate. A recent study by Daniel Aldea of the University of Pennsylvania and his colleagues found that repeated mutations of a gene called Engrailed 1 may have led to this abundance of eccrine sweat glands. In relatively dry environments akin to the ones early hominins evolved in, the evaporation of sweat cools the skin and blood vessels, which, in turn, cools the body's core.

Armed with this powerful cooling system, early humans could afford to be more active than other primates. In fact, some researchers think that persistence hunting&mdashrunning an animal down until it overheats&mdashmay have been an important foraging strategy for our ancestors, one they could not have pursued if they did not have a means to avoid overheating.

This enhanced sweating ability has a downside, however: it elevates our risk of dehydration. Martin Hora of Charles University in Prague and his collaborators recently demonstrated that Homo erectus would have been able to persistence hunt for approximately five hours in the hot savanna before losing 10 percent of its body mass. In humans, 10 percent body mass loss from dehydration is generally the cutoff before serious risk of physiological and cognitive problems or even death occurs. Beyond that point, drinking becomes difficult, and intravenous fluids are needed for rehydration.

Our vulnerability to dehydration means that we are more reliant on external sources of water than our primate cousins and far more than desert-adapted animals such as sheep, camels and goats, which can lose 20 to 40 percent of their body water without risking death. These animals have an extra compartment in the gut called the forestomach that can store water as an internal buffer against dehydration.

In fact, desert-dwelling mammals have a range of adaptations to water scarcity. Some of these traits have to do with the functioning of the kidneys, which maintain the body's water and salt balance. Mammals vary in the size and shape of their kidneys and thus the extent to which they can concentrate urine and thereby conserve body water. The desert pocket mouse, for example, can live without water for months, in part because of the extreme extent to which its kidneys can concentrate urine. Humans can do this to a degree. When we lose copious amounts of water from sweating, a complex network of hormones and neural circuitry directs our kidneys to conserve water by concentrating urine. But our limited ability to do so means we cannot go without freshwater for nearly so long as the pocket mouse.

Neither can we preload our bodies with water. The desert camel can drink and store enough water to draw on for weeks. But if humans drink too much fluid, our urine output quickly increases. Our gut size and the rate at which our stomach empties limit how fast we can rehydrate. Worse, if we drink too much water too fast, we can throw off our electrolyte balance and develop hyponatremia&mdashabnormally low levels of sodium in the blood&mdashwhich is just as deadly if not more so than dehydration.

Even under favorable conditions, with food and water readily available, people generally do not recover all of their water losses from heavy exercise for at least 24 hours. And so we must be careful to strike a balance in how we lose and replenish the water in our bodies.

Desert mammals such as camels have a range of adaptations to water scarcity. Credit: Mlenny Getty Images


Why can’t bugs be grub?

Billions of people around the world regularly eat bugs. Why do so many Westerners find the idea disgusting?

Share this:

November 19, 2018 at 6:45 am

One Friday morning in May, 11-year-old Sarah Nihan went to school and did something she had never done before. She pulled a dry-roasted cricket out of a bowl and carefully lifted it to her mouth. “At first I was a little iffy,” Sarah admits. “I made the mistake of looking it in the eyes.”

At the time, Sarah was a fifth grader at Ellis School in Fremont, N.H. Before her language-arts class held its bug buffet, the students had learned all about the benefits of eating insects. Packed with protein and vitamins, insects are quite nutritious. And raising them takes far less land and water than raising traditional livestock, such as cattle. So as a food source, insects are better for the planet.

Fifth and sixth graders in New Hampshire held a classroom Bug Buffet last spring. Anyone who couldn’t stomach these dry-roasted insects (or insect-containing snacks such as cricket chips or cricket pancakes) could opt to instead eat gummy worms. Robin Lee

The kids wrote essays on the environmental and health benefits of eating bugs, or entomophagy (En-tuh-MAH-fuh-jee). They read a book about a student who ate a stink bug as defense against a bully. They watched videos of Asian people relishing tarantula burgers. Yet Sarah still had to brace herself and count to three before popping that bacon-and-cheese-flavored cricket into her mouth. “I told myself that I’m not going to lose to a bug,” she says. But after chewing a few seconds, she cringed.

She’s not alone. To most North Americans and Europeans, the thought of eating insects triggers the same reaction: Ewwww.

This isn’t how people react to all foods they dislike. For example, people who dislike asparagus usually don’t say it’s disgusting. “They just say it tastes bad,” points out Paul Rozin. “But they’d say goat intestine is disgusting.” We seem to save our revulsion for certain animal products.

Rozin is a psychologist at the University of Pennsylvania in Philadelphia. He’s spent decades studying how some foods have become taboo. He and other researchers are trying to learn where this disgust comes from — and whether it can be unlearned.


Eating your first bug isn’t always easy, as participants in the Bug Buffet learned.
Curriculum with a Cause/Facebook

Geography matters

Insects aren’t gross to everyone. Indeed, some two billion people around the world savor them on a regular basis.

Most Westerners — people who live in North America and Western Europe — don’t eat insects. But the Western diet includes a number of foods that can seem just as gross when you stop to think about them. Cheeses are made with mold and bacteria. Escargot, a dish eaten in France and other countries, consists of cooked snails. Shrimp and lobsters look kind of like giant bugs. (In fact, they’re arthropods, the same group of animals that includes insects and spiders.)

So why do Westerners shun ants, grasshoppers and other creepy-crawlies? That question piqued the interest of Julie Lesnik. She’s an anthropologist at Wayne State University in Detroit, Mich. There, she studies how the human diet has evolved.

Even though many North Americans find the idea of eating bugs gross, much of the world enjoys snacking on insects. But bugs are far from the only animals many diners shun. Here, fried scorpions are sold as a street snack in Beijing, China. weiXx/istockphoto

Lesnik has always been a picky eater. She never intended to study edible insects, let alone eat them. But while doing research in South Africa, she found evidence that primate ancestors of early humans used bone tools to dig into termite mounds. That suggested ancient humans ate insects. So when and why did Westerners quit eating bugs?

Some researchers think hunting for insects became less popular as ancient people found easier food sources in farming. If the land could support crops and cattle, why go after tiny scattered bugs with fewer calories?

Others explain the puzzle by looking at climate. Tropical countries get plenty of sun. That produces thicker vegetation, bigger insects and more kinds of them. People have better odds of finding an insect they like when they have lots to choose from year-round, Lesnik says. But farther north, where the seasons change, insects aren’t available during winter months.

Both ideas make sense. The earliest Europeans lived 18,000 to 22,000 years ago. That was during a period called the Last Glacial Maximum. Ice covered much of North America and northern Europe. To survive, people had to hunt deer and other large game. There wouldn’t have been many big, juicy bugs around. Could it be that insect-eating habits depend on where people live?

To test her idea, Lesnik gathered data on various factors that might affect whether cultures eat insects.

One such factor was agriculture. It’s likely that ancient hunter-gatherers chowed on insects. But people who raise animals and grow crops probably came to view insects as pests. That could make bugs less appealing.

Yet when Lesnik looked at a current map of insect-eating countries, she saw that agriculture was common in many of them. She also gathered data on the share of land in each country that’s good for farming. If agriculture were a key factor in insect eating, she’d expect people in farmable regions to eat fewer bugs. But that wasn’t the case.

She considered other explanations. For example, maybe the people who eat insects live in countries that are poor. Or maybe they don’t have enough farmed food to go around. If those theories were right, Lesnik would expect more insect eaters to be found in countries with crowded conditions or in low-income nations. Experts describe that last group as having a low gross domestic product, or GDP. (GDP is a way to measure the health and wealth of a nation’s economy.) However, Lesnik found no link between insect-eating and either GDP or population density. So insects aren’t just a fallback food for desperate people. Lesnik published her analysis last year in the American Journal of Human Biology.

Researchers have found that latitude — how far north or south of the equator you are — is the biggest predictor of who eats insects. People in warmer regions eat more of them. Yde Jongema

As it turns out, “Where you are in the world is the number one predictor of who’s going to be eating insects,” Lesnik says. Latitude is how far north or south you are from the equator. And in eight out of every 10 people, latitude alone predicts the likelihood that they’ll eat insects. Warmer parts of the world, Lesnik says, just have more bug-eating.

On the practical side, geography explains why early Westerners didn’t eat insects. But it doesn’t explain the emotional part — the disgust. And that disgust has not only persisted in Western culture but also crossed borders.

Yuck factor

Lesnik thinks Westerners’ “yuck” reaction to bugs came with travel.

As early Europeans began traveling farther, they met other cultures. In 1493, a member of Christopher Columbus’ expedition to the Caribbean wrote about what he saw: “They eat all the snakes, the lizards, and spiders, and worms, that they find upon the ground so that, to my fancy, their bestiality is greater than that of any beast upon the face of the earth.” In other words, he was comparing the people in the New World to animals.

Writings like this show that Europeans “considered the people they encountered beastlike because they ate insects,” Lesnik says. As Westerners colonized other cultures, they needed to make themselves feel superior to those cultures, she says. She suspects that this need strengthened Western disgust toward eating insects.

Disgust also can be learned by various messages shared within a culture, says Lesnik. We aren’t necessarily born thinking that insects are gross. “If a kid tries to put a bug in his mouth, many parents discourage that behavior and tell the kid it’s icky,” she observes.

Today, Europeans and North Americans aren’t the only ones who see eating insects as gross. The disgust is spreading to people in low-income nations who had been used to eating insects.

Arnold van Huis is a tropical entomologist at Wageningen University in the Netherlands. He noticed this shift in attitude while conducting a grasshopper study in the West African country of Niger. After adopting a Western lifestyle, “The people say, ‘We have a certain standard of living now, and we don’t eat insects anymore,’” van Huis reports. “They go for the hamburger instead of the nice grasshoppers.”

Changing minds

Can disgust be unlearned? For some people, education does the trick.

Six years ago, Robert Nathan Allen, a recent college graduate, was working as a bartender in Austin, Texas. At some point, his mom shared a video about edible insects. “She sent it as a joke — said it seemed like something wacky my dad and I would try,” Allen recalls. The video explained how bugs are good for us and good for the Earth, just as Sarah and her classmates had learned in school. “I thought this was just incredible,” Allen says.

People in the United States can buy buggy snacks including cricket-flour chips, cricket-protein bars and sour-cream-and-onion “Crick-ettes.” Robin Lee

He looked around for insect foods. He found the occasional bag of candied ants or chocolate-covered grasshoppers for sale. But there wasn’t much else available in the United States.

He started calling insect researchers on the phone. “I’ve got a bar in Austin and I want to serve bugs,” he would say. “What should I do?” Some people hung up. Others laughed him off the phone. But finally a professor confided that he cooks up a batch of bugs and brings them to school each year on the last day of class. “Everyone eats them. We all have a blast,” he said. “But please don’t tell anybody,” he implored of Allen, “because I don’t want the administration to make me stop.”

That phone call didn’t result in any new insect foods or recipes for Allen’s bar. But it did something bigger: It spurred Allen to action. “This professor was worried he’d be barred from serving a food that’s eaten by billions simply because it was stigmatized in our Western food culture,” Allen says. That made him realize there was “the need to educate the public and address the cultural taboo.”

Allen got in touch with researchers and business people who shared his goals. He discovered other campuses that host public insect-tasting events. Each year, some 30,000 people attend Purdue University’s Bug Bowl. And in February, Montana State University held its 30th annual Bug Buffet. This weeklong event features cook-off competitions, lectures and plenty of insect treats to sample.

In 2013, Allen founded an Austin nonprofit called Little Herds. The organization teaches the public about the benefits of edible insects, sometimes known as “mini livestock.”

Early on, the group set up tasting booths at local farmers’ markets. They gave talks at schools. They advertised at museums. Right away they realized their prime audience: children. Most parents wouldn’t dare reach for a roasted cricket before first sampling a nicer-looking food, such as a cookie made with cricket flour. But “little kids would just walk up and start chowing down on the crickets,” Allen says.

Fear of missing out — on bugs

Kids may be a somewhat easy sell when it comes to insects. That’s why some researchers have focused on adults. They’ve tried to figure out what traits make people likely to try insects. For a 2015 study, Rozin at the University of Pennsylvania and his colleagues gave 399 people from the United States or India an online survey about food. Participants saw pictures of cookies. Or breads baked with mealworm flour. And tacos or crepes containing whole grasshoppers. Then they asked the participants how willing they would be to sample those foods.

These researchers also asked the participants about their religion and politics. And they asked if participants agreed with statements such as: “eating bugs is disgusting,” “bugs are nutritious,” or “eating bugs puts you at risk for disease.” People also reported how willing they were to try new foods, how sensitive they were to disgust and how much they like risk and spontaneity.

Disgust was the most common reason people refuse to eat bugs, this study found. People who were most likely to try eating insects were those who weren’t easily grossed out, who didn’t mind unfamiliar foods and who liked new experiences (and telling others about them). Rozin’s team reported its findings, last year, in the Journal of Insects as Food and Feed.

Another team of researchers surveyed 368 meat eaters in Flanders, Belgium. According to their analysis, Westerners are most willing to replace meat with bugs if they’re young, male, open to new foods, environmentally conscious and already trying to eat less meat. Those 2014 findings were published in Food Quality and Preference.

Potential insect eaters may share another key trait: fear of missing out, often known as FOMO. In 2015, behavioral economists carried out a study to learn what types of messages might nudge people to try insects. At a shopping mall in England, the team lured shoppers to a table of dry-roasted crickets by posting three different signs. One described the health benefits of eating insects. A second tried to make eating insects seem normal. It showed a photo of family members at a restaurant and enjoying crickets. The third sign tapped into FOMO by showing a near-empty plate of roasted bugs with the plea, “Don’t miss your chance to try.”

The sign about health benefits did OK at attracting shoppers to the bug-foods table. The sign with the family photo did better. But the FOMO poster worked best.

It’s a marketer’s version of peer pressure. That tactic also seemed effective in the New Hampshire classroom. At the bug buffet, Sarah’s classmate Ruby Drake initially steered clear of insect foods. “I was just going to have the gummy worms,” she says. But after a friend begged her to try “one of the real bugs,” Drake picked up a roasted cricket.

Classroom questions

The taste test ended quickly. “It crumbled the minute I touched it, and that grossed me out,” Drake says. “I spit it out.”

But that crunchy critter wasn’t a deal breaker. Drake also tried cricket-flour chips. “Those were pretty good,” she says. “I would put them in my lunch box.”

As for the dry-roasted crickets, Nihan says she would eat them again. However, she adds, “I’d probably brush my teeth afterward because the legs can get stuck in your teeth.”

Power Words

agriculture The growth of plants, animals or fungi for human needs, including food, fuel, chemicals and medicine.

annual Adjective for something that happens every year. (in botany) A plant that lives only one year, so it usually has a showy flower and produces many seeds.

arthropod Any of numerous invertebrate animals of the phylum Arthropoda, including the insects, crustaceans, arachnids and myriapods, that are characterized by an exoskeleton made of a hard material called chitin and a segmented body to which jointed appendages are attached in pairs.

bacteria (singular: bacterium) Single-celled organisms. These dwell nearly everywhere on Earth, from the bottom of the sea to inside other living organisms (such as plants and animals).

behavior The way something, often a person or other organism, acts towards others, or conducts itself.

bug The slang term for an insect.

calorie The amount of energy needed to raise the temperature of 1 gram of water by 1 degree Celsius. It is typically used as a measurement of the energy contained in some defined amount of food.

cattle Also known as bovines (because they’re members of the subfamily known as Bovinae), these are breeds of livestock raised as a source of milk and meat. Although the adult females are known as cows and the males as bulls, many people refer to them all, generally, as cows.

climate The weather conditions that typically exist in one area, in general, or over a long period.

colleague Someone who works with another a co-worker or team member.

crop (in agriculture) A type of plant grown intentionally grown and nurtured by farmers, such as corn, coffee or tomatoes. Or the term could apply to the part of the plant harvested and sold by farmers.

culture (n. in social science) The sum total of typical behaviors and social practices of a related group of people (such as a tribe or nation). Their culture includes their beliefs, values and the symbols that they accept and/or use. Culture is passed on from generation to generation through learning. Scientists once thought culture to be exclusive to humans. Now they recognize some other animals show signs of culture as well, including dolphins and primates.

density The measure of how condensed some object is, found by dividing its mass by its volume.

diet The foods and liquids ingested by an animal to provide the nutrition it needs to grow and maintain health. (verb) To adopt a specific food-intake plan for the purpose of controlling body weight.

economy Term for the combined wealth and resources (people, jobs, land, forests and minerals, for instance) of a nation or region. It is often measured in terms of jobs and income or in terms of the production and use of goods (such as products) and services (for instance, nursing or internet access). Experts who study these issues are known as economists.

edible Something that can be eaten safely.

entomology The scientific study of insects. One who does this is an entomologist.

entomophagy A term for the human practice of eating insects.

equator An imaginary line around Earth that divides Earth into the Northern and Southern Hemispheres.

evolution (v. to evolve) A process by which species undergo changes over time, usually through genetic variation and natural selection. These changes usually result in a new type of organism better suited for its environment than the earlier type. The newer type is not necessarily more “advanced,” just better adapted to the particular conditions in which it developed.

expedition A journey (usually relatively long or over a great distance) that a group of people take for some defined purpose, such as to map a region’s plant life or to study the local microclimate.

factor Something that plays a role in a particular condition or event a contributor.

FOMO Urban slang for fear of missing out.

geography The study of Earth’s features and how the living and nonliving parts of the planet affect one another. Scientists who work in this field are known as geographers.

gross domestic product Abbreviated GDP, this term refers to the monetary value (for instance, the dollar value) of all of the goods that are made and services that are performed in one year by everyone living within a nation.

host (in biology and medicine) The organism (or environment) in which some other thing resides. Humans may be a temporary host for food-poisoning germs or other infective agents.

hunter-gatherer A cultural group that feeds itself through hunting, fishing and gathering wild produce (such as nuts, seeds, fruits, leaves, roots and other edible plant parts). They can be somewhat nomadic and do not rely on agriculture for their foods.

insect A type of arthropod that as an adult will have six segmented legs and three body parts: a head, thorax and abdomen. There are hundreds of thousands of insects, which include bees, beetles, flies and moths.

journal (in science) A publication in which scientists share their research findings with experts (and sometimes even the public). Some journals publish papers from all fields of science, technology, engineering and math, while others are specific to a single subject. The best journals are peer-reviewed: They send all submitted articles to outside experts to be read and critiqued. The goal, here, is to prevent the publication of mistakes, fraud or sloppy work.

latitude The distance from the equator measured in degrees (up to 90).

livestock Animals raised for meat or dairy products, including cattle, sheep, goats, pigs, chickens and geese.

lizard A type of reptile that typically walks on four legs, has a scaly body and a long tapering tail. Unlike most reptiles, lizards also typically have movable eyelids. Examples of lizards include the tuatara, chameleons, Komodo dragon, and Gila monster.

mealworm A wormlike larval form of darkling beetles. These insects are found throughout the world. The ever-hungry wormlike stage of this insect helps break down — decompose, or recycle — nutrients back into an ecosystem. These larvae also are commonly used as a food for pets and some lab animals, including chickens and fish.

Niger A French-speaking nation in the center of the horn of Africa, it established its independence from France in 1960. It is one of the world’s poorest countries and largely rural. It also has the highest fertility rate of any country in 2018, which helps explain why half of its population was under the age of 15.

online (n.) On the internet. (adj.) A term for what can be found or accessed on the internet.

peer (noun) Someone who is an equal, based on age, education, status, training or some other features. (verb) To look into something, searching for details.

politics (adj. political) The activities of people charged with governing towns, states, nations or other groups of people. It can involve deliberations over whether to create or change laws, the setting of policies for governed communities, and attempts to resolve conflicts between people or groups that want to change rules or taxes or the interpretation of laws. The people who take on these tasks as a job (profession) are known as politicians.

population (in biology) A group of individuals from the same species that lives in the same area.

primate The order of mammals that includes humans, apes, monkeys and related animals (such as tarsiers, the Daubentonia and other lemurs).

protein A compound made from one or more long chains of amino acids. Proteins are an essential part of all living organisms. They form the basis of living cells, muscle and tissues they also do the work inside of cells. Among the better-known, stand-alone proteins are the hemoglobin (in blood) and the antibodies (also in blood) that attempt to fight infections. Medicines frequently work by latching onto proteins.

psychologist A scientist or mental-health professional who studies the human mind, especially in relation to actions and behaviors.

spider A type of arthropod with four pairs of legs that usually spin threads of silk that they can use to create webs or other structures.

survey (v.) To ask questions that glean data on the opinions, practices (such as dining or sleeping habits), knowledge or skills of a broad range of people. Researchers select the number and types of people questioned in hopes that the answers these individuals give will be representative of others who are their age, belong to the same ethnic group or live in the same region. (n.) The list of questions that will be offered to glean those data.

taboo A term for some activity that is considered wholly inappropriate and/or forbidden within a particular religious or social group. Many times the idea of this practice is so off limits that people won’t even discuss it in public.

tactic An action or plan of action to accomplish a particular feat.

tarantula A hairy spider, some of which grow large enough to catch small lizards, frogs and birds.

taste One of the basic properties the body uses to sense its environment, especially foods, using receptors (taste buds) on the tongue (and some other organs).

termite An ant-like insect that lives in colonies, building nests underground, in trees or in human structures (like houses and apartment buildings). Most feed on wood.

trait A characteristic feature of something. (in genetics) A quality or characteristic that can be inherited.

vegetation Leafy, green plants. The term refers to the collective community of plants in some area. Typically these do not include tall trees, but instead plants that are shrub height or shorter.

vitamin Any of a group of chemicals that are essential for normal growth and nutrition and are required in small quantities in the diet because either they cannot be made by the body or the body cannot easily make them in sufficient amounts to support health.

Western (n. the West) An adjective describing nations in Western Europe and North America (from Mexico northward). These nations tend to be fairly industrialized and to share generally similar lifestyles levels of economic development (incomes) and attitudes toward work, education, social issues and government.

Citations

Journal: A. van Huis. Did early humans consume insects? Journal of Insects as Food and Feed. Posted online September 25, 2017. doi: 10.3920/JIFF2017.x006.

Journal: M.B. Ruby et al. Determinants of willingness to eat insects in the USA and India. Journal of Insects as Food and Feed. Posted online August 17, 2015. doi: 10.3920/JIFF2015.0029.

Journal: J. Lesnik. Not just a fallback food: global patterns of insect consumption related to geography, not agriculture. American Journal of Human Biology. Posted online February 1, 2017. doi: 10.1002/ajhb.22976.

Journal: W. Verbeke. Profiling consumers who are ready to adopt insects as a meat substitute in a Western society. Food Quality and Preference, Vol. 39, January 2015, p. 147-155. doi: 10.1016/j.foodqual.2014.07.008.

About Esther Landhuis

Esther Landhuis is a freelance journalist in the San Francisco Bay Area. She worked on her high school newspaper and spent a decade studying biology before discovering a career that combines writing and science.

Classroom Resources for This Article Learn more

Free educator resources are available for this article. Register to access:


Dissection

I’ve been building a theoretical background now for the case, which should apply for similar claims of humans being herbivores or “made for plants” or whatever, but let’s spend some time on this image:

First of all, it is obvious that the point of this image is to try and show that humans are “frugivores” thus, more like the primate (which I really cannot identify the species of, not a primatologist). Please note that just due to shared evolutionary history, we will be more similar to a primate in many cases simply by that.

Secondly, frugivores are basically omnivores. Frugivores are usually used as a term for omnivores that feed on fruit. Most frugivores do not eat exclusively fruit. Anyway, orangutans are usually referred to as frugivorous. Take a look at this orangutan skeleton and look at those canines. Just as a contrast for that – cherry picked – image representing all frugivores.

So let’s go down the table and just stop and think at every row.

Physiological food: What the hell is that? A platonic diet?

Hands/legs: Is this reflecting adaptation towards specific diets? I think not.

Walking: Well this is obviously cherry picked to fit the idea. These walking styles are in no way representative of diet. Some primates walk upright, and many primates are omnivores.

Mouth opening: Again, is this evidence for specialization? The image tries to imply that only meat eaters have large mouths, what about hippos?

Teeth: Human teeth look neither like an herbivore or a carnivore. Again, cherry picking away, what would happen if you used a panda as representative of herbivore teeth?

Chewing: This behavior is clearly related to what type of food you are actually eating and not a fixed behavior that is a clear adaptation to specialization of food. Some foods needs to be chewed more to swallow.

Saliva: As discussed above, humans have adapted to eating starch from agriculture. Omnivores are expected to handle both vegetable matter and animal tissue, so this is nothing strange.

Urine: The urine is the body’s way of excreting waste products, regulating water balance and body pH levels. The pH is dependent on what one eats. A high protein diet is causing acidic urine an animal does not have a carnivorous diet because it has acidic urine (Rose, Parker, Jefferson, & Cartmell, 2015).

Urate oxidase: Humans and other higher apes have this gene, but it’s not functional. Otherwise this is present in virtually all organisms. Apes are outliers in that sense.

Gastric acid: This is simply wrong. Human gastric acid has a pH of 1.5 – 3.5 which is highly acidic (Lehrer, 2014)

Fibers and cholesterol: This might be true, but it’s mainly carnivores that really need these traits. I don’t know if this is representative of the given groups, but from what you might notice, you shouldn’t trust the image.

Sweat: Humans are like omnivores in this sense even according to the image.

Intestines: As one might expect from an omnivore the intestines have an intermediate relative length between carnivores and herbivores.

Short alkaline colon: Here my guess would be that since a bear is chosen as a representative for omnivores – which are closely related to carnivorous polar bears, It might be the reason why omnivorous bears are biased towards carnivoury.

Cellulose: Humans are like omnivores in this sense even according to the image.

Digestion: As one might expect from an omnivore the digestion time according to the image is an intermediate between carnivores and herbivores.

With that being said, there are additional problems one would face if one would claim that humans aren’t omnivores. Humans are not able to synthesize sufficient b12 in the gut, neither can humans acquire b12 from any other source than animal origin or artificially fermented – supplements https://en.wikipedia.org/wiki/Vitamin_B12 (accessed: 04/01/2016). Additionally humans absorb iron form heme-sources most efficiently. That food that contains blood (West & Oates, 2008).

I’ve said it earlier and I say it again. This is not an argument for not being vegan. Humans are omnivores, but can live on a completely vegan diet with the supplementation of B12 from fermentation. I think that trying to claim that humans are something else than omnivores are just counter productive since it’s quite easily debunked and we lose credibility. There are plenty of reasons to be vegan and still stick to what is true. This post is mainly focused on debunking the claim that humans are herbivores and should therefore eat only plants, but the post should qualify to debunk anyone claiming that humans are biological meat eaters and therefore should eat meat, likewise.

And hey! This is the longest ever debunking of a meme I ever done, and probably will do. Memes are stupid


Mashed potatoes are Uncle Mike’s favorite food at family dinners he would spoon mound after mound of them onto his plate long after everyone else was done eating. Many people seem to feel the same way about these creamy mountains of starch, but is it possible something sinister lurks within, threatening some people with weight gain? To answer that question we first need to understand starch and its role in the human diet, as well as what happens to starch as we digest it, and how this process might differ in different people.

Starch, a principle component of potatoes, corn, pasta, bread, and rice, as seen in Figure 1, is composed of long, branching chains of glucose, which is a sugar and the primary source of energy for living cells. In fact, starch is so rich in glucose, it serves as energy storage for plants to help them survive when the climate is cold or dry. These starchy stores are also exploited by animals, like humans, for energy to grow, stay warm, fight illness, and reproduce.

(a) Starch molecules are long, branching chains of the sugar, glucose, linked together. Pink shading denotes individual glucose molecules. Image credit, modified from Wikipedia, Amylopektin Sessel. (b) Many plant foods, such as potatoes, pasta, bread, rice, and corn have high starch content. Purified starch, such as this corn starch, is a fine, white, tasteless powder. Photo credit, clockwise from top: Wikipedia, Potatoes, Starchy Foods, Corn Starch Mixed with Water.

But, some human populations have historically eaten more starch than others. For example, humans living in tropical or arctic environments eat less starch than those living in more dry or temperate climates. This is because the amount of starch plants produce, and thus the amount that is available for human consumption, depends on climate [1]. In tropical rainforests with plentiful sun and rain, plants have little need to store energy, focusing instead on capturing it by growing big leaves, and also growing rich fruits to attract animals to disseminate their seeds. Humans living in these environments, therefore, eat lower starch diets, consuming more meat, fruit, and honey. Additionally, arctic human populations in Siberia and northern Canada have little access to plant food at all, relying principally on animal foods. However, in somewhat dry or more temperate seasonal climates, plants grow large starch supplies, so humans living there have long dug for starchy tubers like potatoes as staples of their diets. Around 10,000 years ago, many of these populations cultivated plants like wheat, maize, and rice to maximize starch stores. This innovation actually changed how starch is digested in different people’s bodies—a divergence with serious ramifications for people’s health today.

When starch is consumed, it dissolves into glucose molecules with the help of molecular machines, known as enzymes. Specifically, enzymes called amylases aid in breaking starch into glucose with the help of water. The first amylases to act are those found in the saliva, encoded in the genome (the full set of heritable material of an organism) by one gene, called AMY1. However, mistakes that happen when the genome replicates sometimes cause AMY1 to duplicate, so some people end up with many copies of AMY1—up to twenty! It turns out that people who have more copies of AMY1 actually produce more amylase enzymes in their saliva, and more efficiently digest starch in their mouths [1]. This seems to have provided a nutritional benefit in populations who domesticated plants and increased starch consumption over time these agriculturalist populations got more and more copies of AMY1, while the number of copies in non-agriculturalists has remained relatively low [1].

This difference in efficiency of starch digestion in saliva also has surprising downstream effects on the body. People with lower efficiency of starch digestion and a lower number of AMY1 copies actually have more dramatic spikes in their blood glucose levels after eating starch than people with a higher number—even though people with a higher number of copies break starch into sugar faster [1]. When food is digested into sugars, the sugars are absorbed into the bloodstream in order to feed tissues like muscle or fat around the body. In order for these tissues to take up glucose from the blood a hormone called insulin is required. Insulin is supposed to make sure tissues take up the energy-rich glucose from the blood, preventing blood glucose levels from getting too high and becoming toxic, as in diabetes. As it turns out, people with a lower number of AMY1 copies also have lower insulin levels after eating starch, so their glucose stays in the blood instead of entering tissues, which may explain why those people end up with bigger spikes in blood glucose levels [1, 2]. So maybe high starch consuming populations have become adapted to not only efficiently digest starch into sugars, but also use those sugars, keeping blood glucose levels moderate.

Unfortunately, lower insulin levels and higher glucose spikes also relate to a risk of obesity. According to a 2014 study of over 5,000 people from Europe and Asia, each fewer copy of AMY1 was associated with a 20% increase in risk of obesity, as depicted in Figure 2 [3]. Furthermore, variation in AMY1 copies may account for somewhere between 2.5% and 20% of all variation in risk of obesity among people. Prior to this study, all of the hundreds of genetic variants found to be associated with obesity accounted together for only 2-4% of the genetic risk of obesity, or less than 3% of the overall risk of obesity.

Copies of AMY1 in the genome for different individuals. Fewer copies of AMY1 increases risk of obesity and also impacts blood glucose and insulin—traits closely linked to diabetes.

We still would like to know more about how starch affects health. Researchers want to figure out why exactly people with low amylase are at greater risk for obesity and insulin resistance, and how this relates to starch. For example, one study found that people can actually perceive the changes in viscosity of their saliva as the starch digests in their mouths [1]. If starch digests too slowly in some people’s mouths, would they think it tastes bad? Or would they feel less full from eating starch? Researchers are also trying to figure out if salivary amylase has different important functions in other tissues, like fat, where it is also found at high levels [4].

In spite of these mysteries, current research tells us two things. First, though huge variation exists within any given population, humans from different places are adapted to different diets, at least as far as starch is concerned. And, second, such adaptations can lead to vastly different health outcomes for people eating modern diets rich in processed, starchy foods. Uncle Mike certainly has never had a problem with eating potatoes, but I can’t really say if that’s because he’s an extremely efficient starch digester or because he’s been a champion runner since high school. Other factors like lifestyle and development also influence food digestion and obesity risk. However, studying what people’s ancestors ate and why people’s bodies differ in what they do with the food they eat will help tailor nutritional guidelines to suit individuals.

Elizabeth Brown is a graduate student in the Department of Human Evolutionary Biology at Harvard University.

References

[1] American Society for Nutrition:

[2] Muneyuki, et al. “Latent associations of low serum amylase with decreased plasma insulin levels and insulin resistance in asymptomatic middle-aged adults.” Cardiovascular diabetology. 11, 1: 80.


Drinking milk might have other advantages besides its nutritional value

She speculates that drinking milk might have other advantages besides its nutritional value. People who keep livestock are exposed to their diseases, which can include anthrax and cryptosporidiosis. It may be that drinking cow’s milk provides antibodies against some of these infections. Indeed, milk's protective effect is thought to be one of the benefits of breastfeeding children.

Women nurse their children in Bogota, Colombia for a World Breastfeeding Week event. Milk’s protective effect is thought to be a benefit of breastfeeding (Credit: Getty)

But some of the mysterious absences of lactase-persistence could be down to sheer chance: whether anyone in a group of pastoralists happened to get the right mutation. Until fairly recently there were a lot fewer people on Earth and local populations were smaller, so some groups would miss out by plain bad luck.

“I think the most coherent part of the picture is that there’s a correlation with the way of life, with pastoralism,” says Swallow. “But you have to have the mutation first.” Only then could natural selection go to work.

In the case of Mongolian herders, Swallow points out that they typically drink fermented milk, which again has a lower lactose content. Arguably, the ease with which milk can be processed to be more edible makes the rise of lactase persistence even more puzzling. “Because we were so good at adapting culturally to processing and fermenting the milk, I’m struggling with why we ever adapted genetically,” says Swallow’s PhD student Catherine Walker.

There may have been several factors promoting lactase persistence, not just one. Swallow suspects that the key may have been milk’s nutritional benefits, such as that it is rich in fat, protein, sugar and micronutrients like calcium and vitamin D.

It is also a source of clean water. Depending on where your community lived, you may have evolved to tolerate it for one reason over another.

It’s unclear whether lactase persistence is still being actively favoured by evolution, and thus whether it will become more widespread, says Swallow. In 2018 she co-authored a study of a group of pastoralists in the Coquimbo region of Chile, who acquired the lactase-persistence mutation when their ancestors interbred with newly-arrived Europeans 500 years ago. The trait is now spreading through the population: it is being favoured by evolution, as it was in northern Europeans 5,000 years ago.

Dairy cows munch on alfalfa in north-western France, a part of the world where people would have adapted to drinking milk around 3,000 years ago (Credit: Getty)

But this is a special case because the Coquimbo people are heavily reliant on milk. Globally, the picture is very different. “I would think it’s stabilised myself, except in countries where they have milk dependence and there is a shortage [of other food],” says Swallow. “In the West, where we have such good diets, the selective pressures are not really likely to be there.”

Dairy decline?

If anything, the news over the last few years offers the opposite impression: that people are abandoning milk. In November 2018, the Guardian published a story headlined “How we fell out of love with milk”, describing the meteoric rise of the companies selling oat and nut milks, and suggesting that traditional milk is facing a major battle.

But the statistics tell a different story. According to the 2018 report of the IFCN Dairy Research Network, global milk production has increased every year since 1998 in response to growing demand. In 2017, 864 million tonnes of milk were produced worldwide. This shows no sign of slowing down: the IFCN expects milk demand to rise 35% by 2030 to 1,168 million tonnes. (Read more about how milk became a staple food in industrialised societies).

Still, this masks some more localised trends. A 2010 study of food consumption found that in the US milk consumption has fallen over the last few decades – although it was replaced with fizzy drinks, not almond milk. This fall was balanced by growing demand in developing countries, especially in Asia – something the IFCN has also noted. Meanwhile, a 2015 study of people’s drinking habits in 187 countries found that milk drinking was more common in older people, which does suggest that it is less popular with the young – although this says nothing about young people’s consumption of milk products like yoghurt.

While milk consumption has fallen in the US, in Asia demand is growing (Credit: Getty)


From Campfire to Haute Cuisine: How Food and Flavor Drove Human Evolution

Taste is often dismissed as the most primitive of the senses. But new research reveals it is in many ways the most subtle and complex of them all.

Reporting from kitchens, farms, restaurants, and science labs, Pulitzer Prize-winning author John McQuaid, author of Tasty: The Art and Science of What We Eat, tells the story of how taste and food drove human evolution. Talking from his home in Maryland, he explains how human evolution and taste are intimately linked, why we like a juicy burger, and why sugar is wrecking our taste buds.

You open the book with a description of something called the "tongue map." Tell us about that.

The "tongue map" was a fixture of elementary school experiments where you would be handed a bunch of cups containing a bitter-sweet, salty-sour solution, the four basic tastes that the tongue can detect. According to this map, different parts of the tongue could detect each taste. The tip of the tongue detected sweet the back of the tongue, bitter and the sides, salty and sour. You would swish these things around in your mouth and discover the anatomy of the tongue. But the tongue map proved to be completely false. [Laughs]

Later, as scientists began to develop the tools to unravel the anatomy of the tongue down to the molecular level, they were able to discover exactly how it worked, which is much more complicated than that early map. Taste is detected through tiny proteins called taste receptors, which are embedded in the taste buds. These taste cells respond to each of the basic tastes. So, every part of the tongue can detect every taste.

Recently, they have discovered an anatomy at the molecular level that can detect fat and produce a unique and pleasurable sensation in the brain. Some people think that there's also a taste for carbohydrates, such as starches, which no one really thought before.

You say "flavor embodies the basic savagery of being an animal." We taste, therefore we are?

Taste evolved in order to provide gratification from food. If you are an animal, you would go out and seek out food, and eat it, and stay alive. This has been true for hundreds of millions of years. Today, we live in a society that papers over those basic urges, but the urges are still there. They're there every time you bite into a hamburger or drink a glass of wine. The anatomy of your brain and body responds, and these ancient impulses take over.

We think we're dining in a refined restaurant, supping on the finest food and drink available, but actually we're just animals devouring a kill.

You go as far as to say that "we owe our existence and our humanity to taste—and, in many ways our future depends on it, too." Is this just a blurb for the book cover?

No. [Chuckles] In the course of researching the book, I came to believe that it's absolutely the case. This is an underappreciated aspect of taste and flavor. Because taste and flavor are so mundane, something we experience every day, people don't think about it in these terms. But at each point in the evolution of humans, taste provided a kind of bootstrapping mechanism for larger brains and smarter strategies for getting food.

As food became more gratifying, these strategies improved. For example, cooking changed the taste of food. So when it was invented, somewhere around a million years ago, there was a whole revolution in flavor. This has been tied by scientists to humans developing larger brains and the human body changing shape to become more like we are now. So, the emergence of human culture and civilization both—on some level—rotate around food, and food rotates around flavor.

Taste has been kept on the margins of scientific enquiry for the past two thousand years. Why? And what's changed?

Partly what we were talking about earlier. There's a certain anxiety about the savage nature of eating and flavor. Philosophers and scientists have traditionally regarded taste as a base sense, whereas vision or hearing were construed to be a higher sense, a refined sense.

The other problem was that until recently no one had the tools to untangle how tastes really worked. You could study optics, or how light worked, back in the Renaissance. But you need very sophisticated tools to study genes and molecular biology. That's the reason why things are changing fast today. We're getting all these fascinating new insights into taste and flavor.

One of the truly shocking statistics in the book is that Americans now consume 40 teaspoons of sugar per day. What will be the consequences if this sugar binge continues?

We're already seeing terrible consequences in high obesity. Sugar has been linked to diabetes. Type 2 diabetes has gone way up, and there are all sorts of other health problems that have been linked to sugar: metabolic issues, cardiovascular issues. It's a public health problem that no one really knows how to solve.

Soft drink companies are responsible for a lot of that extra sugar. And people are starting to recognize that this is a problem and looking for other things to drink. As a result, soft drinks have suffered losses in their sales.

But sugar has addictive properties. It hits the pleasure centers of the brain, rather like cocaine. It's not exactly the same thing. But as you eat or drink a lot of sugar, the body becomes less sensitive to it, like addiction. So to get that sweet taste, you need to have more sugar. How you break that is very difficult.

Many food lovers will raise their eyebrows at your assertion that "the burger is the culmination of 5,000 years of culinary history." Isn't that how bush meat was cooked 5,000 years ago?

[Laughs] I don't know if it was ground up and grilled in quite the same way! But it's a good question. It has more to do with the availability issue. For most of the period where humans have been living in settlements, roughly the past 12,000 years or so, most of them didn't get much meat, because meat is relatively hard to come by, unless you're living in a herding society.

Only in the past hundred years or so has meat become more generally available, thanks to the development of the industrial food system. That changed the whole dynamic that had existed for thousands of years. Suddenly, everybody who's living in a fairly well-off society, and many in non-well-off societies—McDonald's are everywhere—can now get meat as the center of their meal. Previously, only rich people and kings and their courts could do that. It has been a real revolution, which has changed food and the tastes people have. People have developed a taste for much richer food than they had in the past.

The meat industry in America is famously litigious. Did this prevent you taking a more critical stand toward mass-produced beef?

I was looking at the ordinary experience of a consumer of food and how that has changed. What does it mean to be eating more meat? I'm not sure if that's something that the meat industry will get upset about. Perhaps they will.

But in terms of the taste issue, it raises a lot of issues about how human biology works and is fascinating for that reason. I don't know if I'm going to offend anyone in the meat industry by kind of exploring these issues. Because I think it's important to explore them. And I hope that people will be educated by what I've written.

I just bought some carrots at my local farmers market, and they in no way resembled the woody, bitter things you buy in a plastic bag.

I was more looking at the scientific question of taste rather than where food comes from. One thing I did look at, though, is the decline of the taste of food [raised] in industrial systems. For example, since the tomato has been a mass-produced farm product, it has lost a lot of the flavor that it once had a hundred years ago. The tomatoes we get in the supermarket look fantastic. They're red, they're bulging, but they don't really taste that great.

I talked to this one professor at the University of Florida who has launched a project to try to recover this lost flavor of the tomato. They've decoded the genome of the tomato, and gathered heirloom tomatoes, trying to engineer a tomato that tastes really good but is also capable of being produced by farmers and, ideally, mass produced.

How has writing this book changed the way you live and eat?

That's a good question. Prior to doing this I didn't think a whole lot about flavor: where it came from, how my body was responding, or how a particular food was engineered. It's made me a great deal more skeptical about sugar and about diet foods, also.

One thing that has come up recently is that even diet drinks with artificial sweeteners may be messing with the body's metabolism in a way that is similar to sugar. So, even with no calories, they could be causing a diabetic syndrome.

All of these things are very alarming when you look into them. I used to be a big Diet Coke drinker, but I've curtailed my consumption. At the same time, I have developed a much greater appreciation of what traditional food producers and chefs do in terms of all of the craft that goes into making, for example, a cheese. I visited an artisanal cheese-making facility in Vermont and saw the great care that goes into making it and the complexity of how they manage these microbial communities, which make the flavor.

Now every time I taste something that has been made in a careful way, it's a much different experience. I have a much greater appreciation. I try to savor things more assiduously than I did before.


Why do we love being gross?

In a somewhat counterintuitive twist on Darwin’s theory, kids might love being gross because it gives them an evolutionary advantage.

We already know that not all germs are bad for us. From our gut flora to the germs on our skin, microbes work with our immune systems to maintain our body’s equilibrium, protect us from pathogens, and more. Science also tells us that wallowing in a little filth, especially with activities that bring kids close to the soil or in contact with animals, helps them build stronger immune systems that can more readily fight disease.

“It is less about getting dirty than it is enabling them to interact with the world around them,” says Jack Gilbert, a paediatrics professor at the University of California, San Diego. Gilbert doesn’t run after his kids with disinfecting wipes. He lets them experience nature’s bouquet of microbes, because he knows their future immune systems depend on it.

“Children that physically interact with a dog, under the age of one years old or so, will have a 13 percent reduction in the likelihood of developing asthma,” he says. “Kids who grow up on a farm interacting with lots of farm animals have a 50 percent reduction. That exposure is actually very important for stopping chronic allergic diseases.”

Childhood is effectively a boot camp for the immune system—at least, until a certain age. One study from 2014 shows that for most kids, disgust sensitivity starts to kick in around five. That’s right around the time kids are more likely to be exposed to more dangerous forms of microbial life, such as the respiratory syncytial virus and Giardia, a microscopic parasite that causes diarrhoea.

“This is an age at which they've been weaned, and so they're starting to find food for themselves and put a lot of things in their mouths, but their immune systems aren't fully developed,” says study author Joshua Rottman, an assistant professor of psychology at Franklin & Marshall College in Lancaster, Pennsylvania. “A lot of really young children die every year due to pathogens and parasites. That might be in part because they're not disgusted.”

Some adults find gross things compelling, too. We judiciously inspect the contents of our tissues, watch gory movies, enjoy slimy food, and take strange enjoyment in squeezing our spots. What’s wrong with us?

The jury’s still out on this one. But researchers have a few ideas. Some experts, including Rottman, attribute our zeal for the ick factor to “benign masochism,” in which our brains find pleasure in negative things. Others hypothesise it’s our subconscious bent for problem solving that makes the gross so compelling.

“It’s related to the value of learning about a threat, so that you may better protect yourself in the future, or neutralising the threat now,” says Laith Al-Shawaf, an assistant professor of psychology at the University of Colorado, Colorado Springs. “So, if your kid gets an open wound and there's pus oozing out, you have to go learn more about it and attend to your kid and help them.”

Both of these could be true. There’s also a third hypothesis: Dirt could still be good for adult immune systems, says Gilbert. “I think of the immune system like a gardener,” he says. “It’s there to maintain the garden of microbes we come in contact with every day, to keep the good ones around and the bad ones out. The good ones have a massive impact on our health.”


Why humans didn't evolve to safely consume rotten food? - Biology

A lot has changed since the age of dinosaurs hundreds of millions of years ago. Humans didn't exist and dinosaurs are gone. Yet crocodiles are still here and, unlike humans, have not evolved much by comparison.
They even look similar to ones from the Jurassic period some 200 million years ago.

A new study find that it's due to a 'stop-start' pattern of evolution, governed by environmental change. This pattern of evolution known as "punctuated equilibrium" is generally slow, but occasionally means faster evolution because the environment has changed. This new research suggests that their evolution speeds up when the climate is warmer, and that their body size increases.

"Our analysis used a machine learning algorithm to estimate rates of evolution," said lead author Dr. Max Stockdale from the University of Bristol. "Evolutionary rate is the amount of change that has taken place over a given amount of time, which we can work out by comparing measurements from fossils and taking into account how old they are. For our study we measured body size, which is important because it interacts with how fast animals grow, how much food they need, how big their populations are and how likely they are to become extinct."

This also explains why there are only 25 species of crocodile today while animals such as lizards and birds have achieved a diversity of many thousands of species in the same amount of time or less. Crocodiles arrived at a body plan that was very efficient and versatile enough that they didn't need to change it in order to survive.

Crocodiles had a much greater diversity of forms in the past. Examples include fast runners, digging and burrowing forms, herbivores, and ocean-going species. Image: University of Bristol

So why did prehistoric dinosaurs die out? The climate during the age of dinosaurs was warmer than it is today, and that may explain why there were many more varieties of crocodile than we see now. Being able to draw energy from the sun means they do not need to eat as much as a warm-blooded animal like a bird or a mammal. Which means they should not have been as impacted by the meteor impact that set off the daisy chain of events that led to mass extinction. But there is no certain answer.



Comments:

  1. Dracul

    to burn

  2. Kigahn

    Tell me where can I read about this?



Write a message