Analytical chemistry tools are helping to unravel one of the most remarkable feats of recent human evolution, Maria Burke reports – the development of lactose tolerance, which allows people to digest lactose, the main sugar in milk
Despite the prominence of milk drinking in Europe and the US, only about a third of the world’s population has a genetic mutation that allows it to digest lactose. It’s estimated that about 68% of the world’s population can’t digest lactose, which can cause problems such as cramps, diarrhoea, and flatulence. This lactose intolerance appears more commonly in South America, Africa and Asia.1
‘To digest lactose, we need to produce the enzyme lactase in our gut,’ explains George Davey Smith, Director of the MRC integrative epidemiology unit at the University of Bristol, UK. ‘Almost all babies produce lactase, but in the majority of people globally that production declines rapidly between weaning and adolescence. However, a genetic trait called lactase persistence [the continued expression of lactase into adulthood] has evolved multiple times over the last 9,000 years and spread in various milk-drinking populations in Europe, central and southern Asia, the Middle East and Africa.’ This allows people to drink milk comfortably.
Scientists have assumed that lactase persistence (LP) emerged because it allowed people to consume more milk and dairy products which meant they were taking in useful calories as well as nutrients such as calcium and vitamin D; this is known as the calcium assimilation hypothesis. But new research, involving scientists from the University of Bristol and University College London (UCL) in the UK with collaborators from 20 other countries, suggests that there may be another more significant reason.2
They found that milk or milk product consumption was common and popular throughout Europe long before people were lactose tolerant. Under normal circumstances, this didn’t present too many problems, especially if people mostly consumed fermented milk (yoghurt or cheese) because fermentation converts much of the lactose into fats, and they ate plenty of other foods as well. But during famines, as crops failed and they used up supplies of fermented foodstuffs, people ate more high-lactose products. This would have caused the usual lactose intolerance symptoms such as diarrhoea, which can be life-threatening in severely malnourished individuals or those suffering from disease. As a result, those people able to digest milk were more likely to survive.
The team mapped patterns of milk use over the last 9000 years, mined the UK Biobank, which holds genetic and medical data for more than 300,000 living individuals, and combined ancient DNA, radiocarbon, and archaeological data using new computer modelling techniques. Their findings show LP was not common until around 1000 BCE, nearly 4000 years after it was first detected (4700–4600 BCE).
‘The LP genetic variant was pushed to high frequency by some sort of turbocharged natural selection,’ says Mark Thomas, Professor of evolutionary genetics and study co-author from UCL. ‘The problem is such strong natural selection is hard to explain.’
Researchers have been able to list almost 7000 animal fat residues derived from over 13,000 pot shards from 554 archaeological sites.
Milk was used extensively in European prehistory, dating from the earliest farming nearly 9000 years ago, but increased and decreased in different regions at different times.
That’s where chemistry could help. Richard Evershed’s team from the University of Bristol’s School of Chemistry, assembled a database of organic animal fat residues found on fragments of pottery and then used it to find out where and when people were consuming milk.
‘When cooking in unglazed clay pottery vessels, lipids from foodstuffs get absorbed into the clay pores of pottery,’ explains team member Mélanie Roffet-Salque. ‘They can remain preserved there for thousands of years. At the University of Bristol, we are able to extract those lipids using organic solvents and analyse them using state-of-the-art chromatographic and spectrometric methods to quantify and characterise those compounds.’
In the 1990s, Evershed developed a method to characterise animal fats further and distinguish between non-ruminant and ruminant fats, and dairy and adipose fats. This method is based on compound-specific carbon isotope analyses of the main fatty acids from degraded animal fats (palmitic C16:0 and stearic C18:0 acids) using a gas chromatograph coupled with an isotope ratio mass spectrometer through a combustion interface (GC-C-IRMS). Using these analytical methods, a few years ago the team extracted animal fats from over 4000 pottery shards across early farming sites in Europe. In this recent study, they used these data together with data from published literature to list almost 7000 animal fat residues derived from over 13,000 pot shards from 554 archaeological sites.
‘Now, for the first time, we were able to map milk use in time and space, from 7000 BCE to 1000 AD across Europe,’ says Roffet-Salque. ‘This was only possible due to the analysis of thousands of pottery shards throughout the year and the construction of this unprecedented database.’ Their findings showed milk was used extensively in European prehistory, dating from the earliest farming nearly 9000 years ago, but increased and decreased in different regions at different times.
The LP [lactase persistence] genetic variant was pushed to high frequency by some sort of turbocharged natural selection. The problem is such strong natural selection is hard to explain.
Mark Thomas professor of evolutionary genetics, University College London
The UCL team, meanwhile, assembled a database of the presence or absence of the LP genetic variant using published ancient DNA sequences from more than 1700 prehistoric European and Asian individuals. They found that the variant first appeared around 5000 years ago. By 3000 years ago, it was at appreciable frequencies and is very common today. Next, the team developed a new statistical approach to examine how well changes in milk use through time could explain the natural selection for LP. Surprisingly, they found no relationship, challenging the long-held view that the growth of milk use drove LP evolution.
In the third strand of the study, UK Biobank data showed few differences in milk drinking behaviour between genetically lactase persistent and non-persistent people. Significantly, the researchers found the majority of people who didn’t have the LP gene experienced no short or long-term negative health effects when they consume milk.
‘Our findings show milk use was widespread in Europe for at least 9000 years, and healthy humans, even those who are not lactase persistent, could happily consume milk without getting ill,’ Davey Smith concludes. ‘However, drinking milk in lactase non-persistent individuals does lead to a high concentration of lactose in the intestine, which can draw fluid into the colon, and dehydration can result when this is combined with diarrhoeal disease.’
A healthy person who is lactase non-persistent and drinks lots of milk, may experience some discomfort, but doesn’t die of it, says Thomas. ‘However, if you are severely malnourished and have diarrhoea, then you’ve got life-threatening problems. When their crops failed, prehistoric people would have been more likely to consume unfermented high-lactose milk – exactly when they shouldn’t.’
To test these ideas, Thomas’ team applied indicators of past famine and pathogen exposure into their statistical models. Their results clearly supported both explanations – the LP gene variant was under stronger natural selection when there were indications of more famine and more pathogens.
The authors suggest that, in later prehistory, as populations and settlement sizes grew, human health would have been increasingly affected by poor sanitation and diseases causing diarrhoea. Under these conditions, consuming milk would have resulted in increasing death rates, with individuals lacking lactase persistence being especially vulnerable. This situation would have been further exacerbated under famine conditions when disease and malnutrition rates rise. This would lead to individuals who did not carry a copy of the LP gene variant being more likely to die before or during their reproductive years, which would push up the population prevalence of lactase persistence. It seems the same factors that influence human mortality today drove the evolution of the gene through prehistory, they conclude.
Lactase persistence – the continued expression of lactase into adulthood – was not common until around 1000 BCE, nearly 4000 years after it was first detected (4700–4600 BCE).
Using a new statistical approach to examine how well changes in milk use through time could explain the natural selection for lactase persistence, scientists found no relationship, challenging the long-held view that the growth of milk use drove lactase persistence evolution.
Bronze Age backup
The team’s theory is in line with the findings by another international group of researchers from the US, Germany and Switzerland. They tested the genetic material from the bones of people who died during a Bronze Age battle in around 1200 BCE near a river in present day Germany. The team found that, of the 14 Bronze age warriors, only one had the LP allele genetic variant – a frequency of about 7%.3 This was despite the battle occurring more than 4000 years after the introduction of agriculture in Europe, partly involving the consumption of dairy products from early domesticated animals. Other European genetic data from the early Medieval period less than 2000 years later indicate that more than 60% of individuals had the ability to drink milk as adults, close to what is observed in modern central European countries.
‘The frequency [we found] is much lower than the frequency found in northern Germany today,’ says Daniel Wegmann, a biologist at the University of Fribourg, Germany. ‘In a sample we used, representing modern Europeans, the frequency of the LP allele is about 75%, but there is some variation. Our modelling revealed that such a frequency change across about 3000 years cannot be explained without very strong selection – actually the strongest selection inferred for any locus to date.’
The fact that the LP allele and the LP trait was under selection was not in itself surprising, Wegmann continues. ‘What was surprising was that the frequency was still very low in the Bronze Age and selection was still so strong after the Bronze Age. The classic theory assumed that selection started to be strong as soon as people started to drink milk, and to get weaker as alternative foods become available to the general population, for instance during the Roman period. But, as our data suggest, it was likely during those past 3000 years when selection had its major impact. That opens up the debate about why this locus was under such strong selection during this time.’
The team’s theory, tallying with that of the UCL/Bristol team, is that lactase persistence was only crucial under special circumstances such as, for instance, a pandemic or a famine. ‘These were situations during which access to food may have been limited, not least because people were too sick to harvest or collect. Being able to consume milk directly from the cow with which many shared their roof may thus have made a real difference.
Intolerance versus allergy
The research also raises questions about whether some people who believe they are lactose intolerant might be able to tolerate drinking milk. In adults, one theory why someone might appear to be lactose intolerant is their general diet. If someone deficient in lactase also eats lots of processed foods, this could lead to imbalances in the gut microbiome. This means undigested milk remains in the colon for longer where certain bacteria are more likely to break it down into substances that cause stomach cramps, bloating and flatulence. In contrast, transit times through the colon will be much faster if that person eats lots of fibre, allowing milk to pass through without causing noticeable problems, despite the lack of lactase. Our ancestors most probably had a diet high in fibre from vegetables and cereals and this could be why they were able to tolerate milk for such a long period of time, despite not producing lactase.
Lactose intolerance can be confused with cow’s milk allergy, where the body overreacts to the protein casein. A person with lactose intolerance can eat hard cheese and butter without experiencing symptoms whereas a person with a milk allergy cannot – butter contains hardly any lactose, hard cheese a little. A hydrogen breath test is a quick way to determine between the two. The patient consumes a standard dose of lactose, waits half an hour, and then their breath is measured. If hydrogen is present, it shows that the lactose hasn’t been digested fully.
In 2021, UK researchers flagged concerns that international guidelines on diagnosing cow’s milk allergy may be leading to overdiagnosis, and unintentionally medicalising normal baby symptoms. The study of over 1000 infants found three-quarters had two or more symptoms at some point in their first year of life.4 The researchers say their findings suggest that the majority of symptoms listed in guidelines – vomiting, colic, loose stools, constipation, eczema – are common, normal and not caused by cow’s milk allergy.
‘Our findings come against a background of rising prescription rates for specialist formula for children with cow’s milk allergy, which is completely out of proportion to how common we know the condition is,’ says children’s allergy doctor, Michael Perkin from St George’s, University of London. ‘Incorrectly attributing these symptoms to cow’s milk allergy is not only unhelpful, but it may also cause harm by discouraging breastfeeding.’
1 The Lancet. Gastroenterology & Hepatology. 2017, 2 (10), 738.
2 Nature, 2022, 608, 336.
3 Current Biology, 2020; doi: 10.1016/j.cub.2020.08.033
4 Clinical & Experimental Allergy, 2021; doi: 10.1111/cea.14060