Fire: The Spark That Ignited Cognitive Evolution
Before fields and flocks, humans farmed fire.
New archaeological evidence suggests that long before anyone sowed a grain field, our ancestors were already “domesticating” a very different resource: fire.
A new paper in Nature reports the earliest clear evidence that humans could make fire on demand about 400,000 years ago at a site in what is now eastern England. Agriculture means gaining reliable control over plants and animals. Fire-making means gaining reliable control over energy itself.
From borrowed flames to homemade fire
Archaeologists have long known that ancient humans used fire more than a million years ago. Burnt bones and ash layers show up at sites in Africa and Eurasia well before 400,000 years ago. But there’s a crucial difference between:
Using natural fire (for example, from lightning strikes or wildfires, then keeping embers going), and
Being able to make fire wherever and whenever you want.
Most earlier evidence could be explained by people taking advantage of wildfires and tending those flames as long as possible. Proving that someone in the deep past made fire, not just maintained it, is incredibly hard, because natural and human-caused burning can leave similar traces in the soil.
Until now, the earliest strong case for deliberate fire-making came from a site in France dated to about 50,000 years ago, associated with Neanderthals (Jiang et al., 2025)
The new Nature study pulls that date back by roughly 350,000 years.
The Barnham site: a 400,000-year-old campsite in England
The new evidence comes from Barnham, a Middle Pleistocene site in Suffolk, about 145 km northeast of London. The site lies in sediments dating to around 400,000 years ago (Marine Isotope Stage 11), a relatively warm interglacial period when what is now East Anglia was home to early humans — likely early Neanderthals or their close relatives.
At Barnham, researchers found:
A buried land surface with clear signs of heating
Fire-cracked flint handaxes, broken and reddened in ways consistent with intense heat
Soils whose mineral structure shows exposure to very high temperatures (over ~750 °C), revealed by micromorphology and Fourier transform infrared spectroscopy (FTIR)
Chemical traces of burning in the form of polycyclic aromatic hydrocarbons (PAHs) and magnetic changes indicating multiple episodes of fire, not a one-off blaze.
Taken together, these lines of evidence already make a strong case that repeated fires were lit and tended on that surface — more like a campsite hearth than a random bushfire that happened to pass by.
But that only shows controlled fire. The really crucial clue is what lay next to those hearths.
The smoking gun: imported pyrite fire-starters
Among the heated sediments and cracked tools, the team found two small fragments of iron pyrite. That might not sound dramatic, but it’s the game-changer.
Iron pyrite (often called “fool’s gold”) is famous in later prehistory as a fire-striking mineral: if you hit it against the sharp edge of flint, you produce a shower of sparks. Experimental archaeology and wear analyses of later Stone Age “strike-a-light” kits show that flint + pyrite = portable lighter in Stone Age form.
At Barnham:
Pyrite is geologically rare in the immediate area. Geological surveys and regional studies suggest it does not occur naturally right where the camp was.
The fragments have characteristics consistent with being brought in rather than just eroding out of local rock.
They were found right alongside the hearth deposits, not randomly scattered in deeper layers.
The simplest explanation is that these early humans carried pyrite to the site specifically to strike sparks against flint: that is, they had a deliberate fire-making toolkit.
As Nick Ashton of the British Museum, one of the project leaders, put it, this is “the earliest evidence of making fire, not just in Britain or Europe, but anywhere in the world.”
Not a “normal” ancient campfire
You might ask: couldn’t lightning or a natural wildfire have caused the burning, with pyrite arriving by chance?
The authors push back against that with several arguments:
Repeated burning:
The chemical signatures (PAHs) and magnetic changes in the soil suggest multiple fire episodes, not one intense wildfire. That fits better with people relighting campfires over time.Local geology:
Pyrite being extremely rare in the local rock makes it unlikely to appear by accident in exactly the same spot as the hearths.Context with tools:
The fire-cracked handaxes and heated sediments form a coherent activity zone: stone tools, hearth, and fire-making mineral together.Analogy with later strike-a-light kits:
In later Neanderthal and early modern human sites, flint and pyrite are found together and interpreted as fire-making sets. Microwear and experiments show they were used exactly this way.
When you stack these lines of evidence, the most parsimonious explanation is that fire was being produced deliberately on site.
Fire-making reshaped human evolution, and this discovery suggests our ancestors were ‘farming’ hundreds of thousands of years before wheat fields existed. Behind the paywall: How controlled fire may have triggered brain growth through the cooking hypothesis, why this fits gene-culture coevolution models, and what it tells us about Neanderthal intelligence.
Fire as the first “farm”
So where does “farming” come into this?
Agriculture, in the narrow sense, begins roughly 10,000–12,000 years ago in the Near East with the domestication of cereals like wheat and barley. But in a broader sense, humans have been manipulating ecological and energetic resources for much longer:
Aboriginal Australians and other Indigenous groups practiced “fire farming” or fire-stick agriculture, using controlled burns to manage vegetation and favour certain species.
Hunter-gatherers across many environments used fire to clear undergrowth, drive game, and renew plant growth long before formal agriculture.
What the Barnham discovery suggests is that the technological basis for this kind of “fire farming” may go back 400,000 years.
Once you can make sparks whenever you like, you are no longer at the mercy of lightning storms or smouldering logs. You can:
Cook food on demand, not only when nature provides embers
Select where on the landscape you want fire, and how often
Use fire for warmth, protection, tool production (e.g., heat-treating stone), and perhaps even early kinds of landscape management
In that sense, controlling fire is like domesticating a super-tool. It’s not a plant or an animal, but an energy regime, and it predates wheat fields by hundreds of thousands of years.
Evolutionary implications
The study has real implications for how we think about brain and body evolution.
1. Cooking and the energy budget
The “cooking hypothesis” argues that fire-enabled cooking allowed humans to get more calories and nutrients out of food with less digestive effort, freeing up energy for bigger brains.
If reliable fire-making was already in place 400,000 years ago, it strengthens the idea that:
Regular cooking of roots, tubers, and meat could have become a stable part of the diet well before Homo sapiens appears.
The energy gains from cooked food may have contributed to brain growth in Middle Pleistocene humans and early Neanderthals, not just later modern humans.
This would also fit neatly into a broader pattern of gene–culture coevolution, where a cultural invention reshapes diet or social life and then feeds back into biological evolution. Once fire-making becomes reliable, it changes what people eat, how often they cook, and perhaps even how they structure their days. In turn, those shifts can create new selection pressures on digestion, metabolism and brain function.
The Barnham paper itself underlines how cognitively demanding this behaviour already was: the authors argue that the evidence points to complex behaviour, including understanding the special properties of pyrite, choosing appropriate tinder for successful ignition, and curating pyrite as part of a fire-making kit, alongside other late Middle Pleistocene signs of sophistication such as systematic bone and wood-working and the manufacture of adhesives for hafting. In other words, by this point we are not just looking at opportunistic scavenging of wild fire, but at a technological package that had to be learned, transmitted and improved over generations — literally a spark of fire and, quite possibly, the spark that triggered a new phase of cognitive evolution.
Within a gene–culture coevolution framework, it is natural to imagine that this process may have started with a rare biological outlier. One possibility is a single mutation that slightly boosted planning or technical reasoning in an individual who figured out how to strike pyrite against flint and manage tinder. Another is that someone happened to be born with a particularly high polygenic score for intelligence, giving them the cognitive edge needed to invent and refine fire-making. Once such an innovation appears, it changes the social and ecological game: families and groups that master the technique gain a huge nutritional and survival advantage, which then intensifies selection for the very cognitive traits that made the innovation possible in the first place.
We already know of clear examples of this feedback loop:
Pastoralism → lactase persistence: the cultural adoption of dairying created an environment where individuals who could digest lactose into adulthood had a nutritional advantage, leading to the spread of lactase-persistence alleles.
Agriculture → cognitive evolution: the rise of complex, settled societies with division of labour, record-keeping and long-term planning likely favoured traits linked to learning and abstract reasoning, with genetic evidence that some alleles associated with cognition have shifted in frequency over the Holocene.
If the Barnham interpretation is right, controlled fire and cooking may be an even earlier case of this same mechanism: a technological breakthrough that reorganises daily life and diet, and over many generations helps sculpt the human brain and body.
In that framework, a clear, testable prediction follows: once we have enough ancient genomes spanning this period, polygenic scores for cognition should show an uptick or acceleration starting around the time reliable fire use becomes widespread. We have already seen marked acceleration in cognition-related polygenic scores after the adoption of agriculture about 12,000 years ago.
However, this prediction is almost certainly untestable with current methods. Recovering enough hominin DNA at ~400,000 years ago, with sufficient quality and sample size to build polygenic scores and track them over time, is beyond what ancient DNA technology can realistically deliver, especially outside permafrost. So this remains a theoretical implication of the gene–culture coevolution framework, not an empirical claim. It tells us what should have happened if the model is right, even if the signal is forever out of reach of our sequencing machines.
The only partial workaround is to fall back on the good old cranial-capacity measurements from fossil skulls as a very rough proxy: if fire and cooking really triggered an energy windfall for the brain, we might expect to see a step up or acceleration in average brain size in fossils dating to around this period, even if the underlying polygenic signal itself remains invisible.
Neanderthals, again, look smarter
The paper is cautious about exactly which hominin species made the fires at Barnham, but the regional context points toward early Neanderthals or their close kin.
Over the last decade, the stereotype of Neanderthals as clumsy, dim-witted cavemen has been eroding. We now know they:
Used sophisticated stone tool technologies
Likely made glues and tars for hafting tools
May have produced symbolic objects and had complex hunting strategies
Adding fire-making 400,000 years ago to this list pushes their technological sophistication even further back in time.
Implications beyond archaeology
In perspective, wheat fields are a late chapter. The Barnham evidence suggests that by 400,000 years ago, humans (or very close relatives) were already doing something conceptually similar to farming. Not with crops, but with controlled combustion.
They weren’t “pure” hunter-gatherers.
They were, in a very real sense, fire farmers.
References
S. Jiang, et al., (2025). Onset of extensive human fire use 50,000 y ago,Proc. Natl. Acad. Sci. U.S.A.122 (27) e2500042122, https://doi.org/10.1073/pnas.2500042122
Davis, R., Hatch, M., Hoare, S. et al. Earliest evidence of making fire. Nature (2025). https://doi.org/10.1038/s41586-025-09855-6


