Today we have a good understanding of nutritional needs, with things like vitamin-D-fortified milk, or earlier knowledge that you needed to take citrus fruit on sea voyages. But in the early days when humans were a fairly new product, people just ate whatever they could. When our nutritional needs are so complex, how could people get enough of everything to be healthy and survive?
Further, if many of these nutrients are not readily and naturally available, why did our bodies evolve to have such needs?
“Eating whatever you could” gives a person an extremely varied diet over the course of a year. Nutritional issues only began to develop when we started just eating what we grew in the back yard (or what was stable enough to be stored for months).
Early people were outside in the sun a lot, which would take care of Vitamin D. And “gathering” was undoubtedly a huge part of “hunting and gathering”- this includes edible plant matter, but also likely insects and other small, easily obtained animals (like shellfish for coastal gatherers). With a varied diet of plant and small animal matter, supplemented by occasional large kills, I think it would be quite easy to obtain all the necessary nutrients.
It seems to me pretty clear that if such nutrients weren’t readily available, we wouldn’t now be in a position to need them. Part of the problem comes when people moved out of the region where they were available.
The issue about vitamin C is particularly interesting. Our bodies used to make it. Some time around 60 million years ago a genetic mutation occurred which sttopped our ancestors (simians by then) from producing it internally. I suspect that, since we lived in an environment where there were plenty of vitamin C sources, this wasn’t a fatal mutation, and it continued to breed true, since it wasn’t selected against. It was only when people started moving outside of that area that it became an issue.
But not immediately – even Inuit people have sources for vitamin C (or else they wouldn’t survivbe in the Arctic). But most sources have to be fresah, so when people started taking long sea voyages, scurvy became a problem. (Since vitamin C deteriorates rapidly, even bringing along fruit juice was not at first the cure people had hoped for).
Ironically, since they retained the ability to make vitamin C internally, the rats on those ships survived just fine, while the sailors succumbed to tissue disorders.
There are those who think our lack of the vitamin C gene may actually have contributed to the development of humans. See the Wikipedia pages, and others:
But the short answer is that we don’t develop the need for hard-to-find nutrients. We develop where such things are easily available, then learn or develop coping mechanisms when they become hard to find. Or else we die.
DrFidelius has it correct, but maybe needs just a bit of amplification.
The diet that our ancestors ate has been revived as a food plan called the Paleo Diet.
The Paleo Diet is still controversial and the amount of genetic adaptation to changing diets is disputed by many. Putting that aside for the moment, here’s the sequence that is usually given.
Humans evolved in fairly lush surroundings in Africa. Estimates are that they ate 200 different species of plants. They supplemented this with meat first by scavenging then hunting. The tribes living by the sea ate fish and shellfish galore. That’s easily sufficient to get all needed nutrients. However, it assumes a nomadic lifestyle so that the tribe could always be near the best sources of nutrients.
Climatic change seems to have driven humans out of Africa to populate the rest of the world. They adapted to local conditions, even when it meant as radical a reversal of diet as the Inuit. They still hunt over long distances, though.
The modern human diet appeared in the Neolithic. The very short version is that humans learned how to grow crops from seeds and domesticate animals. Doing that eliminated the need to roam over long distances - good - but also tied them into one locality - bad. So the first cities appeared as populations built up. Hierarchies accompanied cities, with the result that large numbers of peasants grew a limited variety of crops and only those on top had varied diets. Centuries of peasants around the world ate mostly one staple crop and supplemented it with a tiny few vegetables and occasional meat.
Our bodies need a huge variety of nutrients because we’re animals and our animal ancestors needed the same huge variety of nutrients. They did it by eating everything they could get their hands or paws on. We’re the ones who totally artificially cut down that varied abundance. To get vitamin D all you have to do is spend time in the sun. It wasn’t needed in milk. We changed that. To get vitamin C all we needed to do was eat any of a number of fruits or vegetables. A year-long sea voyage cut sailors off from those. Neither of those things or all the other problems of modern life were ever going to happen to animals in the wild.
How much exactly our bodies have evolved and adapted in the relatively short span of 15,000 years is controversial. Lactose intolerance is something new, even though it’s from a simple mutation. But until milkable animals were domesticated and dairy products served to adults, the mutated gene was irrelevant. We keep doing things nature never anticipated. That’s the only reason we have to scramble to get all our nutrients.
They evolved to have these needs because it wasn’t a problem at that point. Vitamin C is a great example. Most mammals can synthesize vitamin C internally, and in fact the ancestors of all mammals could. But at some point, a primate ancestor had a mutation that damaged the enzyme and could no longer synthesize vitamin C. Now, for many animals, that would be fatal, but the proto-primate species had a diet that included lots of fruit, so they got enough vitamin C from their food, and they didn’t need to synthesize any. So that particular mutated individual survived and either just by random chance or because there was some advantage to it that we don’t know about, that mutation spread and ended up getting passed down to modern primates (including humans). Since early humans still got enough C in their diet, it wasn’t a problem for them.
Still generally isn’t a problem, except in rare cases where humans are eating nearly exclusively preserved food.
And, I don’t know much about the particular biochemistry of vitamin C, but it’s not completely far-fetched to think that there might be some advantage to getting rid of the gene to synthesize vitamin C. For instance, the enzyme that creates vitamin C might interfere chemically with some other important reactions in the cell, and dropping the vitamin C enzyme might allow the other reactions to be much more efficient in some way.
Especially if that includes edible items that we (that is, at least in Northern Europe and the USA) are a bit squeamish about - i.e. insects - they’re a great source of nutrients.
Nit pick: There were no “simians” 60M years ago (the so-called higher primates). They would have been prosimians at that time.
The other thing to keep in mind is that you get a lot more nutrients by eating more of the plant or animal. We modern types tend to process the food or eat only select parts, leaving out much of the nutritional elements.
In some cases, there are mutations that are pretty common when there are (I might be mixing my terminologies here) endemic deficiencies of certain nutrients.
For example, hemochromatosis is a hereditary condition where people’s bodies basically hoard dietary iron. It’s typically seen in people of N. European descent. There is some thought that this was an evolutionary advantage in some areas where dietary iron levels are very low- people weren’t anemic if they had this particular mutation.
I think it’s also worth mentioning that modern standards nutrition (such as RDA values) are different from what even early ancestors probably got. You don’t need 100% of RDA every single day to be “healthy enough” - in many cases, a dose far below the RDA is enough to avoid the worst nutritional deficiencies. Looking at vitamin C specifically, you can go months with no vitamin C before suffering scurvy, and a mere 10 mg is enough to avoid it forever.
so we’ve just raised the bar for what we consider “healthy” amounts.
Nitpick, you mean lactase persistence, not lactose intolerance, which doesn’t really exist since it is the normal condition for all adult animals.
Also, while the paleo diet eschews dairy as if it it something really unhealthy, some studies have found that people with lactose persistence have better health than their counterparts (see link above under evolutionary advantages); this paper (results taken from multiple studies) that says people with higher dairy intake have a lower risk of cardiovascular disease, diabetes and cancer (and oddly enough, this article says that people who ate high-fat dairy had weight loss, and conversely weight gain occurred with low-fat dairy, presumably because it was more satiating since no effects on metabolism were found, or perhaps it is due to specific fatty acids).
Personally, I think there is nothing wrong with most of the food eaten today as long as they are eaten in a relatively balanced manner; of course, avoiding any food that you are intolerant/allergic to (or foods with lots of added fats and sugars; note just how many of our daily calories come from these and where caloric intake has increased in recent decades, leading to the current epidemic of metabolic diseases).
As for vitamins like Vitamin D, various sites claim (example) that up to 75% of the world population is deficient, even those who get plenty of sunlight, which suggests that the real problem is setting the optimum blood levels too high (Qadgop the Mercotan recently debunked the claims about Vitamin D in this post).
You’re completely accurate in what you say. I shouldn’t have said that it’s from a mutation. However, the condition that is popularly known as lactose intolerance - defined as the presence of symptoms after eating lactose - didn’t exist as a condition until high-lactose dairy products became a major component of diets. And lactase persistence is not a condition at all.
It’s similar to gluten intolerance, the popular term for celiac disease and similar ailments, which wasn’t an issue until wheat and other gluten containing grains became staple crops.
That’s exactly why the Paleo Diet eliminates these types of newly created foods. Putting them into our diets creates problems for some because of the time needed to evolve to adapt to them.
You seem to be putting the cart before the horse here. Lactose intolerance is the ancestral condition. It’s the “normal”, if you want to use that loaded word. So to say that lactose intolerance is a result of having dairy products around is wrong. Lactose persistence is the result of a mutation which arose some time back and allowed (some of) us to keep making lactase and digesting milk into adulthood.
If I’m reading right, you’re suggesting that lactose intolerance arose as a consequence of too much dairy in everyone’s diet. That’s simply untrue. People are generally lactose intolerant because they lack the mutation that makes them lactose tolerant. In reality, of course, it’s a bit more complicated than that, but not a whole lot so.
And most of those plants your mom told you not to touch or eat when you were playing outside because “they’re probably poisonous”…probably weren’t. Many of them are excellent sources of nutrients that used to be a large part of the diet and today aren’t - nettles, lamb’s quarters, sorrel, chickweed, dandelion…there’s a veritable buffet of free vitamin C every spring in our yards. We call them weeds and spray them with poison or dig them out and throw them away, congratulate ourselves on a productive day of “gardening” and then go inside and take our daily multivitamin pill. :smack:
The medical literature makes a distinction. Humans - except for a vanishingly small handful - are born with the ability to manufacture lactase to digest the lactose in their mother’s milk. The lactase-manufacturing ability vanishes at about the time of weaning in most mammals, and presumably did so in most human societies until recently. Oddly, there is no one recognized name for this in the medical literature. Lactase nonpersistence is probably the usual term, but I’ve also seen lactose maldigestion, lactose malabsorption, lactase deficiency, hypolactasia, and low lactose digestion capacity.
Lactose intolerance is not used for the underlying genetic or physiological problem. It is a clinical term defined, as I said above, as the presence of symptoms after a lactose load. The presence of symptoms is highly correlated with an underlying lack of lactase but is not identical. Many studies have found that some people who produce lactase get some symptoms and many with a lack of lactase don’t even in a large lactose challenge.
You can’t have a clinical finding of lactose intolerance in a world without dairy. You can have lactase nonpersistence, but that’s medically a different thing.
Notice how many terms are used in the abstract: Persistence of intestinal lactase; lactose maldigester; lactose digester; lactose maldigestion; lactase deficiency; lactose intolerance. They use lactose intolerance correctly, referring only to symptoms, but an outsider could be excused for not getting the subtle distinctions they’re making.
Yes, I made a dumb mistake. I really know better. The gene that makes the protein trigger on chromosome 2 that turns off lactase-making sometime after weaning is the ancestral condition we inherited from our ancestors. That gene is still the majority in the world population. The mutation that keeps the gene from sending the stop lactase signal, what we call lactase persistence, is dominant and spread quickly after the rise of dairying made the ability to have milk products as an adult valuable.
That’s genetics. Lactose intolerance is cultural. It only shows in populations where adult consumption of dairy is the norm. Individuals can show symptoms to dairy in any population, of course, but the condition of lactose intolerance is something new. In fact, the medical community didn’t realize it existed in adults until the 1960s. The term is that recent.
Sorry for the confusion. The popular use of lactose intolerance is confusing, though.
I eat a nearly-all-raw version of the paleo diet, mostly fish (wild-caught ocean fish and shellfish) and fruit (high-brix).
My health is in MUCH better shape than before. I’ve been doing this for about 7 years now. If I start deviating from the diet too much, I start having health problems.