Getting pregnant is not as easy as it might sometimes seem. Natural selection has played an important role in mammalian evolution by ensuring that conceptions that are not likely to survive pregnancy and the early months of life are eliminated before the mother has invested much time and energy. Today in health-rich nations many conceptions that would have been lost early in pregnancy or in the first few days of life are ‘‘rescued’’ by technological interventions that are usually welcomed by women and men who desire children, no matter what their reproductive or evolutionary ‘‘value’’ might be. In fact, many readers of this book might not be here today if they had been conceived, gestated, or born under conditions that existed a mere 100 years ago, much less earlier in human evolution. Although getting pregnant and staying pregnant are very different challenges, thankfully, once the first two to three months of pregnancy have passed, most proceed with little stress. Before moving on to the last two trimesters, a few concerns with regard to the first trimester should be mentioned, including morning sickness.
More on the First Trimester
During the first trimester, the developing organism is referred to as an embryo. This is the period of differentiation during which most of the organ systems are being developed. It is also a period of great vulnerability because many factors can interfere with the developing embryo to render it nonviable or to affect development in a way that will have lifelong effects if the embryo survives. As noted, the first trimester is a period of great vulnerability and it is often a time when a woman does not even realize she is pregnant. This is particularly problematic when her normal diet is not sufficient or well balanced, a common problem with teen pregnancies in the United States. Nutritional deficiencies at this time can result in numerous developmental failures in the embryo: insufficient riboflavin can cause problems with skeletal development; low B-6 (pyridoxine) can result in neuromotor problems; low B- can result in hydrocephaly; low niacin can result in cleft palate; low folic acid can cause neural tube defects; low vitamin A can cause vision problems; and low iodine can result in neurological problems and cretinism.
Iodine Deficiency, Cretinism, and the Evolution of PTC-Tasting
Cretinism, a condition in which individuals show marked physical and mental abnormalities, results from an underactive maternal thyroid during pregnancy. This can be due to a clinical condition in the fetus or the mother or it may be due to insufficient intake of dietary iodine in pregnancy. In both cases, if the infant is treated soon after birth, the condition does not appear to affect later physical development, but if it is not treated, the child will be stunted and, perhaps, mentally retarded. Iodine deficiency during critical periods of brain development may not be reversible. Dietary iodine is used in the formation of thyroid hormone. Cretinism caused by insufficient iodine in the diet is a worldwide problem and is most common in parts of the world where crops are grown in iodine-poor soils. The World Health Organization estimates that approximately 35% of the world population is at risk for iodine deficiency ranging from a low of about 10% in the Americas to a high of about 57% in Europe. Iodine-deficient soils are most commonly found in areas that were previously under continental glaciers, which explains the high frequency of low-iodine intake in Europe before fortification of salt and other foods. A deficiency of this mineral also causes goitre in adults, and this region is often referred to as the ‘‘goitre belt’’ because of high incidence of the disorder. Goitres are signs of an overactive thyroid gland, which expands to form a mass in the throat area and is very visible in extreme cases. In most cases the goitre does not have a major impact on the health of the mother, but it has an obvious and dangerous effect on her developing fetus and, thus, her reproductive success. Dietary iodine deficiency is the primary reason for the iodization of salt and other foods, so there is a technological fix, but there is speculation that there may also be a genetic ‘‘fix’’ that results from natural selection. An interesting, although still speculative, proposal is that the high frequency of a common gene that affects whether a person can taste certain bitter substances may be due to its role in prevention of goitre and cretinism. This gene is commonly called the ‘‘tasting or non tasting’’ gene because it determines whether a person can taste the chemical known as PTC (phenylthiocarbamide). There is a great deal of variation in the frequencies of the alleles, indicating that natural selection acted on the gene in the past and may be acting on it today. On the surface, it is hard to imagine how a person who can taste the substance would be advantaged over the non taster or vice versa.
The chemical is closely related to those that cause bitter tastes in members of the cabbage family such as broccoli and Brussels sprouts. Tasters (those with the homozygous dominant or heterozygous combination of alleles) detect the bitter substance and tend to avoid eating such foods or eat them in small quantities. Non tasters (those with two recessive alleles) are not as aware of the bitter taste and tend to eat the foods in larger quantities. Interestingly, these foods are known as ‘‘goitrogens’’ in that they inhibit the uptake of iodine, which is a problem in populations that consume foods grown on iodine-deficient soils. Before the iodization of salt, women who avoided eating these bitter foods (because they were tasters) would have been better able to utilize the iodine available and less likely to have children with cretinism. Nontasters may have been more likely to eat the foods, and thus, their iodine uptake may have been compromised, resulting perhaps in major health problems in their children. So if Brussels sprouts are not among your favourite foods, you may be able to thank your aversion to their bitter taste for keeping your ancestors healthy. Furthermore, if cabbage-family vegetables were particularly likely to cause morning sickness in your mother (as will be discussed next), you may have been protected from developing thyroid problems or even cretinism. Cabbage family vegetables are among those most frequently cited as causing morning sickness.
Medical Concerns (or Not) of Early Pregnancy: Morning Sickness
Two common medical concerns of pregnancy that have been viewed as non pathological in some circumstances by evolutionary medicine: early fatal loss and eclampsia and preeclampsia. A third example is nausea of pregnancy. Although morning sickness (also called ‘‘nausea and vomiting of pregnancy’’ or NVP and early pregnancy sickness) is so common (its incidence in the United States may be as high as 90% of pregnancies) that it is a normal and expected part of pregnancy, there are still efforts to ‘‘treat’’ it by developing drugs or other interventions that can prevent its occurrence. A particularly tragic effort to ‘‘solve’’ the problem of morning sickness in the 1950s with the drug thalidomide resulted in thousands of children being born with severe abnormalities. Morning sickness is an example of a health problem that benefits from questioning whether it is a defence or a defect. The direct or proximate cause of nausea and food aversions is likely hormonal (perhaps HCG, progesterone, and estradiol, which rise early in pregnancy and changes in gut hormones that delay gastric emptying), but an evolutionary or ultimate explanation is that pregnant women who found potentially harmful food components to be aversive may have protected their foetuses from developmental damage, especially during the first trimester. If this is true, one could speculate that ancestral women who had morning sickness had more healthy offspring and greater reproductive success. There is extensive evidence that pregnancies that do not include morning sickness may be at risk. Miscarriage is far less frequent in women who report than in those who do not report feeling nauseous early in pregnancy. Midwives with whom I have worked are concerned when they have a client who reports not having had morning sickness. Foods to which women report aversions include pungent meats, bitter vegetables (such as the Brussels sprouts discussed earlier), overripe foods, spicy foods, and smoked foods, all of which may have compounds that interfere with fatal development. It is important to note that in most cases these food aversions developed during pregnancy and most women report that they previously liked the foods. In an extensive survey of the literature, the category ‘‘meat, fish, poultry, and eggs’’ was listed most frequently as food aversions, which had not been expected based on previous research and the preconceived notion that all of these foods would be ‘‘good for’’ pregnant women to eat. If we look at the conditions under which meats and poultry would have been stored in the past and in many areas of the world today it becomes clear that food-borne illnesses and food poisoning are much more common for this category of foods than for fruits, vegetables, and beverages. Exposure to these kinds of contaminants is especially problematic for women during the early stages of pregnancy when their immune systems are suppressed. Later, when the fetus is not so vulnerable, a mother’s aversions to meats, eggs, and poultry decrease when the benefits of eating these protein-rich foods outweigh the risks to the developing baby.
The weeks when morning sickness is most pronounced coincide with the time when the embryo is most vulnerable. Although there is concern that chronic nausea can lead to mal- and under nutrition, most problems with morning sickness are during early pregnancy when lowered food intake by the mother is not as problematic as it would be later in pregnancy when rapid fatal growth is occurring. In most cases, nausea disappears after the first trimester. The trade-off here is that mothers may be miserable for a period during pregnancy, but the fetus, who is the cause of the misery, is protecting itself. Cultural taboos are common for women during pregnancy and include prohibitions against certain behaviours and consumption of certain foods. Many anthropologists have interpreted the taboos as cultural mechanisms for protecting pregnant women at a vulnerable time. Anthropologist Dan Fessler, in an extensive review of the cross-cultural literature, found 73 societies that had specific food taboos for pregnant women. Meat was by far the most common category of forbidden food. Even handling of meat is often prohibited for pregnant women and, as we know, handling can expose a person to dangerous contaminants. Given the importance of meat consumption in human evolution, Fessler concludes that ‘‘meat-borne diseases have constituted a significant source of selective pressure on pregnant women for much of human history.’’ One problem that pregnant women may encounter when meat is not a prominent part of their diet is iron depletion. Iron needs increase in pregnancy and iron deficiency is the most common nutritional deficiency in the world today. Insufficient iron is implicated in preterm birth, low birth weight, infant and maternal mortality, and delayed cognitive development of the child. Fessler argues that iron depletion early in pregnancy could be adaptive in that low iron levels inhibit pathogen proliferation, which may compensate for the mother’s suppressed immune system at this time. One function of geophagy (see p. 80) may be that it serves to inhibit the absorption of iron. As pregnancy proceeds and the iron needs of the fetus increase, the mother’s physiology changes so that she can better absorb what iron is available. Trade-offs would have to be examined—are babies healthier if their mothers risked low iron to suppress infection early in pregnancy or are the dangers of low iron greater than the dangers of infection? As far as we know, morning sickness does not occur in other animal species, although decreased appetite early in pregnancy has been reported for domestic dogs, captive rhesus macaques, and captive chimpanzees. Cross-cultural studies of morning sickness confirm that it is not restricted to women in health-rich countries, although it seems to be somewhat more rare where traditional foods are bland (such as regions where corn or rice is the primary staple). Morning sickness was probably more important in early human evolution when foods were less predictable and more variable; this degree of variability has come back into human diets only with mass marketing of foods.
We can argue that morning sickness, up to a point, can be a good thing, but there is a point when the defence becomes a defect. There are limits to the adaptive value of nausea and vomiting early in pregnancy and when nausea results in dehydration and extreme weight loss, health of both mother and fetus is compromised and one or both could die. At this point, the nausea moves into the category of a ‘‘defect’’ and is estimated to occur in 1 in 5,000 pregnancies. Ewald proposes that the extreme forms of pregnancy sickness may result from a naturally selected defence being ‘‘hijacked’’ by an infectious agent. Anthropologist Ivy Pike argues that the ‘‘embryo protection hypothesis’’ for morning sickness has been largely based on observations of well-nourished women and contends that for women who were inadequately nourished before pregnancy, there are nutritional consequences. She presents data for African Turkana women showing that the risk for fatal, perinatal, or neonatal mortality is more than twice as high if the woman reported morning sickness. Another hypothesis for nausea of pregnancy is that it is a result of maternal fatal conflict early in pregnancy, particularly in cases where the resulting nausea is so severe that maternal health is compromised or under conditions of chronic nutritional stress. In this view, morning sickness is a by-product of maternal-fatal competition for nutrients and is not itself a product of selection. For the mother, the worst that she can do (from a fitness standpoint) is continue a pregnancy that will result in a nonviable or otherwise compromised infant. Thus, it may be important for her embryo to signal that it is healthy and likely to develop optimally. Given that lack of morning sickness is associated with miscarriages, it may be that nausea in early pregnancy is a signal of embryo viability and thus has selective value in itself. A phenomenon somewhat related to food aversion is the craving for clays (geophagy) reported by pregnant women in a number of societies. In medical literature, this craving is often reported as pathological, but its existence is so widespread that scholars of evolutionary medicine search for adaptive explanations. Worldwide, clays are used to relieve diarrhoea (the original Kaopectate, after all, was mostly kaolin, a clay), detoxify compounds, and provide minerals that are insufficient in the diet. In Africa, this practice is also employed by women seeking to relieve nausea of pregnancy and it can serve to bind toxins that would harm the fetus at this stage. When geophagy continues beyond the early stages of pregnancy, it probably adds important nutrients to the diet, especially calcium, essential for fatal skeletal development and maintenance of blood pressure in pregnancy. Anthropologists Andrea Wiley and Sol Katz propose that clay as a source of calcium helps to explain the distribution of geophagy in African populations. Their survey of 60 societies confirms that where dairying is practiced and calcium is available in the diets of pregnant women, geophagy is less common than where dairy foods are not available. This work confirms that what may be seen as pathological or abnormal by clinicians may make sense from an evolutionary perspective because in many cases geophagy serves to reduce the negative aspects of morning sickness, detoxify agents early in pregnancy, and provide calcium and other minerals. Given evidence that clay consumption occurs in chimpanzees and may have been practiced by early hominins, it seems to have a long evolutionary history. Certainly most of the sources of calcium sought by pregnant women today (dairy products) were not available to our ancestors and clay consumption may have made the difference between a healthy and unhealthy pregnancy. As with all practices, however, geophagy may have negative consequences, including exposure to pathogens in soil, iron deficiency anaemia, and lead poisoning.
Second and Third Trimesters
Development of the respiratory and neurological systems continues in the second trimester, toward the end of which the fetus is said to be viable, at least with modern technologies. Pregnant women will notice a lot of fatal movement early in the second trimester, beginning with somewhat frequent position changes (an average of 10 per hour) and then smoothing out to resemble what has been called ‘‘a young female Balinese temple dancer.’’ Other movements reported include sucking and swallowing, head movements, and hand-to-face contacts. The fetus even does somersaults and loop-de-loops during the second trimester, some of which account for the umbilical cord being wrapped around the neck at birth, as I will discuss later. Movements themselves and their duration increase so that some women report great difficulty sleeping. Then the movements begin to decrease as the fetus becomes larger and fills the uterus so that movement is impeded. Many of the movements exhibited by the fetus serve to prepare it for actions and reflexes that it will use after birth.
Growth in the Third Trimester
Further development in the respiratory and neurological systems continues in the third trimester and the circulatory and respiratory systems prepare for the changes that will take place at birth. The most important development that is happening at this time, however, is increase in body weight and fat deposition. Fat is particularly important for fatal development and human babies are notably fatter than the infants of other mammals (averaging about 16% fatter), even under conditions of nutritional stress or intrauterine growth retardation (IUGR). The primary selective value of fat accumulation in the last weeks of pregnancy is its contribution to early postnatal survival, especially in helping to maintain that expensive, rapidly growing brain. Fat babies may also serve as a signal to a mother that the baby is healthy and ‘‘worth saving’’ under conditions where infanticide or neglect are options. The potential for maternal-fatal conflict continues in the last trimester, especially in competition for nutrients. It seems that severe under- or malnutrition during pregnancy would compromise the uterine environment for the fetus so that it miscarries. But, surprisingly, the prediction that near-starvation in the mother would terminate a pregnancy is not often borne out: women continue to give birth even under severe food restrictions such as occur with famine and war. This suggests that the fetus has ways of prolonging the pregnancy, even if there are negative consequences for maternal health. But just because a fetus makes it through pregnancy to be born does not mean that health is not compromised by in-utero nutritional stress. Such babies are often born small for their gestational age due to intrauterine growth retardation, and they are at much higher risk for health challenges and even death in the first few months after birth. Some instances of IUGR result from preeclampsia in the mother, genetic inabilities to metabolize nutrients, or placental problems leading to inefficient delivery of nutrients to the fetus, even when the mother is well nourished. These causes fall in the realm of clinical concern and do not necessarily benefit from an evolutionary consideration. But IUGR caused by socioeconomic-induced under- and malnutrition can benefit from analysis by evolutionary medicine
Impaired thymus growth/impaired immuno competence McDade et al., 2001 fetus experiences while gestating in a non optimal environment. If food is restricted, for example, the hungry developing brain is served first and other developing organs get what is left over, which may not be sufficient for normal growth. Thus, a fetus that was malnourished in pregnancy will have a smaller-than-normal liver, pancreas, and gut. Compromise in the functioning of these crucial organs further compromises cholesterol and glucose metabolism, thus explaining the link between IUGR and several health challenges in adulthood. Intrauterine growth retardation that leads to impaired thymus growth means compromised immune function so that even at younger ages a person has limited ability to fight off infections. Additionally, it is also important to remember that early postnatal environments have an impact on later life health so we cannot just focus on pregnancy. Some scholars working in this field use the hypothesis of ‘‘fatal programming’’ to explain the relationship to adult health. Consider that a computer program will work fine if it is written correctly but will malfunction if some of the code is wrong. In the same way, if the fatal ‘‘program’’ develops correctly in a healthy pregnancy, later life health and development have a better chance of proceeding healthfully than if the program has errors because of malnutrition or other insults during pregnancy. Of course in some cases, there may be genetic factors that affect both birth weight and adult diseases, but in other cases, it may be that the program is altered by environmental circumstances. The fatal programming hypothesis proposes that a developing fetus uses cues to assess not only the environment of gestation but also the postnatal environment. The assumption is that if there is nutritional stress during pregnancy, there will also be nutritional stress after birth. Thus, the baby’s system is programmed to be ready for the same conditions it experiences in utero. The result is a metabolism that is very efficient at getting the most out of whatever calories are available—a thrifty metabolism. When this get-the-most-out-of-every-calorie metabolism meets food abundance in childhood, excess fat deposition and other health challenges may result. Furthermore, women in health-rich populations who try to restrict their food intake during pregnancy to avoid gaining weight may be inadvertently teaching their foetuses to expect insufficient nutrients after birth, which can be problematic in a world of high-fat diets and more than enough food.
Some of the greatest challenges to adult health are seen in people who were born small after nutritionally stressful pregnancies but subsequently grow up in environments that are very different from the ones for which they were programmed prenatally, resulting in a ‘‘mismatch’’. This is particularly true today with rapid globalization and the associated high-fat and high-caloric foods that are often readily available. The baby may be able to put on weight and reach a normal level rather quickly, but the suppressed growth of most organs of the body cannot usually be made up for postnatally. An overweight child who was low birth weight but is eating lots of fats and sugars puts more and more stress on her liver, pancreas, and gut. No wonder she finds she has problems with obesity, diabetes, hypertension, and atherosclerosis when she is older. Some have argued that children born small are better off if they remain small throughout their lives and that efforts to improve health by improving nutrition may misfire. In this case, if the prenatal and postnatal environments are similar, the metabolic needs developed in utero would be better matched to the world into which the child is born and grows up and it is predicted that adult health would not be further compromised. Unfortunately, such arguments often lead to policy decisions that fail to improve conditions for women during pregnancy, especially in circumstances where the causes of poor conditions are social and economic inequalities. In this case, the argument that the status quo is best does not resonate morally and ethically, nor does it work in populations undergoing modernization. Additionally, nutritional stress during pregnancy can lead to later life health challenges even if the infant is not small at birth. Furthermore, infants who are unusually large at birth are predisposed to some adult-onset diseases, suggesting a much broader phenomenon with regard to intrauterine developmental effects on later life health.
Some of the most astounding and worrisome findings from studies of the relationship between prenatal conditions and later life health are that the effects are transgenerational. As noted previously, the quality of the eggs developing in a girl fetus just a few weeks after fertilization will determine the quality of her offspring, several years later. Those eggs have a ‘‘memory’’ of their time in utero that affects their quality when they are ovulated and fertilized to start a new life trajectory. Anthropologist Chris Kuzawa has proposed the ‘‘intergenerational phenotypic inertia’’ hypothesis, which states that the fetus is obtaining cues not just from the mother, but from her entire matrilineage. This suggests that adaptations that have been successful for generations because they served to buffer pregnancy from insult due to poor nutrition may not be amenable to short-term fixes. This is further support of the argument that only by improving health for all pregnant women and their offspring can we eventually break the vicious cycle of transgenerational prenatal programming. Paediatrician Peter Nathanielsz has argued that prenatal programming can explain good adult health as well and proposes that it resolves the problem of the ‘‘French paradox’’ whereby the French can eat high-fat and high-cholesterol foods all of their lives without developing the cardiovascular problems and diabetes that their diet would inflict on most Americans. He notes that for more than a century, the French have had a very sophisticated system of prenatal care for all pregnant women that ensures their foetuses develop optimally. Prenatal programming is affected by a number of factors other than nutrition with infection and inflammation being predominant. In fact, early life infections may have as much of an impact on later health as prenatal nutrition. In other words, it is not just under nutrition alone or even infections alone that predispose to poor adult health, but the synergistic effects of these two factors. This is not surprising, given the obvious and well-known synergy between malnutrition and infection in childhood health. Of course any model linking to adult disease must include genes and environment and their interaction.
The Pregnant Biped
Bipedalism is the hallmark of the human species, but it does not come without costs, especially with regard to the spine. Four-legged animals have curved spinal columns and carry their internal organs (and babies when they are pregnant) slung beneath the horizontal spinal supports. When our ancestors began to habitually walk upright, however, the spinal column assumed an S-shape in order to keep the body mass above the legs and feet that supported it. Thus, the centre of gravity of a biped is just above the hip region. The curve in the lower back is known as the lumbar curve and the posture assumed is called lordosis. It is a weak point of the skeleton and accounts for the almost ubiquitous lower back pain that is associated with aging. As a woman reaches the last trimester of pregnancy, the baby she is carrying projects farther and farther in front of her, throwing off her centre of gravity. Fortunately, her spine has the ability to compensate for this shift in balance, thanks to natural selection for greater wedging of her lower (lumbar) vertebrae. Anthropologist Katherine Whitcome and her colleagues studied the spines of several women late in pregnancy and found that they were able to increase the curvature of their lower spines to keep the centre of gravity positioned above their hips, just as it is when they are not pregnant. They could do this because more of their lumbar vertebrae were wedged toward the back in comparison to men. Furthermore, the researchers found that the australopithecines also showed this adaptation, suggesting that it traces its origin to the beginnings of bipedalism itself. Unfortunately, it also means that women have more lower back pain during pregnancy and are at higher risk for slipped disks in this region.
Psychosocial and Other Stress during Pregnancy
One mechanism that has been proposed to affect fatal programming is excess production of hormones known as glucocorticoids, the best known of which is cortisol, often produced in response to stress. Excess glucocorticoids are also implicated in a number of the adult-onset diseases that have been linked to low birth weight, so it has been suggested that this excess can explain both sets of phenomena. One proposal is that excess glucocorticoids, especially late in pregnancy, can alter the development of several systems, including metabolic functions; insulin resistance; cardiovascular, liver, and pancreatic functions. Stress is also known to compromise immune function. Glucocorticoids are also important for brain development, and in excess, can cause problems, particularly if the excess occurs during sensitive periods. Thus, the timing of a stressful incident has varying impacts on the effects that glucocorticoids have on developing fatal systems. Attention to some of these effects is important because glucocorticoids are commonly used in medical treatment of pregnancies at risk for premature delivery. Not surprisingly, the effects of stress hormones on developing systems are also manifested in physical and mental problems later in life. For example, stress during pregnancy appears to put children at risk for behavioural disorders, including hyperactivity, impaired cognitive function, anxiety, and fearfulness. As long as the fetus is gestating, it is subjected to the physiological effects of stress that the mother is experiencing. The effects are less direct once the baby is born, and an important advantage gained by giving birth while fatal brain development is still going on is that it minimizes the direct effects of stress on the mother during later phases of neurological development. Just as a fatal environment that suffers from under- and malnutrition programs the fetus to expect similar challenges after birth, one that includes a great deal of stress on the mother may program the fetus to expect a lot of stress later in life. The result tends to be an over-active and over-reactive stress response that can itself affect later life health, both physical and mental. Cardiovascular health is worse and depression is more frequent in people who experience a great deal of stress (and the associated glucocorticoids) in utero. This is another experience that may transcend generations. If a woman who experienced stress in utero has an over-active stress response she may have elevated stress levels when she becomes pregnant and can pass this along to her own children. Those children who are girls can continue this process on down through the generations. Following a ‘‘prenatal prescription’’ for minimizing stress may be the only way out of this cycle.
Psychosocial stress has certainly been implicated in pregnancy complications, but there is evidence that environmental stresses such as those caused by earthquakes can also affect pregnancies, depending on when they occurred. These are the sorts of stresses our ancestors may have faced—the stress of an unknown shaking of the earth, eruption of a volcano, thundering of wildebeests, screams of lions and leopards. Evidence in support of this view was found in a study of women who experienced the 1994 Northridge, California, earthquake during their pregnancies. Women who were in the first trimester when the earthquake occurred gave birth significantly earlier than those who were in the last trimester. In the ancestral past a healthy and active stress response was probably a good thing for pregnant women who had to move quickly to get out of the way of a predator or other ‘‘real’’ threat. The elevated glucocorticoids did their job to get our ancestors moving but then dropped off, relaxing the stress response. Today, that same healthy stress response that may have enabled our ancestors to survive is activated several times a day by bad news on TV, sirens and other loud noises, not enough money, slights or insults from other people, anger at our spouses or parents, unruly children, and our active imaginations. These problems with the stress response are by-products of modern lives and have taken a formerly advantageous response and turned it into one that causes mental and physical health problems throughout life. Another type of stress actually appears to enhance maternal and fatal health, and that is the stress associated with moderate physical activity. From an evolutionary perspective, the idea that pregnant women should sit quietly still and rest through the last part of pregnancy does not make sense. There is no evidence from contemporary or recent past populations of people who live like our ancestors that women altered their activities in significant ways until the very end of pregnancy. Thus, it is not surprising that babies of mothers who exercised during pregnancy are easier to calm (their stress response is not over-active), seem more aware of their environment (and thus learn more), and seem to have an easier time with birth. The mothers themselves report feeling better and having shorter labours and fewer complications at birth. Of course, women with other health complications would not necessarily be advised to exercise in pregnancy, and it is probably not a good idea for a woman to start exercising vigorously if that has not been a part of her normal lifestyle.
Finally are the stresses of toxic and other noxious substances that the mother and her developing fetus are exposed to in their everyday lives. These stressors include cigarette smoke (including passive smoke), pollutants in water and air, recreational and therapeutic drugs, alcohol, caffeine, and household chemicals. It has been suggested that some cases of attention-deficit hyperactivity disorder (ADHD) can be traceable to exposure to environmental hazards. Smoking during pregnancy has been linked to low birth weight in the baby, smaller placentas, higher incidence of respiratory diseases in childhood, and higher likelihood of smoking as an adult (another form of fatal programming?). Excessive alcohol consumption during pregnancy can lead to fatal alcohol syndrome (FAS), a condition in which a baby is born with physical and mental abnormalities that are irreversible. Slowed postnatal growth can also result from alcohol use during pregnancy, the probable mechanism for which is a decreased number of cells because of alcohol’s inhibition of cell division early in development. Because fermented beverages were rare to non-existent during early human evolution, this is another mismatch between pregnancies that evolved not to expect alcohol consumption and our contemporary lives. Clearly, any way that one can reduce stress and exposure to stressors during pregnancy is likely to be beneficial. Considering the dense social networks in which our ancestors experienced pregnancy, it is not surprising that social support enhances the health of both mother and fetus. Having social support at the time of birth may have made the difference between life and death for many mothers and babies in the past. This desire for the presence of others during labour and delivery and in the early months of life of our infants may be a legacy we have inherited from our ancestors. All the things that can go wrong in a pregnancy and, indeed, this seems to be just what I have done. But pregnancies usually proceed without mishap in environments of adequate food, few infectious agents, and relatively low stress. Most of us are here today because our mothers did not lose us early in pregnancy, got sufficient nutrients including vitamins and minerals, and were able to avoid serious morning sickness, gestational diabetes, and eclampsia. For those of us whose mothers had clinical problems during pregnancy, they likely had access to modern medical resources that enabled them to continue the pregnancy and give birth to a healthy infant.