Homo sapiens is unique in having four prolonged and pronounced postnatal pre-adult life history stages: infancy, which lasts for 30–36 months and ends with weaning from breast feeding in traditional societies; childhood, which lasts for an additional 2–4 years and concludes in a degree of independence as regards protection and food provision; a juvenile stage of 3–4 years that terminates with readiness for sexual maturation; and adolescence, which lasts for 3–5 years and culminates in fertility. Juvenility implies two transitional periods which are only experienced by humans: a transition from childhood to juvenility and from juvenility to adolescence. Juvenility, “the age of reason and responsibility” and concrete operation, coincides with elementary school age and offers opportunities to prepare for the social complexity of adolescence. Here I define the transition to juvenility by three variables: adrenarche (the onset of adrenal androgen generation), growth pattern (decelerating from a linear childhood growth velocity) and adiposity rebound acceleration of body mass index. The data presented suggest that this period is endowed with programming/predictive adaptive responses of body composition to the environment.
Statistics from Altmetric.com
Evolutionary life history theory deals with the strategic allocation of an organism’s energy for growth, maintenance, reproduction, raising offspring to independence, and avoiding death.1 It predicts that selection will promote efficient physiological mechanisms that mediate allocation strategies by linking life history stages to trade-offs.
The human life history strategy includes: long gestation and large, yet immature neonates for body weight; an unusually high rate of energy-costly postnatal brain growth; an extended period of offspring dependency and slow growth of helpless young; a brief duration of breast feeding, yet intense maternal, paternal and tribal care; delayed reproduction; rapid adolescent growth; and menopause with two decades (and today many more) of postmenopausal life.1 The life history approach to intermediate growth stages and the transition between them provides a theory for understanding strategic objectives that include when to be born, when to be weaned, how long to remain an infant, when to be independent for self-protection and self-provision, when to mature for reproduction, and when to cease growth.
As late as 3 000 000–4 000 000 years ago the early Homininae Australopithecus afarensis, best known to include “Lucy” from Hadar in Ethiopia, had (like the chimpanzee) merely two postnatal, pre-adult life history stages: infancy and juvenility. During the evolution of the Hominidae, childhood and adolescence have been added as new life history stages as compared with apes.2–4 Thus, Homo sapiens has four prolonged and pronounced postnatal pre-adult life history stages (fig 1A): infancy, which lasts for 30–36 months and ends with weaning from breast feeding in traditional societies; childhood, which lasts for an additional 2–4 years, culminating in a degree of independence as regards protection and food provision; a juvenile stage of 3–4 years that concludes with readiness for sexual maturation; and adolescence, which lasts for 3–5 years, culminating in fertility at an average age of 18.5 This stages are well supported by changes in sex hormone patterns (fig 1B).
Except for humans, all other mammals (including the great apes) transit directly from infancy to juvenility and from the latter to adulthood, without passing through the childhood and adolescence stages. This means that humans have four additional transitional periods that only they possess: transitions from infancy to childhood, from childhood to juvenility, from juvenility to adolescence and from there to adulthood. Comparison with the African apes suggests that the age of transition to juvenility in chimpanzees may be similar to that in humans, although the full course of age-related changes in dehydroepiandrosterone sulfate (DHEAS) and their relationships with reproductive and brain maturation are not clear.6
This review defines juvenility as a distinct clinical life history stage, characterises it in terms of its unique endocrine and body composition changes, relates these changes to social assignments and psychological maturation, describes the plasticity in the transition from childhood to juvenility and from juvenility to adolescence, and claims that this life history stage is endowed with programming/predictive adaptive responses for a thrifty phenotype, metabolism and body composition.
The social/cognitive definition of juvenility
As previously mentioned, the childhood stage is exclusively a human innovation; it is defined by a stabilising growth rate, immature dentition and weaning while the child continues to depend on older people for provisions and protection, and by behavioural characteristics, including immature motor control.
The juvenile stage offers opportunities to prepare for the social complexity of adolescence as well as adulthood. The psychologist Sheldon White called it “the five to seven year shift” or “the age of reason and responsibility”7 (table 1). At the end of brain growth and equipped with adult molars, primates move on to juvenility to forage for food and care for themselves. Whereas chimpanzees make the move directly from infancy, humans, who have a shorter infancy, initiate juvenility after a period of childhood.
While in modern societies the transition to juvenility coincides with the age when children first go to school, in traditional human societies, juveniles find much of their own food, avoid predators and compete to some extent with adults for food and space. Cognitive and social advances accompany the physical changes induced by adrenarche. Developmental psychologists refer to this period as “middle childhood”, a period of cognitively concrete operations, when children become less dependent on their parents for support and begin to interact with other adults and peers (table 1). The juvenile builds up organised and logical thoughts, can perform multiple classification tasks (order objects in a logical sequence) and comprehend the principle of conservation. Thinking becomes less transductive and less egocentric, and the child is capable of concrete problem-solving. To become independent, the juvenile learns complex feeding skills: when and where to find food, and how to hunt with the group. In that respect, it is interesting that around age 6, a systematic process of brain grey matter reduction occurs in the primary association areas,8 which will be complete in the prefrontal cortex in the 20s. Grey matter reduction represents synaptogenesis during this period.9
For the paleoanthropologist, the transition from childhood to juvenility is associated with the eruption of permanent molars, for that is the fossil they usually encounter. Permanent molars erupt in Homo sapiens at 6 years of age. A comparative study across 21 primate species found the age of first molar eruption to be highly associated with brain weight (r = 0.98) and a host of other life history variables.10 The strength of the correlations seemed best explained by robust tooth development in response to environmental perturbations, especially when compared to such life history variables as age of sexual maturation or first birth. Thus, the age of tooth eruption is a good overall measure of the maturation rate of a species, and in the context of the present discussion the transition from childhood to juvenility is closely associated with molar eruption.
Interestingly, data for dental eruption in 1837 showed similar eruption ages to those known today11; unlike the secular trend in the transitional age to adolescence, at a time of a marked worldwide upward trend for height, transition to juvenility has not changed much over the last 170 years.12 The age of transition to juvenility as determined by the eruption of the first molar may be even longer standing. A recent study used novel techniques to estimate the chronological age of an ancient modern human from Jebel Irhoud in Morocco, dated to 160 000 years before the present (ybp).13 They showed that the age of lower incisor tooth eruption was much the same as it is today, suggesting that the age of transition to juvenility has not changed throughout the ∼200 000 years of modern humans. Interestingly, the Neanderthals’ permanent molars erupted at an equivalent age of 6.5 years,14 marking the age of transition to juvenility of this species.
If the transition to juvenility is to be defined by adrenarche (the onset of adrenal androgen production), this was generally considered to be around age 7–8 for girls and 8–9 for boys. Yet a closer look at adrenal androgen levels suggests an earlier age of 5–6 (fig 2).15–17 If adrenarche is considered to begin with the appearance of the adrenal zona reticularis, it may have its onset as early as age 3–4.18
The endocrine control of adrenarche has been the topic of several excellent review articles.19 20 Serum DHEA and DHEAS rise progressively throughout juvenility,21 with effects on a wide variety of physiological systems, including neurological22 and immune,23 and somatic growth and development.24 25 It has been suggested that the primary effects of DHEA in humans are neurological and as a mood modulator.26 27 In that respect, adrenarche is timely at the age of maturation of the cerebral cortex of 6 years.8 Campbell suggested three mechanisms by which DHEAS may promote changes in behaviour and cognition,6 all in line with the evolutionary significance of juvenility: (1) acting on the amygdala to reduce fearfulness and allow for the expression of an increased range of social interactions with unfamiliar individuals, as the juvenile cares for his or her new needs and interacts with peers; (2) acting on the hippocampus to promote memory, social and cognitive capacity, as he or she joins in some adult activities; and (3) acting as an allosteric antagonist of the GABA receptor. In addition, DHEAS may play a role in synaptogenesis and cortical maturation, and may help wire the brain in response to existing social environments. It is conceivable, although requiring experimental evidence, that these brain effects of DHEA are required to prepare the central nervous system for adolescence, both in its psychosocial sense and in setting the scene for pubertal maturation of the hypothalamic-pituitary-gonadal (HPG) axis.
The HPG axis is operational during juvenility and shows unique changes in both sexes.28 29 Evaluation of the diurnal rhythm of oestrogen indicates that girls show a rise in early morning estradiol levels at least 2 years before the onset of clinical puberty, and in the middle of juvenility.30 Indeed, oestrogens suppress 3β-hydroxysteroid dehydrogenase activity and enhance 17,20-lyase activity, resulting in higher DHEA and DHEAS levels.31
Juvenility and increasing adrenal androgen levels are associated with an increase in muscle mass and bone mineral content; the association of enhanced adrenal androgen generation in congenital adrenal hyperplasia with muscularity is well documented.32 Accordingly, an increase in fat-free, lean body mass is evident around age 5, which is greater in girls than in boys, apparently as part of the female mid-childhood spurt.33 34 As a consequence of increasing fat-free mass, bone remodelling accelerates; in a study of 205 healthy juveniles, a significant influence of muscularity on periosteum modelling was found, with positive correlation of C19 steroids with cortical density and bone mineral content.35
The adrenal reticularis and the HPG axis are not the only glands involved in the transition to juvenility. GH-IGF-1 axis activity is enhanced in parallel with the rise in adrenal androgens36 37 and in girls more than in boys, and DHEA levels during female (but not male) juvenility correlate with increasing leptin levels, suggesting an association between body fat and transition to juvenility.38
Juvenile body composition
Adiposity rebound is an important indicator for the transition to juvenile body composition (fig 3).39 The adiposity rebound corresponds with the second rise in the age-related BMI curve that occurs between ages 4 and 6. Its sexual dimorphism corresponds very well to that observed in the second derivative growth curves, with mean boys’ rebound at age 68 months as compared to girls with a mean adiposity rebound 6 months earlier.
An early adiposity rebound is observed in overweight children and is associated with an increased risk of overweight, suggesting that the body habitus programs for the timing of transition from childhood to juvenility.40 The typical pattern associated with an early adiposity rebound is a marked increase in BMI during juvenility that will exacerbate during adolescence. This pattern is recorded in children of recent generations as compared to those of previous times, owing to the trend of a steeper increase in height as compared to weight in the first years of life.
Adiposity rebound may be the first clinical sign of juvenility, or it may be the signal that switches transition on. It is interesting to note that even lean girls with precocious adrenarche have higher than control levels of IGF-I, IGFBP-3 and leptin,41 as mechanisms that may transmit the signal of energy readiness.
Two important clinical observations may be relevant for understanding the importance of this stage with respect to body composition. We have previously shown that onset of obesity in children with acanthosis nigricans and a family history of obesity and the metabolic syndrome occurs at a mean age of 6.4 years as compared to 2.3 years in obese children without acanthosis nigricans and a family history of the metabolic syndrome.42 Children with obesity onset at juvenility had a truncal (android) distribution of fat and their fasting blood glucose and HDL/total cholesterol were low. It is around the age of 4–5 that patients with Prader-Willi syndrome (PWS) become progressively overweight, while developing their typical high body fat mass and low body muscle mass.43 Indeed, patients with PWS tend to have premature pubarche, with higher DHEAS levels, as compared to control subjects.44 Whereas the pathophysiological mechanism of obesity in PWS is poorly understood, suggestions of increased insulin sensitivity may imply a role for insulin in the link between obesity and juvenility; the transition to juvenility coincides with obesity onset in children who later develop the metabolic syndrome and type 2 diabetes.42
The close proximity to adiposity rebound suggests transition to juvenility is linked to energy supply. In support, increases in DHEAS levels correlate positively with increases in BMI.45 It has been suggested that as brain growth tapers off during juvenility (fig 4), energy that was formerly allocated to brain growth is temporarily stored as abdominal fat, in order to support the energetically costly accelerating growth during the upcoming adolescence.6 Indeed, the age of transition to juvenility is strongly linked to age at onset of puberty; patients with precocious puberty have an early adrenarche,21 and those with delayed puberty or hypogonadotrophic hypogonadism have a late transition to juvenility.46
It is interesting to note that as adiposity accelerates in mass, bone decelerates in growth and accruing minerals, while the rise in bone mineral density slows to its lowest rate.47 These two tissues, bone and adipose, mature out of common stem cells, which ultimately differentiate into osteocytes, chondrocytes or adipocytes. The plasticity to differentiate into different routes is a feature of the stem cell, which responds to the environment by means of a complex array of hormones and growth factors. In vitro and in vivo studies strongly support an inverse relationship between the commitment of mesenchymal stem cells or stromal cells to adipocyte or osteoblast lineage pathways.48
Growth of the juvenile
Another possible definition of juvenility derives from growth curves and human physical anthropology (figs 4 and 5). After a period of constant growth rate and the “mid-childhood spurt” (greater and earlier for girls than for boys), a decline in the rate of growth signifies a transition to a new life stage, giving juvenility the slowest growth rate since birth.25 This coincides with the social assignment of the juvenile as he or she joins adult society for hunting or domestic tasks. This juvenile growth pattern is mostly evident from velocity curves, which suggest a mean transition to juvenility at age 4.5 years for girls and 5.5 years for boys. The trade-off for this deceleration at a time when brain growth is almost complete may have to do with the learning required for living within the social hierarchy of the group without posing the physical threat of a large body.49
Analysis of leg length as a function of age shows a clear acceleration in lower limb growth as juvenility commences (fig 4), in accordance with independence in juvenility as regards provisions and protection. Sitting height (fig 4) and the bi-iliac diameter (fig 5) also show an increase, whereas sitting height as a fraction of the total height (fig 5) decreases during the transition from childhood to juvenility. It has recently been shown that longer lower limbs relative to body mass reduce the energetic cost of human walking,50 an obvious advantage as the juvenile joined the hunter-gatherer adult society.
Transition from childhood to juvenility
The initial taxonomy of life history stages defined pubarche as the onset of juvenility.2 Evidence shown here gives a much earlier age for the transition to juvenility for present day humans; pubarche is a late event during juvenility and is also quite subjective. However, it is clinically evident, and studies of premature pubarche may shed light on the impact of early or late juvenility; body composition and metabolic adaptation are obvious as the most noticeable outcomes. Girls with premature pubarche are more inclined to develop ovarian hyperandrogenism, hyperinsulinaemia and dyslipidaemia later in life.20 In fact, hyperinsulinaemia and dyslipidaemia may be detectable as early as during juvenility, and worsen during pubertal development. They are commonly accompanied by anovulation from late adolescence onwards, with low serum levels of IGFBP-1 and sex hormone-binding globulin.
Although precocious pubarche, the metabolic syndrome and obesity may constitute a genetic syndrome, no unambiguous gene mutations have been found so far, and I suggest that these conditions may represent developmental programming or an adaptive response within our adaptive phenotypic plasticity. This is supported by the fact that premature pubarche is mostly a female phenomenon and is less common in boys. Should this be the case, the adaptive trade-off package includes high levels of circulating androgens in females, obesity and insulin resistance. The latter two effects have previously been suggested as components of the thrifty phenotype, known from the outcome of intrauterine growth retardation. Indeed, small-for-gestational-age children also have an earlier transition to juvenility, characterised by early adrenarche, early pubarche and early adiposity rebound.
What, then, are the environmental cues for early onset of juvenility? One cue is prenatal energy balance, which signals for a trade-off of early transition to juvenility, which in turn provides a cue for the thrifty phenotype and hyperandrogenism in the female. Whereas hyperandrogenism compromises fertility, masculinisation of girls and women may be a valuable trade-off for the individual, her direct family and the social group under certain environmental constraints.51 Early adolescence will be a secondary trade-off for this package.
Another trade-off may relate to the neurological effects of DHEA mentioned above, as it rises with the onset of juvenility. The social function of the juvenile, as he or she gains new assignments, requires the androgenic effect of DHEA. In fact, DHEAS levels positively correlate with ratings of aggression and delinquency among juvenile boys,52 and girls with premature adrenarche show higher levels of anxiety associated with increased DHEAS levels.53 Among women with adrenal insufficiency, DHEA supplementation improved self-esteem, sexuality and overall well-being, and decreased depression and anxiety,26 54 traits that are consistent with the newly assigned social role of the juvenile.
This review has been an attempt to use life history theory to understand juvenility in a broad evolutionary perspective. Life history traits respond to environmental cues in order to enhance fecundity-survival schedules and behavioural strategies that yield the highest fitness in a given environment.
The transition from childhood to juvenility is part of a strategy in the transition from a period of total dependence on the family and tribe for provision and security to self-supply; it is assigned with a predictive adaptive response of body composition and energy metabolism. The transition from juvenility to adolescence is also assigned with the age and length of fecundity. It entails plasticity in adapting to energy resources, other environmental cues, and the social needs of adolescence and their maturation to directly determine fitness. These periods influence each other in an intricate web of connections that are related to evolutionary fitness and lifelong advantages.
The data presented suggest that juvenility is endowed with programming/predictive adaptive responses for a thrifty phenotype, metabolism and body composition.
Competing interests: None.
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.