Thunderstorms are uncommon in late November but on the morning of 25 November 2006 thundery showers persisted for several hours. Eventually, just after nine o'clock, this spectacular rainbow appeared. Note that the sky inside the bow is significantly lighter than it is outside. The optics involved in this phenomenon are fairly straightforward and are due to raindrops "inside" the bow from the viewer's perspective reflecting sunlight back at the viewer.
© Christopher Seddon 2008
Saturday, 31 May 2008
Tuesday, 27 May 2008
Seating on the 271 Bus
The photos below, taken with a mobile phone, show the seating configuration on the buses used on Route 271 from Highgate to Moorgate, compared with a normal bus.
The lack of legroom is such that it is physically impossible for a person of my size to be seated without either occupying both seats, with my legs splayed apart, or sideways on the outermost seat, with my legs in the aisle. Either way is singularly uncomfortable! Admittedly at 6 ft I’m quite tall, but I’m hardly in the Peter Crouch league.
Below is the rather condescending reply I got when I raised this matter with TfL back in March. The buses were converted to facilitate disabled access, but rather than accept a slightly reduced capacity they hit on the brilliant idea of reducing the legroom. The assertion that the majority of passengers can sit without difficulty is nonsense – I have a friend who is 5ft4 who finds them uncomfortable. I doubt if anybody but a young child could be seated in comfort.
Dear Mr Seddon
Thank you for your recent email about leg room on the route 271 buses.
I am sorry to learn that you have found the seats uncomfortable and too close together. Please accept my apologies for any discomfort and inconvenience you have suffered as a result. We do try to operate an efficient service for the benefit of all our passengers.
Since the beginning of 2006, all London buses have been of a low floor, accessible design. In order to accomplish this we had to alter the seating layout on new buses so that overall passenger capacity was not reduced because of the accessible design.
I can assure you that all London buses operated on our behalf meet the necessary legal regulations (including those related to safety). All public service vehicles in the UK must comply with the legal requirements before they can enter service.
We do have a Bus Design Forum which provides us with passenger feedback about the diverse needs of commuters and the barriers they encounter. We ensure that the Forum is representative, by recruiting members from different user groups. This includes older people, young people, wheelchair users, passengers using buggies and people with learning difficulties or physical impairments. All buses are built to standards specifications which fulfil the standards detailed by the Disabled Passengers Transport Advisory Committee.
The comments received from the Forum are fed into our discussions with bus operators and manufacturers about bus design. Our decision to ensure that all new buses have a vertical grab rail at the end of every seat on the lower deck of buses is one such outcome from discussions with the Forum.
The seat dimensions on buses enable the majority of passengers to stand or be seated without difficulty. I appreciate that you have found the route 271 buses to be restrictive in this respect. Once again, I am sorry about this.
We will continue to work with the operating companies, bus manufacturers and the Bus Design Forum to ensure that each new bus design is better than the last.
Thank you for contacting me about this matter. Please let me know if I may be of any further assistance.
Basically, given a choice between a slight reduction in capacity and making the upper deck of the bus horribly uncomfortable for just about everybody, they opted for the latter. As is all too often said these days, you couldn't make it up!
© Christopher Seddon 2008
The lack of legroom is such that it is physically impossible for a person of my size to be seated without either occupying both seats, with my legs splayed apart, or sideways on the outermost seat, with my legs in the aisle. Either way is singularly uncomfortable! Admittedly at 6 ft I’m quite tall, but I’m hardly in the Peter Crouch league.
Below is the rather condescending reply I got when I raised this matter with TfL back in March. The buses were converted to facilitate disabled access, but rather than accept a slightly reduced capacity they hit on the brilliant idea of reducing the legroom. The assertion that the majority of passengers can sit without difficulty is nonsense – I have a friend who is 5ft4 who finds them uncomfortable. I doubt if anybody but a young child could be seated in comfort.
Dear Mr Seddon
Thank you for your recent email about leg room on the route 271 buses.
I am sorry to learn that you have found the seats uncomfortable and too close together. Please accept my apologies for any discomfort and inconvenience you have suffered as a result. We do try to operate an efficient service for the benefit of all our passengers.
Since the beginning of 2006, all London buses have been of a low floor, accessible design. In order to accomplish this we had to alter the seating layout on new buses so that overall passenger capacity was not reduced because of the accessible design.
I can assure you that all London buses operated on our behalf meet the necessary legal regulations (including those related to safety). All public service vehicles in the UK must comply with the legal requirements before they can enter service.
We do have a Bus Design Forum which provides us with passenger feedback about the diverse needs of commuters and the barriers they encounter. We ensure that the Forum is representative, by recruiting members from different user groups. This includes older people, young people, wheelchair users, passengers using buggies and people with learning difficulties or physical impairments. All buses are built to standards specifications which fulfil the standards detailed by the Disabled Passengers Transport Advisory Committee.
The comments received from the Forum are fed into our discussions with bus operators and manufacturers about bus design. Our decision to ensure that all new buses have a vertical grab rail at the end of every seat on the lower deck of buses is one such outcome from discussions with the Forum.
The seat dimensions on buses enable the majority of passengers to stand or be seated without difficulty. I appreciate that you have found the route 271 buses to be restrictive in this respect. Once again, I am sorry about this.
We will continue to work with the operating companies, bus manufacturers and the Bus Design Forum to ensure that each new bus design is better than the last.
Thank you for contacting me about this matter. Please let me know if I may be of any further assistance.
Basically, given a choice between a slight reduction in capacity and making the upper deck of the bus horribly uncomfortable for just about everybody, they opted for the latter. As is all too often said these days, you couldn't make it up!
© Christopher Seddon 2008
Monday, 26 May 2008
Australopithecus
The First of the Great South Africans
The first discovery of a bipedal ape was made by the Australian anthropologist Raymond Dart in 1924. Investigating a box of fossil-bearing rock from a limestone quarry in Taung in North West Province, South Africa, Dart discovered fossilised remains of a juvenile apelike creature. The remains comprised the face, part of the cranium, the complete lower jaw and a brain endocast formed when material within the skull hardened to rock. Dart concluded that the brain was humanlike and that the foramen magnum was placed centrally in the basicranium as with humans, rather than towards the rear as with apes. The canine teeth were small – again like humans. Dart described what became known as the Taung Child in February 1925 in the journal Nature (Dart, 1925), naming it Australopithecus africanus (Southern ape of Africa).
Dart’s claim was strongly criticised at the time, largely because the (fake) Eoanthropus dawsoni (Dawson’s dawn man) better known as Piltdown Man fitted the then prevalent view that brains had expanded before bipedal walking had evolved. The find was widely dismissed as a fossilised ape.
However in 1936 the Scottish palaeontologist Robert Broom, a long-time supporter of Dart, instigated new searches for early human fossils and soon discovered the braincase of an adult specimen in a set of limestone caves at Sterkfontein near Krugersdorp 25km northwest of Johannesburg. Further finds followed at nearby Kromdraai in 1938 but then the war intervened and the sites were closed down for the duration. Broom (who was in his seventies) spent the war years preparing a monograph on the finds which was published in 1946, after which the australopithecines were generally accepted as hominins.
Broom assigned his finds to two species, Plesiantropus transvaalensis (“Near-man from the Transavaal”) and Paranthropus robustus (“alongside-man”). After the war, now assisted by John T. Robinson, Broom resumed his excavations. In April 1947 the pair discovered the nearly complete adult skull of a female Plesianthropus (STS 5) which became popularly known as Mrs Ples. Another find from Sterkfontein, a fossilized pelvis, vertebral column and fragmentary rib and femur discovered by Broom in 1947 and known as STS 14, may well be part of the same individual as STS 5.
Plesianthropus transvaalensis is now regarded as the same species as Australopithecus africanus. Paranthropus was later “lumped” in with Australopithecus but the current trend for “splitting” has led some to resurrect it as a separates genus, incorporating the so-called robust australopithecines. The morphological differences between the robust australopithecines and the earlier australopithecines, colloquially referred to as gracile australopithecines, are far less than those between the earliest (Homo habilis) and modern humans (H. sapiens), which are not given separate genera. I will therefore follow Klein (1999), Conroy (1997), Lewin & Foley (1998, 2004), etc in not adopting this convention.
A. africanus is known from 2.8 to 2.3 mya; A. robustus is more recent, known from 1.8 to possibly as late as 1.0 mya. Both species were small-brained in comparison to human. Australopithecus africanus is believed to have had a cranial capacity of around 430-520 cc. There was a considerable degree of sexual dimorphism in both. Males typically measured 4ft 6in tall and weighed 40 kg, whereas females measured 3ft 8 in tall and weighed 30 kg. Australopithecus robustus had a cranial capacity of 500-545 cc. Males measured 4ft 4 in tall and weighed 40 kg. Females measured 3ft 6 in tall and weighed 32 kg.
In 2004, Mrs Ples made the Top 100 in SABC3’s television series Great South Africans, placing her in the company of Nelson Mandela, Steve Biko and Dr Christiaan Barnard.
Zinj
In the 1950s Louis and Mary Leakey began excavating at Olduvai Gorge, a steep-sided ravine in the Great Rift Valley located in the eastern Serengeti, Tanzania. For many years the Leakeys recovered only stone tools, but in 1959 they made the first of a series of discoveries that was to make them world famous and led to Olduvai Gorge becoming known as the Cradle of Mankind. On 17 July of that year, with Louis back at camp unwell, Mary discovered an almost complete hominin cranium. The find was initially classified as Zinjanthropus boisei – Zinj was the medieval Arab name for this region of East Africa. However it was later reclassified as a robust australopithecine and hence is now known as Australopithecus boisei (or Paranthropus boisei). The specific name boisei honoured the expedition sponsor Charles Boise.
“Zinj” or “Dear Boy” as the specimen was affectionately known is dated to 1.75 mya. A. boisei is now also known from East Turkana, Kenya (KNM-ER 406 - male cranium; KNM-ER 732 – female cranium) and from Ethiopia. The species is known from 2.3 to 1.2 mya. It is again quite small brained at 500-545 cc. Males measured 4ft 6 and weighed 45 to 80 kg, again much larger than the females at 4ft tall and 36 kg.
Lucy
Although the discovery of Australopithecus boisei proved that australopithecines had not been confined to South Africa, Mrs Ples and her relatives remained unchallenged as the oldest known bipedal apes, but in 1974 the baton passed to what is undoubtedly the most celebrated fossil hominin ever discovered.
In the 1970s diggings in the Afar Depression began to reveal evidence of an australopithecine species that considerably predated Australopithecus africanus. The first specimen, a fossilized knee-joint known as AL 129-1, was discovered in November 1973 by a young American PhD student, Donald Johanson, at the Middle Awash site along the Awash River. Its humanlike oblique femoral shaft indicated that it had belonged to a biped. A year later an expedition led by Johanson and French anthropologist Yves Coppens recovered a 40% complete skeleton designated AL 288-1, some 2.5 km from the site of AL 129-1. Nicknamed Lucy after the Beatles song Lucy in the Sky with Diamonds (which was played on a cassette-recorder at the campsite at a party held to mark the find), AL 288-1 became world famous overnight. The new species was named Australopithecus afarensis.
Shortly after the discovery of Lucy, Johanson’s team found a 3.2 million year old fossil bed containing 333 separate fragments. The site was accordingly dubbed Locality 333. The remains were associated with a group of 13 A. afarensis including males, females and infants. It is believed that this group – which became known as the First Family - were drowned by a flash flood as there is no evidence that they were attacked by predators. If they were indeed all members of a single social group, this suggests A. afarensis lived in relatively large groups of mixed sexes and ages.
In 1978 Mary Leakey discovered a set of hominin footprints preserved for 3.7 million years in volcanic ash at Laetoli in Tanzania, 45 km south of Olduvai Gorge. Generally attributed to Australopithecus afarensis, the footprints were made by three bipedal individuals, all walking in the same direction. The individuals had human-like arched feet, lacked the mobile big toes of apes and appear to have been moving at a leisurely stroll.
Australopithecus afarensis is known from 3.9 to 2.8 mya. Its cranial capacity was 380-485 cc. Males measured 5ft tall and weighed 45 kg, females measured 3ft 3in and weighed 30 kg; thus again the species was highly sexually dimorphic.
The face was prognathic – that is to say the lower jaw jutted forward. The cranium was long and low, with a nuchal crest at the back to which were attached powerful neck muscles, needed to balance the head because of the prognathic lower face. The foramen magnum was centrally-placed, confirming upright posture. Males had a sagittal crest, implying strong, ape-like jaw muscles.
The canines and incisors, though large, were reduced in comparison to modern ape; the molars were thick-enamelled and large.
The human-like pelvis had a short, broad, backwardly-extended iliac blade, which centres the trunk over the hip joints, reducing fatigue during upright bipedal walking. However it had relatively ape-like limb proportions: very short thighs, powerful arms with forearms long in proportion to upper arms, similar to chimpanzees. The ribcage was probably cone-shaped as opposed to barrel-shaped in humans. This implies that it was an adept climber and not yet a wholly-committed biped, though it was undoubtedly a bipedal walker on the ground.
The Australopithecine family grows
In the decade and a half that followed the discovery of Lucy, the australopithecine family continued to grow.
In the mid 1980s, a new robust type, Australopithecus aethiopicus, was identified. As the name implies, the first specimen (Omo 18) came to light in Ethiopia. But this find, made in 1967, comprised a partial mandible fragment and the species wasn’t recognised until 1985 when Alan Walker and Richard Leakey discovered a skull at West Turkana, Kenya. KNM WT 17000, known as the Black Skull due to manganese colouration, which has a cranial capacity of 410 cc. There is insufficient material to estimate its size. It lived 2.7 to 2.5 mya and may be ancestral to A. robustus and A. boisei, but this is uncertain. It is not even universally accepted that Omo 18 and the Black Skull belong to the same species.
The next discovery was Australopithecus bahrelghazali, discovered in 1993, known only from at Bahr el Ghazal valley near Koro Toro, Chad (KT-12H1). The find consists of a fragmentary upper third premolar and the anterior portion of a mandible retaining one incisor, the sockets for the remaining three, both canines and all four premolars. An age of 3.4 to 3 million years makes it contemporary with A. afarensis and its status as a separate species is disputed. However its location in Chad, 2,500 km from contemporary australopithecine sites in the Rift Valley suggests north-central Africa may also be important in human origins.
Further back: Ardipithecus ramidus and Australopithecus anamensis
Up to this point, Lucy continued to hold the title for oldest known hominin, and by implication, the closest to the last common ancestor between humans and living apes, but from the 1990s onwards, the date was successively pushed ever further back.
Australopithecus anamensis (“lake” in Turkana language) is another find dating back to the Sixties but not recognised until later. The first specimen, a single arm bone, was recovered 1965 at Lake Turkana, but it was not proposed as a new species until 1995 following discoveries in 1987 at Lake Turkana by Allan Morton and 1993 by Meave Leakey and Alan White. Leakey proposed the new species after noting differences between the new finds and A. afarensis.
Australopithecus anamensis was unquestionably bipedal, as shown by the form of its tibia, including the near right angle between the proximal shaft and the proximal articular surface, the large size of the lateral proximal condyle, and the human-like buttressing of the proximal and distal shaft. These features suggest human-like transfer of weight from one leg to another when walking. Like A. afarensis, A anamensis had powerful arms that would have aided tree climbing. Body weight is estimated to have been between 47 to 55 kg. Based on canine root size, it may have been more sexually dimorphic than A. afarensis.
Australopithecus anamensis lived from 4.2 to 3.9 mya and is the oldest known australopithecine, but round about the time its discovery was announced, an earlier, considerably more ape-like hominin, now known as Ardipithecus ramidus, came to light.
In September 1994 a research team headed by Dr Timothy White discovered the hominin fragments including skull, mandible, teeth and arm bones—from the Afar Depression in the Middle Awash river valley of Ethiopia. Eventually 45 percent of the total skeleton was recovered. Dated to 4.4 mya, the new species was originally classed as an australopithecine, A. ramidus (“ramid” means root in the native Afar language), but it has subsequently been assigned a new genus, Ardipithecus (“Ardi” means ground or floor” in Afar). More Ardipithecus ramidus finds were made in 2005 in As Duma, northern Ethiopia. The finds comprised 9 individuals who lived from between 4.5 to 4.3 mya.
Ardipithecus ramidus was about the size of a chimpanzee and had chimp-like dentition, including thin enamel, strongly-muscled arms which could be an aid to climbing. It is linked to later hominins by incisor-like canines and by forward position of foramen magnum, implying bipedalism. Leg bones show it was bipedal, but less so than its less ape-like successors. It clearly represented an earlier grade of organization than Australopithecus.
Ardipithecus ramidus seems to have lived in forest, scuppering theories about a savannah origin for bipedalism. Fossils are found with typical forest fauna. This is supported by implied dietary adaptations – thin molar enamel and small molar teeth, suggesting a diet of leaves, soft fruit and soft vegetables.
The discovery of a second Ardipithecus species, Ardipithecus kadabba (originally classed as a subspecies, A. ramidus kadabba) pushed the hominin lineage back still further. These later samples, also found at Middle Awash, represented five individuals and were older than the 1994 findings. Ardipithecus kadabba lived from 5.8 to 5.2 mya, not far from the 7 to 5 mya date for the human/chimp split obtained from recent molecular studies.
A fairly straightforward evolutionary relationship seemed to be indicated at this point, with Ardipithecus being ancestral to Australopithecus. Meanwhile, the discovery in 1999 of another “late” gracile australopithecine, A. garhi (“surprise” in Afar) seemed to fill in another gap between A. africanus and Homo habilis, the first human species.
However, the recent discoveries of Orrorin turgensis (Tugen Hills, Kenya, Pickford & Senut, 2000) and Sahelanthropus tchadensis (19 July 2001, TM 266 “Toumai” [Hope of Life]) have thrown the whole issue back into the melting pot.
© Christopher Seddon 2008
The first discovery of a bipedal ape was made by the Australian anthropologist Raymond Dart in 1924. Investigating a box of fossil-bearing rock from a limestone quarry in Taung in North West Province, South Africa, Dart discovered fossilised remains of a juvenile apelike creature. The remains comprised the face, part of the cranium, the complete lower jaw and a brain endocast formed when material within the skull hardened to rock. Dart concluded that the brain was humanlike and that the foramen magnum was placed centrally in the basicranium as with humans, rather than towards the rear as with apes. The canine teeth were small – again like humans. Dart described what became known as the Taung Child in February 1925 in the journal Nature (Dart, 1925), naming it Australopithecus africanus (Southern ape of Africa).
Dart’s claim was strongly criticised at the time, largely because the (fake) Eoanthropus dawsoni (Dawson’s dawn man) better known as Piltdown Man fitted the then prevalent view that brains had expanded before bipedal walking had evolved. The find was widely dismissed as a fossilised ape.
However in 1936 the Scottish palaeontologist Robert Broom, a long-time supporter of Dart, instigated new searches for early human fossils and soon discovered the braincase of an adult specimen in a set of limestone caves at Sterkfontein near Krugersdorp 25km northwest of Johannesburg. Further finds followed at nearby Kromdraai in 1938 but then the war intervened and the sites were closed down for the duration. Broom (who was in his seventies) spent the war years preparing a monograph on the finds which was published in 1946, after which the australopithecines were generally accepted as hominins.
Broom assigned his finds to two species, Plesiantropus transvaalensis (“Near-man from the Transavaal”) and Paranthropus robustus (“alongside-man”). After the war, now assisted by John T. Robinson, Broom resumed his excavations. In April 1947 the pair discovered the nearly complete adult skull of a female Plesianthropus (STS 5) which became popularly known as Mrs Ples. Another find from Sterkfontein, a fossilized pelvis, vertebral column and fragmentary rib and femur discovered by Broom in 1947 and known as STS 14, may well be part of the same individual as STS 5.
Plesianthropus transvaalensis is now regarded as the same species as Australopithecus africanus. Paranthropus was later “lumped” in with Australopithecus but the current trend for “splitting” has led some to resurrect it as a separates genus, incorporating the so-called robust australopithecines. The morphological differences between the robust australopithecines and the earlier australopithecines, colloquially referred to as gracile australopithecines, are far less than those between the earliest (Homo habilis) and modern humans (H. sapiens), which are not given separate genera. I will therefore follow Klein (1999), Conroy (1997), Lewin & Foley (1998, 2004), etc in not adopting this convention.
A. africanus is known from 2.8 to 2.3 mya; A. robustus is more recent, known from 1.8 to possibly as late as 1.0 mya. Both species were small-brained in comparison to human. Australopithecus africanus is believed to have had a cranial capacity of around 430-520 cc. There was a considerable degree of sexual dimorphism in both. Males typically measured 4ft 6in tall and weighed 40 kg, whereas females measured 3ft 8 in tall and weighed 30 kg. Australopithecus robustus had a cranial capacity of 500-545 cc. Males measured 4ft 4 in tall and weighed 40 kg. Females measured 3ft 6 in tall and weighed 32 kg.
In 2004, Mrs Ples made the Top 100 in SABC3’s television series Great South Africans, placing her in the company of Nelson Mandela, Steve Biko and Dr Christiaan Barnard.
Zinj
In the 1950s Louis and Mary Leakey began excavating at Olduvai Gorge, a steep-sided ravine in the Great Rift Valley located in the eastern Serengeti, Tanzania. For many years the Leakeys recovered only stone tools, but in 1959 they made the first of a series of discoveries that was to make them world famous and led to Olduvai Gorge becoming known as the Cradle of Mankind. On 17 July of that year, with Louis back at camp unwell, Mary discovered an almost complete hominin cranium. The find was initially classified as Zinjanthropus boisei – Zinj was the medieval Arab name for this region of East Africa. However it was later reclassified as a robust australopithecine and hence is now known as Australopithecus boisei (or Paranthropus boisei). The specific name boisei honoured the expedition sponsor Charles Boise.
“Zinj” or “Dear Boy” as the specimen was affectionately known is dated to 1.75 mya. A. boisei is now also known from East Turkana, Kenya (KNM-ER 406 - male cranium; KNM-ER 732 – female cranium) and from Ethiopia. The species is known from 2.3 to 1.2 mya. It is again quite small brained at 500-545 cc. Males measured 4ft 6 and weighed 45 to 80 kg, again much larger than the females at 4ft tall and 36 kg.
Lucy
Although the discovery of Australopithecus boisei proved that australopithecines had not been confined to South Africa, Mrs Ples and her relatives remained unchallenged as the oldest known bipedal apes, but in 1974 the baton passed to what is undoubtedly the most celebrated fossil hominin ever discovered.
In the 1970s diggings in the Afar Depression began to reveal evidence of an australopithecine species that considerably predated Australopithecus africanus. The first specimen, a fossilized knee-joint known as AL 129-1, was discovered in November 1973 by a young American PhD student, Donald Johanson, at the Middle Awash site along the Awash River. Its humanlike oblique femoral shaft indicated that it had belonged to a biped. A year later an expedition led by Johanson and French anthropologist Yves Coppens recovered a 40% complete skeleton designated AL 288-1, some 2.5 km from the site of AL 129-1. Nicknamed Lucy after the Beatles song Lucy in the Sky with Diamonds (which was played on a cassette-recorder at the campsite at a party held to mark the find), AL 288-1 became world famous overnight. The new species was named Australopithecus afarensis.
Shortly after the discovery of Lucy, Johanson’s team found a 3.2 million year old fossil bed containing 333 separate fragments. The site was accordingly dubbed Locality 333. The remains were associated with a group of 13 A. afarensis including males, females and infants. It is believed that this group – which became known as the First Family - were drowned by a flash flood as there is no evidence that they were attacked by predators. If they were indeed all members of a single social group, this suggests A. afarensis lived in relatively large groups of mixed sexes and ages.
In 1978 Mary Leakey discovered a set of hominin footprints preserved for 3.7 million years in volcanic ash at Laetoli in Tanzania, 45 km south of Olduvai Gorge. Generally attributed to Australopithecus afarensis, the footprints were made by three bipedal individuals, all walking in the same direction. The individuals had human-like arched feet, lacked the mobile big toes of apes and appear to have been moving at a leisurely stroll.
Australopithecus afarensis is known from 3.9 to 2.8 mya. Its cranial capacity was 380-485 cc. Males measured 5ft tall and weighed 45 kg, females measured 3ft 3in and weighed 30 kg; thus again the species was highly sexually dimorphic.
The face was prognathic – that is to say the lower jaw jutted forward. The cranium was long and low, with a nuchal crest at the back to which were attached powerful neck muscles, needed to balance the head because of the prognathic lower face. The foramen magnum was centrally-placed, confirming upright posture. Males had a sagittal crest, implying strong, ape-like jaw muscles.
The canines and incisors, though large, were reduced in comparison to modern ape; the molars were thick-enamelled and large.
The human-like pelvis had a short, broad, backwardly-extended iliac blade, which centres the trunk over the hip joints, reducing fatigue during upright bipedal walking. However it had relatively ape-like limb proportions: very short thighs, powerful arms with forearms long in proportion to upper arms, similar to chimpanzees. The ribcage was probably cone-shaped as opposed to barrel-shaped in humans. This implies that it was an adept climber and not yet a wholly-committed biped, though it was undoubtedly a bipedal walker on the ground.
The Australopithecine family grows
In the decade and a half that followed the discovery of Lucy, the australopithecine family continued to grow.
In the mid 1980s, a new robust type, Australopithecus aethiopicus, was identified. As the name implies, the first specimen (Omo 18) came to light in Ethiopia. But this find, made in 1967, comprised a partial mandible fragment and the species wasn’t recognised until 1985 when Alan Walker and Richard Leakey discovered a skull at West Turkana, Kenya. KNM WT 17000, known as the Black Skull due to manganese colouration, which has a cranial capacity of 410 cc. There is insufficient material to estimate its size. It lived 2.7 to 2.5 mya and may be ancestral to A. robustus and A. boisei, but this is uncertain. It is not even universally accepted that Omo 18 and the Black Skull belong to the same species.
The next discovery was Australopithecus bahrelghazali, discovered in 1993, known only from at Bahr el Ghazal valley near Koro Toro, Chad (KT-12H1). The find consists of a fragmentary upper third premolar and the anterior portion of a mandible retaining one incisor, the sockets for the remaining three, both canines and all four premolars. An age of 3.4 to 3 million years makes it contemporary with A. afarensis and its status as a separate species is disputed. However its location in Chad, 2,500 km from contemporary australopithecine sites in the Rift Valley suggests north-central Africa may also be important in human origins.
Further back: Ardipithecus ramidus and Australopithecus anamensis
Up to this point, Lucy continued to hold the title for oldest known hominin, and by implication, the closest to the last common ancestor between humans and living apes, but from the 1990s onwards, the date was successively pushed ever further back.
Australopithecus anamensis (“lake” in Turkana language) is another find dating back to the Sixties but not recognised until later. The first specimen, a single arm bone, was recovered 1965 at Lake Turkana, but it was not proposed as a new species until 1995 following discoveries in 1987 at Lake Turkana by Allan Morton and 1993 by Meave Leakey and Alan White. Leakey proposed the new species after noting differences between the new finds and A. afarensis.
Australopithecus anamensis was unquestionably bipedal, as shown by the form of its tibia, including the near right angle between the proximal shaft and the proximal articular surface, the large size of the lateral proximal condyle, and the human-like buttressing of the proximal and distal shaft. These features suggest human-like transfer of weight from one leg to another when walking. Like A. afarensis, A anamensis had powerful arms that would have aided tree climbing. Body weight is estimated to have been between 47 to 55 kg. Based on canine root size, it may have been more sexually dimorphic than A. afarensis.
Australopithecus anamensis lived from 4.2 to 3.9 mya and is the oldest known australopithecine, but round about the time its discovery was announced, an earlier, considerably more ape-like hominin, now known as Ardipithecus ramidus, came to light.
In September 1994 a research team headed by Dr Timothy White discovered the hominin fragments including skull, mandible, teeth and arm bones—from the Afar Depression in the Middle Awash river valley of Ethiopia. Eventually 45 percent of the total skeleton was recovered. Dated to 4.4 mya, the new species was originally classed as an australopithecine, A. ramidus (“ramid” means root in the native Afar language), but it has subsequently been assigned a new genus, Ardipithecus (“Ardi” means ground or floor” in Afar). More Ardipithecus ramidus finds were made in 2005 in As Duma, northern Ethiopia. The finds comprised 9 individuals who lived from between 4.5 to 4.3 mya.
Ardipithecus ramidus was about the size of a chimpanzee and had chimp-like dentition, including thin enamel, strongly-muscled arms which could be an aid to climbing. It is linked to later hominins by incisor-like canines and by forward position of foramen magnum, implying bipedalism. Leg bones show it was bipedal, but less so than its less ape-like successors. It clearly represented an earlier grade of organization than Australopithecus.
Ardipithecus ramidus seems to have lived in forest, scuppering theories about a savannah origin for bipedalism. Fossils are found with typical forest fauna. This is supported by implied dietary adaptations – thin molar enamel and small molar teeth, suggesting a diet of leaves, soft fruit and soft vegetables.
The discovery of a second Ardipithecus species, Ardipithecus kadabba (originally classed as a subspecies, A. ramidus kadabba) pushed the hominin lineage back still further. These later samples, also found at Middle Awash, represented five individuals and were older than the 1994 findings. Ardipithecus kadabba lived from 5.8 to 5.2 mya, not far from the 7 to 5 mya date for the human/chimp split obtained from recent molecular studies.
A fairly straightforward evolutionary relationship seemed to be indicated at this point, with Ardipithecus being ancestral to Australopithecus. Meanwhile, the discovery in 1999 of another “late” gracile australopithecine, A. garhi (“surprise” in Afar) seemed to fill in another gap between A. africanus and Homo habilis, the first human species.
However, the recent discoveries of Orrorin turgensis (Tugen Hills, Kenya, Pickford & Senut, 2000) and Sahelanthropus tchadensis (19 July 2001, TM 266 “Toumai” [Hope of Life]) have thrown the whole issue back into the melting pot.
© Christopher Seddon 2008
Labels:
australopithcus,
dart,
human evolution,
leakey,
lucy,
mrs ples
Upright Apes
Introduction
The idea that there was a now-extinct “missing link” between apes and present-day humans goes back to Darwin’s time, but the idea that it was a single species has long seen as being simplistic. We now know that maybe as many as a dozen species of human lived before the rise of Homo sapiens. But before the appearance of larger brained beings considered to be the first humans, 2.5 mya, there were apes with brains no larger than those of chimps that habitually (i.e. all the time) walked upright rather than on all fours – unlike any ape now living. It is these that are closest to the idea of a missing link – or of the not quite human “man-apes” described by the late Sir Arthur C. Clarke in the novel version of 2001: A Space Odyssey.
A key question to understanding the relationship between these fossil apes, humans and our closest living relatives the chimpanzees is to determine the time of genetic isolation between the latter two. Current estimates vary. Yang (2002) obtained a divergence time of 5.2 mya, with a 95% confidence interval from 4.6 to 6.1 mya. Kumar et al (2005) obtained a 5.0 myr divergence with a 95% confidence interval of 4.4-5.9 mya. However some reject these studies and claim that upright apes lived earlier.
What do you call an Upright Ape?
For many years, the term “Hominid” sufficed. The term was derived from the Linnaean Family Hominidae, which was taken to include modern and extinct humans (Homo sapiens, H. erectus, the Neanderthals, etc) and the upright apes, which were then generally contained in a single genus, Australopithecus. This latter grouping was split into two: “gracile” and “robust” australopithecines. The famous “Lucy” is an example of the former. The great apes (the two chimp species, gorillas and orang-utans) were banished to their own family, Pongidae. The two families were lumped together into a superfamily, Hominoidae, to which the term Hominoid could be applied.
However genetic studies in the mid-1970s showed that chimps were most closely related to humans, followed by gorillas with orang-utans a distant third. This strongly suggested a common African origin for humans, chimps and gorillas but made nonsense of the existing classification. Clearly humans and great apes belonged in the same family, and by the internationally-agreed rules of taxonomy, the more recently-described grouping, Pongidae, was “sunk” or subsumed into Hominidae. This meant the term “Hominid” could now mean a great ape, or indeed anything within the great ape clade, such as Sivapithecus (believed to be ancestral to the orang-utans).
To get round this problem and get back to a grouping that includes humans and only species more closely related to us than chimps, taxonomists have recently begun using the ranking of Tribe (which rather illogically comes below Family in the Linnaean scheme) Hominini, which gives us the term “Hominin”. Some still include the chimps in this grouping, though they are more usually given their own tribe, Panini.
The Hominins, then, include two broad groupings - habitual upright walking (biped) apes, and larger-brained humans. Note that Hominini is a subset of Hominidae: all hominins are also hominids.
Of the thirty or so hominin species that may have existed since the divergence with the chimpanzees, just one remains: Homo sapiens.
Defining features of the Hominins
A hominin is distinguished from other hominids by various adaptations for a terrestrial rather than arboreal lifestyle. They are all habitual bipeds, with the skull sited on top of the vertebral column. Unlike other primates, the feet are not prehensile and lack an opposable big toe. However the opposable thumb is well developed.
In addition to upright walking, the evolution of the hominins is often described in terms of three other new features: reduction of anterior teeth and enlargement of cheek teeth, elaboration of culture and a significant increase in brain size. These features arose at separate intervals, developing at different rates (an example of what is known as mosaic evolution). Tools appear at 2.5 mya; brain expansion does not occur until after 2 mya; but bipedalism is evidenced at 4 mya and may have existed in very first hominins known, which would make it the primary hominin adaptation. In other words even the earliest hominins were bipeds, but they had few other humanlike traits.
Bipedalism in Humans
While the great majority of tetrapods are quadrupeds, habitual bipedalism isn’t uncommon – many dinosaurs were bipedal, all birds (when not in flight or swimming) are and the macropods (kangaroos, wallabies, etc) are. Though other apes can and do walk upright from time to time, the hominins are the only tail-less habitual biped group.
Human bipedalism is a striding gait. The legs alternate between a swing phase and stance phase. During the stance phase, the knee is locked in the extended position, requiring little energy to support it. The femur slopes inward to the knee (valgus angle) and two feet are close to the body’s midline. The body’s centre of gravity doesn’t shift laterally very much during each phase of walking. The strong gluteal abductor muscles prevent the body from toppling over.
By contrast, chimps cannot extend knee joints to produce the straight leg in the stance phase and need to expend more energy to support body (try walking with your knees bent and you will get the idea). The femur does not slope inwards to the knee as much as in humans, the feet are therefore placed well apart and the gluteal abductors are not highly developed. Chimps accordingly waddle and this is exacerbated by their higher centre of gravity.
Chimps are a design compromise between tree-climbing and terrestrial (mainly knuckle walking) where as modern humans are fully adapted for terrestrial living. Earlier hominins retained some arboreal adaptations.
Humans have needed extensive anatomical modifications including:
1) Curved lower spine.
2) Shorter, broader pelvis and angled femur, reorganized musculature.
3) Lengthened lower limbs and enlarged joint surface areas.
4) An extensible knee joint.
5) Platform foot in which enlarged big toe is in line with other toes.
6) A movement of the foramen magnum towards the centre of the basicranium.
On the face of it, one might wonder how what is in effect the re-engineering of the fundamentally horizontal tetrapod bauplan to function in the vertical mode ever happened (see Lewin and Foley, 2003).
In fact it isn’t as dramatic a shift as might be expected. Most primates can sit upright, many can stand unsupported and some can walk upright. Thus the human upright posture is best thought of as an expression of an ancient primate evolutionary trend. The dominant motif has always been an erect body (Napier, 1971). The trend moved through vertical clinging and leaping (prosimians) to quadruped (monkeys, apes) to brachiation in apes. It didn’t involve transforming a true quadruped (e.g. a dog or horse) into a biped.
This does not explain why bipedalism arose.
Origins of Bipedalism
Darwin (Descent of Man, and Selection in Relation to Sex (1871)) and many others believed the characters that mark out humans – intelligence, manual dexterity, tool-making and upright bipedalism – would give an ape an advantage over other apes, and thus they were selected for. But Darwin was misinterpreting his own theory. Features don’t evolve because of what they what they might be able to do in the future. For example, no matter how useful the human brain might have proved in writing Shakespeare’s plays, proposing the Theory of Relativity and piloting jumbo jets, brains didn’t evolve for these purposes. The first bigger-brained hominins couldn’t do any of these things; nevertheless bigger brains evolved and an explanation must be sought in terms of what those earlier larger brains could do rather than what modern humans can do. Thus bipedalism probably didn’t evolve to free up hands for carrying or making things; these were merely useful spin-offs from a feature that evolved for some other reason.
In the 1960s the Man the Hunter theory was popular. Although bipeds are slower and less energy efficient than quadrupeds at top speed, at lower speeds they possess greater stamina, which is useful for tracking and killing prey. The more recent Man the Scavenger picture has the superior biped endurance being useful for following migrating herds and scavenging carcasses. One problem with these theories is that the stone tools needed for exploiting carcasses did not appear until well after bipedalism. Another difficulty is that tooth wear patterns on early hominins suggest they remained predominantly vegetarian until 2.5 mya.
The majority of more recent theories are to varying degrees tied up with the climate change that began around 10-5 mya. A change to cooler dryer conditions promoted grasses at the expense of trees and bushes in lower latitudes. Forest-dwelling creatures declined. The apes that had prospered 17-10 mya were very hard hit and many species became extinct, especially in Eurasia. By 6.5-5.0 mya the Antarctic ice cap was repeatedly draining the Mediterranean, depriving Africa and Eurasia of an important moisture source and accelerating the contraction of the forests. With trees more widely scattered, apes were forced to spend more time on the ground moving between them, and this may have encouraged bipedal locomotion and sparked the emergence of fully-bipedal apes.
Theories fall into five main categories:
1) Improved predator avoidance through seeing farther than a quadruped across open plains.
2) More efficient thermoregulation.
3) Display or warning.
4) A dietary shift, such as seed eating or berry picking.
5) Carrying things.
The “Woman the Gatherer” theory dates to the 1970s. On this picture society was based on females and offspring with males being peripheral. In the more open habitat resulting from the climate change, females had to travel further during foraging and often carried infants. Bipedalism would have been an advantage. Another “carrying things” theory is “Man the Provisioner” proposed in 1981 by Owen Lovejoy. This model has males gathering food and provisioning their partners and their offspring. With the male providing the food, females could breed at shorter intervals, giving them an advantage over other large hominoids. But pair bonding and monogamy is incompatible with the large degree of sexual dimorphism seen in early hominins.
The term sexual dimorphism simply refers to physical differences between the two sexes of a sexually-reproducing species. An extreme example is the angler fish, in which the tiny male attaches itself to the much larger female, and lives out the remainder of its life as a parasite, incapable of independent existence, serving only to fertilise the female.
Among the living primates, a strong correlation between mating strategy and sexual dimorphism has been found. In polygamous species, a male can get ahead of the competition in the access to females stakes by being bigger and more powerful than the other males. In such species, therefore, there is a selective pressure on males to be large. But no such condition applies to the females. Thus, in a polygamous species, males tend to be larger than females. However in monogamous species the males do not have any selective pressure to be large, and therefore tend to be the same size as the females.
The fossil record suggests that early hominins possessed significant sexual dimorphism, making any model assuming monogamy highly unlikely.
There are two theories focussing on posture rather than locomotion. One, due to Nina Jablonski, focuses on hominoid threat displays, where individuals stand erect in aggressive encounters.
Kevin Hunt, on the other hand, noted in field studies of chimps that 80% of bipedal behaviour was related to stationary feeding, and only 4% while walking. He believes that bipedalism was initially a feeding adaptation that only later became a locomotion adaptation. Both these theories suggest standing upright preceded walking bipedally.
However in 1980 Peter Rodman and Henry McHenry at the University of California at Davis suggested bipedalism evolved in response simply to a change in distribution of dietary resources. This explanation is more parsimonious, i.e. involves fewer assumptions.
In the Late Miocene, hominoid dietary resources became more thinly dispersed in some cases requiring a more energy-efficient way of getting about. This theory is based on a couple of simple observations. Firstly, human bipedalism is more energy-efficient than quadrupedalism at walking speeds, albeit less so at high speeds. Secondly, chimps are 50% less energy-efficient than regular quadrupeds on the ground, whether knuckle-walking or moving bipedally. Therefore, as the authors noted, “there was no energetic Rubicon separating hominoid quadrupeds from hominin bipeds”.
For bipedalism to evolve, selective pressure was required. More dispersed and otherwise unreachable food resources provided that pressure. Bipedalism enabled apes to boldly go where no ape had gone before.
The theory is attractive, but contains the hidden assumption that the common ancestor of African apes and humans was a knuckle-walker, which many doubt. Also early hominins differ from modern humans in possessing significant degrees of arboreal adaptation. Their bipedalism would have been less efficient than ours, notes Karen Streudel of the University of Wisconsin.
But Lynne Isbell and Truman Young at University of California at Davies support their colleagues’ theory and extended it, pointing out another strategy would be reduce average daily travel distance by reducing group size (the whole group needs to travel between fewer dispersed sources of food for everybody to satisfy their appetites) hence less travel for the group as a whole. This strategy was adopted by chimps. Thus the theory in its expanded form describes two possible evolutionary trajectories taken by hominids as the effects of climate change were felt.
Many theories have focussed on thermoregulation and the need to reduce exposure to the sun, particularly at noon. Peter Wheeler at the Liverpool John Moores University sees bipedalism as a way of reducing exposure to sun while foraging. Hair loss and sweating are other adaptations. Although the traditional view that early homimins lived in open woodland and savannah is now known to be incorrect (Ardipithecus ramidus and Australopithecus amamensis lived in wooded and possibly forested habitats), the need to keep cool while out in the open moving between the small forest patches could have been a factor.
Recently time and motion studies (Foley) have shown that a hominid would need to spend 60% of its time on terrestrial foraging before it would pay off energetically. This is more a case of “not quite half full” rather than “more than half empty”: it was not necessary to be 100% terrestrial – a hominin with 40% of its time spent in the trees, and living in closed woodlands, could still find it worthwhile to make the switch.
References:
Cameron, D and Groves, C 2004: Bones, Stones and Molecules: “Out of Africa” and Human Origins, Elsevier Academic Press.
Groves, C 1991: A Theory of Human and Primate Evolution, Clarendon Press Oxford.
Klein, R 1999: The Human Career (2nd edition), University of Chicago Press.
Lewin, R and Foley, R 2004: Principles of Human Evolution (2nd edition), Blackwell Science Ltd.
Scarre, C (Ed) 2005: The Human Past, Thames & Hudson, London
© Christopher Seddon 2008
The idea that there was a now-extinct “missing link” between apes and present-day humans goes back to Darwin’s time, but the idea that it was a single species has long seen as being simplistic. We now know that maybe as many as a dozen species of human lived before the rise of Homo sapiens. But before the appearance of larger brained beings considered to be the first humans, 2.5 mya, there were apes with brains no larger than those of chimps that habitually (i.e. all the time) walked upright rather than on all fours – unlike any ape now living. It is these that are closest to the idea of a missing link – or of the not quite human “man-apes” described by the late Sir Arthur C. Clarke in the novel version of 2001: A Space Odyssey.
A key question to understanding the relationship between these fossil apes, humans and our closest living relatives the chimpanzees is to determine the time of genetic isolation between the latter two. Current estimates vary. Yang (2002) obtained a divergence time of 5.2 mya, with a 95% confidence interval from 4.6 to 6.1 mya. Kumar et al (2005) obtained a 5.0 myr divergence with a 95% confidence interval of 4.4-5.9 mya. However some reject these studies and claim that upright apes lived earlier.
What do you call an Upright Ape?
For many years, the term “Hominid” sufficed. The term was derived from the Linnaean Family Hominidae, which was taken to include modern and extinct humans (Homo sapiens, H. erectus, the Neanderthals, etc) and the upright apes, which were then generally contained in a single genus, Australopithecus. This latter grouping was split into two: “gracile” and “robust” australopithecines. The famous “Lucy” is an example of the former. The great apes (the two chimp species, gorillas and orang-utans) were banished to their own family, Pongidae. The two families were lumped together into a superfamily, Hominoidae, to which the term Hominoid could be applied.
However genetic studies in the mid-1970s showed that chimps were most closely related to humans, followed by gorillas with orang-utans a distant third. This strongly suggested a common African origin for humans, chimps and gorillas but made nonsense of the existing classification. Clearly humans and great apes belonged in the same family, and by the internationally-agreed rules of taxonomy, the more recently-described grouping, Pongidae, was “sunk” or subsumed into Hominidae. This meant the term “Hominid” could now mean a great ape, or indeed anything within the great ape clade, such as Sivapithecus (believed to be ancestral to the orang-utans).
To get round this problem and get back to a grouping that includes humans and only species more closely related to us than chimps, taxonomists have recently begun using the ranking of Tribe (which rather illogically comes below Family in the Linnaean scheme) Hominini, which gives us the term “Hominin”. Some still include the chimps in this grouping, though they are more usually given their own tribe, Panini.
The Hominins, then, include two broad groupings - habitual upright walking (biped) apes, and larger-brained humans. Note that Hominini is a subset of Hominidae: all hominins are also hominids.
Of the thirty or so hominin species that may have existed since the divergence with the chimpanzees, just one remains: Homo sapiens.
Defining features of the Hominins
A hominin is distinguished from other hominids by various adaptations for a terrestrial rather than arboreal lifestyle. They are all habitual bipeds, with the skull sited on top of the vertebral column. Unlike other primates, the feet are not prehensile and lack an opposable big toe. However the opposable thumb is well developed.
In addition to upright walking, the evolution of the hominins is often described in terms of three other new features: reduction of anterior teeth and enlargement of cheek teeth, elaboration of culture and a significant increase in brain size. These features arose at separate intervals, developing at different rates (an example of what is known as mosaic evolution). Tools appear at 2.5 mya; brain expansion does not occur until after 2 mya; but bipedalism is evidenced at 4 mya and may have existed in very first hominins known, which would make it the primary hominin adaptation. In other words even the earliest hominins were bipeds, but they had few other humanlike traits.
Bipedalism in Humans
While the great majority of tetrapods are quadrupeds, habitual bipedalism isn’t uncommon – many dinosaurs were bipedal, all birds (when not in flight or swimming) are and the macropods (kangaroos, wallabies, etc) are. Though other apes can and do walk upright from time to time, the hominins are the only tail-less habitual biped group.
Human bipedalism is a striding gait. The legs alternate between a swing phase and stance phase. During the stance phase, the knee is locked in the extended position, requiring little energy to support it. The femur slopes inward to the knee (valgus angle) and two feet are close to the body’s midline. The body’s centre of gravity doesn’t shift laterally very much during each phase of walking. The strong gluteal abductor muscles prevent the body from toppling over.
By contrast, chimps cannot extend knee joints to produce the straight leg in the stance phase and need to expend more energy to support body (try walking with your knees bent and you will get the idea). The femur does not slope inwards to the knee as much as in humans, the feet are therefore placed well apart and the gluteal abductors are not highly developed. Chimps accordingly waddle and this is exacerbated by their higher centre of gravity.
Chimps are a design compromise between tree-climbing and terrestrial (mainly knuckle walking) where as modern humans are fully adapted for terrestrial living. Earlier hominins retained some arboreal adaptations.
Humans have needed extensive anatomical modifications including:
1) Curved lower spine.
2) Shorter, broader pelvis and angled femur, reorganized musculature.
3) Lengthened lower limbs and enlarged joint surface areas.
4) An extensible knee joint.
5) Platform foot in which enlarged big toe is in line with other toes.
6) A movement of the foramen magnum towards the centre of the basicranium.
On the face of it, one might wonder how what is in effect the re-engineering of the fundamentally horizontal tetrapod bauplan to function in the vertical mode ever happened (see Lewin and Foley, 2003).
In fact it isn’t as dramatic a shift as might be expected. Most primates can sit upright, many can stand unsupported and some can walk upright. Thus the human upright posture is best thought of as an expression of an ancient primate evolutionary trend. The dominant motif has always been an erect body (Napier, 1971). The trend moved through vertical clinging and leaping (prosimians) to quadruped (monkeys, apes) to brachiation in apes. It didn’t involve transforming a true quadruped (e.g. a dog or horse) into a biped.
This does not explain why bipedalism arose.
Origins of Bipedalism
Darwin (Descent of Man, and Selection in Relation to Sex (1871)) and many others believed the characters that mark out humans – intelligence, manual dexterity, tool-making and upright bipedalism – would give an ape an advantage over other apes, and thus they were selected for. But Darwin was misinterpreting his own theory. Features don’t evolve because of what they what they might be able to do in the future. For example, no matter how useful the human brain might have proved in writing Shakespeare’s plays, proposing the Theory of Relativity and piloting jumbo jets, brains didn’t evolve for these purposes. The first bigger-brained hominins couldn’t do any of these things; nevertheless bigger brains evolved and an explanation must be sought in terms of what those earlier larger brains could do rather than what modern humans can do. Thus bipedalism probably didn’t evolve to free up hands for carrying or making things; these were merely useful spin-offs from a feature that evolved for some other reason.
In the 1960s the Man the Hunter theory was popular. Although bipeds are slower and less energy efficient than quadrupeds at top speed, at lower speeds they possess greater stamina, which is useful for tracking and killing prey. The more recent Man the Scavenger picture has the superior biped endurance being useful for following migrating herds and scavenging carcasses. One problem with these theories is that the stone tools needed for exploiting carcasses did not appear until well after bipedalism. Another difficulty is that tooth wear patterns on early hominins suggest they remained predominantly vegetarian until 2.5 mya.
The majority of more recent theories are to varying degrees tied up with the climate change that began around 10-5 mya. A change to cooler dryer conditions promoted grasses at the expense of trees and bushes in lower latitudes. Forest-dwelling creatures declined. The apes that had prospered 17-10 mya were very hard hit and many species became extinct, especially in Eurasia. By 6.5-5.0 mya the Antarctic ice cap was repeatedly draining the Mediterranean, depriving Africa and Eurasia of an important moisture source and accelerating the contraction of the forests. With trees more widely scattered, apes were forced to spend more time on the ground moving between them, and this may have encouraged bipedal locomotion and sparked the emergence of fully-bipedal apes.
Theories fall into five main categories:
1) Improved predator avoidance through seeing farther than a quadruped across open plains.
2) More efficient thermoregulation.
3) Display or warning.
4) A dietary shift, such as seed eating or berry picking.
5) Carrying things.
The “Woman the Gatherer” theory dates to the 1970s. On this picture society was based on females and offspring with males being peripheral. In the more open habitat resulting from the climate change, females had to travel further during foraging and often carried infants. Bipedalism would have been an advantage. Another “carrying things” theory is “Man the Provisioner” proposed in 1981 by Owen Lovejoy. This model has males gathering food and provisioning their partners and their offspring. With the male providing the food, females could breed at shorter intervals, giving them an advantage over other large hominoids. But pair bonding and monogamy is incompatible with the large degree of sexual dimorphism seen in early hominins.
The term sexual dimorphism simply refers to physical differences between the two sexes of a sexually-reproducing species. An extreme example is the angler fish, in which the tiny male attaches itself to the much larger female, and lives out the remainder of its life as a parasite, incapable of independent existence, serving only to fertilise the female.
Among the living primates, a strong correlation between mating strategy and sexual dimorphism has been found. In polygamous species, a male can get ahead of the competition in the access to females stakes by being bigger and more powerful than the other males. In such species, therefore, there is a selective pressure on males to be large. But no such condition applies to the females. Thus, in a polygamous species, males tend to be larger than females. However in monogamous species the males do not have any selective pressure to be large, and therefore tend to be the same size as the females.
The fossil record suggests that early hominins possessed significant sexual dimorphism, making any model assuming monogamy highly unlikely.
There are two theories focussing on posture rather than locomotion. One, due to Nina Jablonski, focuses on hominoid threat displays, where individuals stand erect in aggressive encounters.
Kevin Hunt, on the other hand, noted in field studies of chimps that 80% of bipedal behaviour was related to stationary feeding, and only 4% while walking. He believes that bipedalism was initially a feeding adaptation that only later became a locomotion adaptation. Both these theories suggest standing upright preceded walking bipedally.
However in 1980 Peter Rodman and Henry McHenry at the University of California at Davis suggested bipedalism evolved in response simply to a change in distribution of dietary resources. This explanation is more parsimonious, i.e. involves fewer assumptions.
In the Late Miocene, hominoid dietary resources became more thinly dispersed in some cases requiring a more energy-efficient way of getting about. This theory is based on a couple of simple observations. Firstly, human bipedalism is more energy-efficient than quadrupedalism at walking speeds, albeit less so at high speeds. Secondly, chimps are 50% less energy-efficient than regular quadrupeds on the ground, whether knuckle-walking or moving bipedally. Therefore, as the authors noted, “there was no energetic Rubicon separating hominoid quadrupeds from hominin bipeds”.
For bipedalism to evolve, selective pressure was required. More dispersed and otherwise unreachable food resources provided that pressure. Bipedalism enabled apes to boldly go where no ape had gone before.
The theory is attractive, but contains the hidden assumption that the common ancestor of African apes and humans was a knuckle-walker, which many doubt. Also early hominins differ from modern humans in possessing significant degrees of arboreal adaptation. Their bipedalism would have been less efficient than ours, notes Karen Streudel of the University of Wisconsin.
But Lynne Isbell and Truman Young at University of California at Davies support their colleagues’ theory and extended it, pointing out another strategy would be reduce average daily travel distance by reducing group size (the whole group needs to travel between fewer dispersed sources of food for everybody to satisfy their appetites) hence less travel for the group as a whole. This strategy was adopted by chimps. Thus the theory in its expanded form describes two possible evolutionary trajectories taken by hominids as the effects of climate change were felt.
Many theories have focussed on thermoregulation and the need to reduce exposure to the sun, particularly at noon. Peter Wheeler at the Liverpool John Moores University sees bipedalism as a way of reducing exposure to sun while foraging. Hair loss and sweating are other adaptations. Although the traditional view that early homimins lived in open woodland and savannah is now known to be incorrect (Ardipithecus ramidus and Australopithecus amamensis lived in wooded and possibly forested habitats), the need to keep cool while out in the open moving between the small forest patches could have been a factor.
Recently time and motion studies (Foley) have shown that a hominid would need to spend 60% of its time on terrestrial foraging before it would pay off energetically. This is more a case of “not quite half full” rather than “more than half empty”: it was not necessary to be 100% terrestrial – a hominin with 40% of its time spent in the trees, and living in closed woodlands, could still find it worthwhile to make the switch.
References:
Cameron, D and Groves, C 2004: Bones, Stones and Molecules: “Out of Africa” and Human Origins, Elsevier Academic Press.
Groves, C 1991: A Theory of Human and Primate Evolution, Clarendon Press Oxford.
Klein, R 1999: The Human Career (2nd edition), University of Chicago Press.
Lewin, R and Foley, R 2004: Principles of Human Evolution (2nd edition), Blackwell Science Ltd.
Scarre, C (Ed) 2005: The Human Past, Thames & Hudson, London
© Christopher Seddon 2008
Sunday, 25 May 2008
Highpoint
Located on top of Highgate Hill at one of the highest places in London (hence the name), Highpoint comprises two apartment blocks designed by the Russian imigre Berthold Lubetkin and constructed in two phases - Highpoint 1 in 1935 and the adjascent Highpoint 2 in 1938. It is a classic example of Modernist architecture, though local estate agents regularly display their ignorance by referring to apartments on their books as Art Deco.
View from the south, taken from adjacent carpark.
Another view from the south of the complex.
One of the two caryatids supporting the porch of Highpoint 2. Although completely out of keeping with the Modernist design of the building, they are an attractive feature.
A view of the front aspect of the complex.
View from the north, the only picture actually taken on the site.
© Christopher Seddon 2008
View from the south, taken from adjacent carpark.
Another view from the south of the complex.
One of the two caryatids supporting the porch of Highpoint 2. Although completely out of keeping with the Modernist design of the building, they are an attractive feature.
A view of the front aspect of the complex.
View from the north, the only picture actually taken on the site.
© Christopher Seddon 2008
Faster than Light?
Probably the best known consequence of Einstein’s Special Theory of Relativity is that nothing can travel faster than light. Science fiction writers have tried to get round the problem in various ways, most notably the warp drive of Star Trek fame but the reality was best summed up by the late Arthur C. Clarke in the notes accompanying his 1986 novel The Songs of Distant Earth when he said that “...no Warp Six will ever get you from one episode to another in time for next week’s episode. The great Producer in the Sky did not arrange his programme planning that way”.
But why can’t we go faster than light? For instance, what would happen if we were travelling at the speed of light, then went a bit faster? The answer turns out to be that only light can travel at the speed of light; anything else – no matter how fast it tries to go – can never quite reach much less exceed the speed of light.
This was the conclusion that Einstein came to when he considered relative velocities – that is to say the speed and direction of travel of one object relative to another. Hence – the Theory of Relativity. Before we come to this, however, let us look at the speed of light itself.
To paraphrase the late Douglas Adams, light is fast. Mind-bogglingly fast. Light is so fast that it was for long time believed that its speed must be infinite – switch on a sufficiently bright light on, say, Mars, and it would become instantly visible from Earth. That this was not the case was demonstrated in an ingenious fashion by the Danish astronomer Ole Christensen Roemer. Roemer’s demonstration arose out of Galileo’s attempt to solve the vexed “longitude problem” of determining the longitude of a ship at sea. All attempts had been confounded by the lack of any means of establishing the time of day with any degree of accuracy once a ship was out of sight of land. Galileo hit on the idea of timing the eclipses of Jupiter’s four principle moons. The moons regularly disappear behind the giant planet in a completely predictable fashion: thus the Jovian system could in effect be used as a clock.
Unfortunately the method proved impractical due to the difficulties of making the required observations from a ship at sea. Nevertheless in the 1660s and 1670s several astronomers began making observations with a view to compiling an ephemeris for the use of ships’ captains. These included Roemer and his French colleague Jean-Felix Picard at Hven, near Copenhagen, and Giovanni Cassini, who was observing from Paris. Cassini had noticed discrepancies in his observations - when Jupiter was closer to Earth the eclipses would occur earlier than expected: conversely, when it was further away they would be late. This he correctly attributed to light having a finite speed: the further away the Jovian system was, the longer the light from the eclipsed moon would take to reach Earth, and the later the eclipse would appear to be.
Cassini however did not pursue the matter until 1672, when Roemer went to Paris and began working as his assistant. Roemer made further observations and eventually determined the discrepancy amounted to between ten and eleven minutes (the actual value is just over eight minutes). Cassini published the results in 1675 in a short paper and Roemer, using data collected by Picard and himself, published a more detailed treatment a year later. Using currently accepted values for the distances of Earth and Jupiter from the Sun, which were not known with great accuracy in Roemer’s day, a calculation of the speed of light using Roemer’s observations yields a value of 227,000 km per second for the speed of light, rather less than the currently-accepted value of 299,792.5 km per second. However Roemer himself never attempted to calculate an actual value.
Despite this strong evidence, it would be another 50 years that it became generally accepted that light has a finite speed. In the 172os, the third Astronomer Royal, James Bradley, discovered and then explained a phenomenon now known as stellar aberration. Bradley and his friend and colleague Samuel Molyneux were attempting the measure the so-called parallax of the star Gamma Draconis. Parallax is the apparent motion of an object when seen from two viewpoints against a distant background. If, for example, I look at a lighthouse visible against a background of stars from my hotel window on a fine night, then decide to go down to the beach for a midnight swim, and there see the lighthouse from a different angle, it will appear to have moved against the background stars. Two things will determine the apparent movement (it has of course in reality remained stationary); the distance between my hotel room and the beach; and the distance between myself and the lighthouse (which we assume to be large in comparison, say three kilometres versus a couple of hundred metres. We don’t know how far away the lighthouse is, but we do know the distance between the hotel room and the beach, and we can measure the apparent movement – or parallax – of the lighthouse. Armed with this information, simple trigonometry can be used to calculate the distance to the lighthouse.
Bradley and Molyneux were attempting to measure the distance from the solar system to Gamma Draconis using a similar method, based the fact that as the Earth goes round the Sun, the star would be viewed from a slightly different angle and so should show a parallax against more distant objects. Because the distance from the Earth to the Sun (the so-called astronomical unit) is tiny in relation to the distance to even the nearer stars, the parallax shown will generally be tiny and very difficult to measure. Bradley and Molyneux were unsuccessful in their attempts to determine the distance of Gamma Draconis by this method (it is actually too far away for the parallax method to be practical), but they did pick up an unexplained “wobble” in its position.
Molineux died in 1728, but Bradley carried on the work on his own and explained the results in terms of the speed of the Earth as it moves round the Sun, and the finite speed of light. The speed of light is not infinite compared with the Earth’s orbital velocity and when the two are combined the result is a small apparent displacement of distant celestial objects from their true positions. After the passage of a year, objects returns to their original positions. This is the wobble Bradley and Molyneux observed for Gamma Draconis. The maximum displacement of an object by this effect is small – just 22.47 seconds of an arc, but it was within the resolving capacity of the instruments of that era. Bradley’s observations yield a value of 298,000 km per second for the speed of light, much closer to the accepted value.
Although attempts go back to the early 17th Century, it was not until 1849 that Earthbound techniques were first successfully used to measure the speed of light. In that year, the French physicist A.H.L. Fizeau used an apparatus consisting of a beam of light that was directed to mirror several kilometres away. The beam passed through a rotating cogwheel which was spun at a rate whereby the beam would pass through one gap on the way out and another on the way back. If the rate of rotation was slightly higher or lower, the beam would be blocked by a tooth on its return. Based on the rate of rotation of the wheel and the distance to the mirror, the speed of light could be calculated. Fizeau obtained a figure of 313,000 km per second. Leon Foucault refined the method, substituting a rotating mirror for the cogwheel and obtained a better estimate of 298,000 km per second. Further refinements by the Americans A.A. Michelson and Simon Newcomb eventually yielded a figure of 299,860 +/- 60 km per second.
In 1887 Michelson collaborated with E.W. Morley in one of the most important experiments in the history of science at the Case Western Reserve University in Cleveland, Ohio.
In the late 19th Century it was widely believed that just as sound waves require a medium such as air through which to propagate, so light waves must require some form of medium, referred to as the luminiferous aether, in order to propagate. Because light could propagate through a vacuum, it was assumed that the vacuum of space must be permeated with this substance. Because waves propagate through a medium at a constant speed, the speed of light should be fixed in relation to the aether. Also, because the Earth is in motion around the Sun, its movement though the aether should manifest itself in variations in the speed of light as measured indifferent directions (think of the aether as a “wind” blowing against the Earth; the speed of light should vary depending on whether it is measured in a direction facing directly into the wind or away from it).
Michelson and Morley devised an apparatus known as an interferometer, in which a single beam of light is split by a half-silvered mirror and later recombined by mirrors after the two portions have been sent in different directions along two arms of equal length, 90 degrees apart. By using a coherent light source (one in which all the waves were in phase), any difference in travel times caused by the motion against the aether should show up as interference patterns, where waves either reinforce each other (constructive interference) or cancel out (destructive interference). It would then be possible to calculate the variations in the speed of light and from this, the speed and direction of the Earth’s movement through the aether.
But surprisingly, absolutely no difference was detected. The speed of light was the same regardless of what direction it was measured in. The experiment was repeated at different times of the year to rule out the possibility that Earth might happen to be stationary with respect to the aether at a certain time of the year: if so, it could not be six months later when it would be moving in the opposite direction. But still no differences were found. The speed of light remained resolutely immutable.
The paradox implied here can be illustrated by considering an express train passing through a station at night. Suppose a woman in the train gets up from her seat to go to the refreshment car at the head of the train. She walks past her fellow passengers at a brisk 5 km per hour. The train passes through the station at 100 km per hour. A man standing on the platform sees the woman on the train. He concludes she is passing at 100 + 5 = 105 km per hour, i.e. the sum of the train’s speed plus her own speed along the carriage.
All very straightforward, but now suppose that the man on the platform (who is in fact none other than that Einsteinian folk-hero, the stationary observer) measures the speed of light coming from the train and that from the platform lighting. Applying the same logic as before, one would expect the former to be approaching at 100 km per hour faster. This is of course an infinitesimal difference, but one that is well within the range of a present-day undergraduate physics lab.
But that isn’t what happens if the conclusions of the Michelson-Morley experiment are accepted. The speed of light, regardless of how fast the source is travelling in relation to the observer, is always the same old 299,792.5 km per second. This even applies to the light from distant galaxies, moving away from us at significant fractions of the speed of light.
In 1905 Einstein proposed what is now known as the Special Theory of Relativity in one of his four so-called Annus Mirabilis papers, published in the scientific journal Annalen der Physik. Einstein postulated that all frames of reference were equivalent: thus it was equally valid for the man on the station platform to claim that the woman on the train was moving at 105 km per hour; for other passengers to claim that the woman was moving at 5 km per hour; and for the woman herself to claim that she was stationary, but the other passengers were moving at 5 km per hour and the man on the platform was moving at 105 km per hour! Einstein claimed that there was no physics experiment one could perform that would distinguish between the three frames of reference. He also claimed that this included measuring the speed of light, which would always come out at 299,792.5 km per second.
One of the consequences of this is that velocities are not additive. The 100 + 5 = 105 km per hour calculation above is not, strictly speaking, correct. The actual speed is ever so slightly less than 105 km per hour. The difference is negligible at such low speeds, but becomes ever more significant as the speed of light is approached. Thus if the train was moving at 95% of the speed of light and the woman was moving down the carriage at 15% of the speed of light, the man on the platform would not measure the woman’s speed at 95% + 15% = 110% of light speed but a speed given by the formula:
S = (V + U) / (1 + (V/C) * (U/C) where V = speed of train, U = speed of woman and C = speed of light.
With V = 0.95, U = 0.15 and c = 1, S comes out at (1.1) / (1 + 0.95 * 0.15) = .96, i.e. still less than light speed.
The correctness of this formula implies that supraluminal speed is impossible since even if U and V are arbitrarily close to C, S will still come out at less than C. Also, if U and V are small in relation to C (as in the first example), the approximation S = U + V holds to all intents and purposes.
That velocities are not additive have further consequences. Physicists define momentum as the product of mass and velocity and while there is an upward limit on velocity, there is no upward limit on momentum. This implies that as an object approaches the speed of light, its mass increases. The effect has been verified in the laboratory – it is been shown that so-called relativistic particles in an accelerator are more massive than their stationary counterparts. At lightspeed a particle would have infinite mass, thus to reach the speed of light would require an infinite amount of energy – another way of saying that faster than light speeds are impossible.
Finally the energy required to produce the mass increase can be calculated. There is a direct equivalence between mass and energy, which extends to an object at rest, the energy equivalent of which is the rest mass multiplied by the square of the speed of light.
The relationship is usually expressed by that most familiar of equations, E = mc squared.
© Christopher Seddon 2008
But why can’t we go faster than light? For instance, what would happen if we were travelling at the speed of light, then went a bit faster? The answer turns out to be that only light can travel at the speed of light; anything else – no matter how fast it tries to go – can never quite reach much less exceed the speed of light.
This was the conclusion that Einstein came to when he considered relative velocities – that is to say the speed and direction of travel of one object relative to another. Hence – the Theory of Relativity. Before we come to this, however, let us look at the speed of light itself.
To paraphrase the late Douglas Adams, light is fast. Mind-bogglingly fast. Light is so fast that it was for long time believed that its speed must be infinite – switch on a sufficiently bright light on, say, Mars, and it would become instantly visible from Earth. That this was not the case was demonstrated in an ingenious fashion by the Danish astronomer Ole Christensen Roemer. Roemer’s demonstration arose out of Galileo’s attempt to solve the vexed “longitude problem” of determining the longitude of a ship at sea. All attempts had been confounded by the lack of any means of establishing the time of day with any degree of accuracy once a ship was out of sight of land. Galileo hit on the idea of timing the eclipses of Jupiter’s four principle moons. The moons regularly disappear behind the giant planet in a completely predictable fashion: thus the Jovian system could in effect be used as a clock.
Unfortunately the method proved impractical due to the difficulties of making the required observations from a ship at sea. Nevertheless in the 1660s and 1670s several astronomers began making observations with a view to compiling an ephemeris for the use of ships’ captains. These included Roemer and his French colleague Jean-Felix Picard at Hven, near Copenhagen, and Giovanni Cassini, who was observing from Paris. Cassini had noticed discrepancies in his observations - when Jupiter was closer to Earth the eclipses would occur earlier than expected: conversely, when it was further away they would be late. This he correctly attributed to light having a finite speed: the further away the Jovian system was, the longer the light from the eclipsed moon would take to reach Earth, and the later the eclipse would appear to be.
Cassini however did not pursue the matter until 1672, when Roemer went to Paris and began working as his assistant. Roemer made further observations and eventually determined the discrepancy amounted to between ten and eleven minutes (the actual value is just over eight minutes). Cassini published the results in 1675 in a short paper and Roemer, using data collected by Picard and himself, published a more detailed treatment a year later. Using currently accepted values for the distances of Earth and Jupiter from the Sun, which were not known with great accuracy in Roemer’s day, a calculation of the speed of light using Roemer’s observations yields a value of 227,000 km per second for the speed of light, rather less than the currently-accepted value of 299,792.5 km per second. However Roemer himself never attempted to calculate an actual value.
Despite this strong evidence, it would be another 50 years that it became generally accepted that light has a finite speed. In the 172os, the third Astronomer Royal, James Bradley, discovered and then explained a phenomenon now known as stellar aberration. Bradley and his friend and colleague Samuel Molyneux were attempting the measure the so-called parallax of the star Gamma Draconis. Parallax is the apparent motion of an object when seen from two viewpoints against a distant background. If, for example, I look at a lighthouse visible against a background of stars from my hotel window on a fine night, then decide to go down to the beach for a midnight swim, and there see the lighthouse from a different angle, it will appear to have moved against the background stars. Two things will determine the apparent movement (it has of course in reality remained stationary); the distance between my hotel room and the beach; and the distance between myself and the lighthouse (which we assume to be large in comparison, say three kilometres versus a couple of hundred metres. We don’t know how far away the lighthouse is, but we do know the distance between the hotel room and the beach, and we can measure the apparent movement – or parallax – of the lighthouse. Armed with this information, simple trigonometry can be used to calculate the distance to the lighthouse.
Bradley and Molyneux were attempting to measure the distance from the solar system to Gamma Draconis using a similar method, based the fact that as the Earth goes round the Sun, the star would be viewed from a slightly different angle and so should show a parallax against more distant objects. Because the distance from the Earth to the Sun (the so-called astronomical unit) is tiny in relation to the distance to even the nearer stars, the parallax shown will generally be tiny and very difficult to measure. Bradley and Molyneux were unsuccessful in their attempts to determine the distance of Gamma Draconis by this method (it is actually too far away for the parallax method to be practical), but they did pick up an unexplained “wobble” in its position.
Molineux died in 1728, but Bradley carried on the work on his own and explained the results in terms of the speed of the Earth as it moves round the Sun, and the finite speed of light. The speed of light is not infinite compared with the Earth’s orbital velocity and when the two are combined the result is a small apparent displacement of distant celestial objects from their true positions. After the passage of a year, objects returns to their original positions. This is the wobble Bradley and Molyneux observed for Gamma Draconis. The maximum displacement of an object by this effect is small – just 22.47 seconds of an arc, but it was within the resolving capacity of the instruments of that era. Bradley’s observations yield a value of 298,000 km per second for the speed of light, much closer to the accepted value.
Although attempts go back to the early 17th Century, it was not until 1849 that Earthbound techniques were first successfully used to measure the speed of light. In that year, the French physicist A.H.L. Fizeau used an apparatus consisting of a beam of light that was directed to mirror several kilometres away. The beam passed through a rotating cogwheel which was spun at a rate whereby the beam would pass through one gap on the way out and another on the way back. If the rate of rotation was slightly higher or lower, the beam would be blocked by a tooth on its return. Based on the rate of rotation of the wheel and the distance to the mirror, the speed of light could be calculated. Fizeau obtained a figure of 313,000 km per second. Leon Foucault refined the method, substituting a rotating mirror for the cogwheel and obtained a better estimate of 298,000 km per second. Further refinements by the Americans A.A. Michelson and Simon Newcomb eventually yielded a figure of 299,860 +/- 60 km per second.
In 1887 Michelson collaborated with E.W. Morley in one of the most important experiments in the history of science at the Case Western Reserve University in Cleveland, Ohio.
In the late 19th Century it was widely believed that just as sound waves require a medium such as air through which to propagate, so light waves must require some form of medium, referred to as the luminiferous aether, in order to propagate. Because light could propagate through a vacuum, it was assumed that the vacuum of space must be permeated with this substance. Because waves propagate through a medium at a constant speed, the speed of light should be fixed in relation to the aether. Also, because the Earth is in motion around the Sun, its movement though the aether should manifest itself in variations in the speed of light as measured indifferent directions (think of the aether as a “wind” blowing against the Earth; the speed of light should vary depending on whether it is measured in a direction facing directly into the wind or away from it).
Michelson and Morley devised an apparatus known as an interferometer, in which a single beam of light is split by a half-silvered mirror and later recombined by mirrors after the two portions have been sent in different directions along two arms of equal length, 90 degrees apart. By using a coherent light source (one in which all the waves were in phase), any difference in travel times caused by the motion against the aether should show up as interference patterns, where waves either reinforce each other (constructive interference) or cancel out (destructive interference). It would then be possible to calculate the variations in the speed of light and from this, the speed and direction of the Earth’s movement through the aether.
But surprisingly, absolutely no difference was detected. The speed of light was the same regardless of what direction it was measured in. The experiment was repeated at different times of the year to rule out the possibility that Earth might happen to be stationary with respect to the aether at a certain time of the year: if so, it could not be six months later when it would be moving in the opposite direction. But still no differences were found. The speed of light remained resolutely immutable.
The paradox implied here can be illustrated by considering an express train passing through a station at night. Suppose a woman in the train gets up from her seat to go to the refreshment car at the head of the train. She walks past her fellow passengers at a brisk 5 km per hour. The train passes through the station at 100 km per hour. A man standing on the platform sees the woman on the train. He concludes she is passing at 100 + 5 = 105 km per hour, i.e. the sum of the train’s speed plus her own speed along the carriage.
All very straightforward, but now suppose that the man on the platform (who is in fact none other than that Einsteinian folk-hero, the stationary observer) measures the speed of light coming from the train and that from the platform lighting. Applying the same logic as before, one would expect the former to be approaching at 100 km per hour faster. This is of course an infinitesimal difference, but one that is well within the range of a present-day undergraduate physics lab.
But that isn’t what happens if the conclusions of the Michelson-Morley experiment are accepted. The speed of light, regardless of how fast the source is travelling in relation to the observer, is always the same old 299,792.5 km per second. This even applies to the light from distant galaxies, moving away from us at significant fractions of the speed of light.
In 1905 Einstein proposed what is now known as the Special Theory of Relativity in one of his four so-called Annus Mirabilis papers, published in the scientific journal Annalen der Physik. Einstein postulated that all frames of reference were equivalent: thus it was equally valid for the man on the station platform to claim that the woman on the train was moving at 105 km per hour; for other passengers to claim that the woman was moving at 5 km per hour; and for the woman herself to claim that she was stationary, but the other passengers were moving at 5 km per hour and the man on the platform was moving at 105 km per hour! Einstein claimed that there was no physics experiment one could perform that would distinguish between the three frames of reference. He also claimed that this included measuring the speed of light, which would always come out at 299,792.5 km per second.
One of the consequences of this is that velocities are not additive. The 100 + 5 = 105 km per hour calculation above is not, strictly speaking, correct. The actual speed is ever so slightly less than 105 km per hour. The difference is negligible at such low speeds, but becomes ever more significant as the speed of light is approached. Thus if the train was moving at 95% of the speed of light and the woman was moving down the carriage at 15% of the speed of light, the man on the platform would not measure the woman’s speed at 95% + 15% = 110% of light speed but a speed given by the formula:
S = (V + U) / (1 + (V/C) * (U/C) where V = speed of train, U = speed of woman and C = speed of light.
With V = 0.95, U = 0.15 and c = 1, S comes out at (1.1) / (1 + 0.95 * 0.15) = .96, i.e. still less than light speed.
The correctness of this formula implies that supraluminal speed is impossible since even if U and V are arbitrarily close to C, S will still come out at less than C. Also, if U and V are small in relation to C (as in the first example), the approximation S = U + V holds to all intents and purposes.
That velocities are not additive have further consequences. Physicists define momentum as the product of mass and velocity and while there is an upward limit on velocity, there is no upward limit on momentum. This implies that as an object approaches the speed of light, its mass increases. The effect has been verified in the laboratory – it is been shown that so-called relativistic particles in an accelerator are more massive than their stationary counterparts. At lightspeed a particle would have infinite mass, thus to reach the speed of light would require an infinite amount of energy – another way of saying that faster than light speeds are impossible.
Finally the energy required to produce the mass increase can be calculated. There is a direct equivalence between mass and energy, which extends to an object at rest, the energy equivalent of which is the rest mass multiplied by the square of the speed of light.
The relationship is usually expressed by that most familiar of equations, E = mc squared.
© Christopher Seddon 2008
Saturday, 24 May 2008
Bligh
The Museum of Garden History in Lambeth, South London, is based in the deconsecrated church of St Marys at Lambeth. The church was originally due to be demolished but was spared when it was discovered that two noted 17th Century Royal Gardeners and botanists, John Tradescant and his son, also called John, were buried in the churchyard. This inspired John and Rosemary Nicholson to have the church converted into the world’s first museum dedicated to the history of gardening.
The churchyard’s most famous resident, however, is none other than Vice-Admiral William Bligh:
Bligh was a local man, the terraced house in which he lived with his wife Elizabeth is a short distance away. A blue plaque marks the house, drawing attention to the event for which he is inevitably and perhaps unfortunately best remembered:
The mutiny aboard the armed vessel HMS Bounty is one of the most infamous episodes in the history of the Royal Navy and has been the subject of innumerable books and films – most of which are hopelessly inaccurate, with the 1962 movie starring Trevor Howard and Marlon Brando as probably the worst offender.
William Bligh – who held the rank of Lieutenant at the time of the 1789 mutiny – was certainly not the sadistic bully of popular imagination. Like his mentor James Cook, Bligh took very seriously the welfare of his men on long and difficult sea voyages. Far from being a tyrant, he was if anything too lax and in an era of brutal discipline at sea, tended to avoid floggings, only ordering them in circumstances where other captains might well have ordered hangings (such as when three men tried to desert at Tahiti).
Tasked with transporting breadfruit plants from Tahiti to the West Indies (for the admittedly-ignoble purpose of providing a cheap source of food for the slaves there), Bligh and his crew endured a long and difficult voyage to Tahiti, lengthened to ten months after a failed attempt to round Cape Horn, which forced them to sail east across the Atlantic and Indian Ocean in order to reach the Pacific. The delay meant a stopover of five months at Tahiti to wait for the breadfruit plants to ripen.
During the stay at Tahiti Bligh allowed the men to live ashore, which actually went against the common naval practice of the day whereby men returning from a lengthy voyage could be compulsorarily transferred to another ship without the opportunity of even setting foot on land. Unfortunately the men spent the stay enjoying the company of the beautiful and uninhibited women of Tahiti and were less than enthusiastic when the time came to resume the voyage.
The mutiny - led by Bligh’s friend and former shipmate Fletcher Christian – broke out shortly after the Bounty sailed from Tahiti. Bligh was overpowered, tied up and bundled into the ship’s longboat. Eighteen men joined him of their own volition, a further four who remained loyal were detained in the Bounty by the mutineers because they were needed to help work the ship.
Cast adrift with very little food and water, no charts or compass and only a sextant and a pocket watch to navigate with, the position of Bligh and his men must have seemed all but hopeless. In fact they were to successfully undertake one of the most remarkable voyages in the annals of the sea.
Making first for Tofua, an island in the Tongan archipelago thirty nautical miles away, they were attacked by natives and one man was killed. Bligh then decided to sail for the then-Portuguese colony of Timor, over 3600 nautical miles away, a journey that would require them to navigate the treacherous Torres Strait. The voyage took 47 days and the privation Bligh and his men must have endured in the cramped 23-foot open boat can be but imagined. No further lives were lost on the voyage, but it took its toll on the men and an 18th Century colony in distant waters was probably not the best of places for recuperation. Five men died before the group could be repatriated to England.
Meanwhile Christian and his fellow mutineers returned to Tahiti, loaded up with women and after landing 16 men, including the four loyalists, sailed off in the hope of evading the long arm of the Navy. They came across the uninhabited Pitcairn Island by accident – it was wrongly marked on the Royal Navy’s charts, and for this reason they decided to settle there. Their descendants live there to this day.
Most of the men who had remained on Tahiti were later rounded up by Captain Edward Edwards of HMS Pandora, who had been ordered to retrieve the Bounty and bring the mutineers to justice. Making no attempt to distinguish between mutineers and loyalists, and (despite being aware that four men had remained loyal to Bligh), the atrocious Edwards confined the men in irons to a deckhouse (dubbed Pandora’s Box by its inmates) then left them to drown after running his ship aground in the Torres Straits (which Bligh had successfully negotiated without charts and compass). Fortunately they were freed by a compassionate crewman at the last minute, though they were weighed down with their chains and four of them still drowned. The four loyal men were among the survivors and on reaching England they were cleared by Bligh’s testimony. Three of the mutineers were hanged.
The questions as to the cause of the mutiny continue to the present day. Moving on from the myths perpetuated by Hollywood, one of the milder accusations thrown at Bligh is that he was a poor man-manager. It is true that Bligh did not suffer fools gladly, and tearing his subordinates off a strip in front of the crew was certainly not good management practice. It is also said that – in modern parlance – he was prone to micromanage. But if people repeatedly fail to carry out what is asked of them to a satisfactory standard, what other choice is there? Were Bligh’s officers up to the job?
Before the voyage of the Bounty, Bligh’s application for promotion to Commander was turned down by the Admiralty. This meant he could not recruit commissioned officers to his senior staff and had to make do with NCOs, including Christian. These men probably lacked the qualifications and experience for the Bounty’s tough assignment.
The main problem was the breakdown of discipline at Tahiti. Can Bligh, albeit the expedition’s commander, be blamed for this, or were the odds stacked against him?
Not only did Bligh’s senior staff fail to set a good example ashore – Christian “married” one of the local women – but there was no contingent of Marines aboard to maintain order. Bounty was too small for them to be accommodated. In truth, she was unfit for purpose. So the blame for the mutiny can really be pinned on the Admiralty who not only refused Bligh promotion, but handed him an inadequate ship for the voyage.
Bligh and Edwards were court-martialled for the loss of their ships, a standard practice at the time. Both men were acquitted.
Bligh’s naval career was not adversely affected by the Bounty debacle, and he subsequently attained the rank of Vice Admiral. However he remained dogged by controversy and while Governor of the then-British colony of New South Wales, he was deposed by rebels and kept under house arrest for two years.
© Christopher Seddon 2008
The churchyard’s most famous resident, however, is none other than Vice-Admiral William Bligh:
Bligh was a local man, the terraced house in which he lived with his wife Elizabeth is a short distance away. A blue plaque marks the house, drawing attention to the event for which he is inevitably and perhaps unfortunately best remembered:
The mutiny aboard the armed vessel HMS Bounty is one of the most infamous episodes in the history of the Royal Navy and has been the subject of innumerable books and films – most of which are hopelessly inaccurate, with the 1962 movie starring Trevor Howard and Marlon Brando as probably the worst offender.
William Bligh – who held the rank of Lieutenant at the time of the 1789 mutiny – was certainly not the sadistic bully of popular imagination. Like his mentor James Cook, Bligh took very seriously the welfare of his men on long and difficult sea voyages. Far from being a tyrant, he was if anything too lax and in an era of brutal discipline at sea, tended to avoid floggings, only ordering them in circumstances where other captains might well have ordered hangings (such as when three men tried to desert at Tahiti).
Tasked with transporting breadfruit plants from Tahiti to the West Indies (for the admittedly-ignoble purpose of providing a cheap source of food for the slaves there), Bligh and his crew endured a long and difficult voyage to Tahiti, lengthened to ten months after a failed attempt to round Cape Horn, which forced them to sail east across the Atlantic and Indian Ocean in order to reach the Pacific. The delay meant a stopover of five months at Tahiti to wait for the breadfruit plants to ripen.
During the stay at Tahiti Bligh allowed the men to live ashore, which actually went against the common naval practice of the day whereby men returning from a lengthy voyage could be compulsorarily transferred to another ship without the opportunity of even setting foot on land. Unfortunately the men spent the stay enjoying the company of the beautiful and uninhibited women of Tahiti and were less than enthusiastic when the time came to resume the voyage.
The mutiny - led by Bligh’s friend and former shipmate Fletcher Christian – broke out shortly after the Bounty sailed from Tahiti. Bligh was overpowered, tied up and bundled into the ship’s longboat. Eighteen men joined him of their own volition, a further four who remained loyal were detained in the Bounty by the mutineers because they were needed to help work the ship.
Cast adrift with very little food and water, no charts or compass and only a sextant and a pocket watch to navigate with, the position of Bligh and his men must have seemed all but hopeless. In fact they were to successfully undertake one of the most remarkable voyages in the annals of the sea.
Making first for Tofua, an island in the Tongan archipelago thirty nautical miles away, they were attacked by natives and one man was killed. Bligh then decided to sail for the then-Portuguese colony of Timor, over 3600 nautical miles away, a journey that would require them to navigate the treacherous Torres Strait. The voyage took 47 days and the privation Bligh and his men must have endured in the cramped 23-foot open boat can be but imagined. No further lives were lost on the voyage, but it took its toll on the men and an 18th Century colony in distant waters was probably not the best of places for recuperation. Five men died before the group could be repatriated to England.
Meanwhile Christian and his fellow mutineers returned to Tahiti, loaded up with women and after landing 16 men, including the four loyalists, sailed off in the hope of evading the long arm of the Navy. They came across the uninhabited Pitcairn Island by accident – it was wrongly marked on the Royal Navy’s charts, and for this reason they decided to settle there. Their descendants live there to this day.
Most of the men who had remained on Tahiti were later rounded up by Captain Edward Edwards of HMS Pandora, who had been ordered to retrieve the Bounty and bring the mutineers to justice. Making no attempt to distinguish between mutineers and loyalists, and (despite being aware that four men had remained loyal to Bligh), the atrocious Edwards confined the men in irons to a deckhouse (dubbed Pandora’s Box by its inmates) then left them to drown after running his ship aground in the Torres Straits (which Bligh had successfully negotiated without charts and compass). Fortunately they were freed by a compassionate crewman at the last minute, though they were weighed down with their chains and four of them still drowned. The four loyal men were among the survivors and on reaching England they were cleared by Bligh’s testimony. Three of the mutineers were hanged.
The questions as to the cause of the mutiny continue to the present day. Moving on from the myths perpetuated by Hollywood, one of the milder accusations thrown at Bligh is that he was a poor man-manager. It is true that Bligh did not suffer fools gladly, and tearing his subordinates off a strip in front of the crew was certainly not good management practice. It is also said that – in modern parlance – he was prone to micromanage. But if people repeatedly fail to carry out what is asked of them to a satisfactory standard, what other choice is there? Were Bligh’s officers up to the job?
Before the voyage of the Bounty, Bligh’s application for promotion to Commander was turned down by the Admiralty. This meant he could not recruit commissioned officers to his senior staff and had to make do with NCOs, including Christian. These men probably lacked the qualifications and experience for the Bounty’s tough assignment.
The main problem was the breakdown of discipline at Tahiti. Can Bligh, albeit the expedition’s commander, be blamed for this, or were the odds stacked against him?
Not only did Bligh’s senior staff fail to set a good example ashore – Christian “married” one of the local women – but there was no contingent of Marines aboard to maintain order. Bounty was too small for them to be accommodated. In truth, she was unfit for purpose. So the blame for the mutiny can really be pinned on the Admiralty who not only refused Bligh promotion, but handed him an inadequate ship for the voyage.
Bligh and Edwards were court-martialled for the loss of their ships, a standard practice at the time. Both men were acquitted.
Bligh’s naval career was not adversely affected by the Bounty debacle, and he subsequently attained the rank of Vice Admiral. However he remained dogged by controversy and while Governor of the then-British colony of New South Wales, he was deposed by rebels and kept under house arrest for two years.
© Christopher Seddon 2008
Tuesday, 20 May 2008
Thames Sunset
Sunday, 11 May 2008
Architecture at London Zoo
Opened in 1828, London Zoo is the world's oldest scientific zoological garden. From the beginning, renowned architects have always been hired to work on new buildings, and in addition to its extensive animal collection, the zoo site hosts many structures of outstanding architectural merit. Currently there are two Grade One and eight Grade II listed buildings on the Regents Park site.
The Lubetkin-designed Penguin Pool was built in 1934 and is now a Grade One Listed Building, but is no longer considered to suitable for penguins and has been empty for some years. It is to be hoped that one day a way will be found to return this Modernist classic to use, without affecting its architectural integrity in any way.
Also listed is the former Elephant and Rhino Pavillion, a fine example of Brutalist architecture, designed by Sir Hugh Casson. It opened in 1964. The pavillion now houses smaller animals, including pigs, camels and a number of birds.
Also of note are the Mappin Terraces, a man-made mountain landscape, completed in 1914 which housed bears for many years. They are currently closed for renovation (this picture was taken in 2003). Below the terraces is the aquarium, opened in 1924 by King George V. In my childhood, this was the highlight of any visit to the zoo, but sadly it is now very dilapidated. A wonderful original 1920s exploded diagram of the aquarium can still be seen within.
© Christopher Seddon 2008
The Lubetkin-designed Penguin Pool was built in 1934 and is now a Grade One Listed Building, but is no longer considered to suitable for penguins and has been empty for some years. It is to be hoped that one day a way will be found to return this Modernist classic to use, without affecting its architectural integrity in any way.
Also listed is the former Elephant and Rhino Pavillion, a fine example of Brutalist architecture, designed by Sir Hugh Casson. It opened in 1964. The pavillion now houses smaller animals, including pigs, camels and a number of birds.
Also of note are the Mappin Terraces, a man-made mountain landscape, completed in 1914 which housed bears for many years. They are currently closed for renovation (this picture was taken in 2003). Below the terraces is the aquarium, opened in 1924 by King George V. In my childhood, this was the highlight of any visit to the zoo, but sadly it is now very dilapidated. A wonderful original 1920s exploded diagram of the aquarium can still be seen within.
© Christopher Seddon 2008
Sunday, 4 May 2008
Last and First Men (1930) & Star Maker (1937), by Olaf Stapledon
Two books written in the 1930s by the Liverpool-born author and philosopher Olaf Stabledon together comprise the greatest work of science fiction ever written. The Encyclopedia of Science Fiction claims that Stapledon's influence on the development of science fiction ideas is "probably second only to that of H.G. Wells" and I would dispute that even Wells can be ranked higher. Last and First Men and Star Maker surpass even Asimov's Foundation Trilogy and Clarke's Childhood's End and The City and the Stars. The efforts of others pale into insignificance by comparison.
The two are self-contained works (they cannot be described as "novels" in the traditional sense of the word), but Star Maker is clearly intended as a sequel to Last and First Men and twice refers, albeit briefly, to the earlier work.
Last and First Men begins at a time shortly after World War I. The first four chapters describe a series of wars, fought with chemical and biological weapons, which eventually result in a world dominated by America. The resulting society endures for four millennia, then collapses as supplies of fossil fuel run out.
The next part of the book describes the fall of the First Men - modern Homo sapiens (this is of course incorrect, on even the most economical of schemes we are the fourth human species after Homo habilis, H. erectus and the Neanderthals). After a brief renaissance, the human race is all but annihilated when a nuclear power plant gets out of control and sets off a chain reaction that devastates the Earth. A handful of survivors, on an expedition to the North Pole, survive the initial catastrophe, but mankind remains in eclipse for ten million years until Earth is again fully habitable, when a new, more highly-evolved species, the Second Men appear.
This noble race produces several civilisations, which rise and fall over the course of a quarter of a million years before achieving a stable world community, which is unfortunately destined to be short-lived. Disaster overtakes it in the form of invaders from Mars. These Martians are very different to the clichéd bug-eyed monsters normally associated with the Red Planet; described in great detail they are life, but certainly not as we know it.
As he demonstrates here, and will do so again in Star Maker, Stapledon's ability to envisage and describe intelligent life-forms utterly unlike humans has never come close to being equalled, let alone surpassed.
The wars between Earth and Mars rage for tens of thousands of years. Stapledon describes not only the effect of the wars on the material culture of the Second Men, but also on their collective state of mind. The wars finally end in a Pyrric victory for the Second Men when a bacteriological weapon is devised which annihilates the invaders, but is barely less lethal to mankind, of which only a tattered remnant remains to start again.
A hiatus of thirty million years ensues before a new species, the Third Men, appear. Much smaller and shorter lived than their predecessors, they produce a great diversity of cultures, some enduring for as long as a quarter of a million years until biological sciences advance way beyond those of the Second Men and at length it is decided to produce a giant brain.
After several failed attempts, a sessile brain in a forty-foot diameter reinforced concrete turret becomes the first of the Fourth Men, who eventually enslave the Third Men, then set out to produce a new species, the Fifth Men. These long-lived beings, of greater stature and intellectual capacity than even the Second Men come inevitably into conflict with their creators, and though the book passes over these events, the Great Brains and their slaves are annihilated.
Surpassing anything that has come before it, the civilisation of the Fifth Men endures for millions of years, but is eventually threatened with destruction as the Moon spirals in towards the Earth (in actual fact the Moon is receding from Earth, but Stapledon would not be the last author to make this mistake). In what must be one of the earliest ever accounts of terraforming a planet, the book describes how Venus is transformed into a new home for Mankind, albeit an unsatisfactory one.
Man's sojourn on Venus, however, "lasted somewhat longer than his whole career on the Earth", and gets off to an unhappy start when the indigenous Venusians have to be exterminated to make way for the newcomers. Rather questionable justifications are advanced for this planetary-scale genocide. A period of some 500 million years sees the passage of three human species, the degenerate survivors of the original migration (the Sixth Men): the winged Seventh Men: and finally the Eighth Men. Eventually Mankind is forced to take flight again, this time to Neptune, when a gas cloud collides with the Sun, greatly increasing its luminosity.
On Neptune, the new species specially designed to live in Man's new home, the Ninth Men, fall rapidly into animality. For millions of years, Neptune is populated by sub-human descendants of the Ninth Men, but eventually intelligence does returns to the planet. However not until 600 million years after the solar collision does a superior species, the Fifteenth Men appear. Thereafter, progress is steady and Mankind advances steadily to "true humanity" in the Eighteenth, and last, human species, which is finally destroyed, ending the story of the human race, when the Sun goes nova after being disrupted by violent disorders taking place in a near-by star.
Incredibly, this gargantuan epic is dwarfed in scale by Star Maker: Mankind's story turns out to be a mere foot note in the history of the galaxy; he is to play no part in the Galactic Society of Worlds.
A man sitting on a suburban hill finds his disembodied mind soaring into interstellar space. After learning how to control his headlong flight through time and space, he comes to rest on the World of the Other Men, humanoids who existed on a distant planet a billion years before the time of Homo sapiens. There he enters the mind of Bvalltu one of the Other Men and through him learns much about the society of the Other Earth, which is terminal decline. The two embark on another journey through space, encountering increasingly bizarre lifeforms along the way, some of whom join the growing band of travellers.
Intelligent mollusc-like creatures that have evolved into living sailing-ships; human echinoderms; symbiotic arachnoids and ichthyoids are only some of the extraordinary aliens the travellers meet, and this is only the beginning of their journey as they go on to learn that planets, stars and entire galaxies are themselves alive. Finally, in the "supreme moment of the cosmos", they come face to face with the Star Maker, who has created and destroyed one universe after another in a relentless drive for perfection.
The two works are really a series of linked essays on the culture, art, science, history and philosophy of human and non-human civilisations utterly unlike our own. Almost any could be used as the starting point for a full-length novel; indeed many have. Time and time again, one sees where the idea for such and such an SF story originated - including some very well-known works.
This is not really surprising. Sir Arthur C. Clarke said of Last and First Men that "no book before or since ever had such an impact on my imagination" and Brian Aldiss described Star Maker as "the most wonderful novel I have ever read". Fellow Liverpudlian Stephen Baxter has also claimed Stapledonian inspiration for his superb epic Evolution, published in 2003.
It is safe to say that nothing comparable with two monumental works will ever be written again. The fictional narrator of Star Maker constantly refers to his sense of utter inadequacy when it comes to describing the wonders he has experienced, and I can only admit to feeling exactly the same way in attempting to write about Last and First Men and Star Maker.
© Christopher Seddon 2008
The two are self-contained works (they cannot be described as "novels" in the traditional sense of the word), but Star Maker is clearly intended as a sequel to Last and First Men and twice refers, albeit briefly, to the earlier work.
Last and First Men begins at a time shortly after World War I. The first four chapters describe a series of wars, fought with chemical and biological weapons, which eventually result in a world dominated by America. The resulting society endures for four millennia, then collapses as supplies of fossil fuel run out.
The next part of the book describes the fall of the First Men - modern Homo sapiens (this is of course incorrect, on even the most economical of schemes we are the fourth human species after Homo habilis, H. erectus and the Neanderthals). After a brief renaissance, the human race is all but annihilated when a nuclear power plant gets out of control and sets off a chain reaction that devastates the Earth. A handful of survivors, on an expedition to the North Pole, survive the initial catastrophe, but mankind remains in eclipse for ten million years until Earth is again fully habitable, when a new, more highly-evolved species, the Second Men appear.
This noble race produces several civilisations, which rise and fall over the course of a quarter of a million years before achieving a stable world community, which is unfortunately destined to be short-lived. Disaster overtakes it in the form of invaders from Mars. These Martians are very different to the clichéd bug-eyed monsters normally associated with the Red Planet; described in great detail they are life, but certainly not as we know it.
As he demonstrates here, and will do so again in Star Maker, Stapledon's ability to envisage and describe intelligent life-forms utterly unlike humans has never come close to being equalled, let alone surpassed.
The wars between Earth and Mars rage for tens of thousands of years. Stapledon describes not only the effect of the wars on the material culture of the Second Men, but also on their collective state of mind. The wars finally end in a Pyrric victory for the Second Men when a bacteriological weapon is devised which annihilates the invaders, but is barely less lethal to mankind, of which only a tattered remnant remains to start again.
A hiatus of thirty million years ensues before a new species, the Third Men, appear. Much smaller and shorter lived than their predecessors, they produce a great diversity of cultures, some enduring for as long as a quarter of a million years until biological sciences advance way beyond those of the Second Men and at length it is decided to produce a giant brain.
After several failed attempts, a sessile brain in a forty-foot diameter reinforced concrete turret becomes the first of the Fourth Men, who eventually enslave the Third Men, then set out to produce a new species, the Fifth Men. These long-lived beings, of greater stature and intellectual capacity than even the Second Men come inevitably into conflict with their creators, and though the book passes over these events, the Great Brains and their slaves are annihilated.
Surpassing anything that has come before it, the civilisation of the Fifth Men endures for millions of years, but is eventually threatened with destruction as the Moon spirals in towards the Earth (in actual fact the Moon is receding from Earth, but Stapledon would not be the last author to make this mistake). In what must be one of the earliest ever accounts of terraforming a planet, the book describes how Venus is transformed into a new home for Mankind, albeit an unsatisfactory one.
Man's sojourn on Venus, however, "lasted somewhat longer than his whole career on the Earth", and gets off to an unhappy start when the indigenous Venusians have to be exterminated to make way for the newcomers. Rather questionable justifications are advanced for this planetary-scale genocide. A period of some 500 million years sees the passage of three human species, the degenerate survivors of the original migration (the Sixth Men): the winged Seventh Men: and finally the Eighth Men. Eventually Mankind is forced to take flight again, this time to Neptune, when a gas cloud collides with the Sun, greatly increasing its luminosity.
On Neptune, the new species specially designed to live in Man's new home, the Ninth Men, fall rapidly into animality. For millions of years, Neptune is populated by sub-human descendants of the Ninth Men, but eventually intelligence does returns to the planet. However not until 600 million years after the solar collision does a superior species, the Fifteenth Men appear. Thereafter, progress is steady and Mankind advances steadily to "true humanity" in the Eighteenth, and last, human species, which is finally destroyed, ending the story of the human race, when the Sun goes nova after being disrupted by violent disorders taking place in a near-by star.
Incredibly, this gargantuan epic is dwarfed in scale by Star Maker: Mankind's story turns out to be a mere foot note in the history of the galaxy; he is to play no part in the Galactic Society of Worlds.
A man sitting on a suburban hill finds his disembodied mind soaring into interstellar space. After learning how to control his headlong flight through time and space, he comes to rest on the World of the Other Men, humanoids who existed on a distant planet a billion years before the time of Homo sapiens. There he enters the mind of Bvalltu one of the Other Men and through him learns much about the society of the Other Earth, which is terminal decline. The two embark on another journey through space, encountering increasingly bizarre lifeforms along the way, some of whom join the growing band of travellers.
Intelligent mollusc-like creatures that have evolved into living sailing-ships; human echinoderms; symbiotic arachnoids and ichthyoids are only some of the extraordinary aliens the travellers meet, and this is only the beginning of their journey as they go on to learn that planets, stars and entire galaxies are themselves alive. Finally, in the "supreme moment of the cosmos", they come face to face with the Star Maker, who has created and destroyed one universe after another in a relentless drive for perfection.
The two works are really a series of linked essays on the culture, art, science, history and philosophy of human and non-human civilisations utterly unlike our own. Almost any could be used as the starting point for a full-length novel; indeed many have. Time and time again, one sees where the idea for such and such an SF story originated - including some very well-known works.
This is not really surprising. Sir Arthur C. Clarke said of Last and First Men that "no book before or since ever had such an impact on my imagination" and Brian Aldiss described Star Maker as "the most wonderful novel I have ever read". Fellow Liverpudlian Stephen Baxter has also claimed Stapledonian inspiration for his superb epic Evolution, published in 2003.
It is safe to say that nothing comparable with two monumental works will ever be written again. The fictional narrator of Star Maker constantly refers to his sense of utter inadequacy when it comes to describing the wonders he has experienced, and I can only admit to feeling exactly the same way in attempting to write about Last and First Men and Star Maker.
© Christopher Seddon 2008
Subscribe to:
Posts (Atom)