This week's edition of the podcast is dedicated to the Sense About Science Lecture 2013, given by the sci-fi writer and web activist Cory Doctorow.
Cory's lecture was entitled "We get to choose: How to demand an internet that sets us free" and was delivered to an invited audience at The Institution of Engineering and Technology on 13 May.
To find out more about Cory Doctorow's writings go to his website craphound.com.
We're always here when you need us. Listen back through our archive.Alok JhaJason PhippsCory Doctorow
With snow still lying on the Cairngorm plateau in Scotland, this year has started well for ptarmigans, mountain hares and blaeberry plants. These species thrive in arctic conditions, and a decent covering of spring snow aids their survival.
Underneath the 'subnivium', as this special habitat is called, plants, insects, reptiles, amphibians and mammals all take advantage of warmer temperatures, near constant humidity and an absence of biting winds. However, Jonathan Pauli, from the University of Wisconsin-Madison, and his colleagues are concerned about signs that the subnivium is retreating.
Writing in the journal Frontiers in Ecology and the Environment, they say that since 1970 snow cover across the northern hemisphere has diminished by over five per cent during the critical spring months of March and April. Maximum snow cover now occurs in January instead of February, and the spring melt starts almost two weeks earlier than it used to.
Such changes, they warn, could spell trouble for species that thrive in the subnivium. For example, an earlier snow melt will bring hibernating reptiles and amphibians out of their slumber too early, putting them at risk of late spring storms and drops in temperature. Meanwhile, plants that lose their snowy blanket may suffer tissue damage from direct exposure to freeze-thaw cycles. Up on the Cairngorm plateau milder winters may enable red grouse to usurp ptarmigan from their homes. But for this year at least, the ptarmigans' main worry is whether the snow will melt in time for them to raise their chicks.Kate Ravilious
But forecast global temperature rise of 4C heralds disaster for large swaths of planet with oceans absorbing most global warming
Some of the most extreme predictions of global warming are unlikely to materialise, new scientific research has suggested, but the world is still likely to be in for a temperature rise of double that regarded as safe.
The researchers said that warming was most likely to reach about 4C above pre-industrial levels if the past decade's readings were taken into account.
That would still lead to catastrophe across large swaths of the Earth, causing droughts, storms, floods and heatwaves and with drastic effects on agricultural productivity leading to secondary effects such as mass migration.
Some climate change sceptics have suggested that because the highest global average temperature yet recorded was in 1998, climate change has stalled. The new study, which is published in the journal Nature Geoscience, shows a much longer "pause" would be needed to suggest that the world was not warming rapidly.
Alexander Otto, at the University of Oxford, lead author of the research, told the Guardian that there was much that climate scientists could still not fully factor into their models. He said that most of the recent warming had been absorbed by the oceans, but this would change as the seas heat up. The thermal expansion of the oceans is one of the main factors behind current and projected sea level rises.
The highest global average temperature ever recorded was in 1998, under the effects of a strong El Niño, a southern Pacific weather system associated with warmer and stormy weather, which oscillates with a milder system called La Niña. Since then, the trend of average global surface temperatures has shown a clear rise above the long-term averages – the 10 warmest years on record have been since 1998 – but climate sceptics have claimed that this represents a pause in warming.
Otto said that this most recent pattern could not be taken as evidence that climate change has stopped. "Given the noise in the climate and temperature system, you would need to see a much longer period of any pause in order to draw the conclusion that global warming was not occurring," he said. Such a period could be as long as 40 years of the climate record, he said.
Otto said the study found that most of the climate change models used by scientists were "pretty accurate". A comprehensive global study of climate change science is expected to be published in September by the Intergovernmental Panel on Climate Change, its first major report since 2007.
Jochem Marotzke, professor at the Max Planck Institute for Meteorology in Hamburg and a co-author of the paper, said: "It is important not to over-interpret a single decade, given what we know, and don't know, about natural climate variability. Over the past decade, the world as a whole has continued to warm, but the warming is mostly in the subsurface oceans rather than at the surface."
Other researchers also warned that there was little comfort to be taken from the new estimates – greenhouse gas emissions are rising at a far higher rate than had been predicted by this stage of the 21st century, and set to rise even further, so estimates for how much warming is likely will also have to be upped.
Richard Allan, reader in climate at the University of Reading, said: "This work has used observations to estimate Earth's current heating rate and demonstrate that simulations of climate change far in the future seem to be pretty accurate. However, the research also indicates that a minority of simulations may be responding more rapidly towards this overall warming than the observations indicate."
He said the effect of pollutants in the atmosphere, which reflect the sun's heat back into space, was particularly hard to measure.
He noted the inferred sensitivity of climate to a doubling of carbon dioxide concentrations based on this new study, suggesting a rise of 1.2C to 3.9C was consistent with the range from climate simulations of 2.2C to 4.7C. He said: "With work like this, our predictions become ever better."Fiona Harvey
New temperature norms under climate change will increase weather-related deaths in metropolitan areas in coming decades
New York city could experience up to 22% more deaths from extreme summertime heat in the coming decade under global warming, according to a study of the impact of climate trends.
The higher deaths will be partially offset by a reduction in deaths due to the milder winters predicted in Manhattan.
Overall, however, the net effect of the new temperature norms under climate change would be to increase weather-related deaths in New York city by up to 6.2% a year by the 2020s, according to the scientists.
The study, published in Nature Climate Change, predicted oppressive summer temperatures would exact an increasingly heavy toll on people living in metropolitan areas such as Manhattan in the coming decades.
The numbers would not be significantly offset by milder winters, the study found, and deaths due to extreme temperatures would rise more dramatically in the later decades of this century.
Without bold action to cut greenhouse gas emissions, heatwave deaths in New York city could rise by as much as 91% on 1980s levels by the 2080s, according to the study's projections. The net loss of life would be as much as 31% on 1980s levels, the study said.
"This is the first real study of the seasonal trade-off of climate change," Patrick Kinney, a professor of environmental health sciences at Columbia University and one of the authors of the study, said.
Kinney added: "What our study suggests is that the heat effects of climate change dominate the winter warming benefits that might also come: climate change will cause more deaths through heat than it will prevent during winter."
The findings, based on computer projections of future climate and their impact on deaths, provide a scaled-down version of the potential public health challenges posed by future climate change. The scientists used a set of 16 computer models to arrive at their findings.
The conclusions debunk the popular notion put forward by climate sceptics that warmer temperatures would benefit public health.
As the study notes, even under current conditions, there are more deaths due to extreme heat than to extreme cold in New York city every year.
Last year, the hottest summer since record-keeping began in the US, saw a string of days on which the temperature hit more than 37.7C (100F) in a number of US cities.
The week-long heatwave killed 82 people, according to figures compiled by the Associated Press.
In large metropolitan areas, such as New York, the impact of those temperature extremes are compounded by densely built-up areas. Cities such as Chicago, Cincinnati, Philadelphia and St Louis have also recorded sharp rises in deaths due to heart attacks and strokes during heatwaves, according to the draft of the National Climate Assessment, which was released last year.
"Urban heat islands, combined with an ageing population and increased urbanisation, are projected to increase the vulnerability of urban populations to heat-related health impacts in the future," the assessment said.
Kinney said he hoped the findings would push city planners in New York and other large urban areas to step up preparations for hotter and deadlier summers.
New York city has already begun efforts to cool the city during the summer, encouraging tree-planting programmes and setting new building standards.
Other cities also routinely set up "cooling centres", with cots and air conditioning, to allow people relief from the heat. Kinney said city officials also needed to target poor, elderly or disabled residents who are confined in hot and airless apartments during heatwaves.
"How can we reach out to people who are stuck in their apartments trying to ride out the events? We have to try to target vulnerable people," he added.Suzanne Goldenberg
Mice and lizards to undergo tests in Moscow after a month in orbit, in study into effects of weightlessness on cell structure
A Russian capsule carrying mice and lizards has returned to Earth after spending a month in space.
Scientists say the experiment is intended to test the effects of weightlessness and other factors of space flight on cell structure.
Russian state television showed the capsule and some of its inhabitants after it landed safely in a planted field near Orenburg on Sunday. The report said not all of the animals survived the flight.
Vladimir Sychov, deputy director of the Institute of Medical and Biological Problems, said it was the first time that animals had spent so much time in space on their own. The mice and lizards are to be flown back to Moscow to undergo a series of tests at the institute.
Artist Susumu Nishinaga has used a scanning electron microscope to delve deep into the fabric of petal, leaves and pollen
Prof James Hansen rebukes oil firms and Canadian government over stance on exploiting fossil fuel, which he says would make climate problem unsolvable
Major international oil companies are buying off governments, according to the world's most prominent climate scientist, Prof James Hansen. During a visit to London, he accused the Canadian government of acting as the industry's tar sands salesman and "holding a club" over the UK and European nations to accept its "dirty" oil.
"Oil from tar sands makes sense only for a small number of people who are making a lot of money from that product," he said in an interview with the Guardian. "It doesn't make sense for the rest of the people on the planet. We are getting close to the dangerous level of carbon in the atmosphere and if we add on to that unconventional fossil fuels, which have a tremendous amount of carbon, then the climate problem becomes unsolvable."
Hansen met ministers in the UK government, which the Guardian previously revealed has secretly supported Canada's position at the highest level.
Canada's natural resources minister, Joe Oliver, has also visited London to campaign against EU proposals to penalise oil from Alberta's tar sands as highly polluting. "Canada can offer energy security and economic stability to the world," he said. Oliver also publicly threatened a trade war via the World Trade Organisation if the EU action went ahead: "Canada will not hesitate to defend its interests."
The lobbying for and against tar sands has intensified on both sides of the Atlantic as the EU moves forward on its proposals, which Canada fears could set a global precedent, and Barack Barack Obama considers approving the Keystone XL pipeline to transport tar sands oil from Canada to the US gulf coast refineries and ports. Canada's prime minister, Stephen Harper, was met by protesters when he visited New York last week to tell audiences that KXL "absolutely needs to go ahead".
Canada's tar sands are the third biggest oil reserve in the world, but separating the oil from the rock is energy intensive and causes three to four times more carbon emissions per barrel than conventional oil. Hansen argues that it would be "game over" for the climate if tar sands were fully exploited, given that existing conventional oil and gas is certain to be burned.
"To leave our children with a manageable situation, we need to leave the unconventional fuel in the ground," he said. Canada's ministers were "acting as salesmen for those people who will gain from the profits of that industry," he said. "But I don't think they are looking after the rights and wellbeing of the population as a whole.
"The thing we are facing overall is that the fossil fuel industry has so much money that they are buying off governments," Hansen said. "Our democracies are seriously handicapped by the money that is driving decisions in Washington and other capitals."
The EU aims to penalise oil sources with higher carbon footprints, as part of a drive to reduce the carbon emissions from transport called the fuel quality directive (FDQ). But Canada, supported by the UK, is fiercely opposed: "We are not saying they should not move to reduce emissions," said Oliver. "But the proposed implementation of the FQD is discriminatory to oil sands and not based on scientific facts." However, Europe's commissioner for climate action, Connie Hedegaard, said the FQD was "nothing more, nothing less" than accurate labelling and putting a fair price on pollution.
Hansen, who informed the US Congress of the danger of global warming in 1988, has caused controversy before by saying the "CEOs of fossil fuel companies should be tried for high crimes against humanity" and calling coal-fired power plants "factories of death". In April, he stepped down from his Nasa position after 46 years, in order to spend more time communicating the risks of climate change and to work on legal challenges to governments.
Hansen has started a science programme at Columbia University, the first task of which is to produce a report to support suits filed again the US federal government and several state governments. It is being pursued by the Our Children's Trust charity and is based on a trust principle recognised in US law.
"We maintain that the atmosphere and climate are held in trust by the present generations for the future generations and we do not have the right to destroy that asset," Hansen said. "Therefore the courts should require the government to give a plan as to how they are going to ensure that we still have that asset to pass on to the next generation."Damian Carrington
Polly Morland's study of bravery is executed with energy, curiosity – and courage
You might think an inquiry into the nature of courage a trumped-up excuse for a book but Polly Morland loses no time in persuading you otherwise. She approaches her subject with energy, tenacious curiosity and, however much she may protest that she is lily-livered, courage. The Society of Timid Souls was based in New York in the 1940s and run by Bernard Gabriel, a professional concert pianist, who, in his Manhattan apartment, helped performers counter stage fright. During recitals, musicians were heckled and, through surviving this ordeal, would locate in themselves the stern stuff of which performers need to be made. I'd have thought an audience's silence might be scarier still but evidently not.
Morland skips lightly where angels fear to tread. Her book has astonishing range. There is an especially wonderful encounter with Rafaelillo, a Murcian matador (involving a masterclass with a fake bull); an austere audience with David Alderson, uncannily brave bomb disposal expert; and a cheery chat with 50-something Sally Ann Sutton, mauled by a mad rottweiler in her determination to save a baby from his jaws. But this is only the tiniest sample of Morland's interviewees. In every case, she proves the liveliest company: sane, merry and undeceived. But the intriguing thing is that the more she focuses on courage, the more elusive it becomes. Not many will admit to having the quality.
One of the most memorable passages describes the Iranian earthquake in Bam, 2003. It is remembered in nightmarish detail by Ruth Millington, a former high-flying lawyer, whose heroic efforts to dig people out of the wreckage saved lives. Yet when congratulated on her bravery, she is bemused: "I never even felt like it was a choice." When non-timid souls are put on the spot, this is their most common rejoinder.
Morland scrutinises the question of choice and considers animal courage in this light. She interviews representatives of the PDSA who give animals awards for bravery. She quotes Byron's fond tribute to his stout-hearted Newfoundland dog. But she remains unseduced. Instead, she wonders: can courage be courage when an animal has no choice? On a visit to St Christopher's Hospice, there is no choice about what is ahead – but there may be a choice about how to face it. Morland movingly alludes to her father holding her hand, at the end of his life, as though to express "the extremity in which he now found himself, as one might hold on to a vital scrap of paper or a £50 note in a high wind."
This is writing of unusual, sympathetic precision. And speaking of high winds, she also writes about people who throw caution to the winds, waves and to dizzying heights. She interviews star surfer Greg Long, eccentric French spiderman Alain Robert and superhuman Dean Potter whose flying feats are never inconvenienced by his lack of feathers. And what these encounters make one realise – the book's most interesting implication – is that the death wish and life wish are so close as to be almost, yet never quite, interchangeable. GK Chesterton helps this idea along – he defined courage in 1908 as "a strong desire to live taking the form of a readiness to die".
What one wonders is: what does the courting of danger say about the value a person places on his life? At Wootton Bassett, Morland finds soldiers who, characteristically, prefer not to dwell on courage at all. However, Colonel Tim Collins, OBE and former SAS commander, hauntingly explains: "Every time you prepare to die, you die a bit and you never get that back." And what emerges elsewhere is that pretending to have courage can be the same as having it. What, after all, is courage without fear?
It is also clear throughout this bracing, moving and uncommon book that performance nerves are not about to fade away. Morland seeks out orchestral players (an alarming number) calming their nerves with beta blockers. "I don't know why I have such a sense of panic about it," says viola player Ken Mirkin of the New York Philharmonic. It sounds as though the time may have come to reconvene the Society of Timid Souls.
Compiled by an ardent bibliophile, this week's report includes The Eponym Dictionary of Amphibians; Megafauna: Giant Beasts of Pleistocene South America; and a Photoalbum of the Birds of Uzbekistan; all of which were recently published in North America and the UK.
Books to the ceiling,
Books to the sky,
My pile of books is a mile high.
How I love them! How I need them!
I'll have a long beard by the time I read them.
~ Arnold Lobel [1933-1987] author of many popular children's books.
Compiled by Ian "Birdbooker" Paulsen, the Birdbooker Report is a weekly report that has been published online for years, listing the wide variety of nature, natural history, ecology, animal behaviour, science and history books that have been newly released or republished in North America and in the UK. The books listed here were received by Ian during the previous week, courtesy of various publishing houses.
New and Recent Titles:
.. .. .. .. .. .. .. .. .. .. .. ..
This piece was assembled by Ian "Birdbooker" Paulsen and formatted by GrrlScientist.
.. .. .. .. .. .. .. .. .. .. .. ..
Ian "Birdbooker" Paulsen is an avid book collector who is especially well-known to the publishing world. Mr Paulsen collects newly-published books about nature, animals and birds, science, and history, and he also collects children's books on these topics. Mr Paulsen writes brief synopses about these books on his website, The Birdbooker Report.
.. .. .. .. .. .. .. .. .. .. ..GrrlScientist
UK growers adopt specialist computer forecasting system to help improve yields of crops whatever the weather
As Britain steels itself for the prospect of yet another washout summer, strawberry growers are finding themselves forced to come up with increasingly sophisticated ways of assessing the threat posed to their livelihoods by inclement weather.
For fruit growers, predicting the weather is vital. It causes fruit yields to vary by as much as 70%, making for an erratic growing season if poor conditions are not anticipated.
The last few years has seen a glut of strawberries arrive during rainy periods, when demand was limited. Conversely, this has meant a shortage of the fruit in some parts of the country at peak times – for example outside London when Wimbledon fortnight started last year.
But as the year's first crop of British field-grown strawberries goes on sale this weekend, growers have a new hi-tech weapon in their armoury. The biggest growers are using a state-of-the-art forecasting system that allows them to predict the yields in individual fields.
The specialist technology compares historical yield curves, the recorded effect weather has on the crop and the planting date of the strawberries in their respective locations. The information is then fed into a computer along with long-term weather forecasts, specific growing data for some of the 600 varieties of strawberry produced in the UK, and growth charts for each field.
The new system is helping the UK's biggest growers, who are responsible for producing around 20,000 tonnes of strawberries – a third of the annual UK crop. It involves field visits up to three times a week, when light levels and plant growth are recorded. The collated information has helped growers accurately determine when to plant their crops to ensure yields mature throughout the season, "smoothing out" the supply of strawberries to the supermarkets.
Although the vast majority of British strawberries are grown under polytunnels, their yields are heavily influenced by dank, cold conditions.
"For the last couple of years a glut of strawberries arrived during a rainy spell when demand wasn't so high," said Paul Jones, a strawberry buyer for Tesco.
"As a result we got together with some of the UK's biggest strawberry growers and suppliers to discuss bringing in technology that could help them plan their planting programmes more accurately. Now, with the aid of computer technology and leading weather prediction data, we will be able to process and analyse forecasted strawberry volumes down to individual field level."
The hi-tech approach is a new way of harvesting one of the most venerated, historic fruits. In medieval times strawberries were regarded as an aphrodisiac and a soup made of strawberries, borage and soured cream was served to newlyweds at their wedding breakfast.
An initial trial of the new system involving a small number of growers last year was found to be around 95% accurate, enough to convince large-scale producers of the need to use the new technology. Growers hope it will spell an end to the problems they experienced last season when a very wet spring and poor light levels were followed by the wettest summer for more than a century.
Securing a steady supply is likely to pay dividends for retailers. Demand for strawberries – which were first cultivated by the Romans in 200 BC – continues to increase every year, according to industry figures. The industry predicts an 8% rise in tonnage this summer compared with 2012 and estimates that between 60,000 and 65,000 tonnes will be produced by British growers.
But the prospect of a glorious summer in which to enjoy strawberries looks a forlorn hope. Early indications, such as last week's snow flutters in Shropshire and Devon, suggest that we may be in for a similar summer to last year.Jamie Doward
More than 40 million people globally take an SSRI antidepressant, among them many writers and musicians. But do they hamper the creative process, extinguishing the spark that produces great art, or do they enhance artistic endeavour?
Twenty-five years after pharmaceutical giant Eli Lilly unleashed Prozac on the red-braced 80s, SSRIs are still the world's most popular antidepressants. They are swallowed by more than 40 million people, from Beijing to Beirut, knitting a web of happiness from New York to New Caledonia. Selective serotonin reuptake inhibitors, of which Prozac is the best known, are the defining drug of the modern age, the crutch of choice for the worried well. In the US, where one in 10 takes antidepressants, you can buy beef-flavoured Prozac for your dog, trademarked Reconcile. The Prozac revolution has not only changed the way we think about depression (aided by Eli Lilly's mammoth advertising campaign); it has also changed the way we think, full stop.
In his 1993 book Listening to Prozac, the psychiatrist Peter D Kramer explored the ethical issues around the rise of what he termed "cosmetic pharmacology". With a daily pill people could now banish social awkwardness or the unhappiness of relationship break-ups, forge brassily assertive personae from their once shy selves. Like the Soma of Aldous Huxley's Brave New World, Prozac was making people "better than well". Kramer wrote of the "personality transformations" that occurred in a substantial minority of those taking the drug, briefly pausing to speculate as to what impact this might have had on their creativity. While we know, thanks to Kay Redfield Jamison's Touched with Fire, that poets are up to 30 times more likely to suffer from bipolar disorder than the national average, we have no idea how or if the pills they take to treat the disease affect their creative output.
The French writer Henry de Montherlant said that happiness writes white. For me that whiteness was the colour of a 20mg Cipralex pill – a close cousin of Prozac – taken at the breakfast table. With the depthless chemical happiness of the drug, a thin layer of snow seemed to fall over my mind, blocking access to strong feeling, cutting me off from the hidden impulses that drove me to write. Sometimes I did feel "better than well", but more often I was haunted by the uncanny feeling that I was skimming over the surface of my life. Looking back, those Prozac years have a curious, occluded feel, as if viewed through a gauze.
To celebrate the drug's quarter-century, I spoke to other writers, artists and musicians who have taken SSRIs, trying to establish whether they have been a bane or a boon for our collective creativity. I've deliberately concentrated on the arts, rather than the sciences. This is partly because, while we've all seen Carrie Mathison in Homeland and John Nash in A Beautiful Mind, there is significantly more literature on artists and writers taking antidepressants than on chemists and economists. It's partly because the arts are my bailiwick: I'm not on "are you on drugs?" terms with that many scientists.
We expect our artists to be, in Baudelaire's words, touched by "a breath of wind from the wings of madness". In his book Poets on Prozac, Richard Berlin speaks of "an entire generation of writers who became famed for the dramatic excesses of their psychiatric disorders". Sylvia Plath sits at the head of a pantheon of artists who took their own lives – Virginia Woolf, Alexander McQueen, Ernest Hemingway, David Foster Wallace – and who battered their bodies into submission with drugs and booze (see also Roberto Bolaño, Amy Winehouse, F Scott Fitzgerald, Billie Holiday). It's easy to agree with Dryden when he says, "Great wits are sure to madness near allied, / And thin partitions do their bounds divide."
From Heinrich Heine to Edvard Munch, many resisted treatment for their depression, fearing a loss of creative urges. When offered psychotherapy, the poet Edward Thomas replied: "I wonder whether for a person like myself whose most intense moments were those of depression, a cure that destroys the depression may not destroy the intensity – a desperate remedy?" Sigmund Freud – who also killed himself – argued that artistic creativity is a product of neurosis. We deal with the conflicts in our subconscious by making objects out of them. If this, grossly simplified, is the theory behind the link between mental illness and creativity, then the worry for artists is that in banishing their black dogs they are also dousing the flames of inspiration, blunting the edge of their genius.
Creativity and pharmacology have a troubled past. Chloral hydrate, used as a sedative for the first half of the 20th century, left patients feeling sapped and sluggish. The playwright Antonin Artaud accused it of lowering his "mental water level", causing a "diminution of my morality and my intellect". He finally died of an overdose of the drug. In an unpublished letter discovered in 2001, Ted Hughes revealed that Sylvia Plath was taking a monoamine oxidase inhibitor (MAOI) in the days leading up to her suicide. She'd had a negative reaction to a similar drug as a teenager and in the letter, Hughes blames the MAOI and the doctor who prescribed it for her death.
Plath's antidepressant was remarkably similar to Nardil, the drug with which David Foster Wallace struggled for many years. Making little headway with the novel that would be published, incomplete, after his death as The Pale King, Wallace began to wean himself off Nardil. His biographer, DT Max, said "he thought that removing the scrim of Nardil might help him see a way out of his creative impasse". Instead, he remained blocked and, as his friend Jonathan Franzen put it, "when his hope for fiction died, after years of struggle with the new novel, there was no other way out but death".
This is not the essay in which to debate in depth the efficacy of SSRIs. Irving Kirsch claims – to my mind convincingly – in The Emperor's New Drugs that their benefits have been substantially overstated. What is clear is that their side-effects have not. Apart from stifling the libido, SSRI use has consequences that are particularly significant for artists. A 2009 study by Oxford University, published in the British Journal of Psychiatry, found that those taking SSRIs reported "a general reduction in the intensity of the emotions that they experienced". They described themselves as feeling "dulled", "numbed", "flattened", or "blocked". If poetry is (as Wordsworth claimed) "the spontaneous overflow of powerful feelings… emotion recollected in tranquillity", then could Prozac bring artists too little feeling, too much tranquillity?
I spent most of my 20s on SSRIs of one sort or another. I was a difficult teenager, expelled from school and lurching from one illegal chemical high to the next. I was prescribed Prozac in the wake of one particularly manic episode and continued to take it on and off for eight years. My GP at university persuaded me to quit for a while, but when I moved to London I found a pharmacy that would sell me my SSRI of choice over the counter, no questions asked. What should have been a temporary buttress ended up forming part of the architecture of my young life.
Writing on SSRIs was like swimming in mud. Words came slowly or not at all; emotions were perceived as if at a great distance, alien and remote. Even at a sentence-by-sentence level, I was aware of a certain lag in my writing, a syntactic sluggishness – the imprint of a brain that was failing to catch up with itself. I missed the hectic moods of my teens where I'd write great (I mean clearly terrible, but great in my mind) stories on my father's ancient Amstrad, caught up in the flow of words. Fuddled and frustrated, I quit writing altogether and didn't start again until I'd given up the pills.
In a recent Radio 4 documentary, Will Self considered the legacy of Prozac's first 25 years on the planet. What he didn't say on air, but admitted to me in a subsequent email, was that he'd had his own run-in with SSRIs. I'd mentioned "Inclusion", a surreal story in his book Grey Area that satirises the psychopharmacological brouhaha surrounding Prozac. "I was prescribed Seroxat (I believe wrongly)," he wrote in reply, "to help me with withdrawals from a bad crack habit (what's a good crack habit?). After being on it a couple of weeks, I borderline intentionally took a heroin overdose and nearly died... so, I have a negative view of the drugs." Self, however, didn't blame the SSRIs for obstructing his artistic flow: "Heroin, cocaine, marijuana and alcohol were really the drugs that ended up fucking my creativity; the Seroxat was just a way station on the escape ramp to abstinence."
Other writers identified with the creative hamstringing I'd experienced on SSRIs. The novelist Amanda Craig was an early adopter of Prozac in Britain. Suffering from profound depression, she found SSRIs unhelpful, even damaging, despite the brief lift they gave to her mood. "Prozac enabled me to function, but dulled everything," she told me, "including the shafts of joy that gradually pierce depression. It changed who I was and that included who I was as a writer." She finally stopped taking the pills and turned her experience of depression into a bestselling novel, In a Dark Wood.
Children's author Lucy Coats is another who found herself blocked by SSRIs. "I've been depressed all my life," she told me, "but it came to a head with postnatal depression after my second child. I was badly depressed and my doctor put me on Seroxat." Although the drugs offered some relief from her symptoms, it was at a heavy price – her creativity. "I took it for six months and I felt as if I was walking through this grey world, with all the joy totally stripped out of it. I could feel neither happy nor sad. It was absolutely vile. As a writer, I need to feel emotion of some kind. The creative spark was completely extinguished for me. I had a deadline and I had to ask the publisher to give me more time because I could not write. Everything I wrote was kind of lumpy, disgusting clay and I couldn't shape it into anything."
It's not just authors who have suffered creatively from the effects of SSRIs. I spoke to my brother, Sam, better known as Preston from the Ordinary Boys. Or, if we're honest, better known for going on Celebrity Big Brother and marrying Chantelle Houghton, one of his fellow housemates. He's since forged a successful songwriting career. I knew he'd been on Prozac throughout his time in the Celebrity Big Brother house and asked him how it affected him – creatively and otherwise.
"More than anything," he told me, "it made me really sweaty. And it seems a banal thing, but it was debilitating, particularly as it was a time I was in the public eye. As for creativity, Prozac just makes you a bit 'Yeah, OK, fine, whatever' about stuff. You lose the inner critic. And that goes for life as well as art. I got married to someone I'd met on a TV show and didn't really know. I think if it hadn't been for the haze of the drug, I might have made better decisions."
I can relate to this (and not just because he's my kid brother). With my creative blockage came what I later identified as a kind of moral blockage. Because actions didn't feel like they had consequences – in that nothing seemed able to shock me from the pallid world the drugs had wrapped about me – I pushed myself into more and more extreme situations, desperate for a spark of authentic feeling. I was haunted by the sense that I was living in the third person. This inability to feel implicated in my actions had its own creative repercussions – the characters in my novels seem to lack agency, are buffeted by forces beyond their control (as several reviewers have pointed out). I gave Charlie Wales in This Bleeding City a Valium addiction, but actually what I was describing was life on SSRIs: "With dead eyes and dead hands, I navigated the world. On the way to work in the mornings I pressed a pill into the furry lining of my cheek and felt it melt, bitter and comforting as I sat on the fusty orange seats of the tube and watched flares of electricity light up the darkness of tunnels. I had stopped reading. Instead, I just watched."
For other artists, Prozac has been a life belt thrown as they drowned in a sea of depression. In an exchange of letters with the historian Roy Porter, Zoë Heller speaks of how, after taking Prozac, "I stopped lying in bed in the middle of the day. I stopped crying all the time. I began to entertain visions of my future that were, if not entirely rosy, then at least not entirely gloom-laden." The original Prozac pin-up, Elizabeth Wurtzel, is another who claims to have been rescued by the drug (although a careful reading of her memoir Prozac Nation might give the credit to the rather less zeitgeisty lithium).
Wurtzel's book has not aged well – it is stuck in the 90s, po-faced and narcissistic. It lacks the note of authenticity that characterises the best books about mental illness. Wurtzel is also unsure exactly how she feels about the drug. At one point she gushes, "Prozac was the miracle that saved my life." Several pages later, though, she admits that "the secret I sometimes think that only I know is that Prozac really isn't that great". Writing about depression is difficult precisely because it is a disease that strips us of words, of narrative. One of the most impressive works on the subject is by the Welsh poet Gwyneth Lewis. Her memoir, Sunbathing in the Rain, joins Lewis Wolpert's Malignant Sadness and William Styron's Darkness Visible, three books sent back by emissaries from deep within the abyss of depression. Gwyneth Lewis is another who benefited greatly from Prozac.
When we first met a couple of years ago at a writing retreat in Norfolk, Lewis was literally wearing rose-tinted spectacles, but the world didn't always have such an optimistic hue. After a serious bout of depression, she found herself incapacitated, a ghost in her own life. Sunbathing in the Rain is her description of journeying into and, eventually, out of her despair, during which time SSRIs offered "some psychic space, a small but crucial distance between me and the horrors". I asked her about her experience of writing on the drugs.
"When I get ill, I get so ill I can't write at all," she told me. "I don't work when I'm wretched, I work when I'm happy. The antidepressants offered a pathway to effective working." But there were drawbacks. She stopped taking the pills during a sailing trip with her husband, finding that they rendered her spaced-out and unreactive (and a poor sailor to boot). "I was distanced and dissociated… I'd see a rock coming towards us and I just wouldn't move." She was also aware that the loss of sex drive so common to SSRI users had creative repercussions. "Part of what you feel as a poet is libido towards language. Being on these drugs will change your language use because they change who you are."
For Lewis it was a decision between writing on Prozac or not writing at all. For Keeril Makan, the choice was rather different. One of America's most celebrated young composers, he struggled for years with a depression that would often find vivid reflection in his work. He describes his music as "informed, almost viscerally, by my depression", and spiky, atonal pieces such as The Noise Between Thoughts attack the listener with a bleak physical force. Finally, though, he reached a point at which he had to step away from the darkness. "Although I was still composing," he told me, "it was such an excruciating process and was putting me in contact with these really difficult emotional places. I couldn't go on with my daily life. I was creating music I was happy with and people were interested in, but I had to live as well."
He started taking antidepressants and meditating and found that his music gained a new depth as he dragged himself out of his depression. "Being on the antidepressants does change the type of emotions I'm experiencing," he said, "but I think they can be just as interesting. If anything, this helps the composing. I was working on an opera recently and I don't think I could have written it before. I was too one-dimensional, emotionally. Things were just dark but now there's both – dark and light." I confessed to admiring the raw power of his early work and he chuckled. "It's true that I'm not as fully immersed in darkness as previously, but I guess I don't care, because I couldn't keep doing that. It was a question of living, or creating this music that was negative and violent. I made my choice."
It shows how little we understand of the functioning of the brain's neurochemistry and SSRIs' effect upon it that a pill that may cause blockages (as it did in my own case) has also been prescribed as a cure for writer's block. In a Late Show documentary aired in 1995, the psychiatrist and author Oliver James gave five artists Prozac to see what effect it would have on their creative output. Two of them – the New Order frontman Bernard Sumner and the poet Alan Jenkins – were blocked when filming began. Sumner, who was working on his Electronic side project with Johnny Marr at the time, was afflicted by a hyper-critical internal voice, and said that the process of writing lyrics was "like breaking a horse". As he wrote, he'd hear repeated in his head: "You can't do this, you can't do this."
I spoke to James about the effect of SSRIs on writer's block. "What the film showed," he told me, "was that once you removed the depression – and Prozac did seem to do that, whether by placebo or not – people could write. When I first met Bernard Sumner he was clearly blocked and by the end of it he'd written some lyrics." There was a hitch, though. "What I couldn't say on the documentary was that he may have done some work, but I'm not sure that it was any good." This seems to be one of the problems with the use of SSRIs to free up the creative impulse. While, as Gwyneth Lewis said, it's very difficult to write during periods of intense depression, it may be that we need to be a bit down on ourselves in order to produce good work.
James agrees. "On Prozac you become more confident, you're less aware of other people's feelings, less worried about what other people might think about you, you're more able to act as opposed to [being] self-absorbed and stuck. You may be talking crap, producing crap, but you don't care and just press on. And that's a real change of personality for some creative types – to stop caring what other people think. It's a dangerous game."
We begin to recognise the precarious high-wire act that most creative depressives undertake, trapped between the unbearable pain of their illness and the equally unbearable blockages brought about by their medication – walking Dryden's "thin partitions". We need the critical voices in our heads (mine is that of a reviewer who gave my second novel a mauling on Radio 4), but they mustn't swamp us with their carping and condemnation. In Touched with Fire, Kay Redfield Jamison looked at manic depressive artists who took lithium, a drug which "inhibits creativity so that the individual is unable to express himself". She found that, overwhelmingly, the artists either gave up the drug or reduced their dosage "in hope of achieving a kind of controlled cyclothymia [mood swings], willing to take the undulations of power and imbecility in exchange for periods of high enthusiasm and flowing thoughts".
In this essay, I've deliberately only quoted artists who would let me use their names in print. This is partly because, post-Leveson, we know that "a close friend" means the journalist made it up, but also because I think it's important that the subject be addressed in the open. One thing that has struck me while researching this piece, though, is the sheer number of artistic friends and acquaintances who have taken Prozac – some of whom agreed to be quoted, some who preferred to remain incognito. I mentioned that I was writing this article on Twitter and was contacted by a host of creative types keen to share their experiences – positive or (more usually) negative – of working on SSRIs. This is far from a clinical survey, but it does feel like our creative industries are smoothing the jagged surfaces of their lives with SSRIs in astonishing – even epidemic – numbers.
My conversation with my brother confirms this impression. "Everyone in music is on Prozac," he says. "It's like it's part of the job description." We know from toxicology reports that Michael Jackson, Michael Hutchence, Heath Ledger and Brittany Murphy were taking Prozac (although for them it was but one of a heady concoction of drugs), while stars such as Sheryl Crow, Robbie Williams and Olivia Newton-John have spoken about their reliance on SSRIs.
"It's partly to do with the stress of the business," my brother tells me. "If you're really successful you have little time to yourself, you're having to sleep when and if you can, you don't have much control of your life. And if you're playing a gig in Tokyo on Friday, you can't commit to therapy, to sitting down once a week and talking through your problems. You never know where you'll be one week to the next, so you just take a pill and get on with it."
There's another factor in the celebrity antidepressant narrative – doctors. "There's a kind of understanding you come to," my brother tells me. "Because most people in the music industry use private doctors and it was certainly the case with me that I went to this one doctor because I knew I'd get the drugs I wanted. I was paying and she knew that if she didn't write the prescription I'd just go elsewhere." Certain doctors would gain a reputation for being particularly laissez-faire with their prescriptions. "I don't think it was necessarily that they were corrupt or anything," my brother says. "It was more that the only people they saw were these neurotic actors and musicians. Now I see an NHS doctor and she's having all sorts in her surgery so when I come in moaning she's just like, 'Come on now, pull yourself together, you'll be fine.'"
One of the effects of the Prozac revolution has been an increasing acceptance that mental illness is caused by chemical imbalances in the brain, a simplified standpoint that has been reinforced by the press and celebrity commentators. In a 2011 Larry King Live interview, Jim Carrey came out with some exemplary bio-babble, both meaningless and pernicious: "Certain elements of the brain like tyrosine and hydroxytryptophan… instead of being a serotonin inhibitor, which just uses the serotonin you have and Prozac and things like that. It just uses the serotonin you have and it doesn't allow it go back into the receptor. But it metabolises your serotonin after a while and you have to keep taking more and more to feel good. This actually creates dopamine and creates serotonin."
Bolstered by heavy drug company spending, the message has been put out there: the brain is an organ like any other; treat depression as you would a stomach upset or broken ankle. This narrative misses the extraordinary complexity of the brain and the very limited understanding we have of its operations. The neurotransmitters which are influenced by SSRIs are intricate and multivalent – indeed the role of these neurotransmitters in the control of mood was only discovered by accident when examining the effect of the anti-psychotic thorazine on the brain's chemistry. In her Prozac Diary (1998), Lauren Slater referred to Prozac as a "revolution in psychopharmacology because of its selectivity on the serotonin system; it was a drug with the precision of a Scud missile, launched miles away from its target only to land, with a proud flare, right on the enemy's roof." Such grandiose claims have faded with time as we come to understand how little we really know about how – and if – Prozac works.
In Daniel Nettle's book Strong Imagination: Madness, Creativity and Human Nature, he turns a scientific eye upon the creative process, looking in depth at the types of mental illness associated with creativity. Of particular interest is his work on serotonin – the neurotransmitter influenced by Prozac. He shows how serotonin systems function to help us to adapt to psychological challenges, reducing anxiety and providing "a carapace against a fickle and confusing world". When I questioned him about the specific impact of Prozac on creativity, he described serotonin-related drugs stimulating "energy, concentration and an expanded mental horizon", although he added that, in the decade since writing the book, he had become convinced that Prozac and related SSRIs were much less effective than once thought.
It is comforting to believe that, to quote Robert Lowell, the lack of a little salt in the brain is all that stands between us and sanity. Irving Kirsch's research for The Emperor's New Drugs suggests, however, that SSRIs are barely more effective than placebos. While the drugs have clearly delivered dramatic benefits to some like Gwyneth Lewis (and, indeed, Oliver James himself, who when he briefly took Prozac in the 90s said he felt "miraculous" on it), it seems to hamper as many creative types as it helps. We need to be sane to work – being an author requires discipline, doggedness, a rhino-hide for criticism – but we must also be open to the insanity of creativity. The state of manic flow when we write, paint, compose or merely play is a kind of cogent madness and antithetical to my experience of the drab fog of SSRI "happiness".
Within three weeks of my own Prozac fog lifting, I was writing again. Yes, I still felt down, so down some days that I couldn't work and buried my head under the duvet, but the trade-off was days when my fingers couldn't move fast enough over the keyboard, my pen struck sparks from the page. In Deborah Levy's Swimming Home, the heroine, Kitty Finch, has just quit Seroxat. "It's quite a relief to feel miserable again," she says. "I don't feel anything when I take my pills." It's been five years since I took my last SSRI. The happiness I get from my writing is deeper seated and more authentic than anything that could be confected in the laboratories of Big Pharma. The drugs didn't work for me and, more importantly, I couldn't work when I was on them.
1988 The first SSRI (selective serotonin reuptake inhibitor), Prozac, is made by Eli Lilly and launched in the US.
1989 The drug reaches the UK. It hit the covers of Newsweek and New York magazine, which described it as the "new wonder drug for depression".
1991-2001 Annual UK antidepressant prescriptions rise from 9m to 24m.
1994 Elizabeth Wurtzel's memoir Prozac Nation is published, establishing the drug's position in popular culture.
1994 The first of many lawsuits concerning side-effects of the drug goes to trial. Joseph Wesbeckerwent on a killing spree in 1989, killing eight before shooting himself. His violence was claimed to be a side-effect of taking Prozac.
1994 Psychiatrist Peter Breggin's Talking Back to Prozac, critical of the drug, is published.
1995 Prozac is referenced in the Blur song Country House: "He's reading Balzac and knocking back Prozac… It's the helping hand that makes you feel wonderfully bland."
1998 Prozac Diary, the candid memoir by Lauren Slater, is published.
2000 Zoloft overtakes Prozac as the most popular SSRI in the US.
2001 Prozac (fluoxetine) loses its patent. Eli Lilly loses $35m of its market value in one day and 90% of its prescriptions in a single year.
2004 Prozac is in our drinking water. The Environment Agency says the drug is building up in British rivers and ground-water supplies, probably via the sewage system, but in quantities so dilute they could have no effect.
2008 Antidepressants are now the third most common prescription drugs in the US.
2009 The Lancet ranks the top 12 antidepressants from 117 studies. Zoloft and Lexapro come in first for their combination of effectiveness and fewest side-effects.
2010 One in 10 people in Europe has now taken an antidepressantAlex Preston
Tim Peake's selection seen as major boost for UK industry and an inspiration to young people
Britain's first official astronaut, Major Tim Peake, has been selected to fly on a five-month mission on the International Space Station in 2015, it is believed. The go-ahead for the flight will be seen as a major boost for the UK's space industry. Peake graduated as a European Space Agency astronaut more than two years ago and has been waiting for a space mission since then.
It was feared the former army helicopter pilot might be given a short-duration mission because the UK only makes modest contributions to Esa's manned space programme. Major contributors such as France, Germany and Italy were expected to have priority.
However, the Observer has learned that 41-year-old Peake has been assigned a lengthy stay in orbit in 2015. He will be blasted into space on a Russian Soyuz rocket from Kazakhstan in November that year and flown to the space station where he will stay for five months. He will be able to take part in spacewalks and other complex scientific activities.
UK space officials, who have refused to reveal any information about Peake's forthcoming mission, are expected to confirm details of his flight at a press conference on Monday at the Science Museum in London.
The news of Peake's mission was welcomed by Nick Spall, of the British Interplanetary Society, which has been campaigning for years for the government to change past policy and allow the UK to have official astronauts. "At last this has come about with a flight slot to the International Space Station (ISS) for Tim Peake," he said.
"The UK can now join in with important microgravity research work on the space station, win industrial contracts for future human spaceflight projects and forge new links with Nasa, Russia and hopefully China – and one day India – in space. Many young people will be inspired by Tim. It will also help boost the UK's technical employment potential for jobs and industry."
Peake, who is married with two sons, is considered to be Britain's first official astronaut because in the past those UK citizens who have flown in space have either been privately funded for their missions – such as Helen Sharman who flew on a Russian rocket in 1991 – or have taken out American citizenship, such as Nick Foale and Piers Sellers, who have both flown on the US space shuttle.
By contrast, Peake was picked to be one of six new Esa astronauts who were selected, in 2009, from several thousand candidates. During their 14-month training programme, the six travelled to Nasa's astronaut base in Houston, to the Russian astronaut training centre in Star City outside Moscow, to Tsukuba Space Centre in Japan, and spent two weeks on a survival course in Sardinia. To improve their Russian language skills, the astronauts spent a month lodging with families in St Petersburg. To see how the astronauts coped with stress, the training staff created mock emergencies, including one scenario where an astronaut fell unconscious during a spacewalk.
Peake completed his training in November 2010 and been waiting to be assigned a spaceflight. However, he has denied that the wait was causing problems. "No, it doesn't get frustrating at all – there's just so much going on, so much diversity, and there's brilliant training all along the way," he told the BBC a few weeks ago.
A graduate from Sandhurst, Peake received a commission with the Army Air Corps in 1992 and served as a platoon commander with the Royal Green Jackets in Northern Ireland. He gained his wings in 1994 after completing the army pilots' course. Following a posting to the US, he returned to Britain in 2002 to instruct trainees in flying Apache helicopters. He went on to graduate from the prestigious Empire Test Pilot School at Boscombe Down and conduct special forces operations.
He retired from the army in 2009 and joined Augusta Westland as a senior helicopter test pilot. He has flown more than 30 different aircraft.Robin McKieIan Sample
One maths boffin + one funny man = a stand-up comedian who jokes about dominoes, algorithms and Rubik's Cubes
"I'm obsessed with spreadsheets at the moment," says Matt Parker, a maths fellow at Queen Mary, University of London, who moonlights as a stand-up comedian. "In my new show I'm going to be showing my all-time favourite. It's pretty spectacular. You may have your own favourite spreadsheet, I don't know."
I don't, as it happens, but Parker is finding there are many like-minded souls out there. He did his first gig in early 2009 and quickly gained attention for a series of offbeat videos – how to split a restaurant bill; an algorithm for whether to use a budget airline – that went modestly viral on YouTube. He now does solo performances, including a spot at the Cheltenham science festival next month, and is one-third of the Festival of the Spoken Nerd, which will perform its Full Frontal Nerdity show at the Udderbelly in London this summer and in Edinburgh throughout August.
Aged "two to the power of five" – or 32 – Parker, originally from Perth in Australia, is developing an impressively eclectic CV. In November he helped organise the breaking of the world record for mass Rubik's Cube solving. More than 2,000 people, mostly school children, packed into the O2 Arena and 1,414 were successful. (Parker's personal best for the Rubik's Cube is a minute: "not great in Rubik's Cube circles; if you're not down to 30 seconds it's a bit embarrassing.")
He also recently bought 10,000 dominoes to create the world's largest computer that runs solely on dominoes. By setting up two rows, he could input any two numbers – between zero and 15 – and, depending on where they bump into each other, it would add them up. "It's a very, very inefficient oversized integrated circuit," he explains. "It's basically what you get on a chip in a computer."
This story will find its way into his comedy routine – "it's not funny yet but I'm working it up" – and he plans to do a scaled-down experiment with 1,000 dominoes in his show. "But don't promise that," he says. "Sprung stages aren't great, and if it's carpeted I'm in trouble because it's hard to balance them."
Parker concedes that his material particularly appeals to lapsed maths nerds but he believes that more of us have a dormant interest than you might think. "Why, when we're buying fuel, do many of us round it up to a whole number of pounds or tens of pounds?" he asks. "It's because we have a sense of 'Well, that looks nice.' A lot of people think they don't like maths but they do like order and patterns and problem-solving and puzzles. It's a human thing we all have, it's just some of us take it to a ridiculous nth degree and become mathematicians." Tim LewisTim Lewis
Why do we perceive time differently according to circumstances? Radio 4 presenter Claudia Hammond has some interesting answers
The time we have at our disposal each day is elastic, Proust claimed. It sounds an odd remark. Surely we have precisely 24 hours, no more and no less. Even the occasional leap second – introduced to keep calendars accurate – hardly changes the fixed time we have every day on Earth.
But as Claudia Hammond, presenter of Radio 4's All in the Mind, argues in this lively account of our perception of time, our experiences of passing minutes differ greatly according to circumstances. "A watched pot never seems to boil, but go and check your emails and it will be boiling over before you know it."
And you know what she means: that moron moment when you realise you have locked yourself out of your hired car, with your keys inside its boot, seems to stop time in its tracks while the most pleasurable experiences race by at light speed. Much has to do with the event being experienced. However, your mood, health and attentiveness also affect the rate at which time appears to pass.
Consider the fate of innocent test subjects who were tricked into believing no one on their psychology experiment liked them. Asked to estimate the passage of a minute, they reported times that were far longer than test subjects who had been told people liked them. Suffer rejection and time starts to drag, in short. "Thus the belief that a few strangers dislike you can alter your time perception," concludes Hammond.
Such observations are important because the way we perceive time is crucial to our lives. The intelligibility of spoken language depends on millisecond precision in pronunciation, for example, while the word "time" turns out to be the most widely used noun in English.
Oddly, however, there appears to be no single part of the brain that measures the passage of hours and minutes, which is just one of the many curiosities about our species' attempts to assess time. Of these paradoxes, the most intriguing – quoted by Hammond in her final chapter – was expressed by Kierkegaard: "Life can only be understood backwards but must be lived forwards."Robin McKie
Cognitive scientist and philosopher Daniel Dennett is one of America's foremost thinkers. In this extract from his new book, he reveals some of the lessons life has taught him1 USE YOUR MISTAKES
We have all heard the forlorn refrain: "Well, it seemed like a good idea at the time!" This phrase has come to stand for the rueful reflection of an idiot, a sign of stupidity, but in fact we should appreciate it as a pillar of wisdom. Any being, any agent, who can truly say: "Well, it seemed like a good idea at the time!" is standing on the threshold of brilliance. We human beings pride ourselves on our intelligence, and one of its hallmarks is that we can remember our previous thinking and reflect on it – on how it seemed, on why it was tempting in the first place and then about what went wrong.
I know of no evidence to suggest that any other species on the planet can actually think this thought. If they could, they would be almost as smart as we are. So when you make a mistake, you should learn to take a deep breath, grit your teeth and then examine your own recollections of the mistake as ruthlessly and as dispassionately as you can manage. It's not easy. The natural human reaction to making a mistake is embarrassment and anger (we are never angrier than when we are angry at ourselves) and you have to work hard to overcome these emotional reactions.
Try to acquire the weird practice of savouring your mistakes, delighting in uncovering the strange quirks that led you astray. Then, once you have sucked out all the goodness to be gained from having made them, you can cheerfully set them behind you and go on to the next big opportunity. But that is not enough: you should actively seek out opportunities just so you can then recover from them.
In science, you make your mistakes in public. You show them off so that everybody can learn from them. This way, you get the benefit of everybody else's experience, and not just your own idiosyncratic path through the space of mistakes. (Physicist Wolfgang Pauli famously expressed his contempt for the work of a colleague as "not even wrong". A clear falsehood shared with critics is better than vague mush.)
This, by the way, is another reason why we humans are so much smarter than every other species. It is not so much that our brains are bigger or more powerful, or even that we have the knack of reflecting on our own past errors, but that we share the benefits our individual brains have won by their individual histories of trial and error.
I am amazed at how many really smart people don't understand that you can make big mistakes in public and emerge none the worse for it. I know distinguished researchers who will go to preposterous lengths to avoid having to acknowledge that they were wrong about something. Actually, people love it when somebody admits to making a mistake. All kinds of people love pointing out mistakes.
Generous-spirited people appreciate your giving them the opportunity to help, and acknowledging it when they succeed in helping you; mean-spirited people enjoy showing you up. Let them! Either way we all win.2 RESPECT YOUR OPPONENT
Just how charitable are you supposed to be when criticising the views of an opponent? If there are obvious contradictions in the opponent's case, then you should point them out, forcefully. If there are somewhat hidden contradictions, you should carefully expose them to view – and then dump on them. But the search for hidden contradictions often crosses the line into nitpicking, sea-lawyering and outright parody. The thrill of the chase and the conviction that your opponent has to be harbouring a confusion somewhere encourages uncharitable interpretation, which gives you an easy target to attack.
But such easy targets are typically irrelevant to the real issues at stake and simply waste everybody's time and patience, even if they give amusement to your supporters. The best antidote I know for this tendency to caricature one's opponent is a list of rules promulgated many years ago by social psychologist and game theorist Anatol Rapoport.
How to compose a successful critical commentary:
1. Attempt to re-express your target's position so clearly, vividly and fairly that your target says: "Thanks, I wish I'd thought of putting it that way."
2. List any points of agreement (especially if they are not matters of general or widespread agreement).
3. Mention anything you have learned from your target.
4. Only then are you permitted to say so much as a word of rebuttal or criticism.
One immediate effect of following these rules is that your targets will be a receptive audience for your criticism: you have already shown that you understand their positions as well as they do, and have demonstrated good judgment (you agree with them on some important matters and have even been persuaded by something they said). Following Rapoport's rules is always, for me, something of a struggle…3 THE "SURELY" KLAXON
When you're reading or skimming argumentative essays, especially by philosophers, here is a quick trick that may save you much time and effort, especially in this age of simple searching by computer: look for "surely" in the document and check each occurrence. Not always, not even most of the time, but often the word "surely" is as good as a blinking light locating a weak point in the argument.
Why? Because it marks the very edge of what the author is actually sure about and hopes readers will also be sure about. (If the author were really sure all the readers would agree, it wouldn't be worth mentioning.) Being at the edge, the author has had to make a judgment call about whether or not to attempt to demonstrate the point at issue, or provide evidence for it, and – because life is short – has decided in favour of bald assertion, with the presumably well-grounded anticipation of agreement. Just the sort of place to find an ill-examined "truism" that isn't true!4 ANSWER RHETORICAL QUESTIONS
Just as you should keep a sharp eye out for "surely", you should develop a sensitivity for rhetorical questions in any argument or polemic. Why? Because, like the use of "surely", they represent an author's eagerness to take a short cut. A rhetorical question has a question mark at the end, but it is not meant to be answered. That is, the author doesn't bother waiting for you to answer since the answer is so obvious that you'd be embarrassed to say it!
Here is a good habit to develop: whenever you see a rhetorical question, try – silently, to yourself – to give it an unobvious answer. If you find a good one, surprise your interlocutor by answering the question. I remember a Peanuts cartoon from years ago that nicely illustrates the tactic. Charlie Brown had just asked, rhetorically: "Who's to say what is right and wrong here?" and Lucy responded, in the next panel: "I will."5 EMPLOY OCCAM'S RAZOR
Attributed to William of Ockham (or Ooccam), a 14th-century English logician and philosopher, this thinking tool is actually a much older rule of thumb. A Latin name for it is lex parsimoniae, the law of parsimony. It is usually put into English as the maxim "Do not multiply entities beyond necessity".
The idea is straightforward: don't concoct a complicated, extravagant theory if you've got a simpler one (containing fewer ingredients, fewer entities) that handles the phenomenon just as well. If exposure to extremely cold air can account for all the symptoms of frostbite, don't postulate unobserved "snow germs" or "Arctic microbes". Kepler's laws explain the orbits of the planets; we have no need to hypothesise pilots guiding the planets from control panels hidden under the surface. This much is uncontroversial, but extensions of the principle have not always met with agreement.
One of the least impressive attempts to apply Occam's razor to a gnarly problem is the claim (and provoked counterclaims) that postulating a God as creator of the universe is simpler, more parsimonious, than the alternatives. How could postulating something supernatural and incomprehensible be parsimonious? It strikes me as the height of extravagance, but perhaps there are clever ways of rebutting that suggestion.
I don't want to argue about it; Occam's razor is, after all, just a rule of thumb, a frequently useful suggestion. The prospect of turning it into a metaphysical principle or fundamental requirement of rationality that could bear the weight of proving or disproving the existence of God in one fell swoop is simply ludicrous. It would be like trying to disprove a theorem of quantum mechanics by showing that it contradicted the axiom "Don't put all your eggs in one basket".6 DON'T WASTE YOUR TIME ON RUBBISH
Sturgeon's law is usually expressed thus: 90% of everything is crap. So 90% of experiments in molecular biology, 90% of poetry, 90% of philosophy books, 90% of peer-reviewed articles in mathematics – and so forth – is crap. Is that true? Well, maybe it's an exaggeration, but let's agree that there is a lot of mediocre work done in every field. (Some curmudgeons say it's more like 99%, but let's not get into that game.)
A good moral to draw from this observation is that when you want to criticise a field, a genre, a discipline, an art form …don't waste your time and ours hooting at the crap! Go after the good stuff or leave it alone. This advice is often ignored by ideologues intent on destroying the reputation of analytic philosophy, sociology, cultural anthropology, macroeconomics, plastic surgery, improvisational theatre, television sitcoms, philosophical theology, massage therapy, you name it.
Let's stipulate at the outset that there is a great deal of deplorable, second-rate stuff out there, of all sorts. Now, in order not to waste your time and try our patience, make sure you concentrate on the best stuff you can find, the flagship examples extolled by the leaders of the field, the prize-winning entries, not the dregs. Notice that this is closely related to Rapoport's rules: unless you are a comedian whose main purpose is to make people laugh at ludicrous buffoonery, spare us the caricature.7 BEWARE OF DEEPITIES
A deepity (a term coined by the daughter of my late friend, computer scientist Joseph Weizenbaum) is a proposition that seems both important and true – and profound – but that achieves this effect by being ambiguous. On one reading, it is manifestly false, but it would be earth-shaking if it were true; on the other reading, it is true but trivial. The unwary listener picks up the glimmer of truth from the second reading, and the devastating importance from the first reading, and thinks, Wow! That's a deepity.
Here is an example (better sit down: this is heavy stuff): Love is just a word.
Oh wow! Cosmic. Mind-blowing, right? Wrong. On one reading, it is manifestly false. I'm not sure what love is – maybe an emotion or emotional attachment, maybe an interpersonal relationship, maybe the highest state a human mind can achieve – but we all know it isn't a word. You can't find love in the dictionary!
We can bring out the other reading by availing ourselves of a convention philosophers care mightily about: when we talk about a word, we put it in quotation marks, thus: "love" is just a word. "Cheeseburger" is just a word. "Word" is just a word. But this isn't fair, you say. Whoever said that love is just a word meant something else, surely. No doubt, but they didn't say it.
Not all deepities are quite so easily analysed. Richard Dawkins recently alerted me to a fine deepity by Rowan Williams, the then archbishop of Canterbury, who described his faith as "a silent waiting on the truth, pure sitting and breathing in the presence of the question mark".
I leave the analysis of this as an exercise for you.
This is an edited extract from Intuition Pumps and Other Tools for Thinking by Daniel Dennett, published by Allen Lane (£20)Daniel Dennett
The presence of isolated bug Meenoplus Roddenberryi on Gran Canaria suggests important things about the evolution of cave-dwelling species
Things are looking up for bugs underground. Among the 132 cave-dwelling invertebrate species of the Canary Islands are about 15 species of Hemiptera or true bugs. Most of these troglobites are from younger, more recently volcanically active, islands where lava tubes are abundant. La Palma and El Hierro, for example, are less than two and one million years old, respectively, and until recently home to most of the documented cave fauna.
Most volcanic activity on Gran Canaria ceased 1.6m years ago. As a result, this 14m-year-old island has few lava tubes, leading biospelunkers to assume that the cavernicolous fauna would be sparse. Localised activity to the north and east has produced some volcanic landscapes, but the south-western half of the island has few lava tubes or cinder cones and virtually no troglobites.
Before the year 2000, Gran Canaria cave fauna consisted of one spider and one cockroach. Since then, explorations of lava tubes and old artificial caves have revealed a much richerfauna than was suspected, almost the equal to that of younger islands. Discoveries have included millipedes, pseudoscorpions, spiders, silverfish and beetles, many of which are yet to be named. Most are found in shallow mesocavernous habitats, the so-called milieu souterrain superficiel. Caves are voids large enough for a human to enter. Mesocaverns are smaller than caves, but larger than mere fractures in rock.
Dr Hannelore Hoch of the Museum für Naturkunde, Berlin, with Dr Manuel Naranjo of the Sociedad Entomológica Canaria Melansis and Dr Pedro Oromí of the Universidad de La Laguna, recently discovered a new species of cavernicolous true bug in a 30 metre-long water mine on Gran Canaria near Tenteniguada, at about 1,100 metres above sea level. The bug was found in the deepest part of the mine, formed in colluvial deposits of basalt, where seasonal variations are slight; temperature remains 13-17C and the relative humidity 85-94%. The presumed food source for the bugs are roots of a number of trees and shrubs penetrating the mine, including some combination of sweet chestnut (Castanea sativa), yellow broom (Teline microphylla), blue Gran Canarian tajinaste (Echium callythirsum), and escobon (Chamaecytisus proliferus).
Meenoplus roddenberryi is named after Gene Roddenberry, creator of Star Trek that has spawned an industry of sequels, movies, and now scientific names. The mission statement that began episodes of the original series included the words "… to explore strange new worlds, to seek out new life… to boldly go where no man has gone before". Hoch et al suggest this applies as much to biospeleology as to space exploration.
M roddenberryi is a textbook example of a relict since not a single epigean member of the family exists in the Canaries, but must have at one time. Those remaining reflect three independent cavern colonisations by at least two different extinct ancestral species: M claustrophilus on La Palma, M cancavus and M charon on El Hierro, and M roddenberryi on Gran Canaria. It is equally curious that in spite of suitable habitats and the presence of other troglobitic bugs, no meenoplids are known on Tenerife. Nor is M roddenberryi a close relative of known species from Africa or Cape Verde, leaving its ancestral origins a mystery for now.
Because larvae of the family live in or on the soil the transition to hypogean life is easily envisioned. Still, there are degrees of morphological adaptation to cave life, and M roddenberryi is a more extreme example than its relatives on younger islands. The opposite has been noted among bugs in Hawaii with the most extreme forms on younger islands. This suggests that degree of adaptation correlates with physical parameters, rather than a gradual process. The scarcity of cavernicolous planthoppers on older islands had been explained by the elimination of mesocaverns by erosion and soil formation, but M roddenberryi challenges that explanation and suggests that landslides and rock avalanches create new habitats.Quentin Wheeler
Concerns that firms' rights to hold patents on genes linked to breast cancer is pushing up cost of testing for disease
Angelina Jolie's decision to speak out about her decision to have a preventive double mastectomy was intended to highlight the terrible risks of breast cancer. But the film star's move also cast a spotlight on the far less known arena of patent battles over genetic technology which could have far more impact than Jolie's widely applauded move.
Before the end of next month the US supreme court will issue a landmark decision in a case brought against the biotech firm Myriad Genetics, which is based in Utah, by the Association for Molecular Pathology.
The firm owns a patent on the BRCA1 gene, which Jolie carries and which is believed to carry a high risk of causing breast cancer. It also owns a patent on the similar BRCA2 gene.
It means that Myriad has the exclusive right to develop diagnostic tests for those genes – a fact that has implications for other firms, who thus might be prevented from developing innovations in the field.
It also has some serious hard-money business implications: in the wake of Jolie's announcement, Myriad's share price shot up. That has worried some commentators. In a New York Times column describing her decision, Jolie acknowledged she was lucky to be well-off enough to easily afford to take the test for the culpable genes.
Some have complained that the lengthy court battle over Myriad's patents has kept the price of the tests too high and have asked whether patents actually sacrifice patients' interests in favour of protecting corporate profits. "How many more women – and men – might have been able over the past four years to afford BRCA1 or BRCA2 testing in the absence of those protective patents?" wrote Andrew Cohen in Atlantic magazine.
The issue of patents and genetic technology is one of growing importance as a flood of companies enter the booming sector and scientific advances allow more and more advanced genetic manipulation. So far the supreme court has shown a willingness to side with big business. Earlier this month it ruled in favour of agricultural firm Monsanto in defence of a patent it holds on soy beans that dominate the US farming sector.Paul Harris
Can the ancient tusks of the extinct species retrieved from Arctic ice tell us exactly why it died out?
The first time I met palaeontologist Dan Fisher was in a hotel in the Arctic frontier town of Salekhard, in Siberia. I was there to film an expedition to recover a new mammoth specimen with a crew from the BBC. We were keen to head north into the tundra of the Yamal peninsula, where we'd heard that new mammoth carcasses had been discovered. After sharing a large Mi-8 helicopter with a load of Siberian hunters, we landed at a reindeer herders' camp, consisting of a few tepee-like "chums" on an island surrounded by ice-choked rivers. This was to be our base while the team tried to track down the mammoth remains said to be in the vicinity.
But fate was against us. One trail dried up as we got close. We knew a worker on the Gazprom railway had reported finding a mammoth, but it seemed that he had either forgotten the location or been offered a better price. We hoped to investigate other possible finds to the north, but were thwarted by the iced-up rivers. Eventually we had to give up.
Dan was philosophical about this turn of events. He'd been on many expeditions to Siberia, sometimes finding mammoths, often not. For him, the real prize was mammoth ivory – the same stuff that the reindeer herders searched for every spring as the ice loosened its grip. Most of this ancient ivory would be traded to the east, to markets in Japan and China. A very small proportion of it would find its way to scientists, who saw a value beyond its aesthetic appeal.
Later in the year, I visited Dan in his own habitat, the University of Michigan's Museum of Palaeontology, to find out how he unlocked the secrets of mammoth ivory. He brought out a tusk he'd acquired on his recent trip to Wrangell Island in the Arctic Ocean, home to the last of the mammoths. This tusk was from an animal that had died a mere 6,000 years ago. We took it into a small room full of scientific paraphernalia – and a bandsaw – and started to cut. It was slow and painstaking work.
An hour later, the cut was complete. We laid the tusk on its side and Dan let me lift the upper half away, which I did gingerly, then laid it sawn face uppermost on the table. The tusk had opened like a book and now Dan could read it. Even without polishing the cut surfaces, we could see features: darker and lighter stripes. Under a UV lamp, the stripes stood out even more clearly. Ivory grows incrementally throughout the lifetime of an elephant, mammoth or mastodon. The lines we could see most clearly related to annual growth cycles, but under the microscope, Dan was able to pick out much finer lines. He told me about the first time he counted those lines and realised that there were about 365 across an annual band: dentine laid down on a daily basis. So here was an incredible record of the life of a proboscidean.
Just like tree rings, narrow bands corresponded with times of stress and poor growth, whereas they became wider in times of plenty. So Dan could identify cycles of pregnancy and lactation in females; he could tell when young males had first become sexually mature, going into the testosterone-fuelled rage called musth, when growth was not a priority.
All this information was useful in answering questions about individual mammoths and mastodons, but en masse, it started to provide the level of data that would help to answer the most compelling question of all: why had these magnificent animals become extinct? There's still much work to be done, but among late-surviving mastodons that he's studied, Dan is finding examples of females losing calves (where one pregnancy is immediately followed by another, rather than by two years of lactation) and of males going into musth early (just as young bull elephants do in Africa, when mature males are poached out). Dan had also found examples of mammoths dying in the autumn, a time of the year when the animals should have been in peak condition. Autumn deaths argued for an extrinsic cause of death. For Dan, all of this could be pinned on one such cause: overhunting by humans.
Many have argued that the demise of mammoths, and their close relatives mastodons, soon after the arrival of modern humans in North America, is far more than just a coincidence. Kill-sites exist that show humans were certainly, at least occasionally, hunting these formidable beasts. But it's hard to argue from those isolated cases that humans were responsible for wiping out entire species. The picture emerging from the study of tusks may be more revealing, showing populations of animals under chronic pressure. Far from rampaging across the continent, killing every large mammal in sight, it seems that ancient hunters might have had a more subtle, but no less terminal impact. Over thousands of years, the level of hunting was just enough to be unsustainable for these huge, slow-breeding behemoths of the ice age.
Cutting the tusk open like that was only the beginning of coaxing the ivory into yielding its secrets. Dan would make fine sections to look at under the microscope and take samples for chemical analysis. He was quietly excited about this new specimen from Wrangell. "It's one of the best tusks I've seen," he said.
Alice's new series, Ice Age Giants, starts tonight at 8pm on BBC2Alice Roberts
Quantum mechanics research could hold the key to a new generation of super-fast computers
Our imagination is stretched to the utmost," wrote Richard Feynman, the greatest physicist of his day, "not, as in fiction, to imagine things which are not really there, but just to comprehend those things that are there." Which is another way of saying that physics is weird. And particle physics – or quantum mechanics, to give it its posh title – is weird to the power of n, where n is a very large integer.
Consider some of the things that particle physicists believe. They accept without batting an eyelid, for example, that one particular subatomic particle, the neutrino, can pass right through the Earth without stopping. They believe that a subatomic particle can be in two different states at the same time. And that two particles can be "entangled" in such a way that they can co-ordinate their properties regardless of the distance in space and time that separates them (an idea that even Einstein found "spooky"). And that whenever we look at subatomic particles they are altered by the act of inspection so that, in a sense, we can never see them as they are.
For a long time, the world looked upon quantum physicists with a kind of bemused affection. Sure, they might be wacky, but boy, were they smart! And western governments stumped up large quantities of dosh to enable them to build the experimental kit they needed for their investigations. A huge underground doughnut was excavated in the suburbs of Geneva, for example, and filled with unconscionable amounts of heavy machinery in the hope that it would enable the quark-hunters to find the Higgs boson, or at any rate its shadowy tracks.
All of this was in furtherance of the purest of pure science – curiosity-driven research. The idea that this stuff might have any practical application seemed, well, preposterous to most of us. But here and there, there were people who thought otherwise (among them, as it happens, Richard Feynman). In particular, these visionaries wondered about the potential of harnessing the strange properties of subatomic particles for computational purposes. After all, if a particle can be in two different states at the same time (in contrast to a humdrum digital bit, which can only be a one or a zero), then maybe we could use that for speeded-up computing. And so on.
Thus was born the idea of the "quantum computer". At its heart is the idea of a quantum bit or qubit. The bits that conventional computers use are implemented by transistors that can either be on (1) or off (0). Qubits, in contrast, can be both on and off at the same time, which implies that they could be used to carry out two or more calculations simultaneously. In principle, therefore, quantum computers should run much faster than conventional, silicon-based ones, at least in calculations where parallel processing is helpful.
For as long as I have been paying attention to this stuff, the academic literature has been full of arguments about quantum computing. Some people thought that while it might be possible in theory, in practice it would prove impracticable. But while these disputes raged, a Canadian company called D-Wave – whose backers include Amazon boss Jeff Bezos and the "investment arm" of the CIA (I am not making this up) – was quietly getting on with building and marketing a quantum computer. In 2011, D-Wave sold its first machine – a 128-qubit computer – to military contractor Lockheed Martin. And last week it was announced that D-Wave had sold a more powerful machine to a consortium led by Google and Nasa and a number of leading US universities.
What's interesting about this is not so much its confirmation that the technology may indeed be a practical proposition, though that's significant in itself. More important is that it signals the possibility that we might be heading for a major step change in processing power. In one experiment, for example, it was found that the D-Wave machine was 3,600 times faster than a conventional computer in certain kinds of applications. Given that the increases in processing power enabled by Moore's law (which applies only to silicon and says that computing power doubles roughly every two years) are already causing us to revise our assumptions about what computers can and cannot do, we may have some more revisions to do. All of which goes to prove the truth of the adage: pure research is just research that hasn't yet been applied.John Naughton
People with conditions such as heart disease or Parkinson's could benefit from tissue grown with their own DNA
Lorraine Barnes suffered a heart attack in 2005 and has lived with the consequences – extreme exhaustion and breathlessness – ever since. "I was separated from my husband and so my children, Charlotte and James, had to grow up overnight because suddenly they were caring for me," she says.
Charlotte agrees: "It turns your world upside down. I worry about my mum day and night, 24/7."
Heart failure leaves Barnes, 49, "drowning and gasping for air", she says. What really preys on her mind, though, is not her present difficulty but her future. "It scares me, as obviously I want to be around to see my children grow up."
There is no cure for heart failure, the aftermath of a heart attack, and the condition is common. Every seven minutes a person has a heart attack in the UK, and some victims are left so weakened they can hardly walk a few metres.
It's a grim scenario. But the prospects for patients like Barnes last week took a dramatic turn for the better when it was revealed that human cloning has been used for the first time to create embryonic stem cells from which new tissue – genetically identical to a patient's own cells – could be grown.
Scientists have been working on such techniques (see box) for some time but their work has been hampered by the difficulties involved in cloning human cells in the laboratory. But the team led by Shoukhrat Mitalipov, of the Oregon Health and Science University in Portland, got around this problem. By adding caffeine to cell cultures, their outputs were transformed. "We were able to produce one embryonic stem cell line using just two human eggs, which would make this approach practical for widespread therapeutic use," said Mitalipov.
The development was hailed as a major boost for patients such as Barnes, who might benefit from tissue transplants – and not just heart attack patients but those suffering from diabetes, Parkinson's disease and other conditions.
But the announcement was also greeted with horror. "Scientists have finally delivered the baby that would-be human cloners have been waiting for: a method for reliably creating cloned human embryos," said David King of Human Genetics Alert. "It is imperative we create an international ban on human cloning before any more research like this takes place. It is irresponsible in the extreme to have published this."
Several tabloid newspapers also carried banner headlines warning of the human cloning "danger". Such reactions have a familiar ring. When the cloning of Dolly the Sheep was revealed in 1997 there was an outpouring of hysteria about the prospect of multiple Saddam Husseins being created in laboratories.
"At the time the chances of these horrors occurring – when scientists had not even created a single clone of a human cell – were remote," said physiologist Professor Colin Blakemore of Oxford University. "Not that this worried the alarmists. The crucial point is that we should have spent the intervening time thinking about how we should react sensibly to the concept of a human clone when it does become possible. We have not done that and, although the science is still far off, it is getting closer. We need to ask, carefully and calmly: under what circumstances would we tolerate the creation of a human clone?"
At present such a creation is banned in Britain. No human embryo created by cloning techniques is allowed to develop beyond 14 days. "The research is very tightly regulated and I think there is little chance of a rogue laboratory creating a human clone," said James Lawford Davies, a lawyer who specialises in health sciences. "However, many US states which, ironically, banned therapeutic cloning work because of their strong anti-abortion stances have laws that would permit human clones to develop into foetuses."
Experts such as Professor John Harris, director of Manchester University's Institute for Science, Ethics and Innovation, see positive benefits in reproductive cloning which could have a place in society. He said: "If you take a healthy adult's DNA and use it to create a new person – by cloning – you are essentially using a tried and tested genome, one that has worked well for several decades for the donor. By contrast, a child born naturally has an 8% chance of succumbing to a serious genetic abnormality because of the random selection of their DNA. You can avoid that with a clone."
In fact, most arguments against human cloning are foolish, said Harris, adding: "It could be used in medically helpful ways. If a couple find they are carriers of harmful, possibly fatal recessive genetic illnesses, there is a one in four chance they will produce a child who will die of that condition. That is a big risk. An alternative would be to clone one of the parents. If you did that, then you would know you were producing a child who would be unaffected by that illness in later life.
"Or consider the example of a single woman who wants a child. She prefers the idea of using all her own DNA to the idea of accepting 50% from a stranger. But because we ban human cloning she would be forced to accept DNA from a stranger and have to mother 'his child'. I think that is ethically questionable. Just after Dolly the Sheep was born, Unesco announced a ban on human cloning. I think that was a mistake."
This point was backed by Blakemore. He said: "Many people react with horror at the thought of a human clone, yet three out of every 1,000 babies born today are clones – in the form of identical twins. These twins share not just the same DNA but have grown up in the same uterus and have had the same parenting – features that only intensify their similarities. Society is quite happy about this situation, it appears, but seems to find it odd when talking about cloning."
However, a note of caution was sounded by Ian Wilmut, who led the team that created Dolly the Sheep. He said: "The new work may encourage some people to attempt human reproductive cloning but the general experience is that it still results in late foetal loss and the birth of abnormal offspring." It would be cruel to cause this in humans until techniques had been vastly improved, he added.
However, most scientists see Mitalipov's work as encouraging. If nothing else, the prospects for Lorraine Barnes – and countless other patients whose lives could be transformed by transplants – have greatly improved in the long term.How it works
The nucleus is removed from a human egg cell and the nucleus from a skin cell is inserted.
An electric shock fuses the skin cell nucleus inside the egg and it begins to divide into new cells. An embryo starts to form.
After a few days the growth of the embryo is halted and cultures of its constituent stem cells created.
By treating stem cells with different chemicals they can be transformed into specialised cells such as those that make up heart muscle, brain, pancreas and other organs. These cells are genetically identical to the original skin cell and can be used to create tissue for transplanting into the skin cell's donor .