Death by PowerPoint: the slide that killed seven people

The space shuttle Columbia disintegrating in the atmosphere (Creative Commons)

We’ve all sat in those presentations. A speaker with a stream of slides full of text, monotonously reading them off as we read along. We’re so used to it we expect it. We accept it. We even consider it ‘learning’. As an educator I push against ‘death by PowerPoint’ and I'm fascinated with how we can improve the way we present and teach. The fact is we know that PowerPoint kills. Most often the only victims are our audience’s inspiration and interest. This, however, is the story of a PowerPoint slide that actually helped kill seven people.

January 16th 2003. NASA Mission STS-107 is underway. The Space Shuttle Columbia launches carrying its crew of seven to low orbit. Their objective was to study the effects of microgravity on the human body and on ants and spiders they had with them. Columbia had been the first Space Shuttle, first launched in 1981 and had been on 27 missions prior to this one. Whereas other shuttle crews had focused on work to the Hubble Space Telescope or to the International Space Station this mission was one of pure scientific research.

The launch proceeded as normal. The crew settled into their mission. They would spend 16 days in orbit, completing 80 experiments. One day into their mission it was clear to those back on Earth that something had gone wrong.

As a matter of protocol NASA staff reviewed footage from an external camera mounted to the fuel tank. At eighty-two seconds into the launch a piece of spray on foam insulation (SOFI) fell from one of the ramps that attached the shuttle to its external fuel tank. As the crew rose at 28,968 kilometres per hour the piece of foam collided with one of the tiles on the outer edge of the shuttle’s left wing.

Frame of NASA launch footage showing the moment the foam struck the shuttle’s left wing (Creative Commons)

It was impossible to tell from Earth how much damage this foam, falling nine times faster than a fired bullet, would have caused when it collided with the wing. Foam falling during launch was nothing new. It had happened on four previous missions and was one of the reasons why the camera was there in the first place. But the tile the foam had struck was on the edge of the wing designed to protect the shuttle from the heat of Earth’s atmosphere during launch and re-entry. In space the shuttle was safe but NASA didn’t know how it would respond to re-entry. There were a number of options. The astronauts could perform a spacewalk and visually inspect the hull. NASA could launch another Space Shuttle to pick the crew up. Or they could risk re-entry.

NASA officials sat down with Boeing Corporation engineers who took them through three reports; a total of 28 slides. The salient point was whilst there was data showing that the tiles on the shuttle wing could tolerate being hit by the foam this was based on test conditions using foam more than 600 times smaller than that that had struck Columbia. This is the slide the engineers chose to illustrate this point:

NASA managers listened to the engineers and their PowerPoint. The engineers felt they had communicated the potential risks. NASA felt the engineers didn’t know what would happen but that all data pointed to there not being enough damage to put the lives of the crew in danger. They rejected the other options and pushed ahead with Columbia re-entering Earth’s atmosphere as normal. Columbia was scheduled to land at 0916 (EST) on February 1st 2003. Just before 0900, 61,170 metres above Dallas at 18 times the speed of sound, temperature readings on the shuttle’s left wing were abnormally high and then were lost. Tyre pressures on the left side were soon lost as was communication with the crew. At 0912, as Columbia should have been approaching the runway, ground control heard reports from residents near Dallas that the shuttle had been seen disintegrating. Columbia was lost and with it her crew of seven. The oldest crew member was 48.

The shuttle programme was on lock down, grounded for two years as the investigation began. The cause of the accident became clear: a hole in a tile on the left wing caused by the foam let the wing dangerously overheat until the shuttle disintegrated.

The questions to answer included a very simple one: Why, given that the foam strike had occurred at a force massively out of test conditions had NASA proceeded with re-entry?

Edward Tufte, a Professor at Yale University and expert in communication reviewed the slideshow the Boeing engineers had given NASA, in particular the above slide. His findings were tragically profound.

Firstly, the slide had a misleadingly reassuring title claiming that test data pointed to the tile being able to withstand the foam strike. This was not the case but the presence of the title, centred in the largest font makes this seem the salient, summary point of this slide. This helped Boeing’s message be lost almost immediately.

Secondly, the slide contains four different bullet points with no explanation of what they mean. This means that interpretation is left up to the reader. Is number 1 the main bullet point? Do the bullet points become less important or more? It’s not helped that there’s a change in font sizes as well. In all with bullet points and indents six levels of hierarchy were created. This allowed NASA managers to imply a hierarchy of importance in their head: the writing lower down and in smaller font was ignored. Actually, this had been where the contradictory (and most important) information was placed.

Thirdly, there is a huge amount of text, more than 100 words or figures on one screen. Two words, ‘SOFI’ and ‘ramp’ both mean the same thing: the foam. Vague terms are used. Sufficient is used once, significant or significantly, five times with little or no quantifiable data. As a result this left a lot open to audience interpretation. How much is significant? Is it statistical significance you mean or something else?

Finally the single most important fact, that the foam strike had occurred at forces massively out of test conditions, is hidden at the very bottom. Twelve little words which the audience would have had to wade through more than 100 to get to. If they even managed to keep reading to that point. In the middle it does say that it is possible for the foam to damage the tile. This is in the smallest font, lost.

NASA’s subsequent report criticised technical aspects along with human factors. Their report mentioned an over-reliance on PowerPoint:

“The Board views the endemic use of PowerPoint briefing slides instead of technical papers as an illustration of the problematic methods of technical communication at NASA.”

Edward Tufte’s full report makes for fascinating reading. Since being released in 1987 PowerPoint has grown exponentially to the point where it is now estimated than thirty million PowerPoint presentations are made every day. Yet, PowerPoint is blamed by academics for killing critical thought. Amazon’s CEO Jeff Bezos has banned it from meetings. Typing text on a screen and reading it out loud does not count as teaching. An audience reading text off the screen does not count as learning. Imagine if the engineers had put up a slide with just: “foam strike more than 600 times bigger than test data.” Maybe NASA would have listened. Maybe they wouldn’t have attempted re-entry. Next time you’re asked to give a talk remember Columbia. Don’t just jump to your laptop and write out slides of text. Think about your message. Don’t let that message be lost amongst text. Death by PowerPoint is a real thing. Sometimes literally.

Thanks for reading

- Jamie

Columbia’s final crew (from

There is nothing new under the sun: the current New York measles epidemic and lessons from the first 'anti-vaxxers'

An 1807 cartoon showing ‘The Vaccination Monster’

What has been will be again,
    what has been done will be done again;
    there is nothing new under the sun.

Ecclesiastes 1:9

The State of New York is currently in the midst of an epidemic. Measles, once eradicated from the USA has returned with a vengeance. Thanks to a rise in unvaccinated children fueled by the ‘anti-vaxxer’ movement 156 children in Rockwood County have been infected by measles; 82.8% of these had never had even one MMR vaccine. With measles now rampant in the boroughs of Brooklyn and Queens the state government has taken an unusual step. In New York in the USA, the home of liberty and personal choice, no unvaccinated under-18 year old is now able to set foot in a public space. Parents of unvaccinated children who break this ban will face fines or jail.

In a previous blog I wrote about the fight against smallpox first using variolation (which sometimes caused infection) and then the invention of the world’s first vaccine. This musing is about how vaccination was made compulsory in the United Kingdom, the subsequent fight against it through a public campaign and how that movement raised its head again in the last few decades. This is the story of the first ‘anti-vaxxer’ movement and how the arguments regarding vaccination show there isn’t really anything new under the sun.

Early opposition to vaccination

Following Edward Jenner’s work into using cowpox to offer immunity against smallpox in 1796 the Royal Jennerian Society was established in 1803 to continue his research.

Even in these early days there was opposition to the vaccine. John Birch, the ‘surgeon extraordinary’ to the Prince of Wales pamphleteered against Jenner’s work with arguments one might expect to see circulating today on social media:

A section of John Birch’s pamphlet from

He of course did not mention how he was making a lot of money through inoculating patients against smallpox (a practice that vaccination would replace) or using novel treatments such as electrocution.

Wood painting caricature from 1808 showing Edward Jenner confronting opponents to his vaccine (note the dead at their feet) (Creative Commons)

Despite Birch’s efforts by 1840 the efficacy of Jenner’s vaccine was widely accepted. Decades before germ theory was established and viruses were identified we finally had a powerful weapon against a deadly disease. Between 1837 and 1840 a smallpox epidemic killed 6,400 people in London alone. Parliament was persuaded to legislate. The 1840 Vaccination Act made the unpredictable variolation illegal and made provision for free, optional smallpox vaccination.

At the time healthcare in the UK was largely unchanged since Tudor times. Parish based charity had been the core of support for the sick and poor until workhouses were made the centre of welfare provision in 1834. With the workhouse came a stigma that illness and poverty were avoidable and to be punished. Government was dominated by two parties, the Whigs and the Tories both of whom were non-interventionist and the universal healthcare provided by the NHS was over a century away. Consider this laissez-faire backdrop with punitive welfare. The fact free vaccination was provided is remarkable and I think reflects the giddy optimism at a future without ‘the speckled monster’ of smallpox.

The Anti-Vaccination Leagues

The Vaccination Act of 1853 went further. Now vaccination against smallpox was compulsory for all children born after 1st August 1953 within the first three months of their life with fines for parents who failed to comply. By the 1860s two-thirds of babies in the UK had been vaccinated.

There was immediate opposition to the 1853 Act with violent protests across the country. This was the state’s first steps into the health of private citizens. The response seems to have been motivated in much the same way as the modern day opposition in the US to vaccination and universal healthcare in general: that health is a matter of private civil liberty and that vaccination caused undue distress and risk. In England and Wales in particular although the penalties were rarely enacted their presence alone seems to have been motivation for opposition. The Anti-Vaccination League in London was established in 1853 to allow dissenting voices to coalesce.

The Vaccination Act of 1867 extended the age by which a child had to be vaccinated to 14 with cumulative fines to non-compliance. That same year saw the formation of the Anti-Compulsory Vaccination League. They published the National Anti-Compulsory Vaccination Reporter newsletter in which they listed their concerns, the first three being:

I. It is the bounden duty of parliament to protect all the rights of man.

II. By the vaccination acts, which trample upon the right of parents to protect their children from disease, parliament has reversed its function.

III. As parliament, instead of guarding the liberty of the subject, has invaded this liberty by rendering good health a crime, punishable by fine or imprisonment, inflicted on dutiful parents, parliament is deserving of public condemnation.

Further newsletters were formed over the following decades: the Anti-Vaccinator (founded 1869), the National Anti-Compulsory Vaccination Reporter (1874), and the Vaccination Inquirer (1879). All of these continued to place political pressure against compulsory vaccination. Much like today the main body of arguments focused on personal choice and the testimony of parents alleging that their child was injured or killed by vaccination. In Leicester in 1885 an anti-vaccination demonstration attracted 100,000 people. A staggering number when the city’s population in total at the time was around 190,000.

A royal commission was called to advise on further vaccination policy. After deliberation for seven years listening to evidence across the spectrum of opinion in 1896 they published their findings. Smallpox vaccination was safe and effective. However, it advised against continuing compulsory vaccination. Following the 1898 Vaccination Act parents who did not want their child to be vaccinated could ‘conscientiously object’ and be exempt. There was no further appetite for Parliament to intervene in the rights of parents. Even the fledgling socialist Labour Party, no enemy of government intervention, made non-compulsory vaccination one of its policies.

Whilst the two World Wars saw a change in public opinion towards a greater role in society for government, culminating in the creation of the National Health Service in 1948, vaccination remains voluntary in the United Kingdom. The first half of the 20th century saw the advent of vaccines against several deadly diseases such as polio, measles, diphtheria and tetanus. In 1966 an ambitious worldwide vaccination programme led by the World Health Organisation saw smallpox become the first disease to be eradicated by mankind in 1980. There were dreams of polio and measles going the same way. It was not to be.

Anti-vaccination re-emerges

Herd immunity is a key component for any vaccination programme to be effective. Not everyone can be vaccinated and so they rely on being surround by vaccinated people to prevent transmission. The level of vaccination in a population required for herd immunity varies between diseases. The accepted standard to prevent measles transmission is 90-95%.

On 28th February 1998 an article was published in the Lancet which claimed that the Measles, Mumps and Rubella (MMR) vaccine was linked to the development of development and digestive problems in children. Its lead author was Dr Andrew Wakefield, a gastroenterologist.

The infamous Lancet paper linking the MMR vaccine to developmental and digestive disorders

The paper saw national panic about the safety of vaccination. The Prime Minister Tony Blair refused to answer whether his newborn son Leo had been vaccinated.

Except just like John Birch nearly two centuries before him Andrew Wakefield had held a lot back from the public and his fellow authors. He was funded by a legal firm seeking to prosecute the companies who produce vaccines. This firm led him to the parents who formed the basis of his ‘research’. The link between children developing developmental and digestive problems was made by the parents of twelve children recalling that their child first showed their symptoms following the MMR vaccine. Their testimony and recall alone were enough for Wakefield to form a link between vaccination and autism. From a research sense his findings were no more useful than those the Victorian pamphlets used. But the damage was done. The paper was retracted in 2010. Andrew Wakefield was struck off as were some of his co-authors who did not practice due diligence. Sadly, this has only helped Wakefield’s ‘legend’ as he tours America spreading his message tapping in to the general ‘anti-truth’ populist movement. Tragically unsurprisingly, often in his wake comes measles.

Earlier this year the largest study to date investigating the links between MMR and autism was published. 657,461 children in Denmark were followed up over several years (compare that to Wakefield’s research where he interviewed the parents of 12 children). No link between the vaccine and autism was shown. In fact, no large high level research has ever backed up Wakefield’s claim.

There are financial and political forces at work here. Anti-vaccination is worth big money. The National Vaccination Information Center in the US had an annual income of $1.2 billion in 2017. And the people they target are economically and politically powerful. Recent research in America shows that parents who refuse vaccinations for their children are more likely to be white, educated and of higher income. They prize purity and liberty above all, emotional reasoning over logic. They vote. And their world view is prevalent in certain circles.

Tweet by Donald Trump 28th March 2014

Tweet by Donald Trump 28th March 2014

In the UK in 2018 the rate of MMR vaccination was 91.8%, worryingly close to no longer being effective for herd immunity. There have been debates in the UK about re-introducing compulsory vaccination. In France certain childhood vaccinations are now compulsory. Social media companies are under pressure to silence the groups anti-vaxxers use to spread their message and recruit. The state is once again prepared to step into personal liberty when it comes to vaccines.

In 1901 52% of childhood deaths in England and Wales were due to infectious diseases. By 2000 it was 7.4%. In 1901 40.6% of all deaths were children. By 2000 it was 0.9%. No-one would want that progress to reverse. But history does have a habit of repeating itself if we let it. The debates continue to be the same: the rights of parents and the individual versus those of the state and public health necessity. This is a debate we have to get right. History tells us what will happen if we don’t. After all, there is nothing new under the sun.

Thanks for reading

- Jamie

Sweating Sickness: England’s Forgotten Plague


The history of medicine is littered with diseases which impacted on the course of humanity. The Black Death. Smallpox. Influenza. HIV/AIDS. Each one has left its own mark on our collective consciousness. And yet there is an often overlooked addition to this list: sweating sickness. This disease tore its way through Tudor England, killing within hours, before disappearing as quickly and mysteriously as it arrived. In it’s wake it left it’s mark, a nation changed. The identity of this disease remains a matter for conjecture to this day. This is the story of England’s forgotten plague.

Background to an outbreak

It’s summer 1485. An epic contest for the throne of England is reaching its bloody climax. In a few weeks on August 22nd at the Battle of Bosworth Henry Tudor will wrest the crown from King Richard III and conclude the Wars of the Roses. Away from the fighting people start dying. As contemporary physicians described:

“A newe Kynde of sickness came through the whole region, which was so sore, so peynfull, and sharp, that the lyke was never harde of to any mannes rememberance before that tyme.

These words take on added impact when you remember the writer would have experienced patients with bubonic plague. What was this disease “the like was never heard of”? Sudor Anglicus, later known as the English sweating sickness, struck quickly. The French physician Thomas le Forestier described victims feeling apprehensive and generally unwell before violent sweating, shaking and headaches began. Up to half of patients died, usually within 24 hours. Those who lived longer than this tended to survive. However, survival did not seem to offer immunity and patients could be struck multiple times. 15,000 died in London alone. We don’t have an exact figure for its mortality but it is commonly estimated at 30-50%.

Outbreaks continued beyond 1485 and the reign of Henry VII and into that of his grandson Edward VI in five further epidemics; 1508, 1517, 1528, and 1551, each time in summer/autumn. The disease remained limited to England apart from 1528/29 when it also spread to mainland Europe.

John Keys

The principle chronicler of the sweat was the English doctor John Keys (often Latinised to John Caius/Johannes Caius) in his 1552 work ‘A Boke or Counseill Against the Disease Commonly Called the Sweate, or Sweatyng Sicknesse.’ This is how know so much about how the disease presented and progressed.

Key’s noted that the patients most at risk of the disease were:

“either men of wealth, ease or welfare, or of the poorer sort, such as were idle persons, good ale drinkers and tavern haunters.

Both Cardinal Wolsely and Anne Boleyn contracted the disease but survived. Wolsely survived two attacks. Anne’s brother-in-law William Carey wasn’t so lucky and died of the sweat. The disease’s predilection for the young and wealthy led to it being dubbed the ‘Stop Gallant’ by the poorer classes.

Key’s was the physician to Edward VI, Mary I and Elizabeth I. As he was born in 1510 his work on the first epidemics of sweating sickness was based on prior reports of the illness; it could therefore be said he had performed a kind of literature review. Unlike le Forestier his lack of first hand experience and the fact he focused mostly on noble deaths has led to criticism. However, Keys was clear that the sweat was different to plague and other conditions. This goes with le Forestier and other physicians at the time.

The impact of the sweat permeated Tudor culture. Even in 1604 William Shakespeare was concerned enough about sweating sickness to write in his play ‘Measure by Measure’:

“Thus, what with the war, what with the sweat, what with the gallows, and what with poverty…

How the sweat changed history

Henry Tudor was an ambitious man with a fairly loose claim to the throne of England: his mother, Lady Margaret Beaufort, was a great-granddaughter of John of Gaunt, Duke of Lancaster, fourth son of Edward III, and his third wife Katherine Swynford. Katherine was Gaunt’s mistress for 25 years before they married and had 4 children already before she gave birth to John Beaufort, Henry’s great-grandfather. If this sounds complicated it is. Henry was not a strong claimant and his chances had been further weakened by an Act of Parliament in 1407 by Henry IV, John of Gaunt’s first son, which recognised his half-siblings but ruled them and their descendants ineligible for the throne.

Henry Tudor’s ancestry from

Henry needed alliances if he was going to get anywhere. He attempted to take the crown in 1483 but the campaign was a disaster. He was running out of time and needed to kill Richard III in battle if he was going to be king. He accepted the help of King Charles VIII of France who provided Henry with 2000 mercenaries from France, Germany and Switzerland. This force crossed the English Channel on 7th August 1485. In was in this army that the sweat first appeared. There is debate about whether this was before or after the Battle of Bosworth but Lord Stanley, a key ally of Richard III and a contributor of 30% of the king’s army, used fear of sweating sickness as a reason to not join the royal forces in battle. It’s therefore possible that sweating sickness was seen before Bosworth and helped shape the course of English history.

Arthur Tudor (1486-1502)

Sweating sickness may have had a further impact on the Tudors and their role in our history. Henry VII’s first son, Arthur the Prince of Wales died in 1502 aged 15. Sweating sickness has been suggested as the cause of his sudden death. His death saw Henry VII’s second son, also called Henry, become first in line to the throne which he took in 1509 as King Henry VIII.

What was the sweat?

Unlike other plagues the identity of sweating sickness remains a mystery to this day. The periodicity of the epidemics suggests an environmental or meteorological trigger and possibly an insect or rodent vector.

A similar disease struck Northern France in 1718 in an outbreak known as ‘the Picardy sweat’. 196 local epidemics followed until the disease disappeared in 1861 with its identity also a mystery. Interestingly, the region of France where the Picardy sweat arose is near where Henry Tudor’s group of French, German and Swiss solders amassed prior to the Battle of Bosworth.

Several diseases have been proposed as the true identity of the sweat. Typhus (not as virulent), influenza and ergotism (don’t match the recorded symptoms) have been suggested and dropped. In 1997 it was suggested that a hantavirus could have been responsible. Hantaviruses are spread by inhalation of rodent droppings and cause similar symptoms to sweating sickness before killing with bleeding and complications to the heart and lungs. Although rare they have been identified in wild rodents in Britain. If we remember how the sweat seemed to strike following summer when rodent numbers would be at their highest and add in the poor sanitation of Tudor times then hantavirus is a strong candidate.

We’ll likely never know the true identity of sweating sickness unless it re-emerges. If that’s the case based on the terror it inspired to Tudor England we should be glad to keep it a mystery.

Thanks for reading.

- Jamie

The Most Famous Case Report Ever

Case reports are nothing new. We’ve all told colleagues about interesting cases we’ve seen. I’ve presented a couple at RCEM. They tend to focus on the weird and wonderful, cases with surprising twists and turns but with actual limited learning. That’s why case reports are at the bottom of the table when it comes to levels of evidence. However, one in particular could be said to have marked a turning point in modern medical practice.

The Morbidity and Mortality Weekly Report (MMWR) has been published weekly by the Centre for Disease Control and Prevention (CDC) since 1950. Each week they release public health information, possible exposures, outbreaks and other health risks for health workers to be aware of. One case report in particular stands out out of all of their back catalogue. It was written by various doctors from the University of California, Los Angeles and Cedars-Mt Sinai Hospital, Los Angeles. It was published on June 5th 1981:

The MMWR June 5th 1981

Reported by MS Gottlieb, MD, HM Schanker, MD, PT Fan, MD, A Saxon, MD, JD Weisman, DO, Div of Clinical Immunology-Allergy, Dept of Medicine, UCLA School of Medicine; I Pozalski, MD, Cedars-Mt. Sinai Hospital, Los Angeles; Field Services Div, Epidemiology Program Office, CDC.

Pneumocystis Pneumonia (PCP) is a rare form of pneumonia caused by the yeast like fungus Pneumocystis jiroveci. The fungus can live in the lungs of healthy people without causing any problems so to see it in 5 otherwise healthy young (the oldest was 36) people was odd.

Less than a month later the MMWR published further cases of PCP as well as Kaposi sarcoma in 26 previously well homosexual men in Los Angeles and New York since 1978. Kaposi sarcoma is very rare form of cancer previously seen usually in older men of Jewish/Mediterranean descent. Again it was virtually unheard of it in young men. It was suspected that something was affecting their immune systems preventing them from fighting off infections and malignancy.

At the time there were many theories as to what was causing the immune systems of patients to shut down. It was felt that it was linked to the ‘gay lifestyle’ in some way leading to the stigmatising description in the media of GRID (Gay-related immunodeficiency) first used in 1982. By 1983 the disease was linked also to injecting drug users, haemophiliacs who’d received blood products and Haitians. This led to another stigmatising phrase ‘the 4H club’ (Homosexuals, Heroin addicts, Haemophiliacs and Haitians).

In 1982 however, the CDC had actually given it a proper name: ‘Acquired Immune Deficiency Syndome’ or ‘AIDS’.

The fact it was being transmitted to blood product recipients suggested the cause had to be viral as only a virus could pass the filtration process. In 1983 two rival teams, one American and one French, both announced they had found the virus causing AIDS with ongoing debate as to who got there first. Each team gave it a different name. In 1985 a third one was chosen: ‘Human Immunodeficiency Virus’ or ‘HIV’. By that time the virus had spread not just in America but in Canada, South America, Australia, China, the Middle East and Europe. Since 1981 worldwide more than 70 million people have been infected with the HIV virus and about 35 million people have died of AIDS. 

The MMWR of 5th June 1981 is now recognised both as the beginning of the HIV/AIDS pandemic and as the first publication of HIV/AIDS. Although only a case report it shows the value of these publications at the front line. Only by recording and publishing the ‘weird and wonderful’ can we start to share practice, appreciate patterns and spot emergent diseases.

Thanks for reading

- Jamie


Mass Hysteria & Moral Panic: The Dancing Plagues and #Momo


“The only thing we have to fear is fear itself” - President Franklin Roosevelt

Human beings are social animals with conventions and norms. Yet sometimes we act in ways that defy logic or reason. Very often the inspiration for this is fear. In this week’s blog I’ve looked at the ‘Dancing Plagues’ of the Middle Ages and the ‘Momo Challenge’ of last month and how they illustrate the power of fear over people. In a previous blog I wrote about the importance of being accurate with psychiatric diagnosis and so I’m also going to use them as examples of the difference between mass hysteria and moral panic.

In 1374 in Aachen along the River Rhine in what is now Germany locals suddenly started dancing. They didn’t stop. They couldn’t stop. Justus Hecker, an 18th century physician researched the phenomenon and described it as:

“They formed circles hand in hand, and appearing to have lost all control over their senses, continued dancing, regardless of the bystanders, for hours together, in wild delirium, until at length they fell to the ground in a state of exhaustion. They then complained of extreme oppression, and groaned as if in the agonies of death, until they were swathed in cloths bound tightly round their waists, upon which they again recovered, and remained free from complaint until the next attack.”

The dancing spread to Liege, Utrecht, Tongres and other towns in the Netherlands and Belgium, up and down the Rhine river. It was known as St. John’s Dance or St. Vitus’ Dance as these saints were blamed for causing the ‘disease’.

In July 1518, this time in Strasbourg, a woman known as Frau Troffea started dancing and carried on for 4 to 6 days. By the end of the week 34 people had joined her and at the end of month 400 people were dancing. It didn’t stop. It seemed contagious. Local authorities initially thought that the people just needed to get the dancing out of their systems and so organised musicians to encourage dancing. It didn’t work. Dozens collapsed and died out of exhaustion.

The ‘Dancing Plagues’ continued to occur randomly throughout medieval Europe. It’s still a mystery why so many people acted in such a bizarre way. Possession was blamed but exorcisms had no effect. Doctors at the time wondered if the stars or even spider bites were to blame. Various different theories have since been put forward such as encephalitis which can cause delirium and hallucinations or poisoning of grain by the ergot fungus. None completely explain all the symptoms. There may have been an element of religious demonstration but that doesn’t explain how the dancing seemed contagious.

It may be that ‘Dancing Mania’ was due to mass hysteria.

Mass hysteria is a "conversion disorder," in which a person has physiological symptoms affecting the nervous system in the absence of a physical cause of illness, and which may appear in reaction to psychological distress.

It has been suggested that mass hysteria has 5 principles:

  1. "it is an outbreak of abnormal illness behavior that cannot be explained by physical disease"

  2. "it affects people who would not normally behave in this fashion"

  3. "it excludes symptoms deliberately provoked in groups gathered for that purpose," such as when someone intentionally gathers a group of people and convinces them that they are collectively experiencing a psychological or physiological symptom

  4. "it excludes collective manifestations used to obtain a state of satisfaction unavailable singly, such as fads, crazes, and riots"

  5. "the link between the [individuals experiencing collective obsessional behavior] must not be coincidental," meaning, for instance, that they are all part of the same close-knit community

In 1374 Europe was still recovering from the Black Death of 1347-1351. The people of Strasbourg in 1518 had suffered a famine. At a time where disease and famine were the preserve of God’s wrath perhaps the stress of potential apocalypse manifested itself in a a plague of dancing.

If you think that mass hysteria is confined to the dark pre-Renaissance ages you’d be very wrong. In 1999 26 school children in Belgium fell ill after drinking cans of Coca Cola. After this was reported in the news more students in other schools also fell ill. Coca Cola withdrew nearly 30 million cans from the market as about 100 people eventually complained of feeling unwell after drinking their product. Except when they were examined nothing organically wrong could be found and the students had drunk cans from different factories.

Professor Simon Wessely, the former president of the Royal College of Psychiatrists, has written a lot about mass hysteria and is clear: mass hysteria does not make a person insane. The patient thinks they are unwell and communicate that illness. This brings us to moral panic and #Momo.


If you’ve been anywhere near social media the past month chances are you heard about Momo or ‘The Momo Challenge’. These were posts, shared widely, warning about a character called Momo who has started appearing in online videos aimed at children threatening them and encouraging them to perform acts of violence against themselves and others. It led to widespread discussion and warnings from authorities:

Except it wasn’t real. There is no evidence of Momo or any character infiltrating videos of Pepper Pig and inspiring children to hurt themselves. This obsession over an imaginary character was been traced to a solitary post in America warning about Momo. Over the following month there was an exponential growth of interest into this imaginary threat. Charities warned that the fear mongering would actually do more harm than good.

Guardian graphic. Source: Google trends. Numbers represent search interest

Guardian graphic. Source: Google trends. Numbers represent search interest

Professor Wessely defines moral panic as the “phenomenon of masses of people becoming distressed about a perceived — usually unreal or exaggerated — threat portrayed in catastrophising terms by the media“. Like mass hysteria there’s a threat. Unlike mass hysteria people don’t believe themselves to be unwell. The sociologist Stanley Cohen described four stages of moral panic in response to a perceived threat:

  1. The threat is then depicted in a simple and recognizable symbol/form by the media

  2. The portrayal of this symbol rouses public concern

  3. There is a response from authorities and policy makers

  4. The moral panic over the issue results in social changes within the community

Look at these four steps and then the graph above. #Momo illustrates moral panic perfectly. It also brilliantly illustrates how misinformation and as a result panic can be spread by social media. Moral panic is the result of fears the ‘threat’ poses to our way of life. It is therefore a powerful tool by the far right. Watch how Donald Trump and his supporters spread inaccurate information about illegal immigrants to push their border wall agenda.

The world is a scary place and as a species we instinctively fear the unknown especially when our way of life or our lives themselves are believed to be at risk. The panic at Oxford Street, London last December where gun shots were reported is believed to be due to a ‘perfect storm’ of fear over terrorism and violence. People panicked when a fight broke out. Next thing hundreds of shoppers were running and their imagination took over. What’s worse is when that fear is used to ostracise whole groups of people. The Witch Trials of Salem, on which both mass hysteria and moral panic have been blamed, is a classic example of this. For all the drama caused what the #Momo phenomenon also shows is a potential solution to fear: knowledge and a measured response.

Thanks for reading

- Jamie

Flogging, Bellows, Barrels, Smoke Up the Bum and Abraham Lincoln - Early CPR

Last week saw the start of induction of our new third year students starting the clinical phase of their time at university. I was on CPR duty for much of it. CPR as we know it was developed in 1960. For the centuries before that there were many different techniques attempted to revive a patient without a pulse. It’s fair to say these were, ahem, interesting.


In the early Middle Ages a patient would be flogged or flagellated sometimes with nettles to try and revive them. It was presumed that the shock would awake the patient rather than any consideration of their circulation.


In an early example of artificial ventilation bellows began to be used in the 1500s to blow air down the patient’s throat and into their lungs. However, simple air manoeuvres were not used. Bellows were popular for about 300 years until in 1829, the French physician Leroy d’Etiolles demonstrated through his work on barotrauma that over distension of the lungs could kill an animal so they went out of fashion.


Fast forward to the 1700s and fumigation comes into use. The physician would light a pipe of tobacco and use the above contraption to literally blow tobacco smoke up the patient’s rectum. It was believed that the smoke’s effects would revive the patient through the irritant effects of the tobacco smoke.

Barrel Method

Resuscitation would then go back the lungs and a technique to force air in and out of the chest. This led to the barrel method where the barrel was used to force inspiration and expiration. You can kind of see what they were trying to do here with a mechanical intervention.

Russian Snow Method

Meanwhile in Russia in the early 19th century the snow method came into fashion. The idea being to slow the metabolism of the patient and hope that circulation would return at a later date.

The Assassination of Abraham Lincoln

Whilst at Ford’s Theatre on 14th April 1865 US President Abraham Lincoln was shot by John Wilkes Booth in the back of the head. Charles Leal, a young military surgeon, and another doctor Charles Sabin Taft attempted to revive Lincoln with a three stage approach:

Method “A” “... As the President made no move to revive then, I thought of another way of death, apnea, and I assumed my preferred position for him with artificial respiration to revive …” “… I knelt on the floor on the President, with one knee on each side of the pelvis and in front of him. I leaned forward, opened my mouth and inserted two fingers of his right hand as far as possible .. . and then I opened the larynx and I did a free passage for air to enter the lungs … “

Method “B”: “… I put an assistant in each of his arms to manipulate in order to expand the chest and then slowly pushed his arms down the side of the body as I pressed the diaphragm above: These methods caused the vacuum and air is forced out of their lungs … “

Method “C”: “… Also with the thumb and fingers of my right hand pressure intermittent sliding pressure below the ribs stimulated the apex of the heart …”

Lincoln did become more responsive but it was clear his wounds were fatal and he died the next day. The three stages above are almost recognisable to the ‘ABC’ method we’re taught today. The doctors took steps to protect Lincoln’s airway, there was some consideration to force ventilation and an attempt at pressing the heart. It’s still not CPR as we would know it.

It took the US Military adopting mouth-to-mouth resuscitation and CPR as well as public campaigns helped by the arrival of Resusci Anne (more on her here) for CPR to become a key part of both medicine and health education.

It’s very easy to laugh at these previous techniques and sometimes hard to see the logic behind them. However, we don’t know which staples of Medicine we use today will be deemed irrelevant or even wrong. For example, we no longer perform gastric lavage and even collar and blocks are being debated as sometimes doing more than good. Maybe in the future medics will view us as incredulously as we do at someone blowing tobacco smoke up the rectum of a moribund patient. Maybe.

Thanks for reading.

- Jamie

What if I Told You About Morpheus and 'The Mandela Effect'?

We’ve probably all seen this meme of Morpheus played by Laurence Fishburne from the movie The Matrix. In the film Morpheus has a ‘teacher role’ and reveals the true nature of reality to Keanu Reeve’s character, Neo. The meme has been in use since 2012 often as social commentary noting a common observed truth. I’ve also seen it in use in PowerPoint presentations. A simple search via Google reveals the sheer number of Morpheus memes there are out there:

However, the truth is that Morpheus never actually says the line, “what if I told you” in the film. Despite this, people who have seen it may say that they remember the line from the film and not from a meme often in this scene:

How about Darth Vader when he reveals he’s Luke Skywalker’s father in the film ‘Empire Strikes Back’? Chances are you’re thinking “Luke, I am your father”, when he actually says:

And if I asked you what the Queen in ‘Snow White’ says to her mirror? Did you think “Mirror, Mirror on the wall?” Wrong again.

This phenomenon of commonly reported false memories is known as ‘The Mandela Effect’ due to a large number of South Africans recalling that they heard news of the death of Nelson Mandela in the 1980s despite the fact he went on to become their president in 1994 and actually died in 2013. Searching for the Mandela Effect will bring up pseudoscientific websites who explain the phenomenon as multiverses mixing together. Psychologists would describe it as confabulation and it’s incredibly common. It’s been shown in studies that subjects will falsely recall a certain word (sleep) if they’ve been hearing related words (tired, bed, pillow etc). In another study participants were told of 4 events that happened to them aged between 4 and 6; three of them were real and one (getting lost in a mall) was false. 25% of participants stated that they could remember the false memory and even elaborated and embellished giving new details they ‘remembered’. In another experiment participants’ recall of a witnessed car crash was affected by the way they were questioned. If they were asked “what speed were the cars going when they smashed?” they tended to recall a much faster speed than if they were asked “what speed were the cars going when they contacted?”

We all rely on our memories to help make us who we are and establish the values we have. It is important to be aware of how our recall and that of our patients may be affected by context or other factors we may not be aware of.

No one is easier to fool than yourself.

Thanks for reading.

- Jamie

Smallpox: the Giants' Shoulders Edward Jenner Stood on to Overcome 'The Speckled Monster'

Convergent evolution is a key principle in Biology. Basically, it means that nature will find the same ‘solutions’ for organisms filling similar environmental niches. So in different species with no close relation but who encounter similar environments (say sharks and dolphins) you’ll see similar features (both have fins). The same is true in the history of science. Very few discoveries are actually made in solitude; more often there were several people working on the same problem but often only one gets the fame. For Charles Darwin see Alfred Russel Wallace. For Sir Isaac Newton see Robert Hooke.

When it comes to vaccination and the conquest of smallpox, one of the most deadly diseases to ever afflict mankind, one name always comes to mind: Edward Jenner. We all know the story: an English country doctor in the 18th century observed how milkmaids who contracted cowpox from their cattle were immune to the far more serious smallpox. On 14th May 1796 he deliberately infected the eight-year-old boy James Phipps with cowpox. The boy develops a mild fever. Later Jenner exposes Phipps to pus from a smallpox blister. The boy is unaffected by smallpox. A legend is born. Vaccination becomes the mainstay of the fight against smallpox. In the 20th century a worldwide campaign results in smallpox being the first disease to be eradicated.

Of course, as we all know, things are rarely this straightforward and there are many less famous individuals who all contributed to the successful fight against smallpox.

Dr Jenner performing his first vaccination on James Phipps, a boy of age 8. May 14th, 1796. Painting by Ernest Board (early 20th century).

Smallpox, the ‘speckled monster’, has a mortality rate of 30% and is caused by a highly contagious airborne virus. It long affected mankind; smallpox scars have been seen in Egyptian mummiesfrom the 3rd Century BCE. Occurring in outbreaks it is estimated to have killed up to 500 million peoplein the 20th century alone. Throughout history it was no respecter of status or geography, killing monarchs and playing a role in the downfall of indigenous peoples around the world. Superstition followed smallpox with the disease even being worshipped as a god around the world.

Doctors in the West had no answer to it. But in other parts of the world solutions were found. With colonisation and exploration the West started to hear about them. More than seventy years before Jenner’s work two people on either side of the Atlantic were inspired by custom from Africa and Asia. 

Left: Shitala, the North Indian goddess of pustules and sores

Right: Sopona, the god of smallpox in the Yoruba region of Nigeria

Lady Mary Wortley Montagu

In 1721 in London Lady Mary Wortley Montagu the wife of the Ambassador to Turkey who had lost her brother to smallpox having survived the disease herself is keen to protect her daughter. Whilst in Turkey she learned of the local practice of ‘variolation’ or inoculation against smallpox. Pus from the pustules of patients with mild smallpox is collected and scratched into the skin of an uninfected person. This was a version of a process dating back to circa 1000 AD in China where pustule material was blown up the nose of a patient. Mary was impressed and her son was inoculated in Turkey in 1715. 

Six years later in London her daughter was similarly inoculated in the presence of doctors of the royal court, the first such procedure in Britain. She campaigns for the practice to be spread.

Cotton Mather

Also in 1721, an outbreak of smallpox is ravaging Boston, Massachusetts. An influential minister Cotton Mather is told by one of his African slaves, Onesimus, about a procedure similar to inoculation. According to Onesimus as a younger man a cut was made into his skin and smallpox pustules were rubbed into the wound. He told Mather that this had made him immune to smallpox. Mather hears of the same practice in China and Turkey and is intrigued. He finds a doctor, Zabdiel Boylston, and the two start performing the first inoculations in America.

Lady Wortley Montagu and Cotton Mather never met, they lived on opposite sides of the Atlantic and yet learnt of the same practice. Convergent medical advances in practice.

Inoculation is not without risk however and Prince Octavius, son of King George III dies in 1783 following inoculation. People inoculated with smallpox are also potentially contagious. An alternative is sought.

In 1768 an English physician John Fewster identified that cowpox offered immunity against smallpox and begins offering the practice of immunisation. In 1774 Benjamin Festy, a dairy farmer in Yetminster, Dorset, also aware of the immunity offered by cowpox,, has his wife and children infected with the disease. He tests the procedure by then exposing his son to smallpox. His son is fine. The process is called vaccination after the Latin word for cow. However, neither men publish their work. In France in 1780 politician Jacques Antoine Rabaut-Pommier opens a hospital and offers vaccination against smallpox. It is said that he told an English doctor called Dr Pugh about the practice. The English physician promised to tell his friend, also a doctor, about vaccination. His friend’s name? Edward Jenner.

It is Jenner who publishes his work and pushes for vaccination as a preventative measure against smallpox. It is his vaccine which is used as the UK Parliament through a succession of Acts starting in 1840 makes variolation illegal and provides free vaccination against smallpox. In a precursor of the ignorance of the ‘anti-vaxxer’ movement there is some public hysteria but vaccination proves safe and effective (as it still is).

Cartoon by British satirist James Gillray from 1802 showing vaccination hysteria

Arguably, without Jenner’s work such an effective campaign would have been held back. As the American financier Bernard Baruch put it, “Millions saw the apple fall, but Newton was the one who asked why.” However, Newton himself felt that “if I have seen further it is because I have stood on the shoulders of giants.” History in the West has a habit of focusing on rich white men at the exclusion of others, especially women and slaves. Any appreciation of Jenner’s work must also include those giants whose shoulders he stood by acknowledging the contribution of people like Lady Wortley Montagu and Onesimus and of Fewster and Festy - the Wallaces to his Darwin.

Thanks for reading

- Jamie

How 'The Unknown Woman of the Seine' became 'Resusci Anne'


She’s been called “the most kissed face in the world.” I think it’s fair to say that most of us will encounter Resusci Anne/Resus Anne/Rescue Anne/CPR Anne at some point of our lives. The mannequin itself dates from the 1960s when it was first produced by Laerdal Medical. However, as often with medical history, the story of Anne goes back further than that.

Paris in the 1880s showing the beginnings of the Eiffel Tower

Paris in the 1880s showing the beginnings of the Eiffel Tower

It’s Paris in the late 1880s; a busy, bustling city where, much like London, there was what we would now consider a morbid curiosity with death and it was not uncommon for people to vanish. The body of a young woman, presumed to be no more than 16 years old, is pulled out of the River Seine. As is customary her body is displayed in the mortuary window on a marble slab (a popular attraction at the time) in the hope a family will come forward. They don’t.

Her body shows no sign of disease or trauma. Suicide is suspected. But something else intrigues the pathologist. Her face upon which is a calm, knowing half smile quite unlike anything you’d expect from a person drowning. The pathologist was so taken by her he ordered a plaster cast to be made of her face.


Copies of the plaster cast became widespread as both a decoration and artistic inspiration. Parallels were drawn with Ophelia, the character in William Shakespeare’s ‘Hamlet’ who drowns after falling out of a willow tree. The French philosopher Albert Camus compared her enigmatic smile to the Mona Lisa.

In Richard Le Gallienne’s 1899 novella, ‘The Worshipper of the Image’, the protagonist Antony falls in love with the death mask. The Russian born poet Vladimir Nabokov wrote a whole 1934 poem titled “L’Inconnue de la Seine” in which he imagined her final days:

Urging on this life’s denouement,
loving nothing upon this earth,
I keep staring at the white mask
of your lifeless face.

In 1926 the death mask was included in a catalogue of death masks and was titled, ‘L'Inconnue de la Seine’ (The Unknown Woman of the Seine) and her legend was complete.

Fast forward to 1955 and the Norwegian toy manufacturer Asmund Laerdal saves his son Tore from near drowning in a river. Shortly after he is approached to design a mannequin to help teach cardio-pulmonary resuscitation. He decides that a female mannequin would be less scary for students and wants her to have as natural a face as possible.   Remembering a plaster cast of L’Inconnue he decides to replicate her face. L’Inconnue is reborn as ‘Resusci Anne’. The rest is history.

As Laerdal themselves put it: "Like Leonardo da Vinci's Mona Lisa, and John Everett Millais' Ophelia, the girl from the Seine represents an ideal of beauty and innocence." An anonymous victim of drowning now responsible for teaching cardio-pulmonary resuscitation around the world having briefly been the focus of gothic romantic obsession. It’s unlikely she could ever have imagined this legacy in her short life.

Thanks for reading

- Jamie

The Story behind Gray's Anatomy (the real one)

Choosing the Twitter name @mcdreeamie might have been mostly because I work at NUH DREEAM but I’d be lying if I didn’t know about the TV show ‘Grey’s Anatomy’ and watched the odd episode. However, this blog is about the real Gray’s Anatomy, the book. So please only read on if you’re interested in the book and not Meredith or Derek.

I was bought a copy of Gray’s Anatomy as a present from my uncle once my place in medical school was guaranteed and it has made a eye-catching addition to my book shelf ever since. For this blog I thought I’d look a bit into Gray and how the book came about.

Henry Gray

Henry Gray was born in 1827 in Belgravia, London. That’s pretty much all we know about his early life until he started at St. George’s University, also in London, in 1842. He’s thought to have lied about his age in order to enrol.

Back then in order to be a medical student one would also have to a practising member of the Church of England and show proof. To become a staff surgeon at St. George's Hospital, Gray would first have to pass the Apothecaries exam, then an exam to obtain membership in the Royal College of Surgeons, and later a difficult exam to become a Fellow of the Royal College.  He was described as good looking and a bit dandyish and hard working. He was interested in anatomy and dissection from the very beginning.

Gray benefited, along with his peers, from a recent change in the law: The Anatomy Act of 1832. This gave freer licence to medical students and teachers to practice human dissection. Whereas before restrictions had led to a dark business in grave robbing and corpse selling now everyone wanting to practice dissection had to be registered with the Home Secretary. London, the rest of England and Wales, Scotland and Ireland each had an inspector of anatomy who would keep a list of all bodies being dissected and report to the Home Secretary. Bodies could be dissected or claimed from prisons or workhouses if no one next of kin came for them. This meant Gray would have been able to practice anatomy in the open and within the law.

When he was 21 he won prizes in surgery. He became a member of the Pathological Society of London and a member of the Royal College of Surgeons. In 1852 , still in his twenties, he was made a governor of the hospital.

Self portrait of Henry Vandyke Carter

By 1853 he had met Henry Vandyke Carter, a medical student with a talent for drawing. Gray entered and won the Astley Cooper Prize (100 pounds - about £13000 now) for his work “The structure and use of the human spleen.” His 350 page book contained over 50 illustrations by Carter although no credit was given and payment was incomplete.

In 1855 Gray approached the shy Carter again this time regarding the possibility of a textbook for medical students. Carter was more careful this time and only began work when promised £10 (£1000 now) a month for 15 months to produce drawings. To put that into perspective I’ve found one freelance book illustrator online who changes £320 a day! By summer 1858 the first copies were ready for printing for the students arriving later that year.

Gray's markings on the first edition's title page, downplaying Carter's contributions and his titles

Called Anatomy: Descriptive and Surgical Gray sought to boost his name whilst diminishing that of Carter and his contribution. Carter, presumably very fed up of Gray by this point, went to India to practice and never received a penny of royalties.

Early reviews were a bit mixed. The British Medical Journal called it “far superior to all other treatise on anatomy, … a book which must take its place as THE manual of Anatomy Descriptive and Surgical.” The Medical Times and Gazette, however, found it “ not wanted…low and unscientific in tone…compiled, for the most part, in a manner inconsistent with the professions of honesty which we find in the preface… . A more unphilosophical amalgam of anatomic details and crude surgery we never met with.”

The second edition was released in 1861. That same year Gray’s nephew Charles became ill with smallpox. Gray treated Charles back to health. However, despite having received an early form of smallpox vaccination, Gray himself contracted the disease. He developed confluent smallpox, a more serious form where the lesions meet to form one whole sheet. On 13th June 1861, when he was due to be interviewed for a new post at St. George’s, he died aged just 34. As a smallpox patient all of his possessions, including his writings, were burnt. Carter stayed in India for 30 years before returning to England in 1888. He died of tuberculosis in 1897.

Gray’s Anatomy has never been out of print. In 2004 a Student edition was also released. Gray’s is published through Elsevier with online materials as well.

Henry Gray was clearly a precocious talent, albeit one with a thirst for fame. St. George’s continue to honour one of their most renowned alumni with their anatomy society. Whilst there have been rumblings of plagiarism he has clearly achieved the recognition he craved uniting nearly two centuries of students and practitioners who have used Gray’s.

Thanks for reading

- Jamie

The Wound Man: From Textbook to Emblem and Hollywood


If you think you’re having a bad day just remember The Wound Man. Stabbed, bludgeoned and shot yet still standing tall it’s safe to say that his image has been on a journey over more than half a millennium to becoming truly iconic. This journey has taken him from the pages of a medieval textbook to Hollywood via James Bond and the Royal College of Emergency Medicine.

Gunpowder made its way to Europe during the 14th century, probably along the Silk Road trade route with Asia. This meant that as well as the traditional war wounds from blades and arrows doctors were also seeing the effects of cannon and shrapnel. Doctors needed some form of reference to help with the myriad new forms of trauma they might encounter.


Along came Johannes de Ketham/von Kirchheim, a German physician living and working in 15th century Italy. In 1491 he published Fasciculus Medicinae (the little bundle of Medicine) in Venice; basically the Oxford Handbook of Medicine of its time complying medical knowledge as it was. Written in Latin, the original edition consisted of six illustrations with accompanying text. The world’s first ‘Wound Man’ was one of these illustrations. The illustrations and sections were as follows (diagrams are from the 1495 edition):

Urine and blood letting

  • Urine section: the ‘little bundle’ starts immediately with a section on how a physician could use the colour and smell of a patient’s urine to diagnose their condition

  • Bloodletting/phlebotomy section: a full male figure showing arteries and veins and where the patient could be bled

  • Zodiac figure’: another full male figure annotated with when blood can be taken from certain areas of the body depending on the time of year

‘Zodiac Man’

Obstetrics and Gynaecology

  • Gynaecology and obstetrics: including a pregnant anatomical female figure, and texts related to sexuality, generation, and disorders particular to women

  • ‘Wound Man’: this section illustrated various specific injuries and how to treat them

The original ‘Wound Man’

‘Disease Man’

  • ‘Disease Man’: labelled with various diseases and illnesses

The Fasciculus Medicinae was published again in 1495, 1500, 1509, 1513 and 1522 by which time its information was outdated and it was replaced as a prominent textbook. However, the concept of the ‘Wound Man’ continued with new injuries matching the advancements of military technology.

Possibly the most famous example of a ‘Wound Man’ was included in Feldbuch der Wundarznei (Fieldbook of Surgery), written by the Austrian field surgeon Hans von Gersdorff in 1531.

‘Wound Men’ continued to be used in textbooks until the 17th century, their forms changing with the artistic fashions of the day.

‘Wound Man’ from Feldbuch der Wundarznei (Fieldbook of Surgery), written by Hans von Gersdorff in 1531

The iconography of the ‘Wound Man’ led to its inclusion in the official blazon for the Royal College of Emergency Medicine, adopted on January 24, 1997. Used on the dexter side, it was chosen to show the injured patient in contrast to the healthy man on the sinister side. He represents how emergency medics are trained to treat patients with all kinds of injuries and injury mechanisms as well as the sheer variety seen in trauma patients.


‘Wound Men’ have been potent icons in fiction as well. ‘The Wound Man’ was a potential title for Ian Fleming’s James Bond novel ‘Dr No’, published in 1958, rejected partly because of the possibility people would mispronounce it as ‘wound up’ rather than a wounded man. In the 1981 novel Red Dragon by Thomas Harris the serial killer Hannibal Lector murders a patient and displays them with multiple injuries similar to a ‘Wound Man’ illustration in one of his books. This was also included in NBC’s television series Hannibal.


It’s safe to say that Medicine is full of symbols. ‘Wound Men’ are one of the most enduring as symbols of education, traumatic injury and an example of Medicine’s roots over the centuries.

Thanks for reading

- Jamie

How Saline Earned its Salt


The other day in Resus as I was putting up a bag of fluids for a patient I wondered how long it was we’d been using what seems an incredibly simple but important intervention for our patients. “Putting up a bag of fluids” is a core part of the resuscitation of an unwell patient to the extent it just rolls off the tongue.

Turns out the history of IV saline goes back to Victorian Britain and the cholera pandemics. The Industrial Revolution brought unprecedented numbers of people to the cities, especially London where the population trebled between 1815 and 1860. In many places sewage disposal was not much changed from Tudor days and so struggled to cope. By and large the response was simply to tip the waste into cesspits or into the nearest river hence ‘The Great Stink’ of 1858 which brought the city of London and even Parliament to a standstill.

These conditions were perfect for cholera, a disease brought to Britain from the Indian subcontinent in 1832. A secretory diarrhoea spread via the faeco-oral route it ‘enjoyed’ 5 pandemics between 1817 and 1896. Cholera presents with ‘rice water’ diarrhoea (up to a litre an hour) with a mortality rate up to 70% in untreated patients largely due to extreme dehydration.

Cholera challenged medics at the time and inspired some great work such as that of John Snow on Broad Street in 1854 who rejected the miasmic theory of spread and instead proposed there was a causative agent and that contaminated water was the source. In the same year the causative agent was identified by Filippo Pacini in Italy and was first grown in culture in 1884 by Robert Koch working in Egypt and India. Fluid replacement with intravenous fluid was another such innovation.

By 1832 the work of Dr W B O’Shaughnessy had concluded that the blood of cholera patients had lost most of its water as well as its saline contents and the Lancet recommended “the injection into the veins of tepid water, holding a solution of the normal salts of the blood.” That same year Thomas Latta, a physician working in Leith, Scotland attempted to treat a patient with cholera doing just this.

He first tried treating patients with fluids inserted rectally but found that not only did it not work it actually seemed to make their vomiting and diarrhoea worse. So then he tried intravenous injection. He wrote describing how over half an hour he injected 6 pints (3.4 litres) of fluid in the basilic vein of an elderly lady with cholera: “soon the sharpened features, and sunken eye, and fallen jaw, pale and cold, bearing the manifest imprint of death’s signet, began to glow with returning animation; the pulse returned to the wrist…” Sadly he left the patient in the hands of the house surgeon who did not repeat the treatment as the patient deteriorated again and so she died 5 hours later.

That same year Dr Robert Christison wrote to the Dutch government advising them on this new treatment. He described 37 cases treated with intravenous fluids of whom 12 survived. He mentioned certain risks including air embolus, phlebitis and the potential risks of introducing so much fluid to a patient but generally recommended the treatment. Looking back there were probably other risks too such as secondary infections from unsterile injections they wouldn’t have been aware of.

In these early days there was no standardisation of what the fluid should contain with some physicians using egg whites with their fluid and some adding albumin. It wasn’t until Dr Sydney Ringer’s work in the 1880s that an optimum physiological solution was found. The dangers of potential fluid overload were identified quite early on as well with S.K. Mujumbar of Port Blair, India writing in 1916 of the need to identify cholera patients who actually need fluids so as not to cause “an extra amount of work on the already weakened and embarrassed heart”.

0.9% Saline has since become a staple of modern medicine and is included on the World Health Organisation’s list of essential medicines last published in 2017.

Further pandemics of cholera occurred in 1899-1923 and 1961-1975 by which time advances in public health medicine meant Western Europe was unaffected. The last outbreak in the USA took place in 1910-1911. Cholera remains endemic in many countries in Africa and has continued to re-surface whenever sanitation is affected such as in Haiti following Hurricane Matthew. Diarrhoeal disease is the second highest cause of mortality and leading cause of malnutrition in the under-fives worldwide. The WHO describes diarrhoea as “preventable and treatable”. Re-hydration whether orally or intravenously remains the mainstay of treatment.

So whenever we are putting up a bag of fluid for a patient we are continuing a tradition first started by Thomas Latta injecting 6 pints of fluid into the basilic vein of his patient in Leith.

Thanks for reading

- Jamie

Don't Just Sit There: Audience Participation in Simulation


In my last blog I talked a bit about setting the audience up for a simulation. In this blog I wanted to explore this more. Students who volunteer first to simulate are often in the minority but it’s important that the audience don’t feel like it’s an easy ride and just sit there.

The standard lay out when I run a simulation session is like this:


One of the audience members becomes the scribe, recording the clinical history and the story as in unfolds. This provides a point of reference for the students in the simulation as well as helping discussion during debrief. As these example show it also lets students be a bit creative:

We’ve often given students in the audience a checklist and asked them to record proceedings as well as part of student to student feedback.

If you’re giving the students in the simulation information such as an ECG or blood gas, make sure the audience can either see it or get a copy too to involve them. Later on reflect on their conclusions: are they different from those in the simulation?

In the previous blog I mentioned lifelines such as ‘Ask the Audience’. I find reminding the audience that at any moment the students in the simulation might turn round and ask your opinion inspires them to keep paying attention!"

‘Tag Team Simulation’ is something I’ve thought about but not done yet but would be another way to keep the audience engaged - I’d be interested to hear how this works if anyone has done it.

Managing audience members is a key part of the simulation session. If nothing else their engagement is a sign of respect for the students actually doing the simulation and helps create a positive learning atmosphere for everyone.

Thanks for reading

- Jamie

Setting the Stage: The Pre-Brief


Simulation is an indelible part of medical education as well as the training of other healthcare professionals. At DREEAM I’m lucky to be able to work with the great cohort of simulated patients trained by my colleague Ali Whitfield. Working with real human beings rather than mannequins adds another layer of realism and has become a firm fixture in the training we offer at DREEAM.

However, being ‘on stage’ in a simulation remains a very divisive issue for learners. From my experience half of students will feed back how much they love simulation and want more with the other half hating every moment. That’s usually due to previous bad experiences or pre-conceived notions.

As I’ve worked more with simulation as a facilitator I’ve put more emphasis at the beginning; the pre-brief or ‘setting the stage’ partly due to these pre-conceived notions but also as not every student has the same experiences with simulation. Much as an actor wouldn’t expect to just walk onto a stage so a novice student shouldn’t be expected to just perform without some clear guidance.

My pre-briefs usually follow the same pattern:

  • Introduce the session and why we are using simulation - it’s not being arbitrarily used to scare but instead with very clear objectives with definite relevance

  • The setting (ward, A&E, GP practice whatever) their role (you’re a student/doctor/nurse) and the expectations of the behaviours expected - act as you would as a doctor, treat the simulated patient as a real patient etc. How do referrals work in this scenario? Also, what is expected from the audience? I regularly get one member to act as a scribe on a flip-chart for the learners in the simulation to refer to. Encourage the audience to be active observers


  • Orientate the students around the environment and the equipment. Highlight any potential sources of confusion such as differences between the patient’s palpable pulse rate and what is on the monitor. Will there be blood results available? Are we running in real time? This helps keep the scenarios running later on

  • Acknowledge simulation can be challenging and where possible allay fears - if you’ve set the scenario up so that any mistakes made will be corrected and death is not an outcome then tell them! In my experience the biggest apprehension toward simulation from students is that we’re trying to catch them out and they’ll kill the patient. If that won’t happen let them know. If it might happen then also let them know and reassure that it’s a safe environment allowing for these mistakes

  • Lifelines - the timeout is a common theme in any simulation and it’s important to reassure students about this. I go further and add lifelines similar to ‘Who Wants to be a Millionaire?’ Students can ‘Ask the Audience’ - turn to the learner watching and ask their opinion or ‘Phone a Friend’ - ask the facilitator a closed, focused question; “I think it’s aortic dissection what’s the best imaging for this?” If it’s not going great we can always time-out and take a break; this is fine and should be used if needed

  • Is there a specific aspect of simulation they’d like you to look out for? Giving useful feedback can be challenging. Students may have apprehensions about one particular skill or part of simulation such as cannulation or delivering a useful referral based on previous experiences. If they can tell you this beforehand it helps observation and providing feedback the student can use

  • Give them appropriate information beforehand. Most of the simulations I do with students are based in the Emergency Department. One is with a patient presenting with chest pain who turns out to have a pulmonary embolus. In real life there would be an ECG. I provide this ECG and give them a moment to read it. I ask what the students are thinking and why. They usually say Acute Coronary Syndrome. At the end after diagnosing PE during the feedback I then ask them to take us through the process of how they ended up changing their mind from ACS to PE (usually because the patient complains of calf pain). I reinforce how important it is to always consider ACS but also what aspects should make us think PE (yes calf pain is important but the patient had oxygen saturations of 90% with no lung history, a clear chest and her ECG shows AF.) - this shows the nature of a true ‘working diagnosis’ as well as helping us understanding each others thought processes

Ultimately the pre-brief is about you and your learners finding out about each other. Standardise as much as you can. Students who go first may resent being ‘the example’ so think about making a video to show everyone what is expected at the beginning. This is then easier for you later on. I think pre-brief is an investment in your simulation sessions. The more and better you do at the beginning the bigger and more rewarding your session will be.

Thanks for reading


We Need to Talk About Kevin: Is Kevin McCallister a Psychopath?

Home alone tooth 2.gif

It’s Christmas time, there’s no need to be afraid. At Christmas time we let in light and we banish shade. And usually sit down to watch a number of Christmas films including one or both of Home Alone (1990) or Home Alone 2: Lost in New York (1992)*. Both films were written and produced by John Hughes and directed by Chris Colombus and star Macaulay Culkin as Kevin McCallister as a young boy left home at Christmas in the first film and who ends up in New York in the second. In both he has to defend himself against a couple of bumbling burglars ‘The Wet Bandits’, Harry (Joe Pesci) and Marv (Daniel Stern). Home Alone remains the highest grossing live action comedy in the US and only lost the worldwide title in 2011 to The Hangover II. Both films are firm fixtures for Christmas watching.

Yet, on a recent viewing the other day there was an easy question in my mind. Not the sad decline of Macaulay Culkin from childhood star to example for any child who becomes famous or even the fact that the McCallister family would clearly have social services swarming over them. No, this question was about the character of Kevin himself. Beyond all the jokes and slapstick, is Kevin McCallister a psychopath?

I’m not a psychiatrist so I first had to look up the criteria to make a diagnosis. Turns out psychopathy doesn’t really exist anymore as a diagnosis and has been largely replaced by anti-social personality disorder (ASPD). So this changed my question for this musing immediately; does Kevin McCallister fulfil the criteria for ASPD?

As Kevin McCallister is American it seemed right to base any diagnosis against the criteria of the American Psychiatric Association. Fortunately, they publish their diagnostic criteria in the Diagnostic and Statistical Manual of Psychiatric Disorders (DSM) now in its 5th iteration published in 2013. The DSM defines the essential features of a personality disorder as “impairments in personality (self and interpersonal) functioning and the presence of pathological personality traits.” The DSM 5 details very clear criteria to make a diagnosis of ASPD.

First there needs to be significant impairments in self functioning AND in interpersonal functioning.

DSM defines impairments in self functioning as either identity or self-direction:

a.Identity: Ego-centrism; self-esteem derived from personal gain, power, or pleasure.

b.Self-direction: Goal-setting based on personal gratification; absence of prosocial internal standards associated with failure to conform to lawful or culturally normative ethical behaviour.

Kevin certainly has high self-esteem. In Home Alone he genuinely believes that through his own power he has made his own family disappear. A belief that prompts celebration:

In terms of conforming to legal or ethical behaviour at no point in either film does he seek to tell the police or authorities that he’s home alone or at risk of the Wet Bandits. Indeed, he shop lifts albeit inadvertently in the first film and uses his father’s credit card to book into a luxury hotel in the second. He certainly uses his freedom for personal gratification spending $967 ($1,742.98 in today’s money) on room service alone in Home Alone 2.

So far it seems like he’s ticking the boxes without us even mentioning the vigilante justice. More of that violence later.

On to interpersonal functioning, defined by the DSM as either in empathy or intimacy:

a.Empathy: Lack of concern for feelings, needs, or suffering of others; lack of remorse after hurting or mistreating another.

b.Intimacy: Incapacity for mutually intimate relationships, as exploitation is a primary means of relating to others, including by deceit and coercion; use of dominance or intimidation to control others.

Speaking as a doctor in Emergency Medicine let’s get this straight: Kevin would have killed the Wet Bandits several times over. Especially in the second film where to start he throws four bricks on Marv’s head from height. Marv would be dead. No question. Kevin McCallister is attempting murder:

Later on, he sets up elaborate traps and stays around rather than running away (like most would do) merely to supervise Marv getting electrocuted and Harry setting fire to his head:

At no stage does he show any remorse and actually celebrates what he’s doing. Prior to meeting Kevin the Wet Bandits were cartoon villains, non-violent and stupid. Did they deserve to die? Kevin obviously felt it was worth risking it and enjoyed it.

He does form friendships in both films with people he previously feared; Old Man Marley and the Pigeon Lady. He inspires the former to re-connect with his family and gives the latter a present. This does suggest that he can form bonds with people. However, both were useful to him by helping him escape the Wet Bandits so it could be argued he was exploiting and rewarding them for his own benefit. This bit is open to debate but for the benefit of the blog lets assume this was Machiavellian manipulation and move on.

The patient then needs to have pathological personality traits in antagonism and disinhibition.

The DSM defines antagonism as:

a.Manipulativeness: Frequent use of subterfuge to influence or control others; use of seduction, charm, glibness, or ingratiation to achieve one„s ends.

b.Deceitfulness: Dishonesty and fraudulence; misrepresentation of self; embellishment or fabrication when relating events.

c. Callousness: Lack of concern for feelings or problems of others; lack of guilt or remorse about the negative or harmful effects of one’s actions on others; aggression; sadism.

d. Hostility: Persistent or frequent angry feelings; anger or irritability in response to minor slights and insults; mean, nasty, or vengeful behaviour.

We’ve already looked at Kevin’s violence and cruelty. He’s also certainly a master of deception. Throughout both films he is adept at speaking to adults and painting stories with great ease. Lying comes very easily to him as does using props and music whether to pretend to be his dad in the shower, a gangster with a gun or even a house full of celebrating people:

This is a crafty kid who is willing to lie and smile while doing it.

Disinhibition is defined by the DSM as:

a. Irresponsibility: Disregard for – and failure to honor – financial and other obligations or commitments; lack of respect for – and lack of follow through on – agreements and promises.

b. Impulsivity: Acting on the spur of the moment in response to immediate stimuli; acting on a momentary basis without a plan or consideration of outcomes; difficulty establishing and following plans.

c.Risk taking: Engagement in dangerous, risky, and potentially self-damaging activities, unnecessarily and without regard for consequences; boredom proneness and thoughtless initiation of activities to counter boredom; lack of concern for one’s limitations and denial of the reality of personal danger

In both films Kevin shows poor regard for his own safety, whether climbing down a rope soaked in kerosine, zip-lining from the roof of his house or climbing his brother’s shelves:

Early on in both films he is shown impulsively lashing out in anger when he feels frustrated at having his pizza eaten or embarrassed in public during his choir solo. Kevin is not inhibited:

And violence is clearly natural to him. So far…seems to be meeting all the criteria.

Finally these factors must be consistent across time and place. They must not be due to intoxication or head injury.

As Kevin behaves the same in Home Alone (1990, set in Chicago) and Home Alone 2 (1992, set in New York) we can assume his behaviour is consistent to time and place. At no point is he seen taking drugs or drinking alcohol so we can rule those out as a cause. He does hit his head slipping on ice in Home Alone 2 but that’s very late on and isn’t shown to effect his behaviour in any way. Once again he’s meeting criteria.

They must not be better understood as “normative for the individual’s developmental stage or socio- cultural environment.”

Kevin is a remarkable child acting in a way we wouldn’t expect of a boy his age. He definitely has, at best, a chaotic family and there’s no doubt that after Home Alone 2 social services would have come down on the McCallister family like a tonne of bricks:

However, the house is pristine and all the children look well nourished and dressed. While there’s plenty of questions about what kind of job the McCallisters must do in order to fund this lifestyle there’s no indication that this is a family where violence is the norm. Box ticked again.

Finally, the individual is at least age 18 years.

Ah, here it falls down right at the last. Kevin is 8 in Home Alone and so well below the age where we can diagnose ASPD. NICE does have a Quality Standard (QS59) first published in 2014 aimed at identifying children at risk of ASPD. This includes interventions for the whole family. But in no way can a conclusive diagnosis be made in a child.

- - -

So is Kevin McCallister a psychopath? By the DSM diagnostic criteria the answer is no. Does he show some traits that might set off alarm bells? The answer is yes. However, there is a debate about whether ‘psychopathy’ is an evolved trait which has been able to survive through natural selection as it has benefited human society to have individuals without morals prepared to do whatever it takes to achieve their goals (Glenn, Kurzban and Raine, 2011). Maybe we should celebrate Kevin’s innate traits as he uses them to defend himself and his family. After all, it means the bad guys get caught and it wouldn’t really make good films if he just rang the police like a good citizen.

- - -

Time to be serious now. This blog does highlight a big issue I find with mental health. Almost ahead of any other profession people are all too willing to play ‘keyboard psychiatrist’ and diagnose public figures such as Donald Trump with a mental illness. Whilst this blog is meant as a bit of fun it shows how mental health has very clear diagnostic criteria to be met before we use loaded terms such as ‘psychopath’. That can be my Christmas message: before you make a stigmatising diagnosis make sure you know what you’re talking about. In fact in general: research first, speak later. Let’s be nice people.

Merry Christmas, you filthy animals

*Yes, I am aware there is a ‘Home Alone 3’ and even somethings called Home Alone 4: Taking Back the House and Home Alone: The Holiday Heist. I just choose to ignore them as we all should.

Home alone tooth.gif


Glenn, A., Kurzban, R. and Raine, A. (2011). Evolutionary theory and psychopathy. Aggression and Violent Behavior, 16(5), pp.371-380.

Hopwood, C. J., Thomas, K. M., Markon, K. E., Wright, A. G., & Krueger, R. F. (2012). DSM-5 personality traits and DSM-IV personality disorders. Journal of abnormal psychology121(2), 424-32.

Pitch Perfect: Delivering a Short Talk

Earlier this year I submitted an abstract to the Association for Simulated Practice in Healthcare (ASPiH) conference about my work on the application for medical students in the Emergency Department.

On 3rd August I received an email back informing me I’d been accepted for a poster presentation. Joy was then met with not a small amount of trepidation as I read further “You should plan to speak for 3 minutes with an additional 2 minutes for questions.”

3 minutes?

180 tiny seconds of time to talk about my project, my baby?

I’d presented for 10 minutes before but this was something else in terms of brevity. So I tried to have a process. This is my process, I’m not saying it’s perfect and there’s still things I’d want to change but it’s an example of how to approach the problem of getting your message across in a short period of time.

The idea of any talk is to engage, inspire and inform, That ratio can be changed around; the longer your talk you more you can inform about at the risk of losing engagement and inspiration. Obviously with only 3 minutes I was going to have to cut on the informing and focus on engagement and inspiring. I needed another way of informing the audience.

So, I wrote a blog. This step helped me know the subject and clarify the key points along the way. What’s the killer and what’s filler? What is nice to know but what is absolutely essential about your message? By doing this I realised I had three (always a good number) key points I wanted to highlight: the app was easy to make, it flipped the classroom and blended learning and that students value opportunities to practice digital literacy in simulation.

My talk would therefore be used to get those three points across and hopefully inspire my audience to look for the blog to read more.

Once the blog was written (and that took a long time) and I’d realised what the purpose of my talk would be I decided to see this as an ‘elevator pitch’ similar to those used in business as a format of engaging potential employers/investors in your project. Essentially you imagine you’re in the lift with the person you want to impress and you’ve got the time it takes to go from the ground floor to the top to sell yourself or your product.

With that in my mind I did a few Google searches to find what advice is our there to create an elevator pitch. Unsurprisingly, they’ve made an industry out of this so there’s a lot of information but very little I found useful without needing to pay money. However, I found this blog very useful by Alyssa Gregory. She breaks the process down into stages:

  1. Define who you are are

  2. Describe what you do

  3. Identify ideal audience

  4. Explain what’s unique and different about you

  5. State what you want to happen next

  6. Create an attention-grabbing hook

  7. Put it all together (start with 6)

So I did this. I wrote it out.

And I read it out loud. Slowly and clearly. It came in at 2 minutes 45 seconds. Great! But on reading it out loud it felt flat. And weird. I realised while it was a useful tactic to plan a bit like an elevator pitch the key difference was I wasn’t selling anything. So I did the next step. I practised. With an audience.

This seems perfunctory but it’s key. Only on performing in front of colleagues who I knew would be productive in their criticism did I start to get a sense of what it was like to hear about the application. They could tell me about the ebb and flow and how easy it was to follow my three key messages. There was a rewrite. And another.

Finally, I was lucky in that my poster was on the third day of ASPiH so on the second I was able to watch a poster session and get a feel for the room and see what works and what doesn’t. Clear, slow speech was vital. A bit of humour if possible. Don’t distract with your body language. Look at the audience and not your poster.

There was another slight rewrite and then a lot more practising.

This is what I came up with.

As I said before it’s not perfect but I was happy with it and it seemed to be well received.

I hope this is useful to you and helps if you’ve got a very short presentation to make. Any of your own tips? Anything you don’t agree with? Let me know!

Medical Education - There's An App For That


Third year medical students at the University of Nottingham on clinical attachment at the Queens Medical Centre spend one week with us in DREEAM and the Emergency Department. Emergency Medicine can be daunting for experienced students let alone those right at the beginning of the clinical phase of the course. Also, as students arrive in the spring their week is often shortened due to Bank Holidays and other commitments such as core teaching. I wanted to find a way to give the students as much information as possible about the working of the department and its protocols as well as a way of increasing contact time to make the most of their attachment. A smartphone application seemed a practical solution.


The thought of creating an application might bring to mind Matrix-style screens of numbers and formulas indecipherable to anyone except hackers and programmers. The truth is it has never been easier to create your own application thanks to a number of websites which require no coding experience. A simple Google search reveals just what a crowded market the app building industry has become so I recommend a bit of research to find the platform that works best for you. Most of them work along the lines of choosing a template and then picking functionality such as social media links to add literally in a ‘drag and drop’ style. I chose AppMakr due to its ease of use and because the app also works as a website allowing students to access it on their computers and laptops as well.

The whole process is based on simply picking what functionality you want (say a calendar) and dragging it across. Each function is based on an RSS feed so you link your calendar to a Google calendar or your podcast icon to a SoundCloud feed. In order for the app to have our department’s Initial Assessment Tools (IATs) I created a Google Drive folder containing the guidelines as PDF files and used the shareable link. This meant clicking on the IATs button opened the IATs up for the students to use.

Backgrounds can be played around with. I tried importing pictures but found the resolution and sizes difficult; I think this had more to do with the images I was trying to bring in rather than the website itself.

You can pay to create your app and submit it to iTunes for consideration (it will need to be very good to get approved) or AppMakr will create a mobile website for free. The mobile website for this application can be found here. All students need to do is set the website to the home screen of their mobile or tablet and there, instant application. It was that simple.

Simply pressing on the green cross brings up the list of functions. The green cross acts as a kind of home button.

Using Google Drive links the presentation buttons bring up the slides for each session. The IATs function works the same. The calendar links to a Google Calendar account created especially for this application.

External resources were linked to the application as well. Certain Free Open Access Medical Education (FOAM) resources were linked as well as dedicated Social Media, Podcast and YouTube resources to help the students during the attachment. Unlike other apps this worked without internet connection which helps in the concrete basement of our Emergency Department!


Finally, a forum was created into which students and educators could log in and communicate. This helped share resources and further discussions. This brings me on to my next point:


The flipped classroom describes students being given learning resources before a teaching session. The teaching session is then used to explore the material in greater detail (, 2018). The role of the teacher in the flipped classroom has been described as a facilitator and coach or ‘guide on the side’ (Baker 2000). This notion resonates within suggested theories in medical education regarding the exponential growth as well as the shortening half life of knowledge such as connectivism (Siemens 2005) or navigationism (Brown 2006). The growth of Web 2.0 resources such as blogs and podcasts further facilitates this.

Using the application it was possible via the forum and word of mouth to point the students to a particular resource (say a podcast on Abdominal Pain) and advise them to listen to it before their session on Abdominal Pain. By flipping the classroom we can establish a set level of prior knowledge the traditional session can then build on. This maximises the time together. There’s a number of resources aimed at helping educators set up a flipped classroom programme such as this one from the Flipped Learning Network.

The forum allowed further discussions around cases (anonymised) or other subjects that couldn’t be covered in the timetabled sessions. These discussions were held between 1700-1900 in the evening and all students were encouraged to participate. This use of online resources alongside traditional sessions is called blended learning (, 2018). The point was that the sessions would lead onto online discussion which would then be referred back to in the next face-to-face session which takes some time to get used to but is rewarding. The Higher Education Academy has a great page on blended learning here.


In all 18 students ‘downloaded’ the application. There was an orientation session at the beginning as well as clear guidance on professional online behaviour; so called ‘netiquette’ (, 2018).


All sessions were designed to integrate the application either through guideline reference in simulation or signposted online resources during seminar sessions. All students used the application in at least one session with 11 students (61%) using the application in every session.

The forum transcription was analysed for any emergent themes. As I already discussed the forum was a key part of the ‘flipped classroom’ and improving communication before educators and students. This resulted in logistical messages communicating session changes or short term information such as interesting patients:


There was signposting to resources to consolidate sessions or for prior learning before teaching:

There were also online discussions instigated around investigations or clinical questions:

The reason(s) for students using the application during simulation was assessed. By far and away the most common reason was to look up departmental guidelines (13) followed by looking up national guidelines (4), looking up free open access resources (2) and to look back on a presentation (1).

Students were asked to rate the application on a Likert scale. All students rated the application positively (4 or 5) with 14 (78%) rating the application 5/5.


Finally, students were given a free text box for further feedback. As the diagram below shows this feedback focused on themes of reference and reflecting real life practice:

This suggests that medical students appreciate the role of guidelines and technology in clinical practice. The importance of digital literacy and navigating resources for expedient care is well documented with the Royal College of Nursing and Health Education England launching programmes to improve and formalise skills in this area (The Royal College of Nursing, 2018 and Health Education England, 2018). Studies have looked at promoting digital literacy amongst medical students (Mesko, Győrffy and Kollár, 2015). Perhaps with the rise of connectivism and/or navigationism as a learning paradigm we will see even more work into this area with digital literacy becoming a key skill alongside clinical acumen. Anecdotally speaking to the students the biggest barrier to using the application was familiarity. Once this was overcome they felt comfortable using it and contributing on the forum.

Students value innovation from educators but this requires careful design of the curriculum and resources to ensure feasibility and compatibility.  The project suggests that application design is feasible for educators with limited coding experience and the results indicate that students engage well with a blended curriculum.  Engagement with the application was related to student confidence and perceived relevance.  Mobile learning tailored with simulation offers students a realistic experience.  Resource design may become a crucial part of educator development.  Students require guidance with new resources when presented to them.


Baker, J. W. (2000) The “Classroom Flip”: Using Web Course Management Tools to Become the Guide on the Side. In: the 11th International Conference on College Teaching and Learning: Jacksonville, Florida.

Brown, T. (2006) "Beyond constructivism: navigationism in the knowledge era", On the Horizon, Vol. 14 Issue: 3, pp.108-120,

Flipped Learning Network (FLN). (2014) The Four Pillars of F-L-I-P™ (2018). Blended learning | Higher Education Academy. [online] Available at: [Accessed 19 Oct. 2018]. (2018). Flipped learning | Higher Education Academy. [online] Available at: [Accessed 19 Oct. 2018].

Health Education England. (2018). Digital literacy. [online] Available at: [Accessed 29 Oct. 2018].

Mesko, B., Győrffy, Z. and Kollár, J. (2015). Digital Literacy in the Medical Curriculum: A Course With Social Media Tools and Gamification. JMIR Medical Education, 1(2), p.e6.

Siemens, G. (2005). Connectivism: A learning theory for the digital age.International Journal of Instructional Technology and Distance Learning2(1), 3-10.

The Royal College of Nursing. (2018). Improving Digital Literacy | Royal College of Nursing. [online] Available at: [Accessed 29 Oct. 2018]. (2018). What is Netiquette? A Guide to Online. [online] Available at: [Accessed 29 Oct. 2018].

Heart or Head - Can We Teach Empathy?


First, I want you to read the descriptions of these four patients and imagine, really imagine, them in front of you as if this is their triage and you're just about to see them:

  • A teenager with leukaemia on chemotherapy who's presenting with a fever, shortness of breath and a productive cough

  • An elderly lady who has tripped on the pavement whilst collecting for charity and is presenting with right hip pain and shortening and external rotation of her right leg

  • An intravenous drug user who presents with left groin pain and swelling with fever two days after injecting there

  • A middle aged man in police custody after being arrested at a far right rally for smashing a window brought to you with a laceration to his right arm who's being racially abusive

You're probably already making working diagnoses based on these statements: possible neutropenic sepsis, right neck of femur fracture, groin abscess, laceration needing sutures.  But how do you feel reading them?  Sure, you're going to do your best and you'll be professional but how would you be inside as you see these hypothetical patients?  Would you feel compassion? Sympathy?  How about empathy?

Empathy has been an important factor in the reflections all healthcare professionals underwent following the Mid Staffs scandal:

“food and drinks were left out of reach of patients…the standards of hygiene were at times awful with families forced to remove used bandages and dressings from public areas…people…suffered horrific experiences that will haunt them and their loved ones for the rest of their lives” (Campbell, 2018).

The Francis Report identified empathy as one of the main professional attributes that enable compassionate care (, 2018). The Department of Health and NHS Commissioning Board 2012 recommended that, “care is given through relationships based on empathy, respect and dignity” (, 2018). This leads us to the obvious question:



I think it's fair to say that empathy's meaning can get lost in translation.  A straw poll of colleagues on a certain shift led to a number of different suggestions, "it's when you feel sorry for someone", "it's when you show you care", "it's when you can put yourself in their shoes".

Empathy is defined by (2018) as:

  1. the action of understanding, being aware of, being sensitive to, and vicariously experiencing the feelings, thoughts, and experience of another of either the past or present without having the feelings, thoughts, and experience fully communicated in an objectively explicit manner; also the capacity for this

  2. the imaginative projection of a subjective state into an object so that the object appears to be infused with it

I like this line from Harper Lee's 'To Kill a Mockingbird' - 

"You never really understand another person until you consider things from his point of view - until you climb inside of his skin and walk around in it."

First things first: empathy is not about feeling sorry - that's sympathy.  But let's look at the above definition from Merriam-Webster.  "Understanding, being aware of, being sensitive to" - all sound very appropriate for the caring professional.  But "vicariously experiencing the feelings, thought and experience of another...the capacity for the imaginative projection" - is this necessary?  Is it healthy for healthcare professionals exposed to challenging environments every day to be put through such an emotional wringer?



We certainly know that empathy has a biological framework. Empathy has a proven genetic basis (Knafo, 2008) acting as an evolutionary advantage by promoting altruism and other prosocial behaviours (de Waal, 2008). Displaying empathy is linked to known neuronal pathways such as those involved in facial mimicry (Sonnby-Borgstrom et al, 2003) and imitation (Field et al, 1982 and Fornan et al, 2004) as well as the mirror neuron (Iacoboni, 2008) and limbic systems (Carr el al, 2003, Iacoboni and Dapretto, 2006 and Preston and de Waal, 2002). In contrast deficits in empathy are particularly evident after focal prefrontal cortex damage with poor empathic development seen in children with early damage to their frontal lobes (Eslinger, 1998). Further evidence of the innateness of empathy has been reproduced in several studies showing how children as young as 18-hours-old are distressed at other’s crying (Martin and Clark, 1982, Sagi and Hoffmann, 1976 and Simner, 1971).

Behavioural psychologists have hypothesised that everyone falls somewhere on a spectrum of emphasising and tendency towards system creation (empathy-systemising theory) (Baron-Cohen, 2009). 



If we accept empathy as behaviour we can focus on the prosocial behaviours in a simulated environment with our students. Kahn’s Etiquette Based Medicine (EBM) (Kahn, 2008) is an interesting development in this area.  He looked at clinical checklists such as Pronovost’s checklist for inserting central lines (, 2018) and used the same principle for empathetic behaviour in a potentially simulated environment giving the following example:

1. Ask permission to enter the room; wait for an answer.

2. Introduce yourself, showing ID badge.

3. Shake hands (wear glove if needed).

4. Sit down. Smile if appropriate.

5. Briefly explain your role on the team.

6. Ask the patient how he or she is feeling about being in the hospital.

Let’s imagine designing a simulation where this could be assessed.  Our students could cycle through simulated patients with a variety of different diagnosis: a patient in pain, a patient with substance abuse, a patient with mental health issues.  The simulation could also look at the standard A-E and diagnostic approach but also include this empathic assessment just as prominently.  The simulated patient would be very useful  in feedback:

“When I mentioned that my husband had died it was nice that you held my hand.”


“You didn’t always look me in the eye which made me feel like you were talking over me sometimes.”

Those observing the simulation can also give similar feedback based on their third person observations in a purely objective format: we saw or heard this therefore it appeared as this.  It would be useful to include fellow students as part of this audience to help learn from each other. Further studies have expanded on Kahn's initial checklist to include suggestions for a second meeting with a patient (Castelnuovo, 2013).

Kahn accepts the pedagogical limitations here; this is behaviour not feeling.  But he argues that this is a useful first step for students to start to develop empathy.  This reminds me of the ‘Hello my name is...’ campaign, an incredibly successful movement based on a very simple premise.  As Kahn himself argues, it is easier to change behaviour than attitude.

We already use simulation and observed exercises as the basis of teaching and assessing communication skills.  A student could in theory pass without issue on a station marked communication skills when they know that is being assessed before forgetting communication skills when it comes to further stations.  The current move towards more holistic and OSCE assessment partially recognises this possibility; a station is no longer marked for one particular topic but instead assesses focused examination, investigation interpretation and communication skills.  We have to make sure that empathy is consider in all observed real and simulated patient encounters both assessed and not.



Experiential attempts to foster empathy in students have focused on immersion and realism such as ageing simulation (Through Their Eyes Project, 2018) and reflection and simulation (, 2018).  These are time and resource heavy. The Jefferson Scale of Physician Empathy offers a psychometric assessment of students and practitioners and has a growing evidence base (Tavakol, Dennick and Tavakol, 2011). The JSPE is based on self-reporting via a questionnaire. This suggests that a reflective model can be encouraged either via simulation and immersion or through qualitative questioning of subjects. This obviously relies on resources and honesty.

So can empathy be taught? The truth is this depends on how you view empathy, as a skill or a behaviour. Even then, some of the greats of education have been unable to agree and have changed their minds.

In 1951 humanist Carl Rogers wrote that empathy was a skill that can be taught (Rogers, 1951) but by 1975 changed his opinion and argued that it wasn’t a skill but behaviour that had to be willingly displayed (Rogers, 1975). This definition sounds closer to behaviourism but still includes the element of free will. Fellow humanist Martin Buber disagreed arguing that feeling empathy was a passive event that had to be allowed to happen (Buber, 1955).

However, some have argued that empathy cannot be taught. The philosopher Edith Stein repeated Buber’s idea of empathy only being noticed ‘post-event’. However she disagreed with Rogers arguing that what they called empathy was in fact ‘self-transposal’ – a person actively listening to another – and that empathy was only achieved after crossing-over to the other’s viewpoint before returning back. She believed that empathy was a ‘happening’ not behaviour and as such was impossible to teach (Stein, 1970). Davis agreed with Stein believing that self-transposal could be taught as it is cognitive process but as a behaviour empathy cannot be taught as a skill (Davis, 1990). She did however argue that through experiential learning, humanism and reflection empathy could be facilitated (Davis, 1989).

So we have a dictionary definition of empathy and we know there are checklists and questionnaires we can use as well as immersive simulation to help foster, if not teach, empathy in our students even if we can’t agree if that is a skill or a behaviour we are fostering. This brings me to my next question: Is empathy even that useful? Could it even be harmful?

Although some authors have concluded that empathy makes for better clinicians (Quince et al, 2016) the evidence and models used to make these conclusions is sketchy at the best.  One study Quince (2016) referred to concluded that clinicians can improve their ability to recognise emphatic moments in clinic not that it made them better at their job (Levinson, 2000).  I can think of medical students so caught up in feeling for patients they'd seen in a trauma call they reported not concentrating well afterwards to the detriment of their studies.  Previous studies have shown how empathy actually declines with experience until a "minimum level of empathy" is found which is all that's needed "to benefit from the positive aspects of professional quality of life in medicine" (Gleichgerrcht and Decety, 2014). 



Paul Bloom wrote a very interesting article in the New Yorker called The Baby in the Well (Bloom et al., 2018) explaining how individuals actually make bad assessments of situations based on empathy.  He cites several studies where empathy skews our emotional response and decision making disproportionately.  In one such study participants were given the choice between donating to save the life of one named individual with a photograph and back story or a group of 8 nameless individuals.  Participants consistently gave more money to save that one individual whose name they knew and who they could empathise to even though money to the other group would have saved more lives (Kogut and Ritov, 2005).

This means we could be liable to make bad decisions for the patients we feel more empathetic toward. Never mind the potential risks of counter-transference that inexperienced practitioner or student may not be aware of when they first encounter patients. So if empathy might be risky what else should we encourage?



According to Singer and Klimecki (2014) whenever we meet a person in distress we are inspired to feel either compassion or empathic distress. The two pathways have two distinct basis in neurobiology. First, the empathy training activated motion in the insula (linked to emotion and self-awareness) and motion in the anterior cingulate cortex (linked to emotion and consciousness), as well as pain registering). The compassion group, however, stimulated activity in the medial orbitofrontal cortex (connected to learning and reward in decision making) as well as activity in the ventral striatum (also connected to the reward system). The two types of training led to very different emotions and attitudes toward action. The empathy-trained group actually found empathy uncomfortable and troublesome. The compassion group, on the other hand, created positivity in the minds of the group members. The compassion group ended up feeling kinder and more eager to help others than those in the empathy group.

From Singer and Klimecki (2014)

This suggests there is an importance to encourage healthcare workers towards the positive outcomes and to encourage compassion rather than empathy. The Loving Kindness Meditation model of mindfulness has been shown to foster positive emotions (Fredrickson et al., 2008).

In debating whether empathy testing could be introduced to medical school applications Schwartzstein (2015) suggested the following four stage model for supporting compassion development at medical school:


  • Implement curricular changes that support students' idealism, kindness, and focus on patients, such as providing earlier clinical experiences or requiring participation in student-directed clinics.

  • Select clinical faculty for their ability to support the desired values.

  • Refine measures and instruments for assessing interpersonal skills, support faculty in completing these assessments, and prohibit students deficient in the necessary skills from advancing in their training.

  • Advocate for financial and logistic systems that enable doctors to spend more time with patients.

I find this model fascinating as it suggests a complete overhaul of a medical education facility based purely on compassion and communication. Would it work? It’s too early to tell and it will be interesting to see if it gets any mileage.

Back to our hypothetical patients.  Would attempting to place ourselves in the shoes of a fascist be helpful?  Obviously we have to be professional and ignore fundamental-attribution bias but what about empathy? Empathy has been shown to motivate an individual to help another person based purely on altruism - the 'empathy-altruism hypothesis' - rather than any gain to the individual (Batson et al, 1991).  If a healthcare professional is doing their best for their patient does it matter if it is empathy or professional duty motivating their actions?  Is it better to teach our students the behaviours and values of treating our patients with compassion and respect rather than obsessing about empathy?  Use feedback from patients both real and simulated to reflect on clinical behaviour and encourage the actions which make others feel respected and listened to regardless of motivation.  Make students aware they are entering an emotionally challenging profession and that they should expect complex feelings.  Encourage them to reflect and support them if they don't feel empathy, support them if they do and support them if they feel too much.

This is obviously a massive topic and it’s impossible to cover everything. Let me know what you think, I’d love to continue this debate.

- Jamie (@mcdreeamie)



Baron-Cohen S, Autism: The Empathizing–Systemizing (E-S) Theory, Year in Cognitive Neuroscience 2009: Ann. N.Y. Acad. Sci. 1156: 68–80 (2009).

Batson, C. D., Batson, J. G., Slingsby, J. K., Harrell, K. L., Peekna, H. M., & Todd, R. M. (1991). Empathic joy and the empathy-altruism hypothesis. Journal of Personality and Social Psychology, 61(3), 413-426.

Bloom, P., Cassidy, J., Parker, I., Thurman, J., Gessen, M. and Murakami, H. (2018). The Baby in the Well. [online] The New Yorker. Available at: [Accessed 28 Aug. 2018].

Buber M (1955), Between Man and Man Boston, Mass: Beacon Press; 1955.

Campbell, D. (2018). Mid Staffs hospital scandal: the essential guide. [online] the Guardian. Available at: [Accessed 26 Sep. 2018].

Carr L et al 2003, “Neural mechanisms of empathy in humans: A relay from neural systems for imitation to limbic areas.” In J. T. Cacioppo, and G. G. Berntson (eds.). Social neuroscience: Key readings, New York: Psychology Press, pp. 143-152.

Castelnuovo, G. (2013). 5 years after the Kahn's etiquette-based medicine: a brief checklist proposal for a functional second meeting with the patient. Frontiers in Psychology, 4. (2018). [online] Available at: [Accessed 28 Aug. 2018].

Davis CM (1989) Patient practitioner interaction: an experiential manual for developing the art of healthcare Thorofare, NJ: Slack Inc: 1989.

Davis CM (1990), What is empathy and can empathy be taught? PHYS THER, 1990; 70: 707-711.

Eslinger PJ 1998, Neurological and neurophysiological bases of empathy, Eur Neurol 1998;39:193–199

Field TM et al 1982, “Discrimination and imitation of facial expressions by neonates.” Science 218: 179-181.

Fornan DR et al 2004, “Toddlers’ responsive imitation predicts preschool-age conscience.” Psychological Science 15: 699-704.

Fredrickson, B., Cohn, M., Coffey, K., Pek, J. and Finkel, S. (2008). Open hearts build lives: Positive emotions, induced through loving-kindness meditation, build consequential personal resources. Journal of Personality and Social Psychology, 95(5), pp.1045-1062.

Gleichgerrcht, E. and Decety, J. (2014). The relationship between different facets of empathy, pain perception and compassion fatigue among physicians. Frontiers in Behavioral Neuroscience, 8.

Iacoboni, M., and M. Dapretto. 2006. “The mirror neuron system and the consequences of its dysfunction.” Nature Reviews Neuroscience 7: 942-951.

Iacoboni, M. 2008. Mirroring people: The new science of how we connect with others. New York: Farrar, Straus and Giroux.

Kahn, M. (2008). Etiquette-Based Medicine. New England Journal of Medicine, 358(19), pp.1988-1989.

Knafo, A., C. Zahn-Waxler, C. Van Hulle, J. L. Robinson, and S. H. Rhee. 2008. “The developmental origins of a disposition toward empathy: Genetic and environmental contributions.” Emotion 8: 737-752.

Kogut, T. and Ritov, I. (2005). The “identified victim” effect: an identified group, or just a single individual?. Journal of Behavioral Decision Making, 18(3), pp.157-167.

Lee, H. (1960). To kill a mockingbird. Harlow: Longman.

Martin, G. B., and R. D. Clark. 1982. “Distress crying in neonates: Species and peer specificity.” Developmental Psychology 18: 3-9. (2018). Definition of EMPATHY. [online] Available at: [Accessed 27 Aug. 2018].  

Preston, S. D., and F B. M. de Waal. 2002. “Empathy: Its ultimate and proximate bases.” Behavioral and Brain Sciences 25: 1-72

Quince, T., Thiemann, P., Benson, J., & Hyde, S. (2016). Undergraduate medical students’ empathy: current perspectives. Advances in Medical Education and Practice7, 443–455.

Rogers CR (1951) Client Centred Therapy Boston, Mass: Houghton Mifflin Co.

Rogers CR (1975) Empathic: an unappreciated way of being Counselling Psychologist 1975; 1:1.

Sagi and Hoffmann 1976. Sagi, A., and M. L. Hoffman. 1976. “Empathic distress in the newborn.” Developmental Psychology 12: 175-176.

Schwartzstein, R. (2015). Getting the Right Medical Students — Nature versus Nurture. New England Journal of Medicine, 372(17), pp.1586-1587.

Simner, M. L. 1971. “Newborn's response to the cry of another infant.” Developmental Psychology 5: 136-150.

Singer, T. and Klimecki, O. (2014). Empathy and compassion. Current Biology, 24(18), pp.R875-R878.

Sonnby-Borgstom M et al 2003, . “Emotional empathy as related to mimicry reactions at different levels of information processing.” Journal of Nonverbal Behavior 27: 3-23.

Stein E (1970) On The Problem of Empathy 2nd Ed The Hague, The Netherlands; Martinus Nijholf/Dr W Junk Publishers; 1970.

Tavakol, S., Dennick, R. and Tavakol, M. (2011). Psychometric properties and confirmatory factor analysis of the Jefferson Scale of Physician Empathy. BMC Medical Education, 11(1). (2018). [online] Available at: [Accessed 29 Aug. 2018].

Through Their Eyes Project. (2018). Aging Simulation. [online] Available at: [Accessed 28 Aug. 2018].

de Waal, F. B. M. 2008. “Putting the altruism back into altruism: The evolution of empathy.” Annual Review of Psychology 59: 279-300. (2018). [online] Available at: [Accessed 26 Sep. 2018]. (2018). Final report | Mid Staffordshire NHS Foundation Trust Public Inquiry. [online] Available at: [Accessed 26 Sep. 2018].