History of Medicine

The Medieval Key Workers

Between 1346 and 1351 the Black Death swept across Europe. First from the east along trade routes to Italy before spreading to northern Europe and Britain.

In all up to 200 million people died, somewhere between one and two thirds of Europe’s population. It was a horrible way to die; victims suffered fevers, bloody coughing, extreme pains and the characteristic ‘buboes’ or swellings full of pus. 

For the people of Europe at the time it must have been like the end of the world. When COVID-19 first struck most people, while scared, had at least some rudimentary knowledge of germs and how disease is spread. This was not the case in the 14th century, some 400 years before germ theory would emerge. Doctors might have their own individual theories but couldn’t offer much hope. As a result people turned to the church. 

This made sense. The Bible speaks of plague as the work of God and so the Catholic church, the dominant religious power in Europe, was expected to provide answers. As people died priests were to become the key workers of the Black Death, providing what comfort they could and, importantly, performing the Last Rites to ensure passage to Heaven. 

Unfortunately, this put the priests at great risk. Without any form of PPE they would often fall victim to the Black Death themselves. It’s estimated that 42-45% of priests died from the Black Death. In 1348 six cardinals would die. 

The head of the Catholic church at the time, Pope Clement VI, therefore faced an unprecedented challenge: how best to provide spiritual leadership and ensure salvation for people. Born Pierre Roger in 1291, he had been Pope since 1342. He was the fourth Avignon Pope - a series of seven popes between 1309 to 1376 whose court sat in the French town of Avignon rather than the Vatican. The wine Châteauneuf-du-Pape (the Pope’s new house) dates from this period and is still made in the region today, its name signifying its papal legacy.

In facing the Black Death Clement VI was pragmatic. As grounds for burial began to run out he consecrated the River Rhône in 1349 so that the bodies of plague victims could be disposed of there. He granted the remission of sins of all of those who died from the plague so even if they couldn’t confess their sins and receive the Last Rites their souls would be safe. As fear mongering led to the blame and persecution of Jews he issued two papal bulls in 1348 denouncing anti-Semitism. On 26th September 1348 he issued Quamvis Perfidiam in which he called on Christians to protect Jews:

“certain Christians, seduced by that liar, the Devil, are imputing the pestilence to poisoning by Jews…“since this pestilence is all but universal everywhere, and by a mysterious decree of God has afflicted, and continues to afflict, both Jews and many other nations throughout the diverse regions of the earth to whom a common existence with Jews is unknown (the charge) that the Jews have provided the cause or the occasion for such a crime is without plausibility.”

Ralph of Shrewsbury, Bishop of Bath of Well’s, would be similarly pragmatic as he faced a shortage of key workers to hear plague victims’ confessions. In the absence of priests to hear confessions now plague sufferers could confess to each other, or a layman or even to a woman. He commanded this in a letter to his priests written in January 1349: 

“The contagious pestilence of the present day, which is spreading far and wide, has left many parish churches and other livings in our diocese without parson or priest to care for their parishioners. Since no priests can be found who are willing, whether out of zeal and devotion or in exchange for a stipend, to take on the pastoral care of these aforesaid places, nor to visit the sick and administer to them the Sacraments of the Church…We, therefore, wishing, as is our duty, to provide for the salvation of souls and to bring back from their paths of error those who have wandered, do strictly enjoin and command..that, either yourselves or through some other person you should at once publicly command and persuade all men, in particular those who are now sick or should fall sick in the future, that, if they are on the point of death and cannot secure the services of a priest, then they should make confession to each other, as is permitted in the teaching of the Apostles, whether to a layman or, if no man is present, then even to a woman.”

History doesn’t necessarily repeat herself but she does like to rhyme. It’s hard not to see the Catholic church’s response to the Black Death in a similar vein to how we in the West approached COVID-19: military involvement, new hospitals, volunteer groups, lockdown, furlough etc.

Two years into the pandemic and many of us might well be questioning our governments and leaders: Boris Johnson’s behaviour during lockdown will probably define his career forever.

In the Middle Ages it was the same. After the Black Death passed there was a shortage of labour leading to the Peasants Revolt of 1381. It could be argued that the Catholic church could never again demand the total power it had once wielded: they had not been able to save people. Questions could be asked and their authority challenged. Within 200 years Lutheranism would rise and Henry VIII would break from Rome. As Europe began to recover the seeds of new discovery had been sown. The Renaissance would see a new era of art and discovery. 

Shark Tales: the oldest shark attack, the shark attack survivor who became a lord mayor, and the shark that solved a crime

Shark attack. The very phrase brings a particular fear. At a silent, malevolent force dragging us to our death. Once calm, azure water frothing and turning crimson. Yet these creatures, older than the dinosaurs and unchanged in over 400 million years, also bring fascination. The seas have been there’s about as long as there has been life on land. So as long as humans have ventured to the sea there have been sharks.

Inspired by The Daily Jaws here are three stories of our relationship with sharks: the oldest shark attack, the shark attack survivor who became a lord mayor and the shark that helped solve a crime.

The first shark attack

Three thousand years ago in what would one day be called Japan, a young man died. His death was violent. He had fractured ribs, lost one hand and both feet, and one of his legs was severed clean off. He was buried with some care and with his leg. 

Around 1920 his skeleton was excavated and became known as ​​Tsukumo No. 24. He was part of the Jōmon period of prehistoric Japan, a time of hunter-gatherers and fishermen. In total, his bones had 790 different marks on them across the whole of the skeleton. Tsukumo No. 24 represented a mystery to archaeologists. Clearly, his last moments had been horrific but what had caused them? What kind of prehistoric weaponry could have left these marks? Japan is home to bears and wolves, could they have been responsible?

A paper published earlier this year may have found the answer. Through detailed imaging of the skeleton of Tsukumo No. 24 scientists were stunned at how deep into the bone his injuries went. They were V-shaped and showed lines or striations. This is typical for shark bites. The researchers surmised that Tsukumo No. 24 had been killed by a great white or tiger shark which had had enough time to start feeding. 

Tsukumo No. 24 now represents the world’s oldest shark attack. It’s likely his death was quick, probably through rapid blood loss after his leg was bitten off. Whilst the shark or sharks had started feeding his body and leg were salvaged either through the tide or the intervention of other people. The result was that his body was able to have a ceremonial burial. No one present would have known what this would one day represent.

The Lord Mayor of London, the Artist, and the Shark

Brook Watson was born in Plymouth, Devon on 7th February 1735. In 1741 he was sent to live with his aunt and uncle, a merchant, in Boston, Massachusetts after his parents died. By the time he was a teenager, Watson had been signed up as a crew member on one of his uncle’s ships. 

In 1749 the ship anchored in the harbour of Havana, Cuba. The 14-year-old Watson decided to go for a swim. Whilst the crew prepared a boat to escort their captain ashore Watson stripped and cooled off in the Caribbean. It was then he was attacked. His right leg was grabbed by a shark which stripped the flesh from his calf. He was released but the shark returned and took his foot clean off. By the time the shark came back for a third attack his crewmates had launched their boat and were able to fight off the assailant and haul Watson to safety.

Watson’s naval career was over. His right leg was amputated below the knee and, after 3 months of recuperation, he was discharged. After first returning to America he made his way back to London and was Member of Parliament for the City of London from 1784 to 1793. In 1796 he became Lord Mayor of London. He would later join the Bank of England before being created a baronet. Watson requested that his new coat of arms featured his missing right foot. 

At some point, Watson met the American artist John Singleton Copley (1738-1815) whom he befriended and commissioned. In 1778 Copley painted ‘Watson and the Shark’ capturing the moment of Watson’s rescue; his crewmates grabbing him from the jaws of a shark. The shark is not particularly anatomically correct; it appears to have lips and bellows of air coming from its nostrils.  The painting was exhibited in London’s Royal Academy. Watson himself is believed to have penned a description to accompany the painting:

“after suffering an amputation of the limb, a little below the knee, the youth received a perfect cure in about three months”

In the end, Copley painted three versions. The 1778 version is currently on display in the National Gallery of Art, Washington DC. A second version is in the Museum of Fine Arts, Boston, while a third, smaller, 1782 version is in the Detroit Institute of Art.  

The shark that helped solve a crime

Hugh Whylie was in trouble. It’s 28th August 1799 and Whylie is a commander in the British Navy leading the ship Sparrow in the Caribbean. It’s a time of simmering tensions with the Dutch, especially within the tropical theatre Whylie calls home. He has just intercepted the ship Nancy, captained by Thomas Briggs on suspicion of smuggling. The Nancy has been veering around, porting at Aruba and Haiti. Briggs blames bad weather. Whylie suspects smuggling. 

As Whylie’s men board Nancy Briggs comes up on deck and presents his papers. The Nancy is Dutch-owned and her journey is legitimate so his story says. Everything seems in order. But Whylie is not sure. He arrests Briggs and seizes The Nancy. He must have known how delicate the situation was. Briggs was American. The War of Independence had only finished 16 years earlier. Diplomatic relations between the United Kingdom and the United States of America were only 14 years old. It did not look good for British naval officers to be arresting Americans in the Caribbean. And if he’s intercepted a legitimate Dutch ship this could be an act of war.

Whylie filed a suit against Briggs in Kingston, Jamaica. Briggs immediately countersues for lost revenue. The pressure is on for Whylie. Other than his gut feeling and the fact that The Nancy had been veering off course he has nothing. Briggs has his story about bad weather, which there had been, and his seemingly immaculate papers. This could be a costly mistake by Whylie and after two days he must have been worried about his future.

Then, something incredible happened. Crewmen on the British ship Ferret caught a large shark off the coast of Haiti. Cutting into the shark they made a bizarre discovery in its stomach: a bundle of papers tied with string. Anyone who has seen the film Jaws will remember the scene where Hooper cuts into a tiger shark to see if there are human remains and comments on how slow a shark’s digestion is. The papers are in such good condition the writing on them is still legible. The captain of The Ferret Michael Fitton summons Whylie. Whylie expects breakfast and is bemused to be shown papers drying out on Fitton’s desk. 

Remarkably, they contain letters and papers which belong to The Nancy and show that her Dutch ownership is a cover and that Briggs really was captaining a smuggling operation. Having seen The Sparrow coming Briggs had ordered the papers to be thrown overboard where a passing shark had fancied them for a snack. 

Briggs presents the ‘shark papers’ in court. They are the deciding factor in the case. Briggs is found guilty of both smuggling and piracy. He describes it as an “active and unnatural piece of cruelty” to be “damned and condemned by a bloody sharkfish”. The shark papers remain on display in the National Library of Jamaica. 

How that *one* thing we all 'know' about Egyptian Mummies might be wrong

Ancient Egypt: land of pyramids, animal-headed gods, pharaohs’ curses, and mummies. It’s fair to say that, perhaps out of any civilisation in history, it’s Egypt that mesmerises us the most. And it’s possibly mummies and mummification which hold the greatest sway over our fascination. The Egyptian belief that preserving the dead body guaranteed eternal life for the deceased’s soul ensured an intricate process of evisceration and embalming. And what’s the one fact about mummification that everyone, from child to adult, can recite with either glee or disgust?

“They pulled the brain out with a hook”

Certain organs would be stored in canopic jars each with a head for a lid. There was Hapi, the baboon-headed god associated with the North whose jar held the lungs. Duametef, the jackal-headed god associated with the East whose jar held the stomach. Qebehsenuef, the falcon-headed god of the West whose jar held the intestines. And Imsety, the god who looked like a human and was associated with the South whose jar held the liver.

Canopic jars. From left: Qebehsenuef (intestines), Imsety (liver), Hapi (lungs) and Duametef (stomach).

But not the brain. No, that was hooked out and thrown away. That’s a fact. The one thing we all know about mummification. Except it might be wrong.

Sculpture of Herodotus in the porch of the Stoa of Attalos building at the Ancient Agora of Athens. Attica region, Greece.

Herodotus (c484 BC - c425 BC) was an Ancient Greek known as the ‘Father of History’. In circa 425BC he travelled to Egypt and documented what he saw there. It is from his Book II of History that we get the following entry regarding the Ancient Egyptian process of mummification:

“They take first a crooked piece of iron, and with it draw out the brain through the nostrils, thus getting rid of a portion, while the skull is cleared of the rest by rinsing with drugs”

Attempts to recreate the mummification process as described by Herodotus suggested that the brain could not be removed using a hook. Then in 2008 CT imaging and then endoscopy of a mummy found an eight centimetre wooden tube had been left inside the skull. This suggested that, rather than hooking the brain out through the nose, the brain was liquified first (maybe using a hook) before the body was laid on its front (prone) and the tube was used to allow the brain to drain out via gravity.

CT scans of a 2,400-year-old female mummy revealed a tubular object embedded in its skull between the brain's left parietal bone and the resin filled back of the skull. Image copyright RSNA RadioGraphics)

The object, which measures 3 inches (8 cm) in length, was cut off from resin that it had gotten stuck to. It was left in the skull by the embalmers by accident, possibly because it broke off.

(Image credit: Photo copyright RSNA RadioGraphics )

Of course. it might well be that different areas of Ancient Egypt used different methods of mummification. It might also be that status and wealth affected how your body was preserved. But this find offers a snapshot that questions what every schoolboy and schoolgirl remembers about Ancient Egypt and mummies.

Maybe they didn’t hook the brain out but used a straw.

Ice and a slice (of history): the story of the G&T

shutterstock_169910423.jpg

Last weekend saw World Gin Day; a chance to celebrate a drink which has exploded in popularity in the UK over the past few years. ‘Gin’ describes a liquor which is at least 40% alcohol by volume derived from grain distillation and primarily flavoured with juniper berries. It is the Dutch word for juniper, genever, which gives us the word ‘gin’. With the warm weather, many of us may be reaching for a relaxing beverage in the form of a gin with its most popular mixer: tonic. The story of this drink, with its myriad different garnishes and taste profiles, starts in the most unlikely place: fighting a deadly disease.

Malaria remains one of the biggest global health issues. According to the World Health Organisation in 2018 228 million people were infected worldwide with malaria, with 405,000 dying. Nearly half of the world’s population live in malaria-affected areas. The symptoms are vague; tiredness, muscle pains and periods of fever and shivering. Multi-organ failure can result.

The disease is caused by a single-celled organism called Plasmodium. This parasite has a complex life cycle which begins with a sporozoite stage which infects liver cells where they reproduce once: creating merozoites which are released when the infected liver cells burst. The merozoites in turn infect and reproduce inside red blood cells. The red blood cells burst and release more merozoites which infect yet more red blood cells and the cycle continues. This stage corresponds to the periodic fever the patient feels. At some point gametocytes rather than merozoites are produced. These are sexual (there are male and female forms). If a female Anopheles mosquito bites and drinks the blood of an infected patient it is the gametocytes which are ingested as well. Inside the stomach of the mosquito, they undergo sexual reproduction and form sporozoites. The next time the mosquito feeds it is these sporozoites she’ll inject into that person’s bloodstream and so the circle begins anew and another patient develops malaria.

Dr. Alphonse Laveran, a military doctor in France’s Service de Santé des Armées (Health Service of the Armed Forces).

Illustration drawn by Laveran of various stages of malaria parasites as seen on fresh blood.

Illustration drawn by Laveran of various stages of malaria parasites as seen on fresh blood.

Malaria was endemic in Europe; its name comes from medieval Italian for ‘bad air’ and was associated with swamps and other stagnant water. However, it was colonialism which would represent the biggest battle between imperial powers and malaria. The cause of malaria would first be identified in 1880 by a French army doctor, Alphonse Laveran, spotting the parasite down a microscope in the blood of patients.

As is often the case in medical history, however, a cure was sought long before the cause was identified.  

It’s the early 17th-century in Peru.  The Spanish Countess of Chinchón, Ana de Osorio, the wife of the Viceroy of Peru is ill with malaria.  The locals present her with the bark of a local tree.  The tale was that a local man, ill with malaria, drank from a pool of water contaminated with this bark and was cured.  Although probably apocryphal, the bark had become a reliable cure for the natives.  Ana de Osorio took the bark and was cured.  On her return to Europe, she brought word of this tree.  French scientists were able to isolate the compound responsible for defeating malaria and named it ‘quinine’ after the Inca word for the tree bark.  The tree itself was named ‘cinchona’ in honour of the countess.  

Wellcome Images

Wellcome Images

The Scottish physician George Cleghorn (1716-1789), who had spent years studying fevers in Minorca (now Menorca), would hear of the properties of quinine and recommended that a solution of quinine in water be provided to British soldiers stationed in the Indian subcontinent.  He was listened to and ‘Indian Tonic Water’ became a staple of colonial life.  However, the solution was incredibly bitter.  To make the drink more palatable by the 19th-century soldiers had started mixing it with sugar, lime, and; crucially, gin. The gin and tonic was born.

Portrait. Credit: Wellcome Library

Initially, the Peruvian authorities did all they could to protect their monopoly on cinchona plants and prohibited the exportation of its seeds.  British businessman and alpaca farmer Charles Ledger, with the help of his manservant, was able to smuggle cinchona seeds out of the country.  He sold them to the Dutch government who started growing cinchona in the Dutch East Indies. 

By 1900 at least two-thirds of the world’s supply of quinine was grown in Java.  Indonesia and India remain important cultivators of cinchona; it is also grown throughout Africa.  

Modern tonic water has a quinine content far below the therapeutic dose and is produced for consumption only, not a cure.  Quinine itself has been replaced with other drug treatments for malaria although it remains an option in cases of drug-resistance.  Resistance has been identified in two of the four Plasmodium species which cause malaria and represents a public health challenge.  Mosquito control and sanitation remain staples of attempts to control the disease. Gin and tonic remains a popular drink with its own international day: 19th October.  

Disease has been a part of existence throughout the history of mankind.  It therefore shouldn’t be surprising to find aspects of modern life which appear completely unrelated to healthcare with their roots in the history of medicine.  When it comes to gin and tonic there are hints: the popular tonic water brand ‘Fever Tree’ chose its name in honour of the cinchona plant. Every time you have a ‘G&T’ you’re sampling a taste of that history.

Cheers.

History rhymes: two Prime Ministers, two pandemics

The United Kingdom is in the throes of a pandemic. A new virus without cure or vaccine kills with frightening speed. The Prime Minister is struck down with fever. His life hangs in the balance. It is September 1918. The Prime Minister is David Lloyd George. History may not repeat but she does love to rhyme.

15th September 1918. David Lloyd George, Prime Minister and leader of the wartime coalition government, although not the leader of his party the Liberals, visits Manchester to receive the freedom of the city. It is the last few months of the bloodiest conflict known to man. By the end of the month the German High Command would telegram the Kaiser that victory was impossible. Peace would soon be in sight. However, far more people worldwide would lose their lives to a different, invisible enemy.

Lloyd George receiving the Freedom of Manchester Photo: Illustrated London News [London, England] 15 September 191721 September 1918

H1N1 influenza may well have been circulating in military camps for a while before 1918; the confines and poor hygiene were perfect for viruses to spread and mutate. There is much debate as to where the virus first appeared but given wartime censorship it was credited neutral Spain whose open reporting gave it the impression of being the disease’s epicentre. It would go to infect a third of the world’s population, killing at least 50 million people; more than double the deaths in World War One.

Unusually for ‘flu the victims were not children or the elderly but young to middle-aged adults. There’s a number of theories for this; whether their stronger immune systems actually turned against them and made the disease worse or if those old enough to have lived through the 1889–1890 ‘Russian ‘flu’ pandemic had retained some form of immunity. Whatever the reason those who succumbed rapidly developed pneumonia. As their lungs failed to supply their face and extremities with oxygen they would go blue with hypoxia. This harbinger of death was given the name ‘heliotropic cyanosis’ after the flower whose colour patients were said to resemble.

A plate from Clinical Features of the Influenza Epidemic of 1918–19 by Herbert French

And so to Albert Square, Manchester. David Lloyd George receives the keys to the city. The weather is appalling. Pouring with rain, the Prime Minister is soaked during the lengthy ceremony. He is met by dignitaries and well-wishers, shaking hands and exchanging pleasantries. By the end of the day he is hit by ‘a chill’. Although underplayed this chill renders him unable to leave Manchester Town Hall. A hospital bed is installed for Lloyd George. His personal physician visits him daily. It is eleven days before the Prime Minister is well enough to leave his bed wearing a respirator to both protect his stricken lungs and to prevent infection of others.

Manchester itself was to be an innovator in its response to the ‘flu pandemic. At the time there was no centralised Ministry of Health and so Public Health was a matter for local authorities under the auspices of Medical Officers. The Medical Officer for Manchester since 1894 was James Niven, a Scottish physician and statistician. With total war still ongoing Sir Arthur Newsholme, a senior health advisor to the British government, advised that even with ‘flu spreading munitions factories had to remain open and troop movements could not be interrupted. It was up to Medical Officers to think autonomously. Niven looked back at the pandemic of 1889–90 and noted that unlike seasonal ‘flu which strikes annually, pandemic ‘flu came in waves with each wave often more virulent than before. He argues:

“public health authorities should press for further precautions in the presence of a severe outbreak”

James Niven, Creative Commons

After the first cases of influenza were seen in Manchester in spring 1918 Niven therefore worked to prepare the city for the next wave that he predicted would hit later that year. Manchester was a densely packed working class city, a perfect breeding ground for disease. He closed schools and public areas such as cinemas. Areas which couldn’t be closed were disinfected. He studied statistics to be published on posters throughout the city to give people as much information as possible. He became a regular columnist in the Manchester Guardian advising readers on the symptoms of the disease. He advised that anyone showing symptoms must,

“on no account join assemblages of people for at least 10 days after the beginning of the attack, and in severe cases they should remain away from work for at least three weeks”

Manchester’s ‘flu outbreak would peak on 30th November 1918. Niven reflected that it might have occurred sooner without Armistice celebrations where he was powerless to prevent people congregating on the streets. Niven would remain in post until 1922. As well as his work fighting influenza he also led slum clearance, sanitation installation and improving air quality. Despite Manchester’s population increasing from 517,000 to 770,000 during his tenure the death rate per 1,000 population fell from 24.26 to 13.82. Despite his success in retirement he would be struck by depression. In 1925 he took poison and drowned himself in the sea off the Isle of Man.

Lloyd George would make a full recovery from his illness. He led the country’s Armistice celebrations and remained as Prime Minister until 1922 through the support of his Conservative coalition partners. His struggles for the leadership of the Liberals with his long time rival Herbert Asquith would dominate the party for at least the next decade and see them fall from government to third place in British politics. They would never return. Lloyd George remains the last Liberal Prime Minister in the United Kingdom. He would live to see his hard fought peace shatter and very nearly saw it return again, dying in March 1945.

History doesn’t repeat but she does rhyme. It is human nature to look for patterns and reason comparing the present to what has gone before. For Lloyd George stricken with influenza see Boris Johnson admitted to intensive care with COVID-19. For James Niven see Chris Whitty. However, our knowledge of disease, access to sanitation and healthcare progression is without equal in history. The H1N1 influenza virus behind the pandemic of 1918–19 would not be genetically sequenced until 1999. When COVID-19 first emerged in late December 2019 its genetic sequence was identified within a month. Intensive care and ventilation weren’t even figments of the imagination for the patient in 1918, Prime Minister or not. However, until a cure or vaccine for COVID-19 are realised our best weapon against it remains the advice of James Niven from over a century ago. From a time before social media or hashtags. Stay home.

Super Spreaders: The Story of 'Typhoid Mary'

A new virus which first appeared in a food market in China has crossed the world in a couple of months and declared a pandemic by the World Health Organisation As of 11th March there have been 118 619 confirmed cases of this virus, called COVID-19, worldwide, with 456 in the United Kingdom. Six people in the UK have died. Of those cases in the UK four were all linked to one other infected person who also infected another six, five in France and one in Spain. This is the story of a modern super-spreader and his Victorian era counterpart, ‘Typhoid’ Mary Mallon.

Steve Walsh Pic: Servomex

The case of our modern super-spreader Steve Walsh has been well covered in the media since he reported himself to health authorities. The 53 year old works for the gas analysis company Servomex. From 20th to 22nd January 2020 he attended a work conference in Singapore, one of 94 delegates travelling from overseas to the 109 strong body. One attendee was from Wuhan, China the centre of the epidemic. During the conference Walsh was exposed to COVID-19. Following the conference he joined his family for a holiday at Les Contamines-Montjoie near Mont Blanc in the French Alps from 24th to 28th January, staying in a ski chalet. Still feeling well he travelled on a busy easyJet flight from Geneva to Gatwick and went to a local pub, The Grenadier in Hove, on the 1st February. It was only after conference organisers alerted attendees that one of their number had tested positive for COVID-19 that Walsh alerted the authorities and was himself tested. By this time five Britons who had stayed in the same chalet became ill in France, another Brit returned to their house in Mallorca and fell ill and another group of four people flew home to the UK from the same ski resort and became unwell. All tested positive for COVID-19. All had had contact with Walsh. In the two week incubation period and without feeling unwell Walsh had inadvertently infected 10 people. After a mild illness in quarantine at the specialist infectious diseases unit at Guy’s and St Thomas’ NHS Foundation Trust in London he was discharged on 12th February.

A super-spreader is an individual who is more likely to spread a disease compared to other people with the same infection. The principle which is often used is the ‘20-80’ rule; 20% of people are behind 80% of transmissions. There are many different reasons why one person may be more contagious than others: vaccination rates, the environment, co-infections (men infected with HIV are more contagious if they also are infected with syphilis compared to those infected with HIV alone) and their viral load. A super-spreader may also be a carrier, completely symptom free, who yet can pass a disease onto others. Perhaps the most famous example of this kind of super-spreader was ‘Typhoid’ Mary Mallon.

Mary Mallon was born on September 23, 1869 in Cookstown, County Tyrone in what is now Northern Ireland. By 1884 she had moved to America to live with her aunt and uncle and to seek work as a cook for wealthy families. Between 1900 and 1907 she worked for seven families in the New York City area.

Mary Mallon in quarantine Creative Commons

A strange pattern emerged. Wherever Mary worked there was an outbreak of typhoid fever. This disease is caused by a type of Salmonella bacterium called Samonella typhi and spread in contaminated food and drink. Infected patients develop fever, abdominal and joint pains and vomiting and diarrhea. Some patients develop a rash.

This was very unusual. Typhoid fever was traditionally seen in slum areas and the poverty stricken, not the affluent houses Mary worked at. In 1900, she moved to work in  Mamaroneck, New York. Within a fortnight of her arrival residents fell ill with typhoid fever. The same thing happened in 1901 when she moved to Manhattan. The laundress at the house she worked at died of the disease. She then was employed by a lawyer and again left after seven of the eight people in the house fell ill.

In 1906 she moved to the very well to do area of Oyster Bay in Long Island. At the first house she worked at ten out of the eleven family members living there were hospitalised with typhoid fever. The same thing happened at another three households. Mary continued to change jobs after each outbreak.

She was eventually employed by a wealthy banker, Charles Henry Warren, as a cook. In 1906 when the family summered in Oyster Bay Mary joined them. From August 27 to September 3, six of the 11 people in the family came down with typhoid fever. George Thompson, the man whose house they had holidayed in, was concerned that the water supply might be contaminated and cause further outbreaks. He secured the services of a sanitation engineer George Soper who had investigated similar cases.

Soper published the findings of his research in the Journal of the American Medical Association on June 15th, 1907:

“It was found that the family changed cooks on August 4. This was about three weeks before the typhoid epidemic broke out. The new cook, Mallon, remained in the family only a short time and left about three weeks after the outbreak occurred. Mallon was described as an Irish woman about 40 years of age, tall, heavy, single. She seemed to be in perfect health.”

Soper could link 22 cases and one death to this Irish cook who seemed to vanish after each outbreak. So began a chase similar to that in the movie ‘Catch Me if you Can’; as Soper tried to track down Mary Mallon. When he eventually found her and asked for samples of her faeces and urine she violently refused:

“She seized a carving fork and advanced in my direction. I passed rapidly down the long narrow hall, through the tall iron gate, and so to the sidewalk. I felt rather lucky to escape.”

On another encounter in a hospital where Mary was being treated she locked herself in a toilet and refused to open the door until Soper left. She refused to accept she was the cause of the outbreaks and that she couldn’t work as a cook.

Soper passed the case over to physician Sara Josephine Baker with whom Mary still refused to engage. In the end Baker had to enlist the help of the New York police who arrested Mary. Stool samples confirmed the presence of Salmonella typhi. In 1908 the Journal of the American Association had dubbed Mallon ‘Typhoid Mary’.

Mary was held in isolation for three years of quarantine. By 1910 she was was released having signed an affidavit that she would no longer work as a cook and take all precautions to prevent infecting others. She began to find work as a laundress a position with less job security and lower income. Having struggled to make ends meet she changed her name to Mary Brown and began to work as a cook again. Typhoid once again followed her.

In 1915 she caused an outbreak at Sloane Hospital for Women in New York City infecting 25 of whom 3 died. As before she left her position following the outbreak but authorities found her visiting a friend and arrested her again. This time there would be no second chance and Mary Mallon spent the rest of her life in quarantine. She worked at the hospital she was confined to cleaning bottles in the laboratory. In 1932 she was paralysed by a stroke. She died of pneumonia on November 11, 1938 aged 69.

At post mortem Salmonella typhia bacteria were found in her gallbladder. She had remained a carrier until her death. We now know that 1 in 20 patients with typhoid fever who are not treated will become carriers. They themselves feel well even though the bacteria lives in their faeces and urine and can be spread by poorly washed hands. This is probably what happened to Mary Mallon.

Thanks to her aliases and avoiding authorities Mary Mallon may well have caused up to 50 deaths due to typhoid fever.

Mary Mallon and Steve Walsh both show the impact one part of the infection chain can have. However, that’s as far as the similarities go. Mary knew she was contagious and yet continued to work and put people at risk and did all she could to avoid detection. Yes it’s easy for me to criticise a woman who lived a century ago without job security who feared losing her livelihood. Whatever her reasons as a result of her actions people died. Walsh didn’t know he was infected and made himself known to and co-operated with the authorities as soon as he thought he might be. They both illustrate the same key importance of the public health approach; of contact tracing and identifying sources to break the chain of infection. They also show the value of an individual’s attitude. If we think we might be at risk of passing an infectious disease on we all have to make the choice. Are we going to be like Mary Mallon or Steve Walsh?

Thanks for reading

- Jamie

The Truthtellers: Li Wenliang & Ignaz Semmelweis

“Sunlight is the best disinfectant”

Louis Brandeis - US Lawyer

Disease feeds on ignorance and misinformation. And yet it is often human nature to conceal or silence due to vested self-interest. George Orwell said, “in a time of universal deceit telling the truth is a revolutionary act.” This is the story of a modern medical truth teller Li Wenliang and his predecessor of nearly two centuries: Ignaz Semmelweis.

Li Wenliang, Creative Commons

Li Wenliang was an ophthalmologist working at Wuhan Central Hospital since 2014. Like a lot of doctors (myself included) Li enjoyed using social media, especially the Chinese microblogging site Weibo.  Like myself food featured heavily in his posts, in particular Japanese food and fried chicken, as well as his favourite singer and actor Xiao Zhan.  He was a husband and father.  On his final birthday he posted a resolution to be a simple person, refusing to let the world’s complications bother him.  He was an optimistic person and entered various lotteries and competitions especially if the prize involved gadgets. 

On 30th December 2019 he saw a patient’s blood result confirming infection by a coronavirus, the same family of virus which had caused the Severe Acute Respiratory Syndrome (SARS) epidemic of 2003. SARS had originated in South China but news of the disease was suppressed by the government. The World Health Organisation was made aware of SARS through internet monitoring and China refused to share information for several months. By the time action was taken 2000 more people had been infected. In total 8098 people were infected with 774 deaths across 17 countries. The possibility of a new disease was politically sensitive. Li knew this.

Li alerted a group of his medical schoolmates on the social media application WeChat about the result. He warned them:

“Don't circulate the information outside of this group. Tell you family and beloved one to take caution.”

Li’s letter of admonishment, Creative Commons

On 31st December 2019 the World Health Organisation was alerted about an outbreak of pneumonia in the city centred around the Huanan Seafood Market. Despite asking for secrecy screenshots of Li’s warning were shared on social media. On 3rd January 2020 Li was interrogated by police staff for rumourmongering. He was warned about his conduct and against making claims on the internet. Li was given an official letter of admonishment which he was forced to sign.  He then had to give written answers to two questions: in future, could he stop his illegal activities and did he understand that if he continued he would be punished under the law?  “I can” and “I understand” he wrote, placing his thumbprint in red ink by both answer.  It wasn’t enough to scare him, this was the kind of punishment given to a schoolboy.  His name and the accusations were broadcast by state television. The message from was clear: toe the line.

The People’s Republic of China celebrated its 70th anniversary in October 2019 with a display of military might and what can be achieved through communist rule: mega cities, high speed travel and economic stability.  President Xi Jingping became only the second ruler after founding dictator Mao Zedong to have his political philosophy (Xi Jinping Thought on Socialism with Chinese Characteristics for a New Era or simply Xi Jinping Thought) incorporated into the Chinese constitution.  In 2018 the Chinese Communist Party voted to remove the two term limit on presidency, in place since Mao, leaving Xi free to indulge in a cult of personality similar to those of Maoism. The unspoken contract between the government and people was simple: in exchange for reduced freedom of speech we will give you efficient and effective government. First came a downturn in economic growth and a trade war with the US. Then the ongoing revolt in Hong Kong. This new disease was a third significant crack in the facade of the Chinese government’s strength.

On 7th January Li returned to work. The next day in clinic he saw a patient with glaucoma who worked at the Huanan market.  The patient didn’t have a fever and so Li saw her without a mask.  On January 10th he started to cough.  He sent his family to his in-laws 200 miles away and checked into a hotel.  On February 1st he tested positive for the new coronavirus.  “Well that’s it then, confirmed”, he wrote on Weibo from his bed.   Li was admitted to intensive care on the 12th January and tested positive for the new coronavirus on the 30th. He died on 7th February 2020 aged 33. There has been an outpouring of support for Li and criticism of the communist government of China with calls for freedom of speech. Despite a ruling from the Chinese Supreme Court on 4th February that Li should not have been punished there has been no apology at the time of writing from the government.

In the hours after Dr Li’s death nearly two million Chinese netizens had shared the hashtag “I want freedom of speech” on social media before it was taken down by authorities.  Petitions have been signed and sent calling for greater freedom of expression to be guaranteed in China.  Party chiefs are now using Dr Li as a hero, blaming his mistreatment as mistakes being made by individuals.  Global Times, a nationalist tabloid, has stressed that Dr Li was a loyal Communist party member and the pro-democracy forces whipped up by his death are the work of enemies abroad and dissidents in Hong Kong.

The new coronavirus has been named COVID-19 by the World Health Organisation who then declared a Public Health Emergency of International Concern. On 12th February 2020 the Chinese authorities announced a previous unrecorded extra 13,332 cases and 254 deaths. The official explanation is that this is due to recording patients with lung changes on CT scan suggestive of infection rather than just those with confirmed positive blood tests. The memory of SARS casts a long shadow, however, and suspicion remains that China are not being truthful about the extent of the epidemic. President Xi Jingping, despite being invisible for much of the early outbreak, is now the “commander of the people’s war against the epidemic” according to the state news agency, Xinhua.  Stirring stuff, but as scientists point out, the language of war doesn’t leave room for debate and discussion.  The international medical community is pushing against the censorship of a governnment. In the past, however, doctors have been behind the suppression of knowledge.

Ignaz Semmelweis, Creative Commons

Ignaz Semmelweis was born on July 1, 1818 in the Tabán, an area of Buda which is part of present day Budapest, Hungary which was then part of the Austrian Empire. He was the fifth child of ten of the family of grocer Josef and Teresia Müller Semmelweis. In 1837 he began studying Law at the University of Vienna but a year later switched to Medicine and graduated in 1844. After failing to secure a position in Internal Medicine Semmelweis was appointed assistant to Professor Johann Klein in the First Obstetrical Clinic of the Vienna General Hospital on July 1, 1846.

At the time free maternity care was available to women as long as they agreed to let medical and midwifery students to learn from them. Semmelweis was in charge of logging all patients as well as preparing for teaching and ward rounds. There were actually two clinics: the First Clinic was led by doctors and medical students, the Second was midlife which alternated daily. Semmelweis caught wind of the terrible reputation the First Clinic had. Indeed, destitute women would rather give birth in the street and wait a day than be admitted to the First Clinic. The difference was due to mortality. 10% of women admitted to the First Clinic died of fever compared to less than 4% of those in the Second Clinic. What was the difference? Meticulous to the extreme, Semmelweis began investigating and eliminating differences. The climate was the same, there was no difference in religious practice, it couldn’t be overcrowding as the Second Clinic was actually the busier. He was haunted by the question.

Jakob Kolletschka, Creative Commons

Tragedy would give him his answer. In 1847 his good friend Jakob Kolletschka, a Professor of Forensic Medicine at the University of Vienna, accidentally cut himself with a scalpel during a post mortem examination. He developed fever and multi-organ failure. Semmelweis had actually left Vienna to give himself a break from the question of the First Clinic. He returned on 20th March 1847 to discover that Kolletschka had died a week earlier. Semmelweis wrote:

“Day and night I was haunted by the image of Kolletschka's disease and was forced to recognize, ever more decisively, that the disease from which Kolletschka died was identical to that from which so many maternity patients died”

Rather than looking at just the clinic Semmelweis looked at what was taking place elsewhere. The medical students who ran the First Clinic started their day in the mortuary performing dissections before coming to see patients. The midwives of the Second Clinic spent all their time on the ward. Semmelweis surmised that there was a link between dead bodies and the fever which had killed his friend and was killing his patients. This was decades before the work of Koch and Pasteur and ‘germ theory’. All Semmelweis could postulate was that some ‘cadaverous particles’ were on the scalpel which cut his friend and caused his death and these particles were being spread to his patients by the medical students.

Semmelweis instituted a policy of using a solution of chlorinated lime (modern calcium hypochlorite, the compound used in today’s common household bleach) for washing hands between autopsy work and the examination of patients. He felt a strong smelling solution would eradicate the smell of rotting flesh and so eliminate whatever infectious agent was causing disease. Mortality in the First Clinic plummeted from 18.3% in April 1847 to 1.95% by the August.

Semmelweis had discovered that rates of infection could be cut dramatically by the simple act of handwashing. This went against established medical theory at the time which had concluded that miasmas or ‘bad air’ was behind infection and that balancing the humours through procedures such as blood letting were the basis of treatment.

In a precedent to modern China this was also a time of political upheaval. 1848 saw revolutions across Europe and in Hungary saw an independence movement rise up against Austria. Although ultimately quashed the rebellion would have consequences for Semmelweis’s research. His brothers were involved in the movement and his superior Professor Johann Klein was a conservative Austrian who probably didn’t approve of their actions. Semmelweis had other struggles. With no scientific explanation as to why hand washing worked he was faced with scepticism. The medical community was not prepared to accept that they were somehow unclean and responsible for the deaths of patients. There was also a problem with Semmelweis’s poor presentation of his project and high handed manner which some of his colleagues found off putting.

In 1851 Semmelweis took a new post on the obstetric ward at St. Rochus Hospital in Pest (now part of Budapest), Hungary. Again through handwashing he virtually eliminated post-partum fever amongst his patients. Between 1851 and 1855 only 8 patients out of 933 births died due to fever. However, even within the same city his ideas failed to spread. Ede Flórián Birly, the Professor of Obstetrics at the University of Pest, continued to believe that puerperal fever was due to uncleanliness of the mother’s bowel and so continued to practice extensive purging.

In 1858, Semmelweis finally published his own account of his work in an essay entitled, “The Etiology of Childbed Fever”. Two years later he published a second essay, “The Difference in Opinion between Myself and the English Physicians regarding Childbed Fever”. In 1861, Semmelweis published his main work Die Ätiologie, der Begriff und die Prophylaxis des Kindbettfiebers (The Etiology, Concept and Prophylaxis of Childbed Fever). In his 1861 book he lamented:

“Most medical lecture halls continue to resound with lectures on epidemic childbed fever and with discourses against my theories. The medical literature for the last twelve years continues to swell with reports of puerperal epidemics, and in 1854 in Vienna, the birthplace of my theory, 400 maternity patients died from childbed fever. In published medical works, my teachings are either ignored or attacked. The medical faculty at Würzburg awarded a prize to a monograph written in 1859 in which my teachings were rejected”

His health was in decline. He was obsessed with his work and the wronging he had suffered. He aged physically and became increasingly absent minded in his work and distant at home, slipping into depression. By 1865 he was drinking and visiting prostitutes. It’s been suggested that this was due to mental illness, the beginnings of dementia or even third-stage syphilis; a disease that obstetricians sometimes picked up from their patients. In 1865 a board made up of University of Pest professors referred Semmelweis to a mental institution where ‘treatment’ including being placed into a straitjacket, doused in water and beaten up. During one beating by the institution’s guards Semmelweis received a cut to his right hand which became gangrenous. Two weeks after his institutionalisation Semmelweis died of septicaemia; the condition he had spent his career fighting. His funeral was a quiet, poorly attended affair.

Two decades later Louis Pasteur confirmed the germ theory of infectious disease. Joseph Lister would pioneer aseptic surgery. Ignaz Semmelweis’s reputation was revised by a humbled profession. Now he was “the saviour of mothers”. There is a university named after him in Budapest and his home is now a museum. It is too soon to be able to fully evaluate Li’s legacy but in the immediate aftermath of his death he is being held up as a martyr to the nascent Chinese pro-democracy movement.

COVID-19 has been described as potentially China’s ‘Chernobyl moment’. As as with the USSR China’s monolithic one party state is struggling to contain and respond to a human disaster. The government is attempting to make the fight against the disease as a test of Chinese pride, a populist struggle to rise to. However, as history tells us disease is no respecter of borders, regimes or reputations. I’ll leave the last words to Dr Li given to Chinese media from his hospital bed shortly before he died:

“I think there should be more than one voice in a healthy society”

Thanks for reading.

- Jamie

Mouldy Mary and the Cantaloupe

public.jpeg

It’s a well known story and example of medical serendipity. Alexander Fleming (1881-1955) a Scottish microbiologist who returned to his laboratory following his summer holiday and found his growth plates of Staphylococcal bacteria had been contaminated with mould. Wherever the mould was growing the bacterial cells had been killed. Antibiotics had been discovered. Except this wasn’t the first antibiotic to be made. Medication with antibacterial action dates back to before the medieval period. When it came to penicillin Fleming’s discovery was only the beginning. And the penicillin still in use today owes much to an unsung hero called Mary and a mouldy cantaloupe.

Fleming surmised that the mould must be making some sort of chemical which was killing the Staphylococcus.  The mould in question was Penicillium notatum and so Fleming called this chemical Penicillin.  Fleming wasn’t skilled at chemistry and so was only able to extract small amounts of this penicillin which he demonstrated did kill bacteria and was safe in humans.

Sample of penicillin mould presented by Alexander Fleming, 1935. From https://blog.sciencemuseum.org.uk/oxford-and-the-forgotten-man-of-penicillin/

Fleming was a poor public speaker and despite presenting his findings at a Medical Research Club and publishing his results in the British Journal of Experimental Pathology in 1929 there was little recognition amongst his peers.  It wouldn’t be until 1939 that Ernst Chain and Sir Howard Florey managed to distil concentrated penicillin from the mould.  In 1940 they completed their first animal trials.  By 1941 they were ready to treat their first human patient but due to the experimental nature of their drug they needed someone who was seriously if not terminally ill.  In 1941 Albert Alexander, a police constable in Oxford, scratched his face on a rose thorn (although this explanation for the injury has been described as apocryphal).  The scratch became infected with both Staphylococcus and Streptococcus bacteria.  Abscesses covered his face and he lost an eye.  

On 12th February Alexander was given an intravenous infusion of 160mg of penicillin.  Within 24 hours his fever resolved and he regained strength and his appetite.  Sadly, it was already clear that Penicillium notatum made very little amounts of penicillin; it took gallons and gallons of the mould to make enough penicillin to even cover a fingernail.  After 5 days of treatment the team ran out of penicillin.  Alexander’s condition worsened again and he died.  

Whilst penicillin was clearly promising there needed to be a more efficient way to produce the antibiotic, especially at the height of the Second World War when demand couldn’t have been higher.  A solution would be found in America.

Mary Hunt worked at the Department of Agriculture’s Northern Regional Research Laboratory (NRRL) in Peoria, Illinois.  It was her job to search our for mould strains which might produce more penicillin than Penicillium notatum.  This earned her the nickname, ‘Mouldy Mary’.  One day in 1943 she found a mouldy cantaloupe in a grocery store.  Bringing it to the lab she found it was infected with Penicillium chrysogenum, a strain which produced two hundred times the amount of penicillin as notatum.  The next step sounds like it came right out of science fiction.  The chrysogenum was zapped with X-rays to cause mutation.  This mutated mould now produced a thousand times the amount of penicillin.  By D-Day in 1944 there was enough penicillin to treat every soldier in need.  By 1945 a million people had been treated with penicillin compared to fewer than 1000 in 1943.  

After the war Fleming, Florey and Chain received the Nobel Prize in Physiology and Medicine for the ‘discovery of penicillin and its curative effect in various infectious diseases’.  As for Mary Hunt, whilst researching this blog I couldn’t even find out when she was born or what she looked like.  She isn’t the first woman to be sidelined in history despite her massive contribution.  But all penicillin used today is related to that mouldy cantaloupe and owes its existence to ‘Mouldy’ Mary Hunt.

shutterstock_1394087879.jpg

Casting the dye: the first antibiotics

public.jpeg

To look at a photograph of microscopic life is to see a world of purple and blue.  Of swirls and dots.  Of course this isn’t the case but actually the result of dyes used to make this invisible world vivid under the lens.  Haematoxylin and eosin (H&E) are two such dyes regularly used.  Haematoxylin dyes the nuclei of cells (where DNA is stored) blue whilst eosin stains the cytoplasm (the goo which makes up most of the cell) pink.  Other structures take on dyes in various amounts to create the remarkable pictures which form the basis of life and disease.

Breast cancer cells viewed under a microscope using an H&E stain. Hematoxylin has stained the cell nuclei blue while eosin has dyed the cytoplasm pink. From Shutterstock.

Breast cancer cells viewed under a microscope using an H&E stain. Hematoxylin has stained the cell nuclei blue while eosin has dyed the cytoplasm pink. From Shutterstock.

Such a principle is impressive.  But with a bit of lateral thinking it would herald a medical revolution.  Decades before Alexander Fleming would find pencillium mould killing cultures of bacteria in his laboratory dyes would form the foundation for the first ever antibiotics.

Paul Ehrlich (1854–1915) was a German biochemist who started experimented with dyes in microscopy. In looking down his microscope and seeing how different cells and structures took up dyes in differing amounts Ehrlich hypothesised that it must be possible to find chemicals which could be taken up bacterial cells but not human cells.  If these chemicals were toxic to bacterial cells they would form an anti microbial treatment which would be safe for humans.  Such a chemical would be the ‘magic bullet’ doctors were crying out for.

Ehrlich painstakingly tested compounds of arsenic.  The 606th compound, Arsphenamine (later called Salvarsen) was shown to destroy Treponema pallidum, the cause of syphillis.  Patients with syphillis started treatment with Arsphenamine.  Sadly, it proved toxic and killed some of Ehrlich’s patients.  But a principle was proven and Ehrlich won the Nobel Prize for Medicine in 1908 for ‘outlining the principles of selective toxicity and for showing preferential eradication of cells by chemicals’.  

A stamp printed in Niger shows Nobel Prize in Medicine, Paul Ehrlich, circa 1977. From Shutterstock.

A stamp printed in Niger shows Nobel Prize in Medicine, Paul Ehrlich, circa 1977. From Shutterstock.

German chemical company, IG Farbenindustrie continued looking at dyes as a potential antibiotic treatment.  In 1932 Gerhard Domagk (1895-1964) discovered that a red dye, Prontosil, was toxic against Streptococcus bacteria.  What’s more, it was safe in humans.  The medical community were sceptical, however.  In 1936 President Roosevelt’s son took ill with a sore throat and high fever.  After conventional treatments were first exhausted doctors tried Prontosil.  It worked and he made a full recovery.  Fame followed for Domagk, with the American media extolling the virtues of Prontosil, but not fortune.  Whilst Prontosil had a patent over it researchers quickly found that its therapeutic effects only came through it being metabolised in the body into sulphanilamide which didn’t have a patent.  Sulphanilamides were free to be made by any drug company who wanted to.  Domagk would still be celebrated by the scientific community and, like Ehrlich, was awarded the Nobel Prize, this time in 1939.  However, the prize was frowned upon by the Nazis and Domagk was apprehended by the Gestapo when he attempted to travel to receive it.  He was finally free to receive it in 1947.  

Gerhard Domegk. Creative Commons.

Gerhard Domegk. Creative Commons.

However, by the time Domagk was awarded his Nobel Prize the potential of penicillin was starting to be realised. To Alexander Fleming would come fame. His story of serendipity and not those of Ehlrich and Domagk’s painstaking research would become part of folklore. However, the story of antibiotics does not stop with Fleming. There would need to a crucial intervention from someone with even less recognition than Ehlrich and Domagk. But more of that story later.

Thanks for reading.

- Jamie












Bald’s Leechbook: Going (pre) medieval on superbug

Bald’s eyesalve. A facsimile of the recipe, taken from the manuscript known as Bald’s Leechbook (London, British Library, Royal 12, D xvii).

Bald’s eyesalve. A facsimile of the recipe, taken from the manuscript known as Bald’s Leechbook (London, British Library, Royal 12, D xvii).

In a previous musing I looked at medieval Medicine and common theories of illness and cure at the time. Medieval Medicine has a reputation for being backward in contrast to the enlightened Renaissance. This is the story of a pre-medieval medicine and a modern day ‘superbug’.

Methicillin resistant Staphylococcus aureus (MRSA) was first identified in Britain in 1961. Staphylococcus aureus is a very common species of bacteria often found on the human body which, given the opportunity, can cause infections especially in soft tissues like the skin. Staph aureus bacteria have a cell wall which protects them from their environment. They use a number of proteins to build this wall. It’s these proteins to which penicillin antibiotics bind and stop bacteria from making their protective cell wall (hence them being called penicillin binding proteins or PBPs). Without their wall the bacterial cells die. MRSA gets around this because it has evolved to produced a protein called PBP2a which is much harder for penicillins to bind to and so their activity is greatly impeded.

Like a lot of bacteria MRSA is able to stick together and secrete proteins to form a slimy, protective layer. This is called a biofilm. This means it is able to colonise various surfaces in the community and hospital environment. As a result MRSA is a leading cause of infections acquired in both the community and in hospitals. In fact, there are ten times the number of infections due to MRSA than all other multiple drug resistant bacteria combined. Science has looked at unusual places to find an answer. Enter Bald’s Leechbook.

Bald’s Leechbook was written in England in the 10th century and offers cures for a number of conditions, including infections. In 2015 a study decided to look at one such cure for eye ‘wen’ - a lump in the eye (probably a sty). This was the passage in question:

“Ƿyrc eaȝsealf ƿiþ ƿænne: ȝenim cropleac ⁊ ȝarleac beȝea emfela, ȝecnuƿe ƿel tosomne, ȝenim ƿin ⁊ fearres ȝeallen beȝean emfela ȝemenȝ ƿiþ þy leaces, do þonne on arfæt læt standan niȝon niht on þæm arfæt aƿrinȝ þurh claþ ⁊ hlyttre ƿel, do on horn ⁊ ymb niht do mid feþre on eaȝe; se betsta læcedom.”

This is translated into modern English as follows.

“Make an eyesalve against a wen: take equal amounts of cropleac [an Allium species] and garlic, pound well together, take equal amounts of wine and oxgall, mix with the alliums, put this in a brass vessel, let [the mixture] stand for nine nights in the brass vessel, wring through a cloth and clarify well, put in a horn and at night apply to the eye with a feather; the best medicine.”

Incredibly, Bald’s salve was found to kill MRSA as well as break up the biofilms it forms.  This was shocking enough but the researchers found that the salve seemed to work best when the recipe was followed exactly.  If steps or ingredients were skipped then the resulting treatment did not work as well.  Previous research had shown that individual ingredients such as allium species did have antibacterial effects but these were intensified when used in combination with the other ingredients. 

This suggests one of two possibilities.  Either Bald randomly threw together these recipes and got lucky.  Or there was something of a scholarly approach going on, using ingredients known to work and mixing them together to create something greater than the sum of its parts.  I’m not suggesting that all Medicine at the time was correct but as I said in a previous blog, perhaps the medieval period wasn’t such a dark age after all. 

This suggests a redrawing of medical and scientific history; previously the scientific method was believed to have been invented by the Royal Society in the 17th century and the first antimicrobial medicine was Lister’s carbolic acid in their 19th century. I’m not advocating a return of 10th century Medicine but instead an appreciate of our forebears who probably knew a lot more than we give them credit.

Thanks for reading

- Jamie

Opium: A trip through history

public.jpeg

In the past few months pharmaceutical giant Johnson and Johnson were ordered in court to pay $572 million in damages to patients addicted to prescribed opiate drugs. This is a landmark case placing those companies producing painkillers at a similar level of responsibility as tobacco manufacturers. Worldwide it is estimated that 16 million people have at some point been addicted to opiates. These chemicals, which include morphine and heroin, are alkaloid chemicals found naturally in the opium poppy, Papaver somniferum. Our relationship with this plant is historic and complex. Its therapeutic benefits are without question, hence an opium poppy is seen on the emblem for Royal College of Emergency Medicine. Yet we have fought wars over control of opium and its role in society calls in question greater issues involving the role of prescribed drugs and the companies who produce them. To understand opium is to understand the history of our relationship with it.

shutterstock_669963217.jpg

Opium comes from the latex of the poppy. This is a sticky substance like sap which oozes out of the poppy if it is cut. We don’t know when we first realised the poppy’s potential but there is evidence of our Stone Age ancestors making it one of the first plants to be harvested. It’s easy to imagine one of our ancestors eating a poppy or licking the latex off their fingers and finding its hidden abilities. As a result of the poppy’s ability to nullify pain and bring on altered consciousness it became linked to deities throughout the ancient civilisations such as the Egyptians. For the ancient Greeks the poppy was a gift from the goddess Demeter and was associated with Hypnos, the god of sleep, and Morpheus, the god of dreams, whose name would give us ‘Morphine’. It’s at this point it is worth pointing out a difference in nomenclature.  Drugs derived from opium are called opiates whilst synthetic drugs such as heroin are called opioids.  

Opiates and opioids both work by acting on opiate receptors. When they bind to opiate receptors on neurons they cause a reduction in neurotransmitter release and so prevent signals being sent. There are four types of opiate receptor: mu, kappa, delta and nociceptin dotted throughout the body. As a result whilst opiates are excellent at dulling pain (analgesia) they also come with side effects as well as an impact on the chemistry of the brain. This brings addiction and dependence. This is the double edge sword of opium.

The eminent Basra physician al-Razi (854-925) is credited with being one of the first people to use opium as an anaesthetic. Yet the downsides of the poppy were soon becoming clear. The fourth ruler of the Mughal Empire, Jahangir (1569-1627) was so addicted to opium his wife ruled in his stead. It was written of the Turkish people that “there is no Turk who would not buy opium with his last penny.”

shutterstock_1295082616 (2).jpg

One of the most popular forms of opium, as both a medicine and a drug of abuse, is laudanum. The legendary English physician, Thomas Sydenham (1624-1689) to whom the expression ‘primum non nocere’ (first do no harm) has been credited, published his recipe in 1676. He wrote “among the remedies which it has pleased Almighty God to give to man to relieve his sufferings, none is so universal and so efficacious as opium.”

Laudanum was widely exhorted as a treatment for illnesses as wide ranging as coughs and pain. Despite its addictive tendencies it was a lot safer than many other treatments of the times which often contained poisonous heavy metals. And, due to the constipating side effects of opium, at a time of poor hygiene with diarrhoea common, it did offer some therapeutic benefit. Laudanum was a common base for most treatments at the time. In 1821 the essayist Thomas De Quincey (1785-1859) published Confessions of an Opium Eater, an autobiographical work chronicling his addiction to and misuse of opium.

shutterstock_237235501.jpg

One country which struggled in particular with addiction to opium was China, which sort to limit the influx of the drug. For the Imperial British, growing poppies in India, China was a convenient and hungry market and they defended that market violently. So the two Opium Wars were fought between 1839 and 1842 and then 1856 to 1860 between China and Britain, helped by the French for the second conflict. The resulting victories for Britain reduced China’s gross domestic product by a half, taking them from the largest economy in the world to diplomatic subservience, kept the opium flow in China open and started the process by which the British took hold of Hong Kong. The current political turmoil in Hong Kong, returned to China in 1997, can be led back to conflicts fought to ensure Chinese opium addicts were having their habits fed.

Around this time in Britain a chemist, Charles Romley Alder Wright (1844-1894), was seeking to overcome the problem of opium addiction by formulating a version of morphine with all of the analgesic benefits but which wasn’t addictive. He boiled morphine with a number of different acids. It’s fair to say he failed in his mission. He created diamorphine, otherwise known as heroin, an agent even more potent than heroin both in its analgesic and addictive properties. After Wright’s death the Bayer Laboratories in Germany took over production led by Heinrich Dreser (1860-1924). First sold as a cough suppressant, Bayer stopped manufacturing heroin in 1913 when it was clear how addictive their product was. It is around this time that other opiates were being created, such as codeine.

shutterstock_65355052.jpg

It was in China where opium dens were established. With economic migration of Chinese workers to the USA came these opium dens. Here users could smoke opium and receive a much faster and stronger hit than through eating opium. The rise of these dens was partially behind the US Chinese Exclusion Act of 1882 which sought to limit immigration from China. In 1906 the federal government under President Teddy Roosevelt passed the Pure Food and Drug Act, which required any “dangerous” or “addictive” drugs to appear on the label of products. Three years later, the Smoking Opium Exclusion Act banned the importation of opiates that were to be used purely for recreational use. This was also wrapped up in anti-Chinese sentiment rather than simple drug legislation.

After campaigning from the pathologist Hamilton Wright (1867 - 1917) who called opium "the most pernicious drug known to humanity” the Harrison Narcotics Tax Act of 1914 put taxes and restrictions on opium.  Opium was stigmatised in the media and by officials.  In the UK the supply of opium and its derivatives was controlled by pharmacists.  The 1927 Rolleston Act gave prescribing power to doctors if they saw medical need.  Addiction was seen as a medical need and so doctors were able to prescribe small amounts to try and wean their patients off the drugs.  There was a clear division between the medical treatment of addicts and the criminal prosecution of producers and distributors.  That changed in the 1960s.

handcuffs-921290_960_720.jpg

In 1961 the Single Convention on Narcotic Drugs, an international treaty signed by all members of the United Nations, sought to restrict the spread of opium as well as other drugs.  In the UK this led to the 1964 Drugs (Prevention of Misuse) Act which for the first time criminalised addiction.  This created penalties for possession as well as stop and search powers for the police. The Misuse of Drugs Act 1971 divided drugs into categories A, B and C still in use today.

The debate about whether making criminals out of addicts actually works is ongoing and not something I’m going to dwell on much here.  But it is true that the idea of illegal drugs with penalties for possession only dates back within a generation.  A blink of an eye in relation to the millennia we’ve spent with opium.  One country who took a very different approach is Portugal. At one point in the 1980s 1 in 10 Portuguese were addicted to heroin.  The country had the highest rates of HIV infection in the European Union.  In 2001 they decriminalised posssesion, users were instead directed to support and treatment, similar to practice in the UK before the 1960s.  Since then, and not just due to the change in the law, deaths due to overdose, HIV transmission and drug related crime have all plummeted.  This suggests an alternative approach to illegal opium. But it’s not just the illegal market we now face a challenge from.

Opiates and opioids remain our best analgesics for those with severe pain or at end of life.  But all the evidence shows that they offer no benefit in long term use.  The problem is in the UK chronic pain is on the rise.  The British Pain Society estimates that as many as 28 million adults in Britain are living with pain lasting longer than 3 months. In giving his landmark judgement against Johnson & Johnson Oklahoma Judge Thad Balkman found the company guilty of:

“promotion of the concept that chronic pain was under-treated (creating a problem) and increased opioid prescribing was the solution.”

Similar claims have been made across the USA with more landmark trials expected. Chronic pain often has psychological issues attached to it. Yet in a short consultation with a GP or in a clinic there is rarely the opportunity to explore these. Opiates make a convenient solution. And so the double edge sword is wielded. Perhaps with litigation will come an open discussion about the use of opiates and the role of opium in our society. From a divine gift to a precious medical tool albeit one that needs to be used with caution.

Thanks for reading.

- Jamie

Two snakes or one? How we get the symbol for Medicine wrong

public.jpeg

Healthcare is full of antiquity, not surprising for a venture as old as humanity itself. Humans have always got sick and always turned to wise men and women and the divine to help them. With that comes symbols and provenance. Wound Man. The Red Cross. The Rod of Asclepius.

Ah yes, the Rod of Asclepius, the Ancient Greek God of healing. It’s a prominent symbol of Medicine. One staff, with two snakes entwined around it…

Except that symbol is not the Rod of Asclepius at all. That symbol of two snakes wrapped around a pole, known as a caduceus, actually belonged to Hermes, the Ancient Greek messenger God in charge of shepherds, travel and commerce. The Ancient Romans called him Mercury. The fastest of the gods, he had winged shoes and helmet to help him travel. On one adventure he saw two snakes fighting. To stop them he threw a stick at them and at once the serpents wrapped themselves around it and became fixed. Hermes liked the resulting staff so much he took it as his own. Hence the caduceus became a symbol of Hermes; of commerce and travel.

public.jpeg

Asclepius (Vejovis to the Romans) on the other hand was the son of Apollo the Sun God. Just like Hermes Asclepius was also linked to snakes. One story has a snake licking his ears clean and in so doing giving him healing knowledge. Another story has a snake giving him a herb with resurrecting powers. For whatever reason, Asclepius would show his gratitude to snakes by carrying a staff with one snake on it. Not two. One.

public.jpeg

The Ancient Greeks weren’t the first or last civilisation to link snakes to divinity. People have a habit of venerating and fearing in equal measure. Snakes, with their stillness, mysterious venom and supposed powers of self-renewal through shedding their skin are always going to inspire wonder.

So why the confusion between these two symbols? One possible reason is due to alchemy; the attempt by early scientists to turn base metals to gold which, while a folly, helped advance scientific knowledge including Medicine. The caduceus was used as a symbol by alchemists as they often used mercury or quicksilver in their preparations. Hermes/Mercury was linked to the metal that bore his name and so a connection was made. However, the caduceus was also a symbol of professionalism and craft. Therefore anyone wanting their work to be taken seriously would include the caduceus as a kind of early precursor of professional accreditation. In that vein when John Caius, the chronicler of sweating sickness, presented both the Cambridge college which bears his name and the Royal College is Physicians with a silver caduceus it was not as a symbol of Medicine but of professionalism.

In any case, in Great Britain, as late as 1854, the distinction between the rod of Asclepius and the caduceus as symbols of two very different professions was apparently still quite clear. In his article On Tradesmen's Signs of London A.H. Burkitt notes that among the very old symbols still used in London at that time, which were based on associations between pagan gods and professions, "we find Mercury, or his caduceus, appropriate in trade, as indicating expedition. Esculapius, his Serpent and staff, or his cock, for professors of the healing art"

It seems the mix up didn’t take place until the 20th century. In 1902 the US Army Medical Corp adopted the caduceus as their symbol. The reason isn’t clear as the American Medical Association, Royal Army Medical Corp and the French Military Service all would happily adopt the staff of Asclepius. This decision to choose the caduceus has been credited either to a Captain Frederick P. Reynolds or a Colonel Hoff. The Americans Public Health Service and US Marine Hospital would also take Hermes’s symbol as their own.

This confusion seems to be uniquely American and driven by commercialisation. In 1990, a survey in the US found that 62% of the professional associations used the Rod of Aesculapius while 37% used the Caduceus and 76% of commercial organizations used the Caduceus. Perhaps that makes sense as Hermes was the god of trade (or maybe that’s me being cynical). The World Health Organisation would choose the Rod of Asclepius for their emblem where it can still be seen today.

public.jpeg

Medicine is full of symbolism. Symbols, like language, change their meaning. There was a time that healthcare was full of quacks and charlatans. The Caduceus was a mark of professionalism long before their were accreditations to be had. Using the two snakes is a nod to those efforts to make the trade professional and accountable. But if you want to be accurate, it’s the staff with one snake you’re after.

Thanks for reading.

- Jamie

public.jpeg

When mental health robbed England of its king for over a year

Both Prince William and Prince Harry have spoken openly about their own mental health and the impact of losing their mother and growing up in the public eye. Together they have formed a charity to support young people with mental health problems. They aim to remove a stigma which still remains in the 21st century.

This musing goes back to another royal with mental health problems, this time in the 15th century. Problems whose diagnosis we still can’t identify and which led to his downfall and changed the course of history in England.

It’s 1453 and to say that King Henry VI of England has a lot on his plate would be an understatement. The Battle of Castillon on 17th July effectively ends the Hundred Years War with France and sees Henry lose the last part of an empire which once had stretched from the Channel to the Pyrenees. At home this defeat stoked the embers of rebellion. The War of the Roses is imminent. For Henry defeat was a personal blow too. He was the son of Henry V; war hero of Agincourt. He succeeded the throne in 1422 aged only nine months after his father’s sudden death and by the time he was deemed old enough to rule in his own right in 1437 the war with France had already turned against England. Henry was unable to live up to his father’s legend and reverse the slide putting his reign under increasing pressure from the very beginning.

King Henry VI

King Henry VI

Henry did have one thing going for him, his wife Margaret of Anjou whom he married in 1445. By the summer of 1453 she was pregnant. Strong willed and volatile she was far more willing than Henry to stand firm and make decisions. Henry deplored violence and would rather spare traitors and cut back himself instead of raising taxes. Royal duties were a distraction from his preferred activities; praying and reading religious texts. Admirable, but not ideal when revolution is in the air. As Henry began to earn his reputation as one of England’s weakest ever kings Margaret would come to be the de facto monarch. He would soon need her even more.

Margaret of Anjou

Margaret of Anjou

10th August 1453 at the royal lodge in Clarendon near Salisbury. Henry receives news of the defeat at Castillon and the deaths of one of his most faithful and talented commanders John Talbot, Earl of Shrewsbury and his son. Suddenly he falls unwell. Without warning he acts unaware of his surroundings, unresponsive to anyone and anything around him and seemingly unable to even move. With England on the verge of civil war his entourage are understandably keen to keep this under wraps and hope it passes. It doesn’t. Margaret stays in London and the royal court continues as normal. In early October, accepting how ill the king is, his court moves him gradually to Windsor. On 13th October Margaret goes into labour and is delivered of a baby boy, Edward. Henry is informed of the birth of his heir but doesn’t react. In the New Year Margaret brings Prince Edward to Henry. Both her and the Duke of Buckingham beg Henry to bless the young prince. Other than moving his eyes he does nothing. At the time he has to be fed and guided around the palace by his attendants.

One 22nd March 1454 John Kemp, the Archbishop of Canterbury and Lord Chancellor of England dies. The news is given to Henry by a delegation of bishops and noblemen in the hope he will wake and announce a successor. The group report back to Parliament that the king remained unresponsive. That same month a commission sends a group of doctors to treat Henry. They are provided with a list of possible treatments including enemas, head purging (heat applied to the head), laxatives and ointments. Whatever treatment they chose nothing works.

As suddenly as Henry fell ill he recovered after nearly 18 months on Christmas Day 1455. On 30th December Margaret brought Edward to Henry. The king was delighted and acted as though he was meeting the prince for the first time. Margaret was overjoyed, but with an agenda. During Henry’s illness Richard of York had claimed the title of Lord Protector and on the death of John Kemp placed his brother-in-law Richard Neville as the new Chancellor, a move Margaret opposed. Edmund Duke of Somerset, a rival of Richard’s and an ally of Margaret’s, was sent to the Tower of London. Richard was a relative of Henry’s and had a claim to throne. A claim scuppered by the birth of Prince Edward. The life of her son was in jeopardy. With Henry now well again Margaret persuaded him to remove Richard from favour and restore Somerset from the Tower. So intensified the resentment. Richard would begin to grow his support. The Wars of the Roses sprang from these personal rivalries. Had Henry not been unwell it’s possible the Wars of the Roses could have been avoided.

roses-3256160_960_720.jpg

So what was Henry’s illness? Much has been made of a supposed family history of mental health problems. His maternal grandfather King Charles VI of France suffered recurrent bouts of violence and disorientation, not recognising his family or remember he was king. These bouts lasted months at a time. It is possible they were due to mental illness such as bipolar disorder or schizophrenia. However, they seemed to follow a fever and seizures he suffered in 1392. Potentially Charles’s ‘madness’ may have been due to an infection such as encephalitis rather than psychiatric illness.

The length of Henry’s illness and sudden improvement with no apparent ill effect make schizophrenia or catatonic schizophrenia unlikely. The length of illness again along with the loss of awareness and memory make a depressive illness unlikely. There’s no record of him being similarly ill at another time of his life. It is is possible he suffered a severe dissociative disorder due to stress. Of course, it is completely plausible that contemporary accounts are inaccurate or incomplete, never mind the fact that it is impossible to make a diagnosis of a patient you haven’t met, never mind one who died six centuries ago.

Henry would cling to the throne until he was deposed in 1461, replaced by Edward IV, son of Richard of York. Henry was imprisoned and Margaret fled to Scotland with their son. But she wasn’t finished. She would reach out to Richard Neville and form an alliance based on an arranged marriage between her son and his daughter. Neville would force out Edward IV and reinstate Henry in 1470. It was to be a short return however. Edward IV raised an army and in the ensuing conflict both Richard Neville and then Henry’s son died in combat in early 1471. Henry once again was imprisoned in the Tower of London. He died mysteriously, possibly murdered on the orders of Edward IV, in 1471. His mental health was blamed, with supporters of Edward IV claiming he died of a broken heart at the loss of his son. Margaret was also imprisoned until she was ransomed by King Louis XI of France in 1475. She lived out her days in France until she died in 1482.

King as long as he could remember, losing his kingdom and facing potential rebellion and death it’s no wonder Henry’s mental health suffered. But what I think is remarkable is that at a time of no mental health knowledge his court was able to keep him fed and watered and otherwise healthy for 18 months. Even in the time since their mother died in 1997 Princes William and Harry are showing how far we have come in appreciating mental health. Their ancestor King Henry VI is a powerful example of the impact of mental health.

Thanks for reading

- Jamie

#FOAMPubMed 1: Lemons and Limes, the first clinical trial and how to make a research question

fruit-15408_960_720.jpg

Before we conduct any research we first need to construct a research question. This can be a difficult step. Our question needs to be precise and easy to understand. To do this we can use the ‘PICO’ criteria:

Population

We need a population of interest. These will be subjects who share particular demographics and needs to be clearly documented.

Intervention

The intervention is something you’re going to do to your population. This could be treatment or education or an exposure such as asbestos. The effect of the intervention is what you’re interested in.

Control/Comparison

If we’re going to study an intervention we need to compare it. We can use people without the exposure (control) or compare the treatment to another or placebo.

Outcome

The outcome is essentially what we are going to measure in our study. This could be mortality, it could be an observation such as blood pressure or a statistic such as length of stay in hospital. Whatever it is we need be very clear that this our main outcome, otherwise known as our primary outcome. The outcome decides our sample size so has be explicit.

PICO therefore allows us to form a research question.

To demonstrate this let’s look at the first ever clinical trial and see how we use PICO to write a research question.

It’s the 18th century. An age of empires, war and exploration. Britain, an island nation in competition with its neighbours for hegemony, relies heavily on her navy as the basis of her expansion and conquest. This is the time of Rule Britannia. Yet Britain, as with all sea going nations, was riddled with one scourge amongst its sailors: scurvy.

Scurvy is a disease caused by a lack of Vitamin C. Vitamin C, or ascorbic acid, is essential in the body to help catalyse a variety of different functions including making collagen, a protein which forms the building blocks of connective tissue, and wound healing. A lack of Vitamin C therefore causes a breakdown of connective tissue as well as impaired healing; this is scurvy, a disease marked by skin changes, bleeding, loss of teeth and lethargy. Hardly the state you want your military to be when you’re trying to rule the waves.

James Lind was born in Edinburgh in 1716. In 1731, he registered as an apprentice at the College of Surgeons in Edinburgh and in 1739 became a surgeon's mate, seeing service in the Mediterranean, Guinea and the West Indies, as well as the English Channel. In 1747, whilst serving on HMS Salisbury he decided to study scurvy and a potential cure.

James Lind 1716-1794

James Lind 1716-1794

Lind, as with medical opinion at the time, believed that scurvy was caused by a lack of acid in the body which made the body rot or putrefy. He therefore sought to treat sailors suffering with scurvy with a variety of acidic substances to see which was the best treatment. He took 12 sailors with scurvy and divided them into six pairs. One pair were given cider on top of their normal rations, another sea water, another vinegar, another sulphuric acid, another a mix of spicy paste and barley with another pair receiving two oranges and one lemon (citrus fruits containing citric acid).

Although they ran out of fruit after five days by that point one of the pair receiving citrus fruits had returned to active duty whilst the other was nearly recovered. Lind published his findings in his 1753 work, A treatise on scurvy. Despite this outcome Lind himself and the wider medical community did not recommend citrus fruits to be given to sailors. This was partly due to the impossibility of keeping fresh fruit on a long voyage and the belief that other easier to store acids could cure the disease. Lind recommended a condensed juice called ‘rob’ which was made by boiling fruit juice. Boiling destroys vitamin C and so subsequent research using ‘rob’ showed no benefit. Captain James Cook managed to circumnavigate the globe without any loss of life to scurvy. This is likely due to his regular replenishment of fresh food along the way as well as the rations of sauerkraut he provided.

It wasn’t until 1794, the year that Lind died, that senior officers on board the HMS Suffolk overruled the medical establishment and insisted on lemon juice being provided on their twenty three week voyage to India to mix with the sailors’ grog. The lemon juice worked. The organisation responsible for the health of the Navy, the Sick and Hurt Board, recommended that lemon juice be included on all voyages in the future.

Although his initial assumption was wrong, that scurvy was due to a lack of acid and it was the acidic quality of citrus fruits that was the solution, James Lind had performed what is now recognised as the world’s first clinical trial. Using PICO we can construct Lind’s research question.

Population

Sailors in the Royal Navy with scurvy

Intervention

Giving sailors citrus fruits on top of their normal rations

Comparison

Seawater, vinegar, spicy paste and barley water, sulphuric acid and cider

Outcome

Patient recovering from scurvy to return to active duty

So James Lind’s research question would be:

Are citrus fruits better than seawater, vinegar, spicy paste and barley water, sulphuric acid and cider at treating sailors in the Royal Navy with scurvy so they can recover and return to active duty?

After HMS Suffolk arrived in India without scurvy the Naval establishment began to give citrus fruits in the form of juice to all sailors. This arguably helped swing superiority the way of the British as health amongst sailors improved. It became common for citrus fruits to be planted across Empires by the Imperial countries in order to help their ships stop off and replenish. The British planted a particularly large stock in Hawaii. Whilst lemon juice was originally used the British soon switched to lime juice. Hence the nickname, ‘limey’.

A factor which had made the cause of scurvy hard to find was the fact that most animals can actually make their own Vitamin C, unlike humans, and so don’t get scurvy. A team in 1907 was studying beriberi, a disease caused by the lack of Thiamine (Vitamin B1), in sailors by giving guinea pigs their diet of grains. Guinea pigs by chance also don’t synthesise Vitamin C and so the team were surprised when rather then develop beriberi they developed scurvy. In 1912 Vitamin C was identified. In 1928 it was isolated and by 1933 it was being synthesised. It was given the name ascorbic (against scurvy) acid.

James Lind didn’t know it but he had effectively invented the clinical trial. He had a hunch. He tested it against comparisons. He had a clear outcome. As rudimentary as it was this is still the model we use today. Whenever we come up with a research question we are following the tradition of a ship’s surgeon and his citrus fruit.

Thanks for reading.

- Jamie

Hippocrates to Helsinki: Medical Ethics

300px-Karl-Brandt.jpg

On 2nd June 1948 seven men were hanged for crimes committed in World War Two. Although all held some form of military rank none had actually fired a gun in combat. Four of them were doctors. They, along with sixteen other defendants, had been on trial from 9th December 1946 to 20th August 1947 to answer for the horrors of Nazi human experimentation under the guise of Medicine. A common defense used in response for their crimes was that there was no agreed standard saying what they had done was wrong. After almost 140 days of proceedings, including the testimony of 85 witnesses and the submission of almost 1,500 documents, the judges disagreed.

The Nuremberg trials are rightfully held up as a landmark of medical ethics. Yet they are only one point on a timeline that stretches back to the very beginnings of Medicine. This musing is a brief journey through that timeline.

The Hippocratic Oath

Hippocrates

Hippocrates

We start with Hippocrates (c. 460 BCE to c. 370 BCE) a Greek physician considered the Father of Medicine. The Hippocratic Oath is attributed to him, although it may have been written after his death. The oldest surviving copy dates to circa 275 CE. The original text provides an ethical code for the physician to base themselves on:

I swear by Apollo Physician, by Asclepius, by Hygieia, by Panacea, and by all the gods and goddesses, making them my witnesses, that I will carry out, according to my ability and judgment, this oath and this indenture. To hold my teacher in this art equal to my own parents; to make him partner in my livelihood; when he is in need of money to share mine with him; to consider his family as my own brothers, and to teach them this art, if they want to learn it, without fee or indenture; to impart precept, oral instruction, and all other instruction to my own sons, the sons of my teacher, and to indentured pupils who have taken the physician’s oath, but to nobody else. I will use treatment to help the sick according to my ability and judgment, but never with a view to injury and wrong-doing. Neither will I administer a poison to anybody when asked to do so, nor will I suggest such a course. Similarly I will not give to a woman a pessary to cause abortion. But I will keep pure and holy both my life and my art. I will not use the knife, not even, verily, on sufferers from stone, but I will give place to such as are craftsmen therein. Into whatsoever houses I enter, I will enter to help the sick, and I will abstain from all intentional wrong-doing and harm, especially from abusing the bodies of man or woman, bond or free. And whatsoever I shall see or hear in the course of my profession, as well as outside my profession in my intercourse with men, if it be what should not be published abroad, I will never divulge, holding such things to be holy secrets. Now if I carry out this oath, and break it not, may I gain for ever reputation among all men for my life and for my art; but if I break it and forswear myself, may the opposite befall me.

The oath has been used in update iterations as a pledge for doctors to make on graduation.

download.jpg

The Formula Comitis Archiatrorum is the earliest known text on medical ethics from the Christian era. It was written by Magnus Aurelius Cassiodorus (c. 484-90 to c. 577-90 CE), a statesman and writer serving in the administration of Theodoric the Great, the king of Ostrogoths. It laid out a code of conduct for physicians to align their lives and medical practice with.

Ethics of the Physician was penned by Ishāq bin Ali al-Rohawi, a 9th century Arab physician, and is the first medical ethics book in Arab medicine. It contains the first documented description of the peer review process where a physician’s notes were reviewed by their peers.

Primum non nocere and the beginnings of ‘medical ethics’

The phrase ‘primum non nocere’ (first do no harm) is often attributed to Hippocrates and the Hippocratic Oath. The exact author is unknown, however. One study looked at medical writings back to the Middle Ages and found the first mention in 1860 attributed to a physician Thomas Sydenham.

Thomas Percival

Thomas Percival

In 1794, Thomas Percival (1740-1804) created one of the first modern codes of medical ethics in a pamphlet which was expanded in 1803 into a book: Medical Ethics; or, a Code of Institutes and Precepts, Adapted to the Professional Conduct of Physicians and Surgeons. This was the first use of the phrase ‘medical ethics’. The book largely influenced the American Medical Association code, which was adopted in 1847. An Introduction to the Study of Experimental Medicine was written by French physiologist Claude Bernard (1813-1878) in 1865. Bernard’s aim in the Introduction was to demonstrate that medicine, in order to progress, must be founded on experimental physiology.

Nuremberg and Helsinki

After the failed defence of the defendants in the Doctors Trial The Nuremberg Code, written in 1947, is a ten part code guiding research ethics:

  1. The voluntary consent of the human subject is absolutely essential.

  2. The experiment should be such as to yield fruitful results for the good of society, unprocurable by other methods or means of study, and not random and unnecessary in nature.

  3. The experiment should be so designed and based on the results of animal experimentation and a knowledge of the natural history of the disease or other problem under study that the anticipated results will justify the performance of the experiment.

  4. The experiment should be so conducted as to avoid all unnecessary physical and mental suffering and injury.

  5. No experiment should be conducted where there is an a priori reason to believe that death or disabling injury will occur; except, perhaps, in those experiments where the experimental physicians also serve as subjects.

  6. The degree of risk to be taken should never exceed that determined by the humanitarian importance of the problem to be solved by the experiment..

  7. Proper preparations should be made and adequate facilities provided to protect the experimental subject against even remote possibilities of injury, disability, or death.

  8. The experiment should be conducted only by scientifically qualified persons. The highest degree of skill and care should be required through all stages of the experiment of those who conduct or engage in the experiment.

  9. During the course of the experiment the human subject should be at liberty to bring the experiment to an end if he has reached the physical or mental state where continuation of the experiment seems to him to be impossible.

  10. During the course of the experiment the scientist in charge must be prepared to terminate the experiment at any stage, if he has probable cause to believe, in the exercise of the good faith, superior skill and careful judgment required of him that a continuation of the experiment is likely to result in injury, disability, or death to the experimental subject.

It’s not hard reading this code to see the inspiration from the trials behind the code, in particular empowering patients. The Nuremberg Code remains a keystone of medical research.

Another keystone is The Helsinki Declaration, a set of ethical principles for research on human subjects, developed by the World Medical Association in 1964. It’s not a law in itself but has been used to form the basis of laws in the signatory countries. The original draft contained five basic principles, a section on clinical research and a section on non-therapeutic research. It has been revised several times since, most recently in 2013 which contained thirty-seven principles. 1979 saw the publication of the first edition of Principles of Biomedical Ethics by philosophers Tom Beauchamp and James Childress. It is this book which gave us the four ethical principles often quoted by medical students:

  • Beneficence (promoting well being)

  • Non-malevolence (not doing harm - drawing back to primum non nocere)

  • Autonomy (the patient’s right to decide for themselves)

  • Justice (the community at large)

Scandals

Despite Nuremberg and Helsinki a number of scandals occurred throughout the 20th century reflecting inequalities and prejudices in society. In 1951, a young American black woman named Henrietta Lacks had a piece of her cancerous tumour extracted without her knowledge. The cells from Henrietta’s cervical tumour, known as HeLa cells, were the first human cell line to survive in-vitro and have since been immortalised for testing new medical treatments, most notably the Polio vaccine.

Henrietta Lacks

Henrietta Lacks

However, the case raised serious concerns surrounding the lack of informed consent and taking samples from living patients. In 1997, President Bill Clinton issued a formal apology for the Tuskegee Syphilis Study, which took place between 1932 and 1972 in the state of Alabama. This infamous government experiment allowed hundreds of African-American men to go untreated for their syphilis, so doctors could study the effects of the disease. This continued even after penicillin was discovered to be an effective cure. This was one of a number of unethical studies performed in the 20th century including the Guatemalan Syphilis Study and the Skid Row Cancer Study. When we talk about ethics we often think about Nazis and other deplorable regimes I think these studies remind us that even in the democratic West we are more than capable of committing horrible acts against vulnerable people.

Reproductive advances & rights and Alder Hey

The latter part of the 20th century saw a growth in women’s rights including their reproductive rights and the abilities of reproductive medicine. The Abortion Act was passed in the UK in 1967 after widespread evidence that unsafe illegal abortion can often result in maternal mortality/morbidity. The Act made abortion legal in all of Great Britain (but not Northern Ireland) up until a gestational period of 24 weeks. In 1973 the US Supreme Court voted in favour of Jane Roe in the case of Roe vs. Wade. The Court ruled 7-2 that access to safe abortion is a fundamental right. The arguments contained within this court case are still prominent topics of debate to this day.

Louise Joy Brown was born in Oldham, UK in 1978. She was the first baby to be born as a result of in-vitro fertilisation (IVF). The Report of the Committee of Inquiry into Human Fertilisation and Embryology, commonly called The Warnock Report was published in 1984 as a result of a UK governmental inquiry into the social impacts of infertility treatment and embryological research. The Human Fertilisation & Embryology Act 2008 updated and revised The Human Fertilisation and Embryology Act of 1990 in the UK, and implemented key provisions within IVF and human reproduction. These include the banning of sex selection for non-medical purposes and that all human embryos are subject to regulation.

The Human Tissue Act 2004 created the Human Tissue Authority to "regulate the removal, storage, use and disposal of human bodies, organs and tissue." In response to the Alder Hey organ scandal which involved the removal and retention of children’s organs without parental knowledge or consent.

The right to die

In February 1990, Terri Schiavo collapsed at home after suffering a cardiac arrest. The oxygen supply to her brain was cut off and, as a result, she entered a "persistive vegetative state" from which she would never wake. For the next 10 years, Terri's husband would fight various legal battles against her parents, and eventually the state of Florida, to enable her feeding tube to be removed. Her case was one of several sparking an enduring national debate over end-of-life care and the right to die. Dignitas was founded by in 1998 by Ludwig Minelli, a Swiss lawyer specialising in human rights law. Since its foundation Dignitas has helped over 2,100 people with severe physical illnesses, as well as the terminally ill, to end their own lives. In the UK the Assisted Dying Bill was blocked by the House of Lords in 2007. The bill would have only allowed those of sound mind and with less than six months to live to seek assisted death. Both euthanasia and assisted suicide remain illegal in the UK.

In 2016 Charlie Gard was born in the UK with an exceedingly rare and fatal inherited disease - infantile onset encephalomyopathy mitochondrial DNA depletion syndrome (MDDS). The ensuing legal battle between Charlie's parents and his doctors over withdrawing life-support provoked passionate debate throughout the world. It illustrated the power of social media to both facilitate and hamper debate around autonomy, end-of-life care, and parental rights with ugly scenes of mob violence threatened against the staff looking after Charlie. It also showed how complicated nuanced cases can be hijacked by various individuals and groups to spread their own agenda, such as the American right.

How we can show we respect research ethics

For anyone working in clinical research Good Clinical Practice is the international ethical, scientific and practical standard. Everyone involved in research must be trained or appropriately experienced to perform the specific tasks they are being asked to undertake. Compliance with GCP provides public assurance that the rights, safety and well being of research participants are protected and that research data are reliable. More information can be found on the National Institute for Health Research website who offer both introduction and refresher courses.

Looking at the timeline of medical ethics it’s tempting to think that we’ve never been more advanced or ethical and the whole movement is an evolution to enlightenment. To a certain extent that’s true, of course things are better than they were. But we can’t be complacent. The trend is that ethics often lag behind medical advances. As we become better at saving the lives of premature children, our population ages with more complex diseases and our resuscitation and intensive care improve there will undoubtedly continue to be more debates. As the case of Charlie Gard showed these can be very adversarial. Social media and fake news will no doubt continue to play a huge part of any scientific discussion, be it in medicine or climate change. All the more reason to stick to our principles and always aim to do no harm.

Thanks for reading

- Jamie

Not to be sneezed at: How we found the cause of hay fever

dandelion-302129_960_720.jpg

The recent good weather in the UK has seen barbecues dusted off and people taking to the garden. Cue sneezes and runny eyes and noses. Yes, with the nice weather comes hay fever. Hay fever or allergic rhinitis affects somewhere between 26% and 30% of people in the UK. Symptoms include sneezing, swelling to the conjunctivae and eyelids, runny nose (rhinorrhea) and a blocked nose. Sometimes it can result in hospital admissions and death.

We all know that pollen is the cause behind hay fever. Pollen in the air is inhaled and trapped by hairs in the membrane of the nostrils. There the body responses to proteins on the surface of the pollen. These proteins are called allergens. Different types of pollen have different allergens. A type of white blood cell called a B cell produce an antbody called immunoglobulin E or IgE specific to a particular allergen. The IgE then binds to a type of cell called mast cells. These are found in some of the most sensitive parts of the body, including the skin, blood vessels, and respiratory system. Mast cells contain 500 to 1500 granules containing a mix of chemicals including histamine. This binding causes mast cells to release their histamine. It is histamine which causes the symptoms of hay fever by binding to histamine receptors throughout the body. Antihistamines work by binding to these receptors instead of histamine and blocking them.

But two centuries ago hay fever was a mystery. It took a couple of doctors with sneezing and blocked noses to research the problem to link it to pollen. This musing is their story.

The first description of what we would call hay fever came in 1819 in a study presented to the Medical and Chirurgical Society called ‘Case of a Periodical Affection of the Eyes and Chest’. The case was a patient called ‘JB’, a man “of a spare and rather delicate habit”. The patient was 46 and had suffered from cattarh (blockage of the sinus and a general feeling of heaviness and tiredness) every June since the age of eight. Numerous treatments including bleeding, cold baths, opium and vomiting were tried to no avail. What makes this study even more interesting is that ‘JB’ was the author, John Bostock, a Liverpool-born doctor who was not afraid to experiment on himself.

John Bostock

Bostock tried to broaden his research by looking for more sufferers, He found 28. In 1828 he published his work and called the condition, “catarrhus aestivus” or “summer catarrh”. After Bostock published an idea spread amongst the public that the smell of hay was to blame. This led to the colloquial term “hay fever”. Bostock didn’t agree and felt that the heat of summer was to blame. He rented a clifftop house near Ramsgate, Kent for three consecutive summers which helped. In 1827 The Times reported that the Duke of Devonshire was "afflicted with what is vulgarly called the Hay-fever, which annually drives him from London to some sea-port". In 1837 a few days before King William IV died the same paper reported that the king had "been subject to an attack of hay fever from which he has generally suffered for several weeks".


Charles Harrison Blackley

In 1859 another doctor, Charles Harrison Blackley, sniffed a bouquet of bluegrass and sneezed. He was convinced that pollen was to blame and methodically set out to prove it. He experimented on himself and seven other subjects. He first applied pollen to the nose and noted how it produced the symptoms of hay fever. He then covered microscope slides with glycerine and left them in the sunshine under a little roof for 24 hours before removing them and studying them under a microscope. He was then able to count the number of pollen granules in the air. In noted the prevalence of grass pollen in June, the time when symptoms were at their worse. To prove that wind could carry pollen great distances he then put similar slides up in kites to altitudes of 500 to 1500 feet. He discovered the slides there caught an even greater number of granules than at the lower level. In 1873 he published his work, Experimental Researches on the Causes and Nature of Catarrhus aestivus.

Fast forward to 1906. An Austrian paediatrician, Clemens von Pirquet, notices that if patients vaccinated against smallpox with horse serum are given a second dose they react quickly and severely. He correctly deduces that the body ‘remembers’ certain substances and produces antibodies against them. He calls this ‘allergy’. In the 1950s mast cells are discovered. In 1967 IgE is identified. The mechanism of allergic rhinitis and other allergies is finally understood. With this came new lifesaving treatment such as the EpiPen.

For a lot of us hay fever is an annual nuisance. But as we reach for the antihistamines and tissues we should thank a couple of 19th century sufferers who happened to turn their symptoms into a life’s work and, as a result, make hay fever that bit easier for us.


Thanks for reading

- Jamie

Medical school medieval style

022519-46-History-Medieval-Middle-Ages-Health-Medicine-Disease-768x448.jpg

We discuss the first Medical schools (and review wine) in the first episode of the Quacks Who Quaff podcast.

Philosophia et septem artes liberales, the seven liberal arts. From the Hortus deliciarum of Herrad of Landsberg (12th century)

It’s tempting to see medieval doctors as a group of quacks and inadequates stuck between the Dark Ages and the enlightened Renaissance. Certainly, it was a dangerous time to be alive and sick. In the twelfth century the majority of people lived in rural servitude and received no education. Average life expectancy was 30-35 years with 1 in 5 children dying at birth. Healthcare policy, such as it was, was based on Christian teachings; that it was everyone’s duty to care for the sick and poor. To that end medieval hospitals more resembled modern day hospices providing basic care for the destitute and dying with nowhere else to go. Education and literacy were largely the preserve of the clergy and it was in monasteries where most hospitals could be found. The Saxons built the first hospital in England in 937 C. E, and many more followed after the Norman Conquest in 1066, including St. Bartholomew's of London, built in 1123 C.E. The sick were cared for by a mix of practitioners including physicians, surgeons, barber-surgeons and apothecaries. Of these only physicians would have received formal training. The vast majority of people providing healthcare were practising a mix of folklore and superstition.

However, it was in the early medieval period that the first medical schools were formed and the first ever medical students went to university. In this musing I’m looking at what medical education was like in the Middle Ages at the most prestigious university of the age as well as the common theories behind disease and cure.

The Schola Medica Salernitana was founded in the 9th century in the Southern Italian city of Salerno. In 1050 one of its teachers Gariopontus wrote the Passionarius, one of the earliest written records of Western Medicine as we would recognise it. Gariopontus drew on the teachings of Galen (c. 129-199 CE) and latinised Greek terms. In doing so he formed the basis of several modern medical terms such as cauterise. Another early writing mentioned a student: “ut ferrum magnes, juvenes sic attrahit Agnes” (Agnes attracts the boys like iron to a magnet”). This shows that the first medical school in the world had female students.

The medical school published a number of treatises such as work by a woman called Trotula on childbirth and uterine prolapse and work on the management of cranial and abdominal wounds. In head wounds it was recommended to feel for and then surgical remove pieces of damaged skull. In abdominal trauma students were advised to try to put any protruding intestine back inside the abdomen. If the intestine was cold it was to be warmed by wrapping the intestines of a freshly killed animal over it beforehand with the wound being left open before a drain was inserted.

Anatomy remained based on the work of Galen. Doctors were encouraged to dissect pigs as their anatomy was felt to be the most closely related to humans. However, the teachers were more innovative when it came to disseminating knowledge, in verse form often with a spice of humour. 362 of these verses were printed for the first time in 1480 and would increase to 3520 verses in a later edition. By 1224 the Holy Roman Emperor Frederick II made it obligatory that anyone hoping to practice Medicine in the kingdom of Naples should seek approval from the masters of Salerno medical school.

But Salerno medical school did not teach any other subjects and so did not evolve into a studium generale or university as they began to spring up. By the fourteenth century the most prestigious medical school in Europe was the University of Bologna, founded in 1088, the oldest university in the world. In the United Kingdom medical training began at the University of Oxford in the 12th century but was haphazard and based on apprenticeship. The first formal UK medical school would not be established until 1726 in Edinburgh.

The University of Bologna was run along democratic lines, with students choosing their own professors and electing a rector who had precedence over everyone, including cardinals, at official functions.

The Medicine course lasted 4 years and consisted of forty six lectures. Each lecture focused on one particular medical text as written by Hippocrates (c. 460-370 BCE), Galen, and Avicenna (c. 980-1037 CE). Students would also read texts by these authors and analyse them using the methods of the French philosopher Peter Abelard to draw conclusions. His work Sic et Non had actually been written as guide for debating contrasting religious text, not scientific work. This reflected how religion and philosophy dominated the training of medical students. The university was attached to a cathedral and students were required to be admitted to the clergy prior to starting their studies. Further to studying Medicine students were also required to study the seven classical liberal arts: Grammar, Rhetoric, Logic, Geometry, Arithmetic, Music and Astronomy.

At the time knowledge of physiology and disease focused on the four humors: phlegm, blood, black bile, and yellow bile. Imbalance of one was what caused disease, for example too much phlegm caused lung disease and the body had to cough it up. This was a theory largely unchanged since its inception by the ancient Egyptians. This is why blood letting and purging were often the basis of medieval medicine. The state of imbalance was called dyskrasia while the perfect state of equilibrium was called eukrasia. Disease was also linked to extremes of temperature and personality. For example, patients who were prone to anger or passion were at risk of overheating and becoming unwell. Patients would also be at risk if they went to hot or cold places and so doctors were taught to advise maintaining a moderate personalty and avoiding extreme temperatures.

Diet was taught as important to prevent disease. During blood letting doctors were taught to strengthen the patient’s heart through a diet of rose syrup, bugloss or borage juice, the bone of a stag’s heart, or sugar mixed with precious stones such as emerald. Other foods such as lettuce and wine were taught as measures to help balance the humors.

Pharmacy was similarly guided by ancient principles. The Doctrine of Signatures dated back to the days of Galen and was adopted by Christian philosophers and medics. The idea being that in causing disease God would also provide the cure and make that intended cure apparent through design in nature. For example, eyebright flowers were said to resemble the human eye, while skullcap seeds were said to resemble the human skull. This was interpreted as God’s design that eyebright was to be used as a cure for eye disease and skullcap seeds for headaches.

God, the planets and polluted air or miasma were all blamed as the causes of disease. When the Black Death struck Italy in 1347 a contemporary account by the scholar Giovanni Villani blamed “the conjunction of Saturn and Jupiter and Mars in the sign of Aquarius” while the official Gabriel de Mussis noted “the illness was more dangerous during an eclipse, because then its effect was enhanced”. Gentile da Foligno, a physician and professor at the University of Bologna, blamed a tremor felt before the plague hit opening up underground pools of stagnant air and water. Doctors therefore were taught to purify either the body through poultices of mallow, nettles, mercury, and other herbs or the air by breathing through a posy of flowers, herbs and spices. De Mussis mentioned that “doctors attending the sick were advised to stand near an open window, keep their nose in something aromatic, or hold a sponge soaked in vinegar in their mouth.” During the Black Death a vengeful God was often blamed. The flagellants were a group of religious zealots who would march and whip themselves as an act of penance to try and appease God. There’s a sense here of being close but not quite. Of understanding that balance is important for the body, that environmental factors can cause disease and that there was something unseen spreading disease. Close but not yet there. The Middle Ages isn’t known as a time of enlightenment. That would come with the Renaissance. But it was not a barren wasteland. It was a time of small yet important steps.

It was in the Middle Ages that laws against human dissection were relaxed and knowledge of human anatomy began to improve. An eminent surgeon of the time Guy de Chauliac would lobby for surgeons to require university training and so started to create equivalence with physicians. Physicians began to use more observations to help them diagnose disease, in particular urine as seen in the Fasciculus Medicinae, published in 1491, then the pinnacle of medical knowledge at the time (this book also contained Wound Man as discussed in a previous musing). The scholarly approach encouraged at medical school led to methodical documentation from several physicians; is is through these writings that we know so much about the Black Death and other medieval illness. An English physician Gilbertus Angelicus (1180-1250) teaching at the Montpellier school of Medicine would be one of the first to recognise that diseases such as leprosy and smallpox were contagious.

Perhaps most importantly, it was in this period that the first medical schools and universities were established. These particular small steps would begin the role of doctor as a scholar and start to legislate the standards required of a physician. This would be an vital first step without which future advances could never have been possible.

Thanks for reading

- Jamie

The world’s first forensic scientist

dirt-88363_960_720.jpg

Our setting is a rural Chinese village. A man is found stabbed and hacked to death. Local investigators perform a series of experiments with an animal carcass looking at the type of wounds caused by different shaped blades and determine that the man had been killed with a sickle. The magistrate calls all of the owners of a sickle together. The ten or so suspects all deny murder. The sickles are examined, all are clean with nothing to give away being a murder weapon. Most in the village believe the crime won’t be solved. The magistrate then orders all of the suspects to stand in a field and place their sickle on the ground before stepping back. They all stand and wait in the hot afternoon sun. It’s an unusual sight. At first nothing happens. Eventually a metallic green fly lands on one of the sickles. It’s joined by another. And another. And another. The sickle’s owner starts to look very nervous as more and more flies land on his sickle and ignore everyone else’s. The magistrate smiles. He knows that the murderer would clean his weapon. But there would be tiny fragments of blood, bone and flesh invisible to the human eye but not beyond a fly’s sense of smell. The owner of the sickle breaks down and confesses. He’s arrested and taken away.

I think it’s safe to say that we love forensic science dramas. They’re all of a type: low lit metallic labs, ultraviolet lights, an array of brilliant yet troubled scientists and detectives dredging the depths of human depravity. Forensic science is the cornerstone of our criminal justice system, a vital weapon in fighting crime. Yet the tale of the flies and the sickle didn’t take place in 2019. It didn’t even take place this century. It was 1235 CE.

This account, the first recorded example of what we would now call forensic entomology, was recorded in Collected Cases of Injustice Rectified a Chinese book written in 1247 by Song Ci, the world’s first forensic scientist. This is his story.

Song Ci from a Chinese Stamp (From China Post)

Song Ci was born in 1186 in southeast China. He was born in a period of China’s history called the Song dynasty (960-1279 CE). This period saw a number of political and administrative reforms including developing the justice system to create the post of sheriff. Sheriffs were employed to investigate crime, determine the cause of death and to interrogate and prosecute subjects. With this developed a framework to investigate crime.

The son of a bureaucrat he was educated into a life of scholarship. First training as a physician he found his way into the word of justice and was appointed judge of prisons four times during his lifetime.

Bust of Shen Kuo (From Lapham’s Quarterly)

This was a time of polymaths. Song Ci was inspired by the work of Shen Kuo (1031-1095) a man who excelled in many different areas of philosophy, science and mathematics. Shen Kuo argued for autopsy and dissected the bodies of criminals in the process proving centuries held theories about human anatomy wrong. In the UK such a practice would not be supported in legislation for another seven centuries.

Song Ci built on Shen Kuo’s work, observing the practice of magistrates and complying recommendations based on good practice. This would form his book Collected Cases of Injustice Rectified; in all fifty-three chapters in five volumes. The first volume contained an imperial decree on the inspection of bodies and injuries. The second volume was designed as instruction in post-mortem examination. The remaining volumes helped identify cause of death and the treatment of certain injuries.

Of note the book outlines the responsibilities of the official as well as what would be considered now routine practices such as the importance of accurate notes and the need to present during the post-mortem (including not being put off by bad smells). There are procedures for medical examination and specific advice on questioning suspects and interviewing family members.

Forensically, the richest part of the text is within the section titled "Difficult Cases". This explains how an official could piece together evidence when the cause of death appears to be something else such as strangulation masked as suicidal hanging or intentional drowning made to look accidental. A pharmacopoeia is also provided to make obscure injuries appear. There is a detailed description of determining time of death by the rate of decomposition and whether the corpse has been moved.

Whilst forensic science has obviously progressed since the work of Song Ci what is striking is how the foundations of good forensic work have not changed. He wrote about determining self-inflicted wounds and suicide based on the direction of wounds or the disposition of the body. He recommended noting tiny details such as looking underneath fingernails or in various orifices for clues of foul-play. Standard procedure today.

Song Ci died in 1249 with little heraldry. However, in modern times there has been an increased appreciation of his work. Just think how few 13th century scientific publications could have been as relevant as his after nearly a millennium.

There is an Asian maxim that “China is the ocean that salts all the rivers that flow into it”. All of us try to contribute in some way to the river of life. Any practitioner or appreciator of forensics must recognise the tremendous contribution Song Ci and his contemporaries made to progress the flow of justice.

Thanks for reading

- Jamie

Those who cannot remember the past: how we forgot the first great plague and how we're failing to remember lessons with Ebola

Plague-in-the-Streets.png

“Those who cannot remember the past are condemned to repeat it”

George Santayana

To look at the History of Medicine is to realise how often diseases recur and, sadly, how humans repeat the same mistakes. It is easy to look back with the benefit of hindsight and with modern medical knowledge but we must remember how we remain as fallible as our forebears.

Our first introduction to the History of Medicine is often through learning about the Black Death at school. The story is very familiar; between 1347 and 1351 plague swept the whole of Europe killing between a third and two-thirds of the continent. At the time it was felt the end of the world was coming as a disease never before seen took up to 200 million lives. However, this was actually the second time plague had hit Europe. Nearly a thousand years earlier the first plague pandemic had devastated parts of Europe and the Mediterranean. Half the population of Europe were affected. This was Justinian’s plague, named for the Holy Roman Emperor whose reign the disease helped to define. Yet despite the carnage Europe forgot and had no preparation when plague returned.

Between 2014 and 2016 nearly 30,000 people were hit by the Ebola outbreak in West Africa. Our systems were found lacking as the disease struck in a way never before seen. We said we would learn from our mistakes. Never again.

Yet the current Ebola epidemic in the Democratic Republic of the Congo (DRC) is proving that even today it is hard to remember the lessons of the past and how disease will find any hole in our memory. This is the story of Justinian’s plague, the lessons we failed to learn then and now as we struggle with Ebola.

Justinian’s Plague

Justinian I. Contemporary portrait in the Basilica of San Vitale, Ravenna . From Wikipedia.

It’s 542 CE in Constantinople (now Istanbul). A century earlier the Western provinces of the Roman Empire collapsed. The Eastern empire continues in what will be called the Eastern Roman or Byzantine Empire. Constantinople is the capital city, then as now, a melting pot between Europe and Asia. Since 527 CE this Empire has been ruled by Justian I, an absolute monarch determined to return to the glory years of conquest.

The Empire has already expanded to cover swathes of Northern Africa. Justinian’s focus is now on reclaiming Italy. The Empire is confident and proud, a jewel in an otherwise divided Europe now in the Dark Ages.

The Eastern Roman Empire at the succession of Justinian I (purple) in 527 CE and the lands conquered by the end of his reign in 565 CE (yellow). From US Military Academy.

Procopius of Caesarea (Creative Commons)


The main contemporary chronicler of the plague of Justinian, Procopius of Caesarea (500-565 CE)  identified the plague as arriving in Egypt on the Nile’s north and east shores. From there it spread to Alexandria in the north of Egypt and east to Palestine. The Nile was a major route of trade from the great lakes of Africa to the south. We now know that black rats on board trade ships brought the plague initially from China and India via Africa and the Nile to Justinian’s Empire.


Procopius noted that there had been a particularly long period of cold weather in Southern Italy causing famine and migration throughout the Empire. Perfect conditions to help a disease spread.

In his book Secret History Procopius detailed the symptoms of this new plague: delusions, nightmares, fevers and swellings in the groin, armpits, and behind their ears. For most came an agonising death. Procopius was of no doubt that this was God’s vengeance against Justinian, a man he claimed was supernatural and demonic.

Justinian’s war in Italy helped spread disease but so did peace in the areas he’d conquered. The established trade routes in Northern Africa and Eastern Europe, with Constantinople in the centre, formed a network of contagion. Plague swept throughout the Mediterranean. Constantinople was under siege for four months during which time Procopius alleged 10,000 people died a day in the city. Modern historians believe this figure to be closer to a still incredible 5,000 a day. Corpses littered the city streets. In scenes prescient of the Black Death, mass plague pits were dug with bodies thrown in and piled on top of each other. Other victims were disposed of at sea. Justinian was struck down but survived. Others in Constantinople were not so lucky, in just four months up to 40% of its citizens died.

The plague’s legacy

The plague continued to weaken the Empire, making it harder to defend. Like the medieval kings after him Justinian I struggled to maintain the status quo and tried to impose the same levels of taxation and expansion. He died in 565 CE. His obsession with empire building has led to his legacy as the ‘last Roman’. By the end of the sixth century much of the land in Italy Justinian had conquered had been lost but the Empire had pushed east into Persia. Far from Constantinople the plague continued in the countryside. The plague finally vanished in 750 CE by which point up to 50 million people had died, 25% of the population of the Empire.

Procopius’s description of the Justinian plague sounds like a lot like bubonic plague. This suspicion was confirmed in recent research.

Yersinia pestis bacteria, Creative Commons

At a two separate graves in Bavaria bacterial DNA was extracted from the remains of Justinian plague victims. The DNA matched that of Yersinia pestis the bacterium which causes bubonic plague. The DNA was analysed and found to be most closely related with Y. pestis still endemic to this day in Central Asia. This suggests the route from infection via trade from Asia to Europe.

After 750 CE plague vanished from Europe. New conquerors came and went with the end of the Dark Ages and the rise of the Middle Ages. Europeans forgot about plague. In 1347 they would get a very nasty reminder.

It’s very easy now in our halcyon era of medical advances to feel somewhat smug. Yes, an interesting story but wouldn’t happen now. Medieval scholars didn’t have germ theory. Or a way of easy accessing Procopius’s work. Things are different now.

We’d study Justinian’s plague with its high mortality. We’d identify the cause. We’d work backwards and spot how the trade link with Asia was the route of infection. We’d work to identify potential outbreaks in their early stages in Asia. By the time the second plague epidemic was just starting we’d notice it. There would be warnings to spot disease in travelers and protocols for dealing with mass casualties and disposal of bodies. We’d initiate rapid treatment and vaccination if possible. We’d be OK.

Ebola shows how hard this supposedly simple process remains.

Ebola: A Modern Plague

The Ebola virus (Creative Commons)

Ebola Viral Disease is a type of viral haemorrhagic fever first identified in 1976 during an outbreak in what is now South Sudan and the DRC. Caused a spaghetti-like virus known as a filovirus this disease causes severe dehydration through vomiting and diarrhoea before internal and external bleeding can develop. Named for the Ebola River where it was first identified it spreads by direct human contact with a mortality rate varying from 25% to 90%. An epidemic has been ongoing in DRC since August 2018. We are in our fourth decade of knowing about Ebola. And five years ago we were given the biggest warning yet about its danger.

Up until 2014 the largest outbreak of Ebola had affected 315 people. Other outbreaks were much smaller. Ebola seemed to burn brightly but with only a few embers. In 2014 it became a forest fire.

A healthcare worker during the Ebola outbreak of 2014-16 (From CNN.com)

The West Africa Epidemic of 2014-16 hit Guinea, Sierra Leone and Liberia. The disease showed its potential in our age of global travel as the first cases appeared in America and Europe. In all there were 28,160 reported cases. 11,308 people died. Ebola caught the world napping. What had been a rare disease of Africa was potentially a threat to us all. Suspicion of foreign healthcare workers and miscommunication about the causes of Ebola were blamed for helping to further the disease. Yet there was hope as promising experimental vaccines were put into production.

As the forest fire finally died down there was a chance to reflect. There were many publications, including from Médecins sans frontières, about the lessons learnt from Ebola and how not to repeat the lessons from the past. These were all along similar themes: the importance of trained frontline staff, rapid identification of the disease, engaging and informing local communities, employing simple yet effective methods to provide disease spread and the use of the new vaccine to protect contacts and contacts of contacts. There was lots of criticism about the speed of the World Health Organisation response but also a feeling that with new tools and lessons learnt things would be different next time.

When Ebola surfaced again last year in the DRC there was initial hope that lessons were learnt. Over 100,000 people have been vaccinated; a new weapon. However, the disease continues a year on with over 1000 cases and over 800 fatalities and fresh concern that this outbreak is far from over.

There remains delays in identifying patients with Ebola; not surprising as the early symptoms mimic more common diseases such as malaria. As a result patients are not isolated quickly enough and may infect others before their test results are back. Also the talk of engaging communities is falling flat. In a region torn apart by decades of civil unrest there is widespread mistrust of authorities with blame falling on the Ebola units themselves for causing death. It is estimated that 30% of patients are staying at home and being a potent vector for disease rather than coming forward. There has also been violence against healthcare workers and hospitals as a result of this fear. Reassuringly, where local communities and healthcare has come together Ebola has been stopped but this is not the norm and behavioural scientists are being used to help connect with locals. Despite the lessons learnt Ebola is continuing to be a difficult adversary.

It is easy in the West to feel we are immune from misinformation and fear. Yet look at the current measles epidemic in New York State. Look at the anti-vaccination movement, labelled a “public health timebomb” by Simon Stevens, the chief executive of NHS England last week. We are no more immune than anyone else to irrationality. Nor too proud to learn the lessons of the past; the ‘ring’ style of vaccinating contacts against Ebola is the same as used during the successful campaign to eradicate smallpox over four decades ago.

Medical advances have come on in ways no-one in the Middle Ages could have foreseen. We have never had more ways to share our knowledge of disease or so many ways to prevent suffering. Yet people remain the same. And that’s the tricky part. Let’s not forget that bit.

Thanks for reading

- Jamie

There is nothing new under the sun: the current New York measles epidemic and lessons from the first 'anti-vaxxers'

An 1807 cartoon showing ‘The Vaccination Monster’

What has been will be again,
    what has been done will be done again;
    there is nothing new under the sun.

Ecclesiastes 1:9

The State of New York is currently in the midst of an epidemic. Measles, once eradicated from the USA has returned with a vengeance. Thanks to a rise in unvaccinated children fueled by the ‘anti-vaxxer’ movement 156 children in Rockwood County have been infected by measles; 82.8% of these had never had even one MMR vaccine. With measles now rampant in the boroughs of Brooklyn and Queens the state government has taken an unusual step. In New York in the USA, the home of liberty and personal choice, no unvaccinated under-18 year old is now able to set foot in a public space. Parents of unvaccinated children who break this ban will face fines or jail.

In a previous blog I wrote about the fight against smallpox first using variolation (which sometimes caused infection) and then the invention of the world’s first vaccine. This musing is about how vaccination was made compulsory in the United Kingdom, the subsequent fight against it through a public campaign and how that movement raised its head again in the last few decades. This is the story of the first ‘anti-vaxxer’ movement and how the arguments regarding vaccination show there isn’t really anything new under the sun.

Early opposition to vaccination

Following Edward Jenner’s work into using cowpox to offer immunity against smallpox in 1796 the Royal Jennerian Society was established in 1803 to continue his research.

Even in these early days there was opposition to the vaccine. John Birch, the ‘surgeon extraordinary’ to the Prince of Wales pamphleteered against Jenner’s work with arguments one might expect to see circulating today on social media:

A section of John Birch’s pamphlet from https://vaxopedia.org/2017/10/07/who-was-john-birch/

He of course did not mention how he was making a lot of money through inoculating patients against smallpox (a practice that vaccination would replace) or using novel treatments such as electrocution.

Wood painting caricature from 1808 showing Edward Jenner confronting opponents to his vaccine (note the dead at their feet) (Creative Commons)

Despite Birch’s efforts by 1840 the efficacy of Jenner’s vaccine was widely accepted. Decades before germ theory was established and viruses were identified we finally had a powerful weapon against a deadly disease. Between 1837 and 1840 a smallpox epidemic killed 6,400 people in London alone. Parliament was persuaded to legislate. The 1840 Vaccination Act made the unpredictable variolation illegal and made provision for free, optional smallpox vaccination.

At the time healthcare in the UK was largely unchanged since Tudor times. Parish based charity had been the core of support for the sick and poor until workhouses were made the centre of welfare provision in 1834. With the workhouse came a stigma that illness and poverty were avoidable and to be punished. Government was dominated by two parties, the Whigs and the Tories both of whom were non-interventionist and the universal healthcare provided by the NHS was over a century away. Consider this laissez-faire backdrop with punitive welfare. The fact free vaccination was provided is remarkable and I think reflects the giddy optimism at a future without ‘the speckled monster’ of smallpox.

The Anti-Vaccination Leagues

The Vaccination Act of 1853 went further. Now vaccination against smallpox was compulsory for all children born after 1st August 1853 within the first three months of their life with fines for parents who failed to comply. By the 1860s two-thirds of babies in the UK had been vaccinated.

There was immediate opposition to the 1853 Act with violent protests across the country. This was the state’s first steps into the health of private citizens. The response seems to have been motivated in much the same way as the modern day opposition in the US to vaccination and universal healthcare in general: that health is a matter of private civil liberty and that vaccination caused undue distress and risk. In England and Wales in particular although the penalties were rarely enacted their presence alone seems to have been motivation for opposition. The Anti-Vaccination League in London was established in 1853 to allow dissenting voices to coalesce.

The Vaccination Act of 1867 extended the age by which a child had to be vaccinated to 14 with cumulative fines to non-compliance. That same year saw the formation of the Anti-Compulsory Vaccination League. They published the National Anti-Compulsory Vaccination Reporter newsletter in which they listed their concerns, the first three being:

I. It is the bounden duty of parliament to protect all the rights of man.

II. By the vaccination acts, which trample upon the right of parents to protect their children from disease, parliament has reversed its function.

III. As parliament, instead of guarding the liberty of the subject, has invaded this liberty by rendering good health a crime, punishable by fine or imprisonment, inflicted on dutiful parents, parliament is deserving of public condemnation.

Further newsletters were formed over the following decades: the Anti-Vaccinator (founded 1869), the National Anti-Compulsory Vaccination Reporter (1874), and the Vaccination Inquirer (1879). All of these continued to place political pressure against compulsory vaccination. Much like today the main body of arguments focused on personal choice and the testimony of parents alleging that their child was injured or killed by vaccination. In Leicester in 1885 an anti-vaccination demonstration attracted 100,000 people. A staggering number when the city’s population in total at the time was around 190,000.

A royal commission was called to advise on further vaccination policy. After deliberation for seven years listening to evidence across the spectrum of opinion in 1896 they published their findings. Smallpox vaccination was safe and effective. However, it advised against continuing compulsory vaccination. Following the 1898 Vaccination Act parents who did not want their child to be vaccinated could ‘conscientiously object’ and be exempt. There was no further appetite for Parliament to intervene in the rights of parents. Even the fledgling socialist Labour Party, no enemy of government intervention, made non-compulsory vaccination one of its policies.

Whilst the two World Wars saw a change in public opinion towards a greater role in society for government, culminating in the creation of the National Health Service in 1948, vaccination remains voluntary in the United Kingdom. The first half of the 20th century saw the advent of vaccines against several deadly diseases such as polio, measles, diphtheria and tetanus. In 1966 an ambitious worldwide vaccination programme led by the World Health Organisation saw smallpox become the first disease to be eradicated by mankind in 1980. There were dreams of polio and measles going the same way. It was not to be.

Anti-vaccination re-emerges

Herd immunity is a key component for any vaccination programme to be effective. Not everyone can be vaccinated and so they rely on being surround by vaccinated people to prevent transmission. The level of vaccination in a population required for herd immunity varies between diseases. The accepted standard to prevent measles transmission is 90-95%.

On 28th February 1998 an article was published in the Lancet which claimed that the Measles, Mumps and Rubella (MMR) vaccine was linked to the development of development and digestive problems in children. Its lead author was Dr Andrew Wakefield, a gastroenterologist.

The infamous Lancet paper linking the MMR vaccine to developmental and digestive disorders

The paper saw national panic about the safety of vaccination. The Prime Minister Tony Blair refused to answer whether his newborn son Leo had been vaccinated.

Except just like John Birch nearly two centuries before him Andrew Wakefield had held a lot back from the public and his fellow authors. He was funded by a legal firm seeking to prosecute the companies who produce vaccines. This firm led him to the parents who formed the basis of his ‘research’. The link between children developing developmental and digestive problems was made by the parents of twelve children recalling that their child first showed their symptoms following the MMR vaccine. Their testimony and recall alone were enough for Wakefield to form a link between vaccination and autism. From a research sense his findings were no more useful than those the Victorian pamphlets used. But the damage was done. The paper was retracted in 2010. Andrew Wakefield was struck off as were some of his co-authors who did not practice due diligence. Sadly, this has only helped Wakefield’s ‘legend’ as he tours America spreading his message tapping in to the general ‘anti-truth’ populist movement. Tragically unsurprisingly, often in his wake comes measles.

Earlier this year the largest study to date investigating the links between MMR and autism was published. 657,461 children in Denmark were followed up over several years (compare that to Wakefield’s research where he interviewed the parents of 12 children). No link between the vaccine and autism was shown. In fact, no large high level research has ever backed up Wakefield’s claim.

There are financial and political forces at work here. Anti-vaccination is worth big money. The National Vaccination Information Center in the US had an annual income of $1.2 billion in 2017. And the people they target are economically and politically powerful. Recent research in America shows that parents who refuse vaccinations for their children are more likely to be white, educated and of higher income. They prize purity and liberty above all, emotional reasoning over logic. They vote. And their world view is prevalent in certain circles.

Tweet by Donald Trump 28th March 2014

Tweet by Donald Trump 28th March 2014

In the UK in 2018 the rate of MMR vaccination was 91.8%, worryingly close to no longer being effective for herd immunity. There have been debates in the UK about re-introducing compulsory vaccination. In France certain childhood vaccinations are now compulsory. Social media companies are under pressure to silence the groups anti-vaxxers use to spread their message and recruit. The state is once again prepared to step into personal liberty when it comes to vaccines.

In 1901 52% of childhood deaths in England and Wales were due to infectious diseases. By 2000 it was 7.4%. In 1901 40.6% of all deaths were children. By 2000 it was 0.9%. No-one would want that progress to reverse. But history does have a habit of repeating itself if we let it. The debates continue to be the same: the rights of parents and the individual versus those of the state and public health necessity. This is a debate we have to get right. History tells us what will happen if we don’t. After all, there is nothing new under the sun.

Thanks for reading

- Jamie