Today we chat to SCI member Luca Steel about her life as a plant pathology PhD student in 2020.
Zymoseptoria tritici is a fungal pathogen of wheat which can cause yield losses of up to 50%. We’re investigating an effector protein secreted by Z. tritici which acts as a ‘mask’, hiding the pathogen from host immune receptors and avoiding immune response.
What does a day in the life of a plant pathology PhD Student look like?
My days are very varied – from sowing wheat seeds to swabbing pathogenic spores onto their leaves, imaging symptoms, discussing results with my supervisor and lab team, and of course lots of reading. It doesn’t always go to plan - I recently attempted to make some wheat leaf broth, which involved lots of messy blending and ended up turning into a swampy mess in the autoclave!
Wheat in the incubator!
How did your education prepare you for this experience?
The most valuable preparation was my placement year at GSK and my final year project at university. Being in the lab and having my own project to work on made me confident that I wanted to do a PhD – even if it was a totally different research area (I studied epigenetics/immunoinflammation at GSK!).
What are some of the highlights so far?
My highlight was probably attending the European Conference on Fungal Genetics in Rome earlier this year. It was great to hear about so much exciting work going on – and it was an added bonus that we got to explore Rome. I’ve also loved getting to know my colleagues and being able to do science every day.
What is one of the biggest challenges faced in a PhD?
My biggest challenge so far has probably been working from home during lockdown. Although I am very privileged to have a distraction-free space and good internet connection, it was difficult to adjust to working from my kitchen! It was sad abandoning unfinished experiments, and I missed being in the lab – so I’m glad to be back now.
What advice would you give to someone considering a PhD?
If you’re sure you want to do one, then absolutely go for it and don’t be afraid to sell yourself! If not, I’d recommend spending some time working in a lab before you apply and chatting to any prospective labs. If you don’t get a reply from the PI, existing students/post-docs in the group are often very happy to talk and give honest opinions.
How have things been different for you because of the global pandemic?
I was lucky that the pandemic came early on in my PhD, so I had a lot of flexibility to change what I was working on. I switched from lab work involving lots of bioimaging, towards a more bioinformatic approach. My poor laptop will be glad when I’m back to using my computer at work!
2019 has been declared by UNESCO as the Year of the Periodic Table. To celebrate, we are releasing a series of blogs about our favourite elements and their importance to the chemical industry.
Discovery of this noble gas:
In 1894 argon was discovered by chemists Sir William Ramsay and Lord Rayleigh. Ramsay believed the presence of a heavy impurity in the ‘atmospheric’ nitrogen could be responsible for giving nitrogen a higher density when isolated from the air. Both scientists worked to discover this unrecognised new element hiding in the air, winning a Nobel Prize in 1904, primarily for their role in the discovery of argon.
Argon makes up 1% of the earth’s atmosphere and it is the most plentiful of the rare gases. Argon can be both used in its gaseous state and its liquid form. In its liquid state, argon can be stored and transported more easily, affording a cost-effective way to deliver product supply.
Argon as a narcotic agent
One of the most well-known biological effects of argon gas is in its narcotic capabilities. Sea divers normally develop narcotic symptoms under high pressure with normal respiratory air. These symptoms include slowed mental cognition and psychological instability. Argon exerts this narcotic effect in a physical way rather than in a chemical way, as argon, an inert gas, does not undergo chemical reactions in the body.
During the heating and cooling of printing materials, argon provides several benefits to this process. The gas reduces oxidation of the metal preventing reactions and keeping out impurities. This creates a stable printing environment as a constant pressure is maintained.
Future of argon
Argon as a clinical utility tool has received maximum attention. Although the potential benefits are still in the experimental stages, argon could be the ideal neuroprotective agent. Studies have shown that argon could improve cell survival, brain structural integrity and neurological recovery. These protective effects are also efficient when delivered up to 72 hours after brain injury.
Springtime colour is one of gardening’s greatest joys. Colourful bursts dispel the long darkness of winter with its depressing wetness and cold. Social research is clearly showing the physical and mental benefits obtained from the emergence in spring of bright garden colours linked with lengthening daylight. As with most gardening pleasures, this requires advanced financial outlay and an understanding of the rhythms of plant growth.
Planting bulbs such as daffodils, tulips and hyacinths in autumn is the necessary investment. In return, plant breeders now provide a huge array of colours, shapes, sizes and seasonal sequencing with bulbous plants.
Geoff Dixon: February Gold daffodils
Bulbs are large pieces of vegetative tissue which come pre-loaded with immature leaves and flowers, safely wrapped inside a dry coating of protective scales. Essentially, bulbs are large flower buds which are stimulated into growth by planting in warm, moist soil or compost. These conditions trigger the emergence of roots from the base of each bulb. Because bulbs are nascent plants, they require careful handling and are safest once planted.
Many bulbous species originate from higher altitude mountainous pastures and are naturally evolved for dealing with fluctuating periods of heat, cold and drought. Once safely planted at depths which should equal twice the length of each bulb, they will survive the freezing, thawing and fluctuating soil water- content delivered by winter weather.
Geoff Dixon: Bulb structure showing the flower bud embedded in the bulb
Warming soils of spring encourage growth and emergence of the leaves and flower buds contained within each bulb. Speed of emergence is governed by interaction between the genetic complement of bulbs and an interaction with their environment. Identifying and understanding the impact of this interaction formed the basis for Charles Darwin and Alfred Wallaces’ theory of natural selection. For springtime gardeners it is expressed in the multiplicity of bulbs on offer. Choosing a range of daffodil varieties for example, provides colourful gardens from February through to late May.
Geoff Dixon: Technique for planting bulbs using hand trowel and some sand for drainage under the bulb
Conserving the joys of spring pleasure over years can be achieved by naturalising bulbs. This means planting them in grass swards. This works effectively for daffodils, provided the foliage is allowed 8 to 10 weeks of uninterrupted growth and senescence after flowering. During this period, photosynthesis produces the chemical energy needed for replacement growth, which provides bulb multiplication and flower bud development for the following year. Tulips are much less easily naturalised in British gardens. This is because the leaves mature and senesce much more quickly after flowering, hence, less energy is produced, therefore, regrowth is less, and replacement flower buds are not formed.
For most gardeners the policy should be one of enjoying each springtime’s show and replacing bulbs with new ones every autumn for a relatively modest outlay.
2019 has been declared by UNESCO as the Year of the Periodic Table. To celebrate, we are releasing a series of blogs about our favourite elements and their importance to the chemical industry. Today’s blog focuses on zinc and its contribution towards a sustainable future.
Foods high in zinc: Evan Lorne
Zinc is a naturally occurring element, considered a ‘life saving commodity’ by the United Nations. As well as playing a fundamental role in the natural development of biological processes, it is also highly recyclable which means that once it has reached the end of its life cycle, it can be recycled, and returned to the cycle as a new source of raw material. Statistically, around 45% of zinc in Europe and in the United States is recovered and recycled once it has reached the end of its life cycle.
Circular and linear economy showing product life cycle: Petovarga
Circular economy is an economic model that focuses on waste reduction and ensuring a product that has reached its end cycle is not considered for disposal, but instead becomes used as a new source of raw material. Zinc fits this model; its lifecycle begins from mining and goes through a refining process to enable its use in society. Finally, it is recycled at the end of this process.
The production of zinc-coated steel mill: gyn9037
Zinc contributes to the planet in various ways:
1. Due to its recyclable nature, it lowers the demand for new raw material
2. As zinc provides a protective coating for steel, it extends the lifecycle of steel products
3. Coating steel reduces carbon dioxide emissions
As reported by the Swedish Environmental Protection Agency, zinc uses the lowest energy on a per unit weight and per unit volume basis, (with the exception of iron). Only a small amount of zinc is needed to conserve the energy of steel, and during electrolytic zinc production, only 7% of energy is used for mining and mineral processing.
Green technology: Petrmalinak
According to a new report published by The World Bank, ‘The Growing Role of Minerals and Metals for a Low-Carbon Future,’ a low carbon future and a rise in the use of green energy technologies will lead to an increased demand in a selected range of minerals and metals. These metals include aluminium, copper, lead, lithium, manganese, nickel, silver, steel, zinc and rare earth minerals. Hence, zinc will be one of the main metals to fill this demand.
Aldrin, Armstrong and Collins, Apollo 11’s brave astronauts were the first humans with the privilege of viewing Earth from another celestial body. These men uniquely wondered “what makes Earth special?” Certainly, within our Solar System, planet Earth is very special. Its environment has permitted the evolution of a panoply of life.
Green plants containing the pigment, chlorophyll either in the oceans as algae or on land as a multitude of trees, shrubs and herbs harvest energy from sunshine. Using a series of chemical reactions, known as photosynthesis, light energy is harvested and attached onto compounds containing phosphorus.
Captured energy then drives a series of reactions in which atmospheric carbon dioxide and water are combined forming simple sugars while releasing oxygen. These sugars are used further by plants in the manufacture of larger carbohydrates, amino acids and proteins, oils and fats.
The release of oxygen during photosynthesis forms the basis of life’s second vital process, respiration. Almost all plants and animals utilise oxygen in this energy releasing process during which sugars are broken down.
Released energy then drives all subsequent growth, development and reproduction. These body-building processes in plants are reliant on the transfer of the products of photosynthesis from a point of manufacture, the source, to the place of use, a sink.
Leaves and shoots are the principle sources of energy harvesting while flowers and fruits are major sinks with high levels of respiration.
Figure 1: Photosynthesis vs respiration, drawn by James Hadley
Transfer between sources and sinks occurs in a central system of pipes, the vascular system, using water as the carrier. Water is obtained by land plants from the soils in which they grow. Without water there would be no transfer and subsequent growth. Earth’s environment is built around a ‘water-cycle’ supplying the land and oceans with rain or snow and recycles water back into the atmosphere in a sustainable manner.
Early in Earth’s evolution, very primitive marine organisms initiated photosynthetic processes, capturing sunlight’s energy. As a result, in our atmosphere oxygen became a major component. That encouraged the development of the vast array of land plants which utilise rain water as the key element in their transport systems.
Subsequently, plants formed the diets of all animals either by direct consumption as herbivores or at second-hand as carnivores. As a result, evolution produced balanced ecosystems and humanity has inherited what those astronauts saw, “the Green Planet”.
Earth will only retain this status if humanity individually and collectively defeats our biggest challenge – climate change. Burning rain forests in South America, Africa and Arctic tundra will disbalance these ecosystems and quicken climate change.
Controlling when and how vigorously plants flower is a major discovery in horticultural science. Its use has spawned vast industries worldwide supplying flowers and potted plants out-of-season. The control mechanism was uncovered by two American physiologists in the 1920s. Temperate plants inhabit zones where seasonal daylength varies between extending light periods in spring and decreasing ones in autumn.
Those environmental changes result in plants which flower in long-days and those which flower in short-days. ‘Photoperiodism’ was coined as the term describing these events. Extensive subsequent research demonstrated that it is the period of darkness which is crucially important. Short-day plants flower when darkness exceeds a crucial minimum, usually about 12 hours which is typical of autumn. Long-day plants flower when the dark period is shorter than the crucial minimum.
Irises are long day flowers. Image: Geoffery R Dixon
A third group of plants usually coming from tropical zones are day-neutral; flowering is unaffected by day-length. Long-day plants include clover, hollyhock, iris, lettuce, spinach and radish. Gardeners will be familiar with the way lettuce and radish “bolt” in early summer. Short-day plants include: chrysanthemum, goldenrod, poinsettia, soybean and many annual weed species. Day-neutral types include peas, runner and green beans, sweet corn (maize) and sunflower.
Immense research efforts identified a plant pigment, phytochrome as the trigger molecule. This exists in two states, active and inactive and they are converted by receiving red or far-red wavelengths of light.
Sunflowers are day neutral flowers. Image: Geoffery R Dixon
In short-day plants, for example, the active form suppresses flowering but decays into the inactive form with increasing periods of darkness. But a brief flash of light restores the active form and stops flowering. That knowledge underpins businesses supplying cut-flowered chrysanthemums and potted-plants and supplies of poinsettias for Christmas markets. Identifying precise demands of individual cultivars of these crops means that growers can schedule production volumes gearing very precisely for peak markets.
Providing the appropriate photoperiods requires very substantial capital investment. Consequently, there has been a century-long quest for the ‘Holy Grail of Flowering’, a molecule which when sprayed onto crops initiates the flowering process.
Chrysanthemums are short day flowers. Image: Geoffery R Dixon
In 2006 the hormone, florigen, was finally identified and characterised. Biochemists and molecular biologists are now working furiously looking for pathways by which it can be used effectively and provide more efficient flower production in a wider range of species.
2019 has been declared by UNESCO as the Year of the Periodic Table. To celebrate, we are releasing a series of blogs about our favourite elements and their importance to the chemical industry. Today’s blog focuses on silicon’s positive effects on the body.
Silicon was not originally regarded as an important element for human health, as it was seen to have a larger presence in (other) animal and plant tissue. It was not until a 2002 ‘The American Journal of Clinical Nutrition’ paper that surmised that accumulating research found that silicon plays an important role in bone formation in humans.
Silicon was first known to ‘wash’ through biology with no toxological or biological properties. However, in the 1970s, animal studies provided evidence to suggest that silicon deficiency in diets produced defects in connective and skeletal tissues. Ongoing research has added to these findings, demonstrating the link between dietary silicon and bone health.
Silicon plays an important role in protecting humans against many diseases. Silicon is an important trace mineral essential for strengthening joints. Additionally, silicon is thought to help heal and repair fractures.
The most important source of exposure to silicon is your diet. According to two epidemiological studies (Int J Endocrinol. 2013: 316783 ; J Nutr Health Aging. 2007 Mar-Apr; 11(2): 99–110) conducted, dietary silicon intake has been linked to higher bone mineral density.
Silicon is needed to repair tissue, as it is important for collagen synthesis – the most abundant protein in connective tissue in the body – which is needed for the strengthening of bones.
However, silicon is very common in the body and therefore it is difficult to prove how essential it is to this process when symptoms of deficiency vary among patients.
There has also been a plausible link between Alzheimer’s disease and human exposure to aluminium. Research has been underway to test whether silicon-rich mineral waters can be used to reduce the body burden of aluminium in individuals with Alzheimer’s disease.
However, longer term study is needed to prove the aluminium hypothesis of Alzheimer’s disease.
2019 has been declared by UNESCO as the Year of the Periodic Table. To celebrate, we are releasing a series of blogs about our favourite elements and their importance to the chemical industry. Today’s blog is about the importance of potassium in human health.
Potassium plays an essential role to health, being the third most important mineral in the body. The human body requires at least 1000mg of potassium a day in order to support key bodily processes.
Potassium regulates fluid balance in the body, controls the electrical activity of the heart, muscles, and helps in activating nerve impulses throughout the nervous system.
According to an article from Medical News Today Knowledge Center, the possible health benefits to a regular diet intake of potassium include maintaining the balance of acids and bases in the body, supporting blood pressure, improving cardiovascular health, and helping with bone and muscle strength.
These powerful health benefits are linked to a potassium rich diet. Potassium is present in all fruits, vegetables, meat and fish.
Receptors on a cell membrane.
Can it go wrong?
The body maintains the potassium level in the blood. If the potassium level is too high in the body (hyperkalemia) or if it is too low (hypokalemia) then this can cause serious health consequences, including an abnormal heart rhythm or even a cardiac arrest.
Fortunately, cells in the body store a large reservoir of potassium which can be released to maintain a constant level of potassium in blood.
What is hyperkalemia? Video: Osmosis
Potassium deficiency leads to fatigue, weakness and constipation. Within muscle cells, potassium would normally send signals from the brain that stimulate contractions. However, if potassium levels steep too low, the brain is not able to relay these signals from the brain to the muscles, the results end in more prolonged contractions which includes muscle cramping.
As potassium is an essential mineral carrying out wide ranging roles in the body, the low intakes can lead to an increase in illness. The FDA has made a health claim, stating that ‘diets containing foods that are a good source of potassium and that are low in sodium may reduce the risk of high blood pressure and stroke.’
This suggests that consuming more potassium might reduce the risks of high blood pressure and the possibility of strokes. However, more research on dietary and supplemental potassium is required before drawing towards a set conclusion.
For British Science Week 2019, we are looking back at how Great Britain has shaped different scientific fields through its research and innovation. First, we are delving into genetics and molecular biology – from Darwin’s legacy, to the structure of DNA and now modern molecular techniques.
The theory of evolution by natural selection is one of the most famous scientific theories in biology to come from Britain. Before Charles Darwin famously published this theory, several classical philosophers considered how some traits may have occurred and survived, including works where Aristotle pondered the shape of teeth.
These ideas were forgotten until the 18th century, when they were re-introduced by philosophers and scientists including Darwin’s own grandfather, Erasmus Darwin.
Darwin used birds, particularly pigeons and finches to demonstrate his theories. Image: Pixabay
In 1859, Darwin first set out his theory of evolution by natural selection to explain adaptation and speciation. He was inspired by observations made on his second voyage of HM Beagle, along with the work of political economist Thomas Robert Malthus on population.
Darwin coined the term ‘natural selection’, thinking of it as like the artificial selection imposed by farmers and breeders. After publishing a series of papers with Alfred Russel Wallace, followed by On the Origin of Species, the concept of evolution was widely accepted.
Although many initially contested the idea of natural selection, Darwin was ahead of his time, and further evidence was yet to come in the form of genetics.
Gregor Mendel first discovered genetics whilst working on peas and inheritance in the late 19th century. The unraveling of the molecular processes that were involved in this inheritance, however, allowed scientists to study inheritance and genetics in a high level of detail, ultimately advancing the field dramatically.
A major discovery in the history of genetics was the determination of the structure of deoxyribose nucleic acid (DNA).
DNA was first isolated by Swiss scientists, and it’s general structure – four bases, a sugar and a phosphate chain – was elucidated by researchers from the United States. It was a British team that managed to make the leap to the three-dimensional (3D)structure of DNA.
Using x-ray diffraction techniques, Rosalind Franklin, a British chemist, discovered that the bases of DNA were paired. This lead to the first accurate model of DNA’s molecular structure by James Watson and Francis Crick. The work was initially published in Nature in 1953, and would later win them a Nobel Prize.
The age of genetic wonder. Source: TED
By understanding the structure of DNA, further advances in the field were made. This has lead to a wide range of innovations, from Crispr/CAS9 gene editing to targeted gene therapies. The British-born science has been utilised by British pharmaceutical companies – pharma-giants GlaxoSmithKline (GSK) and AstraZeneca use this science today in driving new innovations.
Microscopic membranous vesicles floating outside of cells were first discovered 50 years ago; 30 years later, a subset of these was coined exosomes. At the time, these membrane bubbles were believed to be nothing more than a cellular waste disposal mechanism. But within the past decade, extracellular vesicles – and exosomes in particular – have piqued scientists’ interests, resulting in a research boom.
In 2006, there were just 115 publications referencing exosomes; by 2015, this number had mushroomed to 1010. Today, a PubMed search brings up more than 7500 publications. Consulting firm Grand View Research estimates that the global exosome market could reach $2.28bn by 2030.
Advancements in exosome research could lead to breakthroughs in prostate cancer treatment.
The interest in exosomes has been driven by the new finding that exosomes are more than just a waste disposal system – they are also a means of communication between cells and have the ability to carry cargos such as proteins and mRNA, suggesting there could be potential medical applications.
‘Currently, research into exosomes and other extracellular vesicles is very strong,’ says Jason Webber, Prostate Cancer UK research fellow in the Division of Cancer and Genetics at Cardiff University. ‘I think this field of research will continue to grow and I believe we’ll also see greater clinical application of exosomes and a drive towards research exploring the therapeutic potential of exosomes.’
Exosomes in Cancer Research. Video: Thermo Fisher Scientific
Exosomes are best described as extracellular vescles – essentially membrane sacs – formed by the inward budding of the membrane of intracellular compartments known as multivesicular bodies (MVBs) or multivesicular endosomes (MVEs). They are released from cells when MVBs fuse with the cell’s plasma membrane, releasing its contents outside the cell. These vesicles, made of a phospholipid bilayer and ranging between 40nm and 150nm in diameter, are found in all biological fluids including blood, urine, saliva, bile, semen and breast milk.
It’s well known that the oceans are becoming more acidic as they absorb increasing amounts of CO2 from the atmosphere. Now, German researchers say they have found the first evidence that this is happening in freshwaters, too, with potentially widespread effects on ecosystems.
‘Many current investigations describe tremendous effects of rising CO2 levels on marine ecosystems,’ says Linda Weiss at Ruhr-University Bochum: acidic oceans can have major impacts on marine food webs, nutrient cycles, overall productivity and biodiversity. ‘However, freshwater ecosystems have been largely overlooked,’ she adds.
Waters with high acidity have reduced biodiversity.
Weiss and colleagues looked at four freshwater reservoirs in Germany. Their analysis of data over 35 years – from 1981 to 2015 – confirmed a continuous increase in CO2, measured as the partial pressure or pCO2, and an associated decrease in pH of about 0.3, suggesting that freshwaters may acidify at a faster rate than the oceans.
In lab studies, the team also investigated the effects of higher acidity on two species of freshwater crustaceans called Daphnia, or water fleas. Daphnia found in lakes, ponds and reservoirs are an important primary food source for many larger animals.
Daphnia are an essential part of the freshwater food chain. Image: Faculty of Natural Sciences at Norwegian University of Science and Technology/Flickr
When Daphnia sense that predators are around, they respond by producing ‘helmets’ and spikes that make them harder to eat. Weiss found that high levels of CO2 reduce Daphnia’s ability to detect predators. ‘This reduces the expression of morphological defences, rendering them more vulnerable,’ she says.
The team suggest that CO2 alters chemical communication between species, which could have knock-on effects throughout the whole ecosystem. Many fish learn to use chemical cues from injured species to detect predatory threats and move away from danger, for example.
Ocean acidification - the evil twin of climate change | Triona McGrath | TEDxFulbrightDublin. Video: TEDx Talks
Cory Suski, an ecologist at the University of Illinois at Urbana-Champaign, US, says he is not aware of many other data sets showing trends in CO2 abundance in freshwater over an extended time. Also, he notes: ‘A lot of the work to date in this area has revolved around behavioural or physiological responses to elevated CO2, so a morphological change is novel.’
But he points out that it is difficult to predict how this change could impact aquatic ecosystems, or whether this may be a global phenomenon, simply because of the complex nature of CO2 in freshwater. The amount of CO2 in freshwater is driven by a number of factors including geology, land use, water chemistry, precipitation patterns and aquatic respiration.
Vaccines are much debated these days, but before starting a discussion about them, let’s see how a vaccine is defined.
The World Health Organisation defines a vaccine as:
‘a biological preparation that improves immunity to a particular disease. A vaccine typically contains an agent that resembles a disease-causing microorganism, and is often made from weakened or killed forms of the microbe, its toxins or one of its surface proteins. The agent stimulates the body’s immune system to recognize the agent as foreign, destroy it, and “remember” it, so that the immune system can more easily recognize and destroy any of these microorganisms that it later encounters.’
We put in our bodies something that looks like or has a tiny part of the ‘microbe’ that produces the disease so that our body can produce the right agents to fight it in case we actually contract the real illness.
A vaccine is comprised of an active ingredient and other added ingredients. Like any other drug, the active ingredient is the key component that triggers the immune response. Beside this, the added ingredients have different roles, such as improving the immune response, or acting as a preservative, stabiliser, or suspending fluid.
These added ingredients are the ones that are sometimes contested due to their toxicity. But when speaking about toxicity, there is a very important point to make. Everything is toxic.
It all comes down to the dose you eat, drink, or otherwise insert into your body. An important indicator of toxicity is LD50 (lethal dose 50), which is the dose at which 50% of individuals die. Sodium chloride, also known as table salt, has a LD50 of 12,400mg/kg (868g of salt for a 70kg individual) for humans. The lower the LD50 indicator is, the more toxic a compound is.
Table salt can also be toxic.
Aluminium salts are used in many vaccines as adjuvants. This means that they help by stimulating the immune response and by a slow release of the active ingredient.
The most used salts are aluminium hydroxide, aluminium phosphate and potassium aluminium sulphate. Data about these compounds are freely accessible by searching for their material safety data sheets (MSDS) on the big chemical suppliers’ websites. The 11th section of an MSDS file is the toxicological information section, which contains the LD50 value, carcinogenicity information, and others.
Section 11 of the aluminium phosphate MSDS Sigma-Aldrich
None of the salts above are reported as carcinogenic, and the LD50 of aluminium phosphate is more than 5,000mg/kg for mice. The total quantity of the aluminium in a vaccine is less 1mg (0.001g), which is a very low quantity. In the normal European diet the amount of aluminium we intake from food varies between 3–10mg a day.
Vaccine composition lists also include compounds and products used in the manufacturing process – even though at the end of manufacture they are present only in trace amounts, if at all.
One of the chemicals on this list that scares people is formaldehyde, which is indeed carcinogenic with and LD50 of 42mg/kg for mice. Nevertheless, the quantity present in a vaccine dose is less 0.1 mg. One 200g pear contains 12mg of formaldehyde. We should always remember ‘the dose makes the poison’, as compound interest illustrates below.
The does makes the poison – ‘toxic’ chemicals in food. Compound Interest
Vaccination is a personal decision. Nevertheless, it should be based on information from multiple verified sources. Easily accessible and clear information can be found on the Vaccine Knowledge Project website designed by the Vaccine Research Group from the University of Oxford.
Around 10 million medical devices are implanted each year into patients, while one-third of patients suffer some complication as a result. Now, researchers in Switzerland have developed a way to protect implants by dressing them in a surgical membrane of cellulose hydrogel to make them more biocompatible with patients’ own tissues and body fluids.
‘It is more than 60 years since the first medical implant was implanted in humans and no matter how hard we have tried to imitate nature, the body recognises the implant as foreign and tends to initiate a foreign body reaction, which tries to isolate and kill the implant,’ says Simone Bottan at, who leads ETH Zurich spin-off company Hylomorph.
Hylomorph is a spin-off company of ETH Zurich, Switzerland. Image: ETH-Bibliothek@Wikimedia Commons
Up to one-fifth of all implanted patients require corrective intervention or implant replacement due toan immune response that wraps the implant in connective tissue (fibrosis), which is also linked with infections and can cause patients pain. Revision surgeries are costly and require lengthy recovery times.
The new membrane is made by growing bacteria in a bioreactor on micro-engineered silicone surfaces, pitted with a hexagonal arrangement of microwells. When imprinted onto the membrane, the microwells impede the formation of layers of fibroblasts and other cells involved in fibrosis.
25,000 people in the UK have a pacemaker fitted each year. Image: Science Photo Library
The researchers ‘tuned’ the bacteria, Acetobacter xylinum, to produce ca 800 micron-thick membranes of cellulose nanofibrils that surgeons can wrap snuggly around implants. The cellulose membranes led to an 80% reduction of fibrotic tissue thickness in a pig model after six weeks, according to a study currently in press. Results after three and 12 months should be released in January 2018.
It is hoped the technology will receive its first product market authorisation by 2020. First-in-man trials will focus on pacemakers and defibrillators and will be followed by breast reconstruction implants. The strategy will be to coat the implant with a soft cellulose hydrogel, consisting of 98% water and 2% cellulose fibres.
The membrane will improve the biocompatibility of implants. Video: Wyss Zurich
‘Fibrosis of implantables is a major medical problem,’ notes biomolecular engineer Joshua Doloff at Massachusetts Institute of Technology, adding that many coating technologies are under development.
‘[The claim] that no revision surgery due to fibrosis will be needed is quite a strong claim to make,’ says Doloff, who would also like to see data on the coating’s robustness and longevity.
The silicone topography is designed using standard microfabrication techniques used in the electronics industry, assisted by IBM Research Labs.
Some could argue the greatest threat to life as we know it is the slow, invisible war being fought against antibiotic resistant bacteria. The accidental discovery of penicillin by Fleming in the late 1920s revolutionised modern medicine, beginning with their use in the Second World War.
Over-prescription of these wonder drugs has allowed bacteria, which multiply exponentially, the ability to pick up on deadly cues in their environment at a phenomenal rate. They’re adapting their defence mechanisms so they’re less susceptible to attack. In theory, with an endless supply of different drugs, this would be no big deal.
Alexander Fleming, who discovered penicillin. Image: Wikimedia Commons
Unfortunately, the drug pipeline seems to have run dry, whilst the incidence of resistance continues to climb. For the gnarliest of infections, there’s a list of ‘drugs of last resort’, but resistance even to some of these has recently been observed. A report published by the World Health Organisation echoes these warnings – of the 51 new drugs in clinical development, almost 85% can be considered an ‘upgraded’ version of ones on the market right now. These drugs are a band aid on a snowballing problem.
Are viruses the answer?
Bacteriophages, or phages for short, are viruses that infect only bacteria, wreaking havoc by hijacking cellular machinery for their growth and development.
A bacteriophage. Image: Vimeo
Phages can find themselves in one of two different life cycles: virulent and temperate. The first involves constant viral replication, killing bacteria by turning them inside out (a process known as lysis). The second life cycle allows the phage in question to hitch a ride in the cell it infects, integrating its genetic material into the host’s and in doing so, propagating without causing immediate destruction. It’s the former that is of value in phage therapy.
Long before Fleming’s discovery, phages were employed successfully to treat bacterial infections. In areas of Eastern Europe, phages have been in continuous clinical use since the early part of the 20th century.
Why did their use not take off like that of penicillin’s in the West? ‘Bad science’ that couldn’t be validated in the early days proved to be disheartening, and phages were pushed to the wayside. Renewed interest in the field has come about due to an improvement in our understanding of molecular genetics and cell biology.
Phages are highly specific and, unlike antibiotics, they don’t tamper with the colonies of bacteria that line our airways and make up a healthy gut microbiome. As they exploit an entirely different mode of action, phages can be used as a treatment against multiple drug-resistant bacteria.
Repeated dosing may not even be necessary – following initial treatment and replication of the phage within infected cells, cell lysis releases ever more phages. Once the infection is cleared, they’re excreted from the body with other waste products.
What is holding it back?
A number of key issues must be ironed out if phage therapy is to be adopted to fight infection as antibiotics have. High phage specificity means different phage concoctions might be needed to treat the same illness in two different people. Vast libraries must be created, updated and maintained. Internationally, who will be responsible for maintenance, and will there be implications for access?
Scientists are looking at new ways to tackle antibiotic resistance. Video: TEDx Talks
Despite proving a promising avenue for (re)exploration, under-investment in the field has hindered progress. Bacteriophage products prove hard to patent, impacting the willingness of pharmaceutical companies investing capital. AmpliPhi Biosciences, a San Diego-based biotech company that focuses on the ‘development and commercialization of novel bacteriophage-based antibacterial therapeutic,’ was granted a number of patents in 2016, showing it is possible. This holds some promise – viruses might not save us yet, but they could be well on their way to.
The next five years will be the most promising in the fight against cancer with immunotherapies – such as CAR-T and moderating T-Cell approaches, and innate immunity therapies – delivering far better patient outcomes.
In the last five years, the industry has rapidly advanced its understanding of the body’s immune response and genetic markers. As a result, combination therapies – chemotherapies will continue to play an important role – are forecast to become an increasingly standardised treatment, with pharma keen to invest.
These newer options are bringing in transformative remission rates, and check-point inhibitors have already been seen to elicit long-term cures in patients, with success rates two-to-three times higher than standard chemotherapy approaches.
Over the next ten years, we will see significant breakthroughs as the industry’s understanding of the immune system improves. There are currently more than 130 biotechs – in addition to 20 big pharma companies – working on new therapies and it is believed the smaller companies are more aggressively bringing newer innovations to market. In the long run, pharma will undoubtedly absorb the most promising players in an effort to become leaders in combination therapy approaches, which many argue will deliver the best outcomes.
The current investor frenzy is comparable to that of the genomics industry at the turn of the century. Experts argue that a more complete understanding of the genome and promise of clinical data of these transformative modalities will create a golden age for cancer therapy over the next few years.
There are, however, a number of immediate challenges. For example, CAR-T, although demonstrating good efficacy in blood cancers, has yet to show enough efficacy in solid tumours. Another challenge is how far towards cures for all patients we can get, particularly for patients with late stage metastatic cancer.
Immunotherapies are moving cancer from treatment options that simply extend life or improve experience to more effective cures. The cost of newer therapies is also coming into focus; however, this is a positive pressure on companies to produce significant, not just incremental, outcomes for patients.
CRISPR/Cas9 is a gene editing tool that is swiftly becoming a revolutionary new technology. It allows researchers to edit the genome of a species by removing, adding or modifying parts of the DNA sequence.
To alter DNA using CRISPR, a pre-designed sequence is added to the DNA using a RNA scaffold (gRNA) that guides the enzyme Cas9 to the section of DNA that scientists want to alter. Cas9 ‘snips’ the selected sequence.
At this point, the cell identifies the DNA as damage and tries to repair it. Using this information, researchers can use repair technology to introduce changes to the genes of the cell, which will lead to a change in a genetic trait, such as the colour of your eyes or the size of a plants leaf.
Cas9 unzips the selected DNA sequence as the latter forms bonds to a new genetic code. Adapted from: McGovern Institute for Brain Research at MIT
Public approval of genetic modification is at an all-time high, with a recent YouGov survey finding only 7% of people in the UK oppose gene editing, although there is still a way to go. Lighter regulation in recent years has allowed smaller companies and academic institutions to undertake research.
The future of farming
One of the industries that has benefited from CRISPR is agriculture. The ongoing GM debate is an example of controversial use of transgenesis, the process of inserting DNA from one species into another, spawning fears of ‘Frankenstein foods’.
Instead of creating mega-crops that out-compete all conventional plants, gene editing provides resistance to harsh environments and infections; particularly significant in the context of global food security.
Although gene-editing has been a staple of new agriculture technology for many years now, it is only recently that CRISPR has seen successful use in human disease research and resulting clinical trials.
Scientists at the Salk Institute, California, successfully removed the MYBPC3 gene, linked to a common form of heart disease, from a human embryo. The correction was made at the earliest stage of human development, meaning that the condition could not be passed to future generations.
CRISPR is also being used to study embryo development. Recently, scientists at the Francis Crick Institute, London, discovered that the gene OCT4 was vital in these early stages, although its purpose is still not fully understood. Researchers involved believe that more research into OCT4 could help us improve success rates of IVF and understand why some women miscarry.
A human embryo at day four, taken by a Scanning Electron Microscope. Image: Yorgos Nikas, Wellcome Images
CRISPR is still in the early stages and we are far from editing embryos that can be implanted for pregnancy. Many more safety tests are required before proceeding with any clinical trials, with the next step perhaps replicating the experiment on other mutations such as BRCA1 and BRCA2, the genes responsible for an increased risk of breast cancer.
Experts are confident, however, that this technique could be applied to thousands of other diseases caused by a single mutation, such as cystic fibrosis and ovarian cancers.
The benefits of gene editing are abundant. For example, we may be able to turn the tables on antibiotic-resistant bacteria or ‘super-bugs’ by engineering bacteriophages - viruses that infect bacteria - to target antibiotic resistance genes, knocking them out and allowing conventional antibiotics to work once again. Elsewhere, CRISPR could be used to modify metabolic pathways within algae or corn to produce sustainable and cost-effective ethanol for the biofuel market.
Is CRISPR ethical?
CRISPR and gene editing will revolutionise many industries, but the fear remains in many that we will slip into a society where ‘designer babies’ become the norm, and individuality will be lost.
Marcy Darnovsky, Executive Director of the Centre for Genetics and Society, said in a statement: ‘We could all too easily find ourselves in a world where some people’s children are considered biologically superior to the rest of us.’
Could CRISPR lead to a new generation of superheros? Image: Cia Gould
Dr Lovell-Badge, from the Francis Crick Institute, disagrees. ‘I personally feel we are duty bound to explore what the technology can do in a safe, reliable manner to help people. If you have a way to help families not have a diseased child, then it would be unethical not to do it,’ he said.
Genetic engineering does not have to have an all-or-nothing approach. There is a middle ground that will benefit everyone with correct regulation and oversight. With its globally renowned research base, the UK government has a great opportunity to encourage genetic experiments and further cement Britain’s place as the genetic research hub of the future.