top of page
  • Writer's picturesherbornesciencecafe

2016 Meetings

23 November 2016 Sleep: The low hanging fruit of health and wellbeing? Professor Vincent Walsh - Institute of Cognitive Neuroscience, UCL

For a full report of the meeting please download the following document: https://drive.google.com/open?id=0B5WWN5Xs4rIYOGwwWnByUkh5Tkk

26 October 2016 Changing Energy Sources - For Today and Tomorrowby Charles Miller - Consultant Drilling and Well Control Engineer For a full report of the meeting please download the following document:https://drive.google.com/open?id=0B5WWN5Xs4rIYM3lmYTNjYU1oVGc 28 September 2016 Dr Percy Seymour MAKERS OF MODERN PHYSICS- Their personalities and the powers they discovered


INTRODUCTION

This month's presentation, celebrating 10 years of Sherborne Science Cafe, was given by astronomer, author and founder member of Sherborne Science Cafe, Dr Percy Seymour. Percy examined the progress of (mainly fundamental) physics beginning with Newton's Laws of both Motion and of Gravitation, and the expansive development of the subject from the early 20th century onwards following the discovery of radioactivity. The talk was illustrated with the personalities involved, many now famous (e.g. Einstein), others much less so (e.g. Chadwick).  

Isaac Newton's inspirational work in determining the Laws of Motion and the Law of Gravitation (both 1687) (see references) gave scientists a pretty effective set of tools to predict the behaviour of matter and they still work with great accuracy today. Space probes, for example, continue to be launched and their destinations precisely determined using Newton's laws of classical physics. However, with the discovery of radioactivity, it became obvious that Newtonian view of the world was deficient in the context of these new discoveries. 

FROM 'CLASSICAL' TO ‘MODERN’ PHYSICS The 'new physics' began with the discovery of radioactivity by Henri Becquerel in 1896. J.J. Thompson showed (1897) that cathode rays were negatively charged particles, later called electrons. Radium and polonium were discovered by Marie and Pierre Curie in 1898, and Ernest Rutherford (previously a student of Thompson), identified the nature of alpha and beta particles and named the ionising emanation from radioactive elements, ‘gamma’ radiation (terms still used today).

In 1904, Thompson developed the 'plum pudding' model of the atom, suggesting that electrons were distributed in a uniform sea of positive charge.  A clutch of Nobel Prizes was awarded to these pioneers of radioactivity. Marie Curie was the first women to receive a Nobel Prize, and the only recipient to have won it twice. For Thompson, eight of his students, including his son, were later Nobel Prize winners.

With the Cavendish Laboratory (at Cambridge) having pre-eminence in the field of radioactivity, Arthur Schuster, a German born physicist at Manchester University with wealthy family connections, toured the universities of Europe, gaining ideas and insights to build a lab in Manchester which would rival the Cavendish.

Having completed this project, and generously recognising Rutherford to be the better scientist, he offered University authorities his resignation on condition they appoint Rutherford (then at McGill) in his place. Rutherford took over in 1908, and the following year performed his famous 'gold foil' experiment (1909) with Hans Geiger and Ernest Marsden. The pattern of deflection of alpha particles being fired at the foil lead to the well recognised model of the atom, a small positively charged nucleus orbited by low mass electrons (the 'Rutherford model') contrasting with Thompson's somewhat earlier and incorrect 'Plum Pudding' model. In later research, Rutherford, split the atom (1917) and discovered and named the proton. 

Other scientists were also making inroads into the new physics. Albert Einstein, a Swiss patent clerk, noting that light can push electrons out of a material (building on Max Planck's Quantum Theory of 1900), won the Nobel Prize in 1905 for his discovery of the photoelectric effect.

Meanwhile, in 1911, Swede Niels Bohr, on a Carlsberg (of larger fame) scholarship, entered the Cavendish to study under J.J. Thompson. They famously never got on, after Bohr apparently discovered a mistake in one of Thompson's research papers. This error, namely that an electron in orbit would continue to emit radiation was incorrect. Bohr, driven out of the Cavendish took up residence at Manchester under Rutherford. His key suggestion was that electrons exist in defined orbital shells around the nucleus. It was only the movement of an electron from an outer to an inner shell which produces a scintilla of radiation. Thus was born the Bohr-Rutherford model of the atom. The reactive behaviour of different elements was explained by the numbers of electrons in orbital shells. Electrons filled lower shells before higher ones and each shell took only a certain number of electrons. The lack of a full shell of electrons would give an atom its chemically responsive characteristics. Bohr returned to Sweden in 1921.

Determining the nature of the nucleus continued with James Chadwick under Rutherford at Manchester, and later moving with him to the Cavendish. In 1932, when firing alpha particles at nitrogen, he noted a conversion to oxygen and observed that a neutral particle with one quarter of the mass of proton was given off, which he promptly named a neutron. 

The realisation that an atom can have protons (which determine the broad chemical properties of the element) and neutrons (which add mass only), lead to the concept of isotopes whereby an element can have different forms, with atoms consisting of identical proton numbers, but different numbers of neutrons giving slightly different chemical properties. Chadwick departed Cambridge to take a professorship at Liverpool, developing the institution into an important centre for nuclear physics.

Critical steps in radioactivity were made by Otto Hahn and Lise Meitner (and student Fritz Strassman) who discovered in 1938 that firing a neutron at uranium split the atom into barium and krypton (fission) atoms with a small loss of mass equivalent to the energy created (derived from Einstein's famous equation E = mc2). Each split or fissioned uranium atom also released 3 neutrons. With enough uranium present, a self-sustaining chain reaction would occur with the potential to emit immense amounts of energy.

Whilst Hahn was reluctant to publicise this finding, it was further developed by Meitner and her nephew, Otto Frisch. Both Meitner and Frisch understood that this could be the basis of a formidable weapon. Surprisingly, the potential of a chain reaction had been conceived a few years earlier by scientist and inventor Leo Szilard who in 1933 patented the process (though interestingly, at this point, Rutherford, now Lord Rutherford, was unable to see any virtue in nuclear energy).

PUTTING THEORY INTO PRACTICE

The Maud Committee of 1941, was founded by Winston Churchill in response to a memorandum from Otto Frisch and fellow researcher Rudolph Peieris about the potential of constructing an achievable nuclear bomb. It was acknowledged that a chain reaction would be very difficult to establish. In addition, only the isotope U-235 rather than its much more common (by a factor of 1:138) cousin U-238 would be required. The Maud Report went to the USA, where it attracted keen interest after the attrition of Pearl Harbor.


Robert Oppenheimer was the man chosen to build the bomb. As a student, he was a high achiever and went to Rutherford in 1924 where he was that told he was no good at practical lab work. Disappointed and humiliated, he next went to Trinity College Cambridge, where he had Patrick Blackett as his tutor who also concluded he was poor in experimental work. Humiliated again, he presented Blacket with a poisoned apple. Blacket survived its consumption. At this point, whilst his experimental work had been poor, his theoretical work was excellent and he was persuaded to stay at the Cavendish. Continued scientific research later saw him in America. At this point, General Groves had been charged with building a team at Los Alamos to develop a nuclear bomb. Whilst Oppenheimer was not everyone's cup of tea, Groves was very impressed with him, and despite possible communist associations, Groves, against virtually everyone else's opinion, put Oppenheimer in charge. It proved an excellent decision. Three bombs were produced, a Uranium bomb of 70kt (dropped on Hiroshima) and 2 plutonium bombs (of 40 kt) one of which was dropped successfully on Nagasaki three days later. The other plutonium bomb was detonated on the American mainland (Trinity) as a test, one month prior to the Hiroshima attack. Whilst the Uranium bomb was relatively simple requiring 2 disconnected lumps of uranium to be pushed to be pushed together to create a critical mass, the plutonium bomb was more technical requiring a sphere of plutonium to be precisely imploded by a concentric coating of conventional explosives.

The ultimate and devastating expressions of the new physics were undoubtedly the atomic bombs dropped on Japan in 1945. These events were not without controversy. Was it truly necessary to completely destroy two Japanese cities with appalling collateral damage to civilians to bring the conflict to an immediate conclusion thereby saving allied lives in the planned invasion of the Japanese mainland? Or were there ulterior motives for deploying these fearsome weapons? Percy raised the possibility of revenge for the Pearl Harbor attack in 1941 (the spectre of a first strike haunted US defence planners throughout the Cold War), or a need, as an audience member suggested, to demonstrate to the increasingly belligerent Russians the primacy of American military power. Whilst atomic bombs may be viewed by many with horror, the flip side of radioactivity has been carbon-neutral nuclear power, medical diagnostics (X-rays, MRI scans) and radioactive treatments for cancer.

When asked if there were any more major discoveries in physics to be made, Percy related the story of Lord Kelvin, who, when asked by a prospective student, prior to the discovery of radioactivity, whether physics was a suitable discipline to enter, was told not to bother as all major discoveries had been made. As Donald Rumsfeld famously declared (see references), there are 'known knowns' (things we know we know), 'known unknowns' (things we now know we don't know) and 'unknown unknowns' (that is things we do not know we don't know). It is these 'unknown unknowns' that will continue to provide new and exciting frontiers for physics.

REFERENCES Newton's Laws of Motion- see https://en.m.wikipedia.org/wiki/Newton%27s_laws_of_motion

22 June 2016

Dr Jeremy WaltonFour Seasons in One Day: The Met Office, Weather & Climate Dr Jeremy Walton, from the Met Office in Exeter, is the lead computational scientist for the UK Earth System Model (UKESM). A major application of UKESM will be in the next round of the Climate Model Intercomparison Project, which aims for a better understanding of climate change in a multi-model context. UKESM and other climate models will be used to run the same set of experiments, and comparison between their results will be used to assess their performance and quantify the spread amongst future projections. Climate modelling was one of main themes of Dr Walton’s talk, but he first gave a brief account of the history and role of the UK Met Office. It was the wreck of the clipper Royal Charter in 1852 following a particularly violent storm that led Admiral Robert Fitzroy (the Captain of the Beagle on Darwin’s voyages) to speculate whether the event could have been predicted. The Meteorological Department of the Board of Trade eventually became the present Met Office, and was responsible for the first real attempt to forecast the weather. The Met Office is currently based in Exeter and employs over 1900 staff (1400 in Exeter).


Dr Walton next turned to weather forecasting, starting with examples of recent significant weather events. The potential damage and loss of life from extreme weather, for example hurricane Katrina, emphasise the need for accurate forecasting. Three stages were identified: firstly to observe the various physical quantities that characterise the current situation, then to model the data to see what could happen and finally to bring these two stages together to make a forecast of future events. Observations are made on the ground, at sea, in the air and by remote sensing by satellites. The atmosphere is modelled in three dimensions by solving a number of differential equations. The complexity of the model and amount of data to be handled require the use of multi-processor super computers. The Met Office has one of the most powerful computers in the UK, currently a Cray XC40. Weather forecasting has improved steadily as computing power has increased; a current 4-day forecast is as accurate as a one-day forecast was 40 years ago. A major problem of the modelling is that small perturbations in the initial data can lead to very different forecasts. One approach is to model a large ensemble of possible scenarios, enabling an estimate of how probable the final forecast is. Dr Walton then returned to climate, as distinct from weather. Whereas weather forecasting is concerned with predicting a few days ahead, the time-scale for climate modelling is years, decades and centuries into the future. The topic of climate change and the possible role of humans in global warming is controversial. However, there is a large body of evidence that the earth is warming: of the 13 hottest years since 1980 11 have been between 2001-2011. Several illustrations of climate predictions were shown which depended on the assumed levels of greenhouse gases, particularly CO2. Dr Walton proved a lively speaker, and following a period of heavy rainfall, arrived on a sunny evening, emphasising the variability of the weather in our part of the world. He welcomed questions throughout his talk and gave us a warm and entertaining meeting. Report by Bob Barber


25 May 2016Star FormationDr Jennifer HatchellDepartment of Physics and Astronomy, Exeter University Dr Jennifer Hatchell, Lecturer in Astrophysics from Exeter University, entertained us with a beautifully illustrated talk on current research into star and planet formation in our galaxy, the Milky Way. The Milky Way consists of mature stars, of various types, plus the raw materials needed for star formation: dust and gas (essentially hydrogen). Surprisingly, it is estimated that as few as two stars per year are formed in our galaxy. One problem for astronomers is that the regions where stars are formed are hidden by dust when observations are made at optical wavelengths. We were shown images of the Orion nebula, one of the nearest star-forming regions, at both optical and (longer) infra-red (IR) wavelengths. Young stars, obscured by dust at optical wavelengths, are revealed in the near-IR. An understanding of the physics of star formation requires balancing the force of gravity (causing material to coalesce) and gas pressure (which resists the collapse). Proto-stars take about one million years to form, a process that has been modelled by Prof Matthew Bate in Exeter, using supercomputers. Some of the models take several months to run to complete the calculations, but we were shown simulations of the process condensed into a few seconds. The evolution to nuclear fusion (hydrogen ‘burning’), the process that fuels our sun, takes much longer - 100 million years. Checking the computer models against observations takes astronomy into new areas, in particular sub-mm wavelengths. Dr Hatchell outlined those parts of the electro-magnetic spectrum that are available to earth-based telescopes, and described how the atmosphere, particularly atmospheric water, absorbs strongly at some wavelengths. One approach is to place telescopes in outer space, as the Hubble and Herschel Space Telescopes, or as far as possible above the atmosphere on Earth: major observatories are sited in Mauna Kea in Hawaii and in the Atacama Desert in Chile. A range of instruments now enable astronomers to look at a much wider range of wavelengths than previously, and to investigate the temperature of star-forming regions, and the relative amount of radiation at different wavelengths - important for comparing theory with observation. One particularly exciting recent development has been the introduction of ALMA, the Atacama Large Millimetre Array, a large scale interferometer operating in the sub millimetre region. Angular momentum within dust clouds leads to the formation of disks around proto-stars, the regions where planets are known to form.

Reproduced courtesy of the European Southern Observatoryunder a Creative Commons Attribution 4.0 International License Perhaps the most amazing of Dr Hatchell’s many remarkable images was of the proto- star HL Tauri, imaged by ALMA, showing disks separated by darks bands, thought to be evidence of planets in formation ‘clearing’ the dust. The comparison with our own solar system is striking. A very successful talk was followed by a lively session of questions. Report by Bob Barber


27th AprilSaving Nature - A Media PerspectiveJulian HectorActing Head of the BBC Natural History Unit (NHU), Bristol 📷Spot and Stripe with zookeeper Giles Clark in Tigers About the House

In a change of emphasis from the 'hard', fundamental science presentations of recent months, Julian Hector, a colleague of David Attenborough, spoke about the instrumental role of the BBC NHU in the conservation of many threatened species, which on the one hand, as human beings, we value, but on the other, ultimately (quite often) fail to conserve at all.

Julian's entry into this field had been via PhD research into albatross behaviour at Bird Island in South Georgia during the early 1980's. His motivation had been, and continues to be, a strong affinity with such animals, and a desire that they should be available for future generations.

The BBC NHU (which will celebrate its 60th birthday next year), of which Julian is Acting Head, has traditionally been based in Bristol, for no other reason than when first established, this was where the main players in the business were based. Fast forward to the present day, and the unit is still quartered there, with extensive collaboration with wildlife experts at Bristol University. The City of Bristol continues to be the premiere city (globally) for wildlife programming with the established skills and infrastructure for producing such complex programming.

A key feature of BBC output, compared to that made by other providers, is that the Corporation is unable to campaign and needs to be (and be seen to be) fair, impartial and balanced in its approach. For Julian, the theme of this presentation is about what can be done by the BBC to influence and conserve wildlife, without its impartiality being tainted by active campaigning.

There are several key influences in the BBC's approach which have led to both the success of the NHU (in terms of demand for its products and audience reach) and its broad influence in wildlife conservation:

i)IMAGE CAPTURE TECHNOLOGY. This advances so quickly that photographers prefer to hire rather than purchase their equipment, knowing that on their next assignment, whatever technology they had used previously, will have been superseded. There is great expectation to produce the 'impossible shot', and Julian showed in his various clips, examples of this. This not only produces a dramatic montage for the casual audience, but can provide new data leading to new discoveries in animal behaviour as was the case in the landmark, soon to be released, series 'Oceans'.

An example of the march of technology has been the replacement of long, awkward, helicopter-mounted, stabilised lenses by drone technology permitting unobtrusive, detailed filming close in.  Another is to use animals to do the filming themselves with innocuous attached cameras which record what the animal sees and responds to.

ii) PRESENTATION: Large (and Increasing) screen sizes coupled with high (and increasing) screen definitions are able to immerse people in the natural world. This, coupled with a good animal story, sensitises an audience in wanting to know more. An example of this ('Tigers about the House') is the story of zookeeper Giles Clarke raising Spot and Stripe, two orphaned Sumatran tigers, at the Australian Zoo. Continuous access to Giles and his feline charges made a compelling story which raised large donations for the Zoo and their ongoing conservation efforts.

Other opportunities have been made by collaboration between the NHU and SEGA, a Japanese gaming company. The state-of-the-art result of this partnership (Orbi in Yokohama) is the dramatic showing of wildlife footage on a 40 x 8 metre screen with associated special effects (water in the face and butt punchers in the seats for activating at the appropriate moment) and surround sound to give an all absorbing experience. Immersion of the audience in the world of nature, leaves audiences wanting more, thereby encouraging interest in both conservation and the wider animal world.

iii) NETWORKING WITH EXPERTS: the NHU makes extensive use of, and collaborates with, appropriate scientists, experts and consultants, both in the planning phase of any project, and the final production. This is important in giving the final product ultimate credibility by being endorsed by the foremost experts in the relevant field.

The way forward will be to keep at least abreast, if not in front of, the latest technological developments, utilise the best experts in any wildlife production, and ensure presentation is both cutting edge and connects with those watching, not only for entertainment and enjoyment, but to bring them on board with the ultimate message that wildlife, if it is to survive, must be conserved.

Report by Simon Webster

23rd MarchCarbon electronics is the future black?Dr Sharon StrawbridgeDept of Physics and Astronomy, Exeter UniversityGraphene- honeycombed carbon lattice

Carbon, the 15th most abundant element in Earth's crust and the 4th most abundant in the universe, is one of the most intriguing of elements.  It's electron structure gives it the ability to join with other carbon atoms to form allotropes (elements that exist in two or more different forms) and to combine with other elements to produce stable, often quite complex, compounds, for which, as significant life forms, we have been a major beneficiary. This electronic explanation for the behaviour of carbon has given rise to the heretical comment (for chemists at least) that  'All chemistry is physics of the electron'.


Carbon allotropes (see examples above), then, consist of the same building blocks (the carbon atom) but each can join with its neighbours in different ways, producing disparate materials with altogether dissimilar physical properties. Diamond and graphite are the most commonly quoted examples of this diversity.  But other forms exist. Fullerenes, for example, discovered only in the 1980's, are carbon constructions in the shape of hollow spheres, ellipses and tubes. Most recently, graphene (from graphite and the suffix 'ene') has been discovered (or more technically rediscovered) by Andre Geim and Konstantinos Novoselov of Manchester University in 2004, for which they received the Nobel physics prize in 2010. Dr Sharon Strawbridge, a senior lecturer in Physics and Astronomy at Exeter University, came to talk about graphene, and more particular, whether it is the wonder material it is hyped up to be. Sharon explained that her interest was very much in primary research rather than development of the material for commercial applications.

Graphene (a structure that is the 'last to be discovered as a material on its own') was theorised in 1947 as the simplest model for carbon structures- a single sheet, lightweight 2D polymer and as extensive as you like. It was first observed using an electron microscope in 1962 though its potential went unrecognised for the next 42 years.

It is cheap to make, the raw material is carbon and any carbon source (e.g. methane CH4) that can be readily converted to plasma will suffice. It is easy to manufacture, air stable, optically transparent and being only one atomic layer thick, it can only be seen by the interference pattern it produces. It is more conductive than copper and 100x stronger than steel. It has a tendency to adsorb molecules on its surface, perturbing its electronic qualities. Annealing is needed to remove adsorbed molecules and it can be given paramagnetic susceptibility by introducing faults within the carbon structure. It can also be produced by pulling single carbon layers from graphite and transferring to a silicon wafer.

For the future to be 'black', that is electronic systems based on carbon (graphene), and not, as at present, silicon, several challenges need to be met. In technical terms, material purity, reproducibility and scale need to be improved.  In economic terms, the electronics industry is currently predicated on vast capital investment in silicon and in consequence there is inertia to change. Whatever eventually knocks silicon off its perch will need to be much, much better, and graphene has not yet reached that stage. Investors, often in China, have to date, lost a great deal of money in trying to unlock the commercial potential of graphene. There is the potential for carbon molecular logic gate circuits, but true molecular electronics is still a long way off.

Sharon's talk left a lasting impression of the diversity of carbon structures, that new discoveries will continue to be made, and that at some point, the future will indeed be 'black'.

Report by Simon Webster


24th February From Lodestones to Hard Drives - Insights into Magnetism (or Magnets- how do they work?) Dr Chris Bell, University of Bristol

The phenomenon of ferromagnetism, from its discovery by the Ancient Greeks in the 6th century BC to its use by Chinese navigators from the 4th century AD; from its theoretical and early practical development by Coulomb, Faraday and Maxwell in the 18th/19th centuries to its modern application in data storage devices; was brought to life by Cambridge alumnus, Dr Chris Bell, currently a researcher in the Physics department at the University of Bristol. His research focuses on the creation and control of novel electronic phases in materials.  The term 'magnetism' covers a range of effects (ferromagnetism, diamagnetism, paramagnetism, anti-ferromagnetism) and it is ferromagnetism specifically that Chris addressed in his presentation. Ferromagnetism is the mechanism by which certain materials form permanent magnets, or are attracted to magnets.

Despite the fact that it is a prominent part of everyday technologies, ferromagnetism has defied easy explanation. Whilst it has provided ample material for several Nobel prizes and seriously engaged the minds of Nobel laureates, it remains an elusive phenomenon. Notable physicist Richard Feynman once stated, 'I really can't do a good job, any job, of explaining magnetic force in terms of something else you are more familiar with, because I don't understand it in terms of anything else you're more familiar with'. Ferromagnetism, it seems, is an entity that can only be understood on its own terms. It may be, however, that Richard Feynman relished the esoteric. He said of his own laureate research (quantum electrodynamics), 'If I could explain it to the average person, it wouldn't have been worth the Nobel prize'.

Although the term 'ferromagnetism' appears to limit its remit to iron-only materials, ferromagnetism is also a property of several other elements, namely Cobalt (Co), Nickel (Ni), Gadolinium (Gd), and also certain rare earth metals, Samarium (Sa) and Neodymium (Nd), the latter being increasingly found in commercially available magnets. Ferro-magnets are marked by a property known as the 'Curie Temperature', the temperature above which residual magnetism is destroyed. This varies according to the material and is why iron-containing volcanic rocks such as basalt can, having being purged of residual magnetism whilst molten, take on the new magnetic signature of their surroundings after cooling below their Curie temperature, thereby providing a palaeorecord trace of the Earth's magnetic field. Looking forward to new developments, graphenes may well exhibit desirable magnetic properties.

Classical physics has been unable to explain ferromagnetism. Current thinking relates its existence to quantum effects. Electrons have a fundamental property (besides a negative charge) of a magnetic dipole moment whereby they behave as a tiny magnets. The dipole moment, which exists in two possible states (UP and DOWN), is related to both the quantum mechanical spin of the electron and its orbital angular momentum. Much of this theory was developed by Wolfgang Pauli (1900-58; Nobel prize: 1945) and Bristol-born Paul Dirac (1902-84: Nobel prize:1933) who combined both quantum mechanics and special relativity to account for spin. At its simplest level, the quantum nature of Ferromagnetism is marked by 3 attributes:

i) Collectivity - needs lots of spins pointing in the same direction;ii) Macroscopic distances - electrons are ordered over huge distances; &iii) Emergence - new properties arise with an increase in components within the system.

A major application of ferromagnetism in today's world is within the multiplicity of computer hard drives upon which our data-driven world depends. Old-school magnetic hard drives from the 1980's provided reasonable magnetic storage. Individual magnetic storage elements were placed horizontally adjacent; if too close neighbouring components could be caused to 'flip'. It was Albert Fert (b.1938) and Peter Grunberg (b.1939) (joint Nobel prize: 2007) who discovered giant magnetoresistance, whereby thin, alternating films of ferromagnetic and non-magnetic conductive material interacted in such a way that paved the way for a massive reduction in memory storage size. Individual magnetic storage elements could be placed perpendicularly on a drive without each element contaminating its neighbours with its signal. However, making each individual magnet smaller, weakens each unit in terms of its magnetic strength, and causes a reduction in the Curie temperature. There is therefore a need to choose a magnetic material with an initially high Curie temperature. The short time lag between this primary research (discovery of GMR) and a marketable product (terabyte hard drives) shows how quickly the computer industry adapts new findings.

One aspect of Chris Bell's research is related to determining the ultimate limit of magnetic storage. Cutting edge microphotographs showed minute individual magnetic storage elements, each of iron, consisting of only several atoms, positioned on a copper substrate. This fundamental research presently requires low temperatures to operate (1Kelvin) to avoid crossing the Curie threshold. In comparison, current hard drives are only a factor of 10 away from the best developed by this fundamental research. An interesting point was that magnetic memory shrinkage (Kryder's Law) has been greater than the shrinkage of transistor size (Moore's Law) over the same period. Since 1979, storage size has shrunk by 10,000 times and embedded loses have reduced considerably. A current limiting factor is the relatively slow speed of mechanically extracting information from a hard drive for processing. An interesting stage has been reached where a none-mechanical means is required. Flash, as a technology, is not regarded as reliable.

Post-presentation discussion suggested that hard-water descaling magnets (fitted to surround mains water inlet pipes) did change the crystalline structure of calcium salts within the water and therefore had the potential to affect mineral deposition. Modern hard drives are not in danger of being corrupted by stray magnetic fields. It is uncertain for how long the integrity of data can be maintained on a hard drive. Ferromagnetism can exist in non-ferromagnetic materials (Heusler alloys) suggesting ferromagnetism is not purely a property of a material but also of its microstructure and that magnetic fields can be shielded from by suitable materials. As regards mobile phones, it was felt that early versions may well have had the potential to cause harm, but modern phones, give out much less radiation.


by Simon Webster

27th January Nitrogen and Humanity- the Limits of Civilisation and the Challenge to Science Professor Philip Poole Department of Plant Science, University of Oxford

The internationally publicised annual setting of the Doomsday Clock this January, when it remained at 3 minutes to Midnight (‘Doomsday’), reflects not only the nuclear ambitions of big nations, but also environmental factors such as the IPCC report that crop yields have slowed over the past 40 years. As if on cue, Science Cafe welcomed Oxford University professor, Phil Poole, to speak on the technicalities of increasing global yields by developing crops more able to provide their own nitrogen, a major limiting nutrient for many staples, especially in agriculturally-challenged parts of the world. This is an issue which affects rich and poor nations differently. First world countries often use too much nitrogen, whereas developing countries use too little, limiting crop yields.

Whilst nitrogen is hugely abundant (78% of the atmosphere by volume) its triple-bonded molecule is famously unreactive. Chemically combining it with hydrogen to form ammonia (NH3) makes it immediately and beneficially available to plants for conversion to nitrates.

Nature, as innovative as ever, got in on the act first. Sixty-four million years' ago, in a once-only-event, a relatively small family of plants formed an association with rhizobium bacteria whereby the latter gained the security of a root system in exchange for providing ammonia.

Legumes are the plants which possess this remarkable innovation and so specific is this adaptation that only one non-leguminous plant (Parasponia) has also managed to acquire it. Nitrogen fixation has a dramatic effect on protein yields. It allows 2x protein per acre than any other major vegetable or grain, 5-10x the protein yield than if the land is used for dairy and 15x than where land is used for meat production. Italian philosopher, Umberto Eco, claimed the bean saved western civilisation allowing European population to triple in size between 1000-1500A.D. by supplying adequate protein to the growing populace.

Rhizobia (nitrogen fixing soil bacteria) reside in nodules on the root systems with specific rhizobium associated with specific legumes. Some 19,400 species of legumes exist. For agricultural purposes, we use only 12 with the likelihood that a whole range of possible legumes are out there waiting to be exploited. Not all legumes are immediately suitable for human consumption. Australia, for example, has a native legume, which contains poison to prevent it being consumed by ravenous animals in an environment otherwise devoid of tasty offerings.

Further yields were found to be possible by adding nitrogen directly to agricultural soils. In the 19th century, nitrogen-rich guano was extracted from the Chincha Islands (off Peru). When this ran out (1870's), surface deposits from the ever dry Atacama Desert were discovered to be nitrate-rich and exported for global agriculture production. These sources were always potentially limited, and exhaustion would have been disastrous for burgeoning populations. In 1898, William Crookes, President of the British Association for the Advancement of Science, prophesied that demand for nitrogen compounds would soon exceed supply and that it would for scientists to take up the reigns of nitrate production.

In what Professor Poole described as the 'Greatest scientific development in the last 200 years', German chemists, Fritz Haber and Carl Bosch, in 1910, succeeded in developing a chemical process for the industrial production of huge quantities of ammonia

In retrospect, at the time of its introduction, commentators were in two minds about its utility. During WW1, the UK controlled access to Chilean saltpetre and the Royal Navy dominated the world's oceans, denying Germany access to nitrates for munitions production. Without any other source, Imperial Germany could not have engaged in conflict beyond 1915. However, the unlimited ammonia produced by German Haber-Bosch plants allowed explosives to be manufactured without restriction. Haber (Nobel prize-1918) and Bosch (Nobel prize- 1931) were on the one hand heroes to those seeking to increase crop yields in starving countries and demons to allied forces seeking to throw off the might of Germany. There was also personal sadness for Haber. His brilliant pacifist wife Clara, dismayed by her equally brilliant husband's involvement in poisonous gas production for the German war effort, committed suicide in 1915.

So important and so influential is the Haber-Bosch process, and so much in demand are its products, that it is estimated that over half the nitrogen atoms in our bodies are derived from the process.  This high temperature, high pressure manufacturing process consumes 1-2% of global energy production and is a pervasive but largely hidden process from the everyday experience of many of us. Some 450 million tonnes of nitrate fertilizer are manufactured annually.

However, the disadvantage of soil-applied nitrogen is that a substantial amount (~50%) leaches from the soil, contaminating runoff and encouraging massive algal blooms which cause immense damage to marine and riverine ecosystems. Denitrifying bacteria convert excess soil ammonia to N2O, a hugely potent greenhouse gas (300x that of CO2). In comparison, fixed-nitrogen, kept within the nodules of legumes, provides plants with sufficient nitrogen, without soil or runoff contamination or a greenhouse stimulus to the atmosphere.

The question, Professor Poole asked, was whether further gains in crop production can be achieved by incorporating Rhizobia into staple cereals?

The key to this is as follows. When primitive land plants (e.g. liverworts) first colonized the continents around 400 million years ago, they had little in the way of roots and relied for sustenance on a symbiotic association with mycorrhizal fungi to supply essentials from the soil. Despite having developed effective root systems, 85% of land plants today still retain a mutually beneficial association with mycorrhiza. Such an association requires a signalling pathway between symbiotic partners. Similarly, there is a signalling pathway between rhizobia and their symbiotic, leguminous partners. The interesting point here is that the signalling pathway used in nodulation is very similar to that utilized by plants in association with mycorrhizal fungi. There is therefore an innate susceptibility for many crop plants, such as cereals, to be piggy-backed with engineered nitrogen-fixing symbiotics.  This forms the basis of much of Professor Poole's research.

Another innovation being pursued at Oxford by Professor Poole's colleagues is to incorporate so-called C4 photosynthetic pathways into C3 crops. Basically, evolutionarily older plants are C3-users and include rye, oats, soybean, peanut, sugar beet, rice and barley. C4 photosynthesis, a more modern innovation, is a more efficient process, especially in hot dry climates and uses water and nutrients much more efficiently. In predicted greenhouse climes of the future, converting the venerable C3 crops (most particularly rice) to C4-useage should improve yields whilst utilizing less water and nutrients in the process.

Questioning and discussion led to several interesting points. Whilst legumes may have provided protein for European population growth, the potato matched this demand in growth by supplying the necessary carbohydrate dietary component. The leguminous N-fixing process has a yield penalty for the plant compared to cereals, but protein is produced in abundance without environmental detriment. Green manure is a desirable addition to soils, in fact, research shows that growing legumes profoundly affects soil quality for the better. For Professor Poole, GM crops are neither good nor bad. Each case should be looked at individually, but always a precautionary principle should be invoked in making any such decision before planting. Organic crops do not have any great advantage in crop yields. Often they are more ancient, lower yielding varieties requiring a greater land area per unit yield.

So far, mankind has managed to broadly match crop yields to population growth. The key has been nitrogen, firstly obtained by cultivating legumes, then later by adding nitrogen sources to the soil (e.g. guano), then nitrogen derived from the Haber-Bosch process. A future boost will inevitable come from Professor Poole's work, the success of which might wind back the Doomsday clock preventing it from reaching the apocalyptic midnight hour.

Report by Simon Webster 6th JanuaryGreat Egg Race Team "Clifton Bridge"

Like previous events, this years’ Egg Race required planning, scientific application and more than a dash of initiative.  Anyone with a civil engineer on their team was definitely at an advantage. Historically, the Egg Race was always held before Christmas, but with the approach to festivities often being too busy for people to attend, the Egg Race is now a Christmas-spirited, post-Christmas event scheduled early in the New Year.

Instead of the normally passive stance required by attendees at Café meetings, the Egg Race demands engagement with others, a willingness to apply scientific principles in a practical context (with good humour), together with a glass of a favourite tipple in one hand and a mince pie in the other (both liberally supplied by the stewards).  Teams, equipped with dried spaghetti, masking tape, glue and pipe-cleaners, were tasked with building a bridge to cover a 40cm span over which a small 4-wheeled vehicle could traverse. On completion, the competition judges ensured the bridges were navigable by the vehicle, and then progressively loaded the finished structures until failure, usually sudden and dramatic, much like that which happened to Sir Thomas Bouch’s Tay Bridge. Team tactics were therefore to build a bridge of the requisite span, but one that was stronger than the competition.

Most teams adopted a strategy of taping and/or gluing a number of strands of spaghetti to form rigid columns, and then attached the columns together in the form of a girder bridge. Some of these were quite spectacular. The team Clifton Bridge structure seemed to be indestructible, with a bridge that appeared to balance the compressive and tensile forces in a manner ideal for spaghetti. Other attempts were much shakier, more in keeping with Heath-Robinson than Gustave Eiffel (who besides erecting the eponymous tower, was a notable builder of elegant girder bridges).

For those with the skills to multitask, a scientific quiz was available for completion, and Percy, our founder and former chairman, whilst not physically present, was there in spirit having forwarded for proclamation, a splendid set of Christmas jokes.

Congratulations to all who took part, courageously willing to accept acclamation or humiliation according to the success of their structures, and especially to team ‘Clifton Bridge’, the winners!

Report by Simon Webster

56 views0 comments

Recent Posts

See All
bottom of page