Arkenor on February 9th, 2011

This press release from Oregon Health and Science University caught my eye. Like most people, I’m in a perpetual battle with my waistline, so new clues as to how to win the war are always welcome! 3JUA7J4BWKFQ

The dietary concerns of too much fructose is well documented. High-fructose corn syrup has become the sweetener most commonly added to processed foods. Many dietary experts believe this increase directly correlates to the nation’s growing obesity epidemic. Now, new research at Oregon Health & Science University demonstrates that the brain – which serves as a master control for body weight – reacts differently to fructose compared with another common sweetener, glucose. The research is published in the online edition of the journal Diabetes, Obesity and Metabolism and will appear in the March print edition.

“We know from animal models that the brain responds uniquely to different nutrients and that these responses can determine how much they eat,” said Jonathan Purnell, M.D., an associate professor of medicine (endocrinology, diabetes and clinical nutrition) in the OHSU School of Medicine.

“With newer technologies such as functional MRI, we can examine how brain activity in humans reacts when exposed to, say, carbohydrates or fats. What we’ve found in this case is that the brain’s response to fructose is very different to the response to glucose, which is less likely to promote weight gain.”

Functional MRI allows researchers to watch brain activity in real time. To conduct the research, nine normal-weight human study subjects were imaged as they received an infusion of fructose, glucose or a saline solution. When the resulting brain scans from these three groups were compared, the scientists observed distinct differences.

Brain activity in the hypothalamus, one brain area involved in regulating food intake, was not affected by either fructose or glucose. However, activity in the cortical brain control areas showed the opposite response during infusions of the sugars. Activity in these areas was inhibited when fructose was given but activated during glucose infusion.

This is an important finding because these control brain areas included sites that are thought to be important in determining how we respond to food taste, smells, and pictures, which the American public is bombarded with daily.

“This study provides evidence in humans that fructose and glucose elicits opposite responses in the brain. It supports the animal research that shows similar findings and links fructose with obesity,” added Purnell.

“For consumers, our findings support current recommendations that people be conscious of sweeteners added to their drinks and meals and not overindulge on high-fructose, processed foods.”

The effect this research describes is to do with feelings of satiation. It takes a larger amount of fructose for your body to decide it has had enough than it would if you were consuming glucose. This is important research because fructose has replaced other ingredients in a wide range of foods, particularly in the US. The reduction of satiation would lead, generally, to increased consumption, and we all know where that ends up!

It’s worth mentioning that the popular food ingredient, high fructose corn syrup, is actually a blend of glucose and fructose, rather than being pure fructose. It’s still pretty nasty stuff though, and has been linked to liver disease and gout, amongst other complaints.

While fructose is present in many natural foods, most notably fruit and honey, we should not let these findings put us off from eating such otherwise beneficial foods. It is the massive amounts of fructose that is added to processed food that is really the problem. Just knowing that fructose can “trick” us into eating more than we really need can help us to make sensible decisions on when to stop eating.

Arkenor on July 1st, 2010

I know, I’m a bad man for not updating this site more often. After all, the whole point of it was to stop my main blog being so unfocused, by posting all my science related stuff here. Unfortunately, as it turns out, I don’t post about science anywhere near as often as I do about everything else!

This site will be here next time I do have something scientific to discuss. Until then, you might like to check out my regular blog, Ark’s Ark, for a vaguely gaming-related mess of tomfoolery and nonsense.

A breathtaking new picture just released by NASA, from their Wide-field Infrared Survey Explorer. Launched in 14 December 2009, WISE finished its testing phase on January 14 2010. In the two months since then it has already produced some spectacular data, and more than a few gorgeous images.

NASA WISE Cosmic Rosebud The Berkeley 59 Cluster 500x497

WISE's Cosmic Rosebud

A new infrared image from NASA’s Wide-field Infrared Survey Explorer, or WISE, shows a cosmic rosebud blossoming with new stars. The stars, called the Berkeley 59 cluster, are the blue dots to the right of the image center. They are ripening out of the dust cloud from which they formed, and at just a few million years old, are young on stellar time scales.

The rosebud-like red glow surrounding the hot, young stars is warm dust heated by the stars. Green “leafy” nebulosity enfolds the cluster, showing the edges of the dense, dusty cloud. This green material is from heated polycyclic aromatic hydrocarbons, molecules that can be found on Earth in barbecue pits, exhaust pipes and other places where combustion has occurred.

Red sources within the green nebula indicate a second generation of stars forming at the surface of the natal cloud, possibly as a consequence of heating and compression from the younger stars. A supernova remnant associated with this region, called NGC 7822, indicates that a massive star has already exploded, blowing the cloud open in a “champagne flow” and leaving behind this floral remnant. Blue dots sprinkled throughout are foreground stars in our Milky Way galaxy.

Berkeley 59 and NGC 7822 are located in the constellation Cepheus at a distance of about 3,300 light-years from Earth.

Infrared light is color coded in this picture as follows: blue shows 3.4-micron light; cyan, 4.6-micron light; green, 12-micron light; and red, 22-micron light.

Image Credit: NASA/JPL-Caltech/WISE TEam

Arkenor on June 22nd, 2009

I know I post too many faeces related stories. I can’t help it. We British are culturally hardwired to find poo hilarious, and thus interesting:

The whiskers, eyes, organs, and even genitals of tigers, and other big cats, are highly sought after for many medical and religious practices. These practices, along with widespread habitat destruction, have placed tigers under ever-increasing pressure, and it is more important than ever that we get an idea of how many are left in the wild. The difficulty with that is that they are very shy of humans, and exceptionally good at not being seen, so unless they are unwell, or feeling particularly rumbustious, you’re not going to spot enough to get a decent statistical sample. Camera traps, while useful, are a bit of a shot in the dark for creatures whose territory can cover up to 200 square miles.

Luckily, these chaps have come up with a plan!

The Wildlife Conservation Society (WCS) announced today a major breakthrough in the science of saving tigers: high-tech DNA fecal sampling.

According to the study, researchers will be able to accurately count and assess tiger populations by identifying individual animals from the unique DNA signature found in their dung. In the past, DNA was collected from blood or tissue samples from tigers that were darted and sedated. The authors say this new non-invasive technique represents a powerful new tool for measuring the success of future conservation efforts.

The study appears in the June 16th edition of the journal Biological Conservation. Authors of the study include: Samrat Mondol of the National Centre for Biological Sciences; K. Ullas Karanth, N. Samba Kumar, and Arjun M. Gopalaswamy of the Wildlife Conservation Society and Centre for Wildlife Studies; and Anish Andheria and Uma Ramakrishnan, also of the National Centre for Biological Sciences.

“This study is a breakthrough in the science of counting tiger numbers, which is a key yardstick for measuring conservation success,” said noted tiger scientist Dr. Ullas Karanth of the Wildlife Conservation Society. “The technique will allow researchers to establish baseline numbers on tiger populations in places where they have never been able to accurately count them before.”

Collecting tiger's poo allows scientists to count the population size.

Collecting tiger's poo allows scientists to count the population size.

The study took place in India’s Bandipur Reserve in Karnataka, a longterm WCS research site in the Western Ghats that supports a high abundance of tigers. Researchers collected 58 tiger scats following rigorous protocols, then identified individual animals through their DNA. Tiger populations were then estimated using sophisticated computer models. These results were validated against camera trap data, where individual tigers are photographed automatically and identified by their unique stripe pattern. Camera-trapping is considered the gold standard in tiger population estimation, but is impractical in several areas where tiger densities are low or field conditions too rugged.

“We see genetic sampling as a valuable additional tool for estimating tiger abundance in places like the Russian Far East, Sunderban mangrove swamps and dense rainforests of Southeast Asia where camera trapping might be impractical due to various environmental and logistical constraints,” said Karanth.

This is a promising technique for counting the remaining populations of many other species which are too elusive to get a handle on by normal means.

If you are interested in helping these brave and strong-stomached folks, you can donate towards their cause over at

Tags: ,

Arkenor on April 12th, 2009

This discovery is a couple of years old, but I’ve been thinking a lot recently about how humanity can improve itself (or as some folks call it “transhumanism”).

Scientists from the University of Sheffield are developing an artificial ‘plastic blood’, which could act as a substitute for real blood in emergency situations. The ‘plastic blood’, which will be on display at the Science Museum this month, could have a huge impact on military applications.

Because the artificial blood is made from a plastic, it is light to carry and easy to store. Doctors could store the substitute as a thick paste in a blood bag and then dissolve it in water just before giving it to patients – meaning it’s easier to transport than liquid blood.

Donated blood has a relatively short shelf-life of 35 days, after which it must be thrown away. It also needs refrigeration, whereas the ‘plastic blood’ will be storable for many more days and is stable at room temperature.

The artificial blood is made of plastic molecules that hold an iron atom at their core, just like haemoglobin, that can bind oxygen and could transport it around the body. The small plastic molecules join together in a tree-like branching structure, with a size and shape very similar to that of natural haemoglobin molecules. This creates the right environment for the iron to bind oxygen in the lungs and release it in the body.

While still in its development, the scientists hope this will make it particularly useful for military applications and being plastic, it’s also affordable. The scientists are now seeking further funding to develop a final prototype that would be suitable for biological testing.

Dr Lance Twyman, from the Department of Chemistry at the University of Sheffield and who has been developing the artificial blood for the last five years, said: “We are very excited about the potential for this product and about the fact that this could save lives. Many people die from superficial wounds when they are trapped in an accident or are injured on the battlefield and can’t get blood before they get to hospital. This product can be stored a lot more easily than blood, meaning large quantities could be carried easily by ambulances and the armed forces.
Sheffield University

This discovery is clearly of immense use in reducing the reliance on blood donation, which requires a high degree of testing and processing to ensure that no pathogens are transmitted. The just-add-water system, low cost, and long shelf life will make it easy to keep a supply almost anywhere where it might conceivably be of use. That’s fantastic, but might it also be able to bring the idea of transhumans one step closer?

If these oxygen-carrying plastic molecules were injected into the blood of a normal healthy individual, who has not suffered blood-loss, what would be the result?

It seems reasonable to think that the oxygen carrying capacity of their blood would be increased. This would have the effect of increasing endurance, but has also been linked to increased brain-function. Essentially then, it would make you tougher and smarter, which sounds like a pretty good deal to me if it’s cheap and has no horrible side-effects.

Athletes have been making use of the benefits of increased oxygen supply for some time, of course. Traditionally though, they have just injected regular blood, which is not terribly convenient, and the benefits are short-lived (though more than long enough to win a race):

To implement this form of doping, athletes collect and store several units of blood—their own or someone elses’—in the months prior to competition and then transfuse it back into themselves just prior to the event. One well-known instance of this practice occurred at the 1968 Olympic games in Mexico when an athlete broke the outdoor one-hour cycling record. He was accompanied to the games by two cardiologists and eight young men with blood types compatible with his own. – Illumin

Fairly ghoulish, and not available to most people. Thank goodness. Playing blood doll to an vampiric athlete must count as one of the worst jobs out there.

The other more modern method used involves the drug erythropoietin, or EPO, which causes the bone-marrow to produce extra haemoglobin. As well as increasing your athletic performance, EPO also causes blood-clots and seizures. Both methods are banned by sporting organisations.

Increasing the carrying capacity of blood might also be of use during pregnancy to help avoid certain complications that can arise in the unborn child. It might also reduce the (rare) complications that can arise when anaesthesia is used during surgery.

One potential downside to this would depend upon how quickly these artificial molecules are removed or metabolised. It is possible that they aren’t removed, and will happily bob around in your blood for years, or they might be filtered out in a matter of days. Having to get a weekly injection would be a big turn-off for most folks, though clearly not some professional athletes, who as recent history has shown are often more than happy to have mysterious substances shot into their veins on a daily basis.

If the effects are long-lived, the process is cheap enough for us commoners, and lacks unpleasant side-effects, then it would seem to be a good candidate for improving ourselves beyond our current capabilities. Even if this particular method turns out to be unsuitable for this purpose, it seems likely that a suitable technique to achieve increased blood oxygen will eventually be discovered.

Would you be up for something like that, or does it curdle your boringly unenhanced blood with horror?


Arkenor on September 8th, 2008

From a Cell press release:

Researchers have shown that they can create entirely new strains of infectious proteins known as prions in the laboratory by simply mixing infectious prions from one species with the normal prion proteins of another species. The findings are reported in the September 5th issue of the journal Cell, a Cell Press publication.

Prion diseases, also known as transmissible spongiform encephalopathies (TSEs), are infectious neurodegenerative diseases affecting the brain of several species of mammals including humans. Creutzfeldt-Jakob disease (CJD) is the most common prion disease in humans, along with scrapie in sheep, bovine spongiform encephalopathy (BSE, aka mad cow) in cattle, and chronic wasting disease (CWD) in deer and other cervids.

Unlike conventional infectious microorganisms, the infectious agent in the case of prion diseases consists exclusively of a misfolded form of the prion protein, earlier studies showed.

The researchers now find that prion strains produced by combining normal hamster proteins with infectious mouse proteins can infect hamsters and vice versa. Although they are both rodents, prions from one of the two species normally don’t readily infect the other, a common phenomenon amongst prions known as a species barrier, the researchers explained.

The novel prions they produced not only look different, but they also produce symptoms in the animals that differ from any known strain found in nature, they report.

” We are forcing the system by putting everything together, but this suggests that the variety of possible prions is really very large,” said Claudio Soto of the University of Texas Medical Branch. “We shouldn’t be surprised if new barriers are crossed and new prions arise. There is the potential for a large variety of new infectious prions—some of which may have dramatic effects.”

“The infectous agent is nothing like what we’re used to,” Soto said. “It’s just a protein with a different shape from the normal protein we all have.” Those misfolded and misshapen proteins can spread by causing normal protein to change their shape. Those aberrant forms band together, forming fibrils.

Soto’s team recently reported the generation of infectious prions by amplification of prion misfolding in the test tube. In those experiments, they used a technology called protein misfolding cyclic amplification (PMCA) that mimics some of the fundamental steps involved in the replication of infectious prions in living animals, but at an accelerated rate. The method involves placing small quantities of infectious prions with large quantities of the normal protein from the same species together, allowing the infectious form to imprint on the normal form and thereby replicate itself.

Now, they show that the same method can generate new strains when infectious prions from one species are mixed with normal prion proteins from another species. The finding provides conclusive evidence that the imprinting of disease-causing prions on normal forms can overcome species barriers, and doesn’t require any other infectious agent.

This new insight has profound implications for public health, according to the researchers.

” One of the scariest medical problems of the last decades has been the emergence of a new and fatal human prion disease–variant CJD–originated by cross-species transmission of BSE from cattle,” the researchers said. BSE has also spread to other animals, including exotic cats, other primates and domestic cats, after they ate feed derived from diseased cows.

The new method might provide insight into the risk that other prion diseases could spread from one species to another, Soto said. For instance, scientists don’t know whether chronic wasting disease, a condition now on the rise amongst deer in some parts of the U.S., can be transmitted to humans or not.

Test tube studies like this one might help answer that question, and– in the case that the deer prions can make the leap—such studies may inform scientists about what those prions might look like, he said. By studying any new prion strains created in mice with the human prion protein, scientists might also gain insight into the potential symptoms associated with those diseases.

” The data demonstrate that PMCA is a valuable tool for the investigation of the strength of the barrier between diverse species, its molecular determinants, and the expected features of the new infectious material produced,” the researchers concluded. “Finally, our findings suggest that the universe of possible prions is not restricted to those currently known but that likely many unique infectious foldings of the prion protein may be produced and that one of the sources for this is cross-species transmission.”

This research makes it extremely clear that there is an inherent risk in feeding one species to another. As prions are believed to be able to cross the blood brain barrier, if you eat an animal that has infectious prions, even if those prions are not capable in and of themselves of infecting you, there is a risk that they will interact with your own prions to create a whole new variety of infectious prion protein that CAN harm us. Perhaps this is how Creutzfeldt-Jakob disease originated.

The sensible thing to do is to take great care in ensuring our food animals are free of prion disease, all the way down their food chain. Cattle feed is often surprisingly high in animal protein. Feeding animal protein to herbivores seems to me to be an entirely unnecessary risk factor in disease transmission. Equally, just because a prion disease is not known to be transferable in the traditional sense, does not mean that we should be eating infected deer or rabbits.

The North Carolina Department of Agriculture has a handy guide for removing all the parts of a deer that are especially high in prions.

What parts of a deer could kill you.

Oh dear. There’s rather a lot of them.


Arkenor on May 7th, 2008

Durham University is predicting that the UK, already famous for being a rather damp place, is likely to get moister still in the coming years.

Expert predicts ‘Monsoon Britain’

Prepare for more floods – in ways we are not used to – that’s the message from experts at Durham University who have studied rainfall and river flow patterns over 250 years.

Last summer was the second wettest on record and experts say we must prepare for worse to come.

Professor Stuart Lane, from Durham University’s new Institute of Hazard and Risk, says that after about 30 to 40 less eventful years, we seem to be entering a ‘flood-rich’ period. More flooding is likely over a number of decades.

Prof. Lane, who publishes his research in the current edition of the academic journal Geography, set out to examine the wet summer of 2007 in the light of climate change. His work shows that some of the links made between the summer 2007 floods and climate change were wrong. Our current predictions of climate change for summer should result in weather patterns that were the exact opposite of what actually happened in 2007.

The British summer is a product of the UK’s weather conveyor belt and the progress of the Circumpolar Vortex or ‘jet stream’. This determines whether we have high or low pressure systems over the UK. Usually the jet stream weakens and moves northwards during spring and into summer. This move signals the change from our winter-spring cyclonic weather to more stable weather during the summer. High pressure systems extend from the south allowing warm air to give us our British summer.

In 2007, the jet stream stayed well south of its normal position for June and July, causing low pressure systems to track over the UK, becoming slow moving as they did so. This has happened in summer before, but not to the same degree. Prof. Lane shows that the British summer can often be very wet – about ten per cent of summers are wetter than a normal winter. What we don’t know is whether climate change will make this happen more in the future.

However, in looking at longer rainfall and river flow records, Prof. Lane shows that we have forgotten just how normal flooding in the UK is. He looked at seasonal rainfall and river flow patterns dating back to 1853 which suggest fluctuations between very wet and very dry periods, each lasting for a few years at a time, but also very long periods of a few decades that can be particularly wet or particularly dry.

In terms of river flooding, the period since the early 1960s and until the late 1990s appears to be relatively flood free, especially when compared with some periods in the late 19th century and early 20th Century. As a result of analysing rainfall and river flow patterns, Prof. Lane believes that the UK is entering a flood rich period that we haven’t seen for a number of decades.

Boscastle, Cornwall, has been repeatedly flooded in recent years.

He said: “We entered a generally flood-poor period in the 1960s, earlier in some parts of the country, later in others. This does not mean there was no flooding, just that there was much less than before the 1960s and what we are seeing now. This has lowered our own awareness of flood risk in the UK. This has made it easier to go on building on floodplains. It has also helped us to believe that we can manage flooding without too much cost, simply because there was not that much flooding to manage.”

He added: “We have also not been good at recognising just how flood-prone we can be. More than three-quarters of our flood records start in the flood-poor period that begins in the 1960s. This matters because we set our flood protection in terms of return periods – the average number of years between floods of a given size. We have probably under-estimated the frequency of flooding, which is now happening, as it did before the 1960s, much more often that we are used to.

“The problem is that many of our decisions over what development to allow and what defences to build rely upon a good estimate of these return periods. The government estimates that 2.1 million properties and 5 million people are at risk of flooding. In his review of the summer floods Sir Michael Pitt was wise to say that flooding should be given the same priority as terrorism.”

Professor Lane concluded: “We are now having to learn to live with levels of flooding that are beyond most people’s living memory, something that most of us have forgotten how to do.”

Flooding is one of the issues covered by the Institute of Hazard and Risk Research at Durham University where Prof. Lane is a resident expert. The IHRR, which launches this week, is a new and unique interdisciplinary research institute committed to delivering fundamental research on hazards and risks and to harness this knowledge to inform global policy. It aims to improve human responses to both age-old hazards such as volcanoes, earthquakes, landslides and floods as well as the new and uncertain risks of climate change, surveillance, terror and emerging technologies.

Tags: , ,

Arkenor on May 2nd, 2008

Cerro Paranal, home of ESO’s Very Large Telescope, is certainly one of the best astronomical sites on the planet. Stunning images, obtained by ESO staff at Paranal, of the green and blue flashes, as well as of the so-called ‘Gegenschein’, are real cases in point.

The Earth’s atmosphere is a gigantic prism that disperses sunlight. In the most ideal atmospheric conditions, such as those found regularly above Cerro Paranal, this will lead to the appearance of so-called green and blue flashes at sunset. The phenomenon is so popular on the site that it is now the tradition for the Paranal staff to gather daily on the telescope platform to observe the sunset and its possible green flash before starting their long night of observations.

The green and blue flashes are fleeting events that require an unobstructed view of the setting Sun, and a very stable atmosphere. These conditions are very often met at Paranal, a 2635m high mountain in the Chilean Atacama Desert, where the sky is cloudless more than 300 days a year. Paranal is home of ESO’s Very Large Telescope, an ensemble of four 8.2-m telescopes and four 1.8-m Auxiliary Telescopes that together form the world’s most advanced optical telescope.

ESO staff Guillaume Blanchard was able to capture the rather rare blue flash while observing the sunset on Christmas eve. The very intense blue seen on the image shows the reality of the phenomenon.

ESO staff Stéphane Guisard has been chasing green flashes for many years and has been able to capture them on many occasions. The picture shown above is one of many examples from his collection. “The most challenging is to capture the green flash while still seeing the rest of the Sun with all its colours,” says Guisard.
His colleague Guillaume Blanchard was even luckier. On Christmas Eve, as he was one of the few to follow the tradition of looking at the sunset, he had the chance to immortalise a blue flash using his hobby telescope.

ESO astronomer Yuri Beletsky also likes to take photographs from Paranal, but he prefers the night views. This allows him to make use of the unique conditions above the site to make stunning images. On some of these, he has captured other extremely interesting effects related to the Sun: the so-called Zodiacal light and the ‘Gegenschein’.

Photo of the morning sky above the Paranal Residencia taken by ESO astronomer Yuri Beletsky. The Milky Way is nicely seen along with its numerous dark dust lanes and amazing nebulae. The Zodiacal light - sunlight reflected by interplanetary dust - is clearly visible as the band of light that is inclined with respect to the Milky Way by about 40-50 degrees. The planet Venus is also visible in this photo, just above the Residencia.

Both the Zodiacal light and the Gegenschein (which is German for “counter shine”) are due to reflected sunlight by interplanetary dust. These are so faint that they are only visible in places free from light pollution.

Most of the interplanetary dust in the Solar System lies in the ecliptic, the plane close to which the planets are moving around the Sun, and the Zodiacal light and Gegenschein are thus seen in the region centred around the ecliptic. While the Zodiacal light is seen in the vicinity of the Sun, the Gegenschein is seen in the direction opposite to the Sun.

Each of the small particles of dust, left over from comets and asteroids, acts as a small Moon reflecting the light coming from our host star. “If you could see the individual dust particles then you would see the ones in the middle of the Gegenschein looking like very tiny full moons, while the ones hidden in the faint part of the dust band would look like tiny crescent moons,” explains ESO astronomer Colin Snodgrass. “But even the VLT cannot see such tiny individual dust particles out in space. Instead we see the combined effect, in photos like these, of millions of tiny dust particles reflecting light back to us from the Sun.”

You can see more of the discoveries being made at the ESO at

Tags: , , ,

Arkenor on January 19th, 2008

GENEVA (AFP) – Forget retail therapy for some relief from that winter cold — a study by Swiss scientists revealed on Wednesday that the flu virus can nestle and survive on banknotes for more than two weeks.

Scientists from Geneva’s University Hospital were asked by a Swiss bank to carry out the study amid worries that a flu pandemic could be prolonged thanks to the millions of bank notes in circulation, Le Temps newspaper reported.

Between 20 and 100 million banknotes change hands in Switzerland alone each day, it said.

The researchers left small samples of the flu virus on used banknotes which were then left at room temperature. Although the virus only survived in most cases for a few hours, certain highly concentrated samples proved resistant for several days.

In the worst case, if the virus was mixed with human mucus on the banknote, it could survive for two and a half weeks, Le Temps said.

“This unexpected resilience of the virus suggests that this sort of inert, non-biological support should not be overlooked in pandemic planning,” chief researcher Yves Thomas told the paper.

The team will now do further research to see how much of a factor banknotes might be in flu transmission, though Thomas stressed that the main risks remain airborne transmission and direct human contact.

Turns out green really IS the colour of money!

Pretty nasty to think about. Influenza virus can survive outside of the body long enough to make any number of objects potential carriers. Door handles, keyboards, newspapers, ewww. All the more reason to wash hands regularly, especially before meals.

Tags: , , ,

Arkenor on December 16th, 2007

Just a couple of quickies today:

Stanford, CA — Carbon emissions from human activities are not just heating up the globe, they are changing the ocean’s chemistry. This could soon be fatal to coral reefs, which are havens for marine biodiversity and underpin the economies of many coastal communities. Scientists from the Carnegie Institution’s Department of Global Ecology have calculated that if current carbon dioxide emission trends continue, by mid-century 98% of present-day reef habitats will be bathed in water too acidic for reef growth. Among the first victims will be Australia’s Great Barrier Reef, the world’s largest organic structure.

Chemical oceanographers Ken Caldeira and Long Cao are presenting their results in a multi-author paper in the December 14 issue of Science* and at the annual meeting of American Geophysical Union in San Francisco on the same date. The work is based on computer simulations of ocean chemistry under levels of atmospheric CO2 ranging from 280 parts per million (pre-industrial levels) to 5000 ppm. Present levels are 380 ppm and rapidly rising due to accelerating emissions from human activities, primarily the burning of fossil fuels.

“About a third of the carbon dioxide put into the atmosphere is absorbed by the oceans,” says Caldeira, “which helps slow greenhouse warming, but is a major pollutant of the oceans.” The absorbed CO2 produces carbonic acid, the same acid that gives soft drinks their fizz, making certain minerals called carbonate minerals dissolve more readily in seawater. This is especially true for aragonite, the mineral used by corals and many other marine organisms to grow their skeletons.

“Before the industrial revolution, over 98% of warm water coral reefs were bathed with open ocean waters 3.5 times supersaturated with aragonite, meaning that corals could easily extract it to build reefs,” says Cao. “But if atmospheric CO2 stabilizes at 550 ppm — and even that would take concerted international effort to achieve — no existing coral reef will remain in such an environment.” The chemical changes will impact some regions sooner than others. At greatest risk are the Great Barrier Reef and the Caribbean Sea.

Carbon dioxide’s chemical effects on the ocean are largely independent of its effects on climate, so measures to mitigate warming short of reducing emissions will be of little help in slowing acidification, the researchers say. In fact, impending chemical changes may require emissions cuts even more drastic than those for climate alone.

“These changes come at a time when reefs are already stressed by climate change, overfishing, and other types of pollution,” says Caldeira, “so unless we take action soon there is a very real possibility that coral reefs — and everything that depends on them —will not survive this century.”

Aragonite Saturation

Coral reefs are vital to the ocean’s biodiversity. Their loss would be catastrophic for the entire oceanic ecosystem, as many open ocean species use their relative calm and safety to breed.

Tags: , , , , ,