News


Abstract

Obama talks climate change. How air conditioners make cities hotter. The search for habitable exoplanets. And why data analysis is at the heart of a debate over the birth of the universe.

Obama and the “climate deniers club”

US president Barack Obama has said that denying climate change is like arguing that the moon is made of cheese. Congress is “full of folks who stubbornly and automatically reject the scientific evidence”. Some, he said, claim that climate change is a hoax or fad, while others avoid the question. “They say, ‘Hey, look, I'm not a scientist.'” Obama went on: “I'll translate that for you: what that really means is, ‘I know that man-made climate change really is happening but if I admit it, I'll be run out of town by a radical fringe that thinks climate science is a liberal plot.'”

Obama was addressing graduates at the University of California two weeks after he announced a contentious plan to dramatically cut pollution from power plants. He also announced a $1bn fund for communities to rebuild and prepare for the impact of extreme weather. Climate change was a “big thing”, he said, in a political system consumed by “small things”; but Americans should not be afraid to address it despite the type of opposition he faces from Congress.

“It's pretty rare that you'll encounter somebody who says the problem you're trying to solve simply doesn't exist. When President Kennedy set us on a course to the moon, there were a number of people who made a serious case that it wouldn't be worth it. But nobody ignored the science. I don't remember anybody saying the moon wasn't there or that it was made of cheese,” Obama said.

His strong endorsement of evidence for, and call for action on, climate change contrasts strongly with the views and actions of the current Canadian and Australian premiers. “What I am supportive of is, frankly, not ratifying the Kyoto agreement and not implementing it”, said Stephen Harper, Canada's prime minister, back in 2002, describing it as “a socialist scheme to suck money out of wealth-producing nations” based on “tentative and contradictory scientific evidence”. More recently, in June this year, he met Australian prime minister Tony Abbott, who, in 2009, described the science of human-caused climate change as “crap”. Climate change is “not the only or even the most important problem that the world faces”, said Abbott at their meeting and dismissed the need to do more in response to a new US plan.

Australian critics have accused the pair of “creating a conservative climate deniers club”.

Hard to deny?

A new approach to analysing climate data has found that the odds that global warming is due to natural factors are – as a headline put it – “slim to none”. More technically, the natural-warming hypothesis may be ruled out “with confidence levels greater than 99%, and most likely greater than 99.9%”.

The analysis does not use climate models, with all their complexities and uncertainties; instead it analyses historical data. It takes air temperatures since 1500 and sets them against carbon dioxide emissions from the burning of fossil fuels in the industrial era from around 1880. “CO2 emissions as a proxy for all man-made climate influences is a simplification justified by the tight relationship between global economic activity and greenhouse gas emission and particulate pollution”, says the study's author, Shaun Lovejoy of McGill University. “This allows the new approach to implicitly include the cooling effects of particulate pollution that are still poorly quantified in computer models.”

image

Lovejoy uses multi-proxy climate reconstructions using gauges found in nature, such as tree rings, ice cores, and lake sediments. Fluctuation-analysis techniques make it possible to understand the temperature variations over wide ranges of time scales. The study predicts, with 95% confidence, that a doubling of carbon dioxide levels in the atmosphere would cause the climate to warm by between 2.5 and 4.2°C. That range is more precise than – but in line with – recent predictions by the Intergovernmental Panel on Climate Change (IPCC) that temperatures would rise by 1.5–4.5°C if CO2 concentrations double. The study complements and confirms the IPCC predictions, but is independent of them.

“We've had a fluctuation in average temperature that's just huge since 1880 – on the order of about 0.9 degrees Celsius”, Lovejoy says. “This study will be a blow to any remaining climate-change deniers. Their two most convincing arguments – that the warming is natural in origin, and that the computer models are wrong – are either directly contradicted by this analysis, or simply do not apply to it.”

Source: Lovejoy, S. (2014) Scaling fluctuation analysis and statistical hypothesis testing of anthropogenic warming. Climate Dynamics, 42, 2339–2351.

Big data: big disagreement

Big data contains within it thousands of patterns and relationships between vast quantities of variables that may be linked in all kinds of ways. Therein lies much of its value. But spotting the patterns is a major problem.

In 2011, a group of Harvard researchers published a highly influential paper in the journal Science that claimed to provide a statistical tool to do exactly that. A new paper demonstrates that this new tool is critically flawed, and provides an alternative. “The [Harvard] tool does not have the mathematical properties that were claimed”, says Justin Kinney of the Cold Springs Harbour Lab, co-author of the paper. Instead, they have developed a tool which uses “mutual information” – a concept first described by Claude Shannon in 1948 as part of information theory. “What we've found in our work is that this same concept can also be used to find patterns in data”, Kinney explains.

image

Applied to big data, mutual information is able to reveal patterns in large lists of numbers. Importantly, mutual information provides a way of identifying all types of patterns within the data without reliance upon any prior assumptions. “Our work shows that mutual information very naturally solves this critical problem in statistics.”

Source: Science Daily, http://www.sciencedaily.com/releases/2014/02/140218185128.htm

Hot city nights

Can't sleep? Too darned hot? Is even a sheet too much to bear on these sweltering nights? Summertime is hitting the northern hemisphere, and heat from baking city streets makes the nights long and wakeful. Perhaps you should turn up the air-con.

Or rather, don't. US researchers have found that heat from air-conditioning systems raises some nighttime urban temperatures by 1°C or more. Waste heat from air conditioners discharges from offices and apartment blocks straight into the city air and makes hot cities hotter.

Phoenix, Arizona (pictured) is their model city – a hot place anyway. The researchers took records of a 10-day period of hot weather in July 2009 and used computer models to analyse the difference air-conditioning systems made to temperatures outside.

The biggest demand for air conditioning is in the daytime, but daytime effects are small. At night, though, it is a different story.

Eighty-seven per cent of US households have air conditioning, and the US uses more electricity to keep cool than all the countries of the world combined, though it is far from being the hottest country. Meanwhile, air conditioning in Phoenix can consume more than half the city's electricity needs.

And as the researchers point out, it is a vicious circle. The hotter the nights, the more people use air conditioning – making the nights hotter still. They suggest recycling the waste heat, perhaps to heat water, rather than discharging it. It could save Phoenix alone more than 1,200 megawatt-hours of electricity per day. And its citizens could sleep soundly – and coolly – knowing their fuel bills were smaller.

Source: Salamanca, F., Georgescu, M., Mahalov, A. Moustaoui, M. and Wang, M. Anthropogenic heating of the urban environment due to air conditioning. Journal of Geophysical Research: Atmospheres. doi: 10.1002/2013JD021225

Life support systems

There are some 100 million places in the Milky Way that are capable of supporting life – life that is larger and more complex than microbes. New research by astronomers compares different indexes of habitability of exoplanets – those that lie outside our solar system – to throw light on the matter. The authors claim it as the first quantitative assessment of the plausibility of complex life based on empirical data.

They developed the Earth Similarity Index (ESI), which rates how similar each newly discovered exoplanet is to Earth (in terms of radius, temperature and density). But they point out that life could exist in forms quite different from those on Earth – which means a planet's similarity to Earth may not be the most appropriate measure. They added a Planetary Habitability Index (PHI) as a second tier of analysis for the search for life on other worlds, which assesses planets based on the availability of energy sources, and the presence of liquids either in the atmosphere or above or below the surface.

But evolution of complex life requires more, so they added a Biological Complexity Index (BCI) as a complement to the other two. It ranks planets on their ability to sustain complex biology, rather than their geophysical similarity to Earth. It includes whether the surface is solid or gas, its chemistry, and its age – whether it is old enough to allow time not only for the appearance of life but also for the evolution of complex and diverging life forms. Thus, while an exoplanet called HD 20794 has a surface temperature of an unbearable (to us) 155 °C, a period of prolonged evolution – three times longer than life has had to evolve on Earth – could have adapted complex organisms to those extremes, or the planet could have passed through a cooler period during which complex life might have thrived.

All three metrics employ some overlapping attributes, but the predictive value of each is geared towards a different objective. The BCI looks for biospheres that allow whole ecosystems to develop, with many ecological niches and a range of life histories – a number of forms and a food chain or web.

There is a high correlation (r2 = 0.43, p < 0.0001) between ESI and BCI. And one exoplanet, Gliese 581c, has a BCI that is actually higher than Earth's, at 1.95 compared to 1.88. Such “superhabitable” worlds, say the authors, would probably be larger, warmer, and older than Earth, and orbiting dwarf stars – and would therefore score low on the ESI (cf. Significance, October 2013).

The authors warn that the BCI does not represent a statistical probability of life but a tool for the relative assessment of conditions that could be compatible with life. Further, the great distances allow a greater estimate of the possibility for complex life than a more detailed look might justify. Mars, for example, would score highly when viewed from several light-years away; our closer view tells us that, at present anyway, it cannot support complex life. With about 10bn stars in the Milky Way, the BCI yields 100m plausible planets. The evolution of complex life on other worlds would seem rare in frequency but large in absolute number.

Source: Irwin, L. N., Méndez, A., Fairén, A. G. and Schulze-Makuch, D. (2014) Assessing the possibility of biological complexity on other worlds, with an estimate of the occurrence of complex life in the Milky Way Galaxy. Challenges, 5, 159–174.

image

Dust devils in the details

When astrophysicists published the first images of gravitational waves in March, it was hailed as a “smoking gun” for cosmic inflation theory. But a new statistical analysis suggests cosmic dust might be obscuring the true picture.

The researchers, based at the South Pole, had been using the powerful BICEP2 telescope, to study an area of space about two to ten times the width of the full moon looking for tell-tale signs that our universe experienced a period of rapid, faster-than-light expansion a fraction of a second after the Big Bang occurred.

Theorists predicted that this expansion would have set off waves of gravitational energy, leaving distinctive swirls in the cosmic microwave background radiation, the afterglow of the Big Bang – and it was these patterns that the BICEP2 team aimed to discover.

In particular, they were looking for a B-mode pattern, described by BICEP2 co-leader Chao-Lin Kuo as “a unique signature of gravitational waves” – and, according to a press release announcing the findings, the team were “surprised to detect a B-mode polarization signal considerably stronger than many cosmologists expected” (pictured).

But doubt has followed surprise, with several researchers since questioning whether the BICEP2 team properly accounted for cosmic dust when running their analysis.

In a recent paper1, Raphael Flauger, a theorist from the Institute for Advanced Study, argued – with J. Colin Hill of New York University and David N. Spergel of Princeton – that the amount of polarized dust emission in that region of the sky is uncertain “and could potentially be large enough to account for the excess B-mode power seen by BICEP2”.

image

BICEP2 Collaboration

Indeed, Flauger et al. said that BI-CEP2 (and BICEP1) data was not able to discriminate between a cosmological signal and a null hypothesis model that used different estimates for the level of dust and other factors. “Thus, no strong cosmological inference can be drawn at this time,” they wrote.

The BICEP2 team are, however, standing by their findings – though they acknowledge in their paper2 that new data from the Planck space observatory might suggest that dust contamination in their part of the sky is higher than previously thought.

That Facebook feeling

If your Facebook updates have been uncharacteristically upbeat lately, it might have something to do with your social connections. According to a recently-published study1, the emotional content of your Facebook news feed might be – ever so slightly – contagious.

Adam Kramer, a data scientist at Facebook, teamed up with researchers from the University of California and Cornell University to tweak the amount of positive and negative status updates appearing in the news feeds of more than 310 000 Facebook users.

According to Kramer (http://bit.ly/AdamKramerPost), they wanted to explore whether “seeing friends post positive content leads to people feeling negative or left out” and whether “exposure to friends’ negativity might lead people to avoid visiting Facebook”.

What they found instead was that when negative news feed posts were reduced, Facebook users included fewer negative words and more positive words in their own status updates, while a decrease in positive news feed content was accompanied by a reduction in the use of positive words and an increase in negative words.

The study ran for a week in January 2012. Randomly selected users were split into two groups – one to test the effect of reduced positivity, the other to test for reduced negativity. Posts were deemed to be positive or negative if they contained at least one positive or negative word, as defined by a piece of text analysis software called Linguistic Inquiry and Word Count. Each experimental group had their own similarly sized control group.

The observed effects were small, with emotional content moving only a fraction of a percentage point in one direction or the other – and these movements were, in all cases, much less than one standard deviation.

But the external effect of the research has been huge – triggering a storm of debate about the ethics of big data studies such as these. The researchers did not tell study participants about the research because they said it was consistent with Facebook's Data Use Policy, which all users agree to when joining the social network. However, Elizabeth Berman, associate professor of sociology at the University of Albany, asks: “Does signing a user agreement when you create an account really constitute informed consent?” (http://bit.ly/BethBermanBlog).

Ancillary