Friday, July 31, 2015

Life's Building-Block Chemicals Found on Comet by Lander

The Conversation

Scientists analysing the latest data from  have discovered molecules that can form sugars and amino acids, which are the building blocks of life as we know it. While this is a long, long way from , the data shows that the organic compounds that eventually translated into organisms here on Earth existed in the early solar system.The Conversation

The results are  as two independent  in the journal , based on data from two different instruments on . One comes from the German-led  (COSAC) team and one from the UK-led .

The data finally sheds light on questions that the European Space Agency posed 22 years ago. One of the declared goals of the  when it was approved in 1993 was to determine the composition of  in the cometary nucleus. And now we have the answer, or at least, an answer: the compounds are a mixture of many different molecules. Water, carbon monoxide (CO) and carbon dioxide (CO2) – this is not too surprising, given that these molecules  around comets. But both COSAC and Ptolemy have found a very wide range of additional compounds, which is going to take a little effort to interpret.

At this stage, I should declare an interest: I am a co-investigator on the Ptolemy team – but not an author on the paper. But the principal investigator of Ptolemy, and first author on the paper, is my husband .

Having made this clear, I hope that readers will trust that I am not going to launch into a major diatribe against one set of data, or a paean of praise about the other. What I am going to do is look at the conclusions that the two teams have reached – because, although they made similar measurements at similar times, they have interpreted their data somewhat differently. This is not a criticism of the scientists, it is a reflection of the complexity of the data and the difficulties of disentangling mass spectra.

Deciphering the datagas chromatographs or . In mass spectrometry mode, they can identify chemicals in vaporised compounds by stripping the molecules of their electrons and measuring the mass and charge of the resulting ions (the mass-to-charge ratio, m/z). In gas-chromatography mode they separate the mixture on the basis of how long it takes each component in the mixture to travel through a very long and thin column to an ionisation chamber and detector.

Either way, the result is a mass spectrum, showing how the mixture of compounds separated out into its individual components on the basis of the molecular mass relative to charge (m/z).

Unfortunately, the job doesn’t end there. If it were that simple, then organic chemists would be out of a job very quickly. Large molecules break down into smaller molecules, with characteristic fragmentation patterns depending on the bonds present in the original molecule. Ethane, C2 H6 for example, has an m/z of 30, which was seen in the spectra. So the peak might be from ethane, or it might be from a bigger molecule which has broken down in the ionisation chamber to give ethane, plus other stuff.

Then again, it might be from CH2O, which is formaldehyde. Or it might be from the breakdown of polyoxymethylene. Or it might be from almost  which have an m/z of 30. Figuring out what it is exactly is a tough job and the main reason why I gave up organic chemistry after only a year – far too many compounds to study.

Of course, the teams didn’t identify every single peak in isolation, they considered the series of peaks which come from fragmentation. This helps a bit, in that there are now many more combinations of compounds and fractions of compounds which can be matched.

So where does this leave us? Actually, with an embarrassment of riches. Have the teams come to the same conclusions? Sort of. They both detected compounds which are important in the pathway to producing sugars – which go on to form the . They also both note the very low number of sulphur-bearing species, which is interesting given the , and the ease with which it can become integrated into organic compounds.

comet New images show Philae’s landing spots on comet when bouncing around and taking measurements.

The COSAC team suggests that nitrogen-bearing species could be relatively abundant, whilst Ptolemy found fewer of them. This is important because nitrogen is an essential element for life, and is a fundamental . Conversely, the Ptolemy team has found lots of CO2, whilst COSAC hasn’t detected much.

These differences are probably related to sampling location: COSAC ingested material from the bottom of Philae, while Ptolemy sniffed at the top. Did Ptolemy breathe in cometary gases, whilst COSAC choked on the dust kicked up during the brief touchdown? If so, then the experiments have delivered wonderfully complementary sets of data.

Most importantly, both of those sets of data show that the ingredients for life were present in a body which formed in the earliest stages of solar system history. Comets act as messengers, delivering water and dust throughout the solar system – now we have learnt for certain that the ingredients for life have been sown far and wide through the 4.567 billion years of solar system history. The challenge now is to discover where else it might have taken root.

What else is certain is that both teams are keeping fingers crossed that the Philae-Rosetta communications link stabilises, so that they can get on with their analyses. This is just the start.

The Conversation. Read the

see also:

How the Brain Purges Bad Memories

A brain circuit has been found that allows us to forget fear and anxiety

By | |illustration of the human brain

A new study confirms that a working connection between the two brain regions is necessary to do away with fear.

The brain is extraordinarily good at alerting us to threats. Loud noises, noxious smells, approaching predators: they all send electrical impulses buzzing down our sensory neurons, pinging our brain’s fear circuitry and, in some cases, causing us to fight or flee. The brain is also adept at knowing when an initially threatening or startling stimulus turns out to be harmless or resolved. But sometimes this system fails and unpleasant associations stick around, a malfunction thought to be at the root of post-traumatic stress disorder (PTSD). New research has identified a neuronal circuit responsible for the brain’s ability to purge bad memories, findings that could have implications for treating PTSD and other anxiety disorders.

Like most emotions, fear is neurologically complicated. But previous work has consistently implicated two specific areas of the brain as contributing to and regulating fear responses. The amygdala, two small arcs of brain tissue deep beneath our temples, is involved in emotional reactions, and it flares with activity when we are scared. If a particular threat turns out to be harmless, a brain region behind the forehead called the prefrontal cortex steps in and the fright subsides. Our ability to extinguish painful memories is known to involve some sort of coordinated effort between the amygdala and the prefrontal cortex. The new study, led by Andrew Holmes at the National Institutes of Health, however, confirms that a working connection between the two brain regions is necessary to do away with fear.

Normally mice that repeatedly listen to a sound previously associated with a mild foot shock will learn that on its own the tone is harmless, and they will stop being afraid. Using optogenetic stimulation technology, or controlling specific neurons and animal behavior using light, the authors found that disrupting the amygdala–prefrontal cortex connection prevents mice from overcoming the negative association with the benign tone. In neurobiology speak, memory “extinction” fails to occur. They also found that the opposite is true—that stimulating the circuit results in increased extinction of fearful memories.

Until now investigators were unsure whether the amygdala–prefrontal cortex communication pathway could on its own control fear extinction; both structures interact with many other brain regions, and so isolating their effects of on behavior was a challenge. Optogenetics made the discovery possible, allowing the NIH group to precisely assess only the connection between the two brain regions in real time, providing a more accurate correlation between neuronal activity and behavior.

Holmes sees the amygdala and prefrontal cortex as two major hubs in a complex communications network. In the case of impaired fear extinction such as PTSD, however, it is just the one connection between the two regions that is faulty, not the hubs themselves. “To regulate fear extinction,” he explains, “I think it will be better to isolate and fix that one line of communication as opposed to trying to reengineer the hubs themselves—it's their job to carry many lines of communication for all manner of brain functions, most of which are probably working just fine."

Given the similarities in fear circuitry between rodents and humans, the new findings could inform research into new therapeutic approaches to anxiety disorders, including into medications that act on the fear circuit. Holmes believes that healthy fear extinction relies on “neural plasticity,” the brain’s ability to make new neuronal connections, which is in part influenced by the brain’s own cannabinoids, compounds that regulate neurotransmitters. Drugs that alter the cannabinoid system could provide a way to modify the fear circuit, thereby—possibly—alleviating anxiety.

Neurostimulation technologies, including transcranial magnetic stimulation and even optogenetics, could also potentially be used therapeutically to augment standard anxiety treatments. One such treatment is exposure therapy, in which patients are repeatedly exposed to a stimulus they find abnormally stressful until it no longer causes anxiety. Perhaps externally stimulating the fear circuit in combination with repeated recollections of a painful memory—or repeated exposures to a fear-inducing stimulus—might work together to ease the symptoms of PTSD and other anxiety disorders.

As Holmes points out, it is not unlike when your home Internet connection goes sluggish: "Rather than trying to fix the faulty wire on the telephone pole to help boost your signal—and disrupting many lines of communication—it’s better to just fix the faulty line of communication.”

see also:

What Science Says about Kids and Tech

Are touch screens rotting the brains of our youth? New research is finding some answers

By | |

In my I noted that it’s typically older people who seem to disapprove of the younger ones' immersion in electronics. Of course that's a typical generational reaction; it used to be the radio that would rot young people’s brains…then TV…and now phones and tablets.

But what does say about the effect of touch-screen devices on children?

Not much; the touch-screen era is still very young. But a handful of studies investigating intensive device use have emerged, including the ones summarized here. Most seem to suggest that moderation in screen time is a good idea, although some point out benefits. Here are more details about what they found—and how:

: Cognitive control in media multitaskersSubjects: 262 college students

: Adolescents' use of instant messaging as a means of emotional reliefSubjects: 150 adolescents

: Do television and electronic games predict children's psychosocial adjustment?Subjects: 11,014 British five- and seven-year-olds

Study (): Parents' perspectives: Children's use of technology in the early yearsSubjects: 1,028 parents

: Five days at outdoor education camp without screens improves preteen skills with nonverbal emotion cuesSubjects: 105 preteens

: Sleep Duration, Restfulness and Screens in the Sleep EnvironmentSubjects: 2,048 fourth- and seventh-graders

see also:

Hacked Molecular Machine Could Pump Out Custom Chemicals

illustration of DNA

An engineered ribosome with a permanent connection between its subunits (red) can operate side-by-side with a cell's own protein production machinery.

By hijacking the cellular machinery that makes proteins, bioengineers have developed a tool that could allow them to better understand protein synthesis, explore how antibiotics work and convert cells into custom chemical factories.

All life owes its existence to the ribosome, a huge, hardworking molecular machine that reads RNA templates transcribed from DNA, and uses the information to string together amino acids into proteins. A cell requires functioning ribosomes to survive — but they are difficult to engineer. If the engineered molecules deviate too far from the standard design, the cell will die.

“An engineered ribosome learns to do better what you want, but it starts to forget how to do its normal job,” says biochemist Alexander Mankin of the University of Illinois in Chicago.

Mankin teamed up with biochemical engineer Michael Jewett of Northwestern University in Evanston, Illinois, and others to create a ribosome that engineers could tinker with. The results of their handiwork are published in .

Mega-machines

It is these properties that draw the attention of bioengineers such as Jewett. These researchers would like to create ribosomes that could do other chemical reactions and spit out novel polymers, or incorporate unnatural amino acids into proteins that could be used as drugs.

Each ribosome contains two clumps of snarled RNA molecules, a small subunit and a large one. The subunits come together to translate a messenger RNA sequence into protein, and then separate. They assemble again when it is time to make another protein, although not necessarily with the same partners. “In a way they are very promiscuous,” says Mankin.

Arranged marriage

The solution, Mankin and Jewett's team decided, was to marry together two engineered subunits. It was unclear whether the approach would work: it was thought that ribosomes exist in two distinct units because it is necessary for their function.

The researchers used a strand of RNA to tether the large and the small subunit together, toiling for months to get the length and location of the link just right so that the machine could still function. “We certainly came close, several times, to saying ‘OK, biology wins',” says Jewett.

The team screened its tethered ribosomes incells that lacked functioning RNA, and eventually found engineered ribosomes that worked well enough to support some growth, albeit slow. They then tested their platform to confirm that a tethered ribosome could operate side-by-side with natural ribosomes.

The result unlocks a molecular playground for bioengineers: by tethering the artificial subunits together, they can tweak the engineered machines to their liking without halting cell growth, says Joseph Puglisi, a structural biologist at Stanford University in California. Puglisi hopes to harness the system to study how the ribosome functions. James Collins, a bioengineer at the Massachusetts Institute of Technology in Cambridge, says that his lab may use the system to study antibiotics — many of which work by binding to bacterial ribosomes.

Jewett wants to see what the system can do for synthetic biology, perhaps producing new antibiotics or unnatural polymers. “We’re just at the leading edge,” he says. “We're going to try to expand the genetic code in unique and transformational ways.”

see also:

Missing: 1 Year's Worth of California Rain

The amount of rain that California has missed out on since the beginning of its record-setting drought in 2012 is about the same amount it would see, on average, in a single year, a new study has concluded.

The study’s researchers pin the reason for the lack of rains, as others have, on the absence of the intense rainstorms ushered in by so-called , the ribbons of very moist air that can funnel water vapor from the tropics to California during its winter rainy season.

Overall, the study, accepted for publication in the , found that California experiences multi-year dry periods, like the current one, and then periods where rains can vary by 30 percent from year to year. Those wet and dry years typically cancel each other out.

The , one phase of which has ushered in some of the state’s wettest years, only accounts for about 6 percent of overall precipitation variability, the researchers found.

Drought began creeping across the California landscape in 2012 and has continued to mushroom year after year as winter rains and snows were much diminished. The atmospheric rivers that normally funnel in moisture-laden air were thwarted by a persistent area of high pressure that blocked them from reaching California. This winter, precipitation that did manage to fall mostly did so as rains thanks to record-high temperatures linked to extremely , leaving the .

The new study looked at satellite measurements of rainfall from NASA’s  (TRMM) satellite, as well as a recreated climate record that used both observations and model data to gauge how much California’s annual precipitation varied and how much it was in the hole after four years of drought.

The researchers found that in an average year, the state sees about 20 inches of rain; it turns out that’s also about the amount of missing rain since 2012.

To dig out of the drought in just one winter, the state would have to see 200 percent of its normal yearly rain, to cover both that year’s rain and make up the missing amount.

That wet a winter isn’t very likely happen, , a PhD student at Stanford University, said in an email. And if it did occur, it would mean major flooding, he added. Swain wasn’t involved with the new research.

The study also looked at another recent dry period, from 1986 to 1994, and found a 27.5-inch precipitation deficit over that period. While that was overall greater than the current drought, the per year rain deficit is much higher this time around, Swain pointed out.

Added to that, “ during the current drought have been warmer than during any previous drought on record, which has greatly amplified the effect of the precipitation deficits,” and helped fuel the wildfires currently flaring up around the state, Swain said.

Many are hoping  will make a serious dent , as it looks to become a strong event, and those are associated with higher odds of increased winter rains over at least parts of the state.

The study found that the whole El Niño-Southern Oscillation cycle only accounts for about 6 percent of the variation in yearly California precipitation. That cycle encompasses not just strong El Niños, but weak ones, as well as neutral and La Niña conditions, and when separated out “very strong events (like the El Niño currently underway) exert a far greater influence upon California climate than weak ones,” Swain said. So this year’s El Niño could play a major role in what precipitation California sees.

What’s important this year, Swain said, is where the precipitation falls and how much of it falls as snow to build back up the snowpack that keeps water flowing into reservoirs come the warm, dry days of summer.

[embedded content]

Drought May Stunt Forests’ Ability to Capture CarbonWhat Warming Means for 4 of Summer’s Worst PestsWarming May Boost Wind Energy in Plains StatesFossil Fuels May Bring Major Changes to Carbon Dating

Climate Central. The article was

see also:

Missing: One Year's Worth of California Rain

The amount of rain that California has missed out on since the beginning of its record-setting drought in 2012 is about the same amount it would see, on average, in a single year, a new study has concluded.

The study’s researchers pin the reason for the lack of rains, as others have, on the absence of the intense rainstorms ushered in by so-called , the ribbons of very moist air that can funnel water vapor from the tropics to California during its winter rainy season.

Overall, the study, accepted for publication in the , found that California experiences multi-year dry periods, like the current one, and then periods where rains can vary by 30 percent from year to year. Those wet and dry years typically cancel each other out.

The , one phase of which has ushered in some of the state’s wettest years, only accounts for about 6 percent of overall precipitation variability, the researchers found.

Drought began creeping across the California landscape in 2012 and has continued to mushroom year after year as winter rains and snows were much diminished. The atmospheric rivers that normally funnel in moisture-laden air were thwarted by a persistent area of high pressure that blocked them from reaching California. This winter, precipitation that did manage to fall mostly did so as rains thanks to record-high temperatures linked to extremely , leaving the .

The new study looked at satellite measurements of rainfall from NASA’s  (TRMM) satellite, as well as a recreated climate record that used both observations and model data to gauge how much California’s annual precipitation varied and how much it was in the hole after four years of drought.

The researchers found that in an average year, the state sees about 20 inches of rain; it turns out that’s also about the amount of missing rain since 2012.

To dig out of the drought in just one winter, the state would have to see 200 percent of its normal yearly rain, to cover both that year’s rain and make up the missing amount.

That wet a winter isn’t very likely happen, , a PhD student at Stanford University, said in an email. And if it did occur, it would mean major flooding, he added. Swain wasn’t involved with the new research.

The study also looked at another recent dry period, from 1986 to 1994, and found a 27.5-inch precipitation deficit over that period. While that was overall greater than the current drought, the per year rain deficit is much higher this time around, Swain pointed out.

Added to that, “ during the current drought have been warmer than during any previous drought on record, which has greatly amplified the effect of the precipitation deficits,” and helped fuel the wildfires currently flaring up around the state, Swain said.

Many are hoping  will make a serious dent , as it looks to become a strong event, and those are associated with higher odds of increased winter rains over at least parts of the state.

The study found that the whole El Niño-Southern Oscillation cycle only accounts for about 6 percent of the variation in yearly California precipitation. That cycle encompasses not just strong El Niños, but weak ones, as well as neutral and La Niña conditions, and when separated out “very strong events (like the El Niño currently underway) exert a far greater influence upon California climate than weak ones,” Swain said. So this year’s El Niño could play a major role in what precipitation California sees.

What’s important this year, Swain said, is where the precipitation falls and how much of it falls as snow to build back up the snowpack that keeps water flowing into reservoirs come the warm, dry days of summer.

[embedded content]

Drought May Stunt Forests’ Ability to Capture CarbonWhat Warming Means for 4 of Summer’s Worst PestsWarming May Boost Wind Energy in Plains StatesFossil Fuels May Bring Major Changes to Carbon Dating

Climate Central. The article was

see also:

Forests Suck Up Less Carbon After Drought

Tree growth lags below normal for several years following droughts, a detail about carbon sequestration that climate models currently overlook. Christopher Intagliata reports.

By | |

Climate scientists forecast to rise by the end of the century. That's a pretty big range. And there's a good reason for that: there's a lot of uncertainty baked into

Take, for example, the way climate models predict how trees respond to drought. "Drought in these models is treated as a light switch"—either on or off—“but in the real world we know that drought damages trees, and it can take a while for trees to repair this damage and recover." 

William Anderegg, an ecologist at Princeton University. He and his colleagues examined tree ring data from more than 1,300 sites around the world. And by comparing the rings with known drought records they found that trees don't simply kick back into gear as soon as rains return. Drought actually puts the trees' water transport systems under a huge amount of tension, he says, causing air bubbles to leak in, which damages or blocks those pipes. "I often compare this to a sort of a heart attack for a tree. That in some cases it can be lethal and in some cases they can repair that blockage." 

That drought 'hangover' causes tree growth to lag five to ten percent below normal for several years following the dry spell. "This is a problem because forests currently take up about 25 percent of human emissions of CO2, which is an incredible break on climate change.” And the less CO2 the trees are able to take up—the warmer it gets. The findings appears in the journal . [W. R. L. Anderegg et al, ]

The thing this study makes clear, is that predicting climate change… is hard. "It's hard. These models have an incredibly challenging task of representing processes that occur from a leaf scale to a continent scale in space. And from several seconds to hundreds of years or at least a hundred years in time." But maybe a better understanding of how much carbon trees soak up—and how much they will make climate forecasting just a little bit easier.

—Christopher Intagliata

Facing Poison Gas, 1915

One idea to repel gas attacks: fans to blow the gas away. A dubious invention, but perhaps a reasonable idea in the days before gas masks (the flannel pads over nose and mouth were an early—and for a while the only—defense against gas).

Reported in , This Week in World War I: July 31, 1915 

The world’s first full-scale attack by poison gas took place on April 22, 1915, near the town of Ypres in Belgium. It was a stunning success for the Germans who deployed it, and a catastrophe for the French territorial troops who were unfortunate enough to be on the receiving end of this new form of chemical warfare.

When an armed force must suddenly fight against an effective new weapon, the people tasked with winning wars scramble to find some kind of workable protection for their troops. The Allies did not know the exact nature of the gas but were pretty sure it was chlorine. The first attempts at gas defense came from industrial safety equipment used in factories that had to deal with chlorine fumes: flannel pads soaked in various liquids to cover the mouth (goggles were added later).

But more was needed. The suggestion in this article, and the illustration it contained (), was an earnest and perhaps desperate search for a technological fix for poison gas. (The soldiers manning the trench are shown wearing the flannel pads then in use.)

“[One inventor] proposes the use, in the trenches near enough to the enemy to be in danger from such gases, of rotary fan-blowers worked by hand placed at about every three or four yards. The fan-blowers should be connected with pipes going through the base of the earthwork in front of the trench. If the number of blowers were equal to the number of gas cylinders used by the enemy, the blowers when vigorously worked would deliver a far greater volume of air than the volume of the poisonous gas, so that the gas would become much diluted, and with good respirators would be harmless to our men. We believe that experiments are being conducted at the front with the object of devising means to render the poisonous gases innocuous by spraying with water and in other ways.”

I do not believe the scheme depicted was ever tried. If it was, I pity the poor infantryman on the front lines desperately cranking these fans and cursing the powers-that-be. But there are sound reasons why the idea was considered in the first place. The earliest method of launching a  chlorine gas attack was to lug large cylinders full of chemical to the front line, wait for just the right wind blowing gently (not too fast, not too slow) in exactly the right direction (and not when it’s raining or drizzling, or too hot or too cold), and then open the cylinders. In a light breeze, if the targeted soldiers had been able to fan the gas around, it is faintly possible that the gas might have been diluted enough to reduce casualties: “A slight counter air current ought to suffice, therefore, to deflect the direction of the gas cloud as it slowly drifts over the ground between the two lines of trenches,”

In any event, the better response was to develop a good gas mask for the individual soldier (or civilian), manufacture large numbers of them, and distribute them as fast as possible. Given how effective gas could be, both sides developed increasingly sophisticated methods of delivering the chemicals at much longer ranges and in different conditions: by artillery shells, grenades or bombs. And as fast as gas masks were developed, new gases were introduced to the battlefield that quickly rendered older protection obsolete.

-

Our full archive of the war, called Scientific American Chronicles: World War I, has many articles from 1914–1918 on chemical warfare and defense against it. It is available for purchase at

Book Review: Genius at Play

Recommendations from

By | |

Genius at Play: The Curious Mind of John Horton ConwayBloomsbury, 2015 (($30))

Mathematician John H. Conway's name pops up all over the mathematics world—group theory, game theory, knot theory, abstract algebra, geometry—and in the pages of this magazine, where he was frequently featured in Martin Gardner's Mathematical Games column. It was there that his most famous creation, Conway's Game of Life—a set of rules for propagating a pattern that generates incredible complexity—made its world debut. Science journalist Roberts's new biography of Conway demonstrates how the man's playfulness and originality has fed into the creativity and intelligence of his ideas. The tome resonates with Conway's voice—which gets its own special font—and his discussions with the author dictate the story's structure and provide the narrative's best glimpses into how his mind darts and weaves.

see also:

Big Polluting Vehicles Roar Back with Low Gasoline Prices

Detroit's Big Three automakers posted supercharged performances in the second quarter, sales fueled significantly by strong sport utility vehicle and truck sales in North America.

Fiat Chrysler Automobiles' net profit jumped 69 percent from the same time last year. Ford Motor Co. notched record income for North America and its Asia-Pacific market, totaling a 44 percent jump in income year over year. And General Motors Co. netted $1.1 billion in income, five times more than its $190 million profit from the March-June period in 2014.

"We just wrapped up the U.S. auto industry's best six months in a decade, driven by strong demand for pickups and crossovers," said Kurt McNeil, vice president of sales operations at GM, in a statement earlier this month.

"People feel good about their jobs and the direction the economy as a whole is taking, so the second half of the year should be strong too," he said, noting the depth of truck and crossover vehicle options Chevrolet and GMC sell.

With average nationwide gas prices staying below $3 per gallon this year, drivers seem to be forgoing more fuel-frugal cars for bigger, heavier options.

Ford's SUV sales are up 10 percent year over year, and the category marked its best sales performance since 2002, according to the company's latest sales figures. From this June to last, Ford has sold 30 percent more Explorers, 39 percent more Lincoln Navigators and more than 50 percent more Mustangs, the highest tally for the sports car since pre-recession 2007.

"In North America, the new F-150 is a smash hit," Ford President and CEO Mark Fields told analysts Tuesday. Last year, the company stripped down the iconic truck -- the best-selling model in America for the past 33 years -- to add an aluminum frame to boost its fuel economy.

Fuel efficiency gains are flattened

Automakers can sell trucks and SUVs with higher profit margins than sedans, and customers are responding as the economy improves.

"It's certainly true that the American automakers are making a lot of money on their trucks and SUVs," Cooke said. But a significant portion of the growth is coming from demand for crossover vehicles, "or SUVs based on car frames," Cooke explained. Crossover vehicles are significantly more fuel-efficient than larger cars.

"It isn't like in the '90s when it was entirely due to low oil prices," Cooke said of drivers purchasing new lower-mpg models.

Americans typically purchase steadily more fuel-efficient vehicles with each passing year.

But in April, according to  from the University of Michigan's Transportation Research Institute, consumers purchased less efficient new cars than they did in March. The same was true from January to February and from May to June.

The data, which tick up and down as quickly as a cruise ship makes a turn in a tight harbor, are sales-weighted figures based on combined city-highway gas ratings and monthly sales tallies. And they've remained essentially flat in 2015 and down from their peak of 25.8 mpg last August.

Electrics and hybrids are down

The Italian-American firm plans to expand to India and China, the biggest car market ahead of the United States, and introduce three new Jeep models special to China's SUV- and luxury-hungry marketplace, the company said in a July filing. (Mary Barra, GM's chief executive, said last week that SUV sales in China rocketed to 83 percent versus 2014's first two quarters.)

Unlike its traditional rivals, GM actually saw its fleetwide sales contract about 30 percent since last year, and a sales count from the company shows just how sharply large SUV sales have picked up since January.

Among Cadillacs, under the GM umbrella, sales of the Escalade, one of the most imposing vehicles in existence, were in the red this June compared to last year. But, in 2015 only, sales for the Escalade and the Escalade ESV are up about 50 and 70 percent, respectively.

Yukon XL and Acadia models, both GM products, are leaving dealerships at a quicker clip, too, up 8 percent and almost 20 percent year-to-date.

"Even when gas prices are high, there's a strong correlation between truck purchases and economic activity," Cooke said.

Asked about separating the economic impacts of low gas prices and an improving national economy on car purchases, he said, "It's hard to disentangle those effects a little bit," adding that "certainly hybrids have been very well coordinated to high gas prices."

From January through June, dealers have sold one-third fewer Chevrolet Volts compared to the same period of 2014. Sales for all Toyota Motor Corp.'s Prius hybrid models were down across the board in the first half of 2015. The company has sold about 15 percent fewer Priuses than it did in the same period last year.

www.eenews.net

see also:

How to Get to the Fourth Dimension

Things to Make and Do in the Fourth Dimension: A Mathematician's Journey through Narcissistic Numbers, Optimal Dating Algorithms, at Least Two Kinds of Infinity, and More

Mathematics popularizer Matt Parker, an Australian based in England is a self-proclaimed “” perhaps best known for his numerous contributions to the . He is also the Public Engagement in Mathematics Fellow at Queen Mary, University of London, and his new book is an ambitious and delightful addition to the current age’s plethora of high-quality volumes on recreational mathematics—even if most of the material he covers is focused on 2-D and 3-D. Like the extensive writings of legendary columnist this book seeks to make mathematics come alive for an intelligent and curious audience by engaging the reader in a lively informal style, and with irresistible invocations to roll up one’s sleeve and experiment. Parker also enlivens his chapters with numerous surprises.

domino computer display The display resolution on a domino computer display is terrible (Manchester, 2012)

Parker’s philosophy is to present mathematics as “one big game”—and he later reiterates that it’s “a game where you choose the starting  rules.” His major goal, he says, is to give readers “the freedom to play with math.” Like Gardner, Parker favors hands-on activities while gently leading his readers from easy to more sophisticated challenges.

Chapter Two opens with the problem of slicing into equal-size pieces an idealized circular pizza which is infinitely thin—but some of the pieces must not touch the center. He encourages readers to use a compass to experiment and offers several simple solutions to the problem.

While considering some of these tricky shapes, Parker suggests some folding fun, showing how to make a pentagon from “a long strip of paper” by tying a knot in it. (Curiously there is no mention of what is surely the most common occurrence of this in real life: people idly knotting their chopstick paper sheaths in Chinese restaurants while waiting for their food to arrive.) He shows that if the Greek rules of geometry are modified a little, as they are in origami, formerly impossible things like trisecting any angle suddenly become doable. He also ventures into 3-D—remember the book’s title promises action-packed adventures in 4-D—with a sweet puzzle about cutting a cubical cake into five easy pieces with equal amounts of cake and icing (that is, identical volumes and surface areas). Note that there is no icing on the cake’s bottom. (He had me wondering if his strategies would work for numbers other than five.

It’s impressive how often Parker offers a fresh take on topics that are somewhat familiar to a mathematically savvy reader cheek by jowl with the not so familiar; this is one of his strong suits. Another is his dry sense of humor, which is deployed just the right amount here. (The pizza of infinite thinness, he says, offers “terrible value for the money” although, compared with other pizzeria fare, is “possibly much easier to digest.”) Readers with modest mathematical backgrounds, however, could get lost now and then, because at times the pace accelerates as he winds up a topic, with the more complex ideas occasionally presented in compact summary form.

4-d Rubik's cube A 4-D Rubik’s cube (as seen projected into 3-D—or is that 2-D?). http://ift.tt/1fGyZwF

Chapter Three includes the basics of the Pythagorean theorem and mentions (but does not prove) the astonishing fact that the square root of 2 isn’t a ratio of two whole numbers—however Parker implies that this number is 1.4142. (Missing is the key word “roughly”—if it were exactly equal to that decimal, it would also equal 14,142/10,000 which in turns equals 7,071/5,000 in simplified form, a famous no-no known even to the Greeks.)

Parker’s treatment of shapes of constant width—there are many more than mere circles and spheres—in Chapter Four is excellent, building on simple examples to far-reaching irregularly shaped generalizations. His —the topic that launched Gardner’s career in back in the 1950s—is exemplary, and wisely preceded by the easier to grasp (and flex!) tetraflexagons.

Chapter Five starts with a problem similar to one my father baffled me with half a century ago: how to fit a large coin through a seemingly too small hole in a piece of paper. The solution requires flexing into 3-D, following which Parker detours into curvy geometries (namely, spherical and hyperbolic). He moves on to survey the forgotten 3-D Platonic solids, namely, Kepler–Poinsot polyhedrons, such as the small stellated dodecahedron, known to have been depicted by a Venetian mosaic artist as early as 1430.

Parker finally gets to his lucid exposition of the 4th dimension in Chapter Ten”—the book’s cover has a 2-D image of a witty 3-D realization of a 4-D cube on it—a concoction which can indeed be built in the world in which we’re living. I especially liked his highlighting of the hyper-diamond, aka the 24-cell or octacube, an oft-overlooked (ordinary) 4-D Platonic solid.

4-d hyper diamond The 4-D hyper-diamond (as seen in 3-D). 

Mobius strips, knotted surfaces and glimpses of dimensions higher than four, along with other mysteries, fill out the rest of the book.  Parker also discusses algorithms and a clever project he first executed successfully in 2012.

There are a few small problems with this edition. First, the figures are not numbered, which occasionally gives rise to mild confusion when an image referred to in the text is on a later page. Second, the publisher didn’t see fit to use a U.S.-friendly version of the content. American readers of this hardback edition will have to contend with minor language and spelling variations (for example, “maths” versus “math”) and not infrequent references to British coins (with their very different sizes and sometimes unusual shapes. Third, there is no index. The last two issues will be addressed for the U.S. paperback edition due out later in 2015. Some technical details (and many puzzle solutions) are wisely relegated to the back of the book, and there is a with related activities and material to explore.

Overall, Parker’s book is an infectious and friendly romp through some stimulating and diverse material that should convince any alert reader that mathematics can be fun, is certainly not static and has plenty of practical applications in real life—from tying your shoelaces faster to doing basic digital image processing using spreadsheets.

see also:

MIND Reviews Whispersync for Voice

I am an avid reader. But proclaiming as much in the past few years has made me feel a little dishonest—I can so rarely find time to read for pleasure. What kind of “avid reader” finishes only two books a year? A few months ago, however, I discovered a delightful way to fit books back into my busy life: a technology from Amazon called Whispersync for Voice, which is automatically included with the free Kindle and Audible apps. This ingenious bit of cross-platform magic, originally released in 2012 and recently updated, allows a reader to switch between reading an e-book purchased through Amazon and listening to the book's audio narration seamlessly in these apps on any device.

My typical reading schedule now starts in the morning, as I listen to an audiobook on my smartphone while I hike with my dog. Whenever I find myself waiting in a line, I switch to reading the book in the Kindle app on my phone. When I'm doing chores around the house, I listen to the narration on our home stereo system, via my computer. Before bed, I read a little more on my iPad. Throughout all those transitions, Whispersync for Voice is behind the scenes, marking my place so I never have to search for where I left off. And as with most apps, the features keep getting better. This past April the Kindle app was updated so that switching between reading and listening happens with only one click.

Another small but real benefit: Whispersync for Voice relieves a minor concern I have about using electronic screens at night. Recent studies have hinted that e-reading might disrupt sleep patterns—although most research finds the effect only after several of screen time before bed. Still, I am on my computer most of the day for work, and when I add e-reading or a movie to the mix, I might be approaching that threshold. So now, on days when I am sick of staring at a glowing screen, I can simply switch over to listening to my book while I rest my eyes.

All these little chunks of reading and listening time add up. I have completed 14 novels since I discovered Whispersync for Voice three months ago—and I am thrilled that books are back in my life in a major way. As an editor for , I see many studies about the benefits of reading or listening to stories—fiction may hone your social skills, for instance, and a well-drawn character can evoke empathy for people unlike yourself. Even more, though, I simply missed getting lost in an imagined world, which happens for me more intensely with books than it does with movies or TV. Don't get me wrong—I love my favorite shows, and I make time to watch them. But the other night, when I tried turning on the TV while cooking dinner, I found myself wondering what the characters were up to in the book I am currently immersed in. I turned off the TV and dove back into my novel.

see also:

Initial Results from Ebola Vaccine Experiment Are "Promising": World Health Organization

The results suggest the shot could help bring an end to West Africa's epidemic, World Health Organization director general Margaret Chan said on Friday

July 31, 2015

|

GENEVA, July 31 (Reuters) - Initial results from an Ebola vaccine trial in Guinea are "exciting" and "promising" and suggest the shot could help bring an end to West Africa's epidemic, World Health Organization director general Margaret Chan said on Friday.

"If proven effective, this is going to be a game changer, and it will change the management of the current Ebola outbreak and future outbreaks," Chan told reporters at a news conference.

Data from a trial of an Ebola vaccine known as VSV-EBOV developed by Merck and NewLink Genetics are expected to be published later on Friday.

(Reporting by Tom Miles, writing by Kate Kelland, editing by Dominic Evans)

see also:

Eye-Tracking Goggles Look for Concussions

Experimental technology aims to bring the doctor's office to the sidelines. Editor's Note: Models are used for illustrative purposes only; their presence is not meant to imply that they have... -- Read more on ScientificAmerican.com

Graphene Kirigami

Nature Video finds out how the Japanese art of paper-cutting can give 'supermaterial' graphene even more incredible properties. This article was reproduced with permission and was on... -- Read more on ScientificAmerican.com

Thursday, July 30, 2015

YouTube's Rock Stars of Science Make a Splash at VidCon

Hank Green poses for a selfie with fans at VidCon 2015.

YouTube has exploded in the decade since it first launched in 2005, to the point where the .

Alongside the young celebrity musicians, beauty gurus and pranksters, is an impressive lineup of science rock stars—scientists and communicators who have risen to Internet fame through their educational and personality-driven original content.

Last week at the world’s largest online video conference, (where YouTube celebrities are swamped with 20,000 screaming fans), these new faces on the frontlines of science communication were among the most famous and flocked to of all YouTube stars at the convention in Anaheim, California.

With tens of millions of views per video and hundreds of millions of views overall, these science YouTubers have reached more young people than Carl Sagan and Neil deGrasse Tyson combined. By blending humor with their personal lives and irreverent style, these online identities are popularizing science in new ways. They are also playing an increasing role in advancing science literacy, combating the misappropriation of science, motivating young people and influencing informed public policy discussions.

With online video , the influence of these personalities will grow ever stronger. So who are these science rock stars? Here are some of the top creators featured at the sixth annual VidCon.

Destin Sandin – Smarter Every DaySmarter Every Day, Destin Sandlin is also a Missile Flight Test Engineer at Redstone Arsenal. Sandlin has produced over 200 videos focusing on unusual physics phenomena, including topics such as and ”. His videos regularly receive over 10 million views, and he has a legion of almost 3 million subscribers.

[embedded content]

Emily Graslie – The Brain ScoopThe Brain Scoop, has become the largest science channel hosted by a woman in all of YouTube. As the Chicago Field Museum’s first-ever Chief Curiosity Correspondent, Emily’s videos focus on the biological sciences, including topics such as ” and how to preserve animal specimens. Graslie’s fascination with field science began after she started volunteering at the Philip L. Wright Zoological Museum as a university student. Emily is well known as an advocate for women in STEM fields, particularly women who want to share their expertise on YouTube. Her video ” is The Brain Scoop’s most popular video, with almost a million views.

[embedded content]

Mitchell Moffit and Greg Brown – AsapSCIENCEAsapSCIENCE.

Unlike most science channels, Moffit and Greg Brown’s AsapSCIENCE avoids talking heads, instead communicating through a unique style of fast-action whiteboard drawings. This formula has done well for them—in just three years, the pair have racked up almost a half billion views.

AsapSCIENCE explains the everyday science behind pop culture phenomena. Its most successful video, ” gained almost 20 million views and was featured widely in online media outlets such as The Huffington Post. Moffit and Brown claim the ASAP in their channel name stands for “as simple as possible,” and many would argue that is exactly what their channel achieves.

[embedded content]

Joe Hanson – It’s Okay To Be SmartIt’s Okay To Be Smart. A host on PBS’s Digital Network, Hanson has almost half a million subscribers and over 20 million views, covering topics as wide reaching as ” to ”.

A self-proclaimed “white male” Hanson is also an advocate for the diversification of science and the inclusion of more women in STEM fields. His video ” produced in conjunction with Amy Poehler’s , featured NASA scientists and female YouTube creators alongside Hollywood celebrities like Susan Sarandon.

[embedded content]

Hank Green – SciShow, CrashCourse, Vlog BrothersVlog Brothers (a channel he co-runs with his brother, author ), which today spans to include science channels such as and .

Across all platforms, Green boasts a fan base approaching the 10 million mark, making him the leading power player in the science-on-YouTube space. His educational videos span the mundane (“ to the controversial (“”) and often involve music, finger puppets and celebrity guest hosts to get the message across.

Green is known as a thought leader of YouTube, calling today’s online stars “more legitimate sources of unbiased thought and information than the news,” and he may be right. Earlier this year, Green was invited to the —a sign of the snowballing cultural influence of YouTube personalities.

[embedded content]

What Global Warming Means for Four of Summer's Worst Pests

Summer may mean it’s time for outdoor fun in the sun, but it’s also prime time for a number of pests. All that extra time outdoors can bring everything from poison ivy rashes to exposure to Lyme disease from tick bites. And of course there’s that ubiquitous summer menace, the mosquito.

With the rising temperatures brought about by global warming, the risks posed by these pernicious pests could also be increasing. A warmer climate can mean expanded habitats for many pest species, as well as increases in their numbers. Here’s what research suggests will happen with four key summertime pests as the world warms:

Mosquitoesrisk of spreading diseases like malaria, West Nile virus and dengue fever, including the invasive , which first appeared in the U.S. in 1985.

effect of warming on mosquito population Predicted change in the range of the Asian Tiger mosquito with warming from high levels of greenhouse gas emissions. Rochlin et al./PLOS ONE

As temperatures around the country rise, the areas that are conducive to such mosquitoes could expand, and the insects could start to emerge earlier in the year, meaning more opportunities for bites that . After an unseasonably warm late spring, summer, and early winter in 2012, the U.S. experienced a  outbreak linked to the Asian Tiger mosquito, with some 5,600 people becoming infected.

Asian Tiger mosquitoes tend to die off when temperatures venture outside a range from 50°F to 95°F and when relative humidity dips below 42 percent. A Climate Central analysis examined how warming would affect this range for cities around the country, showing how many more “mosquito suitable” days there were now compared to 1980. See how your city has fared in the dropdown menu below.

One key question in terms of the health impact of expanded mosquito territory is whether the new climates they venture into will be as welcoming to the pathogens they can carry. Arizona has a lot of , another invasive mosquito species, but no dengue, which it can often carry, , a scientist with the National Center for Atmospheric Research in Boulder, Colo., said. Why this is isn’t known, but Hayden and her colleagues suspect it is because the harsh desert climate doesn’t allow the mosquitoes to live long enough for dengue to undergo its full development cycle.

But there have been small outbreaks of dengue in Texas near the Mexican border, Hayden said, as well as a disease found in the Caribbean, called , in Florida. Health officials are closely watching these areas for larger outbreaks, she said.

Poison Ivy

 is a well-known scourge for those who spend time outdoors in the summer. Already more than 350,000 cases of poison ivy occur annually in the U.S., according to , and that number could go up as the climate changes.

The impacts of climate change on poison ivy have more to do with the cause behind rising temperatures than the warming itself. Plants need carbon dioxide — the key heat-trapping greenhouse gas — to fuel photosynthesis. Experiments that exposed poison ivy plants to different levels of CO2 have found that “poison ivy grows faster when there’s more CO2” and it produces more leaves that carry the plant’s toxic oil, Doug Inkley, a NWF scientist, said.

Those oils, which put the “poison” in poison ivy, can vary in their chemical structure, and  also cause the plants to produce a more toxic form, “so climate change is not doing us any favors there,” Inkley said.

poison ivy outbreak

Deer Ticksdeer tick, to transmit diseases such as  and .

The  that about 300,000 people in the U.S. are diagnosed with Lyme disease each year, primarily in the Midwest and Northeast. (While ticks are found throughout the South, they have a more diverse array of species to feed on there, and so are less likely to encounter the deer and mice that can harbor Lyme disease.)

As temperatures rise, there is concern that  into newly suitable habitat and bring Lyme disease and other pathogens with them. They have already expanded northward into Canada, where the number of reported cases of Lyme disease doubled between 2009 and 2012, according to  — a trend attributed to more locally acquired cases.

That northward expansion is expected to continue, as shown in the , while a much smaller retraction on the southern end of their range is also anticipated. The worry is that people who aren’t used to having to think about Lyme exposure won’t know to take proper precautions to reduce their risks.

Warming could also cause explosions in tick populations, as higher winter temperatures fail to thin out overwintering populations, Inkley said. More ticks means more chances for Lyme to be transmitted. Earlier thaws and later frosts could also mean that ticks are active for a longer period, again increasing the risk of Lyme transmission.

But, just as with mosquitoes, it is unclear whether changes in the climate and conditions of new habitats will be as conducive to the Lyme bacterium and other diseases as they are to ticks.

spread of deer ticks Expected range of deer ticks with warming by 2080. NCA

Red Fire Antsimported red fire ant, as it is colloquially known, came to the U.S. from its native South America sometime in the 1930s or ‘40s, likely as a stowaway in ship ballast.  now covers more than 300 million acres, mostly in the Southeast, where it came ashore, according to the NWF.

The ants, which bite and sting as a single mass, thrive in places where winter low temperatures don’t dip too low. “The colder it is, the slower the colonies grow and the more mortality occurs,” Lloyd Morrison, a National Park Service ecologist who has studied them, said in an email. “One very cold period in the winter could kill colonies outright or prevent colonies from reproducing.”

With warming, those low temperatures don’t get as cold, meaning colonies could be less inhibited. Morrison did a  that modeled the  of the imported red fire ant with climate change and found that warming temperatures would expand suitable habitats by about 5 percent by mid-century and then by 21 percent towards the end of the century. This would mean imported red fire ants could be found as far north as Nebraska, Kentucky and Maryland.

And while these ants can certainly provide an unpleasant encounter for any unwitting humans who come across them — their en masse bites inject their victims with venom that produces a burning sensation and raises blisters that can become infected — they are actually more of a threat to local wildlife. Swarms of ants can easily overwhelm young birds in ground nests and small animals like mice, Inkley said.

Thinking about all of these summer fun-ruining pests may have you scratching some imaginary itch and eyeing the outdoors warily, but it doesn’t mean you can’t enjoy what nature has to offer, Inkley said.

“We strongly believe that people should get out of doors,” he said. It just means being vigilant and prepared for nature’s not-so-nice side.

spread of red fire ants Possible expanded range of the imported red fire ant with climate change. L.W. Morrison et al. Warming May Boost Wind Energy in Plains StatesFossil Fuels May Bring Major Changes to Carbon DatingRain, Storm Surge Combine to Put U.S. Coasts at RiskGAO Report Sees Climate Risks to Army Corps Projects

Climate Central. The article was

see also:

Drought May Stunt Forests' Ability to Grow for Years

The megadrought in the Amazon rainforest during the summer of 2005 caused widespread damage and die-offs to trees, as depicted in this photo taken in Western Amazonia in Brazil.

Forests are sometimes called the lungs of the earth—they breathe in carbon dioxide in the atmosphere and store it in tree trunks until the forest dies or burns. A , however, shows that forests devastated by drought may lose their ability to store carbon over a much longer period than previously thought, reducing their role as a buffer between humans’ carbon emissions and a changing climate.

The study, published Thursday in the journal Science by a team of by researchers at the University of Utah and Princeton University, shows that the world’s forests take an average of between two and four years to return to their normal growth and carbon dioxide absorption rate following a severe drought—a finding that has significant climate implications.

“This means that these forests take up less carbon both during drought and after drought,” study lead author , an assistant professor of biology at the University of Utah and a researcher at Princeton University, said.

Forests act as a  by absorbing human-emitted carbon dioxide from the atmosphere and storing it in trees’ woody roots and stems. As climate change affects forests, they’ll store less carbon dioxide because drought stresses them and hinders their ability to grow, making man-made global warming even worse. Eventually, forests could become a source of carbon instead of storehouse of it.

“In some scenarios in the coming century, due to things like drought, wildfire, insects, (or) disturbances, forests start to lose more carbon than they take up, and they become a carbon source,” Anderegg said. “It’s a vicious cycle. (Forests) accelerate climate change and more climate change kills more forests.”

Using tree ring data, Anderegg’s team measured the recovery of tree stem growth following droughts dating back to 1948 in more than 1,300 forests worldwide. The average length of growth recovery in forests across the globe ranged from two to four years. Tree growth on average was 9 percent slower than normal during the first year following a drought and 5 percent slower in the second year. The driest forests recovered the slowest.

Over the course of a century, the lasting effects of drought could lead to 3 percent lower carbon storage in semi-arid forests, such as those in Arizona, Colorado and New Mexico, according to the study. That’s equivalent to 1.6 metric gigatons of carbon when considering all semi-arid ecosystems worldwide. (Between 1990 and 2007, forests worldwide stored roughly 2.4 gigatons of carbon annually, according to a 2011 University of Alaska-Fairbanks study.)

Scientists unaffiliated with the study said it highlights a flaw in climate models and can help update their assumptions about the ability of forests to sequester carbon.

“It’s an interesting paper,” , a research ecologist at the U.S. Forest Service’s Southern Research Station in Raleigh, N.C., said. “The paper spells it out. If you have a model that doesn’t account for this ecosystem response (drought), it’s going to overpredict carbon sequestration.”

, a biology professor at Dartmouth College, called the paper “a fascinating study.”

There has long been reason to expect that forests could take many years to recover from drought, “but it was previously impossible to estimate so even our most sophisticated large-scale models have essentially ignored it,” Ayres said.

“Now we know that droughts in arid pine forests reduce carbon sequestration from the atmosphere by gigatons more than we had been estimating,” he said. “This study helps us understand the powerful feedbacks between people, forests and climate that will be prominent in shaping the properties of the planet that we hand off to our children.”

Editor’s Note: Stephen W. Pacala, a member of Climate Central 's board of directors and a Princeton University ecology professor, is one of this study’s 13 co-authors.

Climate Central. The article was

see also:

A Really Long Straw

A pressurized science project from Science Buddies

By | |

Why aren't there more super-long straws? Learn how your mouth "vacuums" up beverages when you sip through a straw--and build your own mega straw to learn about the physics behind this impressive everyday feat! How long can you go?

Key conceptsAtmospheric pressureGravity

Introduction

Background

Does that sound bizarre? Here is a little more explanation: The atmosphere is a massive layer of air. The weight of all that air is constantly pressing on us and on the things around us. At sea level, this invisible pressure is approximately 14.7 pounds per square inch. That is like having the weight of a bowling ball sitting on each square inch or five bowling balls pressing on the liquid filling a two-and-a-half-inch-diameter glass. Put a straw into liquid and the liquid will enter the straw until it reaches the same level as the liquid outside the straw. The liquid in the straw and around it is being pushed down by the air above it in a similar way, so they reach about the same level.

But it gets interesting when you remove some air from the straw. Suddenly, there is less air pressure inside and liquid is pushed up the straw. The more air you remove from the straw, the higher the liquid will be pushed into it.

Do you think there is a limit to how high the liquid can rise in a straw? This activity will help you make a very large “mega-straw” and test it out!

Materials

  • A package of plastic straws (at least one dozen), preferably those with a bendable part
  • Scissors
  • Ruler
  • Tape
  • Drinking glass filled with water
  • Level surface that can get wet (or if not, something to protect it)
  • Sturdy chair or table on which to stand

Preparation

  • Have an adult help to cut two half-inch slits, across from one another, lengthwise in one end of a plastic straw. These cuts will help you slip the end of one straw over another one.
  • Prepare 10 more straws in a similar way until you have enough for a superlong mega-straw! (You can also come back to these steps during the process in case you need more straws for your mega-straw.)

Procedure

  • Slip the cut end of a prepared straw over the end of an unprepared straw.
  • Wrap the area where the straws overlap with tape so you have an airtight seal. Do not hurry; a good airtight seal will help you avoid trouble later. (Hint: When you drink with a straw, you must remove air from it.)
  • To test your extralong straw, put a glass of water on level ground. (Be sure to place something down to protect your level surface or use one that can get.) Now hold your straw vertically or close to vertically and try to drink with it.
  • If little or no liquid enters the straw, check the seal where you joined the straws. Is it airtight? If not, add tape or undo and redo this connection. If the seals at all joints seem airtight, check for holes in other areas of your mega-straw and seal them with tape.
  • Play around with your first mega-straw. Suck lightly to remove a little air from the straw then suck hard to remove more air. Observe each time how high the water rises in your mega-straw.
  • Time to add on! Attach another prepared straw to your mega-straw in a similar way and put your lengthened mega-straw to the test. Remember to hold your straw vertically or close to vertically during your test.
  • Keep adding prepared straws and testing after each addition. You might have to carefully stand on a chair to test your growing mega-straw.
  • Once you have connected a few straws together and it becomes a little challenging to drink with the straw, test your mega-straw at different angles. In addition to holding the straw vertically, test it at an angle about halfway between horizontal and vertical (approximately 45 degrees) as well as by holding it as close to horizontal as possible. Note that you might need to add more water to your glass to test a fairly horizontal position. If so, rank the straw positions in descending order: 1 being the hardest to suck up water, or needing most effort; 3 being the easiest, or needing the least effort. Note that you did not change the distance over which the water was transported; the straw stayed the same length.
  • Pause a moment and think about how the difference in height between your mouth and the glass changed depending on the angle at which you held the mega-straw. Rank the methods in descending order of difference in height between your mouth and the glass: 1 being the position with the most height; 3, the position with the least height. Do you see
  • If you have bendable sections in your straw, test what happens if you keep the height of your glass and your head the same but change the way you bend the mega-straw. Try a straight mega-straw and a mega-straw with one or several kinks.
  • Build on. Extra: Would one type be more suited to make a mega-straw?

Observations and results

Remove more air, and a bigger difference in air pressure will cause the water level to rise even higher into the straw. As soon as the water reaches the height of your mouth, you can drink.

Your lung power determines how much air you can remove. Some will have difficulty with a three-foot straw whereas others can successfully drink standing eight feet above their drink!

There is a limit though. If you could create a vacuum in your mouth by removing the air, the water could rise about 30 feet high. It's not possible, however, to create a complete vacuum in the human mouth, so usually people reach their straw-slurping limit at a much lower level!

Note that it is mainly the difference in height the water needs to overcome that counts, not the total length the water needs to travel in the straw. Holding your straw almost horizontally will allow you to suck up water over a very long distance.

More to exploreWould a Straw Work in Space? from Science-Based LifeHow Do Drinking Straws Work? from Indiana Public MediaUnder an Ocean of Air Pressure, from University of Illinois ExtensionAtmospheric Pressure, from ScienceOnline

Science Buddies

see also:

Can Planting Trees Make Up for Warming River Water?

Five years ago Medford, Oregon, had a problem common for most cities—treating sewage without hurting fish.

The city’s wastewater treatment plant was discharging warm water into the Rogue River. Fish weren’t dying, but salmon in the Rogue rely on cold water. And the Environmental Protection Agency has rules to make sure they get it.

So, instead of spending millions on expensive machinery to cool the water to federal standards, the city of Medford tried something much simpler: planting trees.

It bought credits that paid others to handle the tree planting, countering the utility's continued warm-water discharges. Shady trees cool rivers, and the end goal is 10 to 15 miles of new native vegetation along the Rogue.

Pollution-trading programs are common in other industries, such as caps on sulfur dioxide from U.S. power plants. A regulator, say the EPA, issues or sells to polluters permits allowing a set amount of emissions. Firms must own permits to match their emissions, but the total amount is capped. If a coal plant owner can't or won't cut emissions from its stacks, it buys permits, or credits, from other utilities that have chopped emissions and don't need as many.

But using credits to curb warm discharges is novel and water-quality trading to protect stream temperatures is gaining traction in Oregon.

Supporters say it’s a win-win: wastewater plants save money, streams stay cool and the trees do other good things like provide habitat and suck up carbon.

“Traditional environmental practices, litigation, advocacy, got us a long way, but not too much further,” said Joe Whitworth, president of The Freshwater Trust, which spearheaded the Medford project. “We thought, what else is out there, what can we do different to enhance the entire watershed?”

However, some say it’s not enough to protect the states’ fish.

“It is a get-out-of-jail card,” said Nina Bell, executive director of the nonprofit, Portland, Oregon,-based Northwest Environmental Advocates. “It takes care of [wastewater] plant’s responsibility but doesn’t have the kind of real water quality benefits we need.”

Using trees to save green

With discharge likely to increase—by 2030 the plant is estimated to serve an additional 30,000 people—Medford started looking for ways to lessen their discharge footprint.

Cooling towers and chillers are a traditional solution, said Walt Meyer of West Yost Associates, the city's engineering consultant. But shady riverbanks can accomplish the same goal as expensive engineering. “It turned out trading was the most cost-effective and the most environmentally compatible,” he said.

Chillers run somewhere around $15 million to $20 million and require massive amounts of energy to run, while tree planting will cost the city about $8 million.

A software system run by The Freshwater Trust calculates the benefits of planting a given area. The Medford wastewater treatment plant can buy the credits, ostensibly offsetting the warm water they’re releasing into the environment.

The current price per credit is a little more than 1/100th of a penny. About 600 million credits will be required for Medford’s compliance.

The Freshwater Trust has to recruit landowners along the river who could use some restoration. A lease agreement grants the rights to manage the land for 20 years.

Then there’s removal of non-native species and a trip to the nursery.

Not without controversy

Many factors influence stream temperature—both natural fluctuations in temperature and precipitation, and human causes such as discharges and removing vegetation from riverbanks.

Ades estimates that of the human warming sources, wastewater treatment plants typically account for 5 to 10 percent in Oregon. The remainder is from what’s called non-point sources—such as agriculturally driven losses of streamside vegetation and river diversions.

This is where some of the controversy comes in. That 90 percent remainder is a big number.

“Some say this [Medford project] is a great idea as it will restore some riparian areas,” Bell said. “It might restore some areas, but it’s such a drop in the bucket and distracts from real question: what are we doing to achieve widespread non-point source controls?”

“Point sources" such as wastewater treatment plants are where states have the most authority to make a difference, Ades said. “The Clean Water Act doesn’t give us a lot of non-point tools."

Oregon has some local ordinances and voluntary reductions, he added. For instance, farmers can participate in water quality trading projects, accelerating their voluntary reductions.

Medford's initial round of credit-buying and tree-planting will be completed by 2020. The water from Medford is discharged in White City, and the trees are planted elsewhere in the Rogue River’s basin.

Saving money in environmental programs is key, said Larry Karp, a professor of agricultural and resource economics at the University of California. “By doing it cheaply, you can do more of it. You’re more likely to achieve environmental benefits if you go about it in efficient ways.”

And someone else pretty important agrees with Karp. In a 2012 speech, before the program had really gotten underway, President Obama praised the Medford project as an example helping the environment without putting unnecessary financial pressure on industry.

“It worked for business, it worked for farmers, it worked for salmon,” President Obama said. “Those are the kinds of ideas we need in this country.”

The idea of water quality trading was hatched a little more than a decade ago, born out of thinking of how to lessen not only the impact of wastewater treatment plants but also that other 90 percent of sources that are contributing to warming the river but not fully regulated.

Trees don’t discriminate: Planting them helps the whole watershed—removing both pollutants and climate-warming gasses from the air and providing shelter and habitat for creatures.

Chillers require about 6,000 horsepower of connected load, Meyer said, which would have been an energy suck.

Bell, the critic, admits trees are good. But location matters—the trees are being planted on the main stem of the Rogue River, while shading smaller offshoot streams would have much more impact, she said.

Primozich agrees smaller tributaries would be more influenced by shade, but regulators require thermal reductions where the heat is being added, on the Rogue's main stem.

Primozich said the project is too young to have generated much shade. That’s where the aptly named Shade-A-Lator software program comes into play, estimating future shade benefits. “Within seven to 10 years, we anticipate about half of the shade at maturity.” he said.

Bell countered that “planting a few trees here and there is not addressing state’s problem.” Her organization has sent multiple letters to the regional EPA offices with concerns about the Medford program and officials have said “pretty much nothing in response,” she said.

“They’re still discharging something that’s warmer than it should be,” Bell said. “How can you do that without undercutting the regulatory structure of the Clean Water Act?”

Primozich said one way the state has addressed the continued discharge and lag between tree planting and shade is by using a ratio tilted in the direction of more shade. “If the utility discharges 10 units of heat, we have to plant 20 units of shade, using a 2 to 1 ratio.”

Oregon leads the way

The agency would not make any of their environmental economists available for an interview.

There have been more than 40 water quality-trading projects in the United States, according to the . However, the projects are focused on pollution—not hot water—discharge and seem to have limited success, according to a 

Medford is not the first such program in Oregon: Clean Water Services, which cleans about 60 million gallons of wastewater daily for 550,000 customers in nearby Washington County, has been using a similar trading system on the Tualatin River for about a decade, Ades said.

That program has been largely successful for the environment and has provided “widespread community benefits" such as payments to participating landowners, as well as the aesthetic and recreational value of restored riparian areas, according to a . The researchers estimate that the trading has cost Clean Water Services 95 percent less than buying new equipment would have been.

James Boyd, senior fellow and director of the Center for the Management of Ecological Wealth, said as long as the natural method works as well as the technological fix, it comes down to one thing: money.

“You’re seeing that in Oregon, you might get co-benefits of riparian restoration and desirable aesthetics, but bottom line is they’re doing it cheaper,” Boyd said.

The Medford situation is analogous, Karp said.

“The societal goal is for pollution to not exceed a certain level," he noted. "You could insist every factory put on certain equipment, or in the case of Medford, make them use cold water."

But if the target is clear, Karp said, the emissions trading program has shown that letting companies decide how to achieve it is often the most efficient approach.

So why isn’t everybody using trading for warm water discharges?

“For one thing it's a lot easier to go to an engineering company and say ‘this is what we need’,” Boyd said. “You get more certain outcomes. Whereas when you start to talk about green infrastructure, things get messier.” Boyd also said the way the Clean Water Act targets wastewater plants and industries, but not farmers, creates some disincentive to work together.

Oregon has been very progressive on this, Boyd said. But making sure trading is producing the desirable result “complicates lives,” he said.

Historically environmental groups have been leery of trading programs, but that’s starting to change, Karp said.

“There’s no doubt environmental groups have come on board to a considerable extent when it comes to market-based methods,” he said. “Maybe a groundswell is an exaggeration, but there’s been at least a drift.”

Karp said one of the main objections to trading is hot spots. “By allowing trade and permits, you might lower aggregate, but might concentrate emissions, or thermal loading, in certain areas,” Karp said. But Meyer said Medford doesn’t have a hot spot.

“The stream segment where we discharge doesn’t violate temperature standards. Sixty miles downstream is the point of impact,” Meyer said.

Bell said she’s not opposed to water quality trading but maintains the Medford trading is letting the wastewater plant shed its responsibility to curb warm discharges and questions whether the promised stream restoration will actually get done.

Whitworth remains unfazed. He said such “quantified conservation” is the future. Confident that monitoring throughout the life of the project will silence critics, he sounds almost Machiavellian.

“Some in the environmental community are like ‘hey what are you doing here? You’re doing tradeoffs and the environment will get shortchanged’,” he said. “But we can quantify.

“You can do a deal with the devil himself and still be ahead environmentally.”

originally ran at 

bbienkowski@ehn.org

see also:

Google Street View Soon to Picture Local Pollution Too

The San Francisco Bay Area will soon see Google Street View vehicles that not only take snapshots of its streets but also capture snapshots of the air quality in neighborhoods they pass. Google and Aclima, a San Francisco-based air sensor technology developer, announced Tuesday that they are partnering to introduce air quality sensor-enabled Street View cars in the Bay Area, and in the future in other cities.

“We want to understand how cities live and breathe in an entirely new way,” said Davida Herzl, co-founder and CEO of Aclima. This endeavor to bring air quality monitoring closer to the people using a mobile platform has generated interest among both scientists and regulators.

“Environmental air quality is an issue that affects everyone, especially those living in big cities,” Karin Tuxen-Bettman, who manages the Google Earth Outreach program, said in a statement. In 2014, a majority of the world’s population was living in urban areas. It has long been recognized that cities are major contributors to global emissions, and many of them are under the shadow of heavy air pollution that is detrimental to public health.

In a monthlong pilot project last year in Denver, Street View cars equipped with Aclima sensors measured levels of smog ingredients, including nitrogen dioxide, nitric oxide, ozone, carbon monoxide and volatile organic compounds. The cars also tracked carbon dioxide, methane, black carbon and soot—an array of pollutants responsible for global warming.

Historically, air pollution monitoring has relied on stationary measurement, usually in towers away from city centers, which are not designed to track urban atmospheric air pollution. At the turn of the century, there was some interest in air pollution monitoring in urban cities, said Daniel Mendoza, a researcher at the University of Utah who studies the impact of air pollutants on public health.

The conventional modes of air quality monitoring are geared toward better regulation but do not necessarily relate to the lives of residents. Heavy-duty equipment like that is designed to collect comprehensive data from a site over a period of time. But as technology for air monitoring has evolved, it has made it possible for smaller instruments to achieve acceptable standards of accuracy at the street level.

EPA sees it as ‘obvious next step’

Advances in technology that have allowed these instruments to become smaller and less expensive also make it possible to integrate them with existing technology, be it a car or a phone. “The integration of these technologies with people’s lives is so fascinating,” said Kevin Gurney, an atmospheric scientist and ecologist at Arizona State University. “This way, they crowdsource the problem with the aid of good sensors.”

Quality control has traditionally been a concern about inexpensive mobile sensors, he said; however, technological shifts are happening rapidly and observers are “beginning to see the power of these technologies.” Scientists are always hungry for data, and particularly for a problem like air pollution. They often don’t have a lot of information that comes from such close proximity to the polluting sources, he said.

The idea of mobile monitoring itself is not novel—it has been recognized as a next-generation air quality-sensing technology. A handful of projects across the country already use these technologies, though the modalities differ from the Street View initiative. At the University of Utah, a team of researchers installed monitoring instruments on the Utah Transit Authority’s light-rail train that runs through the Salt Lake Valley. Using this sensor, they can track levels of fine particulate matter, ozone and greenhouse gases, and because the trains run on the same route, it is possible to trace the pollution levels at the same location at different times.

A ‘fine tool’ to go after polluters

While he agreed that mobile measurements improve coverage and are more representative than older technologies, Mendoza contended that “stationary is still the gold standard.” Google and Aclima recognized that the new mobile technology will only supplement rather than supplant traditional ways of monitoring air pollution.

“These sensors do not replace the traditional monitoring that has gone on for all these years,” Gurney said. “I am excited about what it would add to that.”

The technology would improve the resolution of air pollution mapping techniques, experts agreed. “It gives policymakers and regulators a scalpel instead of a mallet,” Gurney said. “It gives them a fine tool to go after pollution sources much more efficiently.”

He also pointed to a less tangible benefit from such monitoring. “Greenhouse gas emissions have been a very abstract problem for the public for many years; it has been associated with polar bears and images of the planet from space,” he said. “All that is perfectly legitimate imagery, but it is such an abstraction from people’s lives. Being able to show emissions as an artifact of our everyday lives, from cars, houses, factories—that is a powerful thing.”

www.eenews.net

see also:

'Imperfect' Vaccines May Aid Survival Of Ultra-Hot Viruses

Certain vaccines prevent sickness and death, but don't block transmission--meaning they may actually give some viral strains an extra shot at survival. Christopher Intagliata reports.

By | |

In the 1960s, was ravaging the . The virus caused what's called Marek's disease--and killed one to two percent of the birds. "Given that there are billions of birds in the industry, that's a lot of birds." Andrew Read, an evolutionary biologist at Penn State University. He says the virus was easy to catch. "The dander of chickens if full of the virus. If you shake a chicken, the virus drops out."

Then, in 1970, a new vaccine put an end to most of the deaths. But the poultry vaccine, unlike most, was a so-called 'leaky' or 'imperfect' vaccine. "The vaccine is life-saving, but it allows the infection to persist and transmit from the host." Meaning you could still shake a vaccinated chicken--and make it rain viruses.

Now, Read and his colleagues have shown that these 'leaky' vaccines may actually give some viral strains an evolutionary leg up. Because the most virulent strains usually wipe out unvaccinated birds in just 10 days--not enough time for the birds to infect many others. These viruses are essentially so 'hot' they burn themselves out. But vaccinated birds infection with the hot virus, and shed it for weeks--allowing strains that would otherwise die out to stick around, and kill any unprotected birds. The study appears in the journal . [Andrew F. Read et al, ]

Of course, none of this is reason to doubt the efficacy of vaccines. "Vaccines have been one of the most important public health interventions. And the most cost effective we've ever had. And they're critically important in food chain security as well. So vaccines themselves are fantastic." But several imperfect vaccines for malaria are currently being tested. If they're approved, he says, we'll need to use other measures, like bed nets, to block transmission--remembering that not all vaccines are a one-shot deal.

--Christopher Intagliata

Vertigo Knocks Millions off Their Feet, and Doctors Are Looking at Ear Implants to Relieve It

Leaping through the air with ease and spinning in place like tops, ballet dancers are visions of the human body in action at its most spectacular and controlled. Their brains, too, appear to be special, able to evade the dizziness that normally would result from rapid pirouettes. When compared with ordinary people's brains, researchers found in a study published early this year, parts of dancers' brains involved in the perception of spinning seem less sensitive, which may help them resist vertigo.

For millions of other people, it is their whole world, not themselves, that suddenly starts to whirl. Even the simplest task, like walking across the room, may become impossible when vertigo strikes, and the condition can last for months or years. Thirty-five percent of adults older than 39 in the U.S.—69 million people—experience vertigo at one time or another, often because of damage to parts of the inner ear that sense the body's position or to the nerve that transmits that information to the brain. Whereas drugs and physical therapy can help many, tens of thousands of people do not benefit from existing treatments. “Our patients with severe loss of balance have been told over and over again that there's nothing we can do for you,” says Charles Della Santina, an otolaryngologist who studies inner ear disorders and directs the Johns Hopkins Vestibular NeuroEngineering Laboratory.

Steve Bach's nightmare started in November 2013. The construction manager was at home in Parsippany, N.J. “All of a sudden the room was whipping around like a 78 record,” says Bach, now age 57. He was curled up on the living room floor in a fetal position when his daughter found him and called 911. He spent the next five days in the hospital. “Sitting up in bed,” he recalls, “was like sitting on top of a six-foot ladder.” Bach's doctors told him that his left inner ear had been inflamed by a viral infection. He underwent six months of physical therapy to train his brain and his healthy right ear to compensate for the lost function in his left. It helped, and he returned to his job in May 2014. Even so, this spring he was still having unsteady moments as he made his way around a construction site. “Whatever is in your brain that tells you when your foot is going to hit the ground to keep you upright, I don't have 100 percent of that,” he says. Vertigo can also trigger severe anxiety and depression, impair short-term memory, disrupt family life and derail careers.

Such crippling difficulties are prompting physicians to test new treatments for the most severe vertigo cases, Della Santina says. He is starting a clinical trial of prosthetic implants for the inner ear. Other doctors are experimenting with gene therapy to fix inner ear damage. And the work with dancers is beginning to reveal novel aspects of brain anatomy involved with balance, parts that could be targets for future treatments.

The ears are key to keeping us upright and stable because they hold an anatomical marvel known as the peripheral vestibular system. This is a tiny arrangement, in each ear, of fluid-filled loops, bulbs and microscopic hair cells. The hairs are topped by a membrane embedded with even tinier calcium carbonate crystals. When the head moves, the crystals pull on the hairs and combine with the other bits of anatomy to relay information about motion, direction and speed to the vestibular nerve. The nerve passes it on to a region at the stem of the brain called the cerebellum, as well as other neural areas. The brain then activates various muscles and the visual system to maintain balance.

The list of things that can go wrong with this delicate system is long. Causes of inner ear vertigo include tumors, bacterial and viral infections, damage from certain antibiotics, and Meniere's disease, a chronic condition characterized by recurring bouts of vertigo, hearing loss and tinnitus that experts estimate to affect an additional five million people. The most common vestibular disorder is benign paroxysmal positional vertigo, or BPPV. It occurs when renegade crystals get loose, float into the vestibular loops and generate a false sensation of movement. Fortunately, this type of problem is usually treated effectively with physical therapy involving a repeated set of slow head movements that float the crystals out of the loops.

But physical therapy does not help everyone or, as in Bach's case, does not heal the person completely. Some patients have lost vestibular function in both ears. For them, Della Santina and his colleagues at Johns Hopkins have been developing an implant that substitutes mechanical components for damaged inner ear anatomy. Once the researchers get the green light from the U.S. Food and Drug Administration, they will begin testing their invention, called a multichannel vestibular implant, in humans. The device is modeled on the cochlear implants that have restored hearing for thousands of people since the first one was used in 1982. These implants use a microphone to pick up sound vibrations and transmit them to the brain via the auditory nerve. Instead of a microphone, a vestibular implant has two miniature motion sensors that track the movement of the head. One, a gyroscope, measures the motion of the head as a person looks up, down and around a room. The other, a linear accelerometer, measures directional movement, such as walking straight ahead or down a flight of stairs. And instead of breaking sound into different frequency components and sending them to the auditory nerve, the motion sensors send the signals connoting head position and movement to the vestibular nerve.

Results from the trial of a different vestibular implant in four patients with Meniere's disease at the University of Washington were mixed. Although it worked well initially, the effect petered out after a few months. But the Johns Hopkins device has a different design and will be used in patients with disorders other than Meniere's, so the physicians hope the outcomes will be better.

Ear Genes

Gene therapy needs to be handled carefully; it can trigger serious immune system reactions, and patients in other experiments have died. Safety factors in this trial include a gene that can be turned on only in the targeted cells, Staecker says, and a minuscule dose that does not circulate through the body. In addition, he explains, the viral jacket around the gene, which helps it penetrate cells, has been deployed “without safety problems” in about 1,500 people in previous experiments with different genes.

Even if such research succeeds, major gaps in our basic knowledge about disabling dizziness remain. For example, doctors do not know why the ear crystals get loose in the first place. These gaps are why some researchers turned to ballet dancers. The idea is to study especially robust vestibular systems to better understand the mysteries of unhealthy ones.

A team at Imperial College London used a battery of tests and brain imaging to investigate the ability of expert ballet dancers to resist vertigo while performing multiple pirouettes. The scientists studied 29 female dancers with an average of 16 years of training—the dancers started at or before age six—and compared them with female rowers. The more experienced and highly trained dancers had a lower density of neurons in parts of the cerebellum where dizziness is perceived, the group reported this year in the journal . The anatomy is smaller, the researchers think, because the dancers continually suppress the perception of dizziness. During pirouettes, dancers focus their eyes on a fixed point for as long as possible. The technique, called spotting, limits the sensory signals sent to the brain. This “active effort to resist dizziness” during years of training also left the dancers in the study with a smaller, slower network of neuron connections in a part of the right hemisphere of the brain where those signals are processed.

This kind of suppression might someday offer relief to patients with chronic vertigo, if ways can be found to develop it in nondancers using physical therapy, the scientists suggest. For thousands of patients, it would be a turn for the better.

see also: