Sunday, August 31, 2014

Cheaper Solar Cells with a New Salt

Cadmium chloride is a nasty chemical. If it gets on the skin, it releases cadmium, which has been linked to cancer, lung disease and cardiovascular disease. And yet the expensive, dangerous compound has long been used as a coating for thin-film solar cells because it increases the efficiency of converting sunlight to energy. During manufacturing, chemists have to don protective gear and use fume hoods and other precautions to apply the coating, then carefully dispose of the dissolved cadmium waste.


Physicist Jon Major of the University of Liverpool in England and his team set out to find a replacement. They tested numerous alternative salts, including sodium chloride (table salt) and potassium chloride, and found that magnesium chloride yielded comparable efficiency. “We got cells as good as, if not better than, anything we ever got with cadmium chloride,” Major says.


Magnesium chloride is also nontoxic, abundant and costs about 300 times less than cadmium chloride. It can even be applied with a cheap spray coater purchased on the Web. The team published its research online in June in . ( is part of Nature Publishing Group.)


The new material applies to those solar cells that are made of cadmium telluride, the second most abundant type of solar cell in the worldwide market. Some experts are skeptical that the swap will yield big cost savings because the largest expense varies between manufacturers. Alessio Bosio, a physicist at the University of Parma in Italy, estimates savings will be “minimal,” at about 15 percent. Still, physicist Julian Perrenoud of Switzerland's Empa, a materials science institute, who was not involved in the study, is optimistic. Using magnesium chloride, he says, “will reduce not only the health risks but also the production costs because the raw material is cheaper and much easier to dispose of.”


Friday, August 29, 2014

Ebola Drug Saves Infected Monkeys

ebola drug

The drug — a cocktail of three purified immune proteins, or monoclonal antibodies, that target the Ebola virus — has been given to seven people.


ZMapp, the drug that has been used to treat seven patients during the current Ebola epidemic in West Africa, can completely protect monkeys against the virus, research has found.


The study, published online today in , comes the day after the World Health Organization (WHO) warned that the Ebola outbreak, which has killed more than 1,500 people, is worsening and before it ends. A fifth West African nation, Senegal, reported its first case of the disease on Friday.


Public-health experts say that , such as the deployment of greater numbers of health-care workers to stricken areas, should be the focus of the response. But ZMapp, made by Mapp Pharmaceutical in San Diego, California, is one of several unapproved products that the WHO has said in the outbreak.


The drug — a cocktail of three purified immune proteins, or monoclonal antibodies, that target the Ebola virus — has been given to seven people: two US and three African health-care workers, a British nurse and a Spanish priest. The priest and a Liberian health-care worker who got the drug have since died. There is no way to tell whether ZMapp has been effective in the patients who survived, because they received the drug at different times during the course of their disease and received various levels of medical care.


In the study, designed and conducted in part by Mapp Pharmaceutical scientists, 18 monkeys were given three doses of the drug starting three, four or five days after they were infected with Ebola. All animals that received the drug lived, no matter when their treatment started; three monkeys that were not treated died.


The strain of Ebola virus used in the study is not the same as the one causing the current outbreak. But researchers showed that the antibodies in ZMapp recognize the current form of the virus in cell cultures, and the parts of the virus recognized by the drug are present in the strain of Ebola that has caused the outbreak.


Advanced disease


Thomas Geisbert, a virologist at the University of Texas Medical Branch at Galveston, estimates that day 5 of infection in the monkeys studied is roughly equivalent to days 7 to 9 of a human infection. People can develop symptoms up to 21 days after they contract Ebola, although signs commonly develop between 8 and 10 days after infection.


The study authors say that ZMapp works in an “advanced” stage of the disease. The drug was able to save one monkey that had bleeding under the skin affecting more than 70% of its body, and other monkeys that had enough virus in their blood to cause severe symptoms in people, says study co-author Gary Kobinger, an infectious-disease researcher at the Public Health Agency of Canada in Winnipeg.


In humans, the large majority are unable to walk or even sit with this level, and most will die within 24 hours,” Kobinger says.


But other researchers say that the findings should be interpreted with caution, because monkeys with Ebola are not a perfect analogue for humans with the disease. “I don’t think the data support that this drug is effective, even in the animal model, in individuals with advanced Ebola disease,” says infectious-disease physician Charles Chiu at the University of California, San Francisco.


Knowing when to give the drug may help guide its use in future outbreaks. But for now, Mapp says that no more ZMapp is available and will not be for months.


Book Review: The Marshmallow Test

Little, Brown, 2014


One marshmallow now or two later? This simple choice has agonized preschoolers since the 1960s, when psychologist Mischel began running his famous experiment to test children's ability to delay gratification. It turns out that a kid's performance on this willpower test predicts far-reaching outcomes such as SAT scores, relationship satisfaction and even body-mass index later in life. The good news is that the ability to resist instant gratification for longer-term rewards is not innate but can be learned. “It is a skill open to modification, and it can be enhanced through specific cognitive strategies that have now been identified,” Mischel writes in this account of the history of the test and the revelations it has produced. Admittedly impatient himself, he details the tactics that help our minds resist temptation and the implications of his work on child rearing, education and public policy.


How Asteroid 1950 DA Keeps It Together


The kilometer-size rubble pile appears to be held together by van der Waals forces. Karen Hopkin reports Aug 29, 2014 | |


From the depths of space, an asteroid hurtles toward Earth. [Well, our general vicinity.] But this is no ordinary hunk of galactic debris. Because the body of this asteroid seems to defy gravity. It’s bound by forces never observed on this scale in space.


That’s not the plot of a new summer blockbuster. It’s the result of a study in the journal . [Ben Rozitis, Eric MacLennan and Joshua P. Emery: ]


The asteroid in question is actually a kilometer-sized collection of rubble. In most cases, such space-faring pebble piles are held together by a combination of gravity and friction. But not so for our rocky interloper, dubbed “1950 DA.” This asteroid is rotating so rapidly that its pieces should have flung apart long ago.


Now, by analyzing the 1950 DA’s temperature and density, researchers conclude that cohesive forces called van der Waals attractions must be keeping the pieces clustered. Van der Waals forces may sound mysterious, but they’re well known on the small scale for their weak influence within and between molecules.


Blasting the asteroid, Hollywood style, could overcome these forces. But that might leave us with hundreds of smaller killer space rocks to dodge. The good news is, the asteroid won’t be in our area until 2880. So we have some time to figure it out.


—Karen Hopkin


[]


Are Personal Drones a Public Hazard? [Video]

DIY drone enthusiasts speak out about safety


Aug 29, 2014 | |

Attaching a GoPro camera to a personal drone gives you an aerial perspective unlike any other. That’s one reason these vehicles are getting popular with enthusiasts. But how safe would you feel walking around with those drones buzzing overhead? What if a drone suddenly drops midair or crashes into a building?


The Federal Aviation Administration estimates that by 2018, there could be as many as swarming through the sky. In fact, last year to help achieve speedy deliveries in the future. Today, personal drones are already in flight. “Flying above 200 feet is unnecessary with these kinds of aircraft,” says Steven Cohen, unmanned autonomous systems education coordinator at Bergen Community College. Cohen teaches STEM (science, technology, engineering and mathematics) students how to build and fly their own recreational unmanned aerial vehicles, which rely on several rotors.


Unlike their winged counterparts, recreational drones are less efficient and only have enough power for about 15 minutes of flight, according to Cohen, who is also an organizer of the . This limitation actually doubles as an unintended safety benefit. “It can be fatiguing to fly for longer duration,” Cohen explains, and fatigue can cause accidents.


Personal drones do have built-in safety features, including for loss of radio control or GPS signals, but he says that when it comes to safety “it really depends on the operator and someone’s experience.”


More on drones:


Better Security Measures Are Needed Before Drones Roam the U.S. Airspace


Could Radio-Hijacked Civilian Drones Become Lethal Projectiles?


Drones Bring Fight and Flight to Battle against Poachers


Why Orchestras Haven't Been Digitized

My explores the push to replace live orchestras at operas and musicals with less expensive, but less satisfying, digital ones.


This issue affects me deeply, you see, because I wasn't always a technology writer. My original aspiration was to compose Broadway musicals.


After college I spent 10 years chasing that dream. While I waited for the world to discover my compositional genius I worked in the office of a theatrical licensing house called Music Theater International (MTI). Companies like MTI rent the rights, scripts and music for musicals such as or to your school or theater.


My boss was fascinated by the possibilities of exploiting technology to foster more live theater. We had a lot of long conversations about what that meant. For example, should we offer canned recordings of the orchestra parts for our shows? Then many more schools and theaters would be able to produce musicals. But we also didn't want to be in the business of discouraging the use of live orchestras.


In the end, we limited our offerings to tools for use in rehearsals. For each show, we developed a set of MIDI files (which perfectly reproduce an original keyboard performance) for rehearsal use only. Because these disks could play the entire piano part perfectly, at any tempo and in any key, they permitted choir teachers, directors and choreographers to conduct appropriate rehearsals—and even multiple rehearsals simultaneously in different rooms. But we still insisted that our productions were performed with live music.


In the 20 years since then the licensing companies have tiptoed closer to permitting digital playback at live performances. For many elementary and middle schools—whose student orchestras simply don't have the ability to play a full score—productions can rent a CD of the show accompaniment; audiences are understanding.


But what about regular adult performances? Some licensing houses, including Theatrical Rights Worldwide (TRW), offer both a rehearsal tool and software that let productions perform with a digital orchestra. "We encourage live performers and live musicians but we also look to promote and encourage the performance of our musicals," TRW president Steve Spiegel says.


So far, he says, very few professional productions rely on these digital playback offerings. "It's a small percentage, maybe 10 percent of the shows we license," he says. "Most of them haven't done musicals before. They're building a program. They don't have the budget or the musicians available. A lot of them are overseas."


Elsewhere, he says, when funding is tight, most theaters prefer to reduce the number of live players rather than use prerecorded tracks. "The good news is that the majority, by far, are only looking for something to get through the rehearsal process," Spiegel tells me. "The excitement of having a live orchestra can never be replaced. The audience expectation can never be replaced. The great majority understands the impact that the live musicians have in the theater world, and that's very comforting."



The Serious Need for Play

See Inside

Free, imaginative play is crucial for normal social, emotional and cognitive development. It makes us better adjusted, smarter and less stressed


By

On August 1, 1966, the day psychiatrist Stuart Brown started his assistant professorship at the Baylor College of Medicine in Houston, 25-year-old Charles Whitman climbed to the top of the University of Texas Tower on the Austin campus and shot 46 people. Whitman, an engineering student and a former U.S. Marine sharpshooter, was the last person anyone expected to go on a killing spree. After Brown was assigned as the state's consulting psychiatrist to investigate the incident and later, when he interviewed 26 convicted Texas murderers for a pilot study, he discovered that most of the killers, including Whitman, shared two things in common: they were from abusive families, and they never played as kids.


Brown did not know which factor was more important. But in the 47 years since, he has interviewed some 6,000 people about their childhoods, and his data suggest that a lack of opportunities for unstructured, imaginative play can keep children from growing into happy, well-adjusted adults. “Free play,” as scientists call it, is critical for becoming socially adept, coping with stress and building cognitive skills such as problem solving. Research into animal behavior confirms play's benefits and establishes its evolutionary importance.



*You must have purchased this issue or have a qualifying subscription to access this content


Thursday, August 28, 2014

Personal Drones: Are They a Public Hazard?

August 28, 2014 |

DIY drone enthusiasts speak out about safety


Showing 16450


Scientific American Special Universe

Get the latest Special Collector's edition


Secrets of the Universe: Past, Present, Future


> X


Email this Article


X


Losing Ground: Southeast Louisiana is Disappearing, Quickly

In just 80 years, some 2,000 square miles of its coastal landscape have turned to open water, wiping places off maps, bringing the Gulf of Mexico to the back door of New Orleans and posing a lethal threat to an energy and shipping corridor vital to the nation’s economy.


And it’s going to get worse, even quicker.


Scientists now say one of the greatest environmental and economic disasters in the nation’s history is rushing toward a catastrophic conclusion over the next 50 years, so far unabated and largely unnoticed.


At the current rates that the sea is rising and land is sinking, National Oceanic and Atmospheric Administration scientists say by 2100 the Gulf of Mexico could rise as much as 4.3 feet across this landscape, which has an average elevation of about 3 feet. If that happens, everything outside the protective levees — most of Southeast Louisiana — would be underwater.


The effects would be felt far beyond bayou country. The region best known for its self-proclaimed motto “laissez les bons temps rouler” — let the good times roll — is one of the nation’s economic linchpins.


This land being swallowed by the Gulf is home to half of the country’s oil refineries, a matrix of pipelines that serve 90 percent of the nation’s offshore energy production and 30 percent of its total oil and gas supply, a port vital to 31 states, and 2 million people who would need to find other places to live.


The landscape on which all that is built is washing away at a rate of a football field every hour, 16 square miles per year.


For years, most residents didn’t notice because they live inside the levees and seldom travel into the wetlands. But even those who work or play in the marshes were misled for decades by the gradual changes in the landscape. A point of land eroding here, a bayou widening there, a spoil levee sinking a foot over 10 years. In an ecosystem covering thousands of square miles, those losses seemed insignificant. There always seemed to be so much left.


Now locals are trying to deal with the shock of losing places they had known all their lives — fishing camps, cypress swamps, beachfronts, even cattle pastures and backyards — with more disappearing every day.


Fishing guide Ryan Lambert is one of them. When he started fishing the wetlands out of Buras 34 years ago, he had to travel through six miles of healthy marshes, swamps and small bays to reach the Gulf of Mexico.


“Now it’s all open water,” Lambert said. “You can stand on the dock and see the Gulf.”


Two years ago, NOAA removed 31 bays and other features from the Buras charts. Some had been named by French explorers in the 1700s.


The people who knew this land when it was rich with wildlife and dotted with Spanish- and French-speaking villages are getting old. They say their grandchildren don’t understand what has been lost.


“I see what was,” said Lloyd “Wimpy” Serigne, who grew up in the fishing and trapping village of Delacroix, 20 miles southeast of New Orleans. It was once home to 700 people; now there are fewer than 15 permanent residents. “People today — like my nephew, he's pretty young — he sees what is.”


If this trend is not reversed, a wetlands ecosystem that took nature 7,000 years to build will be destroyed in a human lifetime.


The story of how that happened is a tale of levees, oil wells and canals leading to destruction on a scale almost too big to comprehend — and perhaps too late to rebuild. It includes chapters on ignorance, unintended consequences and disregard for scientific warnings. It’s a story that is still unfolding.


Speck by speck, land built over centuries

The coastal landscape Europeans found when they arrived at the mouth of the Mississippi River 500 years ago was the Amazon of North America, a wetlands ecosystem of more than 6,000 square miles built by one of the largest rivers in the world.


For thousands of years, runoff from the vast stretch of the continent between the Rockies and the Appalachians had flowed into the Mississippi valley. Meltwater from retreating glaciers, seasonal snowfall and rain carried topsoil and sand from as far away as the Canadian prairies. The river swelled as it rushed southward on the continent’s downward slope, toward the depression in the planet that would become known as the Gulf of Mexico.


Down on the flat coastal plain, the giant river slowed. It lost the power to carry those countless tons of sediment, which drifted to the bottom. Over thousands of years, this rain of fine particles gradually built land that would rise above the Gulf.


It wasn’t just the main stem of the Mississippi doing this work. When the river reached the coastal plain, side channels — smaller rivers and bayous — peeled off. They were called “distributaries,” for the job they did spreading that land-building sediment ever farther afield.


The delta had two other means of staying above the Gulf. The plants and trees growing in its marshes and swamps shed tons of dead parts each year, adding to the soil base. Meanwhile, storms and high tides carried sediment that had been deposited offshore back into the wetlands.


As long as all this could continue unobstructed, the delta continued to expand. But with any interruption, such as a prolonged drought, the new land began to sink.


That’s because the sheer weight of hundreds of feet of moist soil is always pushing downward against the bedrock below. Like a sponge pressed against a countertop, the soil compresses as the moisture is squeezed out. Without new layers of sediment, the delta eventually sinks below sea level.


The best evidence of this dependable rhythm of land building and sinking over seven millennia is underground. Geologists estimate that the deposits were at least 400 feet deep at the mouth of the Mississippi when those first Europeans arrived.


By the time New Orleans was founded in 1718, the main channel of the river was the beating heart of a system pumping sediment and nutrients through a vast circulatory network that stretched from present-day Baton Rouge south to Grand Isle, west to Texas and east to Mississippi. As late as 1900, new land was pushing out into the Gulf of Mexico.


A scant 70 years later, that huge, vibrant wetlands ecosystem would be at death’s door. The exquisite natural plumbing that made it all possible had been dismantled, piece by piece, to protect coastal communities and extract oil and gas.


Engineering the river

For communities along its banks, the Mississippi River has always been an indispensable asset and their gravest threat. The river connected their economies to the rest of the world, but its spring floods periodically breached locally built levees, quickly washing away years of profits and scores of lives. Some towns were so dependent on the river, they simply got used to rebuilding.


That all changed with the Great Flood of 1927.


Swollen by months of record rainfall across the watershed, the Mississippi broke through levees in 145 places, flooding the midsection of the country from Illinois to New Orleans. Some 27,000 square miles went under as much as 30 feet of water, destroying 130,000 homes, leaving 600,000 people homeless and killing 500.


Stunned by what was then the worst natural disaster in U.S. history, Congress passed the Flood Control Act of 1928, which ordered the U.S. Army Corps of Engineers to prevent such a flood from ever happening again. By the mid-1930s, the corps had done its job, putting the river in a straitjacket of levees.


But the project that made the river safe for the communities along the river would eventually squeeze the life out of the delta. The mud walls along the river sealed it off from the landscape sustained by its sediment. Without it, the sinking of land that only occurred during dry cycles would start, and never stop.


If that were all we had done to the delta, scientists have said, the wetlands that existed in the 1930s could largely be intact today. The natural pace of sinking — scientists call it subsidence — would have been mere millimeters per year.


But we didn’t stop there. Just as those levees were built, a nascent oil and gas industry discovered plentiful reserves below the delta’s marshes, swamps and ridges.


At the time, wetlands were widely considered worthless — places that produced only mosquitoes, snakes and alligators. The marsh was a wilderness where few people could live, or even wanted to.


There were no laws protecting wetlands. Besides, more than 80 percent of this land was in the hands of private landowners who were happy to earn a fortune from worthless property.


Free to choose the cheapest, most direct way to reach drilling sites, oil companies dredged canals off natural waterways to transport rigs and work crews. The canals averaged 13 to 16 feet deep and 140 to 150 feet wide — far larger than natural, twisting waterways.


Effects of canals ripple across the wetlands

Eventually, some 50,000 wells were permitted in the coastal zone. The state estimates that roughly 10,000 miles of canals were dredged to service them, although that only accounts for those covered by permitting systems. The state began to require some permits in the 1950s, but rigorous accounting didn’t begin until the Clean Water Act brought federal agencies into play in 1972.


Researchers say the total number of miles dredged will never be known because many of those areas are now underwater. Gene Turner, a Louisiana State University professor who has spent years researching the impacts of the canals, said 10,000 miles “would be a conservative estimate.”


Companies drilled and dredged all over the coast, perhaps nowhere more quickly than the area near Lafitte, which became known as the Texaco Canals.


This fishing village 15 miles south of New Orleans had been named for the pirate who used these bayous to ferry contraband to the city. For years, the seafood, waterfowl and furbearers in the surrounding wetlands sustained the community. As New Orleans grew, Lafitte also became a favorite destination for weekend hunters and anglers.


Today those scenes are only a memory.


“Once the oil companies come in and started dredging all the canals, everything just started falling apart,” said Joseph Bourgeois, 84, who grew up and still lives in the area.


From 1930 to 1990, as much as 16 percent of the wetlands was turned to open water as those canals were dredged. But as the U.S. Department of the Interior and many others have reported, the indirect damages far exceeded that:




  • Saltwater creeped in

    Canal systems leading to the Gulf allowed saltwater into the heart of freshwater marshes and swamps, killing plants and trees whose roots held the soils together. As a side effect, the annual supply of plant detritus — one way a delta disconnected from its river can maintain its elevation — was seriously reduced.




  • Shorelines crumbled

    Without fresh sediment and dead plants, shorelines began to collapse, increasing the size of existing water bodies. Wind gained strength over ever-larger sections of open water, adding to land loss. Fishers and other boaters used canals as shortcuts across the wetlands; their wakes also sped shoreline erosion. In some areas, canals grew twice as wide within five years.




  • Spoil levees buried and trapped wetlands

    When companies dredged canals, they dumped the soil they removed alongside, creating “spoil levees” that could rise higher than 10 feet and twice as wide.


    The weight of the spoil on the soft, moist delta caused the adjacent marshes to sink. In locations of intense dredging, spoil levees impounded acres of wetlands. The levees also impeded the flow of water — and sediments — over wetlands during storm tides.


    If there were 10,000 miles of canals, there were 20,000 miles of levees. Researchers estimate that canals and levees eliminated or covered 8 million acres of wetlands.




All this disrupted the delta’s natural hydrology — its circulatory system — and led to the drowning of vast areas. Researchers have shown that land has sunk and wetlands have disappeared the most in areas where canals were concentrated.


In the 1970s, up to 50 square miles of wetlands were disappearing each year in the areas with heaviest oil and gas drilling and dredging, bringing the Gulf within sight of many communities.


As the water expanded, people lived and worked on narrower and narrower slivers of land.


“There’s places where I had cattle pens, and built those pens … with a tractor that weighed 5,000 or 6,000 pounds,” said Earl Armstrong, a cattle rancher who grew on the river nine miles south of the nearest road. “Right now we run through there with airboats.”


There are other forces at work, including a series of geologic faults in the delta and the rock layers beneath, but a U.S. Department of Interior report says oil and gas canals are ultimately responsible for 30 to 59 percent of coastal land loss. In some areas of Barataria Bay, said Turner at LSU, it’s close to 90 percent.


Even more damage was to come as the oil and gas industry shifted offshore in the late 1930s, eventually planting about 7,000 wells in the Gulf. To carry that harvest to onshore refineries, companies needed more underwater pipelines. So they dug wider, deeper waterways to accommodate the large ships that served offshore platforms.


Congress authorized the Corps of Engineers to dredge about 550 miles of navigation channels through the wetlands. The Department of Interior has estimated that those canals, averaging 12 to 15 feet deep and 150 to 500 feet wide, resulted in the loss of an additional 369,000 acres of coastal land.


Researchers eventually would show that the damage wasn’t due to surface activities alone. When all that oil and gas was removed from below some areas, the layers of earth far below compacted and sank. Studies have shown that coastal subsidence has been highest in some areas with the highest rates of extraction.


Push to hold industry accountable

The oil and gas industry, one of the state’s most powerful political forces, has acknowledged some role in the damages, but so far has defeated efforts to force companies to pay for it.


The most aggressive effort to hold the industry accountable is now underway. In July 2013, the Southeast Louisiana Flood Protection Authority-East, which maintains levees around New Orleans, filed suit against more than 90 oil, gas and pipeline companies.


The lawsuit claims that the industry, by transforming so much of the wetlands to open water, has increased the size of storm surges. It argues this is making it harder to protect the New Orleans area against flooding and will force the levee authority to build bigger levees and floodwalls.


The lawsuit also claims that the companies did not return the work areas to their original condition, as required by state permits.


"The oil and gas industry has complied with each permit required by the State of Louisiana and the Corps of Engineers since the permits became law,” said Ragan Dickens, spokesman for the Louisiana Oil and Gas Association.


State leaders immediately rose to the industry’s defense. Much of the public debate has not been about the merits of the suit; instead, opponents contested the authority’s legal right to file the suit and its contingency fee arrangement with a private law firm.


“We’re not going to allow a single levee board that has been hijacked by a group of trial lawyers to determine flood protection, coastal restoration and economic repercussions for the entire State of Louisiana,” said Gov. Bobby Jindal in a news release demanding that the levee authority withdraw its suit.


“A better approach,” he said in the statement, “to helping restore Louisiana’s coast includes holding the Army Corps of Engineers accountable, pushing for more offshore revenue sharing and holding BP accountable for the damage their spill is doing to our coast.”


The industry’s political clout reflects its outsized role in the economy of one of the nation's poorest states. The industry directly employs 63,000 people in the state, according to the federal Department of Labor.


Many of those employees live in the coastal parishes that have suffered most from oil and gas activities and face the most severe consequences from the resulting land loss.


Legislators in those areas helped Jindal pass a law that retroactively sought to remove the levee authority’s standing to file the suit. The constitutionality of that law is now before a federal judge.


Consequences now clear

Even as politicians fought the lawsuit, it was hard to deny what was happening on the ground.


By 2000, coastal roads that had flooded only during major hurricanes were going underwater when high tides coincided with strong southerly winds. Islands and beaches that had been landmarks for lifetimes were gone, lakes had turned into bays, and bays had eaten through their borders to join the Gulf.


“It happened so fast, I could actually see the difference day to day, month to month,” said Lambert, the fishing guide in Buras.


Today, in some basins around New Orleans, land is sinking an inch every 30 months. At this pace, by the end of the century this land will sink almost 3 feet in an area that’s barely above sea level today.


Meanwhile, global warming is causing seas to rise worldwide. Coastal landscapes everywhere are now facing a serious threat, but none more so than Southeast Louisiana.


The federal government projects that seas along the U.S. coastline will rise 1.5 to 4.5 feet by 2100. Southeast Louisiana would see “at least” 4 to 5 feet, said NOAA scientist Tim Osborn.


The difference: This sediment-starved delta is sinking at one of the fastest rates of any large coastal landscape on the planet at the same time the oceans are rising.


Maps used by researchers to illustrate what the state will look like in 2100 under current projections show the bottom of Louisiana’s “boot” outline largely gone, replaced by a coast running practically straight east to west, starting just south of Baton Rouge. The southeast corner of the state is represented only by two fingers of land – the areas along the Mississippi River and Bayou Lafourche that currently are protected by levees.


Finally, a plan to rebuild — but not enough money

Similar predictions had been made for years. But Hurricane Katrina finally galvanized the state Legislature, which pushed through a far-reaching coastal restoration plan in 2007.


The 50-year, $50 billion Master Plan for the Coast (in 2012 dollars) includes projects to build levees, pump sediment into sinking areas, and build massive diversions on the river to reconnect it with the dying delta.


The state’s computer projections show that by 2060 — if projects are completed on schedule — more land could be built annually than is lost to the Gulf.


But there are three large caveats.




  • The state is still searching for the full $50 billion. Congress so far has been unwilling to help.




  • If the plan is to work, sea-level rise can’t be as bad as the worst-case scenario.




  • Building controlled sediment diversions on the river, a key part of the land-building strategy, has never been done before. The predictions, then, are largely hypothetical, although advocates say the concept is being proven by an uncontrolled diversion at West Bay, near the mouth of the river.




Some of the money will come from an increased share of offshore oil and gas royalties, but many coastal advocates say the industry should pay a larger share.


In fact, leaders of the regional levee authority have said the purpose of the lawsuit was to make the industry pay for the rebuilding plan, suggesting that state could trade immunity from future suits for bankrolling it.


That idea is gaining momentum in official circles, despite the industry’s latest win in the state Legislature.


Kyle Graham, executive director of the Louisiana Coastal Protection and Restoration Authority, said recently that the industry understands its liability for the crumbling coast and is discussing some kind of settlement. “It's very difficult to see a future in which that [such an agreement] isn't there,” he said he said.


Graham has said current funding sources could keep the restoration plan on schedule only through 2019. He was blunt when talking about what would happen if more money doesn’t come through: There will be a smaller coast.


“There are various sizes of a sustainable coastal Louisiana,” he said. “And that could depend on how much our people are willing to put up for that.”


A vanishing culture

Trying to keep pace with the vanishing pieces of southeast Louisiana today is like chasing the sunset; it’s a race that never ends.


Lambert said when he’s leading fishing trips, he finds himself explaining to visitors what he means when he says, “This used to be Bay Pomme d’Or” and the growing list of other spots now only on maps.


Signs of the impending death of this delta are there to see for any visitor.


Falling tides carry patches of marsh grass that have fallen from the ever-crumbling shorelines.


Pelicans circle in confusion over nesting islands that have washed away since last spring.


Pilings that held weekend camps surrounded by thick marshes a decade ago stand in open water, hundreds of yards from the nearest land — mute testimony to a vanishing culture.


Shrimpers push their wing nets in lagoons that were land five years ago.


The bare trunks of long-dead oaks rise from the marsh, tombstones marking the drowning of high ridges that were built back when the river pumped life-giving sediment through its delta.


“If you’re a young person you think this is what it’s supposed to look like,” Lambert said. “Then when you’re old enough to know, it’s too late.”

Venice


More:

Drowning New Orleans


Protecting New Orleans


From ProPublica.org (find the original story here); reprinted with permission.


This entry passed through the Full-Text RSS service — if this is your content and you're reading it on someone else's site, please read the FAQ at http://ift.tt/jcXqJW.


U.S. Names 20 Corals As Threatened, Down From Original List of 66

The U.S. government pared back the number of reef-building coral species it was considering to label as threatened from 66 to 20 this week, prompting criticism from conservationists. Environmentalists urged the National Oceanic and Atmospheric Administration on Thursday to extend the protection to all threatened marine species.



Aug 28, 2014


|

By Daniel Wallis


MIAMI (Reuters) - The U.S. government pared back the number of reef-building coral species it was considering to label as threatened from 66 to 20 this week, prompting criticism from conservationists.


Environmentalists urged the National Oceanic and Atmospheric Administration on Thursday to extend the protection to all threatened marine species.


"We are concerned with NOAA's unwillingness to acknowledge the widespread threats to the coral species not receiving protections," said Bethany Cotton, wildlife program director for environmental advocacy group WildEarth Guardians.


NOAA was considering 66 coral species when it embarked on its study two years ago. On Wednesday it announced its decision, adding the 20 species to two - staghorn and elkhorn - that were listed as threatened in 2006.


Of the new species, five are found in the Caribbean, including pillar coral and rough cactus coral, and 15 in the Indo-Pacific.


A U.N.-backed study warned earlier this year that most reefs in the Caribbean could vanish in the next two decades, hit by the loss of fish and sea urchins that eat coral-smothering algae.


NOAA said it considered wide-ranging public comments as part of the rulemaking process.


"The final decision is a result of the most extensive rulemaking ever undertaken by NOAA," Eileen Sobeck, assistant administrator for NOAA Fisheries, said in a statement.


"The amount of scientific information sought, obtained and analyzed was unprecedented."


Coral is a stationary animal that slowly grows on sea floors over tens and even hundreds of years. Coral reefs are nurseries for many types of fish, and they also they help protect coasts from storms and tsunamis, as well as attracting tourists.


NOAA was petitioned in 2009 by the Center for Biological Diversity to list 83 of what it said were the most vulnerable coral species found in U.S. waters as threatened or endangered under the Endangered Species Act.


The U.S. agency considered 66 of those species for the protected status.


Miyoko Sakashita, the Center's oceans director, said getting 20 species listed on Wednesday was "great news," but also a "bittersweet victory."


"This is a wake-up call that our amazing coral reefs are dying and need federal protection," Sakashita said.


"But there's hope for saving corals and many other ocean animals if we make rapid cuts in greenhouse gas pollution to stop global warming and ocean acidification."


(Reporting by Daniel Wallis; Editing by David Adams and Doina Chiacu)


Juneau Where I Am: Scientific American Alaska Cruise, Part 2

Scientific American Bright Horizons Cruise 22 arrives in Juneau, Alaska.



-- Read more on ScientificAmerican.com

Warming Aids Arctic Economies But Far Short Of 'Cold Rush'

By Alister Doyle


OSLO, (Reuters) - Climate change is aiding shipping, fisheries and tourism in the Arctic but the economic gains fall short of a "cold rush" for an icy region where temperatures are rising twice as fast as the world average.


A first cruise ship will travel the icy Northwest Passage north of Canada in 2016, Iceland has unilaterally set itself mackerel quotas as stocks shift north and Greenland is experimenting with crops such as tomatoes.


Yet businesses, including oil and gas companies or mining firms looking north, face risks including that permafrost will thaw and ruin ice roads, buildings and pipelines. A melt could also cause huge damage by unlocking frozen greenhouse gases.


"There are those who think that growing strawberries in Greenland and drilling for oil in the Arctic are the new economic frontiers," said Achim Steiner, head of the U.N. Environment Program.


"I would caution against the hypothetical bonanza that some people see," he told Reuters of Arctic regions in Russia, Nordic nations, Alaska and Canada. U.N. studies say global warming will be harmful overall with heatwaves, floods and rising seas.


FEWER FUR COATS


In 2002, however, Russian President Vladimir Putin mused that warming might benefit Russia - thereby easing pressure to curb greenhouse gas emissions. He joked that warmer temperatures could mean fewer fur coats in northern regions.


More than a decade later, researchers see the Arctic as a test case for the impacts of climate change. It is warming fast because a thaw of white ice and snow exposes darker ground and water below that soak up more of the sun's heat.


"So far, I believe the benefits (of Arctic warming) outweigh the potential problems," said Oleg Anisimov, a Russian scientist who co-authored a chapter about the impacts of climate change in polar regions for a U.N. report on global warming this year.


Others say it is hard to discern benefits. Factors such as improved drilling technology or relatively high oil prices around $100 a barrel may be bigger drivers for change than a thaw in a chill, remote region shrouded in winter darkness.


Off Alaska, for instance, oil company bids for leases in the Arctic Chukchi and Beaufort seas since 2005 have totaled about $2.7 billion. But a previous round in the 1980s - before global warming was an issue - attracted similar sums, according to data from the U.S. Bureau of Ocean Energy Management.


"There are subjective interpretations of development costs and benefits (tourism, fishing, oil and gas, shipping) but it will be some years before there are enough trends and data," said Fran Ulmer, President Barack Obama's chair of the U.S. Arctic Research Commission.


Indigenous peoples doubt there are benefits. Aqqaluk Lynge, a Greenlander and ex-head of the Inuit Circumpolar Council, said vital dogsleds were useless in some areas because of the thaw.


"People think the economy is Wall Street but it's the local economy that's feeling the pressure," he said.


Among new activities, 71 cargo ships used a short-cut shipping route between the Pacific and Atlantic oceans north of Russia in 2013. Roughly the same number is likely in 2014, said Sergei Balmasov of the Northern Sea Route Information Office.


In a sign of more tourism, Crystal Cruises will send its Crystal Serenity ship from Anchorage to New York in 2016 past icebergs and polar bears north of Canada - priced from $19,755 per passenger and with an escort vessel as an ice-breaker.


CRUISES, CARGO


The route was first navigated in 1903-1906 by Norwegian explorer Roald Amundsen, but has only been ice-free in some recent years. Paul Garcia, spokesman for Crystal Cruises, said there had been a high volume of bookings so far.


Tourism has benefited in some areas. The number of nights spent by visitors to the Arctic archipelago of Svalbard north of Norway rose to 107,000 in 2013 from 24,000 in 1993.


And cod, haddock, herring and blue whiting are among fish stocks expanding north. Iceland has set new, unilateral quotas for mackerel, including almost 150,000 tonnes in 2014.


"The biomass sum of all types of species is increasing, and will continue to increase in the Arctic," said Svein Sundby, of the Institute of Marine Research in Norway.


Among oil companies, Exxon Mobil began drilling in Russia's Arctic on Aug. 9 despite Western sanctions on its Russian partner Rosneft over Ukraine crisis.


But Royal Dutch Shell dropped plans for drilling in 2014 after spending $5 billion on exploration since 2005, following protests and accidents off Alaska.


And despite any gains, a 2013 study in the journal Nature said the Arctic has a hidden economic time bomb.


A major release of methane trapped in the frozen seabed off Russia could accelerate global warming and cause $60 trillion in damage, almost the size of world GDP, it said. Costs would be from more heatwaves, floods, droughts and rising sea levels.


"The size (of drawbacks) is likely to dwarf any kind of benefits," said Chris Hope of the Judge Business School at Cambridge University, who was among the authors.


The U.N.'s panel of climate experts says that it is at least 95 percent probable that human activities are the main driver of warming since 1950. But many voters are doubtful, suspecting that natural variations are to blame.


(Editing by Tom Heneghan)


For Dessert, May I Recommend the Buglava?

The recipe for wild mushroom risotto starts with the ingredients list. The risotto includes rice, garlic, minced onion and vegetable stock. The mushroom mixture contains half a pound of wild mushrooms, garlic, butter, thyme, 12 grasshoppers with the legs and wings removed, and two thirds of a cup of buffalo worms, along with salt and freshly ground black pepper. Julia Child has left the building.


Entering the building are Arnold van Huis and Marcel Dicke, entomologists at Wageningen University in the Netherlands, along with chef Henk van Gurp, from the nearby Rijn IJssel Vakschool, which teaches hotel and tourism management. The wild mushroom risotto recipe is one of 32 in the frying Dutchmen's new volume, .


Americans may involuntarily utter “Gurp” as they contemplate dishes rich in grasshoppers and buffalo worms, the larvae of a rather handsome beetle. Therefore, the meat of the book is its essays discussing the value of incorporating insects into culinary cultures that have mostly eschewed them.


More than 1,900 insect species are on the menu in great swaths of the world. “People in Asia, Africa and Latin America do commonly eat insects,” the authors say, “not because of hunger, but because they are considered special treats.” Indeed, insects in these regions can be more expensive than meat. The authors note that an analogous situation exists in Europe, where meat can be less pricey than shrimp. And shrimp, being fellow arthropods, are much closer morphologically, but for some reason not yuckitudinally, to insects than to cows or chickens.


Of course, Americans already eat plenty of insects. “Apples sometimes have an insect or two in them—and these just get ground up … and become part of the applesauce and juice,” the authors point out. “The same goes for tomatoes and ketchup, grains and bread, coffee beans and coffee, and a long list of other foods.” The most healthful ingredients in your burger and side of fries may be the insect bits in the bun and ketchup.


The U.S. has legal limits, say Dicke and the two vans: “The maximum is sixty insect pieces per 3.5 ounces … of chocolate, thirty insect pieces per 3.5 ounces of peanut butter, and five fruit [fly eggs] per 1 cup … of fruit juice. Calculations indicate that each of us unknowingly consumes about 1 pound … of insects per year.”


And that does not include any foodstuffs such as red candy or strawberry yogurt that contain the dye carmine. This red additive comes from the smashed bodies of a scale insect called the cochineal. In 2012 news of the presence of carmine in six Starbucks offerings quickly got the coffee giant to switch to a tomato-derived replacement dye. Personally, I'd keep the carmine and get rid of almost everything else in the products, which included the raspberry swirl cake, the mini doughnut with pink icing and the red velvet whoopie pie.


The push to increase the attraction of insects as food for people comes from two population figures. First, the human population of the planet is expected to reach nine billion by the middle of this century. Second, the population of insects may be as high as 10 quintillion. The authors put that stat in what may seem to be more accessible terms but is still perhaps merely mind-blowing: “For every human being on Earth, there are between 200 million and 2 billion insects.” Put down the spray gun, you're surrounded.


Those nine billion people will need protein, and cultivating insects is far more efficient than producing other animal foods, especially beef, in terms of land and water use and the feed-to-food conversion ratio: about two pounds of feed will get you a pound of edible crickets, compared with 25 pounds of feed for a pound of beef. If we want a comeback of small farmers, it may be through farming these small critters.


So, as recommends, think of grasshoppers as land shrimp. Call locusts sky prawns. Frenchifying it to a bonbon could help the grasshopper bonbon hop off shelves. As my, and seemingly everybody else's, mother used to say upon the discovery of an insect in our food, “It's just a little extra protein.”


New Neurons Make Room for New Memories

See Inside

How does the brain form new memories without ever filling up? Scientists turn to the youngest neurons for answers


By

For many years scientists believed that you were born with all the neurons you would ever get. The evidence for this dogma seemed strong: neuroanatomists in the early 20th century had identified immature neurons under the microscope but only in the brains of mammalian embryos and fetuses, never after birth.


We now know that the truth is not quite so simple. By radioactively labeling DNA, researchers gradually began to find exceptions to the rule against new neurons in the adult brain. Today scientists have identified two small regions where neurogenesis, or the birth of new neurons, continues throughout life: the olfactory bulb and the hippocampus. The former area is part of the brain's odor-discrimination system, so neurons there likely participate in this process. But the hippocampus has a much broader function. It gives us memory.



*You must have purchased this issue or have a qualifying subscription to access this content


Wednesday, August 27, 2014

New Evidence Shows How Human Evolution Was Shaped by Climate

See Inside

Swings between wet and dry landscapes pushed some of our ancestors toward modern traits—and killed off others


By

Scrambling up the steep bank of a small wadi, or gully, near the western shore of Lake Turkana in northern Kenya, I stop on a little knoll that offers a view across the vast, mostly barren desert landscape. The glittering, jade-blue lake contrasts in every way with the red-brown landscape around it. This long, narrow desert sea, nestled within Africa's Great Rift Valley, owes its existence to the Omo River, whose winding flow delivers runoff that comes from summer monsoon rains in the Ethiopian highlands, hundreds of miles north.


The heat here has to be respected. By noon it feels like a blast furnace. The sun beats down, and the hot, stony ground fires it back upward. Scanning the dusty horizon, with the lake winking in the distance, I find it hard to imagine this place as anything but a desert.



*You must have purchased this issue or have a qualifying subscription to access this content


Strange Neutrinos from the Sun Detected for the First Time

An underground neutrino detector has found particles produced by the fusion of two protons in the sun’s core


Aug 27, 2014 | |The Borexino neutrino detector

The Borexino neutrino detector uses a sphere filled with liquid scintillator that emits light when excited. This inner vessel is surrounded by layers of shielding and by about 2,000 photomultiplier tubes to detect the light flashes.


Deep inside the sun pairs of protons fuse to form heavier atoms, releasing mysterious particles called in the process. These reactions are thought to be the first step in the chain responsible for 99 percent of the energy the sun radiates, but scientists have never found proof until now. For the first time, physicists have captured the elusive neutrinos produced by the sun’s basic proton fusion reactions.Earth should be teeming with such neutrinos—calculations suggest about 420 billion of them stream from the sun onto every square inch of our planet’s surface each second—yet they are incredibly hard to find. Neutrinos almost never interact with regular particles and usually fly straight through the empty spaces between the atoms in our bodies and all other normal matter. But occasionally they will collide with an atom and knock an electron loose, creating a quick flash of light visible to extremely sensitive detectors. That is how the Borexino experiment at Italy’s Gran Sasso National Laboratory found them. Its detection of so-called pp neutrinos—neutrinos created by the fusion of two protons in the sun—was a feat far from guaranteed. “Their existence was not in question, but whether some group was capable of building such an exquisitely pristine detector to see these low-energy neutrinos in real time, event by event, was,” says Wick Haxton, a physicist at the University of California, Berkeley, who was not involved in the experiment. “Borexino accomplished this through a long campaign to reduce and understand background events.”[ Slide Show: Giant Experiments Seek Out Tiny Neutrinos]Borexino uses a vat of liquid scintillator—a material designed to emit light when excited—contained in a large sphere surrounded by 1,000 tons of water, cocooned in layers upon layers of shielding and buried 1.4 kilometers underground. These defenses are meant to keep out everything but neutrinos, thereby excluding all other background radiation that could mimic the signal. “Unfortunately for the pp neutrinos all this is not enough,” says Andrea Pocar of the University of Massachusetts Amherst who is also a member of the Borexino collaboration and lead author of a paper reporting the results in the August 28 ( is part of Nature Publishing Group).Some background contamination cannot be shielded because it originates inside the experiment. “The main background is the presence of carbon 14 in the scintillator itself,” Pocar says. Carbon 14 is a radioactive isotope common on Earth. Its predictable decay schedule allows archaeologists to date ancient specimens. When it decays, however, carbon 14 releases an electron that creates a flash of light very similar to that of a pp neutrino. The physicists had to look in a narrow sliver of energies where pp neutrinos can be distinguished from errant carbon 14 decays. Even then, once in a while two carbon 14 atoms in the scintillator will decay simultaneously, and the energies of the electrons they release can “pile up” on top of one another to exactly mimic the pp neutrino flash. “We had to understand these pileup events very precisely and subtract them out,” Pocar explains. The team invented a new way to count the events, and gathered data over multiple years before the researchers were convinced they had isolated a true signal. “This was a very difficult measurement to make,” says Mark Chen of Queen’s University in Ontario, who was not involved in the project. “The campaign by Borexino to purify the liquid scintillator in their detector paid off.”Borexino’s discovery of pp solar neutrinos is a reassuring confirmation of physicists’ main theoretical models describing the sun. Previous experiments have found higher-energy solar neutrinos created by later stages of the fusion process involving the decay of boron atoms. But the lower-energy pp neutrinos were harder to find; their detection completes the picture of the sun’s fusion chain as well as bolsters plans for next-generation Earthbound neutrino experiments.A strange quirk of these elementary particles is that they come in —called electron, muon and tau—and they have the bizarre ability to swap flavors, or “oscillate.” Because of the complex particularities of proton fusion reactions, all of the sun’s neutrinos happen to be born as electron neutrinos. By the time they reach Earth, however, some portion of them have morphed into muon and tau neutrinos.Each neutrino flavor has a slightly different mass, although physicists do not yet know exactly what those masses are. Determining the masses and how they are ordered among the three flavors is one of the most important goals of current neutrino experiments. The mass differences between flavors are the main factor affecting how neutrinos oscillate.If neutrinos are traveling through matter, their interactions with it will also alter their oscillation rates. The oscillations of higher-energy neutrinos, it turns out, are more altered by matter, leading to a larger chance they will oscillate—and therefore to fewer of them surviving as electron neutrinos by the time they reach Earth.The in Ontario and Japan’s experiment measured this phenomenon decades ago when they detected the higher-energy solar neutrinos from boron decays. Now, Borexino’s findings confirm the effect: more of the lower-energy neutrinos seen by Borexino persisted as electron flavor than the higher-energy neutrinos measured by those previous experiments. “This is important because matter effects have so far only been seen in the sun, yet we want to use this effect on Earth in future ‘long-baseline neutrino experiments’ to fully determine the pattern of neutrino masses,” Haxton says.These experiments, such as the Fermi National Accelerator Laboratory’s Long-Baseline Neutrino Experiment (LBNE) planned to open in 2022, will probe how neutrinos traveling though matter oscillate. Rather than using solar neutrinos, these projects will create powerful beams of neutrinos in particle accelerators and fine-tune their pathways to make precision measurements. Fermilab’s experiment will generate a stream of neutrinos from its base laboratory near Chicago to the Sanford Underground Research Facility in South Dakota. As the neutrinos fly through about 1,285 kilometers of Earth's mantle on their journey (the so-called “long baseline”), many will oscillate. By studying how the mantle matter interacts with the different flavors to affect their oscillation rates, the researchers hope to reveal which neutrino flavors are lighter and which are heavier.Solving the neutrino mass puzzle, in turn, could point to a deeper theory of particle physics than the current Standard Model, which does not account for neutrino masses. Borexino’s latest feat of suggests that experiments are finally becoming powerful enough to pry such secrets from the evasive particles.


Turn On, Tune In, Get Better: Psychedelic Drugs Hold Medical Promise

See Inside

Psychedelic drugs are poised to be the next major breakthrough in mental health care


Aug 14, 2014 | |psychedelic artist drawing

Alamy


Almost immediately after Albert Hofmann discovered the hallucinogenic properties of LSD in the 1940s, research on psychedelic drugs took off. These consciousness-altering drugs showed promise for treating anxiety, depression, post-traumatic stress disorder (PTSD), obsessive-compulsive disorder (OCD) and addiction, but increasing government conservatism caused a research blackout that lasted decades. Lately, however, there has been a resurgence of interest in psychedelics as possible therapeutic agents. This past spring Swiss researchers published results from the first drug trial involving LSD in more than 40 years.



Time for a Psychedelic Spring?


The DEA and the U.S. Food and Drug Administration maintain that there is insufficient research to justify recategorization. This stance creates a catch-22 by basing the decision on the need for more research while limiting the ability of scientists to conduct that research. The June report recommends transferring responsibility for drug scheduling from the DEA to another agency or nongovernmental organization without a history of antidrug bias, such as the U.S. National Academy of Sciences. No matter how it happens, until the drugs are reclassified, bringing psychedelics from research into clinical practice will be an uphill battle.


Why the Multiverse May Be the Most Dangerous Idea in Physics

In the past decade an extraordinary claim has captivated cosmologists: that the expanding universe we see around us is not the only one; that billions of other universes are out there, too. There is not one universe—there is a multiverse. In articles and books such as Brian Greene's , leading scientists have spoken of a super-Copernican revolution. In this view, not only is our planet one among many, but even our entire universe is insignificant on the cosmic scale of things. It is just one of countless universes, each doing its own thing. The word “multiverse” has different meanings. Astronomers are able to see out to a distance of about 42 billion light-years, our cosmic visual horizon. We have no reason to suspect the universe stops there. Beyond it could be many—even infinitely many—domains much like the one we see. Each has a different initial distribution of matter, but the same laws of physics operate in all. Nearly all cosmologists today (including me) accept this type of multiverse, which Max Tegmark calls “level 1.” Yet some go further. They suggest completely different kinds of universes, with different physics, different histories, maybe different numbers of spatial dimensions. Most will be sterile, although some will be teeming with life. A chief proponent of this “level 2” multiverse is Alexander Vilenkin, who paints a dramatic picture of an infinite set of universes with an infinite number of galaxies, an infinite number of planets and an infinite number of people with your name who are reading this article.


Similar claims have been made since antiquity by many cultures. What is new is the assertion that the multiverse is a scientific theory, with all that implies about being mathematically rigorous and experimentally testable. I am skeptical about this claim. I do not believe the existence of those other universes has been proved—or ever could be. Proponents of the multiverse, as well as greatly enlarging our conception of physical reality, are implicitly redefining what is meant by “science.”



*You must have purchased this issue or have a qualifying subscription to access this content


Psychedelic Drugs Hold Medical Promise

See Inside

Psychedelic drugs are poised to be the next major breakthrough in mental health care


Aug 14, 2014 | |psychedelic artist drawing

Alamy


Almost immediately after Albert Hofmann discovered the hallucinogenic properties of LSD in the 1940s, research on psychedelic drugs took off. These consciousness-altering drugs showed promise for treating anxiety, depression, post-traumatic stress disorder (PTSD), obsessive-compulsive disorder (OCD) and addiction, but increasing government conservatism caused a research blackout that lasted decades. Lately, however, there has been a resurgence of interest in psychedelics as possible therapeutic agents. This past spring Swiss researchers published results from the first drug trial involving LSD in more than 40 years.



Time for a Psychedelic Spring?


The DEA and the U.S. Food and Drug Administration maintain that there is insufficient research to justify recategorization. This stance creates a catch-22 by basing the decision on the need for more research while limiting the ability of scientists to conduct that research. The June report recommends transferring responsibility for drug scheduling from the DEA to another agency or nongovernmental organization without a history of antidrug bias, such as the U.S. National Academy of Sciences. No matter how it happens, until the drugs are reclassified, bringing psychedelics from research into clinical practice will be an uphill battle.


Tuesday, August 26, 2014

California Quake Warning System Delayed By Lack Of Money

A system to provide early warnings of earthquakes such as the one which shook California's wine country this week is planned and ready to go, but two years after the scientific work finished, the funding is still being lined up. California passed a law last year to start the system, but did not appropriate the money needed to build it.



Aug 26, 2014


|

By Sharon Bernstein


SACRAMENTO Calif. (Reuters) - A system to provide early warnings of earthquakes such as the one which shook California's wine country this week is planned and ready to go, but two years after the scientific work finished, the funding is still being lined up.


California passed a law last year to start the system, but did not appropriate the money needed to build it.


A system to cover California, Washington and Oregon would cost $38 million, according to the U.S. Geological Survey (USGS).


Had money been available two years ago, the system would likely have been finished by Sunday, when the largest earthquake in the region in 25 years injured more than 200 people and damaged buildings around the city of Napa, said Doug Given, earthquake early warning coordinator for the USGS.


"We don't have the funding to move it forward yet," he said.


The technology, which can provide a few seconds to halt a speeding train or stop elevators before seismic waves hit, would not have helped residents near the epicenter, where shaking happened too quickly to warn anyone.


Similar systems are already in place in Japan and Mexico. When a magnitude 9.0 earthquake struck eastern Japan on March 11, 2011, people in Sendai, a city of about 1 million that was the closest big city to the epicenter, had more than 10 seconds to prepare. But those living in the coastal town of Ishinomaki, nearer to the epicenter in the seabed, had only 2 seconds’ warning before tremors began, according to the Japan Meteorological Agency.


Sensors pick up a tremor from shallower and faster-moving waves, known as "P-waves". A meteorological agency uses that data to estimates the location and intensity of shaking from so-called "S-waves", which are deeper and cause actual motion on the ground. For a graphic, see: http://reut.rs/1qgU7fw


A trial version of the U.S. technology, set up in the San Francisco Bay Area and Los Angeles, gave researchers, local governments and state emergency services officials 10 seconds warning before the 6.0 magnitude Napa quake hit, enough time to take basic emergency measures had it been strong enough to threaten San Francisco.


A related system developed by the Berkeley Seismological Laboratory provided a warning to the area's commuter train operator, Bay Area Rapid Transit.


JAPANESE SYSTEM


Japan began developing an earthquake warning system after the devastating 1995 Kobe temblor, which caused 6,000 deaths and more than 30,000 injuries, and went online in 2004. When sensors pick up a tremor, the public is warned via television and radio stations, as well as by cellphone alerts and through loudspeaker systems.


In California, the system would piggy-back on an existing network of 400 earthquake monitoring sensors. Scientist envision doubling that to 800 sensors and sending alerts to residents, public utilities and others.


David Oppenheimer, who directs the Northern California Seismic Network for the USGS, said a few seconds would be enough to open a fire station's doors so rescue engines are not trapped inside should a power outage render them inoperable.


USGS scientists have been working on a system for western states for years in cooperation with the California Institute of Technology and the University of California, Berkeley, among others.


A report released by Given's group in May said the $38 million system for California, Washington and Oregon would cost $16 million annually to operate. A California-only system would cost $23 million to build.


But funding from federal and state sources has been slow.


California U.S. Senator Dianne Feinstein on Monday called for federal funding.


"An integrated early warning system is essential to save lives and property," Feinstein said. "What we need is the political resolve to deploy such a system."


California state senator Alex Padilla, who represents part of the Los Angeles area hit by the deadly 1994 Northridge earthquake, said scientists complained to him about difficulty obtaining federal funding nearly two years ago.


Governor Jerry Brown signed a law last year directing state emergency services officials to develop a system for California using outside funding, which Padilla said the state was working to identify.


"It wasn’t my preference but given what the state has been through for the past decade and the impact on the state’ budget, a lot of leaders were wary (of using state funds)," Padilla said.


(Additional reporting by Sophie Knight and Jiro Minier in Japan, editing by Peter Henderson)


Nearly Complete Mammoth Skeleton, Found On Farm, Goes to Texas Museum

A North Texas family, who discovered the skeleton of a 20,000- to 40,000-year-old mammoth while mining through sediment on their farm, is preparing to turn over the remains to a local museum. In May, Wayne McEwen and his family were gathering material from a gravel pit on their property, south of Dallas, when his son struck a 6-foot (1.8 meter) tusk while operating an excavator.



Aug 26, 2014


|

By Lisa Maria Garza


DALLAS (Reuters) - A North Texas family, who discovered the skeleton of a 20,000- to 40,000-year-old mammoth while mining through sediment on their farm, is preparing to turn over the remains to a local museum.


In May, Wayne McEwen and his family were gathering material from a gravel pit on their property, south of Dallas, when his son struck a 6-foot (1.8 meter) tusk while operating an excavator.


The rest of the near-complete skeleton was unearthed by a team from a nearby community college, who determined it was a Columbian mammoth - a slightly larger, less hairy version of the more famous woolly mammoth.


The family decided to donate the remains to the Perot Museum of Nature and Science in Dallas.


Ron Tykoski, a paleontologist with the museum who is working with a team to prepare the specimen for transport, said the remains are missing a few leg bones but are mostly intact.


"We get a lot of mammoth fossils in Texas but it's usually a tooth here, a tusk there or a piece of jaw," Tykoski said on Tuesday.


"This is unusual. It looks like it just laid down and died."


There is no sign the carcass was disturbed by scavengers, likely because flood waters covered it with silt shortly after its death, he said.


The mammoth is believed to be a female because of its small size, the length of the tusks and the flare of the pelvic bones.


The animal was approximately 8-9 feet (2.4-2.7 meters) tall at the shoulder, or similar in size to a modern-day female Asian elephant.


"It needed to stay in North Texas where the local communities can enjoy it for a long time to come," McEwen said in a news release.


(Reporting by Jon Herskovitz; Editing by Sandra Maler)


Exxon Mobil Unit To Pay $1.4 Million For Louisiana Oil Spill

An Exxon Mobil Corp unit has agreed to pay $1.4 million to resolve U.S. government claims over a 2012 crude oil spill in Louisiana, the U.S. Justice Department said on Tuesday. ExxonMobil Pipeline Company discharged 2,800 barrels of crude oil after a pipeline ruptured, in violation of the Clean Water Act, the agency said.



Aug 26, 2014


|

WASHINGTON (Reuters) - An Exxon Mobil Corp unit has agreed to pay $1.4 million to resolve U.S. government claims over a 2012 crude oil spill in Louisiana, the U.S. Justice Department said on Tuesday.


ExxonMobil Pipeline Company discharged 2,800 barrels of crude oil after a pipeline ruptured, in violation of the Clean Water Act, the agency said.


(Reporting by Aruna Viswanatha; Editing by Sandra Maler)


WHO Calls for Electronic Cigarette Regulation

The World Health Organization says it’s necessary to check the “booming” market and ban indoor use


Aug 26, 2014 | |

Even in the absence of on the potential hazards surrounding the use of electronic cigarettes, regulations are needed now to head off health concerns. One such restriction should be a ban on indoor use of the devices. That’s according to the World Health Organization in a the international body published on August 26. Electronic cigarettes, the organization states, “represent an evolving frontier, filled with promise and threat for tobacco control.” The popular devices deliver an aerosolized solution to users by a heating a nicotine solution that users inhale.In the past nine years the e-cigarette industry has exploded to include more than 400 brands in a roughly $3-billion industry. Yet flavorings that attract children, poor quality control between brands and apparent rapid experimentation among adolescent users (with e-cig use doubling in that group from 2008 to 2012) has triggered growing fears that e-cigs are the new gateway drug. Just in the U.S. the Centers for Disease Control and Prevention reported earlier this week that more than a quarter million youth who had never smoked traditional tobacco cigarettes used e-cigs last year—representing a roughly three-fold increase from 2011 to 2013.The WHO report details potential regulatory options countries could consider, including blocking e-cig manufacturers from making health claims about the devices that suggest they are effective smoking cessation aids (unless and until they are scientifically proved as such), banning e-cig use in public places and restricting advertising for the products. Other recommendations from the roughly 100 scientists and regulators that contributed to the report include subjecting the devices to the same surveillance and monitoring typical for tobacco products, restricting sales to minors and possibly requiring health warnings on the packaging.E-cig aerosol “is not merely water vapor as is often claimed in the marketing for these products,” WHO wrote. Most of these gadgets “have not been tested by independent scientists, but the limited testing has revealed wide variations in the nature of the toxicity of contents and emissions.” Factors including how long and deep users puff on e-cigarettes, the lack of uniformity in the e-cig flavoring solutions and the possibility that users could potentially overdose on the nicotine found in the liquid—or that the drug may contribute to other health risks—must all be considered, the agency says.The international health organization recommends that smokers attempting to quit should first turn to tried-and-true methods such as the patch and nicotine gum before turning to e-cigarettes. The 13-page report was published in six languages.The organization’s dispatch, commissioned by the governing body of the WHO Framework Convention on Tobacco Control, was published even as the U.S. government with deciding how it will approach the devices. The U.S. Food and Drug Administration in April proposed regulations for e-cigarettes that have not yet been finalized. Those regulations, if enacted, would block youths from buying e-cigarettes and require health warnings on the products. They would not, however, ban advertising nor online sales, which critics contend allow minors easy access to the devices.


Catch Me If You Ketchikan: Scientific American Alaska Cruise, Part 1

Scientific American Bright Horizons Cruise 22 arrives in Ketchikan, Alaska.



-- Read more on ScientificAmerican.com

Catch Me If You Ketchikan: Scientific American Alaska Cruise Part 1

Scientific American Bright Horizons Cruise 22 arrives in Ketchikan, Alaska.



-- Read more on ScientificAmerican.com

Multitasking Gene May Help Drone Operators Control Robotic Swarms

A genetic variant that keeps dopamine levels high could lead to personalized training and also benefit personnel in ERs and air traffic control towers


Aug 26, 2014 | |

For thousands of years generals such as Caesar and Napoleon have molded citizens into soldiers en masse by using the same drills and training techniques for everyone. A recent study suggests how genetic testing could enable more personalized training for today's operators who remotely control missile-armed Predators and Reapers.


The , funded by the U.S. Air Force Research Laboratory, looked at how different variants of the or gene affected people’s multitasking performances. The gene makes an enzyme that breaks down certain neurochemicals such as dopamine, thereby affecting behavior and mood. Humans have three variants of , labeled as Met/Met, Met/Val and Val/Val. These abbreviations refer to the amino acids methionine and valine in certain paired positions in the molecular structure of the enzyme. The Met and Val variants create observable differences in human behavior that have led researchers to nickname the "worrier–warrior" gene.


In the study led by Raja Parasuraman, a psychologist at George Mason University and director of the Center of Excellence in Neuroergonomics, Technology and Cognition, participants trained on a simulation that required them to each control a swarm of six military drones in a battlefield scenario. The results showed that individuals who inherited the Met/Met "worrier" variant had a significant multitasking advantage over those with the Val/Met variant or Val/Val "warrior" variant in terms of how quickly they directed their drones to intercept incoming threats and efficiently destroy enemy aircraft.


Evidently, the Met/Met variant may not degrade neurochemicals as well as the other variants, so that those with the Met/Met allele “can better express dopamine in the brain’s prefrontal cortex, which allows them to perform better in complex multitasking scenarios,” Parasuraman says. “Other individuals may simply require more time or different training techniques.”


Geneticists have already known about this particular advantage of the Met/Met variant based on past experiments that tested multitasking skill using card sorting. But the new study represents one of the first to try replicating such findings in more practical scenarios. “We’ve gone beyond a simple laboratory task to a more realistic task that has commonalities with real-world tasks,” Parasuraman explains.


This also represents one of the few studies to look at the effects of variants across extended training sessions. The study's 99 student participants all underwent two practice sessions lasting seven minutes each before testing their newfound skills in two more seven-minute sessions consisting of "low" and "high" difficulty. As expected, the Met/Met group showed greater improvement in multitasking performance compared with the other two groups during the course of the training.


Such work is at the forefront of neuroergonomic research aimed at designing better systems based on a scientific understanding of the brain. It represents the first genetics paper in Human Factorsa professional society publication focused on improving technological design and training.


Parasuraman hopes to eventually discover the ideal combinations of learning and training techniques for each of the variants. That ability to tailor multitasking training to each individual—which could include noninvasive brain stimulation—could theoretically benefit both military personnel and civilians such as ER physicians or air traffic controllers. "Knowing individual genotypes may also help further tailor training,” Parasuraman says.


Such individualized training options would likely come at a higher cost than the current "one size fits all" approach, Parasuraman says, so cost-benefit analyses need to be done. Certainly, multitasking efficiency makes sense for the U.S. Air Force, which envisions drone operators controlling robotic swarms in future wars. (By comparison, today’s Predator drones typically require a crew of one pilot and two sensor operators.)


But the study results also imply another possible future for the U.S. military: the idea of genetically matching individual soldiers with certain roles. “The findings “could clearly be used to develop a genetic testing program,” says Mildred Cho, associate director at the Stanford Center for Biomedical Ethics.


Currently, the U.S. military has no program with the known goal of using genetic screening to assign personnel to certain roles. Still, a group of independent scientific advisors for the U.S. military, known as JASON, previously in 2011 recommending the armed forces prepare for the possibility of conducting genetic research on their personnel. Cho and her colleagues detailed the implications of such genetic testing in a research highlight article for Nature Reviews Genetics. ( is part of Nature Publishing Group.)


Some ethical complications remain even if U.S. commanders only consider genetic testing for tailoring individual training. For instance, the Met/Met variant may signal a better multitasker but it is also known as the “worrier” variant, because individuals inheriting the variant have greater vulnerability to stress and lower tolerances for pain. Such knowledge would likely give the U.S. added responsibility for shielding its most vulnerable soldiers from post-traumatic stress disorder. By comparison, holders of the Val/Val variant tend to better withstand stress and pain. “Because of this issue of genes being associated with multiple things, even weakly, what you do is create is the problem of potentially finding other things that are of concern and having the obligation to do something about them,” Cho says.


Read More:Better Security Measures Are Needed Before Drones Roam the U.S. Airspace


Why Digital Music Looks Set to Replace Live Performances

This August's production of Richard Wagner's four-opera cycle in Hartford, Conn., has been postponed.


Rather than hiring pit musicians, producer Charles M. Goldstein had intended to accompany the singers with sampled instrument sounds, played by a computer. Not a CD, not a synthesizer; the computer triggers the playback of individual notes (“samples”) originally recorded from real instruments.


The reaction of professional musicians—and, of course, the musicians' union—was swift and furious. New York City's Local 802 president called it operatic karaoke. Hate mail poured in. In the end, the opera's music director, as well as two of the stars, withdrew from the production.


I know exactly what Goldstein must be feeling right about now. For my first 10 years out of college, I worked on Broadway shows as a musical director and arranger. In 1993 the group now called the Broadway League (of theater owners) contacted me. They wanted me to demonstrate how well computers and samplers could serve a live performance.


I was flattered that powerful producers were seeking the advice of little 30-year-old me. I was all set to help out—until I started getting anonymous threats on my answering machine.


It turns out, the Broadway League and Local 802 were at the bargaining table, and the league wanted to use technology as leverage. The unspoken message: “If we can't reach an agreement, our shows will go on—without live music.”


I bowed out. I was a Local 802 member employed by a Broadway producer; I was in no position to choose a side. Even today, though, I'm deeply empathetic to both parties.


Musicians and music lovers argue that live orchestras are essential. Nobody buys a ticket to listen to a CD; there's something thrilling about musicians working as a unified artistic element. Of course, the musicians' unions also have a less noble interest: keeping their dwindling ranks employed.


For their part, producers often argue that there might be no show at all without a digital orchestra; live musical theater is expensive. Just look at the list of U.S. opera companies that have closed in the past few years: Opera Cleveland, Opera Pacific, San Antonio Opera and, shockingly, New York City Opera.


Do we really want to eliminate opera altogether or watch it with a piano accompaniment—a live player, yes, but a puny sound? Those outcomes serve nobody, including the public.


As technology has marched on, the musicians have lost two additional arguments: that fake music doesn't sound as good as real players and that audiences demand live players.


These days you can't tell a live but amplified orchestra from a high-end sampled one. And—tragically, to me—it doesn't seem as though, in the end, showgoers care much. During a 1993 musicians' strike, management at the John F. Kennedy Center for the Performing Arts in Washington, D.C., announced that its production of would use taped accompaniment. About 90 percent of ticket holders attended anyway.


It's likely Goldstein is correct that a full live orchestra would make his cycle too expensive to produce. But if we let him proceed, what's to stop producers from running with that argument, eventually replacing all live players to save money? It's a fraught situation, rife with potential for abuse on both sides.


History is not on live music's side. Canned music has largely replaced live players at dance performances, restaurants, school plays and community theaters. Nobody seems to bat an eye.


Further, the efficiencies and economies of digital technology have destroyed the old models in other creative industries: book publishing, moviemaking, pop music recording, and so on.


The battle between technology and live music will rage on for years, with passion on both sides. But as a musician and a live music fan, it's painful for me to say it: the long-term future of live pit musicians doesn't look especially upbeat.