Wednesday, September 30, 2015

World Health Org Calls for Early Treatment for Everyone with HIV

Everyone with HIV should be given antiretroviral drugs as soon as possible after diagnosis, meaning 37 million people worldwide should be on treatment, the WHO said

September 30, 2015

|

By Stephanie Nebehay

GENEVA (Reuters) - Everyone with HIV should be given antiretroviral drugs as soon as possible after diagnosis, meaning 37 million people worldwide should be on treatment, the World Health Organization (WHO) said on Wednesday.

Recent clinical trials have confirmed that early drug use extends the lives of those with HIV and cuts the risk of disease transmission to partners, the WHO said in a statement setting out the new goal for its 194 member states.

Under previous WHO guidelines, which limited treatment to those whose immune cell counts had fallen below a certain threshold, 28 million people were deemed eligible for antiretroviral therapy (ART).

All people at "substantial" risk of contracting HIV should also be given preventive ART, not just men who have sex with men, the WHO said.

The new guidelines are a central plank of the United Nations agency's aim to end the AIDS epidemic by 2030.

"Everybody living with HIV has the right to life-saving treatment. The new guidelines are a very important step towards ensuring that all people living with HIV have immediate access to antiretroviral treatment," said Michel Sidibe, executive director of UNAIDS.

"According to UNAIDS estimates, expanding ART to all people living with HIV and expanding prevention choices can help avert 21 million AIDS-related deaths and 28 million new infections by 2030."

The move will lead to a sharp increase in demand for ART medicines, which are typically given as a three-drug cocktail to avoid the risk of the virus developing resistance.

Major suppliers of HIV drugs include Gilead Sciences, ViiV Healthcare, which is majority-owned by GlaxoSmithKline, and multiple Indian generic manufacturers.

The medical charity Medecins Sans Frontieres (Doctors Without Borders) welcomed the WHO's "treat-all" plan, which it believes will prevent many HIV-positive people in poorer countries from falling through the treatment net.

MSF said its experience showed that a third of people who were diagnosed with HIV, but not eligible to start treatment, never returned to the clinic.

The charity also warned that making the new recommendation a reality would require dramatically increased financial support from donors and governments.

The WHO estimates that by 2020 low- and lower-middle income countries will need $18.4 billion annually for the expanded HIV fight. However, fast-tracking the response should yield economic returns of $15 per dollar invested, based on improved health and infections averted.

Since it began spreading 30 years ago, AIDS has killed around 40 million people worldwide.

see also:

All Creatures Great and Small: Elizabeth Blackburn [Video]

From jellyfish to ants, all life is beautiful in the eyes of Elizabeth Blackburn, co-winner of the 2009 Nobel Prize in Physiology or Medicine. She talks about her fascination with living things and the discovery of telomerase and telomeres.

Recorded at the 65th Lindau Nobel Laureate Meeting and produced with support from Mars, Inc.

All Creatures Great and Small: Elizabeth Blackburn

From jellyfish to ants, all life is beautiful in the eyes of Elizabeth Blackburn, co-winner of the 2009 Nobel Prize in physiology or medicine. She talks about her fascination with living things and the discovery of telomerase and telomeres.

Recorded at the 65th Lindau Nobel Laureate Meeting and produced with support from Mars, Incorporated.

NASA Drops Partnership with Private Asteroid Hunt

The Sentinel space telescope (artist's impression), in development by a private foundation, has lost NASA's support.

NASA has cut ties with a  that intends to launch an asteroid-survey mission. The decision clouds the prospects of the only large-scale space telescope being developed to seek space objects that have the potential to wreak havoc on Earth.

NASA said Tuesday that it has ended its commitment to provide analytical and data-downlink support to Sentinel, a US$450-million satellite designed to spot 90% of near-Earth objects (NEOs) larger than 140 metres. NASA said the decision was made because the project has missed its development deadlines, and the money held in reserve for Sentinel operations is needed elsewhere. 

The Sentinel team vows to continue, but it is unclear whether the project can overcome perennial cash-flow problems and NASA’s vote of no confidence. Money for the spacecraft’s development was supposed to come from private donors, but fund-raising has lagged behind expectations.

Sentinel’s struggles are “disappointing,” says Timothy Spahr, CEO of space consultancy NEO Sciences. If the mission ever launches, it has the potential to make a big dent in the estimated half-million-plus undetected and potentially devastating asteroids that come within 45 million kilometres of Earth’s orbit. An object no more than 55 metres across ravaged trees across 2,000 square kilometres of Siberian forest in the so-called ; a direct hit from a Tunguska-sized asteroid could lay waste to a large city. Of the roughly 363,000 near-Earth asteroids as big as the , only 565 had been discovered as of a year ago.

A steady flow of scientific reports has urged the launch of a space-based telescope dedicated to finding these objects. In 2005, the US Congress passed a law requiring NASA to track down 90% of all objects 140 metres and bigger by 2020. The space agency’s NEO budget has risen dramatically in recent years, but NASA continues to rely primarily on a patchwork of ground-based telescopes for NEO surveillance and expects to miss its deadline.

Hoping to make progress toward that goal, NASA signed a 2012 agreement with the , a non-profit group in Mill Valley, California, led by former astronaut Ed Lu. The agency agreed to provide assistance worth roughly $30 million if B612 met a number of development milestones. But the group has missed its technical deadlines and raised only $1.6 million in 2013, the most recent year for which figures are available. The foundation needs roughly $30-40 million annually to keep Sentinel on track.

NASA notified B612 in August that it was dissolving their agreement. The end of the deal, first reported by SpacePolicyOnline.com, “in no way changes the resolve of the B612 Foundation to move forward,” B612 Foundation CEO Lu says in a statement. But it may sway the chances of a different survey effort: , an asteroid-spotting mission now under consideration by NASA. The NEOCam team, headed by astronomer Amy Mainzer of the Jet Propulsion Laboratory in Pasadena, California, expects to learn in the next few weeks whether it has made the first cut for funding from NASA’s Discovery programme, which supports relatively modest space missions. If NEOCam is selected, it will compete with a small number of  for a 2021 launch slot.

I have nothing but admiration for the hearts and dedication of the Sentinel team,” says planetary scientist Richard Binzel of the Massachusetts Institute of Technology in Cambridge. “But this is such a large problem to tackle that it is going to require a dedicated space agency effort.”

Some scientists think that without Sentinel as a distraction, NASA is more likely to fund a dedicated NEO telescope. “The promise of something for free was impeding NASA from making what was seen as a critical investment,” says NEOCam team member Mark Sykes of the Planetary Science Institute in Tucson, Arizona.

Sentinel isn’t the first ambitious NEO-spotting project to fall by the wayside, Spahr notes. “There’s a long history of things that didn’t work. But failure drives things forward.”

see also:

Entrepreneurs Explore Bitcoin's Future

When the digital currency Bitcoin came to life in January 2009, it was noticed by almost no one apart from the handful of programmers who followed cryptography discussion groups. Its origins were shadowy: it had been conceived the previous year by a still-mysterious person or group known only by the alias Satoshi Nakamoto. And its purpose seemed quixotic: , in which strong encryption algorithms were exploited in a new way to secure transactions. Users' identities would be shielded by pseudonyms. Records would be completely decentralized. And no one would be in charge—not governments, not banks, not even Nakamoto.

Yet the idea caught on. Today, there are some 14.6 million Bitcoin units in circulation. Called bitcoins with a lowercase 'b', they have a collective market value of around US$3.4 billion. Some of this growth is attributable to criminals taking advantage of the anonymity for drug trafficking and worse. But the system is also drawing interest from financial institutions such as JP Morgan Chase, which think it could streamline their internal payment processing and cut international transaction costs. It has inspired the creation of some 700 other cryptocurrencies. And on 15 September, Bitcoin officially came of age in academia with the launch of , the first journal dedicated to cryptocurrency research.

What fascinates academics and entrepreneurs alike is the innovation at Bitcoin's core. Known as the block chain, it serves as the official online ledger of every Bitcoin transaction, dating back to the beginning. It is also the data structure that allows those records to be updated with minimal risk of hacking or tampering—even though the block chain is copied across the entire network of computers running Bitcoin software, and the owners of those computers do not necessarily know or trust one another.

Many people see this block-chain architecture as the template for a host of other applications, including self-enforcing contracts and secure systems for online voting and crowdfunding. This is the goal of Ethereum, a block-chain-based system launched in July by the non-profit Ethereum Foundation, based in Baar, Switzerland. And it is the research agenda of the Initiative for CryptoCurrencies and Contracts (IC3), an academic consortium also launched in July, and led by Cornell University in Ithaca, New York.

Nicolas Courtois, a cryptographer at University College London, says that the Bitcoin block chain could be “the most important invention of the twenty-first century”—if only Bitcoin were not constantly shooting itself in the foot.

Several shortcomings have become apparent in Bitcoin's implementation of the block-chain idea. Security, for example, is far from perfect: there have been more than 40 known thefts and seizures of bitcoins, several incurring losses of more than $1 million apiece.

Cryptocurrency firms and researchers are attacking the problem with tools such as game theory and advanced cryptographic methods. “Cryptocurrencies are unlike many other systems, in that extremely subtle mathematical bugs can have catastrophic consequences,” says Ari Juels, co-director of IC3. “And I think when weaknesses surface there will be a need to appeal to the academic community where the relevant expertise resides.”

Academic interest in cryptocurrencies and their predecessors goes back at least two decades, with much of the early work spearheaded by cryptographer David Chaum. While working at the National Research Institute for Mathematics and Computer Science in Amsterdam, the Netherlands, Chaum wanted to give buyers privacy and safety. So in 1990 he founded one of the earliest digital currencies, DigiCash, which offered users anonymity through cryptographic protocols of his own devising.

DigiCash went bankrupt in 1998—partly because it had a centralized organization akin to a traditional bank, yet never managed to fit in with the financial industry and its regulations. But aspects of its philosophy re-emerged ten years later in Nakamoto's design for Bitcoin. That design also incorporated crowdsourcing and peer-to-peer networking—both of which help to avoid centralized control. Anyone is welcome to participate: it is just a matter of going online and running the open-source Bitcoin software. Users' computers form a network in which each machine is home to one constantly updated copy of the block chain.

Nakamoto's central challenge with this wide-open system was the need to make sure that no one could find a way to rewrite the ledger and spend the same bitcoins twice—in effect, stealing bitcoins. His solution was to turn the addition of new transactions to the ledger into a competition: an activity that has come to be known as mining (see ).

Mining starts with incoming Bitcoin transactions, which are continuously broadcast to every computer on the network. These are collected by 'miners'—the groups or individuals who choose to participate—who start competing for the right to bundle transactions into a new block. The winner is the first to broadcast a 'proof of work'—a solution showing that he or she has solved an otherwise meaningless mathematical puzzle that involves encrypted data from the previous block, and lots of computerized trial and error. The winning block is broadcast through the Bitcoin network and added to the block chain, with the proof of work providing an all but unbreakable link. The block chain is currently almost 400,000 blocks long.

In principle, this competition keeps the block chain secure because the puzzle is too hard for any one miner to solve every time. This means that no one will ever gain access to the encrypted links in the block chain and the ability to rewrite the ledger.

Mining is also a way to steadily increase the bitcoin supply: the miner who wins each block gets a reward, currently 25 new bitcoins. That is worth almost $6,000 at today's prices. Nakamoto's design controls the supply increase by automatically adjusting the difficulty of the puzzle so that a new block is added roughly every ten minutes. In addition, the reward for creating a block decreases by half roughly every four years. The goal is to limit the supply to a maximum of 21 million bitcoins.

The network cannot determine the value of bitcoins relative to standard currencies, or real-world goods and services. , with people trading bitcoins on online exchanges. One result is that the market price has gyrated spectacularly—especially in 2013, when the asking price soared from $13 per bitcoin in January to around $1,200 in December. That would have made the first real-world products ever paid for with the cryptocurrency—a pair of Papa John's pizzas, purchased for 10,000 bitcoins on 22 May 2010—worth almost $12 million.

Puzzle solutions

Other issues surfaced with Bitcoin's mining procedure. As the currency has gained value, for example, mining competition has become fiercer, with increasingly specialized computers solving the puzzles ever faster. Courtois, who has found ways to streamline the puzzle-solving process, says that at one point he was successfully earning $200 a day through mining. The rivalry has driven the establishment of large Bitcoin-mining centres in Iceland, where cooling for the computers is cheap. According to one estimate from 2014, Bitcoin miners collectively consumed as much power as the whole of Ireland.

Working together

To reduce the threat from mining pools, some existing cryptocurrencies, such as Litecoin, use puzzles that call more on computer memory than on processing power—a shift that tends to make it more costly to build the kind of specialized computers that the pools favour. Another approach, developed by IC3 co-director Elaine Shi and her collaborators, enlists a helpful kind of theft. “We are cryptographically ensuring that pool members can always steal the reward for themselves without being detected,” explains Shi. Their supposition is that miners would not trust each other enough to form into pools if their fellow pool members could easily waltz off with the rewards without sharing. They have built a prototype of the algorithm, and are hoping to see it tested in Bitcoin and other cryptocurrencies.

Another problem is the profligate amount of electricity used in Bitcoin mining. To reduce wastage, researchers including Shi and Juels have proposed a currency called Permacoin. Its proof of work would require miners to create a distributed archive for , or the output of a gene-sequencing centre. This would not save energy, but would at least put it to better use.

The security of cryptocurrencies is another huge concern. The many thefts of bitcoins do not result from the block-chain structure, says Narayanan, but from Bitcoin's use of standard digital-signature technology. In digital signatures, he explains, people have two numeric keys: a public one that they give to others as an address to send money to, and a private one that they use to approve transactions. But the security of that private key is only as good as the security of the machine that stores it, he says. “If somebody hacks your computer, for example, and steals your private keys, then essentially all of your bitcoins are lost.”

Security is such a concern for consumers that Narayanan thinks Bitcoin is unlikely to find widespread use. So his team is working on a better security scheme that splits private keys across several different devices, such as an individual's desktop computer and smartphone, and requires a certain proportion of the fragments to approve a payment. “Neither reveals their share of the key to each other,” says Narayanan. “If one machine gets hacked, you're still OK because the hacker would need to hack the others to steal your private key. You'll hopefully notice the hack happened before they have the chance.”

Other thefts have occurred because the private key needs to be combined with a random number to create a transaction signature. Some software—such as Bitcoin apps developed for Android smartphones—has generated random numbers improperly, making them easier to guess. This has allowed hackers to steal somewhere between several thousand and several million dollars' worth of bitcoins, says Courtois, who has been investigating such vulnerabilities. “It's embarrassing,” admits David Schwartz, chief cryptographer at cryptocurrency developer Ripple Labs in San Francisco, California. “We as an industry just seem to keep screwing up.”

Into the ether

To prevent the basic cryptography-related mistakes that have plagued Bitcoin, Ethereum has recruited academic experts to audit its protocol. Shi and Juels are looking for ways that Ethereum could be abused by criminals. “The technology itself is morally neutral, but we should figure out how to shape it so that it can support policies designed to limit the amount of harm it can do,” says Juels.

Like Bitcoin, Ethereum is not under anyone's direct control, so it operates outside national laws, says Wood. However, he adds that technologies such as music taping and the Internet were also considered extralegal at first, and seemed threatening to the status quo. How Bitcoin, Ethereum and their successors sit legally is therefore “something that, as a culture and society, we're going to have to come together to deal with”, he says.

Juels suspects that Bitcoin, at least, will not last as an independent, decentralized entity. He points out how music streaming has moved from the decentralized model of peer-to-peer file-sharing service Napster to commercial operations such as Spotify and Apple Music. “One could imagine a similar trajectory for cryptocurrencies: when banks see they're successful, they'll want to create their own,” he says.

Courtois disagrees. He calls Bitcoin “the Microsoft of cryptocurrency”, and maintains that its size and dominance mean that it is here to stay. As soon as any new innovations come along, he suggests, Bitcoin can adopt them and retain its leading position.

Whatever the future holds for Bitcoin, Narayanan emphasizes that the community of developers and academics behind it is unique. “It's a remarkable body of knowledge, and we're going to be teaching this in computer science classes in 20 years, I'm certain of that.”

see also:

Climate Model Shows Limits of Global Pollution Pledges

Countries have not pledged to cut enough to restrain global warming

By | |

The Paris climate talks are a little more than two months away and most of the world’s big carbon emitters have submitted their climate pledges. That’s the good news. The bad news is that despite many countries pledging to cut carbon emissions in the coming decades, the current commitments may not be enough to limit warming to the world’s agreed upon goal of 2°C (3.6°F).

The pledges have been rolling in all year. On Monday, Brazil said it would cut emissions to 43 percent of 2005 levels in the next 15 years, stop illegal deforestation and reforest 30 million acres of land. Deforestation is a major source of Brazil’s carbon emissions.

The pledge puts Brazil in the company of 82 other countries — including the U.S., China, and other large carbon polluters in the European Union — that have submitted their climate pledges to the United Nations.

To gauge the effectiveness of the proposed emissions cuts, the nonprofit group  has put them into a climate model to show just how much the current goals would limit warming.

The results are mixed. The current emissions pledges will decrease warming. In a business as usual scenario, warming could go as high as 4.5°C above pre-industrial temperatures by 2100. The current goals would drop that to about 3.5°C of warming, one degree lower.

While that is progress, it’s still pretty far from the goal to limit warming by 2°C. The 2°C limit  by the European Union in 2009 and has since become the benchmark for warming for the Intergovernmental Panel on Climate Change and the UN.

If warming rises above 2°C, the  would only intensify. Extreme weather like droughts and large tropical cyclones would become more common, fragile ecosystems like coral reefs would be at risk of destruction and polar ice melting would swamp many coastal cities over the next century.

Several countries’ climate emissions goals,, including Russia, Canada, Japan and Australia,  have all been rated inadequate by . More ambitious goals from these relatively large emitters could bring the world closer to the 2°C goal.

Climate Central. The article was

see also:

Tesla's Model X Shows an SUV Can Go All-Electric

Tesla Motors Inc. frontman Elon Musk unveiled the company’s Model X sport utility vehicle in California last night.

The midsized crossover and newest option in the electric automaker’s lineup has several new selling points—a HEPA filter, blazing acceleration, agile “falcon wing” doors, a low center of gravity that reduces rollover risk, a panoramic windshield and the highest crash safety rating federal regulators can give—as well as an old marketing problem: price.

The high-end Model X, the Signature version, which Musk rolled out last night, is expected to cost between $132,000 and $144,000, well outside most buyers’ range. The company has said it will eventually produce a less-expensive version.

Tesla will reveal the Model 3, the $35,000 baseline sedan, in March 2016, at which point customers will be able to pre-order the cars, Musk announced earlier this month. Production on the Model 3 will begin in 2017, he said.

“The mission of Tesla is to accelerate the advent of sustainable transport,” Musk, the chairman and CEO of Tesla, told several thousand people at the firm’s Fremont, Calif., plant last night.

“It’s important to show that any type of car can go electric,” he said. With the Roadster and the Model S, upstart car company Tesla, the first U.S. firm in the industry to go public since Ford Motor Co. in 1956, proved electric sports cars and sedans can work and belie the idea that eco-friendly cars are sluggish.

“Now we’re going to show that you can do it with an SUV,” Musk said.

Another Model X feature, “which is kind of topical,” Musk said to chuckles from the audience, “is air safety.”

The Model X has the first “true” high-efficiency particulate air (HEPA) filter of any car, he said. “Now, we designed the car well before recent events,” Musk added in an allusion to the recent Volkswagen AG emissions scandal.

A few customers, who had made early reservations for the car, drove home Model Xs after the debut event ended. An estimated 25,000 customers have pre-ordered the car, already two years behind schedule, and deliveries are expected to trickle out slowly.

High-end electrics begin to proliferate

Tesla, in its August quarterly filing, said it was “highly confident” of turning out between 1,600 to 1,800 Model S and Model X automobiles each week next year and estimated it would deliver between 50,000 and 55,000 Model S and Model X cars this year.

The company’s management said in that same document that they expect annual production to increase more than 50 percent each year “for the next several years.”

At the Frankfurt car show earlier this month, Audi AG showed off its e-tron quattro, a pure electric SUV, which the company said will be able to travel 310 miles on a full charge, farther than both the Models S and X.

Luxury brands Porsche and Aston Martin are working on fully electric SUVs set to be sold to the public before the decade ends.

Yesterday, California joined 10 other national and regional governments, including the Netherlands, Norway, the United Kingdom, Connecticut, Oregon and Quebec, as a founding member of the International ZEV Alliance, a group pushing for zero-emission vehicle deployment targets.

“Limiting the impact of climate change is only possible if we transition to cleaner, more energy efficient vehicles,” Matthew Rodriquez, California’s secretary for environmental protection, said in a statement. “We’re very pleased to be joining with many of the leaders in this growing market, and we look forward to working with them to put more drivers at the wheel of zero-emission vehicles.”

www.eenews.net

see also:

Advanced Robotics on a Dime

The toy company WowWee brings expensive university bots to store shelves

By | |

The robotic butlers and sentries of sci-fi fantasies already roam our planet, but you can't have them—not yet. The fate of most would-be home robots breaks in one of two ways: Bots such as Honda's Asimo, a bipedal assistant, exist only as demonstrations from multimillion-dollar research and development laboratories. Robots that consumers could purchase, such as the $1,600 Pepper companion robot, are unaffordable for most. Toy company WowWee aims to change all that when it delivers the first sub-$600 multifunction home-service robot. The freestanding, self-navigating Switchbot—part concierge, part security guard—will roll out in 2016.

Hong Kong–based WowWee's success stems from bringing university research projects to life that might otherwise languish in the prototype stage. A licensing agreement with the Flow Control and Coordinated Robotics Labs at the University of California, San Diego, for example, provides WowWee with access to patents and the labs with a healthy cash infusion. The collaboration has already netted a series of toy robots that balance like Segways. More recently, the avionics lab at Concordia University in Montreal began working with the company to perfect flight algorithms for a four-rotor drone. Next, chief technology officer Davin Sufer says he has his eye on the Georgia Institute of Technology and its work with swarming behaviors, which would allow a group of robots to function in tandem.

In the case of Switchbot, WowWee adapted a locomotion system developed in part by former U.C. San Diego student Nick Morozovsky. The robot moves on tank-tread legs either horizontally to navigate uneven terrain or on end to stand and scoot fully upright. Morozovsky built his prototype with off-the-shelf parts, including a set of $50 motors. The motors were a compromise; each one had the size and torque he wanted but not the speed. Over the past few years he has worked with WowWee to customize a motor with the exact parameters needed and to cut the final cost of the part down to single digits.

That back and forth yields low-cost, mass-producible parts, which means university-level robotics could become available to everyday people. “One of the reasons I went into mechanical engineering was so I could create real things that have a direct impact,” Morozovsky says. “I didn't expect that to necessarily happen in the process of grad school.”

Academic research that translates directly to consumer electronics is rare, especially given how quickly WowWee can turn products around, says Fred Reinhart, president of the Association of University Technology Managers, which promotes transfer of intellectual property from universities to companies. But WowWee to innovate quickly because toy companies need new stuff every year. Unlike a lab, “there isn't the luxury of being able to develop the technology just to see where it will take us,” Sufer says. “That pressure makes cool things happen.”

Tech Transfer

WowWee has a long track record of bringing lab-borne robots to reality

MiP () The first product to come out of WowWee's collaboration with U.C.S.D, MiP's self-balancing system—including sensors, wireless radios, motors and processors—had to be reimagined to cut the cost of raw materials significantly.

OutRunner The spiky-wheeled land cruiser, based on an unfunded Kickstarter project by former Florida Institute for Human and Machine Cognition scientists, can hit speeds of up to 20 miles per hour. WowWee is working with the team to bring the prototype system's sticker price down from $500 to sub-$200, primarily by sourcing smaller, more efficient motors.

Switchbot The U.C.S.D. research project that led to Switchbot was about twice the size of the two-foot final product and cost nearly $1,800 in parts to build. To trim the price substantially, WowWee is working directly with suppliers and researchers to perfect new motors and balancing sensors.

Intellicopter WowWee has been paying attention to the shortfalls of many remote-controlled quadcopters—particularly that the learning curve for flying them is steep. That's why it is working with Concordia University researchers to create flight-control algorithms to help better train new pilots.

see also:

Graphene Finally Gets an Electronic On-Off Switch

Long-sought method could turn super-thin material into usuable computer components

By and | |

Graphene, a single layer of carbon atoms arranged in a honeycomb-shaped lattice, exhibits a range of superlative properties. Since it was discovered in 2003, it has been found to have exceptional strength, thermal conductivity and electric conductivity. The last property makes the material ideal for the tiny contacts in electronic circuits, but ideally it would also make up the components—particularly transistors—themselves.

To do so, graphene would need to behave not just as a conductor but as a semiconductor, which is the key to the on–off switching operations performed by electronic components. Semiconductors are defined by their band gap: the energy required to excite an electron stuck in the valence band, where it cannot conduct electricity, to the conduction band, where it can. The band gap needs to be large enough so that there is a clear contrast between a transistor’s on and off states, and so that it can process information without generating errors.

Regular graphene has no band gap—its unusually rippled valence and conduction bands actually meet in places, making it more like a metal. Nonetheless, scientists have tried to tease them apart. By fabricating graphene in odd shapes, such as ribbons, band gaps up to 100 meV have been realised, but these are considered too small for electronics.

 at the Georgia Institute of Technology, US, and colleagues fabricated their graphene by epitaxial growth. In this method, a silicon carbide (SiC) substrate is heated to temperatures of 1360°C, at which point it begins to decompose and form graphene layers. The researchers found that the first of these layers, normally called the buffer layer, forms a band gap greater than 0.5 eV, because of the highly periodic way it bonds to the SiC substrate.

, a physicist at the University of Konstanz in Germany who was not involved in the work, says this is ‘almost, but not quite’ as large as the band gap in regular semiconductors. ‘It remains to be seen whether the graphene produced in this way also possesses the favourable electronic properties of previously studied graphene samples,’ he adds, ‘but the result reported is certainly very promising.’

Conrad and colleagues’ epitaxial method for generating semiconducting graphene is not new. In 2006, a group led by Alessandra Lanzara at the University of California in Berkeley, US, investigated the second layer of graphene grown epitaxially above silicon carbine, and reported a band gap of 0.26 eV. Conrad says the main difference in his group’s work is the honing of the growth technique. ‘It turns out that crystalline order is extremely important to get this band gap, and they didn’t have that,’ he explains.

When Conrad and colleagues tried growing their graphene just 20°C lower than their favoured temperature, they found that the band gap was non-existent. Conrad likens the development to the early days of silicon electronics. ‘If you go back to the early days of silicon transistors in the 1960s, it was really about [finding] incredibly highly ordered crystals,’ he says. And the high cost of silicon-carbide wafers doesn’t matter at this stage, he adds. ‘The first [silicon] transistors they sold were $1,500. The point is, you get the device first, and you worry about the cost later.’

Conrad claims that his colleagues at Georgia Tech are already using his semiconducting graphene to make transistors, with on–off current ratios on the order of one million to one—ten times more than is required by regular electronics. ‘So, it’s starting to work,’ he says.

Chemistry World. The article was

see also:

Toxic Habits: Overthinking

This week, we’ll wrap up our three-part series on toxic habits. Our third toxic habit? Call it overthinking, obsessing, brooding, or wallowing, or, call it the official term: rumination. In this episode of the Savvy Psychologist, Dr. Ellen Hendriksen offers 4 tips to stop the mental hamster wheel

By | |

Scientific AmericanQuick & Dirty TipsScientific American 

Rumination is thinking (and thinking and thinking) about something upsetting, but in a passive way, without actually taking action.

Now, I bet you never thought you’d learn about taxonomy in a psychology podcast, but I promise I’ll connect the dots: animals like cows, deer, goats, and sheep belong to the suborder Ruminantia. These multi-stomached ruminants regurgitate their partially digested food and chew it again.  

Likewise, ruminators chew on their thoughts, as it were, over and over and over again. Very different, but essentially the same concept. How’s that for a mental image?

What’s So Bad About Rumination?

Not only that, but rumination has been found to impair problem solving skills, which makes ruminators less likely to take action on a possible solution, makes them more pessimistic about the future, and pretty much guarantees a bad mood. In fact, those who ruminate develop  at four times the rate of those who don’t ruminate. It’s like a hamster running frantically on a wheel, exhausting itself without actually going anywhere.  

 

see also:

Big Data Are Reducing Homicides in Cities across the Americas

Violence is a big problem in modern society and in cities in particular. Homicides were rampant in my hometown of Cali, Colombia, when I became mayor in 1992. Few people saw murder as a pressing health problem, but I did—probably because I had earned a Ph.D. in epidemiology at the Harvard School of Public Health. I decided to apply the statistical methods used by public health experts to identify the sources of homicide and to reveal social and policy changes that might make a difference.

At the beginning of my first term, the people of Cali and all of Colombia generally believed, mistakenly, that little could be done because we Colombians were “genetically violent.” Other skeptics maintained that violent crime would not diminish unless profound changes were made on socioeconomic issues such as unemployment and educational levels. My administration and I proved all these people wrong.

We developed an epidemiological database about the many societal factors that significantly raised the risk that a homicide would happen. These included sometimes subtle aspects of human behavior, such as the desire to carry guns in certain places or the tendency to drink alcohol on certain days. This exhaustive and fine-grained information led to new laws and policies built on data, not politics.

The method worked. In 1994 annual homicides in my city, then home to nearly 1.8 million, dropped from 124 per 100,000 residents to 86 in just three years after the leading causes were found and policies were applied. An even larger decline took place over nine years in Bogotá, after our capital city adopted the same methods. And when I was elected mayor of Cali for a second time, in late 2011, after being out of office for almost 18 years, the same approach reduced homicide rates again. Let me tell you the story of how big data and scientific analysis can help solve entrenched social problems.

Pinpoint the root causes
When I began my first term, I did what epidemiologists generally do: plot cases on a map. I hung a big printout of Cali on my office wall and stuck color-coded pins in it at each location of a death, intentional injury, traffic accident, home burglary or other violent event. When a journalist saw the map, his local newspaper ran a headline that read: “Mayor Guerrero Intends to Curb Violence with Acupuncture.”

Even to smart journalists, evidently, it was strange to look at homicide in a statistical way. But to me, it made perfect sense: if epidemiological methods could find the causes of medical diseases, they could find the causes of a societal disease.

Using statistics was crucial because Colombia had a long record of violence that left many misimpressions. Beginning in the late 1940s, La Violencia, a fierce struggle for power between the two main political parties, sparked over 200,000 killings across more than 10 years. Guerrilla warfare followed for decades. The cultural tolerance for violent responses to conflict was so high when I took office that quarrels between neighbors or drivers in traffic accidents frequently ended in homicide. In 1991 Medellín, the second-largest city in Colombia, had an annual homicide rate of 380 per 100,000. Around that time, Chile's rate was 2.9.

My epidemiological approach began with a definition of violence scripted by the World Health Organization: the use of force with the intention to cause harm or death. This definition does not include accidents or psychological or political violence.

Despite the media's preoccupation with domestic warfare, only 36 percent of the deaths in Colombia in 1991 were caused by guerrillas, mostly in rural areas. I thought drug dealers would arise as the culprits in the other 64 percent. As we investigated the who, where and when of each death in Cali, however, we found that homicide victims and aggressors were predominantly young, unemployed males who had low levels of education, came from the poorer sectors of the city and were frequently involved in gang fights. We also found that close to 80 percent of homicides were carried out with firearms. When we discovered that two thirds of homicides took place on weekends, we decided to chart blood alcohol levels in victims; more than half of them had been intoxicated. These facts pointed to social disintegration more than drug-related violence.

Drug traffic still had an effect, but it was not the direct cause of most homicides. As we analyzed the numbers, we realized that drug traffic was to society as HIV is to the human body: the virus attacks defense mechanisms, making the body vulnerable to other diseases. Likewise, drug dealers attack the police and the judiciary and political systems—the defense mechanisms of society. These weakened institutions arose as risk factors for violence. For example, the police identified a suspect in only 6 percent of homicides, and the judiciary system brought even fewer to trial.

Also, children were often exposed to violence and maltreatment, and violent content was prevalent in the media. In a culture of violence, economic inequality and ineffective public security, people killed and got killed, often under the influence of alcohol, over conflicts as simple as noisy neighbors or settling debts.

Change the culture
Our goal was to reveal risk factors we could control directly. Because firearms were used in a large proportion of homicides and alcohol was often associated with the deaths, in November 1993 I began to change gun and alcohol laws.

In my country, guns are manufactured and sold by the Colombian National Army, so military authorities opposed my idea of a permanent ban on weapon-carrying permits. But they agreed to our suspending the permits in public places on select dates identified by the data as posing a high risk, which was generally associated with high alcohol consumption. These dates included New Year's Eve and (strangely) Mother's Day, as well as days when payments to employees, made on the 15th and 30th of each month in Colombia, coincided with a Friday.

I also restricted alcohol sales in public places after 2 A.M.—a measure my administration called the semidry law. Nightclub owners objected adamantly, so I proposed a deal: I would apply the law for three months, and if violent deaths and injuries did not diminish, I would drop it. After only two weeks, hospitals reported such a drastic reduction in violence-related emergencies that abandoning the measure was not an option. I enforced the semidry law until the end of my term.

An epidemiological strategy also calls for evaluating interventions. After several months, we found that when both alcohol sales and firearms permits were restricted, there was a 35 percent reduction in homicides versus days when neither were in force. The reduction was 14 percent when firearms alone were restricted.

Other interventions included adding more prosecutors, as well as putting more police on the streets and improving their equipment, such as surveillance cameras, cars and radios. To support these people in their challenging careers, we launched a privately funded program to help police officers become homeowners and gave computers and training to members of the judiciary. Crime prevention rose, and more suspects were brought to trial.

We also created two Houses of Justice—premises within violent neighborhoods on the outskirts of Cali in which all law-enforcement institutions operated 24 hours a day. Previously these services were available only downtown and during business hours. This change was particularly helpful in reducing domestic violence because investigations would begin immediately after forensic medical personnel certified a victim's injuries, which lessened the chance that women would withdraw their complaints under pressure from their husbands. In an effort to offer young males in poor districts greater opportunities for education, recreation, income and social connections, I launched DESEPAZ—a program to restore public safety by improving the cohesiveness of a neighborhood. As part of the program, we opened “youth houses” in several communities where people could socialize and gather around cultural and sports activities. City workers also trained youth who were involved with gangs to run small businesses. The city even hired one such business dedicated to manufacturing cobblestones to pave streets.

Improve the data
We realized early on that the data we were working with were not always cohesive. For example, in my first security council meeting in July 1992, it became clear that the police and judiciary used different definitions of homicide, which complicated our efforts to pin down causes of deaths. To fix the issue, I established weekly security meetings that involved officials from the police, judiciary and forensic authorities, members of the Institute for Research and Development in Violence Prevention and Promotion of Social Coexistence (CISALVA) at the University of Valle, cabinet members responsible for public safety, and the municipal statistics agency. Information was reported weekly to me and to police commanders. We held a security council meeting every week of my term. Slowly the data coalesced. The meetings evolved into “observatories of crime,” sometimes called “social observatories.” CISALVA, which is dedicated to studying violence prevention, has kept the observatory's weekly data running for 22 years—to my knowledge, the longest reliable set of information on violence in any Colombian city.

Based on the improved analysis of risk factors, we began interventions at the end of 1993 and widened them before my two-and-a-half-year period as mayor ended in December 1994. My successor continued them. The homicide rate in Cali dropped from 124 per 100,000 in 1994 to 112 in 1995, 100 in 1996, and 86 in 1997. It is difficult to say how much of the decline was a direct result of the interventions because the national government was also changing how police fought drug cartels. But evaluations in Cali and Bogotá confirm that the epidemiological approach played an important role. I believe that is true in part because the mayors who followed my successor did not keep in place unpopular measures such as the restriction of alcohol consumption, and the homicide rate climbed back up.

Experience in Bogotá, the country's largest city, backs up the data-intensive method. When Antanas Mockus became mayor there in January 1995, he applied and improved our strategy. His most important tactical interventions were increasing the police budget 10-fold, improving police education about violent crime, developing temporary detention centers for minor offenders and creating a government position of subsecretary of violence prevention. The social interventions included rebuilding dilapidated public spaces and tripling investment in health and education.

Mockus also implemented a semidry law and restrictions on firearms, which quickly reduced homicide rates as much as they had in Cali. In Bogotá, strict use of the epidemiological method spanned three administrations over nine years, from 1995 through 2003. Across that time, homicide rates dropped from 59 per 100,000 to 25. As in Cali, some of that improvement may have been helped by changes at the national level.

New tactics, 20 years later
In Colombia, mayors cannot be reelected consecutively (and I had other plans anyway). After I left office, I dedicated myself to spreading the word that urban violence could be controlled and to doing further research about that goal. I went to work at the Pan American Health Organization in Washington, D.C., was instrumental in actions that created the Inter-American Coalition for the Prevention of Violence and helped to garner approval of a loan from the Inter-American Development Bank to Cali, Medellín and Bogotá for deterring violence. After three years, I returned to Cali and helped to launch VallenPaz, an organization devoted to creating economic programs in rural southwestern Colombia as an alternative to the lure of money from guerrillas and illicit drug crops.

Years later, however, I found that there is no lifelong immunity to politics. I ran again for mayor of Cali.

When I took office on January 1, 2012, I found a different city. Cali had grown from 1.8 million inhabitants in 1994 to 2.4 million. Most of the additional people were migrants, primarily from Colombia's Pacific coast and neighboring rural areas. After years of incompetent administrations and one mayor ousted from office, collective self-esteem was low, and unemployment was up from 6.9 percent in 1994 to 13 percent in 2013. Although the large Colombian drug cartels were dismantled in the 1990s, they had fragmented into smaller cartels that worked rather independently in the nation's cities, particularly in Medellín and Cali. Drug dealing was still present, and new forms of crime had emerged, such as small “vaccine” payments required by gangs to protect local businesses and war over the territorial control of drug distribution and selling within cities.

The good news was that the Colombian police had become professional and trustworthy. The national homicide rate had dropped from 79 in 1991 to 36 in 2011. Yet Cali's homicide rate was around 80, compared with 22 in Bogotá and 70 in Medellín.

I immediately reinstated the weekly security council meetings. Soon our data analyses showed that the proportion of homicides resulting from interpersonal conflict such as quarrels and alcohol-related brawls had diminished compared with the period of 1992 to 1994. But killings that we classified as organized crime—those that were premeditated and involved sophisticated weapons such as machine guns—accounted for 67 percent of violent deaths in 2012. Data suggested that organized crime was playing a bigger role. The data also showed that social inequalities had gotten worse since my earlier term.

We presented our data to the national government and suggested it create specialized groups of criminal investigators, police and prosecutors to dismantle criminal bands. My administration also began a massive social investment plan in 11 districts that were home to a total of 800,000 people, 26 percent of them living in poverty and another 6.5 percent in extreme poverty.

The plan that resulted, called Territories of Inclusion and Opportunities, is still in effect today. It applies a geographical approach to fighting poverty, focusing interventions in impoverished areas and encouraging local residents to play big roles. Local and national officials work on raising incomes, extending school schedules, promoting cultural activities and sports, and improving housing, health facilities and public education. We also teach parenting skills and peaceful conflict resolution.

Together with the effort from the national government to fight organized crime, our interventions again reduced violence. Cali's homicide rate of 83 in 2012 dropped to 62 in 2014. This pattern has continued; the number of homicides in the first trimester of 2015 is less than in the same period in any of the past 12 years.

All these coordinated police and social actions help the crime interventions. A good example of the strategy is Comuna 6, a political district of Cali where 212,000 residents, most of whom are middle-income, live. We energetically implemented the coordinated police and social interventions, and homicides went down 44 percent within a year's time, from 160 in 2013 to 89 in 2014.

The epidemiological approach to reducing violence is passing the test in other cities in Colombia and across the Americas. Crime observatories—the evolution of our regular security council meetings—are essential to the approach. The Inter-American Development Bank, the U.S. Agency for International Development and the World Bank, among others, now recommend that cities or states create the observatories when seeking financial support for violence-prevention programs. Today four national and numerous municipal-level observatories are meeting systematically in 26 countries and cities in the Americas.

A study published in the International Journal of Injury Control and Safety Promotion found that homicides were significantly reduced in 22 Colombian cities in the three-year period after the observatories were implemented. Studies directly comparing cities in different countries are difficult, however, because countries have diverse definitions of crimes and varying criteria for collecting information. To improve the situation, the Inter-American Development Bank is supporting a project to standardize violence indicators across the Americas.

Political will is the top priority
Using an epidemiological strategy to help solve a social issue may seem straightforward, but it is not. The first lesson I can espouse is that such a move takes strong political will because the strategy frequently requires public officers to do things they would rather not do, such as making necessary but unpopular decisions to close bars or ban firearms. Making crime data public can also be uncomfortable, but it is essential, just as economists releasing unemployment and gross domestic product numbers is essential to formulating economic strategy. Data on social issues such as violence and education are now published periodically for various Colombian cities by nonprofit groups called Bogotá How Are We Doing, Cali How Are We Doing, and so on. The information makes public officials and mayors accountable in their communities.

The second lesson is that there is no one-size-fits-all approach in applying epidemiological methods to social issues because cities and countries have different risk factors. Data-driven observation is needed in each context to guide public officials.

The process also requires perseverance and patience. Certain risk factors can be controlled rapidly—for example, by banning firearms or restricting bar hours—but other measures, such as improving the reach of police and judiciary services, take longer. Steps such as correcting social inequalities or establishing healthy child-rearing practices need not only time and patience but also considerable resources.

Urban violence is socially regressive because it mostly affects the poor, and fighting crime devours a portion of the public budget, which could instead be invested to eradicate poverty. Violence prevention must therefore be a priority for humanity.

see also:

This entry passed through the Full-Text RSS service - if this is your content and you're reading it on someone else's site, please read the FAQ at http://ift.tt/jcXqJW.

What Could Criminals Do with 5.6 Million Fingerprint Files?

Courtesy of BRAND X PICTURES (MARS)

Of all the personal data that cybercriminals can steal, your biometric information is the most unsettling. Purloined passwords, credit cards and even can be changed to guard against identify theft and fraud. Fingerprints, however, cannot. At least, not permanently. Perhaps the only silver lining to the U.S. Office of Personnel Management’s that criminals had stolen 5.6 million fingerprint files, up from the 1.1 million files originally reported missing, is that it would be extremely difficult to use such biometric data to commit fraud or theft.

Movies and television shows often concoct identity-theft plots involving fingerprints discretely lifted from, say, a drinking glass and transferred to latex gloves. Misuse of stolen digital fingerprint files is hardly that straightforward and would involve cracking encryption codes, reverse-engineering data files and several other complicated procedures that are probably not worth the effort. The raid on OPM’s computers—which impacts 21.5 million current, former and prospective federal workers—included a treasure trove of addresses, dates of birth and other personal information that would be much easier to exploit.

The fingerprint theft was more likely meant as a psychological blow to the government and its employees, says , senior director of technology at security firm RSA. Given the highly personal nature of a biometric data, which in other settings can include such characteristics as DNA or patterns in one’s iris, retina or palm veins, “by the time you could convince users that it’s not that bad, your reputation is already damaged.”

Commercial fingerprint-based security systems used by businesses and government agencies create digital maps of the ridges and valleys that make each person’s fingertips unique. Most systems generate these maps by scanning high-resolution images of a person’s hand and using software algorithms to encode this map data into a file that can be used to identify that person. [A simple diagram of the fingerprint scanning and encoding process can be found .] A properly configured system will delete the images after use and encrypt the files containing these encoded fingerprint maps, Alikhani says.

Consumer tech versions of fingerprint readers—such as Apple’s iPhone Touch ID—are a bit different. Rather than taking digital images, they measure a fingertip’s , to capture a fingerprint image. Hackers have already proven they can and break into an iPhone. But they’ve done this by painstakingly copying physical fingerprints and applying them to the sensor. The iPhone’s digital fingerprint records are encrypted and stored exclusively on the phone itself. Apple says it does not keep copies of those files on the network. That means a thief would need to already have access to an iPhone in order to steal the fingerprint file.

The OPM fingerprint data, however, was stolen from the agency’s networks and computers as opposed to the fingerprint readers themselves. The agency last week issued a statement reassuring the public that the stolen fingerprint data is of limited use to criminal at this time, although the agency didn’t rule out future problems as “technology evolves.” Alikhani took this statement to mean the data was most likely encrypted. “In order for you to get back to the original fingerprint, you would have to break the encryption used when storing one’s fingerprint template,” he says. Assuming a criminal has the processing power and time to do that, he or she would then have to reverse engineer the algorithm used to encode the fingerprint data. In the unlikely event a criminal put this much effort into the scheme, that reverse-engineered data could then be reassembled into the original fingerprint image.

“There has been to take encrypted, templated biometric information and reverse engineer it,” Alikhani says. But the and requires knowledge of the technology used to create the biometric profile. Even if someone were able to do all of this, that person would still need to create a physical copy of the fingerprint—perhaps 3-D printed and glued to a latex glove—to fool the actual fingerprint scanner guarding entry to a particular facility or computer. This might work, but only in the unlikely scenario there are no other security measures in place.

The U.S. Customs and Border Protection agency’s Global Entry program, for example, allows international travelers to skip long lines at airport immigration by scanning their passports, face and fingerprints at a kiosk. If a traveler answers some questions and these identifying features match those already on file, the kiosk prints a receipt that the traveler can show to a Homeland Security official on the way out of the airport. If there is no match, the traveler can expect an immediate conversation with on-site police.

Science of the People, by the People and for the People

It’s good to learn something new every day. The Internet makes that easy, placing knowledge at our fingertips. Learning doesn’t require consulting experts. We can enlighten each other. crowdsources knowledge by asking everyone, as Michael Feldman does on the radio, I learn something from you; you learn something from me.

With a slight tweak, this type of crowdsourcing can also be used to learn something that . Crowdsourcing for scientific discovery, known as “citizen science,” involves asking everyone, “Whad’Ya observe, experience, find or ponder?” Assemble contributions together in the appropriate way and voila! New knowledge.

With a “more heads are better than one” approach, citizen science leads to discoveries that would not be possible with scientists working alone.

To bring heads together, over 40 U.S. federal agencies have joined the . Some already make elaborate use of citizen science, like the U.S. Geological Survey relying on people who watch birds (), notice when flowers bloom () and experience earthquakes (), and the National Oceanic and Atmospheric Administration relying on weather updates from , and through the recent . Other agencies are getting their feet wet, like the NASA (), the Federal Communications Commission (), and the National Archives (). Today, if you follow former President Kennedy’s iconic advice and ask what you can do for your country, the resounding answer is: citizen science.

It makes sense to draw on We the People. According to government statistics on occupations, there are about 6 million scientists in the U.S. Yet our human capital includes over 300 million potential citizen scientists (plus, one does not need to be an American citizen to be a citizen scientist in the U.S.). There is no need to limit our national energies for innovation and scientific revolutions to the small percentage of professionals when We the People can do citizen science.

USA-National Phenology Network citizen-scientist Lucille Tower records the one millionth observation on maple vine in the large nature database as part of USGS's Nature's Notebook project.

The federal agencies have learned that coordinating citizen science is itself a science. The challenges traditional scientists deal with, such as data management, quality and potential sources of bias, have to be addressed in citizen science too. Plus, the challenges are often amplified by the scale of citizen science projects, which can involve tens of thousands of people contributing observations or micro-tasks towards a single research pursuit. Specialists in education, communication, information sciences, data visualization, human-computer interactions, and more are coming together to ensure rigorous citizen science. There are international membership organizations, such as the with a and , to help foster science by the people.

The complexity of implementing citizen science drove the Federal Community of Practice on Citizen Science and Crowdsourcing to create a tool kit, which is being released tomorrow, to guide Federal agencies (and others) to harness the powers of curious people throughout the nation.

To celebrate the release and spur use of the tool kit, the and the is hosting a citizen science forum from 8:00 a.m. to 12:00 p.m. on September 30 called “Open Science and Innovation: Of the People, For the People, By the People.” Citizen science projects are providing ways to overcome national challenges such as conserving pollinators, monitoring droughts, recovering from coastal flooding and improving health with low-cost sensors.

The White House event will be live webcast , with live tweeting (#WHCitSci and follow @WhiteHouseOSTP and me @CoopSciScoop):

[embedded content]

Although the federal took kit is a new development, federal agencies have long recognized the intellectual capital of We the People. For example, in the mid-1800s, U.S. Naval officer wanted to chart the distribution and seasonal migrations of whales. (Maury is a real character footnoted in the fictional story, ).

Maury became the father of oceanography by charting the ocean currents and winds: a journey he completed without leaving his desk at the Depot of Charts and Instruments of the U.S. Navy. At the time, sailing the high seas was a risky undertaking. Not much had changed over the 25 centuries since Homer’s story of Odysseus, which is an epic navigational nightmare. Instead of multiple gods toying with sailors’ lives, mariners in the 1800s believed in false protections like not whistling on board for fear of challenging the winds. When sailors saw a shark, it was a bad omen; a cat on board was good luck. At the mercy of forces that appeared like whims, the voids in our understanding were filled with superstitious explanations.

Commander Matthew Fontain Maury, one of the pioneers of citizen science in 19th century, asked fellow sailors to report whale sightings in order to chart the marine mammals' distribution and seasonal migration.

To fill the void another way, Navy and merchant sailors dutifully began sending regularly recorded estimates of latitude and longitude, wind direction, wind speed and weather conditions to Maury. By aggregating observations from over 1,000 ships across the seven seas, Maury created wind and current charts that instantly made sailing safer and sped commerce. In this way, he used citizen science and crowdsourcing to identify the safest and most efficient ocean routes and earned the moniker “Pathfinder of the Seas.” Maury’s project continues to this day, administered by the , which is part of the Department of Defense, also a member of the Federal Community of Practice on Citizen Science and Crowdsourcing.

“Until I took up your work, I had been traversing the ocean blindfolded,” one mariner wrote to Maury. The tool kit created by the Federal Community of Practice on Citizen Science and Crowdsourcing is a how-to manual to guide federal agencies in removing more blindfolds.

We the People observe. We the People take note of the environment that surrounds us. We the People share our experiences. More is still unknown than known to us. We are navigating humanity’s future blindfolded. We the People can steer a better course with citizen science and learn something truly new every day.

Tuesday, September 29, 2015

MacArthur Genius Grant Winner Makes Waste a Resource

Environmental engineer Kartik Chandran of Columbia University won a MacArthur Fellowship for his work on extracting nutrients and energy from wastewater and sewage.

By | |

itself traditionally has been viewed as something negative, it has been viewed as something that we need to get rid of.”

Environmental engineer Kartik Chandran of Columbia University. On September 29th he was named one of this year’s , often referred to as recipients of the “genius grants.” Where most people see sewage, Chandran sees a resource.

“To me these are not just waste streams, there are enriched streams. These are enriched in nutrients, nitrogen and phosphorus, these are enriched in carbon, organic carbon. These are also enriched in energy. And so if you now start to think about these as enriched streams, these now contain resources that we could extract and recover and use.

“Using alternate biological processes we can convert the carbon present in these waste streams to methane, and methane can be directly used for energy. We can extract the methane and we can use it for cogeneration of electricity and power. There are many utilities in the nation that actually do this. This changes the game when we are talking about developing or underdeveloped economies where people just don’t have access to sanitation. Because they don’t have access to energy to drive these energy-intensive sanitation processes or wastewater treatment processes. So what we are now doing is, considering, when we start talking about waste streams as energy sources, we are basically driving the treatment of these waste streams from the energy which is produced from within these streams.

“One example of our field work is in Ghana, where we’ve been working with , the , to design and implement novel toilets that can separate out the urine stream and the fecal sludge stream from human waste. And the end application for this project has been the re-use and recovery of nutrients from the urine stream for agriculture in villages in Ghana. Another example of our field work in Ghana is the conversion of fecal sludge to biodiesel to drive the conversion of fecal sludge to more high-value endpoints.”

For the complete list of this year’s 24 MacArthur Fellows, including about 10 science and medicine people depending on how you define their activities, go to , for MacArthur Foundation.

—Steve Mirsky

Chandran audio via MacArthur Foundation

Gene-Edited "Micropigs" to Be Sold as Pets

Cutting-edge gene-editing techniques have produced an unexpected byproduct — tiny pigs that a leading Chinese genomics institute will soon sell as pets.

BGI in Shenzhen, the genomics institute that is famous for a , originally created the micropigs as models for human disease, by . On September 23, at the Shenzhen International Biotech Leaders Summit in China, BGI revealed that it would start selling the pigs as pets. The animals weigh about 15 kilograms when mature, or about the same as a medium-sized dog.

At the summit, the institute quoted a price tag of 10,000 yuan (US$1,600) for the micropigs, but that was just to "help us better evaluate the market”, says Yong Li, technical director of BGI’s animal-science platform. In future, customers will be offered pigs with different coat colours and patterns, which BGI says it can also set through gene editing.

With , the field's pioneers say that the application to pets was no big surprise. Some also caution against it. “It's questionable whether we should impact the life, health and well-being of other animal species on this planet light-heartedly,” says geneticist Jens Boch at the Martin Luther University of Halle-Wittenberg in Germany. Boch helped to develop the gene-editing technique used to create the pigs, which uses enzymes known as TALENs (transcription activator-like effector nucleases) to disable certain genes.

How to regulate the various applications of gene-editing is an open question that scientists are already discussing with agencies across the world. BGI agrees on the need to regulate gene editing in pets as well as in the medical research applications that make up the core of its micropig activities. Any profits from the sale of pets will be invested in this research. “We plan to take orders from customers now and see what the scale of the demand is,” says Li.

Animal models

Bama pigs, which weigh 35–50 kilograms (by contrast, many farm pigs weigh more than 100 kilograms), have previously been used in research.

To make the smaller, gene-edited micropigs, BGI made cloned pigs from cells taken from a Bama fetus. But before they started the cloning process, they used TALENs to disable one of two copies of the growth hormone receptor gene () in the fetal cells.  Without the receptor, cells do not receive the ‘grow’ signal during development, resulting in stunted pigs.

Show stealers

He says that the micropigs have already proved useful in studies of stem cells and of gut microbiota, because the animals' smaller size makes it easier to replace the bacteria in their guts. They will also aid studies of Laron syndrome, a type of dwarfism caused by a mutation in the human  gene.

The decision to sell the pigs as pets surprised Lars Bolund, a medical geneticist at Aarhus University in Denmark who helped BGI to develop its pig gene-editing program, but he admits that they stole the show at the Shenzhen summit. “We had a bigger crowd than anyone,” he says. “People were attached to them. Everyone wanted to hold them.”

They could meet a preexisting demand. In the United States, for instance, reports have surfaced of people who wanted a porcine lap pet, but were disappointed when  weighing only 5 kilograms grew into 50-kilogram animals. Genetically-edited micropigs stay reliably small, the BGI team says.

Pig problems

Some researchers think that dogs or cats will be next up for genetic manipulation. Scientists and ethicists agree that gene-edited pets are not very different from conventional breeding — the result is just achieved more efficiently. But that doesn’t make the practice a good idea, says Jeantine Lunshof, a bioethicist at Harvard Medical School in Boston, Massachusetts, who describes both as “stretching physiological limits for the sole purpose of satisfying idiosyncratic aesthetic preferences of humans”.

Dana Carroll, a gene-editing pioneer at the University of Utah in Salt Lake City, adds: “I can certainly imagine resistance to manipulating dogs, even though all of the current breeds are the result of selective breeding by humans.”

Daniel Voytas, a geneticist at the University of Minnesota in Saint Paul, hopes that any buzz over gene-edited pets does not hamper progress in developing gene-editing techniques for  and creating new crop varieties. “I just hope we establish a regulatory framework — guidelines for the safe and ethical use of this technology — that allows the potential to be realized," he says.  "I worry that pet mini pigs distract and add confusion to efforts to achieve this goal."

see also:

VW Scandal Causes Small but Irreversible Environmental Damage

Scientists and engineers remain skeptical of high pollution claims

By | |

Volkswagen’s ruse to circumvent U.S. auto emissions standards has left many wondering about the precise environmental impact of its cars, which emitted more pollutants than regulations allow. Although the extra pollution is impossible to quantify so soon, experts agree that although the amount is globally insignificant, it might add to Europe’s regional health concerns.

On September 18 the U.S. Environmental Protection Agency discovered that four Volkswagen vehicles from model years 2009 to 2015 had been with illegal software. They used a sophisticated algorithm that would make the cars run cleanly during emissions tests but then stop so the cars would get better fuel economy and driving ability. As such, the unrestricted vehicles released higher-than-acceptable emissions in everyday driving situations. The German automaker quickly recalled 482,000 VW and Audi brand cars in the U.S. alone, and later admitted that the software might have been fitted to 11 million vehicles worldwide.

EPA now suspects that these cars emitted more nitrogen oxide—a pollutant that can harm human health—than standards allow. Many news organizations were quick to jump on this number. ran its own , claiming that the scandal may have caused nearly one million extra metric tons of pollution yearly. But experts remain skeptical.

John Heywood, a mechanical engineer at Massachusetts Institute of Technology who focuses on internal combustion engines and air pollution is hesitant to agree with such high numbers. He has identified a key change in how the engine operates (by delaying the start of combustion) that would improve the snappiness of the driving, but it would only increase nitrogen oxide emissions by three to five times.

Travis Bradford, director of Energy and Environment Concentration at Columbia University, agrees. He argues that a number as high as 40 likely represents a spike while the car is accelerating. It cannot be anywhere near the average. “Fuels these days are not that dirty and emissions control systems are not that clean,” Bradford says. “So the idea that it would on average be 40 times the amount of emissions is pretty incredulous.”

Still, experts agree that nitrogen oxide () is a nasty pollutant. Once released into the air it quickly converts into nitrogen dioxide—a reddish-brown gas with a pungent odor—and then absorbs sunlight to transform into the yellow-brown haze that blankets cities. It is this smog that can exacerbate dozens of health problems, including Alternatively, it can be washed into the ground in the form of acid rain, which can kill plants and animals. Once the damage is done “there is no antidote,” says Yiannis Levendis, an engineering professor at Northeastern University who focuses on diesel emissions.

The news is not tragic for those living in the U.S., where the portion of diesel-powered cars is small (roughly 1 percent). But in Europe that number is much higher, clocking in at roughly 50 percent. In some European cities there is already so much nitrogen dioxide that it is “toxic in its own right,” Heywood says. But that was prior to the scandal. VW just upped the dosage.

All experts agree that on a local scale, the extra pollution can only make matters worse; on a global scale, however, it is insignificant. According to the EPA, small cars released roughly one billion metric tons () of greenhouse gases in 2011 alone. The ’s estimate, which experts agree is likely too high, is that the rigged cars account for only 0.1 percent of that. “Unfortunately, in the grand scheme of things, this is a drop in the bucket in terms of our aggregate pollution,” Bradford says. He says “unfortunately” mostly because he thinks it’s a shame that pollution is already so high, and partially because he is flabbergasted that a company of VW’s stature could stoop so low. “They literally stole public property,” he says. “They took air that could have been cleaner and available to all the people in the U.S. because they wanted to sell cars.”

Heywood will keep crunching the numbers. But he’s waiting for Volkswagen and EPA to release more concrete information. “We've got to let the dust settle on the numbers,” he says, before we jump to any radical conclusions.

see also:

MacArthur 'Genius Grants' Reward Science Innovation

Nine US scientists and social scientists working on nano wires, stem-cells transplants and wastewater treatment were among the 24 winners

By and | |

Peidong Yang, a chemist who is building nanowires into commercial applications—such as devices that generate fuel from solar energy, or convert waste heat into electricity—is one of nine US scientists and social scientists to win a so-called ‘genius grant’ this year from the philanthropic MacArthur Foundation, based in Chicago, Illinois.

The awards, announced on September 29, give $625,000 of “no-strings-attached” funding to creative and inspiring individuals in any field, paid out over five years.

Yang, at the University of California, Berkeley, recently helped build a device that uses nanowires and bacteria to absorb solar energy and convert carbon dioxide and water into fuel. His nanowire research has also been used to make chemical sensors and optical switches.

Other science and social science-related  of this year’s fellowships are listed below.

Beth Stevens, a neuroscientist at Harvard Medical School in Boston, Massachusetts, discovered that . Her work could help us understand how neurodegeneration occurs in diseases like Alzheimer’s.

Christopher Ré, a computer scientist at Stanford University in California, created  that has been used to crawl across vast tracts of data and pull out information—for example, to analyse human trafficking networks on the web, to identify interactions between genes and prescription drugs, and to .

John Novembre, a computational biologist at the University of Chicago, IIlinois, works on  through analysing genetic data. He created a detailed map of genetic diversity among African Americans, and has  that riddle human populations.

William Dichtel, a chemist at Cornell University in Ithaca, New York, makes sponge-like polymers called covalent organic frameworks, which might be useful for storing fuels and separating molecules.

Kartik Chandran, an environmental engineer at Columbia University in New York, works on new ways to treat wastewater. He uses microbes and other technologies to clean the water while also produce valuable byproducts such as biofuel. He has tested his projects on the ground in Ghana, with Engineers without Borders.

Lorenz Studer, a stem cell biologist at the Memorial Sloan-Kettering Cancer Center in New York, is leading efforts to transplant neurons into the brain, with the hope of replacing cells that die due to neurodegenerative diseases such as Parkinson’s.

Heidi Williams, an economist at the Massachusetts Institute of Technology in Cambridge, Massachusetts, looks at the causes of innovation in health-care markets. She focuses on the influence of patents and intellectual property laws, and has examined the race between public and private institutions to decode the human genome.

Matthew Desmond, a social scientist at Harvard University in Cambridge, Massachusetts, studies the impact of eviction on the urban poor. His research has led to policy changes that help victims of domestic violence.

see also:

Can the U.S. Jump-Start Offshore Wind Power?

The Department of Energy has awarded around a half-million dollars to New York, Maine, Rhode Island and Massachusetts state organizations to cooperate on scaling up the offshore wind industry in the region.

Under the leadership of the New York State Energy Research and Development Authority (NYSERDA), the group will lay out a collaborative road map by the end of the year on how to build up the new industry. The project largely aims to reduce the cost of offshore wind projects, which has been a barrier to development, and establish a regional supply chain.

Industry and state representatives learned about the federal grant at the first-ever offshore wind summit hosted by the White House yesterday.

Offshore wind has struggled to take off in the United States. Europe, meanwhile, has more than 80 offshore wind farms with more than 10,000 megawatts of capacity. The White House summit marks a renewed effort to get the industry going in the United States, said various attendees.

“There’s a real commitment and desire to move offshore wind forward,” said Warren Leon, executive director of the Clean Energy States Alliance (CESA), who was at the meeting. “There’s a recognition that it’s not easy; this is a technology that is not currently cheap or easy to move into the marketplace, but it’s very much worth focusing on because the rewards for our electricity system, for our environment and for jobs would be so good.”

CESA will help coordinate the interstate project, he said. The group will first gather input from different stakeholders to find out where extended research is needed. The states involved could eventually figure out ways to cooperate on building transmission lines, giving out permits or filling out environmental impact analyses, he said.

The nearly $600,000 was awarded as part of the Department of Energy’s State Energy Program, which funded 12 projects.

A regional strategy takes shapestatement from the White House.

Also, the Department of the Interior’s Bureau of Ocean Energy Management has begun organizing a multilateral group to collect and share expertise on offshore wind regulations and technology from industry leaders in Germany, Denmark and the United Kingdom. Yesterday afternoon, experts from abroad and from the United States shared strategies on scaling up the industry at a  held by the Environmental and Energy Study Institute.

The different federal initiatives come as the offshore wind industry is “picking up momentum after a lull,” said Kit Kennedy, director of energy and transportation for the Natural Resources Defense Council, who was at the White House meeting.

The offshore wind industry has experienced disappointing developments in recent years, including major setbacks to the proposed Cape Wind project in Massachusetts’ Nantucket Sound. That motivated a report earlier this year to urge a multi-state approach to replace the current strategy to get the industry on its feet again (, Feb. 17).

Earlier this summer, construction started on what is expected to become the first offshore wind farm in the United States off Block Island in Rhode Island by Deepwater Wind. The $225 million, 30 MW farm will have five turbines and is scheduled to begin producing energy next year. Last week, DOE announced it will hold an auction for the rights to build wind turbines in 344,000 acres of federal waters off New Jersey in November, the latest of several lease auctions it has announced (, Sept. 23).

Kennedy applauded the federal support for a regional road map but stressed that in the meantime, individual projects should keep pushing forward.

“The regional approach to scaling up offshore wind is an interesting and promising approach as long as it leads to firm commitments to build offshore wind in each state, rather than a prolonged study process,” she said. “A pipeline of offshore wind projects has to start somewhere.”

www.eenews.net

see also:

7 More People Sick with Legionnaires’ Disease in NYC

Officials say the new cases are not related to the city’s summertime Legionnaires outbreak, the largest in the city’s history, which sickened 120 people in the South Bronx

By and | |

This electron micrograph depicts an amoeba, Hartmannella vermiformis (orange) as it entraps a Legionella pneumophila bacterium (green) with an extended pseudopod.

More people in New York City are sick with Legionnaires’ disease in what appears to be a new cluster of cases, health officials say.

So far, seven people who live or work in the Morris Park neighborhood of the Bronx have been hospitalized recently with Legionnaires’ disease, according to the New York City Department of Health and Mental Hygiene. Officials were notified of these cases last week.

The new cases are not related to the  that occurred in New York City over the summer, which was the largest in the city’s history, and sickened 120 people in the South Bronx. Officials traced that outbreak to a cooling tower at the Opera House Hotel, which was contaminated with , the bacteria that cause the disease.

An investigation into the new cluster is underway, and scientists have taken samples from all of the cooling towers in Morris Park to test for the bacteria, according to the health department. Officials have also notified health care providers in the area to look out for patients with symptoms of the disease, and conduct the necessary tests on these patients. []

Officials urged New Yorkers with symptoms of , such as fever, cough, chills and difficulty breathing, to seek prompt medical attention. The disease most commonly affects older adults and people with weakened immune systems.

 live in watery environments, such as cooling towers and air-conditioning systems, and symptoms can show from two to 10 days after a person is exposed to the bacteria. People get infected when they inhale airborne water droplets containing the bacteria, but the disease does not spread from person to person.

The outbreak this summer led to the passage of legislation requiring that building owners regularly test their cooling towers for the bacteria.

An estimated 8,000 to 18,000 people are hospitalized with Legionnaires’ disease each year in the U.S., according to the Centers for Disease Control and Prevention.

LiveScience

see also:

The Difference between Science and Pseudoscience

Newton was wrong. Einstein was wrong. Black holes do not exist. The big bang never happened. Dark energy and dark matter are unsubstantiated conjectures. Stars are electrically charged plasma masses. Venus was once a comet. The massive Valles Marineris canyon on Mars was carved out in a few minutes by a giant electric arc sweeping across the Red Planet. The “thunderbolt” icons found in ancient art and petroglyphs are not the iconography of imagined gods but realistic representations of spectacular electrical activity in space.

These are just a few of the things I learned at the Electric Universe conference (EU2015) in June in Phoenix. The Electric Universe community is a loose confederation of people who, according to the host organization's Web site (thunderbolts.info), believe that “a new way of seeing the physical universe is emerging. The new vantage point emphasizes the role of electricity in space and shows the negligible contribution of gravity in cosmic events.” This includes everything from comets, moons and planets to stars, galaxies and galactic clusters.

I was invited to speak on the difference between science and pseudoscience. The most common theme I gleaned from the conference is that one should be skeptical of all things mainstream: cosmology, physics, history, psychology and even government (I was told that World Trade Center Building 7 was brought down by controlled demolition on 9/11 and that “chemtrails”—the contrails in the sky trailing jets—are evidence of a government climate-engineering experiment).

The acid test of a scientific claim, I explained, is prediction and falsification. My friends at the nasa Jet Propulsion Laboratory, for example, tell me they use both Newtonian mechanics and Einstein's relativity theory in computing highly accurate spacecraft trajectories to the planets. If Newton and Einstein are wrong, I inquired of EU proponent Wallace Thornhill, can you generate spacecraft flight paths that are more accurate than those based on gravitational theory? No, he replied. GPS satellites in orbit around Earth are also dependent on relativity theory, so I asked the conference host David Talbott if EU theory offers anything like the practical applications that theoretical physics has given us. No. Then what does EU theory add? A deeper understanding of nature, I was told. Oh.

Conventional psychology was challenged by Gary Schwartz of the University of Arizona, who, in keeping with the electrical themes of the day, explained that the brain is like a television set and consciousness is like the signals coming into the brain. You need a brain to be conscious, but consciousness exists elsewhere. But TV studios generate and broadcast signals. Where, I inquired, is the consciousness equivalent to such production facilities? No answer.

A self-taught mathematician named Stephen Crothers riffled through dozens of PowerPoint slides chockablock full of equations related to Einstein's general theory of relativity, which he characterized as “numerology.” Einstein's errors, Crothers proclaimed, led to the mistaken belief in black holes and the big bang. I understood none of what he was saying, but I am confident he's wrong by the fact that for a century thousands of physicists have challenged Einstein, and still he stands as 's Person of the Century. It's not impossible that they are all wrong and that this part-time amateur scientist sleuth is right, but it is about as likely as the number of digits after the decimal place in Einstein's equations accurately describing the relativistic effects on those GPS satellite orbits.

The EU folks I met were unfailingly polite, unquestionably smart and steadfastly unwavering in their belief that they have made one of the most important discoveries in the history of science. Have they? Probably not. The problem was articulated in a comment Thornhill made when I asked for their peer-reviewed papers: “In an interdisciplinary science like the Electric Universe, you could say we have no peers, so peer review is not available.” Without peer review or the requisite training in each discipline, how are we to know the difference between mainstream and alternative theories, of which there are many?

In his book , Tom Wolfe quotes Merry Prankster Ken Kesey: “You're either on the bus or off the bus.” It's not that EUers are wrong; they're not even on the bus.

AND HENRY HOLT ARE AFFILIATES

see also:

Can the U.S. Jumpstart Offshore Wind Power?

The Department of Energy has awarded around a half-million dollars to New York, Maine, Rhode Island and Massachusetts state organizations to cooperate on scaling up the offshore wind industry in the region.

Under the leadership of the New York State Energy Research and Development Authority (NYSERDA), the group will lay out a collaborative road map by the end of the year on how to build up the new industry. The project largely aims to reduce the cost of offshore wind projects, which has been a barrier to development, and establish a regional supply chain.

Industry and state representatives learned about the federal grant at the first-ever offshore wind summit hosted by the White House yesterday.

Offshore wind has struggled to take off in the United States. Europe, meanwhile, has more than 80 offshore wind farms with more than 10,000 megawatts of capacity. The White House summit marks a renewed effort to get the industry going in the United States, said various attendees.

“There’s a real commitment and desire to move offshore wind forward,” said Warren Leon, executive director of the Clean Energy States Alliance (CESA), who was at the meeting. “There’s a recognition that it’s not easy; this is a technology that is not currently cheap or easy to move into the marketplace, but it’s very much worth focusing on because the rewards for our electricity system, for our environment and for jobs would be so good.”

CESA will help coordinate the interstate project, he said. The group will first gather input from different stakeholders to find out where extended research is needed. The states involved could eventually figure out ways to cooperate on building transmission lines, giving out permits or filling out environmental impact analyses, he said.

The nearly $600,000 was awarded as part of the Department of Energy’s State Energy Program, which funded 12 projects.

A regional strategy takes shapestatement from the White House.

Also, the Department of the Interior’s Bureau of Ocean Energy Management has begun organizing a multilateral group to collect and share expertise on offshore wind regulations and technology from industry leaders in Germany, Denmark and the United Kingdom. Yesterday afternoon, experts from abroad and from the United States shared strategies on scaling up the industry at a  held by the Environmental and Energy Study Institute.

The different federal initiatives come as the offshore wind industry is “picking up momentum after a lull,” said Kit Kennedy, director of energy and transportation for the Natural Resources Defense Council, who was at the White House meeting.

The offshore wind industry has experienced disappointing developments in recent years, including major setbacks to the proposed Cape Wind project in Massachusetts’ Nantucket Sound. That motivated a report earlier this year to urge a multi-state approach to replace the current strategy to get the industry on its feet again (, Feb. 17).

Earlier this summer, construction started on what is expected to become the first offshore wind farm in the United States off Block Island in Rhode Island by Deepwater Wind. The $225 million, 30 MW farm will have five turbines and is scheduled to begin producing energy next year. Last week, DOE announced it will hold an auction for the rights to build wind turbines in 344,000 acres of federal waters off New Jersey in November, the latest of several lease auctions it has announced (, Sept. 23).

Kennedy applauded the federal support for a regional road map but stressed that in the meantime, individual projects should keep pushing forward.

“The regional approach to scaling up offshore wind is an interesting and promising approach as long as it leads to firm commitments to build offshore wind in each state, rather than a prolonged study process,” she said. “A pipeline of offshore wind projects has to start somewhere.”

www.eenews.net

see also: