The Protect IP/SOPA Bill Threatens the Entire Internet, Video Explains the Flaws in This Hurried Legislation

Tell Congress not to censor the internet NOW! – fightforthefuture.org/pipa

Source: fightforthefuture.org

Science and Space…NASA’s Kepler Mission Confirms Its First Planet in Habitable Zone of Sun-like Star

This diagram compares our own solar system to Kepler-22, a star system containing the first "habitable zone" planet discovered by NASA's Kepler mission. Image credit: NASA/Ames/JPL-Caltech

NASA’s Kepler mission has confirmed its first planet in the “habitable zone,” the region where liquid water could exist on a planet’s surface. Kepler also has discovered more than 1,000 new planet candidates, nearly doubling its previously known count. Ten of these candidates are near-Earth-size and orbit in the habitable zone of their host star. Candidates require follow-up observations to verify they are actual planets.

The newly confirmed planet, Kepler-22b, is the smallest yet found to orbit in the middle of the habitable zone of a star similar to our sun. The planet is about 2.4 times the radius of Earth. Scientists don’t yet know if Kepler-22b has a predominantly rocky, gaseous or liquid composition, but its discovery is a step closer to finding Earth-like planets.

Previous research hinted at the existence of near-Earth-size planets in habitable zones, but clear confirmation proved elusive. Two other small planets orbiting stars smaller and cooler than our sun recently were confirmed on the very edges of the habitable zone, with orbits more closely resembling those of Venus and Mars.

“This is a major milestone on the road to finding Earth’s twin,” said Douglas Hudgins, Kepler program scientist at NASA Headquarters in Washington. “Kepler’s results continue to demonstrate the importance of NASA’s science missions, which aim to answer some of the biggest questions about our place in the universe.”

Kepler discovers planets and planet candidates by measuring dips in the brightness of more than 150,000 stars to search for planets that cross in front, or “transit,” the stars. Kepler requires at least three transits to verify a signal as a planet.

“Fortune smiled upon us with the detection of this planet,” said William Borucki, Kepler principal investigator at NASA Ames Research Center at Moffett Field, Calif., who led the team that discovered Kepler-22b. “The first transit was captured just three days after we declared the spacecraft operationally ready. We witnessed the defining third transit over the 2010 holiday season.”

The Kepler science team uses ground-based telescopes and the Spitzer Space Telescope to review observations on planet candidates the spacecraft finds. The star field that Kepler observes in the constellations Cygnus and Lyra can only be seen from ground-based observatories in spring through early fall. The data from these other observations help determine which candidates can be validated as planets.

Kepler-22b is located 600 light-years away. While the planet is larger than Earth, its orbit of 290 days around a sun-like star resembles that of our world. The planet’s host star belongs to the same class as our sun, called G-type, although it is slightly smaller and cooler.

Of the 54 habitable zone planet candidates reported in February 2011, Kepler-22b is the first to be confirmed. This milestone will be published in The Astrophysical Journal.

The Kepler team is hosting its inaugural science conference at Ames Dec. 5-9, announcing 1,094 new planet candidate discoveries. Since the last catalog was released in February, the number of planet candidates identified by Kepler has increased by 89 percent and now totals 2,326. Of these, 207 are approximately Earth-size, 680 are super Earth-size, 1,181 are Neptune-size, 203 are Jupiter-size and 55 are larger than Jupiter.

The findings, based on observations conducted May 2009 to September 2010, show a dramatic increase in the numbers of smaller-size planet candidates.

Kepler observed many large planets in small orbits early in its mission, which were reflected in the February data release. Having had more time to observe three transits of planets with longer orbital periods, the new data suggest that planets one to four times the size of Earth may be abundant in the galaxy.

The number of Earth-size and super Earth-size candidates has increased by more than 200 and 140 percent since February, respectively.

There are 48 planet candidates in their star’s habitable zone. While this is a decrease from the 54 reported in February, the Kepler team has applied a stricter definition of what constitutes a habitable zone in the new catalog, to account for the warming effect of atmospheres, which would move the zone away from the star, out to longer orbital periods.

“The tremendous growth in the number of Earth-size candidates tells us that we’re honing in on the planets Kepler was designed to detect: those that are not only Earth-size, but also are potentially habitable,” said Natalie Batalha, Kepler deputy science team lead at San Jose State University in San Jose, Calif. “The more data we collect, the keener our eye for finding the smallest planets out at longer orbital periods.”

NASA’s Ames Research Center manages Kepler’s ground system development, mission operations and science data analysis. NASA’s Jet Propulsion Laboratory in Pasadena, Calif., managed Kepler mission development.

Ball Aerospace and Technologies Corp. in Boulder, Colo., developed the Kepler flight system and supports mission operations with the Laboratory for Atmospheric and Space Physics at the University of Colorado in Boulder.

The Space Telescope Science Institute in Baltimore archives, hosts and distributes the Kepler science data. Kepler is NASA’s 10th Discovery Mission and is funded by NASA’s Science Mission Directorate at the agency’s headquarters.

Released: December 5, 2011

Source: NASA

Related Link:

http://www.nasa.gov/mission_pages/kepler/news/kepscicon-briefing.html

Medical News: Shortages of Surgical Drugs May Pose Threats to Patient Safety

Steps Urged to Maintain Supplies and Prevent Harmful Effects from Shortages of Key Drugs

Newswise — San Francisco, CA. (November 23, 2011) – The United States is facing ongoing shortages of several critical anesthesia medications—shortages with potentially serious effects on patient care and safety, according to a special article in the December issue of Anesthesia & Analgesia, official journal of the International Anesthesia Research Society (IARS).

“Anesthesiologists should be actively involved in the steps necessary to provide a fast resolution [to drug shortages] and that can minimize adverse effects to patient care,” writes Dr Gildasio S. De Oliveira, Jr, of Northwestern University, Evanston, Ill.

Shortages of Perioperative Drugs—Causes and Safety Impact
Dr. Oliveira and colleagues reviewed key issues related to national shortages of important drugs used in the perioperative period (before, during, and after surgery). Medication shortages have become increasingly frequent over the past decade. In 2010, the American Society of Health-System Pharmacists (ASHP) listed 140 medications in short supply. Even more alarming, shortages are also reported for alternative drugs in several categories.

Many factors contribute to medication shortages, such as product recalls and shortages of raw materials. A surge in demand can cause shortages even when manufacturing and supply are unaffected. Shortages are especially acute for sterile injectable medications because of the many complex steps involved in their manufacture. Current inventory management practices, such as short inventories and “just in time” production schedules, also play a role.

Drug shortages can have a “devastating” impact on patient care—particularly if alternative drugs are not available. Patients may face treatment delays, have procedures canceled, or receive alternative drugs that are less effective or have more side effects. The study noted that “Drug shortages can therefore increase risks to patients, and can also have a negative impact on institutions.”

Of special concern to anesthesiologists is the shortage of propofol—a drug that is widely used not only for anesthesia but also as a sedative. Naloxone, an essential drug for managing an overdose of morphine-like drugs, is also in short supply. Shortages have also been reported for medications used to paralyze patients during surgery, as well as the drugs used to reverse muscle paralysis.

Call for Anesthesiologists to Play an Active Role
Dr De Oliveira and colleagues urge anesthesiologists to take the lead in dealing with the problem of shortages, noting that “proactive measures must be taken to identify, resolve, and possibly prevent a medication shortage before patient care and safety are jeopardized.” Anesthesiologists need to be aware of ASHP guidelines for dealing with medication shortages and play an active role in developing and implementing the response at local hospitals.

Health care professionals can also inform the FDA about potential drug shortages. Depending on the cause, the FDA may take steps to alleviate shortages—for example, by helping to obtain raw materials or allowing important of alternative medications.

“[A]s anesthesiologists, we have an obligation to report shortages, especially the ones that cause deviations from the best practices of patient care,” Dr De Oliveira and coauthors write. They believe that steps should be taken now to prevent shortages of anesthesia drugs from becoming a public health issue. These may include increasing inventories, implementing policies and legislation to increase drug production, and regulatory changes affecting drug manufacturing.

An accompanying editorial by Drs Richard P. Dutton and Jerry A. Cohen urges anesthesiologists to use caution in using substitutes when a desired drug is not available. They conclude “We must not continue to expose patients to these risks, when we know that proper action on the part of industry, our policy makers, and ourselves, can reduce it.”

Released: 11/23/2011

Source:  International Anesthesia Research Society (IARS)

Related Link:

http://www.newswise.com/articles/shortages-of-surgical-drugs-may-pose-threats-to-patient-safety

Intentional Poisonings Result in 14,720 Emergency Department Visits in a Year

Females accounted for nearly two-thirds of these intentional poisoning visits

Newswise — Intentional poisoning refers to attempts to physically harm someone or render that person defenseless against crimes by deliberately getting them to ingest, inhale or in some other way take in a potentially harmful substance without their knowledge. A first-of-a-kind national report reveals that more than 14,720 emergency department visits were caused by drug-related intentional poisonings during 2009 (the latest year with available data). The report by the Substance Abuse and Mental Health Services Administration (SAMHSA) shows that the majority of those visits (63 percent) were by females, and that 73 percent of the visits were by people aged 21 or older.

The report showed that a wide variety of substances were involved in these intentional poisonings, including alcohol, illicit drugs, and pharmaceuticals, as well as substances for which patients lacked knowledge about the specific drug(s). Alcohol was a factor in 60 percent of these intentional poisoning-related emergency department visits. Illicit drugs such as marijuana, stimulants, cocaine and Ecstasy were involved 30 percent of the time. Pharmaceuticals such as drugs for insomnia, anxiety, benzodiazepines and pain relievers appear to have been involved in 21 percent of these intentional poisoning cases.

Alcohol and drug combinations were involved in almost half (46 percent) of the emergency department visit cases linked to intentional poisonings.

“The danger of being tricked into ingesting an unknown substance is all too real at bars, raves, parties or concerts where alcohol and other substances are shared in a social manner,” said SAMHSA Administrator Pamela S. Hyde. “Not only is the health of the person who is poisoned compromised they are in jeopardy of falling prey to other crimes such as robbery and sexual assault. Clearly some common sense precautions like being aware can go a long way in protecting oneself from people with malicious intent.”

The report, Drug-Related Emergency Department Visits Attributed to Intentional Poisoning, is available at http://oas.samhsa.gov/2k11/DAWN040/WEB_DAWN_040.htm. The report was developed from data drawn from SAMHSA’s Drug Abuse Warning Network — a public health surveillance system that monitors drug-related emergency department visits throughout the nation.

People who suspect they may be the victims of an intentional poisoning can call the national Poison Help toll-free number, 1-800-222-1222, to reach their closest poison control center where trained health care providers provide multilingual help 24 hours a day, seven days a week. The Poison Help number is funded by the Health Services and Resources Administration.

Released: 11/10/2011
Source: Substance Abuse and Mental Health Administration (SAMHSA)

Related Link:

http://www.newswise.com/articles/intentional-poisonings-result-in-14-720-emergency-department-visits-in-a-year

Hubble Uncovers Tiny Galaxies Bursting with Star Birth in Early Universe

Photo Credit: NASA, ESA, STScI, and the CANDELS team CANDELS DWARF GALAXIES. The CANDELS team identified 69 dwarf galaxies that are undergoing intense bursts of star formation. The dwarf galaxies were found in two regions of the sky called the Great Observatories Deep Survey-South and the UKIDSS Ultra Deep Survey (part of the UKIRT Infrared Deep Sky Survey). Each dwarf is shown centered in cutouts made from near-infrared (I, J, and H band) images acquired by Hubble's Wide Field Camera 3 and Advanced Camera for Surveys. The light from these galaxies has been traveling for about 9 billion years. Many of the stars in nearby dwarf galaxies may have formed in similar starbursts around the same time. The background shows a wider near-infrared view of the CANDELS Ultra Deep Survey field.

Newswise — Using its near-infrared vision to peer 9 billion years back in time, NASA’s Hubble Space Telescope has uncovered an extraordinary population of tiny, young galaxies that are brimming with star formation. The galaxies are typically a hundred times less massive than the Milky Way galaxy, yet they churn out stars at such a furious pace that their stellar content would double in just 10 million years. By comparison, the Milky Way would take a thousand times longer to double its population.

These newly discovered dwarf galaxies are extreme even for the young universe, when most galaxies were forming stars at higher rates than they are today. The universe is 13.7 billion years old. Hubble spotted the galaxies because the radiation from young, hot stars has caused the oxygen in the gas surrounding them to light up like a bright neon sign. The rapid star birth likely represents an important phase in the formation of dwarf galaxies, the most common galaxy type in the cosmos.

“The galaxies have been there all along, but up until recently astronomers have been able only to survey tiny patches of sky at the sensitivities necessary to detect them,” said Arjen van der Wel of the Max Planck Institute for Astronomy in Heidelberg, Germany. Van der Wel is the lead author of a paper that will be published online Nov. 14 in The Astrophysical Journal. “We weren’t looking specifically for these galaxies, but they stood out because of their unusual colors.”

The observations were part of the Cosmic Assembly Near-infrared Deep Extragalactic Legacy Survey (CANDELS), an ambitious three-year survey to analyze the most distant galaxies in the universe. CANDELS is the census of dwarf galaxies at such an early epoch in the universe’s history.

“In addition to the images, Hubble has captured spectra that show us the oxygen in a handful of galaxies and confirm their extreme star-forming nature,” said co-author Amber Straughn at NASA’s Goddard Space Flight Center in Greenbelt, Md. “Spectra are like fingerprints–they tell us the galaxies’ chemical composition.”

The observations are somewhat at odds with recent detailed studies of the dwarf galaxies that are orbiting as satellites of the Milky Way.

“Those studies suggest that star formation was a relatively slow process, stretching out over billions of years,” explained Harry Ferguson of the Space Telescope Science Institute (STScI) in Baltimore, Md., co-leader of the CANDELS survey. “The CANDELS finding that there were galaxies of roughly the same size forming stars at very rapid rates at early times is forcing us to re-examine what we thought we knew about dwarf galaxy evolution.”

Added team member Anton Koekemoer, also of STScI, who is producing all the Hubble imaging data for the survey: “As our observations continue, we should find many more of these young galaxies and gather more details on their star-forming histories.”

The CANDELS team uncovered the 69 young dwarf galaxies in near-infrared images taken with Hubble’s Wide Field Camera 3 and Advanced Camera for Surveys. The galaxies were found in two regions of the sky called the Great Observatories Origins Deep Survey South and the UKIDSS Ultra Deep Survey (part of the UKIRT Infrared Deep Sky Survey).

The observations suggest that the newly discovered galaxies were very common 9 billion years ago. It is a mystery, however, why the newly found dwarf galaxies were making batches of stars at such a high rate. Computer simulations show that star formation in small galaxies may be episodic. Gas cools and collapses to form stars. The stars then reheat the gas through, for example, supernova explosions, which blow the gas away. After some time, the gas cools and collapses again, producing a new burst of star formation, continuing the cycle.

“While these theoretical predictions may provide hints to explain the star formation in these newly discovered galaxies, the observed ‘bursts’ are much more intense than what the simulations can reproduce,” van der Wel said.

The James Webb Space Telescope, an infrared observatory scheduled to launch later this decade, will be able to probe these faint galaxies at an even earlier era to see the glow of the first generation of stars, providing detailed information of the galaxies’ chemical composition.

“With Webb, we’ll probably see even more of these galaxies, perhaps even pristine galaxies that are experiencing their first episode of star formation,” Ferguson said. “Being able to probe down to dwarf galaxies in the early universe will help us understand the formation of the first stars and galaxies.”

For images and more information about Hubble and the CANDELS results, visit:

http://hubblesite.org/news/2011/31
http://www.nasa.gov/hubble
http://www.spacetelescope.org/news/heic1117

The Hubble Space Telescope is a project of international cooperation between NASA and the European Space Agency. NASA’s Goddard Space Flight Center manages the telescope. The Space Telescope Science Institute (STScI) conducts Hubble science operations. STScI is operated for NASA by the Association of Universities for Research in Astronomy, Inc. in Washington, D.C.

Related Link:

http://www.newswise.com/articles/hubble-uncovers-tiny-galaxies-bursting-with-star-birth-in-early-universe

Wireless Demand Soon Outstripping Capacity

Newswise — A new report from the Global Information Industry Center at the University of California, San Diego examines the projected disconnect between U.S. wireless infrastructure capacity and consumer demand. According to “Point of View: Wireless Point of Disconnect,” wireless use is growing rapidly and if present trends continue, demand will often outstrip capacity, causing congestion.

“We’re currently experiencing a mass migration from wired networks to wireless networks, which under the best of circumstances have far less capacity,” said Michael Kleeman, author of the report and senior fellow at UC San Diego. Wireless is much more convenient than wired connections for many purposes, but “we must understand and accept the trade-offs we will face for the convenience of accessing limited wireless capacity. Alternatively, as citizens we need to dramatically lower our expectations for wireless services in the future.”

Wireless data capacity is inherently different than fiber optic cables, which affects its performance. Among other differences, wireless is allocated a small portion of the available spectrum, and its signals are susceptible to interference from numerous sources, including weather and buildings. According to the report, even with advanced wireless technology, the capacity available to all network users in a given cell can be less than 1/000th the capacity of a fiber optic thread. Wireless demand is also mobile and hard to predict, and when it exceeds capacity the result is dropped connections and slow downloads.

The How Much Information? 2009 American Consumer Report found that in 2008, Americans consumed 3.6 zettabytes of data including nearly five hours of TV viewing per average day. This is more than 3.5 pettabytes per day, which exceeds the wireless data network’s entire 2010 throughput. Increasing use of mobile video will be a major source of growing demand for wireless capacity.

The 2011 “Wireless Point of Disconnect” report highlights three strategies for addressing this disconnect, all of which have drawbacks and tradeoffs. First, a key limiting factor is spectrum, and increasing and optimizing available spectrum are effective ways to increase network capacity. A combination of public and private strategies to optimize spectrum use should be employed and encouraged. However, many of the public solutions will take as much as a decade to implement. Second, carriers will increasingly need to manage traffic and develop triage and prioritization protocols, potentially including pricing-based mechanisms with real-time customer feedback to help manage network load. Third, the industry can invest in more infrastructure, including cell towers and “backhaul” cables. This will require community support.

“There is a lot of discussion about supply-demand issues for broadband Internet, but soon the same questions will be considerably more acute for wireless,” said Roger Bohn, director of the Global Information Industry Center at UC San Diego. “This report shows why future wireless systems will require adjustments, of one kind or another.”

The report is in the Center’s “Point of View” series, which are occasional overviews by noted experts. They address topical issues in technology, business, and public policy.

The report “Point of View: Wireless Point of Disconnect” is available online and can be downloaded in PDF format at http://giic.ucsd.edu/wireless_disconnect_2011_10_26.php

Released: 10/26/2011

Source: University of California, San Diego

Related Link:

http://www.newswise.com/articles/wireless-demand-soon-outstriping-capacity

Fighting Fire With Fire: ‘Vampire’ Bacteria Has Potential as Living Antibiotic

Photo Credit: Martin Wu/Zhang Wang/University of Virginia The bacterium Micavibrio aeruginosavorus (yellow), attached to and leeching on a Pseudomonas aeruginosavorus bacterium (purple), surrounded by dead P. aeruginosavorus cells (gray).

Newswise — A vampire-like bacteria that leeches onto specific other bacteria – including certain human pathogens – has the potential to serve as a living antibiotic for a range of infectious diseases, a new study indicates.

The bacterium, Micavibrio aeruginosavorus, was discovered to inhabit wastewater nearly 30 years ago, but has not been extensively studied because it is difficult to culture and investigate using traditional microbiology techniques. However, biologists in the University of Virginia’s College of Arts & Sciences, Martin Wu and graduate student Zhang Wang, have decoded its genome and are learning “how it makes its living,” Wu said.

The bacterium “makes its living” by seeking out prey – certain other bacteria – and then attaching itself to its victim’s cell wall and essentially sucking out nutrients. Unlike most other bacteria, which draw nutrients from their surroundings, M. aeruginosavorus can survive and propagate only by drawing its nutrition from specific prey bacteria. This kills the prey – making it a potentially powerful agent for destroying pathogens.

One bacterium it targets is Pseudomonas aeruginosavorus, which is a chief cause of serious lung infections in cystic fibrosis patients.

“Pathologists may eventually be able to use this bacterium to fight fire with fire, so to speak, as a bacterium that will aggressively hunt for and attack certain other bacteria that are extremely harmful to humans,” Wu said.

His study, detailing the DNA sequence of M. aeruginosavorus, is published online in the journal BMC Genomics. It provides new insights to the predatory lifestyle of the bacterium and a better understanding of the evolution of bacterial predation in general.

“We used cutting-edge genomic technology in our lab to decode this bacterium’s genome,” Wu said. “We are particularly interested in the molecular mechanisms that allow it to hunt for and attack prey. This kind of investigation would have been extremely difficult and expensive to do only a few years ago.”

He noted that overuse of traditional antibiotics, which work by either inhibiting bacteria propagation or interfering with cell wall formation, are creating so-called “super bugs” that have developed resistances to treatment strategies. He suggests that new approaches are needed for attacking pathogens without building up their resistance.

Additionally, because M. aeruginosavorus is so selective a feeder, it is harmless to the thousands of beneficial bacteria that dwell in the general environment and in the human body.

“It is possible that a living antibiotic such as M. aeruginosavorus – because it so specifically targets certain pathogens – could potentially reduce our dependence on traditional antibiotics and help mitigate the drug-resistance problem we are now facing,” Wu said.

Another benefit of the bacterium is its ability to swim through viscous fluids, such as mucus. P. aeruginosavorus, the bacterium that colonizes the lungs of cystic fibrosis patients, creates a glue-like biofilm, enhancing its resistance to traditional antibiotics. Wu noted that the living cells of M. aeruginosavorus can swim through mucus and biofilm and attack P. aeruginosavorus.

M. aeruginosavorus also might have industrial uses, such as reducing bacteria that form biofilms in piping, and for medical devices, such as implants that are susceptible to the formation of biofilms.

Wu said M. aeruginosavorus requires further study for a more thorough understanding of its gene functions. He said genetic engineering would be required to tailor the predatory attributes of the bacterium to specific uses in the treatment of disease.

“We have a map now to work with, and we will see where it leads,” he said.

Wu and Wang’s co-author is Daniel E. Kadouri, a researcher at the New Jersey Dental School. Kadouri is interested in M. aeruginosavorus as an agent for fighting oral biofilms, such as plaque.

Released: 10/31/2011

Source: University of Virginia

Via Newswise:

Related Link:

http://newswise.com/articles/fighting-fire-with-fire-vampire-bacteria-has-potential-as-living-antibiotic

New Discovery….”Junk DNA” Defines Differences Between Humans and Chimps

Newswise — For years, scientists believed the vast phenotypic differences between humans and chimpanzees would be easily explained – the two species must have significantly different genetic makeups. However, when their genomes were later sequenced, researchers were surprised to learn that the DNA sequences of human and chimpanzee genes are nearly identical. What then is responsible for the many morphological and behavioral differences between the two species? Researchers at the Georgia Institute of Technology have now determined that the insertion and deletion of large pieces of DNA near genes are highly variable between humans and chimpanzees and may account for major differences between the two species.

The research team lead by Georgia Tech Professor of Biology John McDonald has verified that while the DNA sequence of genes between humans and chimpanzees is nearly identical, there are large genomic “gaps” in areas adjacent to genes that can affect the extent to which genes are “turned on” and “turned off.” The research shows that these genomic “gaps” between the two species are predominantly due to the insertion or deletion (INDEL) of viral-like sequences called retrotransposons that are known to comprise about half of the genomes of both species. The findings are reported in the most recent issue of the online, open-access journal Mobile DNA.

“These genetic gaps have primarily been caused by the activity of retroviral-like transposable element sequences,” said McDonald. “Transposable elements were once considered ‘junk DNA’ with little or no function. Now it appears that they may be one of the major reasons why we are so different from chimpanzees.”

McDonald’s research team, comprised of graduate students Nalini Polavarapu, Gaurav Arora and Vinay Mittal, examined the genomic gaps in both species and determined that they are significantly correlated with differences in gene expression reported previously by researchers at the Max Plank Institute for Evolutionary Anthropology in Germany.

“Our findings are generally consistent with the notion that the morphological and behavioral differences between humans and chimpanzees are predominately due to differences in the regulation of genes rather than to differences in the sequence of the genes themselves,” said McDonald.

The current analysis of the genetic differences between humans and chimpanzees was motivated by the group’s previously published findings (2009) that the higher propensity for cancer in humans vs. chimpanzees may have been a by-product of selection for increased brain size in humans.

Released: 10/25/201

Source: Georgia Institute of Technology

Via Newswise

Related Link:

http://www.newswise.com/articles/junk-dna-defines-differences-between-humans-and-chimps

Environmental News..Future Forests May Soak Up More Carbon Dioxide than Previously Believed

An aerial view of the 38-acre experimental forest in Wisconsin where U-M researchers and their colleagues continuously exposed birch, aspen and maple trees to elevated levels of carbon dioxide and ozone gas from 1997 through 2008. Credit: David Karnosky, Michigan Technological University

Newswise — ANN ARBOR, Mich.—North American forests appear to have a greater capacity to soak up heat-trapping carbon dioxide gas than researchers had previously anticipated.

As a result, they could help slow the pace of human-caused climate warming more than most scientists had thought, a U-M ecologist and his colleagues have concluded.

The results of a 12-year study at an experimental forest in northeastern Wisconsin challenge several long-held assumptions about how future forests will respond to the rising levels of atmospheric carbon dioxide blamed for human-caused climate change, said University of Michigan microbial ecologist Donald Zak, lead author of a paper published online this week in Ecology Letters.

“Some of the initial assumptions about ecosystem response are not correct and will have to be revised,” said Zak, a professor at the U-M School of Natural Resources and Environment and the Department of Ecology and Evolutionary Biology in the College of Literature, Science, and the Arts.

To simulate atmospheric conditions expected in the latter half of this century, Zak and his colleagues continuously pumped extra carbon dioxide into the canopies of trembling aspen, paper birch and sugar maple trees at a 38-acre experimental forest in Rhinelander, Wis., from 1997 to 2008.

Some of the trees were also bathed in elevated levels of ground-level ozone, the primary constituent in smog, to simulate the increasingly polluted air of the future. Both parts of the federally funded experiment—the carbon dioxide and the ozone treatments—produced unexpected results.

In addition to trapping heat, carbon dioxide is known to have a fertilizing effect on trees and other plants, making them grow faster than they normally would. Climate researchers and ecosystem modelers assume that in coming decades, carbon dioxide’s fertilizing effect will temporarily boost the growth rate of northern temperate forests.

Previous studies have concluded that this growth spurt would be short-lived, grinding to a halt when the trees can no longer extract the essential nutrient nitrogen from the soil.

But in the Rhinelander study, the trees bathed in elevated carbon dioxide continued to grow at an accelerated rate throughout the 12-year experiment. In the final three years of the study, the CO2-soaked trees grew 26 percent more than those exposed to normal levels of carbon dioxide.

It appears that the extra carbon dioxide allowed trees to grow more small roots and “forage” more successfully for nitrogen in the soil, Zak said. At the same time, the rate at which microorganisms released nitrogen back to the soil, as fallen leaves and branches decayed, increased.

“The greater growth has been sustained by an acceleration, rather than a slowing down, of soil nitrogen cycling,” Zak said. “Under elevated carbon dioxide, the trees did a better job of getting nitrogen out of the soil, and there was more of it for plants to use.”

Zak stressed that growth-enhancing effects of CO2 in forests will eventually “hit the wall” and come to a halt. The trees’ roots will eventually “fully exploit” the soil’s nitrogen resources. No one knows how long it will take to reach that limit, he said.

The ozone portion of the 12-year experiment also held surprises.

Ground-level ozone is known to damage plant tissues and interfere with photosynthesis. Conventional wisdom has held that in the future, increasing levels of ozone would constrain the degree to which rising levels of carbon dioxide would promote tree growth, canceling out some of a forest’s ability to buffer projected climate warming.

In the first few years of the Rhinelander experiment, that’s exactly what was observed. Trees exposed to elevated levels of ozone did not grow as fast as other trees. But by the end of study, ozone had no effect at all on forest productivity.

“What happened is that ozone-tolerant species and genotypes in our experiment more or less took up the slack left behind by those who were negatively affected, and that’s called compensatory growth,” Zak said. The same thing happened with growth under elevated carbon dioxide, under which some genotypes and species fared better than others.

“The interesting take home point with this is that aspects of biological diversity—like genetic diversity and plant species compositions—are important components of an ecosystem’s response to climate change,” he said. “Biodiversity matters, in this regard.”

Co-authors of the Ecology Letters paper were Kurt Pregitzer of the University of Idaho, Mark Kubiske of the U.S. Forest Service and Andrew Burton of Michigan Technological University. The work was funded by grants from the U.S. Department of Energy and the U.S. Forest Service.

Via Newswise

Released: 10/13/2011
Source: University of Michigan
Related Link:

http://newswise.com/articles/future-forests-may-soak-up-more-carbon-dioxide-than-previously-believed

Countdown: America’s No. 1 Solar Car Ready to Race the World

University of Michigan solar car team race crew member Ethan Lardner works on Quantum during a control stop on a practice race in Australia. Credit: Evan Dougherty

Newswise — ANN ARBOR, Mich.—With a cutting-edge solar car, an advanced strategy and an intrepid 16-student race crew, the University of Michigan’s national champion solar car team is ready for the upcoming World Solar Challenge. The 1,800-mile international contest starts on the north shore of Australia in Darwin on Oct. 16.

During the past two years of intense preparation, the team shaved 200 pounds off its 2009 car by weighing the vehicle bolt by bolt and streamlining nearly every part. They improved its aerodynamics by an estimated 30 percent. They tested in practice races across Michigan and in Australia. And they strategized with computer scientists and sailboat racers to come up with more accurate weather forecasting models.

University of Michigan solar car Quantum driving in Australia. The World Solar Challenge begins Oct. 16. Credit: Evan Dougherty

All they can do now, for the most part, is wait. And for some, that’s harder than it sounds.

“I just want to race!” said Chris Hilger, the team’s business manager, a junior in chemical engineering.

The World Solar Challenge is a grueling four-day race across the desert. Drivers rotate in four-hour shifts in a car that’s not designed for comfort. The cockpit can exceed 100 degrees. They sleep in tents on the side of Stuart Highway. U-M’s team is one of 37 competing from across the globe this year.

Michigan has finished third in this world race four times, most recently in 2009. That year’s model, Infinium, also nabbed a third consecutive national win for the team, which has six in all.

While the students are aiming for a world title with this year’s Quantum, they know the competition will be tough. And they are proud of their accomplishments so far.

University of Michigan solar car team race crew members Santosh Kumar, Aeresh Bilmoria, Jordan Feight and AJ Trublowski check under the hood during a control stop on a practice race in Australia. Credit: Evan Dougherty

“The team has done some pretty incredible things this year. We took on some ambitious designs and processes. We’re pushing the limits of what’s possible,” said Rachel Kramer, the team’s race manager, a junior neuroscience student.

“No matter how the race turns out, we can walk away knowing we’ve revolutionized how the team designs, builds and races solar cars.”

Released: 10/11/2011      Source: University of Michigan

Via Newswise

Related Link:

http://newswise.com/articles/countdown-america-s-no-1-solar-car-ready-to-race-the-world

Previous Older Entries