2011-08-10

New Enterprise Associates, Lithium-Ion Battery, Leyden Energy, Battery Battery developer Leyden Energy recharges with $20M

Battery technology developers continue to attract venture funding.

Among them is Leyden Energy Inc. (Fremont, Calif.), which recently announced Series B funding totaling $20 million. The round was led by New Enterprise Associates (Menlo Park, Calif.) along with current investors Lightspeed Ventures, Sigma Partners and Walden Capital.

Leyden said it will use it latest infusion of capital to add manufacturing capacity for its next-generation lithium-ion batteries as well as for future development of advanced battery technologies. The production boost is tied to growing demand from customers in the "smaller consumer electronics sector," the company said.

Leyden's core technology includes thermal properties that address shortfalls in Li-ion battery performance at high temperatures. Chemical reactions in batteries speed up at high temperature, degrading performance and reducing the number of charging cycles. The company said it is offering a three-year warranty on its Li-ion batteries as compared to a standard one-year warranty.

“Market demand for high-performance, long-lasting batteries in consumer electronics is growing," Leyden Energy CEO Aakar Patel said in a statement.

The company also said Ron Bernal of New Enterprise Associates will  join its board of directors. “What [Leyden] has introduced is really an energy storage platform that can be applied to a number of different product markets in order to increase the value that those applications bring to end customers," Bernal said in a statement.

Along with consumer electronics, Leyden Energy is also targeting the electric vehicle, smart grid and backup storage markets.
New Enterprise Associates, Lithium-Ion Battery, Leyden Energy, Battery Battery developer Leyden Energy recharges with $20M

Study builds on plausible scenario for origin of life on Earth

The study, "A Route to Enantiopure RNA Precursors from Nearly Racemic Starting Materials," shows how the precursors to RNA could have formed on Earth before any life existed. It was authored by Jason E. Hein, Eric Tse and Donna G. Blackmond, a team of researchers with the Scripps Research Institute. Hein is now a chemistry professor with UC Merced.

Biological molecules, such as RNA and proteins, can exist in either a natural or unnatural form, called enantiomers. By studying the chemical reactions carefully, the research team found that it was possible to generate only the natural form of the necessary RNA precursors by including simple amino acids.

"These amino acids changed how the reactions work and allowed only the naturally occurring RNA precursors to be generated in a stable form," said Hein. "In the end, we showed that an amazingly simple result emerged from some very complex and interconnected chemistry."

The natural enantiomer of the RNA precursor molecules formed a crystal structure visible to the naked eye. The crystals are stable and avoid normal chemical breakdown. They can exist until the conditions are right for them to change into RNA.

The study was led by Blackmond and builds on the work of John D. Sutherland and Matthew W. Powner published in 2009 and covered by outlets such as The New York Times and Wired. Sutherland is a chemist with Cambridge's Medical Research Council Laboratory of Molecular Biology. Powner is a post-doctoral scholar with Harvard University.


Study builds on plausible scenario for origin of life on Earth

Flash Summit, Yoram Cedar, 3-D Flash, Resistive RAM, Flash, EUV, SanDisk, Lithography, Memory EUV delay will slow NAND supply growth

SANTA CLARA, Calif. – Delays delivering next-generation lithography will slow the growth in supply of NAND flash, said the chief technologist of SanDisk in a keynote address at the Flash Memory Summit here.

In an otherwise upbeat assessment of the outlook for the flash market, Yoram Cedar waved a yellow flag about delays fielding extreme ultraviolet lithography. The lack of EUV tools will result in the historical increases in flash supply and decreases in cost to be more moderate with future process technologies, he said.

Existing immersion lithography tools will serve flash makers down to geometries of less than 10nm, two generations from today's processes, he said. In addition, vendors are working to create 3-D stacks of NAND strings using existing fab tools to further boost capacity and supply, he added.

Further in the future, chip makers including SanDisk are developing 3-D structures that use changes in resistance to create denser chips. But the so-called resistive RAM will require EUV tools, he said.

Cedar declined to give any specifics about the timeframe for EUV or the status of the current 3-D chip research. However, he did say chip makers expect to ship 64 and 128 Gbit flash devices using immersion tools.

"Many people in the semiconductor industry are very concerned about EUV not only from the standpoint of its availability but also its cost--these things will cost many millions of dollars," said one audience member in question to Cedar after the keynote.

Some pre-production EUV tools reportedly began shipping in January. Costs for the tools could soar as high as $120 million, according to some reports.

Cedar expressed optimism that EUV systems will be affordable. He also noted historical fears of an end to Moore's Law have so far been unfounded.

"When we were at 90nm, we thought 56nm was difficult and may be the end of the game," he said.

The good news is flash demand is broad and strong. Flash is expected to grow 25 percent on a compound basis through 2015, nearly double the rate of hard disk storage and far above DRAM at only one percent, he said.

About a third of all NAND bits will go to smartphones by 2015 when as many as 1.1 billion units ship, Cedar said. Tablets will take another 15 percent of NAND bits for 327 million systems that year, he added.

"Tablets represent a sizeable market that came from nowhere," he said. "There is so much new development here that wasn’t forecast three or four years ago, and there's no reason this will not continue," he added.

He projected solid-state drives will consume 25 percent of NAND bits, selling into 133 million units for clients and 12 million for servers. The rest of NAND supply, about 26 percent, will go into existing systems such as MP3 players, USB drives and digital cameras, he said.

Researchers are working in parallel on 3-D flash structures using current immersion lithography (left) and extreme ultraviolet technology.


Flash Summit, Yoram Cedar, 3-D Flash, Resistive RAM, Flash, EUV, SanDisk, Lithography, Memory EUV delay will slow NAND supply growth

Solar flares: What does it take to be X-class? Sun emits an X-Class flare on August 9, 2011

The biggest flares are known as "X-class flares" based on a classification system that divides solar flares according to their strength. The smallest ones are A-class (near background levels), followed by B, C, M and X. Similar to the Richter scale for earthquakes, each letter represents a 10-fold increase in energy output. So an X is ten times an M and 100 times a C. Within each letter class there is a finer scale from 1 to 9.

C-class and smaller flares are too weak to noticeably affect Earth. M-class flares can cause brief radio blackouts at the poles and minor radiation storms that might endanger astronauts.

And then come the X-class flares. Although X is the last letter, there are flares more than 10 times the power of an X1, so X-class flares can go higher than 9. The most powerful flare measured with modern methods was in 2003, during the last solar maximum, and it was so powerful that it overloaded the sensors measuring it. The sensors cut out at X28.

The biggest X-class flares are by far the largest explosions in the solar system and are awesome to watch. Loops tens of times the size of Earth leap up off the sun's surface when the sun's magnetic fields cross over each other and reconnect. In the biggest events, this reconnection process can produce as much energy as a billion hydrogen bombs.

If they're directed at Earth, such flares and associated CMEs can create long lasting radiation storms that can harm satellites, communications systems, and even ground-based technologies and power grids. X-class flares on December 5 and December 6, 2006, for example, triggered a CME that interfered with GPS signals being sent to ground-based receivers.

NASA and NOAA -- as well as the US Air Force Weather Agency (AFWA) and others -- keep a constant watch on the sun to monitor for X-class flares and their associated magnetic storms. With advance warning many satellites and spacecraft can be protected from the worst effects.

On August 9, 2011 at 3:48 a.m. EDT, the sun emitted an Earth-directed X6.9 flare, as measured by the NOAA GOES satellite. These gigantic bursts of radiation cannot pass through Earth's atmosphere to harm humans on the ground, however they can disrupt the atmosphere and disrupt GPS and communications signals. In this case, it appears the flare is strong enough to potentially cause some radio communication blackouts. It also produced increased solar energetic proton radiation -- enough to affect humans in space if they do not protect themselves.

There was also a coronal mass ejection (CME) associated with this flare. CMEs are another solar phenomenon that can send solar particles into space and affect electronic systems in satellites and on Earth. However, this CME is not traveling toward and Earth so no Earth-bound effects are expected.


Solar flares: What does it take to be X-class? Sun emits an X-Class flare on August 9, 2011

Japan's Tohoku tsunami created icebergs in Antarctica

Kelly Brunt, a cryosphere specialist at Goddard Space Flight Center, Greenbelt, Md., and colleagues were able to link the calving of icebergs from the Sulzberger Ice Shelf in Antarctica following the Tohoku Tsunami, which originated with an earthquake off the coast of Japan in March 2011. The finding, detailed in a paper published online in the Journal of Glaciology, marks the first direct observation of such a connection between tsunamis and icebergs.

The birth of an iceberg can come about in any number of ways. Often, scientists will see the towering, frozen monoliths break into the polar seas and work backwards to figure out the cause.

So when the Tohoku Tsunami was triggered in the Pacific Ocean on March 11 this spring, Brunt and colleagues immediately looked south. All the way south. Using multiple satellite images, Brunt, Emile Okal at Northwestern University and Douglas MacAyeal at University of Chicago were able to observe new icebergs floating off to sea shortly after the sea swell of the tsunami reached Antarctica.

To put the dynamics of this event in perspective: An earthquake off the coast of Japan caused massive waves to explode out from its epicenter. Swells of water swarmed toward an ice shelf in Antarctica, 8,000 miles (13,600 km) away, and about 18 hours after the earthquake occurred, those waves broke off several chunks of ice that together equaled about two times the surface area of Manhattan. According to historical records, this particular piece of ice hadn't budged in at least 46 years before the tsunami came along.

And as all that was happening, scientists were able to watch the Antarctic ice shelves in as close to real-time as satellite imagery allows, and catch a glimpse of a new iceberg floating off into the Ross Sea.

"In the past we've had calving events where we've looked for the source. It's a reverse scenario -- we see a calving and we go looking for a source," Brunt said. "We knew right away this was one of the biggest events in recent history -- we knew there would be enough swell. And this time we had a source."

Scientists first speculated in the 1970s that repeated flexing of an ice shelf -- a floating extension of a glacier or ice sheet that sits on land -- by waves could cause icebergs to break off. Scientific papers in more recent years have used models and tide gauge measurements in an attempt to quantify the impact of sea swell on ice shelf fronts.

The swell was likely only about a foot high (30 cm) when it reached the Sulzberger shelf. But the consistency of the waves created enough stress to cause the calving. This particular stretch of floating ice shelf is about 260 feet (80 meters) thick, from its exposed surface to its submerged base.

When the earthquake happened, Okal immediately honed in on the vulnerable faces of the Antarctic continent. Using knowledge of iceberg calving and what a NOAA model showed of the tsunami's projected path across the unobstructed Pacific and Southern oceans, Okal, Brunt and MacAyeal began looking at what is called the Sulzberger Ice Shelf. The Sulzberger shelf faces Sulzberger Bay and New Zealand.

Through a fortuitous break in heavy cloud cover, Brunt spotted what appeared to be a new iceberg in MODerate Imaging Spectroradiometer (MODIS) data.

"I didn't have strong expectations either way whether we'd be able to see something," Brunt said. "The fastest imagery I could get to was from MODIS Rapid Response, but it was pretty cloudy. So I was more pessimistic that it would be too cloudy and we couldn't see anything. Then, there was literally one image where the clouds cleared, and you could see a calving event."

A closer look with synthetic aperture radar data from the European Space Agency satellite, Envisat, which can penetrate clouds, found images of two moderate-sized icebergs -- with more, smaller bergs in their wake. The largest iceberg was about four by six miles in surface area -- itself about equal to the surface area of one Manhattan. All the ice surface together about equaled two Manhattans. After looking at historical satellite imagery, the group determined the small outcropping of ice had been there since at least 1965, when it was captured by USGS aerial photography.

The proof that seismic activity can cause Antarctic iceberg calving might shed some light on our knowledge of past events, Okal said.

"In September 1868, Chilean naval officers reported an unseasonal presence of large icebergs in the southernmost Pacific Ocean, and it was later speculated that they may have calved during the great Arica earthquake and tsunami a month earlier," Okal said. "We know now that this is a most probable scenario."

MacAyeal said the event is more proof of the interconnectedness of Earth systems.

"This is an example not only of the way in which events are connected across great ranges of oceanic distance, but also how events in one kind of Earth system, i.e., the plate tectonic system, can connect with another kind of seemingly unrelated event: the calving of icebergs from Antarctica's ice sheet," MacAyeal said.

In what could be one of the more lasting observations from this whole event, the bay in front of the Sulzberger shelf was largely lacking sea ice at the time of the tsunami. Sea ice is thought to help dampen swells that might cause this kind of calving. At the time of the Sumatra tsunami in 2004, the potentially vulnerable Antarctic fronts were buffered by a lot of sea ice, Brunt said, and scientists observed no calving events that they could tie to that tsunami.

"There are theories that sea ice can protect from calving. There was no sea ice in this case," Brunt said. "It's a big chunk of ice that calved because of an earthquake 13,000 kilometers away. I think it's pretty cool."


Japan's Tohoku tsunami created icebergs in Antarctica

Polar dinosaur tracks open new trail to past

The discovery, reported in the journal Alcheringa, is the largest and best collection of polar dinosaur tracks ever found in the Southern Hemisphere.

"These tracks provide us with a direct indicator of how these dinosaurs were interacting with the polar ecosystems, during an important time in geological history," says Emory paleontologist Anthony Martin, who led the research. Martin is an expert in trace fossils, which include tracks, trails, burrows, cocoons and nests.

The three-toed tracks are preserved on two sandstone blocks from the Early Cretaceous Period. They appear to belong to three different sizes of small theropods -- a group of bipedal, mostly carnivorous dinosaurs whose descendants include modern birds. Photos of the tracks, above and below, by Anthony Martin.

The research team also included Thomas Rich, from the Museum Victoria; Michael Hall and Patricia Vickers-Rich, both from the School of Geosciences at Monash University in Victoria; and Gonzalo Vazquez-Prokopec, an ecologist and expert in spatial analysis from Emory's Department of Environmental Studies.

The tracks were found on the rocky shoreline of remote Milanesia Beach, in Otways National Park. This area, west of Melbourne, is known for energetic surf and rugged coastal cliffs, consisting of layers of sediment accumulated over millions of years. Riddled with fractures and pounded by waves and wind, the cliffs occasionally shed large chunks of rock, such as those containing the dinosaur tracks.

One sandstone block has about 15 tracks, including three consecutive footprints made by the smallest of the theropods, estimated to be the size of a chicken. Martin spotted this first known dinosaur trackway of Victoria last June 14, around noon. He was on the lookout, since he had earlier noticed ripple marks and trace fossils of what looked like insect burrows in piles of fallen rock.

"The ripples and burrows indicate a floodplain, which is the most likely area to find polar dinosaur tracks," Martin explains. The second block containing tracks was spotted about three hours later by Greg Denney, a local volunteer who accompanied Martin and Rich on that day's expedition. That block had similar characteristics to the first one, and included eight tracks. The tracks show what appear to be theropods ranging in size from a chicken to a large crane.

"We believe that the two blocks were from the same rock layer, and the same surface, that the dinosaurs were walking on," Martin says.

The small, medium and large tracks may have been made by three different species, Martin says. "They could also belong to two genders and a juvenile of one species -- a little dinosaur family -- but that's purely speculative," he adds.

The Victoria Coast marks the seam where Australia was once joined to Antarctica. During that era, about 115-105 million years ago, the dinosaurs roamed in prolonged polar darkness. Earth's average temperature was 68 degrees Fahrenheit -- just 10 degrees warmer than today -- and the spring thaws would cause torrential flooding in the river valleys.

The dinosaur tracks were probably made during the summer, Martin says. "The ground would have been frozen in the winter, and in order for the waters to subside so that animals could walk across the floodplain, it would have to be later in the season," he explains.

Lower Cretaceous strata of Victoria have yielded the best-documented assemblage of polar dinosaur bones in the world. Few dinosaur tracks, however, have been found.

In the February 2006, Martin found the first known carnivorous dinosaur track in Victoria, at a coastal site known as Dinosaur Dreaming.

In May 2006, during a hike to another remote site near Milanesia Beach, he discovered the first trace fossil of a dinosaur burrow in Australia. That find came on the heels of Martin's co-discovery of the first known dinosaur burrow and burrowing dinosaur, in Montana. The two discoveries suggest that burrowing behaviors were shared by dinosaurs of different species, in different hemispheres, and spanned millions of years during the Cretaceous Period.


Polar dinosaur tracks open new trail to past

Hybrid solar system makes rooftop hydrogen

ScienceDaily (Aug. 9, 2011) — While roofs across the world sport photovoltaic solar panels to convert sunlight into electricity, a Duke University engineer believes a novel hybrid system can wring even more useful energy out of the sun's rays.

Instead of systems based on standard solar panels, Duke engineer Nico Hotz proposes a hybrid option in which sunlight heats a combination of water and methanol in a maze of glass tubes on a rooftop. After two catalytic reactions, the system produces hydrogen much more efficiently than current technology without significant impurities. The resulting hydrogen can be stored and used on demand in fuel cells.

For his analysis, Hotz compared the hybrid system to three different technologies in terms of their exergetic performance. Exergy is a way of describing how much of a given quantity of energy can theoretically be converted to useful work.

"The hybrid system achieved exergetic efficiencies of 28.5 percent in the summer and 18.5 percent in the winter, compared to 5 to 15 percent for the conventional systems in the summer, and 2.5 to 5 percent in the winter," said Hotz, assistant professor of mechanical engineering and materials science at Duke's Pratt School of Engineering.

The paper describing the results of Hotz's analysis was named the top paper during the ASME Energy Sustainability Fuel Cell 2011 conference in Washington, D.C. Hotz recently joined the Duke faculty after completing post-graduate work at the University of California-Berkeley, where he analyzed a model of the new system. He is currently constructing one of the systems at Duke to test whether or not the theoretical efficiencies are born out experimentally.

Hotz's comparisons took place during the months of July and February in order to measure each system's performance during summer and winter months.

Like other solar-based systems, the hybrid system begins with the collection of sunlight. Then things get different. While the hybrid device might look like a traditional solar collector from the distance, it is actually a series of copper tubes coated with a thin layer of aluminum and aluminum oxide and partly filled with catalytic nanoparticles. A combination of water and methanol flows through the tubes, which are sealed in a vacuum.

"This set-up allows up to 95 percent of the sunlight to be absorbed with very little being lost as heat to the surroundings," Hotz said. "This is crucial because it permits us to achieve temperatures of well over 200 degrees Celsius within the tubes. By comparison, a standard solar collector can only heat water between 60 and 70 degrees Celsius."

Once the evaporated liquid achieves these higher temperatures, tiny amounts of a catalyst are added, which produces hydrogen. This combination of high temperature and added catalysts produces hydrogen very efficiently, Hotz said. The resulting hydrogen can then be immediately directed to a fuel cell to provide electricity to a building during the day, or compressed and stored in a tank to provide power later.

The three systems examined in the analysis were the standard photovoltaic cell which converts sunlight directly into electricity to then split water electrolytically into hydrogen and oxygen; a photocatalytic system producing hydrogen similar to Hotz's system, but simpler and not mature yet; and a system in which photovoltaic cells turn sunlight into electricity which is then stored in different types of batteries (with lithium ion being the most efficient).

"We performed a cost analysis and found that the hybrid solar-methanol is the least expensive solution, considering the total installation costs of $7,900 if designed to fulfill the requirements in summer, although this is still much more expensive than a conventional fossil fuel-fed generator," Hotz said.

Costs and efficiencies of systems can vary widely depending on location -- since the roof-mounted collectors that could provide all the building's needs in summer might not be enough for winter. A rooftop system large enough to supply all of a winter's electrical needs would produce more energy than needed in summer, so the owner could decide to shut down portions of the rooftop structure or, if possible, sell excess energy back to the grid.

"The installation costs per year including the fuel costs, and the price per amount of electricity produced, however showed that the (hybrid) solar scenarios can compete with the fossil fuel-based system to some degree," Hotz said. 'In summer, the first and third scenarios, as well as the hybrid system, are cheaper than a propane- or diesel-combusting generator."

This could be an important consideration, especially if a structure is to be located in a remote area where traditional forms of energy would be too difficult or expensive to obtain.

Hotz's research was supported by the Swiss National Science Fund. Joining him in the study were UC-Berkeley's Heng Pan and Costas Grigoropoulos, as well as Seung H. Ko of the Korea Advanced Institute of Science and Technology, Daejon.


Hybrid solar system makes rooftop hydrogen

Renesas, Glucose, Kit Renesas offers glucose meter demo kit

MANHASSET, NY -- Renesas Electronics America has made its Continua demonstration platform available at a level of power efficiency that ensures long battery life.

The platform is based on the Renesas Electronics V850 microcontroller with certified Continua Blood Glucose agent software.

V850 MCUs, at about 350uA/DMIPS with up to 1 MB of flash memory and integrated USB function support, make it possible to handle the added software load without sacrificing the long battery life that end users have come to expect, according to Renesas.

By eliminating the error-prone process of manually collecting measurement data, Continua blood glucose meters as defined by the Continua Health Alliance help improve the quality of medical data and diagnosis.

The challenge in implementing these devices is that collecting and communicating the data adds a significant software burden to existing medical products. This software burden could become a human burden if the people who use these devices every day have to replace batteries frequently.

As an example, Renesas Electronics’s 32-bit V850ES/Jx3-L MCUs relieve this burden with an extraordinarily high energy efficiency — 1.9 DMIPS/MHz — which enables the V850 devices to process more than other MCUs at a lower frequency.

The demonstration platform implements a set of guidelines for blood glucose meters developed by the Continua Health Alliance — a non-profit industry organization dedicated to establishing connected and interoperable personal health solutions.

Alliance guidelines utilize key industry standards to ensure an end-to-end architecture from personal medical devices to hospital information systems. One such standard is ISO/IEEE 11073, which defines a medical device, the measurements that it makes, and a protocol for communicating the measurement data. For the underlying communication transport method, the Continua guidelines specify open standards such as USB, Bluetooth  and ZigBee.

While the Continua demonstration platform uses USB, the V850ES/Jx3-L MCUs can also support Bluetooth and ZigBee functionality.

The Renesas Electronics V850 Continua demonstration platform runs software from Lamprey Networks Inc., developer of the reference code for the Continua Health Alliance. The software used in the platform has been certified by the Continua Health Alliance and is portable across multiple Renesas Electronics MCUs. Because the software is targeted to a specific type of medical device, the software footprint can be optimized.

The Continua Demonstration Platform for Renesas Electronics V850 MCUs is available now.
Renesas, Glucose, Kit Renesas offers glucose meter demo kit

New eruption discovered at undersea volcano, after successfully forecasting the event

What makes the event so intriguing is that the scientists had forecast the eruption starting five years ago -- the first successful forecast of an undersea volcano.

Bill Chadwick, an Oregon State University geologist, and Scott Nooner, of Columbia University, have been monitoring Axial Seamount for more than a decade, and in 2006 published a paper in the Journal of Volcanology and Geothermal Research in which they forecast that Axial would erupt before the year 2014. Their forecast was based on a series of seafloor pressure measurements that indicated the volcano was inflating.

"Volcanoes are notoriously difficult to forecast and much less is known about undersea volcanoes than those on land, so the ability to monitor Axial Seamount, and determine that it was on a path toward an impending eruption is pretty exciting," said Chadwick, who was chief scientist on the recent expedition, which was jointly funded by the National Oceanic and Atmospheric Administration and the National Science Foundation.

Axial last erupted in 1998 and Chadwick, Nooner and colleagues have monitored it ever since. They used precise bottom pressure sensors -- the same instruments used to detect tsunamis in the deep ocean -- to measure vertical movements of the floor of the caldera much like scientists would use GPS on land to measure movements of the ground. They discovered that the volcano was gradually inflating at the rate of 15 centimeters (six inches) a year, indicating that magma was rising and accumulating under the volcano summit.

When Axial erupted in 1998, the floor of the caldera suddenly subsided or deflated by 3.2 meters (10.5 feet) as magma was removed from underground to erupt at the surface. The scientists estimated that the volcano would be ready to erupt again when re-inflation pushed the caldera floor back up to its 1998 level.

"Forecasting the eruption of most land volcanoes is normally very difficult at best and the behavior of most is complex and variable," said Nooner, who is affiliated with the Lamont-Doherty Earth Observatory. "We now have evidence, however, that Axial Seamount behaves in a more predictable way than many other volcanoes -- likely due to its robust magma supply coupled with its thin crust, and its location on a mid-ocean ridge spreading center.

"It is now the only volcano on the seafloor whose surface deformation has been continuously monitored throughout an entire eruption cycle," Nooner added.

The discovery of the new eruption came on July 28, when Chadwick, Nooner and University of Washington colleagues Dave Butterfield and Marvin Lilley led an expedition to Axial aboard the R/V Atlantis, operated by the Woods Hole Oceanographic Institution. Using Jason, a remotely operated robotic vehicle (ROV), they discovered a new lava flow on the seafloor that was not present a year ago.

"It's funny," Chadwick said. "When we first arrived on the seafloor, we thought we were in the wrong place because it looked so completely different. We couldn't find our markers or monitoring instruments or other distinctive features on the bottom. Once we figured out that an eruption had happened, we were pretty excited.

"When eruptions like this occur, a huge amount of heat comes out of the seafloor, the chemistry of seafloor hot springs is changed, and pre-existing vent biological communities are destroyed and new ones form," Chadwick added. "Some species are only found right after eruptions, so it is a unique opportunity to study them."

The first Jason ROV dive of the expedition targeted a field of "black smoker" hot springs on the western side of the caldera, beyond the reach of the new lava flows. Butterfield has been tracking the chemistry and microbiology of hot springs around the caldera since the 1998 eruption.

"The hot springs on the west side did not appear to be significantly disturbed, but the seawater within the caldera was much murkier than usual," Butterfield said, "and that meant something unusual was happening. When we saw the 'Snowblower' vents blasting out huge volumes of white floc and cloudy water on the next ROV dive, it was clear that the after-effects of the eruption were still going strong. This increased output seems to be associated with cooling of the lava flows and may last for a few months or up to a year."

The scientists will examine the chemistry of the vent water and work with Julie Huber of the Marine Biological Laboratory to analyze DNA and RNA of the microbes in the samples.

The scientists recovered seafloor instruments, including two bottom pressure recorders and two ocean-bottom hydrophones, which showed that the eruption took place on April 6 of this year. A third hydrophone was found buried in the new lava flows.

"So far, it is hard to tell the full scope of the eruption because we discovered it near the end of the expedition," said Chadwick, who works out of OSU's Hatfield Marine Science Center in Newport. "But it looks like it might be at least three times bigger than the 1998 eruption."

The lava flow from the 2011 eruptions was at least two kilometers (1.2 miles) wide, the scientists noted.

"Five years ago, these scientists forecast this eruption, which has resulted in millions of square meters of new lava flows on the seafloor," said Barbara Ransom, program director in the National Science Foundation's Division of Ocean Sciences. "The technological advances that allow this research to happen will lead to a new understanding of submarine volcanoes, and of any related hazards."

The bottom-anchored instruments documented hundreds of tiny earthquakes during the volcanic eruption, but land-based seismic monitors and the Sound Surveillance System (SOSUS) hydrophone array operated by the U.S. Navy only detected a handful of them on the day of the eruption because many components of the hydrophone system are offline.

"Because the earthquakes detected back in April at a distance from the volcano were so few and relatively small, we did not believe there was an eruption," said Bob Dziak, an OSU marine geologist who monitors the SOSUS array. "That is why discovering the eruption at sea last week was such a surprise." Both Dziak and Chadwick are affiliated with the Cooperative Institute for Marine Resource Studies -- a joint NOAA/Oregon State University institute.

This latest Axial eruption caused the caldera floor to subside by more than two meters (six feet). The scientists will be measuring the rate of magma inflation over the next few years to see if they can successfully forecast the next event.

"The acid test in science -- whether or not you understand a process in nature -- is to try to predict what will happen based on your observations," Chadwick said. "We have done this and it is extremely satisfying that we were successful. Now we can build on that knowledge and look to apply it to other undersea volcanoes -- and perhaps even volcanoes on land."


New eruption discovered at undersea volcano, after successfully forecasting the event

Solid-State Drives, Serial ATA, PCI Express, Flash Drives, SATA-IO, SSDs Serial ATA, PCIe converge for flash drives

SAN JOSE, Calif. – The serial ATA interconnect will take a ride on PCI Express to support data rates of 8 and 16 Gbit/second to serve the accelerating needs of sold-state and hybrid drives. The move is a sign of the rising proliferation of both flash drives and PCIe.

The Serial ATA International Organization (SATA-IO) will create a so-called SATA Express standard as part of its version 3.2 specifications expected out by the end of the year. The spec essentially ports serial ATA software to the PCI Express transport and defines new connectors needed for cards and drives that use it.

Many early flash drives adopted 3 Gbit/second SATA because it was fast enough, low cost and widely supported in PCs. But with the advent of new, faster NAND flash interfaces the SATA "host interface has become the bottleneck," said Mladen Luksic, president of SATA-IO and an interface engineer at hard drive maker Western Digital.

SATA Express will handle up to two lanes of PCI Express to deliver 8 Gbits/second when implemented with PCIe Gen 2 or 16 Gbits/s with PCIe Gen 3. The newly minted 8 GTransfers/s PCIe Gen 3 interface will begin shipping in volume in PC products over the next six months, said Amber Huffman, a technical lead at SATA-IO and a principal engineer in Intel's storage group.

Current SATA interfaces are usually implemented in PC chip sets and device SoCs using an embedded controller to support the Advanced Host Controller Interface. SATA Express will allow devices to tap directly into PCI Express links coming off chip interfaces and even some modern processors.

The low latency, particularly of the CPU links "makes it a very interesting interface for solid-state drives," said Huffman.

Flash drives are on the rise and increasingly adopting PCIe.

Objective Analysis (Los Gatos, Calif.) forecasts PCIe will become dominant in server SSDs in 2012, with unit shipments greater than the combined shipments of its Serial-attached SCSI (SAS) and Fibre Channel drives. By 2015, the market watcher predicts more than two million PCIe SSDs will ship, more than all of the SATA SSDs sold in 2010.

Serial ATA is used in the vast majority of notebook and desktop hard drives, but less than a third of server drives, territory owned by the more robust and higher cost SAS interface.

SAS specialists such as LSI Corp. are already showing 12 Gbit's SAS chips, but industry efforts also are also underway to port SAS to PCI Express. Specs for a so-called SCSI over PCI Express standard are not expected to be finished for about a year.

The existing 6 Gbit/s serial ATA interface adequately serves a wide range of desktop, notebook and consumer systems, said Luksic. The SATA-IO group will explore needs for those systems as they evolve in the future, he said.


Solid-State Drives, Serial ATA, PCI Express, Flash Drives, SATA-IO, SSDs Serial ATA, PCIe converge for flash drives