niedziela, 12 września 2010

Electricity collected from the air could become the newest alternative energy source

Imagine devices that capture electricity from the air ― much
like solar cells capture sunlight ― and using them to light a house or
recharge an electric car. Imagine using similar panels on the rooftops of buildings to prevent lightning before it forms. Strange as it may
sound, scientists already are in the early stages of developing such
devices, according to a report presented today at the 240th National
Meeting of the American Chemical Society (ACS).

"Our research could pave the way for turning electricity from the
atmosphere into an alternative energy source for the future," said study
leader Fernando Galembeck, Ph.D. His research may help explain a
200-year-old scientific riddle about how electricity is produced and
discharged in the atmosphere. "Just as solar energy could free some
households from paying electric bills, this promising new energy source
could have a similar effect," he maintained.
"If we know how electricity builds up and spreads in the atmosphere,
we can also prevent death and damage caused by lightning strikes,"
Galembeck said, noting that lightning causes thousands of deaths and
injuries worldwide and millions of dollars in property damage.
The notion of harnessing the power of electricity formed naturally
has tantalized scientists for centuries. They noticed that sparks of
static electricity formed as steam escaped from boilers. Workers who
touched the steam even got painful electrical shocks. Famed inventor
Nikola Tesla, for example, was among those who dreamed of capturing and
using electricity from the air. It's the electricity formed, for
instance, when water vapor collects on microscopic particles of dust and
other material in the air. But until now, scientists lacked adequate
knowledge about the processes involved in formation and release of
electricity from water in the atmosphere, Galembeck said. He is with the
University of Campinas in Campinas, SP, Brazil.
Scientists once believed that water droplets in the atmosphere were
electrically neutral, and remained so even after coming into contact
with the electrical charges on dust particles and droplets of other
liquids. But new evidence suggested that water in the atmosphere really
does pick up an electrical charge.

Galembeck and colleagues confirmed that idea, using laboratory
experiments that simulated water's contact with dust particles in the
air. They used tiny particles of silica and aluminum phosphate, both
common airborne substances, showing that silica became more negatively
charged in the presence of high humidity and aluminum phosphate became
more positively charged. High humidity means high levels of water vapor
in the air ― the vapor that condenses and becomes visible as "fog" on
windows of air-conditioned cars and buildings on steamy summer days.
"This was clear evidence that water in the atmosphere can accumulate
electrical charges and transfer them to other materials it comes into
contact with," Galembeck explained. "We are calling this
'hygroelectricity,' meaning 'humidity electricity'."
In the future, he added, it may be possible to develop collectors,
similar to the solar cells that collect the sunlight to produce
electricity, to capture hygroelectricity and route it to homes and
businesses. Just as solar cells work best in sunny areas of the world,
hygroelectrical panels would work more efficiently in areas with high
humidity, such as the northeastern and southeastern United States and
the humid tropics.

Galembeck said that a similar approach might help prevent lightning
from forming and striking. He envisioned placing hygroelectrical panels
on top of buildings in regions that experience frequent thunderstorms.
The panels would drain electricity out of the air, and prevent the
building of electrical charge that is released in lightning. His
research group already is testing metals to identify those with the
greatest potential for use in capturing atmospheric electricity and preventing lightning strikes.
"These are fascinating ideas that new studies by ourselves and by
other scientific teams suggest are now possible," Galembeck said. "We
certainly have a long way to go. But the benefits in the long range of
harnessing hygroelectricity could be substantial."

http://www.physorg.com/news201958072.html

sobota, 11 września 2010

Graphene may hold key to speeding up DNA sequencing

Extremely thin membrane, a mere one-atom thick, lives up to its acclaim as a 'rapidly rising star'

Cambridge, Mass. - September 9, 2010 - In a paper published as the cover story of the September 9, 2010 Nature,
researchers from Harvard University and MIT have demonstrated that
graphene, a surprisingly robust planar sheet of carbon just one-atom
thick, can act as an artificial membrane separating two liquid
reservoirs.
By drilling a tiny pore just a few-nanometers in diameter, called a
nanopore, in the graphene membrane, they were able to measure exchange
of ions through the pore and demonstrated that a long DNA molecule can
be pulled through the graphene nanopore just as a thread is pulled
through the eye of a needle.

"By measuring the flow of ions passing through a nanopore drilled in
graphene we have demonstrated that the thickness of graphene immersed in
liquid is less then 1 nm thick, or many times thinner than the very
thin membrane which separates a single animal or human cell from its
surrounding environment," says lead author Slaven Garaj, a Research
Associate in the Department of Physics at Harvard. "This makes graphene
the thinnest membrane able to separate two liquid compartments from each
other. The thickness of the membrane was determined by its interaction
with water molecules and ions."
Graphene, the strongest material known, has other advantages. Most importantly, it is electrically conductive.

"Although the membrane prevents ions and water from flowing through
it, the graphene membrane can attract different ions and other chemicals
to its two atomically close surfaces. This affects graphene's
electrical conductivity and could be used for chemical sensing," says
co-author Jene Golovchenko, Rumford Professor of Physics and Gordon
McKay Professor of Applied Physics at Harvard, whose pioneering work
started the field of artificial nanopores in solid-state membranes.
"I believe the atomic thickness of the graphene makes it a novel
electrical device that will offer new insights into the physics of
surface processes and lead to a wide range of practical application,
including chemical sensing and detection of single molecules."

In recent years graphene has astonished the scientific community with
its many unique properties and potential applications, ranging from
electronics and solar energy research to medical applications.
Jing Kong, also a co-author on the paper, and her colleagues at MIT
first developed a method for the large-scale growth of graphene films
that was used in the work.
The graphene was stretched over a silicon-based frame, and inserted
between two separate liquid reservoirs. An electrical voltage applied
between the reservoirs pushed the ions towards graphene membrane. When a
nanopore was drilled through the membrane, this voltage channeled the
flow of ions through the pore and registered as an electrical current
signal.

When the researchers added long DNA chains in the liquid, they were
electrically pulled one by one through the graphene nanopore. As the DNA
molecule threads the nanopore, it blocks the flow of ions, resulting in
a characteristic electrical signal that reflects the size and
conformation of the DNA molecule.
Co-author Daniel Branton, Higgins Professor of Biology, Emeritus at
Harvard, is one of the researches who, more than a decade ago, initiated
the use of nanopores in artificial membranes to detect and characterize
single molecules of DNA.
Together with his colleague David Deamer at the University of
California, Branton suggested that nanopores might be used to quickly
read the genetic code, much as one reads the data from a ticker-tape
machine.

As a DNA chain passes through the nanopore, the nucleobases, which
are the letters of the genetic code, can be identified. But a nanopore
in graphene is the first nanopore short enough to distinguish between
two closely neighboring nucleobases.
Several challenges still remain to be overcome before a nanopore can
do such reading, including controlling the speed with which DNA threads
through the nanopore.
When achieved, nanopore sequencing could lead to very inexpensive
and rapid DNA sequencing and has potential to advance personalized
health care.

"We were the first to demonstrate DNA translocation through a truly
atomically thin membrane. The unique thickness of the graphene might
bring the dream of truly inexpensive sequencing closer to reality. The
research to come will be very exciting," concludes Branton.

http://www.eurekalert.org/pub_releases/2010-09/hu-gmh091010.php

 

Astronomers to detect alien volcanoes

SYDNEY: Astronomers may soon be able to detect volcanic activity on
planets outside our Solar System, providing further insight into
‘Earth-like’ alien worlds, according to a recent paper.

When large, explosive volcanic eruptions occur, they emit high
quantities of sulphur dioxide into the stratosphere. Without an
eruption, however, sulphur dioxide only occurs in an Earth-like
stratosphere in very small amounts.
Now scientists have developed a model for eruptions on an Earth-like
exoplanet, finding that the presence of volcanic sulphur dioxide could
be used to remotely detect a volcanic eruption, despite the fact that
technology for imaging the surface of an exoplanet remains decades away.

Catching that first glimpse
“Measuring volcanic activity can be just one new tool in our
near-term toolbox, along with atmospheric spectra, to get an early
‘first glimpse’ into a planet's behaviour, long before we can see
anything like the pattern of oceans, mountain ranges, islands, or
continents,” said co-author, Lisa Kaltenegger, from the
Harvard-Smithsonian Centre for Astrophysics in Boston, Massachusetts.
To look for volcanic sulphur dioxide, astronomers would rely on a
technique known as the secondary eclipse, which requires the exoplanet
to cross behind its star, as seen from Earth.

By collecting light from the star and planet, then subtracting it from
the star (while the planet is hidden), astronomers are left with the
signal from the planet alone. They can then search that signal for signs
of particular chemical molecules.

Finding planets like our own
“If we can find volcanoes on other planets, we can figure out if they
are similar to our own planet when it was young,” said Kaltenegger.
“Or, if [the exoplanet] is as old as the Earth, but still has huge
volcanoes, the question would be, why is that so? What makes that
‘Earth’ different from ours?”
Scientists think that the Earth was much more volcanic when it was
‘young’, and that this helped bring the temperature into a habitable
range.

NASA to test theory
Brad Carter from the University of Southern Queensland said the paper
presents a useful method for studying or detecting terrestrial planets
orbiting “even nearby” stars.
“Given the important role of volcanism in the development of Earth's
atmosphere and climate, this paper suggests a practical new way to
compare rocky extrasolar planets with our own world,” said Carter.
“The line of research taken in this paper suggests an extension of
the method of 'comparative planetology' that has already been successful
in understanding the worlds of our Solar System using comparisons of
different planets," he said.
Kaltenegger hopes to test the theory, and several others, when NASA launches the James Webb Space Telescope (JWST) in 2014.

New microscope breaks light microscopy resolution barrier

A new laser-equipped microscope at IU Bloomington's
Light Microscopy Imaging Center makes it possible to examine biological samples with unprecedented detail in three dimensions.


The $1.2 million DeltaVision OMX super-resolution
microscope from Applied Precision (Issaquah, Wash.) was paid for
entirely with funds from the American Recovery and Reinvestment Act of
2009, through a National Institutes of Health program that supports
high-end instrumentation at America's most deserving centers of higher
education.

"It's a fantastic and unique acquisition for our university," said cell
biologist Claire Walczak, the Imaging Center's executive director. "This
super-resolution microscope, one of only 16 in the world and one of
only 8 commercial units, is part of our vision to bring state-of-the-art
technology to IU's life science researchers, to enable them to address
questions that they did not have the ability to ask previously, due to
the lack of appropriate technologies."

Walczak is a professor of biochemistry and molecular biology in the
Medical Sciences Program Bloomington, an arm of the IU School of
Medicine. Walczak also holds an adjunct appointment in the IU
Bloomington Department of Biology and is part of the Biochemistry
Program.
The imager is exceptionally fast in collecting images of a biological
specimen, and this speed enables scientists to gather crucial data. The
device uses laser light of four different colors to illuminate samples,
while four extremely sensitive digital cameras capture images every 10
milliseconds at the imager's speediest setting. The device can produce
as many as 5,000 full-color images per minute for its major task of
producing high-resolution images. Known as a "structured illumination"
microscope, the device will help IU scientists attain a better
understanding of how proteins are distributed inside cells with
unprecedented resolution.

Most high-technology light microscopes reach the limits of resolution
at 250-300 nanometers -- the diameter of a small bacterial cell. The
new OMX microscope IU has acquired can produce clear images down to 100
nanometers in the lateral dimension. Resolution along the z-axis
(perpendicular, or coming out of the page) is somewhat lower but still
tremendously improved relative to previous technologies.
"We'd envisioned this device would be most useful for
microbiologists, cell biologists, and neurobiologists at IU," Walczak
said. "But we expect scientists from many other fields will come up with
creative ways to take advantage of it."

Light Microscopy
Imaging Center (LMIC) Manager Jim Powers is responsible for training IU
researchers -- as well as visitors -- to use the device.
"The Imaging Center is a user-oriented resource," Powers said.
"Scientists rent time on our devices, and receive training to use them,
but after that, we expect they'll be able to work independently."

IU scientists get a reduced rate when using the LMIC's many microscopes,
due to the generous support from OVPR, the College of Arts and
Sciences, Medical Sciences, and Optometry. At present, the OMX is still
in a training mode in which Powers is working closely with Sid Shaw, an
assistant professor of biology and the technical director of the LMIC,
as well as IU research staff to calibrate the device and establish
protocols for future, similar uses. The LMIC staff expects the
instrument to be available to all IU researchers by September.

The arrival of the DeltaVision OMX microscope has spurred Walczak,
Shaw, and Powers to consider LMIC's future needs. Partly because of the
DeltaVision OMX's size, the LMIC is now out of physical space. In
addition, the device produces so much data (4,000 images takes up about
1.5 gigabytes of hard drive space), Walczak and Powers said one of the
center's next priorities is to improve the center's information
technology infrastructure through continued collaboration with IU's
Information Technology Services. Walczak and Powers want to ensure that
the large data sets produced by the OMX imager can be stored rapidly --
as well as protected from power outages and other catastrophes.


"We have some new things to think about, and lots of new things to see," Walczak said.

International research team develops ultrahigh-power energy storage devices

A team of researchers from the U.S. and France report the development of a micro-supercapacitor with remarkable properties.
The paper was published in the premier scientific journal Nature Nanotechnology online on August 15.

These micro-supercapacitors have the potential to power
nomad electronics, wireless sensor networks, biomedical implants, active
radiofrequency identification (RFID) tags and embedded microsensors,
among other devices.
Supercapacitors, also called electric double layer capacitors (EDLCs)
or ultracapacitors, bridge the gap between batteries, which offer high
energy densities but are slow, and “conventional” electrolytic
capacitors, which are fast but have low energy densities.

The newly developed devices described in Nature Nanotechnology have
powers per volume that are comparable to electrolytic capacitors,
capacitances that are four orders of magnitude higher, and energies per
volume that are an order of magnitude higher. They were also found to be
three orders of magnitude faster than conventional supercapacitors,
which are used in backup power supplies, wind power generators and other
machinery.

These new devices have been dubbed “micro-supercapacitors”
because they are only a few micrometers (0.000001 meters) thick.
What makes this possible? “Supercapacitors store energy in layers of
ions at high surface area electrodes,” said Dr. Yury Gogotsi, Trustee
Chair Professor of materials science and engineering at Drexel
University, and a co-author of the paper. “The higher the surface area
per volume of the electrode material, the better the performance of the
supercapacitor.”

Vadym Mochalin, research assistant professor of materials science and
engineering at Drexel and co-author, said, “We use electrodes made of
onion-like carbon, a material in which each individual particle is made
up of concentric spheres of carbon atoms, similar to the layers of an onion. Each particle is 6-7 nanometers in diameter.”
This is the first time a material with very small spherical particles
has been studied for this purpose. Previously investigated materials
include activated carbon, nanotubes, and carbide-derived carbon (CDC).


“The surface of the onion-like carbons is fully accessible to ions,
whereas with some other materials, the size or shape of the pores or of
the particles themselves would slow down the charging or discharging
process,” Mochalin said. “Furthermore, we used a process to assemble the
devices that did not require a polymer binder material to hold the
electrodes together, which further improved the electrode conductivity
and the charge/discharge rate. Therefore, our supercapacitors can
deliver power in milliseconds, much faster than any battery or supercapacitor used today.”

Researchers make magnetic fields breakthrough


Researchers at the University of Dundee
have made a breakthrough in the study of magnetic fields, which enhances
our understanding of how stars, including the Sun, work.

The team from the Magnetohydrodynamics research group in
the School of Engineering, Physics and Mathematics used state-of-the-art
computer simulations of evolving plasmas in the Sun's atmosphere.

By following how the magnetic field and the plasma interact, they
have uncovered new rules that govern what evolutions are possible.
Knowing the basic rules behind the apparently complex solar atmosphere gives the team hope of predicting how it will behave.
Magnetic fields cannot be directly seen, felt or tasted, but they are
a ubiquitous force of nature.
 

The neat pattern of magnetic "field
lines" from a bar magnet is well-known from school physics experiments.
Indeed, the magnetic field of the Earth itself has a similar pattern on a
much larger scale, which is what enables navigation by compass.
But magnetic fields are not always so ordered. Telescopic pictures of
the Sun's lower atmosphere taken in extreme-ultraviolet light, outside
the visible spectrum,
reveal the shape of the magnetic field lines because the plasma
particles emitting the light are guided by magnetic forces and move
along the magnetic field lines.


These images often reveal braiding and tangling of the field, in a
manner that would render a compass useless. The fact that the magnetic
field lines are tangled like spaghetti means that the plasma in the
Sun's atmosphere is not free to move around however it pleases and that
vast quantities of energy can be locked in the magnetic field, because
tangled fields have more energy than ordered fields.
Scientists believe that this energy is responsible for heating the Sun’s atmosphere to million-degree temperatures, but how this works in detail is a longstanding puzzle in solar physics.

The Dundee team hope their discovery will give us a better idea of just how this energy is released.
'Using these computer simulations,
we have studied braided magnetic fields and made a significant advance
in understanding how they evolve over time,' said Dr Gunnar Hornig, one
of the paper’s authors.

'You can observe magnetic fields on the Sun with satellites and see
that these structures are often braided. That is they are not just
simple loops, but these loops interlink.
'These structures are not static. They evolve because the Sun is not a
rigid body but essentially a plasma ball of gas. It kind of boils, and
the motion on the surface changes these magnetic structures. They start
to move them around and sometimes the braiding is increased. And if
certain critical conditions are met then these structures start to relax
to something simpler.

'If you take a twig of a branch and start to twist it, then at some
point it starts to break and the individual fibres break up. Something
similar happens to these magnetic fields. Where it differs is that the
evolutions we have been studying allow the broken fields to combine to
form new structures.'
Having investigated how magnetic field
braiding works in a specific instance, the team will now switch their
attention to examining how they work in more general, complex
structures.


'We began by looking at braided magnetic fields in the Sun’s
atmosphere,' explained Dr Anthony Yeates, one of the team members. 'We
know that these magnetic fields break up and reconnect and we have now
discovered new rules governing which evolutions are possible and how
this is happening.
'This is fundamental research - part of the theory of astrophysical
plasmas. It forms part of our attempts to understand how stars work,
which enhances our understanding of how our own Sun evolves, and how it
affects the climate and life on Earth.'


Their research has been published in the latest edition of Physical Review Letters, as a paper entitled 'Topological constraints on magnetic relaxation'.
The ongoing research project on quantifying magnetic fluxes started
last October, and is funded by the Science and Technology Facilities
Council.

Scientists discover first new chlorophyll in 60 years

University of Sydney scientists have
stumbled upon the first new chlorophyll to be discovered in over 60
years and have published their findings in the international journal Science.
Found by accident in stromatolites from Western Australia's Shark Bay, the new pigment named chlorophyll f can utilise lower light energy than any other known chlorophyll.

The historic study published online in Science, challenges our
understanding of the physical limits of photosynthesis - revealing that
small-scale molecular changes to the structure of chlorophyll allows
photosynthetic organisms to survive in almost any environment on Earth.


The new chlorophyll was discovered deep within stromatolites -
rock-like structures built by photosynthetic bacteria, called
cyanobacteria - by lead author Dr Min Chen from the University of
Sydney.

A team of interdisciplinary scientists, including Dr Martin Schliep
and Dr Zhengli Cai (University of Sydney); Associate Professor Robert
Willows (Macquarie University); Professor Brett Neilan (University of
New South Wales) and Professor Hugo Scheer (University of Munich),
characterised the absorption properties and chemical structure of chlorophyll f, making it the fifth known type of chlorophyll molecule on Earth.


Chlorophyll is the essential molecule in oxygenic photosynthesis -
the process that enables plants, algae and some bacteria to convert
carbon dioxide into sugar and oxygen by using free energy from sunlight.
Until recently, oxygenic photosynthesis was thought only to occur in
light that is visible to human eyes, between 400nm to 700nm, as
chlorophyll was strictly limited to absorbing light in this range.

This was overturned in 1996 when scientists found a cyanobacterium
that could photosynthesise using light just outside the visible
spectrum - at 710nm, in the infrared region - using a modified
chlorophyll molecule, named chlorophyll d. Since this discovery, scientists around the world have been puzzled by how chlorophyll d is able to get enough energy from infrared light for photosynthesis.
Now the rules of photosynthesis need to be rewritten again, with the
discovery of a new chlorophyll that can absorb light of even lower
photon energy - 720nm - making it the most red-shifted chlorophyll to
date.

In ecological terms, chlorophyll f allows cyanobacteria living
deep within stromatolites to photosynthesise using low-energy infrared
light, the only light able to penetrate into the structure, which
challenges further our understanding of the physical limits of
photosynthesis.
Dr Chen, from the School of Biological Sciences, explains:

"Finding  the new chlorophyll was totally unexpected - it was one of those
serendipitous moments of scientific discovery.
"I was actually looking for chlorophyll d, which we knew could
be found in cyanobacteria living in low light conditions. I thought
that stromatolites would be a good place to look, since the bacteria in
the middle of the structures don't get as much light as those on the
edge."

After obtaining a sample of stromatolite from Hamelin Pool, Dr Chen looked for chlorophyll d
by culturing the cyanobacterial sample in infrared light of 720nm. This
ensured only the survival of cyanobacteria that had chlorophylls able
to absorb and use infrared light.
High performance liquid chromatography of the cultured sample
performed six months later revealed not only trace amounts of
chlorophyll d, but also a new chlorophyll not seen before.

Testing the optical absorption spectrum of the new chlorophyll
revealed that it could absorb much longer wavelengths of light than any
other known chlorophyll - 10nm longer than chlorophyll d and more than 40nm longer than chlorophyll a.
Sequential mass spectral analysis revealed the molecular weight of
the new pigment to be 906 mass units. Then nuclear magnetic resonance
(NMR) spectroscopy was performed to determine the chemical structure of
the chlorophyll. Results indicated that chlorophylls a, b, d and f
have very similar chemical structures, differing only in the position
of a substitution. Yet these tiny differences in structure give the
chlorophylls very different spectral properties, and hence can function
in very different light environments.

"Discovering this new chlorophyll has completely overturned the
traditional notion that photosynthesis needs high energy light," Dr Chen
said.


"It is amazing that this new molecule, with a simple change to its
chemical structure, can absorb extremely low energy light. This means
that photosynthetic organisms can utilise a much larger portion of the
solar spectrum than we previously thought and that the efficiency of
photosynthesis is much greater than we ever imagined.

"Chlorophyll f, and its ability to absorb infrared light, can have numerous applications to industries like plant biotechnology and bioenergy.


"For us, the next challenge is to work out the function of this new chlorophyll in photosynthesis.
"Is its job to capture additional red light and pass it on to another
chlorophyll, like chlorophyll a, in the reaction centre for
photosynthesis?

"Or is it the only chlorophyll responsible for photosynthesis in the
cyanobacterium? And if it is, then we will be speechless wondering how
this molecule can get enough energy from infrared light to make oxygen
from water.


"Whatever happens next, the fact that we have discovered a
cyanobacterium that exploits a tiny modification in its chlorophyll
molecule to photosynthesise in light that we cannot see, opens our mind
to the seemingly limitless ways that organisms adapt to survive in their
environment."

Rosette Nebula: The Heart of a Rose

This composite image shows the Rosette star formation region, located about 5,000 light years from Earth.

Data from the Chandra X-ray Observatory are colored red and outlined by
a white line (roll your mouse over the image above).

The X-rays reveal hundreds of young stars in the central cluster and fainter
clusters on either side. Optical data from the Digitized Sky Survey and
the Kitt Peak National Observatory (purple, orange, green and blue) show
large areas of gas and dust, including giant pillars that remain behind after intense radiation from massive stars has eroded the more diffuse gas.

A recent Chandra study of the cluster on the right side of the image,
named NGC 2237, provides the first probe of the low-mass stars in this
satellite cluster. Previously only 36 young stars had been discovered in
NGC 2237, but the Chandra work has increased this sample to about 160
stars. The presence of several X-ray emitting stars around the pillars
and the detection of an outflow -- commonly associated with very young
stars -- originating from a dark area of the optical image indicates
that star formation is continuing in NGC 2237.

By combining these  results with earlier studies, the scientists conclude that the central  cluster formed first, followed by expansion of the nebula, which triggered the formation of the two neighboring clusters, including NGC 2237.

This work was led by Junfeng Wang of the Harvard-Smithsonian Center
for Astrophysics. The co-authors were Eric Feigelson, Leisa Townsley,
Pat Broos and Gordon Garmire from Penn State University, Carlos
Roman-Zuniga from the German-Spanish Astronomical Center in Spain, and
Elizabeth Lada from the University of Florida.

source:
http://chandra.harvard.edu/photo/2010/rosette/

Cheaper, better solar cell is full of holes


A new low-cost etching technique developed at the U.S.
Department of Energy's National Renewable Energy Laboratory can put a
...trillion holes in a silicon wafer the size of a compact disc.

As the tiny holes deepen, they make the silvery-gray
silicon appear darker and darker until it becomes almost pure black and
able to absorb nearly all colors of light the sun throws at it.
At room temperature, the black silicon wafer can be made in about
three minutes. At 100 degrees F, it can be made in less than a minute.


The breakthrough by NREL scientists likely will lead to lower-cost solar cells that are nonetheless more efficient than the ones used on rooftops and in solar arrays today.
R&D Magazine recently awarded the NREL team one of its R&D
100 awards for Black Silicon Nanocatalytic Wet-Chemical Etch. Called
"the Oscars of Invention," the R&D 100 awards recognize the most
significant scientific breakthroughs of the year.
Howard Branz, the principal investigator for the project, said his
team got interested in late 2006 after he heard a talk by a scientist
from the Technical University of Munich.

The scientist described how his
team had created black silicon by laying down a thin gold layer using a
vacuum deposition technique. Quickly, NREL senior scientist Qi Wang and
senior engineer Scott Ward gave it a try.
"We always ride on the shoulders of others," Branz said. "We started by replicating the Munich experiment."
Packets of Light, Golden Holes
Think of light as coming in little packets.
Each packet is a photon,
which potentially can be changed into an electron for solar energy. If
the photon bounces off the surface of a solar cell, that's energy lost.
Some of the light normally bounces off when it hits an object, but a
'black silicon' wafer will absorb all the light that hits it.
The human eye perceives the wafer as black because almost no sunlight
reflects back to the retina. And that is because the trillion holes in
the wafer's surface do a much better job of absorbing the wavelengths of
light than a solid surface does.
It's roughly the same reason that ceiling tiles with holes in them
absorb sound better than ceiling tiles without holes.

Scientists by the
late 19th century had already done experiments to show that what works
for absorbing sound also works for absorbing light.
The team from Munich used evaporation techniques that require
expensive vacuum pumps to lay down a very thin layer of gold, perhaps 10
atoms thick, Branz said. When a mixture of hydrogen peroxide and
hydrofluoric acid was poured on the thin gold layer, nanoparticles of
gold bored into the smooth surface of the wafer, making billions of
holes.

The NREL team knew right away that the vacuum pumps and evaporative
equipment needed to deposit the gold were too costly to become
commercially viable.
NREL's Goal: Simplify the Process, Lower the Cost
"Our thinking was that if the goal is to make it cheaper, we want to avoid vacuum deposition completely," Branz said.
In a string of outside-the-box insights combined with some
serendipity, Branz and colleagues Scott Ward, Vern Yost and Anna Duda
greatly simplified that process.
Rather than laying the gold with vacuums and pumps, why not just spray it on? Ward suggested.


Rather than layering the gold and then adding the acidic mixture, why not mix it all together from the outset? Dada suggested.
In combination, those two suggestions yielded even better results.
The scientists put a suspended solution of gold nanoparticles, called
colloidal gold, on the silicon surface, and let the water evaporate
overnight to leave just the gold, which then etched into the wafer. The
wafer turned nearly as black as with the evaporated gold.

A Lucky Accident
And then, as is often the case with important scientific breakthroughs, serendipity entered.
NREL technician and chemist Vern Yost noticed after a time that he
wasn't getting such good results, and assumed it was because an old
batch of colloidal nanoparticles had somehow clumped together. So he
tried to separate them with aqua regia, a highly corrosive mixture of
nitric acid and hydrochloric acid.
Aqua regia is Latin for regal water,
and refers to a liquid that can dissolve the royal metals such as silver
and gold.
The aqua regia treatment got the process working better than ever,
and a little investigation found that the aqua regia had reacted with
the gold to form a solution of chloroauric acid.
Voila! Chloroauric acid is less expensive than colloidal gold and
actually is the chemical precursor that industry uses to make colloidal
gold.
Could the same black-silicon etching result be achieved by
substituting the inexpensive chloroauric acid for costly colloidal gold,
and then mixing it as before with hydrogen peroxide and hydrofluoric
acid?

Yost and Branz wondered.
Yes, it worked. "Chloroauric acid is much cheaper than colloidal
gold," Branz said. "In essence, by skipping a few steps, they were able
to make gold nanoparticles from the chloroauric acid at the same time as
they were etching holes into the silicon with the gold they had made."
Once the concept was understood and the mix of materials solved, the actual making of a black silicon wafer became quite simple.

"You take a beaker, put a silicon wafer in, pour in the chloroauric
acid, pour in the hydrofluoric acid and hydrogen peroxide, and wait,"
Branz said.
As little as 20 seconds later, the silvery silicon wafer turns black.
"Our method gives a blacker silicon and would replace an expensive
vacuum deposition system with a single, cheap, wet etch step," Branz
said.
Cheaper Process Also Makes a Better Material
They tested their black silicon and found that the much-lower-cost
recipe containing chloroauric acid quickly reduced the unwanted
reflection to less than 2 percent.

The more costly approach using
conventional silicon nitride anti-reflection layers stalled out at about
3 to 7 percent reflection. As an added bonus, black silicon prevents
reflection of low-angle morning and afternoon sunlight far better than
the conventional antireflection layer.
To understand why their inexpensive approach worked so well, the team
brought in NREL optics expert and senior scientist Paul Stradins and
NREL electron microscopists Bobby To and Kim Jones.

The trio found that
the black silicon squelched reflection so well because the holes were
smaller in diameter than the solar wavelengths.
That's crucial, because if the holes were as big as these light
wavelengths, the light rays would recognize a "sharp interface," just as
they would if they encountered a stainless steel counter. Any sharp
interface causes the light from the sun to reflect from the surface
before it can enter the solar cell and be changed into electricity.
Another reason the sunlight never feels a sharp interface when it
hits the silicon is that all those trillions of holes are bored to
different depths, because of the randomness of the etch rate of each
nanoparticle. Because of the variable depths of the holes, the rays very
gradually move from air to silicon.

The light never encounters an
abrupt change from air to solid surface, so it doesn't bounce off the
wafer.
But Will it Work in a Solar Cell?
Next was the formidable challenge of using the technology to make a workable solar cell.
Hao-Chi Yuan, a postdoctoral researcher, was added to the team to
figure out how best to work this new kind of silicon into a solar cell,
make the solar cells and determine the strengths and weaknesses of this
new kind of cell. Yuan, along with Yost, Branz and NREL engineer Matthew
Page worked to determine the ideal depths and diameters of the holes if
the goal is to turn photons into electrons.
To keep a solar cell at or near the record 16.8 percent efficiency
rate they'd achieved, they realized the holes had to adhere to the
"Goldilocks" principle.

The holes must be "just right": deep enough to
block reflections, but not so deep that they spoil the solar cell.
Specifically, they found the best results occurred when the trillions
of holes were on average about 500 nanometers or half a micron deep,
and their diameters just a little bit narrower than the smallest
wavelength of light. (How small? The diameter of 40 holes, added
together, would be the thickness of a human hair.)
If the holes were much deeper, the solar cell would have trouble
pulling all of the solar-generated electrons out. Efficiencies would be
so low no one would want to put the cells on their roof.
Happily, that combination of depth and diameter can be achieved with a 3-minute wet-etch soak at room temperature.


Industry's Acutely Interested
Though they will be cheaper to manufacture, NREL's best solar cells
are still a few tenths of a percent less efficient than the conventional
type. But the low reflection means a jump in photovoltaic efficiency of
at least 1 percentage point could be achieved. The team is still
working to wrest a bit more efficiency from the black silicon cells. The
solar cell world has become a game of inches, Branz said, so "even half
a percentage point bump in efficiency at reduced cost would be huge."
Solar cell companies are interested in licensing the technology from NREL.


"We've had several companies come visit here to learn more about it,"
Chris Harris, associate director of licensing in NREL's
commercialization and technology transfer division, said. "The interest
is high.
"This is certainly a significant advantage in an industry where
everyone is competing for market share and the cost per watt is a key
selling feature," Harris added. "Black silicon provides an added benefit
on top of any other improvements in efficiency a company can get."
Al Goodrich, a senior cost analyst for NREL's PV manufacturing division, found that making the black silicon wafers requires about a third less energy than adding the conventional anti-reflection layer to the finished solar cell.


The one-step process also is a lot easier on the environment.
The technology would replace a process that uses dangerous silane
gas, as well as cleaning gases such as nitrogen trifluoride, which has
17,000 times more punch than carbon dioxide in contributing to global
warming. A switch to the black silicon wet etch technology would mean
huge reductions in greenhouse gases, and improvements in the energy
payback for resulting PV devices.

It also reduces the capital costs of
starting a factory line by about 10 percent, because it replaces several
expensive vacuum vapor tools with a simple wet bath, Goodrich said.
NREL estimates that the black silicon can reduce cell conversion
costs by 4 to 8 percent, while using widely available industrial
materials and equipment.


"That's big," Goodrich added. "The people who are interested in this
technology recognize that that difference is valuable real estate."

Single gene regulates motor neurons in spinal cord

Discovery could help scientists develop new treatments for motor neuron diseases

New York (September 8, 2010) – In a surprising and unexpected discovery,
scientists at NYU Langone Medical Center have found that a single type
of gene acts as a master organizer of motor neurons in the spinal cord.
The finding, published in the September 9, 2010 issue of Neuron, could help scientists develop new treatments for diseases such as Lou Gehrig's disease or spinal cord injury.

The "master organizer" is a member of the Hox family of genes, best
known for controlling the overall pattern of body development. By
orchestrating a cascade of gene expression in the early embryo, Hox
genes allow for the creation of an animal's overall structure and body
part orientation. Scientists first discovered the genes in fruit flies
but they have since detected Hox activity in mammals. Humans harbor 39
such genes and 21 have been identified as coordinating motor neurons in
the spinal cord.

"We knew that there were 21 Hox genes that determine how connections are
made between motor neurons in the spinal cord and muscles in the
limbs," says Jeremy S. Dasen, PhD, an associate professor in the
Departments of Physiology and Neuroscience at NYU Langone Medical Center
and a Howard Hughes Medical Institute Early Career Scientist. "But
what was surprising to us in this study was that a single Hox gene acts
as a global organizer of motor neurons and their connections. The next
step will be to see how Hoxc9 in motor neurons affect motor behaviors
such as walking and breathing."

In mammals, many hundreds of motor neurons are needed to control the
variety of muscle cells used to coordinate movement. Proper function
depends on each of these neurons in the embryo finding its way from the
spinal cord to the group of muscles that it is equipped to control.
Dr. Dasen and his colleagues have been working to discover the blueprint
for this motor neuron diversity.

For this study, scientists studied mice with a mutation in Hoxc9 gene.
They analyzed the molecular markers that distinguished between motor
neurons in the limb and thoracic area and discovered mutation of Hoxc9
transformed the thoracic motor neurons into limb motor neurons. In a
series of biochemical experiments they further showed that Hoxc9
orchestrates gene expression in motor neurons by repressing the Hox
genes dedicated to limb coordination.

"What we are trying to understand is how the nervous system is wired to
control movements such as breathing and walking and see how genetic
programs can further control these circuits in terms of exploring this
paradigm as a way at looking at the vital circuits of the body," adds
Dr. Dasen.

Co-authors of the study include Heekyung Jung, Julie Lacombe, and
Jonathan Grinstein of NYU Langone Medical Center. The research was done
in collaboration with researchers at Columbia University Medical
Center, Massachusetts Institute of Technology and Memorial Sloan
Kettering Cancer Center.


The study was supported by a grant from the National Institutes of Health in Bethesda, Maryland.

 

http://www.eurekalert.org/pub_releases/2010-09/nlmc-sgr090310.php

Scientists identify new gene for memory

A team led by a Scripps Research Institute scientist has for the
first time identified a new gene that is required for memory formation in Drosophila, the common fruit fly. The gene may have similar
functions in humans, shedding light on neurological disorders such as
Alzheimer's disease or human learning disabilities.


The study was published in the September 9, 2010 edition (Vol. 67, No. 5) of the journal Neuron.

"This is the first time we have a new memory and learning gene that
lies outside what has been considered the most fundamental signaling
pathway that underlies learning in the fruit fly," said Ron Davis, chair
of Scripps Research Department of Neuroscience who led the study.
"Since many of the learning and memory genes originally identified in
the fruit fly are clearly involved in human neurological or psychiatric
diseases, this discovery may offer significant new insights into
multiple neurological disorders. We're definitely in the right
ballpark.

"
The study shows that different alleles or mutant forms of the gene, known as gilgamesh (gish), are required for short-term memory formation in Drosophila olfactory associative learning - learning that links a specific odor with a negative or positive reinforcer.
Because Drosophila learning genes are known to be conserved in higher organisms including humans, they often provide new insights into human brain disorders. For example, the Drosophila gene known as dunce, which Davis helped identify several years ago, provided clues to the genetics of the devastating psychiatric condition
of schizophrenia. Recent studies have revealed that the human version
of the dunce gene is a susceptibility determinant for schizophrenia. In a
similar way, any new learning gene identified in Drosophila, including gilgamesh, may provide new clues to genes involved in human neurological or psychiatric disorders.


"We're still early in the process of making connections between Drosophila
memory and learning genes and the pathology of human disease," Davis
said, "but it's already clear that many of these genes will provide
important conceptual information and potential insights into human brain
disorders. In addition, there is every reason to believe that their
gene products will be one day become the target of new drugs to enhance
cognition. Uncovering this new gene and its signaling pathway helps
bring us that much closer to this goal."

New Gene, New Pathway
To identify the new gene, Davis and his colleagues used a novel
screen for new memory mutants, looking for lines that showed abnormal
learning when only one of two copies of the gene was mutant.
"We used a dominant screen because we realized that behavior such as
learning and memory are very sensitive to gene dosage," Davis said.

"That is, the mutation of just one copy of a gene involved in behavior
is often sufficient to produce an abnormality."
The formation of new memories occurs, in part, through the activation
of molecular signaling pathways within neurons that comprise the neural
circuitry for learning, and for storing and retrieving those memories.
One of the things that makes the function of gish so interesting,
Davis noted, is the fact that it is independent of mutations of the
rutabaga gene, a Drosophila memory-learning pathway that is known
to be essential for memory formation.

The rutabaga mutants convert ATP, the energy chip of cells, into cyclic AMP or cAMP, which plays a  critical role in olfactory learning in Drosophila.
"The cAMP pathway is the major signaling pathway used by Drosophila neurons to turn on other enzymes and genes that are necessary for memories to form," Davis said. "In fruit flies,  memory and learning revolves around mutants of this pathway. It is  fundamental to the process."
In the new study, gish provided an answer to a longstanding problem in Drosophila learning and memory research - the unexplained residual memory
performance of flies carrying rutabaga mutations, which indicated the
existence of an independent signaling pathway
for memory formation. While other memory mutants have been identified,
until the discovery of gish none have been shown to reduce the residual
learning of mutant rutabaga flies.

Interestingly, the study found that the gish gene encodes a kind of
casein kinase (which help regulate signal pathways in cells) called Iγ
(CKIγ). This is the first time that this specific kinase has been cited
as having a role in memory formation.


The identification of all signaling pathways that are engaged in
specific neurons during memory formation and how they interact with one
another to encode memories is an issue of great importance, Davis said,
one that needs more exploration for a deeper understanding of memory
formation and memory failure in humans.

"The truth is that we have an extremely sketchy understanding of what
causes diseases like Alzheimer's," Davis said. "We need to understand a
lot more than we do now about normal brain functions like memory and learning before we have a high probability of succeeding in the development of a cure."

Tractor Beams Get Real

Once the stuff of science fiction, laboratory tractor beams are grabbing onto small objects in the lab.




WASHINGTON (ISNS) -- Tractor beams, energy rays that can move objects, are a science fiction mainstay. But now they are becoming a reality - at least for moving very tiny objects.

Researchers from the Australian National University have announced that they have built a device that can move small particles a meter and a half using only the power of light.



Physicists have been able to manipulate tiny particles over miniscule
distances by using lasers for years. Optical tweezers that can move
particles a few millimeters are common.

Andrei Rode, a researcher involved with the project, said that existing
optical tweezers are able to move particles the size of a bacterium a
few millimeters in a liquid. Their new technique can move objects one
hundred times that size over a distance of a meter or more.


The device works by shining a hollow laser beam around tiny glass
particles. The air surrounding the particle heats up, while the dark
center of the beam stays cool. When the particle starts to drift out of
the middle and into the bright laser beam, the force of heated air
molecules bouncing around and hitting the particle's surface is enough
to nudge it back to the center.



A small amount of light also seeps into the darker middle part of the
beam, heating the air on one side of the particle and pushing it along
the length of the laser beam. If another such laser is lined up on the
opposite side of the beam, the speed and direction the particle moves
can be easily manipulated by changing the brightness of the beams.

Rode said that their technique could likely work over even longer distances than they tested.



"With the particles and the laser we use, I would guess up to 10 meters
in air should not be a problem. The max distance we had was 1.5 meters,
which was limited by the size of the optical table in the lab," Rode
said.


Because this technique needs heated gas to push the particles around, it
can't work in the vacuum of outer space like the tractor beams in Star
Trek. But on Earth there are many possible applications for the
technology. The meter-long distances that the research team was able to
move the particles could open up new avenues for laser tweezers in the
transport of dangerous substances and microbes, and for sample taking
and biomedical research.

"There is the possibility that one could use the hollow spheres as a
means of chemical delivery agents, or microscopic containers of some
kind, but some more work would need to be done here just to check what
happens inside the spheres, in terms of sample heating," said David
McGloin, a physicist at the University of Dundee in the U.K not
connected with the Australian team.

Scientists examine possibility of a phonon laser, or 'phaser'

While the optical laser celebrated its 50th anniversary earlier
this year, some scientists have been working on a new type of coherent
beam amplifier for sound rather than light. Scientists theorize that phonons, which are the smallest discrete unit of vibrational energy, can
be amplified by a phonon laser to generate a highly coherent beam of
sound (particularly, high-frequency ultrasound), similar to how an
optical laser generates a highly coherent beam of light. However, phonon
laser research is still a relatively new area. In a new study,
scientists have for the first time demonstrated the possibility that
phonons can be collectively excited in an ultra-cold atomic gas in a way
that is similar to how an optical laser excites photons, prompting the
scientists to call the proposed device a "phaser."

The first theoretical phonon laser was proposed one year ago, in 2009, by a team of scientists  (Kerry Vahala, et al.) from the Max Planck Insitute and Caltech. In that study, the scientists outlined for the first time how a single
magnesium ion can be cooled to a temperature of about 1 milli-Kelvin in
an electromagnetic trap, and be used to create a single-ion phonon
laser.

However, the single-ion phonon laser works somewhat differently
than an optical laser,
since the phonon frequency is determined by the single-atom oscillation
frequency rather than corresponding to a collective oscillation.
In the new study, scientists J. T. Mendonca from the Instituto
Superior Tecnico (IST) in Lisbon, Portugal, and colleagues from the IST
and Umea University in Umea, Sweden, have extended the concept of the
single-ion phonon laser to the case of a large ensemble of atoms.

In  doing so, they have shown that an ultra-cold atomic gas can enable
collective phonon excitations. In contrast with the single-ion case,
here the phonon frequency is determined by the internal oscillations of
the atoms in the gas, similar to how the photon frequency in a laser is
determined by internal vibrations of the optical cavity.
“Neither coherent electromagnetic waves, nor coherent sound waves are necessarily difficult to generate,” Mendonca told PhysOrg.com.
“It depends on the system of consideration, the frequency range, etc.
The difficulty that has been addressed in our work is to copy the laser
mechanism, but now generating quanta of sound - phonons - rather than
quanta of light - photons.

We have shown that cold atom systems can be
controlled in great detail, enabling emissions of coherent phonons in a
manner copying the celebrated laser mechanism.”
In the new method, the gas is confined in a magneto-optical trap.
Three physical processes are used to create the phonon laser
instability. First, a red-detuned laser beam cools the atomic gas to
ultra-cold temperatures.

Then, a blue-detuned laser pumps the ultra-cold
atomic gas to create a population inversion, which is also a standard requirement  for optical lasers. Finally, the atoms produce a coherent emission of
phonons and then decay into a lower kinetic energy state. The scientists
note that the resulting acoustic oscillations can then be coupled to
the outside world by mechanical or electromagnetic means, where they
could be used to provide a source of coherent acoustic radiation.

Regarding the name of a future phonon laser, scientists have
previously considered the term “saser” (Sound Amplification by
Stimulated Emission of Radiation). But Mendonca and coauthors suggest
that “phaser” (where “phonon” is used instead of “sound”) may be more
appropriate, since the proposed phonon laser would excite phonons in a
way that is very similar to the laser.
“Using the word 'phonon' emphasizes the quantum nature of the
process,” Mendonca said. “Saser is certainly an adequate name as well,
but we think phaser has a better ring to it.”

A phonon laser could have some useful applications for researchers,
although it's likely that more applications are yet to be found. One
possible use of a highly coherent beam of ultrasound is that it could
allow researchers to greatly improve the imaging resolution in
tomography and other imaging techniques.


“It took a while to discover the many applications of the laser - at
first it was considered as an invention without a problem to solve,”
Mendonca said. “In our work we have been concerned with the basic
science issues rather than applications, but hopefully the case of the
phaser could be similar to that of the laser.”

Computers that read minds are being developed by Intel

New technology could allow people to dictate letters and search the internet simply by thinking, according to researchers at Intel who are behind the project.
Unlike current brain-controlled computers, which require users to imagine making physical movements to control a cursor on a screen, the new technology will be capable of directly interpreting words as they are thought.

Intel's scientists are creating detailed maps of the activity in the brain for individual words which can then be matched against the brain activity of someone using the computer, allowing the machine to determine the word they are thinking. Preliminary tests of the system have shown that the computer can work out words by looking at similar brain patterns and looking for key differences that suggest what the word might be. Dean Pomerleau, a senior researcher at Intel Laboratories, said that currently, the devices required to get sufficient detail of brain activity were bulky, expensive magnetic resonance scanners, like those used in hospitals.

But he said work was under way to produce smaller pieces of equipment that can be worn as headsets and that can produce the same level of detail. He said: "The computer uses a form of 20 questions to narrow down what the word is. "So a noun with a physical property such as spade, which you dig with, produces activity in the motor cortex of the brain, as this is the area that controlsphysical movements."A food related word like apple, however, produces activity in those parts of the brain related to hunger.

So the computer can infer attributes to each word being thought about and this lets the computer zero down on what the word is pretty quickly. "We are currently mapping out the activity that an average brain produces when thinking about different words. It means you'll be able to write letters, open emails or do Google searches just by thinking".Intel already have a working prototype that can detect words such as "screwdriver", "house" and "barn", by measuring around 20,000 points in the brain. But as brain scanning technology becomes more sophisticated the computer's ability to distinguish thoughts will improve.

Justin Ratner, director of Intel Laboratories and the company's chief technology officer, said: "Mind reading is the ultimate user interface. There will be concerns about privacy with this sort of thing and we will have to overcome them. "What is clear though is that humans are not restricted any more to just using keyboards and mice".

http://www.telegraph.co.uk/technology/news/7957664/Computers-that-read-minds-are-being-developed-by-Intel.html 

Engineers achieve world record with high-speed graphene transistors

Graphene, a one-atom-thick layer of graphitic carbon, has great potential to make electronic devices such as radios, computers and phones faster and smaller. But its unique ...properties have also led to difficulties in integrating the material into s...uch devices.

In a paper published Sept. 1 in the journal Nature, a group of UCLA researchers demonstrate how they have overcome some of these difficulties to fabricate the fastest graphene transistor to date.With the highest known carrier mobility — the speed at which electronic information is transmitted by a material — graphene is a good candidate for high-speed radio-frequency electronics.

But traditional techniques for fabricating the material often lead to deteriorations in device quality.The UCLA team, led by professor of chemistry and biochemistry Xiangfeng Duan, has developed a new fabrication process for graphene transistors using a nanowire as the self-aligned gate.Self-aligned gates are a key element in modern transistors, which are semiconductor devices used to amplify and switch electronic signals.

Gates are used to switch the transistor between various states, and self-aligned gates were developed to deal with problems of misalignment encountered because of the shrinking scale of electronics.To develop the new fabrication technique, Duan teamed with two other researchers from the California NanoSystems Institute at UCLA, Yu Huang, an assistant professor of materials science and engineering at the Henry Samueli School of Engineering and Applied Sciences, and Kang Wang, a professor of electrical engineering at the Samueli School."This new strategy overcomes two limitations previously encountered in graphene transistors," Duan said.

"First, it doesn't produce any appreciable defects in the graphene during fabrication, so the high carrier mobility is retained. Second, by using a self-aligned approach with a nanowire as the gate, the group was able to overcome alignment difficulties previously encountered and fabricate very short-channel devices with unprecedented performance.

"These advances allowed the team to demonstrate the highest speed graphene transistors to date, with a cutoff frequency up to 300 GHz — comparable to the very best transistors from high-electron mobility materials such gallium arsenide or indium phosphide."We are very excited about our approach and the results, and we are currently taking additional efforts to scale up the approach and further boost the speed." said Lei Liao, a postdoctoral fellow at UCLA.

High-speed radio-frequency electronics may also find wide applications in microwave communication, imaging and radar technologies.

http://newsroom.ucla.edu/portal/ucla/ucla-chemists-engineers-achieve-169811.aspx

Silicon oxide circuits break barrier: Nanocrystal conductors could lead to massive, robust 3-D storage

Rice University scientists have created the first two-terminal memory chips that use only silicon, one of the most common substances on the planet, in a way that should be easily adaptable to nanoelectronic manufacturing techniques and promises to extend the limits of miniaturization subject to Moore's Law.

Last year, researchers in the lab of Rice Professor James Tour showed how electrical current could repeatedly break and reconnect 10-nanometer strips of graphite, a form of carbon, to create a robust, reliable memory "bit." At the time, they didn't fully understand why it worked so well.Now, they do. A new collaboration by the Rice labs of professors Tour, Douglas Natelson and Lin Zhong proved the circuit doesn't need the carbon at all.

Jun Yao, a graduate student in Tour's lab and primary author of the paper
to appear in the online edition of Nano Letters, confirmed his
breakthrough idea when he sandwiched a layer of silicon oxide, an
insulator, between semiconducting sheets of polycrystalline silicon that
served as the top and bottom electrodes.

 

Applying a charge to the electrodes created a conductive pathway by stripping oxygen atoms from the silicon oxide and forming a chain of nano-sized silicon crystals. Once formed, the chain can be repeatedly broken and reconnected by applying a pulse of varying voltage.

The nanocrystal wires are as small as 5 nanometers wide, far smaller than
circuitry in even the most advanced computers and electronic devices."The beauty of it is its simplicity," said Tour, Rice's T.T. and W.F. Chao Chair in Chemistry as well as a professor of mechanical engineering and materials science and of computer science. That, he said, will be key to the technology's scalability. Silicon oxide switches or memory locations require only two terminals, not three (as in flash memory), because the physical process doesn't require the device to hold a charge.

 

It also means layers of silicon-oxide memory can be stacked in tiny but capacious three-dimensional arrays. "I've been told by industry that if you're not in the 3-D memory business in four years, you're not going to be in the memory business. This is perfectly suited for that," Tour said.Silicon-oxide memories are compatible with conventional transistor manufacturing technology, said Tour, who recently attended a workshop by the National Science Foundation and IBM on breaking the barriers to Moore's Law, which states the number of devices on a circuit doubles every 18 to 24 months.

"Manufacturers feel they can get pathways down to 10 nanometers. Flash memory is going to hit a brick wall at about 20 nanometers. But how do we get beyond that? Well, our technique is perfectly suited for sub-10-nanometer circuits," he said.Austin tech design company PrivaTran is already bench testing a silicon-oxide chip with 1,000 memory elements built in collaboration with the Tour lab. "We're real excited about where the data is going here," said PrivaTran CEO Glenn Mortland, who is using the technology in several projects supported by the Army Research Office, National Science Foundation, Air Force Office of Scientific Research, and the Navy Space and Naval Warfare Systems Command Small Business Innovation Research (SBIR) and Small Business Technology Transfer programs.

"Our original customer funding was geared toward more high-density memories," Mortland said. "That's where most of the paying customers see this going. I think, along the way, there will be side applications in various nonvolatile configurations."Yao had a hard time convincing his colleagues that silicon oxide alone could make a circuit. "Other group members didn't believe him," said Tour, who added that nobody recognized silicon oxide's potential, even though it's "the most-studied material in human history."

"Most people, when they saw this effect, would say, 'Oh, we had silicon-oxide breakdown,' and they throw it out," he said. "It was just sitting there waiting to be exploited."

In other words, what used to be a bug turned out to be a feature.

Yao went to the mat for his idea. He first substituted a variety of materials for graphite and found none of them changed the circuit's performance. Then he dropped the carbon and metal entirely and sandwiched silicon oxide between silicon terminals. It worked."It was a really difficult time for me, because people didn't believe it," Yao said. Finally, as a proof of concept, he cut a carbon nanotube to localize the switching site, sliced out a very thin piece of silicon oxide by focused ion beam and identified a nanoscale silicon pathway under a transmission electron microscope.

"This is research," Yao said. "If you do something and everyone nods their heads, then it's probably not that big. But if you do something and everyone shakes their heads, then you prove it, it could be big."It doesn't matter how many people don't believe it. What matters is whether it's true or not."

Silicon-oxide circuits carry all the benefits of the previously reported graphite device. They feature high on-off ratios, excellent endurance and fast switching (below 100 nanoseconds).They will also be resistant to radiation, which should make them suitable for military and NASA applications. "It's clear there are lots of radiation-hardened uses for this technology," Mortland said.

Silicon oxide also works in reprogrammable gate arrays being built by NuPGA, a company formed last year through collaborative patents with Rice University. NuPGA's devices will assist in the design of computer circuitry based on vertical arrays of silicon oxide embedded in "vias," the
holes in integrated circuits that connect layers of circuitry. Such
rewritable gate arrays could drastically cut the cost of designing
complex electronic devices.

 

 

God did not create Universe: Hawking

God no longer has any place in theories on the creation of the Universe due to a series of developments in physics, British scientist Stephen Hawking said in extracts published Thursday from a new book.

In a hardening of the more accommodating position on religion that he took in his... 1988 international best-seller "A Brief History of Time", Hawking said the Big Bang was merely the consequence of the law of gravity.

"Because there is a law such as gravity, the Universe can and will create itself from nothing. Spontaneous creation is the reason there is something rather than nothing, why the Universe exists, why we exist," he writes in "The Grand Design", which is being serialised by The Times newspaper.

"It is not necessary to invoke God to light the blue touch paper and set the Universe going," added the wheelchair-bound expert.

Hawking has achieved worldwide fame for his research, writing and television documentaries despite suffering since the age of 21 motor neurone disease that has left him disabled and dependent on a voice synthesiser.In "A Brief History of Time", Hawking had suggested that the idea of God or a divine being was not necessarily incompatible with a scientific understanding of the Universe.


But in his latest work, Hawking cites the 1992 discovery of a planet
orbiting a star outside our own Solar System as a turning point against
Isaac Newton's belief that the Universe could not have arisen out of
chaos.

"That makes the coincidences of our planetary conditions -- the single Sun, the lucky combination of Earth-Sun distance and solar mass -- far less remarkable, and far less compelling as evidence that the Earth was carefully designed just to please us human beings," he wrote.

Hawking argued earlier this year that mankind's only chance of long-term survival lies in colonising space, as humans drain Earth of resources and face a terrifying array of new threats.He also warned in a recent television series that mankind should avoid contact with aliens at all costs, as the consequences could be devastating

X-Ray Jets

The supermassive black holes that lie at the centers of galaxies can spawn tremendous bipolar jets of atomic particles.

These outflows, discovered at radio wavelengths, are thought to be produced
by matter accreting onto a hot disk around the black hole. There are many types of galaxies and radio jets, but in the most dramatic cases the hot particles extend across hundreds of thousands of light-years, well beyond the visible boundaries of the galaxy, and move at speeds close to the speed of light.


The physical processes that drive these jets and cause them to radiate are among the important outstanding problems of modern astrophysics. One of the problems in deciphering how the jet mechanisms operate, and how they interact with the environment near the black hole nuclei, is the comparatively gradual way that radio emission responds to changes. As a result, differences that may occur along the jet (or at different times in the same location) appear blurred together when viewed at radio wavelengths.

X-rays are also emitted from these jets, but unlike radio emission they respond very quickly to changes. SAO astronomers and associates Diana Worrall, Mark Birkenshaw, Andreas Zezas and Pepi Fabbiano, together with three colleagues, used the Chandra X-Ray Observatory to measure the X-ray emission from jets in the galaxy 3C270. By comparing the emission from both sides of the jets (and their environments) with the corresponding radio emission, they were able to reach several important conclusions. 

 

The first is that bright knots along the jet are driven by the effects of local magnetic fields. Another is that changes in the jets seem to have taken place during comparatively short timescales -- only tens or hundreds of thousands of years. The exact causes, however, are not yet clear. The results suggest that X-ray observations might be the most powerful way to probe episodic activity around giant black holes.

http://www.cfa.harvard.edu/

 

Variations in fine-structure constant suggest laws of physics not the same everywhere

One of the most controversial questions in
cosmology is why the fundamental constants of nature seem fine-tuned
...for life. One of these fundamental constants is the fine-structure
constant, or alpha, which is the coupling constant for the
electromagnetic force and equal to about 1/137.0359. If alpha were just
4% bigger or smaller than it is, stars wouldn't be able to make carbon
and oxygen, which would have made it impossible for life as we know it
to exist. Now, results from a new study show that alpha seems to have
varied a tiny bit in different directions of the universe billions of
years ago, being slightly smaller in the northern hemisphere and
slightly larger in the southern hemisphere. One intriguing possible
implication is that the fine-structure constant is continuously varying
in space, and seems fine-tuned for life in our neighborhood of the
universe.

The physicists, John Webb from the University of New South
Wales and his coauthors, used data from two telescopes to uncover the
spatial dependence of the fine-structure constant. Using the
north-facing Keck telescope in Mauna Kea, Hawaii, and the south-facing
Very Large Telescope (VLT) in Paranal, Chile, the researchers observed
more than 100 quasars, which are extremely luminous and distant galaxies
that are powered by massive black holes at their centers.
By measuring the quasar spectra, the researchers could gather data on the frequency of the electromagnetic radiation
emitted by quasars at high redshifts, corresponding to a time about 10
billion years ago. During the time the light traveled through space to
reach the telescopes, some of it was absorbed at specific wavelengths by
very old gas clouds that today can reveal the chemical composition of the clouds.

The cloud compositions could help the scientists determine the
fine-structure constant in those areas of the universe at that time,
since alpha
is a measure of the strength of the electromagnetic force between
electrically charged particles. As the coupling constant for the
electromagnetic force, it is similar to the constants for the other
three known fundamental forces of nature: the strong nuclear force, the
weak nuclear force, and gravitational force. Among its important
implications, alpha determines how strongly atoms hold on to their
electrons.
By combining the data from the two telescopes that look in opposite
directions, the researchers found that, 10 billion years ago, alpha
seems to have been larger by about one part in 100,000 in the southern
direction and smaller by one part in 100,000 in the northern direction.
The data for this “dipole” model of alpha has a statistical significance
of about 4.1 sigma, meaning that that there is only a one in 15,000
chance that it is a random event. 

 

At first, the data surprised Webb and his colleagues, since it seemed
to contradict previous results that the scientists had published in
1999. At that time, the scientists had used the north-facing Keck
telescope to find that alpha became slightly smaller the further away
(and older) the quasars were. So when the scientists first looked at
equally distant quasars from the southern hemisphere
using the VLT, they were surprised to find the slight increase in
alpha. After eliminating any possible bias, though, they realized that
they were looking at hemispherical differences of alpha.
While the data from just one telescope seemed to suggest that alpha
varies in time, data from the two telescopes show that alpha also seems
to vary in space. Such a discovery could have major implications,
starting with shattering the basic assumption that physical laws are the
same everywhere in the universe. The results also violate the Einstein
Equivalence Principle, and suggest that the universe may be much larger
than currently thought - or even infinite in size. Right now, the
scientists want to confirm the results with other experimental methods,
and see if the fine-structure constant could truly lead scientists to a
very different understanding of our universe.