Welcome Guest ( Log In | Register )

33 Pages « < 23 24 25 26 27 > » Bottom

Outline · [ Standard ] · Linear+

 Science & Technology Today

views
     
TStzmmalaysia
post Mar 26 2011, 10:39 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Nanotechnology points the way to greener pastures

Nourishing crops with synthetic ammonia (NH3) fertilizers has increasingly pushed agricultural yields higher, but such productivity comes at a price. Over-application of this chemical can build up nitrate ion (NO3–) concentrations in the soil -- a potential groundwater poison and food source for harmful algal blooms. Furthermore, industrial manufacturing of ammonia is an energy-intensive process that contributes significantly to atmospheric greenhouse gases.

A research team led by Miho Yamauchi and Masaki Takata from the RIKEN SPring-8 Center in Harima has now discovered an almost ideal way to detoxify the effects of ammonia fertilizers. By synthesizing photoactive bimetallic nanocatalysts that generate hydrogen gas from water using solar energy, the team can catalytically convert NO3– back into NH3 through an efficient route free from carbon dioxide emissions.

Replacing the oxygen atoms of NO3– with hydrogen is a difficult chemical trick, but chemists can achieve this feat by using nanoparticles of copper–palladium (CuPd) alloys to immobilize nitrates at their surfaces and catalyzing a reduction reaction with dissolved hydrogen atoms. However, the atomic distribution at the ‘nanoalloy’ surface affects the outcome of this procedure: regions with large domains of Pd atoms tend to create nitrogen gas, while well-mixed alloys preferentially produce ammonia.

According to Yamauchi, the challenge in synthesizing homogenously mixed CuPd alloys is getting the timing right—the two metal ions transform into atomic states at different rates, causing phase separation. Yamauchi and her team used the powerful x-rays of the SPring-8 Center’s synchrotron to characterize the atomic structure of CuPd synthesized with harsh or mild reagents. Their experiments revealed that a relatively strong reducing reagent called sodium borohydride gave alloys with near-perfect mixing down to nanoscale dimensions.

Most ammonia syntheses use hydrogen gas produced from fossil fuels, but the use of solar energy by the researchers avoids this. They found that depositing the nanoalloy onto photosensitive titanium dioxide (TiO2) yielded a material able to convert ultraviolet radiation into energetic electrons; in turn, these electrons stimulated hydrogen gas generation from a simple water/methanol solution. When they added nitrate ions to this mixture, the CuPd/TiO2 catalyst converted nearly 80% into ammonia—a remarkable chemical selectivity that the researchers attribute to high concentrations of reactive hydrogen photocatalytically produced near the CuPd surface.

Yamauchi is confident that this approach can help reduce the ecological impact of many classical chemical hydrogenation reactions. “Considering the environmental problems we face, we have to switch from chemical synthesis using fossil-based hydrogen to other clean processes,” she says.

PhysOrg

TStzmmalaysia
post Mar 26 2011, 10:43 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Where robots labour to overcome genetic disease

I'M PEERING through an internal window into an eerie blue-lit room packed with high-tech machinery. The temperature inside is held at 28 °C and the humidity is high. Technicians scan computer screens to ensure the robots are happy, but seem oblivious to my presence.

Replace the blue light with green and it would resemble the interior of one of Star Trek's Borg cubes. Thankfully, the technicians look nothing like cyborgs, and the goal is not the assimilation of humanity. Rather, this is the production line that Complete Genomics, a start-up in Mountain View, California, bets will revolutionise the discovery of disease genes.

It is already the world's largest human genome sequencing factory. In a room half the size of a tennis court sit 16 robots that sequenced the genomes of 800 people last year. Going full tilt, they can now churn out 400 genomes a month.

The current price, offered to researchers and pharma companies but not yet to private individuals, is $9500 per genome; place an order for 1000 or more, and it drops to $5500. When you consider that the first human genome was completed a decade ago for billions, DNA sequencing has come a long way, fast.

As I survey the scene, Jennifer Turcotte, the company's marketing chief, explains why it looks different from other sequencing labs. DNA is usually read inside hermetically sealed machines, but here the robots work with their guts exposed, for ease of maintenance. The heat and humidity suit the biochemistry of the sequencing reactions, and the dim blue light avoids frequencies that would bleach the fluorescent probes used to detect each letter of the genetic code.

The technicians wear clean-room gear, as dust would interfere with reading the sequences. Unless something goes awry, there is no need for them to intervene. The robots add the required reagents, and manoeuvre the samples so that a camera can record the light signals that reveal the DNA sequence of 70 bases at a time.

The formidable computation needed to assemble these snippets into 3-billion base-pair human genomes is done at a fully automated data centre about 20 minutes' drive away in Santa Clara - electricity is cheaper there, and data storage is charged by the kilowatt-hour, explains Clifford Reid, the company's CEO.

With the production line essentially running itself, most of the 185 staff are busy improving the company's sequencing technology, or liaising with customers. "We need people to interact with people, but not to interact with the DNA," says Reid.

In a cute twist, the company has even automated its reception area. When I arrived, I was greeted by a computer terminal, which asked for my name and who I had come to see. A label printer spat out a visitor badge while an email summoned Turcotte to lead me into the inner sanctum.

The culture of automation has a serious scientific goal. Geneticists had hoped that mutations determining our susceptibility to disease would emerge from limited scans, which record common variants at some 1 million positions across the genome. But the discoveries so far explain a small part of the heritability of many conditions. Gene hunters are starting to hit a wall.

Looking for the missing mutations means sequencing entire genomes and pinpointing rare anomalies that are inherited with the disease in question. The principle was exhibited last year by a team led by Leroy Hood of the Institute for Systems Biology in Seattle, which narrowed to a list of four the mutations responsible for the craniofacial condition Miller syndrome in one affected family (Science, DOI: 10.1126/science.1186802). Complete Genomics did the sequencing.

Several companies are pushing the envelope of cost and speed in DNA sequencing. But Complete Genomics is unusual in tailoring its technology to the task of churning out whole human genomes, and deciding not to sell machines but to offer a contract sequencing service. Hood sees little point in scientists doing the work: "We want to put our efforts into developing the tools to interpret the information."

New Scientist


Added on March 26, 2011, 10:51 pmRESEARCH



The First Plastic Computer Processor

This post has been edited by tzmmalaysia: Mar 26 2011, 10:51 PM
TStzmmalaysia
post Mar 26 2011, 10:54 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

The First Plastic Computer Processor

Two recent developments—a plastic processor and printed memory—show that computing doesn't have to rely on inflexible silicon.

Silicon may underpin the computers that surround us, but the rigid inflexibility of the semiconductor means it cannot reach everywhere. The first computer processor and memory chips made out of plastic semiconductors suggest that, someday, nowhere will be out of bounds for computer power.

Researchers in Europe used 4,000 plastic, or organic, transistors to create the plastic microprocessor, which measures roughly two centimeters square and is built on top of flexible plastic foil. "Compared to using silicon, this has the advantage of lower price and that it can be flexible," says Jan Genoe at the IMEC nanotechnology center in Leuven, Belgium. Genoe and IMEC colleagues worked with researchers at the TNO research organization and display company Polymer Vision, both in the Netherlands.

The processor can so far run only one simple program of 16 instructions. The commands are hardcoded into a second foil etched with plastic circuits that can be connected to the processor to "load" the program. This allows the processor to calculate a running average of an incoming signal, something that a chip involved in processing the signal from a sensor might do, says Genoe. The chip runs at a speed of six hertz-on the order of a million times slower than a modern desktop machine-and can only process information in eight-bit chunks at most, compared to 128 bits for modern computer processors.

Organic transistors have already been used in certain LED displays and RFID tags, but have not been combined in such numbers, or used to make a processor of any kind. The microprocessor was presented at the ISSCC conference in San Jose, California, last month.



Making the processor begins with a 25-micrometer thick sheet of flexible plastic, "like what you might wrap your lunch with," says Genoe. A layer of gold electrodes are deposited on top, followed by an insulating layer of plastic, and the plastic semiconductors that make up the processor's 4,000 transistors. Those transistors were made by spinning the plastic foil to spread a drop of organic liquid into a thin, even layer. When the foil is heated gently the liquid converts into solid pentacene, a commonly used organic semiconductor. The pentacene layer was then etched using photolithography to make the final pattern for transistors.

In the future, such processors could be made more cheaply by printing the organic components like ink, says Genoe. "There are research groups working on roll-to-roll or sheet-to-sheet printing," he says, "but there is still some progress needed to make organic transistors at small sizes that aren't wobbly," meaning physically irregular. The best lab-scale printing methods so far can only deliver reliable transistors in the tens of micrometers, he says.

Creating a processor made from plastic transistors was a challenge, because unlike those made from ordered silicon crystals, not every one can be trusted to behave like any other. Plastic transistors each behave slightly differently because they are made up of amorphous collections of pentacene molecules. "You won't have two that are equal," says Geneo. "We had to study and simulate that variability to work out a design with the highest chance of behaving correctly."

The team succeeded, but that doesn't mean the stage is set for plastic processors to displace silicon ones in consumer computers. "Organic materials fundamentally limit the speed of operation," Genoe explains. He expects plastic processors to appear in places where silicon is barred by its cost or physical inflexibility. The lower cost of the organic materials used compared to conventional silicon should make the plastic approach around 10 times cheaper.

"You can imagine an organic gas sensor wrapped around a gas pipe to report on any leaks with a flexible microprocessor to clean up the noisy signal," he says. Plastic electronics could also allow disposable interactive displays to be built into packaging, for example for food, says Genoe. "You might press a button to have it add up the calories in the cookies you ate," he says.

But such applications will require more than just plastic processors, says Wei Zhang, who works on organic electronics at the University of Minnesota. At the same conference where the organic processor was unveiled, Zhang and colleagues presented the first printed organic memory of a type known as DRAM, which works alongside the processor in most computers for short-term data storage. The 24-millimeter-square memory array was made by building up several layers of organic "ink" squirted from a nozzle like an aerosol. It can store 64 bits of information.

Previous printed memory has been nonvolatile, meaning it holds data even when the power is off and isn't suitable for short-term storage involving frequent writing, reading, and rewriting, says Zhang. The Minnesota group was able to print DRAM because it devised a form of printed, organic transistor that uses an ion-rich gel for the insulating material that separates its electrodes.

The ions inside enable the gel layer to store more charge than a conventional, ion-free insulator. That addresses two problems that have limited organic memory development. The gel's charge-storing ability reduces the power needed to operate the transistor and memory built from it; it also enables the levels of charge used to represent 1 and 0 in the memory to be very distinct and to persist for as long as a minute without the need for the memory to be refreshed.

Organic, printed DRAM could be used for short-term storage of image frames in displays that are today made with printed organic LEDs, says Zhang. That would enable more devices to be made using printing methods and eliminate some silicon components, reducing costs.

Finding a way to combine organic microprocessors and memory could cut prices further, although Zhang says the two are not yet ready to connect. "These efforts are new techniques, so we cannot guarantee that they will be built and work together," says Zhang. "But in the future, it would make sense."

Technology Review

TStzmmalaysia
post Mar 26 2011, 10:59 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Video: Magnetic Gels That Swim, Shimmy, and 'Walk'

Miklós Zrínyi of Semmelweiss University in Budapest, Hungary, has created some gels that are anything but gellin’. In fact, these gels are moving, shaking, and otherwise getting around with a little help from magnetism. The gel “snakes”--made from a mix of polymer and metal particles--bend to match the shape of any magnetic field exerted upon them.

That means with a little ingenuity, these gels can be manipulated in a variety of ways using either permanent magnets or electromagnets, depending on the shape and strength of the fields. As you can see in the video below, that means you can make them do all kinds of quirky things. But as New Scientist notes, a magnetic material that’s also soft and flexible could find an array of applications, like in artificial robot muscles or to replace machine parts that are usually rigid with softer alternatives.


PopSci

VIDEO HERE http://bcove.me/mf6sym79

New Scientist


TStzmmalaysia
post Mar 26 2011, 11:01 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

A Quantum Communications Switch

The Internet is made of photons that zip through fiber-optic cables and flow through devices like switches, modulators, and amplifiers. But those standard devices would be inadequate for superfast quantum computing or communications—experimental approaches that exploit the peculiar properties of particles at the quantum scale to carry out complex calculations incredibly quickly or to prevent anyone from eavesdropping on messages.

Commercial switches have various problems that make them unsuitable for rerouting entangled photons. Those that are made of micro-electromechanical components keep entangled states in tact, but operate too slowly. Other opto-electronic switches either add too much noise so that single photons are difficult to detect, or they completely destroy the quantum information.

Prem Kumar, professor of electrical engineering and computer science at Northwestern University, has developed a quantum routing switch that can shuttle entangled photons along various paths while keeping the quantum information intact.

The device could be particularly useful for quantum computing, says James Franson, professor of physics at the University of Maryland, Baltimore County. "To build a quantum computer using photons, we need the ability to switch [entangled] photons," says Franson. A quantum switch could also someday allow entangled photons from different quantum computers to be shared over long distances—like cloud computing, but with quantum information.

Kumar says the switch will also make ultra-secure quantum networks a reality. Today's information is typically secured using what's called public key encryption, which relies on the practical impossibility of performing certain mathematical tasks, like factoring extremely large numbers. Quantum networks would offer an even more secure alternative to public key encryption. Using entangled photons to communicate ensures security because any attempt to intercept a message would disturb the particles' quantum state.

To build the new quantum switch, the researchers used commercial fiber-optic cable and other standard optical components, says Kumar. "My goal is to do things in the quantum information space that are very compatible with existing fiber infrastructures," he says.

The first step is to prepare the photons. Entangled photons have properties, such as polarization, that are fundamentally linked. If two photons are entangled, then the measured polarization of one reveals the corresponding state of the other. The researchers used a technique in which they mixed together multiple wavelengths of light within a standard fiber to create entangled photon pairs.

The next step is to send one photon down the optical fiber to the switch, which changes the photon's course. The researchers' switch is made of only optical components, including a spool of 100 meters of optical fiber arranged in a loop. One photon of an entangled pair is sent through one end of the loop, and through a multiplexer, while a powerful laser sends pulses of light into the spool. The photon is shifted in such a way that at the other end of the loop it separates out along a separate path, while remaining entangled with its partner.

The end result is a switch that's very fast, has low background noise, and most importantly, preserves the quantum information. Single photon detectors at the end of the fibers confirm that both photons maintained their entangled state, showing that the quantum information was preserved. The work is described in a recent issue of the journal Physical Review Letters.

"It's an important development, because switching photons is really the main difference in going ahead in further progress in quantum computing using photons," says Franson.

Technology Review

TStzmmalaysia
post Mar 26 2011, 11:05 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

3-D Models Created by a Cell Phone

Capturing an object in three dimensions needn't require the budget of Avatar. A new cell phone app developed by Microsoft researchers can be sufficient. The software uses overlapping snapshots to build a photo-realistic 3-D model that can be spun around and viewed from any angle.

"We want everybody with a cell phone or regular digital camera to be able to capture 3-D objects," says Eric Stollnitz, one of the Microsoft researchers who worked on the project.

To capture a car in 3-D, for example, a person needs to take a handful of photos from different viewpoints around it. The photos can be instantly sent to a cloud server for processing. The app then downloads a photo-realistic model of the object that can be smoothly navigated by sliding a finger over the screen. A detailed 360 degree view of a car-sized object needs around 40 photos, a smaller object like a birthday cake would need 25 or fewer.

If captured with a conventional camera instead of a cell phone, the photos have to be uploaded onto a computer for processing in order to view the results. The researchers have also developed a Web browser plug-in that can be used to view the 3-D models, enabling them to be shared online. "You could be selling an item online, taking a picture of a friend for fun, or recording something for insurance purposes," says Stollnitz. "These 3-D scans take up less bandwidth than a video because they are based on only a few images, and are also interactive."

To make a model from the initial snapshots, the software first compares the photos to work out where in 3-D space they were taken from. The same technology was used in a previous Microsoft research project, PhotoSynth, that gave a sense of a 3-D scene by jumping between different views (see video). However, PhotoSynth doesn't directly capture the 3-D information inside photos.

"We also have to calculate the actual depth of objects from the stereo effect," says Stollnitz, "comparing how they appear in different photos." His software uses what it learns through that process to break each image apart and spread what it captures through virtual 3-D space (see video, below). The pieces from different photos are stitched together on the fly as a person navigates around the virtual space to generate his current viewpoint, creating the same view that would be seen if he were walking around the object in physical space.

"This is an interesting piece of software," says Jason Hurst, a product manager with 3DMedia, which makes software that combines pairs of photos to capture a single 3-D view of a scene. However, using still photos does have its limitations, he points out. "Their method, like ours, is effectively time-lapse, so it can't deal with objects that are moving," he says.

3DMedia's technology is targeted at displays like 3-D TVs or Nintendo's new glasses-free 3-D handheld gaming device. But the 3-D information built up by the Microsoft software could be modified to display on such devices, too, says Hurst, because the models it builds contain enough information to create the different viewpoints for a person's eyes.

Hurst says that as more 3-D-capable hardware appears, people will need more tools that let them make 3-D content. "The push of 3-D to consumers has come from TV and computer device makers, but the content is lagging," says Hurst. "Enabling people to make their own is a good complement."

Technology Review

TStzmmalaysia
post Mar 26 2011, 11:14 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

100% Renewable Energy By 2050 Is Possible - Here's How We Can Do It

We recently examined how Australia can meet 100% of its electricity needs from renewable sources by 2020. Here we will examine how that goal can be scaled up for the rest of the world.

Energy consulting firm Ecofys produced a report detailing how we can meet nearly 100% of global energy needs with renewable sources by 2050. Approximately half of the goal is met through increased energy efficiency to first reduce energy demands, and the other half is achieved by switching to renewable energy sources for electricity production (Figure 1, below).

To achieve the goal of 100% renewable energy production, Ecofys forsees that global energy demand in 2050 will be 15% lower than in 2005, despite a growing population and continued economic development in countries like India and China. In their scenario:

QUOTE
Industry uses more recycled and energy-efficient materials, buildings are constructed or upgraded to need minimal energy for heating and cooling, and there is a shift to more efficient forms of transport.
As far as possible, we use electrical energy rather than solid and liquid fuels. Wind, solar, biomass and hydropower are the main sources of electricity, with solar and geothermal sources, as well as heat pumps providing a large share of heat for buildings and industry. Because supplies of wind and solar power vary, "smart" electricity grids have been developed to store and deliver energy more efficiently.  Bioenergy (liquid biofuels and solid biomass) is used as a last resort where other renewable energy sources are not viable.


Attached Image
Figure 1: Ecofys projected global energy consumption between 2000 and 2050.

To achieve the necessary renewable energy production, Ecofys envisions that solar energy supplies about half of our electricity, half of our building heating, and 15% of our industrial heat and fuel by 2050. This requires an average annual solar energy growth rate much lower than we're currently achieving - an encouraging finding.

The report notes that wind could meet one-quarter of the world's electricity needs by 2050 if current growth rates continue, and sets that as its goal. Ecofys also envisions more than one-third of building heat coming from geothermal sources by 2050. If we double current geothermal electricity production growth rates, it can provide 4% of our total electricity needs by that date. Ocean power, through both waves and tides, accounts for about 1% of global electricity needs in 2050. Hydropower, which currently supplies 15% of global electricity, ultimately supplies 12% in the Ecofys scenario. As you can see in Figure 2, global renewable energy use ramps up gradually between now and 2050.

Attached Image
Figure 2: Energy use by source between 2000 and 2050.

Burning biomass (such as plant and animal waste) will supply 60% of industrial fuels and heat, 13% of building heat, and 13% of electricity needs. Much of the proposed biomass use comes from plant residues from agriculture and food processing, sawdust and residues from forestry and wood processing, manure, and municipal waste. All of these renewable energy technologies currently exist, and it's just a matter of implementing them on a sufficiently large scale.

Ecofys also envisions using currently existing technology and expertise to "create buildings that require almost no conventional energy for heating or cooling, through airtight construction, heat pumps and sunlight. The Ecofys scenario foresees all new buildings achieving these standards by 2030." 2-3% of existing buildings will also need to be retrofitted per year to improve energy efficiency. Ecofys notes that Germany is already retrofitting buildings at this rate. Transportation must become more efficient, using more fuel efficient vehicles like electric cars, and increasing use of mass public transportation.

Accomplishing all of this will require a major effort, but Ecofys has a number of suggestions how we can start:

Introduce minimum efficiency standards worldwide for all products that consume energy, including buildings.
Build energy conservation into every stage of product design.
Introduce strict energy efficiency criteria for all new buildings.
Introduce an energy tax, or perhaps a carbon emissions price.
Help developing countries pursue alternatives to inefficient biomass burning, such as such as improved biomass cooking stoves, solar cookers and small-scale biogas digesters.
Substantial investment in public transportation.
Make individuals, businesses, and communities more aware of their energy consumption, and encourage increased efficiency.

TreeHugger



TStzmmalaysia
post Mar 26 2011, 11:20 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Tata & MIT Work on Breakthrough Way to Generate Power From Ordinary Water

The Tata Group has signed a deal with the founder of SunCatalytix, MIT scientist Daniel Nocera, who has discovered how to generate energy water. Although the terms of their agreement have not yet been disclosed, this breakthrough technology could bring power to as many as three billion people worldwide. What’s more, Nocera’s technology generates energy more efficiently than solar panels, according to the folks at Fast Company.

Nocera and his team discovered recently that an artificial cobalt and phosphate coated silicon leaf placed into a jar of water generates power. Similar to photosynthesis, this process splits hydrogen from the two oxygen molecules in water to create power from the sun.

One and a half bottles of water, including wastewater, can power a small house, and a swimming pool filled with water refreshed once a day will generate enough energy to run a plant. Although in preliminary testing stages, Nocera and TATA envision that this technology could improve the standard of living for billions of people. One small caveat from us: often places that are short on electricity are also short on water. Being just 45 days old, the TATA/MIT team still has a ways to go to get this incredible technology off the ground.

Inhabitat

TStzmmalaysia
post Mar 26 2011, 11:32 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Neutron Analysis Yields Insight Into Bacteria for Solar Energy

Structural studies of some of nature's most efficient light-harvesting systems are lighting the way for new generations of biologically inspired solar cell devices.

Researchers from Washington University in St. Louis and the Department of Energy's Oak Ridge National Laboratory used small-angle neutron scattering to analyze the structure of chlorosomes in green photosynthetic bacteria. Chlorosomes are efficient at collecting sunlight for conversion to energy, even in low-light and extreme environments.

"It's one of the most efficient light harvesting antenna complexes found in nature," said co-author and research scientist Volker Urban of ORNL's Center for Structural Molecular Biology, or CSMB.

Neutron analysis performed at the CSMB's Bio-SANS instrument at the High Flux Isotope Reactor allowed the team to examine chlorosome structure under a range of thermal and ionic conditions.

"We found that their structure changed very little under all these conditions, which shows them to be very stable," Urban said. "This is important for potential biohybrid applications -- if you wanted to use them to harvest light in synthetic materials like a hybrid solar cell, for example."

The size, shape and organization of light-harvesting complexes such as chlorosomes are critical factors in electron transfer to semiconductor electrodes in solar devices. Understanding how chlorosomes function in nature could help scientists mimic the chlorosome's efficiency to create robust biohybrid or bio-inspired solar cells.

"What's so amazing about the chlorosome is that this large and complicated assembly is able to capture light effectively across a large area and then funnel the light to the reaction center without losing it along the way," Urban said. "Why this works so well in chlorosomes is not well understood at all."

"We're trying to find out general principles that are important for capturing, harvesting and transporting light efficiently and see how nature has solved that," Urban said.

Small-angle neutron scattering enabled the team to clearly observe the complicated biological systems at a nanoscale level without damaging the samples.

"With neutrons, you have an advantage that you get a very sharp contrast between these two phases, the chlorosome and the deuterated buffer. This gives you something like a clear black and white image," Urban said.

The team, led by Robert Blankenship of Washington University, published its findings in the journal Langmuir. The research was supported through the Photosynthetic Antenna Research Center, an Energy Frontier Research Center funded by DOE's Office of Science. Both HFIR and the Bio-SANS facility at ORNL's Center for Structural Molecular Biology are also supported by DOE's Office of Science.

ScienceDaily

TStzmmalaysia
post Mar 26 2011, 11:35 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


NANOTECHNOLOGY

Attached Image

New entropy battery pulls energy from difference in salinity between fresh water and seawater

A team of researchers, led by Dr. Yi Cui, of Stanford and Dr. Bruce Logan from Penn State University have succeeded in developing an entropy battery that pulls energy from the imbalance of salinity in fresh water and seawater. Their paper, published in Nano Letters, describes a deceptively simple process whereby an entropy battery is used to capture the energy that is naturally released when river water flows into the sea.

Up to now, this kind of process has been accomplished by passing seawater though a membrane which unfortunately is too costly to merit creating large-scale operations.

The new process works like this:

Step 1 - Two types of nanorod electrodes are placed in river water; one silver anionic electrode contains Cl- ions and one manganese dioxide cationic electrode contains Na+ ions. The battery charges as the river water’s low salinity concentration of salt pulls the chorine and the sodium from the respective electrodes.

Step 2 - The river water is slowly replaced with seawater, causing a potential difference between the two concentrations of ions in the combined water. This is due to the Cl- ions, or anions, traveling to the silver electrode and the Na+ sodium ions, or cations, traveling to the manganese dioxide electrode.

Step 3 - Ions in the electrodes discharge into the seawater when the electrodes receive more ions than they can accommodate.

Step 4 - The salt water is slowly replaced with river water. This lessens the potential difference of the two electrodes which charges the battery. More energy was released in Step3 into the saltwater than is needed to charge the battery, thus the battery collects and stores the energy that has been building up as the ions have been moving in and out of the crystal lattice of the electrodes.

With the entropy battery, costs are much lower than other ways of accomplishing the same thing due to the absence of replaceable membranes.

Cui believes that the entropy battery might eventually contribute up to 13% of total energy needs. He also believes that by moving the two electrodes closer, he might be able to improve his efficiency rate from 74% percent to 85%.

Because the entropy battery operates in both warm and cold conditions it is a completely renewable resource; one that might lead to mass energy production in both developed countries and those in the third world.

More information: Batteries for Efficient Energy Extraction from a Water Salinity Difference, by Fabio La Mantia et al., Nano Lett., Article ASAP

PhysOrg

TStzmmalaysia
post Mar 28 2011, 09:44 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Smaller Particles Could Make Solar Panels More Efficient

Studies done by Mark Lusk and colleagues at the Colorado School of Mines could significantly improve the efficiency of solar cells. Their latest work describes how the size of light-absorbing particles--quantum dots--affects the particles' ability to transfer energy to electrons to generate electricity.

The results are published in the April issue of the journal ACS Nano.


The advance provides evidence to support a controversial idea, called multiple-exciton generation (MEG), which theorizes that it is possible for an electron that has absorbed light energy, called an exciton, to transfer that energy to more than one electron, resulting in more electricity from the same amount of absorbed light.


Quantum dots are human-made atoms that confine electrons to a small space. They have atomic-like behavior that results in unusual electronic properties on a nanoscale. These unique properties may be particularly valuable in tailoring the way light interacts with matter.


Experimental verification of the link between MEG and quantum dot size is a hot topic due to a large degree of variation in previously published studies. The ability to generate an electrical current following MEG is now receiving a great deal of attention because this will be a necessary component of any commercial realization of MEG.


For this study, Lusk and collaborators used a National Science Foundation (NSF)-supported high performance computer cluster to quantify the relationship between the rate of MEG and quantum dot size.


They found that each dot has a slice of the solar spectrum for which it is best suited to perform MEG and that smaller dots carry out MEG for their slice more efficiently than larger dots. This implies that solar cells made of quantum dots specifically tuned to the solar spectrum would be much more efficient than solar cells made of material that is not fabricated with quantum dots.


According to Lusk, "We can now design nanostructured materials that generate more than one exciton from a single photon of light, putting to good use a large portion of the energy that would otherwise just heat up a solar cell."


The research team, which includes participation from the National Renewable Energy Laboratory, is part of the NSF-funded Renewable Energy Materials Research Science and Engineering Center at the Colorado School of Mines in Golden, Colo. The center focuses on materials and innovations that will significantly impact renewable energy technologies. Harnessing the unique properties of nanostructured materials to enhance the performance of solar panels is an area of particular interest to the center.


"These results are exciting because they go far towards resolving a long-standing debate within the field," said Mary Galvin, a program director for the Division of Materials Research at NSF. "Equally important, they will contribute to establishment of new design techniques that can be used to make more efficient solar cells."

Science Daily

TStzmmalaysia
post Mar 28 2011, 09:45 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

A New Way to Turn Out Cheap LED Lighting

A startup in California has developed a manufacturing technique that could substantially cut the cost of LED lightbulbs—a more energy-efficient type of lighting.

LEDs are conventionally made on a relatively costly substrate of silicon carbide or sapphire. Bridgelux has come up a new process takes advantage of existing fabrication machines used to make silicon computer chips, potentially cutting LED production costs by 75 percent, according to the company.

Despite their higher efficiencies and longer life, few homes and businesses use LED lighting—largely because of the initial cost. An LED chip makes up 30 to 60 percent of a commercial LED lightbulb. Electronic control circuits and heat management components take up the rest. So for a 60-watt equivalent bulb that costs $40, Bridgelux's technology could bring the cost down by $9 to $18. Integrating the light chip with the electronics might further reduce costs.

LEDs made with the new technique produce 135 lumens for each watt of power. The U.S. Department of Energy's Lighting Technology Roadmap calls for an efficiency of 150 lumens per watt by 2012. Some LED makers, such as Cree, in Durham, North Carolina, already sell LED lamps with efficiencies in that range. In contrast, incandescent bulbs emit around 15 lumens per watt, and fluorescent lightbulbs emit 50 to 100 lumens per watt.

Manufacturers typically make white LEDs by coating blue gallium-nitride devices with yellow phosphors. The gallium nitride is grown on two- to four-inch sapphire or silicon carbide wafers. Cree builds its chips on silicon-carbide wafers, "because we believe it produces superior LEDs," says company spokesperson Michelle Murray.

Larger wafers mean more devices fabricated at once, which brings down cost. But large sapphire or silicon carbide wafers are more difficult, and expensive, to make. Companies such as Osram Opto Semiconductors in Germany are now moving to 15-centimeter sapphire wafers, most likely the largest size possible. Making 20-centimeter silicon wafers, on the other hand, is routine in the semiconductor chip-making industry. Bridgelux's new silicon wafers were, in fact, made at an old silicon fabrication plant in Silicon Valley.

It is hard to grow gallium nitride on silicon, mainly because the materials expand and contract at very different rates, explains Colin Humphreys, a materials science researcher at Cambridge University. The process is carried out at temperatures around 1,000 °C, and, upon cooling, the gallium nitride cracks because it is under tension, Humphreys says. One way to solve the problem is to insert additional thin films around the gallium nitride to compress the material and balance out the tension produced during cooling. In fact, Humphreys and his colleagues have used this trick to make gallium-nitride LEDs on silicon; their devices produce 70 lumens per watt. Bridgelux might be using a similar technique. "The result from Bridgelux is impressive," Humphreys says. "It offers the promise of a large cost reduction without any reduction of efficiency."

Other LED makers, including Osram, are also trying to make gallium-nitride LEDs on silicon. Bridgelux expects to deliver its first silicon-based LEDs in two to three years.

Technology Review

TStzmmalaysia
post Mar 28 2011, 09:46 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Debut of the first practical 'artificial leaf'

Scientists today claimed one of the milestones in the drive for sustainable energy — development of the first practical artificial leaf. Speaking here at the 241st National Meeting of the American Chemical Society, they described an advanced solar cell the size of a poker card that mimics the process, called photosynthesis, that green plants use to convert sunlight and water into energy.

"A practical artificial leaf has been one of the Holy Grails of science for decades," said Daniel Nocera, Ph.D., who led the research team. "We believe we have done it. The artificial leaf shows particular promise as an inexpensive source of electricity for homes of the poor in developing countries. Our goal is to make each home its own power station," he said. "One can envision villages in India and Africa not long from now purchasing an affordable basic power system based on this technology."

The device bears no resemblance to Mother Nature's counterparts on oaks, maples and other green plants, which scientists have used as the model for their efforts to develop this new genre of solar cells. About the shape of a poker card but thinner, the device is fashioned from silicon, electronics and catalysts, substances that accelerate chemical reactions that otherwise would not occur, or would run slowly. Placed in a single gallon of water in a bright sunlight, the device could produce enough electricity to supply a house in a developing country with electricity for a day, Nocera said. It does so by splitting water into its two components, hydrogen and oxygen.

The hydrogen and oxygen gases would be stored in a fuel cell, which uses those two materials to produce electricity, located either on top of the house or beside it.

Nocera, who is with the Massachusetts Institute of Technology, points out that the "artificial " is not a new concept. The first artificial leaf was developed more than a decade ago by John Turner of the U.S. National Renewable Energy Laboratory in Boulder, Colorado. Although highly efficient at carrying out photosynthesis, Turner's device was impractical for wider use, as it was composed of rare, expensive metals and was highly unstable — with a lifespan of barely one day.

Nocera's new leaf overcomes these problems. It is made of inexpensive materials that are widely available, works under simple conditions and is highly stable. In laboratory studies, he showed that an artificial leaf prototype could operate continuously for at least 45 hours without a drop in activity.

The key to this breakthrough is Nocera's recent discovery of several powerful new, inexpensive catalysts, made of nickel and cobalt, that are capable of efficiently splitting water into its two components, hydrogen and oxygen, under simple conditions. Right now, Nocera's leaf is about 10 times more efficient at carrying out photosynthesis than a natural leaf. However, he is optimistic that he can boost the efficiency of the artificial leaf much higher in the future.

"Nature is powered by photosynthesis, and I think that the future world will be powered by photosynthesis as well in the form of this artificial leaf," said Nocera, a chemist at Massachusetts Institute of Technology in Cambridge, Mass.

PhysOrg

Research Provided by American Chemical Society

TStzmmalaysia
post Mar 28 2011, 09:49 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

High-temperature superconductor spills secret: A new phase of matter

Scientists from the U.S. Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) and the University of California at Berkeley have joined with researchers at Stanford University and the SLAC National Accelerator Laboratory to mount a three-pronged attack on one of the most obstinate puzzles in materials sciences: what is the pseudogap?

A collaboration organized by Zhi-Xun Shen, a member of the Stanford Institute for Materials and Energy Science (SIMES) at SLAC and a professor of physics at Stanford University, used three complementary experimental approaches to investigate a single material, the high-temperature superconductor Pb-Bi2201 (lead bismuth strontium lanthanum copper-oxide). Their results are the strongest evidence yet that the pseudogap phase, a mysterious electronic state peculiar to high-temperature superconductors, is not a gradual transition to superconductivity in these materials, as many have long believed. It is in fact a distinct phase of matter.

"This is a paradigm shift in the way we understand high-temperature superconductivity," says Ruihua He, lead author with Makoto Hashimoto of the paper in the March 25 issue of the journal Science that describes the team's findings. "The involvement of an additional phase, once fully understood, might open up new possibilities for achieving superconductivity at even higher temperatures in these materials." When the research was done Hashimoto and He were members of SIMES, of Stanford's Department of Applied Physics, and of Berkeley Lab's Advanced Light Source (ALS), where He is now a postdoctoral fellow.

The pseudogap mystery

Superconductivity is the total absence of resistance to the flow of electric current. Discovered in 1911, it was long thought to occur only in metals and only below a critical temperature (Tc) not far above absolute zero. "Ordinary" superconductivity commonly takes place at 30 kelvins (30 K) or less, equivalent to more than 400 degrees below zero Fahrenheit. Awkward as reaching such low temperatures may be, ordinary superconductivity is widely exploited in industry, health, and science.

High-Tc superconductors were discovered in 1986. "High" is a relative term; the highest-Tc superconductors function at temperatures five times higher than ordinary superconductors but still only about twice that of liquid nitrogen. Many high-Tc superconductors have been found, but the record holders for critical temperature remain the kind first discovered, the cuprates — brittle oxides whose structure includes layers of copper and oxygen atoms where current flows.

In all known superconductors electrons join in pairs (Cooper pairs) to move in correlated fashion through the material. It takes a certain amount of energy to break Cooper pairs apart; in ordinary superconductors, the absence of single-electron states below this energy constitutes a superconducting gap, which vanishes when the temperature rises above Tc. Once in the normal state the electrons revert to unpaired, uncorrelated behavior.

Not so for cuprate superconductors. A similar superconducting gap exists below Tc, but when superconductivity ceases at Tc the gap doesn't close. A "pseudogap" persists and doesn't go away until the material reaches a higher temperature, designated T* (T-star). The existence of a pseudogap in the normal state is itself anything but normal; its nature has been heatedly debated ever since it was identified in cuprates more than 15 years ago.

Attempts to explain what's going on in the pseudogap have coalesced around two main schools of thought. Traditional thinking holds that the pseudogap represents a foreshadowing of the superconducting phase. As the temperature is lowered, first reaching T*, a few electron pairs start to form, but they are sparse and lack the long-range coherence necessary for superconductivity — they can't "talk" to one another. As the temperature continues to fall, more such pairs are formed until, upon reaching Tc, virtually all conducting electrons are paired and act in correlation; they're all talking. In this scheme, there's only a single phase transition, which occurs at Tc.

Another school of thought argues that the appearance of the pseudogap at T* is also a true phase transition. The pseudogap does not represent a smooth shift to the superconducting state but is itself a state distinct from both superconductivity and normal "metallicity" (the usual state of delocalized, uncorrelated electrons). This new phase implies the existence of a "quantum critical point" — a point along a line at zero temperature where competing phases meet. In theory, with competing phases wildly fluctuating in the neighborhood of a quantum critical point, there may be entirely new routes to superconductivity.

"Promising as the 'quantum critical' paradigm is for explaining a wide range of exotic materials, high-Tc superconductivity in cuprates has stubbornly refused to fit the mold," says Joseph Orenstein of Berkeley Lab's Materials Sciences Division, a professor in physics at UC Berkeley, whose group conducted one of the research team's three experiments. "For 20 years, the cuprates managed to conceal any evidence of a phase-transition line where the quantum critical point is supposed to be found."

In recent years, however, hints have emerged. "New ultrasensitive probes have found fingerprints of phase transitions in high-Tc materials," Orenstein says, "although there's been no smoking gun. The burning question is whether we can discover the nature of the new phase or phases."

A multipronged attack on the pseudogap

In the Stanford-Berkeley study, three groups of researchers joined forces to probe the pseudogap phase on the same sample.

"Pb-Bi2201 was chosen because, first, it is structurally simple, and second, it has a relatively wide temperature range between Tc and T*," says Ruihua He. "This permits a clean separation of any remnant effect of superconductivity from genuine pseudogap physics."

Groups led by Z.-X. Shen at beamline 5?4 of the Stanford Synchrotron Radiation Lightsource (SSRL) at SLAC and by Zahid Hussain, ALS Division Deputy for Scientific Support, at beamline 10.0.1 of Berkeley Lab's ALS, studied the sample with angle-resolved photoemission spectroscopy (ARPES). In ARPES, a beam of x-rays directed at the sample surface excites the emission of valence electrons. By monitoring the kinetic energy and momentum of the emitted electrons over a wide temperature range the researchers map out the material's low-energy electronic band structure, which determines much of its electrical and magnetic properties.

At Stanford, researchers led by Aharon Kapitulnik of SIMES, a professor in applied physics at Stanford University, studied the same crystal of Pb-Bi2201 with the magneto-optical Kerr effect. In light reflected from the sample under a zero magnetic field, tiny rotations of the plane of polarization are measured as the temperature changes. The rotations are proportional to the net magnetization of the sample at different temperatures.

Finally, Orenstein's group at Berkeley applied time-resolved reflectivity to the sample. A pump pulse from a laser excites electrons many layers of atoms deep, temporarily affecting the sample's reflectivity. Probe pulses, timed to follow less than a trillionth of a second after the pump pulses, reveal changes in reflection at different temperatures.

All these experimental techniques had previously pointed to the possibility of a phase transition in the neighborhood of T* in different cuprate materials. But no single result was strong enough to stand alone.

ARPES experiments performed in 2010 by the same group of experimenters as in the present study revealed the abrupt opening of the pseudogap at T* in Pb-Bi2201. Variations in T* in different materials and even different samples, as well as in the surface conditions to which ARPES is sensitive, had left room for uncertainty, however.

In 2008, the Kerr effect was measured in another cuprate, also by the same group as in the present study, and showed a change in magnetization from zero to finite across T*. This was long-sought thermodynamic evidence for the existence of a phase transition at T*. But compared to the pronounced spectral change seen by ARPES, the extreme weakness of the Kerr-effect signal left doubt that the two results were connected.

Finally, since the late 1990s various experiments with time-resolved reflectivity in different cuprates have reported signals setting in near T* and increasing in strength as the temperature drops, until interrupted by the onset of a separate signal below Tc. The probe is complex and there was a lack of corroborating evidence for the same cuprates; the results did not receive wide attention.

Now the three experimental approaches have all been applied to the same material. All yielded consistent results and all point to the same conclusion: there is a phase transition at the pseudogap phase boundary – the three techniques put it precisely at T*. The electronic states dominating the pseudogap phase do not include Cooper pairs, but nevertheless intrude into the lower-lying superconducting phase and directly influence the motion of Cooper pairs in a way previously overlooked.

"Instead of pairing up, the electrons in the pseudogap phase organize themselves in some very different way," says He. "We currently don't know what exactly it is, and we don't know whether it helps superconductivity or hurts it. But we know the direction to take to move forward."

Says Orenstein, "Coming to grips with a new picture is a little like trying to steer the Titanic, but the fact that all three of these techniques point in the same direction adds to the mounting evidence for the phase change."

Hussain says the critical factor was bringing the Stanford and Berkeley scientists together. "We joined forces to tackle a more complex problem than any of us had tried on our own."

PhysOrg

Provided by Lawrence Berkeley National Laboratory
TStzmmalaysia
post Mar 28 2011, 09:52 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Engineers put a damper on 'aeroelastic flutter'

Anyone who has ever flown knows the feeling: an otherwise smooth flight gets a little choppy. If you are lucky, the plane skips a few times like a rock across a pond and then settles. For the not-so-lucky, the captain has turned on that seatbelt sign for a reason, but even the worst turbulence usually fades.

In certain rare situations, however, those vibrations don't settle and the consequences turn dire. The twisting, up-and-down movement in the wing builds upon itself, each wave compounding the next, until the vibration worsens and the wing is ripped from the plane. In an instant, a simple bit of turbulence becomes a matter of life and death.

Aeronautical engineers know it as "aeroelastic ." Pilots call it "buzz."



Complicated stuff

Aeronautical engineers have puzzled over the phenomenon as long as there have been planes. They made their planes better. They built them of new materials. They used supercomputers to predict when it might occur. But, try as they might, they could not absolutely eliminate aeroelastic flutter.

Professor Charbel Farhat, chair of the Aeronautics and Astronautics Department at Stanford's School of Engineering, and David Amsallem, a postdoctoral scholar who worked on his PhD thesis with Farhat, have been studying and trying to solve aeroelastic flutter for years. Computers help, but only to a point.

Listening to Farhat is a bit like flying. He talks quickly with great expression, piloting the listener through his world, swooping from idea to idea in great arcs like a stunt plane, always on the edge of control. He is a barnstormer. Amsallem, the mild-mannered mathematical whiz behind it all, smiles gently, tossing in a French-accented word of clarification here and there.

"This is complicated stuff. It takes today's fastest computer an hour to calculate the aeronautical effect of even a small change in a single variable," said Farhat, his voice rising to deliver the word "today's" as if to reinforce that we are not talking about some mid-century mainframe here. "Imagine a plane in flight and you'll quickly grasp that there are hundreds of variables. Now, imagine the rate of change in those variables for an F-16 at full throttle."

Each incremental shift in pitch of the wing, every inch of altitude, every variation in speed, each milliliter of fuel added or burned sloshing back and forth in the tank are equally at play – alter one, you alter the entire system. And each time you alter the system, the supercomputer starts back at go, computing anew – that is, if you can get time on the supercomputer.



"Now you begin to understand. This is complicated stuff," Farhat said. At this point in the conversation, the professor leaned in. His eyes narrowed and his tone grew serious. "We can now predict flutter in real time … on an iPhone." Someday, he predicted, airplanes will have chips on board that will sense and counteract flutter in real time.

The work has caused a sensation in the aeronautics field. When Farhat and Amsallem presented their paper at the Army Science Conference recently, the crowd of aeronautical old hands sat stunned as the two did a live demo of their work on an iPad.

Farhat chose the iPad over a smaller device, he said, not for processor speed – their innovation works fine on an iPhone, he assured – but for pure visual impact: It looks better on an iPad.

The seat of your pants

Over time, aeronautical engineers have been able to "engineer" flutter to a point of virtual oblivion – emphasis on virtual. It is now only a remote risk.

"But, there are instances when even the smallest of risks is too great a risk," Farhat said.

For instance, when you are a fighter pilot in the saddle of a $60 million F-16, one of the fastest, most agile fighter planes ever developed. F-16 pilots – and their planes – regularly endure forces many times that of gravity. A little turbulence can be a big deal. Each time the wing starts to bounce, no matter how slight the bounce, a little voice nags in the pilot's mind, "Is this the one?" If the vibration fails to fade, the pilot must contemplate the "eject" button. At this point, it is life or plane. Call it flying by the seat of your $60 million pants.

How have Farhat and Amsallem succeeded where others have come up short? The answer sounds suitably complex: interpolation on manifolds. What it means, in essence, is approximating unknowns based on known information. The two engineers devised a system of mathematical approximations that break down complex, computationally demanding equations into smaller, more manageable parts. In mathematics, this is known as "reducing." Reducing allows them to make some very educated guesses, very quickly.

Starting with a mostly random, but carefully selected, sample of a few flight conditions – the variables such as air speed, wing angle and altitude – they "pre-computed" a series of reduced-order models using the very supercomputers they aimed to beat. These models are called "snapshots" – mathematical pictures of the fluid dynamics at play at each flight point. Farhat and Amsallem stored these snapshots in a database, which their computer algorithms can later pluck as needed to make more complex calculations, quite literally, on the fly.

That is the easy part.

Next, using a cleverly designed and painstakingly tested methodology, Farhat and Amsallem segmented data into small groups centered on the pre-calculated points in the databases. In essence, Farhat and Amsallem drew circles around the things they knew – those pre-computed flight conditions – and crafted a system to interpolate the value of any point within each circle.

On a graph, these "cells," as they are described, look like living cells with the known data serving as the nucleus. The rest of the cell is considered of similar enough aeronautical characteristic to the nucleus as to allow the engineers to make very educated guesses as to the behavior of the entire cell – allowing them to determine how the wing will respond at any given moment.

What's more, each time they make a new set of calculations, the new numbers are added to the database, bolstering future results and making their interpolations all the more accurate. The mathematicians call this "training," as if they are teaching the numbers and not the other way around – as if they are telling the numbers what they can and can't do.

Thus, the smartphone beat the supercomputer.



A field aflutter

So, what took so long?

"It sounds simple," said Amsallem, explaining his work, "but the mathematics are exceptionally complex and the stakes of being wrong are great. There's no room for error."

Farhat and Amsallem are able to accurately predict – in real time and where no one had before – whether a plane will experience flutter. Soon, they hope, all planes will have active control mechanisms that continually monitor multiple flight and mechanical data and steer the planes clear of flutter.

While this is good news for pilots and passengers, the implications of the work run far beyond the relatively rare, albeit deadly, phenomenon of flutter.

"Our interpolation method is general enough to work, in principle, on many complex engineering problems," said Farhat, hinting at future possibilities.

And this is what has the field aflutter. Everyone from the Air Force to the Navy to airplane manufacturers to Formula 1 racing teams are lining up – as they once did for time on the supercomputers – to apply the Stanford aeronautical algorithm to their problems.

PhysOrg

TStzmmalaysia
post Mar 29 2011, 05:49 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Structure of DNA Repair Complex Reveals Workings of Powerful Cell Motor

Over the last years, two teams of researchers at The Scripps Research Institute have steadily built a model of how a powerful DNA repair complex works. Now, their latest discovery provides revolutionary insights into the way the molecular motor inside the complex functions -- findings they say may have implications for treatment of disorders ranging from cancer to cystic fibrosis.

In a paper published in an Advance Online Edition of Nature Structural and Molecular Biology March 27, 2011, the scientists say that the complex's motor molecule, known as Rad50, is a surprisingly flexible protein that can change shape and even rotate depending on the task at hand.

The finding solves the long-standing mystery of how a single protein complex known as MRN (Mre11-Rad50-Nbs1) can repair DNA in a number of different, and tricky, ways that seem impossible for "standard issue" proteins to do, say team leaders Scripps Research Professor John Tainer, Ph.D., and Scripps Research Professor Paul Russell, Ph.D., who also collaborated with members of the Lawrence Berkeley National Laboratory on the study.

They say the finding also provides a critical insight into the ABC-ATPase superfamily of molecular motors, of which Rad50 is a member.

"Rad50 and its brethren proteins in this superfamily are biology's general motors," said Tainer, "and if we know how they work, we might be able to control biological outcomes when we need to."

For example, knowing that Rad50 changes its contour to perform a function suggests it might be possible to therapeutically target unique elements in that specific conformation. "There could be a new generation of drugs that are designed not against an active site, like most drugs now (an approach that can cause side effects, but against the shape the protein needs to be in to work," Tainer said.

Russell added, "Proteins are often viewed as static, but we are showing the moving parts in this complex. They are dynamic. They move about and change shape when engaging with other molecules."

First Responder

The MRN complex is known as a first-responder molecule that rushes in to repair serious double-strand breaks in the DNA helix -- an event that normally occurs about 10 times a day per cell due to ultraviolet light and radiation damage, etc. If these breaks are not fixed, dangerous chromosomal rearrangements can occur that lead to cancer. Paradoxically, the complex also mends DNA breaks promoted by chemotherapy, protecting cells against cancer treatment.

When MRN senses a break, it activates an alarm telling the cell to shut down division until repairs are made. Then, it binds to ATP (an energy source) and repairs DNA in three different ways, depending on whether two ends of strands need to be joined together or if DNA sequences need to be replicated. "The same complex has to decide the extent of damage and be able to do multiple things," Tainer said. "The mystery was how it can do it all."

To find out, Tainer, head of a structural biology group, and Russell, who leads a yeast genetics laboratory, began collaborating five years ago. With the additional help of team members at Lawrence Berkeley National Laboratory and its Advanced Light Source beamline, called SIBYLS, the collaboration has produced a series of high-resolution images of the crystal structure of parts of all three proteins (rad50, Mre11, and Nbs1), taken from fission yeast and archaea. The scientists also used the lab's X-ray scattering tool to determine the proteins' overall architecture in solution, which approximates how a protein appears in a natural state.

The scientists say that the parts of the complex, when imagined together as a whole unit, resemble an octopus: the head consists of the repair machinery (the Rad50 motor and the Mre11 protein, which is an enzyme that can break bonds between nucleic acids) and the octopus arms are made up of Nbs1 which can grab the molecules needed to help the machinery mend the strands.

In this study, Tainer and Russell were able to produce crystal and X-ray scattering images of parts of where Rad50 and Mre11 touched each other, and what happened when ATP bound to this complex and what it looked like when it didn't.

In these four new structures, they showed that ATP binding allows Rad50 to drastically change its shape. When not bound to ATP, Rad50 is flexible and floppy, but bound to ATP, Rad50 snaps into a ring that presumably closes around DNA in order to repair it.

"We saw a lot of big movement on a molecular scale," said Tainer. "Rad50 is like a rope that can pull. It appears to be a dynamic system of communicating with other molecules, and so we can now see how flexibly linked proteins can alter their physical states to control outcomes in biology."

"We thought ATP allowed Rad50 to change shape, but now we have proof of it and how it works," Russell said. "This is a key part of the MRN puzzle."

An Engine for Many Vehicles

Rad50 and ATP provide the motor and gas for a number of biological machines that operate across species. These machines are linked to a number of disorders, such as cystic fibrosis, which is caused by a defect in the cystic fibrosis transmembrane conductance regulator (CFTR) gene, which is a member of the ABC ATPase superfamily.

"Our study suggests that ABC ATPase proteins are used so often in biology because they can flexibly hook up to so many different things and produce a specific biological outcome," Tainer said.

Given this new prototypic understanding of these motors, Tainer and Russell envision a future in which therapies might be designed that target Rad50 when it changes into a shape that promotes a disease. For example, chemotherapy could be coupled with an agent that prevents the MRN complex from repairing DNA damage, promoting death of cancer cells.

"There are some potentially very cool applications to these findings that we are only beginning to think about," Russell said.

The study was funded by the National Cancer Institute, the National Institutes of Health, and the Department of Energy.

ScienceDaily

TStzmmalaysia
post Mar 29 2011, 05:50 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


NANOTECHNOLOGY

Attached Image

'Nano-bricks' may help build better packaging to keep foods fresher longer

Scientists are reporting on a new material containing an ingredient used to make bricks that shows promise as a transparent coating for improving the strength and performance of plastic food packaging. Called "nano-bricks," the film even looks like bricks and mortar under a microscope, they say. The coating could help foods and beverages stay fresh and flavorful longer and may replace some foil packaging currently in use, they note. The scientists described the new, eco-friendly material here today at the 241st National Meeting and Exposition of the American Chemical Society (ACS).

Ordinary plastic soda bottles tend to loose their fizz after just a few months of storage on grocery store shelves. If manufacturers apply the new coating to these bottles, the material could slow the loss of carbon dioxide gas and help sodas stay bubbly for several more months or even years, the scientists said. The coating could also extend the shelf life for those portable food packages known as MREs (Meal, Ready to Eat) that sustain soldiers in the field, with the added benefit of being microwavable, they noted. Although made to last for at least three years, their shelf life can drop to as little as three months when exposed to harsh conditions such as high heat.

"This is a new, 'outside of the box' technology that gives plastic the superior food preservation properties of glass," said Jaime Grunlan, Ph.D., who reported on the research. "It will give consumers tastier, longer lasting foods and help boost the food packaging industry."

Grunlan notes that manufacturers currently use a variety of advanced packaging materials to preserve foods and beverages. These materials include plastics that are coated with silicon oxide, a material similar to sand, that provide a barrier to oxygen that can speed food spoilage. Another example is the so-called metalized plastics — plastics with a thin coating of metal or foil — used in many potato chip bags.

These and other packaging materials have drawbacks, Grunlan said. Some plastics crack easily during transport or impact. Metalized plastics are non-transparent — a turn-off to consumers who would like to be able to see their food prior to purchase. The presence of metal also prevents their use in the microwave. Food pouches made out of metal, such as MREs, provide impact resistance, but they lack both transparency and microwavability. Consumers need better food packaging options.

Grunlan has identified a promising alternative in the form of "nano-bricks." The new film combines particles of montmorillonite clay, a soil ingredient used to make bricks, with a variety of polymer materials. The resulting film is about 70 percent clay and contains a small amount of polymer, making it more eco-friendly than current plastics. The film is less than 100 nanometers thick — or thousands of times thinner than the width of a single human hair — and completely transparent to the naked-eye.
"When viewed under an electron microscope, the film looks like bricks and mortar," said Grunlan, an associate professor in the Department of Mechanical Engineering at Texas A&M University in College Station, Texas. "That's why we call it 'nano-bricks'."

When layered onto existing plastic packaging, it adds strength and provides an improved barrier to oxygen, he said. Grunlan demonstrated in lab studies that the film is 100 times less permeable to oxygen than existing silicon oxide coatings. This means that it's also likely to be a better oxygen barrier than a metal coating, whose permeability is similar to that of silicon oxide, the scientists noted.

"Others have added clay to polymer to reduce (gas) permeability, but they are thousands of times more permeable than our film," Grunlan said. "We have the most organized structure — a nano-brick wall — which is the source of this exceptional barrier. This is truly the most oxygen impermeable film in existence."

Grunlan is currently trying to improve the quality of the film to make it more appealing to packaging manufacturers, including making it more moisture resistant. He envisions that manufacturers will dip plastics in the coating or spray the coating onto plastics. In the future, he hopes to develop nano-brick films that block sunlight and contain antimicrobial substances to enhance packaging performance.

The new coating also shows promise for use in flexible electronics, scratch-resistant surfaces, tires, and sporting goods, Grunlan said. It could potentially help basketballs and footballs stay inflated longer than existing balls, he added.

PhysOrg

TStzmmalaysia
post Mar 29 2011, 05:51 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Bullying alters brain chemistry, leads to anxiety

Bullies and the brain. Mice that have been repeatedly bullied by dominant males show an unusual reluctance to approach new, even nonthreatening mice. Above a bullied mouse (right) keeps as much distance as it can from its corralled counterpart.

Being low mouse on the totem pole is tough on murine self-esteem. It turns out it has measurable effects on brain chemistry, too, according to recent experiments at Rockefeller University. Researchers found that mice that were bullied persistently by dominant males grew unusually nervous around new company, and that the change in behavior was accompanied by heightened sensitivity to vasopressin, a hormone involved in a variety of social behaviors. The findings suggest how bullying could contribute to long-term social anxiety at the molecular level.

“We found that chronic social stress affects neuroendocrine systems that are paramount for adaptive mammalian social behaviors such as courtship, pair-bonding and parental behaviors,” says Yoav Litvin, M. S. Stoffel Postdoctoral Fellow in Mind, Brain and Behavior. “Changes in components of these systems have been implicated in human disorders, such as social phobias, depression, schizophrenia and autism.”

Litvin and colleagues in Donald Pfaff’s Laboratory of Neurobiology and Behavior set up a rough-and-tumble school yard scenario in which a young mouse is placed in a cage with a series of larger older mice — a different one in each of 10 days. The mice, being territorial, fight it out in a contest that the new arrival invariably loses. Following the 10-minute battle, the mice were separated in the same cage by a partition that keeps them physically apart but allows them to see, smell and hear one another, a stressful experience for the loser.

Given a day to rest, the test mice are then put in the company of nonthreatening mice of comparable size and age. The biggest change in behavior was that the traumatized mice were more reluctant to socialize with their fellow mice, preferring to keep their distance compared to their unbullied counterparts. The mice that had lost their battles were also more likely to “freeze” in place for longer periods of time and to frequently display “risk assessment” behaviors toward their new cage-mates, behaviors that have been shown to be valid indices of fear and anxiety in humans. The researchers also gave a group of mice a drug that blocked vasopressin receptors, which partly curbed some of the anxious behavior in the bullied mice.

The researchers then examined the brains of the mice, particularly sections in the middle of the forebrain known to be associated with emotion and social behavior. They found that mRNA expression for vasopressin receptors — specifically V1bRs — had increased in the bullied mice, making them more sensitive to the hormone, which is found in high levels in rats with innate high anxiety. In humans, the hormone is associated with aggression, stress and anxiety disorders. The surge of vasopressin receptors was especially notable in the amygdala, Litvin and colleagues reported this month in Physiology & Behavior.

How long these effects last remains an open question. Other studies have found, for instance, that chronic stress can impair some cognitive functions in rodents and people, but that their brains can bounce back, given time to recuperate.

Still, many studies in rodents, primates and people have shown that early psychological trauma can have ill effects on health throughout life. Litvin says his study suggests that victims of bullying may have difficulty forming new relationships, and it identifies the possible role for a specific vasopressin receptor.

“The identification of brain neuroendocrine systems that are affected by stress opens the door for possible pharmacological interventions,” Litvin says. “Additionally, studies have shown that the formation and maintenance of positive social relationships may heal some of the damage of bullying. These dynamic neuroendocrine systems may be involved.”

PhysOrg

TStzmmalaysia
post Mar 29 2011, 05:53 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


NANOTECHNOLOGY

Attached Image

Dutch researchers identify huge potential of nanocrystals in fuel cells

The addition of extremely small crystals to solid electrolyte material has the potential to considerably raise the efficiency of fuel cells. Researchers at TU Delft were the first to document this accurately. Their second article on the subject in a very short time was published in the scientific journal, Advanced Functional Materials.

The researchers at the Faculty of Applied Sciences at TU Delft were concentrating their efforts on improving electrolyte materials. This is the material between two electrodes, for example in a fuel cell or a battery. The better the characteristics of the electrolyte, the better, more compactly or more efficiently the fuel cell or battery works.

The electrolyte is usually a liquid, but this has a number of drawbacks. The liquid has to be very well enclosed, for example, and it takes up a relatively large amount of space. "It would therefore be preferable to have an electrolyte made of solid matter," says PhD student Lucas Haverkate. "Unfortunately though, that has disadvantages as well. The conductivity in solid matter is not as good as it is in a liquid."

"In a solid matter you have a network of ions, in which virtually every position in the network is taken. This makes it difficult for the charged particles (protons) to move from one electrode to another. It’s a bit like a traffic jam on a motorway. What you need to do is to create free spaces in the network."

One of the ways of achieving this, and therefore of increasing conductivity in solid electrolytes, is to add nanocrystals (of seven nanometres to around fifty nanometres), of Titanium Dioxide. "A characteristic of these TiO2 crystals is that they attract protons, and this creates more space in the network." The nanocrystals are mixed in the electrolyte with a solid acid (CsHSO4). This latter material 'delivers' the protons to the crystals. "The addition of the crystals appears to cause an enormous leap in the conductive capacity, up to a factor of 100," concludes Haverkate.

This remarkable achievement by TU Delft has already led to two publications in the scientific journal Advanced Functional Materials. Last December, Haverkate published an article on the theory behind the results. His fellow PhD student, Wing Kee Chan, is the main author of a second item that appeared in the same publication this week. Chan focused on the experimental side of the research. "The nice thing about these two publications is that the experimental results and the theoretical underpinning strongly complement each other," says Haverkate.

Chan carried out measurements on the electrolyte material using the neutron diffraction method. This involves sending neutrons through the material. The way in which the neutrons are dispersed makes it possible to deduce certain characteristics of the material, such as the density of protons in the crystals. Haverkate: "It is the first time that measurements have been taken of solid-material electrolytes in this way, and on such a small scale. The fact that we had nuclear research technologies at the Reactor Institute Delft at our disposal was tremendously valuable."

However, the combination of TiO2 and CsHSO4 does not mark the end of the search for a suitable solid-material electrolyte. Other material combinations will be tested that may achieve better scores in the area of stability, for example. Professor Fokko Mulder, who is Haverkate’s and Chan’s PhD supervisor, says. "At this stage, we are more concerned about acquiring a fundamental understanding and a useful model, than the concrete issue of finding out what the most suitable material is. It is important that we identify the effect of nanocrystals, and give it a theoretical basis. I think there is great potential for these electrolytes. They also have the extra benefit of continuing to function well over a wide range of temperatures, which is of particular relevance for applying them in fuel cells."

PhysOrg

TStzmmalaysia
post Mar 29 2011, 05:54 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

'Green' cars could be made from pineapples and bananas

Your next new car hopefully won't be a lemon. But it could be a pineapple or a banana. That's because scientists in Brazil have developed a more effective way to use fibers from these and other plants in a new generation of automotive plastics that are stronger, lighter, and more eco-friendly than plastics now in use. They described the work, which could lead to stronger, lighter, and more sustainable materials for cars and other products, here today at the 241st National Meeting & Exposition of the American Chemical Society (ACS).

Study leader Alcides Leão, Ph.D., said the fibers used to reinforce the new plastics may come from delicate fruits like bananas and pineapples, but they are super strong. Some of these so-called nano-cellulose fibers are almost as stiff as Kevlar, the renowned super-strong material used in armor and bulletproof vests. Unlike Kevlar and other traditional plastics, which are made from petroleum or natural gas, nano-cellulose fibers are completely renewable.

"The properties of these plastics are incredible," Leão said, "They are light, but very strong — 30 per cent lighter and 3-to-4 times stronger. We believe that a lot of car parts, including dashboards, bumpers, side panels, will be made of nano-sized fruit fibers in the future. For one thing, they will help reduce the weight of cars and that will improve fuel economy."

Besides weight reduction, nano-cellulose reinforced plastics have mechanical advantages over conventional automotive plastics, Leão added. These include greater resistance to damage from heat, spilled gasoline, water, and oxygen. With automobile manufacturers already testing nano-cellulose-reinforced plastics, with promising results, he predicted they would be used within two years.

Cellulose is the main material that makes up the wood in trees and other parts ofplants. Its ordinary-size fibers have been used for centuries to make paper, extracted from wood that is ground up and processed. In more recent years, scientists have discovered that intensive processing of wood releases ultra-small, or "nano" cellulose fibers, so tiny that 50,000 could fit inside across the width of a single strand of human hair. Like fibers made from glass, carbon, and other materials, nano-cellulose fibers can be added to raw material used to make plastics, producing reinforced plastics that are stronger and more durable.

Leão said that pineapple leaves and stems, rather than wood, may be the most promising source for nano-cellulose. He is with Sao Paulo State University in Sao Paulo, Brazil. Another is curaua, a plant related to pineapple that is cultivated in South America. Other good sources include bananas; coir fibers found in coconut shells; typha, or "cattails;" sisal fibers produced from the agave plant; and fique, another plant related to pineapples.

To prepare the nano-fibers, the scientists insert the leaves and stems of pineapples or other plants into a device similar to a pressure cooker. They then add certain chemicals to the plants and heat the mixture over several cycles, producing a fine material that resembles talcum powder. The process is costly, but it takes just one pound of nano-cellulose to produce 100 pounds of super-strong, lightweight plastic, the scientists said.

"So far, we're focusing on replacing automotive plastics," said Leão. "But in the future, we may be able to replace steel and aluminum automotive parts using these plant-based nanocellulose materials."

Similar plastics also show promise for future use in medical applications, such as replacement materials for artificial heart valves, artificial ligaments, and hip joints, Leão and colleagues said.

PhysOrg

Provided by American Chemical Society


33 Pages « < 23 24 25 26 27 > » Top
 

Change to:
| Lo-Fi Version
0.0304sec    0.26    6 queries    GZIP Disabled
Time is now: 29th March 2024 - 02:15 PM