Welcome Guest ( Log In | Register )

Outline · [ Standard ] · Linear+

 Science & Technology Today

views
     
TStzmmalaysia
post Dec 14 2010, 10:06 AM, updated 15y ago

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ABOUT

I will use this thread as an information hub which aims to inform as many people as possible about the current advancements and capabilities of science and technology. Technology is rapidly developing -- faster than many of us realize -- and with this thread I aim to provide a clear picture of just how quickly this is happening through daily reporting of developments and breakthroughs around the world.

I hope to actively engage as many people as possible in recognizing that our current technological development is far more advanced than the majority of humans currently realize. By raising awareness of emerging technologies that can free humans from unnecessary work, while practically and efficiently providing for all with minimal, if any, human effort, we hope to build a global understanding of how this technology can improve everyday life for human beings -- making the human species much more likely to achieve its productive and innovative potential.

The news will fall under different categories as below:
ENERGY
TRANSPORTATION
BIOTECHNOLOGY
ROBOTICS
APPLIED SCIENCES
NANOTECHNOLOGY
SPACE SCIENCE
RESEARCH

UPDATE: Some of you might have found the news here interesting, I'm glad you did. However, often people look at new technology in a perspective that is quite passive, which means that they are amazed by the new technology, then stop there. I hope people can look at technology from a different perspective, it is that all these technology, do not require money. I repeat, they do not require money (it's just printed papers with no function purpose other than a value people give it). What is required is the people who are interested in the research, the tools and resources for their research, and finally, production / application. Unfortunately, money often comes in the way of stopping technology from progressing as fast as possible. In some of these news, you can see that some nations are adopting new technology, but mainly for military/political/business purposes, when all these wonderful technology can be used to improve the lives of billions of people altogether, ridding unnecessary sufferings of our fellow human beings all around the world. This, is the perspective I hope you can stand in.

Please view these videos on what science & technology can do for us:
Awakening


Automation is Here video series (Directed and produced by Alfred Henry III):









Logistics



Restaurants




TED: 7 Species of Robots


I.T.
ISP-Free Internet
Computing a theory of everything
Wolfram Alpha

A.I.
Artificial Human Brain 10 Years Away
Supercomputer Beats Human Champs in 'Jeopardy!' Game


For the sake of integrity, I never alter the content of the articles although they portray money and politics. They are very true and people need to see it.

Note: Kindly refrain from posting in order to keep the thread clean. You may discuss it in your own thread. If there are no sources provided, just copy the news title and search it.

This post has been edited by tzmmalaysia: Dec 27 2010, 03:19 PM
TStzmmalaysia
post Dec 14 2010, 10:10 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

World's most efficient solar cell 'more reliable than a Swiss clock'

Manufacturers of the world's most efficient solar cell claim the device is more reliable than that benchmark of dependency - the Swiss clock.

The latest in the HIT series of solar cells, the HIT-N240SE10 manufactured by SANYO Component Europe GmbH has the world's highest cell conversion efficiency rating of 21.6 percent. Essentially this means that fewer units are needed to generate larger amounts of energy.

Based on the percentage of warranty cases against solar units supplied, SANYO claim that the cell has a reliability rating of 99.9962 percent, meaning than the new cells are more reliable than Swiss clocks, which SANYO claim have a reliability rating of 99.9930 percent.

Due to its high efficiency rating and reliability, the product is targeted towards home owners hoping to benefit from alternative energy but constrained by limited installation space and concerns over reliability.

Other alternative energy solutions for home owners with limited installation space may come in the form of ground breaking technology being developed by Cambridge University. In September 2010 Cambridge University in partnership with the Carbon Trust announced the development of organic solar cells attached to transparent, flexible sheeting which could simply be "rolled out" and attached to surfaces such as windows, therefore reducing the need for a traditional installation space such as a roof.

The HIT-N240SE10 is expected to be released on a commercial basis across Europe in early 2011.

Prior to the release of this new photovoltaic module the highest efficiency rating of a commercial solar cell was 21.1 percent, which was also achieved by a SANYO HIT cell.

This post has been edited by tzmmalaysia: Dec 14 2010, 10:19 AM
TStzmmalaysia
post Dec 14 2010, 10:14 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Engineers Mimic Photosynthesis to Harvest Light Energy

Plants take advantage of quantum mechanics to harvest sunlight with near-perfect efficiency—though only roughly 2 percent of that capture sunlight ultimately gets stored as chemical energy. Now scientists are studying how this light-harvesting step of photosynthesis is optimized by nature to learn how to mimic it in engineered systems for use in solar cells or artificial leaves that produce fuels directly from the sun.

Plants rely on chromophores—molecules that absorb certain wavelengths of visible light while reflecting others—to harvest energy from the sun. When sunlight hits a plant, electrons in the topmost chromophores absorb energy from incoming photons and then transfer it from the newly energized molecule to another molecule at a lower energy state. That transfer repeats itself via a chain of molecules, a cascade of rapid energy pass-offs that ultimately separates an electron from the last chromophore in the chain, which provides energy that is stored by the plant as a carbohydrate.

In this way chromophores perform three functions: they absorb energy from sunlight (acting as "acceptors"); they donate the energy they absorbed (as "donors"); and they transfer energy to another molecule (as "bridges"). Using measurements from other researchers of the intensity of photons absorbed and emitted by chromophores, chemist Jianshu Cao and his colleagues at the Massachusetts Institute of Technology developed a computer model to arrive at the ratio of acceptors, donors and bridges that optimizes the efficiency of the light-harvesting step of photosynthesis.

The findings: there is an optimal ratio of 10 donors for each acceptor in order to efficiently transfer energy in a natural photosynthetic system with just those two chromophore functions. Adding bridges to an arrangement of donors and acceptors then further increases the efficiency of energy transfer, Cao says.

Chromophores are arranged in bundles in plant cells, and these structures and configurations influence light-harvesting efficiency as well. University of California, Berkeley, chemist Matt Francis created artificial light-harvesting systems by attaching chromophores to tobacco mosaic virus molecules. Modeling these genetically engineered systems, Cao found that one structure—stacks of chromophore disks—could be tuned to improve the overall efficiency by combining multiple disks of similar size but different combinations of bridges, acceptors and donors. One particular configuration of two disks comprising bridges and acceptors stacked between disks made entirely of donors is a good candidate for designing artificial light-harvesting devices, according to the study published October 21 in The Journal of Physical Chemistry B.

Earlier research found that photosynthesis takes advantage of an effect known as quantum coherence. In one study researchers found that the energy absorbed by a chromophore travels through multiple networks at the same time in order to take the quickest path. Other research observed that "noise," or random fluctuations, at the quantum level helps move energy from chromophores to the reaction centers of photosynthesis. Building on this work, Cao and M.I.T. chemist Robert Silbey modeled a light-harvesting system in green sulfur bacteria and found that photosynthesis is most efficient when there is an intermediate amount of noise in the system. "In experimental conditions one always tries to reduce noise," Cao says, "but in a quantum mechanical system, it's actually useful to have some noise."

This post has been edited by tzmmalaysia: Dec 14 2010, 10:19 AM
TStzmmalaysia
post Dec 14 2010, 10:16 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Wireless at the speed of plasma

BEFORE you leave for work in the morning, your smartphone downloads the latest episode of a television series. Your drive to work is easy in spite of fog, thanks to in-car radar and the intelligent transport software that automatically guides you around traffic jams, allowing you to arrive in time for a presentation in which high-definition video is streamed flawlessly to your tablet computer in real time.

This vision of the future may not be far off, thanks to a new type of antenna that makes use of plasma consisting of only electrons. It could revolutionise high-speed wireless communications, miniature radar and even energy weapons.

Existing directional antennas that transmit high-frequency radio waves require expensive materials or precise manufacturing. But the new antenna, called Plasma Silicon Antenna, or PSiAN, relies on existing low-cost manufacturing techniques developed for silicon chips. It has been developed by Plasma Antennas of Winchester, UK.

PSiAN consists of thousands of diodes on a silicon chip. When activated, each diode generates a cloud of electrons - the plasma - about 0.1 millimetres across. At a high enough electron density, each cloud reflects high-frequency radio waves like a mirror. By selectively activating diodes, the shape of the reflecting area can be changed to focus and steer a beam of radio waves. This "beam-forming" capability makes the antennas crucial to ultrafast wireless applications, because they can focus a stream of high-frequency radio waves that would quickly dissipate using normal antennas.

"Beam-forming antennas are the key for enabling next-generation, high-data-rate indoor wireless applications," says Anmol Sheth, at Intel Labs in Seattle. "Without beam-forming antennas it would be difficult to scale to the levels of density of wireless devices we expect to have in future homes."

There are two types of plasma antenna: semiconductor or solid-state antennas, such as PSiAN, and gas antennas. Both could fit the bill, but solid-state antennas are favoured as they are more compact and have no moving parts.

That makes them attractive for use in a new generation of ultrafast Wi-Fi, known as Wi-Gig. Existing Wi-Fi tops out at 54 megabits of data per second, whereas the Wi-Gig standard is expected to go up to between 1 and 7 gigabits per second - fast enough to download a television programme in seconds. Wi-Gig requires higher radio wave frequencies, though: 60 gigahertz rather than the 2.4 GHz used by Wi-Fi. Signals at these frequencies disperse rapidly unless they are tightly focused, which is where PSiAN comes in.

Ian Russell, business development director at Plasma Antennas, says that PSiAN is small enough to fit inside a cellphone. "Higher frequencies mean shorter wavelengths and hence smaller antennas," he says. "The antenna actually becomes cheaper at the smaller scales because you need less silicon."

The antennas shouldn't raise any health issues, as they are covered by existing safety standards. The narrow beam means there is less "overspill" of radiation than with existing omnidirectional antennas.

As well as speeding up Wi-Fi, plasma antennas could also allow cars to come with low-cost miniature radar systems to help drivers avoid collisions. Their millimetre wavelengths could be used to "see" through fog or rain, and another set of antennas could listen for real-time updates on traffic and road conditions.

This post has been edited by tzmmalaysia: Dec 14 2010, 10:20 AM
TStzmmalaysia
post Dec 14 2010, 10:18 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Injecting New Bone

An artificial bone-like material could speed up recovery from injury.

Today, a broken hip usually means surgery and extensive rehab. But what if all you needed was an injection and a shorter recovery period? That's the vision that inspires Thomas Webster, an associate professor of engineering at Brown University.

Webster has developed a nanomaterial that quickly solidifies at body temperature into a bone-like substance. This week, Brown announced a deal with medical device maker Audax Medical of Littleton, Massachusetts, to further develop the material and launch trials in animals.

The material contains the same nucleic acids as DNA, Webster says. Each molecule has two covalent bonds and links with other molecules to form a tube. Hence it's called a "twin-base linker." (Audax will develop it under the name Arxis.)

"It self-assembles into a nano structure, emulates natural tissue, solidifies quickly at body temperature, and can be made to match the mechanical properties of the tissue you inject it into," Webster says.

That sounds great, says tissue engineer Kevin Shakesheff, of the University of Nottingham in the United Kingdom, but it will also need to sustain weight like bone can.

He and his colleagues have developed a different material for the same purpose. "If you press down on our material, it's as strong as bone, but if you try and snap it, it's nowhere near as strong," he says.

Webster says he's confident that his material, which has so far only been tested in a laboratory, will be able to bear weight like bone.

"It will have that strength after solidifying in the body—after a couple of minutes," he says.

Ali Khademhosseini, an assistant professor of medicine at Brigham and Women's Hospital and Harvard Medical School in Boston says Webster's material sounds interesting, and there's plenty of room for innovation in the area of bone-like materials.

Today, metal plates are often inserted to provide strength and support while bones, such as the hip joint, slowly heal. But the metal degrades over time, and particularly in younger patients, it may eventually have to be replaced. Khademhosseini says tissue engineers are looking for materials that will better integrate with the body and last longer. If Webster succeeds in developing such a material to replace metal entirely, that would transform the field, he says.

Audax will begin testing Arxis in the hip and knee, according to company president and CEO Mark Johanson. Johanson hopes to have the first product ready for market in 2013. The company recently raised $1 million and plans to raise more capital soon, Johanson says. If Arxis is injectable on an outpatient basis, the sales volume will be high and the price relatively low, Johanson predicts. An injection is likely to run $1,000 to $1,500.

"The material can be processed and manufactured relatively inexpensively, which positions it well for the higher-volume-procedural market," Johanson says.
TStzmmalaysia
post Dec 14 2010, 10:28 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


SPACE SCIENCE

Attached Image

Curving Mirrors in Space

NASA's James Webb Space Telescope is a wonder of modern engineering. As the planned successor to the Hubble Space telescope, even the smallest of parts on this giant observatory will play a critical role in its performance. A new video takes viewers behind the Webb's mirrors to investigate "actuators," one component that will help Webb focus on some of the earliest objects in the universe.

The video called "Got Your Back" is part of an on-going video series about the Webb telescope called "Behind the Webb." It was produced at the Space Telescope Science Institute (STScI) in Baltimore, Md. and takes viewers behind the scenes with scientists and engineers who are creating the Webb telescope's components. During the 3 minute and 12 second video, STScI host Mary Estacion interviewed people involved in the project at Ball Aerospace in Boulder, Colo. and showed the actuators in action.

The Webb telescope will study every phase in the history of our universe, ranging from the first luminous glows after the big bang, to the formation of solar systems capable of supporting life on planets like Earth, to the evolution of our own solar system. Measuring the light this distant light requires a primary mirror 6.5 meters (21 feet 4 inches) across -- six times larger than the Hubble Space telescope's mirror! Launching a mirror this large into space isn't feasible. Instead, Webb engineers and scientists innovated a unique solution -- building 18 mirrors that will act in unison as one large mirror. These mirrors are packaged together into three sections that fold up -- much easier to fit inside a rocket. Each mirror is made from beryllium and weighs approximately 20 kilograms (46 pounds). Once in space, getting these mirrors to focus correctly on faraway galaxies is another challenge entirely. Actuators, or tiny mechanical motors, provide the answer to achieving a single perfect focus.

The primary and secondary mirror segments are both moved by six actuators that are attached to the back of the mirrors. The primary segment has an additional actuator at the center of the mirror that adjusts its curvature. The third mirror segment remains stationary. Lee Feinberg, Webb Optical Telescope Element Manager at NASA's Goddard Space Flight Center in Greenbelt, Md. explained "Aligning the primary mirror segments as though they are a single large mirror means each mirror is aligned to 1/10,000th the thickness of a human hair. This alignment has to be done at 50 degrees above absolute zero! What's even more amazing is that the engineers and scientists working on the Webb telescope literally had to invent how to do this."

With the actuators in place, Brad Shogrin, Webb Telescope Manager at Ball Aerospace, Boulder, Colo, details the next step: attaching the hexapod (meaning six-footed) assembly and radius of curvature subsystem (ROC). "Radius of curvature" refers to the distance to the center point of the curvature of the mirror. Feinberg added "To understand the concept in a more basic sense, if you change that radius of curvature, you change the mirror's focus."



This post has been edited by tzmmalaysia: Dec 15 2010, 10:45 AM
TStzmmalaysia
post Dec 15 2010, 10:33 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

BendDesk introduced: the desk that is a touch screen

A research project from the RWTH Aachen University Media Computing Group and Department of Work and Cognitive Psychology in Germany is developing a desk in which the entire curved surface is a multi-touch touch screen and display, removing the need for keyboard, mouse and separate display.

Most desks these days include a vertical display for digital information, such as a PC or laptop screen, and user interfaces and input devices on the horizontal surface, such as a keyboard and mouse. The desk surface would also often be covered with papers, and objects such as pens and coffee mugs.

The designers of BendDesk say the vertical and horizontal areas of the desk are separated and this makes it difficult to move documents from one surface to the other. They also point out that the user interacts differently with the vertical and horizontal areas of the desk, for example, interacting with objects on the vertical area with a mouse, and the horizontal with a pen. They say the project is their “vision of a future workspace that allows continuous interaction between both areas.”
The result of their vision is BendDesk, which has horizontal and vertical surfaces made of a single 104 cm x 104 cm piece of bendable acrylic. The entire area serves as both display and multi-touch screen, which enables the user to interact with virtual objects anywhere on the surface. The system uses two projectors, three cameras for touch input, and strips of infrared light emitting diodes (IR-LEDs) set into the sides of the desk surface.



The developers, Malte Weiss, Simon Voelker, and Professor Jan Borchers, head of the Media Computing Group, and Christine Sutter from the Department of Work and Cognitive Psychology at RWTH Aachen University, say they took special care over the ergonomics and users can sit comfortably at the desk and can still place physical objects on it. However, they note in their paper that some of the test volunteers became fatigued after only a few minutes, and the volunteers were almost all males between the ages of 24 and 32, and more work would need to be done on exploring the ergonomic aspects.

Possible applications for the BendDesk include manipulation via multi-touch gestures of objects such as photographs, documents, or videos, and video games.

This post has been edited by tzmmalaysia: Dec 15 2010, 10:45 AM
TStzmmalaysia
post Dec 15 2010, 10:35 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Chip provides its own power

Microchips that 'harvest' the energy they need from their own surroundings, without depending on batteries or mains electricity. That will be possible now that Dutch researchers from the University of Twente's MESA+ Institute for Nanotechnology, together with colleagues from the universities of Nankai (China) and Utrecht, have for the first time succeeded in manufacturing a microchip with an efficient solar cell placed on top of the microelectronics.

The researchers presented their findings at the International Electron Device Meeting that was held from 5 to 8 December in San Francisco.
The placement of a solar cell directly on top of the electronics means the autonomous chip does not need batteries. In this way, for example, a sensor chip can be produced, complete with the necessary intelligence and even an antenna for wireless communication. However, the chip's energy use must be well below 1 milliwatt, say the researchers. The chip can then even collect enough energy to operate indoors.
The simplest solution would seem to be to manufacture the solar cell separately and then fit it on top of the electronics, but this is not the most efficient production process, so instead the researchers use the chip as a base and apply the solar cell to it layer by layer. This uses fewer materials, and also ultimately performs better. But the combination is not trouble-free: there is a risk that the steps in the production of the solar cell will damage the electronics so that they function less efficiently.

For this reason the researchers decided to use solar cells made of amorphous silicon or CIGS (copper - indium - gallium - selenide). The manufacturing procedure for these cells does not influence the electronics, and these types of solar cells also produce sufficient power, even in low light. Tests have shown that the electronics and the solar cells function properly, and the manufacturing process is also highly suitable for industrial serial production with the use of standard processes.
TStzmmalaysia
post Dec 15 2010, 10:36 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Salty solar plant stores sun's heat

DRIVING through the baking landscape of Almería, it is no mystery why this Spanish province is home to a novel type of power station that generates electricity by harnessing the heat of the sun.

For over 20 years, the Plataforma Solar de Almería, sited on an almost rainless plain in the south of the province, has been at the forefront of research into solar thermal power generation. Helped by Spain's sunny climate and generous government subsidies, this has led to the construction of 10 solar thermal plants across the country in the last three years alone. Some 50 more are planned.

Within the centre, parabolic dishes lie strewn about like huge discarded toys, but the site is dominated by a giant white tower. Thousands of mirrors, known as heliostats, surround it, catching sunlight and focusing it onto a receiver on top of the tower. This concentrated sunlight produces superheated steam that drives a turbine to generate electricity.

Till now, the mainstay of solar thermal power has been the parabolic trough system, in which carefully shaped parabolic mirrors direct solar energy onto glass tubes containing a heat-absorbing fluid. One of the drawbacks of such installations is that to keep costs down they need large areas of flat ground.

With solar towers this is unnecessary. The heliostats can hug the land at different levels and be individually calibrated to beam their rays to the receiver atop the tower.

Another advantage of towers is that they can operate at high temperatures. The heat-absorbing liquid used in the trough system is an oil that can only cope with temperatures up to 400 °C. With the tower there is no need for an intermediate fluid, and steam passing though the receiver is heated directly to around 550 °C. The higher temperature means the heat energy can be converted to electricity more efficiently.

However, because the towers produce steam directly, they cannot store the heat they collect and so stop generating electricity once the sun sets. A new Spanish project, the Gemasolar tower near Seville, may have solved this problem. The 19-megawatt tower will be the first in the world to use a mixture of molten salts to transfer heat from the receiver on top of the tower to a heat exchanger where steam to drive the turbines is generated.

The salt mixture, made up of sodium and potassium nitrates, can operate at the high temperatures generated in a solar tower's receiver. Because the hot molten salt can be stored until the heat it contains is needed, the Gemasolar plant is expected to be able to run for 15 hours without sunlight. The best parabolic trough plants can only manage about half that time.

If all goes well when Gemasolar launches next year, Spain should be able to profit from its scorching climate for some time to come.
TStzmmalaysia
post Dec 15 2010, 10:38 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Researchers open the door to biological computers

Genetically modified cells can be made to communicate with each other as if they were electronic circuits. Using yeast cells, a group of researchers at the University of Gothenburg, Sweden, has taken a groundbreaking step towards being able to build complex systems in the future where the body’s own cells help to keep us healthy. The study was presented recently in an article in the scientific journal Nature.

“Even though engineered cells can’t do the same job as a real computer, our study paves the way for building complex constructions from these cells,” says Kentaro Furukawa at the University of Gothenburg’s Department of Cell- and Molecular Biology, one of the researchers behind the study. “In the future we expect that it will be possible to use similar cell-to-cell communication systems in the human body to detect changes in the state of health, to help fight illness at an early stage, or to act as biosensors to detect pollutants in connection with our ability to break down toxic substances in the environment.”

Combining biology and technology

Synthetic biology is a relatively new area of research. One application is the design of biological systems that are not found in nature. For example, researchers have successfully constructed a number of different artificial connections within genetically modified cells, such as circuit breakers, oscillators and sensors.

Some of these artificial networks could be used for industrial or medical applications. Despite the huge potential for these artificial connections, there have been many technical limitations to date, mainly because the artificial systems in individual cells rarely work as expected, which has a major impact on the results.

Biotechnology challenges the world of computers

Using yeast cells, the research team at the University of Gothenburg has now produced synthetic circuits based on gene-regulated communication between cells. The yeast cells have been modified genetically so that they sense their surroundings on the basis of set criteria and then send signals to other yeast cells by secreting molecules. The various cells can thus be combined like bricks of Lego to produce more complicated circuits. Using a construction of yeast cells with different genetic modifications, it is possible to carry out more complicated “electronic” functions than would be the case with just one type of cells.

The University of Gothenburg research team is headed by professor Stefan Hohmann, and also comprises Kentaro Furukawa and Jimmy Kjellén.

The article Distributed biological computation with multicellular engineered networks, published in the scientific journal Nature on 8 December, was the result of a partnership with two Spanish research teams at Universitat Pompeu Fabra in Barcelona. The work forms part of the EU CELLCOMPUT project.
TStzmmalaysia
post Dec 15 2010, 10:40 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Sugar and Cornstarch Make Environmentally Safer Plastics

Environmentalists around the world agree -- plastic bags are choking our landfills and polluting our seas. Now a Tel Aviv University researcher is developing new laboratory methods using corn starch and sugar to help sustainable plastics -- those that biodegrade and are even tougher than those made from petrochemicals -- compete in the industry.

The answer to the problem, Prof. Moshe Kol of Tel Aviv University's School of Chemistry says, is a new variety of catalysts -- substances that initiate or sustain chemical reactions in other substances. His team has already developed several of these new catalysts, and it's currently expanding its activities in partnership with the University of Aachen in Germany and the University of Bath in England.
Prof. Kol is improving the process of making these "green" plastics stronger and more heat-resistant, allowing them to be used in a variety of ways, from the automotive industry to Starbucks coffee cups. The type of plastic the partners are working on, polylactic acid or PLA, is a kind of biodegradable plastic made from renewable plant sources such as corn, wheat or sugarcane. It's already used in bottles, bags, and film, and like polyester can even be woven into clothes.
Making stronger and biodegradable "Lego blocks"

The new catalysts enable the polymerization of lactide, which is the building block of a corn-based plastic. Conventional catalysts have limited control of the way in which these building blocks -- the corn-based molecules -- are assembled -- and they may be toxic. But Prof. Kol's catalysts can be used more safely and efficiently, making "green" plastics more commercially feasible.
"The structure of these corn-based plastics depends on several parameters. The most important is the character of the building blocks, like Lego blocks, that hold the material together," says Prof. Kol. He aims to make sustainable corn-based plastics complement or replace the petroleum-based plastics which can take a millenium to degrade, leaving harmful pollutants in the soil and in water. Corn-based plastic wouldn't cause any adverse health effects and would be expected to biodegrade in a compost bin in a matter of months.

Lord of the plastic ring

Plastics won't be going away any time soon, Prof. Kol suggests, pointing to the movement from concrete or stainless steel to plastic in a variety of industries. Replacing the steel manifold of a car with a plastic substitute would cut down on fuel consumption, and replacing a water pipe made of concrete or metal with one made of corrosion- and crack-resistant plastic may improve the quality of our drinking water.

For disposable items, a perfect plastic material would be a polymer made from renewable resources, that degrades to its original non-toxic form. Plastics made from corn sugar are the most desirable in the industry at the moment.
The preliminary results of Prof. Kol's efforts are in, and the plastics that he and his team produce in the lab look and feel like polystyrene, which could be used for making drinking cups, for example. Rigid and transparent, the drinking cups currently only work for liquids under 122 degrees Fahrenheit, but they represent a first big step into greening plastics and the chemical industry.
TStzmmalaysia
post Dec 16 2010, 06:58 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Robot Arm Improves Performance of Brain-Controlled Device

During the experiment, monkeys used their brain signals to move a computer cursor (red circle) to randomly placed targets (squares). When visual and proprioceptive feedback were included, the monkey's hand was moved by a robotic exoskeleton. The additional sensory information resulted in the cursor hitting the target faster and more directly. (Credit: Courtesy, with permission: Hatsopoulos, et al. The Journal of Neuroscience 2010.)

Devices that translate brain activity into the movement of a computer cursor or an external robotic arm have already proven successful in humans. But in these early systems, vision was the only tool a subject could use to help control the motion.
Adding a robot arm that provided kinesthetic information about movement and position in space improved the performance of monkeys using a brain-machine interface in a study published December 14 in The Journal of Neuroscience. Incorporating this sense may improve the design of "wearable robots" to help patients with spinal cord injuries, researchers said.

"A lot of patients that are motor-disabled might have partial sensory feedback," said Nicholas Hatsopoulos, PhD, Associate Professor and Chair of Computational Neuroscience at the University of Chicago. "That got us thinking that maybe we could use this natural form of feedback with wearable robots to provide that kind of feedback."

In the experiments, monkeys controlled a cursor without actively moving their arm via a device that translated activity in the primary motor cortex of their brain into cursor motion. While wearing a sleeve-like robotic exoskeleton that moved their arm in tandem with the cursor, the monkey's control of the cursor improved, hitting targets faster and via straighter paths than without the exoskeleton.

Attached Image

"We saw a 40 percent improvement in cursor control when the robotic exoskeleton passively moved the monkeys' arm," Hatsopoulos said. "This could be quite significant for daily activities being performed by a paralyzed patient that was equipped with such a system."
When a person moves their arm or hand, they use sensory feedback called proprioception to control that motion. For example, if one reaches out to grab a coffee mug, sensory neurons in the arm and hand send information back to the brain about where one's limbs are positioned and moving. Proprioception tells a person where their arm is positioned, even if their eyes are closed.
But in patients with conditions where sensory neurons die out, executing basic motor tasks such as buttoning a shirt or even walking becomes exceptionally difficult. Paraplegic subjects in the early clinical trials of brain-machine interfaces faced similar difficulty in attempting to move a computer cursor or robot arm using only visual cues. Those troubles helped researchers realize the importance of proprioception feedback, Hatsopoulos said.

"In the early days when we were doing this, we didn't even consider sensory feedback as an important component of the system," Hatsopoulos said. "We really thought it was just one-way: signals were coming from the brain, and then out to control the limb. It's only more recently that the community has really realized that there is this loop with feedback coming back."

Reflecting this loop, the researchers on the new study also observed changes in the brain activity recorded from the monkeys when sensory feedback was added to the set-up. With proprioception feedback, the information in the cell firing patterns of the primary motor cortex contained more information than in trials with only visual feedback, Hatsopoulos said, reflecting an improved signal-to-noise ratio.
The improvement seen from adding proprioception feedback may inform the next generation of brain-machine interface devices, Hatsopoulos said. Already, scientists are developing different types of "wearable robots" to augment a person's natural abilities. Combining a decoder of cortical activity with a robotic exoskeleton for the arm or hand can serve a dual purpose: allowing a paralyzed subject to move the limb, while also providing sensory feedback.

To benefit from this solution, a paralyzed patient must have retained some residual sensory information from the limbs despite the loss of motor function -- a common occurrence, Hatsopoulos said, particularly in patients with ALS, locked-in syndrome, or incomplete spinal cord injury. For patients without both motor and sensory function, direct stimulation of sensory cortex may be able to simulate the sensation of limb movement. Further research in that direction is currently underway, Hatsopoulos said.
"I think all the components are there; there's nothing here that's holding us back conceptually," Hatsopoulos said. "I think using these wearable robots and controlling them with the brain is, in my opinion, probably the most promising approach to take in helping paralyzed individuals regain the ability to move."
TStzmmalaysia
post Dec 16 2010, 07:00 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Plasma therapy: An alternative to antibiotics?

Cold plasma jets could be a safe, effective alternative to antibiotics to treat multi-drug resistant infections, says a study published this week in the January issue of the Journal of Medical Microbiology.

The team of Russian and German researchers showed that a ten-minute treatment with low-temperature plasma was not only able to kill drug-resistant bacteria causing wound infections in rats but also increased the rate of wound healing. The findings suggest that cold plasmas might be a promising method to treat chronic wound infections where other approaches fail.

The team from the Gamaleya Institute of Epidemiology and Microbiology in Moscow tested a low-temperature plasma torch against bacterial species including Pseudomonas aeruginosa and Staphylococcus aureus. These species are common culprits of chronic wound infections and are able to resist the action of antibiotics because they can grow together in protective layers called biofilms. The scientists showed not only that plasma was lethal to up to 99% of bacteria in laboratory-grown biofilms after five minutes, but also that plasma killed about 90 % of the bacteria (on average) infecting skin wounds in rats after ten minutes.

Plasmas are known as the fourth state of matter after solids, liquids and gases and are formed when high-energy processes strip atoms of their electrons to produce ionized gas flows at high temperature. They have an increasing number of technical and medical applications and hot plasmas are already used to disinfect surgical instruments.

Dr Svetlana Ermolaeva who conducted the research explained that the recent development of cold plasmas with temperatures of 35-40°C makes the technology an attractive option for treating infections. "Cold plasmas are able to kill bacteria by damaging microbial DNA and surface structures without being harmful to human tissues. Importantly we have shown that plasma is able to kill bacteria growing in biofilms in wounds, although thicker biofilms show some resistance to treatment."

Plasma technology could eventually represent a better alternative to antibiotics, according to Dr Ermolaeva. "Our work demonstrates that plasma is effective against pathogenic bacteria with multiple-antibiotic resistance - not just in Petri dishes but in actual infected wounds," she said. "Another huge advantage to plasma therapy is that it is non-specific, meaning it is much harder for bacteria to develop resistance. It's a method that is contact free, painless and does not contribute to chemical contamination of the environment."
TStzmmalaysia
post Dec 16 2010, 07:02 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

PR2 Robot Community Continues Expansion in Asia, Europe and North America

Willow Garage announced today that the PR2 community has expanded to 16 leading research labs worldwide. Scientists and engineers at four leading research institutions will now be able to explore the innovative capabilities for personal robots at a much faster pace because of the PR2 robot platform they have purchased from Willow Garage. In the past, researchers had to spend a substantial amount of their time building a robot and its operating system before they could start designing and deploying applications for personal robotics use in homes and offices. The four institutions are:

CNRS Laboratory of Analysis and Architecture of Systems (LAAS-CNRS) in Toulouse, France
George Washington University in Washington, DC;
Samsung (News - Alert) Electronics in Suwon, Korea; and
University of Washington in Seattle, WA.

The goal at Willow Garage is to lay the groundwork for a revolution in personal robotics by providing the hardware and software platforms upon which robot scientists can develop applications. The combination of PR2 and the open source Robot Operating System (ROS) means that researchers benefit from immediate time to innovation. Right out of the box, the PR2 and ROS provide a complete platform for research and development in the personal robotics field.
PR2 was first delivered to eleven leading robotics research institutions at no cost in May 2010. In September, Willow Garage announced that the PR2 was available for purchase.

"The PR2 has only been commercially available a short time, but we are proud to say that there are already PR2 robots on three continents," according to Steve Cousins, President and CEO of Willow Garage. "It's inspiring to see the PR2 community grow so quickly. All of us at Willow Garage are looking forward to hearing about the research conducted at Samsung Electronics, UW, LAAS-CNRS and GWU."

One PR2 has already arrived at Samsung Electronics Co., Ltd. in Suwon, Korea. Samsung Electronics, the world's largest electronics company, is using the PR2 to enhance their existing robotics research. South Korea is one of the most technologically advanced countries in the world and one that has enthusiastically embraced personal robotics. The country is hoping to put a robot in every home by the year 2020.

This post has been edited by tzmmalaysia: Dec 16 2010, 07:05 AM
TStzmmalaysia
post Dec 16 2010, 10:58 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Take rambling to the next level with holographic digital maps


It wasn't so long ago when those wanting to visualize the landscape around them would have to use a topographic map and a fair bit of imagination. Nowadays we are spoilt by the immersive opportunities offered by the likes of Google Earth, or even GPS technology, but there's nothing quite like a holographic image for recreating a 3D representation of the surrounding terrain on a 2D surface. While the digital holographic prints produced by Zebra Imaging are not exactly as pocket-friendly as maps, they are quite simply stunning.

With the speed at which technology is moving, it won't be too long before those of us who like to wander through the streets and ramble through the countryside will be able to reach into our pockets for a lifelike three-dimensional representation of the terrain in front of us. The cutting edge combination of laser, optics and image processing technology used by Zebra Imaging already allows for the creation of lifelike 3D holographic visualizations in a format that can be rolled up and transported around.

The company takes three-dimensional digital data from sources like CAD models, laser scans and satellite imagery and produces a single portable, film-based hologram that can be made to jump out from the surface with the aid of a halogen or LED light source. Zebra explains that "the light is reflected and controlled by hogels and combines and emerges from the hologram surface in the same way it would if a solid physical model were actually there."

The end product can bring a landscape to life or allow engineers to view component designs from a number of different perspectives... or just wow an audience. The technology offers a three-dimensional viewing experience of static objects or scenes without the need for special glasses, and is said to allow onlookers to enjoy 360 degrees of image continuity.

Zebra Imaging's full-color or monochrome solutions benefit from protective coatings for durability and can also be annotated. Up to four images can be combined in one holographic print and, being actual physical products, there's no need to worry about keeping up to date with firmware updates on computerized equipment.

The company says that thousands of prints have been used for visualization and defense planning by the U.S. military. Mapping and surveying, engineering and construction, and architecture and design are amongst Zebra's main production markets.


TStzmmalaysia
post Dec 17 2010, 10:03 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Seaweed as Biofuel? Metabolic Engineering Makes It a Viable Option

Is red seaweed a viable future biofuel? Now that a University of Illinois metabolic engineer has developed a strain of yeast that can make short work of fermenting galactose, the answer is an unequivocal yes.

"When Americans think about biofuel crops, they think of corn, miscanthus, and switchgrass. ln small island or peninsular nations, though, the natural, obvious choice is marine biomass," said Yong-Su Jin, a U of I assistant professor of microbial genomics and a faculty member in its Institute for Genomic Biology.
Producers of biofuels made from terrestrial biomass crops have had difficulty breaking down recalcitrant fibers and extracting fermentable sugars. The harsh pretreatment processes used to release the sugars also resulted in toxic byproducts, inhibiting subsequent microbial fermentation, he said.
But marine biomass can be easily degraded to fermentable sugars, and production rates and range of distribution are higher than terrestrial biomass, he said.
"However, making biofuels from red seaweed has been problematic because the process yields both glucose and galactose, and until now galactose fermentation has been very inefficient," he said.

But Jin and his colleagues have recently identified three genes in Saccharomyces cerevisiae, the microbe most often used to ferment the sugars, whose overexpression increased galactose fermentation by 250 percent when compared to a control strain.

"This discovery greatly improves the economic viability of marine biofuels," he said. Overexpression of one gene in particular, a truncated form of the TUP1 gene, sent galactose fermentation numbers soaring. The new strain consumed both sugars (glucose and galactose) almost three times faster than the control strain -- 8 versus 24 hours, he said. "When we targeted this protein, the metabolic enzymes in galactose became very active. We can see that this gene is part of a regulating or controlling system," he said. According to Jin, galactose is one of the most abundant sugars in marine biomass so its enhanced fermentation will be industrially useful for seaweed biofuel producers.

Marine biomass is an attractive renewable source for the production of biofuels for three reasons:

- production yields of marine plant biomass per unit area are much higher than those of terrestrial biomass

- marine biomass can be depolymerized relatively easily compared to other biomass crops because it does not contain recalcitrant lignin and cellulose crystalline structures

- the rate of carbon dioxide fixation by marine biomass is much higher than by terrestrial biomass, making it an appealing option for sequestration and recycling of carbon dioxide, he said.
ScienceDaily

TStzmmalaysia
post Dec 17 2010, 10:05 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

A Cheaper Way to Clean Water

Oasys Water, a company that has been developing a novel, inexpensive desalination technology, showed off a new development facility in Boston this week. The company, which has been demonstrating commercial-scale components of its system in recent months, plans to begin testing a complete system early next year and to start selling the systems by the end of 2011.

Currently, desalination is done mainly in one of two ways: water is either heated until it evaporates (called a thermal process) or forced through a membrane that allows water molecules but not salt ions to pass (known as reverse osmosis). Oasys's method uses a combination of ordinary (or forward) osmosis and heat to turn sea water into drinking water.

On one side of a membrane is sea water; on the other is a solution containing high concentrations of carbon dioxide and ammonia. Water naturally moves toward this more concentrated "draw" solution, and the membrane blocks salt and other impurities as it does so. The resulting mixture is then heated, causing the carbon dioxide and ammonia to evaporate. Fresh water is left behind, and the ammonia and carbon dioxide are captured and reused.

Oasys says the technology could make desalination economically attractive not only in arid regions where there are no alternatives to desalination, but also in places where fresh water must be transported long distances. In California, for example, a massive aqueduct system now transports water from north to south.

"The cost will be low enough to make aqueduct and dam projects look expensive in comparison," says Oasys cofounder and chief technology officer Robert McGinnis, who invented the company's core technology. The process could also require substantially less power than other desalination options. "The fuel consumption and carbon emissions will be lower than those of almost any other water source besides a local lake or aquifer," he says.

The key to making the process work was developing a draw solution with easy-to-remove solutes, something that was done at a lab at Yale University. "Others have tried to develop other solutes for desalination," McGinnis says, "but they haven't been successful so far."

The next-biggest technical challenge has been developing the membrane. The membranes used in reverse osmosis are unsuitable for this process because they work best at high pressures. Forward osmosis doesn't use high pressures, so water moves through these membranes too slowly for the system to be practical. McGinnis and colleagues reëngineered the membranes, reducing the thickness of the supporting material and increasing its porosity without changing a very thin layer that blocks salts. These changes enabled water to pass through 25 times faster, McGinnis says.

The system uses far less energy than thermal desalination because the draw solution has to be heated only to 40 to 50 °C, McGinnis says, whereas thermal systems heat water to 70 to 100 °C. These low temperatures can be achieved using waste heat from power plants. Thermal-desalination plants are often located at power plants now, but it takes extra fuel to generate enough heat for them. The new system, on the other hand, could run on heat that otherwise would have been released into the atmosphere.

The Oasys system requires just one-tenth as much electricity as a reverse-osmosis system, McGinnis says, because water doesn't have to be forced through a membrane at high pressure. That's a crucial source of savings, since electricity can account for nearly half the cost of reverse-osmosis technology. Not working with pressurized water also decreases the cost of building the plant—there is no need for expensive pipes that can withstand high pressures. The combination of lower power consumption and cheaper equipment results in lower overall costs.

The Oasys system will not help everyone. For example, it is unlikely to do much for farmers; although they account for about 80 percent of fresh-water consumption, it wouldn't be cost-effective for them, in part because farms are often located closer to aquifers and other water supplies than are large coastal cities such as L.A. In addition, "there's a minimum amount of energy needed to strip salt ions out of water," says Peter Gleick, president of the Pacific Institute for Studies in Development, Environment, and Security in Oakland, California. "I don't think it will ever be cheap enough for irrigation." In agricultural areas where water is scarce, he says, it's cheaper to switch to better irrigation practices.

As coastal cities grow, however, so will their need for desalination services, says Kenneth Herd, director of the water supply program at the Southwest Florida Water Management District. "It's not a matter of if," he says, "but a matter of when."

MIT Tech Review

TStzmmalaysia
post Dec 17 2010, 10:06 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

New Recycling Process Could Recycle 100% of Plastic Packaging

One of the most disappointing aspects of Christmas (besides the crazed consumerism) is the piles of plastic packaging left over after the morning commotion. In fact, Science Daily reports that each American consumes an average of 120 grams of plastic wrapping on Christmas gifts, most of which is not recyclable. But that might just change (the recycling part, not the wasteful plastic packaging) thanks to a new technique for processing practically any type of plastic. Now that is some Christmas wish come true...

According to Science Daily, researchers at the University of Warwick have figured out a way to deal with the plastic packaging that so far can't go in the recycling bin. Where normally only about 12% of plastic waste is really recycled (though that number might actually be higher, at least in the US; for example, plastic bottle recycling hovers at around 25%, and recycling businesses have tripled in the US in recent years), the new process could deal with 100% of plastics.

University of Warwick reports, "The Warwick researchers have devised a unit which uses pyrolysis (using heat in the absence of oxygen to decompose of materials) in a "fluidised bed" reactor. Tests completed in the last week have shown that the researchers have been able to literally shovel in to such a reactor a wide range of mixed plastics which can then be reduced down to useful products many of which can then be retrieved by simple distillation."

So far, the process can reclaim wax, original monomers, terephthalic acid used to make PET plastic products, methylmetacrylate for making acrylic sheets, carbon for making paint pigments and tires, and even char.

If the simple process proves to be this useful, it could mean far more effective, cheaper and profitable recycling in cities. Right now the researchers are working on scaling up the process for large-scale reactor units that can be used by cities. Lead researcher Jan Baeyens says that the team envisions a large-scale plant capable of dealing with 10,000 tons of plastic a year being able to generate £5 million ($7.8 million) worth of recycled chemicals.

Dealing with a variety of plastics types all in one recycling plant seems too good to be true. If the technology does pan out, however, it could be more like a miracle for recyclers, landfills, the environment, and everyone having to put plastic in their trash bin rather than their recycling bin.

Treehugger

TStzmmalaysia
post Dec 17 2010, 10:07 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Medical robotics to improve heart surgery

In a truly interdisciplinary effort, a team of biomedical scientists and engineers from the University of Houston (UH) and physicians from The Methodist Hospital Research Institute (TMHRI) are collaborating to develop a platform for image-guided and robot-assisted surgeries on beating hearts that is minimally invasive.

Supported by a $1.4 million grant from the National Science Foundation (NSF), researchers in UH's Medical Robotics Laboratory (MRL) are creating a robotic system to one day be used in cardiothoracic surgeries that would not only automate, but also increase the precision of surgeries. This surgical system, guided by magnetic resonance imaging (MRI), incorporates a highly flexible robotic device to maneuver within the beating heart and past organs through the body, resulting in minimal trauma to patients.

Funded through NSF's cyber-physical system (CPS) program, the project is led by Nikolaos V. Tsekos, the director of the MRL, associate professor of computer science at UH and principal investigator on this grant. The award is funding the development of methodology for performing surgical procedures using robotic devices with real-time MRI image guidance. In particular, Tsekos and his team are pursuing the development of something called MIROS – Multimodal Image-guided RObot-assisted Surgeries – that is a novel CPS for performing heart surgeries.

"MIROS is an integrated system composed of multiple cyber and physical elements, ranging from computer algorithms to biomedical sensors to robotic manipulators, that coordinate their operation to achieve a task," Tsekos said. "It is the final system that integrates the robot, the MRI scanner, computers and software, and the tactile and visualization interfaces with the patient and the operator. MIROS is modular and upgradeable, so we can add new pieces of software, reprogram the existing ones, change parts of the robot or even change out the entire layout of the robot, according to the needs of the specific surgery."

Three critical features of MIROS are that it can combine, merge and use multimodality imaging techniques that carry complementary types of information; enable the performance of surgeries on a beating heart without having to stop its natural motion; and immerse the operator into the procedure through a visual-tactile interface. With multimodality imaging, such as MRI, the team aims to provide surgeons with real-time information to enhance diagnosis and in-situ assessment before, during and after the surgery.

"For performing the procedures on a beating heart, a special robotic device is under construction," Tsekos said. "Imagine that instead of using a straight surgical tool you use one that looks like a snake, needing to maneuver it through a maze of moving walls that you must avoid. With robots we can address the issues of limited access to the patient in the MRI machine, as well as increase maneuverability."

The first version of MIROS is specially designed for procedures that involve entering the heart through the myocardium, which is the muscular tissue of the heart. Without limiting the versatility and generality of the system, the MIROS team has selected the challenging clinical paradigm of an alternate aortic valve replacement, with hopes it will result in a quicker procedure, less trauma and faster recovery.

While robots are not new in being used to perform laparoscopic procedures, manual or robot-assisted laparoscopy offers visualization inside the patient by means of cameras and video. Tsekos' plan is to incorporate real-time image guidance to continuously collect images from the targeted area and guide the robot. This takes these surgeries from keyhole-type viewing with cameras to a larger, 3-D field of view with imaging.

"The system we are developing is comprised of many dedicated, specially designed components, incorporating not only the robot, but also tools to collect and process the MRI data from patients during surgery. This ranges from real-time tissue tracking to augmented reality and tactile interfacing for surgeons," Tsekos said. "These are enabling technologies and methodologies that then can be adopted and used for other types of surgeries or with other existing or developing robotic systems."

To reduce the number of animals used in this research, the team will perform all of the work on this particular NSF grant without animal subjects. To do this, the MRL group is creating what Tsekos called an "actuated heart phantom," a computer-controlled machine that has 27 motors that move different structures to mimic the anatomy and motion of the beating heart.

"We will use our actuated phantom inside an MRI scanner, as well as in the lab, to replicate any desired type of cardiac motion, such as simulating the conditions of arrhythmias and breathing patterns," Tsekos said. "We will test and investigate the operation of each component and the MIROS system as a whole by simulating surgeries on this phantom instead of animals for the first phase of this project. Additionally, our team will work with Methodist on another version of this heart phantom onto which we will ultimately attach a heart from a porcine cadaver acquired from a slaughterhouse. Once attached, the simple push of a button will replicate the heart's real motion in the lab and inside an MRI scanner."

In addition to Tsekos' MRL group, the research team includes a number of other professors from UH. From the computer science department, Zhigang Deng leads the effort for the novel visual-tactile interface, while Ioannis Kakadiaris and Shishir Shah work on new image-processing concepts. Mechanical engineers Karolos Grigoriadis and Javad Mohammadpour are focused on the control and construction of the tactile device. The TMHRI surgeons, Mark Davies, M.D., Barbara L. Bass, M.D., and Dipan Shah, M.D., also play a critical role in this project. Davies' focus is on the design of particular cardiac robotic applications, Bass' focus is on the overall design of the system in view of its potential use for surgeries in other organs and Shah's contribution is in the area of cardiac MRI.

EurekAlert


TStzmmalaysia
post Dec 17 2010, 10:09 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

New Building Material Offers New Design Options

Films instead of walls. This is an idea that fascinates architects all over the world. The Eden Project in Southern England, the National Aquatics Center built for swimming events at the Olympics in Beijing and the Allianz Arena in Munich are only three examples of what you can make from plastic sheets. Ethylene tetraflourethylene (ETFE), a transparent membrane, is especially popular because it enables buildings that shine in all colors as in Munich and Peking. But, we are not just talking about colors. You can use this new foil for an intelligent improvement of existing buildings -- by regulating heat, coolness and light precisely according to needs. Experts see film construction as a market poised for the future.
Whether this market develops and if so how quickly is not a question of taste but of the technical possibilities -- the financial options. Films will have to be low-cost, easy to process and free of health hazards for them to have a chance in the international construction business. This is the target that six Fraunhofer institutes are working jointly toward in the Multifunctional Membrane Cushion Construction project.

Engineers have been able to use coatings to change the properties of ETFE foils specifically. For example, membrane cushions with an inner coating of tungsten trioxide turn blue when they come into contact with hydrogen and lose their color if the cushions are filled with oxygen. Thus the passage of light can easily be regulated. Project coordinator Andreas Kaufmann of the Fraunhofer Institute for Building Physics (IBP) states "you could use a foil such as this to cover the entire façade of a house and have light pass depending upon sunlight conditions." The researchers were also able to solve another problem. To date, ETFE membranes have hardly been able to create a heat barrier, but a coat of paper-thin (and therefore transparent) layers of aluminum and paint make sure that heat radiation is effectively reflected. Kaufmann explains that "the challenge was overcoming the anti-adhesive properties of the membrane. ETFE is related to the anti-stick substance Teflon and hardly reacts with other substances chemically. This is why the surface of the foil first has to be pretreated chemically before coating." In the meantime, the researchers have not only come up with heat-insulating, but also antibacterial layers that inhibit the growth of mold and yeasts that form ugly black coverings.

As Robert Hodann, the CEO at film manufacturer Nowofol and industrial partner of the research project, puts it, "we believe that ETFE will emerge as a strong market of its own. The captivating thing about ETFE foil is its transparency combined with its great strength -- no other plastic membrane can compete." For instance, it will be possible to make LED façades with ETFE foil behind which thousands of light-emitting diodes can be installed. This would be an easy way to transform facades into gigantic illuminated screens.

ScienceDaily

TStzmmalaysia
post Dec 18 2010, 08:58 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Waste-to-Energy: a mountain of trash, or a pile of energy?

Collect trash, burn it, and then generate electricity. The technology is called Waste-to-Energy, and it uses our waste streams to produce electricity that can be cleaner than the average kilowatt-hour (kWh) generated in the United States today. A mountain of trash becomes a pile of energy. But, will this domestic renewable resource be able to move beyond its “dirty” reputation to become a larger portion of the U.S. electricity supply?

European countries have embraced Waste-to-Energy (WTE) as a way to reduce landfill growth as well as dependence on imported fuels. Today, about 400 WTE facilities are operating in Europe, using municipal solid waste as their primary fuel source. In Denmark alone, 29 WTE plants are currently in operation with 10 more on the way. In Sweden, the city of Kristianstad has essentially weaned itself off of fossil fuels in just ten years by replacing these energy sources with the city’s own waste.

In the United States, only 86 (PDF) plants use municipal solid waste as fuel. The amount of trash that these facilities process is dropping – by more than 7% (PDF) from 2006 to 2008. And very few new facilities are being discussed.

Why is the United States so far behind?

On the surface, WTE looks like a feasible option in the United States’ search for renewable and domestic energy resources. Today’s WTE technology is safe – assuming you can separate out hazardous materials (like batteries) in the incoming fuel (trash). In areas where tipping fees are high – primarily major metropolitan areas – these facilities can be economically viable without government subsidies.

If deployed nationwide, WTE facilities could reduce the volume of the more than 250 million tons (PDF) of material being thrown away each year by up to 90%. If burned properly, the remaining 10% would be mostly inert ash. With proper filtering systems in place, WTE facilities can meet and even exceed federal air emissions standards. But, despite these positive environmental attributes, the concept of burning trash does not appear able to shake its “dirty” image in America.

In Austin, Texas, Waste-to-Energy has been a “dirty” word for more than 30 years. In 1984, voters authorized bond money for the construction of a WTE plant on the edge of the city. But, vocal opposition from environmental activists and resulting runaway project costs led to a showdown that would scuttle the project 8 years later. The bad-blood resulting from this ordeal still runs thick today. Even when faced with a 30% renewable energy requirement (WTE qualifies), Austin’s city government and local utility (Austin Energy) did not seriously consider any WTE proposals. Instead, city residents receive renewable power primarily from wind farms in West Texas. They will soon buy power from a 100 MW biomass facility that will burn wood chips (not trash), located hundreds of miles to the east of Austin.

A different story has played out on the East Coast – home to high tipping fees and limited land availability for new landfills. In 1984, while Austinites were arguing over the environmental benefits (and costs) of a WTE facility, Baltimore residents were celebrating the opening of the city’s own Waste-to-Energy plant. Still in operation today – and profitable without government subsidies – the Refuse and Energy Systems Company (RESCO) facility processes about 2,250 tons of trash per day. The facility produces enough electricity to power 40,000 homes, as well as steam for the heating and cooling of local commercial buildings. According to RESCO employees, this no-sort facility has also improved local water quality, because of strict discharge requirements for power generation facilities.

So, why were these experiences so different?

Economics and land availability played significant roles in the decisions in Austin and Baltimore. But, these experiences also showed how WTE’s image problem can be a significant roadblock. Its “dirty” stigma has and could continue to prevent the expanded use of this technology in the United States even under favorable economic conditions. Until Americans become tired of dedicating space for new landfills, the country’s mountain of trash is unlikely to become a pile of energy. And waste will remain an untapped domestic renewable energy resource.

Scientific American

TStzmmalaysia
post Dec 18 2010, 09:00 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Science's breakthrough of the year: The first quantum machine

A mechanical device that operates in the quantum realm tops the journal's list of advances in 2010.

Until this year, all human-made objects have moved according to the laws of classical mechanics. Back in March, however, a group of researchers designed a gadget that moves in ways that can only be described by quantum mechanics—the set of rules that governs the behavior of tiny things like molecules, atoms, and subatomic particles. In recognition of the conceptual ground their experiment breaks, the ingenuity behind it and its many potential applications, Science has called this discovery the most significant scientific advance of 2010.

Physicists Andrew Cleland and John Martinis from the University of California at Santa Barbara and their colleagues designed the machine—a tiny metal paddle of semiconductor, visible to the naked eye—and coaxed it into dancing with a quantum groove. First, they cooled the paddle until it reached its "ground state," or the lowest energy state permitted by the laws of quantum mechanics (a goal long-sought by physicists). Then they raised the widget's energy by a single quantum to produce a purely quantum-mechanical state of motion. They even managed to put the gadget in both states at once, so that it literally vibrated a little and a lot at the same time—a bizarre phenomenon allowed by the weird rules of quantum mechanics.

Science and its publisher, AAAS, the nonprofit science society, have recognized this first quantum machine as the 2010 Breakthrough of the Year. They have also compiled nine other important scientific accomplishments from this past year into a top ten list, appearing in a special news feature in the journal's 17 December 2010 issue. Additionally, Science news writers and editors have chosen to spotlight 10 "Insights of the Decade" that have transformed the landscape of science in the 21st Century.

"This year's Breakthrough of the Year represents the first time that scientists have demonstrated quantum effects in the motion of a human-made object," said Adrian Cho, a news writer for Science. "On a conceptual level that's cool because it extends quantum mechanics into a whole new realm. On a practical level, it opens up a variety of possibilities ranging from new experiments that meld quantum control over light, electrical currents and motion to, perhaps someday, tests of the bounds of quantum mechanics and our sense of reality."

The quantum machine proves that the principles of quantum mechanics can apply to the motion of macroscopic objects, as well as atomic and subatomic particles. It provides the key first step toward gaining complete control over an object's vibrations at the quantum level. Such control over the motion of an engineered device should allow scientists to manipulate those minuscule movements, much as they now control electrical currents and particles of light. In turn, that capability may lead to new devices to control the quantum states of light, ultra-sensitive force detectors and, ultimately, investigations into the bounds of quantum mechanics and our sense of reality. (This last grand goal might be achieved by trying to put a macroscopic object in a state in which it's literally in two slightly different places at the same time—an experiment that might reveal precisely why something as big as a human can't be in two places at the same time.)

"Mind you, physicists still haven't achieved a two-places-at-once state with a tiny object like this one," said Cho. "But now that they have reached the simplest state of quantum motion, it seems a whole lot more obtainable—more like a matter of 'when' than 'if.'"

EurekAlert


TStzmmalaysia
post Dec 18 2010, 09:04 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Eco Skyscrapers

Connotations with the word skyscraper couldn’t be further from the idyllic gardens used to produce a plethora of different crops. Most people when they think of skyscrapers see tall rectangular buildings, behemoths devoted to man’s commercial achievements.

Despite this general thought however, there are some out there that see a different future for skyscrapers, one that incorporates hydroponics but can also function as a city in its own right, functioning independent of its surroundings.

Nowhere has this vision been more prevalent than in Josephine Turner’s design of the Bangaroo Sky Village, imagined within the hustle and bustle of Sydney, Australia. The design hopes to put the preconceived notions of skyscrapers on their head, instead using stacked triangular shapes and also incorporating hydroponic gardens, shops, commercial areas and also plazas, interconnected with “sky-bridges”.

Because of the strength of triangles architecturally the design allows for buildings of different heights to be stacked together, interconnected using the sky-bridges to encourage a pedestrian community that can live, work, shop and eat within the system of plazas.
The buildings themselves are set out in levels, with agricultural levels growing produce for the inhabitants, interspersed with commercial levels, residential areas and also communal quarters. A hydroponic system is fitted throughout the buildings to ensure that all of the agricultural areas receive sufficient amounts of water.

The environmentally friendly design also incorporates wind power to provide lighting and electricity within the building and uses the sunlight cleverly due to the triangular faces. Ultimately Turner’s design gives us a hint of where our cities could be headed, and also a clear idea of how important hydroponics will be in the future.

Hydroponics Guide


TStzmmalaysia
post Dec 18 2010, 08:06 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Word Lens app turns your phone into a real-time translator

Word Lens translates printed words in real time on your iPhone. Can our jet packs be far behind? Developed by Quest Visual, Word Lens is an augmented-reality translation app that uses your phone's camera to view printed words and translate them into another language as you watch. If you’re traveling for business or on vacation and need to read a street sign or a menu, point your phone and Word Lens instantly translates it, maintaining the color and font as it goes.

Word Lens is currently available for the iPhone via iTunes. The app is free, with languages available for in-app purchase at US$5 each. So far, only Spanish-to-English and English-to-Spanish are available, but Quest Visual has plans to offer additional languages soon. After you purchase the languages, they are downloaded to your phone so you do not need a network connection to use Word Lens.

Quest Visual says the app is as easy to use as taking a picture with your phone. The app offers a zoom feature so you can crop out extraneous details, and a flashlight feature to light up the text if necessary. In addition, you can translate words by typing them in. Word Lens works best on clearly printed text, and does not work with decorative fonts or handwriting.

The app uses optical character recognition (OCR) technology to analyze the image and translate the words it finds. You can test out the Word Lens OCR capability in the free app using two cute features that will spell all words in the image backwards, or digitally erase all the words in the image. Like most translators and translation software, Word Lens is not perfect, but Visual Quest promises that you can at least get the general meaning of the text.

Word Lens works with the iPhone 4, iPhone 3GS, and iPod Touch with camera, and requires iOS 4.0 or later (The iPhone 3GS and iPod Touch do not support the zoom and flashlight features). There is no official word on an Android version yet.

Quest Visual seems to have a hit on its hands. In its first day of release on iTunes, Word Lens quickly climbed into the top 40 app chart. Hopefully we can look forward to its continued development, with more languages and improved translations. Details are available on iTunes.



Gizmag


This post has been edited by tzmmalaysia: Dec 18 2010, 08:10 PM
TStzmmalaysia
post Dec 19 2010, 09:54 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Biodegradable Styrofoam Alternative Earns Cradle to Cradle Certification

An alternative to foam packaging that is biodegradable and based on sugar cane has earned Cradle to Cradle certification.

Synbra Technology's BioFoam product is a packaging material like expanded polystyrene foam (the type of packaging filler commonly referred to as sytrofoam), but it is made out of polylactic acid, a material derived from sugar cane processing.

BioFoam will initially be used by companies within the Synbra Group that provide packaging products, but the company says that it's possible that BioFoam could be used to replace expanded polystyrene within building materials.

Since BioFoam is made from polylactic acid, it can biodegrade under certain condition when it's no longer needed. The material can also be reformed to fit around different products.

Synbra is building its first plant for commercializing the material in the Netherlands, with plans to be up and running this year. The company says the plant will be able to produce 5,000 tons a year.

Cradle to Cradle certification was given by the Environmental Protection and Encouragement Agency (EPEA), which was founded by Michael Braungart, one of the co-founders of McDonough Braungart Design Chemistry (MBDC). The EPEA is able to grant Cradle to Cradle certification based on criteria set forth by MBDC, which includes materials, material reuse, energy use, water use and social responsibility.

GreenBiz

TStzmmalaysia
post Dec 19 2010, 09:56 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


SPACE SCIENCE

Attached Image

NASA Engineers Propose Combining a Rail Gun and a Scramjet to Fire Spacecraft Into Orbit

NASA has been working on creating a new, cheaper method to launch spacecrafts. Their latest proposal involves train tracks, a rail gun and a scramjet. Here's what they're trying to do:

In April, President Obama urged NASA to come up with, among other things, a less expensive method than conventional rocketry for launching spacecraft. By September, the agency's engineers floated a plan that would save millions of dollars in propellant, improve astronaut safety, and allow for more frequent flights. All it will take is two miles of train track, an airplane that can fly at 10 times the speed of sound, and a jolt of electricity big enough to light a small town.

The system calls for a two-mile- long rail gun that will launch a scramjet, which will then fly to 200,000 feet. The scramjet will then fire a payload into orbit and return to Earth. The process is more complex than a rocket launch, but engineers say it's also more flexible. With it, NASA could orbit a 10,000-pound satellite one day and send a manned ship toward the moon the next, on a fraction of the propellant used by today's rockets.

It may sound too awesome to ever be a reality. But unlike other rocket-less plans for space entry, each relevant technology is advanced enough that tests could take place in 10 years, says Stan Starr, a physicist at NASA's Kennedy Space Center. NASA's scramjets have hit Mach 10 for 12 seconds; last spring, Boeing's X-51 scramjet did Mach 5 for a record 200 seconds. Rail guns are coming along too. The Navy is testing an electromagnetic launch system to replace the hydraulics that catapult fighter jets from aircraft carriers. "We have all the ingredients," says Paul Bartolotta, a NASA aerospace engineer working on the project. "Now we just have to figure out how to bake the cake."

How To Fly Into Orbit:

Rev Up The Rail Gun
A 240,000-horsepower linear motor converts 180 megawatts into an electromagnetic force that propels a scramjet carrying a spacecraft down a two-mile-long track. The craft accelerates from 0 to 1,100 mph (Mach 1.5) in under 60 seconds- fast, but at less than 3 Gs, safe for manned flight.

Fire The Scramjet

The pilot fires a high-speed turbojet and launches from the track. Once the craft hits Mach 4, the air flowing through the jet intake is fast enough that it compresses, heats to 3,000ºF, and ignites hydrogen in the combustion chamber, producing tens of thousands of pounds of thrust.

Get Into Orbit

At an altitude of 200,000 feet, there isn't enough air for the scramjet, now traveling at Mach 10, to generate thrust. Here spaceflight begins. The two craft separate, and the scramjet pitches downward to get out of the way as the upper spacecraft fires tail rockets that shoot it into orbit.

Stick The Landing

The scramjet slows and uses its turbojets to fly back to Earth for a runway landing. Once the spacecraft delivers its payload into orbit, it reenters the atmosphere and glides back to the launch site. The two craft can be ready for another mission within 24 hours of landing.

Gizmodo


TStzmmalaysia
post Dec 20 2010, 09:13 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Is night falling on classic solar panels?

Solar cells that work at night. It sounds like an oxymoron, but a new breed of nanoscale light-sensitive antennas could soon make this possible, heralding a novel form of renewable energy that avoids many of the problems that beset solar cells.

The key to these new devices is their ability to harvest infrared (IR) radiation, says Steven Novack, one of the pioneers of the technology at the US Department of Energy's Idaho National Laboratory in Idaho Falls. Nearly half of the available energy in the solar spectrum resides in the infrared band, and IR is re-emitted by the Earth's surface after the sun has gone down, meaning that the antennas can even capture some energy during the night.

Lab tests have already shown that, under ideal conditions, the antennas can collect 84 per cent of incoming photons. Novack's team calculates that a complete system would have an overall efficiency of 46 per cent; the most efficient silicon solar cells are stalled at about 25 per cent. What's more, while those ideal conditions are relatively narrowly constrained for silicon solar cells - if the sun is in the wrong position, light reflects off a silicon solar cell instead of being absorbed - the antennas absorb radiation at a variety of angles. If the antennas can be produced cheaply, the technology could prove to be truly disruptive, says Novack.

Unlike photovoltaic cells, which use photons to liberate electrons, the new antennas resonate when hit by light waves, and that generates an alternating current that can be harnessed.

To build an array that could capture both visible and infrared radiation, researchers envision multiple layers of antennas, with each layer tuned to a different optical frequency.

So far, two main challenges have stood in the way of fomenting a revolution in solar power. First, the length of the antennas must be close to the size of the wavelength being captured, which in the case of the solar spectrum can be very small - from millimetres down to a few hundred nanometres.

Second, the currents produced will be alternating at frequencies too high to be of use unless they are first converted into a steady direct current. The problem here is that silicon diodes, which are crucial to the conversion, typically don't operate at the high frequencies required, says Aimin Song, a nanoelectronic engineer at the University of Manchester, UK.

Both of these barriers are now being broken down. Earlier this year, Novack and colleagues perfected a technique for creating arrays of billions of antennas. Although these antennas were only just small enough to harvest energy at the far end of the infrared spectrum, Novack says it should be possible to modify the process and build smaller antennas to work with mid and near-infrared.

Meanwhile Song, and Garret Moddel's team at the University of Colorado in Boulder, have independently taken a significant step in tackling the current-conversion problem by creating novel diodes capable of handling high optical frequencies (see "The devil's in the diodes"). Both groups expect to combine the diodes and antennas into working prototypes within months. "There's a potential for this to be a real game-changer," says Moddel.

NewScientist


TStzmmalaysia
post Dec 21 2010, 09:48 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


NANOTECHNOLOGY

Attached Image

New nanotechnology could slash sequencing time

Scientists from Imperial College London are developing technology that could ultimately sequence a person's genome in mere minutes, at a fraction of the cost of current commercial techniques.

he researchers have patented an early prototype technology that they believe could lead to an ultrafast commercial DNA sequencing tool within ten years. Their work is described in a study published this month in the journal Nano Letters and it is supported by the Wellcome Trust Translational Award and the Corrigan Foundation.

The research suggests that scientists could eventually sequence an entire genome in a single lab procedure, whereas at present it can only be sequenced after being broken into pieces in a highly complex and time-consuming process. Fast and inexpensive genome sequencing could allow ordinary people to unlock the secrets of their own DNA, revealing their personal susceptibility to diseases such as Alzheimer's, diabetes and cancer. Medical professionals are already using genome sequencing to understand population-wide health issues and research ways to tailor individualised treatments or preventions.

Dr Joshua Edel, one of the authors on the study from the Department of Chemistry at Imperial College London, said: "Compared with current technology, this device could lead to much cheaper sequencing: just a few dollars, compared with $1m to sequence an entire genome in 2007. We haven't tried it on a whole genome yet but our initial experiments suggest that you could theoretically do a complete scan of the 3,165 million bases in the human genome within minutes, providing huge benefits for medical tests, or DNA profiles for police and security work. It should be significantly faster and more reliable, and would be easy to scale up to create a device with the capacity to read up to 10 million bases per second, versus the typical 10 bases per second you get with the present day single molecule real-time techniques."

In the new study, the researchers demonstrated that it is possible to propel a DNA strand at high speed through a tiny 50 nanometer (nm) hole - or nanopore - cut in a silicon chip, using an electrical charge. As the strand emerges from the back of the chip, its coding sequence (bases A, C, T or G) is read by a 'tunnelling electrode junction'. This 2 nm gap between two wires supports an electrical current that interacts with the distinct electrical signal from each base code. A powerful computer can then interpret the base code’s signal to construct the genome sequence, making it possible to combine all these well-documented techniques for the first time.

Sequencing using nanopores has long been considered the next big development for DNA technology, thanks to its potential for high speed and high-capacity sequencing. However, designs for an accurate and fast reader have not been demonstrated until now.

Co-author Dr Emanuele Instuli, from the Department of Chemistry at Imperial College London, explained the challenges they faced in this research: "Getting the DNA strand through the nanopore is a bit like sucking up spaghetti. Until now it has been difficult to precisely align the junction and the nanopore. Furthermore, engineering the electrode wires with such dimensions approaches the atomic scale and is effectively at the limit of existing instrumentation.

However in this experiment we were able to make two tiny platinum wires into an electrode junction with a gap sufficiently small to allow the electron current to flow between them."
This technology would have several distinct advantages over current techniques, according to co-author, Aleksandar Ivanov from the Department of Chemistry at Imperial College London: "Nanopore sequencing would be a fast, simple procedure, unlike available commercial methods, which require time-consuming and destructive chemical processes to break down and replicate small sections of the DNA molecules to determine their sequence. Additionally, these silicon chips are incredibly durable compared with some of the more delicate materials currently used. They can be handled, washed and reused many times over without degrading their performance."

Dr Tim Albrecht, another author on the study, from the Department of Chemistry at Imperial College London, says: "The next step will be to differentiate between different DNA samples and, ultimately, between individual bases within the DNA strand (ie true sequencing). I think we know the way forward, but it is a challenging project and we have to make many more incremental steps before our vision can be realised."

PhysOrg

TStzmmalaysia
post Dec 21 2010, 09:50 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Join Singularity!

Are you worried about the coming robot apocalypse? Does research into artificial general intelligence freak you out? Do you look at computers and start dreading for the safety of your children? If so, world renowned physicist and science champion Michio Kaku has a message for you: instead of fearing technology, humanity should learn how to become part of it. Kaku hosts a great TV show called Sci Fi Science wherein he explores some of the amazing technologies being developed today, and the out-of-this-world consequences they could bring tomorrow.

Recently, he took a look at the Technological Singularity, which as he puts it will be “a time when computer power grows without limit, surpassing human intelligence, sweeping aside everything in its path.” Sounds scary, huh? Well, after talking to experts around the world, Kaku decides there’s only one reasonable way to deal with the Singularity: join it. By merging with computers, humanity will not only preserve itself, it will expand into realms it cannot comprehend in the present. Watch clips from Kaku’s Singularity episode in the video below. By the end Kaku is openly recruiting us to join him in embracing the machine.

Michio Kaku is one of the most famous modern day presenters in science. He’s authored several best sellers, hosts two radio programs, has the Sci Fi Science TV show (now in its second season), and writes a blog on BigThink. As we’ve shown in the past, he seems to delight in answering questions from average readers and viewers, and no subject seems too far-fetched for him to consider and explain. It’s only fitting then that Kaku take a long look at the Singularity and consider the implications of exponential growth in artificial intelligence. While Kaku doesn’t espouse as optimistic approach to the subject as many others, he does end up with a very positive outlook on what the growth of AI could mean for humanity. I tend to share his view. Accelerating technology is only scary when you view it as separate from ourselves. Once you realize that we are already increasing the ways we incorporate it into our everyday lives (how long do you go between using your phone, the internet, or computer?) continuing along that path starts to make sense. Don’t fear the machine conquering humanity. The machine will become one with humanity.



Singularity Hub


This post has been edited by tzmmalaysia: Dec 21 2010, 09:51 AM
TStzmmalaysia
post Dec 21 2010, 09:52 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


TRANSPORTATION

Attached Image

All-Electric Sonex Aircraft Completes Maiden Flight

We’ve seen electric aircraft advance the horizons of carbon-free aviation by leaps and bounds lately, and this month a brand new electric plane successfully conducted its maiden flight. The all-electric Sonex aircraft completed its first flight on December 3, 2010 at Wittman Regional Airport in Oshkosh, WI.

The airplane was piloted by Sonex Founder and E-Flight team leader John Monnett, who took the plane on a short jaunt in order to break ground-effect and analyze in-flight system performance. This short test run represented four years of development by the E-Flight design team in engineering, building and testing what has been dubbed “one of the most advanced electric flight packages ever conceived.”

“We are very proud of this achievement,” said Jeremy Monnett, CEO and General Manager of Sonex Aircraft. “We have a flight envelope expansion plan and will be working on this in the coming weeks and months. We have also already started our motor v4.0 design and motor controller v12.0 to be integrated on N270DC. Many more great things to come on this project!”

The Sonex Aircraft is a standard Waiex kit aircraft that has been modified with the installation of proprietary E-Flight electric power components. These include an E-Flight 54kw brushless DC electric motor, E-Flight electronic motor controller, a 14.5kw-hr lithium polymer battery system, the E-Flight battery management system, and E-Flight cockpit instrumentation and controls.

The E-Flight Initiative electric flight project was first announced in 2007, but has seen many challenges since then, including designing and testing electric components that use cutting-edge technology. With innovation such as this, electric aircraft could rapidly become more and more common.

Inhabitat





TStzmmalaysia
post Dec 21 2010, 09:54 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Doctors use sick boy's DNA in diagnosis, treatment

Doctors and scientists in Wisconsin have published the first detailed account of a groundbreaking medical case in which they sequenced all the genes of a very sick young boy from Monona, Wis., and used the information to treat the child.
Genetic experts said the Wisconsin case signals a new era in medicine in which doctors will be able to read our genetic script to diagnose and sometimes treat maladies, especially cancers and rare hereditary diseases.

The boy, whose story is the subject of a Milwaukee Journal Sentinel series starting Sunday, suffered from a disease and mutation never before seen in medicine. When he ate, painful holes called fistulas would open, leading from his intestine to his skin. The child, now 6, became so sick that doctors had to remove his colon in the spring of 2009.

In a paper published online Friday in the journal Genetics in Medicine, doctors and scientists at Children's Hospital of Wisconsin and the Medical College of Wisconsin described how they were able to read the boy's genetic script in the summer of 2009 and pinpoint the mutation responsible for his disease.

"For the patient and his family, it's a benefit and something we all feel really good about," said Howard Jacob, one of the paper's authors and director of the Medical College's Human and Molecular Genetics Center. "It demonstrates that this technology can start being used in the clinic today."

The case is believed to be one of the first in the world in which the sequencing of a patient's DNA has led to a diagnosis and treatment.

"Everyone's talking the talk about personalized medicine, and this is a real example. We don't have too many of those," said Richard Gibbs, director of the Human Genome Sequencing Center at Baylor College of Medicine.

A team at Yale University accomplished the feat earlier, using a similar technique to diagnose a baby in Turkey born with congenital chloride diarrhea.

The Wisconsin case shows how, with a patient's life at stake, doctors, geneticists and computer experts can work together to make sense out of the vast ocean of information in our 21,000 genes.

"It's a wonderful paper," said Eric D. Green, director of the National Human Genome Research Institute, part of the National Institutes of Health. "It seems like every month now there's a publication like this that demonstrates the power of this technology.
"The novelty of this story is that it was done in real time to help make a decision about clinical management."

As a result of sequencing, the Wisconsin scientists learned that the boy had a defect in his immune system caused by a single mutation in a gene called XIAP. The mutation also caused a second extremely rare disease called XLP, which affects just 400 boys worldwide, rendering them unable to survive Epstein-Barr virus.

Doctors had not previously known that the child had XLP, which can be treated with a bone marrow transplant. In mid-July the boy received an umbilical cord blood transplant, which is similar to a bone marrow transplant. He was discharged from the hospital in October.

As the doctors point out in their commentary, the new case underscores how the fast-moving revolution in technology is driving breakthroughs in genomics.

"The tools available to make this diagnosis," they write, "were not available when the child first (was hospitalized) four years ago."

Before deciding to sequence his DNA, doctors conducted dozens of tests on individual genes and on his immune system. Yet they were unable to reach a diagnosis.

In Wisconsin, as in the Yale case, scientists decided against sequencing the patient's entire genome, all 3.2 billion chemical base pairs. Instead, they read a little more than 1 percent of the genome, the exons. Exons, part of every gene, carry the instructions for making proteins. The failure to make proteins correctly causes many diseases. The sequencing and analysis of the Wisconsin child's exons cost roughly $75,000 in 2009, though the cost in a couple of years should be $1,000, Jacob said.

Even when doctors decided to sequence all of the boy's genes using this more efficient technique, the task was not simply a matter of waiting for a machine to spit out an answer. Initially, they got a list of 16,124 potential answers - differences between his genetic script and the reference genome that is used as a yardstick of what is "normal."

The Wisconsin researchers developed a software program to help them weed out harmless variations. The child's DNA was run through the sequencing machine five times to reduce the chances of missing a mutation.

Researchers also consulted the medical literature on many genes and the latest database of genetic variations. They were able to drop their list of potential suspects to 32, then to eight and finally to one, XIAP, a gene involved in regulating the immune system. Scientists then ran two tests of the boy's cells in order to confirm that the XIAP mutation caused his illness.

Eric J. Topol, director of the Scripps Translational Science Institute in La Jolla, Calif., called the Wisconsin case "impressive" and said he believes more institutions will use sequencing in the practice of medicine.

"You can just see it taking off," Topol said. "There will be hundreds of these in the next few years, if not much more."

PhysOrg


deadsnow
post Dec 21 2010, 11:49 AM

New Member
*
Junior Member
5 posts

Joined: Aug 2010
interesting posts smile.gif thx.
lopo90
post Dec 21 2010, 12:04 PM

On my way
****
Junior Member
695 posts

Joined: Nov 2010


yea i agree ^^ very very interesting
TStzmmalaysia
post Dec 22 2010, 07:33 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Desalination Plants To Hit $87.8 Billion in Investments

Desalination technology -- turning brackish or salt water into fresh water -- has been a hotly debated issue for years. The main problem is that the technology is incredibly energy intensive, and there for financially and environmentally expensive. However, as water supplies run short, desalination is looking more attractive. So, innovators are coming up with ways to make the process more energy efficient and reduce the environmental impact of plants. As desalination nears a boom phase, investments in plants are set to jump over just the next five years, according to new data from Pike Research.

Pike Research has found that the cost of running a desalination plant is falling as technologies improve, and the lower costs are making plants look more attractive to investors. The research firm states that new construction for plants will create a cumulative global investment of $87.8 billion between 2010 and 2016.

From improved membranes filtering salts out of water to energy recovery units to improve the energy bill for a plant, new technology is making desalination look like an ideal option for coastal areas experiencing water shortages, such as California. Reuters reports that the state is likely to lead the country in implementing desalination plants to meet the states water needs.

While it may be the next big thing for the water industry, it might not be best environmental move. Smart water technologies such as improved metering, water conservation technologies for everything from data centers to irrigation systems, and improved policies from agriculture to manufacturing are the first place to start for crafting a future complete with sustainable water supplies. In some cases, desalination is the only solution for certain communities. But not in all cases...or even in most. Desalination has a long way to go before it can be considered a sustainable, environmentally friendly source of water. However, in true band-aid fashion, it seems rather than fix our poor water use practices, money will simply poured into desalination plants to provide more water now.

Treehugger

mieza
post Dec 22 2010, 09:21 AM

On my way
****
Senior Member
541 posts

Joined: Jun 2008
From: KL



Currently doing research on aerogel.. I love science n technology smile.gif make life easier..
TStzmmalaysia
post Dec 23 2010, 09:04 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


TRANSPORTATION

Attached Image

All-Electric Trash Truck Cleans Up a Dirty Job

The makers of a new all-electric trash truck soon to be plying the streets of a Paris suburb promise that the only fumes coming from the truck will involve rotten fruit and expired cheese, not clouds of diesel exhaust.


Despite what gets loaded into the hopper, the 26-ton truck’s emissions are clean, with each truck saving an estimated 130 tons of CO2 emissions each year over a diesel-powered model. The trash truck, built by PVI Electric Powertrain, features liquid cooled lithium-ion battery packs from Dow Kokam that tout a 10-year usable life and stability in extreme climates. Each truck will have five strings of seven battery packs, which provide the equivalent of 250 kilowatt hours of energy.

“This achievement demonstrates that real advanced battery solutions exist for the commercial and fleet industry today,” said Dow Kokam Vice President Jean-Francois Herchin. The company claims that it’s the first fully-electric trash truck with the performance of a diesel-powered truck, and perhaps one of the largest electric utility trucks on the road.

Utility vehicles like trash trucks — and the hybrid street sweeper we told you about last week — are ideal for electrification, as they travel fixed routes at predetermined times and often replace noisy, smelly vehicles in dense urban cores. The PVI electric trash truck is no exception. Drivers can pick up 16 tons of trash in two rounds of service with a recharge or battery swap during the driver’s lunch break or a shift change.

Anyone who has ever been stuck behind a slowly accelerating trash truck will be glad to hear that PVI designed a gearbox that allows the truck to climb hills without impeding traffic, and the electric drivetrain means that 100 percent of torque is available at acceleration.

The first truck will debut in the Paris suburb of Courbevoie as part of the fleet of SITA Ile de France, a division of Suez. By the end of 2011, another eleven electric trash trucks will hit the road.

Wired

TStzmmalaysia
post Dec 23 2010, 09:07 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Implantable Smart Chip Fights Chronic Pain

Sydney researchers are getting ready to conduct human trials next year of a smart chip, which, when implanted in the spinal cord, can measure and stop pain signals from traveling to the brain.

The technology, targeting chronic pain, was developed in Sydney by National ICT Australia (NICTA) over the last two years by experts in biomedical, electrical and mechanical engineering, as well as textile technology and software applications.

The smart chip is put into a biocompatible device, which is a little smaller than the head of a match. A couple of the devices are sewn into a 1.22mm wide micro-lead made from polymer yarn and electronic wires. The wires are then inserted into the spine (or elsewhere) and connected to a device containing a battery and a computer processor. The battery can be charged wirelessly.

This set-up, according to NICTA, can then measure the properties of nerves carrying pain signals to the brain and can send a 10V electric pulse to block the signals, which tricks the brain into thinking there's no pain.

According to NICTA CTO implant technologies Dr John Parker, current devices used to block pain signals to the brain are larger, around the size of a matchbox.

The smaller size of the NICTA device improves its reliability as it can be implanted closer to the spine and needs shorter connection leads.

The device could be used to treat chronic back pain, leg pain and pain from nerve damage, but could also help those suffering from migraines, Parkinson's disease tremors or epileptic seizures.

ZDNet

TStzmmalaysia
post Dec 24 2010, 10:01 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Gostai Jazz Robot at Your Service

Gostai, a company specialized in Artificial Intelligence solutions, is launching Jazz, its telepresence robot that can be remotely operated via a web-based user interface. The robot can be used for video conferencing, visiting a place or telesurveillance. Jazz can effectively patrol at night, thanks to its infrared camera, laser system and a map of its surroundings.

In case of a security alert, the remote operator can take control of the robot in real time using a regular web browser and check on the situation. Check the complete feature list in the full post.

Jazz Security features:


Jazz Security is equipped with a camera that detects motion and can be controlled by a person via a web-based interface
Jazz Security record video while patrolling a place and send alerts via SMS or email in case of suspicious activities
A Laser Range Finder will soon be proposed with Jazz Security. This powerful device allows the robot to build a map of the area where it stands, and then use this map to localize itself.
Using this map displayed on the user’s screen, it is possible to setup waypoints that the Jazz robot will follow
Random patrolling can be used as well, to avoid regular patterns that can be monitored by potential thieves
Jazz Connect features:


Jazz Connect robot stands in a remote location and will serve as your personal avatar. It can move and perceive its surrounding with its embedded camera, speaker and microphone, and the user remotely controls Jazz Connect via a web browser from a computer, or a smartphone.
Easy to use: the robot can connect itself to the Internet via a WiFi connection and it will be operated using a 3D pointer on the real-time image displayed on the web interface to indicate the direction to follow.
An optional LCD screen (not in the product picture here) can display the user’s face during a video conference meeting, so that people know who is controlling Jazz Connect.
The rotating head enables the robot to better (video-)capture the surroundings

ubergizmo


TStzmmalaysia
post Dec 24 2010, 10:09 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Nao robot receives a much sexier body

Aldebaran Robotics’ latest version of their Nao robot made an impact at the Humanoids 2010 conference in Atlanta, where Nao’s body has been re-engineered to look a whole lot more attractive and robust. It features longer curved arms that just give the bot more space for it to pick stuff up and throw, preferably not at you for forgetting to oil its joints.

Of course, it doesn’t hurt that a new motion engine has been installed into the Nao, making it move a whole lot more fluidly, making it move more like a human than ever before. Nao’s also been given a new head that holds an upgraded brain, where it is capable of recognizing speech and images, performing a certain degree of facial recognition as well as reading text.



ubergizmo


This post has been edited by tzmmalaysia: Dec 24 2010, 10:11 AM
TStzmmalaysia
post Dec 24 2010, 10:11 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Robot Hand Copies Your Movements, Mimics Your Gestures

2010 may go down in history as the year of gesture recognition. We’ve seen it in TVs, we have it in our video games (thanks, Kinect!), and now we have it in our robots. The Biological Cybernetics Lab at Tsukuba University, headed by Kiyoshi Hoshino, recently demonstrated a robotic arm that can mimic the position and movements of your own.

Using two cameras, the system tracks your hand and arm more than 100 times per second and relays the information to the robot so that it can repeat what you do. The system is fast enough that there is relatively little lag time between your gesture and the robot’s copied motion.

Hoshino and his students have pushed the system even further and taught it how to recognize 100 distinct hand shapes. This allows the robot arm to not only track the location and movement of your arm, but to reliably perform the same actions like picking up an object or waving hello. The robot arm was demonstrated at the recent 3DExpo in Yokohama, and you can see it in action in the video below. This sort of intuitive interface could let almost anyone control and program a robot. Hooray for the democratization of technology!



SingularityHub



TStzmmalaysia
post Dec 24 2010, 10:13 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Solar Powered Rain Catchment Offers Shelter and a Fresh Drink

Here is an interesting concept for rainwater catchment. Created by Mostafa Bonakdar, a design student from Tehran, Iran, the structure is both a shelter during rain as well as a drinking fountain. It features both solar power and rainwater collection, with the solar power running a purification system inside.

The structure can act as a bus shelter, a cover for benches in the park, or a number of other locations where both an awning and a bit of fresh water are welcome.

Perhaps the shelters could be temporary, installed during springtime when the weather varies enough to offer rain one day, and a warm day the next. Or it could be set up to release extra water into the ground after a certain duration in the tank -- after all, it'd take quite a bit of energy to constantly filter and sterilize the water held in the tank, which means the solar panel would have to be incredibly efficient or quite a bit larger than pictured here.

Sure it's not exactly practical nor realistic, but it's an interesting concept nonetheless for providing several services at once with renewable energy and resources.

TreeHugger

TStzmmalaysia
post Dec 24 2010, 10:17 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Robots learn to walk like a senior citizen


Today's humanoid robots are able to run, somersault and even dance – now comes a robot that walks like a senior citizen. It leans on objects in its environment for support to help it move around and complete tasks.

Robots, and more importantly roboticists, are looking at objects in the wrong way, thinks Sébastien Lengagne of Japan's National Institute of Advanced Industrial Science and Technology in Tsukuba. "Roboticists usually just see objects as obstacles to be avoided," he says "But they can help us."

Lengagne and his colleagues are developing a system to allow humanoid robots to use their entire bodies, and any surrounding objects, to help them move around cluttered environments and complete complex balancing tasks without getting stuck or falling over.

"If I ask you to look below your desktop, you will put your hand on the desktop for support," he says. "But most methods will try to get the robot to do the task without touching the desktop."

The team's robot, HRP-2, acts more like a human. It will place both arms on a table to maintain its balance when trying to sit down in a chair, or use one arm for support when taking a big swinging kick at a ball.

Video

The system breaks down tasks into two stages. In the first stage, software developed by Lengagne's colleague Karim Bouyarmane identifies objects in the robot's surroundings that it can use to help complete a task – for instance, leaning on a table with its forearms to sidle past it and into a nearby chair. The software then calculates a number of poses that the robot could strike to make best use of the table for stability while it shuffles towards the chair and sits on it. Lengagne's software then converts these static poses into one smooth motion, taking into account the forces operating on the robot in each position to ensure it does not lose balance.

At present, the system has to run these calculations on an external computer, but the team hope that ultimately a robot's onboard computer will be able to carry out the process in one step.

New Scientist

TStzmmalaysia
post Dec 24 2010, 10:18 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Robot waiters in China

Service with a smile also comes with an electronic voice at the Dalu Robot restaurant, where the hotpot meals are not as famous yet as the staff who never lose their patience and never take tips.

The restaurant, which opened this month in Jinan in northern Shandong province, is touted as China's first robot hotpot eatery where robots resembling Star Wars droids circle the room carrying trays of food in a conveyor belt-like system.

More than a dozen robots operate in the restaurant as entertainers, servers, greeters and receptionists. Each robot has a motion sensor that tells it to stop when someone is in its path so customers can reach for dishes they want.

The service industry in China has not always kept up with the country's rapid economic growth, and can be quite basic in some restaurants, leading customers in the Dalu restaurant to praise the robots.

"They have a better service attitude than humans," said Li Xiaomei, 35, who was visiting the restaurant for the first time.

"Humans can be temperamental or impatient, but they don't feel tired, they just keep working and moving round and round the restaurant all night," Li said.

Inspired by space exploration, robot technology and global innovation, the restaurant's owner, Zhang Yongpei, said he hopes his restaurant will show the world China is a serious competitor in developing technology.

"I hope this new concept shows that China is forward-thinking and innovative," Zhang said.

As customers enter the dimly lit restaurant lined with blinking neon lights to simulate a futuristic environment, a female robot decorated with batting eyelashes greets people with an electronic "welcome."

During the meal, crowds of up to 100 customers, are entertained by a dancing and talking robot that looks more like a mannequin with a dress, flapping its arms around in a stiff motion.

Zhang said he hopes to roll out 30 robots — which cost $6,000 each — in the coming months and eventually develop robots with human-like qualities that serve customers at their table and can walk up and down the stairs.

YahooNews


SUSzeitgeist
post Dec 24 2010, 12:06 PM

On my way
****
Senior Member
691 posts

Joined: Aug 2010


dont forget Vanadium Redox Battery, potential energy storage is the most important aspect before all those advanced, science & technology able to operate.

This post has been edited by zeitgeist: Dec 24 2010, 12:06 PM
TStzmmalaysia
post Dec 24 2010, 12:15 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

The Vanadium Battery: The Ultimate Energy Storage Solution

Many of us are feeling that this generation has passed on a heavy burden to our kids especially regarding the ever increasing energy needs of society. It's not all doom and gloom however, the Vanadium Battery might just return a little spring to your step and a bigger smile on your face when you next see your grand kids.

Attached Image

A new mass energy storage technology is on the cusp of entering mainstream society. The Japanese are currently using it on a grand scale, the Canadians have comprehensively evaluated it and soon Australians will have the opportunity to replace their old lead-acid batteries with a Vanadium Redox Battery alternative. There are no emissions, no disposal issues, no loss of charge, the construction materials are 'green' and the battery can be charged and discharged simultaneously. So, is the Vanadium Battery as good as it sounds and more importantly, is it the solution to our energy storage problems?

Quite simply...Yes.

The potential of this system can be easily summed up in one word: 100% recharge/discharge. Well that's slightly more than one word, but still it is an impressive group of words. I'm a little excited here, so let me back track a little and explain the importance of Vanadium Batteries to our very existence.

It has been possible for quite some time to successfully gather energy through a variety of renewable energy sources, in particular solar and wind. The main problem however, which is also true for fossil fuel energy generation, is the storage of the energy. There is no point in generating surplus uber-watts on one sunny and windy day to find the next day is still and raining and worst of all there is no power to play the new DVD of Stainless Steel Rat on your suped-up 80 inch LCD screen (sorry...just wishful thinking). If the energy cannot be stored on the day of bountiful bliss than a renewable energy system is useless.

In small scale alternative energy systems usually found in off-grid houses, lead-acid batteries are commonly used to store energy. The main problem with this storage system is that lead-acid batteries aren't too efficient. In order to obtain the most cycles possible (300-1500), the batteries are designed to only use 10% of their storage capacity - that's like only being able to use your iPod for one hour instead of the battery's 10 hour capacity. If more energy is sucked out of them, the amount of times they can be recharged and discharged is drastically shortened. Large scale power companies also have a little problem with storage.

Basically, they can't be bothered. It's cheaper for them to estimate the daily power needs of a city and make sure that they produce enough electricity to satisfy all vested interests - that usually takes the form of direct support for industry not individual consumers as many North Americans are finding out on an all too regular basis.

Because a powerhouse can't instantaneously lower or raise output, at night there is usually surplus electricity and the crazy situation occurs where it is pumped into the ground. For all the skeptics out there mumbling conspiracy theory, treehugging pinky, just look it up in any dictionary under 'colossal waste'. Which brings me back to the amazing invention of the Vanadium Battery.

This battery, as the name so intelligently suggests, uses a metal called Vanadium. The soon-to-be Nobel Prize recipients (if there is any justice in this world) from Australia and Europe, have found a substance that can store energy indefinitely. On top of all that, it is possible to use 100% of the stored energy without any side effects. The number of times the Vanadium Battery can be recharged/discharged is also a tad worrying for other battery makers (over 10,000 plus cycles), who must be searching desperately for new employment opportunities - possibly in the oil industry .

Attached Image

In all honesty the word 'battery' falls a little shy of an accurate description of this epoch-creating invention. In very basic terms (which is all I can manage after trying to read the manual) the Vanadium is stored in two separate containers in liquid form - one is charged with energy and one has a depleted energy charge. When new energy is gathered, non-charged Vanadium gets spinached-up and popeye's your uncle, you have lots of energy to expend on a 30 inch Cinema Display connected to 17 inch Powerbook playing Doom until your fingers hurt...um and ah all those other things that use power in a normal household, like lights, fridges, blah, blah, blah.

If you decide one day that you need a little more storage capacity, perhaps for that air-conditioner or hairdryer (for the uninitiated, the banes of lead-acid batteries), no worries, just get bigger storage tanks to hold more Vanadium and all of a sudden you have storage to spare. Try that with a lead-acid battery system.

On a final and semi-serious note (which is the best I can do after thinking about my dream Mac setup), Vanadium Batteries have profound implications for normal households that don't have an alternative energy system supplying power to their house. As Japan is demonstrating, the amount of energy that their power stations produce can be cut by 1/3 simply by storing their previously dumped excess nightly energy into huge Vanadium Batteries. This form of load-levelling can be utilised by every power station throughout the world.

Attached Image

So the next time your power has been cut on your desktop while you are smack in the middle of a frag-fest...or should I say, thesis...or perhaps while buying Vanadium stock on-line, remember that your saviour Vanadium is just around the corner and who knows, if all goes well, perhaps your next car might be using charged Vanadium as fuel, which has been the case for a few University of New South Wales professors on their local golf course.

TreeHugger

QUOTE(zeitgeist @ Dec 24 2010, 12:06 PM)
dont forget Vanadium Redox Battery, potential energy storage is the most important aspect before all those advanced, science & technology able to operate.
*
Didn't know that. Thanks for telling!



TStzmmalaysia
post Dec 25 2010, 12:59 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Robot Uses Air Powered Muscles To Run Like A Human

When you go for a jog do you precisely measure the angle of each of your joints to keep you from falling over? No, of course not, only robots do that …and if Ryuma Niiyama has his way, they won’t be doing it for much longer either. The former University of Tokyo post-doctoral student created “Athlete” a bipedal robot that runs on legs powered by pneumatic “muscles”. Instead of rotating its joints electrically like most robots, Athlete contracts its muscles in the same pattern and with the same timing as you do. To simplify the task, the lower legs were replaced by the sort of prosthetic running blades used in the Para-Olympics. Niiyama presented Athletes’ progress at the recent IEEE Humanoids 2010 conference in Tennessee. You can watch its attempts at running in the video below. Athlete only makes it about five or six steps before falling over, but it looks amazingly life-like while doing so.

Like the human body, Athlete’s mechanical frame is a complex work of art. In each leg of the robot there are seven sets of artificial muscles. Each set contains one to six pneumatic actuators that actually look like a sort of over-sized muscle fiber. The location and power of these actuators correspond, roughly, to the muscles in the human body including the gluteus, quadriceps, etc. There are pressure sensors in each bladed foot and an inertial sensor system in the torso. The goal of Athlete is to have it learn how to flex these artificial pneumatic muscles in order to run like a human and stay balanced while doing so. As you can see below, Athlete is still in the very early stages of that education. The actual running test isn’t until the end (1:25), but check out the rest for some neat background information about the robot.



While at the University of Tokyo, Niiyama was advised by Professor Kuniyoshi and worked with Satoshi Nishikawa to get Athlete running. According to their presentation at the IEEE Humanoid 2010 conference, they were inspired by Oscar Pistorious – the world renowned athlete who runs on bladed prosthetics. Pistorious shows that bipedal running on blades is not only possible, but also highly efficient. With repeated trials, it’s possible that Athlete will be able to manage the muscle coordination needed to keep itself upright while running. Niiyama’s previous work, a robot named Mowgli, was able to learn to jump in just 150 trials. You can see it leap more than 50% of its body height in the video below. While Athlete is prone to falling now, it could prove as quick a learner as Mowgli.



I’m sure some of you are curious as to why I’m spending time discussing a robot that can’t get more than 15 feet without falling on its ass. Well, Athlete may not be up to performance standards yet, but it does represent an important concept in robotics. Many of the top bipedal walkers out there (like Asimo, or HRP-4) monitor their joint positions very carefully to keep from toppling over. They know exactly how much each leg should be bent at each portion in their gait. Even small errors in positioning can lead to catastrophic failure. Now, a lot can be done with this technique, including modifying it slightly to allow the foot to fall during portions of the stride, but it’s not how humans control their walking. Similarly, we’ve seen robots that can move pretty fast on two legs and have some great dynamic power and control. We say that these robots are ‘running’ but they don’t quite get both of their feet completely off the ground the way most humans do when sprinting. Athlete actually runs, and it runs like a human – with little regard for exact positioning of its joints. If Athlete can manage to learn how to react quickly and balance itself dynamically, it has the potential to run as smoothly as any human. In the long run it’s not clear how much we’ll want that kind of movement from a robot. Stiff-jointed walking, if performed fast enough, may be better for many applications. Yet the development of robots like Athlete provide an important alternative that I’m glad is available.

SingularityHub


Added on December 27, 2010, 3:20 pmOpening post updated with more contents.

This post has been edited by tzmmalaysia: Dec 27 2010, 03:20 PM
TStzmmalaysia
post Dec 28 2010, 09:09 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Robotic Surgery for Head and Neck Cancer Shows Promise

Less-invasive robotic surgery for upper airway and digestive track malignant tumors is as effective as other minimally invasive surgical techniques based on patient function and survival, according to University of Alabama at Birmingham researchers.

Head and neck squamous cell carcinomas account for about 4 percent of malignant tumors diagnosed in the United States each year. Currently the standard minimally invasive surgery for these tumors is transoral laser microsurgery.

Previous studies have shown that the robotic surgery was better for patients to regain the ability to swallow, a common and serious side effect, but never looked at cure rate. Manguson wanted to know if you could achieve function and get rid of the cancer at the same time. This study, published Dec. 20, 2010, in the Archives of Otolaryngology -- Head & Neck Surgery, showed you could.

UAB otolaryngologist and the study's senior author J. Scott Magnuson, M.D., and colleagues from UAB and the Mayo Clinic looked at 89 patients with various stages of head and neck squamous cell carcinomas whose primary tumor was resected using the da Vinci Robot. All of the patients were monitored during their hospital stay and up to 33 months after surgery.

"The overall two-year survival rate for these patients was 86.3 percent, which is comparable to the standard treatment," Magnuson, also a scientist in the UAB Comprehensive Cancer Center, said. "Those with earlier-stage tumors appeared to have slightly better recurrence-free survival than those with later stages, but it was not statistically significant."

Magnuson said patient swallowing varied depending on the location of the tumor, preoperative swallowing ability, cancer stage and patient age, and their findings on function were consistent with previous research. Some patients, he said, tolerated an oral diet one to two days after surgery while some were discharged with a short-term nasal feeding tube or long-term gastric feeding tube, including some who were feeding tube-dependent prior to surgery.

"Of note," he added, "all of the patients in the study had regained full swallowing ability at the time of the last follow up visit and none remained feeding-tube dependent."

Magnuson said the study's results are encouraging and show robotic surgery offers a technically feasible and oncologically sound alternative treatment for some patients with head and neck squamous cell carcinomas, but he cautions more work needs to be done.

"This is a relatively new technique, and long-term oncologic outcomes are not available," he said. "However, the early functional and oncologic results justify the continued treatment of select patients with head and neck squamous cell carcinomas with robotic-assisted surgeries."

ScienceDaily







TStzmmalaysia
post Dec 29 2010, 09:08 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

New chemical-free, anti-bacteria plastic 'skins'

Taking a leaf from animals like dolphins and pilot whales that are known to have anti-fouling skins, researchers from A*STAR's Industrial Consortium On Nanoimprint (ICON) are using nanotechnology to create synthetic, chemical-free, anti-bacterial surfaces. The surfaces can reduce infections caused by pathogens such as S. aureus and E. coli and can be used on common plastics, medical devices, lenses and even ship hulls. Conventional methods for preventing bacterial surface attachment may use potentially harmful metal ions, nanoparticles, chemicals or UV-radiation.

Nanoimprint technology, a form of nanotechnology, is a simple technique that has been developed by IMRE to make complex nanometer-sized patterns on surfaces to mimic the texture of natural surfaces. This gives the engineered material 'natural' properties such as luminescence, adhesiveness, water-proofing and anti-reflectivity.

The anti-bacterial surfaces research is ICON's second industry-themed project and will involve A*STAR's Institute of Materials Research and Engineering (IMRE) and companies like Nypro Inc (USA), Hoya Corporation (Japan), Advanced Technologies and Regenerative Medicine, LLC (ATRM) (USA), NIL Technology ApS (Denmark) and Akzo Nobel (UK). This is also the first time that 3 local polytechnics, namely Singapore Polytechnic, Temasek Polytechnic and Ngee Ann Polytechnic are working with the consortium partners, under a special arrangement.

"With millions of years of experience behind her, nature has produced some of the most rugged, adaptable life forms. Who better to learn engineering from than Mother Nature?", said Dr Low Hong Yee, IMRE's Director for Research and Innovation and head of the consortium. She added that the anti-microbial surfaces project will demonstrate the versatility of nanoimprinting technology and its benefits to a wide range of industries.

"The strong support given by industry to this second project and to the consortium is a resounding seal of approval of the research, the talent expertise, the technology and its real-world applications", said Prof Andy Hor, Executive Director of IMRE.

Dr Raj Thampuran, A*STAR Science and Engineering Research Council's (SERC) Executive Director added, "Working closely with companies ensures that our R&D and expertise is translated at the earliest possible time and contributes value to the economy. Borrowing intimately from characteristics in nature represents some of the most frontier and innovative ideas in science and engineering. I am pleased that IMRE's research will help companies challenge difficult engineering problems".

"ICON and nanoimprint research gives our own R&D an added dimension and provides us with alternative options on how our existing technology can be applied", said Mr Steve Ferriday, Technical Manager, Worldwide Marine Foul Release, International Paint Ltd (UK), which is part of Akzo Nobel, the world's largest global paints and coatings company. The company recently established their worldwide marine research laboratory in Singapore and is keen to explore how these surfaces might work in a marine environment.

"Chemical additives in biomedical devices can adversely affect different users in different ways. The anti-microbial surfaces derived from nanoimprint technology without the need for additional chemicals and coatings may offer us an alternative solution to this issue", said Mr Tsuyoshi Watanabe, General Manager, R&D Center of Hoya Corporation, a Japanese-based company dealing in advanced electronics and optics technologies. The company has a plant in Singapore producing implanted lenses for the eye.

"Nypro is excited to be a part of this second project. Our participation in such a world class collaborative programme gives Nypro a competitive advantage in bringing innovation to our customers", commented Mr Michael McGee, Director of Technology from Nypro Inc., a leading global solutions provider in the field of manufactured precision plastic products.

"This collaboration will enable the R&D partners to leverage on their areas of expertise to investigate how bacteria attach to specially designed surfaces of different materials. The industrial applications are tremendous and Ngee Ann Polytechnic is excited to be part of the team. Our student interns from various courses at the School of Life Sciences & Chemical Technology will also benefit from working on projects under the supervision of top researchers," said Mrs Tang-Lim Guek Im, Senior Director for Technology Collaboration at Ngee Ann Polytechnic, Singapore.

This post has been edited by tzmmalaysia: Dec 29 2010, 09:08 AM
TStzmmalaysia
post Dec 29 2010, 09:10 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Robot solves Rubik's cube in 15 seconds

A Rubik's cube is probably not on most people's Christmas list this year, but it's still inspiring engineers to create novel ways of solving it. Zachary Grady and Joe Ridgeway, students at Rowan University in New Jersey, built a robotic arm from scratch and created software that can solve the cube in just 15 seconds.

The system uses a camera to capture how the cube is scrambled and sends the images to a computer. It determines the pattern on each face and algorithms are used to solve the cube. The solution is then translated to the arm's pneumatics and motors (see video above).

Ridgeway was inspired by his own expertise - he can consistently solve the cube in about 45 seconds. "We knew the device was capable of doing these movements," he says. The team designed the arm so that the cube could be held in one corner, allowing it to be quickly rotated without having to re-grip it after each move.

VIDEO

NewScientist

TStzmmalaysia
post Dec 29 2010, 09:14 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Self-Cleaning Fridge Concept Makes Meals, Saves Money

When I first saw this futuristic fridge, my thought was "What ever happened to the simple insulated box that makes things cold?" The full-size touch screens look like energy wasters and the whole thing seems like an ugly addition to your kitchen. However, looking a little more closely, this fridge could actually be a solution for shrinking your overall environmental footprint.

The concept was dreamed up as a project between University of Central Lancashire and online supermarket Ocado, a UK-based grocery with a mind for the planet. In fact, their tag line is "Quality groceries that won't cost the earth." A significant part of minimizing the eco-impact of food is minimizing waste -- and that is a primary purpose of this concept fridge.

According to the Daily Mail, the fridge scans its contents and comes up with recipes you can make from whatever you have in there, including your leftovers. This helps to ensure that you never waste food by forgetting about it and letting it go bad, or never figuring out what to cook with it before it expires.

It can move food around its shelves according to expiration date, so the stuff that needs to be used first is up front. And it can also reorder fresh food when needed. Plus, the designers dreamed up coordinating trashcan technology that would scan foods tossed out -- the fridge would read the data and reduce these ingredients in future meals.

How easy would it be to come home after a long day and just look at the fridge to know what you can whip up without having to go to the store for ingredients?

Attached Image

The screens also show you what you have inside so you don't have to open the doors to find out, and that alone is a big energy saver.

So let's say the fridge is built to the highest of energy efficiency standards, and the screens are e-paper with a smaller touch screen for navigation, or the latest in energy efficient display that stays turned off until you're ready to grab a recipe or a snack -- you also throw in zero food waste and saving energy by opening it as few times as possible, and suddenly the giant screens aren't so wasteful. However, there is also the energy use of those scanners and the shelf tiles that shuffle food around inside. But maybe it could be built to at least break even with the average fridge's energy use. There's still years before the technology dreamed up for this fridge would be realistic, so who knows what could happen.

TreeHugger

TStzmmalaysia
post Dec 29 2010, 09:16 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Scientists Plan "Living Earth Simulator" to Track Disease, Disasters and Traffic

It could be one of the most ambitious computer projects ever conceived.

An international group of scientists are aiming to create a simulator that can replicate everything happening on Earth - from global weather patterns and the spread of diseases to international financial transactions or congestion on Milton Keynes' roads.

Nicknamed the Living Earth Simulator (LES), the project aims to advance the scientific understanding of what is taking place on the planet, encapsulating the human actions that shape societies and the environmental forces that define the physical world.

"Many problems we have today - including social and economic instabilities, wars, disease spreading - are related to human behaviour, but there is apparently a serious lack of understanding regarding how society and the economy work," says Dr Helbing, of the Swiss Federal Institute of Technology, who chairs the FuturICT project which aims to create the simulator.

BBC Science

TStzmmalaysia
post Jan 1 2011, 10:13 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Water purification made simpler

Inside a growing number of homes in the developing world, sand and biological organisms are collaborating to decontaminate drinking water.

Arranged in barrels of concrete or plastic, these biosand water filtration systems (BSFs) remove 95 to 99 percent of the bacteria, viruses, worms and particles contained in rain or surface water. A layer of microorganisms at the top of the sand bed consumes biological and other organic contaminants, while the sand below removes contaminants that cause cloudiness and odor.

A BSF can produce several dozen liters of clean water in an hour. But it can weigh several hundred pounds and cost up to $30, an expense some families in developing countries cannot afford.

Kristen Jellison and her students are trying to build a BSF that is smaller than the standard system, but just as effective.

“Smaller, lighter BSFs,” says Jellison, an associate professor of civil and environmental engineering, “would be cheaper, easier to transport and available to a broader global market. Preliminary research has shown the potential for smaller systems to remove most disease-causing organisms, except possibly viruses.”

Jellison, who is affiliated with the university’s STEPS (Science, Technology, Environment, Policy and Society) initiative, has devoted most of her career to improving drinking water. As a co-adviser to Lehigh’s chapter of Engineers Without Borders, she helped lead efforts to design and build a 20,000-gallon water-storage tank and chlorination system in Pueblo Nuevo, Honduras.

With support from NSF and the Philadelphia Water Department, she has spent five years studying the parasite Cryptosporidium parvum and its transport and fate in water bodies. The parasite is found in multiple hosts, is difficult to eradicate, and can be deadly to people with compromised immune systems.

In an effort to identify possible sources of Cryptosporidium contamination in the Philadelphia watershed, Jellison studies the DNA of various species using a technique called polymerase chain reaction (PCR). She also studies the impact on Cryptosporidium of biofilms, the slimy layers of microorganisms that form on rocks, pipes and other surfaces in water.

Jellison’s group is conducting experiments on BSFs of various sizes, including systems that fit inside two- and five-gallon plastic pails. (The typical BSF is 3 feet high.) The group will change the depth of the sand column, add rusty nails to several pails (in an effort to increase virus removal), and alter other parameters.

“BSFs were developed in the 1980s,” says Jellison. “This is the most comprehensive study to date to characterize the efficiency of different filter types.”

PhysOrg



TStzmmalaysia
post Jan 3 2011, 10:40 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

NASA takes a look at the Jet Stream to get 50 times more wind power

NASA aerospace engineer Mark Moore is using a $100,000 federal grant to research what it will take to create a jet stream-based wind industry 30,000 feet above the ground.

The reason the US government is interested in developing the jet stream is that up there, winds blow consistently at 150 miles per hour, so futuristic satellite-based wind turbines or kite-type turbines such as those from Kitegen and Magenn flying at that altitude have the potential to generate 50 times the gigawatts that ground-based turbines can. So far, the early Magenn prototype flies at 1,000 feet.

“At 2,000 feet, there is two to three times the wind velocity compared to ground level,” Moore said. “The power goes up with the cube of that wind velocity, so it’s eight to 27 times the power production just by getting 2,000 feet up, and the wind velocity is more consistent.”

50 times greater energy density

Higher still: 30,000 feet is where this new resource will play out. If you can send turbines further up, to 30,000 feet, into the jet stream, “instead of 500 watts per meter for ground-based wind turbines, you’re talking about 20,000, 40,000 watts per square meter,” Moore said. “That’s very high energy density and potentially lower cost wind energy because of the 50-plus fold increase in energy density.”

Moore has undertaken the wind-power study to streamline the development of R&D and to reduce friction between competitors for airspace. As more kite-type wind turbines are moving from the pie-in-the sky idea to the deployment stage, one entity needs to develop a plan that makes it possible to coexist in the same airspace; only NASA has that kind of experience.

That means dealing with current Federal Aviation Administration regulations and with those that might be necessary to accommodate an airspace that includes manned aircraft, the unmanned aircraft in the future, plus wind-borne energy turbines. The jet stream is very useful to commercial airlines, because the much greater wind speeds greatly reduce their need for fuel.

One solution? Site future potential jet-stream-based wind farms in little-traveled areas of the jet stream over the ocean.

“Offshore deployment of these airborne systems probably makes the most sense in terms of both airspace and land use”, says Moore, “because there is little to no demand for low altitude flight over oceans 12 miles offshore.”

His research also involves some of the core capabilities of NASA in aeronautics, composite materials and air space management. So leaders in this area of the wind power industry, as well as other government agencies, including the Department of Energy and National Renewable Energy Laboratory, have been working with NASA on the research.

“They welcome this study because they’ve never dealt with flying systems and NASA has,” Moore said. “You can’t come up with advanced concepts until you understand the requirements well, and frankly, I don’t think anybody understands the requirements well.”

As we catapult into a real clean energy future, the sky is the limit.

CleanTechnica




TStzmmalaysia
post Jan 3 2011, 10:45 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

IBM's TriviaBot Watson to Take on Ken Jennings in Man Vs. Machine Episode of Jeopardy

Watson, an artificial intelligence program created by IBM (and named after Thomas J. Watson, IBM's founder, not Sherlock Holmes's roommate), is designed as a question-and-answer bot, able to interpret and respond to questions posed in normal human language patterns. The natural use for such a program is, of course, the greatest game show that ever was or ever will be: Jeopardy!. In February, Watson will be facing off against two of Jeopardy!'s toughest competitors ever: Ken Jennings, whose 74-day winning streak was the longest in the show's history, and Brad Rutter, whose $3.3 million winnings are the show's highest.

Creating an AI that can compete on Jeopardy! is an incredibly difficult task for any programmer. The venerable show poses questions not only as simple shows of trivia knowledge, but also puns, various forms of wordplay, trick questions, riddles, and other complex queries. It takes uniquely flexible and quick thinking to succeed at Jeopardy!, placing Watson on a pedestal with chess-playing robot (and IBM sibling) Deep Blue.



On February 14th, Watson will go head-to-head-to-head on a very special episode of Jeopardy!, playing against trivia legends Ken Jennings and Brad Rutter. (In case you were wondering, Watson's spot at the challenger podium will be taken by an avatar--no word on what strained anecdote Trebek will coax out of him and then casually mock.) Watson has been prepping for the battle by sparring with other Tournament of Champions competitors, though neither IBM nor Jeopardy! has released the trivia-bot's record against human competitors. You can see in the video above that some wordings can trip him up, so nobody knows exactly how capable a competitor he'll be. The winner will receive $1 million--if Watson wins, IBM will donate the money to charity, and both of the human competitors have pledged to donate half the prize if either wins.

PopSci



TStzmmalaysia
post Jan 3 2011, 10:46 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Army evaluating transportable solar-powered tents

ARLINGTON, Va. (Army News Service, Dec. 8, 2010) -- The U.S. Army is evaluating a host of flexible, portable, lightweight solar-powered shades and tent-like technologies.

The products are designed to allow expeditionary units to deploy with transferrable, exportable electrical power that can charge batteries, computers and other essential gear without needing fuel or a generator, service officials said.

Using a fast-evolving technology known as Flexible Photovoltaics (PV), the solar-powered tent structures convert light energy into electricity, thus removing the need to haul generators and large amounts fuel.

“They are ideal for charging up batteries, making sure your (communications), night vision goggles and computers are powered up. You don’t want a generator on top of a mountain, and you don’t want to have to bring fuel to a generator or haul batteries,” said Katherine Hammack, assistant secretary of the Army for installations, energy and environment.

Technological advances in the area of photovoltaics have made it possible to build lightweight, portable materials which are flexible and can easily travel with dismounted units.

In fact, the Army has already deployed some of these technologies to forward locations around the world for additional evaluation, sending some to places such as Afghanistan, said Steven Tucker, a senior engineer in the Shelters Technology, Engineering and Fabrication Directorate at the Natick Soldier Research Design and Engineering Center.

In addition, Hammack said the Army is hoping to deploy more of the solar-powered tents in the near future.

“The technology has reached the point where the testing has shown they [solar-powered tents] are proven. Our teams have worked on the inverters and the durability of the systems. The durability of the tent covers has evolved to a point where we would like to see more of them deployed,” Hammack said.

Some of the Flexible PV products being evaluated are called: Power Shade, TEMPER Fly and QUADrant – military shelter items of various sizes and configurations which use flexible solar panels to harness light energy and convert it into transferable electricity.

“The technology we are using is called amorphous silicon. It’s been around since the early 80s. It takes the energy from the sun – photons. They [photons] go into the PV materials and they essentially knock loose electrons. Those electrons are then gathered and utilized for power, converting solar power to electrical power,” Tucker explained.

The TEMPER Fly is a roughly 16-by-20-foot tent structure able to generate 800 watts of electricity. A QUADrant is a smaller variant of the TEMPER Fly, able to generate 200 watts of power, and the Power Shades range in size and are capable of generating up to 3 kilowatts of exportable electrical power, Tucker said. The PV integrated military shelter items use a lamination process to combine the PV materials into the textile substrate, Tucker explained.

“Alternative energy sources are really going to shine in mission scenarios where you don’t want to use a generator because you don’t want the noise or heat signature that goes along with it, or where re-supplying that generator with fuel doesn’t make sense,” said Tucker.

US Army

TStzmmalaysia
post Jan 5 2011, 09:26 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


NANOTECHNOLOGY

Attached Image

Japanese researchers create palladium-like alloy using nanotechnology, 'present-day alchemy'

Japanese researchers have created an alloy with properties similar to palladium, a precious metal used in many high-tech goods, a news report said Thursday, dubbing the breakthrough "present-day alchemy".

Kyoto University professor Hiroshi Kitagawa and his team said they used nano-technology to combine rhodium and silver, elements which do not usually mix, to produce the new composite, the Yomiuri daily said.

The alloy has similar properties to palladium, which is used in cars' emission-reducing catalytic converters as well as in computers, mobile phones, flatscreen TVs and dentistry instruments.

Like other white metals, such as silver and platinum, palladium is expensive, with its deposits largely limited to South Africa and Russia.

Palladium also has applications in the production of fuel cells -- a clean and renewable energy source that produces electricity by combining hydrogen and oxygen, with water as the only byproduct.

To make the new alloy, the Kyoto team used nano-technology to "nebulise" the rhodium and silver and gradually mixed them with heated alcohol, with the two metals mixed stably at the atomic level, the report said.

Japan's industry ministry has listed 31 rare metals, including palladium and lithium, which are used in industrial products, such as electronic devices and batteries. Of these, 17 elements are called rare earth minerals.

Resource-poor Japan has tried to shift from its dependence on China, which controls the bulk of global rare earth production.

Kitagawa said he hopes to create more alloys using nano-technology, without specifying which ones, the Yomiuri said.

PhysOrg

TStzmmalaysia
post Jan 5 2011, 09:32 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

New Floating Wind Turbine Harvests Energy from on High

Think of highly portable wind turbines that can adjust their height to take advantage of the best winds, and you’ve got the next generation of airborne wind energy devices. A 100-kW model from Mageen Power, Inc. is about to go on the market, so let’s dig a little deeper into the idea of harvesting energy through a kite string.

Airborne Wind Energy

The basic principle is simple: instead of anchoring a wind turbine to the ground, you float it up and make its tether double as a grid connector. Their portability, ease of installation and minimal use of land space could make airborne turbines ideal for innumerable small scale uses, including disaster relief and other emergency services.

Many Places for Airborne Wind Energy

One potential use for airborne wind energy is at sites that are not suitable for on-ground alternative energy installations. For example, airborne turbines could be tethered at brownfields as part of the U.S. EPA’s RE-Powering America's Land program, or at construction sites where extra space is minimal. They could also become an important alternative energy source for outdoor festivals and other temporary events (which are already beginning to introduce solar power and pedal power, by the way).

Magenn’s Airborne Turbine

The Magenn Power Wind Turbine, called MARS, differs from a kite-style wind power system in that it’s held aloft by helium rather than relying on the force of wind. It’s basically a blimp that houses rotors which spin on a horizontal axis. It can range up to 1,000 feet, and the system includes a battery so energy can be used immediately on site, stored for later use, or transferred to the grid. The company foresees a diverse market that includes isolated communities and remote facilities such as cell towers or mines, as well as farms, factories and the aforementioned disaster relief.

CleanTechnica


TStzmmalaysia
post Jan 5 2011, 09:34 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

China says it knows how to reprocess nuclear fuel

Chinese scientists have mastered the technology for reprocessing fuel from nuclear power plants, potentially boosting the supplies of carbon-free electricity to keep the country's economy booming, state television reported Monday.

The breakthrough will extend by many times the amount of power that can be generated from China's nuclear plants as fissile and fertile materials are recovered to be new fuel, CCTV said.

Several European countries, Russia, India and Japan already reprocess nuclear fuel - the actual materials used to make nuclear energy - to separate and recover the unused uranium and plutonium, reduce waste and safely close the nuclear cycle.

The CCTV report gave no details on whether or when China would begin reprocessing on an industrial scale.

China overtook the United States as the world's largest energy consumer in 2009, years before it was expected to do so, according to the Paris-based International Energy Agency.

But it is heavily dependent on coal, a major pollutant. It has 13 nuclear power plants in use now and ambitiously plans to add potentially hundreds more.

Reprocessing nuclear fuel costs significantly more than using it once and storing it as waste. It is also controversial because extracted plutonium can be used in nuclear weapons, although China has long had a nuclear arsenal.

U.S. commercial reprocessing of plutonium was halted by then-President Jimmy Carter because of nuclear proliferation worries. Then-President George W. Bush proposed a resumption, but the National Research Council found it not economically justifiable. President Barack Obama scrapped the Bush effort.

Recovered plutonium and - when prices are high - uranium can be re-used. Some reactors can use other reprocessed components, potentially multiplying the amount of energy that results from the original uranium fuel by about 60 times.

Wang Junfeng, project director for the state-run China National Nuclear Corporation, told CCTV the Chinese scientists employed a chemical process that was effective and safe.

"In this last experiment, we made a preparation of standard quality uranium products and standard quality plutonium products, so we can say we were successful," Wang said.

CCTV said the country has enough fuel now to last up to 70 years and the breakthrough could yield enough to last 3,000 years.

To produce that amount of fuel, however, China would have to build a hugely expensive and highly dangerous breeder reactor, said Matthew Bunn, an expert on the Chinese nuclear program at Harvard University's John F. Kennedy School of Government.

Rather than build a breeder reactor or even start reprocessing on a commercial scale, China should simply store used fuel for the next several decades while safer and less expensive technology emerges, Bunn said.

"Reprocessing the spent fuel is much more dangerous," Bunn said, adding that it increased the risk of nuclear terrorism if recovered fuel were stolen.

CCTV says the details of the process the Chinese scientists developed after 20 years' work are being kept secret. The technologies used in other countries also are considered industrial secrets and generally not shared.

Bunn said China build a pilot-scale reprocessing plant several years ago but repeatedly postphoned using it, possibly because of technical problems.

"My interpretation of this statement is that they have resolved whatever issues were delaying that," Bunn said.

China's total 2009 energy consumption, including sources ranging from oil and coal to wind and solar power, was equal to 2.265 billion tons of oil, compared with 2.169 billion tons used by the U.S., the IEA said.

The consumption boom reflects China's transformation from a nation of subsistence farmers to one of workers increasingly trading bicycles for cars and buying air conditioners and other energy-hungry home electronics.

That has also bestowed on China status as the world's biggest polluter, although Beijing has long pointed at developed nations in climate change talks and resists international pressure for it to take a larger role in curbing greenhouse gas emissions.

PhysOrg






TStzmmalaysia
post Jan 5 2011, 09:39 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

New solar cell self-repairs like natural plant systems

WEST LAFAYETTE, Ind. - Researchers are creating a new type of solar cell designed to self-repair like natural photosynthetic systems in plants by using carbon nanotubes and DNA, an approach aimed at increasing service life and reducing cost.

"We've created artificial photosystems using optical nanomaterials to harvest solar energy that is converted to electrical power," said Jong Hyun Choi, an assistant professor of mechanical engineering at Purdue University.

The design exploits the unusual electrical properties of structures called single-wall carbon nanotubes, using them as "molecular wires in light harvesting cells," said Choi, whose research group is based at the Birck Nanotechnology and Bindley Bioscience centers at Purdue's Discovery Park.

"I think our approach offers promise for industrialization, but we're still in the basic research stage," he said.

Photoelectrochemical cells convert sunlight into electricity and use an electrolyte - a liquid that conducts electricity - to transport electrons and create the current. The cells contain light-absorbing dyes called chromophores, chlorophyll-like molecules that degrade due to exposure to sunlight.

"The critical disadvantage of conventional photoelectrochemical cells is this degradation," Choi said.

The new technology overcomes this problem just as nature does: by continuously replacing the photo-damaged dyes with new ones.

"This sort of self-regeneration is done in plants every hour," Choi said.

The new concept could make possible an innovative type of photoelectrochemical cell that continues operating at full capacity indefinitely, as long as new chromophores are added.

Findings were detailed in a November presentation during the International Mechanical Engineering Congress and Exhibition in Vancouver. The concept also was unveiled in an online article (http://spie.org/x41475.xml?ArticleID=x41475) featured on the Web site for SPIE, an international society for optics and photonics.

The talk and article were written by Choi, doctoral students Benjamin A. Baker and Tae-Gon Cha, and undergraduate students M. Dane Sauffer and Yujun Wu.

The carbon nanotubes work as a platform to anchor strands of DNA. The DNA is engineered to have specific sequences of building blocks called nucleotides, enabling them to recognize and attach to the chromophores.

"The DNA recognizes the dye molecules, and then the system spontaneously self-assembles," Choi said

When the chromophores are ready to be replaced, they might be removed by using chemical processes or by adding new DNA strands with different nucleotide sequences, kicking off the damaged dye molecules. New chromophores would then be added.

Two elements are critical for the technology to mimic nature's self-repair mechanism: molecular recognition and thermodynamic metastability, or the ability of the system to continuously be dissolved and reassembled.

The research is an extension of work that Choi collaborated on with researchers at the Massachusetts Institute of Technology and the University of Illinois. The earlier work used biological chromophores taken from bacteria, and findings were detailed in a research paper published in November in the journal Nature Chemistry (http://www.nature.com/nchem/journal/v2/n11.../nchem.822.html).

However, using natural chromophores is difficult, and they must be harvested and isolated from bacteria, a process that would be expensive to reproduce on an industrial scale, Choi said.

"So instead of using biological chromophores, we want to use synthetic ones made of dyes called porphyrins," he said.

Eurekalert

TStzmmalaysia
post Jan 5 2011, 09:44 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

NASA Tests Handy-Man Space Robots For Orbital Repairs

Springing off the heels of a successful repair mission to the Hubble Space Telescope, NASA has been quietly working on developing a new specialty: satellite repair-bots.

The goal of the NASA project is to demonstrate to commercial firms the feasibility of refueling, repairing and servicing spacecraft in orbit.

There are more than 360 operational commercial satellites and hundreds of government spacecraft currently in orbit, many of which will run out of fuel long before they sustain electronics or other systems failures.

"It's our idea to stimulate a pathfinder kind of mission," said Frank Cepolina, a Hubble mission development manager now spearheading NASA's new Satellite Servicing Development Office at the Goddard Space Flight Center in Greenbelt, Md. "Once we're done, commercial takes over."

The first of what could be several demonstration missions is expected to fly on the International Space Station sometime next year. The plan is to use Dextre, the station's Canadian-built robot, to demonstrate autonomous orbital refueling.

Outfitted with smart sensors and tools, Dextre would basically pump fuel through tank valves that are identical to equipment flying on hundreds of satellites today. The robot would have to remove insulation, disconnect safety wires and prepare ports as part of the job.

"We want to demonstrate our ability to get up there...and pass fuel into valves and into a receiving tank, and do this test in many configurations, many different times," said Cepolina.

Dextre already has been through the paces. Before it was launched to the space station, the Hubble team used it to test robotic servicing options for fixing the telescope. NASA initially canceled the shuttle servicing mission after the 2003 Columbia accident, believing it was too risky to fly astronauts anywhere but the space station. In the end, NASA reinstated the shuttle's mission to Hubble, which was successfully completed last year.

The robotics work, however, was not in vain. Tools and techniques developed for the robotic servicing of Hubble were adapted for the shuttle mission, boosting productive of the spacewalking service teams. After the flight, NASA began thinking more generically about robotic satellite servicing.

"We're trying to build an industry, not a government program in whatever NASA does and leave something behind," Dave Huntsman, who oversees commercial space initiatives at NASA Headquarters in Washington, D.C., told Discovery News.

"In area after area, we're falling behind. We're fourth in commercial launches now, with less than 10 percent of the market. We used to have 100 percent of the commercial satellite launch market," he said.

"It isn't just a matter of money," Huntsman added. "It's whether the government leverages the money to leave a sustainable industry behind. The U.S. hasn't been doing that. That's why our competitive industries are falling behind."

Cepolina's team would like to get its refueling demonstration gear on the last shuttle flight to the station, currently scheduled for September, but with just four flights remaining, there is a lot of competition. Ground demonstrations of robotic refueling are under way.

DiscoveryNews

TStzmmalaysia
post Jan 5 2011, 09:48 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Ceiling lights in Minn. send coded Internet data

(AP) -- Flickering ceiling lights are usually a nuisance, but in city offices in St. Cloud, they will actually be a pathway to the Internet.

The lights will transmit data to specially equipped computers on desks below by flickering faster than the eye can see. Ultimately, the technique could ease wireless congestion by opening up new expressways for short-range communications.

The first few light fixtures built by LVX System, a local startup, will be installed Wednesday in six municipal buildings in this city of 66,000 in the snowy farm fields of central Minnesota.

The LVX system puts clusters of its light-emitting diodes, or LEDs, in a standard-sized light fixture. The LEDs transmit coded messages - as a series of 1s and 0s in computer speak - to special modems attached to computers.

A light on the modem talks back to the fixture overhead, where there is sensor to receive the return signal and transmit the data over the Internet. Those computers on the desks aren't connected to the Internet, except through these light signals, much as Wi-Fi allows people to connect wirelessly.

LVX takes its name from the Latin word for light, but the underlying concept is older than Rome; the ancient Greeks signaled each other over long distances using flashes of sunlight off mirrors and polished shields. The Navy uses a Morse-coded version with lamps.

The first generation of the LVX system will transmit data at speeds of about 3 megabits per second, roughly as fast as a residential DSL line.

Mohsen Kavehrad, a Penn State electrical engineering professor who has been working with optical network technology for about 10 years, said the approach could be a vital complement to the existing wireless system.

He said the radio spectrum usually used for short-range transmissions, such as Wi-Fi, is getting increasingly crowded, which can lead to slower connections.

"Light can be the way out of this mess," said Kavehrad, who is not involved in the LVX project.

But there are significant hurdles. For one, smart phones and computers already work on Wi-Fi networks that are much faster than the LVX system.

Technology analyst Craig Mathias of the Farpoint Group said the problems with wireless congestion will ease as Wi-Fi evolves, leaving LVX's light system to niche applications such as indoor advertising displays and energy management.

LVX Chief Executive Officer John Pederson said a second-generation system that will roll out in about a year will permit speeds on par with commercial Wi-Fi networks. It will also permit lights that can be programmed to change intensity and color.

For the city, the data networking capability is secondary. The main reason it paid a $10,000 installation fee for LVX is to save money on electricity down the line, thanks to the energy-efficient LEDs. Pederson said one of his LED fixtures uses about 36 watts of power to provide the same illumination that 100 watts provides with a standard fluorescent fixture.

Besides installation costs, customers such as St. Cloud will pay LVX a monthly fee that's less than their current lighting expenses. LVX plans to make money because the LED fixtures are more durable and efficient than standard lighting. At least initially, the data transmission system is essentially a bonus for customers.

Pederson said the next generation of the system should get even more efficient as fixtures become "smart" so the lights would dim when bright sunlight is coming through a window or when a conference room or hallway is empty.

Because the lights can also change color, Pederson said they could be combined with personal locators or tiny video cameras to help guide people through large buildings. The lights could show a trail of green lights to an emergency exit, for instance.

While Kavehrad and Mathias credited LVX for being the first company in the United States to bring the technology to market, Kavehrad said it trails researchers and consumer electronics companies in Japan and Korea in developing products for visible-light networks.

Pederson's previous company, 911 EP, built high-powered LED roof lights for squad cars and other emergency vehicles. He said he sold the company in 2002. He said the visible-light network grew out his interest in LEDs that goes to the mid-1990s.

The Minneapolis-St. Paul International Airport, which pays for 24-hour lighting and replacing fluorescent bulbs on high ceilings, is considering an LVX system, said Jeffrey W. Hamiel, executive director of the Metropolitan Airports Commission.

The system might include mounting cameras on the light fixtures to bolster the airport security system, but the real attraction is the savings on electricity and maintenance.

"Anything we can do to save costs is worth consideration," he said.

Michael Williams, the city administrator in St. Cloud, said the city had been considering LVX for some time.

"It's pretty wild stuff," he said. "They have been talking about it with us for couple of years, and frankly it took a while for it to sink in."

PhysOrg
TStzmmalaysia
post Jan 5 2011, 09:51 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Electronics on Anything[:] Chemical trick puts solar cells and other electronics on rice paper, Saran wrap, and more practical things, too

There's probably not much call for printing solar cells on toilet paper, but a method developed at MIT can do just that, if it's ever needed.

More to the point, oxidative chemical vapor deposition (oCVD) could allow low-cost production of solar cells and other electronic devices on thin, flexible materials that other processes can't easily handle. Miles Barr, a graduate student in the lab of MIT chemical engineering professor Karen Gleason, described the process at the fall meeting of the Materials Research Society, in Boston.

The technique deposits conjugated polymers, plastics with good conductivity and semiconductor properties that are also flexible, stretchable, and even foldable. "We're particularly interested in polymers because of their good mechanical properties," Barr says.

The process sprays a vapor of a monomer and an oxidizing agent onto a substrate. When they meet on the surface, they polymerize, joining into long chains to form a plastic popularly known as PEDOT. Varying the surface temperature of the substrate between 20 ªC and 100 ªC dictates how the surface of the film forms; it can range from smooth to studded with nanopores. The polymer is conductive on its own, but lacing the nanopores with silver particles can increase conductivity up to a thousandfold. Barr says the process allows users to synthesize, deposit, and pattern the conjugated polymer all in one step.

To show off oCVD's abilities, Barr and his colleagues used the process on a number of extremely delicate materials. Rice paper, used to make spring rolls in restaurants, would dissolve in most processes, but because this one is free of solvents, it remained intact. A plastic film, such as Saran wrap—hard to coat because it repels water—could be coated with this dry process. The researchers even constructed a solar cell printed on toilet paper.

"This is kind of just to illustrate the versatility, not that these are substrates we necessarily want to process with electronics," Barr says. "You don't typically think of paper as a good substrate for photovoltaics, because it's not very transparent."

There may, however, be applications where the ability to build electronics, such as flexible displays, on fabrics or paper will come in handy. And engineers are increasingly looking to roll-to-roll printing—in which inks are printed onto plastic or another flexible material as it unspools from one machine and is wound up on another—as a faster, less costly method for producing some electronics, including photovoltaics.

The team built solar cells on a commonly used plastic and bent them to a radius of less than 5 millimeters more than 1000 times, then tested them to see if they still worked. Their efficiency was still greater than 99 percent of what it had been before bending, Barr said. Electrodes were bent to a radius of less than 1 mm, creased more than 100 times, and stretched to approximately 200 percent and still maintained high conductivity. A solar cell built on Saran wrap performed well even while it was stretched to about 180 percent, at which point the wrap pulled apart, destroying the cell.

To illustrate the point, Barr printed a solar cell on a piece of paper. In a video he showed at the conference, a student folded the paper into the shape of an airplane, attached leads, and shone a light on the folded device. It still generated current.



"I don't know if paper airplanes are the future of solar cells," Barr concedes. But in case they are, he's got it covered.

IEEESpectrum


TStzmmalaysia
post Jan 5 2011, 09:53 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


NANOTECHNOLOGY

Attached Image

'Nanoscoops' could spark new generation of electric automobile batteries

Troy, N.Y. – An entirely new type of nanomaterial developed at Rensselaer Polytechnic Institute could enable the next generation of high-power rechargeable lithium (Li)-ion batteries for electric automobiles, as well as batteries for laptop computers, mobile phones, and other portable devices.

The new material, dubbed a "nanoscoop" because its shape resembles a cone with a scoop of ice cream on top, can withstand extremely high rates of charge and discharge that would cause conventional electrodes used in today's Li-ion batteries to rapidly deteriorate and fail. The nanoscoop's success lies in its unique material composition, structure, and size.

The Rensselaer research team, led by Professor Nikhil Koratkar, demonstrated how a nanoscoop electrode could be charged and discharged at a rate 40 to 60 times faster than conventional battery anodes, while maintaining a comparable energy density. This stellar performance, which was achieved over 100 continuous charge/discharge cycles, has the team confident that their new technology holds significant potential for the design and realization of high-power, high-capacity Li-ion rechargeable batteries.

"Charging my laptop or cell phone in a few minutes, rather than an hour, sounds pretty good to me," said Koratkar, a professor in the Department of Mechanical, Aerospace, and Nuclear Engineering at Rensselaer. "By using our nanoscoops as the anode architecture for Li-ion rechargeable batteries, this is a very real prospect. Moreover, this technology could potentially be ramped up to suit the demanding needs of batteries for electric automobiles."

Batteries for all-electric vehicles must deliver high power densities in addition to high energy densities, Koatkar said. These vehicles today use supercapacitors to perform power-intensive functions, such as starting the vehicle and rapid acceleration, in conjunction with conventional batteries that deliver high energy density for normal cruise driving and other operations. Koratkar said the invention of nanoscoops may enable these two separate systems to be combined into a single, more efficient battery unit.

Results of the study were detailed in the paper "Functionally Strain-Graded Nanoscoops for High Power Li-Ion Battery Anodes," published Thursday by the journal Nano Letters. See the full paper at: http://pubs.acs.org/doi/abs/10.1021/nl102981d

The anode structure of a Li-ion battery physically grows and shrinks as the battery charges or discharges. When charging, the addition of Li ions increases the volume of the anode, while discharging has the opposite effect. These volume changes result in a buildup of stress in the anode. Too great a stress that builds up too quickly, as in the case of a battery charging or discharging at high speeds, can cause the battery to fail prematurely. This is why most batteries in today's portable electronic devices like cell phones and laptops charge very slowly – the slow charge rate is intentional and designed to protect the battery from stress-induced damage.

The Rensselaer team's nanoscoop, however, was engineered to withstand this buildup of stress. Made from a carbon © nanorod base topped with a thin layer of nanoscale aluminum (Al) and a "scoop" of nanoscale silicon (Si), the structures are flexible and able to quickly accept and discharge Li ions at extremely fast rates without sustaining significant damage. The segmented structure of the nanoscoop allows the strain to be gradually transferred from the C base to the Al layer, and finally to the Si scoop. This natural strain gradation provides for a less abrupt transition in stress across the material interfaces, leading to improved structural integrity of the electrode.

The nanoscale size of the scoop is also vital since nanostructures are less prone to cracking than bulk materials, according to Koratkar.

"Due to their nanoscale size, our nanoscoops can soak and release Li at high rates far more effectively than the macroscale anodes used in today's Li-ion batteries," he said. "This means our nanoscoop may be the solution to a critical problem facing auto companies and other battery manufacturers – how can you increase the power density of a battery while still keeping the energy density high?"

A limitation of the nanoscoop architecture is the relatively low total mass of the electrode, Koratkar said. To solve this, the team's next steps are to try growing longer scoops with greater mass, or develop a method for stacking layers of nanoscoops on top of each other. Another possibility the team is exploring includes growing the nanoscoops on large flexible substrates that can be rolled or shaped to fit along the contours or chassis of the automobile.

Eurekalert

TStzmmalaysia
post Jan 6 2011, 10:49 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


TRANSPORTATION

Attached Image

Eaton Hybrid Trucks Have Logged 100 Million Miles, Saving 4 Million Gallons of Fuel

Hybridization Isn't Just for Private Vehicles

Eaton makes, among other things, hybrid drivetrains for commercial trucks and buses. The most familiar models to most people is probably the one used by UPS for local deliveries, but their hybrid vehicles are also used as city buses, school buses, package delivery trucks, beverage delivery trucks, refrigerated delivery trucks, refuse and recycling trucks, utility vehicles, etc. With more than 4,500 hybrids on the road, Eaton has just hit a milestone: "customers of its hybrid systems have collectively accumulated more than 100 million miles of service, reducing fuel consumption by 4 million gallons of diesel fuel and harmful emissions by 40,000 metric tons."

The hybrid system provides an "up to 50 percent improvement in fuel economy and emissions" and the regenerative brakes reduce brake and engine wear. It also reduces considerably emissions when the trucks would usually be left to idle, and since they are mostly diesel and used in urban areas were air quality is especially crucial, this is a very good thing.

Let's hope that the market for hybrid and electric commercial vehicles will stay healthy, because while many cars can be taken off the roads by doing things like expanding public transit and walking/biking infrastructure, commercial vehicles are harder to eliminate (long-haul trucking can be replaced by rail, but you still need to move goods around inside cities).

TreeHugger

TStzmmalaysia
post Jan 6 2011, 10:50 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


TRANSPORTATION

Attached Image

Hybrid Car Drives 2269.3 Miles Across U.S.A. On Less Than 2 Tanks of Fuel


The ultimate test for a fuel efficient vehicle is a long drive with an expert hypermiler at the wheel. In the past few years, it's been done with a variety of cars, including the Honda Insight, Toyota Prius, Ford Fusion hybrid, Toyota iQ, etc. Sadly, it's not the best way to compare these cars because the driving conditions are different each time, but it still gives a good idea of what's possible with extra-careful driving. The latest long-distance hypermiling stunt was done by Wayne Gerdes (who coined "hypermiling") with the new Hyundai Sonata hybrid. Read on for more details.

Wayne drove the Hyundai from San Diego to Jekyll Island, Georgia. A 2269.3 miles trip that only used 38 gallons of gasoline, for an average of 59.58 miles per gallon (MPG). Not bad for a car that is rated at 37 MPG combined by the EPA!

The first tank of fuel was enough for 1221.2 miles, and the second one for 1048.1 miles with 2.5 gallons remaining.

This is impressive considering that the driving conditions were real-world and not ideal (mountains, winer weather, sometimes traffic, etc). This isn't like other hypermiling records that are achieved at much lower speeds while looping around on a track. Wayne could no doubt beat 59.58 MPG in such conditions...

Of the drive, Wayne said: "As a fan of fuel-efficient vehicles, I enjoy the challenge of putting new technology to the test," says Gerdes. "This demonstration shows how the Hyundai Sonata Hybrid can deliver extremely impressive fuel economy and range for drivers who value fuel savings. This is the first time I've driven a car that 'does it right!' Driving on the interstate at the posted speed limit (or 65 mph, whichever is slower), the Sonata Hybrid will exceed or equal its competition while offering a much larger, roomier, and comfortable car."

But remember that he's a pro, and that some hypermiling techniques are a bit too extreme (and sometimes even unsafe, though I don't think he went that far) for the average person. If you buy a Sonata hybrid, dont expect to be getting 60 MPG just with normal driving.

TreeHugger

TStzmmalaysia
post Jan 6 2011, 10:52 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


TRANSPORTATION

Attached Image

South-Korea's Capital Starts Commercial Operation of Electric Buses

Seoul's residents now have access to high-tech full-sized electric buses on the Mt. Namsan circular routes. So far only 5 out of the 14 buses that serve this route have been swapped for EVs, but over time they will all be replaced. This is an initiative of the Seoul Metropolitan Government (SMG) and part of a larger goal: By 2020, the SMG would like to see 120,000 electric vehicles on the city's roads, "which will account for 50% of all public transport vehicles, 10% of sedans and 1% of trucks and vans."


The buses are being produced by Hyundai Heavy Industries and Hankuk Fiber.


The electric coaches now serving on the Mt. Namsan circular routes are 11.05 meters long and run up to 83 km (52 miles) with a single charge. They can be fully charged in less than 30 minutes with a high-speed battery charger. The electric bus, with a maximum speed of 100 km/h (62 mph), has a low floor and a 240 kW motor. It features a high-capacity lithium-ion battery and a regenerative braking system.

Its body is made of a carbon composite material which considerably reduces the vehicle's weight while reinforcing durability. The electric buses are also equipped with automatic slant boards for wheelchair users. (source)

The 30-minute charging time is particularly impressive. It's easy to extrapolate a bit in the future and imagine electric buses that have a longer range and a similar or shorter charging time, making them a perfect fit for even more types of routes.

TreeHugger

TStzmmalaysia
post Jan 6 2011, 10:54 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


TRANSPORTATION

Attached Image

A.N.T. Aid Necessities Transporter concept vehicle


The initial inspiration for the A.N.T came from wanting to design a better vehicle for disaster relief

A multi-purpose vehicle capable of delivering emergency housing and supplies to disaster areas then rapidly returning to base ready for another mission – that is the concept behind the Aid Necessities Transporter (A.N.T.). The idea takes inspiration from it's namesake in the insect world – creating more than just a unique concept vehicle but an entirely new aid distribution system. The A.N.T has been designed to traverse rough terrain that would be impossible for conventional trucks to navigate, delivering supply pods and temporary shelter to disaster stricken communities. The vehicle then transforms itself into a low-profile form for a swift return to headquarters.

The concept was developed by Melbourne (Australia) designer Bryan Lee for his graduate design project while studying Industrial Design at Monash University in 2009. Lee told Gizmag his initial inspiration came from wanting to design a better vehicle for disaster relief.

“At the beginning of 2009 I took notice of the increasing numbers of natural disasters due to global warming,” he said. “From this, I decided that I would design a vehicle the address this problem however through a different direction. Instead of addressing the issue by preventing global warming, I thought that we will need some solutions to address problems when natural disasters hit. This is where my path began which led me to research on organizations such as the United Nations.”

While researching for his project the student designer came across a documentary that changed the direction of the concept.

“It was about ants and their colony,” Lee told Gizmag. “I was fascinated with their aesthetics, ability and system they run on which led me to believe that they are truly natures transporters. One of the biggest inspirations I took from ants were how they transported their food back to their nest. In groups. From this, instead of using the conventional way organizations deliver supplies all at once by land, I decided to create a new system where although it carries slightly less, the A.N.T's will travel back and forth from HQ's to the disaster zone delivering supplies faster and earlier.”

Just like their insect namesakes, the A.N.T in Transport mode would head towards the disaster stricken area traveling in groups. On arrival the vehicle would quickly deploy the supply unit, which doubles as a temporary housing module. The A.N.T would then rotate its cockpit section 90 degrees downwards transforming to Rapid mode allowing the swarm of A.N.T.s to quickly travel back to headquarters ready to load up another supply unit.



The A.N.T employs a number of intelligent design features that allow for swift aid distribution. These include six independent electric in-wheel motors, a large unique all-terrain suspension system that also allows for rapid loading and deployment of supply units, rotational hydraulics for transformation from Transport to Rapid modes, and a well thought out supply unit design that doubles as temporary housing.

Lee graduated from Monash University in Australia with Bachelor of Industrial Design with honours in 2009. He now works at Ford Australia in the Visualization department.

GizMag



TStzmmalaysia
post Jan 6 2011, 10:57 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


TRANSPORTATION

Attached Image

Anti Sleep Pilot detects drowsy drivers

The Anti Sleep Pilot is a dashboard device that lets drivers know when they're becoming too fatigued

According to a 2008 study by the Swedish National Road and Transport Research Institute, about 20 percent of all road traffic accidents are caused by driver fatigue. Tired motorists are also eight times more likely than rested motorists to get in an accident, displaying driving abilities similar to those of someone who is intoxicated. The problem is, we often don’t know when we’ve reached that “too tired” state – a situation that the Anti Sleep Pilot was created to address. The Danish-designed device sits on your dashboard, monitoring you and your driving conditions, and lets you know when it’s time to pull over and take a ten-minute rest.

To start using the Anti Sleep Pilot, you complete a short test to determine your personal risk profile. This information is stored by twisting a knob on the bottom of the unit, so several drivers can keep and access their profiles on one device. An adhesive-backed magnetic base attaches to the dashboard, which provides a mount for the device when in use.

Once you start driving, the Pilot continuously calculates your fatigue level, and displays your status. Its calculations combine 26 different parameters, including your personal risk profile, your fatigue status when you started driving, and input from a clock and accelerometer. It also maintains and measures driver alertness through occasional reactive tests, in which you must touch the device as soon as indicated. The longer you take to react, the slower your reaction time is getting – it sounds like it would be kind of like having a little Simon on your dashboard.

Unlike systems developed by Fraunhofer and Lexus, it does not use cameras to monitor the driver’s eyes.

When the combination of variables indicate that you’re reaching your limit, the Pilot’s visible and audible signals alert you to the fact that you need to take a break – the device is light- and sound-sensitive, so its display and alarm automatically adjust for cabin conditions. As the unit is able to monitor time and vehicle speed, it also knows how long you’ve stopped for, so there’s no pulling over for only a few seconds just to shut it up.

GizMag

TStzmmalaysia
post Jan 6 2011, 11:00 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

NANO VENT-SKIN: Solar, wind and nanotechnology combine to deliver a new green alternative


Mexican born Agustin Otegui is reaching out to Calgarians and Albertans to help him bring a new green alternative to the market. Analyzing architecture and understanding leading-edge power technologies inspired Otegui to design Nano Vent-Skin (NVS). “Since Architecture is focused mainly in building habitable spaces, all the industry behind it invests more money in developing products that solve almost the same needs. Watching how all the architects were focusing on building bigger and bigger structures with turbines at the same scale, I thought why shouldn’t we think on a smaller scale,” explains Otegui. Nano Vent-Skin (NVS) was designed as a green alternative for existing buildings or smaller scale projects.

Nano Vent-Skin works in a very efficient way. Basically, it combines solar and miniature wind turbines to create energy. The outer skin of the structure absorbs sunlight through an organic photovoltaic skin that transfers it to the nano-fibers inside the nano-wires. This is then sent to the storage units at the end of each panel.

Each turbine on the panel generates energy by chemical reactions on each end where it makes contact with the structure. “Polarized organisms are responsible for this process on every turbine’s turn. The inner skin of each turbine works as a filter absorbing CO2 from the environment as wind passes through it. By using nano-bioengineering and nano-manufacturing as means of production, it achieves an efficient zero emission material which uses the right kind and amount of material where needed. These microorganisms have not been genetically altered; they work as a trained colony where each member has a specific task in this symbiotic process. For example, an ant or a bee colony, where the queen knows what has to be done and distributes the tasks between the members,” explains Otegui.

If you analyze Nano Vent-Skin with human skin, it is easier to understand. When we suffer a cut, our brain sends signals and resources to this specific region to get it restored immediately. NVS works in a similar way. Every panel has a sensor on each corner with a material reservoir. When one of the turbines has a failure or breaks, a signal is sent through the nano-wires to the central system and the building material (microorganisms) is sent through the central tube in order to regenerate this area with a self-assembly process.

In order to achieve the best outcome of energy, the blades of each turbine are symmetrically designed. With this feature, even if the wind's direction changes, each turbine adapts itself by rotating clockwise or anti-clockwise, depending on the situation.

“NVS is not trying to reinvent or reshape nature. It’s just acting as a merger of different means and approaches into energy absorption and transformation which will never happen in nature. For example, a palm tree can never learn from an arctic raspberry bush or a bonsai tree if they never coexist within the same surroundings. NVS takes advantage of globalized knowledge of different species and resources and turns them into a joint organism where three different ways of absorbing and transforming energy work in symbiosis. By using nano-manufacturing with bioengineered organisms as a production method, NVS merges different kinds of microorganisms that work together to absorb and transform natural energy from the environment,” clarifies Otegui.

According to Otegui, what comes out of this merging of living organisms is a skin that transforms two of the most abundant sources of green energy on earth: sunlight and wind. The other advantage of using living organisms is the absorption of CO2 from the air. Currently, biologists are creating these materials on a much smaller scale.

In Otegui’s opinion, since the main source or materials involved in this process are living organisms, there is only this kind of process being investigated on a medium scale in the medical industry where they are creating tissues in order to restore organs.

The Examiner

TStzmmalaysia
post Jan 6 2011, 11:04 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Amazing new kidney treatment

When Mirror journalist Steve Purcell, 52, needed a kidney his son didn’t hesitate to volunteer. But as their blood types were different, doctors had to use a revolutionary transplant technique.

I was on the M25 driving to work when the doctor called to tell me that my kidneys had ceased to function. Deadly toxins were building up in my body and I could have a heart attack at any time.

This bolt from the blue was a bit of a heart-stopper itself and, having first told me to pull over onto the hard shoulder, the doctor said I had to get someone to drive me to hospital at once for emergency dialysis.

Within hours I was facing the prospect of spending the next 20 years or more having debilitating dialysis three days every week.

Now though, just 12 months on, I am living a “normal” life following a groundbreaking new type of kidney transplant which allows a person with a different blood group to donate an organ.

My son Patrick, 28, volunteered to give me a kidney the moment he heard the news that mine had failed but he had his mum’s blood group not mine.

Most transplants have to meet a stringent set of criteria matching the donor to the recipient – with the same blood group being one of the most important. Currently a third of all potential donors are rejected because of blood incompatibility.

At first it was a crushing disappointment to both of us but the transplant co-ordinator explained about the possibility of a new ABO Incompatible transplant.

While Patrick’s blood was type A and mine was type B, the transplant team at St George’s Hospital, South London, were prepared to go ahead.

It meant, however, that I would be the first patient at St George’s, one of Britain’s leading transplant centres, to have this treatment and one of only a handful in this country so far.

Rules governing transplants have been relaxed in recent years due to the growing numbers of people desperately needing this life-changing surgery. More than 8,000 are currently waiting for a kidney donor and three die every day while on the waiting list.

But results in Sweden have shown the ABO Incompatible transplants to be every bit as successful as normal live donor ones.

And so, almost 10 months later, Patrick and I were at St George’s for the transplant. Everything went well.

The following day Patrick was able to shuffle round to my room where we planned a celebration with juicy steak dinners – when you’re “nil by mouth” you can’t think about anything other than food!

It’s hard to begin describing your emotions and what you owe to your donor and the medical professionals who’ve made your new life possible all on the much-maligned NHS.

Article continues on The Mirror

Scientific Study Publication HERE
TStzmmalaysia
post Jan 7 2011, 07:39 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Willow Garage’s PR2 Robot Operates Autonomously for 139 km!

What do you do when your robotic child is all grown up? I guess you make it run a marathon. Over the past few years, Willow Garage has taught its PR2 Robot to be autonomous. The Silicon Valley startup showed the bot how to plug itself in. They taught it to navigate on its own. They even taught it how to call for help in case of emergency. On December 8th it was time to see how well those lessons were learned. Willow Garage told the PR2 to start rolling and not stop. It traveled over 70 km in 7 days…then it kept going for six days more! On December 21st the bot ended its marathon having gone a total of 138.9 km (~ 86 miles). The PR2’s epic journey probably doesn’t look like much from the outside, after all the robot was simply wandering around the Willow Garage office, but it represents a major accomplishment for the company and for open source robotics. In the future, robots like the PR2 will be able to perform a wide variety of jobs, and without human supervision.

This marathon run is important because it pushed the limits of the PR2’s autonomous operation, not because it could gauge the physical stamina of the machine’s parts. I don’t just care if a wheel breaks, I want to know how long I can leave a robot alone and not have to worry about whether it is stuck or fallen over. To that end, this in-office demonstration was a great example of how the robot could function on its own. When the PR2 encountered an obstacle, it moved around it. When its batteries were low, it found an outlet and plugged itself in. Most importantly, when it got desperately stuck it sent a text message to an engineer who could log on to a web portal and remotely get the robot rolling again. According to Willow Garage, that scenario only happened twice: “During the run, there were only two interventions: one to help the robot maneuver around a chair, and another to tell the robot where it was (“re-localization”). In both cases, the robot noticed there was an issue and sent a message for help, and the issue was resolved over the web.” Two bugs in 139 km? Not bad.

What finally did the robot in? Its plug got stuck somewhere the robot couldn’t reach. That simple. If this robot was in your house, doing chores, you could have solved that problem in 5 seconds. This 139 km marathon run shows that personal robots will not only be able to operate autonomously, but that their most common failures might be very easy to fix.

Certainly engineers have created robots that have gone farther than the PR2. NASA's Mars Rover? That has Willow Garage beat thousands of times over. Yet the PR2 is both more practical and more accessible. The techniques used to keep the PR2 running autonomously are available to all robot developers through the open source ROS software library. There are now 16 different groups around the world developing the PR2, and many more using ROS. PR2 developers can write code for the robot that takes days, or even weeks, to finish and remain confident that the bot can finish the task. The PR2’s marathon session is just a tiny step in the grand history of robotics, but with the connectivity of the open source community it will be a step multiplied many thousands of times.

SingularityHub

TStzmmalaysia
post Jan 7 2011, 12:29 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Electrifying New Way to Clean Dirty Water

Newswise — University of Utah researchers developed a new concept in water treatment: an electrobiochemical reactor in which a low electrical voltage is applied to microbes to help them quickly and efficiently remove pollutants from mining, industrial and agricultural wastewater.

The patented electrobiochemical reactor (EBR) process replaces tons of chemicals with a small amount of electricity that feed microbes with electrons. Tests have shown that the electrons accelerate how quickly the microbes remove pollutants such as arsenic, selenium, mercury and other materials, significantly reducing the cost of wastewater cleanup.

The research is now being used by a University of Utah startup company named INOTEC, which was honored at the 2010 Cleantech Open competition in San Jose, Calif. INOTEC and its EBR technology won the $40,000 Rocky Mountain regional award in what is nicknamed the “Academy Awards of Clean Technology.” INOTECH was one of 18 teams that became finalists out of 271 in the event.

Metallurgical engineer Jack Adams of the College of Mines and Earth Sciences pioneered the process. He and graduate student Mike Peoples, who co-founded INOTEC, say the award is validation that their research can save the wastewater industry money.

“It is great to be recognized for an innovative clean technology,” says Adams, president of INOTEC and a research professor in the Department of Metallurgical Engineering. “We’re currently in the early stages of growing the company, and every bit of recognition and support we get fits in with our go-to-market model. It will open new opportunities for securing partnerships and investor funding that will allow us and a partner to take the technology further faster.”

Adams says the new method can enhance just about any type of wastewater treatment. It now is being tested primarily for removing metals from mining wastewater, but also could be used for other industrial and agricultural wastes, he adds.

INOTEC has received support and an exclusive license to the EBR technology from the University of Utah’s Technology Commercialization Office, which protects and manages the university’s intellectual property and helps faculty members create startup companies. INOTEC is working with the office’s new Energy Commercialization Center to secure business partners and funding.

In conventional wastewater treatment, microbes or chemicals alter or remove contaminants by adding or removing electrons. The electrons come from large excesses of nutrients and chemicals added to the systems to adjust the reactor chemistry for microbial growth and contaminant removal. Those large excesses must be added to compensate for changes in water chemistry and other factors that limit the availability of electrons to remove pollutants.

The electrobiochemical reactor or EBR system overcomes these shortcomings by directly supplying excess electrons to the reactor and microbes using low voltage and no current, unlike other systems that provide large electrical currents. One volt supplies about one trillion trillion electrons (note: trillion twice is correct). These electrons replace the electrons normally supplied by excess nutrients and chemicals, at a considerable savings and with greater efficiency.

The electrons needed for a full-scale facility can easily be supplied by a small solar power grid. “The provided electrons make reactors more efficient, stable and controllable,” Adams says.

The researchers, through INOTEC, have successfully completed five laboratory tests of waters from various metal and coal mines in North America containing selenium, arsenic, mercury and nitrates.

INOTEC recently completed its first on-site, pilot-scale contract, treating wastewater containing arsenic and nitrate from an inactive gold mine. This demonstration was partially funded through a University of Utah Virtual Incubator Program grant.

INOTEC has also secured its own contract for a second pilot-scale test at a mine for silver and other metals in the Yukon in spring 2011.

Newswise

TStzmmalaysia
post Jan 7 2011, 12:32 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


NANOTECHNOLOGY

Attached Image

Building 3D Batteries from the Bottom Up with Coated Nanowires

To continue on with the string of nanotechnology developments at the end of last year (here and here) that were aimed at improving the battery, I start this New Year with another such story this time coming from researchers at Rice University.

This news actually broke in December when it was originally published in the December 6th online edition of Nanoletters.

The research, which was led by Pulickel Ajayan, managed to find a way to coat nanowires with PMMA coating that provides good insulation from the counter electrode while still allowing ions to pass easily through.

This minimized separation between two electrodes manages to make the battery much more efficient.

"In a battery, you have two electrodes separated by a thick barrier," said Ajayan, professor in mechanical engineering and materials science and of chemistry. "The challenge is to bring everything into close proximity so this electrochemistry becomes much more efficient."

To achieve this, the Ajayan and his lead researchers Sanketh Gowda and Arava Leela Mohana Reddy took the concept of 3D batteries and coated millions of nanowires to create the 3D structure from the bottom up.

“We wanted to figure out how the proposed 3-D designs of batteries can be built from the nanoscale up," said Gowda, a graduate student in Ajayan's lab. "By increasing the height of the nanowires, we can increase the amount of energy stored while keeping the lithium ion diffusion distance constant."

As Gowda readily admits in the news release, 3D designs are nothing new. However, the achievement here was the process they developed for coating the nanowires in the PMMA without any break in the coating.

The process involves the growing of 10-micron-long nanowires through electrodisposition in the pores of an anoidized alumina template. They then drop-coated PMMA onto the nanowire array resulting in an even casing from top to bottom.

The result of this work is ultimately expected to be batteries for scalable microdevices that possess a greater surface area than thin-film batteries.

IEEESpectrum

TStzmmalaysia
post Jan 7 2011, 12:44 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Humans – A Viable Source of Green Energy

The science has reached a high level, especially in the past few years, but what really surprises is the fact that human body can be used as a battery. It sounds odd, but it is a fact, especially for few Parisian scientists that invented an unusual way to power the metro.

Their idea is quite simple – they installed an experimental heating system in a public housing project. This heating system will have a very strange source of energy- the human bodies in the nearby Metro station.

They are the source, which will power the heating system. The human body generates more bio-electricity than a 120-volt battery and over 25,000 BTU’s of body heat. It is a well known fact, so the scientists decided to use it for heating of nearly 17 apartments.

The body heat is actually efficient and its usage cuts the carbon emission by a third compared to a boiler heating system. If the project of the Parisians is released, it will also change the modern way of heating.

In case you think this news is too strange, here is another: Urine can also generate power. Yes, our excrements are source of clean energy, believe it or not. The news came from UK, where the chemist Shanwen Tao is trying to create a cell that simply turns the urine to energy. The urine contains urea, which is a special component able to provide energy generation.

Another good factor that can turn our urine into an energy generator is the chemical composition of the urine. The development of the UK scientist isn’t yet ready, but what matters is his invention that by adding the urine into a fuel cell generates a chemical reaction. That reaction can power the battery.

This progressive news can be compared with odd news, this time about our poo. The so called poo power is discovered by California based Orange County Sanitation District`s researchers that discovered a fuel cell that turns the human waste into hydrogen fuel.

Their invention is developing so far, but we know it is going to work soon. In case you think that’s all from the human body, you are wrong.

The blood and the sweat can also be used as energy sources. Scientists at Rensselaer Polytechnic Institute are trying to create a battery that is able to turn the electrolytes from our blood and sweat into a battery fuel.

TheNewEcologist



TStzmmalaysia
post Jan 7 2011, 12:47 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

CES: Peep Wireless says it makes cell towers redundant

AS VEGAS--At CES, the start-up Peep Wireless Technology is trying to find partners to adopt its mobile phone mesh networking technology. It looks like the company has a long road ahead of it.

The mesh concept, which is not new, is that instead of phone voice or data moving as it does now, from low-powered mobile devices to high-powered, fixed towers, phones (and possibly other radio-equipped devices) would act as a miniature cell towers and repeaters on their own, handling data transmission for nearby devices. So if you're calling someone across the street, chances are you might be able to connect to their device directly, or maybe in just one or two "hops," using other people's devices as the towers and repeaters of your ad-hoc network.

Without cell towers, of course, there's no need for cellular carriers, no expensive private infrastructure to support, and no need for big recurring bills. A peer-to-peer mesh network is, in some cases, more robust than the traditional cellular infrastructure. It's certainly faster and cheaper to build. Mesh networks are in use today. Dust Networks, for example, provides technologies for sensors that are used in industrial and military applications for which there is no infrastructure. In a mesh network, the devices are the infrastructure.

On the other hand, building a mesh network of smartphones presents serious challenges that I don't think Peep has solved. The battery hit is a big one; many modern smartphones can barely make it through a day of use right now. Turning them into mini data repeaters would take even more power. And once a mesh network gets big, route-finding for data packets becomes a nontrivial computational task, and that introduces delay or lag into communications. Security, at least, should not be a big issue, since Peep's data is broken up and AES-encrypted end-to-end.

But the real challenge is getting the chiefs of the smartphone universe--the carriers--to play ball and invest in this technology. Peep President Scott Redmond is here at CES meeting with the carriers, he says.

I hope those meetings go better than his talk with me did.

I found it hard to get a grip on what kind of company Peep Wireless wants to be. At first I thought it was a mesh company. It has software that hops voice and data from device to device. But as I talked to Redmond I learned that Peep also has interesting technology that aggressively uses any available radio channel on a device to send its signal--Wi-Fi, Bluetooth, cellular, it doesn't matter. That sounded pretty cool, but it's a different pitch from the mesh story.

And in addition to the radio-agnostic transmission technology, Peep is also working on a system by which data can be sent via light pulses. Redmond said he's creating a technology demo for AT&T in which a bright light at the top of a building will be modulated to send HD-quality video data to iPhones across town. What this has to do with mesh networking I'm not sure. Redmond said he's doing the work because it was trivial for his team to create the technology, and AT&T seemed interested in it.

Peep is also building a key-fob-size walkie-talkie-like device that will transmit voice point-to-point up to 36 miles. It requires a Bluetooth headset as it has no speaker or mic of its own. My credulity strained when Redmond told me the device will be charged by harvesting and storing energy from ambient radio waves. He said such a device would be able to transmit for 8 to 15 minutes a day, more if it's also charged in a more traditional fashion. His "Peep Pod" device will sell for just $20, he says. (Redmond says radio wave trickle chargers will be shown by other companies at CES, and the concept has been proven.)

And then there's the viral marketing strategy, in which, Redmond says, companies like Starbucks will distribute branded mesh-based free phone-calling apps and leverage them as marketing tools as users are rewarded financially for forwarding the app to their friends.

Redmond's plan for getting his products out the door involves talking to a long list of partner candidates, staring with the cellular carriers, for whom he believes his mesh technology will save billions of dollars. If they don't bite, he'll pitch to upstart and wanna-be phone companies, like Google and Facebook. And if they aren't interested, he says, he'll give his products away to consumers (and make money from the advertising deals, I surmise).

Separately, all the technologies and business concepts Redmond described sound plausible, even if some only just, but for one self-funded 12-person company to try to wrap them all up sounds completely manic. Redmond is certainly passionate, and his ideas are attractive, but the business he described to me was not a start-up; it was more a psychotic Bell Labs. It's too much for one small company to tackle, and the lack of focus makes me wonder if Peep has the discipline to develop any of its ideas into viable businesses.

Cnet





TStzmmalaysia
post Jan 7 2011, 12:50 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Software for Programming Microbes

Genetically modified microbes could perform many useful jobs, from making biofuels and drugs, to cleaning up toxic waste. But designing the complex biochemical pathways inside such microbes is a time-consuming process of trial and error.
Christopher Voigt, an associate professor at the University of California, San Francisco, hopes to change that with software that automates the creation of "genetic circuits" in microbes. These circuits are the pathways of genes, proteins, and other biomolecules that the cells use to perform a particular task, such as breaking down sugar and turning it into fuel. Voigt and colleagues have so far made basic circuit components in E. coli. They are working with the large California biotechnology company Life Technologies to develop software that would let bioengineers design complete genetic circuits more easily.

Designing a microbe for a particular task would then be much like writing a new computer program, says Voigt. Just as programmers do not have to think about how electrons move through the gates in an integrated circuit, he says, biological engineers may eventually be able to design circuits for genes, proteins, and other biomolecules at a level of abstraction. "If we apply computational processes to things that bacteria can already do, we can get complete control over making spider silk, or drugs, or other chemicals," he says.

Certain types of circuits could, for instance, help regulate the activity of bacteria that produce biofuels. Instead of outside controls, internal circuits could maintain the chemical levels and other conditions needed to keep bacteria producing at high yields. "We're trying to make the cell understand where it is and what it should be doing based on its understanding of the world," says Voigt. Trying to design such a control circuit without the help of a computer would take a lot of trial and error.

Voigt has now made a type of circuit component called a NOR gate in E. coli bacteria. NOR gates can be combined to perform any logical operation. In work described in the journal Nature, Voigt's group also showed they could improve the quality of the output of bacterial circuits by having them work collectively, forming a circuit of NOR gates, one in each cell. Voigt has designed bacterial circuits to hook into natural bacterial communication systems called quorum sensing, so that the cells can "vote" on an output. This increases the quality of the computation peformed.

"This breakthrough work in synthetic biology expands our capacity to construct functional, programmable bacteria," says James Collins, professor of biomedical engineering at Boston University who is not affiliated with Voigt's team. Collins observes that the California researchers have learned to combine simple circuits in individual cells to make a more complex circuit at the population level. "This represents an important step towards harnessing the power of synthetic ecosystems for biotech applications," he says.

The University of California researchers are now entering the second year of a research agreement with Life Technologies to develop software to automate the biological design process. "The vision is to take these software modules and develop them so that the process of biological parts selection and circuit design is far more automated and simplified than it is today," says Todd Peterson, vice president of synthetic biology research and development at the company. The company hopes to incorporate most of the software modules being designed by Voigt's group into its Vector NTI software by the end of spring 2012.

TechnologyReview

This post has been edited by tzmmalaysia: Jan 7 2011, 12:51 PM
TStzmmalaysia
post Jan 7 2011, 12:53 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

New Dyes Improve Solar Technologies for Generating Clean Electricity and Hydrogen Fuel

BUFFALO, N.Y. -- Chemists at the University at Buffalo have synthesized a new class of photosensitizing dyes that greatly increase the efficiency of light-driven systems that produce two kinds of green energy: Solar electricity and clean-burning hydrogen fuel.

On a commercial scale, these advancements could form the basis of cost-effective technologies to power everything from household appliances to hydrogen vehicles.

To produce electricity, the dyes--called chalcogenorhodamine dyes--operate as part of a Grätzel-type solar cell that converts sunlight into an electric current. When sunlight strikes the dyes, the energy knocks loose electrons in the dyes that travel through the solar cell, forming the current.

The mechanism for producing hydrogen begins the same way: Sunlight strikes the dyes, freeing electrons. But instead of forming a current, the electrons flow into a catalyst, where they drive a chemical reaction that splits water into its basic elements: hydrogen and oxygen.

In laboratory tests, scientists at UB and the University of Rochester have shown that these chalcogenorhodamine systems produce hydrogen at unprecedented rates, in part because the dyes absorb light more intensely and transfer their electrons more efficiently than conventional dyes.

The research team, led by UB Professor Michael Detty and University of Rochester Professor Richard Eisenberg, reported some of their findings in the Journal of the American Chemical Society in October 2010.

Detty, who worked in the private sector for 17 years before joining UB's faculty, hopes his research will lead to the development of better commercial technologies for producing solar electricity and hydrogen on demand.

"Sunlight in one hour could power the world for a year, but we don't tap into it for either electricity or for making solar fuels," Detty said, explaining the importance of his work. "Plants use sunlight to make their own fuels. Humans don't. We use oil. So if we want to have energy independence, it will come from solar."

UB has received a Notice of Allowance from the U.S. Patent and Trademark Office approving the issue of a patent to cover the composition of the dyes. A separate patent application seeks to protect the dyes' use in hydrogen evolution and lists Detty and Eisenberg, along with Brandon Calitree, Alexandra Orchard and Theresa McCormick, as co-inventors of the process.

The collaborators found that chalcogenorhodamines work efficiently in homogenous hydrogen-production systems that employ cobalt as the catalyst, as well as in heterogeneous systems that employ platinum deposited on titanium dioxide as the catalyst.

UB's Office of Science, Technology Transfer and Economic Outreach is handling licensing of the discoveries.

The University at Buffalo is a premier research-intensive public university, a flagship institution in the State University of New York system and its largest and most comprehensive campus. UB's more than 28,000 students pursue their academic interests through more than 300 undergraduate, graduate and professional degree programs. Founded in 1846, the University at Buffalo is a member of the Association of American Universities.

University of Buffalo



TStzmmalaysia
post Jan 8 2011, 09:40 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


NANOTECHNOLOGY

Attached Image

Nanotube yarns let smart clothing survive the laundry

One of the biggest hurdles in the way of "smart clothing" may finally have been jumped. Nanotechnologists have developed conducting fabrics that can survive a washing machine.

Garment makers would like to introduce novel materials into textiles to create conducting paths that, say, connect sports performance sensors or your music player to your phone.

"But until now, such multifunctional applications have been limited by the ability to spin important materials into yarns and make sure they stay there even after washing," says Ray Baughman of the Alan G. MacDiarmid NanoTech Institute at the University of Texas in Dallas.

To solve this problem, his team set about making a yarn that could be peppered with "guest" particles of interest – titanium dioxide to create self-cleaning fabrics, for instance – and hold onto them through a hot dunking in detergent. What better way to do that, they thought, than to wrap the particles up in a tightly scrolled web?

When a commercially produced "forest" of multiwalled carbon nanotubes is cut into with a razor, drawing the blade out slowly pulls out an exquisitely fine web of nanotubes held together by intramolecular van der Waals forces.

"As you pull, nanotubes stick to the blade, and that pulls the next nanotube, and that one the next, and so on," says Baughman. "So you end up with a sheet, a web of nanotubes. And once you have drawn out a sheet, you can twist it into a yarn."

But the researchers don't spin straight away: first they need to introduce the guest particles they want to trap within their yarn. To do this, they take their nanotube web – which is 1 centimetre wide and just 50 nanometres thick – and place it on a filter paper soaked in a solution of the guest material. Or the solution can be deposited as an aerosol from an electrostatic paint gun.

Once the particle-populated nanoweb is dry, it is clamped at one end while the other is twisted by a spinning magnet, of the type used to stir fluids in the lab (see video above). The result: a yarn that holds onto the guest particles within it and can be woven alongside woollen and cotton threads for clothing manufacture.

Would the guest material be released and lost in the washing machine, though? To find out, Baugham ran tests in a Maytag washing machine at standard 40 °C washing temperatures – but also in a three-hour soak at 80 °C. In neither case did they find guest material to have been depleted.

They plan more tests, however, because different physical stresses that are used in spinning change the way the yarn is "scrolled up" – and that may affect guest particle retention.

VIDEO

The Texan team have now made yarns containing titanium dioxide, various conductors and even "high-temperature" superconductors such as magnesium diboride.

Why a superconducting yarn? Because this is not all about clothes: there could be many engineering applications for smart yarns in superconducting linear motors, batteries, supercapacitors and hydrogen storage systems.

Theoretically the thin conducting skins that could be woven with this material could also have applications in stealth aircraft, as the material would be an ultralight radio-frequency radiation absorber that could foil radar. Baugham wouldn't comment on whether that is a target application, though he says aerospace firms are interested.

The team have lodged an international patent filing on the idea and are now working with what Baugham describes as "an agency" on the most immediate applications for it.

NewScientist

TStzmmalaysia
post Jan 8 2011, 09:42 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

NASA Satellites Can Help Farmers Save Massive Amounts of Water

Why is this important? Because irrigation for food production is about 70% (!) of water use in the U.S., and even more in some other countries. This is a gigantic amount of water, and while it is not destroyed (unlike fossil fuels), it could be used much more efficiently and we could keep many aquifers, rivers, and lakes in much better condition.

NASA researchers have developed a computer program to help farmers better manage irrigation systems in real time. The software uses data from NASA satellites, local weather observations, and wireless sensor networks installed in agricultural fields to calculate water balance across a field and provide farmers with information on crop water needs and forecasts that can be accessed from computers or handheld devices. (source)
This system is being beta-tested by farmers and vineyard managers in the San Joaquin Valley in California as part of an 18-month research project to optimize irrigation management. The project is the first to combine satellite and surface observations to estimate irrigation needs at the scale of an individual field or vineyard, and distribute the information to farmers in near real time.

We're still in the early stages of this technology, but the trend is obvious, especially in countries where agriculture is still very inefficient (uses lots of water, chemicals, fertilizer, etc).

TreeHugger

TStzmmalaysia
post Jan 8 2011, 09:44 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Massive Energy-Generating Wind Tower Proposed for Japan

When considering wind power as an energy source, it’s best to think big. Japan-based ZENA Systems is working on developing a new type of wind energy generator that will dwarf anything before it. The 50 meter-tall hexagonal building essentially acts as a huge scoop that compresses wind from all directions and then runs the rushing air through a series of ground-based generators. The ambitious project is the first of its kind, and it includes a desalination plant, on-site energy storage, and a visitors center. Details are sketchy on the viability of the design, but if it works out this wind concept could someday reach high into the sky to power our grid.

The company explains the operation as a three-point compression technique that takes wind from any direction and compresses and accelerates it through a wind tunnel in the middle of the hexagonal tower. The air flows downward to a series of turbines, which convert the wind’s energy to electricity. The company claims that the system is not constrained by the Betz limit value theory, which states that the maximum theoretical harvestable energy from the wind is 59.3 percent.

To stir the industry up even more, the design calls for on-site energy storage, which ZENA explains on their website: “The E.A.S. is a new energy storage system used to stock the energy generated by the Wind Tower system. This system uses vanadium concentrated solution diluted with nano water and pure water.” They have picked a location in Kurume, Fukuoka, Japan for the massive project.

Inhabitat

TStzmmalaysia
post Jan 8 2011, 09:46 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Discreet Desert Eco Homes Planned for the Mojave Desert

A new desert home community has been planned for La Quinta, CA in the Mojave Desert, just outside of Palm Springs. The rich desert landscape covered in spiny Joshua Trees will continue to be respected in this new discreet eco-community. Designed by Platform for Architecture + Research, 18 homes will be constructed to fit within the natural topology, carefully minimizing any environmental disturbances. Each home has been designed for the hot desert climate of the Mojave and includes solar shading, solar power, natural ventilation, xeriscaping and more.

The 15-acre plot of land set aside for the desert home community has been subdivided into 18 plots, each almost 1 acre in size. In order to minimize the disturbance on the desert landscape, roads and driveways have been sited discreetly and paved naturallly. Footprints for the homes have been minimized, while outdoor living spaces are maximized to take advantage of the moderate temperatures in the fall, winter and spring. Villas were also oriented to take advantage of the prevailing northwesterly winds for natural ventilation.

As the homes are located in a desert, extreme temperatures are of concern. As such, the villas were designed to take advantage of the sun in the winter, while hiding beneath the shade during the summer. Each single family residence will feature solar roofs, solar passive design with appropriate shading and thermal massing, shaded glazing as well as xeriscaping around the homes. Three different villa designs are organized around their own courtyard, and are well-thought-out extensions of the surrounding landscape.

Inhabitat



TStzmmalaysia
post Jan 8 2011, 09:49 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Graphene electrodes for organic solar cells

The standard material used so far for these electrodes is indium-tin-oxide, or ITO. But indium is expensive and relatively rare, so the search has been on for a suitable replacement. Now, a team of MIT researchers has come up with a practical way of using a possible substitute made from inexpensive and ubiquitous carbon. The proposed material is graphene, a form of carbon in which the atoms form a flat sheet just one atom thick, arranged in a chicken-wire-like formation.

An analysis of how to use graphene as an electrode for such solar cells was published on Dec. 17 in the journal Nanotechnology, in a paper by MIT professors Jing Kong and Vladimir Bulovi? along with two of their students and a postdoctoral researcher.

Graphene is transparent, so that electrodes made from it can be applied to the transparent organic solar cells without blocking any of the incoming light. In addition, it is flexible, like the organic solar cells themselves, so it could be part of installations that require the panel to follow the contours of a structure, such as a patterned roof. ITO, by contrast, is stiff and brittle.

The biggest problem with getting graphene to work as an electrode for organic solar cells has been getting the material to adhere to the panel. Graphene repels water, so typical procedures for producing an electrode on the surface by depositing the material from a solution won’t work.

The team tried a variety of approaches to alter the surface properties of the cell or to use solutions other than water to deposit the carbon on the surface, but none of these performed well, Kong says. But then they found that “doping” the surface — that is, introducing a set of impurities into the surface — changed the way it behaved, and allowed the graphene to bond tightly. As a bonus, it turned out the doping also improved the material’s electrical conductivity.

While the specific characteristics of the graphene electrode differ from those of the ITO it would replace, its overall performance in a solar cell is very similar, Kong says. And the flexibility and light weight of organic solar cells with graphene electrodes could open up a variety of different applications that would not be possible with today’s conventional silicon-based solar panels, she says. For example, because of their transparency they could be applied directly to windows without blocking the view, and they could be applied to irregular wall or rooftop surfaces. In addition, they could be stacked on top of other solar panels, increasing the amount of power generated from a given area. And they could even be folded or rolled up for easy transportation.

While this research looked at how to adapt graphene to replace one of the two electrodes on a solar panel, Kong and her co-workers are now trying to adapt it to the other electrode as well. In addition, widespread use of this technology will require new techniques for large-scale manufacturing of graphene — an area of very active research. The ongoing work has been funded by the Eni-MIT Alliance Solar Frontiers Center and an NSF research fellowship.

Peter Peumans, an assistant professor of electrical engineering at Stanford University, who was not involved in this study, says organic solar cells will probably become practical only with the development of transparent electrode technology that is both cheaper and more robust than conventional metal oxides. Other materials are being studied as possible substitutes, he says, but this work represents “very important progress” toward making graphene a credible replacement transparent electrode.

“Other groups had already shown that graphene exhibits good combinations of transparency and sheet resistance, but no one was able to achieve a performance with graphene electrodes that matches that of devices on conventional metal oxide (ITO) electrodes,” Peumans says. “This work is a substantial push toward making graphene a leading candidate.”

PhysOrg

TStzmmalaysia
post Jan 9 2011, 10:49 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


SPACE SCIENCE

Attached Image

NASA Tests New Propulsion System For Robotic Lander Prototype

NASA's Robotic Lunar Lander Development Project at Marshall Space Flight Center in Huntsville, Ala., has completed a series of hot fire tests and taken delivery of a new propulsion system for integration into a more sophisticated free-flying autonomous robotic lander prototype. The project is partnered with the Johns Hopkins University Applied Physics Laboratory in Laurel, Md., to develop a new generation of small, smart, versatile robotic landers to achieve scientific and exploration goals on the surface of the moon and near-Earth asteroids.

The new robotic lander prototype will continue to mature the development of a robotic lander capability by bringing online an autonomous flying test lander that will be capable of flying up to sixty seconds, testing the guidance, navigation and control system by demonstrating a controlled landing in a simulated low gravity environment.

By the spring of 2011, the new prototype lander will begin flight tests at the U.S. Army's Redstone Arsenal Test Center in Huntsville, Ala.

The prototype’s new propulsion system consists of 12 small attitude control thrusters, three primary descent thrusters to control the vehicle’s altitude, and one large "gravity-canceling" thruster which offsets a portion of the prototype’s weight to simulate a lower gravity environment, like that of the moon and asteroids. The prototype uses a green propellant, hydrogen peroxide, in a stronger concentration of a solution commonly used in homes as a disinfectant. The by-products after use are water and oxygen.

"The propulsion hardware acceptance test consisted of a series of tests that verified the performance of each thruster in the propulsion system," said Julie Bassler, Robotic Lunar Lander Development Project Manager. "The series culminated in a test that characterized the entire system by running a scripted set of thruster firings based on a flight scenario simulation."

The propulsion system is currently at Teledyne Brown’s manufacturing facility in Huntsville, Ala., for integration with the structure and avionics to complete the new robotic lander prototype. Dynetics Corp. developed the robotic lander prototype propulsion system under the management of the Von Braun Center for Science and Innovation both located in Huntsville, Ala.

"This is the second phase of a robotic lander prototype development program," said Bassler. "Our initial "cold gas" prototype was built, delivered and successfully flight tested at the Marshall Center in a record nine months, providing a physical and tangible demonstration of capabilities related to the critical terminal descent and landing phases for an airless body mission."

The first robotic lander prototype has a record flight time of ten seconds and descended from three meters altitude. This first robotic lander prototype began flight tests in September 2009 and has completed 142 flight tests, providing a platform to develop and test algorithms, sensors, avionics, ground and flight software and ground systems to support autonomous landings on airless bodies, where aero-braking and parachutes are not options.

NASA

TStzmmalaysia
post Jan 9 2011, 10:58 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Charge Your Electric Vehicle Wirelessly

We’ve seen mats that allow you to charge your gadgets cordlessly just by placing them down (Chevy actually just unveiled the one that is going to be in all of their new Volts), but how about something a little bit larger – like a whole car? Well, that’s exactly what Fulton Innovation‘s eCoupled technology does. Showcased at CES this week, the induction charger can re-juice your electric vehicle with no unruly wires necessary – all you have to do is park it.

Fulton’s eCoupled wireless charging tech was originally created for smaller electronics, and they say that this is the first time it’s been able to wirelessly charge a “high-powered device.” The company demonstrated the new technology, dubbed the PowerSpot, by powering up a shiny, red Tesla Roadster. The “spot” appears as a blue halo on the floor of your garage, and you can engage the accompanying adapter fitted to the underbody of your electric vehicle as long as you park it about 4? (in the case of the Tesla) over the induction pad.

You won’t be able to buy it just yet, but the technology seems promising and Fulton is looking to make its system the go-to wireless kit.



Inhabitat

TStzmmalaysia
post Jan 10 2011, 07:46 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Igloo-shaped 'Poo-Gloos' eat sewage

Inexpensive igloo-shaped, pollution-eating devices nicknamed "Poo-Gloos" can clean up sewage just as effectively as multimillion-dollar treatment facilities for towns outgrowing their waste-treatment lagoons, according to a new study.

"The results of this study show that it is possible to save communities with existing lagoon systems hundreds of thousands, if not millions of dollars, by retrofitting their existing wastewater treatment facilities with Poo-Gloos," says Fred Jaeger, chief executive officer of Wastewater Compliance Systems, Inc., which sells the Poo-Gloo under the name Bio-Dome.

Kraig Johnson, chief technology officer for Wastewater Compliance Systems, will present the study Jan. 13 in Miami during the Water Environment Federation's Impaired Water Symposium. It also will be published in the symposium program.

Wastewater treatment in small, rural communities is an important and challenging engineering task. Proper treatment includes disinfection and the removal of unwanted pollutants. Most rural communities rely on wastewater lagoons as their primary method of treatment because they are simple and inexpensive to operate. Lagoons are large ponds in which sewage is held for a month to a year so that solids settle and sunlight, bacteria, wind and other natural processes clean the water, sometimes with the help of aeration.

But as communities grow and-or pollution discharge requirements become more stringent, typical wastewater lagoons no longer can provide adequate treatment. Until now, the only alternative for these communities was to replace lagoons with mechanical treatment plants, which are expensive to build and operate. Mechanical plants treat water in 30 days or less, using moving parts to mix and aerate the sewage, speeding the cleanup. They require electricity, manpower and sometimes chemicals.

Johnson and his research team developed the Poo-Gloo when he worked as a research assistant professor of civil and environmental engineering at the University of Utah. The Poo-Gloo was designed to address the problem faced by communities outgrowing their sewage lagoons. The device provides a large surface area on which bacteria can grow, providing the microbes with air and a dark environment so they consume wastewater pollutants continuously with minimal competition from algae.

The new study outlines results of a pilot project conducted in 2009 at Salt Lake City's Central Valley Water Reclamation Facility. Wastewater Compliance Systems obtained an exclusive license from the University of Utah to commercialize Poo-Gloos, so the devices now have been deployed in six states in either full-scale installations or pilot demonstrations. Every installation showed Poo-Gloos provide treatment that meets pollution-control requirements.

Lynn Forsberg, public works director for Elko County, Nev., recently started using Poo-Gloos in a county sewage treatment lagoon system in Jackpot, Nev., after a successful pilot test. "Our alternative was to go with a full-blown [mechanical] treatment plant that would cost about four times as much and be much more labor intensive," he says.

Poo-Gloos use a thriving bacterial biofilm to consume pollutants. Two dozen or more igloo-shaped Poo-Gloos are installed on the bottom of the lagoon, fully submerged and arrayed in rows. Each Poo-Gloo consists of a set of four progressively smaller, plastic domes nested within each other like Russian nesting dolls and filled with plastic packing to provide a large surface area for bacterial growth.

Rings of bubble-release tubes sit at the base of every Poo-Gloo and bubble air up through the cavities between domes. The air exits a hole in the top of each dome. As air moves through the dome, it draws water from the bottom of the lagoon up through the dome and out the top.

Each Poo-Gloo occupies 28 square feet of space on the bottom of a lagoon while creating 2,800 square feet of surface area for bacterial growth. The combination of large surface area, aeration, constant mixing and a dark environment that limits algae make Poo-Gloos capable of consuming pollutants at rates comparable with mechanical plants.

Johnson spent time in the wastewater industry before obtaining his master's and doctoral degrees in civil and environmental engineering. In 2002, he set about developing a product that could be used to retrofit wastewater lagoons easily and inexpensively. After seven years, with the help of fellow professors, graduate students and a lot of laboratory tests, Johnson was ready for his first field test.

Johnson built a pilot unit using a large construction dumpster welded shut so it was water-tight. The container held seven Poo-Gloos. Johnson enlisted the help of Salt Lake's Central Valley Water Reclamation Facility to test it. The researchers ran multiple tests using untreated wastewater from the plant to determine the extent to which commonly regulated pollutants could be removed from the wastewater before discharge back to the treatment facility.

The study aimed to determine optimal operating conditions for Poo-Gloos and evaluate their performance at different water temperatures, levels of aeration, and sewage volumes and concentrations. The study found the devices consistently achieved high levels of treatment that were affected only slightly by changing water temperatures and aeration levels:

Biological oxygen demand – a measure of organic waste in water – was reduced consistently by 85 percent using Poo-Gloos, and by as much as 92 percent.
Total suspended solids fell consistently by 85 percent, and by as much as 95 percent.
Ammonia levels dropped more than 98 percent with Poo-Gloo treatment in warmer water and, more important, by as much as 93 percent when temperatures dropped below 50 degrees Fahrenheit – conditions that normally slow bacterial breakdown of sewage.
Total nitrogen levels fell 68 percent in warmer water and 55 percent in cooler water.
"The removal rates we saw during the pilot test are comparable to removal rates from a rotating biological contactor, which is a commonly used device in mechanical treatment facilities," Johnson says. "We couldn't be happier with the performance of the Poo-Gloos."

Johnson conducted the study with Hua Xu, a postdoctoral fellow in civil and environmental engineering at the University of Utah, and Youngik Choi, a professor of environmental engineering at Dong-A University in South Korea.

There may be uses for the Poo-Gloos beyond municipal wastewater treatment.

"The bugs will adapt to consume whatever is available," says Johnson, "In addition to the pollutants discussed in our paper, we've also seen great results in the consumption of other significant pollutants that I can't discuss now because we're in the process of filing patents. Poo-Gloos – or Bio-Domes as we call them – have a lot of potential, and we've only just scratched the surface."

Johnson and his team originally nicknamed the devices Poo-Gloos because they are shaped like igloos. But as possible uses began to expand to industries beyond municipal sewage treatment, Wastewater Compliance Systems decided to sell them as Bio-Domes.

"Every day I speak with community officials who need to upgrade their treatment facilities," says Taylor Reynolds, director of sales for Wastewater Compliance Systems. "They come to us because they receive an engineering report recommending a $4 million to $10 million mechanical plant project that is impossible for them to pay for with their existing tax base. Not only can our Poo-Gloos or Bio-Domes help communities comply with pollution limits, but most of the projects I quote cost between $150,000 and $500,000, and the operating expenses are a fraction those at a mechanical plant."

Each Poo-Gloo requires little maintenance and the same amount of electricity as a 75-watt bulb, putting operating costs for Poo-Gloo systems at hundreds of dollars per month rather than thousands, which is typical of mechanical treatment plants. And some communities may operate Poo-Gloos "off-the-grid" by powering them with solar or wind energy systems.

The results of the new study prompted a number of communities to abandon more expensive alternatives in favor of installing Poo-Gloos. These early adopters can be found in the Nevada town of Jackpot in Elko County, Glacier National Park in Montana, and Plain City and Wellsville in Utah. Wastewater Compliance Systems also has deployed mobile pilot Poo-Gloos in Louisiana, Alabama and Wisconsin so potential customers, engineering firms and regulators can see first-hand how well they work before they commit tax dollars to the new technology.

"We know that small communities have limited budgets," Reynolds says. "That's why we developed our mobile pilot units. Even when our technology has the potential to save hundreds of thousands of dollars on an upgrade project, we like to provide our customers with peace of mind in knowing that our products will solve their problems for years to come. "

WorldNews

TStzmmalaysia
post Jan 10 2011, 07:48 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


SPACE SCIENCE

Attached Image

NASA Antarctica Discovery Might Foreshadow Potential Extraterrestrial Life Forms

A 2009 discovery at the bottom of the world could have implications on the search for extraterrestrial life. NASA researchers were astounded to find an amphipod swimming beneath a massive Ross Ice Shelf, about 12.5 miles away from open water. NASA scientists were using a borehole camera to look back up towards the ice surface when they spotted this pinkish-orange creature swimming 600 feet below the ice, where the NASA team expected to find no higher life form than some microbes.

The image below provided by NASA, taken in Dec. 2009, shows a Lyssianasid amphipod, which is related to a shrimp, where a NASA team lowered a video camera to get the first long look at the underbelly of an ice sheet and a curious shrimp-like creature came swimming by and then even parked itself on the cable attached to the camera. In a surprising discovery that shakes the idea of where higher life can thrive, scientists for the first time found a shrimp-like creature and a jellyfish frolicking beneath a massive Antarctic ice sheet.

About 150 new types of fish were among 500 new marine species, including furry crabs and a lobster off Madagascar, found in the seas in 2006, according to researchers in the 70-nation Census of Marine Life.

Many species were found in places long thought too hostile for life -- including by a vent spewing liquids at 407 Celsius (764.6F) and other habitats that were as inhospitable as planets such as Mars or Venus.

"The age of discovery is not over," said Jesse Ausubel, a program manager at the U.S. Sloan Foundation which is a sponsor of the 10-year Census. Finds "are provocative for NASA and for people who are interested in life in places other than Earth."

Among discoveries in 2006 were shrimps, clams and bacteria living by the searing 407C vent on the floor of the Atlantic Ocean north of Ascension Island, the hottest sea vent ever documented and more than hot enough to melt lead.

"This is the most extreme environment and there is plenty of life around it," said Chris German, of Britain's Southampton Oceanography Center and a leader of the Atlantic survey.

He said one big puzzle was how creatures coped with shifts in temperatures -- water on the seabed at 3,000 meters (9,842 ft) was just 2C yet many creatures withstood near-boiling temperatures of up to 80C from the thermal vent.

Researchers had not yet probed how hardy the microbes nearest the hottest part of the vent were -- a type of bacteria called "Strain 121" found in the Pacific in 2003 holds the record by being able to withstand temperatures of 121 Celsius. Another expedition found crustaceans, jellyfish and single-celled animals living in darkness in the Weddell Sea off Antarctica under ice 700 meters thick and 200 km (125 miles) from open water.

"You can think of it as a cave, one of the remotest caves on earth," Ausubel said of findings by a robot camera."Wherever we've gone on earth we've continued to find life," German said. He said recent discoveries could be encouraging for the search for life elsewhere in the universe.

Astronomers speculate that Jupiter's moon Europa could hide an ocean beneath its frozen surface and Ausubel noted life has been found on Earth beside subsea methane seeps -- Saturn's moon Titan also has methane. And NASA said last week it had found signs of liquid water on Mars.

Among other 2006 finds by the census, due for completion in 2010, was a "Jurassic shrimp" in the Coral Sea east of Australia and previously thought extinct 50 million years ago.

DailyGalaxy

TStzmmalaysia
post Jan 10 2011, 09:08 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


NANOTECHNOLOGY

Attached Image

Molecules can be guided by light alone

A few months ago Münster scientists showed that certain molecules – so-called nano-containers – can be guided by light alone. This study has now been singled out for special praise – according to the journal Optics and Photonics News it is among the 30 best pieces of work done in 2010. In a special issue published at the end of the year the journal – an international opinion-former – traditionally looks back on the research highlights of the past year in the fields of optics and photonics. The Münster study, which involved scientists working with Prof. Cornelia Denz (physics) and Prof. Luisa De Cola (chemistry), even made it on to the front page of the special issue, which was a particular honour.

"We're delighted that the hierarchical arrangement of molecules by means of light has received such a good reception by the research community," says physicist Mike Wördemann, who played a key role in the study. "This interdisciplinary work was only possible as a result of the first-class collaboration between physicists and chemists." The study was carried out jointly with Münster University's Centre for Nonlinear Science as part of the first German-Chinese Transregio Collaborative Research Centre (TRR 61) of the German Research Foundation.

The team of scientists has developed a new type of method to arrange miniscule nano-containers which are able to transport a wide variety of "guest molecules" such as medicines or other active ingredients in cavities inside themselves. It is not even necessary to touch the nano-containers – they are guided solely – as if by magic – by means of light from a non-visible, infrared high-performance laser. In this way completely new possibilities are created for the extremely precise control of artificial, nano-structured materials. For medical applications, for example, filling a container (whether a medicine or an active ingredient) could be precisely positioned and the effect likewise precisely controlled.

The basic idea had already been published by the researchers in August in the interdisciplinary journal "Advanced Materials". Inspired by the universal principles of self-organization in nature, the scientists arranged the containers – which themselves contained highly ordered "guest molecules" – in an ordered structure and thus created a so-called hierarchical, supermolecular arrangement. This method, developed by close collaboration between physicists and chemists, has made it possible for the first time to steer each individual nano-container directly and arrange them on the nanometer scale with the greatest possible precision.

One special quality the containers have is their internal structure with innumerable, strictly arranged cavities which can be filled with a wide variety of "guest molecules". This high degree of order on the nanometer scale can, in itself, lead to fascinating new properties in the materials produced which cannot be realized with the "guest molecules" alone, say the scientists.

Universität Münster
TStzmmalaysia
post Jan 11 2011, 07:32 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Making batteries last longer in electric vehicles

A potentially ‘green’ energy storage device which will help to power electric transport is to be officially launched at The University of Nottingham’s Malaysia Campus (UNMC) this week.

The Sahz-UNMC Pilot Plant produces ‘supercapacitors’ which are electrochemical storage devices with high power density. They are used to improve the lifetime of the batteries in electric vehicles.

The plant has developed the new supercapacitors under the brand name Enerstora. They are cost-saving and more environmentally-friendly when used in the manufacturing of electric cars, trains and other electric transportation. The supercapacitor also has important applications in other areas like solar energy and mobile devices where extremely fast charging is a valuable feature.

The unit was established by the UNMC Faculty of Engineering with industry partner Sahz Holdings to design and manufacture the devices with the eventual aim of building a high volume manufacturing plant in Malaysia. The fabrication process for this new technology was developed in collaboration with the Chemical Engineering Department at the University Park campus in Nottingham, UK.

The plant is sponsored by the Malaysian government and the country’s former Prime Minister, Tun Mahathir, will conduct the official opening on Tuesday 11 January 2011. The event will also see the signing of a memorandum of understanding between Sahz and two future partners, 2M Engineering of the Netherlands and Semyung Ever Energy Co.Ltd of South Korea.

Electric vehicles and ‘green’ cars will account for up to a third of total global sales by 2020, according to a recent report from Deloitte's global manufacturing industry group. At present most electric vehicle batteries typically need replacing every three to five years. The motivation for establishing the pilot plant was to develop the supercapacitor which will extend and maximise the life of the batteries to help conserve the natural environment and global energy resources.

A low state of battery charge leads to sulphurication and stratification, both of which shorten the life of the cell. Short battery life is caused by continuous draining and charging which has a detrimental effect on the battery. By using the new supercapacitor-battery hybrid technology for energy storage in electric vehicles the battery life can be lengthened, the battery size reduced and a higher state of charge can be maintained for a longer period of time.
The pilot plant is now manufacturing Electronic Double Layer Capacitors (EDLCs) in a reproducible and economically viable way. Historically it has been difficult to scale up production to economically viable levels without losing quality and performance of the product because the equipment and processes in a large plant are significantly different to a pilot plant. The Sahz-UNMC plant is also aiming to design a system capable of predicting the yield and quality of the product when it is eventually manufactured on a large scale.

At the launch and signing ceremony Malaysia’s former Prime Minister Tun Mahathir will be given a guided tour of the lab facilities. Leading the pilot plant at UNMC, Professor Dino Isa said:

“The project is one which is leading the market. We are gambling on the demise of hydrocarbon driven vehicles and the eventual emergence of the pure electric vehicle as the dominant means of transport in the next ten years.

“We are also looking to capitalize on the popularity of mobile devices which, as platforms and technologies converge, will require energy storage systems to deliver pulsed voltage and current that drains and eventually kills normal batteries if unaided by a supercapacitor integrated within the system. We thank the Malaysian Ministry of Science, Technology and Innovation (MOSTI) and Sahz for their trust and insight, I personally thank the University for providing me with an environment conducive to inquisitive research, my students for their patience and hard work and not least Prof George Chen for his input during the initial stages of the project."

PhysOrg

TStzmmalaysia
post Jan 11 2011, 07:34 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Cockroach inspires robotic hand to get a grip

No one thinks twice about picking up a cup of coffee, but this task has vexed robots for three decades. A new type of mechanical hand developed by researchers at Harvard and Yale promises to solve this problem.
In a makeover inspired by cockroach legs, the engineers chose not to make their robotic hand smarter, but to redesign its form to suit a dumb robot.

"People have been trying to build robotic hands for 20 or 30 years, but those hands have rarely been able to perform dexterous tasks," explained Robert D. Howe, who heads Harvard's BioRobotics Laboratory. Howe worked with Aaron Dollar, a former graduate student and now an assistant professor of engineering at Yale, to develop the new hand.

In the real world, Howe explained, both robots and humans have trouble estimating the relationship between their hand and the object they want to grasp. Humans compensate for errors by opening their hands and making their fingers soft and flexible, so they can glide along an object's edge before wrapping around it to pick it up.

"The traditional approach in robotics research is to deal with errors using elaborate sensors, motors, and controls," Howe explained. The resulting mechanical hands were very complex and expensive. They were also slow, since they required lots of computing power to perform even the simplest tasks.

Consider a robotic hand trying to pick up a wine glass. Unless it moved at glacially slow speeds, it might knock over the goblet before it could react to sensor signal that it had made contact.

"We took the opposite approach and tried to understand the fundamental mechanics using good mechanical design practices," Dollar said. Their goal was to reinvent the mechanical hand so that it automatically compensated for errors and adapted to grasp a variety of shapes.

Surprisingly, Howe said, their inspiration came from cockroach legs.

Starting in the late 1980's, University of California, Berkeley professor Robert Full began investigating how cockroaches could walk and run over uneven surfaces. Cockroaches have miniscule brains, so Full knew that they could not possibly be computing their movements so quickly.

Full analyzed the mechanics of cockroach legs to see how they worked. It turns out that their legs are flexible and springy. This lets them adjust to uneven surfaces automatically, without thinking. Full created robotic legs that duplicated these properties using springs and hinges and built an eight-legged robot that could run over uneven ground at breakneck speeds, something no robot had ever done before.
Full's demonstration startled roboticists, and Dollar and Howe decided to take a similar approach to building a hand. If they could get the springs and finger shapes and sizes just right, the hand would be flexible enough to glide along objects until it wrapped around them, just like a human hand lifting a coffee cup.

First, Dollar and Howe stripped the hand down to its essentials: a claw of two doubled-jointed plastic fingers with a single motor controlling them by cables and pulleys. Dollar then built a mathematical model to simulate how the hand would react to various shapes and sizes at different levels of springiness and flexibility.

Dollar eventually added another set of fingers for a surer grip. Despite its four-fingered form, the resulting hand has several human characteristics. At rest, its joints are opened from 25 to 45 degrees, and the joints at the base (our palm) are more flexible than the joints of the fingers. Dollar also added sensors to detect when a finger touches an object and the angle of the joints. While the hand automatically adjusts for many small errors, the sensors enable it to compensate for larger miscues.

The result is a very simple hand that can grasp a wide range of objects. It could become a platform for future household and service robots, where the ability to grasp different objects is important.

Dollar is also pursuing prosthetic hands. Each finger weighs under 1.5 ounces, which is an advantage because many amputees abandon mechanical devices because of their weight.

The hand cannot grasp and manipulate small objects, such as keys or forks. Such dexterity will require additional motors, which would increase the hand's weight and complexity. Dollar is also looking at a new configuration with an opposable thumb.

For three decades, researcher sought to make hands better by making them more complex. By embracing the simplicity of nature-based design, Dollar and Howe have given roboticists a new grip on building mechanical hands. There is still much work to do.

PhysOrg

TStzmmalaysia
post Jan 11 2011, 07:36 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


TRANSPORTATION

Attached Image

'Superstreet' traffic design improves travel time, safety

The so-called "superstreet" traffic design results in significantly faster travel times, and leads to a drastic reduction in automobile collisions and injuries, according to North Carolina State University researchers who have conducted the largest-ever study of superstreets and their impacts.

Superstreets are surface roads, not freeways. It is defined as a thoroughfare where the left-hand turns from side streets are re-routed, as is traffic from side streets that needs to cross the thoroughfare. In both instances, drivers are first required to make a right turn and then make a U-turn around a broad median. While this may seem time-consuming, the study shows that it actually results in a significant time savings since drivers are not stuck waiting to make left-hand turns or for traffic from cross-streets to go across the thoroughfare.

"The study shows a 20 percent overall reduction in travel time compared to similar intersections that use conventional traffic designs," says Dr. Joe Hummer, professor of civil, construction and environmental engineering at NC State and one of the researchers who conducted the study. "We also found that superstreet intersections experience an average of 46 percent fewer reported automobile collisions – and 63 percent fewer collisions that result in personal injury."

The researchers assessed travel time at superstreet intersections as the amount of time it takes a vehicle to pass through an intersection from the moment it reaches the intersection – whether traveling left, right or straight ahead. The travel-time data were collected from three superstreets located in eastern and central North Carolina, all of which have traffic signals. The superstreet collision data were collected from 13 superstreets located across North Carolina, none of which have traffic signals.

The superstreet concept has been around for over 20 years, but little research had been done to assess its effectiveness under real-world conditions. The NC State study is the largest analysis ever performed of the impact of superstreets in real traffic conditions.

EurekAlert

TStzmmalaysia
post Jan 12 2011, 09:23 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


SPACE SCIENCE

Attached Image

First “solid” exoplanet found!

Astronomers have just announced the discovery of the first planet orbiting another star that is unequivocally not a gas giant: it must be a very dense, rocky-metallic object not much bigger than the Earth!

The planet, discovered by the orbiting Kepler telescope, is called Kepler-10b. The star (Kepler 10) is roughly the same mass and temperature as the Sun, and is located over 500 light years away.

The planet was detected because it passes directly between us and the star as it orbits. When it does that, it makes a mini-eclipse, blocking a bit of light from the star. By knowing how big the star is and how much light is blocked, the size of the planet can be measured (the bigger the planet, the more light is blocked). In this case, Kepler-10b is only about 1.4 times the diameter of the Earth, making it the smallest exoplanet ever found!

However, there’s more. The planet’s gravity tugs on the star as it orbits, so as the planet makes a big circle around the star, the star makes a little circle in response (I like to use the analogy of a father dancing with his small daughter; as he swings her around she makes a big circle around him and he makes a little circle, because he’s much more massive than she is). As the star moves slightly toward and away from us we can measure the change in velocity using the Doppler shift, and that in turn tells us the mass of the planet. It turns out Kepler-10b is a lot more massive than the Earth, tipping the scales at 4.6 times the Earth’s mass.

So it’s not terribly earth-like; if you stood on its surface you’d weigh almost 2.5 times what you do now!


Even worse, it orbits extremely close in to its star, circling over the star’s surface at a distance of roughly 3 million kilometers (1.8 million miles) — amazingly, it takes less than an Earth day to make one circuit. But being that close to a star comes at a price: the surface temperature of the planet must be several thousand degrees! So yeah, you’d weigh more there, but not for long. You’d burn through those extra calories pretty rapidly. Literally.

And even worse, it’s almost certain the planet is tidally locked to its star, meaning it always shows one face to the star (like the Moon does to the Earth). So the side facing the star is scorching hot and probably glowing brightly with heat, as shown in the artist’s depiction above. This is truly a scary, hellish world.

I’ve seen a lot of reports already calling the planet "solid", but I think it’s clear that it must actually be molten. I I think the reports are trying to distinguish it from the usual hot super-Jupiters found around other stars, planets that are bloated gas giants. Kepler-10b is certainly much smaller and therefore not a gas giant.

So this planet is not even close to being earth-like and habitable, but it’s still the lowest-mass and smallest planet ever found orbiting a sun-like star. This is a huge milestone, because it shows without doubt that Kepler has the potential to fulfill its mission of finding a truly earth-like planet orbiting a distant star. How long will it be before we see one of those? If they exist and Kepler can spot one it’ll still be a couple of years, sadly, but the good news is we should be able to see it. And that’s exciting enough for me for now.

Discovery

TStzmmalaysia
post Jan 12 2011, 09:24 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Fruit-bearing solar crop dryer could provide for thousands

A solar crop dryer developed by a UNSW photovoltaic and solar energy engineering student has the potential to provide a living for thousands of people throughout Vanuatu.

Telia Curtis, 29, who developed the solar tunnel dryer as part of her Masters thesis, wanted to create a unit that would allow families to dry and sell the abundant fruits and nuts found through the South Pacific island nation.

“I want to create a design that could be built out of easily accessible materials,” says Talia of her bamboo, wood, corrugated iron and polythene film construction which she adapted from a German design. It was developed in conjunction with Charles Longwah of the Vanuatu Kava Store.

The dryer works with solar powered fans for forced convection, with products being laid out on mesh trays and air forced over them to extract moisture from the foods.

"Charles Longwah is running electric dryers at the moment and his energy bills are enormous," says Telia. "This is a very simple design that works with a lot of different materials using radiation from the sun."

The units can easily be used by people living in the more remote islands where poor transport infrastructure currently blocks trade in many fresh local products. Telia demonstrated her prototype dryers at an open day in Port Vila to an overwhelmingly enthusiastic audience.

Now local builders and community groups are seeking government and aid funding to build and distribute the units.

Jacob Kapere, of the Vanuatu Cultural Centre, said the units could allow people to make a living on their home islands instead of having to move to Port Vila for employment.

"People were so excited," says Telia. "It’s particularly great for women because they are the ones that sell foods through the local markets. Mango, paw paws, tamarind and nangai nuts. There’s great potential for all of these."

Contacts for Telia’s project were developed by Dr. Richard Corkish, head of the School of Photovoltaic and Renewable Energy Engineering, who has been running student projects in Vanuatu since 2008, and the project is co-supervised by Professor Robert Fuller, a renowned solar dryer expert from Deakin University. "I have been talking to the shop owner who we developed this with on every visit to Vanautu I’ve made so it’s great to come up with the goods at last," said Dr. Corkish. "To see people so interested at the launch has enlivened my commitment to the place."

PhysOrg


TStzmmalaysia
post Jan 12 2011, 09:25 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Kinetic energy harvester charges smartphones off-the-grid

The nPower PEG we first tried in prototype form at CES 2009 finally goes on sale tomorrow (May 3, 2010). The PEG is a light-weight, titanium encased portable generator that can recharge a handheld device (phone, media player, camera, GPS etc.) when you are away from the grid, though it is unlike any other mobile power solutions in that you DON'T need any fuel, don't need to turn hand cranks and you don't need the sun. The US$150 PEG is 9 inches long, weighs 9 ounces and harvests kinetic energy as you move about in your daily life. Just put it in your backpack, bumbag, handbag, brief case or glovebox and it will collect and store energy from your movements. The first 1000 units will appropriately be engraved as “First Mover” Editions.

The personal energy generator uses the same principle as battery-less flashlights to harvest energy from the environment - Faraday's Principle of Electromagnetic Induction.

The nPower® PEG’s charge rate varies widely based on the specific user’s movement. On average, consumers lose battery power in their cell phones once per week. When the PEG is carried during one’s daily activities, the PEG continually tops off its internal power storage, so that users will never be without power
Meeting the USB 2.0 standard, the nPower® PEG’s 1000 mAh battery charges handheld electronics at the same rate as a wall outlet.

The amount of “talk” or “play time” that the PEG will provide varies based on the type of device being charged. The nPower® PEG uses a lithium polymer battery to stockpile energy, so it's fully recyclable and it uses a standard USB 2.0 interface and universal iGo tip system and will deliver said energy to more than 90% of hand-held devices on the market.

The PEG is a product from kinetic energy harvesting specialist Tremont Electric and the nPower® technology is scalable both up and down in size and will eventually power a product range which extends from small, implantable biomedical generators to large commercial scale wave energy converters that sit in the ocean.

Gizmag


TStzmmalaysia
post Jan 12 2011, 09:28 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Spintronic Nanoelectronics based on Magneto-Logic Gates

This paper presents a novel design concept for spintronic nanoelectronics that emphasizes a seamless integra- tion of spin-based memory and logic circuits. The building blocks are magneto-logic gates [1] based on a hybrid graphene/ferromagnet material system. We use network search engines as a technology demonstration vehicle and present a spin-based circuit design with smaller area, faster speed, and lower energy consumption than the state- of-the-art CMOS counterparts. This design can also be applied in applications such as data compression [2]–[5], coding [6] and image recognition [7], [8]. In the proposed scheme, over 100 spin-based logic operations are carried out before any need for a spin-charge conversion. Consequently, supporting CMOS electronics requires little power consumption. The spintronic-CMOS integrated system can be implemented on a single 3-D chip. These nonvolatile logic circuits hold potential for a paradigm shift in computing applications.

The continued Moore’s law scaling in CMOS integrated circuits poses increasing challenges to provide low-energy consumption, sufficient processor speed, bandwidth of interconnects, and memory storage [9], [10]. CMOS logic circuits rely on the von Neumann computer architecture consisting of central processing units (CPU) connected by some communication channel to memory. The bottleneck caused by the communication (sometimes dubbed as the von Neumann bottleneck) and memory accesses is the underlying reason for the significant and widening gap between the fast improving transistor performance and our relatively stagnant ability to provide correspondingly faster programs executions. Such bottlenecks are especially obvious for data intensive applications where most of the actions involve accessing or checking data (rather than doing complex computation). Network routers are a classical example where the Internet protocol (IP) address is compared to a list of patterns to find the best match for further processing. In a typical router, special-purpose associative tables are used where the search data (the key) is simultaneously compared with all of the table entries to find matches. Such tables allows us to fundamentally improve the processing speed. Unfortunately, conventional CMOS implementation of these circuits also suffer from scalability issues, making them ineffective for larger search problems that are increasingly relevant to modern workloads.

Paper continues HERE
TStzmmalaysia
post Jan 12 2011, 09:31 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Generation of Medical X-ray and Terahertz Beams of Radiation Using Table-Top Accelerators

Theoretical and experimental studies of PXR and diffracted radia- tion of an oscillator in crystals combined with the development of VFEL generators with photonic crystals give a promising basis for creation of X-ray and THz sources using the same table-top accelerator. Multi-modal medical facility can be devel- oped on the basis of one dedicated table-top electron accelerator of some tens of MeV energy. Such a system could find a lot of applications in medical practice and biomedical investigations.

In the Executive Summary of the first Workshop “Physics for Health in Europe” held in February 2010 at CERN is stressed that dose reduction during diagnostic radiology and CT examinations is a hot research topic worldwide [1]. Development of new intensive (quasi)-monochromatic tunable x-ray and terahertz sources is an important part of such research.

Medical quasi-monochromatic X-ray beams must have an essential integral flux to provide high-quality high-contrast imaging. Also, realistic sources of these beams must have laboratory sizes and affordable price to be used in clinics and hospitals. The main problem in development of monochromatic X-ray sources is the gap between the achiev- able photon generation efficiency (photon per electron) and the existing electron beam current in the table-top accelerators. Another problem is strong scattering of an elec- tron moving through a single crystal target. We showed [2] that an X-ray source with the required properties can be developed using parametric X-rays (PXR) from charged particles in a crystal [3] and table-top accelerators like compact storage ring or pulsed race-track microtron [4, 5, 6, 7, 8].

Accelerators of a kind can also be used for the development of the intense terahertz source based on the mechanism of the volume free electron laser [9]. Today, THz tech- nology is finding a variety of applications: information and communications technology; biology and medical sciences; non-destructive evaluation and homeland security; qual- ity control of food and agricultural products; global environmental monitoring, space research and ultra-fast computing [10]. High-power tunable T-ray sources are very im- portant devices to bring THz research from promising prospects to a wide use in science and technology.

T-rays are also very promising for biomedical applications: low energy of photons prevents them from ionizing biological media, but this energy corresponds to vibrational levels of important bio molecules including DNA and RNA. This allows direct action for stimuling viruses, cells, their components and provides control of bio-chemical reactions. Thus, T-rays may be applied in therapy, surgery, imaging, and tomography. Terahertz radiation is extremely important for bio-medical applications and its wider use depends on the progress in development of THz sources [11].

In present paper we discuss prospects of application of diffracted radiation of an oscillator (DRO) (sometimes called diffracted channeling radiation - DCR) for X-ray source creation. DRO is the coherent process of diffracted X-ray photon emission by the relativistic oscillator (electron channeling in a crystal). DRO formation was first considered in [12], detailed review and references may be found in [13, 14]. Evaluations show that DRO generation efficiency per one electron is some times higher than that of PXR and diffracted Bremsstrahlung.

The same radiation mechanisms can work in the terahertz range if a single crystal target is changed for the photonic crystal of appropriate properties [15, 16, 17, 18]. Thus, multi-modal medical facility can be developed on the basis of one dedicated table-top electron accelerator of several tens of MeV energy.

Please read the rest of the paper HERE

TStzmmalaysia
post Jan 12 2011, 09:33 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Planck Satellite Team Uncovers Secrets of the Universe

University of British Columbia researchers are part of European Space Agency's Planck satellite mission that is revealing thousands of "exotic" astronomical objects, including extremely cold dust clouds, galaxies with powerful nuclei, and giant clusters of galaxies.

The international collaboration of scientists from 15 countries is presenting more than 25 scientific papers on January 11 in Paris, France, on the first results from the Planck mission. Launched in 2009, the Planck satellite is probing the entire sky at microwave wavelengths from 0.35 millimetre to one centimetre. By measuring the cosmic microwave background -- the oldest source of light in the Universe -at these wavelengths, Planck has provided an unprecedented view of the sky.

Amongst the results is the Early Release Compact Source Catalogue, a guidebook of 10,000 extraterrestrial objects that, while appearing unremarkable through optical telescopes from Earth, may provide vital information about the structure and evolution of the Universe.

"In addition to studying the microwave background, Planck maps the 'foreground' emission from our own galaxy and all the stuff between here and the background," says UBC astronomy professor Douglas Scott, who led the UBC team that helped develop software to analyze and calibrate the vast amounts of data before they were used to create maps of the sky.

"This new information will help scientists learn about the coldest clumps of gas and dust where stars are forming, the properties of unusual forms of cosmic dust, huge clusters of never-before-seen galaxies, and how dust-filled galaxies evolve over the history of the Universe," says Scott.

Planck is a European Space Agency mission, with contributions from other agencies including the Canadian Space Agency (CSA). The CSA supports two Canadian research groups based at UBC and at the University of Toronto.

The UBC team consists of Prof. Scott and research associates Adam Moss, Jim Zibin and Andrew Walker in the Dept. of Physics and Astronomy.

Highlights from the latest research include:

- A new list of the coldest clumps of gas and dust within our own galaxy: Planck's multi-colour survey allows these objects to be picked out easily. Dense cores of dust and gas were found with temperatures as low as just seven degrees above absolute zero (or -266 degrees Celsius). Now that their locations are known, follow-up studies with other telescopes will help understand how these clouds are turning into stars.

- Strong evidence for the existence of what astronomers call "anomalous" dust: interstellar grains of material that behave in an unexpected way, the anomalous dust emits a different light spectrum from other thermally emitting grains. The most likely explanation is that the spectrum peaks at such short wavelengths because dust particles are spinning billions of times a second.

- Clusters of hundreds of galaxies discovered through their effect on the cosmic "background": This "SZ effect," named after Russian astrophysicists Rashid Sunyaev and Yakov Zeldovich, results from the scattering of the background by hot electrons in the clusters. This action is akin to putting a coloured filter onto the microwave sky, altering the spectrum in the direction of the cluster. Planck's combination of multiple wavelength bands has allowed the detection of about 200 clusters. The Cosmic Infrared Background is a glow at infrared and microwave wavelengths, coming from the total light emitted by dust across all the galaxies in the Universe. Planck allows astronomers to study the evolution of these galaxies, by comparing variations across the sky among different wavelength bands.

ScienceDaily

TStzmmalaysia
post Jan 12 2011, 09:40 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Nanodisk gene therapy

One of the challenges of gene therapy - a set of methodologies aimed at treating several nucleic acid diseases (DNA or RNA) - is to assure that this material arrives directly to the nucleus of the cell without losing a substantial amount along the way and without producing any undesired side effects. With this aim, scientists experiment with the use of different types of vectors, molecules capable of transporting genetic material to the correct place. Presently, natural "deactivated" viruses are the most commonly used vectors in clinical trials, their side effects however often limit therapeutic application.

One of the most promising alternatives in this field is the use of artificial viruses. These viruses can be constructed through genetic engineering by assembling minute protein structures made up of peptides, the building blocks of proteins.

The team of scientists, led by Antonio Villaverde, lecturer of the Department of Genetics and Microbiology, researcher at the UAB Institute of Biotechnology and Biomedicine and of the Biomedical Research Networking Center in Bioengineering, Biomaterials and Nanomedicine (CIBER-BBN), demonstrated that the peptide R9, formed by a specific type of amino-acid (arginine), can encapsulate genetic material, assemble itself with other identical molecules to form nanoparticles and enter directly into the cell nucleus to release the material it contains. The nanoparticles have the shape of a disk, with a diameter measuring 20 nanometres and a height of 3 nm.

The study was published recently in the journals Biomaterials and Nanomedicine and describes how scientists studied the performance of R9 nanodisks in the interior of the cells using confocal microscopy techniques provided by the UAB Microscopy Service and applied by Dr Mònica Roldán. The images show that once the cell membrane is passed, particles travel directly to the nucleus at a rate of 0.0044 micrometres per second, ten times faster than if they dispersed passively in the interior. Nanoparticles accumulate in the interior of the nucleus and not in the cytoplasm - the thick liquid between the cell membrane and nucleus - and therefore increase their level of effectiveness. One of the photos was selected by the journal Biomaterials as one of the 12 images of the year 2010.

Participating in this discovery were researchers from the Institute of Material Science of Barcelona (ICMAB-CSIC), the Catalan Institute for Research and Advanced Studies (ICREA), and the Technical University of Catalonia. The discovery represents a new category of nanoparticles offering therapeutic benefits. According to Dr Esther Vázquez, director of the project, "nanodisks assemble automatically, move rapidly, remain stable and travel to the interior of the nucleus. This makes them a promising tool as a prototype for the safe administration of nucleic acids and functional proteins."

EurekAlert

TStzmmalaysia
post Jan 12 2011, 09:43 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Lithium-ion ultracapacitor could recharge power tools in minutes

Although many people keep a few power tools in the garage or basement for weekend projects, the tools usually don’t get used very often. Fully recharging the battery in a drill or saw can take several hours, even if the tool is only used for a few minutes. But with a hybrid energy-storage device that combines a lithium-ion battery with an ultracapacitor, power tools could be recharged in about one minute and have a lifetime of more than 20,000 charges. The downside is that the power tool could run for only about 1/15 as long as it would on a normal battery.

The new lithium-ion ultracapacitor was developed by Ioxus, a company based in Oneonta, New York. The company specializes in making ultracapacitors for hybrid-electric buses and engine start-stop systems in fuel-efficient cars.

In general, hybrid lithium-ion ultracapacitors are similar to traditional lithium-ion batteries, except that they store charge at the surface of the electrodes instead of within the electrodes. Although the concept of hybrid lithium-ion ultracapacitors has been around for 20 years, demand for alternative energy-storage devices has inspired recent improvements.

Typically, standard ultracapacitors can store only about 5% as much energy as lithium-ion batteries. Ioxus’ new hybrid system can store about twice as much as standard ultracapacitors, although this is still much less than standard lithium-ion batteries. However, the advantage of ultracapacitors is that they can capture and release energy in seconds, providing a much faster recharge time compared with lithium-ion batteries. In addition, traditional lithium-ion batteries can be recharged only a few hundred times, which is much less than the 20,000 cycles provided by the hybrid system. In other words, the hybrid lithium-ion ultracapacitors have more power than lithium-ion batteries, but less energy storage.

In the future, the hybrid lithium-ion ultracapacitor could also be used for regenerative braking in vehicles, especially if it could be scaled up to provide greater energy storage. Since vehicle braking systems need to be recharged hundreds of thousands of times, the hybrid system’s cycle life will also need to be improved.

PhysOrg

TStzmmalaysia
post Jan 12 2011, 09:46 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Metamorphosis key to creating stable walking robots

Virtual robots learning to walk are steadiest on their feet when they start out with no legs and are allowed to evolve limbs over time. As well as helping to design more stable robots, the implication is that creatures whose body plans morph as they grow may have an evolutionary advantage.

Programming robots to walk without falling over is arduous, so Josh Bongard at the University of Vermont in Burlington set out to find the quickest way to evolve walking behaviours. To do this, he ran simulations of several types of robots and gave each the same goal: to seek out a virtual light source and evolve a walking gait to reach it.

Bongard added another twist to the simulation, though. Some of his virtual bots could change their body plan over time, "in much the same way that an initially legless tadpole develops into a legged frog over its lifetime", he says. One slithered along the ground like a snake, but gradually grew four vertical legs. Another began with four legs splayed out horizontally in lizard fashion – its legs gradually shifted into a vertical position beneath its body. A third type of virtual bot had four upright legs to start with and lacked the ability to evolve its body plan.

Each bot used a software routine called a genetic algorithm to evolve a slithering or walking gait that would best get it to the light source given its current body plan. Once each bot had evolved to the point where it could walk upright on four legs and reach the light source within a fixed time period, Bongard ran the algorithms on a real four-legged walking robot to assess the results (see movie).

VIDEO

He found that the four-legged robot was stable when programmed to walk like any of the virtual bots that had metamorphosed with time. "Metamorphosed robots were able to continue walking even if they were randomly pushed around," he says. However, when the four-legged robot adopted the walking style of a virtual bot with a fixed body plan, it was far more prone to falling over when pushed. Bongard thinks that's because the morphed robots had to remain balanced and on course through many body plans, so the gait they finally adopted had greater stability.

In terms of biology, evolving behaviours like locomotion may be easier if the animal progresses through body plans that allow for gradual learning over time, says Bongard. "This is what human infants do: they progress from crawling to walking gradually, even as the bones in the legs and feet change to accommodate the change in behaviour."

The results are useful for engineers, says Hod Lipson, a roboticist at Cornell University in Ithaca, New York: "We may now need to examine new ways to allow robots to automatically adapt their hardware, not just their software, if we are to achieve higher levels of performance."

Bongard agrees, but thinks Lipson's call to action might be premature. "We need advances in materials to realise robots that can grow new legs," he says.

Even without those advanced materials, though, roboticists have begun to experiment with some of these concepts by manually altering the body plans of their robots. In 2009, Chris MacLeod at Robert Gordon University in Aberdeen, UK, used a similar algorithm to teach a simple two-legged walking robot how to adjust its gait to cope with the addition of an extra pair of legs – a step towards more advanced and adaptive robots, MacLeod thinks.

He is interested by Bongard's "embryological approach" – the way that his simulations mimic the changes seen in some animals between birth and adulthood. "It is certainly worthy of further study," he says.

NewScientist


TStzmmalaysia
post Jan 12 2011, 09:47 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Antimatter caught streaming from thunderstorms on Earth

A space telescope has accidentally spotted thunderstorms on Earth producing beams of antimatter. Such storms have long been known to give rise to fleeting sparks of light called terrestrial gamma-ray flashes. But results from the Fermi telescope show they also give out streams of electrons and their antimatter counterparts, positrons.

The surprise result was presented by researchers at the American Astronomical Society meeting in the US. It deepens a mystery about terrestrial gamma-ray flashes, or TGFs - sparks of light that are estimated to occur 500 times a day in thunderstorms on Earth. They are a complex interplay of light and matter whose origin is poorly understood.

Thunderstorms are known to create tremendously high electric fields - evidenced by lightning strikes. Electrons in storm regions are accelerated by the fields, reaching speeds near that of light and emitting high-energy light rays - gamma rays - as they are deflected by atoms and molecules they encounter. These flashes are intense - for a thousandth of a second, they can produce as many charged particles from one flash as are passing through the entire Earth's atmosphere from all other processes.

Scaling down
The Fermi space telescope is designed to capture gamma rays from all corners of the cosmos, and sports specific detectors for short bursts of gamma rays that both distant objects and TGFs can produce.

"One of the great things about the Gamma-ray Burst Monitor is that it detects flashes of gamma rays all across the cosmic scale," explained Julie McEnery, Fermi project scientist at Nasa.

"We see gamma-ray bursts, one of the most distant phenomena we know about in the Universe, we see bursts from soft gamma-ray repeaters in our galaxy, flashes of gamma rays from solar flares, our solar neighbourhood - and now we're also seeing gamma rays from thunderstorms right here on Earth," she told BBC News.

Since Fermi launched in mid-2008, the Gamma-ray Burst Monitor (GBM) has spotted 130 TGFs, picking up on the gamma rays in low Earth orbit as storms came within its scope.

But within that gamma-ray data lies an even more interesting result described at the meeting by Dr McEnery and her collaborators Michael Briggs of the University of Alabama Huntsville and Joseph Dwyer of the Florida Institute of Technology. "We expected to see TGFs; they had been seen by the GBM's predecessor," Dr McEnery explained.

"But what absolutely intrigues us is the discovery that TGFs produce not just gamma rays but also produce positrons, the antimatter equivalent to electrons." When gamma rays pass near the nuclei of atoms, they can turn their energy into two particles: an electron-positron pair.

Because electrons and positrons are charged, they align along the Earth's magnetic field lines and can travel vast distances, gathered into tightly focused beams of matter and antimatter heading in opposite directions. The dance of light and matter continues when positrons encounter electrons again; they recombine and produce a flash of light of a precise and characteristic colour. It is this colour of light, picked up by the Fermi's GBM, that is a giveaway that antimatter has been produced. The magnetic field can transport the particles vast distances before this characteristic flash, and one of the Fermi detections was from a storm that was happening completely beyond the horizon.

The results will be published in the journal Geophysical Research Letters. Steven Cummer, an atmospheric electricity researcher from Duke University in North Carolina, called the find "truly amazing". "I think this is one of the most exciting discoveries in the geosciences in quite a long time - the idea that any planet has thunderstorms that can create antimatter and then launch it into space in narrow beams that can be detected by orbiting spacecraft to me sounds like something straight out of science fiction," he said.

"It has some very important implications for our understanding of lightning itself. We don't really understand a lot of the detail about how lightning works. It's a little bit premature to say what the implications of this are going to be going forward, but I'm very confident this is an important piece of the puzzle."

BBC News

TStzmmalaysia
post Jan 12 2011, 09:48 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

UNC researchers inch closer to unlocking potential of synthetic blood

A team of scientists has created particles that closely mirror some of the key properties of red blood cells, potentially helping pave the way for the development of synthetic blood.

The new discovery – outlined in a study appearing in the online Early Edition of the Proceedings of the National Academy of Sciences during the week of Jan. 10, 2011 – also could lead to more effective treatments for life threatening medical conditions such as cancer.

University of North Carolina at Chapel Hill researchers used technology known as PRINT (Particle Replication in Non-wetting Templates) to produce very soft hydrogel particles that mimic the size, shape and flexibility of red blood cells, allowing the particles to circulate in the body for extended periods of time.

Tests of the particles' ability to perform functions such as transporting oxygen or carrying therapeutic drugs have not been conducted, and they do not remain in the cardiovascular system as long as real red blood cells.

However, the researchers believe the findings – especially regarding flexibility – are significant because red blood cells naturally deform in order to pass through microscopic pores in organs and narrow blood vessels. Over their 120-day lifespan, real cells gradually become stiffer and eventually are filtered out of circulation when they can no longer deform enough to pass through pores in the spleen. To date, attempts to create effective red blood cell mimics have been limited because the particles tend to be quickly filtered out of circulation due to their inflexibility.

Beyond moving closer to producing fully synthetic blood, the findings could affect approaches to treating cancer. Cancer cells are softer than healthy cells, enabling them to lodge in different places in the body, leading to the disease's spread. Particles loaded with cancer-fighting medicines that can remain in circulation longer may open the door to more aggressive treatment approaches.

"Creating particles for extended circulation in the blood stream has been a significant challenge in the development of drug delivery systems from the beginning," said Joseph DeSimone, Ph.D., the study's co-lead investigator, Chancellor's Eminent Professor of Chemistry in UNC's College of Arts and Sciences, a member of UNC's Lineberger Comprehensive Cancer Center and William R. Kenan Jr. Distinguished Professor of Chemical Engineering at N.C. State University. "Although we will have to consider particle deformability along with other parameters when we study the behavior of particles in the human body, we believe this study represents a real game changer for the future of nanomedicine."

Chad Mirkin, Ph.D., George B. Rathmann Professor of Chemistry at Northwestern University, said the ability to mimic the natural processes of a body for medicinal purposes has been a long-standing but evasive goal for researchers. "These findings are significant since the ability to reproducibly synthesize micron-scale particles with tunable deformability that can move through the body unrestricted as do red blood cells, opens the door to a new frontier in treating disease," said Mirkin, who also is a member of President Obama's Council of Advisors on Science and Technology and director of Northwestern's International Institute for Nanotechnology.

UNC researchers designed the hydrogel material for the study to make particles of varying stiffness. Then, using PRINT technology — a technique invented in DeSimone's lab to produce nanoparticles with control over size, shape and chemistry — they created molds, which were filled with the hydrogel solution and processed to produce thousands of red blood cell-like discs, each a mere 6 micrometers in diameter.

The team then tested the particles to determine their ability to circulate in the body without being filtered out by various organs. When tested in mice, the more flexible particles lasted 30 times longer than stiffer ones: the least flexible particles disappeared from circulation with a half-life of 2.88 hours, compared to 93.29 hours for the most flexible ones. Stiffness also influenced where particles eventually ended up: more rigid particles tended to lodge in the lungs, but the more flexible particles did not; instead, they were removed by the spleen, the organ that typically removes old real red blood cells.

EurekAlert

TStzmmalaysia
post Jan 13 2011, 11:33 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

New Small Wind Turbine Unveiled at CES

Southwest Windpower, maker of the Skystream 3.7, unveiled a new version of the popular turbine at CES 2011 called Skystream 600. The turbine features an improved design with larger blades, enhanced software, and an improved integrated inverter. And, according to a press release, Skystream 600 will be the “first fully smart grid-enabled wind turbine” on the market when available in April 2011.

With the improvements, Skystream 600 is estimated to produce about 74% more energy than Skystream 3.7. The small wind turbine can provide an average of 7,400 kWh of energy per year in 12 mph average annual wind speeds.

These numbers are pretty good — about 60% of an average American’s home energy needs — but everything depends on siting, wind conditions, tower height, and several other factors.

Skystream 600 comes with the internet-accessible Skyview system showing users how much energy is produced in real time. Southwest Windpower told Jetson Green in an email that the company has not yet decided on a price for the new turbine.

Jetson Green

TStzmmalaysia
post Jan 13 2011, 11:35 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


NANOTECHNOLOGY

Attached Image

New research shows how light can control electrical properties of graphene

New research published today, shows how light can be used to control the electrical properties of graphene, paving the way for graphene-based optoelectronic devices and highly sensitive sensors.

This year's Nobel Prize for Physics was awarded for research into graphene, recognising its potential for many applications in modern life, from high-speed electronics to touchscreen technology. The UK's National Physical Laboratory, along with a team of international scientists, have further developed our understanding of graphene by showing that when this remarkable material is combined with particular polymers, its electrical properties can be precisely controlled by light and exploited in a new generation of optoelectronic devices. The polymers keep memory of light and therefore the graphene device retains its modified properties until the memory is erased by heating.

Light-modified graphene chips have already been used at NPL in ultra-precision experiments to measure the quantum of the electrical resistance.

In the future, similar polymers could be used to effectively 'translate' information from their surroundings and influence how graphene behaves. This effect could be exploited to develop robust reliable sensors for smoke, poisonous gases, or any targeted molecule.

Graphene is an extraordinary two-dimensional material made of a single atomic layer of carbon atoms. It is the thinnest material known to man, and yet is one of the strongest ever tested.

Graphene does not have volume, only surface – its entire structure is exposed to its environment, and responds to any molecule that touches it. This makes it in principle a very exciting material for super-sensors capable of detecting single molecules of toxic gases. Polymers can make graphene respond to specific molecules and ignore all others at the same time, which also protects it from contamination.

The research team included scientists from the National Physical Laboratory (UK), Chalmers University of Technology (Sweden), University of Copenhagen (Denmark), University of California Berkeley (USA), Linköping University (Sweden) and Lancaster University (UK).

EurekAlert




TStzmmalaysia
post Jan 13 2011, 11:37 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Embryonic stem cells help deliver 'good genes' in a model of inherited blood disorder

Researchers at Nationwide Children's Hospital report a gene therapy strategy that improves the condition of a mouse model of an inherited blood disorder, Beta Thalassemia. The gene correction involves using unfertilized eggs from afflicted mice to produce a batch of embryonic stem cell lines. Some of these stem cell lines do not inherit the disease gene and can thus be used for transplantation-based treatments of the same mice. Findings could hold promise for a new treatment strategy for autosomal dominant diseases like certain forms of Beta Thalassemia, tuberous sclerosis or Huntington's disease.

Embryonic stem cells have the potential to produce unlimited quantities of any cell type and are therefore being explored as a new therapeutic option for many diseases. Unfertilized eggs can be cultured to form embryonic stem cells, so-called parthenogenetic embryonic stem cells.

"Parthenogenetic embryonic stem cells can differentiate into multiple tissue types as do stem cells from fertilized embryos," said K. John McLaughlin, PhD, principal investigator in the Center for Molecular and Human Genetics at The Research Institute at Nationwide Children's Hospital. Previously, the group demonstrated that blood cells derived from parthenogenetic cells could provide healthy, long-term blood replacement in mice.

"Advantages of parthenogenetic stem cells are not only that fertilization is not needed, but also that the recipient's immune system may potentially not view them as foreign, minimizing rejection problems. Furthermore, since parthenogenetic embryonic stem cells are derived from reproductive cells which contain only a single set of the genetic information instead of the double set present in body cells, they may not contain certain abnormal genes present in the other copy," said Dr. McLaughlin also one of the study authors.

A single copy of an abnormal gene inherited from one parent can cause so-called autosomal dominant diseases such as tuberous sclerosis or Huntington's disease. The affected person has one defective and one normal copy of the gene, but the abnormal gene overrides the normal gene, causing disease. In normal sexual reproduction, each parent provides one gene copy to offspring via their reproductive cells. Therefore, the reproductive cells of a patient with an autosomal dominant disease could either pass along a defective copy or a normal copy.

"As the donor patient has one defective gene copy and one normal, and only one copy is used for normal reproduction, we can select egg-cell-derived embryonic stem cells with two normal copies," said Dr. McLaughlin. "These single-parent/patient-derived embryonic stem cells can theoretically be used for correction of a diverse number of diseases that occur when one copy of the gene is abnormal," said Dr. McLaughlin.

To test this theory, Dr. McLaughlin and colleagues from the University of Pennsylvania, University of North Carolina and University of Minnesota, examined whether parthenogenetic embryonic stem cells could be used for tissue repair in a mouse model of thalassemia intermedia. Thalassemia intermedia is an inherited blood disorder in which the body lacks sufficient normal hemoglobin, leading to excessive destruction of red blood cells and anemia. They used a mouse model in which one defective gene copy causes anemia.

Using approaches developed from a previous study done by this group, Nationwide Children's Research Fellow Sigrid Eckardt, PhD, derived embryonic stem cells from the unfertilized eggs of female mice with the disease, and identified those stem cell lines that contained only the "healthy" hemoglobin genes. These "genetically clean" embryonic stem cell lines were converted into cells that were transplanted into afflicted mice that were carriers of the disease causing gene. Blood samples drawn five weeks after transplantation revealed that the delivered cells were present in the recipients' blood. Their red blood cells were also corrected to a size similar to normal mice and red blood cell count, hematocrit and hemoglobin levels became normal.

"Overall, we observed long-term improvement of thalassemia in this model," said Dr. Eckardt. "Our findings suggest that using reproductive cells to generate embryonic stem cells that are 'disease-free' may be a solution for genetic diseases involving large, complex or poorly identified deletions in the genome or that are not treatable by current gene therapy approaches." Dr. McLaughlin says that this approach also contrasts with typical gene therapy approaches in that it requires no engineering of the genome, which is currently difficult to achieve in human embryonic and embryonic-like (IPS) stem cells.

EurekAlert

TStzmmalaysia
post Jan 13 2011, 11:38 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

First strawberry genome sequence promises better berries

DURHAM, N.H. – An international team of researchers, including several from the University of New Hampshire, have completed the first DNA sequence of any strawberry plant, giving breeders much-needed tools to create tastier, healthier strawberries. Tom Davis, professor of biological sciences at UNH, and postdoctoral researcher Bo Liu were significant contributors to the genome sequence of the woodland strawberry, which was published last month in the journal Nature Genetics.

"We now have a resource for everybody who's interested in strawberry genetics. We can answer questions that before would have been impossible to address," says Davis, who has been working on the strawberry genome project since 2006 as part of the international Strawberry Genome Sequencing Consortium.

For instance, says Davis, breeders can now look at the DNA "fingerprint" of strawberry plants to more easily breed those with enhanced flavor, aroma, or antioxidant properties. Or they could breed more disease-resistant berries, decreasing the significant amount of spraying that cultivated strawberries currently need to thrive and thus enhancing the berry's healthful qualities.

Further, the woodland strawberry is a member of the Rosaceae family, which includes apples, peaches, cherries, raspberries, and almonds, all economically important and popular crops; researchers say the DNA sequence of the strawberry genome will inform the breeding of these other fruits. "We can now begin to understand how evolution works at the level of the genome on this family of plants we all enjoy," says Davis.

The genome sequencing effort, led by researchers at the University of Florida and Virginia Tech, found that the woodland strawberry -- Fragaria vesca – has 240 million base pairs of DNA (compared to 3 billion for humans), making it one of the smallest genomes of economically significant plants. The consortium focused first on sequencing the wild woodland strawberry because its cultivated cousins, all hybrids, are far more complex.

Building upon prior publications in which he described a one percent genomic sampling of a native New Hampshire wild strawberry, Davis played multiple roles in genome project planning, data interpretation, and manuscript preparation. Liu's unique contribution to this effort was to independently document the locations of specific sequences called ribosomal gene clusters on the chromosomes themselves, using an advanced microscopic technique known as fluorescent in situ hybridization.

EurekAlert


TStzmmalaysia
post Jan 13 2011, 11:41 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Synthetic Genes Sub for Natural Ones in Microbe Experiment

A handful of bacterial genes crucial to survival were successfully replaced by artificial ones in a new synthetic biology experiment.

It’s not clear how the synthetic genes rescued doomed Escherichia coli bacteria, which had several important sequences of DNA knocked out of its genome. But scientists think synthetic proteins produced by the new genes replaced the missing natural versions.

“To enable life you need genes and proteins, which are information and machines,” said molecular biologist Michael Hecht of Princeton University, co-author of the study published online in PLoS ONE. “These evolved over a very long period of time, but we wanted to ask, ‘Are they really special, or can we make stuff like them from scratch?’ It seems we can.”

One of synthetic biology’s primary goals is to create customizable organisms able to produce food, fuel or pharmaceuticals, clean toxins from the environment, or even function as computers.

Most synthetic biology experiments, including the J. Craig Venter institute’s recent creation of synthetic life form, rely on existing genes in nature. In the new experiment, however, Hecht and his team engineered a semi-random library of 1.5 million made-from-scratch genes.

Genes contain instructions for building proteins, which are made of units called amino acids. There are 20 different amino acids a protein can be made of, and some sequences make a protein fold into three dimensions.

Each gene in Hecht’s synthetic library calls for proteins made of 102 amino acids, yet, instead of randomly filling each spot, the genes included sequences prompting them fold into 4-helix structures (right).

“Folding in three dimensions means functionality when it comes to proteins, so the library isn’t completely randomized,” Hecht said. “You might think of it as a targeted shotgun of randomness instead of a bomb of it.”

Twenty-seven strains of E. coli, each with one missing gene critical for survival, individually mingled with the synthetic gene library. Four strains of bacteria incorporated a synthetic gene into their DNA and grew on Petri dishes containing only the bare nutrients for survival. Without the new genes, the four strains didn’t grow at all.

Taking the rescue of doomed microbes further, Hecht’s team mixed the library with a strain of E. coli missing all four genes. It, too, was saved by synthetic genes in the library.

“I think this is a very interesting start to some more research,” said biotechnologist Andrew Ellington of the University of Texas at Austin, who was not involved with the research. “I’d like to see more proof that these proteins are doing what [Hecht] says they’re doing … There may be some weird things going on.”

Hecht said “it would be nice” to untangle the biochemistry of his genetic rescues, adding that the synthetic genes weren’t exactly optimum replacements for nature’s versions chiseled over billions of years of evolution. But he said that’s not the biggest takeaway from the experiment.

“We know which specific genetic sequences rescue the strains, even if we don’t yet know how they work,” Hecht said.

In addition to following up on the biochemistry of the genes that revived the bacteria, Hecht’s laboratory plans to engineer more complex libraries and knock out even more crucial genes.

“‘How far can you go with this?’ is what we want to know. Could you knock out 100 genes and rescue all of them? Eventually a whole genome?” Hecht said.

Wired

TStzmmalaysia
post Jan 13 2011, 11:42 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

An alternative and cleaner power supply for ships

A new application of Molten Carbonate Fuel Cell (MCFC) has been developed by the European-funded MC WAP research project to be eventually used as an alternative power supply for ships. This will be cleaner and avoid the pollution of the marine diesel engines which currently provide the power in the vast majority of the world's ships.
The research into molten carbonate fuel cells has it origins as early as the 1930s, Emil Baur and H.Preis in Switzerland experimented with high-temperature, solid oxide electrolytes. At the initial stages the research was applicable to both molten carbonate and solid oxide fuel cells which are both high-temperature devices. The technical history of both cells seems to follow a similar line of research. A divergence in the development appeared in the late 1950s. From then on, the Molten Carbonate Fuel Cell (MCFC) was developed from a purely experimental prototype to today’s practical demonstrator.

Molten carbonate fuel cells demand very high operating temperatures (600°C and above) and most applications for this kind of cell are limited to large, stationary power plants. The envisaged initial application is associated with waste heat, industrial processing, and in steam turbines to generate more electricity.

The MC WAP project has developed a molten carbon fuel cell which uses hydrogen obtained from a system that converts diesel oil into a hydrogen-rich gas, and air coming from the compressor of a microturbine. The reaction produces electricity and heat, without combustion.

To operate the MCFC on board a ship, researchers of the MC-WAP project have developed two major elements: The Fuel Processor Module and the Fuel Cells Module. The Fuel Cells Module is a chemical plant. It is fed from one side by compressed air and from the other side by a gas called syngas (produced from diesel) by the Fuel Processor Module. This gas is currently being tested in Germany, at the University of Freiberg. The chemical reaction between air and syngas then generates electricity.

The energy produced by the current system, corresponds to about 250 kilowatts, and represents one production unit of reserve energy that can power the essential systems on board, such as the control systems, communication, lighting and main auxiliary systems. Although at this time it will not power the propulsion, it will be able to contribute to it in some cases.

No combustion means fewer greenhouse gas emissions from the many tourist and cargo ships that carry the millions of people and goods around the coasts of Europe and the world. The cleaner ship exhausts are better for the environment and will help the operators to meet the new green legislation.

Cleaning the exhausts involves removing the traces of sulphur and carbon di-oxide that remain after normal combustion, resulting in clean exhaust gases. The system releases virtually no harmful substances: the fuel is transformed into synthetic gas which is then used in the fuel cell, without creating pollution. Furthermore the lack of moving parts in the MCFC will reduce the overall ship vibrations which will result in a more comfortable journey for the passengers.

PhysOrg


TStzmmalaysia
post Jan 13 2011, 11:44 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Bell Book & Candle Restaurant in NYC Features Aeroponic Rooftop Garden

Bell, Book & Candle restaurant in the West Village is using its roof to grow sixty percent of its ingredients via energy-efficient Aeroponic growing towers. Led with Chef John Mooney’s locavore beliefs, Mooney’s initiatives transcend the typical local farm-to-table restaurant. By growing vertically, Mooney is able to control what goes into each vegetable, with a contained water system and no soil – meaning no additives or pesticides are necessary.

Furthermore, since Bell, Book & Candle is self-supplying, fruits and veggies are picked at their peak and used immediately, rather than being stored for long periods of time. A carbon neutral pulley brings each day’s harvest down into the kitchen to be turned into mouth-watering dishes such as Roasted Heirloom Pumpkin Soup, Rooftop Mixed Green Salad, and Gin & Tonic Wild Salmon with caramelized cauliflower. Any produce that isn’t sourced from above is instead purchased from a cold-weather greenhouse in nearby Lancaster, PA. But it really doesn’t get any fresher than rooftop-to-table!

Inhabitat

TStzmmalaysia
post Jan 14 2011, 09:32 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

First electronic focusing eyewear

We have previously reported on the development of prototype adaptive focus glasses at the University of Arizona (UA) that were able to switch focus electronically. Unlike manually adjustable focus glasses, such as TruFocals, that place a flexible liquid lens between two rigid lenses, the lenses of the prototype glasses consisted of a layer of liquid crystals sandwiched between two pieces of glass. By applying an electric charge, the orientation of the liquid crystals – and therefore the optical path length through the lens – was able to be changed, resulting in glasses that changed focus electronically. This technology is now on its way to consumers with PixelOptics showing its emPower! glasses at CES 2011.

Relying on liquid crystals, the glasses, which PixelOptics will bring to market under the brand name emPower!, are able to switch focus in the blink of an eye and with no moving parts – unless you count the reorientation of the liquid crystals. Being electronically activated also allows for a neat feature. While the wearer is able to manually activate the change of focus by touching the arm of the emPower! glasses, thanks to an accelerometer embedded in the arm, with a swipe they can also set the glasses to change focus automatically when they look down to read.



Being electronic also means batteries. The battery embedded in the glasses can be recharged in around two hours using an inductive charger and is good for two to three days, depending on usage patterns.

Calling them the “world’s first electronic focusing eyewear,” PixelOptics' emPower! glasses are based on the technology originally developed at UA, which licensed three patents to Johnson & Johnson, who then sold the patent licenses to PixelOptics to commercialize the technology. That commercialization is set to happen some time this year when PixelOptics plans to launch its emPower! glasses in around 36 different styles.

GizMag


TStzmmalaysia
post Jan 14 2011, 09:37 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


NANOTECHNOLOGY

Attached Image

Fast, easy way to make hydrogen nanosensors found by scientists

A team of Northern Illinois University scientists, with a major role played by NIU Ph.D. students, has discovered a new, convenient and inexpensive way to make high performance hydrogen sensors using palladium nanowires.

The technology could help enable a scale-up for potential industrial applications, such as safety monitors in future hydrogen-powered vehicles.

Highly flammable hydrogen gas cannot be odorized like natural gas. The new technology produces nanoscale sensors that work extremely fast and would allow for closing of safety valves before dangerous concentrations of the gas could be reached.

Scientists have known that palladium nanowires demonstrated promise as hydrogen gas sensors in speed, sensitivity and ultra-low power consumption. But the utilization of single palladium nanowires faced challenges in several areas, including nanofabrication.

“We report on hydrogen sensors that take advantage of single palladium nanowires in high speed and sensitivity and that can be easily and cheaply made,” said lead author Xiaoqiao (Vivian) Zeng, a Ph.D. student in chemistry and biochemistry at NIU. The new research is published in the January edition of the American Chemical Society's prestigious journal Nano Letters.

“The new types of hydrogen sensors are based on networks of ultra-small (< 10 nanometers) palladium nanowires achieved by sputter-depositing palladium onto the surface of a commercially available and inexpensive filtration membrane,” Zeng said.

The research was conducted at both Northern Illinois University and Argonne National Laboratory. The scientists also found that the speed of the sensors increases with decreasing thicknesses of the palladium nanowires. The sensors are 10 to 100 times faster than their counterparts made of a continuous palladium film of the same thickness.

“The superior performance of the ultra-small palladium nanowire network-based sensors demonstrates the novelty of the fabrication approach, which can be used to fabricate high-performance sensors for other gases,” said NIU Presidential Research Professor of Physics Zhili Xiao, leader of the research team and co-adviser to Zeng.

Xiao noted that Zeng’s exceptional contribution to the research is particularly impressive for a Ph.D. candidate. Zeng came to NIU in the fall of 2008 after earning her master’s degree from the University of Science and Technology Beijing. She is now a recipient of the NIU Nanoscience Fellowship, jointly supported by the university and Argonne.

“It is extremely competitive to publish an article in Nano Letters, which has a very high impact factor that is better even than the traditionally prestigious chemical and physical journals,” Xiao said. “We’re proud of Vivian’s achievements and grateful for her creativity and diligence.

“Nanoresearch is truly interdisciplinary,” Xiao added. “Chemists have undoubtedly demonstrated advantages in nanofabrication by utilizing methods of chemical synthesis to obtain extreme nanostructures, while physicists have strengths in exploration of new physical properties at the nanoscale. This research benefitted tremendously from Vivian’s expertise in chemistry. In fact, the substrates used to form the novel networks of palladium nanowires are common filtration members known to chemists.”

PhysOrg


TStzmmalaysia
post Jan 14 2011, 09:39 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


SPACE SCIENCE

Attached Image

The 123,000 MPH Plasma Engine That Could Finally Take Astronauts To Mars

You might expect to find our brightest hope for sending astronauts to other planets in Houston, at NASA’s Johnson Space Center, inside a high-security multibillion-dollar facility. But it’s actually a few miles down the street, in a large warehouse behind a strip mall. This bland and uninviting building is the private aerospace start-up Ad Astra Rocket Company, and inside, founder Franklin Chang Díaz is building a rocket engine that’s faster and more powerful than anything NASA has ever flown before. Speed, Chang Díaz believes, is the key to getting to Mars alive. In fact, he tells me as we peer into a three-story test chamber, his engine will one day travel not just to the Red Planet, but to Jupiter and beyond.

I look skeptical, and Chang Díaz smiles politely. He’s used to this reaction. He has been developing the concept of a plasma rocket since 1973, when he become a doctoral student at the Massachusetts Institute of Technology. His idea was this: Rocket fuel is a heavy and inefficient propellant. So instead he imagined building a spaceship engine that uses nuclear reactors to heat plasma to two million degrees. Magnetic fields would eject the hot gas out of the back of the engine. His calculations showed that a spaceship using such an engine could reach 123,000 miles per hour—New York to Los Angeles in about a minute.

Chang Díaz has spent nearly his entire career laboring to convince anyone who would listen that his idea will work, but that career has also taken several turns in the process. One day in 1980, he was pitching the unlimited potential of plasma rockets to yet another MIT professor. The professor listened patiently. “It sounds like borderline science fiction, I know,” Chang Díaz was saying. Then the telephone rang. The professor held up a finger. “Why, yes, he’s right here,” the surprised engineer said into the receiver, then handed it over. “Franklin, it’s for you.” NASA was on the line. The standout student from Costa Rica had been selected to become an astronaut, the first naturalized American ever chosen for NASA’s most elite corps. “I was so excited, I was practically dancing,” Chang Díaz recalls. “I almost accidentally strangled my professor with the telephone cord.”

All astronauts have big dreams, but Franklin Chang Díaz’s dreams are huge. As a college student, as a 25-year astronaut and as an entrepreneur, his single animating intention has always been to build—and fly—a rocketship to Mars. “Of course I wanted to be an astronaut, and of course I want to be able to fly in this,” he says of his plasma-thrust rocket. “I mean, I just can’t imagine not flying in a rocket I would build.” And now he’s close. In four years Chang Díaz will deploy his technology for the first time in space, when his company, aided by up to $100 million in private funding, plans to test a small rocket on the International Space Station. If this rocket, most commonly known by its loose acronym, Vasimr, for Variable Specific Impulse Magnetoplasma Rocket, proves itself worthy, he has an aggressive timetable for constructing increasingly bigger plasma-thrust space vehicles.

Chang Díaz describes his dreams in relatively practical terms. He doesn’t intend to go straight to Mars. First he will develop rockets that perform the more quotidian aspects of space maintenance needed by private companies and by the government: fixing, repositioning, or reboosting wayward satellites; clearing out the ever-growing whirl of “space junk” up there; fetching the stuff that can be salvaged. “Absolutely, fine, I’m not too proud to say it. We’re basically running a trucking business here,” he says. “We’ll be sort of a Triple-A tow truck in space. We’re happy to be a local garbage collector in space. That’s a reliable, sustainable, affordable business, and that’s how you grow.”

Eventually, though, Chang Díaz intends to build more than an extraterrestrial trucking business, and his ambitions happen to coincide with Barack Obama’s call for a privatized space industry that supports exploration well beyond the moon. “We’ll start by sending astronauts to an asteroid for the first time in history,” Obama said in a major NASA-related address earlier this year at Kennedy Space Center. “By the mid-2030s, I believe we can send humans to orbit Mars and return them safely to Earth.”

Such a belief may seem overly ambitious, but the goals of aviation have always seemed that way. In October 1903, for instance, astronomer Simon Newcomb, the founding president of the American Astronomical Society, spelled out a series of reasons why the concept of powered flight was dubious. “May not our mechanicians,” he asked, “be ultimately forced to admit that aerial flight is one of the great class of problems with which man can never cope, and give up all attempts to grapple with it?” Less than two months later, the Wright brothers flew at Kitty Hawk. And in the 1920s a young man named Frank Whittle was coming up with drawings for a theoretical engine very different from the propeller-driven kind, one that might scoop in air through turbines and fire it through a series of “jet” nozzles. “Very interesting, Whittle, my boy,” said one of his professors of aeronautical engineering at the University of Cambridge. “But it will never work.”

More on this: PopSci


TStzmmalaysia
post Jan 14 2011, 09:42 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

'Liquid Pistons' Could Drive New Advances in Camera Lenses and Drug Delivery

A few unassuming drops of liquid locked in a very precise game of "follow the leader" could one day be found in mobile phone cameras, medical imaging equipment, implantable drug delivery devices, and even implantable eye lenses.

Engineering researchers at Rensselaer Polytechnic Institute have developed liquid pistons, in which oscillating droplets of ferrofluid precisely displace a surrounding liquid. The pulsating motion of the ferrofluid droplets, which are saturated with metal nanoparticles, can be used to pump small volumes of liquid. The study also demonstrated how droplets can function as liquid lenses that constantly move, bringing objects into and out of focus.

These liquid pistons are highly tunable, scalable, and -- because they lack any solid moving parts -- suffer no wear and tear. The research team, led by Rensselaer Professor Amir H. Hirsa, is confident this new discovery can be exploited to create a host of new devices ranging from micro displacement pumps and liquid switches, to adaptive lenses and advanced drug delivery systems.

"It is possible to make mechanical pumps that are small enough for use in lab-on-a-chip applications, but it's a very complex, expensive proposition," said Hirsa, a professor in the Department of Mechanical, Aerospace, and Nuclear Engineering at Rensselaer. "Our electromagnetic liquid pistons present a new strategy for tackling the challenge of microscale liquid pumping. Additionally, we have shown how these pistons are well-suited for chip-level, fast-acting adaptive liquid lenses."

Hirsa's team developed a liquid piston that is comprised of two ferrofluid droplets situated on a substrate about the size of a piece of chewing gum. The substrate has two holes in it, each hosting one of the droplets. The entire device is situated in a chamber filled with water.

Pulses from an electromagnet provoke one of the ferrofluid droplets, the driver, to vibrate back and forth. This vibration, in turn, prompts a combination of magnetic, capillary, and inertial forces that cause the second droplet to vibrate in an inverted pattern. The two droplets create a piston, resonating back and forth with great speed and a spring-like force. Researchers can finely control the strength and speed of these vibrations by exposing the driver ferrofluid to different magnetic fields.

In this way, the droplets become a liquid resonator, capable of moving the surrounding liquid back and forth from one chamber to another. Similarly, the liquid piston can also function as a pump. The shift in volume, as a droplet moves, can displace from the chamber an equal volume of the surrounding liquid. Hirsa said he can envision the liquid piston integrated into an implantable device that very accurately releases tiny, timed doses of drugs into the body of a patient.

As the droplets vibrate, their shape is always changing. By passing light through these droplets, the device is transformed into a miniature camera lens. As the droplets move back and forth, the lens automatically changes its focal length, eliminating the usual chore of manually focusing a camera on a specific object. The images are captured electronically, so software can be used to edit out any unfocused frames, leaving the user with a stream of clear, focused video.

The speed and quality of video captured from these liquid lenses has surpassed 30 hertz, which is about the quality of a typical computer web cam. Liquid lenses could mean lighter camera lenses that require only a fraction of the energy demanded by today's digital cameras. Along with handheld and other electronic devices, and homeland security applications, Hirsa said this technology could even hold the key to replacement eye lenses that can be fine-tuned using only high-powered magnets.

"There's really a lot we can do with these liquid pistons. It's an exciting new technology with great potential, and we're looking forward to moving the project even further along," he said.



ScienceDaily


TStzmmalaysia
post Jan 14 2011, 09:45 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

New Laboratory Aims to Revolutionize Surgery With Real-Time Metabolic Profiling

Metabolic profiling of tissue samples could transform the way surgeons make decisions in the operating theatre, say researchers at a new laboratory being launched. Scientists at Imperial College London, in partnership with clinicians at Imperial College Healthcare NHS Trust, have installed a high resolution solid state nuclear magnetic resonance (NMR) spectrometer in St Mary's Hospital. Researchers will use the machine to analyse intact tissue samples from patients taking part in studies, to investigate whether it can ultimately give surgeons detailed diagnostic information while their patients are under the knife.

The Surgical Metabonomics Laboratory will be led by the surgical innovator Professor Lord Ara Darzi and Professor Jeremy Nicholson, a leading researcher in biomolecular medicine and Head of the Department of Surgery and Cancer.

The science of metabonomics, which involves comprehensively measuring the metabolic changes in a person's body, has been pioneered by the Imperial team over the last 20 years. Techniques from analytical chemistry, such as NMR spectroscopy and mass spectrometry, can allow researchers to measure simultaneously all of the chemicals produced by the body's metabolism. With knowledge of which molecules correspond to which conditions in the body, this "metabolic fingerprint" can provide a wealth of information about the state of a person's health.

Metabonomics has previously been applied to samples of bodily fluids such as blood and urine to look for indicators of disease or of how a person might respond to a particular drug. Now the Imperial team have acquired an NMR machine -- the first to be installed in a hospital setting -- that will analyse solid tissue samples from patients undergoing surgery with Imperial College Healthcare.

The research projects are funded by Imperial's Comprehensive Biomedical Research Centre. Imperial's is one of five Comprehensive Biomedical Research Centres in the UK; it was awarded to Imperial College Healthcare NHS Trust by the National Institute for Health Research following a national competition. The new laboratory forms part of the Academic Health Science Centre, a unique partnership between the Trust and Imperial College London, which aims to improve the quality of life of patients and populations by taking new discoveries and translating them into new therapies as quickly as possible.

Professor Darzi, Chairman of the Institute of Global Health Innovation at Imperial College London and an Honorary Consultant Surgeon with Imperial College Healthcare NHS Trust, said: "People respond differently to the physical trauma of surgery, but currently the tools we have to measure how they respond are very limited. Blood tests are slow and they can only measure one chemical component at a time; the doctor simply looks at whether a particular measure has gone up or down. Using NMR, we can simultaneously measure all of the chemicals that the body is producing, and analyse those data to give the surgeon real-time information about the patient's condition which will help him make decisions."

Surgeons will be able to take tissue samples and have them loaded straight into the NMR machine without the need to prepare them. The research team think it will be possible to give the surgeon a readily interpretable readout from the analysis within 20 minutes, which would provide information such as whether the tissue is infected or how good its blood supply is. Surgeons might also use the technology to determine exactly which areas of tissue are cancerous.

One project that the team will undertake at the new laboratory is to develop an "intelligent knife." Surgeons commonly use a technique called electrocautery in operations to seal blood vessels by burning them with a hot iron. By sucking up the smoke produced in this procedure into a mass spectrometer, researchers believe they will be able to tell the surgeon whether the tissue they are burning is healthy, cancerous or infected.

Professor Nicholson, Head of the Department of Surgery and Cancer at Imperial College London, said: "This is a radical change of approach that doesn't just apply to surgery. We want to be able to provide a metabolic map of the entire patient journey. Before surgery, metabonomics could tell the doctor how risky surgery might be for that patient, or how best to prepare him for surgery. After the operation, metabonomics might help the doctor to monitor the patient's recovery and prescribe the most suitable drugs or diet. Ultimately we hope to apply this approach to every area of medicine.

"It's no small task. The analytical chemistry and mathematical modelling involved are challenging, and not everything we try will work. But we hope that within two to three years, we'll have robust evidence that metabolic profiling can be a really useful tool in surgery."

Dr James Kinross, a Clinical Lecturer in the Division of Surgery at Imperial College London, said: "People have been talking about personalised medicine for many years now, but so far there have been few meaningful steps towards delivering on that promise. Genome sequencing is currently quite slow and expensive, and it can only tell you so much. Metabonomics takes into account not only what genes somebody has, but also all of the environmental factors that influence their biology, such as their diet, what drugs they're taking, and what bacteria they have in their body.

"Because of the world class expertise we have here and the close links between surgeons and biomolecular scientists, Imperial is uniquely placed to be able to make major advances in this field. Almost no other institution is in a position to take on the challenges involved."

To help realise the vision of the new centre to enhance surgical safety and patient care, Imperial has partnered with two of the world's leading spectroscopic instrument manufacturers, Bruker BioSpin and the Waters Corporation, who will help to develop, optimise and implement NMR and mass spectrometric technologies for real time diagnostics and prognostic modelling.

"By combining bioinformatics and surgical expertise with advanced mass spectrometry technology, Imperial College London is setting a powerful vision for innovative new techniques in the operating room," said Rohit Khanna PhD, Vice President of Worldwide Marketing for Waters. "At Waters, our success is based upon the ability and imagination of scientists to apply advances in analytical technology to solve their most difficult challenges. Bringing metabolic profiling to the surgical suite is a great example of how a disruptive innovation can potentially improve patient care with a radical new approach. On behalf of all Waters employees, we congratulate Imperial on the launch of the Surgical Metabonomics Laboratory. We look forward to working together on tomorrow's innovations."

ScienceDaily

TStzmmalaysia
post Jan 14 2011, 09:46 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

New nanoparticles mimic properties of red blood cells


A team of scientists has created particles that closely mirror some of the key properties of red blood cells, potentially helping pave the way for the development of synthetic blood.

The new discovery – outlined in a study appearing in the online Early Edition of the Proceedings of the National Academy of Sciences during the week of Jan. 10, 2011 – also could lead to more effective treatments for life threatening medical conditions such as cancer.

University of North Carolina at Chapel Hill researchers used technology known as PRINT (Particle Replication in Non-wetting Templates) to produce very soft hydrogel particles that mimic the size, shape and flexibility of red blood cells, allowing the particles to circulate in the body for extended periods of time.

Tests of the particles' ability to perform functions such as transporting oxygen or carrying therapeutic drugs have not been conducted, and they do not remain in the cardiovascular system as long as real red blood cells.

However, the researchers believe the findings – especially regarding flexibility – are significant because red blood cells naturally deform in order to pass through microscopic pores in organs and narrow blood vessels. Over their 120-day lifespan, real cells gradually become stiffer and eventually are filtered out of circulation when they can no longer deform enough to pass through pores in the spleen. To date, attempts to create effective red blood cell mimics have been limited because the particles tend to be quickly filtered out of circulation due to their inflexibility.

Beyond moving closer to producing fully synthetic blood, the findings could affect approaches to treating cancer. Cancer cells are softer than healthy cells, enabling them to lodge in different places in the body, leading to the disease's spread. Particles loaded with cancer-fighting medicines that can remain in circulation longer may open the door to more aggressive treatment approaches.

"Creating particles for extended circulation in the blood stream has been a significant challenge in the development of drug delivery systems from the beginning," said Joseph DeSimone, Ph.D., the study's co-lead investigator, Chancellor's Eminent Professor of Chemistry in UNC's College of Arts and Sciences, a member of UNC's Lineberger Comprehensive Cancer Center and William R. Kenan Jr. Distinguished Professor of Chemical Engineering at N.C. State University. "Although we will have to consider particle deformability along with other parameters when we study the behavior of particles in the human body, we believe this study represents a real game changer for the future of nanomedicine."

Chad Mirkin, Ph.D., George B. Rathmann Professor of Chemistry at Northwestern University, said the ability to mimic the natural processes of a body for medicinal purposes has been a long-standing but evasive goal for researchers. "These findings are significant since the ability to reproducibly synthesize micron-scale particles with tunable deformability that can move through the body unrestricted as do red blood cells, opens the door to a new frontier in treating disease," said Mirkin, who also is a member of President Obama's Council of Advisors on Science and Technology and director of Northwestern's International Institute for Nanotechnology.

UNC researchers designed the hydrogel material for the study to make particles of varying stiffness. Then, using PRINT technology — a technique invented in DeSimone's lab to produce nanoparticles with control over size, shape and chemistry — they created molds, which were filled with the hydrogel solution and processed to produce thousands of red blood cell-like discs, each a mere 6 micrometers in diameter.

The team then tested the particles to determine their ability to circulate in the body without being filtered out by various organs. When tested in mice, the more flexible particles lasted 30 times longer than stiffer ones: the least flexible particles disappeared from circulation with a half-life of 2.88 hours, compared to 93.29 hours for the most flexible ones. Stiffness also influenced where particles eventually ended up: more rigid particles tended to lodge in the lungs, but the more flexible particles did not; instead, they were removed by the spleen, the organ that typically removes old real red blood cells.

Nano


TStzmmalaysia
post Jan 14 2011, 09:48 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Spaser - groundbreaking nano-laser for medicine and electronics

Lasers have revolutionized the communications and medical industries. They focus light to zap tumors and send digital TV signals and telephone communications around the world.
But the physical length of an ordinary laser cannot be less than one half of the wavelength of its light, which limits its application in many industries. Now the Spaser, a new invention developed in part by Tel Aviv University, can be as small as needed to fuel nanotechnologies of the future.

Prof. David Bergman of Tel Aviv University's Department of Physics and Astronomy developed and patented the theory behind the Spaser device in 2003 with Prof. Mark Stockman of Georgia State University in Atlanta. It is now being developed into a practical tool by research teams in the United States and around the world.
"Spaser" is an acronym for "surface plasmon amplification by stimulated emission of radiation" — and despite its mouthfilling definition, it's a number one buzzword in the nanotechnologies industry. The Spaser has been presented at recent meetings and symposia around the world, including a recent European Optical Society Annual Meeting.

Spasers are considered a critical component for future technologies based on nanophotonics ––technologies that could lead to radical innovations in medicine and science, such as a sensor and microscope 10 times more powerful than anything used today. A Spaser-based microscope might be so sensitive that it could see genetic base pairs in DNA.
It could also lead to computers and electronics that operate at speeds 100 times greater than today's devices, using light instead of electrons to communicate and compute. More efficient solar energy collectors in renewable energy are another proposed application.

"It rhymes with laser, but our Spaser is different," says Prof. Bergman, who owns the Spaser patent with his American partner. "Based on pure physics, it's like a laser, but much, much, much smaller." The Spaser uses surface plasma waves, whose wavelength can be much smaller than that of the light it produces. That's why a Spaser can be less than 100 nanometers, or one-tenth of a micron, long. This is much less than the wavelength of visible light, explains Prof. Bergman.
Fuelling the buzz

In the next year, the research team expects even more buzz to be created around their invention. In 2009, a team from Norfolk State University, Purdue University, and Cornell University managed to create a practical prototype.
The Spaser will extend the range of what's possible in modern electronics and optical devices, well beyond today's computer chips and memories, Prof. Bergman believes. The physical limitations of current materials are overcome in the Spaser because it uses plasmons, and not photons. With the development of surface plasma waves — electromagnetic waves combined with an electron fluid wave in a metal — future nano-devices will operate photonic circuitry on the surface of a metal. But a source of those waves will be needed. That's where the Spaser comes in.

Smaller than the wavelength of light, nano-sized plasmonic devices will be fast and small. Currently the research team is working on commercializing their invention, which they suggest could represent a quantum leap in the development of nano-sized devices.

Nanowerk



TStzmmalaysia
post Jan 14 2011, 09:49 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Polymer with amazing self-healing properties

Sooner or later, a cut to the skin or a broken bone will heal on its own; however, a scratch to a car's paint or a tear in the wing of an airplane will not. Materials with self-healing properties could help extend the durability of products and make repairs easier.
Krzysztof Matyjaszewski and his co-workers at Carnegie Mellon University (Pittsburgh, USA) and Kyushu University (Japan) have now developed a polymer that can repair itself when irradiated with UV light -- over and over again. As the scientists report in the journal Angewandte Chemie, this is the first material in which capped covalent bonds repeatedly reattach, even allowing fully separated pieces to be fused back together.

Some previous solid self-healing materials contain tiny capsules that tear open to release a chemical agent when the material is damaged and have been able to repair themselves only one time. Other materials, including some gels, can repair themselves repeatedly but lack the covalent bonds that increase materials strength and stability.

In contrast, the new polymeric material produced by the American and Japanese team is stable and repairs itself again and again. The secret to their success is that the polymer is cross-linked through trithiocarbonate units. These are carbon atoms bonded to three sulfur atoms, two of which use their second bonding position to attach to another carbon atom. These groups have a special property: they can restructure under UV light. The light breaks one carbon–sulfur bond in the trithiocarbonate groups. This produces two radicals -- molecules with a free, unpaired electron. The radicals are very reactive and attack other trithiocarbonate groups to form new carbon–sulfur bonds while breaking others to form more free radicals. The chain reaction stops when two radicals react with each other.

The researchers were able to heal cut polymer fragments with irradiation—either immersed in liquid or in bulk. They only had to firmly press the cut edges together and irradiate them. The edges grew back together by means of the radical re-organization process described above.

The self-healing effect goes much further: even shredded polymer samples could simply be pressed together and irradiated to be fused into a continuous piece. The resulting object was in the shape of the cylindrical tube in which the procedure was carried out. This self-healing process can be carried out repeatedly on the same sample. The material is thus also interesting as a new recyclable product.

PhysOrg

TStzmmalaysia
post Jan 14 2011, 09:53 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Hydrocarbon Breakthrough Made Using Gold Catalyst

Hydrocarbons are an extremely important energy resource but, although widely available from fossil fuels, are extremely difficult to activate and require very high temperatures in current industrial processes.
For the first time, the Cardiff study has shown that the primary carbon-hydrogen bonds in toluene, a hydrocarbon widely used as an industrial material, can be activated selectively at low temperatures.

Professor Graham Hutchings FRS, one of the study's co-authors and Cardiff University's Pro Vice-Chancellor for Research, said: "One of the key challenges facing chemists today is to activate primary carbon-hydrogen bonds in hydrocarbons to make more valuable and reactive molecules. This is crucial for the sustainable exploitation of available industrial feedstocks.
"Our research resulted in unprecedented yields of a single product of over 90%. We achieved this using a gold catalyst, an unexpected result as gold is the most noble of the elements."

This opens up the possibility of using hydrocarbon feedstocks in a new way to form intermediates and final products for use in the chemical, pharmaceutical, and agricultural business sectors.
The research was carried out by a large team at the Cardiff Catalysis Institute, in collaboration with researchers at Lehigh University, Pennsylvania. It was funded by a major research grant won by Cardiff University's School of Chemistry in 2008, when it was selected from hundreds of international bids as part of the Dow Methane Challenge.
The challenge was initiated by the Dow Chemical Company to identify collaborators and approaches in the area of methane conversion to chemicals.

ScienceDaily


TStzmmalaysia
post Jan 15 2011, 11:50 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Sun-Tracker Skylights Pump Daylight Indoors

Daylighting is now considered one of the cornerstones of sustainable building, but it has been a challenge to implement it in large buildings — especially retrofits. Sun Tracker skylights are an intriguing solution that can introduce daylight into the huge offices, retail box stores, and warehouses that line our cities. The skylight works by tracking the sun with a set of mirrors that redirect light into the interior. The design significantly reduces the need for artificial lighting and heat compared to traditional skylights and creates a much happier work environment. Read on to see how they work.

The Sun Tracker turned a lot of heads when we first saw it at the GreenBuild 2007 show. Now that it has a few years under its belt the Sun Tracker is proving its worth in installations ranging from industrial and retail buildings to offices. It is basically a mini, fully-autonomous sun tracking motor which uses GPS to align itself and a solar electric panel for power. The double-layered dome is set high off the curb, making the incorporated mirror especially adept at catching early morning and afternoon sun while helping to deflect intense daytime heat.

Each unit replaces the equivalent of 800 watts of florescent lighting, and when hooked up to a light sensor not only saves 2/3rds of the power for lighting but also 1/3 in energy-intensive cooling. This was proven out at a huge installation of Sun Trackers at the Patagonia warehouse in Reno, Nevada. Hard numbers such as energy savings and simple payback are important, but so are better work conditions in daylit spaces, which can improvine productivity and even retail sales. The Sun Tracker had some pretty high failure rates early on, but they come with a ten-year warranty and hopefully an improved history of reliability.

Inhabitat

TStzmmalaysia
post Jan 15 2011, 11:53 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Winter is No Match for the ROBOPLOW!

Are the winter months getting you down? Are piles of snow trapping you in your home? Are you in need of a hero? Fear not gentle reader, for nature technology will provide you with one. Four guys calling themselves ideaLaboratories have produced a kickass robot tough enough to punch winter in the face and laugh while doing it. The ROBOPLOW (caps are a necessity) is a six-wheeled snow-devastating machine with a 50 inch pneumatic blade and 660 Amps of plowing power.

Equipped with 10 watt LED lamps and an onboard camera, the ROBOPLOW lets you remotely clean up the harsh winter wonderland while staying safe and warm in your own home, driving the bot via a computer. The following video of ROBOPLOW in action isn’t so much a demonstration of its capabilities as it is a warning to snowflakes everywhere. I’m not sure what I enjoy more about this clip, the way that ROBOBLOW goes DieHard on snow-covered sidewalks, or the over-the-top awe it inspires in the child watching. That’s right, kid, anyone can shovel snow, but it takes a real badass to design a robot to do it for him.

Look, most of the time I’ll tell you about a robot because of its potential to reshape human-machine interactions or because it does something humans can’t do. ROBOPLOW is not that kind of bot. It’s a snow-killer, plain and simple, and that’s enough for me. Thankfully I’ve moved to warmer climates, but I still remember trudging through New York and Boston winter streets and cursing the demon in a parka that is Jack Frost. If I had a machine like ROBOPLOW I would have ridden it through the iced-over sidewalks cackling maniacally like Slim Pickens in Dr. Strangelove.



For those who have dreams of buying your own ROBOPLOW, I’m sorry to say that it looks like this bot is a one-shot deal. The video of its plowing prowess was released back in February, and we haven’t seen any new information out of ideaLaboratories since. That’s disappointing, but ROBOPLOW does give hope to winter-ravaged robot enthusiasts everywhere. When the ice hits the fan we will be able to watch the robopocalypse and snowpocalypse duke it out on the front lawn while huddling in our garages for safety.

SingularityHub


TStzmmalaysia
post Jan 15 2011, 11:55 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Robots to Replace People for Patrolling Power Lines

Robots are becoming a more important tool for keeping our infrastructure working properly. From scurrying through underground water pipes to search for leaks, to now swinging from power lines to inspect their condition. Electric Power Research Institute, a non-profit utility consortium, has come up with a robot that can patrol power lines in remote places like forests and deserts, places where sending people out can be tedious and expensive.

The New York Times reports that the institute has already devised a prototype that can crawl a few miles each day over transmission line shield wires -- the metal wire used to attract lightening bolts away from power lines -- and look for problems through its sensors that can detect electrical disturbances, a lidar that shows if there is enough distance between the power lines and the ground or trees, an infrared sensor for finding hotspots, and a camera for checking tower structural integrity and other wear and tear.

Powering the prototype is fairly easy. It just taps into the energy from the high-voltage lines it's traveling near, without even having to touch them. The robot, dubbed Ti for Transmission Inspection, will help take utility company trucks off the roads and helicopters out of the air, and ensure that power failures don't happen. Soon utilities will be able to send robots out to areas where there's a suspected problem and likely find out where and what the issue is before a human could.

TreeHugger


TStzmmalaysia
post Jan 15 2011, 11:57 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Stanford researcher uses living cells to create 'biotic' video games

Video game designers are always striving to make games more lifelike, but they'll have a hard time topping what Stanford researcher Ingmar Riedel-Kruse is up to. He's introducing life itself into games.

Riedel-Kruse and his lab group have developed the first video games in which a player's actions influence the behavior of living microorganisms in real time – while the game is being played.

These "biotic games" involve a variety of basic biological processes and some simple single-celled organisms (such as paramecia) in combination with biotechnology.

The goal is for players to have fun interacting with biological processes, without dealing with the rigor of conducting a formal experiment, said Riedel-Kruse, an assistant professor of bioengineering.

"We hope that by playing games involving biology of a scale too small to see with the naked eye, people will realize how amazing these processes are and they'll get curious and want to know more," he said.

"The applications we can envision so far are on the one hand educational, for people to learn about biology, but we are also thinking perhaps we could have people running real experiments as they play these games.

"That is something to figure out for the future, what are good research problems which a lay person could really be involved in and make substantial contributions. This approach is often referred to as crowd-sourcing."

Applying their lab equipment and knowledge to game development, Riedel-Kruse's group came up with eight games falling broadly into three classes, depending on whether players directly interact with biological processes on the scale of molecules, single cells or colonies of single cells.

The results of their design efforts are presented in a paper published in the 10th anniversary issue of Lab on a Chip (the first issue of 2011), published by the Royal Society of Chemistry. The paper is available online now.

Initially, Riedel-Kruse said, the researchers just wanted to see whether they could design such biotic games at all, so this first round of development produced fairly simple games.

"We tried to mimic some classic video games," he said. For example, one game in which players guide paramecia to "gobble up" little balls, a la Pac-Man, was christened PAC-mecium. Then there is Biotic Pinball, POND PONG and Ciliaball. The latter game is named for the tiny hairs, called cilia, that paramecia use in a flipper-like fashion to swim around – and in the game enables the kicking of a virtual soccer ball.

The basic design of the games involving paramecia – the single-celled organisms used in countless biology experiments from grade school classes to university research labs – consists of a small fluid chamber within which the paramecia can roam freely. A camera sends live images to a video screen, with the "game board" superimposed on the image of the paramecia. A microprocessor tracks the movements of the paramecia and keeps score.

The player attempts to control the paramecia using a controller that is much like a typical video game controller. In some games, such as PAC-mecium, the player controls the polarity of a mild electrical field applied across the fluid chamber, which influences the direction the paramecia move. In Biotic Pinball, the player injects occasional whiffs of a chemical into the fluid, causing the paramecia to swim one direction or another.

Riedel-Kruse emphasized that paramecia, being single-celled organisms, lack a brain and the capacity to feel pain. "We are talking about microbiology with these games, very primitive life forms. We do not use any higher-level organisms," he said. "Since multiple test players raised the question of exactly where one should draw this line, these games could be a good tool to stimulate discussions in schools on bioethical issues."

The game on the molecular level involves a common laboratory technique called polymerase chain reaction, or PCR, an automated process that lets researchers make millions of copies of an organism's DNA in as little as two hours.

In this game, called PolymerRace, the player is linked to the output of a PCR machine that is running different reactions simultaneously. While the reactions are running, the players can bet on which reactions will be run the fastest.

"The game PolymerRace is inspired by horse races, where you have different jockeys riding different horses," Riedel-Kruse said. "There is a little bit of biomolecular logic involved and a little bit of chance."

The third game uses colonies of yeast cells that players have to distinguish based on their bread-vinegar-like smell – an olfactory stimulus anyone can experience just by walking into a bakery.

"The idea is that while we as humans play the game, we interact with real biological processes or material," he said. His research group thinks that aspect of the games could help motivate children and even adults to learn more about biology, which is increasingly important to society.

"We would argue that modern biotechnology will influence our life at an accelerating pace, most prominently in the personal biomedical choices that we will be faced with more and more often," Riedel-Kruse said. "Therefore everyone should have sufficient knowledge about the basics of biomedicine and biotechnology. Biotic games could promote that."

Riedel-Kruse wants to maximize the educational potential of these games to enable lay people to contribute to biomedical research. The team hopes that by publishing his group's initial efforts, other researchers in the life sciences will be prompted to explore how their own research could be adapted to "biotic" video games.

Other researchers have developed biologically relevant Internet-based video games such as Fold-It, which lets players try different approaches to folding proteins, and EteRNA, developed in a collaboration between Stanford and Carnegie Mellon University, which lets players propose new molecular structures for ribonucleic acids (RNA).

Fold-It and EteRNA were developed to address specific research questions. Fold-It was strictly a simulation; and although EteRNA will actually test some proposed structures in the laboratory, the players themselves do not have direct interaction with biological processes in real time as in Riedel-Kruse's biotic games.

EurekAlert

TStzmmalaysia
post Jan 16 2011, 12:00 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


TRANSPORTATION

Attached Image

Japanese carmakers in push for hydrogen vehicles

Nissan Motors' X-Trail Fuel Cell Vehicle, seen here in 2006. Toyota, Honda and Nissan, along with 10 Japanese energy groups including natural gas refiners and distributors, want to build 100 hydrogen filling stations by 2015 in Tokyo, Nagoya, Osaka and Fukuoka.

Along with 10 Japanese energy groups including natural gas refiners and distributors, the companies are aiming to build 100 filling stations by 2015 in Tokyo, Nagoya, Osaka and Fukuoka, the companies said in a statement Thursday.

The automakers are making a renewed push behind Fuel Cell Vehicles (FCVs), which covert hydrogen into electricity and emit nothing more harmful than water vapour.

The companies say that the creation of a hydrogen supply infrastructure network is crucial as manufacturers work to reduce the production cost of hydrogen-powered vehicles in order to make them commercially viable.

"Japanese automakers are continuing to drastically reduce the cost of manufacturing such systems and are aiming to launch FCVs in the Japanese market -- mainly in the country's four major metropolitan areas -- in 2015," they said.

"With an aim to significantly reduce the amount of CO2 emitted by the transportation sector, automakers and hydrogen fuel suppliers will work together to expand the introduction of FCVs and develop the hydrogen supply network throughout Japan."

water drains from the tailpipe of an hydrogen fuel cell powered vehicle. Toyota, Honda and Nissan have united with Japanese energy firms in a push to commercialise greener hydrogen fuel cell cars and build a network of fuelling stations.

The companies did not say how much they planned to invest in the project.
While all-electric vehicles such as Nissan's Leaf or hybrids like Toyota's Prius have hogged the limelight recently, fuel cells are seen as a more powerful alternative, but expensive production and a lack of a comprehensive fuelling network has been seen as prohibitive.

Toyota, pioneer of hybrids powered by a petrol engine and an electric motor, has said it plans to launch a fuel-cell car by 2015. It is applying its hybrid technology to the vehicles, swapping the petrol engine for a fuel-cell stack.

Honda in 2008 began delivering about 200 FCX Clarity hydrogen-powered cars on lease to customers in the United States, Japan and later in Europe.

PhysOrg

TStzmmalaysia
post Jan 16 2011, 12:02 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

World's lightest solid material, known as 'frozen smoke', gets even lighter

Researchers have created a new aerogel that boasts amazing strength and an incredibly large surface area. Nicknamed ‘frozen smoke’ due to its translucent appearance, aerogels are manufactured materials derived from a gel in which the liquid component of the gel has been replaced with a gas, resulting in a material renowned as the world’s lightest solid material. The new so-called “multiwalled carbon nanotube (MCNT) aerogel” could be used in sensors to detect pollutants and toxic substances, chemical reactors, and electronics components.

Although aerogels have been fabricated from silica, metal oxides, polymers, and carbon-based materials and are already used in thermal insulation in windows and buildings, tennis racquets, sponges to clean up oil spills, and other products, few scientists have succeeded in making aerogels from carbon nanotubes.

The researchers were able to succeed where so many before them had failed using a wet gel of well-dispersed pristine MWCNTs. After removing the liquid component from the MWCNT wet gel, they were able to create the lightest ever free-standing MWCNT aerogel monolith with a density of 4 mg/cm3.

MWCNT aerogels infused with a plastic material are flexible, like a spring that can be stretched thousands of times, and if the nanotubes in a one-ounce cube were unraveled and placed side-to-side and end-to-end, they would carpet three football fields. The MWCNT aerogels are also excellent conductors of electricity, which is what makes them ideal for sensing applications and offers great potential for their use in electronics components.

A report describing the process for making MWCNT aerogels and tests to determine their properties appears in ACS Nano.
TStzmmalaysia
post Jan 16 2011, 02:14 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Red Cross Uses Solar-Powered Pumps to Increase Water Access in Sudan

However the referendum in South Sudan turns out, one thing will not go away quickly: the lack of water in the region. The International Committee of the Red Cross, however, is at work on a project that will mitigate that problem in at least one town, and will hopefully be replicated in other regions if successful. Solar-powered water pumps.

Akobo is in southeast Sudan, near the Ethiopian border, and is at least a temporary home to thousands who fled violence in their own regions in 2009. More than 55,000 people there don't have enough water—meaning they are often living with less than two liters of clean water a day.

ICRC explains the project:

A powerful pump extracts water from tens of metres below the ground and transfers it to elevated tanks. Under the effect of gravity, the water then flows from the tanks through pipes to public water distribution points in town. Those pumps need electricity, and the supporting structures for a total of 420 solar panels are now in place, with the components of the system currently en route from Germany to Akobo. We expect the project to be completed in the first quarter of 2011.
Once functional, ICRC says, the project will be able to supply 10 liters of water a day for people, as well as provide water for a school, hospital, several new administrative buildings, and other distribution points used by both people and livestock.

Importantly, the project will also be training the water authorities to use the solar pumping system. The ICRC says it's had positive experiences with similar projects in Eritrea, and is considering expanding the technology into Kenya and elsewhere in Africa.

This certainly isn't the first project to take advantage of solar power to access water in arid regions, but every one brings more promise to the people who will benefit. Hopefully this won't be the last.

Treehugger

TStzmmalaysia
post Jan 17 2011, 10:43 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


NANOTECHNOLOGY

Attached Image

Transparent conductive material could lead to power-generating windows

Scientists at the U.S. Department of Energy's (DOE) Brookhaven National Laboratory and Los Alamos National Laboratory have fabricated transparent thin films capable of absorbing light and generating electric charge over a relatively large area. The material, described in the journal Chemistry of Materials, could be used to develop transparent solar panels or even windows that absorb solar energy to generate electricity.

The material consists of a semiconducting polymer doped with carbon-rich fullerenes. Under carefully controlled conditions, the material self-assembles to form a reproducible pattern of micron-size hexagon-shaped cells over a relatively large area (up to several millimeters).
"Though such honeycomb-patterned thin films have previously been made using conventional polymers like polystyrene, this is the first report of such a material that blends semiconductors and fullerenes to absorb light and efficiently generate charge and charge separation," said lead scientist Mircea Cotlet, a physical chemist at Brookhaven's Center for Functional Nanomaterials.
Furthermore, the material remains largely transparent because the polymer chains pack densely only at the edges of the hexagons, while remaining loosely packed and spread very thin across the centers. "The densely packed edges strongly absorb light and may also facilitate conducting electricity," Cotlet explained, "while the centers do not absorb much light and are relatively transparent."
"Combining these traits and achieving large-scale patterning could enable a wide range of practical applications, such as energy-generating solar windows, transparent solar panels, and new kinds of optical displays," said co-author Zhihua Xu, a materials scientist at the CFN.
"Imagine a house with windows made of this kind of material, which, combined with a solar roof, would cut its electricity costs significantly. This is pretty exciting," Cotlet said.
The scientists fabricated the honeycomb thin films by creating a flow of micrometer-size water droplets across a thin layer of the polymer/fullerene blend solution. These water droplets self-assembled into large arrays within the polymer solution. As the solvent completely evaporates, the polymer forms a hexagonal honeycomb pattern over a large area.
"This is a cost-effective method, with potential to be scaled up from the laboratory to industrial-scale production," Xu said.
The scientists verified the uniformity of the honeycomb structure with various scanning probe and electron microscopy techniques, and tested the optical properties and charge generation at various parts of the honeycomb structure (edges, centers, and nodes where individual cells connect) using time-resolved confocal fluorescence microscopy.
The scientists also found that the degree of polymer packing was determined by the rate of solvent evaporation, which in turn determines the rate of charge transport through the material.
"The slower the solvent evaporates, the more tightly packed the polymer, and the better the charge transport," Cotlet said.
"Our work provides a deeper understanding of the optical properties of the honeycomb structure. The next step will be to use these honeycomb thin films to fabricate transparent and flexible organic solar cells and other devices," he said.
More information: Structural dynamics and charge transfer via complexation with fullerene in large area conjugated polymer honeycomb thin films: http://pubs.acs.org/doi/full/10.1021/cm102160m

PhysOrg


TStzmmalaysia
post Jan 17 2011, 10:44 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Electricity-Generating T-Shirts Could Someday Power Your iPhone

Adios wall warts, hello wearable power supplies. University of Texas at Dallas scientists are spinning yarn out of powder-infused carbon nanotubes in the hopes of creating textiles that can power the latest iFad. The nanotubes give superconducting particles, such as boron and magnesium powder, a more manageable form without binders or lasers. Their goal: To weave this energy-transmitting yarn into lightweight batteries you can wear.

Powdered materials like boron and magnesium play a vital role in battery electrodes, superconducting wires, and even catalysts in fuel cells, but they are difficult to work with without complicated processes to bind their shape. By “growing” a web of nanotubes and then spraying it with the powder, any finely ground material can be turned into a “sewable, knittable, knotable, braidable yarn,” says Ray Baughman, director of the university’s Alan G. MacDiarmid NanoTech Institute. The powder, which can account for 95 to 99 percent of the yarn’s weight, is trapped inside the twists of the nanotube web. “When you wash it, almost all the powder is retained,” he adds.

No need to search for a power outlet—charging your device could be as simple as plugging it into your T-shirt.

The researchers at the University of Texas aren’t the only ones working on conductive textiles. Their Rice University are making carbon-nanotube fibers that are very dense and conductive. These fibers may someday be used in low-loss electrical-transmission cables or in super-strong structural materials. Another lab at Stanford University is developing textile-based energy-storage devices. Yi Cui, an associate professor of materials science and engineering, believes that wearable batteries could one day power our gadgets.

The concept of wearable batteries that look like regular clothing could be a major game-changer. Designers worldwide will be clamoring to incorporate energy-storing textiles into their collections. No need to search for a power outlet—charging your device could be as simple as plugging it into your T-shirt.

Ecouterre


TStzmmalaysia
post Jan 17 2011, 10:45 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Autonomous Quadrotor Teams May Build Your Next House

UPenn’s GRASP Lab taught their quadrotors to work together to grasp and move things. The next step, it seems, is teaching the quadrotors to work together to grasp and move things and actually build buildings. The video above shows a team of quadrotors cooperating to construct the framework of a (rather small) building. The building’s structure is held together with magnets, and the quadrotors are able to verify that the alignment is correct by attempting to wiggle the structural components around, which is pretty cool.

It’s fun to speculate about how this technology might grow out of the lab into the real world… To build actual buldings, you’d either need much bigger quadrotors (which is possible), lots of small quadrotors cooperating in big pieces (also possible), or buildings built out of much smaller components (which might be the way to go). The quadrotors probably wouldn’t be able to do all the work, but they have the potential to make construction projects significantly more efficient.



Each quadrotor weighs 500 grams and can deliver some 1250 grams of thrust, making their individual payload capacity somewhere around half a kilogram. This means that a couple together could lift a kilogram, and you can do the math on from there, but there are lots of reasons why you might want a bunch of extra robots cooperating on the lift, which gets back to why swarm robotics has so much potential in the first place. For example, having extra bots protects against mechanical failure of an individual bot. It also protects against complications like wind. Or maybe whatever you’re lifting has a long distance to go or needs to be in the air for a while, and the bots can switch off to go recharge themselves.

It’s interesting to compare these cooperative quadrotors with that distributed flight array from ETH Zurich that we wrote about last month. It’s a different approach, certainly, but the premise is similar, and it’ll be lots of fun to see how each of these projects evolves.

Botjunkie


TStzmmalaysia
post Jan 18 2011, 11:10 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Unnatural Genes Used to Replace Missing DNA Keep Cells Alive

Synthetic biology garnered national headlines in May 2010 when a team led by J. Craig Venter announced it had created the world’s first “synthetic cell." The group used computers to copy an entire bacterial genome that, when inserted into a cell whose own genome had been removed, "booted up" the cell, which then passed the synthesized genome to its offspring.

This accomplishment was no small feat but the new genome, although man-made, was almost entirely a replication of one that already existed in nature. Now, a new study published January 4 in PLoS One has shown that DNA sequences designed in the laboratory and distinct from any found in nature can, when inserted into cells missing genes necessary for survival, "rescue" some of those cells.

They were not random sequences, explains Michael Hecht, a professor of chemistry at Princeton University who led the research. Instead they were intentionally patterned to code for amino acid arrangements, which can fold into relatively crude three-dimensional protein structures that are distinct from any natural proteins. In the past three decades scientists have refined methods for designing entirely new proteins from scratch, and they have shown that some can even catalyze reactions. "Since proteins are basically molecular machines that work in cells," Hecht says, the next logical question was: "Can you get one that you design from scratch to work in a cell?"

To find the answer, Hecht and his colleagues employed 27 Escherichia coli strains, each of which lacked a gene that, given the nutrient-poor medium on which they were growing, should have left them unable to survive. The researchers then introduced to the cells more than one million synthetic DNA sequences, each known to code for a previously designed protein. Explains Hecht: "If we give them an opportunity to pick up one of our genes, and if that gene allows them to survive under these selective conditions, then that cell will form a colony where all of its neighbors have died."

Sure enough, after several days of incubation, four separate experimental strains formed colonies, whereas cells in a control group all perished. To assure the surviving cells had done so because they had incorporated a novel gene, and that survival was not the result of adaptive mutations on the original chromosomes, the researchers purified the DNA from these new colonies and inserted it into new cells with the same original gene deletion. "We transferred it over and over again to make sure that the phenotype—survival—transferred with the genotype, which was the piece of DNA that we put in," Hecht explains. "With something as shocking as this, you don’t believe it until you’ve done a lot of controls."

How did this happen? That is still not exactly clear, Hecht says. Whereas the new proteins could have replaced the catalytic activity of the missing ones, they could also have sustained the cells through an entirely separate, still unknown mechanism. Hecht adds that experiments aimed at elucidating the mechanisms are underway and are "really, really important."

Benjamin Davis, a professor of organic chemistry at the University of Oxford in England who was not part of the research group but studies artificial proteins in his own lab, says Hecht and his team performed a "very clean experiment," adding that they "are very clear that there are lots of unanswered questions about this. But they are great unanswered questions."

Specifically, even though certain processes have evolved in nature, they are not necessarily the only processes that can work, Davis explains. "There are so many possibilities that nature has not surveyed."

As the authors of the new study point out, the DNA and amino acid sequences that have emerged in nature represent only a "minuscule fraction of the theoretical sequence space that is possible for genes and proteins."

"Evolution does not work in a way that tries to map out everything—it can't. It's not possible," Davis says. Therefore, if Hecht's group's new proteins are not replacing the exact catalytic activity that was deleted, "there could be umpteen different ways they are working," he says, and investigations into those possibilities "are exactly the types of experiments we should be doing in synthetic biology."

In the meantime, does the new result bring scientists any closer to creating artificial life? Only very slightly, if at all, Hecht says. "If you imagine a toolbox that sustains life, we've replaced a few screwdrivers. The question with artificial genomes is: Could you sustain life with an entirely new toolbox? And we are nowhere near that point."

Scientific American


TStzmmalaysia
post Jan 18 2011, 11:12 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Asia’s First Tidal Power Plant Coming to India

Wind and solar power may get the most attention in the realm of green energy, but tidal power is slowly edging its way in: Scotland, Korea, and even New York City all have tidal power projects currently underway. Most recently, India joined the tidal power wave with the approval of a commercial-scale tidal power plant in the Gulf of Kutch. The 50 Mw plant will be developed by the London-based company Atlantis Resources Corporation in partnership with Gujarat Power Corporation, and construction will start this year. The plant should be operating by 2013, making it the first of its kind in Asia (Korea’s tidal power plant won’t be finished until 2015).

Last year, Atlantis unveiled the world’s largest tidal turbine in Scotland, where they installed a 378 Mw tidal power project. An initial economic and technical study by the company indicated that the Gulf of Kutch could produce as much as 300 Mw of extractable tidal power, but the plant is likely to be scaled up to a capacity of 250 Mw. The total cost will be about $165 million. Costs for tidal power plants are so high because of the extensive civil works required. To make the Gulf of Kutch project commercially viable, attractive tariffs are imposed for the power offtakers.

Tidal plants usually take between eight and twelve years to break even, but their environmental advantages are unparalleled. Tidal power is incredibly predictable, making it one of the most reliable sources of renewable energy.

Inhabitat

TStzmmalaysia
post Jan 18 2011, 11:13 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Better than the human eye

Researchers from Northwestern University and the University of Illinois at Urbana-Champaign are the first to develop a curvilinear camera, much like the human eye, with the significant feature of a zoom capability, unlike the human eye.
The "eyeball camera" has a 3.5x optical zoom, takes sharp images, is inexpensive to make and is only the size of a nickel. (A higher zoom is possible with the technology.)

While the camera won't be appearing at Best Buy any time soon, the tunable camera -- once optimized -- should be useful in many applications, including night-vision surveillance, robotic vision, endoscopic imaging and consumer electronics.

"We were inspired by the human eye, but we wanted to go beyond the human eye," said Yonggang Huang, Joseph Cummings Professor of Civil and Environmental Engineering and Mechanical Engineering at Northwestern's McCormick School of Engineering and Applied Science. "Our goal was to develop something simple that can zoom and capture good images, and we've achieved that."

The tiny camera combines the best of both the human eye and an expensive single-lens reflex (SLR) camera with a zoom lens. It has the simple lens of the human eye, allowing the device to be small, and the zoom capability of the SLR camera without the bulk and weight of a complex lens. The key is that both the simple lens and photodetectors are on flexible substrates, and a hydraulic system can change the shape of the substrates appropriately, enabling a variable zoom.

The research will be published the week of Jan. 17 by the Proceedings of the National Academy of Sciences (PNAS).

Huang, co-corresponding author of the PNAS paper, led the theory and design work at Northwestern. His colleague John Rogers, the Lee J. Flory Founder Chair in Engineering and professor of materials science and engineering at the University of Illinois, led the design, experimental and fabrication work. Rogers is a co-corresponding author of the paper.

Earlier eyeball camera designs are incompatible with variable zoom because these cameras have rigid detectors. The detector must change shape as the in-focus image changes shape with magnification. Huang and Rogers and their team use an array of interconnected and flexible silicon photodetectors on a thin, elastic membrane, which can easily change shape. This flexibility opens up the field of possible uses for such a system. (The array builds on their work in stretchable electronics.)

The camera system also has an integrated lens constructed by putting a thin, elastic membrane on a water chamber, with a clear glass window underneath.

Initially both detector and lens are flat. Beneath both the membranes of the detector and the simple lens are chambers filled with water. By extracting water from the detector's chamber, the detector surface becomes a concave hemisphere. (Injecting water back returns the detector to a flat surface.) Injecting water into the chamber of the lens makes the thin membrane become a convex hemisphere.

To achieve an in-focus and magnified image, the researchers actuate the hydraulics to change the curvatures of the lens and detector in a coordinated manner. The shape of the detector must match the varying curvature of the image surface to accommodate continuously adjustable zoom, and this is easily done with this new hemispherical eye camera.

PhysOrg

TStzmmalaysia
post Jan 19 2011, 09:47 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Incredible Sahara Forest Project Moves From Concept To Reality

The Sahara Forest Project proposes to use two separate technologies together, Concentrated Solar Power (CSP) and Seawater Greenhouses, to provide an array of sustainable energy and agricultural solutions, in the usually inhospitable desert environment, through the desalination of seawater into freshwater.

After joining forces with the Norwegian environmental group the Bellona Foundation in 2009 The Sahara Forest Project team, including biomimicry architect Michael Pawlyn, Seawater Greenhouse designer Charlie Paton and structural engineer Bill Watts, presented their proposal at COP15.

Having been well received in Copenhagen the fast rising profile of the project lead to an audience with Majesty King Abdullah II of Jordan in Oslo in June 2010. The King was so impressed he invited the SFP team to visit Jordan in October 2010 to scope out a feasibility study. The result of these fast moving developments is the deal that was signed last week between Aqaba Special Economic Zone Authority and The Sahara Forest Project in Amman, Jordan.

The Aqaba Special Economic Zone Authority (ASEZA) is the catchy name for the Jordanian Government's strategic development zone by the Red Sea. A perfect location for the Sahara Forest Project, which needs to be located very specifically near the coastline in order to pump seawater to the power plant.

With financial backing from Norway it has been agreed that ASEZA will:

"facilitate the necessary land area for The Sahara Forest Test and Demonstration Centre, including a corridor for the salt water pipe from the Red Sea. The area needed will be 20 hectares (200,000 sqm). ASEZA will also assist SFP in identifying and securing 200 hectares for possible later expansion."

The SFP team are now committed to developing the project from concept to reality in Aqaba, Jordan. The plan is to conduct comprehensive feasibility studies in 2011, develop the Test and Demonstration Centre in 2012, and we are likely to see a large scale roll out of the project in 2015.

It is exciting to see how this international collaboration between Norway and Jordan can produce the necessary land and funding for such an ambitiously innovative project. We look forward to following the speedy progress towards making these sustainable solutions a reality and then to a large scale roll out that will see many more countries benefitting from these symbiotic technologies.

Undoubtedly there will be trials, tribulations and all sorts of social and technical challenges along the way, but the good news is that the starter gun has gone off. As we so often hear, we don't have any time to waste when it comes to developing and implementing renewable energy sources.

The President of the Bellona Foundation and founding partner of SFP, Frederic Hauge, says of the new deal.

"We are very happy with the strong support from both Jordanian and Norwegian authorities. It is encouraging to see that we share the vision of a more holistic approach towards solving challenges in the food, water and energy sector. The Sahara Forest Project has unfolded at a remarkable pace since first presented, and I am confident that the SFP facility in Jordan can be a reality within a very short time frame."

Treehugger



TStzmmalaysia
post Jan 19 2011, 09:49 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Biolamps uses CO2 to light up the streets

One of the key problems we currently face in our world is the pollution produced from cars. And one of the solutions used to combat this problem is electric cars, but since they are much too costly for most people to afford at the moment, so a designed has come up with a proposed solution to the problem. Called Biolamps, these lamps contain algae mixed with water, which converts the CO2 into oxygen which it then emits back out into the atmosphere. In addition to creating Oxygen, the lamp also converts the CO2 into a biomass which can be used to power up the lamps. If the lamp has more CO2 biomass than it needs, it uses an underground tube system to push it to the nearest filler station where it can be converted into a biofuel to power eco cars. Talk about innovative! It’s great to see technology being used to push our world into a greener state; hopefully technology like this will be implemented before it’s too late to save our world.

Ubergizmo

TStzmmalaysia
post Jan 19 2011, 09:51 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


NANOTECHNOLOGY

Attached Image

Nanoengineered 'Green' Asphalt Reduces Climate Impact of Asphalt

Emerald Cities™ USA Ltd. (div. of Emerald Cities™ International Ltd.) has just resurfaced the world's first solar reflective "Green" asphalt parking lot to demonstrate the importance of "Cool Pavement" addressing Urban Heat Island in the cities. This 24,000 sq.ft. parking lot was donated to Mayor Gordon's Phoenix Recovery Zone at the Duffy Charter School where asphalt temperatures soared to 200 degrees (F) last summer in Arizona. The importance of "Cool Pavement" in a school setting cannot be overstated. Heat radiated from asphalt remains between 1-4 feet and is a "danger zone" where children play during recess because of the risk o f heat stroke. This is an Emerald Cities "Cool Schools" project.

"Zero Carbon" Asphalt

Black asphalt covers 60% of city surfaces as a silent contributor to heat, smog and C02 from parking lots, airports, amusement parks, shopping malls and roadways. Green building can never be "zero carbon" until the asphalt portion of the project is addressed. EC "Cool Pavement" reduces surface heat by 30-50 degrees (F) on summer days, reducing smog and C02. It is a nano-engineered ultra high performance thin concrete proven roadworthy at 1/6" thickness. It is 4300+ psi, 100+ skid resistant, impervious to UV, non-delaminating, comes in beautiful colors, extends the service life of existing asphalt, and is easy to apply with no milling required.

"Green Filling Stations"

Emerald Cities™ International Ltd. seeks JV partners in global regions to establish inventory warehousing to serve as "hubs" for further distribution to EC "Green Filling Stations". The first "Green Filling Station" prototype has been opened in Scottsdale, Arizona. These "Franchises" are available at $100K and include a showroom, equipment and supplies for a drive-in dispensing port where contractors can fill-up and pick up fresh materials daily for projects.

100 Cities by 2012

Solar reflective color now plays a significant role in protecting the environment. According to Steven Chu (Obama's Secretary of Energy), "Changing surface colors in 100 of the world's largest cities could save as much as 44 billion tons of carbon dioxide. This is equivalent to the rise in global carbon emissions anticipated by 2020." With 2010 now the hottest year on record, NASA predicts a 2- 6 degree rise in the next few years will result in altered rainfall, storms, coastal floods, melting glaciers, and droughts contributing to the increase in infectious diseases.

"Cool Communities" Initiative

Lawrence Berkeley's Heat Island Group has worked on climate change solutions for 8 years, and developed a "Cool Communities" Initiative targeting Mayors, Community Organizations and Corporations. On January 12, 2011, the Lawrence Berkeley Heat Island Group signed an Agreement with Emerald Cities™ USA Ltd. to take this seminar nationwide. Parties interested in hosting the "Cool Communities" Seminar may contact Emerald Cities™ USA Ltd. directly.

To read the full story, go to http://www.emeraldcoolpavements.com/pressr...nstallation.pdf.

Nanowerk News

TStzmmalaysia
post Jan 19 2011, 09:53 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Breakthrough in Converting Heat Waste to Electricity

Researchers at Northwestern University have placed nanocrystals of rock salt into lead telluride, creating a material that can harness electricity from heat-generating items such as vehicle exhaust systems, industrial processes and equipment and sun light more efficiently than scientists have seen in the past.

The material exhibits a high thermoelectric figure of merit that is expected to enable 14 percent of heat waste to electricity, a scientific first. Chemists, physicists and material scientists at Northwestern collaborated to develop the material. The results of the study are published by the journal Nature Chemistry.

"It has been known for 100 years that semiconductors have this property that can harness electricity," said Mercouri Kanatzidis, the Charles E. and Emma H. Morrison Professor of Chemistry in The Weinberg College of Arts and Sciences. "To make this an efficient process, all you need is the right material, and we have found a recipe or system to make this material."

Kanatzidis, co-author of the study, and his team dispersed nanocrystals of rock salt (SrTe) into the material lead telluride (PbTe). Past attempts at this kind of nanoscale inclusion in bulk material have improved the energy conversion efficiency of lead telluride, but the nano inclusions also increased the scattering of electrons, which reduced overall conductivity. In this study, the Northwestern team offers the first example of using nanostructures in lead telluride to reduce electron scattering and increase the energy conversion efficiency of the material.

"We can put this material inside of an inexpensive device with a few electrical wires and attach it to something like a light bulb," said Vinayak Dravid, professor of materials science and engineering at Northwestern's McCormick School of Engineering and Applied Science and co-author of the paper. "The device can make the light bulb more efficient by taking the heat it generates and converting part of the heat, 10 to 15 percent, into a more useful energy like electricity."

The automotive, chemical, brick, glass and any industry that uses heat to make products could make their system more efficient with the use of this scientific breakthrough, said Kanatzidis, who also has a joint appointment at the Argonne National Laboratory.

"The energy crisis and the environment are two major reasons to be excited about this discovery, but this could just be the beginning," Dravid said. "These types of structures may have other implications in the scientific community that we haven't thought of yet, in areas such as mechanical behavior and improving strength or toughness. Hopefully others will pick up this system and use it."

BeforeItsNews


TStzmmalaysia
post Jan 20 2011, 09:38 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


SPACE SCIENCE

Attached Image

NASA Mars rover will check for ingredients of life

Techncians and engineers inside a clean room at NASA's Jet Propulsion Laboratory, Pasadena, Calif., prepare to install SAM into the mission's Mars rover, Curiosity. Image Credit: NASA/JPL-Caltech

The specific work planned for this instrument on Mars requires more all-covering protective garb for these specialized workers than was needed for the building of NASA's earlier Mars rovers.

The instrument is Sample Analysis at Mars, or SAM, built by NASA's Goddard Space Flight Center, Greenbelt, Md. At the carefully selected landing site for the Mars rover named Curiosity, one of SAM's key jobs will be to check for carbon-containing compounds called organic molecules, which are among the building blocks of life on Earth. The clean-room suits worn by Curiosity's builders at NASA's Jet Propulsion Laboratory, Pasadena, Calif., are just part of the care being taken to keep biological material from Earth from showing up in results from SAM.

Organic chemicals consist of carbon and hydrogen and, in many cases, additional elements. They can exist without life, but life as we know it cannot exist without them. SAM can detect a fainter trace of organics and identify a wider variety of them than any instrument yet sent to Mars. It also can provide information about other ingredients of life and clues to past environments.

Researchers will use SAM and nine other science instruments on Curiosity to study whether one of the most intriguing areas on Mars has offered environmental conditions favorable for life and favorable for preserving evidence about whether life has ever existed there. NASA will launch Curiosity from Florida between Nov. 25 and Dec. 18, 2011, as part of the Mars Science Laboratory mission's spacecraft. The spacecraft will deliver the rover to the Martian surface in August 2012. The mission plan is to operate Curiosity on Mars for two years.

"If we don't find any organics, that's useful information," said Mahaffy, of NASA's Goddard Space Flight Center. "That would mean the best place to look for evidence about life on Mars may not be near the surface. It may push us to look deeper." It would also aid understanding of the environmental conditions that remove organics.
"If we do find detectable organics, that would be an encouraging sign that the immediate environment in the rocks we're sampling is preserving these clues," he said. "Then we would use the tools we have to try to determine where the organics may have come from." Organics delivered by meteorites without involvement of biology come with more random chemical structures than the patterns seen in mixtures of organic chemicals produced by organisms.

Mahaffy paused in describing what SAM will do on Mars while engineers and technicians lowered the instrument into its position inside Curiosity this month. A veteran of using earlier spacecraft instruments to study planetary atmospheres, he has coordinated work of hundreds of people in several states and Europe to develop, build and test SAM after NASA selected his team's proposal for it in 2004.

"It has been a long haul getting to this point," he said. "We've taken a set of experiments that would occupy a good portion of a room on Earth and put them into that box the size of a microwave oven."

SAM has three laboratory tools for analyzing chemistry. The tools will examine gases from the Martian atmosphere, as well as gases that ovens and solvents pull from powdered rock and soil samples. Curiosity's robotic arm will deliver the powdered samples to an inlet funnel. SAM's ovens will heat most samples to about 1,000 degrees Celsius (about 1,800 degrees Fahrenheit).

One tool, a mass spectrometer, identifies gases by the molecular weight and electrical charge of their ionized states. It will check for several elements important for life as we know it, including nitrogen, phosphorous, sulfur, oxygen and carbon.

Another tool, a laser spectrometer, uses absorption of light at specific wavelengths to measure concentrations of selected chemicals, such as methane and water vapor. It also identifies the proportions of different isotopes in those gases. Isotopes are variants of the same element with different atomic weights, such as carbon-13 and carbon-12, or oxygen-18 and oxygen-16. Ratios of isotopes can be signatures of planetary processes. For example, Mars once had a much denser atmosphere than it does today, and if the loss occurred at the top of the atmosphere, the process would favor increased concentration of heavier isotopes in the retained, modern atmosphere.

More on PhysOrg
TStzmmalaysia
post Jan 20 2011, 09:41 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Killer paper for next-generation food packaging

Scientists are reporting development and successful lab tests of "killer paper," a material intended for use as a new food packaging material that helps preserve foods by fighting the bacteria that cause spoilage. The paper, described in ACS' journal, Langmuir, contains a coating of silver nanoparticles, which are powerful anti-bacterial agents.

Aharon Gedanken and colleagues note that silver already finds wide use as a bacteria fighter in certain medicinal ointments, kitchen and bathroom surfaces, and even odor-resistant socks. Recently, scientists have been exploring the use of silver nanoparticles — each 1/50,000 the width of a human hair — as germ-fighting coatings for plastics, fabrics, and metals.

Nanoparticles, which have a longer-lasting effect than larger silver particles, could help overcome the growing problem of antibiotic resistance, in which bacteria develop the ability to shrug-off existing antibiotics. Paper coated with silver nanoparticles could provide an alternative to common food preservation methods such as radiation, heat treatment, and low temperature storage, they note. However, producing "killer paper" suitable for commercial use has proven difficult.

The scientists describe development of an effective, long-lasting method for depositing silver nanoparticles on the surface of paper that involves ultrasound, or the use of high frequency sound waves. The coated paper showed potent antibacterial activity against E. coli and S. aureus, two causes of bacterial food poisoning, killing all of the bacteria in just three hours. This suggests its potential application as a food packaging material for promoting longer shelf life, they note.

PhysOrg



TStzmmalaysia
post Jan 21 2011, 09:39 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

New Magnest Could Solve Our Rare-Earth Problems

Stronger, lighter magnets could enter the market in the next few years, making more efficient car engines and wind turbines possible. Researchers need the new materials because today's best magnets use rare-earth metals, whose supply is becoming unreliable even as demand grows.

So researchers are now working on new types of nanostructured magnets that would use smaller amounts of rare-earth metals than standard magnets. Many hurdles remain, but GE Global Research hopes to demonstrate new magnet materials within the next two years.

The strongest magnets rely on an alloy of the rare-earth metal neodymium that also includes iron and boron. Magnet makers sometimes add other rare-earth metals, including dysprosium and terbium, to these magnets to improve their properties. Supplies of all three of these rare earths are at risk because of increasing demand and the possibility that China, which produces most of them, will restrict exports.

However, it's not clear if the new magnets will get to market before the demand for rare-earth metals exceeds the supply. The U.S. Department of Energy projects that worldwide production of neodymium oxide, a key ingredient in magnets, will total 30,657 tons in 2015. In one of the DOE's projected scenarios, demand for that metal will be a bit higher than that number in 2015. The DOE's scenarios involve some guesswork, but the most conservative estimate has demand for neodymium exceeding supply by about 2020.

"A lot of the story about rare earths has focused around China and mining," says Steven Declos, manager of material sustainability at GE Global Research. "We believe technology can play a role in addressing this." The DOE is funding GE's magnet project, and one led by researchers at the University of Delaware, through the Advanced Research Projects Agency-Energy (ARPA-E) program, which fosters research into disruptive technology.

Coming up with new magnet materials is not easy, says George Hadjipanayis, chair of the physics and astronomy department at the University of Delaware. Hadjipanayis was involved in the development of neodymium magnets in the 1980s while working at Kollmorgen. "At that time, maybe we all got lucky," he says of the initial development of neodymium magnets. The way researchers made new magnets in the past was to crystallize alloys and look for new forms with better properties. This approach won't work going forward. "Neodymium magnet performance has plateaued," says Frank Johnson, who heads GE's magnet research program. Hadjipanayis agrees. "The hope now is nanocomposites," he says.

Nanocomposite magnet materials are made up of nanoparticles of the metals that are found in today's magnetic alloys. These composites have, for example, neodymium-based nanoparticles mixed with iron-based nanoparticles. These nanostructured regions in the magnet interact in a way that leads to greater magnetic properties than those found in conventional magnetic alloys.

Page 2 of 2
The advantage of nanocomposites for magnets is twofold: nanocomposites promise to be stronger than other magnets of similar weight, and they should use less rare-earth metals. What enables better magnetic properties in these nanocomposites is a property called exchange coupling. The physics are complex, but coupling between different nanoparticles in the composite leads to overall magnetic properties that are greater than the sum of the parts.

Exchange coupling can't happen in pure magnet materials, but emerges in composites made of mixtures of nanoparticles of the same metals that are used to make conventional magnets. "The advantage of stronger magnets is that the machines you put them in can be smaller and lighter," says Johnson.

GE would not disclose which materials it's using to make the magnets, or what its manufacturing methods would be, but Johnson says the company will rely on techniques it has developed to work with other metals. The main problem the company faces, says Johnson, is scaling up production to make large magnets—so far it's only been possible to make thin films of the nanocomposites. The company has about $2.25 million in funding from ARPA-E.

Hadjipanayis reports his group, a multi-institute consortium, has received nearly $4.5 million in ARPA-E funding. It's possible to make the necessary nanoparticles in small quantities in the lab, but scaling up will be difficult. "They're very reactive materials," he says.

The group is experimenting with a wide range of different types of nanoparticles, including combinations of neodymium-based nanoparticles with iron-cobalt nanoparticles. Another challenge is assembling the nanoparticles in a mixture that ensures they have enough contact with each other to get exchange coupling. "It's one step at a time," says Hadjipanayis.

TechnologyReview

TStzmmalaysia
post Jan 21 2011, 09:41 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Robotic Emergency Shelter is Entirely Self-Sufficient

Japan just unveiled the future of emergency housing in the form of the Emergency Disaster Vehicle, or EDV-01. The innovative design is fully-automated, capable of automatically doubling its size, and can be transported to anywhere you can set a shipping container. Developed by Diawa Lease, the shelter is a paragon of high-tech Japanese engineering — call it an emergency shelter robot.

The 10-ton container-sized unit can travel by ship, truck, or even helicopter. Set the EDV-01 down and it quickly transforms into a full-service shelter for medical aid, storage, logistics, or whatever you can think of. Hydraulic pads can level it quickly on uneven ground. Once set, an outer skin raises to create a full-sized second story. Solar electric panels cover the roof and one side of the building, enabling it to be self-sufficient. Other self-sufficiency measures include a system that collects water condensation, a hydrogen electrolysis system, and satellite dish.

Inside is a sink and shower with a waterless toilet accessible by a separate outdoor room. The rest of the lower half looks to be for storage and on-board equipment. A ladder provides access to the second story, which is a large room with a desk at one end and a couple bunks at the other. The design features a perforated skin reminiscent of Virginia Tech’s Lumenhuas, and it’s every bit as transportable to boot. Check out this action-packed video to learn more.



Inhabitat

TStzmmalaysia
post Jan 21 2011, 09:43 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Loud and Clear: 360 Degree Glass Speakers

A speaker’s sound comes from the diaphragm, a flat or cone-shaped piece that pushes air. It can be made from almost anything: metal, carbon, fabric, paper, even wood. California-based Greensound Technology, however, has taken a new approach, sending vibrations pulsing through a half-inch pane of tempered glass.

Since sound comes off both sides of the glass, the speaker projects a full 360 degrees of sound that envelops listeners. By design, this glass sheet can reproduce notes from nearly every instrument, from an upright bass to a piccolo. A sound generator in its 10-inch-tall base sends the pane vibrating, and its sail-like shape helps it handle varied tones: Bass emanates from the lower portion, midrange comes off the midsection, and high frequencies radiate from the tip. The heft of the base deepens low frequencies, while three holes near the top cut mass to tune high notes.

The glass’s range and wide reach replicate a studio soundstage better than other speakers—a pair of these and a subwoofer can put you virtually dead center at a concert hall.

POPSCI

TStzmmalaysia
post Jan 21 2011, 06:23 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


TRANSPORTATION

Attached Image

'Spiral' escalator could give crowds a lift

An Israeli designer has developed the idea as a way of transporting larger numbers of people than a lift in a vertical space too narrow for a traditional escalator. This could reduce the floor space needed in buildings for personal transporters and cut the cost of putting escalators into underground railway stations.

The escalator, which is technically helix shaped rather than spiral, overcomes the problem experienced by conventional machines that are angled so that they travel further horizontally than vertically. While curved escalators have existed since the 1980s and are sometimes stacked on top of each other to give the appearance of a helix, they still rely on the same basic mechanism as straight escalators and are similarly limited in height and length.

But the “Helixator”, which currently exists as a prototype scale model, has several design innovations that offer the possibility of a continuous helix. ‘The big technical breakthrough is the monorail system, which didn’t exist before in the industry,’ designer Michel David told The Engineer.

‘Traditional curved escalator designs comprise multiple rails from both sides of the steps and have to be supported by complex and heavy structures. The monorail solution allows us to concentrate all necessary rails in one central and light structure.’ The Helixator is also driven by a linear motor instead of a chain system where the top link in the chain carries all the weight of the steps.

The Helixator is claimed to lift more people in a small space than a conventional lift system ‘This apparatus spreads the force equally along the machine path and does not apply excessive force on any of the chains links,’ said David, who is also chief technology officer of the Berlin-based Helixator company.

‘It allows us to build a particularly flexible system and presents no mechanical limits for geometry, length or height.’

David has produced numerous designs for different sized and shaped Helixators, as well as several 3D-printed scale models. One example is a 100m-high model with accelerating walkways to transport around 20,000 people per hour in both directions, saving floorspace equivalent to 15 elevators.

David is now looking for investment to develop an industry-standard prototype to take to manufacturers. ‘A lot of people see the design and are afraid they’re going to get dizzy,’ he said. ‘So this is something I’m now working on – what the speed and radius should be. I’m searching for the limit because I want it to go fast and operate in small shafts.’

Attached Image

The three-stage system accelerates passengers before taking them through the spiral section

He said it was too early to predict how much a manufactured unit would cost but that its cost-effectiveness would lie in the reduced supporting architecture needed to fit an escalator in a vertical shaft rather than spreading it across a building.

‘Escalators have been manufactured for the last 100 years and today you can buy a unit for €12,000 and that’s really difficult to compete with from an economical point of view.

‘But modern architecture is constantly searching for new things. In some Asian buildings there are already escalators going to the 10th or 15th floors.’

When bulding the Helixator, David took inspiration from monorails and roller coasters, as well as designs for spiral escalators by Gilbert Luna in the 1970s. Two Helixators could be wrapped around each other to form a double-helix

He also examined a helical moving walkway ‘elevator’ that was briefly installed in Holloway Road tube station in London in 1906. The Engineer reported on the technology behind this machine in 1900 when it was first used in the UK for a straight elevator system at the Crystal Palace. But the spiral version was never deemed safe enough for public use. Its remains are now held by the London Transport Museum.

‘The idea of a helical escalator is not new at all – it is 100 years old,’ said David. ‘Studying these attempts is very important to understand why they didn’t work.’

TheEngineer





TStzmmalaysia
post Jan 22 2011, 09:52 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Robotic suit has enabled paraplegics to walk

Good news for some 125,000 paraplegics in the U.S. New progress has been made in the field of exoskeletons. A device called ReWalk, got its start at the Technion incubator. The exoskeleton will enable those who are paralyzed from the waste down to get back on their feet and walk.

ReWalk is a lightweight, robotic exoskeleton that allows paraplegics to stand, walk, and take stairs themselves. Worn around the legs and torso, the device works using a combination of motion sensors, electric motors, and a computerized backpack - controlled by a wristband. “It shifts a person from a wheelchair user status to a crutch user status, which is a whole difference,” says its designer, Technion alum Amit Goffer of Argo Medical Technologies in Haifa.

After a 1997 accident left Goffer paralyzed from the chest down, “I looked around and was wondering how come the wheelchair is the only solution," he says. Goffer, who was formerly an electrical engineer, quickly got to work on the invention. He soon made a selfless design choice that meant he personally could not use the device: if the wearer could use crutches, it would simplify balance (and conserve energy), as the device wouldn’t have to keep the person upright all on its own.

ReWalk has been used in clinical trials in Israel, and at MossRehab, part of Albert Einstein Healthcare Network in Philadelphia with impressive results. Researchers there are finding that the very act of standing and walking again offers not only emotional rewards, but provides natural exercise for the heart and bones, and lessens some of the complications associated with being wheelchair bound. The device recently received FDA approval for institutional use, and is scheduled for sale to rehab centers as early as January.

“In the near future, we are going to continue to develop the device so that a quadraplegic or tetraplegic like myself will be able to use it,” Goffer says.



PhysOrg


TStzmmalaysia
post Jan 22 2011, 09:54 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

The human touch, in robots

In 2005, when Martin Saerbeck was studying computer science at Bielefeld University in Germany, he programmed a service robot called BIRON. Mounted with a pan-tilt camera on top, BIRON was able to follow a human pointing gesture and focus on the object pointed at. On one occasion, however, BIRON’s camera lost track of Saerbeck’s hand and the robot appeared to be sleeping or to have lost interest. Without thinking, Saerbeck waved his hand in front of BIRON and said "Wake up!"
“But then, I thought, ‘What am I doing?’” says Saerbeck. “I didn’t program waving detection so I should have known it wouldn’t move, but it was natural for me to talk to the robot in that way.” The experience is a demonstration of how people are naturally inclined to use normal human social skills when interacting with technology.

Inspired by such experiences, Saerbeck went on to develop a series of programs and architectures that enable robots to mimic human-to-human communication in a natural and readily understandable manner. “I don’t want to think too much about how to interact with the device or how to control it,” he says. Saerbeck, now a research scientist at the A*STAR Institute of High Performance Computing, is interested in developing robots that can be programmed to express words or reactions in response to a dialog with a human instead of simply responding to a few preset keywords. The goal is to enable the user to understand the state that the current state of the robot in a natural way.

In recent years, ‘social’ robots—cleaning robots, nursing-care robots, robot pets and the like—have started to penetrate into people’s everyday lives. Saerbeck and other robotics researchers are now scrambling to develop more sophisticated robotic capabilities that can reduce the ‘strangeness’ of robot interaction. “When robots come to live in a human space, we need to take care of many more things than for manufacturing robots installed on the factory floor,” says Haizhou Li, head of the Human Language Technology Department at the A*STAR Institute for Infocomm Research. “Everything from design to the cognitive process needs to be considered.”
Mimicking human interaction

Li leads the ASORO program, which was launched in 2008 as A*STAR’s first multi-disciplinary team for robot development covering robotic engineering, navigation and mechatronics, computer vision and speech recognition. The program’s 35 members have developed seven robots, including a robot butler and a robot home assistant. Their flagship robot is OLIVIA, a robot receptionist who also acts as a research platform for evaluating various technologies related to social robotics.

Li unveiled the latest version of OLIVIA—‘OLIVIA 2.1’—at the RoboCup 2010 competition in Singapore. In her robotic receptionist mode, OLIVIA welcomes guests as a robotic receptionist and responds to a few key phrases such as “Can you introduce yourself?” She is also able to track human faces and eyes, and detect the lip motions of speakers. Eight head-mounted microphones enable OLIVIA to accurately locate the source of human speech and turn to the speaker. Intriguingly, OLIVIA even performs certain gestures learned from human demonstrators. Li is now discussing collaborations with Saerbeck to upgrade the OLIVIA platform.

One of the core premises of enhanced human–robot interaction is the concept of ‘believability’, says Saerbeck. A robot is controlled by a highly sophisticated and technical architecture of programs, sensors and actuators, and without detailed attention to the robot’s ‘animation’, the robot can appear mechanical and alien. For example, if a vacuum cleaning robot were to bump into an object and say “Ouch,” people would understand that it is simulating that it was hurt. However, static animations are not sufficient to give the impression of a life-like character. If the robot continuously bumped and repeated the same reaction, the animation would no longer be convincing. “What we are investigating now is how we can take context into account, and how behaviors can develop over time,” says Saerbeck. The research is guided by psychology, social science and linguistics to create models for appropriate actions using animation frameworks. He is also working on more sophisticated programming so that robots can autonomously cope with a wider range of situations instead of resorting to conventional programming of prescribed sequences of actions that humans are expected to perform.


Bringing all these technological elements together, Saerbeck is currently developing a robotic tutor that assists vocabulary-learning tasks for school children. The project dates back to the time when he worked at Philips Research in the Netherlands. In one experiment, his team divided 16 children aged 10–11 years into two groups, and varied the degree of social interaction with a cat-shaped robot named iCAT. All of the children had good language skills and were given the same artificial language to study. The result was that children in the class with a more socially responsive robot scored significantly higher than the children that interacted with a robot in the style of current learning programs. The children with the social iCAT also showed significantly higher intrinsic and task motivation.

At A*STAR, Saerbeck is still using iCAT as a prototype platform. His team is now planning to build a completely new desktop-based static robot tailored for tutoring applications and equipped with specialized hardware for teaching. The design has yet to be fixed, but the researchers are evaluating various technologies including touch screens, flash cards, projectors and gesture-based interfaces.

Meanwhile, Li is also upgrading OLIVIA to enable her to learn from speakers and to understand naturally spoken queries. Ultimately, he plans OLIVIA to be able to deliver information or take actions such as making a taxi booking and shaking hands. However, Li admits that there are many technological gaps that need to be filled before ‘OLIVIA 3.0’ becomes a reality. OLIVIA’s current technologies are already at a sufficiently advanced stage of development to attract commercial interest—aspects of OLIVIA have been applied in a commercial surveillance system. But, according to Li, there remains much scope for improvement, for example, in the accuracy of visual and speech understanding and real-time compliant control.

One of the biggest challenges is to improve robotic ‘attention’, says Li. “In human-to-human interaction, we share a natural concept of communication—we know when the conversation starts and ends, and when we can start talking in a group. We are now trying to facilitate this kind of ability in a robot.” Li’s team is developing new algorithms and cognitive processes that could enable a robot to engage in conversation with both visual and auditory attention, accompanied by natural body language.

As the context of communications is also influenced by the robot design, OLIVIA 3 is being designed in a completely different style from the version demonstrated at RoboCup 2010. The design idea is based on market research garnering public opinion about how people wanted a robotic receptionist to look. As opinions varied with age, ethics and social status, OLIVIA 3’s final appearance is still under discussion. “We also have to consider cultural contexts when we build robots—our aim is to develop social robots that people can interact with comfortably,” says Li.

PhysOrg

TStzmmalaysia
post Jan 22 2011, 09:55 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


TRANSPORTATION

Attached Image

Global Observer UAV Makes Hydrogen-Powered Maiden Flight

Unmanned aerial vehicles (UAV) have been making a lot of headlines in recent months ever since the Zephyr solar-powered plane made several record-breaking endurance flights. However now the UAV industry may have another power option on its hands: hydrogen fuel. AeroVironment‘s high altitude, long-endurance (HALE) Global Observer unmanned aircraft just made its first hydrogen-fueled flight!

Last year, the HALE aircraft made a successful maiden voyage using battery power but this is the first time it has flown on alternative fuels. During its four-hour flight the aircraft reached an altitude of 5,000 feet at Edwards Air Force Base (EAFB) in California. Further test flights are planned, which hope to see the plane’s endurance and altitude increased to heights of 55,000 to 65,000 feet.

This is all the more amazing when you consider the aircraft’s size — it’s hardly small, with a wingspan of 175 feet and a length of 70 feet long. It can also carry payloads of up to 400 lbs while its liquid-hydrogen propulsion system drives four electric motors.

The Global Observer UAV was designed as a constant remote imaging, surveillance and communications platform that is more cost-effective and controllable than satellites. The UAV can also cover greater areas than standard low-flying aircraft.

It is hoped that once it is operating at peak efficiency, the HALE aircraft will be able to remain aloft for up to a week. AeroVironment have even speculated that two Global Observers “would provide persistent satellite-like coverage over any location on the globe” at 20 percent of the cost of existing solutions.

“Global Observer has moved quickly from development and testing toward demonstrating mission-ready, affordable persistence,” said Tim Conver, AV chairman and chief executive officer. “Similar to a satellite, Global Observer is the first system designed to provide a 24/7/365 unblinking eye and continuous communications link over any location on the earth’s surface for as long as needed.”

As of 2008, the world’s 16,000 commercial jet aircraft emit more than 600 million tons of CO2 every year, so it’s clear that we need to look into alternative means of air travel in order to cut emissions and stem the onset of climate change. Hydrogen-powered vehicles feature efficient engines that emit only emit only water vapor, making them one possible solution.

Inhabitat

TStzmmalaysia
post Jan 22 2011, 09:57 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Robots Dominate Manufacturing

Every product in your home tells a story. Listening to the memory card in your camera or computer will give you a sneak peak into the international world of automated manufacturing. Lexar Media is one of the largest producers of memory chips (SD cards, memory sticks, keydrives, CompactFlash, etc), and they recently released a promo that shows the start to finish process for their goods. What does this video demonstrate? Total automation domination. From the creation of the silicon wafer in Utah to the packaging of the final product in Asia , machines are center stage. Watching these slick industrial robots do their thing is something else. You have to check out the video below and see what I mean. Considering how crucial automation is in modern production, maybe the big surprise at this point isn’t the robotic dominance but that Lexar still employs so many humans.

When you watch the following video, I want you to compare the creation of the silicon wafers in Utah to the rest of the assembly and testing performed in Asia. The wafer building at Micron Technology (which owns Lexar, by the way) displays a minimum of human presence, even if it has “thousands of workers.” The factories in Asia are very different. Starting around 3:28, the process goes from being almost purely robotic to relying more on human-machine cooperation.

Two of the major themes in the public eye here in the US are the outsourcing of jobs abroad and the current economic crisis. We’ve recently explored how workers in the US are (rightfully?) concerned about losing jobs to robots as well as to overseas factories. I think the video above shows that these are, in fact, competing interests. Look at the work done in Asia: is there any doubt that many of the steps humans are performing could be automated? The plugging in of cards for testing, the assembling of components, the placement of products in packaging – I could imagine a robot doing any of these tasks. Which probably means that, in a few years or so, they will. Asia, with its relatively cheap labor markets, is where we have exported manufacturing jobs. Yet if these processes become as automated as the wafer factories in Utah would labor costs still be the deciding factor?



As robots get cheaper and better you have to pay your workers less and less to compete. At some point, machines win. And not just in manufacturing – automation is creeping in at all levels of the economy, including research science. In the years ahead, human labor is likely to be removed from any repetitive task. Manufacturing, tech services, legal research and other fields could try to make their way to even cheaper labor markets, but I think that robots will ultimately make such geographic concerns a thing of the past. Watching Lexar’s tour through their creation process leaves little doubt that the domination of automation is well underway. It’s only going to get more robotic from here.

SingularityHub


TStzmmalaysia
post Jan 23 2011, 09:21 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Face-recognition technology provides more relative product suggestions

A face-recognising vending machine developed in Taiwan is able to offer products that are relative to that particular client. "hair-growing tonic to balding men and razors to people with beards," one of the inventors said Friday.

The vending machine, from Taipei-based Innovative DigiTech-Enabled Applications and Services Institute, is equipped with a camera that reads the faces of shoppers and then suggests products according to their gender, age & appearance. Off course, the client is then offered the choice whether he agrees with the suggestions or not.

"Our facial-recognition technology is more active than what has been developed in the United States and Japan, because it can actually offer advice," said Tsai Chi-hung, a researcher at the institute.

As well as perceiving male baldness or facial hair, it is also able to suggest beauty products for young women and health drinks for older ones, according to Tsai.

The machine can also record the choices of shoppers who do not follow its tips to learn from its "mistakes" to be able to offer better suggestions in the future, he said.

Also, if you would want that, the software could remember your choices and preferences. By doing this, it can further adept to your particular needs in the future and even offer you suggestions you had not yet considered yourself.

These types of technology could prove very usefull in the future. By digitalising which type of people are prone to demand a certain product, the production and distribution of these products can be made much more efficient. Because the software has the ability to 'predict' your needs, it could even pre-order certain goods for you, which could virtually eliminate shipping times.

If, for example, a certain area is prone to demand a lot of 'blue jeans', the production and distribution of these jeans can then be adjusted to these circumstances. Hereby reducing overproduction and lowering the ammount of energy needed for shipping and distributing the product.

PhysOrg

TStzmmalaysia
post Jan 23 2011, 09:31 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

New Cognitive Robotics Lab Tests Theories of Human Thought

In a new Cognitive Robotics Lab, students at Rensselaer are exploring how human thought outwits brute force computing in the real world. The lab's 20 programmable robots allow students to test the real-world performance of computer models that mimic human thought.

"The real world has a lot of inconsistency that humans handle almost without noticing -- for example, we walk on uneven terrain, we see in shifting light,"
said Professor Vladislav Daniel Veksler, who is currently teaching Cognitive Robotics. "With robots, we can see the problems humans face when navigating their environment."

Cognitive Robotics marries the study of cognitive science -- how the brain represents and transforms information -- with the challenges of a physical environment. Advances in cognitive robotics transfer to artificial intelligence, which seeks to develop more efficient computer systems patterned on the versatility of human thought.

Professor Bram Van Heuveln, who organized the lab, said cognitive scientists have developed a suite of elements -- perception/action, planning, reasoning, memory, decision-making -- that are believed to constitute human thought. When properly modeled and connected, those elements are capable of solving complex problems without the raw power required by precise mathematical computations.

"Suppose we wanted to build a robot to catch fly balls in an outfield. There are two approaches: one uses a lot of calculations -- Newton's law, mechanics, trigonometry, calculus -- to get the robot to be in the right spot at the right time," said Van Heuveln. "But that's not the way humans do it. We just keep moving toward the ball. It's a very simple solution that doesn't involve a lot of computation but it gets the job done."

Robotics are an ideal testing ground for that principle because robots act in the real world, and a correct cognitive solution will withstand the unexpected variables presented by the real world.

"The physical world can help us to drive science because it's different from any simulated world we could come up with -- the camera shakes, the motors slip, there's friction, the light changes," Veksler said. "This platform -- robotics -- allows us to see that you can't rely on calculations. You have to be adaptive."

The lab is open to all students at Rensselaer. In its first semester, the lab has largely attracted computer science and cognitive science students enrolled in a Cognitive Robotics course taught by Veksler, but Veksler and Van Heuveln hope it will attract more engineering and art students as word of the facility spreads.

"We want different students together in one space -- a place where we can bring the different disciplines and perspectives together," said Van Heuveln. "I would like students to use this space for independent research: they come up with the research project, they say 'let's look at this.'"

The lab is equipped with five "Create" robots -- essentially a Roomba robotic vacuum cleaner paired with a laptop; three hand-eye systems; one Chiara (which looks like a large metal crab); and 10 LEGO robots paired with the Sony Handy Board robotic controller.

On a recent day, Jacqui Brunelli and Benno Lee were working on their robot "cat" and "mouse" pair, which try to chase and evade each other respectively; Shane Reilly was improving the computer "vision" of his robotic arm; and Ben Ball was programming his robot to maintain a fixed distance from a pink object waved in front of its "eye."

"The thing that I've learned is that the sensor data isn't exact -- what it 'sees' constantly changes by a few pixels -- and to try to go by that isn't going to work," said Ball, a junior and student of computer science and physics.

Ball said he is trying to pattern his robot on a more human approach.

"We don't just look at an object and walk toward it. We check our position, adjusting our course," Ball said. "I need to devise an iterative approach where the robot looks at something, then moves, then looks again to check its results."

Attached Image

The work of the students, who program their robots with the Tekkotsu open-source software, could be applied in future projects, said Van Heuveln.

"As a cognitive scientist, I want this to be built on elements that are cognitively plausible and that are recyclable -- parts of cognition that I can apply to other solutions as well," said Van Heuveln. "To me, that's a heck of a lot more interesting than the computational solution."

In a generic domain, their early investigations clearly show how a more cognitive approach employing limited resources can easily outpace more powerful computers using a brute force approach, said Veksler.

"We look to humans not just because we want to simulate what we do, which is an interesting problem in itself, but also because we're smart," said Veksler. "Some of the things we have, like limited working memory -- which may seem like a bad thing -- are actually optimal for solving problems in our environment. If you remembered everything, how would you know what's important?"

ScienceDaily



TStzmmalaysia
post Jan 24 2011, 09:25 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Ping Pong Robot Learns by Doing

Despite all the recent advances in robotics, one fundamental task has always been very difficult: robot programming. New Research in the field of robotic programming is moving towards more natural and exciting directions.

To be sure, robot programming in industrial settings has evolved significantly, from a series of mechanical switches to advanced programming languages and teach-pendant devices for trajectory planning. But getting robots to do their jobs still requires a great deal of human labor -- and human intelligence.

The situation is even worse when it comes to programming robots to do things in non-industrial environments. Homes, offices, and hospitals are unstructured spaces, where robots need to deal with more uncertainty and act more safely.

To overcome this programming bottleneck, engineers need to create robots that are more flexible and adaptable -- robots that, like humans, learn by doing.

That's what a team led by Dr. Jan Peters at the Robot Learning Lab, part of the Max-Planck Institute for Biological Cybernetics, in Tübingen, Germany, is trying to do. Peters wants to transform robot programming into robot learning. In other words, he wants to design robots that can learn tasks effortlessly instead of requiring people to painstakingly determine their every move.

In the video below, you can see his students taking their robot "by the hand" to teach it motor skills needed for three tasks: paddle a ball on a string, play the ball-in-a-cup game, and hit a ping pong ball.

Here's how Dr. Peters explained to Automaton his team's approach: "Take the example of a person learning tennis. The teacher takes the student by the hand and shows basic movements: This is a forehand, this is a backhand, this is a serve. Still, it will take hours and hours of training before the student even feels comfortable at performing these behaviors. Even more practice is needed for the student to be able to play an actual game with these elementary behaviors." But still, he adds, humans succeed at learning the task. Why can't robots do the same? "That's what we're trying to do: Make our robots mimic the way humans learn new behaviors."

In the first part of the video, graduate student Katharina Muelling shows the robot how to paddle a ball on a string by performing the action while holding the robot's "hand." The robot decomposes the movement into primitive motor behaviors -- a discrete motor primitive that modulates the rhythmic paddling with an increasing amplitude until it becomes a stable rhythmic behavior -- and quickly "learns" how to perform the task.

For comparison purposes, the researchers tried to manually program the robot's motors to perform the same task. It took them three months and the result wasn't as good as the imitation learning experiment, which took less than an hour, Dr. Peters says.

Attached Image

In the second part of the video, Muelling teaches the robot the ball-in-a-cup game. [See photo on the right; the robot has to swing the yellow ball, which is att ached to a string, and make it land into the blue cup.] This skill is significantly more difficult than paddling the ball on a string, and the robot doesn't have enough data to simply imitate what the human did. In fact, when the robot attempts to reproduce the human action, it can't match the accelerations of the human hand and the ball misses the cup by a large margin. Here, self-improvement becomes key, Dr. Peters says.

"For every new attempt, when the robot reduces the distance by which the ball misses the cup, the robot receives a 'reward,' " he says. "The robot subsequently self-improves on a trial-by-trial basis. It usually gets the ball in the cup for the first time after 40 to 45 trials and it succeeds all the time after about 90 to 95 trials."

How does the robot's learning ability compare to a human being? PhD student Jens Kober , who led this particular experiment, wanted to find out: He went home for a holiday last year and enjoyed the benefit of an extended, large family -- always good subjects for a scientific experiment. He showed his many cousins the ball-in-a-cup game and rewarded them with chocolate. It turned out that the younger ones (around 6 years old) would not learn the behavior at all, the ones in their early teens (10 to 12) would learn it within 30 to 35 trials, and the grownups would be much faster.

"His supervisor may be the only person in his lab who has not managed to learn this task," Dr. Peters quips.

In the last part of the video, the researchers tackle an ever harder task: ping pong. Again, Muelling teaches the robot by holding its "hand," this time to hit a ping pong ball sent by a ball gun. The challenge here is to use -- and modify -- previously learned basic motions and combine them with visual stimuli: The robot needs to keep track of the ball, which may come from different directions, and then execute the right set of motions.

Attached Image

Some of their work, part of GeRT consortium, a program that aims at generalizing robot manipulation tasks, is still preliminary, Dr. Peters notes. But he's confident they can teach their robot to become a good ping pong player. How good? Maybe not as good as Forrest Gump, but good enough to beat everyone in the lab.

For a video of the robot in action: Click Here

From: Spectrum



TStzmmalaysia
post Jan 24 2011, 09:27 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Woman speaks after pioneering voice box transplant

Just 13 days after receiving a pioneering larynx transplant, a Californian woman was able to speak her first words in a decade. Her own larynx was permanently damaged by an operation 11 years ago.

The first combined larynx and thyroid transplant was performed in 1998, but in the latest operation Brenda Charett Jensen of Modesto, California, received a section of trachea too. The feat, which took 18 hours, was performed last October at the Medical Center of the University of California, Davis, but announced only yesterday.

The transplant also works far better than the first because more of the donated organs' nerves have been plugged into the 52-year-old woman's own nervous system. This enables her to move muscles that control speaking by moving the vocal cords, and others that will eventually allow her to swallow again, once she relearns how to do it.

"It is a miracle," says Jensen. "I'm talking, talking, talking, which just amazes my family and friends." The sound of her voice is her own, rather than that of the donor.

Her own voice again

Jensen lost her speech 11 years ago through complications during surgery that blocked her airway. The blockage stopped her larynx working, so for years she has communicated with a handheld voice synthesiser. That operation also left her breathing dependent on a tracheotomy – a tube inserted into her windpipe. With the new trachea, the hope is that she should also be able to breathe normally and dispense with the tracheotomy.

One of the reasons that Jensen was chosen was that she was already on immunosuppressive drugs because of a previous kidney-pancreas transplant, reducing the risk of organ rejection.

Led by surgeon Gregory Farwell, the team transplanted the larynx, thyroid and trachea of a woman who died in an accident . The thyroid has to be transplanted too, because it supplies blood to the larynx.

Farwell and his colleagues plumbed numerous blood vessels from the donated organs into Jensen's own, and also reconnected five major nerves to maximise her control over the muscle tissue that came with the transplant.

"The first larynx transplant only reconnected three nerves," says Martin Birchall of University College London, who served as chief scientific adviser to the team, specialising in reconnection of the nerves. "Here, we've done five nerves with the intention of restoring much more laryngeal function than the original, and eventually getting rid of the tracheotomy."

Rapid progress

Birchall said that although the man who received the original larynx transplant at the Cleveland Research Clinic in Ohio in 1998 is doing well and has recovered some speech, he still has a tracheotomy. His vocal cords have never moved, whereas Jensen's were moving in just a fortnight. "We've already seen much quicker progress in speech," says Birchall.

The breakthrough is the latest to exploit rapid improvements in microsurgical techniques since the first face transplant in 2005. The increasingly ambitious use of more complex transplants including muscles, nerves and bones has also highlighted the greater functionality that this allows the recipient.

Birchall believes that recipients will benefit even more if their own stem cells are extracted and used to coat donated organs chemically stripped of all donor cells. Because all that's then left of the donated organ is a "scaffold" of the protein collagen, it can be covered with the recipient's own cells and transplanted into their body with no fear of rejection.

In 2008, Birchall was part of a team that demonstrated this can be done by performing the world's first trachea transplant.

Complex challenge

Birchall told New Scientist that such an approach would be possible with the larynx, but unlike the trachea – which is simply a tube – a recoated larynx would also have to include artificially constructed muscles and blood vessels because of its much more complex function.

"It's much more complex than the trachea, but we do have ways to address these things," says Birchall. "Regenerative medicine using stem cells is now moving at a furious pace, and the airways and plumbing systems are at the forefront," he says.

Hear and see her voice here.

NewScientist


TStzmmalaysia
post Jan 26 2011, 07:34 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Rewiring the Human Arm

Although modern prosthetic devices are more lifelike and easier for amputees to control than ever before, they still lack a sense of touch. Patients depend on visual feedback to operate their prostheses – they know that they’ve touched an object when they see their prosthetic hand hitting it. Without sensation, patients cannot accurately judge the force of their grip or perceive temperature and texture.

Todd Kuiken, a professor at Northwestern University and director of the Neural Engineering Center for Artificial Limbs at the Rehabilitation Institute of Chicago, has led the development of a new technique known as targeted reinnervation, which can help amputees control motorized prosthetic arms. He and his team now hope to extend the applications of targeted reinnervation to help patients regain sensory capabilities.

In targeted reinnervation, the motor nerves of a nearby target muscle (usually the chest) are deactivated. Then the residual motor nerves at the end of an amputated arm are transplanted from the stump to the chest. The nerves rewire themselves and grow into the chest muscle. Since amputation of a limb does not prevent the nerves left in the residual limb from signaling, the reinnervation procedure simply gives the signals a new destination.

After the procedure, when a person thinks about moving a muscle in the missing arm or hand, the chest muscle twitches. Electrodes pick up these signals and pass them on to a motorized prosthetic arm, allowing patients to control multiple motor functions like the simultaneous movement of both the elbow and hand to throw a ball.

The regrowth of sensory nerves after this procedure was discovered by accident. The first patient to undergo targeted reinnervation told Kuiken and his other doctors about an interesting sensation he experienced: when someone touched the area of his chest where his nerves had regrown, he felt as if someone was touching his missing hand. The sensory nerves from his arm stump had reinnervated the skin above his chest muscle. He was experiencing touch to the reinnervated skin as being applied to his missing limb. It turned out that sensory reinnervation such as this was common following the procedure.

Kuiken and his colleagues are currently exploring how to take advantage of sensory reinnervation to build prosthetic arms with sensors on the fingers that can transfer touch information from the prosthetic to the chest, allowing patients to “feel” what they are touching with their prostheses.

The next step is to figure out the mechanisms that guide reinnervation, with the hope of someday being able to direct the regrowth of nerves for more refined results. To better understand how sensory reinnervation affects brain reorganization, Kuiken and his colleague Paul Marasco examined the brains of rats after amputation and targeted reinnervation. In this experiment, published in The Journal of Neuroscience, Marasco and Kuiken looked at how the somatosensory cortex, the brain area that receives and processes input from sensory organs, changed in rats following forelimb amputation with and without the targeted reinnervation procedure.

One group of rats underwent forelimb amputation and then targeted reinnervation, while another group of rats underwent only the amputation. The rats that did not undergo targeted reinnervation effectively had the input between the cortex and the forepaw silenced. After thirteen weeks of recovery, the experimenters recorded brain activity in the primary somatosensory cortex of all the animals. Marasco and Kuiken were especially interested in the region known as the forelimb barrel subfield, which would normally process touch input from the amputated forepaw.

As expected, the rats that underwent amputation without targeted reinnervation showed an almost complete silencing of brain activity in the forelimb barrel subfield. The receptive fields for the few active areas in this region were located on the residual shoulder.

In contrast, the rats that underwent targeted reinnervation showed extensive activity in the forelimb barrel subfield. The receptive fields for the active sites in these rats were small and densely clustered on the far end of the stump, and differed in proportion from the large and diffuse receptive fields observed on the residual limb of the amputation-only rats. It appeared that the sensory input from the reinnervated skin was processed within the cortical representation of the missing forepaw.

This helps explain why Kuiken’s earlier human patient reported feeling a touch on his chest as occurring on his missing hand. His somatosensory cortex, in particular the area devoted to the missing limb, had reorganized to accommodate the new sensory input. Sensations from the skin on his chest were being processed within the hand representation area of his somatosensory cortex.

Further somatosensory reorganization was evident in the rats. In most of the animals that underwent targeted reinnervation following amputation, there were regions of the forelimb barrel subfield (called dual receptive fields) that were responsive to both the stump and other regions of the body (the whiskers, lower lip, and hindlimb). The presence of dual receptive fields in these rats, but not in the amputation-only rats, suggests that the adjacent brain areas expanded into the denervated regions following the amputation. The sharing of space allowed those sensory nerves to keep transmitting signals, even after amputation.

Marasco and Kuiken’s results provide important insights into the sensory phenomena observed in human targeted reinnervation patients. The reorganization of somatosensory cortex in rats following the procedure supports the hypothesis that the reinnervated skin is able to act as a direct line of communication from a prosthetic device to the regions of the brain that process hand and limb sensations. This is likely the mechanism by which targeted reinnervation provides sensation that is perceived as coming from an amputated limb.

Ultimately, Marasco and Kuiken hope that this experiment will contribute to the building of better prosthetic limbs. Motorized prostheses that also provide sensory feedback have the potential to be more effective, capable of more functions, and easier to manipulate. Most importantly, they would not only function like a real human arm but also feel like one, allowing the prosthetic to be integrated more naturally into the patient’s self image.

Technology Review

TStzmmalaysia
post Jan 26 2011, 07:36 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


TRANSPORTATION

Attached Image

Hydrogen-Powered “Flying Yacht”

Australian designer Jaron Dickson has come up with a concept for a hydrogen-powered, flying yacht. If that wasn’t cool enough, the yacht is based on the legendary Soviet super-vehicle the Ekranoplan. As a result, this cool little boat, which has been shortlisted for an Australian Design Award, is called the EkranoYacht – read on for a look!

For those of you not familiar with the Ekranoplan, it was a massive vehicle conceived in Scandinavia but realized by the Soviets during the Cold War. Essentially half plane, half boat, the Ekranoplan would ‘skim’ across the water (or land) on short wings using the ‘ground effect’. This allowed the vehicle to ‘fly’ just above the ground on of a cushion of high-pressure air created by the aerodynamic interaction between the wings and the surface. The Soviets design a massive 550-ton Ekranoplan that could transport vehicles and troops at an a stunning 450mph – all while remaining up to 66 feet over the water. Unsurprisingly, it earned the nickname “The Caspian Sea Monster”. Unfortunately, their production fell with the collapse of the Soviet Union.

Ok, science and history lesson over! Back to the EkranoYacht!

According to Dickson, who is a student at Monash University in Australia, the EkranoYacht is a “hydrogen powered wing-in-ground effect vehicle for permanent residence – set for 2025. The conventional ways of living have changed dramatically, people are less bound by the country or topographical location which they reside. Using hydrogen power and flying 4m above the water’s surface, the project focuses on more efficient sea travel and protecting the environment. To truly show your wealth and success is freedom, and the ultimate freedom is bringing you home where ever you go.”

Dickson added that, “Humans are always thinking of new ways travel and improve their dynamic lives. My design is a ‘blue-sky concept’, but this type of forward and different thinking could possibly turn into a reality one day. My project has the livability of a yacht and the convenience of an aeroplane.”

Inhabitat

TStzmmalaysia
post Jan 26 2011, 07:37 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Sushi freezing techique could allow cryopreservation of internal organs

CAS technology could mean organ transplants of the future won't be the frantic race against time they are now. It has been possible to successfully cryopreserve semen, blood, embryos, oocytes, stem cells and other thin samples of small clumps of cells for a few decades now. However, cryopreservation of human internal organs, such as livers and hearts for storage and transplant, currently requires toxic doses of cryoprotectants – substances that protect biological tissue from freezing damage due to ice formation – in order to survive the cooling process. A solution could be at hand in the form of a technology used to preserve sushi that can instantly freeze water, meaning there is no time for cell damaging ice crystals to form. In fact, it’s already being used to preserve teeth.

The common misconception is that the freezing of organs will cause the cells to burst due to the formation of ice crystals within the cell, but this only occurs if the freezing rate exceeds the osmotic loss of water to the extracellular space. This makes it possible to freeze small biological samples in a process known as slow programmable freezing (SPF) or controlled-rate and slow freezing. This is currently for oocyte, skin, blood products, embryo, sperm, stem cells and general tissue preservation in hospitals, veterinary practices and research labs around the world.

However, freezing larger organs or even whole human beings can cause serious damage as a result of ice forming between cells, causing mechanical and chemical damage. To prevent this, larger doses of toxic cryprotectants are used to remove and replace water inside cells with chemicals that prevent freezing. While this can reduce the damage from freezing, the toxicity can still cause serious injuries that aren’t reversible with present technology – meaning we’re not likely to see James Bedford – the first person whose body was frozen and remains cryopreserved – up and about any time soon.

A newer technique known as vitrification, which involves an extremely rapid drop in temperature, claims to provide the benefits of crypreservation without damage due to ice crystal formation. But again, damaging levels of cryoprotectants are needed. This time to increase the viscosity and lower the freezing temperature inside the cell.

Freezing without cryoprotectants

Now a research group at Hiroshima University, borrowing supercooling technology used to preserve sushi and high-end food delicacies, has proven it is possible to freeze cells without the use of toxic cryprotectants. As reported on Singularity Hub, the “Cells Alive System” (CAS) produced by Japanese company ABI prevents freezing at supercool temperatures by vibrating the water using magnetic fields. This allows the water to be supercooled so that when the magnetic field is turned off, the water freezes instantaneously – too fast for damaging ice crystals to form.

The patented CAS technology is already being used at the world’s first commercial tooth bank, The Teeth Bank, to preserve teeth and potentially as an alternative source for harvesting stem cells. The Teeth Bank allows removed teeth, previously disposed of as worthless medical waste, to be stored and re-implanted when needed. Preserved teeth can even be sculpted into different teeth before re-implantation.

The CAS technology has even been proven to reserve the tooth ligaments – an important factor for re-implantation. A 2010 study published in the journal Cryobiology detailed how the ligaments of a fresh tooth slow frozen without the CAS technology were severely damaged, while the ligaments of another tooth frozen using the CAS technology survived showing only minor damage and grew as well as those from a fresh tooth.

The technology could obviously have potential for the preservation of internal organs that currently have only a brief window of viability after being removed from a donor. Thanks to burgeoning technology, it could also provide a way for people to store replacement organs grown from their own stem cells instead of waiting on a compatible donor.

GizMag

TStzmmalaysia
post Jan 26 2011, 07:39 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Incredibly Small Pocket Projector works

Researchers in Germany have developed the world's thinnest "pico" video projector.

The prototype device contains an array of carefully shaped microlenses, each with its own miniature liquid-crystal display (LCD). The device is just six millimeters thick, but it produces images that are 10 times brighter than would normally be possible with such a small device.

Handheld pico projectors can be used to display movies, maps, or presentations on nearby surfaces. But the projections can be difficult to view in direct sunlight because the light source isn't very powerful. The new lens system is small enough to be incorporated into a slim smart phone.

Increasing the brightness of a projection normally means increasing the area of the light source used, says Marcel Sieler, a researcher at the Fraunhofer Institute for Applied Optics and Precision Engineering in Germany. Sieler was involved with developing the prototype. But to increase the area in this way requires a thicker lens to focus the larger image. "As the area of the light source increases, so does the volume of the lens," says Sieler. The result is a much bigger projector.

Sieler and colleagues created a novel type of lens that focuses light from a relatively large light source while remaining thin. The prototype video projector consists of 45 microlenses colored red, green, or blue. Each lens has an LCD with 200 by 200 pixels behind it. The light passing through each LCD is focused through a lens, and together each image is superimposed on top of each other to produce the final image. The design was inspired by a type of microlens array known as a "fly's eye condenser," which is normally used to mix light from different sources.

Source: Technology Review
TStzmmalaysia
post Jan 27 2011, 09:27 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

New Solar Powered Desalination Process

In coastal desert countries like the United Arab Emirates fresh, clean water is a lacking resource. Desalination is a necessary process to receive fresh water, but it often harms the environment by returning concentrated saline back to the ocean. Abu Dhabi's Environment Agency recently announced that they have developed a new solar-powered desalination system that would cut costs and be more eco-friendly.

In desert areas like the Emirates, dust and high temperature often impair the efficiency of solar panels in existing desalination plants. New technologies remedy these problems while reducing the cost of water treatment. In a press release, the agency said that two pilot sites, in Sweihan and Hameem, have shown that the negative environmental impact of desalination can be reduced, along with operating costs. Each plant is capable of producing about 35 kilowatts per hour, having a total capacity of 1050 kilowatt/hour.

Attached Image

Trials of the new solar power system are being tested at 30 different locations within the country. If the new process proves to be successful, Saudi Arabia and other countries will begin to use the system.

Desalination is the most effective way of cleaning water, and several plants already exist in the United Arab Emirates. However, the process can have adverse environmental impact, mainly concentrated saltwater being returned to the sea and killing marine life. New systems like the one developed by the UAE can eliminate or mitigate this problem, resulting in the same amount of clean water but with less damage to the environment.

Via: The Green Optimistic
TStzmmalaysia
post Jan 27 2011, 09:29 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


NANOTECHNOLOGY

Attached Image

Novel nanoparticle filter

Israeli researchers have created a recyclable membrane based on supramolecular linkages that can be used to filter nanoparticles. The membrane, which unusually comprises non-covalent bonds, performs just as well as conventional sieves, offering a green and versatile alternative for size-separation and purification of nanoparticles.

Standard filtration membranes are usually held together by strong covalent bonds, which give membranes suitable strength to withstand the pressures involved in filtration processes. The problem is that when membranes become clogged up they have to be discarded and replaced.

One idea is to make membranes based on supramolecular (non-covalent) interactions which can undergo reversible self-assembly. Since the bonds can be undone easily, they offer a recyclable and adaptable option. But making such membranes with a level of robustness to rival conventional options has remained a challenge.

Now, Boris Rybtchinski and colleagues at the Weizmann Institute of Science in Rehovot, Israel, have managed to make a robust and recyclable untrafiltration membrane with non-covalent hydrophobic linkages. 'This results in easy fabrication, recyclability, and versatility that cannot be achieved with regular covalent materials,' says Rybtchinski.

The team created a compound that self-assembles in water. It has a specially designed large and flat hydrophobic surface. 'In water, these surfaces experience very large attractive forces that hold them together, eventually forming porous nanostructured 3D networks possessing high robustness,' Rybtchinski explains. By filtering these structures onto a cheap commercial support with 400nm pores, they form a nanostructured membrane that works as a nanoparticle sieve.

The hydrophobic interactions are strong enough to hold together the membrane and withstand the flow of particles, says Rybtchinski. Experiments with solutions containing gold nanoparticles of various sizes revealed that only particles smaller than 5nm could pass through a 12µm thick membrane. By increasing the thickness to 45µm, the team discovered that the membrane could separate smaller particles (CdTe quantum dots of 2-4nm in size) because of a time delay between different sized particles passing through the membrane, resulting in size-selective chromatography.

The membrane is easily disassembled by adding solvents such as ethanol which weakens the hydrophobic interactions. 'This way the material can be retrieved, cleaned, and reused for fabrication of another membrane,' says Rybtchinski. Furthermore, particles that are stuck in the filter can be recycled too, which is not always possible with conventional membranes.

Jonathan Nitschke, who researches self-assembling polymers at the University of Cambridge, UK says that Rybtchinski's use of non-covalent interactions to knit together a filtration membrane is innovative. 'Supramolecular linkages can be undone under certain conditions, allowing the membranes to be dissolved and recreated so it's an excellent way of cleaning and recycling them.'

James Urquhart

Royal Society of Chemistry

TStzmmalaysia
post Jan 27 2011, 09:30 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


SPACE SCIENCE

Attached Image

Beaming Rockets into Space

Space launches have evoked the same image for decades: bright orange flames exploding beneath a rocket as it lifts, hovers and takes off into the sky. But an alternative propulsion system proposed by some researchers could change that vision.

Instead of explosive chemical reactions on-board a rocket, the new concept, called beamed thermal propulsion, involves propelling a rocket by shining laser light or microwaves at it from the ground. The technology would make possible a reusable single-stage rocket that has two to five times more payload space than conventional rockets, which would cut the cost of sending payloads into low-Earth orbit.

NASA is now conducting a study to examine the possibility of using beamed energy propulsion for space launches. The study is expected to conclude by March 2011.

In a traditional chemical rocket propulsion system, fuel and oxidizer are pumped into the combustion chamber under high pressure and burnt, which creates exhaust gases that are ejected down from a nozzle at high velocity, thrusting the rocket upwards.

A beamed thermal propulsion system would involve focusing microwave or laser beams on a heat exchanger aboard the rocket. The heat exchanger would transfer the radiation's energy to the liquid propellant, most likely hydrogen, converting it into a hot gas that is pushed out of the nozzle.

“The basic idea is to build rockets that leave their energy source on the ground,” says Jordin Kare, president of Kare Technical Consulting, who developed the laser thermal launch system concept in 1991. “You transmit the energy from the ground to the vehicle.”

With the beam shining on the vehicle continually, it would take 8 to 10 minutes for a laser to put a craft into orbit, while microwaves would do the trick in 3 to 4 minutes. The vehicle would have to be designed without shiny surfaces that could reflect dangerous beams, and aircraft and satellites would have to be kept out of the beam’s path. Any launch system would be built in high-altitude desert areas, so danger to wildlife shouldn’t be a concern, Kare says.

Thermal propulsion vehicles would be safer than chemical rockets since they can’t explode and don’t drop off pieces as they fly. They are also smaller and lighter because most of the complexity is on the ground, which makes them easier and cheaper to launch.

“People can launch small satellites for education, science experiments, engineering tests, etc. whenever they want, instead of having to wait for a chance to share a ride with a large satellite,” Kare says.

Another cost advantage comes from larger payload space. While conventional propulsion systems are limited by the amount of chemical energy in the propellant that's released by combustion, in beamed systems you can add more energy externally. That means a spacecraft can gain a certain momentum using less than half the amount of propellant of a conventional system, allowing more room for the payload.

“Usually in a conventional rocket you have to have three stages with a payload fraction of three percent overall,” says Kevin Parkin, leader of the Microwave Thermal Rocket project at the NASA Ames Research Center. “This propulsion system will be single stage with a payload fraction of five to fifteen percent.”

Having a higher payload space along with a reusable rocket could make beamed thermal propulsion a low-cost way to get material into low Earth orbit, Parkin says.

Parkin developed the idea of microwave thermal propulsion in 2001 and described a laboratory prototype in his 2006 PhD thesis. A practical real-world system should be possible to build now because microwave sources called gyrotrons have transformed in the last five decades, he says. One megawatt devices are now on the market for about a million US dollars.

"They're going up in power and down in cost by orders of magnitude over the last few decades,” he says. “We've reached a point where you can combine about a hundred and make a launch system."

Meanwhile, the biggest obstacle to using lasers to beam energy has been the misconception that it would require a very large, expensive laser, Kare says. But you could buy commercially available lasers that fit on a shipping container and build an array of a few hundred. "Each would have its own telescope and pointing system," he says. "The array would cover an area about the size of a golf course."

The smallest real laser launch system would have 25 to 100 megawatts of power while a microwave system would have 100 to 200 megawatts. Building such an array would be expensive, says Kare, although similar to or even less expensive than developing and testing a chemical rocket. The system would make most economic sense if it was used for at least a few hundred launches a year.

In addition, says Parkin, “the main components of the beam facility should last for well over ten thousand hours of operation, typical of this class of hardware, so the savings can more than repay the initial cost.”

In the near term, beamed energy propulsion would be useful for putting microsatellites into low Earth orbit, for altitude changes or for slowing down spacecraft as they descend to Earth. But the technology could in the future be used to send missions to the Moon or to other planets and for space tourism.

Kare has looked into the possibility of using lasers to propel interstellar probes for NASA’s Institute of Advanced Concepts. A deep space launch would require higher power lasers with larger telescope systems as well as laser relay stations in space. Powering missions over interplanetary distance would require even bigger lasers and telescopes, as well as different propulsion techniques using propellants easier to store than liquid hydrogen.

Sending a spacecraft to a moon of Jupiter, for instance, would require a laser that gives billions of watts of power. "You'd have to have another couple generations of space-based telescopes to do something like that,” Kare says. “You can in fact launch an interstellar probe that way but now you’re talking about lasers that might be hundreds of billions of Watts of power." Laser technology could reach those levels in another 50 years, he says.

Astrobio


TStzmmalaysia
post Jan 27 2011, 09:32 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

More and More Thai Robots In Restaurants

Every nation is known for their specialty goods. Switzerland has its chocolates, France has its wine… and Thailand has its robot waiters. MK Restaurant has over 300 locations in Thailand, and has been looking for ways to modernize their operation. That includes robot greeters and servers. The MK Robot Project has encouraged Bangkok University to develop several different models that could do the job. They’ll have competition from Thailand-based CT Asia Robotics, which has already developed one restaurant-bot (Dinsow) and will soon launch another (Yumbo). Check out Dinsow and Bangkok University’s serving robot in the videos below. With a major restaurant chain backing the idea, and both academic and industrial support, Thailand might be the proving ground for the future of automated dining.

Dinsow is a wheeled robot with some decent speed and an overly happy disposition. While its batteries only last two to three hours, its personality seems to say it could go for miles just to get you a cup of coffee. The bot can be controlled via voice commands or PC base station up to 80 meters away. 10 copies of the Dinsow have served as greeters in MK Restaurants, but not as waiters. Dinsow’s arms look to be mostly for show. Not true for its sibling, Yumbo. The second CT Asia robot will be able to carry a tray and deliver food.

Yumbo is CT Asia's robot directly targeted to restaurants as a way to boost their sales. It can carry trays and change its facial expressions.

Bangkok University looks to be experimenting with several different possible forms for the MK Robot Project, all wheeled. The most promising is a model with a tray built into its chest. It can follow a line, escort someone by the arm, and avoid collisions with ultrasonics. They’ve even put it on a limited test run in a real restaurant, though you’ll have to judge its success for yourself:



Of course, Thailand’s efforts towards robotic waiters aren’t unique. China recently unveiled its own robot-themed restaurant, with their own home grown bots. Japan, too, is clearly into the concept, considering all the weird ways they’ve gotten industrial robots to serve food. In fact, Thailand’s first bot-enabled restaurant featured Motoman robots from Japan. Yet the Thai are making a strong effort to get restaurant-friendly robots into their mainstream. Dinsow only costs about $30,000 – not bad for this kind of market where restaurants might spend close to a million dollars to attract a crowd with automated servers. A low price point might be what restaurants need to take a chance and invest in the technology. In any case, whether they be from Thailand or elsewhere the robots of the world are here to serve.

[URL=Of course, Thailand’s efforts towards robotic waiters aren’t unique. China recently unveiled its own robot-themed restaurant, with their own home grown bots. Japan, too, is clearly into the concept, considering all the weird ways they’ve gotten industrial robots to serve food. In fact, Thailand’s first bot-enabled restaurant featured Motoman robots from Japan. Yet the Thai are making a strong effort to get restaurant-friendly robots into their mainstream. Dinsow only costs about $30,000 – not bad for this kind of market where restaurants might spend close to a million dollars to attract a crowd with automated servers. A low price point might be what restaurants need to take a chance and invest in the technology. In any case, whether they be from Thailand or elsewhere the robots of the world are here to serve.]SingularityHub[/URL]



TStzmmalaysia
post Jan 27 2011, 09:34 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


TRANSPORTATION

Attached Image

Breakthrough promises $1.50 per gallon synthetic gasoline with no carbon emissions

UK-based Cella Energy has developed a synthetic fuel that could lead to US$1.50 per gallon gasoline. Apart from promising a future transportation fuel with a stable price regardless of oil prices, the fuel is hydrogen based and produces no carbon emissions when burned. The technology is based on complex hydrides, and has been developed over a four year top secret program at the prestigious Rutherford Appleton Laboratory near Oxford. Early indications are that the fuel can be used in existing internal combustion engined vehicles without engine modification.

According to Stephen Voller CEO at Cella Energy, the technology was developed using advanced materials science, taking high energy materials and encapsulating them using a nanostructuring technique called coaxial electrospraying.

“We have developed new micro-beads that can be used in an existing gasoline or petrol vehicle to replace oil-based fuels,” said Voller. “Early indications are that the micro-beads can be used in existing vehicles without engine modification.”

“The materials are hydrogen-based, and so when used produce no carbon emissions at the point of use, in a similar way to electric vehicles”, said Voller.

The technology has been developed over a four-year top secret programme at the prestigious Rutherford Appleton Laboratory near Oxford, UK.

The development team is led by Professor Stephen Bennington in collaboration with scientists from University College London and Oxford University.

Professor Bennington, Chief Scientific Officer at Cella Energy said, “our technology is based on materials called complex hydrides that contain hydrogen. When encapsulated using our unique patented process, they are safer to handle than regular gasoline.”

Gizmag

TStzmmalaysia
post Jan 27 2011, 09:42 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Scientists Smash a Super-Tough Robotic Hand With a Hammer

German robotics researchers have built a hyper-strong hand that can withstand hammer blows!

This hand and its high-tech robophalanges come to you courtesy of the Institute of Robotics and Mechatronics at The German Aerospace Center (Deutsches Zentrum für Luft- und Raumfahrt e.V.).

The DLR hand is one of the most durable robotic hands ever built and was specifically built tough for jobs that might ding it up.

As IE EE Spectrum describes: The hand has the shape and size of a human hand, with five articulated fingers powered by a web of 38 tendons, each connected to an individual motor on the forearm.
The main capability that makes the DLR hand different from other robot hands is that it can control its stiffness. The motors can tension the tendons, allowing the hand to absorb violent shocks. In one test, the researchers hit the hand with a baseball bat-a 66 G impact. The hand survived.

The hand has a total of 19 degrees of freedom, or only one less than the real thing, and it can move the fingers independently to grasp varied objects. The fingers can exert a force of up to 30 newtons at the fingertips, which makes this hand also one of the strongest ever built.

Additionally, the hand can catch heavy balls, adjust its level of stiffness to accomplish tasks that require a daintier touch, and snap its fingers. That's right, we're looking at the next star of the future's all-robot revue of West Side Story.

This type of robot, which is so incredibly versaitile, can be applied in a number of areas. It could, for example, be used for manufacturing. In places where handling the product demands high strenght and durability. Or in places where machines are endangered of being damaged, like mining. On the other side, it also helps to give robot hands the flexibility of the human hand, with a dozen times its strenght. This could prove very usefull in household robots, for example.


See how the hand gets blown by the hammer: here.

PopSci


TStzmmalaysia
post Jan 27 2011, 09:44 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

The world can be powered by alternative energy, using today's technology, in 20-40 years

If someone told you there was a way you could save 2.5 million to 3 million lives a year and simultaneously halt global warming, reduce air and water pollution and develop secure, reliable energy sources – nearly all with existing technology and at costs comparable with what we spend on energy today – why wouldn't you do it?

According to a new study coauthored by Stanford researcher Mark Z. Jacobson, we could accomplish all that by converting the world to clean, renewable energy sources and forgoing fossil fuels.

"Based on our findings, there are no technological or economic barriers to converting the entire world to clean, renewable energy sources," said Jacobson, a professor of civil and environmental engineering. "It is a question of whether we have the societal and political will."

He and Mark Delucchi, of the University of California-Davis, have written a two-part paper in Energy Policy in which they assess the costs, technology and material requirements of converting the planet, using a plan they developed.

The world they envision would run largely on electricity. Their plan calls for using wind, water and solar energy to generate power, with wind and solar power contributing 90 percent of the needed energy.

Geothermal and hydroelectric sources would each contribute about 4 percent in their plan (70 percent of the hydroelectric is already in place), with the remaining 2 percent from wave and tidal power.

Vehicles, ships and trains would be powered by electricity and hydrogen fuel cells. Aircraft would run on liquid hydrogen. Homes would be cooled and warmed with electric heaters – no more natural gas or coal – and water would be preheated by the sun.

Commercial processes would be powered by electricity and hydrogen. In all cases, the hydrogen would be produced from electricity. Thus, wind, water and sun would power the world.

The researchers approached the conversion with the goal that by 2030, all new energy generation would come from wind, water and solar, and by 2050, all pre-existing energy production would be converted as well.

"We wanted to quantify what is necessary in order to replace all the current energy infrastructure – for all purposes – with a really clean and sustainable energy infrastructure within 20 to 40 years," said Jacobson.

One of the benefits of the plan is that it results in a 30 percent reduction in world energy demand since it involves converting combustion processes to electrical or hydrogen fuel cell processes. Electricity is much more efficient than combustion.

That reduction in the amount of power needed, along with the millions of lives saved by the reduction in air pollution from elimination of fossil fuels, would help keep the costs of the conversion down.

"When you actually account for all the costs to society – including medical costs – of the current fuel structure, the costs of our plan are relatively similar to what we have today," Jacobson said.

One of the biggest hurdles with wind and solar energy is that both can be highly variable, which has raised doubts about whether either source is reliable enough to provide "base load" energy, the minimum amount of energy that must be available to customers at any given hour of the day.

Jacobson said that the variability can be overcome.

"The most important thing is to combine renewable energy sources into a bundle," he said. "If you combine them as one commodity and use hydroelectric to fill in gaps, it is a lot easier to match demand."

Wind and solar are complementary, Jacobson said, as wind often peaks at night and sunlight peaks during the day. Using hydroelectric power to fill in the gaps, as it does in our current infrastructure, allows demand to be precisely met by supply in most cases. Other renewable sources such as geothermal and tidal power can also be used to supplement the power from wind and solar sources.

"One of the most promising methods of insuring that supply matches demand is using long-distance transmission to connect widely dispersed sites," said Delucchi. Even if conditions are poor for wind or solar energy generation in one area on a given day, a few hundred miles away the winds could be blowing steadily and the sun shining.

"With a system that is 100 percent wind, water and solar, you can't use normal methods for matching supply and demand. You have to have what people call a supergrid, with long-distance transmission and really good management," he said.

Another method of meeting demand could entail building a bigger renewable-energy infrastructure to match peak hourly demand and use the off-hours excess electricity to produce hydrogen for the industrial and transportation sectors.

Using pricing to control peak demands, a tool that is used today, would also help.

Jacobson and Delucchi assessed whether their plan might run into problems with the amounts of material needed to build all the turbines, solar collectors and other devices.

They found that even materials such as platinum and the rare earth metals, the most obvious potential supply bottlenecks, are available in sufficient amounts. And recycling could effectively extend the supply.

"For solar cells there are different materials, but there are so many choices that if one becomes short, you can switch," Jacobson said. "Major materials for wind energy are concrete and steel and there is no shortage of those."

Jacobson and Delucchi calculated the number of wind turbines needed to implement their plan, as well as the number of solar plants, rooftop photovoltaic cells, geothermal, hydroelectric, tidal and wave-energy installations.

They found that to power 100 percent of the world for all purposes from wind, water and solar resources, the footprint needed is about 0.4 percent of the world's land (mostly solar footprint) and the spacing between installations is another 0.6 percent of the world's land (mostly wind-turbine spacing), Jacobson said.

One of the criticisms of wind power is that wind farms require large amounts of land, due to the spacing required between the windmills to prevent interference of turbulence from one turbine on another.

"Most of the land between wind turbines is available for other uses, such as pasture or farming," Jacobson said. "The actual footprint required by wind turbines to power half the world's energy is less than the area of Manhattan." If half the wind farms were located offshore, a single Manhattan would suffice.

Jacobson said that about 1 percent of the wind turbines required are already in place, and a lesser percentage for solar power.

"This really involves a large scale transformation," he said. "It would require an effort comparable to the Apollo moon project or constructing the interstate highway system."

"But it is possible, without even having to go to new technologies," Jacobson said. "We really need to just decide collectively that this is the direction we want to head as a society."

EurekAlert

TStzmmalaysia
post Jan 28 2011, 09:11 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Amazing Floating City Design

There are very few urban design solutions that address housing the tide of displaced people that could arise as oceans swell under global warming. Certainly few are as spectacular as this one.

The Lilypad, by Vincent Callebaut, is a concept for a completely self-sufficient floating city intended to provide shelter for future climate change refugees. The intent of the concept itself is laudable, but it is Callebaut’s phenomenal design that has captured our imagination.


Biomimicry was clearly the inspiration behind the design. The Lilypad, which was designed to look like a waterlily, is intended to be a zero emission city afloat in the ocean. Through a number of technologies (solar, wind, tidal, biomass), it is envisioned that the project would be able to not only produce it’s own energy, but be able to process CO2 in the atmosphere and absorb it into its titanium dioxide skin.

Each of these floating cities are designed to hold approximately around 50,000 people. A mixed terrain man-made landscape, provided by an artificial lagoon and three ridges, create a diverse environment for the inhabitants. Each Lilypad is intended to be either near a coast, or floating around in the ocean, traveling from the equator to the northern seas, according to where the gulf stream takes it.

The project isn’t even close to happening anytime soon, but there is value in future forward designs like the Lilypad. They inspire creative solutions, which at some point, may actually provide a real solutions to various problems.

The oceans take up roughly 70% of the earth's surface. That being said, the land left to live on is scattered through mountains, desserts and ice caps. This has prevented humans from settling at many places. The oceans however, are flat and full of life, space & energy. These factors could provide us with all the necessities for a future life at at sea...

Attached Image Attached Image

via: Inhabitat

This post has been edited by tzmmalaysia: Jan 28 2011, 09:13 AM
BrachialPlexus
post Jan 28 2011, 12:00 PM

Getting Started
**
Junior Member
246 posts

Joined: Oct 2010


Fascinating stuff. Thank you for sharing. smile.gif
TStzmmalaysia
post Jan 29 2011, 09:19 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Universal solvent no match for new self-healing sticky gel

Scientists can now manufacture a synthetic version of the self-healing y substance that mussels use to anchor themselves to rocks in pounding ocean surf and surging tidal basins. A patent is pending on the substance, whose potential applications include use as an adhesive or coating for underwater machinery or in biomedical settings as a surgical adhesive or bonding agent for implants.

Inspiring the invention were the hair-thin holdfast fibers that mussels secrete to stick against rocks in lakes, rivers and oceans. "Everything amazingly just self-assembles underwater in a matter of minutes, which is a process that's still not understood that well," said Niels Holten-Andersen, a postdoctoral scholar with chemistry professor Ka Yee Lee at the University of Chicago.

Holten-Andersen, Lee and an international team of colleagues are publishing the details of their invention this week in the Proceedings of the National Academy of Sciences Early Edition. Holten-Andersen views the evolution of life on Earth as "this beautiful, amazingly huge experiment" in which natural selection has enabled organisms to evolve an optimal use of materials over many millions of years.

"The mussels that live right on the coast where the waves really come crashing in have had to adapt to that environment and build their materials accordingly," he said.

Many existing synthetic coatings involve a compromise between strength and brittleness. Those coatings rely on permanent covalent bonds, a common type of chemical bond that is held together by two atoms that share two or more electrons. The bonds of the mussel-inspired material, however, are linked via metals and exhibit both strength and reversibility.

"These metal bonds are stable, yet if they break, they automatically self-heal without adding any extra energy to the system," Holten-Andersen said.

A key ingredient of the material is a polymer, which consists of long chains of molecules, synthesized by co-author Phillip Messersmith of Northwestern University. When mixed with metal salts at low pH, the polymer appears as a green solution. But the solution immediately transforms into a gel when mixed with sodium hydroxide to change the pH from high acidity to high alkalinity.

"Instead of it being this green solution, it turned into this red, self-healing sticky gel that you can play with, kind of like Silly Putty," he said. Holten-Andersen and his colleagues found that the gel could repair tears within minutes.

"You can change the property of the system by dialing in a pH," said Ka Yee Lee, a professor in chemistry at UChicago and co-author of the PNAS paper. The type of metal ion (an electrically charged atom of, for example. iron, titanium or aluminum) added to the mix provides yet another knob for tuning the material's properties, even at the same pH.

The sticky material that mussels have evolved has inspired an international team of scientists to design a new artificial, self-healing gel that lends itself to underwater applications. The mussels pictured here are attached to a rock on Onetangi Beach of Waiheke Island, New Zealand. Credit: Steve Koppes

"You can tune the stiffness, the strength of the material, by now having two knobs. The question is, what other knobs are out there?" Lee said.
This week's PNAS study reports the most recent in a series of advances related to sticky mussel fibers that various research collaborations have posted in recent years. A 2006 PNAS paper by Haeshin Lee, now of the Korea Advanced Institute of Technology, Northwestern's Phillip Messersmith and UChicago's Norbert Scherer demonstrated an elusive but previously suspected fact. Using atomic-force microscopy, they established that an unusual amino acid called "dopa" was indeed the key ingredient in the adhesive protein mussels use to adhere to rocky surfaces.

Last year in the journal Science, scientists at Germany's Max Planck Institute documented still more details about mussel-fiber chemical bonds. The Max Planck collaboration included Holten-Andersen and Herbert Waite of the University of California, Santa Barbara. Holten-
Andersen began researching the hardness and composition of mussel coatings as a graduate student in Waite's laboratory.

"Our aspiration is to learn some new design principles from nature that we haven't yet actually been using in man-made materials that we can then apply to make man-made materials even better," he said.

Being able to manufacture green materials is another advantage of drawing inspiration from nature. "A lot of our traditional materials are hard to get rid of once we're done with them, whereas nature's materials are obviously made in a way that's environmentally friendly," Holten-Andersen said.

PhysOrg

TStzmmalaysia
post Jan 29 2011, 09:22 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Ingenious Cardboard Packaging Folds to Fit Parcels of Any Shape

Excessive packaging is responsible for a lot of waste.

Because of this we were really inspired by this flat cardboard sheet that is capable of conforming to the shape of any object, saving a bundle on wasteful filler. Designed by Patrick Sung, the packaging design concept features triangulated perforations that allow it to bend around odd forms. This could also save on fuel for shipping, since all of that wasted box filler is eliminated.

We could see how the concept would not be the most practical for all applications, but it could be really great for mailing a surprise gift to a friend! Soft items like clothing or shoes, or even products that are rigid, like a funky reusable water bottle, could be perfect for this packaging. Not to mention that the perforated lines give the package an interesting graphic pattern style. There is something to be said about the efficiency of boxes that stack, which is why it is great that the sheet can also be folded into standard 6-sided boxes.

Attached Image

Sung has branded his concept the UPACKS (Universal Packaging System).

from: Inhabitat



TStzmmalaysia
post Jan 29 2011, 09:23 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Automakers Show Interest in an Unusual Engine Design

An engine development company called the Scuderi Group recently announced progress in its effort to build an engine that can reduce fuel consumption by 25 to 36 percent compared to a conventional design. Such an improvement would be roughly equal to a 50 percent increase in fuel economy.

Sal Scuderi, says that nine major automotive companies have signed nondisclosure agreements that allow them access to detailed data about the engine. Scuderi says he is hopeful that at least one of the automakers will sign a licensing deal before the year is over. Historically, major automakers have been reluctant to license engine technology because they prefer to develop the engines themselves as the core technology of their products. But as pressure mounts to meet new fuel-economy regulations, automakers have become more interested in looking at outside technology.

Although Scuderi has built a prototype engine to demonstrate the basic design, the fuel savings figures are based not on the performance of the prototype but on computer simulations that compare the Scuderi engine to the conventional engine in a 2004 Chevrolet Cavalier, a vehicle for which extensive simulation data is publicly available, Scuderi says. Since 2004, automakers have introduced significant improvements to engines, but these generally improve fuel economy in the range of something like 20 percent, compared to the approximately 50 percent improvement the Scuderi simulations show.

There's a big difference, however, between simulation results and data from engines in actual vehicles, says Larry Rinek, a senior consultant with Frost and Sullivan, an analyst firm. "So far things are looking encouraging—but will they really meet the lofty claims?" he says. Automakers should wait to see data from an actual engine installed in a vehicle before they license the technology, he says.

A conventional engine uses a four stroke cycle: air is pulled into the chamber, the air is compressed, fuel is added and a spark ignites the mixture, and finally the combustion gases are forced out of the cylinder. In the Scuderi engine, known as a split-cycle engine, these functions are divided between two adjacent cylinders. One cylinder draws in air and compresses it. The compressed air moves through a tube into a second cylinder, where fuel is added and combustion occurs.

Splitting these functions gives engineers flexibility in how they design and control the engine. In the case of the Scuderi engine, there are two main changes from what happens in a conventional internal-combustion engine. The first is a change to when combustion occurs as the piston moves up and down in the cylinder. The second is the addition of a compressed-air storage tank.

TechnologyReview

TStzmmalaysia
post Jan 29 2011, 09:25 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


TRANSPORTATION

Attached Image

eStar Electric Vehicle Will Provide Emission Free Inner-City Deliveries

Any city will find that a logistics system that effectively supports flow of goods from one point to another is a crucial point in keeping things running. Fleets of vehicles currently travel around the country delivering food, clothing and other stock, but at the cost of heavy emissions. While there are plans to increase the use of rail freight, inner-city deliveries are still plagued with carbon heavy emissions. So what’s the solution? Enter the eStar Electric Vehicle, a green delivery vehicle that boasts an impressive 4,000 pound payload and can travel 100 miles on a single charge.

Designed by green vehicles specialist Navistar, the eStar aims to press companies to re-evaluate how they outfit their fleets. Navistar says of their ‘peppy’ little vehicle that it is “designed from the ground up to be electric, eStar is more efficient and effective than typical electric conversions.”

It is also easy to charge as the eStar offers a simple 220 volt split phase electrical charging process that’s quick and efficient. This means that the eStar can recharge overnight in approximately 8 hours and be ready for work in the morning.

Spec-wise the eStar runs on a 80kWhr Li-ion cassette battery providing 102 horsepower and 70kw of power. With a top speed of 50 mph and zero emissions, the eStar is ideal for inner-city work. The vehicle’s size also gives it room to move. Its 36-foot turning diameter helps in navigating small spaces, while the open view from the cab gives the driver a clear 180 degree view.

If you run a business in a big city, be it flower deliveries or catering, and are looking to green up your fleet, this could provide the answer.

Beyond the transport of passengers, to ensure that we are reducing our carbon footprint, we need to consider clean automotive in all applications, even in the local delivery of goods.

Inhabitat

TStzmmalaysia
post Jan 29 2011, 09:27 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Artificial Hydrogen Tests Quantum Theory

Scientists have created ultra-light and ultra-heavy forms of the element hydrogen, and have investigated their chemical properties.

Donald Fleming, a chemist at the University of British Columbia in Vancouver, Canada, and his colleagues generated two artificial analogues of hydrogen: one with a mass a little over one-tenth that of ordinary hydrogen, and one four times heavier than hydrogen. These pseudo-hydrogens both contain short-lived subatomic particles called muons -- super-heavy versions of the electron.

The researchers tested the behaviour of these new atoms in a chemical reaction called a hydrogen exchange, in which a lone hydrogen atom plucks another from a two-atom hydrogen molecule -- just about the simplest chemical reaction conceivable. In a paper in Science, they report that both the weedy and the bloated hydrogen atoms behave just as quantum theory predicts they should -- which is itself surprising.

The experiment is a "tour de force", says Paul Percival, a muonium chemist at Simon Fraser University in Burnaby, Canada.

"I would never attempt such a difficult task myself," he says, "and when I first saw the proposal I was very doubtful that anything of value could be gained from the Herculean effort. Don Fleming proved me wrong. I doubt if anyone else could have achieved these results."

A normal hydrogen atom contains a single negatively charged electron orbiting a nucleus made of a single positively charged proton. About 0.015% of natural hydrogen consists of the heavy isotope deuterium, in which the nucleus contains a proton and an electrically neutral neutron, and which has a mass twice that of normal hydrogen. And there is a third isotope with a proton and two neutrons: tritium, three times as massive as hydrogen, which is produced in trace quantities by cosmic rays interacting with the atmosphere, but is too dangerously radioactive for use in such experiments.

The chemical behaviour of atoms depends on the number of electrons they have rather than their masses, so the three hydrogen isotopes are chemically almost identical. But the greater mass of the heavy isotopes means that they vibrate at different frequencies, and quantum theory suggests that this will produce a small difference in the rates of their chemical reactions.

To rigorously test that theory, isotopes of hydrogen are needed with greater differences between their masses. Fleming and his colleagues created some, using muons produced by collisions in the TRIUMF particle accelerator in Vancouver.

Muons have many properties similar to electrons, but are more massive. "A muon is an overgrown electron -- an electron on steroids -- with a mass about 200 times that of an electron," says Richard Zare, a physical chemist at Stanford University in California. "But unlike the free electron, the free muon falls apart, with a mean lifetime of about 2.2 microseconds." This meant that the researchers had to work fast to study their pseudo-hydrogen.

To make the ultra-light isotope, they swapped the proton with a positively charged muon, which has just 11% of the mass of a proton. And to make ultra-heavy hydrogen, they replaced one of the electrons in a helium atom with a negative muon.

Helium has two electrons, two protons and two neutrons. But because it is more massive than an electron, the negative muon orbits the nucleus much more closely, masking the positive charge of one of the protons. In effect, the atom becomes a hydrogen-like composite: a 'nucleus' made of the existing two-proton, two-neutron nucleus and the muon, orbited by the remaining electron. It has a mass of a little over four times that of hydrogen.

Fleming and colleagues found that the reaction rates for hydrogen exchange involving these analogues that were calculated from quantum theory were close to those measured experimentally. "This gives confidence in similar theoretical methods applied to more complex systems," says Fleming.

The close match between experiment and theory wasn't necessarily to be expected, because quantum calculations use a simplification called the Born-Oppenheimer approximation, which assumes that the electrons adapt their trajectories instantly to any movement of the nuclei. This is generally true for electrons, which are nearly 2,000 times lighter than protons. But it wasn't obvious that it would hold for muons, which have a tenth of the proton's mass.

"It surprises me at first blush that the theoretical treatments hold up so well," says Zare. "The Born-Oppenheimer approximation is based on the small ratio of the mass of the electron to the mass of the nucleus. Yet suddenly the mass of the electron is increased two-hundred-fold and all seems to be well."

Because the muon has such a short lifetime, extending such studies to more chemically complex systems is very challenging. But Fleming and his colleagues propose now to look at the 'hydrogen' exchange reaction between the super-heavy 'hydrogen' and methane (CH4).

Scientific American

TStzmmalaysia
post Jan 29 2011, 09:28 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Carbon Touchscreens Offer Cheaper Alternative

Touchscreens are in – although the technology still has its price. The little screens contain rare and expensive elements. This is the reason why researchers at Fraunhofer are coming up with an alternative display made of low-priced renewable raw materials available all over the world. The researchers are presenting touchscreens that contain carbon nanotubes at the nano tech 2011 fair in Tokyo (Hall 5, Stand E-18-11) from February 16-18.

Just touching it slightly with the tips of your fingers is enough. You can effortlessly write, navigate, open menu windows or rotate images on touchscreens. Within fractions of a second your touch is translated into control commands that a computer understands. At first glance, this technology borders on the miraculous, but in real life this mystery just is a wafer-thin electrode under the glass surface of the display made of indium-tin-oxide, ITO. This material is nothing short of ideal for use in touchscreens because it is excellent at conducting slight currents and lets the colors of the display pass through unhindered. But, there is a little problem: there are very few deposits of indium anywhere in the world. In the long term, the manufacturers of electronic gadgets are afraid that they will be dependent upon the prices set by suppliers. This is the reason why indium is one of what people call "strategic metals."

Therefore, private industry is very interested in alternatives to ITO that are similarly efficient. The researchers at Fraunhofer have succeeded at coming up with a new material for electrodes that is on the same level as ITO and on top of it is much cheaper. Its main components are carbon nanotubes and low-cost polymers. This new electrode foil is composed of two layers. One is the carrier, a thin foil made of inexpensive polyethylenterephthalate PET used for making plastic bottles. Then a mixture of carbon-nanotubes and electrically conducting polymers is added that is applied to the PET as a solution and forms a thin film when it dries.

In comparison to ITO, these combinations of plastics have not been particularly durable because humidity, pressure or UV light put a strain on the polymers. The layers became brittle and broke down. Only carbon nanotubes have made them stable. The carbon nanotubes harden on the PET to create a network where the electrically conducting polymers can be firmly anchored. That means that this layer is durable in the long run. Ivica Kolaric, project manager from Fraunhofer Institute for Manufacturing Engineering and Automation IPA, concedes that "the electrical resistance of our layer is somewhat greater than that of the ITO, but it’s easily enough for an application in electrical systems." Its merits are unbeatable: carbon is not only low-cost and available all over the world. It is also a renewable resource that you can get from organic matter such as wood. Kolaric and his colleagues will be presenting their carbon touchdisplay at the 2011 nano tech fair. Since 2003 Fraunhofer researchers show their developments at the annual trade show.

There are a whole series of implementations for the new technology. This foil is flexible and can be used in a variety of ways. Kolaric sums up by saying "we could even make photovoltaic foils out of it to line corrugated roofs or other uneven structures." The researcher has already set up pilot production where the foil can be enhanced for a wide range of applications.

nano

TStzmmalaysia
post Jan 29 2011, 09:31 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Practical Full-Spectrum Solar Cell Comes Closer

Solar cells are made from semiconductors whose ability to respond to light is determined by their band gaps (energy gaps). Different colors have different energies, and no single semiconductor has a band gap that can respond to sunlight's full range, from low-energy infrared through visible light to high-energy ultraviolet.

Although full-spectrum solar cells have been made, none yet have been suitable for manufacture at a consumer-friendly price. Now Wladek Walukiewicz, who leads the Solar Energy Materials Research Group in the Materials Sciences Division (MSD) at the U.S. Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab), and his colleagues have demonstrated a solar cell that not only responds to virtually the entire solar spectrum, it can also readily be made using one of the semiconductor industry's most common manufacturing techniques.

The new design promises highly efficient solar cells that are practical to produce. The results are reported in a recent issue of Physical Review Letters.

How to make a full-spectrum solar cell

"Since no one material is sensitive to all wavelengths, the underlying principle of a successful full-spectrum solar cell is to combine different semiconductors with different energy gaps," says Walukiewicz.

One way to combine different band gaps is to stack layers of different semiconductors and wire them in series. This is the principle of current high-efficiency solar cell technology that uses three different semiconductor alloys with different energy gaps. In 2002, Walukiewicz and Kin Man Yu of Berkeley Lab's MSD found that by adjusting the amounts of indium and gallium in the same alloy, indium gallium nitride, each different mixture in effect became a different kind of semiconductor that responded to different wavelengths. By stacking several of the crystalline layers, all closely matched but with different indium content, they made a photovoltaic device that was sensitive to the full solar spectrum.

However, says Walukiewicz, "Even when the different layers are well matched, these structures are still complex -- and so is the process of manufacturing them. Another way to make a full-spectrum cell is to make a single alloy with more than one band gap."

In 2004 Walukiewicz and Yu made an alloy of highly mismatched semiconductors based on a common alloy, zinc (plus manganese) and tellurium. By doping this alloy with oxygen, they added a third distinct energy band between the existing two -- thus creating three different band gaps that spanned the solar spectrum. Unfortunately, says Walukiewicz, "to manufacture this alloy is complex and time-consuming, and these solar cells are also expensive to produce in quantity."

The new solar cell material from Walukiewicz and Yu and their colleagues in Berkeley Lab's MSD and RoseStreet Labs Energy, working with Sumika Electronics Materials in Phoenix, Arizona, is another multiband semiconductor made from a highly mismatched alloy. In this case the alloy is gallium arsenide nitride, similar in composition to one of the most familiar semiconductors, gallium arsenide. By replacing some of the arsenic atoms with nitrogen, a third, intermediate energy band is created. The good news is that the alloy can be made by metalorganic chemical vapor deposition (MOCVD), one of the most common methods of fabricating compound semiconductors.

How band gaps work

Band gaps arise because semiconductors are insulators at a temperature of absolute zero but inch closer to conductivity as they warm up. To conduct electricity, some of the electrons normally bound to atoms (those in the valence band) must gain enough energy to flow freely -- that is, move into the conduction band. The band gap is the energy needed to do this.

When an electron moves into the conduction band it leaves behind a "hole" in the valence band, which also carries charge, just as the electrons in the conduction band; holes are positive instead of negative.

A large band gap means high energy, and thus a wide-band-gap material responds only to the more energetic segments of the solar spectrum, such as ultraviolet light. By introducing a third band, intermediate between the valence band and the conduction band, the same basic semiconductor can respond to lower and middle-energy wavelengths as well.

This is because in a multiband semiconductor, there is a narrow band gap that responds to low energies between the valence band and the intermediate band. Between the intermediate band and the conduction band is another relatively narrow band gap, one that responds to intermediate energies. And finally, the original wide band gap is still there to take care of high energies.

"The major issue in creating a full-spectrum solar cell is finding the right material," says Kin Man Yu. "The challenge is to balance the proper composition with the proper doping."

In solar cells made of some highly mismatched alloys, a third band of electronic states can be created inside the band gap of the host material by replacing atoms of one component with a small amount of oxygen or nitrogen. In so -- called II-VI semiconductors (which combine elements from these two groups of Mendeleev's original periodic table), replacing some group VI atoms with oxygen produces an intermediate band whose width and location can be controlled by varying the amount of oxygen. Walukiewicz and Yu's original multiband solar cell was a II-VI compound that replaced group VI tellurium atoms with oxygen atoms. Their current solar cell material is a III-V alloy. The intermediate third band is made by replacing some of the group V component's atoms -- arsenic, in this case -- with nitrogen atoms.

Finding the right combination of alloys, and determining the right doping levels to put an intermediate band right where it's needed, is mostly based on theory, using the band anticrossing model developed at Berkeley Lab over the past 10 years.

"We knew that two-percent nitrogen ought to do the job," says Yu. "We knew where the intermediate band ought to be and what to expect. The challenge was designing the actual device."

Passing the test

Using their new multiband material as the core of a test cell, the researchers illuminated it with the full spectrum of sunlight to measure how much current was produced by different colors of light. The key to making a multiband cell work is to make sure the intermediate band is isolated from the contacts where current is collected.

"The intermediate band must absorb light, but it acts only as a stepping stone and must not be allowed to conduct charge, or else it basically shorts out the device," Walukiewicz explains.

The test device had negatively doped semiconductor contacts on the substrate to collect electrons from the conduction band, and positively doped semiconductor contacts on the surface to collect holes from the valence band. Current from the intermediate band was blocked by additional layers on top and bottom.

For comparison purposes, the researchers built a cell that was almost identical but not blocked at the bottom, allowing current to flow directly from the intermediate band to the substrate.

The results of the test showed that light penetrating the blocked device efficiently yielded current from all three energy bands -- valence to intermediate, intermediate to conduction, and valence to conduction -- and responded strongly to all parts of the spectrum, from infrared with an energy of about 1.1 electron volts (1.1 eV), to over 3.2 eV, well into the ultraviolet.

By comparison, the unblocked device responded well only in the near infrared, declining sharply in the visible part of the spectrum and missing the highest-energy sunlight. Because it was unblocked, the intermediate band had essentially usurped the conduction band, intercepting low-energy electrons from the valence band and shuttling them directly to the contact layer.

Further support for the success of the multiband device and its method of operation came from tests "in reverse" -- operating the device as a light emitting diode (LED). At low voltage, the device emitted four peaks in the infrared and visible light regions of the spectrum. Primarily intended as a solar cell material, this performance as an LED may suggest additional possibilities for gallium arsenide nitride, since it is a dilute nitride very similar to the dilute nitride, indium gallium arsenide nitride, used in commercial "vertical cavity surface-emitting lasers" (VCSELs), which have found wide use because of their many advantages over other semiconductor lasers.

With the new, multiband photovoltaic device based on gallium arsenide nitride, the research team has demonstrated a simple solar cell that responds to virtually the entire solar spectrum -- and can readily be made using one of the semiconductor industry's most common manufacturing techniques. The results promise highly efficient solar cells that are practical to produce.

ScienceDaily
TStzmmalaysia
post Jan 29 2011, 12:33 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Bugs Might Convert Biodiesel Waste Into New Fuel

A strain of bacteria found in soil is being studied for its ability to convert waste from a promising alternative fuel into several useful materials, including another alternative fuel.

A graduate student at The University of Alabama in Huntsville is developing biological tools to make products from crude glycerol -- a waste material from the production of biodiesel. The research is being funded by the National Science Foundation.

Disposing of glycerol has been a problem for the biodiesel industry, according to Keerthi Venkataramanan, a student in UAHuntsville's biotechnology Ph.D. program. "Many companies have had problems disposing of it. The glycerol you get as a byproduct isn't pure, so it can't be used in cosmetics or animal feeds. And purifying it costs three times as much as the glycerol is worth."

The volume of glycerol produced is also daunting: About 100,000 gallons of glycerol is produced with every million gallons of biodiesel manufactured from animal fats or vegetable oils. (In 2009 more than 500 million gallons of biodiesel were produced in the U.S. while more than 2.75 billion gallons were produced in Europe.)

Two major American companies "were made to close biodiesel plants in Europe because they couldn't dispose of their crude glycerol," Venkataramanan said. He is working with the Clostidium pasteurianum bacteria, which "eats" glycerol and produces several potentially useful byproducts.

"This strain is found deep in the soil," he said. "It was originally studied for its ability to 'fix' nitrogen from the air."

The bacteria uses glycerol as a carbohydrate source. From that they produce three alcohol byproducts -- butanol, propanediol and ethanol -- plus acetic acid and butyric acid. Butanol is a particularly interesting byproduct.

"Butanol is a big alcohol molecule, twice as big as ethanol," Venkataramanan said. "You can use it as an industrial solvent and it can be used in cars, replacing gasoline with no modifications. It doesn't have some of the problems you have with ethanol, such as rapid evaporation. And ethanol is a two-carbon molecule, but butanol is a four-carbon molecule so its energy value is much higher. In fact, there are plans to use it for jet fuel.

"You can also get butanol from crude oil, but this biological process is less polluting."

In their present form, the bacteria convert about 30 to 35 percent of their gylcerol meals into butanol and another 25 to 30 percent into a chemical used to make plastics.

Venkataramanan is looking at different strategies to improve that yield. He is also studying the bacteria's genes to see if a more productive strain can be bioengineered.

Other groups in the U.S. and abroad are studying a variety of fungi, bacteria and algae for glycerol conversion, but Venkataramanan says his strain has several advantages. Some of the bacteria being studied are dangerous pathogens, while Clostidium pasteurianum "is a completely non-pathogenic strain," he said. "An accidental release is not a big deal. You get it from the soil, so if you spill any you're putting it back in the soil."

ScienceDaily

TStzmmalaysia
post Jan 29 2011, 12:35 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Agave Fuels Excitement as a Bioenergy Crop

Agave, currently known for its use in the production of alcoholic beverages and fibers, thrives in semi-arid regions where it is less likely to conflict with food and feed production. Agave is a unique feedstock because of its high water use efficiency and ability to survive without water between rainfalls. An article in the current issue of Global Change Biology Bioenergy evaluates the potential of Agave as a sustainable biofuel feedstock.

Scientists found that in 14 independent studies, the yields of two Agave species greatly exceeded the yields of other biofuel feedstocks, such as corn, soybean, sorghum, and wheat. Additionally, even more productive Agave species that have not yet been evaluated exist.

According to bioenergy analyst, Sarah Davis, "We need bioenergy crops that have a low risk of unintended land use change. Biomass from Agave can be harvested as a co-product of tequila production without additional land demands. Also, abandoned Agave plantations in Mexico and Africa that previously supported the natural fiber market could be reclaimed as bioenergy cropland. More research on Agave species is warranted to determine the tolerance ranges of the highest yielding varieties that would be most viable for bioenergy production in semi-arid regions of the world."

Agave is not only an exciting new bioenergy crop, but its economically and environmentally sustainable production could prove to successfully stimulate economies in Africa, Australia, and Mexico, if political and legislative challenges are overcome.

ScienceDaily

TStzmmalaysia
post Jan 29 2011, 12:36 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Stem Cells Show Promise in Repairing a Child's Heart

Visionaries in the field of cardiac therapeutics have long looked to the future when a damaged heart could be rebuilt or repaired by using one's own heart cells. A study published in the February issue of Circulation, a scientific journal of the American Heart Association, shows that heart stem cells from children with congenital heart disease were able to rebuild the damaged heart in the laboratory.

Sunjay Kaushal, MD, PhD, surgeon in the Division of Cardiovascular Thoracic Surgery at Children's Memorial Hospital and assistant professor of surgery at Northwestern University Feinberg School of Medicine, who headed the study, believes these results show great promise for the growing number of children with congenital heart problems. With this potential therapy option these children may avoid the need for a heart transplant.

"Due to the advances in surgical and medical therapies, many children born with cardiomyopathy or other congenital heart defects are living longer but may eventually succumb to heart failure," said Kaushal. "This project has generated important pre-clinical laboratory data showing that we may be able to use the patient's own heart stem cells to rebuild their hearts, allowing these children to potentially live longer and have more productive lives."

Cells were obtained from patients ranging in age from a few days after birth to 13 years who were undergoing routine congenital cardiac surgery. Findings show that the number of heart stem cells was greatest in neonates and then rapidly decreased with age, and that the highest numbers of these stem cells are located in the upper right chamber of the heart, or the right atrium. The study also showed that the cardiac stem cells are functional and have the potential for use in repairing the damaged heart. Up until now, heart stem cell studies have addressed the adult diseased heart, but this is the first and largest systematic study to focus on children.

"Heart disease in children is different than heart disease in adults," said Kaushal. "Whereas adults might suffer heart failure from coronary artery disease or atherosclerosis, heart failure in children primarily occurs because they acquire cardiomyopathy or have a congenital condition in which the heart chambers are small or in the wrong position causing the heart to pump inefficiently. The potential of cardiac stem cell therapy for children is truly exciting," said Kaushal. Pending FDA approval, Kaushal hopes to begin clinical trials with children in the fall.

The study was funded by grants from the National Institutes of Health, the Thoracic Surgical Foundation for Research and Education, the Children's Heart Foundation and the North Suburban Medical Research Junior Board.

ScienceDaily



TStzmmalaysia
post Jan 29 2011, 12:37 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCE

Attached Image

The Superstar is a Star-Shaped Self-Sustaining City of the Future

The city of tomorrow takes to the skies in an incredible new concept from Beijing-based MAD Architects. Although its spires may look menacing, the aptly named Superstar is a completely self-sustaining city that is capable of producing all of its own power and food while recycling all of its waste. Conceived as a future-forward update to the contemporary Chinatown, the Superstar will travel the globe, supplying its host cities with energy, commerce, and cultural activities.

Looking suspiciously like a Cylon Base Star from Battlestar Galactica, the utopian Superstar is “a fusion of technology and nature, future and humanity”. MAD Architects conceived of the sparkling modern superstructure as an update to the faded facades and cluttered kitsch of Chinatowns around the globe:

“Superstar: A Mobile China Town is MAD’s response to the redundant and increasingly out-of-date nature of the contemporary Chinatown. Rather than a sloppy patchwork of poor construction and nostalgia, the superstar is a fully integrated, coherent, and above all modern upgrade of the 20th century Chinatown model.”

As a completely self-sustaining city, the Superstar will be capable of housing 15,000 people. It will grow its own food, recycle all of its waste, and produce its own power, even feeding some energy back into its host city’s grid.

The Superstar will also be capable of traveling around the globe, sharing Chinese culture with the cities in which it docks. Inside, one can experience fine Chinese cuisine, purchase quality Chinese goods, and participate in cultural events and celebrations. It will also offer health resorts, sports facilities, drinking-water lakes, and even a digital cemetery to remember those who have passed. According to MAD’s website, the first destination of the super star will be on the outskirts of Rome, where it will provide “an unexpected, ever-changing future imbedded in the Eternal Past”.

While it is questionable whether or not the full-scale Superstar will ever go on tour, visitors and residences in and around Venice are invited to visit a model at the 11th Annual Venice Bienalle.

Inhabitat

TStzmmalaysia
post Jan 30 2011, 09:10 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

New Technique Makes LED Lighting More Efficient

Light-emitting diodes (LEDs) are an increasingly popular technology for use in energy-efficient lighting. Researchers from North Carolina State University have now developed a new technique that reduces defects in the gallium nitride (GaN) films used to create LEDs, making them more efficient.

LED lighting relies on GaN thin films to create the diode structure that produces light. The new technique reduces the number of defects in those films by two to three orders of magnitude. “This improves the quality of the material that emits light,” says Dr. Salah Bedair, a professor of electrical and computer engineering at NC State and co-author, with NC State materials science professor Nadia El-Masry, of a paper describing the research. “So, for a given input of electrical power, the output of light can be increased by a factor of two – which is very big.” This is particularly true for low electrical power input and for LEDs emitting in the ultraviolet range.

The researchers started with a GaN film that was two microns, or two millionths of a meter, thick and embedded half of that thickness with large voids – empty spaces that were one to two microns long and 0.25 microns in diameter. The researchers found that defects in the film were drawn to the voids and became trapped – leaving the portions of the film above the voids with far fewer defects.

Defects are slight dislocations in the crystalline structure of the GaN films. These dislocations run through the material until they reach the surface. By placing voids in the film, the researchers effectively placed a “surface” in the middle of the material, preventing the defects from traveling through the rest of the film.

The voids make an impressive difference.

“Without voids, the GaN films have approximately 10[to the 10th power] defects per square centimeter,” Bedair says. “With the voids, they have 10[to the 7th power] defects. This technique would add an extra step to the manufacturing process for LEDs, but it would result in higher quality, more efficient LEDs.”

The paper, “Embedded voids approach for low defect density in epitaxial GaN films,” was published online Jan. 17 by Applied Physics Letters. The paper was co-authored by Bedair; Pavel Frajtag, a Ph.D. student at NC State; Dr. Nadia El-Masry, a professor of material science and engineering at NC State; and Dr. N. Nepal, a former post-doctoral researcher at NC State now working at the Naval Research Laboratory. The research was funded by the U.S. Army Research Office.

From: NCSU News

TStzmmalaysia
post Jan 31 2011, 09:09 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Ingenious BioSphere design for Emergency Habitats

We have seen some beautifull new designs for future cities over the last few years.And these images from Russian architectural firm Remistudio are nothing less than amazing. Remistudio has designed a massive hotel concept that can be erected at land or sea, is completely self sustaining and is able to endure extreme floods.

The arch-shaped building, dubbed the Ark, has a structure that enables it to float and exist autonomously on the surface of the water. The Ark was also designed to be a bioclimatic house with independent life-support systems, including elements ensuring a closed-functioning cycle.

The Ark concept, which Remistudio designed in connection with the International Union of Architects’ program “Architecture for Disaster Relief,” can be built in various climates and in seismically dangerous regions because its basement is a shell structure, devoid of ledges or angles. A load-bearing system of arches and cables allows weight redistribution along the entire corpus in case of an earthquake. The building’s clever design enables an optimal relationship between its volume and its outer surface, significantly saving materials and providing energy efficiency. Its prefabricated frame also allows for fast construction.

Attached Image

he Ark constitutes a single energy system. Its shape is convenient for installing photovoltaic cells at an optimal angle toward the sun. The cupola, in the upper part, collects warm air which is gathered in seasonal heat accumulators to provide an uninterrupted energy supply for the whole complex independently from outer environmental conditions. The heat from the surrounding environment — the outer air, water or ground — is also used.

Alexander Remizov, of Remistudio, said: 'For architecture there are two major concerns. "The first is maintenance of security and precautions against extreme environmental conditions and climate changes. The second one is protection of natural environment from human activities. The Ark is an attempt to answer the challenges of our time. Provision is made for an independent life support system. 'All the plants are chosen according to compatibility, illumination and efficiency of oxygen producing, and with the aim of creating an attractive and comfort space. Through the transparent roof there is enough light for plants and for illuminating the inner rooms."

Attached Image
Attached Image

Inhabitat


TStzmmalaysia
post Jan 31 2011, 09:11 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Ancient body clock discovered that helps to keep all living things on time

The mechanism that controls the internal 24-hour clock of all forms of life from human cells to algae has been identified by scientists.

Not only does the research provide important insight into health-related problems linked to individuals with disrupted clocks – such as pilots and shift workers – it also indicates that the 24-hour circadian clock found in human cells is the same as that found in algae and dates back millions of years to early life on Earth.

Two new studies out today in the journal Nature from the Universities of Cambridge and Edinburgh give insight into the circadian clock which controls patterns of daily and seasonal activity, from sleep cycles to butterfly migrations to flower opening.

One study, from the University of Cambridge's Institute of Metabolic Science, has for the first time identified 24-hour rhythms in red blood cells. This is significant because circadian rhythms have always been assumed to be linked to DNA and gene activity, but – unlike most of the other cells in the body – red blood cells do not have DNA.

Akhilesh Reddy, from the University of Cambridge and lead author of the study, said: "We know that clocks exist in all our cells; they're hard-wired into the cell. Imagine what we'd be like without a clock to guide us through our days. The cell would be in the same position if it didn't have a clock to coordinate its daily activities.

"The implications of this for health are manifold. We already know that disrupted clocks – for example, caused by shift-work and jet-lag – are associated with metabolic disorders such as diabetes, mental health problems and even cancer. By furthering our knowledge of how the 24-hour clock in cells works, we hope that the links to these disorders – and others – will be made clearer. This will, in the longer term, lead to new therapies that we couldn't even have thought about a couple of years ago."

For the study, the scientists, funded by the Wellcome Trust, incubated purified red blood cells from healthy volunteers in the dark and at body temperature, and sampled them at regular intervals for several days. They then examined the levels of biochemical markers – proteins called peroxiredoxins – that are produced in high levels in blood and found that they underwent a 24-hour cycle. Peroxiredoxins are found in virtually all known organisms.

A further study, by scientists working together at the Universities of Edinburgh and Cambridge, and the Observatoire Oceanologique in Banyuls, France, found a similar 24-hour cycle in marine algae, indicating that internal body clocks have always been important, even for ancient forms of life.

The researchers in this study found the rhythms by sampling the peroxiredoxins in algae at regular intervals over several days. When the algae were kept in darkness, their DNA was no longer active, but the algae kept their circadian clocks ticking without active genes. Scientists had thought that the circadian clock was driven by gene activity, but both the algae and the red blood cells kept time without it.

Andrew Millar of the University of Edinburgh's School of Biological Sciences, who led the study, said: "This groundbreaking research shows that body clocks are ancient mechanisms that have stayed with us through a billion years of evolution. They must be far more important and sophisticated than we previously realised. More work is needed to determine how and why these clocks developed in people – and most likely all other living things on earth – and what role they play in controlling our bodies."

PhysOrg

TStzmmalaysia
post Feb 1 2011, 11:07 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Lunar Cubit: Gigantic Solar Pyramids to Power Abu Dhabi

The motto of these sleek black pyramids is “Renewable Energy Can Be Beautiful.” No, the Luxor Casino in Las Vegas did not suddenly develop a green streak — the project, called Lunar Cubit, is a pyramid-shaped solar power complex designed to power thousands of homes in the Abu Dhabi desert.

The proposal placed first in the Land Art Generator Initiative, a contest which asks designers to integrate art and interdisciplinary processes with the concept of renewable energy. Each proposed project must generate enough green energy to power thousands of homes, while also serving as an innovative public art installation.

Lunar Cubit consists of eight small glossy solar panel pyramids that surround a central large pyramid in a semi circle. The pyramids act as a lunar calendar, and the central pyramid is inversely illuminated according to the phases of the moon — meaning it is at full illumination with the new moon. The surrounding smaller pyramids act like the hands of a clock for the eight phases of the lunar calendar, illuminating in different combinations to indicate the waxing or waning of the moon. Each is outfitted with energy-efficient LED lights, of course.

By day, the pyramids function as solar energy-producing power plants. Each of the frameless solar panels is made of glass and amorphous silicon, and they’re able to produce enough renewable energy to power 250 homes. That may not seem as productive as a solar power farm, yet it is truly exceptional considering it is also a public art installation. If actually constructed, Lunar Cubit would pay back its cost of construction in five years, through the megawatt-hours of clean renewable energy that it produces.

Lunar Cubit, designed by Robert Flottemesch, Jen DeNike, Johanna Ballhaus, and Adrian P. De Luca, combines the power of the moon, ancient measurement (it is proportional to the Great Pyramid of Cheops in Giza), and renewable energy. The Land Art Generator Initiative is currently seeking partnership to start construction, and if completed, it is anticipated to become a tourist destination in itself.

Inhabitat


TStzmmalaysia
post Feb 1 2011, 11:09 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Energy-efficient intelligent house that can learn our routines

The first home in the UK which can learn from its residents and take decisive action, will be unveiled at a competition in Cairo this week.

InterHome, a model for a home developed by researchers at the University of Hertfordshire, incorporates modular custom design units and draws on standard home automation systems which have been adapted so that the house ‘learns’ and ‘adapts’ to its users’ lifestyles.

It will be unveiled at the finals of the Microsoft Imagine Cup which will be held in Cairo from 3-7 July. The prototype of the home, which has been developed in a doll’s house, integrates embedded devices with the industry standard X10, so that it provides convenience and security to the home owner and also enables them to reduce energy and contribute to reducing greenhouse and carbon emissions.

InterHome incorporates an intuitive touch screen user control panel that also allows the house to be monitored and controlled using web browsers, windows mobile and any SMS-capable mobile phone.

“InterHome improves on its competitors by being modular, adaptable and able to ‘learn’ our routines,” said Johann Siau, Senior Lecturer at the University’s School of Electronic, Communication & Electrical Engineering. “The technology enables the system to learn rapidly when we need the lights on or whether we are at home or at work and how the house needs to be at certain times of the day. If we forget to lock the front door or turn off the lights, it can text us and our response can reprogramme the system.”

Through this approach, InterHome can eradicate wasted energy within UK homes and make a difference to CO2 emission statistics when installed in enough houses. The prototype is now ready to be adopted by industry and the team led by Johann Siau, has been approached by several industry companies and are in discussion with the Building Research Establishment. The other members of the team are Ellis Percival and Carol Chen.

This development in 'home-intelligence' will continue to update our systems in the future. Some day every home in the world could have a centralized computer that monitors and regulates all the data from in and around the house. By inputting the user's personal preferences and patterns, the software will be able to emerge & evolve. By doing this, the software will be able to learn from your needs and habits to reach peak efficiency for every individual. Hereby reducing energy waste, while maximizing comfort and efficiency.

Herts

TStzmmalaysia
post Feb 1 2011, 11:12 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image Attached Image Attached Image

Please behold, another innovative design to join the team of promising Architecture. The Envision Green Hotel designed by Richard Moreta Architecture, with interiors by Miami-based MRA Design for Hospitality Design’s Radical Innovation.

Part wind tower, part urban eco-resort, and all egg, this “lobular” structure is touted as one of the most recognizable landmarks for the city in which it would ultimately be…laid.

The Envision Green Hotel combines wind technology and resort architecture for a one-of-a-kind eco-friendly design. Shaped like a giant egg, it is currently just a concept at the moment. Submitted for Hospitality Design’s Radical Innovation competition, you never know if it’ll become more though.

Designed by Richard Moreta Architecture, the Envision Green Hotel features interiors by Miami-based MRA Design. Powered by a photovoltaic exterior sheathing, the interior is rife with fresh gardens. On the whole, this building is like a living organism.

Operating like a living organism, the Envision literally breathes through its wind and atmospheric conversion systems, which allow natural air into the interior of the building without mechanical intervention. Photovoltaic exterior sheathing provides the building’s energy, while indoor gardens at various levels of the structure act as upward extensions of the earth, creating mini-microclimates that filter the air and act as added insulation. Recycled pools of water around the structure serve as catch basins, water reservoirs, fire barriers, and indispensable decorative aquatic features. Power from the wind turbine heats the boiler and creates steam for the chiller water plant beneath the structure to cool and heat the hotel.

Within the hotel, rooms would be designed on a 4 foot multiple to conform to standard-sized materials and reduce construction waste. A high-efficiency LED system would illuminate the interiors, and non-toxic, non-off-gassing finishes would be employed. Besides the typical water-efficient fixtures, this eco resort would use recovered rainwater for flushing and irrigation. The rooms would also include a mood pad control unit that would allow each guest to control the lighting and choose groovy digital images that would reflect behind glass walls and ceilings.

To help this giant eco-egg blend even more subtly into the surrounding urban context, exterior LED curtain walls would change color throughout the night to indicate the progression of time, making the Envision glow in the night like a giant, moody Fabergé egg.

Inhabitat

TStzmmalaysia
post Feb 1 2011, 11:16 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Capturing More Light with a Single Solar Cell

The most efficient solar cells typically have several layers of semiconductor materials, each tuned to convert different colors of light into electricity. Researchers at Lawrence Berkeley National Lab have now made a single semiconductor that performs almost the same job. More importantly, they made the material using a common manufacturing technique, suggesting it could be made relatively inexpensively.

Several research groups are developing semiconductor materials that harness more of the energy in sunlight, based on an idea that dates back to 1960 for changing how semiconductor materials in solar cells interact with light. But the materials used in that research tend to be very difficult to make.

Much work remains before the Lawrence Berkeley lab material could be used in a practical solar cell, but in theory it could convert nearly half of the energy in sunlight into electricity—three times as much as most single-layer (or single-junction) solar cells. Such a solar cell could also cost less than the layered (or multi-junction) solar cells currently needed to achieve high efficiencies, since it would require only one semiconductor material.

In a conventional semiconductor material, it takes a certain amount of energy to free an electron and generate electricity. Photons that have less energy—say, the photons in infrared light—don't generate electricity. And if a photon has more than the minimum—for instance a photon in energetic ultraviolet light—the extra energy is wasted as heat.

The new semiconductor material is based on gallium arsenide. Normally, this material requires high-energy photons to generate electricity. But the researchers modified it so that the energy from more than one photon is used to free an electron—energy adds up until an electron is freed. Replacing some of the arsenic atoms in the material with nitrogen atoms creates regions that act as stepping stones for electrons that have absorbed some energy from low-energy photons, where they can wait to receive energy from more photons, says Wladek Walukiewicz, who leads the Solar Energy Materials Research Group at the Lawrence Berkeley lab, and also led the project.

Technology Review

TStzmmalaysia
post Feb 1 2011, 11:17 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


TRANSPORTATION

Attached Image

Around the world in 0.083 days: Acabion's vision for future transport

Acabion foresees elevated roadways will be needed to accommodate the streamliner's speed

Pneumatic Futurama-style transport systems were proposed as far back as the late 1800’s following the invention of pneumatic tubes for carrying mail around buildings. Swiss company Acabion sees such vacuum tube-based mass transport systems becoming a reality by 2100 and has conceived a vehicle capable of traveling at speeds of almost 12,500 mph (20,000 km/h) on such a platform. The company envisages a global network that would let users circle the globe in less than two hours and make transcontinental journeys possible in less than the time it currently takes to get across town.

The first step in Acabion’s grand vision for the future is the latest version of its GTBO road-ready streamliner – the GTBO VII “da vinci.” This fully electric vehicle would have a top speed of 373 mph (600 km/h) would be orders of magnitude more efficient than a current fully electric compact car.

Thanks to its reduced projected area, turbulence and aerodynamic drag, weight and rolling resistance, Acabion says at 12.4 mph (20 km/h) the vehicle is eight times (or 800 percent) more efficient and at 124 mph (200 km/h) it is 10 times more efficient than a current fully electric vehicle, however, the company claims efficiencies 25 times (2,500 percent) greater than such vehicles are ultimately possible.

Attached Image

The GTBO has been designed for speed and efficiency. Like the Zerotracer, it drives on two main wheels like a motorcycle, with two additional side wheels activated when driving at slow speeds or for parking. Acabion hopes to start selling its streamliners by 2015 for an estimated US$3 million but says prices will drop with mass production.

The company anticipates that, due to the streamliner’s speed potential, by 2050 new elevated roadways – like those mooted for cyclists in the Kolelinia concept – will be needed to separate it from its dilly-dallying forebears.

These fully automated high speed tracks would initially transport people at speeds of around 186.4 mph (300 km/h), before stepping up to 373 mph (600 km/h) in subsequent decades. The tracks would be used for both city and continental mid- and long-range trips with a 1,700+ mile (2,735 km) trip from Los Angeles to Memphis that would currently take more than a day cut to around four hours.

Additionally, the vehicles wouldn’t rely on their own battery packs for power but would draw their energy inductively from the roads themselves, which would be supplied with 100 percent solar power.

Attached Image

But even elevated tracks won’t suffice for the speeds people will be expecting by 2100. For long continental and intercontinental journeys Acabion envisages a global network of maglev-driven vacuum tubes, dubbed “traffic internet”, which will make it possible to travel at speeds of 12,427 mph (20,000 km/h). The company says that public commuter vacuum tube transport systems that would require larger tubes wouldn’t be feasible, but a 10-foot (3 m) diameter tube that would fit a streamliner would be. And such a system would still transport the same number as it would allow constant use instead of trains being spread out at regular intervals. Such a network would not only cross land, but also stretch through oceans, making a 30 minute commute from New York to Paris or San Francisco to Prague a reality.

While such a vision might seem outlandish now, Acabion is confident vacuum tube transport systems will come – and when they do, the company hopes its streamliners will be right inside, ferrying drivers around the globe at breakneck speed.

GizMag


This post has been edited by tzmmalaysia: Feb 1 2011, 11:19 PM
TStzmmalaysia
post Feb 2 2011, 09:17 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Green machine: Fill up your car with hydrogen beads


Filling up your tank with hydrogen could be as simple as topping up with petrol thanks to a liquid fuel made of plastic nanobeads.

Storing hydrogen aboard cars has long been a headache for engineers, as liquid hydrogen must be held at extremely low temperatures, while the compressed gas requires very large tanks. Another compact option is materials called hydrides, which release their hydrogen when heated.

But there are problems to be overcome, says Stephen Bennington, chief scientific officer at Cella Energy in Didcot, UK. "If you expose them to air some of them will burst into flames, while others will get an oxide coating on their surface so they no longer work," he says.

The materials also tend to release their hydrogen content very slowly, over around 30 to 60 minutes, making them unsuitable for use in cars – you just want to turn the key and go, he adds.

Bead craft

To get around these problems, Cella is developing nanobeads of ammonia-borane hydride. These are protected from oxygen in the air by a porous polymer coating, meaning the fuel is safe to use. And when heated to around 80 °C they discharge their hydrogen content in just a few minutes.

Since the beads behave as a fluid, they could be transported with only minor changes to the existing infrastructure, says Bennington. "So you could pump them into the car, heat up the hydrides and drive off using up the hydrogen," he says. "Then the waste beads would be directed into a separate tank and taken out of the car to be rehydrogenated elsewhere."

To produce the hydride beads, the team used a technique called coaxial electrospinning, in which an electrical charge draws small pellets from a liquid – in this case the ammonia-borane hydride liquid surrounded by the polymer.

However, unlike conventional coaxial electrospinning, in which the two materials do not mix, the team allowed them to mix slightly. This resulted in the production of nanobeads with porous, foam-like structures. It is this structure that allows the beads to discharge their hydrogen so quickly, says Bennington.

The coating also filters out chemicals such as ammonia and borazine that can be released by conventional hydrides and can damage fuel cells.

Gas matters

As well as acting as a fuel in their own right, the beads could also reduce the greenhouse gas emissions produced by gasoline if used as an additive, says Bennington. "If you added 20 per cent of this additive, you would remove 30 per cent of the fuel's carbon dioxide emissions," he says.

However, he admits the company still has a lot of work to do to establish how much of the liquid can be added to fuels without affecting their viscosity, for example.

What's more, the ammonia-borane beads are not currently easy to restock with hydrogen after use, so the company is now working with researchers at the University of Oxford and University College London to investigate other hydride materials they can encapsulate in their porous polymer nano-beads.

New Scientist

TStzmmalaysia
post Feb 2 2011, 09:20 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

3D-Printer Creates Largest Rubik's Cube in the World

They said it couldn’t be done, but Oskar van Deventer—a longtime puzzle maker living in the Netherlands—created it anyhow: a 17-by-17-by-17 tile Rubik’s cube that, as far as we know, is an unofficial world record for the world’s largest and most complex Rubik’s puzzle.

Oskar started designing his puzzles as a boy at the age of 12 in the Netherlands. More than 30 years later, he has a reputation as one of the world's most prolific puzzle creators. Oskar first started 3D printing twisty puzzles thanks to Bram Cohen, who began posing challenges to Oskar back in 2008.

Van Deventer, an electrical engineer by trade, didn’t carve out the 1,539 constituent pieces by hand, but rather tapped the increasing availability of 3-D printing technology to manufacture his high-tech puzzle. That’s not to say it was easy; the design itself took more than 60 hours (and three tries) before a successful prototype was printed at Shapeways, an online commercial provider of rapid prototyping technologies. Dyeing the individual pieces required another 10 hours.

The finished product, titled “Over the Top,” is 5.5 inches and 17 tiles long along each side when assembled, blowing the current record of 12-by-12-by-12 out of the water (though, as noted above, this world record is currently unofficial).

Want to give it a shot?



PopSci




TStzmmalaysia
post Feb 2 2011, 09:21 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

When Buildings Think With Their Surroundings

Karl Chu knows he is a man way ahead of his time. It’s a time when, he posits, humans have transcended their bodies to exist on multiple planes, contribute to a global brain, and write apps with their genomes.

But it’s the implications for architecture that are really exciting, says Chu. The founder of the innovative architectural firm metaxy, he imagines “genetic architects” creating buildings and other objects that can build themselves, that are endowed with a certain kind of intelligence, and that make up a massive "self-aware" built ecosystem.

As George Dvorsky writes at the Institute for Ethics and Emerging Technologies blog,

QUOTE
Future "genetic buildings" could, for example, be self-assessing, self-healing and self-modifying, thus minimizing their need to be repaired or maintained by external sources. They will morph, process, and react. These buildings could even meet the needs of its inhabitants by sensing the moods or health of its occupants and act accordingly. Needless to say, the potential for sustainability is substantial.
Chu spoke about genetic architecture (not to be confused with the genomic term) at TEDxBrooklyn recently, showing two hundred and fifty slides in about 20 minutes. Watch it here, ponder it —, maybe giggle; just don’t expect to get it right away.

Though it sounds like science-fiction (the connections Chu draws with quantum mechanics and the multiverse verge toward the poetic), his vision can trace some of its roots to the tenets of the organic architecture that Frank Lloyd Wright helped popularize, and to the 1960s, when architects around the world blended utopian ideals with the promise of new technologies.

The Japanese Metabolists, for instance, envisioned large scale, flexible, and expandable structures that echoed the processes of organic growth; in the U.S. Nicholas Negroponte coined the idea of a responsive architecture that was mechanically and dynamically integrated with its surroundings, an idea that lives on in projects like Columbia’s Living Architecture Lab, in design philosophies like biomimicry, and in concepts like the digitally-networked intelligent city.

Since ancient times, architecture has been linked closely with its surroundings. But in an era when our present-day engineering capacity proposes a less synchronous path, one not just of strip malls and parking lots and mountain-top removal, but also of entire landforms and climates being geo-engineered, the prospect of genetic architecture sounds like a tantalizing corrective. Or, at the very least, an exciting prompt for thinking about how we want to design our future.

Treehugger



TStzmmalaysia
post Feb 2 2011, 09:23 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


SPACE SCIENCE

Attached Image

NASA's NEOWISE completes scan for asteroids and comets

NASA's NEOWISE mission has completed its survey of small bodies, asteroids and comets, in our solar system. The mission's discoveries of previously unknown objects include 20 comets, more than 33,000 asteroids in the main belt between Mars and Jupiter, and 134 near-Earth objects (NEOs). The NEOs are asteroids and comets with orbits that come within 45 million kilometers (28 million miles) of Earth's path around the sun.
NEOWISE is an enhancement of the Wide-field Infrared Survey Explorer, or WISE, mission that launched in December 2009. WISE scanned the entire celestial sky in infrared light about 1.5 times. It captured more than 2.7 million images of objects in space, ranging from faraway galaxies to asteroids and comets close to Earth.

In early October 2010, after completing its prime science mission, the spacecraft ran out of the frozen coolant that keeps its instrumentation cold. However, two of its four infrared cameras remained operational. These two channels were still useful for asteroid hunting, so NASA extended the NEOWISE portion of the WISE mission by four months, with the primary purpose of hunting for more asteroids and comets, and to finish one complete scan of the main asteroid belt.

"Even just one year of observations from the NEOWISE project has significantly increased our catalog of data on NEOs and the other small bodies of the solar systems," said Lindley Johnson, NASA's program executive for the NEO Observation Program.

Now that NEOWISE has successfully completed a full sweep of the main asteroid belt, the WISE spacecraft will go into hibernation mode and remain in polar orbit around Earth, where it could be called back into service in the future.

In addition to discovering new asteroids and comets, NEOWISE also confirmed the presence of objects in the main belt that had already been detected. In just one year, it observed about 153,000 rocky bodies out of approximately 500,000 known objects. Those include the 33,000 that NEOWISE discovered.

NEOWISE also observed known objects closer and farther to us than the main belt, including roughly 2,000 asteroids that orbit along with Jupiter, hundreds of NEOs and more than 100 comets.

These observations will be key to determining the objects' sizes and compositions. Visible-light data alone reveal how much sunlight reflects off an asteroid, whereas infrared data is much more directly related to the object's size. By combining visible and infrared measurements, astronomers also can learn about the compositions of the rocky bodies -- for example, whether they are solid or crumbly. The findings will lead to a much-improved picture of the various asteroid populations.
NEOWISE took longer to survey the whole asteroid belt than WISE took to scan the entire sky because most of the asteroids are moving in the same direction around the sun as the spacecraft moves while it orbits Earth. The spacecraft field of view had to catch up to, and lap, the movement of the asteroids in order to see them all.

"You can think of Earth and the asteroids as racehorses moving along in a track," said Amy Mainzer, the principal investigator of NEOWISE at NASA's Jet Propulsion Laboratory in Pasadena, Calif. "We're moving along together around the sun, but the main belt asteroids are like horses on the outer part of the track. They take longer to orbit than us, so we eventually lap them."

NEOWISE data on the asteroid and comet orbits are catalogued at the NASA-funded International Astronomical Union's Minor Planet Center, a clearinghouse for information about all solar system bodies at the Smithsonian Astrophysical Observatory in Cambridge, Mass. The science team is analyzing the infrared observations now and will publish new findings in the coming months.

When combined with WISE observations, NEOWISE data will aid in the discovery of the closest dim stars, called brown dwarfs. These observations have the potential to reveal a brown dwarf even closer to us than our closest known star, Proxima Centauri, if such an object does exist. Likewise, if there is a hidden gas-giant planet in the outer reaches of our solar system, data from WISE and NEOWISE could detect it.

The first batch of observations from the WISE mission will be available to the public and astronomical community in April.

"WISE has unearthed a mother lode of amazing sources, and we're having a great time figuring out their nature," said Edward (Ned) Wright, the principal investigator of WISE at UCLA.

PhysOrg

TStzmmalaysia
post Feb 3 2011, 02:25 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


SPACE SCIENCE

Attached Image

NASA finds planets a plenty outside solar system

This is an Jan. 2011 handout artist rendering provided by NASA. NASA’s Kepler telescope is finding that relatively smaller planets _ still larger than Earth, but tinier than Jupiter _ are proving more common outside our solar system than once thought. This drawing is of one of the smallest planets that Kepler has found, a rocky planet called Kepler-10b, that measures 1.4 times the size of Earth and where the temperature is more than 2,500 degrees Fahrenheit. (AP Photo/Dana Berry, SkyWorks Digital Inc., Kepler Mission, NASA Ames Research Center)

(AP) -- NASA's planet-hunting telescope is finding whole new worlds of possibilities in the search for alien life. An early report from a cosmic census indicates that relatively small planets and stable multi-planet systems are far more plentiful than previous searches showed.

NASA released new data Wednesday from its Kepler telescope on more than 1,000 possible new planets outside our solar system - more than doubling the count of what astronomers call exoplanets. They haven't been confirmed as planets yet, but some astronomers estimate that 90 percent of what Kepler has found will eventually be verified.

Kepler, launched in 2009, has been orbiting the sun between Earth and Mars, conducting a planet census and searching for Earth-like planets since last year. It has found there are more planets that are much smaller than Jupiter - the biggest planet in our solar system - than there are giant planets.

Some of these even approach Earth's size. That means they are better potential candidates for life than the behemoths that are more easily spotted, astronomers say. While Kepler hasn't yet found planets that are as small as Earth, all the results are "pointing in the right direction," said University of California Santa Cruz astronomer Jonathan Fortney, a Kepler researcher.

Yale University exoplanet expert Debra Fischer, who wasn't part of the Kepler team but serves as an outside expert for NASA, said the new information "gives us a much firmer footing" in eventual hopes for worlds that could harbor life. "I feel different today knowing these new Kepler results than I did a week ago," Fischer said.

PhysOrg


TStzmmalaysia
post Feb 3 2011, 02:26 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


TRANSPORTATION

Attached Image

UA engineers study hybrid systems to design robust unmanned vehicles

The UA College of Engineering's Hybrid Dynamics and Control Laboratory is developing mathematical analysis and design methods that could radically advance the capabilities of unmanned aircraft and ground vehicles, as well as many other systems that rely on autonomous decision making.
Researchers in the lab design computer control systems that may one day allow robotic surveillance aircraft to stay aloft indefinitely. These systems also might be used to safely guide aircraft and automobiles through small openings as they enter buildings. Or they could help airplanes and ground vehicles navigate in cluttered environments without colliding.

In addition, the research can be applied to multiple programmable devices aboard vehicles or in stationary locations, allowing them to communicate in the presence of adversaries.

The lab's research focuses on mathematical analysis and design of control systems that have applications in robotics, biology and aerospace engineering.

"What we do here in our lab is mainly theory," said Ricardo Sanfelice, an assistant professor of aerospace and mechanical engineering, who directs the lab. "We model dynamical systems, analyze them mathematically, devise ways to control them, test them in simulations and, when possible, validate them in our test bed.

"But, because of the complexity of movement in some systems that can include sudden transitions in speed or direction, we have to be very careful to be sure computer simulations reflect the real world, and that's where the experimental lab provides a place for us to check our results."

The lab, which is located in UA's Aerospace and Mechanical Engineering building, consists of a computer room, where Sanfelice and his students devise the computer control systems, and a cavernous test lab, topped with eight motion-capture cameras.

The cameras, which were originally designed to create animated figures from the recorded movements of humans and animals, sit on a rail 20 feet off the floor and track the movements of radio-controlled model airplanes, helicopters and automobiles that are flown or driven by computers. The cameras take the place of satellites in this indoor GPS system.
"We can test our theories 24/7 in this test lab without the weather constraints involved in outdoor testing," Sanfelice, said. "We mathematically model the systems we want to control and design a set of computer instructions to accomplish a particular task, such as hovering."

Testing outdoors is more time consuming and costly, and conditions are more difficult to control, he explained. Testing indoors allows the computing brains to stay safely on the ground making decisions based on data coming from the motion-capture cameras. The cameras continuously record the vehicle's position, orientation, and velocity.

"When we transition to outdoor testing, the computer has to be onboard, like a traditional airplane autopilot, receiving information from satellite-based GPS systems, whereas in the lab, the cameras function as the GPS," he said.
"After extensive testing indoors, we're in a better position to use our resources more efficiently when we transfer to outdoor, real-world experiments to validate and fine tune our controllers."

Sanfelice and his students currently are studying ways to extract energy from wind gusts and thermals to gain altitude without using power, just like birds do when soaring to greater altitudes. "This is very different from traditional control system design, where you want to nullify the effects of perturbations. Here, we're exploiting them," he said.

Sanfelice noted that hybrid control system theory is a relatively new field, having evolved during the past 20 years or so. As a result, theoretical tools for analysis, design, and simulation of hybrid control systems are in the early stages of development. "We are developing a toolbox for such systems, to make them more designer- and user-friendly," he explained. "We hope that our simulation software for these systems will eventually become part of a commercial simulation product."

PhysOrg

TStzmmalaysia
post Feb 3 2011, 02:27 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image Attached Image Attached Image

Cactus Tower Design

The Minister of Municipal Affairs & Agriculture (MMAA) in Qatar is getting a brand new office building that takes the form of a towering cactus.

Designed by Bangkok-based Aesthetics Architects, the modern office and adjoining botanical dome take cues from cacti and the way that they successfully survive in hot, dry environments.

Qatar is fairly barren, covered by sand, and receives and average annual rainfall of 3.2 inches. Qatar has constructed spectacular buildings that can be very efficient in the hot desert environment. Aesthetics Architects GO Group decided to model the MMA’s new office upon the cactus, taking inspiration from the way these plants deal with the scorching desert climate.

An excellent example of desert architecture, MMA’s new building is designed be very energy efficient and utilizes sun shades on its windows. Depending on the intensity of the sun during the day, the sun shades can open or close to keep out the heat when it is too much. This is similar to how a cactus chooses to perform transpiration at night rather during the day in order to retain water – another great example of biomimicry. The botanic dome at the base of the tower will house a botanical garden. Hopefully it will include an edible garden and a living machine as well.

Inhabitat

TStzmmalaysia
post Feb 3 2011, 02:30 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image Attached Image Attached Image

Office Building Covered In A Wall Of Solar Panels

Another good looking design has caught our attention. C. F. Møller's latest project - an office building for the Municipality of Aarhus, Denmark and it's covered in solar panels!

We are thrilled to see building integrated solar panels at work on a real building. Not only is the office a striking example of energy architecture, but it's also designed to German Passive House standards, meaning it needs practically no energy for heating or cooling.

The new office building houses the Technical Administration of the Municipality of Aarhus, and from the start the goal for the building’s design was to create an example of progressive office building construction. The city and C. F. Møller also wanted the energy consumption to be at a ‘passive house’ standard and supply the employees with good indoor quality air. The building has a total heat consumption of a maximum of 15 kWh/m2/year, and an overall energy consumption of at most 50 kWh/m2/year.

Located in a development zone of the Port of Aarhus, the six-story building will now serve as a landmark for the city and become known for its use of solar power. The wide South-facing facade is clad in natural stone and incorporates a 170 m2 solar wall and a 200 m2 slatted wall of solar panels that also provide shade. The 170 m2 solar wall acts as a vertical sculptural element in the corner of the building and the energy from this system is used to pre-heat the ventilation air intake in winter and to cool the offices in summer. Windows for the offices are recessed and protected from the sun with shade panels that are covered in photovoltaics, which provide electricity for the offices.

The municipality’s new office building is twice as airtight as required by the Danish building regulations and features, and features energy-friendly materials and elements with ultra-low thermal conductivity, like vacuum-insulated windows. “Normally you would try to hide the energy-efficient elements of the building, but we have decided to make a virtue of necessity and use them as sculptural elements in the facade, so that the building stands out as a distinctive image of energy architecture,” states architect and partner Mads Møller, C. F. Møller Architects.

Inhabitat

TStzmmalaysia
post Feb 5 2011, 12:28 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Building a bridge to renewable energy

Bridges are generally exposed to the elements, meaning they generally get a nice dose of sunlight often coupled with some fairly strong crosswinds. For these reasons this “Solar Wind” bridge design would seem to make a lot of sense. The proposed bridge would harness solar energy through a grid of solar cells embedded in the road surface, while wind turbines integrated into the spaces between the bridge’s pillars would be used to generate electricity from the crosswinds.

The brainchild of Italian designers Francesco Colarossi, Giovanna Saracino and Luisa Saracino, the Solar Wind concept was designed for the Solar Park Works – Solar Highway competition that asked entrants to modernize sections of a decommissioned elevated highway stretching between Bagnera and Scilla in Italy.

The road surface would replace traditional asphalt with 20 km (12.4 miles) of “solar roadways” consisting of a dense grid of solar cells coated with a transparent and durable plastic coating providing 11.2 million kWh per year. The designers say this system, combined with the 26 wind turbines integrated underneath the bridge generating 36 million kWh per year, would provide enough electricity to power approximately 15,000 homes.

In addition to the “solar roadways,” the top surface of the bridge would also include a “green promenade” along its length comprising solar greenhouses for growing local produce. Drivers would be able to stop along the bridge to buy some fresh fruit and veggies while enjoying panoramic bridge views (an idea which strikes us as "a bridge too far" for this concept).

The Solar Wind entry was awarded second prize in the Solar Park Works – Solar Highway competition and the design clearly has merit. The integration of wind turbines into the underside of high altitude bridge exposed to constant strong winds seems like a particularly good idea – given that this could be achieved from a structural engineering point of view. Let's hope someone will see the concept and run with it.

Attached Image

GizMag

TStzmmalaysia
post Feb 5 2011, 12:30 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image Attached Image Attached Image Attached Image

Mammoth Pyramid Megacity for New Orleans

An Arcology is essentially a mega city that packs a ginormous population into one hyperstructure.

Now, a real-life group of ambitious designers has taken their looming pyramidal arcology and placed it smack dab on the Mississippi River as a proposal for rebuilding New Orleans. This 30 million square foot beast-building with an array of green features is aptly named NOAH (Get it? Noah and the Arcology?), and is meant to house 40,000 residents.

NOAH, which stands for New Orleans Arcology Habitat, boasts 20,000 residential units as well as 3 hotels and 1,500 timeshare units. But that’s not all. Also housed within the triangular walls of this one-stop-building will be commercial space (stores), parking for

8,000 cars, cultural spaces, public works, schools, an administrative office, and a health care facility. This means that you could live your whole life within NOAH if you wanted to. Although that doesn’t sound very fun, it may be prudent, since NOAH has been specifically designed to withstand the hurricanes that have ravaged the city on the Mississippi in the past. Its floating base and

open-wall structure are meant to allow “all severe weather /winds to in effect blow through the structure in any direction with the minimum of massing interference.”

In terms of sustainability, we were at first skeptical as we are with most supermassive structures, but the fact that so many inhabitants are meant to occupy the space offsets NOAH’s giant footprint. Another plus is that NOAH will supposedly “eliminate the need for cars within the urban structure” via vertical and horizontal internal electric transport links, creating a pedestrian-friendly community. Other eco-friendly elements include secured wind turbines, fresh water recovery and storage systems, a passive glazing system, sky garden heating/cooling vents, grey water treatment, solar array banding panels, and river based water turbines. And if NOAH truly is hurricane-proof, that will make the city more sustainable than any wind turbines or solar panels ever could.

Inhabitat

TStzmmalaysia
post Feb 5 2011, 12:32 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Future surgeons may use robotic nurse, 'gesture recognition'

Surgeons of the future might use a system that recognizes hand gestures as commands to control a robotic scrub nurse or tell a computer to display medical images of the patient during an operation.

Both the hand-gesture recognition and robotic nurse innovations might help to reduce the length of surgeries and the potential for infection, said Juan Pablo Wachs, an assistant professor of industrial engineering at Purdue University.

The "vision-based hand gesture recognition" technology could have other applications, including the coordination of emergency response activities during disasters.

"It's a concept Tom Cruise demonstrated vividly in the film 'Minority Report,'" Wachs said.

Surgeons routinely need to review medical images and records during surgery, but stepping away from the operating table and touching a keyboard and mouse can delay the surgery and increase the risk of spreading infection-causing bacteria.

The new approach is a system that uses a camera and specialized algorithms to recognize hand gestures as commands to instruct a computer or robot.

At the same time, a robotic scrub nurse represents a potential new tool that might improve operating-room efficiency, Wachs said.

Findings from the research will be detailed in a paper appearing in the February issue of Communications of the ACM, the flagship publication of the Association for Computing Machinery. The paper, featured on the journal's cover, was written by researchers at Purdue, the Naval Postgraduate School in Monterey, Calif., and Ben-Gurion University of the Negev, Israel.

Research into hand-gesture recognition began several years ago in work led by the Washington Hospital Center and Ben-Gurion University, where Wachs was a research fellow and doctoral student, respectively.

He is now working to extend the system's capabilities in research with Purdue's School of Veterinary Medicine and the Department of Speech, Language, and Hearing Sciences.

"One challenge will be to develop the proper shapes of hand poses and the proper hand trajectory movements to reflect and express certain medical functions," Wachs said. "You want to use intuitive and natural gestures for the surgeon, to express medical image navigation activities, but you also need to consider cultural and physical differences between surgeons. They may have different preferences regarding what gestures they may want to use."

Other challenges include providing computers with the ability to understand the context in which gestures are made and to discriminate between intended gestures versus unintended gestures.

"Say the surgeon starts talking to another person in the operating room and makes conversational gestures," Wachs said. "You don't want the robot handing the surgeon a hemostat."

A scrub nurse assists the surgeon and hands the proper surgical instruments to the doctor when needed.

"While it will be very difficult using a robot to achieve the same level of performance as an experienced nurse who has been working with the same surgeon for years, often scrub nurses have had very limited experience with a particular surgeon, maximizing the chances for misunderstandings, delays and sometimes mistakes in the operating room," Wachs said. "In that case, a robotic scrub nurse could be better."

The Purdue researcher has developed a prototype robotic scrub nurse, in work with faculty in the university's School of Veterinary Medicine.

Researchers at other institutions developing robotic scrub nurses have focused on voice recognition. However, little work has been done in the area of gesture recognition, Wachs said.

"Another big difference between our focus and the others is that we are also working on prediction, to anticipate what images the surgeon will need to see next and what instruments will be needed," he said.

Wachs is developing advanced algorithms that isolate the hands and apply "anthropometry," or predicting the position of the hands based on knowledge of where the surgeon's head is. The tracking is achieved through a camera mounted over the screen used for visualization of images.

"Another contribution is that by tracking a surgical instrument inside the patient's body, we can predict the most likely area that the surgeon may want to inspect using the electronic image medical record, and therefore saving browsing time between the images," Wachs said. "This is done using a different sensor mounted over the surgical lights."

The hand-gesture recognition system uses a new type of camera developed by Microsoft, called Kinect, which senses three-dimensional space. The camera is found in new consumer electronics games that can track a person's hands without the use of a wand.

"You just step into the operating room, and automatically your body is mapped in 3-D," he said.

Accuracy and gesture-recognition speed depend on advanced software algorithms.

"Even if you have the best camera, you have to know how to program the camera, how to use the images," Wachs said. "Otherwise, the system will work very slowly."

The research paper defines a set of requirements, including recommendations that the system should:

- Use a small vocabulary of simple, easily recognizable gestures.
- Not require the user to wear special virtual reality gloves or certain types of clothing.
- Be as low-cost as possible.
- Be responsive and able to keep up with the speed of a surgeon's hand gestures.
- Let the user know whether it understands the hand gestures by providing feedback, perhaps just a simple "OK."
- Use gestures that are easy for surgeons to learn, remember and carry out with little physical exertion.
- Be highly accurate in recognizing hand gestures.
- Use intuitive gestures, such as two fingers held apart to mimic a pair of scissors.
- Be able to disregard unintended gestures by the surgeon, perhaps made in conversation with colleagues in the operating room.
- Be able to quickly configure itself to work properly in different operating rooms, under various lighting conditions and other criteria.

"Eventually we also want to integrate voice recognition, but the biggest challenges are in gesture recognition," Wachs said. "Much is already known about voice recognition."

EurekAlert
TStzmmalaysia
post Feb 5 2011, 12:34 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image Attached Image Attached Image

Field Architecture Empowers South African Township With Sustainable Ubuntu Center

How do you lift up a formerly beleaguered community in South Africa? For starters, build an education and health center as the township's nucleus.

That is what the Ubuntu Education Fund did with California-based Field Architecture's award-winning design. Located in Port Elizabeth's Zwide Township, and based on the Bantu concept that no human lives in isolation, the Ubuntu Center features various sustainable components such as locally-sourced materials, passive design and renewable energy sources.

Inspired by Zwide’s footprints, Stan Field designed a space for people to walk through, not to. Born in Port Elizabeth during the apartheid era, he felt that long footpaths through a loosely aggregated center could transform how the community sees itself. Ngonyama Okpanum Hewitt-Coleman Architects, a local, Black Economic Empowerment (BEE) registered firm, oversaw the building’s construction.

Along with passive cooling and heating design, the building’s thermal mass reduces its reliance on mechanical systems. Thick concrete walls radiate heat back into the building, while large spaces outside maximize solar penetration for additional heat gain. Also included are photovoltaic panels that collect and convert solar energy into electricity.

For cooling, carefully placed windows create a convection effect. Low, ventilated windows circulate cool air, while hotter air is released through higher, stacked windows.

Irrigated with grey water, the rooftop garden further insulates the building. It also encourages once disenfranchised people to reconnect in a healthy, positive way. The Center opened in September, 2010, and includes a pediatric HIV clinic, a community theatre, an education wing, and office space.

Ubuntu center feeds 2,000 children each day, provides holistic support to 3,500 clients and their families, delivers after-school education to 250 students, and issues HIV counseling and testing to 6,000 community members.

In 2009, Architect magazine awarded Field Architecture the Progressive Architecture Award for their design.

Inhabitat
TStzmmalaysia
post Feb 5 2011, 12:36 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

New Energy Technologies demonstrates electricity-generating SolarWindow prototype

Over the past several years, a number of companies and institutions have been developing technologies that could allow windows to double as solar panels. These have included EnSol’s metal nanoparticle-based spray-on product, RSi’s photovoltaic glass and Octillion’s NanoPower window. Last September, Maryland-based New Energy Technologies joined the party by demonstrating a 4 x 4 inch (10.2 x 10.2 cm) prototype of its SolarWindow product. This Tuesday, the company unveiled a working 12 x 12 inch (30.5 x 30.5 cm) prototype, which takes it significantly closer to becoming commercially-viable.

As is the case with EnSol’s technology, SolarWindow incorporates a spray-on photosensitive film. It is applied at room temperature, allows the window to remain transparent, and is capable of generating electricity from both artificial and natural light – the company's intention is that it would be used primarily on the exterior of windows, where it would be exposed to sunlight.

While the details of how the system works aren’t being fully disclosed, the company has stated that the film “replaces visibility-blocking metal [used in most solar panels] with environmentally-friendly and more transparent compounds."

New Energy Technologies claims that SolarWindow is superior to similar products in that its coating doesn’t have to be applied at a high temperature or in a vacuum, it is less than one-tenth the thickness of other “thin films,” and the solar cells used in each window are the world’s smallest functional models – less than a quarter the size of a grain of rice. It is also said to outperform other technologies by up to ten-fold when it comes to generating electricity from artificial light.

Although precise figures on efficiency aren’t available, the company estimates that when applied to the facade of an office tower, its product could generate over 300 percent the energy savings of traditional rooftop panels.

New Energy Technologies has also received some publicity for its experimental MotionPower system, that generates power from traffic driving over small plates embedded in roads.

GizMag

TStzmmalaysia
post Feb 5 2011, 12:38 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Solar Cooker Concept For Off-Grid Communities

Solar cookers for rural communities and villages in developing countries are a hot topic among designers.

They can help cut down on the GHGs emitted by burning typical fuels, from wood to dung to kerosene, but they are often inefficient or impractical. It doesn't stop designers from trying, however, and Yonggu Do and Eunha Seo have created the Hot Liner, a flexible solar panel that can be formed into a cooking surface.

It seems that this design suffers from the same single flaw many of these life-saver designs have -- once it's broke, it's broke. And that sort of non-repairability makes the product unrealistic for its core audience. Plus, these would be expensive -- using a less fancy version of a solar cooker that includes a reflective dish, and that's about it, seems more practical.

There are, however, some smart elements to the design, including minimal parts, and ease of use. Durability is in question, but if the solar panels can be made scratch-resistant enough, then perhaps they make a practical solution for a daily need. The potential is there, perhaps, and it was an award-winning entrant at the 2010 Seoul International Design Competition co-hosted by Designboom.

Attached Image Attached Image Attached Image

TreeHugger




TStzmmalaysia
post Feb 5 2011, 12:39 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

A New Twist on Floating Wind Power

Wind turbines attached to floating buoys can harness stronger, more sustained winds in the open ocean. But the floats now used for such deep-water installations may prove prohibitively expensive because the buoys needed to keep them above water are enormous. Now a project in France is turning the turbine design on its head for what developers hope will be a low-cost alternative.

French oil and gas engineering company Technip and wind-power startup Nenuphar recently announced Vertiwind, a two-megawatt wind turbine that they plan to float in Mediterranean waters by the end of 2013. The project employs a turbine with a main rotor shaft that is set vertically, like a spinning top, rather than horizontally, as in a conventional wind turbine.

The benefit of the vertical-axis design is that it lowers the turbine's center of gravity. Vertiwind's design stands 100 meters tall, but places the generator, which weighs 50 tons, inside a sealed tube beneath the turbine's rotating blades, 20 meters above the sea. This makes the turbine less top-heavy, allowing for a significantly smaller flotation system, which would extend only nine meters below the surface of the ocean.

In contrast, a horizontal-axis turbine with the same power output and blades also reaching 100 meters high would need its generator to be 60 meters above the sea. A buoy built by Technip for a 2.3 megawatt horizontal-axis floating turbine prototype, owned by the Norwegian energy company Statoil, extends 100 meters below the surface.



"You save a lot of material" with a vertical axis, says Stephane His, vice president of biofuels and renewable energy at Technip. "But more than that, you ease the process of installing the machine itself."

Technip and Nenuphar plan to build two vertical-axis turbines with a power output of two megawatts each, one onshore and one offshore, at a cost of $28 million. The figure is still significantly more than shallow-water turbines fixed to the seafloor (which cost around $5 million per megawatt) but much less than the approximately $70 million spent on construction of the prototype owned by Statoil for construction, deployment, and ongoing research.

By pursuing a vertical-axis design, Vertiwind is using technology that was all but abandoned for onshore wind power more than a decade ago. Vertical-axis designs, which are inherently low to the ground, usually cannot compete with taller horizontal wind turbines that catch stronger winds at higher altitudes.

This should be less of a disadvantage offshore, since wind speed increases less with height over open water than it does over land, says Walter Musial, who leads offshore wind energy research activities for the U.S. Department of Energy's National Renewable Energy Laboratory in Golden, Colorado.

However, Musial has doubts about the design that Vertiwind is pursuing. Few large-scale vertical-axis turbines have been built, and all have had a curved-blade design that connects to the turbine's main shaft at the top and bottom of the blade, thereby evenly distributing the load placed on the structure, Musial says. Vertiwind, however, will use a straight-blade design that is only connected to the central shaft by two supports or struts near the bottom of the blade.

"That blade is going to bend as it is rotating due to centrifugal force, and the connections of those struts are carrying all the load," Musial says. "Those joints are getting hammered—that is the most challenging engineering aspect of this design."

Nenuphar CEO Charles Smadja says he is confident that the design will handle the strain, based on tests of a 35-kilowatt prototype that spins at higher speeds. Smadja concedes, however, that ramping up to a two-megawatt turbine will present new challenges. "What you can do properly at a small scale can be difficult to do at a very big scale," he says.

TechnologyReview

TStzmmalaysia
post Feb 6 2011, 08:48 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image Attached Image Attached Image Attached Image

The Floating MORPHotel

Almost nothing says “unique and innovative” like the MORPHotel. A unique project first conceptualized by the brilliant Gianluca Santosuosso, MORPHotel allows individuals to live within this buoyant system as it moves around the world.

Considering its vast size and manta shape, the Morphotel is far too ambitious for just a week-long cruise.

In fact, it’s designed for the opposite: The Morphotel travels across the globe nonstop and just docks occasionally to re-supply. Oh, and we almost forgot to mention—the Morphotel has a bit of Transformers in it too. See its exoskeleton-like structure? Like the manta ray that inspired it, the Morphotel can “flex” its various sized sections to better travel in choppy oceans (perhaps avoid icebergs as well) and comply with docking space constraints.

You’ve heard of submerged hotels and hotels in space, but what about a hotel that adapts to the weather conditions via a linear structure developed around its vertebral spine? Unlike a cruise ship, the MORPHotel is slow-moving, making for one fabulous, endless and luxurious voyage.

The main idea is to exploit the sea not only as a medium to move people from one point to another, but also to discover unknown places. Plug-in City-Arbour: During its continuous movement, this artificial organism will stop for short or long periods in different cities, thereby becoming a temporary extension of them.

TrendHunter

TStzmmalaysia
post Feb 6 2011, 08:50 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


TRANSPORTATION

Attached Image

Researchers work to develop a vehicle that can be driven by the blind

Last Saturday, a blind driver dodged cardboard boxes thrown in front of him while driving a modified Ford Hybrid Escape around the Daytona International Speedway. He had only seconds to react to the obstacles.

"If we just put boxes on the track, people might think we planned the route," said Dennis Hong, whose robotics and mechanisms lab at Virginia Tech modified the cars.

Instead, Hong's team threw boxes from a van so they bounced around. “That shows everyone that their position is random, and that the drivers are really driving,” said Hong.

In addition to avoiding boxes and taking the raceway's turns, the driver, Mark Riccobono, also passed the van.

Fortunately, Riccobono and a second blind driver, Anil Lewis, had done it before.

"The other day, when we got to drive by ourselves, it surpassed any perception of thrilling," Riccobono wrote in an e-mail after a mid-January test run at Virginia International Speedway. Riccobono, who lost his sight at the age of five, is executive director of the National Federation of the Blind's research arm, the Jernigan Institute.

"It's scary and exciting," said Lewis, 46, who lost his sight 21 years ago and thought he would never drive again.

The demonstration at Daytona took place before blind supporters and fans gathered for the Rolex 24 sports car race as part of the National Federation of the Blind's Blind Driver Challenge to develop a car blind people can drive independently.

"We're trying to change people's minds about what blind people can do, and driving is going to change minds," said Lewis, the Federation's communications director. "This is taking it to whole different level."

Riccobono reached speeds of more than 25 mph and did not hit a single box.


Lewis was not worried about a small accident or two. "If it goes off too perfectly, people won't believe its credibility," Lewis said. After all, he explained, the car is a research project and people should expect some failures. Besides, "sighted" drivers have traffic accidents too.

Hong had other goals. "I want it to be perfect," he said. "This is a controversial project. I'm getting hundreds of e-mails from people saying, 'You won't believe how much hope this brings to us.' "But I'm also getting hate mail, saying 'Are you insane?'"

Hong understands. In fact, he had doubts until blind drivers tested a prototype.

Development

The National Federation of the Blind announced the Blind Driver Challenge in 2004. Only Virginia Tech responded, and it did not sign up until 2006.

At the time, Hong's lab was developing a driverless car for the DARPA Urban Challenge. Sponsored by the Defense Advanced Research Projects Agency, it called for robot cars to navigate through 60 miles of traffic, lights, stop signs, and obstacles at an old military base. Virginia Tech's upstart team surprised everyone by claiming the $500,000 third prize.

Hong thought he could use a similar approach to build a car for the blind. That was not the Federation's goal.

"They wanted the blind person to make active decisions, and actually drive and control the vehicle," Hong said.

Yet the two vehicles had much in common. Both use similar sensors. A laser light detection and ranging (LIDAR) system identified cars and other obstacles in the road. In addition, two forward-pointing cameras monitored the road as well as lights and stop signs. A GPS system located the car on a map, while an inertial measurement unit tracked the car's speed and direction in case it lost GPS contact.

The vehicles' computers automatically gathered all the sensory information and blended it together to create a model of the car's rapidly changing environment.

In the driverless car, the computer assessed the model, picked out the route, and used the drive-by-wire system to drive the car.

The Blind Driver Challenge vehicle was different. Instead of telling the car how to drive, it had to communicate this information to the driver, who then had to respond accordingly.

Hong's team faced two initial hurdles. First came money. Autonomous cars are expensive, but the Federation offered Hong only $5,000 to get started. He put together a team of 12 undergraduates and they bought a used gasoline-powered dune buggy on eBay for $2,000.

They bought whatever equipment was not donated. The LIDAR made by Hokuyo Automatic Co. was the most expensive component, and ordinarily cost about $8,000.

"Hokuyo originally donated it for another robotics project, but we used it on this," Hong related. "I was afraid to tell them, but when they saw what we did, they became big fans."

Driving By Touch

The second problem was more profound. How could Virginia Tech convey information fast enough to a driver who cannot see? After all, most computers communicate with humans through visual displays. For the blind, that wasn't possible.

Hong started by letting the computer pick the best route and communicate driving information -- fast, slow, right, left, stop -- to the driver.

For the dune buggy, the students built a vest from a massage chair vibrator. Different massage patterns told the driver when to speed up, slow down, or stop. In the Ford Escape, the expanded team of engineers built the vibrator, now branded SpeedStrip, into the driver's seat.

"We're experimenting with straight up-and-down and zigzag patterns, still figuring out which ones are most effective," Hong explained.

The dune buggy's original steering system used a steering wheel that made clicking noises when it turned. The car's computer told the driver how many clicks to turn the wheel. It was awkward, and forced blind drivers to listen to the computer rather than use their hearing to assess their environment.

The Ford Escape, by contrast, uses DriveGrip, a glove with a small vibrating motor on the knuckle of each finger. The more motors that vibrate, the sharper the driver needs to turn.

Finding the right vibration pattern proved a challenge. Hong considered which hand to signal for which turn; whether the motors should vibrate all at once or sequentially; and whether they should turn off suddenly or gradually as the driver completed the turn.

He also had to face the human-in-the-loop problem. An autonomous vehicle is easy for a computer to control because it responds to commands the same way every time.

Humans are another story. Different drivers may interpret the signal to turn hard differently. Even the same driver may respond differently on different days.

This showed up during testing. The computer would define the "road" as an 18 foot wide path on the 30 foot wide racetrack. "If I steered out of that lane, the car would shut down," Lewis recounted. This often happened when cornering.

"Sighted people get a chance to correct their errors," Lewis said. "We asked for the same chance, and when we got it. We got better at staying on the road."

In this case, the solution to the human-in-the-loop was to modify the car's controls so that the driver could learn to adapt to its signals.

The blind drivers and a blind engineer on the Virginia Tech team also suggested cutting off the ends of the DriveGrip gloves, since blind people need touch to sense their environment. They also collaborated on finding the best vibration patterns.

Lewis and Riccobono practiced for a total of seven days during the months before Daytona. Lewis says the Blind Driver Challenge will help blind people stay on the forefront of technology.

"We were worried that the knobs and buttons we used on appliances and phones were going to touch screens. We wanted to make sure there are non-visual interfaces out there for us," Lewis said.

Those interfaces are coming. They are what Hong calls "informational" interfaces. On the Escape, they communicate what is going on outside the car.

One is AirPix, a pad with pressurized air flowing through a grid of pinholes. It works like an air hockey table, but AirPix controls each pinhole's air pulses to form "pictures" that a blind person can view with his or her fingers, like Braille. A similar technology uses a 3-dimensional rubber membrane that changes shape to show road conditions.

The technology is too new to use at Daytona. Yet one day, similar technologies may not only help the blind to drive, but perhaps to navigate streets, use portable computer devices, or view notes from an electronic blackboard.

PhysOrg

TStzmmalaysia
post Feb 6 2011, 08:54 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Texas Student Attends School as a Robot – A Sign of Things to Come

Freshman Lyndon Baty’s immune system is so fragile he can’t risk being surrounded by people his own age, yet he attends classes at his high school in Knox City, Texas every day. All thanks to a robot. The Vgo telepresence platform is a four foot tall bot on wheels with a small screen, camera, speakers and microphone at the top. Baty logs into the robot remotely from his home, using his PC and a webcam to teleconference into his classes. Baty can drive Vgo around his school, switching between classes just like regular students. For a boy that has spent much of his life sick and isolated from his peers, Vgo not only represents a chance at a better education, it’s also an opportunity for freedom and comradery. Learn more about his story in the local news segment video below. Lyndon Baty’s circumstances may be far from typical, but stories like his are going to become much more common in the future as telepresence robotics makes its way into the mainstream.

Baty’s situation is a rare combination of marketing, bad/good timing, and innovative thinking from school officials. The young man has polycystic kidney disease and recent treatments have left his immune system too damaged for him to attend school directly. Representatives from Vgo contacted the Knox City school district to offer their services. While attending school through a robot isn’t quite the same as being there in person, you can tell from Lyndon’s smile in the following video that Vgo is a more than welcome improvement in his life:
While we haven’t covered the Vgo robot in the past, it reminds me of several other telerobots we have seen, especially Anybot’s QB. Only Vgo is supposedly retailing for around $6000 (including ~$1200/year for the service contract), considerably less than the QB’s $15k price tag. Differences in maneuverability, reliability, and video quality may make the cost difference appropriate, but that’s not really my concern. Vgo is representative of the telerobotics market as a whole right now: reasonable run times (battery life is between 6-12 hours depending on upgrade options), Skype-level video quality, and compatible with standard WiFi. If you can afford the $6k (or $15k) price tag, you can probably have this setup in your home or office right now. In other words, this isn’t the technology of tomorrow, it’s here today and ready to go. Vgo launched sales in 2010 and has been marketing their product to a variety of applications, as you’ll see in the following video:



Not to sound cynical, but I’m guessing that Lyndon Baty’s use of Vgo is just another part of that marketing plan. I’m totally fine with that, by the way. Giving a child (and a school district) a reasonable solution for a terrible predicament is great. If it comes with a moderate price tag, so be it. So, while Lyndon’s personal story of perseverance and increasing freedom is exceptional, the underlying technological implications are pretty mundane: telepresence is gearing up to try to make a big splash in the market.

We’ve seen plenty of indications of this. South Korea is testing telerobots in their schools. They could have one of these devices in every kindergarten classroom by 2013. Researchers in Japan are experimenting with robots aimed towards emotional connections (with mixed results). As we said above, Anybots has their own platform on the market already. iRobot recently unveiled a prototype robotic platform that would transform any teleconference-enabled tablet computer into a telerobot. I’m guessing that in the next five years, one or more of these attempts at telerobotics is going to actually gain some traction and start moving some serious product.

Education may be a natural market. As we learned from Fred Nikgohar, head of telerobotics firm RoboDynamics, there are some big hurdles in other applications of telepresence robots. Offices value secrecy. Medical facilities worry about patient privacy. There’s a lot of bureaucracy standing in the way of widespread adoption of telerobotics. Schools have some of the same problems, but (to be perfectly honest) they also have sick kids who you can’t say no to. Or they’re run by governments who have nationalistic goals in science and technology (exemplified by South Korea). Get the price of telerobotics low enough, and we could see it expand into different niches of education including homeschooling, remote expert instructors (like the English tutors in South Korea), or online universities.

I’m hoping that these early applications for telepresence will keep driving the price down. $15k is way too much for home use. Even $6k is an order of magnitude too large. Given that kind of option we’ll always default to the $30 webcam and Skype video conferencing, even if it’s not mobile.

But give us a $500 telerobot and things could change considerably. Remotely controlling a robot while talking to someone in a far off location is an amazing opportunity. I’d love to visit my distant family members every week if I could actually roll around and interact them in a more ‘natural’ way.

Hopefully Vgo and other telepresence companies will continue to gather momentum in the years ahead and push their products into the mainstream. If we can make them cheap enough, the benefits of telerobotics will sell themselves. The mobility that comes with a telerobot is something that sets it head and shoulders above video conferencing on a laptop. It transforms a restricting experience into a freeing one. Just ask Lyndon Baty.

SingularityHub


TStzmmalaysia
post Feb 6 2011, 08:58 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Robotic Hydroponic Lettuce Farm in Belgium

The steady march of industrial automation continues to envelop the world. Many of us have a quaint early 20th century vision of agriculture, but the truth is that the farmer of the 21st century is a machine. Case in point: lettuce. With new hydroponic techniques, companies are able to grow lettuce in large indoor fields where crops are sorted, planted, and grown automatically. Robots are instrumental to all steps of the process. Don’t believe me? Check out the video below. It shows the day to day automation for a hydroponic farm in Belgium. Those bots seem to have a pretty green thumb.

While I’m unsure of the exact location of the farm, I can tell that the robotics systems used are from Hortiplan, a horticultural automation company based out of the Netherlands. The hydroponic rows you see being filled and maneuvered are part of their Mobile Gully System. The MGS not only plants the lettuce and arranges it in the field, it also moves the crop along as it develops, and delivers it to the right part of the greenhouse for harvest. That picking is done by hand. You can see some (mediocre) clips of the process on the Hortiplan YouTube channel.

Hydroponics and automation seem to go together very well. Hortiplan’s MGS uses what’s known as the nutrient film technique. Essentially, the gullies (trays) have a very thin layer of nutrient rich water flowing through them. That water is pumped to one end of the field and then flows downhill (there’s a very slight slope on the trays). Lettuce is continually watered, slowly moved across the field by the MGS, which also increases the spaces between gullies so the plants have room to grow. By the time they reach the far end they’re ready for harvest.



The nutrient film technique, and various levels of automated lettuce production, have been around for years, but they haven’t really reached the public consciousness. So few of us stop to think about where our food really comes from. Industrial agriculture is just another soldier in the automated legion that is changing the way we produce practically everything. As it continues to march on, we’ll see more crops like the Belgium lettuce that can be grown with less labor and less resources. Hopefully they will be leveraged to create more food and less hunger all over the world. Awesome! When was the last time you were this excited by lettuce?

SingularityHub

TStzmmalaysia
post Feb 6 2011, 09:00 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

High-Efficiency Photovoltaic Cells Developed

The Micro- and Nanotechnology Research Group of the Universitat Politècnica de Catalunya (UPC) has produced silicon photovoltaic cells with a conversion efficiency of 20.5%, the highest level achieved in Spain using this material. This figure is comparable to results obtained by leading research groups in the field at the international level.

The cells developed by the UPC researchers have surpassed the 15% barrier -- the average efficiency of the most common photovoltaic cells. Specifically, a conversion efficiency (of incident light to electric power) of 20.5% has been achieved, which means the energy produced per unit of area can be increased by one third.

For example, thanks to the high efficiency of this new cell type, only 4.8 m² of photovoltaic panels would be needed to meet one family's annual energy needs (an average of about 4 kWh per day). This compares to an area of 6.5 m² for traditional cells.

The cells are made of crystalline silicon and work in a simple way, much as conventional cells do. The light captured by the cells generates charges that are drawn off at the panel contacts and transformed into an electric current. "The goal is to generate a lot of charges that don't get lost -- that make it to the contacts," says Alcubilla, a member of the research group. Finally, after the light from the sun has been converted into electric current, it is fed into the power grid for domestic and industrial use.

The key to the success of the project was therefore to minimize losses, and by pursuing this approach the UPC researchers have managed to produce the most efficient silicon cells in Spain. "We've done a lot of work on the conception and development of new materials and structures, and on the technology needed to optimize the entire process and achieve high levels of efficiency," says Alcubilla. The next step is to develop procedures that facilitate large-scale production.

The result achieved in this research (which has involved 38 trials since 2002) is comparable to those obtained in other research projects carried out in countries that are taking the lead in the field of photovoltaic energy. The maximum efficiency obtained for cells of this type is 24.7%, a record set by an Australian group at the University of New South Wales.

ScienceDaily

TStzmmalaysia
post Feb 6 2011, 09:03 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Philippines Shows Commitment to Aquaponic/Hydroponic Future

News from the Philippines shows once again that the future of hydroponics and aquaponics is certainly burning bright. Increasingly news is reaching us here at the Hydroponics Guide of various projects taking place around the world that show the current popularity of hydroponics and also its viability as a food production solution.

Butuan City, a city with prospects, is to play host to a new aquaponics./hydroponics project. The project is hoping to set up a modern and highly capable resource for the cultivation of plant produce as well as seafood. The project will be bankrolled by Greenphil Aquaculture and Hydroponics Holding Inc, who w

ill invest millions of dollars with the support of foreign investment groups. According to reports there will be a joint management element that will provide the local government and private sector companies with a small economic dividend.

The project is anything but small scale however, with Greenphil committed to creating a production line for high quality seafood predominantly for the export market and a range of fruits and vegetables. As regular readers will be aware, aquaponics is based upon the symbiotic relationship between fish and plants, in which the plants are fed by the waste products of the fish and the fish are fed using special foods and also waste plant matter. The result of such systems is produce of exceptional quality and also healthy, sustainable farmed fish.

The Butuan City project is just one of eight projects that will be built in the Philippines over the next few years that will bring employment and a lucrative export industry to the country, once again demonstrating the viability of aquaponics and hydroponics as food production methods.

News from the Philippines shows once again that the future of hydroponics and aquaponics is certainly burning bright. Increasingly news is reaching us here at the Hydroponics Guide of various projects taking place around the world that show the current popularity of hydroponics and also its viability as a food production solution. Butuan City, a city with prospects, is to play host to a new aquaponics./hydroponics project. The project is hoping to set up a modern and highly capable resource.

Hydroponics Guide

TStzmmalaysia
post Feb 7 2011, 08:02 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Hairwasher Robot Revealed

Panasonic has unveiled robots that scan a person’s head in 3D, then wash his or her hair.

It ensures its 16 fingers exert just the right amount of pressure. It can remember each customer’s head shape and massage preferences, according to a Panasonic press release. Each arm contains three motors to control swing, press, and massage functions.

Panasonic says the robots are meant for the elderly and people with limited mobility. They could reduce caregivers’ burden by plucking patients out of their Panasonic robotic beds and into a special chair for a scalp massage and shampoo.

In the future, more functions could be integrated into these types of robots. Think of a haircut or maybe a shave. With precision robotics, these things are not that far fetched. As in al sections of labor, this development in automation is a very viable solutions to many problems.

Prototype salon robots were on display at the 37th International Home Care & Rehabilitation Exhibition at the Tokyo Big Sight.


TStzmmalaysia
post Feb 7 2011, 08:04 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Internet for Robots Lets Bots Share Instructions and Learn from One Another

Well, we’ve seen this movie before (literally speaking). A group of robotics engineers at the University of Technology in Eindhoven are developing an Internet for robots; a kind of online database from which robots can download instructions and to which they can upload “experience.”

According to its creators, their RoboEarth system will allow robots to share information and learn from each other, allowing the benefits of machine cognition and learning to proliferate through a network of bots. Cue the SkyNet comparisons.

But barring a declaration of war against humans, RoboEarth is actually a pretty neat idea. In fact, we were somewhat surprised to learn that, according to RoboEarth, it is the first system to allow robots to download their instructions form the Internet. Using a RoboEarth or a system like it, helper robots could learn how to do things better, pushing machine learning to new places and enhancing human-machine interaction.


How does it work? According to RoboEarth:

RoboEarth will include everything needed to close the loop from robot to RoboEarth to robot. The RoboEarth World-Wide-Web style database will be implemented on a Server with Internet and Intranet functionality. It stores information required for object recognition (e.g., images, object models), navigation (e.g., maps, world models), tasks (e.g., action recipes, manipulation strategies) and hosts intelligent services (e.g., image annotation, offline learning).

To complete their closed loop, the team will offer ROS compatible, non-bot-specific components that robot builders can take off the shelf and implement into their creations, hooking them up to the robo-web. Put on your raving shoes and see their own robot, AMIGO, download and carry out instructions in
this video.



PopSci


TStzmmalaysia
post Feb 8 2011, 09:12 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Helper Robot HRP-2 Will Do Your Dishes

For now, mechanized household servants are pretty much limited to floor-cleaners -- though they do hold a special place in our hearts.

But what if a robot, after a long dinner party, could bus the table and head to the sink? Now there's a relationship we can build on. Japan's HRP-2 humanoid 'bot, pictured here, has learned to do just that.

Researchers at the University of Tokyo's Jouhou System Kougaku Laboratory have used human motion-capture and video game simulations to teach HRP-2 how to handle different types of dishware. They also had to develop waterproof robot gloves.

The researchers are also working with 3-D sensors to help the robots "see" their surroundings so they can move freely.

Along with washing dishes, the HRP-2 also cleans the floor with a regular, old-fashioned hose vacuum.

The 'domestication' of robots for household tasks is a concept that has been around for a long time. Up until receantly all the prototypes where limited in their capacity as helpers. Despite this, the development in this area of robotics is promising for the future.

However, research for these robots is very expensive. On top of that, some people still don't feel comfortable with robots in their surroundings. This is strange, given the fact that virtually every cellphone, vaccuum cleaner or tv is essentially a robot. The human like appearance of the humanoids still makes some people feel uncomfortable. However, if development will continue, it won't be long before we can all relax, while HRP-2 does our dishes, laundry, fixes the broken sink and does all our housekeeping.

PopSci

TStzmmalaysia
post Feb 8 2011, 09:14 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Emergency detection systems for senior citizens


Elderly people living alone have a dangerous life: after a fall, they often spend hours lying on the floor before their situation comes to anyone's attention and a doctor is contacted. A new system automatically detects predicaments like this and informs a trusted person. This makes it possible to live an independent life in one's own four walls.

Ms. K. is vision-impaired and can't get around very well any more. Still, the 80-year-old, who lives alone, has no intention whatsoever of moving to a retirement home. Most elderly people think the same way. They want to stay in their accustomed surroundings as long as possible, where they can lead an autonomous life. What many fail to realize is that they are risking their health in the process. Cardiovascular problems are more frequent among the elderly, and the risk of falling is more prevalent: one person in three above the age of 65 falls once a year; among those over 80 the ratio is nearly one in two. Many of these accidents occur in private homes in the course of everyday activities, and often at night. Frequently it is hours before the injured are cared for.

Even home emergency-call systems are of limited help when senior citizens cannot sound the emergency signal. They may be injured or disoriented, or may simply not have the emergency button on their person. Help could be forthcoming from an intelligent system that automatically identifies and responds to emergency situations such as these. One such solution is under development by researchers at the Fraunhofer Institute for Experimental Software Engineering IESE in Kaiserslautern, Germany. Their project is dubbed "ProAssist4Life" – shorthand for "Proactive Assistance for Critical Life Situations." Project partners include the company CIBEK technology + trading, Binder Elektronik and the Westpfalz Klinikum.

IESE scientists are working on an unobtrusive system that provides constant "companionship" to elderly people living in single households or in retirement facilities. Multisensory nodes mounted to the ceiling of a room register an individual's movements.

"Our system records how long a person spends in what part of the home," notes Holger Storf, a scientist at IESE. A radio signal transmits the data to a computer. Software documents the individual's daily activities, constantly learning the person's "normal behavior." The analytical software compares the resident's current activity with the model that has been generated. This is how it identifies situations that deviate from the norm – situations that could be an indication that the person has fallen, is lying unconscious on the ground and is in a helpless situation. "If a person spends considerably longer in the bathroom, for instance, or in some other place in the home, this is registered. To prevent false alarms, the first response is to prompt the individual," Storf explains.

This can be accomplished with a telephone call, for instance, or by means of a touchscreen monitor with an integrated speaker. The individual can then respond by touching the monitor. Should the elderly person fail to respond, the software sends a text message to a trusted individual such as a family member or caregiver.

"Our solution is not designed to replace home emergency-call systems but is intended to serve as a kind of airbag to give people living in single households a sense of safety," the researcher emphasizes. Unlike comparable competitor products, neither cameras nor microphones are required. Senior citizens do not need to carry sensors on their person, either. Because the system operates via radio signal, there is no need to install wiring. The system is easy to install.

"To date, there has been no comparable, learning-capable system on the market that constantly adapts to an individual's behavior," Storf notes. The researcher and his team have applied for patents of the software and the multisensory nodes. The experts will be exhibiting a prototype of "emergency detection in the home" at the CeBit 2011 in Hanover, Germany, where they will demonstrate its operation, in a kitchen built for the purpose, at the Fraunhofer joint stand.

PhysOrg


TStzmmalaysia
post Feb 8 2011, 09:15 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Filter feeding basking shark inspires more efficient hydroelectric turbine

Studying the bumpy protrusions on the fins of humpback whales has already led to more efficient wind and tidal power turbines and now nature is once again the source of inspiration for a new and more efficient hydroelectric turbine. The latest source of biomimicry is the basking shark, which industrial design student Anthony Reale has borrowed from the create "strait power," a water-powered turbine generator that tests have shown is 40 percent more efficient that current designs.

Despite being the second largest shark in the ocean, the basking shark is generally considered harmless to humans as it is a filter feeder. It swims with its mouth open to sift zooplankton, small fish and invertebrates from the water before the water is expelled through extended gill slits that nearly encircle its whole head. Although this flow of water assists in the shark’s swimming, Reale recognized that the shape of the shark’s body also played an important role.

With the basking shark’s jaw able to stretch up to 1.2 meter (3.9 ft) in width, a pressure differential is created as the shark swims. As with the wings of an airplane, the water pressure is greater along the straight bottom, while the curved surface of the shark’s body increases the distance the water has to travel, resulting in lower pressure across the shark’s top.

This pressure differential helps draw the water out of the basking shark’s gills and allows the basking shark to be only filter feeder shark that relies solely on the passive flow of water through its pharynx to feed. Other filter feeder sharks, the whale shark and megamouth shark, assist the process by suction or actively pumping water into their pharynxes.

With this in mind, Reale designed his ‘Strait Power’ turbine with a double converging nozzle or an opening within an opening. The water enters the turbine through the first opening and the second nozzle – like the shark’s gills – compresses the water and creates a low-pressure zone to draw the water through and generate more energy.

Reale came up with the design for his senior project at the College for Creative Studies (CCS) in Detroit and recently had the opportunity to put it to the test at the University of Michigan’s (UM) Marine Hydrodynamics Laboratory. The UM researchers with whom Reale collaborated were interested as they had been working on something similar to provide power for remote research camps in Alaska.

Subjected to 200 hours of testing in UM’s 100-yard-long (91 m), 22-foot-wide (6.7 m), 10-foot-deep (3 m) tow tank, Reale’s 900-pound (408 kg) turbine model made mostly of wood, screwed together and sealed with marine paint came out looking battered and bruised. But the results were promising with the researchers saying the design improved the power output of a single blade by around 40 percent – a figure that Reale expects to improve upon with future versions.

Reale has filed a patent for the technology and has designed five potential commercial uses of the Strait Power system ranging from a portable and collapsible version for charging small electrical devices designed for outdoor and military use, up to industrial versions with 10-foot (3 m) diameter blades for powering high-power electrical generators of 40,000 watts and higher.

GizMag

TStzmmalaysia
post Feb 8 2011, 09:21 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Pyramid Farm is a Vision of Vertical Agriculture

The Pyramid Farm is an incredible concept for the future of agriculture envisioned by professors Eric Ellingsen and Dickson Despommier. The design is based on the growing belief (is it fact yet?) that vertical farming will soon become a necessary lifeline in cities throughout the world. The human population is growing exponentially and increasingly more urban while the global food supply is decreasing.

Despommier speculates that if nothing is done to advance current farming techniques, 3 billion people could face starvation by 2060. The Pyramid Farm offers a solution in the form of a complete self-sufficient ecosystem that covers everything from food production to waste management.

The Vertical Farm Project, grown out of one of Despommiers class projects at Columbia University, features urban farming concepts and resources in hopes of securing the world’s food supply by design. His vertical farms are intended to be complete ecosystems, capable of producing even fish and poultry while reusing internal waste.

The Pyramid Farm, among others, would include a heating and pressurization system separating sewage into water and carbon to fuel machinery and lighting. He estimates that the greenhouses can be made to use only 10 percent of the water and five percent of the land needed by farm fields.

Beyond creating a sustainable and local source for food, Despommier envisions a healing process for today’s horizontal farms. Native plant life will be replaced and allowed to grow wild and replenish the depleted soil for future generations. For more information, listen to Despommier’s vertical farming podcast from the Earth Sky network.

Inhabitat

TStzmmalaysia
post Feb 8 2011, 09:23 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

The Science of Bike-Sharing

The new environmentally-friendly concept of municipal "bike-sharing" is taking over European cities like Paris, and American cities like New York are also looking into the idea. It allows a subscriber to "borrow" a bike from one of hundreds of locations in the city, use it, and return it to another location at the end of the journey. It's good for commuters and for running short errands.

While the idea is gaining speed and subscribers at the 400 locations around the world where it has been implemented, there have been growing pains — partly because the projects have been so successful. About seven percent of the time, users aren't able to return a bike because the station at their journey's destination is full. And sometimes stations experience bike shortages, causing frustration with the system.

To solve the problem, Dr. Tal Raviv and Prof. Michal Tzur of Tel Aviv University's Department of Industrial Engineering are developing a mathematical model to lead to a software solution. "These stations are managed imperfectly, based on what the station managers see. They use their best guesses to move bikes to different locations around the city using trucks," explains Dr. Raviv. "There is no system for more scientifically managing the availability of bikes, creating dissatisfaction among users in popular parts of the city."

Their research was presented in November 2010 at the INFORMS 2010 annual meeting in Austin, Texas.

Biking with computers

An environmentalist, Dr. Raviv wants to see more cities in America adopt the bike-sharing system. In Paris alone, there are 1,700 pick-up and drop-off stations. In New York, there soon might be double or triple that amount, making the management of bike availability an extremely daunting task.

Dr. Raviv, Prof. Tzur and their students have created a mathematical model to predict which bike stations should be refilled or emptied — and when that needs to happen. In small towns with 100 stations, mere manpower can suffice, they say. But anything more and it's really just a guessing game. A computer program will be more effective.

The researchers are the first to tackle bike-sharing system management using mathematical models and are currently developing a practical algorithmic solution. "Our research involves devising methods and algorithms to solve the routing and scheduling problems of the trucks that move fleets, as well as other operational and design challenges within this system," says Dr. Raviv.

For the built environment

The benefits of bike-sharing programs in any city are plentiful. They cut down traffic congestion and alleviate parking shortages; reduce air pollution and health effects such as asthma and bronchitis; promote fitness; and enable good complementary public transportation by allowing commuters to ride from and to train or bus stations.

Because of the low cost of implementing bike-sharing programs, cities can benefit without significant financial outlay. And in some cities today, bicycles are also the fastest form of transport during rush hour.

PhysOrg


The city of Tel Aviv is now in the process of deploying a bike sharing system to ease transport around the city, and improve the quality of life for its residents. Tel Aviv University research is contributing to this plan, and the results will be used in a pilot site in Israel.

TStzmmalaysia
post Feb 8 2011, 09:26 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Woodpecker's Head Inspires Shock Absorbers

When aircrash investigators of the future retrieve a flight recorder from the wreckage of a plane they may have the golden-fronted woodpecker, Melanerpes aurifons, to thank for the survival of the flight data. The reason? A shock absorber inspired by the bird's ability to withstand severe deceleration.

A woodpecker's head experiences decelerations of 1200g as it drums on a tree at up to 22 times per second. Humans are often left concussed if they experience 80 to 100g, so how the woodpecker avoids brain damage was unclear.

So Sang-Hee Yoon and Sungmin Park of the University of California, Berkeley, studied video and CT scans of the bird's head and neck and found that it has four structures that absorb mechanical shock.

These are its hard-but-elastic beak; a sinewy, springy tongue-supporting structure that extends behind the skull called the hyoid; an area of spongy bone in its skull; and the way the skull and cerebrospinal fluid interact to suppress vibration.

Artificial analogues

The researchers then set out to find artificial analogues for all these factors so they could build a mechanical shock absorbing system to protect microelectronics that works in a similar way.

To mimic the beak's deformation resistance, they use a cylindrical metal enclosure. The hyoid's ability to distribute mechanical loads is mimicked by a layer of rubber within that cylinder, and the skull/cerebrospinal fluid by an aluminium layer. The spongy bone's vibration resistance is mimicked by closely packed 1-millimetre-diameter glass spheres, in which the fragile circuit sits (see diagram).

To test their system, Yoon and Park placed it inside a bullet and used an airgun to fire it at an aluminium wall. They found their system protected the electronics ensconced within it against shocks of up to 60,000g. Today's flight recorders can withstand shocks of 1000g.

"We now know how to prevent the fracture of microdevices from mechanical shock," says Yoon.

Overcoming space debris

As well as a possible role protecting flight recorder electronics, the shock absorber could also be used for protecting spacecraft from collisions with micrometeorites and space debris. It could also be used to protect electronics in cars.

"This study is a fascinating example of how nature develops highly advanced structures in combination to solve what at first seems to be an impossible challenge," says Kim Blackburn, an engineer at Cranfield University in the UK, which specialises in automotive impact studies.

"It may inform our thinking on regenerative dampers for vehicles, redirecting the energy into a form more easily recoverable than dumping it to heat," Blackburn adds. "Ultimately, we need to learn from the woodpecker to recover energy and not give the driver a headache."

Nick Fry, chief executive of Formula One team Mercedes GP Petronas based in Brackley, UK, says such ideas could feed into crash protection for drivers taking part in motorsport: "One big issue with Formula One is protecting the driver by getting them to decelerate in an accident situation in such a way that his internal organs and brain aren't turned to mush."

"We do that with clever design of composites, very sophisticated seatbelts and a head and neck restraint system," Fry says. "But this research might be something we can draw on in future – it could be very interesting."

NewScientist

Diagram of the mechanics of the concept here.

TStzmmalaysia
post Feb 9 2011, 09:31 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


TRANSPORTATION

Attached Image

Antistatic Concept: ultralight, micro segment electric three-wheeler

The recently finalized Double Challenge project required MA students at London’s Royal College of Art to design an ultra-compact electric vehicle for event sponsor Citroën. Not surprisingly from such distilled intelligence, the winning entry is a new type of personal urban commuter positioned between bicycles and cars – light, aerodynamically efficient, cheap to build and economical in its use of energy and hence run. Heikki Juvonen’s “E-3POD Antistatic” is an ultralight, micro segment electric three-wheeler with the driver sitting inside a large hub-less third wheel.

Juvonen’s E-3POD was conceived as an entry-level electric commuter for young people and students and so that owners of conventional automobiles could have a second low-cost vehicle which fits below the petrol-engined machinery expected to remain the primary means of long distance travel for some time yet. The E-3POD is of simple construction with a low frontal area, excellent aerodynamics and minimal weight in order to minimize the energy requirements of the vehicle and hence the required battery size. Minimal usage of materials also lowers construction costs.

The lowered weight is emphasized in design elements such as the rear wheel, which works as a supportive structural element, the shared suspension for both front wheels, and the use of scratch resistant plastic for the canopy. The silent electric engines also make sound insulation redundant, allowing for lighter material selection.

The E-3POD provides the user with easy, cost efficient transport with access to easier parking due to the small footprint of the vehicle and the likelihood that parking costs will continue to rise will further enhance the attractiveness of the vehicle. The design also provides comfortable and isolated personal space, which - when compared to bicycles or public transport - is a welcome addition. The short length of the vehicle makes it agile in urban environments. At higher speeds the E-3POD tilts slightly to provide solid grip and an emphasised stance, giving cornering a more responsive feel.

Attached Image

Heikki Juvonen’s E-3POD Antistatic was chosen as the best overall design by representatives from Citroën’s Style Centre and Electric Vehicle Development Team. As his prize, Heikki receives a six month employment contract to work at the prestigious PSA Design Centre in Paris. Heikki commented, “I’m thrilled Citroën selected my design as their favourite and I can’t wait to work with their talented team in France. As a designer I strive for new and better solutions. Good and sustainable design not only improves manufacturer brand image and sales, but can also help to preserve our environment.”

The project was jointly sponsored by Citroën and EXA, a France-based aerodynamic simulation software company. Citroen has a long association with the Royal College of Art. Mark Lloyd, the chief designer of the Citroen DS3, studied at the Royal College of Art.

Citroën had significant involvement throughout the Double Challenge project, providing industry figures to lend the students their expertise and experience, as well as organising a trip to the PSA Design Centre and Le Conservatoire, Citroën’s in-house museum of historic models.

Philippe Holland, Responsable Style Graphique at Citroën, said; “We’re delighted to be involved in this important RCA project. The students have produced some truly exceptional ideas for the future design of electric Citroën vehicles. This type of powertrain is increasingly recognised as an important solution for economically and environmentally viable urban transport; so it’s fantastic to see the electric visions of these potential car designers of tomorrow.”

GizMag

TStzmmalaysia
post Feb 9 2011, 09:32 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Amphibious 1000 semi-submerged Architecture offers the best of both worlds

The Amphibious 1000 luxury resorte is the ultimate dream destination. As the world’s first semi-submerged hotel, this opulent hotel commissioned by Arabia and designed by Giancarlo Zema Design Group can offer you the best of two worlds – land and sea.

The resort resembles a big aquatic animal stretching out from the land into the sea, extending horizontally for 1km. The land area houses residential buildings, office buildings and a marina with a modern and flexible harbor. All the structures are situated in a semi-circle around the tower with the panoramic restaurant.

Four semi-submerged hotels, each offering 75 luxury suites, will be located in the sea section of the resort. Reminding us of the soft lines of the superyachts anchored on land, the hotels include water exhibition galleries, an interactive museum on marine life and an aquarium with a glass tunnel leading to an underwater observatory located in the center of this marine park. The hotels have large diagonal glass windows that offer a fascinating view of marine life. Connected to the welcome area by the long arms are fitness areas, gardens and a special outdoor theater with a moving stage that opens out on the sea.

In addition, there will be 80 floating suites called “Jelly-fish“ with underwater views available at the smaller floating platforms, each ending with a lighthouse. The electrical vehicles will take the guests through the marine park without causing harm to the park’s eco-system. The teak floor complements the main steel structure perfectly, creating a stunning eco-destination. The bridge connecting the land and sea section is draped in plants, giving the impression of projecting the land flora into the sea. Water transport is also provided by 20-meter aluminum yachts called Trilobis that are equipped with hydrogen engines and an underwater observatory globe. Considering the comfort, luxurious and green environment the Amphibious 1000 offers, it is definitely the ultimate eco-destination for the well-heeled.

EcoFriend

TStzmmalaysia
post Feb 9 2011, 09:34 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Crossing Over: Modular Green Wildlife Bridge Concept

We’re used to seeing projects that help humans get around (like highways and pedestrian crossings) but it is less common to see projects that help nature navigate around us.

This wildlife crossing was designed by Olin Studio for West Vail Pass, Colorado as a way to help animals pass safely over the street. The design, called “Wild (X)ing,” is one entry in a design competition that aims to find a way for both wildlife and humans to travel safely in the same area.

Because a highway runs through the very large White River National Forest, it poses a very real threat to the animals that make their homes there. The green bridge concept would help wildlife in the White River National Forest cross over a busy highway while staying at a safe distance from the vehicles.

The wildlife bridge concept uses a repeating rhomboid shape because of its inherent strength and functionality as a modular component. The bridge is designed to be expandable when needed; if the highway is widened in the future the bridge can easily be widened along with it.

Each rhomboid is actually what the designers call a “habitat module,” which is a segment of habitat naturally found in the area. Six different types of habitats have been identified for inclusion in the project, ranging from xeric grassland to wet meadow to spruce and fir forest.

By combining these modules on the wildlife-friendly crossing, the designers hope to create a landscape that connects the man-made structure to the surrounding wildlife and provides a comfortable environment for fauna. If the surrounding landscape should happen to change in the future, modules can be lifted out by cranes and replaced. According to the designers, this module approach is the safest and most cost-effective way to integrate a wildlife bridge into the national park.

WebEcoist

TStzmmalaysia
post Feb 9 2011, 09:36 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Photovolatic Farm To Cosy Up With Wind Farm

Late last month, the NSW government granted planning approval for a $150 million AUD solar farm to be built near the town of Bungendore, not far from the Australian national capital of Canberra. The ABC reported that the solar photovoltaic arrays will cover 100 hectares (nearly 250 acres) and produce enough energy (50 megawatts) to power 10,000 homes.

The solar farm will be built alongside Infigen Energy's existing $400 million AUD Capital Wind Farm, Australia's second biggest wind farm, with 67 turbines, generating 141 megawatts of power, which the Braidwood Times says, could supply one third of Canberra's household needs.

The siting of the solar farm next to the wind farm allows it to easily connect to the existing electricity grid infrastructure that was put in place for the wind turbines.

Solar farms tend not to polarise communities like paddocks full of wind turbines do. According to the ABC report, the Member for the electorate of Monaro, Steve Whan, says he does not expect there will be any opposition to the project. "While wind farms are sometimes controversial, solar farms seem to have very broad support, and particularly in this region.

And it does seem a clever move by Infigen Energy. We wonder if they'll combine the two renewable energy systems at their other energy farms, as they own and operate wind farms not only in Australia, but also in the United States, Germany and France, including 41 wind farms that have a total installed capacity of approximately 2,246M.

This is the second planning approval obtained by Infigen Energy for a solar farm in NSW. They'd previously been granted approval for a 200 hectare photovoltaic farm with 100 megawatt capacity, which is said by Business Review Australia would, "have the potential to generate enough renewable energy to power 20,000 homes; and reduce greenhouse gas emissions equivalent to taking 30,000 cars off NSW roads each year."

TreeHugger

TStzmmalaysia
post Feb 9 2011, 09:38 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Concept for renewable energy power plant for Desserts

Another new daring design for producing clean renewable energy.

The ‘Metallica Towers’ are two towers that were designed to provide renewable energy to the desert kingdom of Dubai.

The towers were conceptualized by Egyptian architecht Karim Elnabawy and feature wind turbines and solar panels.

The uber-intelligent design of these ‘Metallica Towers’ hopes to take advantage of Dubai’s harsh weather conditions. Heat absorbed by the towers is gathered in the solar panels and used to serve power to the wind turbines. The ‘Metallica Towers’ have a gorgeous design that looks nothing like conventional power plants.

The design features two large towers that are mounted on a circular metallic surface. As the sun heats up the air, hot air ascends through the tower while powering wind turbines. At the same time solar panels, which are installed on the entire surface, generate renewable electricity as well. The wind-exposed facades of the towers incorporate several filaments that oscillate with the wind and produce energy as well.

Read more at:

EcoFriend

or

Trendhunter


TStzmmalaysia
post Feb 10 2011, 09:23 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Digital 3D Cave Exploration

British researchers have used 3D laser scans to create maps of underground caves.

Just under the surface of Nottingham, England, there’s hidden world ripe for exploration, from a 14th century dungeon that once reputedly held a king prisoner to a 19th century butchery. Not just anyone can get to most of these 450+ sandstone caves, many of which are located under Nottingham Castle, and they’ve never even been accurately mapped – until cutting-edge laser technology made these incredible 3D scans possible. The Nottingham Caves Survey has already recorded the shape and surface details of 35 caves, layering them with above-ground photos to give us an unprecedented and surprisingly artistic view.

As part of the Caves of Nottingham Regeneration Project, the Nottingham Caves Survey is taking 3D laser scanners into the depths beneath the city to photograph the caves, survey them with the scanner and note their condition. Many of these caves have major historical significance for Nottingham and for England – the earliest written record of caves beneath what was then a Saxon settlement dates to the year 868. The project aims to protect the caves, in the hopes that they won’t simply be forgotten and allowed to deteriorate.

To capture these strange digital imprints of vast underground spaces, the Nottingham Caves Survey crew hauls equipment below the surface on bike trailers. The scanners send beams of laser light deep into the caves and measure the amount of time it takes for the light to return. The scanners can capture an incredible 500,000 survey points per second, creating a ‘point cloud’ that results in a 3D image.

“The experience of visiting these domestic caves is far removed from the clean regularity of modern urban living and offers a tangible link to medieval Nottingham,” explains the project team. “This is particularly significant in a city with such a strong past personality but so few medieval structures still standing above ground.”

Take a look at the animations:



Or have a look at the equipment:



WebEcoist


TStzmmalaysia
post Feb 10 2011, 09:26 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Shell Architecture Blends in with Nature over Time

A large shell shaped structure finds itself in the middle of the woods. It is hard to determine what exactly the structure is, and unlike the surrounding caves and rocks, it clearly is not a part of nature – nor is it a ruin.

A frame, a shape, made at a completely different place for a completely different purpose. Within this shell shaped structure will one find floors constructed, wall separating spaces, and rooms furnished. The scenery conjures a SF film-like image, in which locals inhabit over an abandoned spacecraft. With time, trees start to grow encircling the spacecraft, harmonizing it into the landscape.

Desiring a place that will be occupied frequently over many years and yet at the same time be in sync with nature, they came up with the aforementioned scenery of a large shell structure floating above ground.

Being in sync with nature isn’t about yielding to nature – it’s about coexistence. The existence of the structure depends on its power to endure nature. By isolating living space from the wilderness, and upgrading its quality as a shelter, the house will be protected from nature and will provide a comfortable environment. With this, the house will be taken care of and used frequently and continuously. Specifically in cases of villas, frequent use is what leads it to blend in with its surroundings.

The regions’ low temperatures and high humidity level makes for a harsh climate. As a result, many houses that take on traditional structures are decaying. Is it in sync with nature? Perhaps. But the whole idea of comfort seems to be put into question. Consequently, large numbers of villas have not been in use for many years bringing them down to further dilapidation. Despite the general avoidance of concrete material in the region, its usage and the lifting structure have helped the villa protect itself from the humidity.

Leaving the boundary between human life and nature ambiguous is a Japanese virtue. Yet, this ideal can only be achieved through meticulous attention and care of the wilderness on a daily basis. This might be attainable at our homes, but isn’t a practical theory when applied to villas. If a visit to the villa inevitably leads to hours and days of maintenance, why bother going? It clearly goes against the purpose of a villa. Having a type of living space that merges with nature could be appealing, but it seems natural to consider this option only when one is ready to devote a large time solely on maintenance.

It goes without saying that villas should not only be functional spaces for the weekend. Their greatest goal is to provide us with good rest, leisure, and picturesque views that never become dull – all in the vicinity of nature. In the style of many modern sculptures, they aimed to enhance the surrounding nature by incorporating it within the spatial structure.

The central control system enables all mechanical and electrical equipments to be managed by three buttons. In addition, the biometrics lockage and security system will reduce anxiety and stress over house safety management. The installment of the custom made floor-heating system minimizes the use of heat energy for avoiding the trouble of emptying drainage in cold regions.

Furthermore, it is highly effective in mold prevention. In addition, it works as a cold-draft blocking system which enables the luxury of enjoying a hefty amount of space with large openings. The system integrates itself within the architectural form.

Assuming future interior and equipment maintenances (including the window sash) for continual use, the frame is completely separate. The building frame is assumed to assimilate with its surroundings with the passage of time. To provide efficiency during maintenance, the concrete was exposed, finished with a penetrative sealer for concretes.

All air and exhaust outlets are installed beneath the sash, letting air run outside through the terrace louver. In addition, by devising unfixed windows, we tried to maximize natural ventilation (we haven’t arranged air conditioning in general parts). While at a glance, the oval shaped cylinder space might appear as wasteful use of space, the functional use of space is maximized by the installation of furniture in the lower half of the oval cylinder.

For more visuals, visit Artechnic website.

Or Dezeen.

Dezeen



TStzmmalaysia
post Feb 10 2011, 09:28 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Almeisan Tower is a Solar Concentrating Skyscraper

Architect Robert Ferry recently unveiled a stunning design for a sustainable spire in Dubai that requires zero energy and produces zero waste and zero emissions.

The Almeisan Tower is a concept created for Za’abeel Park that generates all of its own energy using concentrating solar power technology. The tower itself is actually a solar power tower (much like Solar One in California) that uses heliostats positioned at the top of the tower to direct sunlight onto a central receiver.

Almeisan is the Arabic name for one of the brightest stars in the sky in the Gemini constellation. It is also derived from the word Al Maisan, which means “the Shining One.” The solar power tower would be capable of generating 600 kW of solar power, which is enough to meet the energy demands of the tower as well as Za’abeel Park. The 224 heliostats placed around the top of the tower would track the sun and reflect the light to the central receiver, which will then heat liquid sodium to 500 degrees Celsius in order to drive a steam turbine.

Eight concrete pillars curve up and out to provide the base for the rooftop solar concentrating plant. The pillars are held together in the middle with a large tension ring. The interior of the structure features a cafe and an observation deck, which would offer spectacular views of the city. Near the bottom, the tower features a conference facility, a children’s library, and a cultural center.

The spire’s construction features eight wind towers that are used to help provide a natural cooling effect, where hot air is drawn up and out of the structure via chimney effect and cool air is drawn in. Living walls and roofs also help cool the building by helping to moderate temperatures. The vegetation acts as a “heat sink for modulating the temperature variations in a similar way to mud walls in traditional indigenous huts.”

This tower has an intriguing design and is the first one that we have come across to incorporate concentrating solar thermal energy into skyscraper design. The building was also designed to qualify as Triple Zero – zero waste, zero energy and zero emissions, which is a very ambitious design goal. Although this concept did not win the competition, we hope to see its ambitious array of sustainable features integrated into future projects.

Inhabitat

TStzmmalaysia
post Feb 10 2011, 09:30 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Structural Geodesics

This intriguing skyscraper proposal by Vahan Misakyan designed for the city of Yerevan in Armenia consists of an assemblage of structural geodesics that form three piercing towers linked by habitable bridges at the top and bottom.

Different programs, including offices, residences, and hotel are located in each tower – the geodesics change in size and configuration depending on the program. The bridges are used as commercial and recreational areas for the general public.

One of the main concepts of the proposal is to create a soft transition between the vertical and horizontal planes by creating surfaces that peel off from the ground and transform into habitable areas. A transportation hub for the entire region emerges from one of these structures while a second one creates a bridge and a recreational park.

The building is designed with the latest green technologies. An “intelligent” skin controls, through mechanical openings, the amount of light incidence and could also be used to reduce heat and provide natural ventilation. This skin is also equipped with rain water collection systems, photovoltaic cells, and wind turbines.

Evolo

TStzmmalaysia
post Feb 10 2011, 09:32 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Futuristic London Bridge Sprouts Solar Powered Vertical Farm

Recently Chetwood Architects unveiled a stunning proposal for a futuristic London Bridge that sprouts a towering vertical farm in the midst of the Thames river. The bridge’s solar-powered spires are crowned with wind turbines and house a self-sufficent organic farm and commercial center that takes advantage of renewable energy generation, efficient use of water, solar heating, and natural ventilation.

In Medieval times, the London Bridge was an active place covered with buildings and merchants on both sides, and a major thoroughfare for people and carts to travel from one side of London to the other. That bridge is long since gone, with many bridges having replaced it since. One of those bridges was even sold to a wealthy American and reconstructed in Lake Havasu City, Arizona.

Recently, the Worshipful Company of Chartered Architects (WCCA) along with the Royal Institute for British Architects (RIBA) held a design competition for a new inhabitated version of the London Bridge. The winner of that design competition was Laurie Chetwood with his vertical farm and public market.

Taking cues from the old bridge, Chetwood designed a concept that not only made the London Bridge a central meeting spot and place to gather, but also a place of commerce. The updated bridge, which crosses the Thames, would not only sell food, but would also produce it via a vertical farm. The bridge is centered around 2 main elements – a vertical farm and a commercial center for fresh food markets, cafes, restaurants, and residential accommodations. A pier connected to the bridge allow goods to be delivered and bought at the water level and even more produce to be grown via hydroponics. Two produce markets will be placed on either side of the bridge, one a wholesale market and the other a public organic market.

Beyond its organic farm the new bridge will also take advantage of renewable energy generation, efficient use of water and efficient heating and cooling technologies. First the vertical farm acts as a cooling tower, drawing cool air in at the bridge level and, while hot air is pushed out through the top. This natural ventilation also powers a vertical axiswind turbine placed at the top of the tower. Solar heating for hot water occurs in convection coils, while EFTE over the core of the farm provides a lightweight solar PV skin for electricity generation. Any excess heat not needed for the farm will be provided to the retailers. Rainwater collection will go to support restrooms and the hydroponic farm, and greywater will be treated and recycled.

The judges declared Chetwood’s design to be “A beautifully presented scheme, wildly imaginative yet very thoroughly considered, both in terms of its construction but also how it could sit within the wider context. The design refers to the surrounding buildings, using them as reference points and inspiration behind the form. It is also full of interesting ecological ideas and on all levels seems to work well. This was a unanimous first choice amongst the panel.”

Inhabitat


TStzmmalaysia
post Feb 10 2011, 09:34 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

The Key to Better Solar Cells: Bumpy Mirrors

Dye-sensitized thin-film solar cells are cheaper to make than conventional silicon cells, but they're still relatively inefficient.

Now researchers at Stanford University have used a specially designed metal reflector to boost the efficiency of solid electrolyte dye-sensitized solar cells by as much as 20 percent. The reflector is a thin silver film with an array of nanoscale bumps. The researchers use the film to coat the cells' back surface; the film helps trap more light inside the cells. "We get about 5 to 20 percent more absorption depending on the dye," says Michael McGehee, director of the Center for Advanced Molecular Photovoltaics at Stanford. McGehee led the research, which was published online this week in the journal Advanced Energy Materials.

Dye-sensitized thin-film cells with a light-to-electricity conversion efficiency of around 11 percent recently made their commercial debut. However, they use liquid electrolytes that are volatile and could leak. Cells with solid electrolytes have only shown efficiencies of about 5 percent.

"They took the best solid-state dye cell they could, and made it better," says David Ginger, a chemistry professor at the University of Washington, of the Stanford researchers. "Even better, they did it using technology and methods that could potentially be used in a production environment."


Dye-based solar cells are composed of semiconductor nanocrystals (typically titanium dioxide, or titania) that are coated with dye molecules and sandwiched—along with an electrolyte—between glass or plastic sheets. The dye absorbs light and creates electrons and positively charged holes. The crystals transfer the electrons to one electrode to produce an electrical current, while the electrolyte carries the holes to the other electrode.

Solid electrolytes are not as efficient as liquid ones, though, and the electrons and holes recombine more easily. To prevent that, the titania layer is very thin—typically two micrometers. But the thinner the cells, the more quickly light passes through them without getting absorbed. Research efforts to improve the efficiency of these cells have typically focused on developing stronger dyes and new types of nanocrystals. But McGehee and his colleagues used plasmonic reflectors to improve their cell's efficiency.

Plasmons are the oscillations of electrons at a metal surface when they are excited by light. By controlling the shape of the surface, you can control the type of plasmons created, which in turn influences how light interacts with the material.

More on Technology Review

TStzmmalaysia
post Feb 11 2011, 09:30 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Zundel and Cristea’s Urban Farms

Urban food production is not a new concept. We’ve seen countless designs here on eVolo for vertical farms, urban ecosystems, and arcologies, but French firm Zundel and Cristea has taken the urban farm concept in an entirely different direction. Instead of proposing a monumental project like a vertical farm, they put together a design for smaller urban farm centers planted throughout a city.

The centers are designed to grow food, process it, and some to even serve it in on site restaurants. On the inside bowls of the spiraling structures is the green space where various types of food and greenery is grown. Visitors and urban farmers would go out to the spirals to harvest and enjoy the green space. Food would then be taken into the superstructure and processed where it could be served or packaged and brought to market.

The small scale of each of the double spiral structures allow for Zundel and Cristea’s urban farms to be regional centers for the districts they individually serve, a sort of park and bazaar in one. Placing them in urban landscapes also reduces the green house emissions that would normally be needed to transport produce from rural farms to city centers. Centers would be topped with wind turbines as well, to create an energy sustainable landmark that is economically, socially, and agriculturally productive.

Zundel and Cristea designed their urban farms as a part of what they see as a shifting focus of city planning and architectural design from industrial functionality to the environmentally conscious and ecological. Their straightforward and refined design makes urban farming on a large scale a feasible element in the city of the future.

Attached Image

Evolo




TStzmmalaysia
post Feb 11 2011, 09:33 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


NANOTECHNOLOGY

Attached Image

Revolutionary Microchip Technology

A team of scientists at Tyndall National Institute, UCC have made the world's first junctionless transistor even smaller. The transistor is the building block of the microchip. The development of the world's first junctionless transistor by Tyndall's Professor Jean-Pierre Colinge had already sparked off huge interest amongst the leading semiconductor manufacturers around the globe when it was published in Nature Nanotechnology.

The announcement was made as part of the programme of events taking place for Nanoweek which runs from 31st January to 4th February.

"The semiconductor industry was excited by the development of the junctionless transistor as it could represent simpler manufacturing processes of transistors. Considering that there are approximately 2 billion transistors on a single microprocessor, any improvement in the performance or structure of the transistor is always hugely significant for the semiconductor industry. Once we had developed the junctionless transistor our attention went towards making it even smaller. We have succeeded in making it at 50 nanometres, which is 20 times smaller than the transistors that were published in Nature Nanotechnology," explains Professor Jean-Pierre Colinge, Tyndall National Institute.

Today's electronic devices are power hungry and feature hungry. The electronics industry is looking for ways to pack more features into their devices while making them more energy efficient.

"The new smaller junctionless transistor is now 30% more energy efficient and outperforms current transistors on the market. Working with my colleagues in the Theory Group at Tyndall, we had predicted that the transistor could perform on a smaller scale and I am happy to say that we were correct in our predictions. It can be difficult to imagine the actual size of a transistor. However, if we look at a strand of our hair and imagine that the 50 nanometer junctionless transistor made in Tyndall is 2,000 times smaller, we can perhaps get a better idea of just what size scale we are working on," says Colinge.

ScienceDaily

TStzmmalaysia
post Feb 11 2011, 09:37 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Dragonfly concept aims for ecological self-sufficiency in New York

The latest concept design from Vincent Callebaut Architects – the Dragonfly – has been designed with the intention of easing the ever-increasing need for ecological and environmental self-sufficiency in the urban cityscape. The proposed development, designed around the Southern bank of Roosevelt Island in New York, follows a vertical farm design which, it is hoped, would cultivate food, agriculture, farming and renewable energy in an urban setting.

The unique 128 floor, 700m concept design is spread over two oblong towers and suggests building a prototype of an urban farm in which a mixed programme of housing, offices, laboratories and farming spaces are vertically laid out over several floors and cultivated by its inhabitants. The architecture of the design proposes reinventing the vertical building, so associated with the New York skyline of the 19th and 20th centuries, both structurally and functionally as well as ecologically.

The functional organisation of the design is arranged around two 600m towers, symmetrically arranged around a huge climactic greenhouse that links them, and constructed of glass and steel. This greenhouse, which defines the shape of the design, supports the load of the building and is directly inspired by the structural exoskeleton of dragonfly wings. Two inhabited rings buttress around the ‘wings,’ and along the exterior of these are solar panels, which will provide up to half the buildings electricity, with the rest being supplied by three wind machines along the vertical axes of the building.

While most would argue that the unconventional design of Dragonfly would be more suited to Dubailand than New York, the conceptual design tackles the contemporary dilemma of food production and agriculture in a city sorely lacking in the horizontal space required to do so, as well as attempting to achieve this in an ecologically sound and renewable way by merging production and consumption in the heart of the city.

Attached Image

World Architecture News

TStzmmalaysia
post Feb 11 2011, 09:38 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

'Robotic' growth factor speeds healing of chronic wounds

Chronic wounds, such as diabetic foot ulcers and burns, can be very difficult to heal. This can result in pain, infection, or worse. Proteins known as growth factors have been shown to help such wounds heal, although purifying these proteins can be pricey, and they don’t last very long once applied to a wound. There is now hope, however, in a nanometer-sized drug that its creators are describing as “robotic.”

The drug is a genetically-engineered protein, consisting of elastin-like peptides and keratinocyte growth factor. It was created by a team of scientists from the Hebrew University of Jerusalem, Harvard Medical School and others in the U.S. and Japan. They call it “robotic” because like a robot, it responds to its environment by carrying out a specific activity – when exposed to heat, dozens of the growth factors will fold together, forming a nanoparticle that is over 200 times smaller than a single hair.

This characteristic makes purification of the protein much easier, and thus a lot cheaper to produce than other growth factors. It is also better able at remaining on wound sites.
So far, the drug has been applied in a topical ointment to wounds on diabetic mice, where it caused a dramatic increase in the rate of healing. Human clinical trials should follow in the future, once the drug has been further tested and refined.

GizMag

TStzmmalaysia
post Feb 11 2011, 09:40 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Water Droplet Resort Will Convert Air into Purified Water

Architecturally and thematically designed in the shape of a drop of water, the Water Building Resort intends to become the first building ever to convert air into water with the help of solar power. What sounds like magic will be achieved with the following combination of nature and technology: A sunny, southerly facing facade made of photovoltaic glass will harness solar energy, allowing light to pass through. The northern facade features a latticed design for ventilation as well as unprecedented Teex Micron equipment that will convert humid air and condensation into pure drinking water.

Attached Image

Designed for construction in warm and humid coasts, the Water Building Resort, a resort complex, will also house a water treatment facility in the bottom floor, for purifying salty sea and rain water, along with a center of technological investigation to control and certify water quality. Restaurants, gyms, exhibition halls, hotel and conference rooms, and spa services will fill the upper floors – all based on the theme of water, the environment and renewable energy. An underwater aquarium will sit at the base of the Water Building Resort, rounding out the water conscious theme and practices.

Inhabitat
TStzmmalaysia
post Feb 12 2011, 01:16 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

How a new manufacturing technology will change the world

THE industrial revolution of the late 18th century made possible the mass production of goods, thereby creating economies of scale which changed the economy—and society—in ways that nobody could have imagined at the time. Now a new manufacturing technology has emerged which does the opposite. Three-dimensional printing makes it as cheap to create single items as it is to produce thousands and thus undermines economies of scale. It may have as profound an impact on the world as the coming of the factory did.

It works like this. First you call up a blueprint on your computer screen and tinker with its shape and colour where necessary. Then you press print. A machine nearby whirrs into life and builds up the object gradually, either by depositing material from a nozzle, or by selectively solidifying a thin layer of plastic or metal dust using tiny drops of glue or a tightly focused beam. Products are thus built up by progressively adding material, one layer at a time: hence the technology’s other name, additive manufacturing. Eventually the object in question—a spare part for your car, a lampshade, a violin—pops out. The beauty of the technology is that it does not need to happen in a factory. Small items can be made by a machine like a desktop printer, in the corner of an office, a shop or even a house; big items—bicycle frames, panels for cars, aircraft parts—need a larger machine, and a bit more space.

At the moment the process is possible only with certain materials (plastics, resins and metals) and with a precision of around a tenth of a millimetre. As with computing in the late 1970s, it is currently the preserve of hobbyists and workers in a few academic and industrial niches. But like computing before it, 3D printing is spreading fast as the technology improves and costs fall. A basic 3D printer, also known as a fabricator or “fabber”, now costs less than a laser printer did in 1985.

Just press print

The additive approach to manufacturing has several big advantages over the conventional one. It cuts costs by getting rid of production lines. It reduces waste enormously, requiring as little as one-tenth of the amount of material. It allows the creation of parts in shapes that conventional techniques cannot achieve, resulting in new, much more efficient designs in aircraft wings or heat exchangers, for example. It enables the production of a single item quickly and cheaply—and then another one after the design has been refined.

For many years 3D printers were used in this way for prototyping, mainly in the aerospace, medical and automotive industries. Once a design was finalised, a production line would be set up and parts would be manufactured and assembled using conventional methods. But 3D printing has now improved to the point that it is starting to be used to produce the finished items themselves (see article). It is already competitive with plastic injection-moulding for runs of around 1,000 items, and this figure will rise as the technology matures. And because each item is created individually, rather than from a single mould, each can be made slightly differently at almost no extra cost. Mass production could, in short, give way to mass customisation for all kinds of products, from shoes to spectacles to kitchenware.

By reducing the barriers to entry for manufacturing, 3D printing should also promote innovation. If you can design a shape on a computer, you can turn it into an object. You can print a dozen, see if there is a market for them, and print 50 more if there is, modifying the design using feedback from early users. This will be a boon to inventors and start-ups, because trying out new products will become less risky and expensive. And just as open-source programmers collaborate by sharing software code, engineers are already starting to collaborate on open-source designs for objects and hardware.

The jobless technology

A technological change so profound will reset the economics of manufacturing. Some believe it will decentralise the business completely, reversing the urbanisation that accompanies industrialisation. There will be no need for factories, goes the logic, when every village has a fabricator that can produce items when needed. Up to a point, perhaps. But the economic and social benefits of cities (see article) go far beyond their ability to attract workers to man assembly lines.

Others maintain that, by reducing the need for factory workers, 3D printing will undermine the advantage of low-cost, low-wage countries and thus repatriate manufacturing capacity to the rich world. It might; but Asian manufacturers are just as well placed as anyone else to adopt the technology. And even if 3D printing does bring manufacturing back to developed countries, it may not create many jobs, since it is less labour-intensive than standard manufacturing.

Our TQ article explains the technology behind the 3-D printing process
The technology will have implications not just for the distribution of capital and jobs, but also for intellectual-property (IP) rules. When objects can be described in a digital file, they become much easier to copy and distribute—and, of course, to pirate. Just ask the music industry. When the blueprints for a new toy, or a designer shoe, escape onto the internet, the chances that the owner of the IP will lose out are greater.

There are sure to be calls for restrictions on the use of 3D printers, and lawsuits about how existing IP laws should be applied. As with open-source software, new non-commercial models will emerge. It is unclear whether 3D printing requires existing rules to be tightened (which could hamper innovation) or loosened (which could encourage piracy). The lawyers are, no doubt, rubbing their hands.

Just as nobody could have predicted the impact of the steam engine in 1750—or the printing press in 1450, or the transistor in 1950—it is impossible to foresee the long-term impact of 3D printing. But the technology is coming, and it is likely to disrupt every field it touches. Companies, regulators and entrepreneurs should start thinking about it now. One thing, at least, seems clear: although 3D printing will create winners and losers in the short term, in the long run it will expand the realm of industry—and imagination.

The ECONOMIST

TStzmmalaysia
post Feb 12 2011, 01:19 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

New Automated Technology Speeds Up E-Waste Recycling

One of the key reasons why we have a problem with old electronics being exported to toxic e-waste dumps is because that's where it's cheapest to deal with the stuff. Solving that problem means becoming more efficient with our recycling processes here in the US, so that there are no excuses for not dealing with our own mess. A recycling plant newly opened in Mississauga outside Toronto, Sims Recycling Solutions, is leading the way in smart methods for amping up the recycling process.

CNET has a great slideshow of images from an e-waste shredder, so you can get an idea of what the process looks like, and reports on what Sims Recycling Solutions is doing to automate the recycling stream. You can check out automations of the process as well. The facility is taking the technology already in use in recycling facilities, like optical sensors and metal-separating machines, and applying it specifically to e-waste. It can even collect the dust and process that so nothing goes to waste.

The upgrade comes as the company tries to make recycling profitable, which is tough to do. They rely on the fee charged on electronics at the time of purchase as well as collection fees. That means that yes, even the dust is valuable. The company will be able to process and resell 75,000 metric tons of e-waste annually. That's a whole lot of old gadgets, from monitors to computers to cell phones, whose parts can be re-purposed.



CNET's Martin LaMonica writes that toxic material like florescent bulbs and batteries are sorted out beforehand, then gadgets are shredded, sent down conveyor belts where magnets separate metals, and sensors pick out glass from plastics. Here's a video of the process for CRT monitors:

While the process is more high-tech and efficient, and is therefore a model for the recycling industry, it also underscores the larger problem of recycling -- it's expensive and a big hassle. That's why we love gadget design that incorporates reparability, and easy separation of components for easy and inexpensive recycling. The best example of this design out there right now is the Bloom Laptop, designed by college students.

Smart tech for recycling plants is a must-have for a future filled with obsolete gadgets, but even more exciting than automated recycling with cool optical sensors sorting conveyor belts filled with shredded electronics, is the process of designing that obsolescence right out of the devices in the first place.

Treehugger


TStzmmalaysia
post Feb 12 2011, 01:20 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Hydropolis Underwater Hotel

Currently under construction in Dubai, Hydropolis will be with the Poseidon the world’s unique luxury underwater hotels. It will include three elements: the land station, where guests will be welcomed, the connecting tunnel, which will transport people by train to the main area of the hotel, and the 220 suites within the submarine leisure complex. It is one of the largest contemporary construction projects in the world, covering an area of 260 hectares, about the size of London’s Hyde Park.

“Hydropolis is not a project; it’s a passion,” enthuses Joachim Hauser, the developer and designer of the hotel. His futuristic vision is about to take shape 20m below the surface of the Arabian Gulf, just off the Jumeirah Beach coastline in Dubai. The 220-suite hotel will incorporate a host of innovations that will take it far beyond the original blueprint for an underwater complex worthy of Jules Verne.

“There have been many visions of colonising the sea – Jules Verne, Jean Gusto and several Japanese architects – but no one has ever managed to realise this dream,” says Hauser. “That was the most challenging factor, and that’s what makes it so fascinating. Despite being a dream of mankind for centuries, nobody has ever been able to make living underwater possible.”



UNDERWATER HOTEL DESIGN

The original idea for Hydropolis developed out of Hauser’s passion for water and the sea, and goes much deeper than just building a hotel underwater. More than just curiosity, it is a commitment to a more far-reaching philosophy. “Once you start digging deeper and deeper into the subject, you can’t help being fascinated and you start caring about all the associated issues,” he explains. “Humans consist of 80% water, the earth consists of 80% water; without water there is no life.”

Hydropolis reproduces the human organism in an architectural design. There is a direct analogy between the physiology of man and the architecture. The geometrical element is a figure eight lying on its side and inscribed in a circle. The spaces created in the basin will contain function areas, such as restaurants, bars, meeting rooms and theme suites. These can be compared to the components of the human organism: the motor functions and the nervous and cardiovascular systems, with the central sinus knot representing the pulse of all life.

The ballroom, located at this nerve centre, will have asymmetrical pathways connecting the different storeys along ramps. A large, petal-like retracting roof will enable the staging of open-sky events. Staircases, lifts and ramps will provide access to the ballroom, while flanking catering areas will supply banquets and receptions.



HYDROPOLIS LAND STATION

In order to enter this surreal space, visitors will begin at the land station. This 120m woven, semicircular cylinder will arch over a multi-storey building. On the lowest level passengers board a noiseless train propelled by fully automated cable along a modular, self-supporting steel guideway to Hydropolis. A just-in-time and on-demand logistical system will facilitate efficient supply of goods to the hotel.

The upper storeys of the land station house a variety of facilities, including a cosmetic surgical clinic, a marine biological research laboratory and conference facilities. On the lower levels are the staff rooms, goods storage and loading areas, and hotel and parking areas.

The land station also includes a restaurant and high-tech cinema screening the evolution of life in the ocean and the history of underwater architecture. As a finale, the screen will open to reveal the real-life Hydropolis. A viewing platform at the front opening of the spanning roof will allow views of the architecture as well as the light shows of Hydropolis.



MARINE ARCHITECTURE

This structure promises to be a conceptual as well as a physical landmark. While human beings accept the existence of water, we have only a superficial appreciation of its significance. “We waste it, go swimming in it and generally take it for granted,” says Hauser. “Humans could actually live self-sufficiently underwater, generating energy, nurturing food supplies and so on. This is why we are starting a foundation to demonstrate something of the importance of water in our lives.

“My general plan was to create a living space in the sea. My initial proposal was a deep-sea project, which looked very different. I had to adjust to the local reality of the natural surroundings and change to a shallow water construction.

“We want to create the first ever faculty for marine architecture because I believe that the future lies in the sea, including the future of city planning. I am certain that one day a whole city will be built in the sea. Our aim is to lay the first mosaic by colonising the sea.”

Hauser plans to incorporate many different elements associated with the sea. The cosmetics will be ocean-based, the cinemas will screen films that focus on aquatic themes and a children’s seaworld will educate as well as entertain.

He views his creation as a place where those who do not dive – or do not even swim – can experience the tranquillity and inspiration of the underwater world. “We are expecting around 3,000 visitors per day in addition to the hotel guests. The aim is to inspire people to develop a new awareness of the sea.”

As well as emphasising the positive aspects of water, Hauser also believes we are systematically destroying marine life, and thus wishes to draw attention to various dangers and problems, such as the loss of algae and the destruction of the coral reefs.

TheBuilderBlog

TStzmmalaysia
post Feb 12 2011, 01:42 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Tidal Power Streetlights Light Up Quays

We’ve covered a number solar powered streetlights over the past year or so. Now, a concept design for illumination via renewable energy near docks or quays: the Flowlight tidal power streetlight system by Shane Molloy (which comes to us via Tuvie).

The system is composed of a series of street lamp-style carbon fiber poles lacking in a lamp–they are used, rather, to hold aloft a series of LED strip lights. Down below, under the waterline, these poles are connected to water turbine blades (protected by sub-floatation housing units to shield the blades from rock-strewn river bed) designed to operate both clockwise and anti-clockwise, in keeping with the river’s flow. During periods of high and low tide, the turbines harvest energy , which in turn illuminates the LED strip lights after dark.

While the Flowlight system was designed, specifically, for the quayside of the River Suir in Ireland, we think it could work equally well anywhere with a river close enough to sea level to respond to the tides, or even around deep seaside harbors.

Attached Image

From a functional point of view, it makes good sense to harvest renewable energy close to the point of consumption, and from an aesthetic point of view, that LED strip lighting is bound to look pretty cool reflected in the water.

EarthTechling


rahizan
post Feb 13 2011, 12:21 PM

Regular
******
Senior Member
1,366 posts

Joined: Dec 2010



TStzmmalaysia
post Feb 13 2011, 02:03 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

The 'new' kilogram is approaching: Avogadro constant determined with enriched silicon-28

A milestone in the international Avogadro project coordinated by the Physikalisch-Technische Bundesanstalt (PTB) has been reached: With the aid of a single crystal of highly enriched 28Si, the Avogadro constant has now been measured as exactly as never before with a relative overall uncertainty of 3 • 10-8. Within the scope of the redefinition of the kilogram, the value NA = 6.02214078(18) • 1023 mol-1 permits the currently most exact realization of this unit. The results have been published in the most recent edition of the journal Physical Review Letters.

The crucial phase of the long-term Avogadro project - which is coordinated by PTB - started in 2003: In that year, several national metrology institutes launched - together with the Bureau International des Poids et Mesures (BIPM) and in cooperation with Russian research institutes - the ambitious project of having approximately 5 kg of highly enriched 28Si (99.99 %) be manufactured as a single crystal, of measuring the Avogadro constant with it and of achieving - by the year 2010 - a measurement uncertainty of approx. 2 • 10-8. Meanwhile, the first measurements have been completed on the two 1 kg spheres of 28Si - which had been polished in Australia - and their density, lattice parameter and surface quality have been determined.

The single steps: After an extensive check of the crystal perfection, the influence of the crystal lattice defects was assessed. Then, the lattice parameter was determined at the Italian metrology institute (INRIM) by means of an X-ray interferometer, and confirmed by comparison measurements with a natural Si crystal at the American NIST. At BIPM, NMIJ (Japan) and PTB, the masses of the two silicon spheres were linked up in vacuum to the international mass standards. In the respective Working Groups of NMIJ, NMI-A (Australia) and PTB, the sphere volume was measured optically - with excellent agreement - by means of interferometers with different beam geometries. The surface layer (basically composed of silicon dioxide) was spectroscopied with electron radiation, X-ray radiation and synchrotron radiation in accordance with different procedures, analyzed and taken into account for the determination of the silicon density. The unexpectedly high metallic contamination of the sphere surfaces with copper and nickel silicides which occurred during the polishing process was measured, and its influence on the results of the sphere volume and of the sphere mass was assessed. This resulted in a higher measurement uncertainty.

What was decisive for the success achieved - i.e. a relative overall measurement uncertainty of 3 • 10-8 - was the development of a new mass-spectrometric method for the determination of the molar mass at PTB.

The result is a milestone on the way towards a successful realization of the new kilogram definition on the basis of fundamental constants whose values have been fixed. At present, the agreement of this value with other realizations of the kilogram is not good enough to change the existing definition of the mass unit. The present state of the Avogadro project is, however, so promising that - on the basis of new measurements with improved sphere interferometers - the measurement uncertainty of 2 • 10-8 demanded by the Consultative Committee for the Mass (CCM) will in the near future be achieved on contamination-free spheres and will probably even be undercut.

PhysOrg

TStzmmalaysia
post Feb 13 2011, 02:05 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Making a Point: Method Prints Nanostructures Using Hard, Sharp 'Pen' Tips Floating on Soft Polymer Springs

Northwestern University researchers have developed a new technique for rapidly prototyping nanoscale devices and structures that is so inexpensive the "print head" can be thrown away when done.

Hard-tip, soft-spring lithography (HSL) rolls into one method the best of scanning-probe lithography -- high resolution -- and the best of polymer pen lithography -- low cost and easy implementation.
HSL could be used in the areas of electronics (electronic circuits), medical diagnostics (gene chips and arrays of biomolecules) and pharmaceuticals (arrays for screening drug candidates), among others.


To demonstrate the method's capabilities, the researchers duplicated the pyramid on the U.S. one-dollar bill and the surrounding words approximately 19,000 times at 855 million dots per square inch. Each image consists of 6,982 dots. (They reproduced a bitmap representation of the pyramid, including the "Eye of Providence.") This exercise highlights the sub-50-nanometer resolution and the scalability of the method.

The results will be published Jan. 27 by the journal Nature.

"Hard-tip, soft-spring lithography is to scanning-probe lithography what the disposable razor is to the razor industry," said Chad A. Mirkin, the paper's senior author. "This is a major step forward in the realization of desktop fabrication that will allow researchers in academia and industry to create and study nanostructure prototypes on the fly."
Mirkin is the George B. Rathmann Professor of Chemistry in the Weinberg College of Arts and Sciences and professor of medicine, chemical and biological engineering, biomedical engineering and materials science and engineering and director of Northwestern's International Institute for Nanotechnology.

Micro- and nanolithographic techniques are used to create patterns and build surface architectures of materials on a small scale.
Scanning probe lithography, with its high resolution and registration accuracy, currently is a popular method for building nanostructures. The method is, however, difficult to scale up and produce multiple copies of a device or structure at low cost.

Scanning probe lithographies typically rely on the use of cantilevers as the printing device components. Cantilevers are microscopic levers with tips, typically used to deposit materials on surfaces in a printing experiment. They are fragile, expensive, cumbersome and difficult to implement in an array-based experiment.

"Scaling cantilever-based architectures at low cost is not trivial and often leads to devices that are difficult to operate and limited with respect to the scope of application," Mirkin said.
Hard-tip, soft-spring lithography uses a soft polymer backing that supports sharp silicon tips as its "print head." The spring polymer backing allows all of the tips to come in contact with the surface in a uniform manner and eliminates the need to use cantilevers. Essentially, hard tips are floating on soft polymeric springs, allowing either materials or energy to be delivered to a surface.
HSL offers a method that quickly and inexpensively produces patterns of high quality and with high resolution and density. The prototype arrays containing 4,750 tips can be fabricated for the cost of a single cantilever-based tip and made in mass, Mirkin said.

Mirkin and his team demonstrated an array of 4,750 ultra-sharp silicon tips aligned over an area of one square centimeter, with larger arrays possible. Patterns of features with sub-50-nanometer resolution can be made with feature size controlled by tip contact time with the substrate.
They produced patterns "writing" with molecules and showed that as the tips push against the substrate the flexible backing compresses, indicating the tips are in contact with the surface and writing is occurring. (The silicon tips do not deform under pressure.)

"Eventually we should be able to build arrays with millions of pens, where each pen is independently actuated," Mirkin said.
The researchers also demonstrated the ability to use hard-tip, soft-spring lithography to transfer mechanical and electrical energy to a surface.

ScienceDaily

TStzmmalaysia
post Feb 13 2011, 02:09 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Biomedical Imaging: Ultrasound Guide Star and Time-Reversal Mirror Can Focus Light Deep Under the Skin

Astronomers have a neat trick they sometimes use to compensate for the turbulence of the atmosphere that blurs images made by ground-based telescopes. They create an artificial star called a guide star and use its twinkling to compensate for the atmospheric turbulence.

Lihong Wang, PhD, the Gene K. Beare Distinguished Professor of Biomedical Engineering at Washington University in St. Louis, has invented a guide star for biomedical rather than celestial imaging, a breakthrough that promises game-changing improvements in biomedical imaging and light therapy.

Wang's guide star is an ultrasound beam that "tags" light that passes through it. When it emerges from the tissue, the tagged light, together with a reference beam, creates a hologram.

When a "reading beam" is then shown back through the hologram, it acts as a time-reversal mirror, creating light waves that follow their own paths backward through the tissue, coming to a focus at their virtual source, the spot where the ultrasound is focused.

The technique, called time-reversed ultrasonically encoded (TRUE) optical focusing, thus allows the scientist to focus light to a controllable position within tissue.

Wang thinks TRUE will lead to more effective light imaging, sensing, manipulation and therapy, all of which could be a boon for medical research, diagnostics and therapeutics.

In photothermal therapy, for example, scientists have had trouble delivering enough photons to a tumor to heat and kill the cells. So they either have to treat the tumor for a long time or use very strong light to get enough photons to the site, Wang says. But TRUE will allow them to focus light right on the tumor, ideally without losing a single tagged photon to scattering.

"Focusing light into a scattering medium such as tissue has been a dream for years and years, since the beginning of biomedical optics," Wang says. "We couldn't focus beyond say a millimeter, the width of a hair, and now you can focus wherever you wish without any invasive measure."

The new method was published in Nature Photonics, which appeared online Jan. 16, and has since been spotlighted by Physics Today (both online and in print) and in a Nature Photonics Backstage Interview.

The problem

Light is in many ways the ideal form of electromagnetic radiation for imaging and treating biological tissues, but it suffers from an overwhelming drawback. Light photons ricochets off nonuniformities in tissue like a steel ball ricochets off the bumpers of an old-fashioned pinball machine.

This scattering prevents you from seeing even a short distance through tissue; you can't, for example, see the bones in your hand. Light of the right color can penetrate several centimeters into biological tissue, but even with the best current technology, it isn't possible to produce high-resolution images of objects more than a millimeter below the skin with light alone.

Ultrasound's advantages and drawbacks are in many ways complementary to those of light. Ultrasound scattering is a thousand times weaker than optical scattering.

Ultrasound reveals a tissue's density and compressibility, which are often not very revealing. For example, the density of early-stage tumors doesn't differ that much from that of healthy tissue.

Ultrasound tagging

The TRUE technique overcomes these problems by combining for the first time two tricks of biomedical imaging science: ultrasound tagging and time reversal.

Wang had experimented with ultrasound tagging of light in 1994 when he was working at the M.D. Anderson Cancer Center in Houston, Texas. In experiments using a tissue phantom (a model that mimics the opacity of tissue), he focused ultrasound into the phantom from above, and then probed the phantom with a laser beam from the side.

The laser light had only one frequency as it entered the tissue sample, but the ultrasound, which is a pressure wave, changed the tissue's density and the positions of its scattering centers. Light passing through the precise point where the ultrasound was focused acquired different frequency components, a change that "tagged" these photons for further manipulation.

By tuning a detector to these frequencies, it is possible to sort photons arriving from one spot (the ultrasound focus) within the tissue and to discard others that have bypassed the ultrasonic beam and carry no information about that spot. The tagged photons can then be used to paint an image of the tissue at the ultrasound focus.

Ultrasound modulation of light allowed Wang to make clearer images of objects in tissue phantoms than could be made with light alone. But this technology selects only photons that have traversed the ultrasound field and cannot focus light.

Time reversal

While Wang was working on ultrasound modulation of optical light, a lab at the Langevin Institute in Paris led by Mathias Fink, was working on time reversal of sound waves.

No law of physics is violated if waves run backward instead of forward. So for every burst of sound (or light) that diverges from a source, there is in theory a set of waves that could precisely retrace the path of the sound back to the source.

To make this happen, however, you need a time-reversal mirror, a device to send the waves backward along exactly the same path by which they arrived. In Fink's experiments, the mirror consisted of a line of transducers that detected arriving sound and fed the signal to a computer.

Each transducer then played back its sound in reverse -- in synchrony with the other transducers. This created what is called the conjugate of the original wave, a copy of the wave that traveled backward rather than forward and refocused on the original point source.

The idea of time reversal is so remote from everyday experience it is difficult to grasp, but as Scientific American reported at the time, if you stood in front of Fink's time-reversal "mirror" and said "hello," you would hear "olleh," and even more bizarrely, the sound of the "olleh," instead of spreading throughout the room from the loudspeakers, would converge onto your mouth.

In a 1994 experiment, Fink and his colleagues sent sound through a set of 2000 steel rods immersed in a tank of water. The sound scattered along all the possible paths through the rods, arriving at the transducer array as a chaotic wave. These signals were time-reversed and sent back through the forest of rods, refocusing to a point at the source location.

In effect, time reversal is a way to undo scattering.

Combining the tricks

Wang was aware of the work with time reversal, but at first couldn't see how it might help solve his problem with tissue scattering.

In 2004, Michael Feld, a physicist interested in biomedical imaging, invited Wang to give a seminar at the Massachusetts Institute of Technology. "At dinner we talked about time reversal," Wang says. "Feld was thinking about time reversal, I was thinking about time reversal, and so was another colleague dining with us."

"The trouble was, we couldn't figure out how to use it. You know, if you send light through a piece of tissue, the light will scatter all over the place, and if you capture it and reverse it, sending it back, it will still be scattered all over the place, so it won't concentrate photonsl."

"And then 13 years after the initial ultrasound-tagging experiments, I suddenly realized I could combine these two techniques.

"If you added ultrasound, then you could focus light into tissue instead of through tissue. Ultrasound tagging lets you reverse and send back only those photons you know are going to converge to a focus in the tissue."

"Ultrasound provides a virtual guide star, and to make optical time reversal effective, you need a guide star," Wang says.

A time-reversal mirror for light

It's much easier to make a time-reversal mirror for ultrasound than for light. Because sound travels slowly, it is easy to record the entire time course of a sound signal and then broadcast that signal in reverse order.

But a light wave arrives so fast it isn't possible to record a time course with sufficient time resolution. No detector will respond fast enough. The solution is to record an interference pattern instead of a time course.

The beam that has gone through the tissue and a reference beam form an interference pattern, which is recorded as a hologram by a special photorefractive crystal.

Then the wavefront is reconstructed by sending a reading beam through the crystal from the direction opposite to that of the reference beam. The reading beam reconstitutes a reversed copy of the original wavefront, one that comes to a focus at the ultrasound focus.

Unlike the usual hologram, the TRUE hologram is dynamic and constantly changing. Thus it is able to compensate for natural motions, such as breathing and the flow of blood, and it adapts instantly when the experimenter moves the ultrasonic focus to a new spot.

More photons to work with

Wang expects the TRUE technique for focusing light within tissue will have many applications, including optical imaging, sensing, manipulation and therapy. He also mentions its likely impact on the emerging field of optogenetics.

In optogenetics, light is used to probe and control living neurons that are expressing light-activatable molecules or structures. Optogenetics may allow the neural circuits of living animals to be probed at the high speeds needed to understand brain information processing.

But until now, optogenetics has suffered from the same limitation that plagues optical methods for studying biological tissues. Areas of the brain near the surface can be stimulated with light sources directly mounted on the skull, but to study deeper areas, optical fibers must be inserted into the brain.

TRUE will allow light to be focused on these deeper areas without invasive procedures, finally achieving the goal of making tissue transparent at optical frequencies.

ScienceDaily

TStzmmalaysia
post Feb 13 2011, 02:11 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Particles That Flock: Strange Synchronization Behavior at the Large Hadron Collider

In its first six months of operation, the Large Hadron Collider near Geneva has yet to find the Higgs boson, solve the mystery of dark matter or discover hidden dimensions of spacetime. It has, however, uncovered a tantalizing puzzle, one that scientists will take up again when the collider restarts in February following a holiday break.

Last summer physicists noticed that some of the particles created by their proton collisions appeared to be synchronizing their flight paths, like flocks of birds. The findings were so bizarre that “we’ve spent all the time since [then] convincing ourselves that what we were see ing was real,” says Guido Tonelli, a spokesperson for CMS, one of two general-purpose experiments at the LHC.

The effect is subtle. When proton collisions result in the release of more than 110 new particles, the scientists found, the emerging particles seem to fly in the same direction. The high-energy collisions of protons in the LHC may be uncovering “a new deep internal structure of the initial protons,” says Frank Wilczek of the Massachusetts Institute of Technology, winner of a Nobel Prize for his explanation of the action of gluons. Or the particles may have more interconnections than scientists had realized. “At these higher energies [of the LHC], one is taking a snapshot of the proton with higher spatial and time resolution than ever before,” Wilczek says.

When seen with such high resolution, protons, according to a theory developed by Wilczek and his colleagues, consist of a dense medium of gluons—massless particles that act inside the protons and neutrons, controlling the behavior of quarks, the constituents of all protons and neutrons. “It is not implausible,” Wilczek says, “that the gluons in that medium interact and are correlated with one another, and these interactions are passed on to the new particles.”

If confirmed by other LHC physicists, the phenomenon would be a fascinating new finding about one of the most common particles in our universe and one scientists thought they understood well.


TStzmmalaysia
post Feb 13 2011, 02:15 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

How the brain compresses visual information

In the Feb. 10 online issue of Current Biology, a Johns Hopkins team led by neuroscientists Ed Connor and Kechen Zhang describes what appears to be the next step in understanding how the brain compresses visual information down to the essentials.

Most of us are familiar with the idea of image compression in computers. File extensions like “.jpg” or “.png” signify that millions of pixel values have been compressed into a more efficient format, reducing file size by a factor of 10 or more with little or no apparent change in image quality. The full set of original pixel values would occupy too much space in computer memory and take too long to transmit across networks.

The brain is faced with a similar problem. The images captured by light-sensitive cells in the retina are on the order of a megapixel. The brain does not have the transmission or memory capacity to deal with a lifetime of megapixel images. Instead, the brain must select out only the most vital information for understanding the visual world.

They found that cells in area “V4,” a midlevel stage in the primate brain’s object vision pathway, are highly selective for image regions containing acute curvature. Experiments by doctoral student Eric Carlson showed that V4 cells are very responsive to sharply curved or angled edges, and much less responsive to flat edges or shallow curves.

To understand how selectivity for acute curvature might help with compression of visual information, co-author Russell Rasquinha (now at University of Toronto) created a computer model of hundreds of V4-like cells, training them on thousands of natural object images. After training, each image evoked responses from a large proportion of the virtual V4 cells — the opposite of a compressed format. And, somewhat surprisingly, these virtual V4 cells responded mostly to flat edges and shallow curvatures, just the opposite of what was observed for real V4 cells.

The results were quite different when the model was trained to limit the number of virtual V4 cells responding to each image. As this limit on responsive cells was tightened, the selectivity of the cells shifted from shallow to acute curvature. The tightest limit produced an eight-fold decrease in the number of cells responding to each image, comparable to the file size reduction achieved by compressing photographs into the .jpeg format. At this level, the computer model produced the same strong bias toward high curvature observed in the real V4 cells.

Why would focusing on acute curvature regions produce such savings? Because, as the group’s analyses showed, high-curvature regions are relatively rare in natural objects, compared to flat and shallow curvature. Responding to rare features rather than common features is automatically economical.

Despite the fact that they are relatively rare, high-curvature regions are very useful for distinguishing and recognizing objects, said Connor, a professor in the Solomon H. Snyder Department of Neuroscience in the School of Medicine, and director of the Zanvyl Krieger Mind/Brain Institute.

“Psychological experiments have shown that subjects can still recognize line drawings of objects when flat edges are erased. But erasing angles and other regions of high curvature makes recognition difficult,” he explained

Brain mechanisms such as the V4 coding scheme described by Connor and colleagues help explain why we are all visual geniuses.

“Computers can beat us at math and chess,” said Connor, “but they can’t match our ability to distinguish, recognize, understand, remember, and manipulate the objects that make up our world.” This core human ability depends in part on condensing visual information to a tractable level. For now, at least, the .brain format seems to be the best compression algorithm around.

Kurzweil AI

TStzmmalaysia
post Feb 15 2011, 08:41 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

The Recycled Port? An Alternative to Dumping at Sea

In search of a sustainable alternative to dumping at sea or disposal on land, a Scandinavian consortium blended contaminated sediment with a special mix of binders to produce a safe construction material for use in ports and harbours.

Stricter regulations have reduced the use of hazardous chemicals and heavy metals in industrial activities, but their legacy lives on in the environment, notably in polluted soils and sediments. One sector where they present a particular headache is in the shipping and port industry, where dredging routinely turns up sediment contaminated with the likes of carcinogenic PCBs, TBT, cadmium, lead and mercury. Port owners are caught between constraints on dumping sediment at sea, the cheap but polluting option, and removing it to be treated for landfill, an expensive alternative.

Enter a recent EUREKA project, STABCON, in which a Swedish-Norwegian consortium -- of research bodies, binder manufacturers, port authorities and design consultants -- sought to adapt the 'stabilisation and solidification' method to treat polluted sediments and other dredged material commonly found in Scandinavia.

Having worked together on an earlier study into the potential of the stabilisation and solidification technique in Sweden for the country's environment protection agency, the project participants teamed up to test the method in a pilot study and draw up guidelines for ports.

A cost-effective solution

Led by Merox, a subsidiary of Swedish steelmaker Svenskt Stål AB (SSAB), they first compared the three alternatives for handling sediments -- dumping, solidification and stabilisation, and dredging and disposing on land -- from a sustainability perspective. Stabilisation and solidification proved to be a sustainable and cost-effective solution. Contaminated sediments are mixed, on site, with products that bind it to create a solid material that contains the hazardous substances.

As well as being more environmentally friendly than dumping and cheaper than landfilling, "this method offers a number of additional benefits," explained Göran Holm, R&D director of the Swedish Geotechnical Institute, one of the project partners. "It reduces the demand for natural resources, such as blasted rock; and by treating the sediments in situ and using them in port areas, the need for transport is reduced, along with the associated health risks."

Supported by funding from EUREKA member countries, the project partners conducted tests in a pilot project to identify the most suitable binder composition and ideal mixing procedure for a variety of contaminants and sediment types. Researchers observed the behaviour of the treated sediment for leakage, permeability, strength and durability. The binder they used was a mixture of cement and a Merox product, Merit 5000, a derivative from the steel-making process. The slag is able to bind heavy metals chemically at the same time as it cures.

Putting it to the test

The final step of the project translated the results into a report and guidelines for port authorities, to enable them to assess options for using stabilisation and solidification and select the best binder for their local conditions, while providing design principles for using treated sediments in harbour structures, such as paved areas, loading zones and buildings. The STABCON test site was the Swedish port of Oxelösund, itself a partner in the project. The port wanted to build a new harbour area, and needed to remove contaminated sediment while at the same time respecting Sweden's strict environmental regulations.

Its aim was to dredge a section of harbour and treat the sediment for use in the new land area. The team dredged about 500 cubic metres of soft sediment, and strengthened it with a mix of cement and Merit 5000. They placed the composition on gravel and sand, and studied its properties, taking samples and conducting laboratory tests for leakage, including in nearby waters. The results were impressive. Once stabilised, there was no degradation from a chemical point of view, and no physical damage either.

The new material also passed the test for durability. "We are pretty confident that it will last for the long term," said Therese Stark, a research and development engineer at Merox. "The main thing is to keep the sediments in as natural an environment as possible, which you can't do if you take them away to deposit elsewhere. We are trying to keep them as they were in the ocean." Key to the outcome of the project was a close working relationship between partners and the expertise that each brought to the table.

ScienceDaily

TStzmmalaysia
post Feb 15 2011, 08:43 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


SPACE SCIENCE

Attached Image

Europe set for landmark launch with robot freighter

A robot freighter is poised to blast into the skies on Tuesday in the heftiest liftoff in Europe's space programme that will also bring its tally of launches to a historic 200.

Designed to supply mankind's outpost in orbit, the Johannes Kepler will be hoisted by an Ariane 5 ES super-rocket from Kourou, French Guiana. Liftoff is pencilled for 2208 GMT.

A successful mission will boost the case for scientists who want the ATV to be the template of a manned spacecraft, placing ESA on an equal footing with the United States, Russia and China.

With a mass of more than 20 tonnes, the payload is the biggest ever taken aloft by the European Space Agency (ESA).

It is a monster compared with the 1.6-tonne test satellite launched in 1979 by Ariane 1, the pencil-thin trailblazer in ESA's exploration of space.

The unmanned supply ship is scheduled to navigate by starlight towards the International Space Station (ISS) and dock with it automatically, a feat of precision unmatched by any other space power.

"We will be working at a speed of around 28,000 kilometers (17,500 miles) per hour and our approach will be at seven centimetres (2.8 inches) a second, so although we are moving at this high speed, we will really be approaching the ISS very gently," explained mission director Kris Capelle.

The Johannes Kepler is the second of five Automated Transfer Vehicles (ATVs) that the ESA is building for the ISS.

The prototype ATV, the Jules Verne, carried out a flawless mission in 2008, silencing those who predicted an expensive firework display or a lethal collision with the space station.

If all goes well, its successor will dock with the ISS on February 23, carrying 7.1 tonnes of fuel, dry goods, oxygen and a scientific experiment, more than three times the load of Russia's Progress supply ship.

But it will not be bringing water, as the six ISS crew already have plenty of the precious stuff, says ESA.

It will then be used as a spare room and for storage, easing the cramped conditions for the ISS crew, and fire its onboard engines to boost the station's altitude in six steps.

The ISS is in low orbit, but loses altitude because it is tugged by the tendrils of Earth's atmosphere. It is currently at around 360 kilometres (225 miles) and needs boosting to some 400 kms (250 miles).

On June 4, the Johannes Kepler will undock, laden with rubbish, human waste and unwanted hardware, and then go on a suicide plunge, burning up over the South Pacific.

Widely applauded for its robot missions, ESA has never had its own manned spaceflight capability. Its astronauts hitch rides with the US space shuttle -- due to be phased out this year -- and Russia's Soyuz.

"In the ATV, there are technological elements which are absolutely fine for transporting astronauts," said Olivier de la Bourdonnaye, director of the ATV 2 programme at Astrium Space Transportation.

"The docking system and propulsion system in particular meet all the safety standards for manned flight. However, to carry a crew, you need a whole lot more, notably a spacecraft that can cope with re-entry."

The first step towards this has been taken with a study for a experimental re-entry vehicle which will carry back instruments back to Earth as a test of survival.

The Johannes Kepler is named after the great German mathematician of the 16th and 17th centuries who first calculated the movement of planetary bodies in elliptical orbits, paving the way to Isaac Newton's theories of gravitation.

PhysOrg
TStzmmalaysia
post Feb 15 2011, 08:50 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

The heaviest known antimatter

All ordinary nuclei are made of protons and neutrons (which in turn contain only up and down quarks). The standard Periodic Table of Elements is arranged according to the number of protons, which determine each element’s chemical properties. There is also a more complex, three-dimensional chart that conveys information about the number of neutrons, which may change in different isotopes of the same element, and a quantum number known as “strangeness,” which depends on the presence of strange quarks. Nuclei containing one or more strange quarks are called hypernuclei. For all ordinary matter, with no strange quarks, the strangeness value is zero and the chart is flat. Hypernuclei appear above the plane of the chart.

Last year, members of the STAR detector collaboration at RHIC published evidence of a form of strange antimatter with an anti-strange quark — an antihypernucleus — making it the first entry below the plane of the 3D chart of nuclides, laying the first stake in a new frontier of physics.

Collisions at RHIC fleetingly produce conditions that existed a few microseconds after the Big Bang, which scientists believe gave birth to the universe as we know it some 13.7 billion years ago. In both nucleus-nucleus collisions at RHIC and in the Big Bang, quarks and antiquarks emerge with equal abundance. Nuclear collisions are unique and distinct from elementary particle collisions because they deposit large amounts of energy into a more extended volume. In contrast to the Big Bang, the small amount of energy in nuclear collisions produces negligible gravitational attraction, which allows the resulting quark-gluon plasma to expand rapidly and to cool down and transition into a hadron gas, producing nucleons and their antiparticles.

At RHIC, among the collision fragments that survive to the final state, matter and antimatter are still close to equally abundant, even in the case of the relatively complex antinucleus and its normal-matter partner featured in the present study. In contrast, antimatter appears to be largely absent from the present-day universe.

The STAR team has found that the rate at which their heaviest antinucleus is produced is consistent with expectations based on a statistical collection of antiquarks from the soup of quarks and antiquarks generated in RHIC collisions. Extrapolating from this result, the experimenters believe they should be able to discover even heavier antinuclei in upcoming collider running periods. The most abundantly produced antimatter next in line for discovery is the antimatter Helium-4 nucleus, also known as the antimatter ? (alpha) particle.

Dr. Xu, a member of the STAR collaboration, will describe the discovery of the first antimatter hypernucleus, the models which can describe the production mechanism and the abundance of these antimatter nuclei, and the remaining heaviest antimatter nucleus to be discovered in the foreseeable future at RHIC.

The STAR collaboration is composed of 54 institutions from 13 countries. Research at RHIC is funded primarily by the U.S. Department of Energy’s Office of Science, Office of Nuclear Physics, and by various national and international collaborating institutions.

PhysOrg

TStzmmalaysia
post Feb 15 2011, 08:59 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Scientists Develop Control System to Allow Spacecraft to Think for Themselves

The world's first control system that will allow engineers to programme satellites and spacecraft to think for themselves has been developed by scientists from the University of Southampton.

Professor Sandor Veres and his team of engineers have developed an artificially intelligent control system called 'sysbrain'.

Using natural language programming (NLP), the software agents can read special English language technical documents on control methods. This gives the vehicles advanced guidance, navigation and feedback capabilities to stop them crashing into other objects and the ability to adapt during missions, identify problems, carry out repairs and make their own decisions about how best to carry out a task.

Professor Veres, who is leading the EPSRC-funded project, says: "This is the world's first publishing system of technical knowledge for machines and opens the way for engineers to publish control instructions to machines directly. As well as spacecrafts and satellites, this innovative technology is transferable to other types of autonomous vehicles, such as autonomous underwater, ground and aerial vehicles."

To test the control systems that could be applied in a space environment, Professor Veres and his team constructed a unique test facility and a fleet of satellite models, which are controlled by the sysbrain cognitive agent control system.

The 'Autonomous Systems Testbed' consists of a glass covered precision level table, surrounded by a metal framework, which is used to mount overhead visual markers, observation cameras and isolation curtains to prevent any external light sources interfering with experimentation. Visual navigation is performed using onboard cameras to observe the overhead marker system located above the test area. This replicates how spacecraft would use points in the solar system to determine their orientation.

The perfectly-balanced model satellites, which rotate around a pivot point with mechanical properties similar to real satellites, are placed on the table and glide across it on roller bearings almost without friction to mimic the zero-gravity properties of space. Each model has eight propellers to control movement, a set of inertia sensors and additional cameras to be 'spatially aware' and to 'see' each other. The model's skeletal robot frame also allows various forms of hardware to be fitted and experimented with.

Professor Veres adds: "We have invented sysbrains to control intelligent machines. Sysbrain is a special breed of software agents with unique features such as natural language programming to create them, human-like reasoning, and most importantly they can read special English language documents in 'system English' or 'sEnglish'. Human authors of sEnglish documents can put them on the web as publications and sysbrain can read them to enhance their physical and problem solving skills. This allows engineers to write technical papers directly for sysbrain that control the machines."

ScienceDaily

TStzmmalaysia
post Feb 15 2011, 09:01 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCE

Attached Image

System for manipulation of an actuated surface

Created by Anthony DeVincenzi, David Lakatos, Matthew Blackshaw, Daniel Leithinger and Hiroshi Ishii at the MIT Lab, Recompose is a system for manipulation of an actuated surface. By utilising openCV, Kinect (i believe) and gesture recognition the team is working on an array of 120 individually addressable pins, whose height can be actuated and read back simultaneously creating an ever transforming landscape responding to body behaviour.

Our system builds upon the Relief table, developed by Leithinger. The table consists of an array of 120 individually addressable pins, whose height can be actuated and read back simultaneously, thus allowing the user to utilize them as both input and output. Building upon this system, we have furthered the design by placing a depth camera above the tabletop surface. By gaining access to the depth information we are able to detect basic gestures from the user. In order to provide visual feedback related to user interaction, a projector is mounted above the table and calibrated to be coincident with the depth camera. Computer vision is utilized to determine and recognize the position, orientation, and height of hands and fingers, in order to detect gestural input.

Attached Image

Creative Applications

TStzmmalaysia
post Feb 15 2011, 09:03 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Researchers working toward automating sedation in intensive care units

Researchers at the Georgia Institute of Technology and the Northeast Georgia Medical Center are one step closer to their goal of automating the management of sedation in hospital intensive care units (ICUs). They have developed control algorithms that use clinical data to accurately determine a patient's level of sedation and can notify medical staff if there is a change in the level.

"ICU nurses have one of the most task-laden jobs in medicine and typically take care of multiple patients at the same time, so if we can use control system technology to automate the task of sedation, patient safety will be enhanced and drug delivery will improve in the ICU," said James Bailey, the chief medical informatics officer at the Northeast Georgia Medical Center in Gainesville, Ga. Bailey is also a certified anesthesiologist and intensive care specialist.

During a presentation at the IEEE Conference on Decision and Control, the researchers reported on their analysis of more than 15,000 clinical measurements from 366 ICU patients they classified as "agitated" or "not agitated." Agitation is a measure of the level of patient sedation. The algorithm returned the same results as the assessment by hospital staff 92 percent of the time.

"Manual sedation control can be tedious, imprecise, time-consuming and sometimes of poor quality, depending on the skills and judgment of the ICU nurse," said Wassim Haddad, a professor in the Georgia Tech School of Aerospace Engineering. "Ultimately, we envision an automated system in which the ICU nurse evaluates the ICU patient, enters the patient's sedation level into a controller, which then adjusts the sedative dosing regimen to maintain sedation at the desired level by continuously collecting and analyzing quantitative clinical data on the patient."

This project is supported in part by the U.S. Army. On the battlefield, military physicians sometimes face demanding critical care situations and the use of advanced control technologies is essential for extending the capabilities of the health care system to handle large numbers of injured soldiers.

Working with Haddad and Bailey on this project are Allen Tannenbaum and Behnood Gholami. Tannenbaum holds a joint appointment as the Julian Hightower Chair in the Georgia Tech School of Electrical and Computer Engineering and the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University, while Gholami is currently a postdoctoral fellow in the Georgia Tech School of Electrical and Computer Engineering.

This research builds on Haddad and Bailey's previous work automating anesthesia in hospital operating rooms. The adaptive control algorithms developed by Haddad and Bailey control the infusion of an anesthetic drug agent in order to maintain a desired constant level of depth of anesthesia during surgery in the operating room. Clinical trial results that will be published in the March issue of the journal IEEE Transactions on Control Systems Technology demonstrate excellent regulation of unconsciousness allowing for a safe and effective administration of an anesthetic agent.

Critically ill patients in the ICU frequently require invasive monitoring and other support that can lead to anxiety, agitation and pain. Sedation is essential for the comfort and safety of these patients.

"The challenge in developing closed-loop control systems for sedating critically ill patients is finding the appropriate performance variable or variables that measure the level of sedation of a patient, in turn allowing an automated controller to provide adequate sedation without oversedation," said Gholami.

In the ICU, the researchers used information detailing each patient's facial expression, gross motor movement, response to a potentially noxious stimulus, heart rate and blood pressure stability, noncardiac sympathetic stability, and nonverbal pain scale to determine a level of sedation.

The researchers classified the clinical data for each variable into categories. For example, a patient's facial expression was categorized as "relaxed," "grimacing and moaning," or "grimacing and crying." A patient's noncardiac sympathetic stability was classified as "warm and dry skin," "flushed and sweaty," or "pale and sweaty."

They also recorded each patient's score on the motor activity and assessment scale (MAAS), which is used by clinicians to evaluate level of sedation on a scale of zero to six. In the MAAS system, a score of zero represents an "unresponsive patient," three represents a "calm and cooperative patient," and six represents a "dangerously agitated patient." The MAAS score is subjective and can result in inconsistencies and variability in sedation administration.

Using a Bayesian network, the researchers used the clinical data to compute the probability that a patient was agitated. Twelve-thousand measurements collected from patients admitted to the ICU at the Northeast Georgia Medical Center between during a one-year period were used to train the Bayesian network and the remaining 3,000 were used to test it.

In 18 percent of the test cases, the computer classified a patient as "agitated" but the MAAS score described the same patient as "not agitated." In five percent of the test cases, the computer classified a patient as "not agitated," whereas the MAAS score indicated "agitated." These probabilities signify an 18 percent false-positive rate and a five percent false-negative rate.

"This level of performance would allow a significant reduction in the workload of the intensive care unit nurse, but it would in no way replace the nurse as the ultimate judge of the adequacy of sedation," said Bailey. "However, by relieving the nurse of some of the work associated with titration of sedation, it would allow the nurse to better focus on other aspects of his or her demanding job."

The researchers' next step toward closed-loop control of sedation in the ICU will be to continuously collect clinical data from ICU patients in real time. Future work will involve the development of objective techniques for assessing ICU sedation using movement, facial expression and responsiveness to stimuli.

Digital imaging will be used to assess a patient's facial expression and also gross motor movement. In a study published in the June 2010 issue of the journal IEEE Transactions on Biomedical Engineering, the researchers showed that machine learning methods could be used to assess the level of pain in patients using facial expressions.

"We will explore the relationship between the data we can extract from these multiple sensors and the subjective clinical MAAS score," said Haddad. "We will then use the knowledge we have gained in developing feedback control algorithms for anesthesia dosage levels in the operating room to develop an expert system to automate drug dosage in the ICU."

EurekAlert

TStzmmalaysia
post Feb 15 2011, 09:05 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Getting a charge out of solar 'paint'

Have you seen those big, bulky, breakable photovoltaic cells that now collect the sun's rays? Well, what if solar energy could be harnessed using tiny collectors that could be spray painted on a roof, a wall or even a window?

The science of converting sunlight into electrical energy is more than a century old, but the reality of doing it efficiently and affordably is ongoing.

"Not only does it involve fundamental science in terms of physics and chemistry, and in some cases biology, but there are major engineering challenges as well," notes Brian Korgel, a nanomaterials chemist at the University of Texas at Austin.

Korgel and his colleagues are a new breed of chemical engineers, looking for answers to the world's big problems.

"There was a time where the field of chemical engineering had a reputation of being really conservative. You'd get your degree in chemical engineering, and you'd work for a chemical plant with a hard hat or in a giant refinery," says Korgel.

That's no longer the only option.

"Chemical engineers are now able to take these new chemicals, like nanomaterials, and we're trying to create the technologies that can meet the global challenge of, say, energy sustainability. We're taking chemistry, we're inventing new ways to actually make materials that can't be made any other way," he continues.

With support from the National Science Foundation (NSF), that's what Korgel and his team are doing to create solar cells that are light, flexible, efficient and--often the biggest obstacle--affordable.

"It's challenging to get high efficiencies of conversion. For example, the basic single junction solar cell is fundamentally limited to an efficiency of 30 percent. So, if you made a perfect solar cell, the highest efficiency would be 30 percent," explains Korgel at his Austin lab.

Currently, manufacturing cells with anything near that level of efficiency requires high heat, a vacuum and is very expensive. Korgel's approach, using nanotechnology, is completely different.

"What we're doing right now in my research group is making nanocrystals. We're focused on 'CIGS'--copper, indium, gallium, selenide--and we make small particles of this inorganic material that we can disperse in a solvent, creating an ink or paint," he says.

This solar "paint" would have the same function as the large photovoltaic (PV) solar collectors on buildings and "solar farms" around the world.

Korgel describes the tiny collection devices as a "solar sandwich."

"So these devices are 'sandwiches,' where you have the metal contact on the bottom and metal contact on the top to extract the charge out; and the middle part is the part that absorbs out the light," explains Korgel.

This paint, made of the CIGS nanocrystals, can be sprayed on plastic, glass and even fabric to create a solar cell.

"So what we're able to do is create radically new ways of depositing inorganic films to make solar cells, and so we're trying to meet this challenge of much lower cost of manufacturing," he says.

One way to create these cells on a very large scale would be to print them on thin, flexible sheets, the same way huge presses now print newspapers. "And the final product would ideally look something like today's shingles," says Vahid Akhavan, one of Korgel's graduate research assistants. "You want to produce something that is very user friendly. So you could go to your local hardware store, buy them and install them on your roof."

These shingles would do double duty, generating electricity while serving as roofing material. They would be also stand up better in bad weather, such as hail and windstorms, than some of today's more fragile solar collectors.

A lot of challenges need to be conquered before solar energy becomes so commonplace. High on that list is improving the efficiency of these nanomaterial cells. "Right now, we have made devices that have an efficiency of 3 percent, and to be commercial, you really need to be at 10 percent," says Korgel. "But I think we can get to 10 percent. Those are just engineering challenges; they are not necessarily easy, but they are not fundamental roadblocks."

Depending on what part of the world is looking to transition to solar energy, that improved efficiency is critical.

"I did my post-doc in Dublin, Ireland, so I know cloudy days with five hours of sunlight," says Korgel. "So if you want to use solar, you need to have efficient devices that can harvest the sun under those conditions."

Another obstacle will be determining what raw materials can be used if this technology can be mass produced. The copper, indium, gallium, and selenide are not all cheap or readily available.

"Ultimately, thinking much further out, you want to go with a technology where you use elements that are earth-abundant," says Korgel.

One possibility is silicon, which is made from sand, abundant across our planet. But extracting the silicon from the sand is now an incredibly energy-intensive process and the chemicals it takes to do that are pretty harsh on the environment.

Korgel, his students and colleagues see all those problems as having answers. And he's also motivated by non-scientists eager to wean the world from diminishing fossil fuels.

"Everyone realizes this is a major problem, and so many people want to see it solved and are incredibly enthusiastic and supportive of the scientific and engineering community. And it's inspiring," says Korgel. "What it's given me is a deep appreciation of how important this problem of meeting energy sustainability is. It drives you further on to try and meet that need."

The Korgel lab is also investigating medical uses for nanomaterials. "These nanomaterials have unique properties. They might be fluorescent and give off light, they can be magnetically responsive. If you shine light on them, they can generate heat. So you can take all of these unique properties, and then they're so small that they can flow around in your bloodstream and get into organs," he says.

For example, a nano probe could detect a cancer cell and then deliver the medicine to kill it. "So, if you could come up with a nanoscopic unit that could detect a variety of different types of cancers or different diseases and then carry out a therapy of some sort, that would be a big deal," he says.

PhysOrg

TStzmmalaysia
post Feb 15 2011, 09:08 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

[B]Delving Into Manganite Conductivity


Chemical compounds called manganites have been studied for many years since the discovery of colossal magnetoresistance, a property that promises important applications in the fields of magnetic sensors, magnetic random access memories and spintronic devices. However, understanding -- and ultimately controlling -- this effect remains a challenge, because much about manganite physics is still not known. A research team lead by Maria Baldini from Stanford University and Carnegie Geophysical Laboratory scientists Viktor Struzhkin and Alexander Goncharov has made an important breakthrough in our understanding of the mysterious ways manganites respond when subjected to intense pressure.

At ambient conditions, manganites have insulating properties, meaning they do not conduct electric charges. When pressure of about 340,000 atmospheres is applied, these compounds change from an insulating state to a metallic state, which easily conducts charges. Scientists have long debated about the trigger for this change in conductivity.

The research team's new evidence, published online Feb. 11 in Physical Review Letters, shows that for the manganite LaMnO3, this insulator-to-metal transition is strongly linked to a phenomenon called the Jahn-Teller effect. This effect actually causes a unique distortion of the compound's structure. The team's measurements were carried out at the Geophysical Laboratory.

Counter to expectations, the Jahn-Teller distortion is observed until LaMnO3 is in a non-conductive insulating state. Therefore, it is reasonable to believe that the switch from insulator to metal occurs when the distortion is suppressed, settling a longstanding debate about the nature of manganite insulating state. The formation of inhomogeneous domains -- some with and some without distortion -- was also observed. This evidence suggests that the manganite becomes metallic when the breakdown of undistorted to distorted molecules hits a critical threshold in favor of the undistorted.

"Separation into domains may be a ubiquitous phenomenon at high pressure and opens up the possibility of inducing colossal magnetoresistance by applying pressure" said Baldini, who was with Stanford at the time the research was conducted, but has now joined Carnegie as a research scientist.

Some of the researchers were supported by various grants from the Department of Energy, Office of Science and National Nuclear Security Administration. Some of the experiments were supported by DOE and Carnegie Canada.

ScienceDaily

This post has been edited by tzmmalaysia: Feb 15 2011, 09:08 AM
TStzmmalaysia
post Feb 15 2011, 09:12 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Carbon-Scrubbing Artificial Trees for Boston City Streets

Trees naturally filter and clean our air, but in today’s heavily polluted world, it’s just too huge of a task to expect Mother Nature to take care of herself. Taking this into account, designers Mario Caceres and Cristian Canonico have designed a set of beautiful air-filtering trees for the SHIFTboston urban intervention contest. Called TREEPODS, the designs harnesses biomimicry to efficiently emulate the carbon filtration qualities of trees.

The TREEPOD systems are capable of removing carbon dioxide from the air and releasing oxygen using a carbon dioxide removal process called “humidity swing,”. In addition to their air-cleansing abilities, TREEPODS will also include solar energy panels and will harvest kinetic energy through an interactive seesaw that visitors can play with at the TREEPOD’s base. As passersby play on the seesaws they power displays that explain the TREEPODS’ de-carbonization process. Both the solar panels and the kinetic energy station will power the air filtration process, as well as interior lights.

The TREEPODS themselves will be made entirely of recycled/recyclable plastic from drink bottles. Based not only on trees, but on the human lung, the design of the “branches” will feature multiple contact points that serve as tiny CO2 filters. The proposed design, giant white and translucent canopies of trees, can be installed among existing trees or on their own. Interestingly, the TREEPODS have been compared to “urban furniture”: sleek yet functional design pieces that would fit into any urban environment. At night, the TREEPODS light up in an array of eye-catching colors.

Caceres and Canonico hope that these “trees” will function not just as examples of gorgeous urban design and sources of sustainable energy, but also as meeting places, allowing citizens to have an air purifying tree to sit under with friends and enjoy the day.

Attached Image

Inhabitat


TStzmmalaysia
post Feb 16 2011, 02:27 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Researchers develop new hydrogen storage technology

Working with scientists from the STFC’s Rutherford Appleton Laboratory and the University of Oxford, LCN researchers Zeynep Kurban and Professor Neal Skipper and UCL graduate Dr Arthur Lovell have developed a new technology that allows hydrogen to be stored in a cheap and practical way, making it promising for widespread use as a carbon-free alternative to petrol.

The team have developed a new nano-structuring technique called ‘co-electrospinning’ to produce tiny plastic micro-fibres 30 times smaller than a human hair. These hollow micro-fibres have then been used to encapsulate hydrogen-rich chemicals known as hydrides, in a way that allows the hydrogen to be released at much faster rates and at lower temperatures than was previously possible. The encapsulation also protects the hydrides from oxygen and water, prolonging their life and making it possible to handle them safely in air.

This new nano-material contains as much hydrogen for a given weight as the high pressure tanks currently used in prototype hydrogen vehicles, and can also be made in the form of micro beads that can be poured and pumped like a liquid. These properties mean that the beads could be used to fill up tanks in cars and aeroplanes in a very similar way to current fuels, but crucially without producing the carbon emissions. This technology underpins the new spin-out company Cella Energy Ltd, which is based at the Harwell Science and Innovation Campus, Oxfordshire.

UCL doctoral student Zeynep Kurban (pictured), who played a key role in the scientific development while studying for her EngD in Molecular Modeling and Materials Science, said: “This new technology provides solutions to some of the key issues surrounding hydrogen storage systems, bringing us a step closer to commercialisation of these materials for clean energy applications.”

The lead A round investor in Cella Energy is Thomas Swan & Co. Ltd., a specialist UK chemical company established in 1926. Thomas Swan’s Advanced Materials Division is dedicated to the development of high specification materials for emerging technologies with particular focus on carbon nanomaterials and advanced coatings. Shareholders also include STFC Innovations Ltd, UCL Business PLC and the Chancellor, Masters and Scholars of the University of Oxford.

Dr Tim Fishlock, Business Manager at UCL Business said: “Cella Energy is capitalising on an innovative technology developed within the research labs of three world class research centres at RAL, UCL and Oxford, which brings the large scale adoption of hydrogen powered vehicles closer to reality. Thomas Swan & Co is a fantastic partner for Cella Energy and wish the team every success with their future plans.”

PhysOrg

TStzmmalaysia
post Feb 16 2011, 02:29 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Full duplex radio technology developed that doubles speed of existing wireless networks

The full duplex radio with two transmitting antennas that cancel each other out at the receiving antenna

Whether it be over walkie talkies or Wi-Fi, wireless communication is a one way street, meaning radio traffic can flow in only one direction at a time on a specific frequency. To get around this limitation mobile phone networks use a workaround that is expensive and requires careful planning, making the technique not feasible for other wireless networks. Now researchers at Stanford University have created a full duplex radio that allows wireless signals to be sent and received simultaneously, thereby instantly doubling the speed of existing networks.

The problem the researchers had to overcome is that when a radio is transmitting and receiving at the same time, the incoming signals are drowned out by the radio’s own transmissions.

"When a radio is transmitting, its own transmission is millions, billions of times stronger than anything else it might hear [from another radio]," says Philip Levis, assistant professor of computer science and of electrical engineering at Stanford. "It's trying to hear a whisper while you yourself are shouting," he says.

To overcome this problem a trio of electrical engineering graduate students, Jung Il Choi, Mayank Jain and Kannan Srinivasan, hit upon the idea developing a radio receiver that could filter out the signal from its own transmitter so the weak incoming signals could be heard. Similar to the way in which noise-canceling headphones filter out ambient noise, each radio would know exactly what it is transmitting and therefore what it should filter out.



The idea seems so obvious that other researchers even told the students their idea wouldn’t work because something so obvious must have already been tried unsuccessfully. Luckily for the students, the naysayers were wrong and the team successfully developed the first full duplex radio device by designing a radio with two transmit antennas located either side of a single receiving antenna. When the signals from the two transmitting antennas meet at the receiving antenna, they effectively cancel each other out – not completely, but enough to allow the receiving antenna to pick up signals from other radios.

The most obvious advantage of the technology is that it instantly doubles the amount of information that can be transmitted, but it also has other benefits. With current air traffic control systems, when two aircraft try to call the control tower at the same time on the same frequency neither will get through. The new system would prevent such potentially disastrous scenarios.

Before the technology is practical for use in Wi-Fi networks the team will need to increase both the strength of the transmissions and the distance over which they work. They are currently working on this but are even more excited about the possibilities once hardware and software are built to take advantage of simultaneous two-way transmission.

The Stanford University students demonstrated their device last year at MobiCom 2010, where they took out first prize for best demonstration. The group has a provisional patent on the technology and is now working to commercialize it.

GizMag

TStzmmalaysia
post Feb 16 2011, 02:32 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Steps Towards a Bionic Eye

The human eye is a biological marvel. Charles Darwin considered it one of the biggest challenges to his theory of evolution, famously writing: that “To suppose that the eye with all its inimitable contrivances for adjusting the focus to different distances, for admitting different amounts of light, and for the correction of spherical and chromatic aberration, could have been formed by natural selection, seems, I freely confess, absurd in the highest degree.” Of course he did go on to explain how natural selection could account for the eye, but we can see why he wrote these words under the heading of “Organs of Extreme Perfection and Complication.”

The complexity and perfection of the eye has meant that, to date, it’s been all but impossible to reproduce its function artificially. Artificial hearts, kidneys (albeit outside the body), and ears (cochlear implants) are all in widespread medical use -- but not eyes.

That might be about to change. In a remarkable achievement, a team of ophthalmologists and engineers has managed to partially restore vision to the blind, using an electronic device which acts as a replacement for the retina. The results are reported in a paper by Professor Eberhart Zrenner, Director of the Institute for Ophthalmic Research at the University Eye Hospital in Tuebingen, Germany.

The implant consists of a tiny panel, 3 by 3.1 mm in size, containing a 38 by 40 array of 1,500 light-sensitive microphotodiodes. These sensors detect light, and control the output of a pulsed electrical current. The brighter the light, the stronger the resulting current. Each sensor has its own microelectrode, and these are placed in contact with nerve cells in the retina, called bipolar cells, the first step on the pathway from the eye to the brain. The sensors therefore mimic the way the eye’s own photoreceptor cells normally function, turning light into a pattern of electrical impulses.

The implant is not a complete artificial eye. It relies on an intact eyeball, an intact retina with functioning bipolar cells, and an optic nerve to convey the information to the brain. This means that the technology is only useful in forms of blindness caused by selective damage to photoreceptor cells.

However, such blindness is unfortunately common. Retinitis pigmentosa is a disease that causes progressive loss of vision, as the photoreceptor cells degenerate, and eventually die. There are many different forms of the disorder, each caused by mutations in a different gene. In some people, the loss of vision is gradual, and they remain able to see for most of their lives. In others, it rapidly leads to blindness. It’s estimated that about 400,000 Americans suffer some form of the disease.

Zrenner and his team implanted their device in three patients, all of whom had been born with normal vision, but had become almost totally blind due to retinal degeneration. Two of them suffered from retinitis pigmentosa, while the third had a similar disease.

The surgical procedure was, naturally, delicate. It involved inserting a metal tube behind and into one of the patient’s eyes, through which the implant was put into place. The chip comes connected to a cable that provides it with power from an external battery. It also allows the patient to control the sensitivity of the electrodes – essentially, manually adjusting the “brightness” of the image, to compensate for changes in the overall level of light. This is something that the eye normally does so effortlessly that we’re rarely aware of it.

So what happened? All three patients regained vision to some extent. Patient 2, a 44 year old man with retinitis pigmentosa, experienced the most dramatic benefits. He began to lose his sight at the age of 16. The first problem he noticed was a difficulty seeing at night, a common early symptom. By the time of the study, he was virtually blind, although he could still tell the direction from which a light was shining.



Thanks to the implant, he gained the ability to recognize everyday objects including spoons, bananas, and apples; he could read a clock; and he could read letters, albeit slowly, and they had to be printed extremely large (about 5-8 cm high).

This subretinal implant is not the only “bionic eye” idea under development, however. Other researchers have been working on using an external camera which transmits information to a relay chip placed on the retina, the "epiretinal” approach.

However, Zrenner’s team argues that their subretinal implant technique has some important advantages. Epiretinal devices have to pre-process the image before sending it to the retina, and patients need time to learn how to process the information that their brain receives, because the camera isn’t able to provide an exact simulation of normal retina outputs.

Zrenner et al’s subretinal method, however, took little “getting used to” because the implant is such a close analogue of the healthy retina. Also, they say that epiretinal approaches have so far only provided up to 60 pixels, as opposed to their 1,500.

Still, the technology has limitations. The image has no color, and it’s much less detailed than normal vision. The sensor has a resolution of 38 by 40 pixels, compared to the 960 by 640 resolution of an iPhone screen.

Being so small, it only covers a small fraction of the normal retinal field. However, this is actually less of a problem than it might first appear, because all of our detailed vision takes place in a tiny part of the retina, called the fovea. By placing the implant where the fovea used to be, the quality of the images was maximized.

The chip also requires an external power supply, so patients need to carry the battery pack and control unit around with them. Finally, they have a fairly hefty wire coming out of the side of their head.

So, at the moment, science is very far from being able to fully restore vision, but it’s still an exciting step forward. Technical improvements are sure to bring higher-quality images in the future.

Other researchers are working on using gene therapy to cure the underlying molecular cause of the disease, preventing the photoreceptors from dying in the first place. This approach has shown promise in animal models, and the results of the first human trials of gene therapy in another genetic eye disease, Leber’s ameurosis, have recently appeared.

So whether this device will become widely used in the treatment of people with diseases like retinitis pigmentosa is unclear. But it joins other emerging technologies, from deep brain stimulation to brain-computer interfaces, which are blurring the boundaries between the nervous system and machines.

Scientific American



TStzmmalaysia
post Feb 16 2011, 02:33 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Everyday Tech From Space: Lithium-Ion Batteries for Electric Cars

Electric cars have been around since the 19th century, so they're not exactly space-age technology. But the recent surge in electric vehicles springs, at least in part, from NASA know-how.

A carmaker, Hybrid Technologies Inc., signed a deal with NASA's Kennedy Space Center in 2006 to help test and improve the lithium-ion battery systems in its vehicles. With NASA's contributions, the company, now known as Li-Ion Motors, was able to deploy New York City's first all-electric taxi and develop a broad range of other lithium-powered vehicles, including converted PT Cruisers and Mini Coopers.

Long history of battery expertise

With its long history of battery development and testing, NASA was a natural partner for an aspiring electric-car maker. Many of the space agency's craft rely on battery power. The moon buggy, for example, was powered around the lunar surface in the early 1970s by non-rechargeable silver-zinc potassium hydroxide batteries. The Mars Opportunity rover — which landed on the Red Planet in January 2004 and is still going strong — is primarily solar-powered, but rechargeable lithium-ion batteries store energy for use at night.

The batteries that power rovers, satellites and other spacecraft must be reliable and robust. They must be able to withstand the jolts and shudders of a launch, and they must be able to work in tough conditions — in extreme heat or cold and high radiation, for example. Their premature failure could end the mission, since replacement or repair would be either difficult (and expensive) or simply impossible.

Testing electric cars

Li-Ion Motors, which is based in Las Vegas, converts cars, trucks and other vehicles to run on rechargeable lithium-ion batteries. In March 2006, the company entered into a Space Act Agreement with the Kennedy Space Center.

Under the deal, the company supplied a fleet of electric cars — including PT Cruisers, smart cars and high-performance all-terrain vehicles — for KSC engineers to use and test. The aim was to help determine the utility of lithium-powered vehicles, and to that end the NASA engineers studied the cars' advanced battery-management system.

NASA's efforts helped improve the vehicles, and the company soon was selling them. Lithium-powered Mini Coopers and smart cars were available in the 2007 Sam's Club catalog.

These days, electric vehicles are becomingincreasingly common on highways around the world, and many carmakers have gotten into the game. This surge is due in large part to improvements in lithium-ion batteries that allow cars to go farther on a single charge.

Meanwhile, NASA engineers are working to improve lithium-ion systems for spacesuits, to make the batteries less prone to "thermal runaway" — when batteries heat up dramatically and, in some cases, explode.

These improvements are likely to find their way to the road at some point, too, making next-generation electric vehicles more robust and more reliable.

SPACE.com

TStzmmalaysia
post Feb 17 2011, 09:51 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


SPACE SCIENCE

Attached Image

NASA spacecraft unravels comet mystery

The Valentine's night encounter was not easy for the US space agency's Stardust-NExT mission spacecraft, which had to fight an onslaught of debris from the comet in order to snap dozens of revealing pictures "Comets, unlike any other body in the solar system, are unique when they are in the inner part of the solar system where the Earth is," said Don Brownlee, Stardust-NExT co-investigator.

"They are literally coming apart and sending tons and tons of gas and rocks and dust out in space," he said. "They don't just spew off things in a uniform way. They send off clods of dirt and ice and rock that come apart," Brownlee said, playing audio of the impact sustained by the spacecraft. The sound was like rapid firecracker bursts. "A good analogy is thinking of a B-17 in World War II flying through flak -- sometimes a large number of impacts in less than a tenth of a second -- so it is a very dramatic environment."

The pictures that Stardust snapped showed some erosion over the past five years, and for the first time allowed scientists to see the crater made by a NASA probe, an impact which was obscured by a huge dust cloud the first time around. "We never saw the crater as we went by, it was there somewhere that created a lot of mystery, it also helped to create this mission," said co-investigator Pete Schultz of Brown University.

Tempel 1 was last glimpsed in 2005 by NASA's Deep Impact mission as the comet was shooting toward the Sun on its five-year orbit between Mars and Jupiter. Deep Impact pummeled the comet with a special impactor spacecraft and the material that came out was a surprise to scientists: a cloud of fine powdery material emerged, not the water, ice and dirt that was expected. Deep Impact also found evidence of ice on the surface of the comet, not just inside it.

This time, the approach had to be carefully orchestrated so that the spacecraft could snap pictures of the right area of the comet at just the right moment.
"We planned it so on approach we would see the Deep Impact area," said principal investigator Joe Veverka of Cornell University. "That meant arriving at precisely the right time and the right place." "We saw the crater, we really did see it," said Schultz, making a joke when an image of the crater failed to appear as prompted during a press conference. "It is subdued. It is about 150 meters across and has a small central mound in the center. It looks as if from the impact the stuff went up and came back down," he said. "This surface of the comet where we hit is very weak. It is fragile so the crater partly healed itself."

The spacecraft came closest to the comet at 11:39 pm Eastern time in the United States on Monday, or 0439 GMT Tuesday, at a distance of 181 kilometers (112 miles), NASA said. The Tempel 1 is about six kilometers (3.7 miles) wide and travels on an orbit that brings it as close to the Sun as Mars and as far away as Jupiter, as they passed. Comparing pictures of the comet taken in 2005 to the latest ones, Veverka said experts could detect "erosion on a scale of twenty or thirty meters has occurred in the five years since we took this picture."

Other areas seen for the first time appear to show layers of material that have been deposited, a phenomenon that deserves further study, Veverka said.

"They have been interpreted as places where a very volatile gas from below the surface has erupted carrying with it small particles of ice and dust and while some of the stuff leaves into space some of it just flows downhill because a comet does have a little bit of gravity," he said.

"We are seeing changes that we have to spend time quantifying to understand what they mean."

PhysOrg




TStzmmalaysia
post Feb 17 2011, 09:52 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Milestone in Path to Large-Scale Quantum Computing Reached: New Level of Quantum Control of Light

An important milestone toward the realization of a large-scale quantum computer, and further demonstration of a new level of the quantum control of light, were accomplished by a team of scientists at UC Santa Barbara and in China and Japan.

The study, published in the Feb. 7 issue of the journal Physical Review Letters, involved scientists from Zhejiang University, China, and NEC Corporation, Japan. The experimental effort was pursued in the research groups of UCSB physics professors Andrew Cleland and John Martinis.

The team described how they used a superconducting quantum integrated circuit to generate unique quantum states of light known as "NOON" states. These states, generated from microwave frequency photons, the quantum unit of light, were created and stored in two physically-separated microwave storage cavities, explained first author Haohua Wang, postdoctoral fellow in physics at UCSB. The quantum NOON states were created using one, two, or three photons, with all the photons in one cavity, leaving the other cavity empty. This was simultaneous with the first cavity being empty, with all the photons stored in the second cavity.

"This seemingly impossible situation, allowed by quantum mechanics, led to interesting results when we looked inside the cavities," said second author Matteo Mariantoni, postdoctoral fellow in physics at UCSB. "There was a 50 percent chance of seeing all the photons in one cavity, and a 50 percent chance of not finding any -- in which case all the photons could always be found in the other cavity."

However, if one of the cavities was gently probed before looking inside, thus changing the quantum state, the effect of the probing could be seen, even if that cavity was subsequently found to be empty, he added.

"It's kind of like the states are ghostly twins or triplets," said Wang. "They are always together, but somehow you never know where they are. They also have a mysterious way of communicating, so they always seem to know what is going to happen." Indeed, these types of states display what Einstein famously termed, "spooky action at a distance," where prodding or measuring a quantum state in one location affects its behavior elsewhere.

The quantum integrated circuit, which includes superconducting quantum bits in addition to the microwave storage cavities, forms part of what eventually may become a quantum computational architecture.

ScienceDaily


TStzmmalaysia
post Feb 17 2011, 09:54 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Futuristic Farm House Designed to Collect Rainwater and Shelter Migrant Workers

Two things that fluctuate considerably in the rich farmland of the central plains in the US are water for irrigation and migrant farmers. Aquifers are being depleted and water is becoming an even scarcer resource. Meanwhile, migrant farmers are crucial in our food production model, but often live in inadequate housing during their brief stay. Endemic Architecture proposes a modest farm house that could alleviate both problems. The futuristic home collects, stores and disseminates rainwater for crop irrigation, while providing a safe and sturdy home for farmers.

Safe, durable and comfortable housing should be a given right for all migrant workers, but often it is not. Endemic’s Farm House is a modest 800 sq ft home with a central living space surrounded by kitchen and dining area, storage, a bathroom and a sleeping area. Modular by design, these homes could aggregate to form grouped housing or could remain separate next to the field.

The geometry of the house is completely attributed to rainwater collection and is covered in watertight pouches or ‘canteens’ that store up to 34,000 gallons of water. This is enough water to irrigate 50 acres for nearly 1 month during the cultivation season. In the central plains and mid-western regions of the US, enough rain falls that the canteens might fill as many as 20 times in a year to full capacity. The home and the water storage is then connected into the farm’s irrigation system, whether it is for drip irrigation or for surface watering.

Inhabitat


TStzmmalaysia
post Feb 17 2011, 10:00 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Amphibious floating garden concept would clean rivers across Europe

Concerned about our rising population having serious water supply issues in the not too distant future, Lilypad floating city designer Vincent Callebaut has come up with a floating amphibious garden that can clean our rivers as it travels the waterways of Europe. His proposed "audacious, avant-garde" Physalia project will be a self-sufficient, nomadic research vessel which can also host aqua-focused exhibitions and conferences.

Our survival depends on getting a regular supply of safe, clean drinking water. According to figures provided by water.org though, about one in eight people currently lack access to safe water supplies. The United Nations says that nearly 4,000 children die every day due to dirty water or poor hygiene. Its World Water Development Report says that by 2050, at least a quarter of the world's population is likely to live in a country affected by chronic or recurring shortages of fresh water.

With these stark facts in mind, Callebaut has designed Physalia. He says that the vessel's structural design was inspired by the physalia physalis (commonly known as the jellyfish-like Portuguese man o' war), from the Greek for bladder or water bubble. As the craft wanders the waterways between the Danube and Volga, the Rhine and Guadalquivir, or the Euphrates and Tiger, a hydraulic network in its aluminum-covered, double hull will enable it to filter river water through to the planted roof for some natural purification.

The Belgian-born architect says that, like his Lilypad floating city concept, the self-cleaning vessel will have a titanium dioxide photocatalyst covering on the "silver-plated dress" which will help to further reduce water pollution. It's also envisaged that Physalia will be able to drag itself out of the water and operate on land too, although exactly how such a feat is achieved is not discussed.

Attached Image

He also plans for Physalia to be completely self-sufficient, producing more energy than it consumes (like the Plus Energy housing projects). To this end, photovoltaic panels in the roofing will harvest energy from the sun while underwater turbines will convert energy from the flowing river.

The interior of the vessel will be divided into four themed gardens representing the four elements. The main entrance will open into a water garden where exhibitions can be hosted, and an earth garden will also serve as a laboratory for international aquatic research initiatives. Peeking below the surface of the water will be the fire garden for dedicated exhibitions, and where visitors can look out into the river environment through underwater windows. Lastly, an amphitheater air garden will incorporate meeting and conference space.

"It is an ecosystem reacting to its environment, a fragment of living earth, inviting the fauna and the flora of the fluvial biodiversity to come and make its nest in the city," says Callebaut. It is aimed "at mixing people around the notion of water respect, sharing in movement and dynamic balance," he added. "After the Copenhagen conference, it is a project of transeuropean leadership and a positive innovation of ecologic resilience."

The architect doesn't go into any great detail on how all the various technologies will be integrated into Physalia, but concentrates more on how stunning it will look. In that regard, we have to agree. As to whether such a craft will ever leap from the drawing board and meander down Europe's rivers as intended, cleaning them as it goes, we'll have to wait and see ...

GizMag

TStzmmalaysia
post Feb 17 2011, 10:02 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Underwater Skyscraper is a Self-Sufficient City at Sea

Ocean levels are rising around the globe, so rather than tethering our buildings to the sinking shoreline why not suit them for a life at sea? That’s the approach behind the Water-Scraper, a futuristic self-sufficient floating city. A special mention in the eVolo Skyscraper Competition, the design expands the concept of a floating island into a full-fledged underwater skyscraper that harvests renewable energy and grows its own food.

Touted as a self-sufficent floating city, Sarly Adre Bin Sarkum’s Water-Scraper utilizes a variety of green technologies. It generates its own electricity using wave, wind, and solar power and it produces its own food through farming, aquaculture, and hydroponic techniques. The surface of the submerged skyscraper sustains a small forest, while the lower levels contain spaces for its inhabitants to live and work. The building is kept upright using a system of ballasts aided by a set of squid-like tentacles that generate kinetic energy.

The architects “envision a future where land as a resource will be scarce; it is only natural progression that we create our own. Approximately 71% of the Earth’s surface is ocean, even more if climate change has its way, hence it is only natural progression that we will populate the seas someday.” As anyone who has seen Waterworld will attest, it’s a grim future indeed — which is why it’s essential that we do what we can to stem the course of the world’s rising tides.

Attached Image

Inhabitat


TStzmmalaysia
post Feb 17 2011, 10:04 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

A new dimension for mathematics – the Periodic Table of shapes

Mathematicians are creating their own version of the periodic table that will provide a vast directory of all the possible shapes in the universe across three, four and five dimensions, linking shapes together in the same way as the periodic table links groups of chemical elements. The three-year project, announced today, should provide a resource that mathematicians, physicists and other scientists can use for calculations and research in a range of areas, including computer vision, number theory, and theoretical physics. For some mental exercise, check out these animations that have already been analyzed in the project.

The researchers, from Imperial College London and institutions in Australia, Japan and Russia, are aiming to identify all the shapes across three, four and five dimensions that cannot be divided into other shapes.

As these building block shapes are revealed, the mathematicians will work out the equations that describe each shape and through this, they expect to develop a better understanding of the shapes' geometric properties and how different shapes are related to one another.

The work is funded by the Engineering and Physical Sciences Research Council, the Leverhulme Trust, the Royal Society and the European Research Council.

Project leader Professor Alessio Corti, from the Department of Mathematics at Imperial College London, explained: "The periodic table is one of the most important tools in chemistry. It lists the atoms from which everything else is made, and explains their chemical properties. Our work aims to do the same thing for three, four and five-dimensional shapes - to create a directory that lists all the geometric building blocks and breaks down each one's properties using relatively simple equations. We think we may find vast numbers of these shapes, so you probably won't be able to stick our table on your wall, but we expect it to be a very useful tool."

The scientists will be analysing shapes that involve dimensions that cannot be 'seen' in a conventional sense in the physical world. In addition to the three dimensions of length, width and depth found in a three-dimensional shape, the scientists will explore shapes that involve other dimensions. For example, the space-time described by Einstein's Theory of Relativity has four dimensions - the three spatial dimensions, plus time. String theorists believe that the universe is made up of many additional hidden dimensions that cannot be seen.

Professor Corti's colleague on the project, Dr Tom Coates, has created a computer modelling programme that should enable the researchers to pinpoint the basic building blocks for these multi-dimensional shapes from a pool of hundreds of millions of shapes. The researchers will be using this programme to identify shapes that can be defined by algebraic equations and that cannot be divided any further. They do not yet know how many such shapes there might be. The researchers calculate that there are around 500 million shapes that can be defined algebraically in four dimensions and they anticipate that they will find a few thousand building blocks from which all these shapes are made.

Dr Coates, from the Department of Mathematics at Imperial College London, added: "Most people are familiar with the idea of three-dimensional shapes, but for those who don't work in our field, it might be hard to get your head around the idea of shapes in four and five dimensions. However, understanding these kinds of shapes is really important for lots of aspects of science. If you are working in robotics, you might need to work out the equation for a five dimensional shape in order to figure out how to instruct a robot to look at an object and then move its arm to pick that object up. If you are a physicist, you might need to analyse the shapes of hidden dimensions in the universe in order to understand how sub-atomic particles work. We think the work that we're doing in our new project will ultimately help our colleagues in many different branches of science.

"In our project we are looking for the basic building blocks of shapes. You can think of these basic building blocks as 'atoms', and think of larger shapes as 'molecules.' The next challenge is to understand how properties of the larger shapes depend on the 'atoms' that they are made from. In other words, we want to build a theory of chemistry for shapes," added Dr Coates.

Dr Coates has recently won a prestigious Philip Leverhulme Prize worth GBP70,000 from the Leverhulme Trust, providing some of the funding for this project. Philip Leverhulme prizes are awarded to outstanding scholars under the age of 36 who have "made a substantial contribution to their particular field of study, recognised at an international level, and where the expectation is that their greatest achievement is yet to come."

GizMag

TStzmmalaysia
post Feb 17 2011, 10:06 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

New fuel cell system produces grid electricity from natural gas

VTT, the Technical Research Centre of Finland, is currently field testing a prototype large-scale solid oxide fuel cell (SOFC) that the organization hopes will provide efficient, cheap grid power from natural gas and biogas. The VTT system is unique in that it uses a single 10 kW planar SOFC stack to produce a year’s worth of electricity for a typical apartment block.

The SOFC system is being developed as part of VTT’s Tekes Fuel Cell Program, and represents the first time a 10 kW power class planar SOFC fuel stack has been operated as part of a complete fuel cell system. The VTT fuel cell system is larger in scale than the Bloom Energy Server (or “Bloom Box”) that was revealed in 2010. Where the Bloom system is designed to power office buildings and similar applications, VTT sees its high-power fuel cell stacks powering the commercial electrical grid.

The VTT system is currently undergoing endurance testing for reliability, durability, and to determine further development needs. Although some of the system’s components are prototypes developed at VTT that have not yet reached mass production, VTT says the system has operated reliably for more than 1,500 hours since the beginning of November 2010.

An SOFC uses electrochemical conversion to produce electricity from the oxidizing of a fuel. An SOFC’s electrolyte is made from solid oxide, or ceramic, material. The advantages of this type of fuel cell are low emissions, stability, and fuel flexibility. VTT says its SOFC can use a wide range of fuels including biogas. SOFCs generally have a higher operating temperature than other types of fuel cells, which can affect their mechanical and chemical design.

The VTT SOFC technology is the result of a partnership that includes Lappeenranta University of Technology and Aalto University. Lappeenranta developed the system’s power electronics, used to transform the SOFC direct current into alternating current suitable for the grid. Aalto University participated in the unit’s mechanical design. In addition, the SOFC stack was supplied by Versa Power Systems Inc. of Canada.

The VTT Technical Research Centre of Finland is a non-profit research organization that specializes in multi-technology applied research in energy, industrial systems, applied materials, and bio- and chemical processes. The Tekes Fuel Cell Program is intended to help Finnish industry develop fuel cell technology and products.

GizMag

TStzmmalaysia
post Feb 18 2011, 10:30 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

HyperSolar concentrator could boost solar panel light input by 400 percent

Solar cells are the most expensive part of a solar panel, so it would follow that if panels could produce the same amount of electricity with less cells, then their prices would come down. In order for panels to be able to do so using existing cell technology, however, they would need to get more light to the fewer cells that they still had. Mounting the panels on the end of vertical poles to get them closer to the sun is one possible approach, that might work in the town of Bedrock or on Gilligan’s Island. A better idea, though, is to apply a clear layer of solar concentrators to the surface of a panel – and that’s just what HyperSolar intends to do.

The California company claims that it has just completed the prototype design of “the world’s first thin and flat solar concentrator for direct placement on top of existing solar cells.” Each sheet will contain a matrix of optical concentrators that are capable of collecting sunlight from a variety of angles. Beneath those concentrators will be a “photonics network,” that will channel light from all the collection points on the top to concentrated output points on the bottom. This network will also able to separate the sunlight into different spectrum ranges, so that specific ranges can be sent to specific cells designed to absorb them.

The sheets will also incorporate a photonics thermal management system, that will keep unusable parts of the solar spectrum from reaching the cells. This should keep the cells from overheating, and becoming less efficient.

Attached Image

While HyperSolar predicts that its product will be able to magnify the sun’s rays by 300 to 400 percent, not all cells will necessarily be able to handle that kind of intensity. For that reason, the solar concentrators will come in Low Magnification, High Magnification, and Mix-Mode Magnification models. At the 400 percent level, the company states that a concentrator-equipped panel could use 75 percent less cells than one without.

HyperSolar’s next step will be to produce an actual physical prototype, and see if it works as envisioned. We’ll keep you posted ..

Gizmag


TStzmmalaysia
post Feb 19 2011, 12:42 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Robots learn human perception

How do robots learn to move? Michael J. Black, founding director of the Max Planck Institute for intelligent systems, tries to answer this question. © Michael J. Black

"I really like Tubingen," says Michael J. Black. "The city, its surroundings. My wife and I love hiking. The Swabian Alb is fantastic!" Michael J. Black intends to stay here permanently. He is now in his late forties. "That means I still have about 20 years to carry out my research." And he is pretty certain about what he wants to achieve in this time. "I want to teach a robot to be as familiar with the world as a two-year-old child." This may not sound like a lot at first, but it would actually be a sensation – after all, two-year-olds are pretty darned clever.

Newborn babies have a strong grip. They have strong grasp reflexes, which is evident when they grab your finger for example - but that is about all they can do. A two-year-old child, however, is already an expert when it comes to grasping and has dozens of gripping variations. For instance, they can gently lift objects and hold a spoon. Small children can competently move round angular and pointed objects in their hands, and they are also capable of abstraction. They can recognise angular objects as angular objects and round objects as round objects, regardless of whether the object has three, four or five corners or curves – and regardless of whether this is the first time they have seen the object.

It is this abstraction ability that is still missing from the brain of a computer today. "Human beings analyse their environment within fractions of a second," says Black. "All we need to do is glance at a surface to know whether it is slippery or not." A computer has to carry out extensive calculations before it can disentangle the input from its sensors and identify what something is. The human brain, however, picks a few basic characteristics from the storm of sensory stimuli it receives and comes to an accurate conclusion about the nature of the world around us.

This is precisely what Black envisages for his computers: to be capable of generalisation and abstraction, to be able to infer characteristics from a small amount of information. Yes, a technical system can process thousands of data, figures and measurement values and analyse the atomic structure of a floor tile – yet a robot would probably still slip on a floor that has been freshly mopped. Black's central question is which environmental stimuli are important. How does the brain manage to deduce so much from so little – and safely guide us through our lives? And how do we teach this to computers and robots?
Black is one of three founding directors that will head the restructuring of the Max Planck Institute for Metal Research to focus on the area of intelligent systems. Up until a few weeks ago, he held the Chair for Computer Sciences at the Brown University in Providence in the US state of Rhode Island. Here, he worked closely with neurosurgeons. It was clear to him that he had to be able to understand both – the computer and the workings of the human brain. He developed statistical computing processes, so-called estimators, which reduce the complexity of environmental stimuli to a required extent, just like the brain. Thanks to these estimators, the computer does not get lost in the massive volume of data.

With this procedure, he is gradually approximating the environment – something he calls “approximative visual physics”. Black is focusing primarily on vision, on movements in particular, as these are especially strong stimuli for the human brain. From the jumble of light reflexes, shadows and roaming pixels of a film sequence, his computing processes can now extract objects that have been moved – just not as swiftly or as simply as the brain. For this reason, the brain continues to be his greatest teacher.

Black's medical colleagues in the US planted tiny electrodes in the brains of paraplegic patients in the areas of the brain responsible for movement – the motor cortex. They then analysed the stimulation of the nerve cells. Nerve cells send out extremely fine electrical impulses when they are stimulated, and the electrodes detect these extremely fine electric shocks. Such electrical stimulation initially does not look much different to a noisy television screen. Black has succeeded in identifying and interpreting clear activation samples from this flickering. Thanks to his mathematical procedure, the computer was able to translate the thoughts of the patients into real movements: simply through the power of thought, the patients could move the cursor on a computer monitor. These links between the brain and computer are called brain-computer interfaces by experts.

Black has analysed the activity sample from the motor cortex and hopes to be able to draw conclusions for the programming of computers. Particularly interesting is that the motor cortex of a human being also becomes active if the person only observes movement, even if the body itself is completely motionless. "Apparently, there is a relationship between a person's knowledge of movement and the person's observation of movement," says Black.

Such findings could be extremely important for future learning strategies for computers. In Tübingen, Black is thus not only working closely with his direct colleagues Bernhard Schölkopf and Joachim Spatz, but also with experts from the neighbouring Max Planck Institute for Biological Cybernetics, such as the working group headed by Heinrich H. Bülthoff, which is researching perception phenomena and the interaction between humans and robots.

A fundamental reason for moving to Tübingen was the opportunity to work on "something really big". "The Max Planck Society allows a scientist to carry out basic research over a period of years in order to understand the basic principles. This is extremely rare in the US," says Black. "Many projects have short terms of about three years, and they then have to deliver clear results. You end up gradually swinging from one small project and one small scientific question to the next – and you simply hope that you will be able to fit everything into the big picture at the end." In Tübingen, on the other hand, the topic of artificial perception and artificial vision can be tackled from the ground up – and over a period of years.

"I would like to understand the neuronal control of movement, understand the models after which the brain is patterned and then transfer these basic principles to the artificial world." So we will know in 20 years whether his machines can actually be as clever as a small child. The challenge is immense, as machines can fail in even the most minor trivial tasks. Put a spoon, rubber ball and cleaning cloth into the hand of a robot one after the other and you certainly won't be bored, as the steel servant will awkwardly play around with these objects for a while before finally understanding what it has to do with them. In the time it takes for this, a two-year-old child will have long ago lost interest and moved on to the next toy.

PhysOrg

TStzmmalaysia
post Feb 19 2011, 12:44 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Scientists build world's first anti-laser

More than 50 years after the invention of the laser, scientists at Yale University have built the world's first anti-laser, in which incoming beams of light interfere with one another in such a way as to perfectly cancel each other out. The discovery could pave the way for a number of novel technologies with applications in everything from optical computing to radiology.
Conventional lasers, which were first invented in 1960, use a so-called "gain medium," usually a semiconductor like gallium arsenide, to produce a focused beam of coherent light—light waves with the same frequency and amplitude that are in step with one another.

Last summer, Yale physicist A. Douglas Stone and his team published a study explaining the theory behind an anti-laser, demonstrating that such a device could be built using silicon, the most common semiconductor material. But it wasn't until now, after joining forces with the experimental group of his colleague Hui Cao, that the team actually built a functioning anti-laser, which they call a coherent perfect absorber (CPA).

The team, whose results appear in the Feb. 18 issue of the journal Science, focused two laser beams with a specific frequency into a cavity containing a silicon wafer that acted as a "loss medium." The wafer aligned the light waves in such a way that they became perfectly trapped, bouncing back and forth indefinitely until they were eventually absorbed and transformed into heat.

Stone believes that CPAs could one day be used as optical switches, detectors and other components in the next generation of computers, called optical computers, which will be powered by light in addition to electrons. Another application might be in radiology, where Stone said the principle of the CPA could be employed to target electromagnetic radiation to a small region within normally opaque human tissue, either for therapeutic or imaging purposes.

Theoretically, the CPA should be able to absorb 99.999 percent of the incoming light. Due to experimental limitations, the team's current CPA absorbs 99.4 percent. "But the CPA we built is just a proof of concept," Stone said. "I'm confident we will start to approach the theoretical limit as we build more sophisticated CPAs." Similarly, the team's first CPA is about one centimeter across at the moment, but Stone said that computer simulations have shown how to build one as small as six microns (about one-twentieth the width of an average human hair).
The team that built the CPA, led by Cao and another Yale physicist, Wenjie Wan, demonstrated the effect for near-infrared radiation, which is slightly "redder" than the eye can see and which is the frequency of light that the device naturally absorbs when ordinary silicon is used. But the team expects that, with some tinkering of the cavity and loss medium in future versions, the CPA will be able to absorb visible light as well as the specific infrared frequencies used in fiber optic communications.

It was while explaining the complex physics behind lasers to a visiting professor that Stone first came up with the idea of an anti-laser. When Stone suggested his colleague think about a laser working in reverse in order to help him understand how a conventional laser works, Stone began contemplating whether it was possible to actually build a laser that would work backwards, absorbing light at specific frequencies rather than emitting it.

"It went from being a useful thought experiment to having me wondering whether you could really do that," Stone said. "After some research, we found that several physicists had hinted at the concept in books and scientific papers, but no one had ever developed the idea."

PhysOrg

TStzmmalaysia
post Feb 19 2011, 12:45 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Algae clean wastewater, convert to biodiesel

Let algae do the dirty work. Researchers at Rochester Institute of Technology are developing biodiesel from microalgae grown in wastewater. The project is doubly "green" because algae consume nitrates and phosphates and reduce bacteria and toxins in the water. The end result: clean wastewater and stock for a promising biofuel.
The purified wastewater can be channeled back into receiving bodies of water at treatment plants, while the biodiesel can fuel buses, construction vehicles and farm equipment. Algae could replace diesel's telltale black puffs of exhaust with cleaner emissions low in the sulfur and particulates that accompany fossil fuels.

Algae have a lot of advantages. They are cheaper and faster to grow than corn, which requires nutrient-rich soil, fertilizer and insecticide. Factor in the fuel used to harvest and transport corn and ethanol starts to look complicated.

In contrast, algae are much simpler organisms. They use photosynthesis to convert sunlight into energy. They need only water—ponds or tanks to grow in—sunlight and carbon dioxide.

"Algae—as a renewable feedstock—grow a lot quicker than crops of corn or soybeans," says Eric Lannan, who is working on his master's degree in mechanical engineering at RIT. "We can start a new batch of algae about every seven days. It's a more continuous source that could offset 50 percent of our total gas use for equipment that uses diesel."

Cold weather is an issue for biodiesel fuels.

"The one big drawback is that biodiesel does freeze at a higher temperature," says Jeff Lodge, associate professor of biological sciences at RIT. "It doesn't matter what kind of diesel fuel you have, if it gets too cold, the engine's not starting. It gels up. It's possible to blend various types of biodiesel—algae derived with soybeans or some other type—to generate a biodiesel with a more favorable pour point that flows easily."

Lannan's graduate research in biofuels led him to Lodge's biology lab. With the help of chemistry major Emily Young, they isolated and extracted valuable fats, or lipids, algae produce and yielded tiny amounts of a golden-colored biodiesel. They are growing the alga strain Scenedesmus, a single-cell organism, using wastewater from the Frank E. Van Lare Wastewater Treatment Plant in Irondequoit, N.Y.

"It's key to what we're doing here," Lodge says. "Algae will take out all the ammonia—99 percent—88 percent of the nitrate and 99 percent of the phosphate from the wastewater — all those nutrients you worry about dumping into the receiving water. In three to five days, pathogens are gone. We've got data to show that the coliform counts are dramatically reduced below the level that's allowed to go out into Lake Ontario."

Lodge and Lannan ramped up their algae production from 30 gallons of wastewater in a lab at RIT to 100 gallons in a 4-foot-by-7-foot long tank at Environmental Energy Technologies, an RIT spinoff. Lannan's graduate thesis advisor Ali Ogut, professor of mechanical engineering, is the company's president and CTO. In the spring, the researchers will build a mobile greenhouse at the Irondequoit wastewater treatment plant and scale up production to as much as 1,000 gallons of wastewater.

Northern Biodiesel, located in Wayne County, will purify the lipids from the algae and convert them into biodiesel for the RIT researchers.

PhysOrg
TStzmmalaysia
post Feb 19 2011, 12:46 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Higher levels of social activity decrease the risk of developing disability in old age

Afraid of becoming disabled in old age, not being able to dress yourself or walk up and down the stairs? Staying physically active before symptoms set in could help. But so could going out to eat, playing bingo and taking overnight trips.

According to research conducted at Rush University Medical Center, higher levels of social activity are associated with a decreased risk of becoming disabled. The study has just been posted online and will be published in the April issue of the Journal of Gerontology: Medical Sciences.

"Social activity has long been recognized as an essential component of healthy aging, but now we have strong evidence that it is also related to better everyday functioning and less disability in old age," said lead researcher Bryan James, PhD, postdoctoral fellow in the epidemiology of aging and dementia in the Rush Alzheimer's Disease Center. "The findings are exciting because social activity is potentially a risk factor that can be modified to help older adults avoid the burdens of disability."

The study included 954 older adults with a mean age of 82 who are participating in the Rush Memory and Aging Project, an ongoing longitudinal study of common chronic conditions of aging. At the start of the investigation, none of the participants had any form of disability. They each underwent yearly evaluations that included a medical history and neurological and neuropsychological tests.

Social activity was measured based on a questionnaire that assessed whether, and how often, participants went to restaurants, sporting events or the teletract (off-track betting) or played bingo; went on day trips or overnight trips; did volunteer work; visited relatives or friends; participated in groups such as the Knights of Columbus; or attended religious services.

To assess disability, participants were asked whether they could perform six activities of daily living without help: feeding, bathing, dressing, toileting, transferring and walking across a small room. They were also asked whether they could perform three tasks that require mobility and strength: walking up and down a flight of stairs, walking a half mile and doing heavy housework. Finally, they were asked about their ability to perform what are referred to as "instrumental" activities of daily living, such as using the telephone, preparing meals and managing medications. Difficulties with household management and mobility are more common and represent less severe disability than difficulty with self-care tasks, so the measures represented a range of disability.

Results showed that a person who reported a high level of social activity was about twice as likely to remain free of a disability involving activities of daily living than a person with a low level of social activity, and about 1.5 times as likely to remain free of disability involving instrumental activities of daily living or mobility.

Why social activity plays a role in the development of disability is not clear, James said. Possibly, social activity may reinforce the neural networks and musculoskeletal function required to maintain functional independence.

EurekAlert

TStzmmalaysia
post Feb 19 2011, 12:47 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Cycloclean Purifies Drinking Water Using Bicycle Pedal Power

Pedaling your way to clean water? You can do just that thanks to Japan’s Nippon Basic Company who have introduced “Cycloclean,” a durable bike equipped with a kinetic water purifier. Initially designed for disaster zones and remote villages, users can bike to virtually anywhere there is a water source, be it rivers, ponds, pools and even wells, and source their own clean water. The device is in fact powerful enough to siphon water from a depth of five meters! An innovative design, the Cycloclean could give way to life changing conditions in remote or contaminated villages around the globe.

Aside from remote villages and disaster zones, the Cycloclean would be great for long distance bike riders. Pedaling the Cycloclean for just one minute can purify up to five liters of drinking water, which is more than the average person can drink in a day (Imagine biking a days long mountain trail, without having to tote or stop for clean water!).

But there’s a clincher – as of now, with Japanese production, the price tag of each system weighs in at around $6,600. As a result, most of the bikes thus far have been sold to local Japanese governments, with only a small amount being distributed to bordering countries. However, Nippon Basic does have plans to start production in Bangladesh to both help reduce the cost, as well as provide the water purifying opportunities to the Bangladeshi people.

We hope that the Cycloclean will be available for mass production, allowing purified water for all at only the cost of a little anaerobic exercise.

Inhabitat

TStzmmalaysia
post Feb 19 2011, 12:49 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Solar-Powered Mobile Eco Shelter for the Swiss Alps

As demand for recreational facilities in the pristine wilderness grows, so does our footprint. At the same time, the prospect of staying in a mountaintop cabin in the Swiss Alps with views from the top of the world is irresistible. The Eco-Temporary Refuge is one proposed solution that consists of a small cabin that can be helicoptered to the site and set up on embedded pipe piers. Cimini Architettura‘s solar-powered retreat is designed to be fully self-supporting, and it can be removed with minimum impact to the site.

The simple design has room for six bunks, a bath, a foyer, and living space. One huge window overlooking the breathtaking views can also let in precious solar heat. When the sun goes down, a thermal curtain helps keep the heat in. The cabin is supported by a 4kW solar array mounted on the walls to keep snow from covering it. The array supplies basic electricity for the occupants of course, but the designers also intend to heat the cabin with an under-floor heating system. Details are vague as to how this system works — suffice to say we are guessing it is not electrically heated.

Snow is melted by the solar system to feed into the unit’s water supply, and all appliances are powered by the solar electric system. A supply of bioethanol is available for emergency heating and cooking needs. The system details are sketchy for the proposal, but the intent is clear: low-impact and easily removable buildings can help us to experience the wilderness while maintaining a pristine environment.

Inhabitat


TStzmmalaysia
post Feb 19 2011, 12:52 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


TRANSPORTATION

Attached Image

Japan to Build World’s Fastest Train: A 310MPH Maglev Monster

China and Japan have been constantly trying to outdo each other when it comes to high-speed rail. Now, the Central Japan Railway Company has announced that it plans to build the world’s fastest train. The train, to be completed in 2027, will travel between Tokyo and Nagoya on a 178 miles rail extension line, estimated to cost about $64 billion. The speedy train will run up to 310 miles per hour, cutting down the hour and half long journey by 40 minutes — the current line runs at about 167 miles per hour.

To gain additional speed, the new rail will use magnetic levitation — where powerful magnets elevate the train above the track, cutting down friction. The current high-speed record is held by a Chinese passenger train that traveled 302 miles per hour (486 kilometers per hour) during a test run on a still unopened line between Beijing and Shanghai.

Inhabitat

TStzmmalaysia
post Feb 19 2011, 12:53 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


TRANSPORTATION

Attached Image

Scientists Steer Car With the Power of Thought

You need to keep your thoughts from wandering, if you drive using the new technology from the AutoNOMOS innovation labs of Freie Universität Berlin. The computer scientists have developed a system making it possible to steer a car with your thoughts. Using new commercially available sensors to measure brain waves -- sensors for recording electroencephalograms (EEG) -- the scientists were able to distinguish the bioelectrical wave patterns for control commands such as "left," "right," "accelerate" or "brake" in a test subject.

They then succeeded in developing an interface to connect the sensors to their otherwise purely computer-controlled vehicle, so that it can now be "controlled" via thoughts. Driving by thought control was tested on the site of the former Tempelhof Airport.

The scientists from Freie Universität first used the sensors for measuring brain waves in such a way that a person can move a virtual cube in different directions with the power of his or her thoughts. The test subject thinks of four situations that are associated with driving, for example, "turn left" or "accelerate." In this way the person trained the computer to interpret bioelectrical wave patterns emitted from his or her brain and to link them to a command that could later be used to control the car. The computer scientists connected the measuring device with the steering, accelerator, and brakes of a computer-controlled vehicle, which made it possible for the subject to influence the movement of the car just using his or her thoughts.

"In our test runs, a driver equipped with EEG sensors was able to control the car with no problem -- there was only a slight delay between the envisaged commands and the response of the car," said Prof. Raúl Rojas, who heads the AutoNOMOS project at Freie Universität Berlin. In a second test version, the car drove largely automatically, but via the EEG sensors the driver was able to determine the direction at intersections.

The AutoNOMOS Project at Freie Universität Berlin is studying the technology for the autonomous vehicles of the future. With the EEG experiments they investigate hybrid control approaches, i.e., those in which people work with machines.

ScienceDaily

TStzmmalaysia
post Feb 19 2011, 12:55 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

World’s Largest Landfill Transformed into Freshkills Park 3X the Size of Central Park

Mention "Freshkills" to any New Yorker and the reaction you'll most likely get is "P.U." The name has long been associated with the world's largest landfill, and has made life for many Staten Islanders (where the dump is located), well, stink. But what many people aren't yet aware of is that the long suffering of those folks is soon to be rewarded because a monumental new green space is being developed on the site of the old landfill, which received its last barge of garbage on March 22, 2001, and from what we've seen, it's incredible.

At 2,200 acres, Freshkills Park is set to be almost three times the size of Central Park and will be the largest park developed in New York City in over 100 years. That's right, what was the world's biggest collection of garbage is being transformed into a beautiful green space for New Yorkers to hike, play and even ski - and with numerous sustainable strategies already in the works, it's also promising to be one of the most eco-friendly developments in the city.

Sure, we all love Central Park, but Freshkills Park is set to eclipse it in both size and the wide range of recreational opportunities that will be available (check out the slideshow above to see what some of them will be). The park will also offer ecological restoration and cultural and educational programming, including possible demonstrations teaching the public about renewable energy, that will echo its environmental mission. About 45 percent of the planned park site was once used for landfill operations, but the remainder of the land is currently composed of wetlands, open waterways, and unfilled lowland areas.

The full transformation and build-out will continue over the next 30 years, with phases over the next few years focusing on providing the public with the opportunity to see the interior of the site, which will be a unique combination of natural and engineered beauty. One of the coolest examples of what people will be able to experience is standing atop the landfill mounds themselves to check out a breathtaking view of lower Manhattan.

Freshkills will also be a showcase for sustainable strategies, some of which are already in place. The NYC Department of Sanitation is already using advanced landfill gas collection infrastructure throughout the landfill to actively harvest methane from the buried decomposing waste. The methane is sold to National Grid to heat close to 22,000 homes on Staten Island and the city generates approximately $11 million in annual revenue from the sale of the methane. In addition to turning farts to fuel, some of the other strategies the city is considering for the park are solar panels, wind turbines, solar thermal cells in water heating systems, geothermal heating and cooling, and following LEED building principles.

A brief history of Freshkills Landfill: Fresh Kills Landfill was established in 1948, before there was any large–scale development in the area. Over the years, it became the largest landfill in the world, amassing most of the household garbage collected in New York City. To give you an idea about how much trash was coming into the dump, at its peak, Fresh Kills received as much as 29,000 tons of trash per day and the four landfills mounds on the site are made up of approximately 150 million tons of solid waste.

By 1997, two of the four landfill mounds were closed off and covered with an impermeable cap. Freshkills received its last barge of garbage on March 22, 2001. New York City’s garbage is now shipped to landfill locations in places such as Pennsylvania and Virginia.

Inhabitat

TStzmmalaysia
post Feb 19 2011, 12:56 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

California Bill Brings Rainwater to Your Toilet

While rainwater collection is a contentious issue in some areas, California is looking to make it a little easier to use your rain barrels as a water source. This week, AB 275, the Rainwater Capture Act of 2011 was introduced into the California State Assembly, a bill that would allow landowners the authority to install rain barrel systems and not only capture water for outdoor use, but for indoor use as well.

The NRDC Switchboard reports, "The bill would also authorize landowners to install systems to capture rainwater for use, with proper treatment, in indoor non-potable applications, such as toilet or urinal flushing. Allowing rainwater to be used for indoor applications would greatly expand the opportunities to capture and use rainwater in the state. The more uses rainwater can be directed to, the faster storage tanks can be emptied, and the more water can be captured."

With a growing population and water sources maxed out especially in the face of long-lasting droughts, California needs to use whatever water it can get. Embracing the refreshing burst of rain the state has experienced the last couple months with an improved policy on rainwater catchment could help alleviate the strain on other sources like the Sierra snowpack and Colorado River, and it's a way to utilize all that rain that simply falls on rooftops and streets, only to be wooshed away into stormwater systems rather than back into groundwater supplies.

Of course, the benefits of rainwater catchment systems would be dependent on homeowners actually installing the systems. But it would be wonderful to be able to utilize rainwater for such uses as flushing the toilet, rather than wasting drinkable water.

TreeHugger

TStzmmalaysia
post Feb 19 2011, 12:58 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Robotic hand nearly identical to a human one

When it comes to finding the single best tool for building, digging, grasping, drawing, writing, and many other tasks, nothing beats the human hand. Human hands have evolved over millions of years into four fingers and a thumb that can precisely manipulate a wide variety of objects. In a recent study, researchers have attempted to recreate the human hand by building a biomimetic robotic hand that they have optimized to achieve near-human appearance and performance.
The researchers, Nicholas Thayer and Shashank Priya from Virginia Tech in Blacksburg, Virginia, have published their study on the robotic hand in a recent issue of Smart Materials and Structures.

The researchers call the hand a dexterous anthropomorphic robotic typing hand, or DART hand, as the main objective was to demonstrate that the hand could type on a computer keyboard. They showed that a single DART hand could type at a rate of 20 words per minute, compared to the average human typing speed of 33 words per minute with two hands. The researchers predict that two DART hands could type at least 30 words per minute. Ultimately, the DART hand could be integrated into a humanoid robot for assisting the elderly or disabled people, performing tasks such as typing, reaching objects, and opening doors.

To design the DART hand, the researchers began by investigating the physiology of the human hand, including its musculoskeletal structure, range of motion, and grasp force. The human hand has about 40 muscles that provide 23 degrees of freedom in the hand and wrist. To replicate these muscles, the researchers used servo motors and wires extending throughout the robotic hand, wrist, and forearm. The robotic hand encompassed a total of 19 motors and achieved 19 degrees of freedom.

“[The greatest significance of our work is the] optimization of the hand design to reduce the number of motors in order to achieve a similar degree of freedom and range of motion as the human hand,” Priya told PhysOrg.com. “This also allowed us to achieve dimensions that are on par with the human hand. We were also able to program the hand in such a manner that a high typing efficiency can be obtained.”

Attached Image

One small difference between the DART hand and the human hand is that each finger in the robotic hand is controlled independently. In the human hand, muscles are sometimes connected at the tendons so they can move joints in more than one finger (which is particularly noticeable with the ring and pinky fingers).
The robotic hand can be controlled by input text, which comes from either a keyboard or a voice recognition program. When typing, a finger receives a command to position itself above the correct letter on the keyboard. The finger presses the key with a specific force, and the letter is checked for accuracy; if there is a typo, the hand presses the delete key. By moving the forearm and wrist, a single DART hand can type any key on the main part of a keyboard.

The DART hand isn’t the first robotic hand to be designed. During the past several years, robotic hands with varying numbers of fingers have been developed for a variety of purposes, from prosthetics to manufacturing. But as far as the researchers know, no robotic hand can accurately type at a keyboard at human speed. When the researchers compared the functional potential of the DART hand to other robotic hands, the DART hand had an overall functional advantage. In addition, the researchers used rapid prototyping to fabricate all the components, significantly reducing the cost, weight, and fabrication time.

In the future, the researchers plan to make further improvements to the robotic hand, including covering the mechanical hand in a silicone skin, as well as adding temperature sensors, tactile sensors, and tension sensors for improved force-feedback control. These improvements should give the robotic hand the ability to perform more diverse tasks.

“We have already experimented with grasping tasks,” Priya said. “In the current form it is not optimized for grasping, but in our next version there will be enough sensors to provide feedback for controlling the grasping action.”

PhysOrg

TStzmmalaysia
post Feb 19 2011, 01:38 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Magma power for geothermal energy?

When a team of scientists drilling near an Icelandic volcano hit magma in 2009, they had to abandon their planned experiments on geothermal energy. But the mishap could point the way to an alternative source of geothermal power.

"Because we drilled into magma, this borehole could now be a really high-quality geothermal well," said Peter Schiffmann, professor of geology at UC Davis and a member of the research team along with fellow UC Davis geology professor Robert Zierenberg and UC Davis graduate student Naomi Marks. The project was led by Wilfred Elders, a geology professor at UC Riverside.

A paper describing geological results from the well was published this month in the journal Geology.

When tested, the magma well produced dry steam at 750 degrees Fahrenheit (400 degrees Celsius). The team estimated that this steam could generate up to 25 megawatts of electricity -- enough to power 25,000 to 30,000 homes.

That compares to 5 to 8 megawatts produced by a typical geothermal well, Elders said. Iceland already gets about one-third of its electricity and almost all of its home heating from geothermal sources.

The team was drilling into the Krafla caldera as part of the Iceland Deep Drilling Project, an industry-government consortium, to test whether "supercritical" water -- very hot water under very high pressure -- could be exploited as a source of power.

They planned to drill to 15,000 feet -- more than two miles deep-- but at 6,900 feet, magma (molten rock from the Earth's core) flowed into the well, forcing them to stop.

The composition of magma from the borehole is also providing insight into how magmas form beneath Iceland, Schiffmann said.

PhysOrg

TStzmmalaysia
post Feb 20 2011, 08:16 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

World’s first hummingbird-like unmanned aircraft system takes flight

AeroVironment, the California-based company behind the largest, highest and longest flying unmanned aircraft system (UAS), the Global Observer, has now achieved a remarkable technical milestone with a much smaller aircraft. With its "Nano Hummingbird" the company has for the first time achieved controlled precision hovering and fast-forward flight of a two-wing, flapping wing aircraft that carries its own energy source and relies only on its flapping wings for propulsion and control.

The hand-made final concept demonstrator Nano Hummingbird has a wingspan of 16 cm (6.5 in) and weighs just 19 g (2/3 oz), which is less than the weight of a AA battery. Into this tiny and lightweight package the AeroVironment UAS team has managed to cram all the systems required for flight, including batteries, motors, communications systems and even a video camera.

The aircraft can climb and descend vertically, fly sideways left and right, fly forward and backward, as well as rotating clockwise and counter-clockwise – all under remote control and while carrying a video camera payload. It is even capable of doing a 360-degree loop.

The Nano Hummingbird can be fitted with a removable body fairing, which is shaped to have the appearance of a real hummingbird and, although it is larger and heavier than an average hummingbird, the aircraft is actually smaller and lighter than the largest hummingbird found in nature.

The achievement was part of the Phase II contract awarded by DARPA to AeroVironment to design and build a flying prototype “hummingbird-like” aircraft for the Nano Air Vehicle (NAV) program.

To meet the technical goals of the contract AeroVironment needed to:

- Demonstrate precision hover flight within a virtual two-meter diameter sphere for one minute.

- Demonstrate hover stability in a wind gust flight which required the aircraft to hover and tolerate a two-meter per second (five mph) wind gust from the side, without drifting downwind more than one meter.

- Demonstrate a continuous hover endurance of eight minutes with no external power source.

- Fly and demonstrate controlled, transition flight from hover to 11 mph (17.7 km/h) fast forward flight and back to hover flight.

- Demonstrate flying from outdoors to indoors, and back outdoors through a normal-size doorway.

- Demonstrate flying indoors 'heads-down' where the pilot operates the aircraft only looking at the live video image stream from the aircraft, without looking at or hearing the aircraft directly.

- Fly the aircraft in hover and fast forward flight with bird-shaped body and bird-shaped wings.

AeroVironment says that not only did its Nano Hummingbird meet all of these requirements, but that it also exceeded many of them.

Check out the two videos below to see the Nano Humminbird in action. The first video from 2009 shows the Nano Hummingbird at an early stage of development, while the second video shows a flight of the final concept demonstrator.



GizMag

TStzmmalaysia
post Feb 22 2011, 07:38 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

New Technology for Cheaper, More Efficient Solar Cells

The sun provides more than enough energy for all our needs, if only we could harness it cheaply and efficiently. Solar energy could provide a clean alternative to fossil fuels, but the high cost of solar cells has been a major barrier to their widespread use.

Stanford researchers have found that adding a single layer of organic molecules to a solar cell can increase its efficiency three-fold and could lead to cheaper, more efficient solar panels. Their results were published online in ACS Nano on Feb. 7.

Professor of chemical engineering Stacey Bent first became interested in a new kind of solar technology two years ago. These solar cells used tiny particles of semiconductors called "quantum dots." Quantum dot solar cells are cheaper to produce than traditional ones, as they can be made using simple chemical reactions. But despite their promise, they lagged well behind existing solar cells in efficiency.

"I wondered if we could use our knowledge of chemistry to improve their efficiency," Bent said. If she could do that, the reduced cost of these solar cells could lead to mass adoption of the technology.

Bent discussed her research on Feb. 20, at the annual meeting of the American Association for the Advancement of Science in Washington, D.C.

In principle, quantum dot cells can reach much higher efficiency, Bent said, because of a fundamental limitation of traditional solar cells.

Solar cells work by using energy from the sun to excite electrons. The excited electrons jump from a lower energy level to a higher one, leaving behind a "hole" where the electron used to be. Solar cells use a semiconductor to pull an electron in one direction, and another material to pull the hole in the other direction. This flow of electron and hole in different directions leads to an electric current.

But it takes a certain minimum energy to fully separate the electron and the hole. The amount of energy required is specific to different materials and affects what color, or wavelength, of light the material best absorbs. Silicon is commonly used to make solar cells because the energy required to excite its electrons corresponds closely to the wavelength of visible light.

But solar cells made of a single material have a maximum efficiency of about 31 percent, a limitation of the fixed energy level they can absorb.

Quantum dot solar cells do not share this limitation and can in theory be far more efficient. The energy levels of electrons in quantum dot semiconductors depends on their size -- the smaller the quantum dot, the larger the energy needed to excite electrons to the next level.

So quantum dots can be tuned to absorb a certain wavelength of light just by changing their size. And they can be used to build more complex solar cells that have more than one size of quantum dot, allowing them to absorb multiple wavelengths of light.

Because of these advantages, Bent and her students have been investigating ways to improve the efficiency of quantum dot solar cells, along with associate Professor Michael McGehee of the department of Materials Science and Engineering.

The researchers coated a titanium dioxide semiconductor in their quantum dot solar cell with a very thin single layer of organic molecules. These molecules were self-assembling, meaning that their interactions caused them to pack together in an ordered way. The quantum dots were present at the interface of this organic layer and the semiconductor. Bent's students tried several different organic molecules in an attempt to learn which ones would most increase the efficiency of the solar cells.

But she found that the exact molecule didn't matter -- just having a single organic layer less than a nanometer thick was enough to triple the efficiency of the solar cells. "We were surprised, we thought it would be very sensitive to what we put down," said Bent.

But she said the result made sense in hindsight, and the researchers came up with a new model -- it's the length of the molecule, and not its exact nature, that matters. Molecules that are too long don't allow the quantum dots to interact well with the semiconductor.

Bent's theory is that once the sun's energy creates an electron and a hole, the thin organic layer helps keep them apart, preventing them from recombining and being wasted. The group has yet to optimize the solar cells, and they have currently achieved an efficiency of, at most, 0.4 percent. But the group can tune several aspects of the cell, and once they do, the three-fold increase caused by the organic layer would be even more significant.

Bent said the cadmium sulfide quantum dots she is currently using are not ideal for solar cells, and the group will try different materials. She said she would also try other molecules for the organic layer, and could change the design of the solar cell to try to absorb more light and produce more electrical charge. Once Bent has found a way to increase the efficiency of quantum dot solar cells, she said she hopes their lower cost will lead to wider acceptance of solar energy.

ScienceDaily
TStzmmalaysia
post Feb 22 2011, 07:39 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Advanced NASA Instrument Gets Close-Up on Mars Rocks

NASA's Mars Science Laboratory rover, Curiosity, will carry a next generation, onboard "chemical element reader" to measure the chemical ingredients in Martian rocks and soil. The instrument is one of 10 that will help the rover in its upcoming mission to determine the past and present habitability of a specific area on the Red Planet. Launch is scheduled between Nov. 25 and Dec. 18, 2011, with landing in August 2012.

The Alpha Particle X-Ray Spectrometer (APXS) instrument, designed by physics professor Ralf Gellert of the University of Guelph in Ontario, Canada, uses the power of alpha particles, or helium nuclei, and X-rays to bombard a target, causing the target to give off its own characteristic alpha particles and X-ray radiation. This radiation is "read by" an X-ray detector inside the sensor head, which reveals which elements and how much of each are in the rock or soil.

Identifying the elemental composition of lighter elements such as sodium, magnesium or aluminum, as well as heavier elements like iron, nickel or zinc, will help scientists identify the building blocks of the Martian crust. By comparing these findings with those of previous Mars rover findings, scientists can determine if any weathering has taken place since the rock formed ages ago.

All NASA Mars rovers have carried a similar instrument -- Pathfinder's rover Sojourner, Spirit and Opportunity, and now Curiosity, too. Improvements have been made with each generation, but the basic design of the instrument has remained the same.

"APXS was modified for Mars Science Laboratory to be faster so it could make quicker measurements. On the Mars Exploration Rovers [Spirit and Opportunity] it took us five to 10 hours to get information that we will now collect in two to three hours," said Gellert, the instrument's principal investigator. "We hope this will help us to investigate more samples."

Another significant change to the next-generation APXS is the cooling system on the X-ray detector chip. The instruments used on Spirit and Opportunity were able to take measurements only at night. But the new cooling system will allow the instrument on Curiosity to take measurements during the day, too.

The main electronics portion of the tissue-box-sized instrument lives in the rover's body, while the sensor head, the size of a soft drink can, is mounted on the robotic arm. With the help of Curiosity's remote sensing instruments -- the Chemistry and Camera (ChemCam) instrument and the Mastcam -- the rover team will decide where to drive Curiosity for a closer look with the instruments, including APXS. Measurements are taken with the APXS by deploying the sensor head to make direct contact with the desired sample.

The rover's brush will be used to remove dust from rocks to prepare them for inspection by APXS and by MAHLI, the rover's arm-mounted, close-up camera. Whenever promising samples are found, the rover will then use its drill to extract a few grains and feed them into the rover's analytical instruments, SAM and CheMin, which will then make very detailed mineralogical and other investigations.

Scientists will use information from APXS and the other instruments to find the interesting spots and to figure out the present and past environmental conditions that are preserved in the rocks and soils.

"The rovers have answered a lot of questions, but they've also opened up new questions," said Gellert. "Curiosity was designed to pick up where Spirit and Opportunity left off."

ScienceDaily

TStzmmalaysia
post Feb 22 2011, 07:41 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Ink-jet printers inspire scientists to make skin

A medical worker cleans a surgery room. Ink-jet printing technology has inspired scientists to look for ways to build sheets of skin that could one day be used for grafts in burn victims, experts said Sunday.

One technique involves a portable bioprinter that could be carried to wounded soldiers on the battlefield where it would scan the injury, take cells from the patient and print a section of compatible skin.

Another uses a three-dimensional printer combining donor cells, biofriendly gel and other materials to build cartilage. The 3-D printer was shown at work, building a prototype of an ear during a half-hour demonstration at a Washington science conference. Hod Lipson of Cornell University in New York said it worked much like an ink-jet printer.

"It spits out plastic to gradually build an object layer by layer... after a couple of hours you end up with a real physical object that you can hold in your hand," he said. "Just imagine -- if you could take cells from a donor, culture them, put them into an ink and recreate an implant that is alive and made of the original cells from the donor -- how useful that would be in terms of avoiding rejection," said Lipson. "That is where we are going. Let's see how far we can go."

Studies using the technology in animals have shown promise, particularly with printed cartilage, which is relatively simple in its construction and is tough so it can withstand the rigors of printing. "There are very severe limitations," Lipson said. "We are right now limited to cells... that can handle being printed."

Scientist James Yoo of Wake Forest University in North Carolina said his team's approach to printing skin has shown positive results in repairing skin in mouse and pig models. "One approach is to directly deploy cells to the wound site and the other approach is to build a tissue construct outside the body and transfer it into the body," said Yoo. The technology works in part via a scanner that takes a measure of the affected area and identifies the depth and extent of the injury, informing the bioprinter of how many layers of cells need to be made.

Both scientists said the advances were still in their early stages and required more research and refinement before they are ready for human patients. "One of the challenges that we will eventually face is like anything else, when you are trying to transfer the technology into the body, how can we create and connect those tissues?" said Yoo. "Whatever you put in the body has to be connected with the body's blood vessels, blood supply and oxygen."

PhysOrg

TStzmmalaysia
post Feb 22 2011, 07:42 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Star Wars-Inspired Marine Research Facility

Is it out of space? No, it’s a Mellard pod on the futuristic aquatic research station!

Star Wars inspiration and biomimicry combine for the design of the Facility at Sea, a sustainable marine research platform and feat of offshore building engineering. The concept came together in an architecture studio at the University of Texas, which evaluated potential applications of the soaring structural designs of Star Wars for a marine research facility. Designer Jason Mellard took further inspiration from the engineering acumen of Santiago Calatrava and present-day offshore oil platforms.

Taking a cue from the trunk of a tree, the central structural element of the Facility at Sea will serve as its lifeline, providing structure to the building while housing the facility’s mechanical and vertical circulation, including energy storage, waste removal systems, a control and engine room, and emergency generators. The trunk will be surrounded by three main branches, two research spheres, and a “habitation disk” that will not only move up and down vertically but also open and close (weather permitting), literally bringing the clam out of the water.

The research spheres will house laboratories, classrooms, computer labs, viewing platforms, holding tanks, offices, and storage. The “habitat disks” will house sleeping and living areas, including a communal dining and food preparation kitchen, a medical clinic, recreational areas, as well as Star Wars-inspired observation decks and docking platforms.

Much like the International Space Station project, the Facility at Sea will house scientists for 6-12 months periods. Because the clam structure will be airtight when closed, the Facility at Sea will be situated for life both above and under water. Not only will the underwater atmosphere provide an excellent view for marine research, it will serve dual purposes, protecting the Facility at Sea from the sometimes harsh environment above sea level, including storms and extremely windy conditions.

The Facility at Sea would be completely amiss not to take advantage of the surrounding air turbulence. Therefore, the automated shells will harness wind energy through its integrated rotating pinwheel design and solar power through incorporated PV Cells. The natural resources do not stop there – it will also be equipped to generate electricity from the ocean current and use its trunk to attract electricity to the storage batteries during seasonal (and unpredicted) thunderstorms.

TheBuilderBlog

TStzmmalaysia
post Feb 22 2011, 07:43 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Chicago Solar Tower

The proposed Solar Tower for Chicago by Zoka Zola Architects features an active solar array mounted to the façade which maximizes solar gain throughout the day.

The spherically based design takes advantage of the large surface of a building by mounting the panels on the vertical plane. By incorporating tracking arms that the solar units mount to, summer electrical production can be improved by as much as 40% compared to a static mounted solar array, and even more compared to traditional vertically mounted solar facades. The array’s full potential is then realized, creating the greatest kWhrs production per square foot of any design. Wind pressure exerted on the solar panel holding mechanisms can be converted into energy.

The spherical panels are mounted in such a way as to maintain views for the interior but to reduce heat gain. This results in a minimized dependency on a cooling plant. The panels are evident from the interiors of the tower to emulate the technology. The siting of the tower will have a dramatic effect on its power production-being isolated or adjacent to a southerly body of water or park is preferable. The entire building will have a kinetic profile raising onlooker’s awareness of renewable onsite energy production and sustainable urban design.

TStzmmalaysia
post Feb 23 2011, 07:39 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

"Zeppelin Could Revolutionize Airship Transportation"

Zeppelins are pie-in-the-sky sort of dreams and every year or so there's a new concept for one, but they never seem to get built. Ever since the great Hindenburg disaster, people are wary of the concept altogether, not to mention, they're incredibly expensive. Lieven Standaert, a Belgian engineer, is hard at work designing a zeppelin that dispels all the major design issues of airships.

He's proposing to build a long and pointy airship called the Aeromodeller II, made out of low-cost materials that generates its own hydrogren via wind power and never needs to land. If his theories prove correct, he could revolutionize airship designs and propel hydrogen power into the future for zero emissions transportation.

Current designs for airships have numerous problems, including that they rely on helium, they’re expensive to build, are vulnerable due to their over-pressured skins and require expensive hangars to park them in. Standaert’s design for the Aeromodeller II would eliminate many of these problems. The zeppelin’s shape is modified to be longer and skinnier to reduce the need for pressurized skins. Lower cost materials, like light thermoplastic foil could be used instead of a woven skin material. The Aeromodeller II is also designed so it never needs to land, which eliminates the need for expensive hangars.

The zeppelin would move via hydrogen, which would be generated on board, so it never needs to stop at a refueling station. Using ground anchors, the airship’s propellers would switch gears to become turbines, harnessing the power of the wind to generate hydrogen. In this way, the zeppelin could remain aloft indefinitely by resupplying hydrogen whenever it needs to.

More like a sail boat than a power boat, the Aeromodeller II isn’t designed for speed and would probably only achieve about 80 km/h (50mph). This low speed transportation though would be completely zero-emissions and rely completely on renewable energy. Standaert is currently showing a model of his design in Antwerp, Belgium, until the end of February.

Inhabitat
TStzmmalaysia
post Feb 23 2011, 07:41 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

How much information is there in the world? Scientists calculate the world's total technological capacity

Think you're overloaded with information? Not even close. A study appearing on Feb. 10 in Science Express calculates the world's total technological capacity -- how much information humankind is able to store, communicate and compute.

"We live in a world where economies, political freedom and cultural growth increasingly depend on our technological capabilities," said lead author Martin Hilbert of the USC Annenberg School for Communication & Journalism. "This is the first time-series study to quantify humankind's ability to handle information."

So how much information is there in the world? How much has it grown?

Prepare for some big numbers:

• Looking at both digital memory and analog devices, the researchers calculate that humankind is able to store at least 295 exabytes of information. (Yes, that's a number with 20 zeroes in it.)

Put another way, if a single star is a bit of information, that's a galaxy of information for every person in the world. That's 315 times the number of grains of sand in the world. But it's still less than one percent of the information that is stored in all the DNA molecules of a human being.

• 2002 could be considered the beginning of the digital age, the first year worldwide digital storage capacity overtook total analog capacity. As of 2007, almost 94 percent of our memory is in digital form.

• In 2007, humankind successfully sent 1.9 zettabytes of information through broadcast technology such as televisions and GPS. That's equivalent to every person in the world reading 174 newspapers every day.

• On two-way communications technology, such as cell phones, humankind shared 65 exabytes of information through telecommunications in 2007, the equivalent of every person in the world communicating the contents of six newspapers every day.

• In 2007, all the general-purpose computers in the world computed 6.4 x 10^18 instructions per second, in the same general order of magnitude as the number of nerve impulses executed by a single human brain. Doing these instructions by hand would take 2,200 times the period since the Big Bang.

• From 1986 to 2007, the period of time examined in the study, worldwide computing capacity grew 58 percent a year, ten times faster than the United States' GDP.

• Telecommunications grew 28 percent annually, and storage capacity grew 23 percent a year.

"These numbers are impressive, but still miniscule compared to the order of magnitude at which nature handles information" Hilbert said. "Compared to nature, we are but humble apprentices. However, while the natural world is mind-boggling in its size, it remains fairly constant. In contrast, the world's technological information processing capacities are growing at exponential rates."

PhysOrg




TStzmmalaysia
post Feb 23 2011, 07:43 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

'25 Hours' City Concept Design

As we move toward the year 2040, the demands for energy, mobility and space in Los Angeles continue to grow in a region already overwhelmed with urban sprawl, traffic congestion, scarce open space, and inferior public transportation. The 25-Hour City looks to oppose the Los Angeles urban model of autonomy by creating an urban environment with hyper-density and vibrancy by incorporating everything, everywhere, all the time.

The hyper-mixing of program allows for the freedom of continuous work or leisure at anytime of the day or night. This urban configuration is coupled with the programmatic dispersal of commercial, residential, retail, public, and recreational space to fulfill the 25-Hour City concept.

This vertical proposal accomplishes the ultimate level of sustainable responsibility vis-à-vis hyper levels of land-use efficiency. By condensing 75,000 people in one tower, transportation needs are reduced substantially while open space on the ground level is maintained. When dealing with the hyper-dense situation of 80,000 people / km2, parameters of natural light and ventilation become the most prominent influences on zoning and massing throughout the city. Using these parameters of light and air to understand limits in density that can happen at the ground level, the only logical way for a city to grow is vertically. Through the use of this logic, swells in the urban fabric are created that evolve into vertical cities where the limits of density cease to exist

Cities as we know it achieve variety through zoning distributions along the horizontal plane; this 25-Hour Tower proposal investigates how this programmatic variety can be accomplished in a vertical format. In order to create a city that includes a multitude of program types and sizes in a vertical situation, the overall building area must increase substantially compared to a typical high-rise in order to provide an extensive amount of programmatic and spatial diversity within the structure. The varieties embedded within a city are developed through different program types and sizes and their mixture. This vertical city proposal reaches a level of ultimate diversity by being composed in a fashion that creates micro-autonomous areas of one program typology, a multitude of mixed program areas, as well as areas of hyper-diversity.

The vertical city is composed of networked strands of program that expand and contract in their spacing between one another in order to create the desired level of variety. This alteration in the spacing of the strands does not affect the amount of program embedded within the building; the degree of compaction determines the level of programmatic diversity that occurs. A vertical city is also faced with technical issues such as vertical circulation, natural light, ventilation, and structure. The control of these technical parameters are embedded within the massing strategy used for programmatic distribution. The spacing of the strands develops voids for natural light and ventilation, and the continuity of the strand network maintains structural integrity for the massing. A dynamic and robust vertical circulation system is produced by developing a hierarchical system of express and local elevators that create multiple ways to reach a single destination.

HoustonDrum

TStzmmalaysia
post Feb 23 2011, 07:44 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

The Growth and Future of Urban Farming

With large swathes of people now living in the world’s cities and increased demand for people to live healthily and sustainably, it is understandable that in recent years urban farming has become a hot topic amongst not only gardeners and businessmen but members of the academia.

Here at the Hydroponics Guide we have seen countless projects over the last few years cropping up all over the world. In the Middle East hydroponics is becoming an increasingly viable form of agriculture whilst rooftop farms and farm shops are becoming an increasingly common sight in cities such as London, New York and Montreal.

Urban farming however serves an important purpose in the developing world, where there are large issues with the shortage of water and arable land. The result is that traditional agriculture in many instances is simply not an option, instead, hydroponics provides a far more sustainable and appropriate solution. Urban farming projects have been utilised across the developing world in countries such as Botswana and Cuba as a means of exacting the greatest food production from the fewest possible resources.

Within other areas of the world hydroponic urban farms are also growing in popularity. In the developed world however, resource pressures are not the predominant reason for hydroponic projects; instead a desire to avoid pesticides and chemicals and reduce the amount of food miles that are intrinsically linked with industrial agriculture stand out.

There is currently some debate as to exactly how urban farming projects should be implemented. So far the majority of projects have been carved out of disused space in cities, one noteworthy example being the construction of rooftop hydroponic farms above office blocks and supermarkets. However many see a future with vertical farms in skyscrapers in excess of 30 floors. These buildings may incorporate aquaponic elements to ensure a sustainable source of fish and according to some scientists will even have space to raise poultry. Understandably the major benefit to any such project would be minimisation of food miles and the removal of harmful pesticides and chemicals from food supplies.

Whilst there are a growing number of hydroponic projects there is as of yet no such skyscraper in existence. Vertical farming racks however have been used successfully in a number of different applications. These racks are typically mobile, acting like a conveyor belt for plants ensuring that each plant is exposed the light within the system. Whilst such vertical farming projects are currently small scale, the argument that they could be scaled up is sound.

Whether vertical farming racks are the future of urban farming is difficult to ascertain. Urban farming as a concept however will arguably gain traction in the coming years as pressure on resources creeps into the global agricultural market. The increased number of medium sized hydroponics projects stands testament to the increased plausibility of urban farming and provides evidence of how it could become a major component in worldwide food production in the future.

Hydroponics Guide

TStzmmalaysia
post Feb 23 2011, 07:46 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


TRANSPORTATION

Attached Image

New Pedestrian Detector Brakes Car For You

In the next decade Volvo wants to end car fatalities. The first step to that goal is giving human drivers a computer supervisor. The Swedish car company’s latest safety feature is the Collision Warning with Full Auto Brake and Pedestrian Detection – a system that will automatically stop your car if you’re about to hit someone. Using an onboard radar, Volvo’s S60 2011 model can determine the distance and speed of objects as it approaches them.

A camera mounted in front of the rear view mirror then determines what those objects are and whether or not they are moving towards a collision with the car. Volvo’s Pedestrian Detection system will give you a warning buzz if you’re about to hit a human or another vehicle, but if you don’t react fast enough you don’t have to worry – it will slam on the brakes for you. Check out commercials for the system, along with some real world tests, in the videos below. I think it’s entirely possible that Volvo could prevent the majority of automotive deaths associated with their cars by 2020. The question is whether or not they’ll ultimately have to remove humans from behind the wheel in order to do so.

Here’s the official promotional video for Volvo’s Collision Warning with Full Auto Brake and Pedestrian Detection. It gives you a good overview on how the system works. (Those who want to watch the emotional story without the dubbed narration can find it here.)



Announced in early 2010, the Pedestrian Detection system has subsequently received good reviews from automotive journals and magazines. Boston.com and Popular Mechanics were among the many that put the device to the test with real world collisions using a dummy. As you can see in the following videos, Volvo’s anti-collision sensors work fairly well. At speeds up to 35 kilometers per hour (~22mph) it can completely avoid a collision. At higher velocities it will reduce the impact significantly.

In the last trial in the following video, the vehicle actually collides with the dummy despite traveling under 35 kph. According to Boston.com this may have been due to crowds gathered on either side of the test track.
The Pedestrian Detection system is just part of a suite of advanced safety features that use computers to augment human drivers. The Volvo S60 retails for around $40,000 in the US when equipped with a technology package that includes:

Driver Alert Control (DAC): An alarm that warns if the car begins to weave randomly or uncontrollably – signs of a tired or distracted driver.

Blind Spot Information System (BLIS): Warning lights on the outside mirrors alert drivers to unseen vehicles.

Dual Xenon with Active Bending Lights (ABL): The headlamps move to follow the curve of the road.

Lane Departure Warning (LDW): System that monitors the lane dividers and warns the driver if the car moves across them without first using the turn indicator.

Many of these features, including the Pedestrian Detection, are upgraded from components found on Volvo’s City Safety system that was released a few years ago and continues to be updated:



Of course, all of these driver enhancing technologies are just a step towards fully automated vehicles. We’ve already seen Volvo take bigger moves in that direction with their work in SARTRE – a project that allows drivers to join ‘convoys’ on the highway, slaving the control of their car to a lead driver. Pedestrian Detection, DAC, BLIS, ABL, LDW and all the other safety features on the S60 are just the calming pats on the head that are trying to prepare us for the day when computers completely take over driving.

Why the need for all these baby steps? As I mentioned back when Google unveiled their robot car project, humans simply aren’t ready to cede control of their vehicles to computers. There are too many social and legal hurdles (not to mention psychological ones) that bar autonomous automobiles from taking their rightful place on the highway. Robot car technology will probably be ready long before humanity is ready to accept it.

So we’ll keep getting these tiny moves towards autonomous vehicles in the years ahead. Letting computers take more control one dangerous potential scenario at a time. By 2020 we may have stumbled all the way into fully robotic cars. I doubt it will happen that fast, but it’s possible. Even if we do go full robot, however, driving fatalities will always be with us. Volvo’s Vision for 2020 is laudable, but no driver, human or computer, is 100% safe.

The irony may be that acknowledging this fact is the only way to make us become brave enough to accept computers behind the wheel.

SingularityHub





TStzmmalaysia
post Feb 23 2011, 07:47 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Massive Green-Roofed Innovation Hub Set To Launch Botswana Into High Tech Business

Botswana is undertaking a huge development called the Botswana Innovation Hub.

They've contracted New York-based SHoP Architects to design a sexy green innovation hub in their capital city, and when the proposal breaks ground it could help launch the African nation into a higher class of business, technology and innovation. The high-tech campus will be completely covered with a green roof and will feature photovoltaics, rainwater collection and much more -- all in hopes of becoming the first LEED certified project in the country.

The Botswana Innovation Hub is set to be located on a 57 hectare veldt (wide open space in Africa) outside of Gaborone. 270,000 square feet of office and laboratory space will be provided for technology and knowledge-intensive foreign and local businesses, as well as research and advanced training institutes. An AIDs research center has already expressed interest in having space inside the campus, which is being designed less as a business park and more as a community with interconnecting bridges, communal parks and meeting areas.

SHoP Architects won an international competition to design the campus and is including a slew of green strategies in hopes of achieving LEED certification. Botswana, to date, has no green buildings, so this center would be an innovation for the country in itself and help show what is possible. The hub is composed of a series of long slender buildings that reach out into the open land and are covered with a concept SHoP calls the “Energy Blanket” roofscape, which combines passive and active sustainable energy techniques. Large overhangs will passively shade the building’s interior volumes and photovoltaics collect the sun’s energy. The building will be covered by a large xeriscaped roof garden planted with indigenous species that collects rainwater for reuse. Overall the green strategies will realize a 50% energy savings over US ASHRAE building standards.

Inhabitat
TStzmmalaysia
post Feb 23 2011, 07:48 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Rotterdam's Cactus Building

Urban Cactus is a housing project in the Vuurplaat section of Rotterdam by UCX Architects / Ben Huygen and Jasper Jaegers and done for Vestia Rotterdam Feijenoord/Estrade Projecten. Due to its siting at the end of harbor, the architects chose to conceptualize the project as belonging to the “green nerve” rather than the surrounding urban structure.

They placed the 98 residential units on 19 floors, using the pattern of outdoor spaces to determine the overall appearance of the project. The slightly irregular pattern alternates these outdoor spaces to create what are in effect double-height spaces. Each unit then receives more sunlight than a typical stacked composition. Also the terrace area might be equivalent to a constant depth extended around the perimeter (say two meters), but their configuration creates larger “rooms” for gardening and for enjoying the outdoors and the city views.

The ground floor has commercial space and a deepened plinth containing the entrance, depositories and parking space. The building includes four types of residences, varying from 65 to 110 square meters per residence. Two types of floors with 4 apartments and two more types of floors each with 6 apartments are enough to accommodate different houses. By rotating each floor type with accordance to the layout, a versatile pattern of thorny outside areas comes into existence. Every residence has its place in the overall façade. The size of the outside areas, the outline and the facilities to horticulture, maximize the garden feeling. A view over the city from the private garden creates a summit of urban living.

TheBuilderBlog
TStzmmalaysia
post Feb 23 2011, 07:51 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Model shows how scientific paradigms rise and fall

Scientific concepts such as climate change, nanotechnology, and chaos theory can sometimes spring up and capture the attention of both the scientific and public communities, only to be replaced by new ideas later on. Although many factors influence the emergence and decline of such scientific paradigms, a new model has captured how these ideas spread, providing a better understanding of paradigm shifts and the culture of innovation.

The researchers, Stefan Bornholdt from the University of Bremen in Bremen, Germany, and Mogens Høgh Jensen and Kim Sneppen from the Niels Bohr Institute in Copenhagen, Denmark, have published their study called “Emergence and Decline of Scientific Paradigms” in a recent issue of Physical Review Letters.

“Our model addresses the interplay between a new idea and the difficulty it has in displacing old ideas in a world where alignment of interests is dominating,” Bornholdt told PhysOrg.com.

Several models of opinion formation already exist, but the new model differs from earlier models in a few important ways. Unlike previous models, the new model allows for an infinite variety of ideas, although each idea has a small probability of being initiated. Also, each idea can appear only once, and an agent (or individual) cannot return to any of the ideas that they have previously held, reflecting scientists’ ongoing hunt for new ideas.

In the model, which is defined on a 2D square lattice, ideas spread in two possible ways. In the first way, an agent adopts a new idea held by its neighbors, with a probability proportional to how many agents already hold this particular idea. In the second way, an agent randomly gets a new idea that does not appear anywhere else in the system with a probability that depends on the “innovation rate.” The first way represents cooperative effects in social systems, while the second way represents innovation.

The model shows how a system with one dominating scientific paradigm transitions into a system with small clusters of ideas, some of which continue to grow until one dominates, and the process repeats with new ideas. The dynamics of the rise and fall of scientific paradigms depends on the system’s innovation rate. Systems with high innovation rates tend to contain a high degree of noise, along with many small domains of ideas that are constantly generated and replaced. In contrast, systems with low innovation rates tend to have low noise and a state that remains dominant for a long time until a single event replaces it.

In addition to providing a theoretical understanding of how scientific paradigms rise and fall, the model also provides insight that helps explain some observations in real life. For instance, the model shows how small systems have the potential to be more dynamic than large systems, which explains why large companies sometimes acquire small start-up firms as a source of innovation.

“Our model indicates that social cooperation makes it more difficult for new ideas to nucleate because of social pressure,” Bornholdt said. “Accordingly, our model finds a ‘winner take all’ dynamic, suggesting a fashion-like dynamic for the prevailing focus of contemporary science.

“Even though our model is extremely simplified and does not deal with right and wrong, it explores the effect of herd mentality in the propagation of ideas,” he added. “Our model suggests that herd mentality makes a larger system less innovative than several smaller ones. In short, for innovation it’s better to listen to yourself than to others.”

Overall, the model shows how new paradigms have a tendency to quickly rise to dominance, to decline slowly, and to quickly be replaced by other paradigms. When the innovation rate is high, the takeover process is chaotic, with many new ideas competing for dominance. Regardless of the idea itself, the model shows that the pattern of paradigm shifts remains fairly consistent over time.

The results could have implications for science philosophy and science policy, as the model suggests that scientific diversity may need special attention. In addition, the researchers are applying the model to the study of the spread of epidemics.

“We are currently studying the ideas of ‘new’ and ‘old’ in epidemics modeling,” Bornholdt said, “where the ‘never-return-policy’ of ideas in the above model is associated with immunity of infected hosts: A host that has been infected by a particular virus in the past will be immune to this virus in the future and, thus, will never acquire the same infection twice.”

PhysOrg

More information: S. Bornholdt, M. H. Jensen, and K. Sneppen. “Emergence and Decline of Scientific Paradigms.” Physical Review Letters 106, 058701 (2011). DOI: 10.1103/PhysRevLett.106.058701

TStzmmalaysia
post Feb 23 2011, 09:20 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Physicists build bigger 'bottles' of antimatter to unlock nature's secrets

Once regarded as the stuff of science fiction, antimatter—the mirror image of the ordinary matter in our observable universe—is now the focus of laboratory studies around the world.

While physicists routinely produce antimatter with radioisotopes and particle colliders, cooling these antiparticles and containing them for any length of time is another story. Once antimatter comes into contact with ordinary matter it "annihilates"—or disappears in a flash of gamma radiation.

Clifford Surko, a professor of physics at UC San Diego who is constructing what he hopes will be the world's largest antimatter container, said physicists have recently developed new methods to make special states of antimatter in which they can create large clouds of antiparticles, compress them and make specially tailored beams for a variety of uses.

He described the progress made in this area, including his own efforts, at the annual meeting in Washington, DC, of the American Association for the Advancement of Science. His talk, "Taming Dirac's Particle," led off the session entitled "Through the Looking Glass: Recent Adventures in Antimatter," on February 18.

Surko said that since "positrons"—the anti-electrons predicted by English physicist Paul Dirac some 80 years ago—disappear in a burst of gamma rays whenever they come in contact with ordinary matter, accumulating and storing these antimatter particles is no small feat. But over the past few years, he added, researchers have developed new techniques to store billions of positrons for hours or more and cool them to low temperatures in order to slow their movements so they can be studied.

Surko said physicists are now able to slow positrons from radioactive sources to low energy and accumulate and store them for days in specially designed "bottles" that have magnetic and electric fields as walls rather than matter. They have also developed methods to cool them to temperatures as low as that of liquid helium and to compress them to high densities.

"One can then carefully push them out of the bottle in a thin stream, a beam, much like squeezing a tube of toothpaste," said Surko, adding that there are a variety of uses for such positrons.

A familiar positron technique that does not use this new technology is the PET scan, also known as Positron Emission Tomography, which is now used routinely to study human metabolic processes and help design new drugs.

In the new methods being developed by physicists, beams of positrons will be used in other ways. "These beams provide new ways to study how antiparticles interact or react with ordinary matter," said Surko. "They are very useful, for example, in understanding the properties of material surfaces."

Surko and his collaborators at UC San Diego are studying how positrons bind to ordinary matter, such as atoms and molecules. "While these complexes only last a billionth of a second or so," he said, "the 'stickiness' of the positron is an important facet of the chemistry of matter and antimatter."

Surko and his colleagues are building the world's largest trap for low-energy positrons in his laboratory at UC San Diego, capable of storing more than a trillion antimatter particles at one time.

"We are now working to accumulate trillions of positrons or more in a novel 'multi-cell' trap—an array of magnetic bottles akin to a hotel with many rooms, with each room containing tens of billions of antiparticles," he said.

"These developments are enabling many new studies of nature. Examples include the formation and study of antihydrogen, the antimatter counterpart of hydrogen; the investigation of electron-positron plasmas, similar to those believed to be present at the magnetic poles of neutron stars, using a device now being developed at Columbia University; and the creation of much larger bursts of positrons which could eventually enable the creation of an annihilation gamma ray laser."

"An exciting long-term goal of the work is the creation of portable traps for antimatter," added Surko. "This would increase greatly the ability to use and exploit antiparticles in our matter world in situations where radioisotope- or accelerator-based positron sources are inconvenient to arrange."

PhysOrg

Study Provided by University of California - San Diego.

TStzmmalaysia
post Feb 23 2011, 09:23 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

New Plastics can Conduct Electricity

A newly discovered technique makes it possible to create a whole new array of plastics with metallic or even superconducting properties.

Plastics usually conduct electricity so poorly that they are used to insulate electric cables but, by placing a thin film of metal onto a plastic sheet and mixing it into the polymer surface with an ion beam, Australian researchers have shown that the method can be used to make cheap, strong, flexible and conductive plastic films.

The research has been published in the journal ChemPhysChem by a team led by Professor Paul Meredith and Associate Professor Ben Powell, both at the University of Queensland, and Associate Professor Adam Micolich of the UNSW School of Physics. This latest discovery reports experiments by former UQ Ph.D. student, Dr Andrew Stephenson.

Ion beam techniques are widely used in the microelectronics industry to tailor the conductivity of semiconductors such as silicon, but attempts to adapt this process to plastic films have been made since the 1980s with only limited success – until now.

"What the team has been able to do here is use an ion beam to tune the properties of a plastic film so that it conducts electricity like the metals used in the electrical wires themselves, and even to act as a superconductor and pass electric current without resistance if cooled to low enough temperature," says Professor Meredith.

To demonstrate a potential application of this new material, the team produced electrical resistance thermometers that meet industrial standards. Tested against an industry standard platinum resistance thermometer, it had comparable or even superior accuracy.

"This material is so interesting because we can take all the desirable aspects of polymers - such as mechanical flexibility, robustness and low cost - and into the mix add good electrical conductivity, something not normally associated with plastics," says Professor Micolich. "It opens new avenues to making plastic electronics."

Andrew Stephenson says the most exciting part about the discovery is how precisely the film’s ability to conduct or resist the flow of electrical current can be tuned. It opens up a very broad potential for useful applications.

"In fact, we can vary the electrical resistivity over 10 orders of magnitude – put simply, that means we have ten billion options to adjust the recipe when we're making the plastic film. In theory, we can make plastics that conduct no electricity at all or as well as metals do – and everything in between,” Dr Stephenson says.

These new materials can be easily produced with equipment commonly used in the microelectronics industry and are vastly more tolerant of exposure to oxygen compared to standard semiconducting polymers.

Combined, these advantages may give ion beam processed polymer films a bright future in the on-going development of soft materials for plastic electronics applications – a fusion between current and next generation technology, the researchers say.

PhysOrg

Provided by University of New South Wales.

TStzmmalaysia
post Feb 23 2011, 09:24 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Silicon Solar Cells Ditch the Wafers

Startup Crystal Solar hopes to take some of the cost out of high-performance single-crystalline solar cells by eliminating conventional silicon wafers. The company says it has developed a wafer-free process for making 50-micrometer-thick solar cells with over 15 percent efficiency, with the possibility of higher efficiencies. Because the process doesn't waste much silicon, Crystal Solar expects to produce cells for half or even a third of the cost of conventional cells.

Earlier this month, the National Renewable Energy Laboratory (NREL) announced it would give the company up to $4 million over the next 18 months to fund development of the technology. Crystal Solar, based in Santa Clara, California, will open a small-scale pilot plant by early 2013.

For solar electricity to compete with coal-fired power, silicon solar cells must get still less expensive and more efficient. (NREL's goal is grid parity by 2017.) In today's conventional solar cells, silicon accounts for about two-thirds of the materials costs. During the four-day process of creating a pure, single-crystal silicon ingot and sawing it down into thin pieces, about half the starting material is lost. Using less silicon in each finished solar cell would further save on materials costs.

Crystal Solar uses a process called epitaxial growth to deposit silicon films directly from gases, eliminating silicon wafers from the process. Over the past two years, the company has adapted the process make very thin single-crystalline silicon solar cells. Crystal Solar says it can make silicon cells that are highly efficient, but thinner than a piece of paper. The sweet spot, it believes, is 40 to 50 micrometers thick, approaching the lower limit of how thin a solar cell can be while still performing up to the material's theoretical potential. (Much thinner than this, and it won't absorb enough light.)


For many years, researchers have tried to adapt epitaxial growth methods to make thin single-crystalline solar cells. The chip industry has been using this method for decades—in fact, modern microelectronics has been made possible by machinery that uses high-temperature vacuum chambers to deposit different forms of silicon on top of silicon wafers. (Before starting Crystal Solar, chief technology officer K. V. Ravi was the director of renewables and environment at Applied Materials, one of the world's biggest suppliers of semiconductor manufacturing equipment—including equipment used to grow various forms of epitaxial silicon for computer chips, display electronics, and solar cells.)

But the epitaxial method hasn't been workable for making thin-film single-crystalline solar cells—the kind with the highest performance. To make the process work for single-crystalline solar cells, Crystal Solar had to remake the processing equipment from the ground up.

Crystal Solar says it has now made the process practical. The semiconductor industry utilizes 5 percent of the silicon in trichlorosilane gas. Ravi says Crystal Solar's equipment uses 60 to 70 percent of the silicon, and can make a solar cell 20 times faster than making one on conventional epitaxial growth equipment. Academic labs have made similar efficiency demonstrations with very small test cells that have never been scaled up. Crystal Solar has made standard-size solar cells with its process.

Making these thin, high-quality silicon films is one thing, but handling them is quite another. Ravi says the company has also developed equipment to handle, finish, and package the thin silicon sheets to make solar cells, though it is not disclosing details on how it does this.

The company has been in stealth mode since its founding in 2008, and Ravi would not name the sources that have provided Crystal Solar with unspecified tens of millions of dollars in two rounds of funding. He says the company does not intend to become an equipment manufacturer, but will partner with another company to make and sell panels.

TechnologyReview

TStzmmalaysia
post Feb 24 2011, 07:34 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


NANOTECHNOLOGY

Attached Image

Nano-sized vaccines

The new particles, described in the Feb. 20 issue of Nature Materials, consist of concentric fatty spheres that can carry synthetic versions of proteins normally produced by viruses. These synthetic particles elicit a strong immune response — comparable to that produced by live virus vaccines — but should be much safer, says Darrell Irvine, author of the paper and an associate professor of materials science and engineering and biological engineering.

Such particles could help scientists develop vaccines against cancer as well as infectious diseases. In collaboration with scientists at the Walter Reed Army Institute of Research, Irvine and his students are now testing the nanoparticles’ ability to deliver an experimental malaria vaccine in mice.

Vaccines protect the body by exposing it to an infectious agent that primes the immune system to respond quickly when it encounters the pathogen again. In many cases, such as with the polio and smallpox vaccines, a dead or disabled form of the virus is used. Other vaccines, such as the diphtheria vaccine, consist of a synthetic version of a protein or other molecule normally made by the pathogen.

When designing a vaccine, scientists try to provoke at least one of the human body’s two major players in the immune response: T cells, which attack body cells that have been infected with a pathogen; or B cells, which secrete antibodies that target viruses or bacteria present in the blood and other body fluids.

For diseases in which the pathogen tends to stay inside cells, such as HIV, a strong response from a type of T cell known as “killer” T cell is required. The best way to provoke these cells into action is to use a killed or disabled virus, but that cannot be done with HIV because it’s difficult to render the virus harmless.

To get around the danger of using live viruses, scientists are working on synthetic vaccines for HIV and other viral infections such as hepatitis B. However, these vaccines, while safer, do not elicit a very strong T cell response. Recently, scientists have tried encasing the vaccines in fatty droplets called liposomes, which could help promote T cell responses by packaging the protein in a virus-like particle. However, these liposomes have poor stability in blood and body fluids.

MIT engineers created vaccine-delivering nanoparticles by placing lipid spheres inside one another. Credit: Peter DeMuth and James Moon

Irvine, who is a member of MIT’s David H. Koch Institute for Integrative Cancer Research, decided to build on the liposome approach by packaging many of the droplets together in concentric spheres. Once the liposomes are fused together, adjacent liposome walls are chemically “stapled” to each other, making the structure more stable and less likely to break down too quickly following injection. However, once the nanoparticles are absorbed by a cell, they degrade quickly, releasing the vaccine and provoking a T cell response.
In tests with mice, Irvine and his colleagues used the nanoparticles to deliver a protein called ovalbumin, an egg-white protein commonly used in immunology studies because biochemical tools are available to track the immune response to this molecule. They found that three immunizations of low doses of the vaccine produced a strong T cell response — after immunization, up to 30 percent of all killer T cells in the mice were specific to the vaccine protein.

That is one of the strongest T cell responses generated by a protein vaccine, and comparable to strong viral vaccines, but without the safety concerns of live viruses, says Irvine. Importantly, the particles also elicit a strong antibody response. Niren Murthy, associate professor at Georgia Institute of Technology, says the new particles represent “a fairly large advance,” though he says that more experiments are needed to show that they can elicit an immune response against human disease, in human subjects. “There’s definitely enough potential to be worth exploring it with more sophisticated and expensive experiments,” he says.

In addition to the malaria studies with scientists at Walter Reed, Irvine is also working on developing the nanoparticles to deliver cancer vaccines and HIV vaccines. Translation of this approach to HIV is being done in collaboration with colleagues at the Ragon Institute of MIT, Harvard and Massachusetts General Hospital. The institute, which funded this study along with the Gates Foundation, Department of Defense and National Institutes of Health, was established in 2009 with the goal of developing an HIV vaccine.

PhysOrg

TStzmmalaysia
post Feb 24 2011, 07:37 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Gas rich galaxies confirm prediction of modified gravity theory

Recent data for gas rich galaxies precisely match predictions of a modified theory of gravity know as MOND according to a new analysis by University of Maryland Astronomy Professor Stacy McGaugh. This -- the latest of several successful MOND predictions -- raises new questions about accuracy of the reigning cosmological model of the universe, writes McGaugh in a paper to be published in March in Physical Review Letters.

Modern cosmology says that for the universe to behave as it does, the mass-energy of the universe must be dominated by dark matter and dark energy. However, direct evidence for the existence of these invisible components remains lacking. An alternate, though unpopular, possibility is that the current theory of gravity does not suffice to describe the dynamics of cosmic systems.

A few theories that would modify our understanding of gravity have been proposed. One of these is Modified Newtonian Dynamics (MOND), which was hypothesized in 1983 by Moti Milgrom a physicist at the Weizmann Institute of Science in Rehovot, Israel. One of MOND's predictions specifies the relative relationship between the mass of any galaxy and its flat rotation velocity. However, uncertainties in the estimates of masses of stars in star-dominated spiral galaxies (such as our own Milky Way) previously had precluded a definitive test.

To avoid this problem, McGaugh examined gas rich galaxies, which have relatively fewer stars and a preponderance of mass in the form of interstellar gas. "We understand the physics of the absorption and release of energy by atoms in the interstellar gas, such that counting photons is LIKE counting atoms. This gives us an accurate estimate of the mass of such galaxies," McGaugh said.

Using recently published work that he and other scientists had done to determine both the mass and flat rotation velocity of many gas rich galaxies, McGaugh compiled a sample of 47 of these and compared each galaxy's mass AND rotation velocity with the relationship expected by MOND. All 47 galaxies fell on or very close to the MOND prediction. No dark matter model performed as well.

"I find it remarkable that the prediction made by Milgrom over a quarter century ago performs so well in matching these findings for gas rich galaxies," McGaugh said. "

MOND vs. Dark Matter - Dark Energy

Almost everyone agrees that on scales of large galaxy clusters and up, the Universe is well described by dark matter - dark energy theory. However, according to McGaugh this cosmology does not account well for what happens at the scales of galaxies and smaller.

"MOND is just the opposite," he said. "It accounts well for the 'small' scale of individual galaxies, but MOND doesn't tell you much about the larger universe.

Of course, McGaugh said, one can start from the assumption of dark matter and adjust its models for smaller scales until it fits the current finding. "This is not as impressive as making a prediction ahead of [new findings], especially since we can't see dark matter. We can make any adjustment we need." This is rather like fitting planetary orbits with epicycles," he said. Epicycles were erroneously used by the ancient Greek scientist Ptolemy to explain observed planetary motions within the context of a theory for the universe that placed the earth in its center.

"If we're right about dark matter, why does MOND work at all?" asks McGaugh. "Ultimately, the correct theory - be it dark matter or a modification of gravity - needs to explain this."

Article from PhysOrg

More information: Preprint of original paper on arXiv.org

Read more about dark energy and dark matter on this page NASA
TStzmmalaysia
post Feb 24 2011, 07:38 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

'Fingerprints' match molecular simulations with reality

A theoretical technique developed at the Department of Energy's Oak Ridge National Laboratory is bringing supercomputer simulations and experimental results closer together by identifying common "fingerprints."

ORNL's Jeremy Smith collaborated on devising a method -- dynamical fingerprints -- that reconciles the different signals between experiments and computer simulations to strengthen analyses of molecules in motion. The research will be published in the Proceedings of the National Academy of Sciences.

"Experiments tend to produce relatively simple and smooth-looking signals, as they only 'see' a molecule's motions at low resolution," said Smith, who directs ORNL's Center for Molecular Biophysics and holds a Governor's Chair at the University of Tennessee. "In contrast, data from a supercomputer simulation are complex and difficult to analyze, as the atoms move around in the simulation in a multitude of jumps, wiggles and jiggles. How to reconcile these different views of the same phenomenon has been a long-standing problem."

The new method solves the problem by calculating peaks within the simulated and experimental data, creating distinct "dynamical fingerprints." The technique, conceived by Smith's former graduate student Frank Noe, now at the Free University of Berlin, can then link the two datasets.

Supercomputer simulations and modeling capabilities can add a layer of complexity missing from many types of molecular experiments.

"When we started the research, we had hoped to find a way to use computer simulation to tell us which molecular motions the experiment actually sees," Smith said. "When we were finished we got much more -- a method that could also tell us which other experiments should be done to see all the other motions present in the simulation. This method should allow major facilities like the ORNL's Spallation Neutron Source to be used more efficiently."

Combining the power of simulations and experiments will help researchers tackle scientific challenges in areas like biofuels, drug development, materials design and fundamental biological processes, which require a thorough understanding of how molecules move and interact.

"Many important things in science depend on atoms and molecules moving," Smith said. "We want to create movies of molecules in motion and check experimentally if these motions are actually happening."

View a supercomputer simulation of a protein in motion here.

"The aim is to seamlessly integrate supercomputing with the Spallation Neutron Source so as to make full use of the major facilities we have here at ORNL for bioenergy and materials science development," Smith said.

PhysOrg

Provided by Oak Ridge National Laboratory (news : web)

TStzmmalaysia
post Feb 24 2011, 07:40 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

New stretchable solar cells will power artificial electronic 'super skin'

Ultrasensitive electronic skin developed by Stanford researcher Zhenan Bao is getting even better. Now she's demonstrated that it can detect chemicals and biological molecules, in addition to sensing an incredibly light touch. And it can now be powered by a new, stretchable solar cell she's developed in her lab, opening up more applications in clothing, robots, prosthetic limbs and more.

"Super skin" is what Stanford researcher Zhenan Bao wants to create. She's already developed a flexible sensor that is so sensitive to pressure it can feel a fly touch down. Now she's working to add the ability to detect chemicals and sense various kinds of biological molecules. She's also making the skin self-powering, using polymer solar cells to generate electricity. And the new solar cells are not just flexible, but stretchable – they can be stretched up to 30 percent beyond their original length and snap back without any damage or loss of power.

Super skin, indeed.

"With artificial skin, we can basically incorporate any function we desire," said Bao, a professor of chemical engineering. "That is why I call our skin 'super skin.' It is much more than what we think of as normal skin."

The foundation for the artificial skin is a flexible organic transistor, made with flexible polymers and carbon-based materials. To allow touch sensing, the transistor contains a thin, highly elastic rubber layer, molded into a grid of tiny inverted pyramids. When pressed, this layer changes thickness, which changes the current flow through the transistor. The sensors have from several hundred thousand to 25 million pyramids per square centimeter, corresponding to the desired level of sensitivity.

To sense a particular biological molecule, the surface of the transistor has to be coated with another molecule to which the first one will bind when it comes into contact. The coating layer only needs to be a nanometer or two thick.

"Depending on what kind of material we put on the sensors and how we modify the semiconducting material in the transistor, we can adjust the sensors to sense chemicals or biological material," she said.

Bao's team has successfully demonstrated the concept by detecting a certain kind of DNA. The researchers are now working on extending the technique to detect proteins, which could prove useful for medical diagnostics purposes.

"For any particular disease, there are usually one or more specific proteins associated with it – called biomarkers – that are akin to a 'smoking gun,' and detecting those protein biomarkers will allow us to diagnose the disease," Bao said.

The same approach would allow the sensors to detect chemicals, she said. By adjusting aspects of the transistor structure, the super skin can detect chemical substances in either vapor or liquid environments.

Regardless of what the sensors are detecting, they have to transmit electronic signals to get their data to the processing center, whether it is a human brain or a computer.

Having the sensors run on the sun's energy makes generating the needed power simpler than using batteries or hooking up to the electrical grid, allowing the sensors to be lighter and more mobile. And having solar cells that are stretchable opens up other applications.

A recent research paper by Bao, describing the stretchable solar cells, will appear in an upcoming issue of Advanced Materials. The paper details the ability of the cells to be stretched in one direction, but she said her group has since demonstrated that the cells can be designed to stretch along two axes.

The cells have a wavy microstructure that extends like an accordion when stretched. A liquid metal electrode conforms to the wavy surface of the device in both its relaxed and stretched states.

"One of the applications where stretchable solar cells would be useful is in fabrics for uniforms and other clothes," said Darren Lipomi, a postdoctoral fellow in Bao's lab and lead author of the paper.

"There are parts of the body, at the elbow for example, where movement stretches the skin and clothes," he said. "A device that was only flexible, not stretchable, would crack if bonded to parts of machines or of the body that extend when moved." Stretchability would be useful in bonding solar cells to curved surfaces without cracking or wrinkling, such as the exteriors of cars, lenses and architectural elements.

The solar cells continue to generate electricity while they are stretched out, producing a continuous flow of electricity for data transmission from the sensors.

Bao said she sees the super skin as much more than a super mimic of human skin; it could allow robots or other devices to perform functions beyond what human skin can do.

"You can imagine a robot hand that can be used to touch some liquid and detect certain markers or a certain protein that is associated with some kind of disease and the robot will be able to effectively say, 'Oh, this person has that disease,'" she said. "Or the robot might touch the sweat from somebody and be able to say, 'Oh, this person is drunk.'"

Finally, Bao has figured out how to replace the materials used in earlier versions of the transistor with biodegradable materials. Now, not only will the super skin be more versatile and powerful, it will also be more eco-friendly.

Chemical engineering Professor Zhenan Bao presented her work on Feb. 20 at the AAAS annual meeting in Washington, D.C.

PhysOrg

TStzmmalaysia
post Feb 24 2011, 07:41 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

The First Full-Color Display with Quantum Dots

Researchers at Samsung Electronics have made the first full-color display that uses quantum dots. Quantum-dot displays promise to be brighter, cheaper, and more energy-efficient than those found in today's cell phones and MP3 players.

Samsung's four-inch diagonal display is controlled using an active matrix, which means each of its color quantum-dot pixels is turned on and off with a thin-film transistor. The researchers have made the prototype on glass as well as on flexible plastic, as reported in Nature Photonics this week. "We have converted a scientific challenge into a real technological achievement," says Jong Min Kim, a fellow at the Samsung Advanced Institute of Technology.

Quantum dots are semiconductor nanocrystals that glow when exposed to current or light. They emit different colors depending on their size and the material they're made from. Their bright, pure colors and low power consumption make them very appealing for displays. Most computer monitors and TVs use power-hungry liquid-crystal displays (LCDs). Organic light-emitting diode (OLED) displays are more brilliant and energy-efficient, but are confined to small gadgets because they are too expensive for TV screens, and their organic materials have limited lifetimes.

Quantum-dot displays would consume less than a fifth of the power of LCDs, says Samsung researcher Tae-Ho Kim. They promise to be brighter and longer-lasting than OLEDs. What's more, they could be manufactured for less than half of what it costs to make LCD or OLED screens.

To make their prototype, the Samsung researchers start by coating a solution of quantum dots on a silicon plate and evaporating the solvent. Then they gently press a rubber stamp with a ridged surface into the quantum-dot layer, peel it off, and then press it on the desired glass or plastic substrate. This transfers stripes of quantum dots onto the substrate.

In a color display, each pixel contains red, green, and blue subpixels. These colors are combined in varying intensities to produce millions of colors. By using their stamping technique over and over, the researchers can create a repeated pattern of red, green, and blue stripes.

They transfer the stripes directly onto an array of thin-film transistors. The transistors are made of amorphous hafnium-indium-zinc oxide, which provide higher, more stable current than conventional amorphous-silicon transistors. The resulting display has subpixels that are about 50 micrometers wide and 100 micrometers long, small enough for use in cell-phone screens.

"This is a powerful demonstration," says Seth Coe-Sullivan, cofounder and chief technology officer of QD Vision. "The individual technology elements aren't necessarily new. Samsung definitely did a lot of good engineering to put all the pieces together in an impressive way."

He cautions, though, that there are many more research and engineering issues to be solved, and that quantum-dot displays are still at least three years away from commercialization. The best quantum-dot devices are still not as power-efficient as OLEDs. They also need to last longer—right now, they start losing their brightness after about 10,000 hours. Finally, researchers will have to develop ways to manufacture them at low cost and large scale.

TechnologyReview

TStzmmalaysia
post Feb 24 2011, 07:42 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Producing clean water in an emergency

Disasters such as floods, tsunamis, and earthquakes often result in the spread of diseases like gastroenteritis, giardiasis and even cholera because of an immediate shortage of clean drinking water. Now, chemistry researchers at McGill University have taken a key step towards making a cheap, portable, paper-based filter coated with silver nanoparticles to be used in these emergency settings.

"Silver has been used to clean water for a very long time. The Greeks and Romans kept their water in silver jugs," says Prof. Derek Gray, from McGill's Department of Chemistry. But though silver is used to get rid of bacteria in a variety of settings, from bandages to antibacterial socks, no one has used it systematically to clean water before. "It's because it seems too simple," affirms Gray.

Prof. Gray's team, which included graduate student Theresa Dankovich, coated thick (0.5mm) hand-sized sheets of an absorbent porous paper with silver nanoparticles and then poured live bacteria through it. "Viewed in an electron microscope, the paper looks as though there are silver polka dots all over," says Dankovich, "and the neat thing is that the silver nanoparticles stay on the paper even when the contaminated water goes through." The results were definitive. Even when the paper contains a small quantity of silver (5.9 mg of silver per dry gram of paper), the filter is able to kill nearly all the bacteria and produce water that meets the standards set by the American Environmental Protection Agency (EPA).

The filter is not envisaged as a routine water purification system, but as a way of providing rapid small-scale assistance in emergency settings. "It works well in the lab," says Gray, "now we need to improve it and test it in the field."

The research was funded by the National Sciences and Engineering Council of Canada (NSERC) and the work is part of the NSERC Sentinel Bioactive Paper Network.

EurekAlert

TStzmmalaysia
post Feb 25 2011, 07:35 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

3D bio-printers to print skin and body parts

The range of uses for three-dimensional printers is increasing all the time, but now scientists are developing 3D "bioprinters" that will be able to print out skin, cartilage, bone, and other body parts.
3D printers print by depositing material line by line and then vertically layer by layer. They have been used to make sculptures and repair sculptures, to make three-dimensional objects out of plastics and polymers, and even to print food.

Professor James Yoo, from the Institute of Regenerative Medicine at Wake Forest University in Winston-Salem, North Carolina told the annual meeting of the American Association for the Advancement of Science (AAAS) his group is developing a system that will allow them to print skin directly onto burn wounds.

Yoo’s team were motivated to develop a portable bioprinting system by the injuries arising on battlefields in Iraq and Afghanistan, where around 30 percent of injuries involve the skin. Their research is funded by the US Department of Defense.

The bioprinter has a built-in laser scanner that scans the wound and determines its depth and area. The scan is converted into three-dimensional digital images that enable the device to calculate how many layers of skin cells need to be printed on the wound to restore it to its original configuration. The system has successfully printed skin patches 10 cm square on a pig.

Also at the AAAS meeting was the director of Cornell University’s Computational Synthesis Laboratory, Professor Hod Lipson, who demonstrated a bioprinter by printing an ear, working from a scan of a human ear and a computer file containing the three-dimensional coordinates. The ear was printed using silicone gel instead of real human ear cells.



The Cornell team has already published results on their experiments to bioprint repairs to damaged animal bones, but Professor Lipson said there were a number of technical challenges still to overcome. He said the first use is likely to be repairs to cartilage, since it has a fairly simple internal structure with little vascularization.

Bioprinting cartilage has been tried "fairly successfully" in animal models, and the team have successfully printed cartilage cells directly into the meniscus of an injured knee to reconstruct it.

One of the major challenges to be faced in bioprinting is the connection between the bioprinted material and the rest of the body, especially with larger tissues, since any organ or body part that is printed will need to be connected to the body’s blood vessels, and this can be very difficult. Regardless of the challenges, Professor Lipson believes bioprinting will become a standard technique within a couple of decades.

PhysOrg
TStzmmalaysia
post Feb 25 2011, 07:39 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


NANOTECHNOLOGY

Attached Image

Tiny silicon-oxygen-based polyhedron enters cellular nuclei to light them up selectively

Nuclei are complex, well-defined organelles carrying genetic information that is critical to the cell. Visualizing these organelles through fluorescence imaging techniques promises to reveal the mechanisms that govern genetic information and provide ways to predict and treat genetic diseases. Working closely with Xinhai Zhang at the A*STAR Institute of Materials Research and Engineering, a research team led by Bin Liu at the National University of Singapore has now developed a method to create ultrasmall, highly selective fluorescent nanoprobes for a cellular nucleus imaging technique known as two-photon excited fluorescence (TPEF) microscopy.

Researchers have proposed a number of fluorescent substances to illuminate nuclei within cells. However, light-induced phenomena, such as cellular autofluorescence and severe photodamage, tend to degrade the performance of these probes.

In the TPEF technique, each nanoprobe produces a fluorescent signal by absorbing not one but two low-energy photons of near-infrared light. This two-photon process significantly reduces the effects of photodamage and cellular autofluorescence while enhancing resolution, making TPEF advantageous over traditional one-photon fluorescence microscopy.

“TPEF imaging is more powerful than one-photon imaging, in particular for in vivo and tissue imaging where strong biological autofluorescence exists,” say Zhang.

Instead of a traditional step-by-step synthesis, the researchers adopted a ‘bottom-up’ approach to synthesize the nanoprobes for their TPEF scheme. These nanoprobes consist of tiny inorganic silicon–oxygen cages surrounded by short positively charged polymer chains. The team obtained cages and chains separately before joining them together, and the synthesis lends itself well to producing TPEF nanoprobes with various light-emission colors and bio-recognition capabilities.

The small, rigid cages facilitate the incorporation of the probes into cellular nuclei, while the positively charged and light-sensitive chains contribute to water-solubility and optical properties. According to Liu, these features combine to ultimately produce TPEF-suitable light-up probes.

The team discovered that the fluorescence of the probes became substantially more intense upon exposure to nucleic acids, such as double-strand DNA and RNA. This is because the positively charged probes bind tightly to the negatively charged nucleic acids through attractive electrostatic interactions, increasing the micro-environmental hydrophobicity of the probes and their fluorescence. Furthermore, the probes selectively stained the nuclei of breast cancer and healthy cells with low toxicity.

The researchers are currently expanding their probe collection to include other intracellular target applications. They are also further optimizing the TPEF performance of the probes. “These nanoprobes can open up new ways of interrogating biological systems in a high-contrast and safe fashion,” say Zhang.

PhysOrg

TStzmmalaysia
post Feb 25 2011, 07:41 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

RainPerfect: Rainwater Recycling System With a Solar-Powered Pump

Disconnect that garden hose from your faucet! We’ve found a much more eco-friendly alternative that harnesses the power of the sun and the gift of rain to store life-giving water for your plants.

RainPerfect is a solar-powered pump system that collects seasonal rainwater in a barrel and then pumps it using a NiMH battery that’s charged by a 3.5 kW solar panel. With 15 feet of wire, the solar panel can soak up the sun on a nearby wall or fence or on the ground, and each charge has the potential to draw up to 100 gallons with a maximum pressure of 13 pounds per square inch.

The RainPerfect pump and solar panel install easily and provide plenty of pressure to run water through a garden hose. The pump provides enough pressure to run most low pressure sprinklers, wash a car or water just about anything around your home.

A convenient solar panel captures natural energy from the sun eliminating the need for electrical power to charge the battery. This makes the RainPerfect pump ready to go anywhere, anytime and involves no additional utility cost for you.

The pump provides pressurized pumping through a garden hose (13 PSI).

It runs on a solar rechargeable (NiMH) battery for operation anytime, day or night. Also, it connects to all standard garden hoses on top of that it's adaptable to most style rain barrels and is able to pump Pump up to 100 gallons of water on a single charge.

Inhabitat
TStzmmalaysia
post Feb 25 2011, 07:43 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Applied physicists discover that migrating cells flow like glass

By studying cellular movements at the level of both the individual cell and the collective group, applied physicists have discovered that migrating tissues flow very much like colloidal glass.

The research, led by investigators at Harvard's School of Engineering and Applied Sciences (SEAS) and the University of Florida, advances scientists' understanding of wound healing, cancer metastasis, and embryonic development.

The finding was published online February 14 in Proceedings of the National Academy of Sciences.

Cells often move from one part of the body to another. In a developing embryo, for example, cells in the three germ layers have to arrange themselves spatially so that the cells that will become skin are all on the outside. Similarly, as a cancerous tumor expands, the cells proliferate and push others aside. In wound healing, too, new cells have to move in to replace damaged tissue.

It is well known that cells accomplish these movements through internal cytoskeletal rearrangements that allow them to extend, retract, and divide. At some point during the migration, though, the new tissue settles into place and stops.

"We're trying to understand it from a fundamental point of view," says principal investigator David Weitz, Mallinckrodt Professor of Physics and Applied Physics at SEAS. "What we're really trying to get at is, why do things stop moving?"

The glass under discussion here is not the kind used in windows—though that is part of the larger category. Glasses include any amorphous materials that are viscous enough to remain solid for a reasonable period of time (often considered to be 24 hours) but which flow over longer periods (see sidebar).

Cream that is churned into butter goes through a sort of glass transition, as the increasing density of particles in the fatty emulsion forces it to become solid. Like any glass, butter will lose its form if the temperature rises.


As supercooled fluids and colloids (like cream) become more dense and approach the glass transition, the particles exhibit certain characteristic motions.


"We study this extensively," says Weitz, who leads the Experimental Soft Condensed Matter Group at SEAS. "We take small particles, and we increase their concentration more and more until they stop moving and they become a glass—and we understand how that behaves very well."


Living cells, though, add several levels of complexity to the system: they vary in size, shape, and rigidity; they divide; they sense their environment; and they exert their own forces on their surroundings.

"What is really surprising to us in this research with tissues," says Weitz, "is that many of the features that inert particles exhibit as their concentration increases are also exhibited by cells. The real qualitative difference is that small particles move only because of thermal motion, whereas cells actually move themselves."

Applied physicists discover that migrating cells flow like glass

This is an artist's representation of epithelial cells (black) approaching the glass transition (blue). Increasingly large groups of cells (green, purple, red) are able to move together more rapidly than the surrounding cells. Credit: Image courtesy of Thomas E. Angelini, University of Florida.

To simulate and study the migration of living tissue, Weitz's team deposited thousands of epithelial cells—specifically, canine kidney cells—onto a polyacrylamide gel containing the protein collagen. The researchers watched them grow and move under a microscope while measuring the individual and collective cellular movements, as well as the changes in density caused by proliferation.

The researchers found that when the cells are in a confluent layer (meaning that the cells are close enough to be touching), they flow like a liquid. However, when cell density increases past a certain threshold, the tightly packed cells begin to inhibit each other's movement. As a result, some cells are able to travel in groups, while others hardly get to move at all.

In other words, they behave just like a supercooled fluid or colloidal suspension transitioning into a glass.

"The implications for biological processes are very surprising," says lead author Thomas E. Angelini, formerly a postdoctoral researcher at SEAS and now an Assistant Professor at the University of Florida.

"Imagine a model wound in which a large group of cells are removed from the middle of a confluent layer," he says. "Cells will migrate inward to fill the void. Our results demonstrate that the low density of cells in the center of the wound is analogous to a raised temperature in the center of a molecular glass, causing flow within the hotter region."

"You could say that a wound is melted glass."

Provided by Harvard University (news : web)

PhysOrg

TStzmmalaysia
post Feb 25 2011, 09:13 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Exoskeleton Tower in Cheongna City by Emergent Architecture

Tom Wiscombe’s (Emergent Architecture) design for the Cheongna City Tower is based on creating an innovative spatial, structural, and energy production device which will become an operational symbol of the future for the IFEZ Cheongna region.

Located at the intersection of the main pedestrian passageway from east to west and the main artificial waterway from north to south in Lake Park, the Tower is intended to be a hub of urban activity and a new destination for the region. It is 400 M. tall and offers views of the ocean, the Incheon Airport to the west and Mt. Geyang to the east. The lower levels of the Tower contain various leisure and cultural activities such as art and design exhibition spaces, an assembly and lecture space, gift shops, and bars. The mid-levels of the Tower contain public Sky-Terraces every 50 M. as well as a Business Spine which contains showroom office space for various technology companies and cultural institutions. The upper levels of the Tower contain an astronomical observatory, a seasonal high-end restaurant with star chefs, and various lookout points and observation decks.

The structure of the Tower is based on a steel exoskeleton rather than a traditional structural core model. Three main structural spines weave along the facades, varying in terms of depth, width, and rotation in response to vertical and lateral forces as well as geometrical rules set by the design team. Similar to the inside of a turtle’s shell, these spines are merged together to form a hybrid of monocoque and frame-and-skin construction types.

Evolo

Emergent Architecture



TStzmmalaysia
post Feb 25 2011, 09:14 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Torque vectoring gears for smaller, more efficient wind turbines

Torque vectoring is a relatively new technology that has been employed in automobile differentials, most commonly all-wheel-drive vehicles, that allows the amount of power sent to each wheel to be varied. Scientists at the Technische Universitaet Muenchen (TUM) have now adapted this technology to wind turbines, to eliminate the need for converting the alternating current produced by the turbines into direct current and back again before it is fed into the grid.

As the rotational speed of the wind turbine, and thus the generator that is connected to the rotor via a gearbox, changes depending on the force of the wind, the alternating current it produces must first be rectified so that it can be fed into the grid with the correct frequency – usually 50 or 60 hertz. To accomplish this, the alternating current from the wind turbine's generators is transformed into direct current using giant rectifiers before being transformed back into alternating current of the right frequency. This twofold conversion process results in a loss of close to five percent.

To attain the desired grid frequency of 50 hertz, a generator with the usual two poles pairs must operate at a synchronous speed of exactly 1500 revolutions per minute. The scientists at TUM developed an active torque-vectoring gear similar to a controlled differential in a motor vehicle, that could operate at this speed in spite of the variable input rotational speed of the rotor.

In the TUM system, as in conventional designs, planetary gears generate most of the transmission required, but these are supplemented by a torque-vectoring gear with a supplemental electric motor that can be used as both a drive and a generator. This motor allows the power from the rotor to be either boosted or diverted to ensure a constant rotational speed for the generator. The researchers say that an electric motor of about 80 kW is sufficient for a 1.5 MW wind turbine.

By doing away with the need for giant rectifiers, the TUM system results in a lighter power train that doesn't require as large a wind turbine nacelle. Also, the researchers say that because a robust, low maintenance synchronous generator can be used, there's no need for powered electronics for frequency adjustment, which results in a boost to the overall efficiency of the wind farm.

GizMag



TStzmmalaysia
post Feb 25 2011, 09:16 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Automaton, Know Thyself: Robots Become Self-Aware

Robots might one day trace the origin of their consciousness to recent experiments aimed at instilling them with the ability to reflect on their own thinking.

Although granting machines self-awareness might seem more like the stuff of science fiction than science, there are solid practical reasons for doing so, explains roboticist Hod Lipson at Cornell University's Computational Synthesis Laboratory.

"The greatest challenge for robots today is figuring out how to adapt to new situations," he says. "There are millions of robots out there, mostly in factories, and if everything is in the right place at the right time for them, they are superhuman in their precision, in their power, in their speed, in their ability to work repetitively 24/7 in hazardous environments—but if a bolt falls out of place, game over."

This lack of adaptability "is the reason we don't have many robots in the home, which is much more unstructured than the factory," Lipson adds. "The key is for robots to create a model of themselves to figure out what is working and not working in order to adapt."

So, Lipson and his colleagues developed a robot shaped like a four-legged starfish whose brain, or controller, developed a model of what its body was like. The researchers started the droid off with an idea of what motors and other parts it had, but not how they were arranged, and gave it a directive to move. By trial and error, receiving feedback from its sensors with each motion, the machine used repeated simulations to figure out how its body was put together and evolved an ungainly but effective form of movement all on its own. Then "we removed a leg," and over time the robot's self-image changed and learned how to move without it, Lipson says.

Now, instead of having robots modeling their own bodies Lipson and Juan Zagal, now at the University of Chile in Santiago , have developed ones that essentially reflect on their own thoughts. They achieve such thinking about thinking, or metacognition, by placing two minds in one bot. One controller was rewarded for chasing dots of blue light moving in random circular patterns and avoiding red dots as if they were poison, whereas a second controller modeled how the first behaved and whether it was successful or not.

So why might two brains be better than one? The researchers changed the rules so that chasing red dots and avoiding blue dots were rewarded instead. By reflecting on the first controller's actions, the second one could make changes to adapt to failures—for instance, it filtered sensory data to make red dots seem blue and blue dots seem red, Lipson says. In this way the robot could adapt after just four to 10 physical experiments instead of the thousands it would take using traditional evolutionary robotic techniques.

"This could lead to a way to identify dangerous situations, learning from them without having to physically go through them—that's something that's been missing in robotics," says computer scientist Josh Bongard at the University of Vermont, a past collaborator of Lipson's who did not take part in this study.

Beyond robots that think about what they are thinking, Lipson and his colleagues are also exploring if robots can model what others are thinking, a property that psychologists call "theory of mind". For instance, the team had one robot observe another wheeling about in an erratic spiraling manner toward a light. Over time, the observer could predict the other's movements well enough to know where to lay a "trap" for it on the ground. "It's basically mind reading," Lipson says.

"Our holy grail is to give machines the same kind of self-awareness capabilities that humans have," Lipson says. "This research might also shed new light on the very difficult topic of our self-awareness from a new angle—how it works, why and how it developed."

One potential application they have tested for self-aware machines is with a model bridge, with sensors continuously monitoring vibrations across its frame to develop a self-image of its "body". "In simulations we've shown that it could identify weakened joints a lot sooner than via traditional civil engineering methods," Lipson says. "The bridge isn't going to suddenly wake up one day and say hello, but in a primitive sense you can say it has self-image, enough to turn on a red light if something's wrong."

A key question for this research concerns how far it can actually go. "These are very simple robots, maybe eight or a dozen moving parts, so it's relatively easy to construct models of everything. But if you scale it up, will it still be able to make a good model of self?" Bongard asks. "That question also extends to social robots observing a human or something else complex. The question of scalability is what research is examining at the moment."

Intriguingly, the research also revealed what mental illness robots might develop. For instance, the starfishlike robot that developed a body image "spontaneously developed 'phantom limb' syndrome, thinking it had arms and legs where it didn't," Lipson says. "As robots become more complex and evolve themselves, we could see the same kinds of disorders we [humans can] have appear in machines."

Lipson detailed his team's research February 19 at the annual meeting of the American Association for Advancement of Science conference in Washington, D.C.

Scientific American

TStzmmalaysia
post Feb 25 2011, 09:17 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Bendable Microchips: The World's First Organic Microprocessor Is Unveiled

This week at the International Solid-State Circuits Conference (you didn’t forget about the International Solid-State Circuits Conference, did you?) a team of European researchers will unveil a 4,000-transistor, 8-bit logic microprocessor with processing power equivalent to a simple silicon chip circa 1977. But this chip is different. This chip is flexible. The world’s first organic microprocessor is here.

Flexible, organic chips have long been on technology’s to-do list, but coaxing consistency out of organic transistors has been something of a chore. Organic transistors lack the monocrystalline structure of silicon, which makes their behavior somewhat unpredictable--each one can have a slightly different voltage threshold--an undesirable characteristic for a transistor to possess.

The Belgian team at nanotech researcher Imec in Leuven sorted out the problem by building an extra gate in the back of each transistor, a backdoor that allows for better control of the semiconductors electric field, solving the switching problems usually associated with organic chips.

However, the organic microprocessor they’ve built is still not exactly busting through the upper limits of Moore’s Law; as mentioned above, it’s roughly equivalent to a 1970s-era silicon chip, hardly a game changer for the overall chip industry. But it does have its place. The tiny, flexible, 25-micrometer-thick microprocessors could lead to better and cheaper flexible displays and sensors that can be embedded into just about anything, from clothing to construction materials to foods or pharmaceuticals. This basically means physical organic objects will harness processing power and ultimately intelligence.

PopSci
TStzmmalaysia
post Feb 25 2011, 09:18 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Green Goose sensors monitor your life, you earn experience points

We're pretty certain that once embedded wireless sensors catch on, they'll pervade every aspect of our lives, and Green Goose is building a microcosm of that eventuality in the form of a role-playing game.

The five-person SF Bay Area startup has embedded custom 915MHz radios and MEMS accelerometers in a variety of tiny transmitters which you can mount to household objects -- like a water bottle, bicycle, or the tooth above -- which report back to the receiver with your actions and thereby increase your score. Brush your teeth on time, take your vitamins, or exercise repeatedly within a couple hundred feet of the receiver, and you'll eventually level up.

Presently, that level isn't worth anything, but founder Brian Krejcarek says there are tentative plans to tie these points into a real game and an API to build the idea out, and he's presently looking for partner companies here at the Launch Conference in San Francisco to help roll out the sensors.

The concept of using a playfull games as these to make people behave in more healthier ways seems promising. With the integration of wireless internet into everyday objects, data like vitamin intake, nutrition levels and overall health status will be feeded back to you in real time. With this information, efficiency in all parts of life could be improved quite drastically.

Engadget

TStzmmalaysia
post Feb 26 2011, 09:39 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Green Goose sensors monitor your life, you earn experience points

We're pretty certain that once embedded wireless sensors catch on, they'll pervade every aspect of our lives, and Green Goose is building a microcosm of that eventuality in the form of a role-playing game.

The five-person SF Bay Area startup has embedded custom 915MHz radios and MEMS accelerometers in a variety of tiny transmitters which you can mount to household objects -- like a water bottle, bicycle, or the tooth above -- which report back to the receiver with your actions and thereby increase your score. Brush your teeth on time, take your vitamins, or exercise repeatedly within a couple hundred feet of the receiver, and you'll eventually level up.

Presently, that level isn't worth anything, but founder Brian Krejcarek says there are tentative plans to tie these points into a real game and an API to build the idea out, and he's presently looking for partner companies here at the Launch Conference in San Francisco to help roll out the sensors.

The concept of using a playfull games as these to make people behave in more healthier ways seems promising. With the integration of wireless internet into everyday objects, data like vitamin intake, nutrition levels and overall health status will be feeded back to you in real time. With this information, efficiency in all parts of life could be improved quite drastically.

Engadget
TStzmmalaysia
post Feb 26 2011, 09:48 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


TRANSPORTATION

Attached Image

The world’s first wireless electric bike

Got a problem with the various gear and brake cables winding their way around your bike frame? If you're riding a standard pedal-powered bike, the answer is probably 'no.' But if you're one of the increasing numbers of people getting around town on an electric bike than your answer may be different, with faulty wiring one of the most common sources of failures found in such vehicles. While some hide their electrical wiring away inside the frame, many e-bikes have wires running down the outside. Like so many of today's electrical devices, the new Shadow Ebike does away with this unsightly mess and potential point of weakness using wireless technology.

Toronto-based Daymak Inc. has dubbed its Shadow Ebike "the world's first wireless power-assist electric bicycle." Through the integration of ISM 2.4 GHz wireless using frequency-hopping spread spectrum technology to prevent interference, the Shadow has no brake or gear cables, and no visible electric wires running from the motor to the batteries, the controller or throttle. Turning the electric motor on or off, the magnetic regenerative brakes, the throttle and the pedal assist are all controlled wirelessly via the Daymak Drive controller.

What wiring and electronics there is, including the motor, lithium polymer battery and wireless Daymak Drive controller, is all packed inside the bike's front wheel, which is accommodated in a custom designed fork and frame. The wheel also includes a USB port, charging port and an LED battery power display. When the brakes are applied from the wireless throttle, the regenerative braking system kicks in to send current back to the batteries and the wheel can also be used as a generator to recharge devices via the USB port.

Daymak offers the Shadow Ebike with a 250W or 350W electric motor, and a 36V 10AH lithium-ion battery, which provides an average range of around 20 to 25 km (12 to 15 miles) running on just motor power, or around 35 to 40 km (22 to 25 miles) with pedal-assist. The included battery takes around 4-5 hours to completely recharge and is good for 750 to 800 cycles.

While the concept of a wireless bike throws up the possibility of interference from other wireless devices or even someone hacking into the bike's controls and slamming on the brakes to send you flying over the handlebars, Daymak says that each Shadow Ebike wireless component is paired and the odds of being affected by other means is less than one in a billion.

Daymak says the use of wireless technology also means the Shadow is setup for future upgrades to interact with smartphones and even PCs – possibly to give it similar remote monitoring capabilities to the PiCycle, an electric bike that could also lay claim to the title of "world's first wireless electric bicycle" with its use of Wi-Fi-based technology.

GizMag

TStzmmalaysia
post Feb 26 2011, 09:50 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Robonaut 2 Arrives At International Space Station

About 264 miles above the Earth, Robonaut 2, the dexterous humanoid robot developed by General Motors and the NASA Johnson Space Center, has finally begun its first mission.

R2 has been packed aboard the space shuttle orbiter Discovery since fall 2010 and was originally scheduled to go into space in early November. The final launch of Discovery on Thursday was delayed by a combination of weather and technical issues with the orbiter. Docking with the International Space Station is expected to occur Saturday.

Just days before Discovery lifted off from the Kennedy Space Center in Florida, R2 won the “Robot of the Year” award from the popular technology website Engadget.com. R2 captured nearly 44 percent of the votes from the site’s readers, ahead of five other contenders.

Before R2 begins regular duties alongside the astronauts, it will go through a period of testing and further development. A twin to the robot on the ISS remains at the Johnson Space Center in Houston, where engineers are refining its sensing and control systems.

Along with R2, a rack with a variety of interchangeable task boards was shipped to the ISS. The astronauts and engineers will evaluate R2?s performance in a range of simulated tasks while operating in the microgravity environment of space. Based on the data measured on the orbiting station, the engineers on the ground will provide updated software and hardware.

R2 will remain aboard the ISS indefinitely and if all goes well, it will eventually be used to perform mundane maintenance and service tasks. Upgraded versions of R2 could eventually perform space walks.

“GM engineers are also studying how the technology embedded within R2 can be put to use within manufacturing facilities to help create a safer working environment,” said Marty Linn, principal robotics engineer. “The dexterity and endurance of R2 can be used alongside people to help reduce repetitive stress injuries and the R2 sensing capabilities can be used in collision avoidance systems.”

DailyMarkets

TStzmmalaysia
post Feb 26 2011, 09:52 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Mouse heart 're-grows when cut', study shows

Scientists in the United States have found newborn mice can re-grow their own hearts. The mice had a large chunk of their heart removed a day after birth, only for the heart to restore itself within three weeks. Fish and amphibians are known to have the power to re-grow heart tissue, but the study in Science is the first time the process has been seen in mammals.

British experts said understanding the process could help human heart care. The researchers at the University of Texas Southwestern Medical Center surgically removed what is known as the left ventricular apex of the heart (about 15% of the heart muscle) from mice just a day after birth.

The heart was then quickly seen to regenerate and was fully restored after 21 days. After two months, the organ still appeared to be functioning normally. But when the same procedure was tested on mice aged one week, the heart failed to regenerate, suggesting this power of self-repair is extremely short-lived in mice.

The belief is that heart cells within the mouse have a narrow window after birth within which they can continue to replicate and repair. Subsequent tests suggested that these repair cells were coming from within the heart muscle.

"What our results show are that the new heart muscle cells which repair the amputated region of the heart came from proliferation and migration of pre-existing heart muscle cells," said Professor Eric Olson, who worked on the study. "We have no evidence they came from a stem-cell population." Many amphibians and fish, most famously the zebrafish, have the ability to renew heart muscle right into adulthood.

This new study suggests mammals too have such capacity for self-repair, if only for a limited time after birth. Professor Olson believes future research will show humans have a similar capacity, although no experiments involving human heart tissue are currently planned. "There's no reason to believe that the same window would not exist in the human heart. "Everything we know about development and early function of the mouse heart is comparable to the human heart so we're quite confident that this process does exist in humans, although that of course still has to be shown."

The team's focus is now on looking at ways to "re-awaken" this capacity to self repair in adult mice, with the ultimate ambition to do the same in humans to repair damage sustained during heart attacks.

"We've identified a micro-RNA (a small piece of genetic material) which regulates this process so we're tying to use that as a way of further enhancing cardiac regeneration later in life and we're also screening for new drugs which can re-awaken this mechanism in adult mice," he said.

Professor Jeremy Pearson, associate medical director of the British Heart Foundation, said the study showed heart regeneration was not the exclusive preserve of zebrafish and newts, but said more work needed to be done to understand what was actually going on inside the healing heart.

"This exciting research shows for the first time that young mice, like fish and amphibians, can heal their damaged hearts," he said. "It strengthens the view that understanding how this happens could provide the key to healing adult human hearts." Professor Olson concedes there will be problems ahead. What works in the low-pressured heart of a zebrafish, might not work in the high-pressured multi-chambered heart of humans. Meddling with heart muscle cells could, for instance, trigger arrhythmias in the heart, he said.

BBC Science


TStzmmalaysia
post Feb 26 2011, 09:53 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Happy children make happy adults

New research links well-being in adolescence with life satisfaction in adulthood. Being a 'happy' teenager is linked to increased well-being in adulthood, new research finds.

Much is known about the associations between a troubled childhood and mental health problems, but little research has examined the affect of a positive childhood. For the first time, researchers from the University of Cambridge and the MRC Unit for Lifelong Health and Ageing have analysed the link between a positive adolescence and well-being in midlife.

Using information from 2776 individuals who participated in the 1946 British birth cohort study, the scientists tested associations between having a positive childhood and well-being in adulthood.

A 'positive' childhood was based on teacher evaluations of students' levels of happiness, friendship and energy at the ages of 13 and 15. A student was given a positive point for each of the following four items - whether the child was 'very popular with other children', 'unusually happy and contented', 'makes friends extremely easily' and 'extremely energetic, never tired'. Teachers also rated conduct problems (restlessness, daydreaming, disobedience, lying, etc) and emotional problems (anxiety, fearfulness, diffidence, avoidance of attention, etc).

The researchers then linked these ratings to the individuals' mental health, work experience, relationships and social activities several decades later. They found that teenagers rated positively by their teachers were significantly more likely than those who received no positive ratings to have higher levels of well-being later in life, including a higher work satisfaction, more frequent contact with family and friends, and more regular engagement in social and leisure activities.

Happy children were also much less likely than others to develop mental disorders throughout their lives – 60% less likely than young teens that had no positive ratings.

The study not only failed to find a link between being a happy child and an increased likelihood of becoming married, they found that the people who had been happy children were actually more likely to get divorced. One possible factor suggested by the researchers is that happier people have higher self-esteem or self-efficacy and are therefore more willing and able to leave an unhappy marriage.

"The benefits to individuals, families and to society of good mental health, positive relationships and satisfying work are likely to be substantial," said Professor Felicia Huppert, one of the authors of the paper and Director of the Well-being Institute at the University of Cambridge. "The findings support the view that even at this time of great financial hardship, policymakers should prioritise the well-being of our children so they have the best possible start in life."

Dr Marcus Richards, co-author of the paper from the MRC Unit for Lifelong Health and Ageing, said: "Most longitudinal studies focus on the negative impact of early mental problems, but the 1946 birth cohort also shows clear and very long-lasting positive consequences of mental well-being in childhood."

For the study, the researchers adjusted for social class of origin, childhood intelligence and education.

EurekAlert

TStzmmalaysia
post Feb 26 2011, 09:54 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

'Thunderbolt FireWire' can transfer data at the breathtaking speed of 10Gb/s

Intel has finally launched its new peripheral interconnect technology—formerly codenamed "Light Peak"—now branded "Thunderbolt." The new interconnect is designed to bring workstation-class I/O throughput to mobile workflows as well as serve as a next-generation connector for peripherals, including displays, storage, and video and audio devices.

The proposed standard was intended to replace interconnects like FireWire, USB, and others with fiber optic connections capable of up to 100Gbps bi-directional throughput. Moving to fiber instead of copper allowed increased speeds as well as dramatically longer cable runs. The original demos used a 30m fiber optic cable to transmit dual 1080p video streams, LAN traffic, and files to an SSD RAID setup.

In its initial out-of-the-lab incarnation, Thunderbolt can use either copper or fiber connections for 10Gbps bidirectional communication. That speed is 20 times faster than the theoretical limit of USB 2.0, 12 times faster than FireWire 800, and twice as fast as USB3. According to Intel, however, the 10Gbps isn't just a theoretical peak speed, but usable bandwidth. This allows a single port to communicate with multiple devices simultaneously for a combined throughput of 10Gbps.

That 10Gbps is much faster than most current I/O technologies. With two devices pushing data at the maximum rate, you could back up a full Blu-ray movie in 30 seconds, or sync 64GB of music to a portable device in about a minute. Copying the entire contents of the Library of Congress in digital form—approximately 20TB of data—would take about 35 minutes.

Active electrical-only cables can be up to 3 meters (just under 10 feet) in length, similar to current FireWire and USB standards. Active optical cables, which use fiber for data transmission and copper for up to 10W of power, can be "tens of meters" in length. Passive fiber-only cables could potentially be hundreds of meters long. These lengths enable more flexible positioning between devices and computers instead of relying on specialized connections or relatively pokey wireless solutions.

Because each Thunderbolt device will include a tiny Intel-made controller, similar to FireWire, multiple Thunderbolt devices can be daisy-chained to a single port and can communicate directly peer-to-peer. It doesn't require hubs like USB does, nor does it depend on the CPU to initiate and handle device communication. Also like FireWire, Thunderbolt ports can supply power to connected devices—up to 10W total per port. Furthermore, powered devices in the chain can pass 10W of power further down the chain if needed.

Thunderbolt supports both DisplayPort and PCI Express protocols over its 10Gbps transport layer. A major benefit of this design is that Thunderbolt devices can leverage native OS drivers for PCI Express and DisplayPort for compatibility—no additional drivers are needed. Existing Mini DisplayPort-equipped monitors are already compatible and can be plugged in directly, and Mini DisplayPort adapters for VGA, DVI, or HDMI will also work. Intel said that adapters can be made using a Thunderbolt controller and common PCI bridges to adapt existing FireWire, USB, eSATA, and even Ethernet connectors.

Intel's controllers handle all the necessary protocol switching between PCI Express and DisplayPort, which enables simultaneous transmission of data via both protocols over the same cable. The controllers are also optimized for extremely low-latency communication with quality-of-service support, which is critical for handling pro video and audio applications. Connected devices can be clock-synchronized to within 8 nanoseconds.

The combination of a compact, inexpensive controller with the tiny Mini DisplayPort makes Thunderbolt particularly suited to mobile computing. In particular, thin and light ultraportable laptops like the MacBook Air and Sony Vaio S series could connect to a high-performance audio controller for recording, a RAID for nearly instantaneous backups, and an external HD monitor—all from a single port.

Arstechnica

TStzmmalaysia
post Feb 26 2011, 09:55 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

New kind of optical fiber developed

A team of scientists led by John Badding, a professor of chemistry at Penn State University, has developed the very first optical fiber made with a core of zinc selenide -- a light-yellow compound that can be used as a semiconductor. The new class of optical fiber, which allows for a more effective and liberal manipulation of light, promises to open the door to more versatile laser-radar technology. Such technology could be applied to the development of improved surgical and medical lasers, better countermeasure lasers used by the military, and superior environment-sensing lasers such as those used to measure pollutants and to detect the dissemination of bioterrorist chemical agents. The team's research will be published in the journal Advanced Materials.

"It has become almost a cliché to say that optical fibers are the cornerstone of the modern information age," said Badding. "These long, thin fibers, which are three times as thick as a human hair, can transmit over a terabyte -- the equivalent of 250 DVDs -- of information per second. Still, there always are ways to improve on existing technology." Badding explained that optical-fiber technology always has been limited by the use of a glass core. "Glass has a haphazard arrangement of atoms," Badding said. "In contrast, a crystalline substance like zinc selenide is highly ordered. That order allows light to be transported over longer wavelengths, specifically those in the mid-infrared."

Unlike silica glass, which traditionally is used in optical fibers, zinc selenide is a compound semiconductor. "We've known for a long time that zinc selenide is a useful compound, capable of manipulating light in ways that silica can't," Badding said. "The trick was to get this compound into a fiber structure, something that had never been done before." Using an innovative high-pressure chemical-deposition technique developed by Justin Sparks, a graduate student in the Department of Chemistry, Badding and his team deposited zinc selenide waveguiding cores inside of silica glass capillaries to form the new class of optical fibers. "The high-pressure deposition is unique in allowing formation of such long, thin, zinc selenide fiber cores in a very confined space," Badding said.

The scientists found that the optical fibers made of zinc selenide could be useful in two ways. First, they observed that the new fibers were more efficient at converting light from one color to another. "When traditional optical fibers are used for signs, displays, and art, it's not always possible to get the colors you want," Badding explained. "Zinc selenide, using a process called nonlinear frequency conversion, is more capable of changing colors."

Second, as Badding and his team expected, they found that the new class of fiber provided more versatility not just in the visible spectrum, but also in the infrared -- electromagnetic radiation with wavelengths longer than those of visible light. Existing optical-fiber technology is inefficient at transmitting infrared light. However, the zinc selenide optical fibers that Badding's team developed are able to transmit the longer wavelengths of infrared light. "Exploiting these wavelengths is exciting because it represents a step toward making fibers that can serve as infrared lasers," Badding explained. "For example, the military currently uses laser-radar technology that can handle the near-infrared, or 2 to 2.5-micron range. A device capable of handling the mid-infrared, or over 5-micron range would be more accurate. The fibers we created can transmit wavelengths of up to 15 microns."

Badding also explained that the detection of pollutants and environmental toxins could be yet another application of better laser-radar technology capable of interacting with light of longer wavelengths. "Different molecules absorb light of different wavelengths; for example, water absorbs, or stops, light at the wavelengths of 2.6 microns," Badding said. "But the molecules of certain pollutants or other toxic substances may absorb light of much longer wavelengths. If we can transport light over longer wavelengths through the atmosphere, we can see what substances are out there much more clearly." In addition, Badding mentioned that zinc selenide optical fibers also may open new avenues of research that could improve laser-assisted surgical techniques, such as corrective eye surgery.

Provided by Pennsylvania State University (news : web)

PhysOrg

TStzmmalaysia
post Feb 26 2011, 09:57 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Large Hadron Collider powers up to unravel mysteries of nature

Outside the small village of Meyrin, Switzerland, horses graze quietly in fields lined by the Jura mountains. You'd never know it by the idyllic landscape, but 300 feet below the Swiss-French border, the Large Hadron Collider is searching for the secrets of the universe. A 17-mile circular tunnel houses the world’s largest atom smasher that is once again firing high-energy proton beams.

On Monday, the CERN Control Center turned on the LHC beams to begin the next two-year run of the particle collider. CERN directors decided to extend the run through the end of 2012, instead of shutting down in 2011 for repairs as previously planned, and spirits are running high among scientists working in the field of new physics.

Researchers from across the world engineer detectors and seek to solve the mysteries of matter in an international collaboration that reaches from Chicago to Mumbai.

“Recently there was a convention in Chamonix,” said Georgios Choudalakis, a Greek physicist on the ATLAS experiment at the LHC. “The heads of the experiments and the director of the laboratory decided that we will take data for 2 years. And the decisive criterion for this was the sensitivity to the Higgs, so we’re optimistic.”

This comes on the heels of the news that 2011 will be the end of the run for the Tevatron, the second most powerful particle collider in the world located at the Fermi National Laboratory in Batavia. After setbacks and shutdowns, the LHC had collisions in 2010 that went even better than expected. “We made it clear even to ourselves that the page has turned,” said Choudalakis. “The energy frontier is not at the Tevatron anymore. We are cutting more ice here.”

Now, the hunt for the Higgs is on at CERN, the Conseil Européen pour la Recherche Nucléaire.

The elusive Higgs particle is, according to the theory, a fundamental building block of matter and the reason everything has mass.

“Nobody can explain where mass comes from, but we know it’s there,” said Pauline Gagnon, a French physicist at the ATLAS experiment. This conundrum is the most important question physicists have to answer, she said.

“If you think of a one pound bag of salt and you add up the weights of each grain of salt, they will logically equal one pound,” said Gagnon. But, when physicists break down atoms in this way and try to determine the weight of the pieces inside, the calculations of the weight of atomic building blocks such as quarks and electrons don’t add up, she said. Here’s where the Higgs comes in.

The proposed Higgs particle is a part of a field that permeates everything. According to theory, it is a particle’s interaction with the Higgs field that creates drag on a particle, giving it mass. Picture a business man walking through a pool in a suit. The water in the pool is like the Higgs field and, as it soaks into his clothing, it will weigh him down and he will move more slowly. He becomes massive.

Although it has been predicted as the final puzzle piece that completes the Standard Model of physics – the leading explanation for atomic interaction – the existence of the Higgs has never been proven. If such a particle exists, experiments at the LHC should be able to detect it.

“We have indirect suggestions of where it should be if the Higgs exists,” said Choudalakis. “But that is if it exists.” If it can’t be found, the Standard Model will have to be rewritten.

In the next two years, the LHC will resume collisions at 7 TeV, or tera electron volts, between the two beams – the highest energy levels achieved in recent years. The decision at Chamonix mandates that energy in the beams will be kept at this conservative level to avoid the types of machine failures that shut down the LHC in 2008, Gagnon said.

At those energy levels, the two particle beams travel through miles of tubes surrounded by 1,700 superconducting magnets that force bunches of charged particles around the ring and through accelerators designed to increase their speed to within a percentage of the speed of light.

The particle beams, each made up of hundreds of billions of protons, began making their way around a series of underground tubes Monday, guided magnetically and gaining speed and energy. The beams start out about as big around as an index finger and, within microseconds they complete their journey through the four accelerator rings into the LHC and are compressed down to the size of a human hair.

“The beams, at 7 TeV, will have an energy which is the same kinetic energy of a 747 landing, so imagine a big airplane which is landing and smashing against a wall – this is the energy of the LHC beams,” said Mirko Pojer, an Italian and the engineer in charge of the CERN Control Center.

The two beams, moving in opposite directions, collide at 4 points along the ring where the separate LHC experiments house their detectors, which gather data to analyze the collisions that occur every 25 nanoseconds.

Two of the experiments at CERN are designed to detect a possible Higgs particle. Though they have the same goal, the ATLAS and CMS detectors are designed to look at particle collisions differently.

ATLAS, the larger of the two detectors, stands 82 feet high and houses an enormous magnet system that bends the paths of charged particles after collisions in order to measure their momentum, which identifies them.

CMS, the Compact Muon Solenoid, as its name suggests, is more compact than ATLAS. The CMS detector is designed around a large coiled magnet, which creates a uniform magnetic field that is 100,000 stronger than the Earth’s. CMS measures the subatomic debris of the collisions, hunting for signs of the Higgs.

The 3,000 scientists at ATLAS and the 2,000 scientists at CMS are in a race to be the first to make the biggest scientific discovery of the century.

The next two years of operation will provide enough particle collisions for a groundbreaking discovery. If the Higgs exists, the physicists at CMS or ATLAS will see evidence of the particle. With the amount of data that the LHC will be able to gather in the next two years, scientists expect to confirm the existence or absence of the Higgs, said Gigi Rolandi, an Italian and the physics coordinator for the CMS.

These detectors are truly wonders of the modern age. Like the Acropolis or the Great Wall of China, the LHC is just as incredible a feat of engineering, though it cannot be seen as readily. “It’s a real pity that these detectors are underground,” said Choudalakis. “If they were on the surface, everybody would be very proud of what mankind has done.”

At CERN, the discovery of the fundamental building blocks of nature are just around the corner. “People are thrilled by this, and well deservedly,” said Choudalakis. “What is beautiful about science, especially on a big scale like this, is that it makes you feel like you have a little chance in your life, from your humble starting point, to touch your finger on history and leave a fingerprint on it. Imagine history as a big piece of glass. Most people don’t even get close, and you have a chance to leave your fingerprint on it. I think it’s one of the noblest missions a person can have.”

While the discovery of the Higgs would indeed be a milestone for the world of physics, and a tidy completion of the Standard Model, sometimes messy is far more interesting.

“Like we say, if we do not discover the Higgs particle, it will be even more interesting to find out what else is there in the physics and in nature which then controls the amplitudes of interactions of the particles,” said Slawomir Tkaczyk, a Polish physicist on the CMS experiment. “So, after 10 or 15 years of hard work, the most exciting times are still ahead of us.”

This story is republished courtesy of Medill Reports. Medill Reports is written and produced by graduate journalism students at Northwestern University's Medill school.

Source: Medill Reports

PhysOrg

TStzmmalaysia
post Feb 26 2011, 09:59 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Scientists investigate the possibility of wormholes between stars

Wormholes are one of the stranger objects that arise in general relativity. Although no experimental evidence for wormholes exists, scientists predict that they would appear to serve as shortcuts between one point of spacetime and another. Scientists usually imagine wormholes connecting regions of empty space, but now a new study suggests that wormholes might exist between distant stars. Instead of being empty tunnels, these wormholes would contain a perfect fluid that flows back and forth between the two stars, possibly giving them a detectable signature.

The scientists, Vladimir Dzhunushaliev at the Eurasian National University in Kazakhstan and coauthors, have posted their investigation of the possibility of wormholes between stars on arXiv.org.

The scientists began investigating the idea of wormholes between stars when they were researching what kinds of astrophysical objects could serve as entrances to wormholes. According to previous models, some of these objects could look similar to stars.

This idea led the scientists to wonder if wormholes might exist in otherwise ordinary stars and neutron stars. From a distance, these stars would look very much like normal stars (and normal neutron stars), but they might have a few differences that could be detectable.

To investigate these differences, the researchers developed a model of an ordinary star with a tunnel at the star’s center, through which matter could move. Two stars that share a wormhole would have a unique connection, since they are associated with the two mouths of the wormhole. Because exotic matter in the wormhole could flow like a fluid between the stars, both stars would likely pulse in an unusual way. This pulsing could lead to the release of various kinds of energy, such as ultrahigh-energy cosmic rays.

For now, the difficult part is calculating exactly what kinds of oscillations are occurring, and what kind of energy is being released. This information would allow scientists to predict what a wormhole-containing star might look like from Earth, and begin searching for these otherwise normal-looking stars.

PhysOrg

TStzmmalaysia
post Feb 26 2011, 10:00 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Floating Solar Panels: Solar Installations on Water

Most of the solar energy systems on the market today bare two major weaknesses: they require vast land areas in order to be built, and the costs related to solar cells fabrication and maintenance are high. A new technology is about to overcome these challenges and many more: floating solar power plants.

Developed by a Franco-Israeli partnership,* this innovative solar power technology introduces a new paradigm in energy production. Solar power plays a dominant role in the world-wide effort to reduce greenhouse gases, it is considered a clean energy and is an efficient source of electricity. Yet several obstacles have been undermining the expansion of this sector and many of its actors are looking for a new approach towards the markets.

A win-win Situation

Soon after the design phase was over, at the end of March 2010, the fabrication of a prototype began and the team is now aiming to launch the implementation phase in September 2011. The tests will take place at Cadarache, in the South East of France, the site having a privileged position on the French electric grid and being close to a local hydro-electric facility providing the water surface to be used for the installation of the system. It will operate on-site during a period of nine months, while assessing the system's performances and productivity through seasonal changes and various water levels. The research team members believe that by June 2012, they will have all the information required to allow the technology's entry on the market.

As even leading photovoltaic companies struggle to find land on which to install solar power plants, the project team identified the almost untouched potential of solar installations on water. The water basins, on which the plants could be built, are not natural reserves, tourists' resorts or open sea; rather they are industrial water basins already in use for other purposes. By that, it is assured that the new solar plants will not have a negative impact on natural landscapes. "It's a win-win situation," declares Dr. Kassel, "since there are many water reservoirs with energy, industrial or agricultural uses that are open for energy production use."

After solving the question of space, the team also took on the problem of cost. "It sounds magical to combine sun and water to produce electricity, but we also have to prove that it carries a financial logic for the long run," explains Dr. Kassel. The developers were able to reduce the costs linked to the implementation of the technology by two means. First they reduced the quantity of solar cells used thanks to a sun energy concentration system based on mirrors, while keeping steady the amount of power produced.

Made of modules

Secondly, the team used a creative cooling system using the water on which the solar panels are floating. Thanks to this efficient cooling method, the photovoltaic system can use silicon solar cells, which tend to experience problems linked to overheating and need to be cooled down in order to allow the system to work correctly, unlike standard type more expensive cells. The particular type of solar cell used also allows a higher efficiency than the standard ones, achieving both reliability and cost reduction.

Still for the purpose of making the technology efficient and ready to market, the system is designed in such way that on a solar platform it is possible to assemble as many identical modules as needed for the power rating desired. Each module produces a standard amount of 200 kiloWatt electricity, and more power can be achieved by simply adding more modules to the plant.

The team also worked on the environmental impact of the technology. It works in fact as a breathing surface through which oxygen can penetrate to the water. This feature ensures that sufficient oxygen will maintain the underwater life of plants and animals. Dr. Kassel adds: "One of the implementation phase's goals is to closely monitor the possible effects of this new technology on the environment with the help of specialists" and "a preliminary check shows no detrimental environmental impact on water quality, flora or fauna. Our choices of materials were always made with this concern in mind."

*The project results from a collaboration between Solaris Synergy from Israel and the EDF Group from France. EUREKA provided the supporting platform which allowed to enhance both companies' partnership. After receiving the "EUREKA label" the project, called AQUASUN, found also support from the Israeli Ministry of Industry, Trade and Labor.

ScienceDaily

TStzmmalaysia
post Feb 26 2011, 08:46 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


TRANSPORTATION

Attached Image

Infinyte Marine hopes its electric i4 will be a quiet success

For many people who own lakefront property, noisy combustion-engined motorboats that leave clouds of exhaust and oil slicks in their wakes have pretty much become a given. Hopefully, however, quiet and clean-running electric watercraft may soon take over a significant portion of the pleasure-boating market. While consumers can already pre-order the planned 8-passenger solar-electric Loon pontoon boat, another option is the smaller Infinyte i4 catamaran, which began production in 2010. Its maker, Canada's Infinyte Marine, also has plans for a larger boat.

First of all, the 5-passenger i4 does indeed look kind of weird – viewed in profile, it's hard to distinguish the bow from the stern. This design reportedly allows for maximum efficiency as it moves through the water.

The 14-foot (4.3-meter) boat is propelled by twin 24 V motors, made by Mercury Marine's MotorGuide division. It manages a top speed of 8 mph (13 kph), and has an estimated runtime of ten hours – depending on use and battery type. It can be recharged from a household 240 V outlet, and also features its own onboard battery charger for getting back to shore, should you need it.

Passengers steer the i4 with a joystick control that displays the remaining battery life, and which allows them to pivot the boat 360 degrees on the spot. With a total weight of 710 pounds (223 kg), the company claims that it's light enough to be towed behind almost any size of vehicle.

The company also intends to produce a larger, faster, covered watercraft, called the i8. A 25-foot (7.6-meter) catamaran designed to fill the same niche as pontoon boats, it will feature seating for 10 passengers, and an estimated top speed of 20 mph (32 kph). Integrated rooftop solar panels and an optional biodiesel generator will help lengthen its battery range.

Infinyte hopes to introduce it sometime this year or next.

Gizmag

TStzmmalaysia
post Feb 26 2011, 08:47 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


TRANSPORTATION

Attached Image

Eco Marine Power Developing Solar Sails to Power Ships

Traditionally wind power has enabled man to travel the seas, but in recent years, fossil fuels and nuclear power have taken over. However, Japanese company Eco Marine Power are harking back to days of yore, but this time with a renewable energy twist. The energy firm has developed rigid sails that are installed with solar models and the system, called The Aquarius, is able to collect both solar and wind energy to power ships.

Eco Marine Power have been developing the technology for large ships such as oil tankers, but believe it could also benefit smaller vessels such as passenger ferries, tourist boats and coastal freighters. The company also hope that global governments could also benefit from the technology on naval vessels.

The Eco Marine Power Aquarius System will allow ships to not only utilise wind power and solar energy, but reduce fuel consumption and lower greenhouse gas emissions. It would enable shipping fleets the world over to reduce the CO2 footprint of their fleet.

The latest design made by Eco Marine allows the array of rigid sail panels to be controlled via an on-board computer system. It allows the solar sail panels to be optimized in order for the collection of either solar and wind energy in variable weather conditions.

Imagine that, not only a future where oil tankers utilize wind energy, but also solar power as well. Oh, the irony.

Inhabitat

TStzmmalaysia
post Feb 27 2011, 12:56 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Diane Pham Modern WAT LED Lamp Powered By Life Giving Water

Don’t let a lack of outlets spoil your perfectly planned lighting intentions. Here’s one lamp from designer Manon Leblanc that harnesses the power of water to provide for lovely interior lighting – completely cord free! Dubbed the WAT, with just a little H2O and a hydroelectric battery (composed of a carbon stick coated with magnesium), the pair combine to create a stellar electro chemical reaction able to create enough power to light up a series of warm light LED strips. A simple and modern design, this is WAT green lighting is all about!

Inhabitat

TStzmmalaysia
post Feb 28 2011, 09:35 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Vortices get organized

Exotic entities that arrange into a crystalline structure at near room-temperature could lead to a new approach to electronic memory.

A crystal consisting not of atoms but exotic swirling magnetic entities, called skyrmions, has been identified at near room-temperature by Yoshinori Tokura of the RIKEN Advanced Science Institute, Wako, and his colleagues from several other institutes in Japan. Previous observations of a skyrmion crystal state, in transition-metal–silicide materials, have been at cryogenic temperatures below 40 kelvin. The existence of skyrmions at room temperature improves the practicality of harnessing their potential for use in novel computer memories.

Skyrmions are formed on some surfaces when the spins of the electrons—think of an arrow about which each electron rotates—collectively arrange such that they wrap around the surface of a sphere (Fig. 1). This pattern spirals in such a way that the spins on the outside point up whereas those at the core point down. This collection of spins can display many properties associated with a single particle. “A skyrmion crystal is the periodic array of these particle-like entities,” explains Tokura.

Earlier neutron-scattering experiments by other researchers identified this unusual effect in both iron–cobalt silicide and manganese silicide. Tokura and his team, however, investigated skyrmions in iron germanium. This alloy has the same cubic atomic crystal structure as iron–cobalt silicide and manganese silicide—the two materials in which skyrmions have been observed at low temperatures; but it remains in the necessary magnetic structure up to a much higher temperature.

Using a transmission electron microscope, the researchers probed the magnetization on the surface of polished layers of the iron–germanium alloy. They found tell-tale signs of skyrmions at temperatures up to 260 kelvin, particularly when they applied a small magnetic field perpendicularly to the surface.

This material also provides an excellent opportunity to investigate the stability of the skyrmion crystal, the team notes. Previous studies focused on very thin layers of material. Tokura and his team investigated the influence of film thickness and found that for thicknesses greater than the distance between skyrmions, about 75 nanometers in this case, the skyrmion crystal state is suppressed and a more conventional ferromagnetic phase starts to dominate.

Skyrmions could play an important role in the development of spintronics—using electron spin to carry information in the same way that electron charge is used in conventional electronics. “Skyrmion crystals could also be applied in memory and logic devices,” says Tokura. The advantage over conventional systems is that control is achieved using electric, rather than magnetic fields, which is known to be more efficient.

More information: Yu, X.Z., et al. Near room-temperature formation of a skyrmion crystal in thin-films of the helimagnet FeGe. Nature Materials 10, 106–109 (2011). http://www.nature. … mat2916.html

Article from PhysOrg

Provided by RIKEN (news : web)
TStzmmalaysia
post Feb 28 2011, 09:37 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

inFORM’s Providence Pedestrian Bridge will connect, vitalize, and re-imagine

inFORM’s design for a multi-purpose pedestrian bridge replaces what used to be the main artery bridge for interstate 195. A new more efficient I-195 bridge has since been constructed and so Providence looks to take advantage of the prime location with a pedestrian connector.

inFORM’s bridge is more a landmark than a means of passage, what they are calling an “urban intervention”. The boardwalk design includes gardens, spaces for sculptures, a sundeck, outdoor seating and even a built in café. The bridge will integrate with existing and planned green space along the river as well as with the existing riverwalk.

inFORM designed the Pedestrian Bridge to coincide directly with existing programmatic elements. The bridge will integrate with the perennial WaterFire events held along the Providence River, provide space for popular on shore fishing activities, local street vendors, buskers, and street entertainment. It will connect the Fox Point and College Hill areas with downtown Providence and the Knowledge District, areas Providence hopes to vitalize with the Pedestrian Bridge and other future projects.

The citywill build the bridge in phases as additional funds are allocated. The city commissioned the competition under the requirements the design would take advantage of the five existing granite piers left over from the previous I-195 bridge and hopes to save much of the construction costs this way. The bridge is expected to take around 18 months to complete, aiming for completetion by 2013.

Evolo


TStzmmalaysia
post Feb 28 2011, 09:38 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Bloom with a View: Robot Subs Help Researchers Study Mysterious Antarctic Sea Life

After decades of riding icebreakers in Antarctica's icy waters hoping to better understand the fragile ecosystem on and around this frigid continent scientists have begun delegating data collection to satellite-guided robotic subs. The hope is that these sea gliders, which can dive hundreds of meters and stay in the water for months at a time, will help to unlock the secrets of phytoplankton blooms that nourish the organisms in Antarctica's Ross Sea for a few months each year before mysteriously disappearing.

There are neither green plants in Antarctica nor macro-algae in the surrounding waters, says Vernon Asper, a marine science professor at the University of Southern Mississippi's (U.S.M.) Department of Marine Science at the NASA Stennis Space Center. "Essentially, everything that eats, lives and breathes in Antarctica is fed from phytoplankton in the ocean."

These blooms, which turn the sea from deep blue to lush green, begin sometime in November in a large polynya (an opening in the ice) in the Ross Sea when the liquid water is exposed to sunlight. The heat stratifies the sea surface in this roughly 160- by 320-kilometer area. (Its expanse varies from year to year.) This allows microscopic phytoplankton to flourish.

"In January, generally, the bloom is going great, the algae is growing like crazy, and then all of a sudden it quits," Asper says. "By mid-January the algae falls out of the water column and it gets clear again. We'd like to see what is different between mid-November and mid-January." Researchers have determined that there is no appreciable difference in water nutrients, sunlight, water temperature or stratification from the time the blooms begin in November to the time they disappear in January.

Researchers have been traveling to this polynya for decades, studying the phytoplankton during ship expeditions or collecting data from moorings placed in the water. The information has been incomplete because ice flows prevent ships from traveling to the area prior to late December (the beginning of southern summer). (They have been able to determine when the blooms begin with the aid of satellite images.) "It's hard to study the onset of the bloom," says Asper, who has been making the trip regularly since the early 1990s. "If you want to study the phytoplankton from a ship, it's like missing the first acts of a play."

Interest in the phytoplankton blooms began when ships traveling to the U.S. Antarctic Program's McMurdo Station would report on how green the water was. "We're interested in understanding the ecosystem around the Southern Ocean because it's an extremely productive place," Asper says. "Given that the entire ecosystem depends on the phytoplankton, we really want to nail down what they depend on. You could stretch that and say we're interested in this for a global significance, because we want to get baseline data to study global climate change, and we want to be able to monitor how things change."

Late last year, Asper and a diverse team of colleagues from U.S.M., the University of Washington in Seattle, Old Dominion University in Norfolk, Va., the Virginia Institute of Marine Sciences, and the U.K.'s University of East Anglia turned to robot subs to help get more comprehensive readings of the phytoplankton and surrounding water. During a trip funded by the National Science Foundation, the researchers dropped two Seaglider unmanned underwater vehicles (UUVs) built by Bedford, Mass.–based iRobot Corp. into the Ross Sea polynya. Seaglider is essentially a pointy-nosed, 1.8-meter-long yellow torpedo with two rear fins having a one-meter wingspan. The tetherless Seaglider drives through the water at a speed of about 1.8 kilometers per hour, driven by changes in buoyancy rather than a propeller system.*

The GPS-guided Seagliders at times operated beneath 100 meters of ice, although the researchers generally tried to avoid directing the subs below ice where they cannot get a satellite signal for prolonged periods of time. When they are out of GPS range the gliders use deduced reckoning (aka "dead" reckoning) to calculate their position and navigate using their last known coordinates until they can resurface and reestablish a GPS link. The deepest dives were about 500 meters and lasted no more than three hours at a time.

The scientists are still parsing the data collected by the gliders' sensors. A conductivity, temperature and depth (CTD) sensor was used to measure water temperature and salinity, which will help researchers calculate the its density and potential to mix with water flowing in from other locations. Researchers are hoping that the Seagliders' three optical sensors shed light on the water's biological properties, including the amount of oxygen and the presence of chlorophyll.

Asper and his colleagues were not the only researchers in the area using a UUV to study the phytoplankton blooms. A team led by Rutgers University oceanography professor Josh Kohut deployed an autonomous glider of its own design called the RU26 to track very small changes in the polynya's temperature and salinity. The RU26 was placed in the water December 11 and traversed 1,180 kilometers through the Ross Sea before it had to be removed on February 5 due to a mechanical problem.

Ocean scientists got their first taste of how gliders could help with their research last summer, following BP's Deepwater Horizon oil spill. iRobot provided Seagliders in May to help detect the presence of oil by measuring temperature, salinity and other ocean properties at depths of up to 1,000 meters.

As data from the Ross Sea is analyzed, Asper and his colleagues plan to build computerized models that help them visualize the phytoplankton blooms below the surface. Asper expects gliders and other autonomous subs are the future of oceanographic data collection. "They can work in conditions where ships can't," he says. "They don't care about hurricanes, they don't get tired or sick—they just work."

Scientific America




TStzmmalaysia
post Feb 28 2011, 09:41 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

How to keep up with processor-power's doubling time, Part 1: Designing the Hardware


Computer chips' clocks have stopped getting faster. To maintain the regular doubling of computer power that we now take for granted, chip makers have been giving chips more "cores," or processing units. But how to distribute computations across multiple cores is a hard problem, and this series of articles examines the different levels at which MIT researchers are tackling it, from hardware design up to the development of new programming languages.

With the multicore chips in today's personal computers, which might have four or six or even eight cores, splitting computational tasks hasn't proved a huge problem. If the chip is running four programs — say, a word processor, an e-mail program, a Web browser and a media player — the operating system can assign each its own core. But in future chips, with hundreds or even thousands of cores, a single program will be split among multiple cores, which drastically complicates things. The cores will have to exchange data much more often; but in today’s chips, the connections between cores are much slower than the connections within cores. Cores executing a single program may also have to modify the same chunk of data, but the performance of the program could be radically different depending on which of them gets to it first. At MIT, a host of researchers are exploring how to reinvent chip architecture from the ground up, to ensure that adding more cores makes chips perform better, not worse.

In August 2010, the U.S. Department of Defense’s Defense Advanced Research Projects Agency announced that it was dividing almost $80 million among four research teams as part of a “ubiquitous high-performance computing” initiative. Three of those teams are led by commercial chip manufacturers. The fourth, which includes researchers from Mercury Computer, Freescale, the University of Maryland and Lockheed Martin, is led by MIT’s Computer Science and Artificial Intelligence Lab and will concentrate on the development of multicore systems.

The MIT project, called Angstrom, involves 19 MIT researchers (so far) and is headed by Anant Agarwal, a professor in the Department of Electrical Engineering and Computer Science. In 2004, Agarwal cofounded the chip company Tilera to commercialize research he’d done at MIT, and today, Tilera’s 64-core processor is the state of the art for multicore technology.

One way to improve communication between cores, which the Angstrom project is investigating, is optical communication — using light instead of electricity to move data. Though prototype chips with optical-communications systems have been built in the lab, they rely on exotic materials that are difficult to integrate into existing chip-manufacturing processes. Two of the Angstrom researchers are investigating optical-communications schemes that use more practical materials.

In early 2010, an MIT research group led by Lionel Kimerling, the Thomas Lord Professor of Materials Science and Engineering, demonstrated the first germanium laser. Germanium is already used in many commercial chips simply to improve the speed of electrical circuits, but it has much better optical properties than silicon. Another Angstrom member, Vladimir Stojanovi? of the Microsystems Technology Laboratory, is collaborating with several chip manufacturers to build prototype chips with polysilicon waveguides. Waveguides are ridges on the surface of a chip that can direct optical signals; polysilicon is a type of silicon that consists of tiny, distinct crystals of silicon clumped together. Typically used in the transistor element called the gate, polysilicon has been part of the standard chip-manufacturing process for decades.

Other Angstrom researchers, however, are working on improving electrical connections between cores. In today’s multicore chips, adjacent cores typically have two high-capacity connections between them, which carry data in opposite directions, like the lanes of a two-lane highway. But in future chips, cores’ bandwidth requirements could fluctuate wildly. A core performing a calculation that requires information updates from dozens of other cores would need much more receiving capacity than sending. But once it completes its calculation, it might have to broadcast the results, so its requirements would invert. Srini Devadas, a professor in the Computer Science and Artificial Intelligence Lab, is researching chip designs in which cores are connected by eight or maybe 16 lower-capacity connections, each of which can carry data in either direction. As the bandwidth requirements of the chip change, so can the number of connections carrying data in each direction. Devadas has demonstrated that small circuits connected to the cores can calculate the allotment of bandwidth and switch the direction of the connections in a single clock cycle.

In theory, a computer chip has two main components: a processor and a memory circuit. The processor retrieves data from the memory, performs an operation on it, then returns it to memory. But in practice, chips have for decades featured an additional, smaller memory circuit called a cache, which is closer to the processor, can be accessed much more rapidly than main memory, and stores frequently used data. The processor might perform dozens or hundreds of operations on a chunk of data in the cache before relinquishing it to memory.

In multicore chips, however, multiple cores may have cached copies of the same data. If one of the cores modifies its copy, all the other copies have to be updated. There are two general approaches to maintaining “cache coherence”: one is to keep a table of all the cached copies of the data, which has a cost in the computation time required to look up or modify entries in the table; the other is to simply broadcast any data updates to all the cores, which has a cost in bandwidth and, consequently, energy consumption. But Li-Shiuan Peh, an Angstrom researcher who joined the MIT faculty in 2009, is advocating yet a third approach. She has developed a system in which each core has its own “router,” which, like the routers in the Internet, knows only where to forward the data it receives. Peh’s simulations show that a network of routers is more computationally and energy efficient than either of the standard alternatives.

Whether the Angstrom project settles on electrical or optical connections remains to be seen. But Agarwal says that future multicore chips could well use both: Electrical connections would move data between individual cores; but optical connections would provide chip-wide broadcasts.

Not all the MIT faculty researching multicore architectures are affiliated with the Angstrom project, however. Jack Dennis is technically an emeritus professor, but together with colleagues at the University of Delaware and Rice University, he’s received National Science Foundation funding to research a radically different multicore architecture. In Dennis’ system, a computer’s memory is divided into chunks of uniform size, each of which can store data but can also point to as many as 16 other chunks. If a data structure — say, a frame of video — is too large to fit in a single chunk, the system creates additional chunks to share the burden and links them to existing chunks.

Dennis’ data chunks have three unusual properties. First, they are abstractions: Several chunks storing a single data structure might be found in a core’s cache, but if the cache fills up, other chunks might be recruited from main memory or even from a flash drive. The system doesn’t care how the chunks are instantiated. Second, and perhaps most counterintuitively, once a chunk has been created, it may never be altered. If a core performs an operation on data stored in a chunk, it must create a new chunk to store the results. This solves the problems of multiple cores trying to modify the same location in memory and of cache coherence. Once a chunk is no longer in use by any core, it’s simply released for general use. Another chunk, storing the result of a computation, could take its place in the network of links. Finally, because any operation of any core could result in the creation or deletion of chunks, the allocation of the chunks is performed by circuits hard-wired into the chip, not by the computer’s operating system. “As far as I know, nothing like this is going on anywhere else,” Dennis says

Whatever the architectural challenges posed by multicore computing, however, they’re only the tip of the iceberg. The next installment in this series will begin to look at MIT research on software.

This story is republished courtesy of MIT News (http://web.mit.edu/newsoffice/), a popular site that covers news about MIT research, innovation and teaching.

PhysOrg

Provided by Massachusetts Institute of Technology (news : web)

TStzmmalaysia
post Mar 1 2011, 10:22 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


SPACE SCIENCE

Attached Image

A date with comet Tempel 1

On Valentine's day, while we were all cooing over your loved ones or lamenting the obvious negligence of the postman, scientists at Denver's NASA station were cooing over something rather larger. On February 14th this year, NASA's Stardust probe made its second visit to the comet Tempel 1 at 8.40pm PST, shaving the comet at a distance of 111 miles (178 km) and traveling at a relative speed of 24,300 mph (10.9 km per second). This is the first time scientists have been able to get a second look at a comet, which allows them to compare data from the first visit in order to learn more about these icy inhabitants of our solar system.

"Here's a chance where we can see what has changed, how much has changed" on Tempel 1, so we'll start unraveling the history of a comet's surface," said the mission's lead investigator Joe Veverka of Cornell University. "That could help us better understand the life cycle of a comet. We have no idea whether we're talking about things that have been there for a hundred years, a thousand years, a million years."

The Stardust probe was initially launched in 1999, and spent the first five years traveling towards its mission target, another comet called Wild 2. Having collected dust samples and returned them to Earth in 2006, it was still fully functional so NASA recommissioned it to visit Tempel 1, which had first been visited by the Deep Impact spacecraft in 2005. On this occasion Deep Impact slammed a projectile into the comet to collect cometary ice and dust, and scientists were particularly interested to examine the crater left in the surface, and hopefully learn more about the comet.

The scientists are intrigued by the crater scar, which shows a small mound in the center, indicating that some ejecta exploded upwards and immediately back down again. "This tells us this cometary nucleus is fragile and weak based on how subdued the crater is we see today," said Pete Schultz of Brown University, Rhode Island.

The Stardust-NExT mission had three particular goals; to observe changed surface features; to get images of new terrain; and to view the crater generated by Deep Impact's projectile. All three were completed successfully, and Stardust's NavCam instrument took 72 high-resolution images of the comet and collected 468 kilobytes of data about its 'coma' dust – the cloud that constitutes a comet's atmosphere.

"This mission is 100 percent successful," said Veverka. "We saw a lot of new things that we didn't expect, and we'll be working hard to figure out what Tempel 1 is trying to tell us."

Stardust sustained some damage during the closest approach, with a dozen impacts of disintegrating cometary particles penetrating more than one layer of its protective shielding. "The data indicate Stardust went through something similar to a B-17 bomber flying through flak in World War II," said Don Brownlee, Stardust-NExT co-investigator from the University of Washington in Seattle. "Instead of having a little stream of uniform particles coming out, they apparently came out in chunks and crumbled."

As the mission winds down, the departure phase will include snapping an image of the receding comet every five minutes for five days and then every 12 minutes for the following six days. Several weeks after this, the Stardust spacecraft will finally be retired. "This spacecraft has logged over 3.5 billion miles since launch, and while its last close encounter is complete, its mission of discovery is not," said Tim Larson, Stardust-NExT project manager at the Jet Propulsion Laboratory (JPL). "We'll continue imaging the comet as long as the science team can gain useful information, and then Stardust will get its well-deserved rest."

JPL, part of the California Institute of Technology, manages Stardust-NExT for the NASA Science Mission Directorate. Lockheed Martin Space Systems built the spacecraft and manages day-to-day mission operations.

"This little spacecraft has really been around the block. Even through the odometer is high and the fuel is low, it did everything we asked of it and the results are visually amazing," said Allan Cheuvront, Lockheed Martin Space Systems Company program manager for Stardust-NExT.

Mission controllers from the University of Maryland-led EPOXI mission celebrated last week as NASA's Deep Impact space probe flew close by the Hartley 2 comet, sending back rare and valuable data about the comet. This is only the fifth time that a comet core has been viewed from such a near distance by a space probe, and it is hoped that by understanding comets better we can learn more about the origin and history of our solar system.

On November 4th at 10am EDT the spacecraft passed within 700 kilometers (435 miles) of the Hartley 2 comet, and within twenty minutes the first images of the encounter were being viewed 37 million kilometers (23 million miles) away on Earth. The initial images provided new information about the comet's volume – the peanut-shaped Hartley 2 comet is the smallest so far to be examined at such proximity – and showed jets of CO2 gas gushing from its surface in plumes.

“Early observations of the comet show that, for the first time, we may be able to connect activity to individual features on the nucleus. We certainly have our hands full," said Michael A'Hearn, University of Maryland astronomer. "The images are full of great cometary data, and that's what we hoped for.” A'Hearn is one of the originators of, and science team leader for, both the Deep Impact mission and its follow-on mission EPOXI.

"There'll be enough data downloaded to keep researchers busy for the next five, 10, 15 years probably. It's proving to be very interesting," said Malcolm Hartley, the man who discovered the comet in 1986.

Comets are believed to be made up of leftover debris material that didn't get incorporated into the planets at the birth of the solar system, and remain gravitationally bound to the sun. As they are thought to be composed of unchanged primitive material, they are extremely interesting to scientists who wish to learn about conditions during the early stages of the solar system.

Hartley 2 is the fifth comet nucleus visited by any spacecraft and the second one visited by the Deep Impact spacecraft. Launched in January 2005, Deep Impact found widespread fame when it slammed a probe into the Tempel 1 comet on July 4th that year. It was this successful mission that won approval for the spacecraft to study a second comet.

The name EPOXI is itself a combination of the names for two extended missions; the extrasolar planet observations, known as Extrasolar Planet Observations and Characterization (EPOCh), and the flyby of comet Hartley 2, known as the Deep Impact Extended Investigation (DIXI). During the EPOCh phase of EPOXI, the Deep Impact spacecraft provided information on possible extrasolar planets and was one of three spacecraft that for the first time found clear evidence of water on Moon. It has also provided data for a publication due to be published in the Astrophysical Journal, about identifying planetary bodies by their colors, which could help us to identify planets similar to Earth.

GizMag


TStzmmalaysia
post Mar 1 2011, 10:24 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Biomimetic patch to be tested on tricky tendon-to-bone repairs

Two Washington University in St. Louis scientists are imitating nature as they attempt to solve one of the most difficult problems in orthopedic surgery: reattaching tendon to bone.

Their goal is to improve the success rate of rotator cuff repairs. The rotator cuff is the group of four tendons and muscles that surround the shoulder joint. When it is injured, the tendons tear or detach from the bone.

Commonly thought of as a sports injury — or perhaps the scourge of orchestra conductors — rotator cuff tears actually become more common with age. The highest incidence is in patients older than 60, the scientists say.
For the doctor or surgeon, the big challenge of these injuries is the limited ability of the tendon to repair itself or to reattach to the bone. Reported failure rates for rotator cuff surgeries range from 20 percent to 94 percent.

Younan Xia, PhD, the James M. McKelvey Professor in the School of Engineering & Applied Science, and Stavros Thomopoulos, PhD, associate professor of orthopaedic surgery in the School of Medicine, demonstrated an innovative solution to this problem several years ago, publishing their results in Nano Letters in 2009.

Together with co-principal investigator Leesa M. Galatz, associate professor of orthopaedic surgery in the School of Medicine, the research group has just received more than $2 million from the National Institutes of Health to take the next step toward clinical use of the biomimetic patch.
Over the next few years, they will be testing its success in repairing cuff tears in a small animal model, the rat.
The most flexible of joints

Like your hip, your shoulder is a ball and socket joint. But the head of your femur sits deep in its socket in the pelvis, so your hip joint is stable and hard to dislocate.

Not so your shoulder. The ball at the end of the humerus sits in a shallow indentation in the scapula like a golf ball sitting on a tee. This allows your arm to swing much more freely than your leg, but it also means the shoulder joint is inherently unstable.
To solve this problem, the shoulder has an active stability system called the rotator cuff. Four muscles that fan out from the humerus continually pull it back into the socket so that the joint can't dislocate. Around the ball of the humerus, the tendons form a nearly continuous band like the cuff on a man's shirt.

The rotator cuff system is in many ways a miracle of biomechanical engineering. The tendons are relatively compliant and stringy, like a rope, and bone is hard and porous, like cement. Some of their mechanical properties differ by a factor of a hundred-fold or more.
"Attaching a compliant material like tendon to a relatively stiff material like bone is a fundamental engineering challenge," says Thomopoulos.
Merging tendon with bone

Nature has solved this problem by grading the mechanical structure and stiffness of the tissues across the interface between the tendon and the bone.
In an infant, both the bone and the tendon consist of collagen, says Thomopoulos. "Soon after birth, the bone starts to mineralize and a linear gradient forms between the bone and the tendon that slowly stiffens the material and transitions it from hard bone to compliant tendon."
The difficulty, says Thomopoulos, is that this unique transitional tissue is not recreated after injury.

Attached Image

"If you fall off a ladder and your rotator cuff tears, even if the surgeons goes in and puts the tendon back on the bone, he's putting the tendon right against bone. You don't have the graded interface. Biologically, it doesn't reform."
As a result, the failure rate for this surgery can be quite high — as high as 94 percent in one study co-authored by Galatz. Many of the patients in the study she published experienced relief from pain, but ultrasound revealed the cuff tear had reopened or a new one had formed.

The biomimetic fix

Because the grading between tendon and bone occurs at both microscopic and nanoscopic levels, Thomopoulos teamed up with Xia, an expert in nanotechnology, to find a solution to this problem.
In 2009, Xia and Thomopoulos proposed a simple solution: a temporary scaffold that will guide the healing process along the path it follows during development.

"A few scaffolds have been attempted in rotator cuff repair in humans," says Thomopoulos, "with only limited success. These patches are fairly compliant materials, even compared to tendon. The hope is that they'll stimulate a better healing response, but it hasn't happened yet. Our scaffold approach is to mimic the natural tissue by stepping up gradually in stiffness from tendon to bone."

The WUSTL scaffold consists of a mat of nanoscopic fibers electrospun in Xia's lab that mimics the structure of the collagen fibers in a tendon. The mat is then coated with a continuous gradient of hydroxyapatite, a mineral containing calcium and phosphorus that gives strength to bone, so that it is stiff and bone-like toward one end and compliant and tendon-like toward the other.
Finally the scaffold is seeded with adult mesenchymal stem cells, a type of stem cell that can mature into osteoblasts (bone-forming cells) or fibroblasts (cells common in tendon).

The idea is that as the fibers disintegrate over the course of a few months, the mineral gradient will promote the graduated differentiation of the stem cells. Stem cells toward the bone end will be coaxed by the presence of mineral to differentiate into osteoclasts while the stem cells at the tendon end, surrounded by aligned, unmineralized fibers, will form fibroblasts.
"The tendon side is a little bit easier because stem cells tend to go toward the fibroblast-like lineage if you don't do much to them," says Thomopoulos.
The presence of either type of cell can be confirmed by looking for markers of each type of mature cell.

Tiny shoulders

"The newly manufactured scaffold looks like a sheet of paper and can be cut to fit the tendon tear," says Xia.
The team has taken the method as far as they can in vitro and are now ready to try it in a small animal model, the rat.

Galatz, who will perform the surgeries, will create a tear in the rat's rotator cuff, repair it with a suture, and then lay a piece of the biomimetic scaffold (a few millimeters wide and long) over the repair site "You wouldn't think it," says Thomopoulos, "but a rat's shoulder is surprisingly similar to ours. A study done years ago compared the shoulder anatomy of 34 different species, and aside from the primates, the rat bony and muscular anatomy was actually closest to ours. "Rats are quadrupeds, but if you observe rats in their cages, they're reaching overhead quite a bit," he says.

PhysOrg



TStzmmalaysia
post Mar 1 2011, 10:27 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Stretched rubber offers simpler method for assembling nanowires

Researchers at North Carolina State University have developed a cheap and easy method for assembling nanowires, controlling their alignment and density. The researchers hope the findings will foster additional research into a range of device applications using nanowires, from nanoelectronics to nanosensors, especially on unconventional substrates such as rubber, plastic and paper.

"Alignment is a critical first step for developing devices that use nanowires," says Dr. Yong Zhu, an assistant professor of mechanical and aerospace engineering at NC State and co-author of a paper describing the research. "Hopefully our simple and cost-effective method will facilitate research in this field."

Aligning nanowires is challenging because, when they are created, the user is faced with a profusion of randomly oriented nanoscale wires that are, by definition, incredibly small. For example, the nanowires are between 10 and 100 nanometers in diameter, whereas a white blood cell is approximately 10,000 nanometers in diameter. Before any practical applications can be pursued, the user must assemble the nanowires in an orderly way. Specifically, users need to align the nanowires in a common direction and define their density – meaning the number of nanowires in a given area. Controlling both alignment and density is commonly called "assembling" the nanowires.

In the new method, Zhu's team deposited the nanowires on a stretched rubber substrate, and then released the tension on the substrate. When the nanowires settled, they aligned at a right angle to where the tension was coming from. Picture a rubber band being stretched to the east and west. If nanowires were placed on the rubber band, and the band was allowed to snap back to its original shape, the nanowires would be oriented to the north and south. The more the rubber substrate is stretched, the more aligned the nanowires will be, and the greater the nanowire density will be.

Previous research has presented a number of other methods for assembling nanowires. But the new method offers a number of distinct advantages. "Our method is cost-effective," says Feng Xu, a Ph.D. student working on this project, "because it is so simple. It can also be used for nanowires synthesized by different methods or processed in different conditions, for instance, silver nanowires synthesized in solution and silicon nanowires synthesized by the vapor-liquid-solid method, as demonstrated in our work." In addition, the new method can be used in conjunction with previous methods to achieve even better nanowire assembly.

The use of a rubber substrate in this method facilitates broad research and manufacturing sectors. For example, a key element of research into stretchable nanoelectronics involves aligning nanowires on a stretchable rubber substrate. Similarly, rubber is also the material used as "stamps" in transfer printing – a critical fabrication method used in manufacturing nanodevices on diverse substrates ranging from silicon to glass to plastic.

Zhu notes that the initial step of the method, when the nanowires are first deposited on stretched rubber, sometimes yields an inconsistent degree of nanowire alignment. The team is currently working to understand the fundamental interface mechanics -including adhesion and static friction -between nanowires and rubber substrates, which is expected to lead to a better control of the assembly process and hence a higher yield of the nanowire assembly.

EurekAlert

TStzmmalaysia
post Mar 1 2011, 10:29 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


NANOTECHNOLOGY

Attached Image

Gene Fuelled Transporter Causes Breast Cancer Cells to Self-Destruct

Scientists at Queen's University Belfast have shown that they can deliver a gene directly into breast cancer cells causing them to self-destruct, using an innovative, miniscule gene transport system, according to research published February 28 in the International Journal of Pharmaceutics.

Using a transport system called a Designer Biomimetic Vector (DBV), Dr Helen McCarthy, from Queen's School of Pharmacy, funded by Breast Cancer Campaign, packaged a gene into a nanoparticle 400 times smaller than the width of a human hair, allowing it to be delivered straight into breast cancer cells in the laboratory.

The gene called iNOS, is targeted specifically to breast cancer cells using the DBV where it forces the cells to produce poisonous nitric oxide; either killing the cells outright or making them more vulnerable to being destroyed by chemotherapy and radiotherapy. As this approach leaves normal healthy breast cells unaffected, this would overcome many of the toxic side effects of current treatments.

Further investigation is needed but it could be trialled in patients in as little as five years. Dr McCarthy's next step is to turn the nanoparticles into a dried powder that could be easily transported and reconstituted before being given to patients.

Dr McCarthy said: "A major stumbling block to using gene therapy in the past has been the lack of an effective delivery system. Combining the Designer Biomimetic Vector with the iNOS gene has proved successful in killing breast cancer cells in the laboratory. In the long term, I see this being used to treat people with metastatic breast cancer that has spread to the bones, ideally administered before radiotherapy and chemotherapy."

Dr Lisa Wilde, Research Information Senior Manager, Breast Cancer Campaign said: "Gene therapy could potentially be an exciting avenue for treating breast cancer. Although at an early stage, Dr McCarthy's laboratory research shows that this system for delivering toxic genes to tumour cells holds great promise and we look forward to seeing how it is translated into patients."

ScienceDaily

TStzmmalaysia
post Mar 1 2011, 10:31 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Power of cool: Liquid air to store clean energy

STANDING in a container full of pipes and valves, wearing a hard hat and sturdy boots, we gaze at a dull grey panel with green and red on and off switches.

"This panel means we're connected to the grid," says my companion Rob Morgan, grinning proudly. "To an engineer, this is really exciting."

We are on an industrial estate in Slough in the UK, on the grounds of a 100-megawatt biomass plant owned by energy firm Scottish and Southern. But what we've come to see is the small cluster of containers and a gleaming white liquid nitrogen tank tucked away in one corner of the site. Here Morgan, chief engineer at Highview Power Storage, London, and his colleagues have been running a pilot plant designed to store potential energy in the form of liquid air.

Until recently, the only way grid operators could store energy was in huge hydropower reservoirs. In future, renewable intermittent generators of electricity will form a larger part of our energy mix. So we are going to need ways of storing the power they produce for use in the hours or even days when the sun isn't shining or the wind isn't blowing. Since we cannot build a huge new reservoir near every large town and city, more compact storage systems are key to the future of green power.

This is why Highview has been testing its 300-kilowatt pilot plant for the past nine months, supplying electricity to the UK's National Grid. The process stores excess energy at times of low demand by using it to cool air to around -190 °C. Excess electricity powers refrigerators that chill the air, and the resulting liquid air, or cryogen, is then stored in a tank at ambient pressure (1 bar). When electricity is needed, the cryogen is subjected to a pressure of 70 bars and warmed in a heat exchanger. This produces a high-pressure gas that drives a turbine to generate electricity. The cold air emerging from the turbine is captured and reused to make more cryogen. Using ambient heat to warm it, the process recovers around 50 per cent of the electricity that is fed in, says Highview's chief executive Gareth Brett. The efficiency rises to around 70 per cent if you harness waste heat from a nearby industrial or power plant to heat the cryogen to a higher than ambient temperature, which increases the turbine's force, he says.

Unlike pumped-storage hydropower, which requires large reservoirs, the cryogen plants can be located anywhere, says Brett. Batteries under development in Japan have efficiencies of around 80 to 90 per cent, but cost around $4000 per kilowatt of generating capacity. Cryogenic storage would cost just $1000 per kilowatt because it requires fewer expensive materials, claims Brett.

"Lower costs are always better for energy storage, even if it comes at the price of slightly reduced efficiency," says Aidan Rhodes at the UK Energy Research Centre in London. Highview has so far been receiving cryogen from an external source and using it to store and produce electricity. But the firm has recently added an on-site liquefaction plant, and will begin producing its own cryogen from late March. It plans to build a 3.5-megawatt, commercial-scale system by late 2012, which will be increased to an 8 to 10 megawatt plant by early 2014.

New Scientist

TStzmmalaysia
post Mar 1 2011, 10:34 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

How to keep up with processor-power's doubling time, Part 2: The next operating system

At the most basic level, a computer is something that receives zeroes and ones from either memory or an input device — like a keyboard — combines them in some systematic way, and ships the results off to either memory or some output device — like a screen or speaker. An operating system, whether Windows, the Apple OS, Linux or any other, is software that mediates between applications, like word processors and Web browsers, and those rudimentary bit operations. Like everything else, operating systems will have to be reimagined for a world in which computer chips have hundreds or thousands of cores.

Project Angstrom, an ambitious initiative to create tomorrow’s computing systems from the ground up, funded by the U.S. Defense Department and drawing on the work of 19 MIT researchers, is concerned with multicore computing at all levels, from chip architecture up to the design of programming languages. But at its heart is the development of a new operating system.

A computer with hundreds of cores tackling different aspects of a problem and exchanging data offers much more opportunity than an ordinary computer does for something to go badly wrong. At the same time, it has more resources to throw at any problems that do arise. So, says Anant Agarwal, who leads the Angstrom project, a multicore operating system needs both to be more self-aware — to have better information about the computer’s performance as a whole — and to have more control of the operations executed by the hardware.

To some extent, increasing self-awareness requires hardware: Each core in the Angstrom chip, for instance, will have its own thermometer, so that the operating system can tell if the chip is overheating. But crucial to the Angstrom operating system — dubbed FOS, for factored operating system — is a software-based performance measure, which Agarwal calls “heartbeats.” Programmers writing applications to run on FOS will have the option of setting performance goals: A video player, for instance, may specify that the playback rate needs to be an industry standard 30 frames per second. FOS will automatically interpret that requirement and emit a simple signal — a heartbeat — each time a frame displays.

If the heartbeat fell below 30, FOS could adopt some computational short cuts in order to get it back up again. Computer-science professor Martin Rinard’s group has been investigating cases where accuracy can be traded for speed and has developed a technique it calls “loop perforation.” A loop is an operation that’s repeated on successive pieces of data — like, say, pixels in a frame of video — and to perforate a loop is simply to skip some iterations of the operation. Graduate student Hank Hoffmann has been working with Agarwal to give FOS the ability to perforate loops on the fly.

Saman Amarasinghe, another computer-science professor who (unlike Rinard) is officially part of the Angstrom project, has been working on something similar. In computer science, there are usually multiple algorithms that can solve a given problem, with different performance under different circumstances; programmers select the ones that seem to best fit the anticipated uses of an application. But Amarasinghe has been developing tools that allow programmers to specify several different algorithms for each task a program performs, and the operating system automatically selects the one that works best under any given circumstances. That functionality will be a feature of FOS, Agarwal says.

Dynamic response to changing circumstances has been a feature of Agarwal’s own recent work. An operating system is, essentially, a collection of smaller programs that execute rudimentary tasks. One example is the file system, which keeps track of where different chunks of data are stored, so that they can be retrieved when a core or an output device requires them. Agarwal and his research group have figured out how to break up several such rudimentary tasks so that they can run in parallel on multiple cores. If requests from application software begin to proliferate, the operating system can simply call up additional cores to handle the load. Agarwal envisions each subsystem of FOS as a “fleet of services” that can expand and contract as circumstances warrant.

In theory, a computer chip has two main components: a processor and a memory circuit. The processor retrieves data from the memory, performs an operation on it, and returns it to memory. But in practice, chips have for decades featured an additional, smaller memory circuit called a cache, which is closer to the processor, can be accessed much more rapidly than main memory, and stores frequently used data.

In a multicore chip, each core has its own cache. A core could probably access the caches of neighboring cores more efficiently than it could main memory, but current operating systems offer no way for one core to get at the cache of another. Work by Angstrom member and computer-science professor Frans Kaashoek can help redress that limitation. Kaashoek has demonstrated how the set of primitive operations executed by a computer chip can be expanded to allow cores access to each other’s caches. In order to streamline a program’s execution, an operating system with Kaashoek’s expanded instruction set could, for instance, swap the contents of two caches, so that data moves to the core that needs it without any trips to main memory; or one core could ask another whether it contains the data stored at some specific location in main memory.

Since Angstrom has the luxury of building a chip from the ground up, it’s also going to draw on work that Kaashoek has done with assistant professor Nickolai Zeldovich to secure operating systems from outside attack. An operating system must be granted some access to primitive chip-level instructions — like Kaashoek’s cache swap command and cache address request. But Kaashoek and Zeldovich have been working to minimize the number of operating-system subroutines that require that privileged access. The fewer routes there are to the chip’s most basic controls, the harder they are for attackers to exploit.

Computer-science professor Srini Devadas has done work on electrical data connections that Angstrom is adopting (and which the previous article in this series described), but outside Angstrom, he’s working on his own approach to multicore operating systems that in some sense inverts Kaashoek’s primitive cache-swap procedure. Instead of moving data to the cores that require it, Devadas’ system assigns computations to the cores with the required data in their caches. Sending a core its assignment actually consumes four times as much bandwidth as swapping the contents of caches does, so it also consumes more energy. But in multicore chips, multiple cores will frequently have cached copies of the same data. If one core modifies its copy, all the other copies have to be updated, too, which eats up both energy and time. By reducing the need for cache updates, Devadas says, a multicore system that uses his approach could outperform one that uses the traditional approach. And the disparity could grow if chips with more and more cores end up caching more copies of the same data.

PhysOrg

This story is republished courtesy of MIT News (http://web.mit.edu/newsoffice/), a popular site that covers news about MIT research, innovation and teaching.

More information: Computer chips’ clocks have stopped getting faster. To maintain the regular doubling of computer power that we now take for granted, chip makers have been giving chips more “cores,” or processing units. But how to distribute computations across multiple cores is a hard problem, and this five-part series of articles examines the different levels at which MIT researchers are tackling it, from hardware design up to the development of new programming languages.

Part 1: Designing the hardware

Provided by Massachusetts Institute of Technology

TStzmmalaysia
post Mar 1 2011, 10:36 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Developing Sensors for Seniors

Japan's top telecoms company is developing a simple wristwatch-like device to monitor the well-being of the elderly, part of a growing effort to improve care of the old in a nation whose population is aging faster than anywhere else.

The device, worn like a watch, has a built-in camera, microphone and accelerometer, which measure the pace and direction of hand movements to discern what wearers are doing - from brushing their teeth to vacuuming or making coffee.

In a demonstration at Nippon Telegraph and Telephone Corp.'s research facility, the test subject's movements were collected as data that popped up as lines on a graph - with each kind of activity showing up as different patterns of lines. Using this technology, what an elderly person is doing during each hour of the day can be shown on a chart.

The prototype was connected to a personal computer for the demonstration, but researchers said such data could also be relayed by wireless or stored in a memory card to be looked at later.

In the U.S., the Institute on Aging at the University of Virginia has been carrying out studies in practical applications of what it calls "body area sensor networks" to promote senior independent living.

What's important is that wearable sensors be easy to use, unobtrusive, ergonomic and even stylish, according to the institute, based in Charlottesville, Virginia. Costs, safety and privacy issues are also key.

George Demiris, associate professor at the School of Medicine at the University of Washington, in Seattle, says technology for the elderly is complex, requiring more than just coming up with sophisticated technology.

Getting too much data, for instance, could simply burden already overworked health care professionals, and overly relying on technology could even make the elderly miserable, reducing opportunities for them to interact with real people, he said.

"Having more data alone does not mean we will have better care for older adults," Demiris said in an e-mail.

"We can have the most sophisticated technology in place, but if the response at the other end is not designed to address what the data show in a timely and efficient way, the technology itself is not useful," he said.

PhysOrg

TStzmmalaysia
post Mar 2 2011, 09:51 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

CeBIT: Laptop tracks gaze, taking eye-tracking out of lab

Ever wish your eyes were lasers? A laptop prototype brings that wish closer to reality.

It tracks your gaze and figures out where you're looking on the screen. That means, among other things, that you can play a game where you burn up incoming asteroids with a laser that hits where you look.

In another demonstration this week, the computer scrolled a text on the screen in response to eye movements, sensing when the reader reached the end of the visible text.

In the future, a laptop like this could make the mouse cursor appear where you're looking, or make a game character maintain eye contact with you, according to Tobii Technology Inc., the Swedish firm that's behind the tracking technology.

The eye tracker works by shining two invisible infrared lights at you. Two hidden cameras then look for the "glints" off your eyeballs and reflections from each retina. It needs to be calibrated for each person. It works for people with or without eyeglasses.

Rather than a replacement for the traditional mouse and keyboard or the newer touch screen, the eye-tracking could be a complement, making a computer faster and more efficient to use, said Barbara Barclay, general manager of Tobii's Analysis Solutions business.

Tobii has been making eye-tracking devices for researchers and the disabled for nearly a decade. The laptop is its way of showing that eye-tracking could expand beyond those niches, Barclay said, calling it an "idea generator."

The laptop is made by Lenovo Corp., and incorporates Tobii's eye-tracking cameras in a "hump" on the cover, making the entire package about twice as thick as a regular laptop. But future, commercial versions can be slimmer and are perhaps two years away, Barclay said.

Lenovo and Tobii made 20 of the laptops and planned to demonstrate them at the CeBIT technology trade show in Hanover, Germany, on Tuesday.

New ways to use computers have been proliferating in recent years. Touch screens are becoming popular on smart phones and tablet computers such as the iPad. Nintendo Corp.'s Wii game console brought motion-sensing technology to the masses. Microsoft Corp. released an accessory for its Xbox games console last year that uses an infrared camera to sense the movement of bodies in three dimensions.

PhysOrg

TStzmmalaysia
post Mar 2 2011, 09:53 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Talk To The Hand: A New Interface For Bionic Limbs

In the real world, robotic limbs have limited motions and the user can’t feel what he or she is “touching.” a new approach using optical fibers implanted around nerves could transmit more data and let prosthetics speak to the brain.

Previously, scientists surgically connected electrodes to the nervous system, but they seemed to harm the body’s tissues, making the implant fail within months. In 2005, scientists discovered that they could stimulate a neuron to send a message by shining infrared light on it. Last September, DARPA, the Pentagon’s R&D branch, awarded $4 million to a project led by Southern Methodist University engineers to attempt to connect nerves to artificial limbs using fiber optics.

The team suspects that flexible glass or polymer fiber optics will be more flesh-friendly than rigid electrodes. In addition, optical fibers transmit several signals at once, carrying 10 times as much data as their electrical counterparts. “Our goal is to do for neural interfaces what fiber optics did for the telecom industry,” says electrical engineer Marc Christensen, who is leading the SMU group. Transmitting more information faster should give bionic limbs more lifelike movements.

This month, the team will implant optical fibers to stimulate a rat’s rear leg. If it works, Christensen says, in about a decade, robotic arms could be as graceful as Steve Austin’s six-million-dollar one.

Artificial Nerves: Using fiber optics, these nerves control a new generation of bionic limbs. Rajeev Doshi

Attached Image

How Artificial Nerves Work

Sensing The Limb

When someone’s prosthetic hand touches a ball, for example, it would trigger an optical fiber in the arm to pulse a pattern of infrared light like Morse code. These light messages stimulate a sensory nerve to fire in a similar pattern, instructing the brain that the hand is feeling a round object.

Moving The Limb

Thinking about squeezing the ball sends electrical impulses from the brain to a motor nerve. When it reaches the optical fiber implanted in the nerve, the signal deforms thousands of the fiber’s spheres. This changes the pattern of light in the fiber, which instructs the prosthetic hand to grip the ball.

PopSci


TStzmmalaysia
post Mar 2 2011, 09:55 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Yill is one wheely useful mobile office energy storage unit

Although HP and others are breaking new ground in notebook battery life, there are times when you might find yourself away from the grid for a bit longer than your laptop battery can last. A mobile energy storage unit like Yill, from Germany's Younicos, is said to be capable of autonomously meeting the power needs of a computer workstation for between two and three days on a single charge of its own quick-charge batteries. Deployment of the drum-like power houses throughout an office could even help save energy bills.

There are numerous reasons why you might consider using something like Yill. You may find yourself having to work for a few days in a remote location where a stable supply of power cannot be guaranteed. You might generate your own electricity from renewable resources, such as photovoltaic panels, and want to spend more time completely off-grid, even during days without sunshine. As a designer, you might want to minimize heat loss in a new office by eliminating the elevated floors or suspended ceilings usually needed for extensive cabling.

The standalone power storage unit is said to be capable of meeting the energy needs of a small mobile office for two to three days before its own rechargeable lithium titanium battery needs some juice. The battery pack benefits from quick recharge times (about four hours) and a long operating life. Yill can supply devices with up to 300 Watts of electricity, and stores about 1 kWh of energy.

Younicos says that when Yill needs some energy, it can be plugged into a charging station that draws power from renewable energy sources, or from the grid. Its 20.8-inch (530 mm) diameter wheels and a pull-out handle also give it mobility.

The Yill mobile energy storage unit, designed by Werner Aisslinger, will be launched during Milan Design Week from April 12 to 17.

Attached Image

Gizmag


TStzmmalaysia
post Mar 2 2011, 09:56 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


TRANSPORTATION

Attached Image

Making waves with the Manta amphibious three-wheeler concept

The Manta three-wheeler amphibious concept vehicle has an electric motor on each of the rear wheels, which can be rotated 90 degrees to provide propulsion in water

Although I live by a river, I don't own a boat so am not faced with having to drag a trailer down to the water's edge and unload my dinghy every time I want to cross the great expanse. I might just be persuaded to spend more time on the water, though, if there was something like the Manta waiting outside my front door. The sporty-looking three-wheeler concept is designed to be run on twin electric motors, with the rear wheels taking care of propulsion on water as well as on the road. The design is amongst the entries chosen by this year's Michelin Challenge Design judges for display at a recent auto show.

First launched in 2001, the Michelin Challenge Design was created by Michelin North America to spotlight creative thinking and innovative vehicle design. The first Challenge managed to attract just 125 entries, but this year there have been a record 970, and 34 of those were recently chosen by judges to be shown at the North American International Auto Show last month.

The Manta, by Belgium's David Cardoso Loureiro, was amongst those chosen and has been designed for those who live near water but don't want to bother swapping vehicles for travel on water and land. Many amphibious vehicle examples already exist of course, but this three-wheeler concept is a little different. The single-occupancy vehicle is electrically-powered – although exact motor details are not mentioned – and the two rear wheels are designed to power the craft on land or on water.

Many modern cars sport wheels with covers/hubs that look like propeller blades, but the Manta wheels are actual blades. The wheels turn 90 degrees to provide propulsion for the craft when in water. It isn't clear whether this process would also be used to control direction on the water, as there is no rudder visible on the renderings, but each rear wheel would be independently-powered.

It's unlikely to achieve the kind of speeds or distance offered by the likes of the WaterCar Python, but could be just the thing for a short jaunt over to the other side of the lake, and beyond.

Would such a design actually work? Loureiro seems to think so, saying that his concept would be "relatively simple to make, it can be a low cost vehicle that gives the driver great driving sensations." What do you think?

Gizmag

TStzmmalaysia
post Mar 3 2011, 05:47 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


NANOTECHNOLOGY

Attached Image

Using artificial, cell-like 'honey pots' to entrap deadly viruses

Researchers from the National Institute of Standards and Technology and the Weill Cornell Medical College have designed artificial "protocells" that can lure, entrap and inactivate a class of deadly human viruses—think decoys with teeth. The technique offers a new research tool that can be used to study in detail the mechanism by which viruses attack cells, and might even become the basis for a new class of antiviral drugs.

A new paper* details how the novel artificial cells achieved a near 100 percent success rate in deactivating experimental analogs of Nipah and Hendra viruses, two emerging henipaviruses that can cause fatal encephalitis (inflammation of the brain) in humans.

"We often call them honey pot protocells," says NIST materials scientist David LaVan, "The lure, the irresistibly sweet bait that you can use to capture something."

Henipaviruses, LaVan explains, belong to a broad class of human pathogens—other examples include parainfluenza, respiratory syncytial virus, mumps and measles—called enveloped viruses because they are surrounded by a two-layer lipid membrane similar to that enclosing animal cells. A pair of proteins embedded in this membrane act in concert to infect host cells. One, the so-called "G" protein, acts as a spotter, recognizing and binding to a specific "receptor" protein on the surface of the target cell.

The G protein then signals the "F" protein, explains LaVan, though the exact mechanism isn't well understood. "The F protein cocks like a spring, and once it gets close enough, fires its harpoon, which penetrates the cell's bilayer and allows the virus to pull itself into the cell. Then the membranes fuse and the payload can get delivered into the cell and take over." It can only do it once, however.

The "honey pot" protocells have a core of nanoporous silica—inert but providing structural strength—wrapped in a lipid membrane like a normal cell. In this membrane the research team embedded bait, the protein Ephrin-B2, a known target of henipaviruses. To test it, they exposed the protocells to experimental analogs of the henipaviruses developed at Weill Cornell. The analogs are nearly identical to henipaviruses on the outside, but instead of henipaviral RNA, they bear the genome of a nonpathogenic virus that is engineered to express a fluorescent protein upon infection. This enables counting and visualizing infected cells.

In controlled experiments, the team demonstrated that the protocells are amazingly effective decoys, essentially clearing a test solution of active viruses, as measured by using the fluorescent protein to determine how many normal cells are infected by the remaining viruses.

The immediate benefit, LaVan says, is a powerful research tool for studying how envelope viruses work. "This is a nice system to study this sort of choreography between a virus and a cell, which has been very hard to study. A normal cell will have tens of thousands of membrane proteins. You might be studying this one, but maybe it's one of the others that are really influencing your experiment. You reduce this essentially impossibly complicated natural cell to a very pure system, so you now can vary the parameters and try to figure out how you can trick the viruses."

In the long run, say the researchers, the honey pot protocells could become a whole new class of antiviral drugs. Viruses, they point out, are notorious for rapidly evolving to become resistant to drugs, but because the honey pots use the virus's basic infection mechanism, any virus that evolved to avoid them likely would be less effective at infecting normal cells as well.

More information: * M. Porotto, F. Yi, A. Moscona and D.A. LaVan. Synthetic protocells interact with viral nanomachinery and inactivate pathogenic human virus. PLoS ONE published online on March 1, 2011.

PhysOrg

TStzmmalaysia
post Mar 3 2011, 09:18 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Cubelets help make robotics a snap

The Cubelets robotic construction kit allows anyone to build simple robots using blocks that magnetically snap together, the overall behavior determined by the interaction between them

Do you remember those colored building blocks you would use to learn words and numbers, or just construct mighty castles to keep your enemies outside? Well, they've now received a 21st Century update in the form of the Cubelets system. Currently made up of 20 colored blocks that snap together with the help of magnets, each one has a little computer inside which gives it different functionality to the others. One might be a sensor, another have wheels and another sport a potentiometer. The fun starts when you put them together. The behavior of the resulting robot depends on how the blocks talk to each other. Sweet.

The Cubelets robotic construction kit is described as "a toy that fosters computational thinking about complex systems" and is based on roBlocks modules created by Eric Schweikardt and Mark D Gross from the Computational Design Lab at Carnegie Mellon University in Pittsburgh, and brought to market by a spin off company founded in 2008. Each kit contains 20 magnetic blocks comprising five Sense blocks, six Action blocks and nine Think/Utility blocks. Each block has a tiny computer inside and is described as a robot in its own right, but when combined with another block, and another, they talk to each other and become a wholly different machine.

VIDEO

Every creation will need to include a battery block for power but beyond that requirement – experimentation is the way to go. For instance, you might connect a light sensing block to a speaker block and noise from the speaker will get louder as the light is increased. Then try it with a temperature sensing block, a thinking block and a drive block that sports a motor and rollers and see what happens. Just like Johnny 5, they can appear to be alive.

There's no programming involved, each block has a purpose of its own and overall functionality is automatically adjusted according to how the different blocks are snapped together. With safety in mind, each of the magnets which bring the blocks together is mechanically attached to the housing and each Cubelet has either five or six snap points. This leads to numerous possible configuration permutations, enough to keep any would-be robotics engineer (young or old) amused for some time.

Gizmag

TStzmmalaysia
post Mar 3 2011, 09:21 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Fast tunable coupler could lead to better quantum computing models

One of the subjects of immense interest to scientists (and non-scientists as well) is the development of quantum computers. However, there are many challenges associated with quantum computing. One of the difficulties to achieving practical quantum computing is related to the way the quantum bits (qubits) that make up a quantum computer are connected together.

“The easiest way to couple superconducting qubits is via fixed coupling, where the coupling strength does not change,” Bialczak says. “Many experiments done with superconducting qubits so far have been with fixed coupling, but there are lots of problems with fixed coupling architectures because they are difficult to scale up to many qubits.”

Bialczak worked with a team at UC Santa Barbara to look for a way around the problems caused by fixed coupling. They developed a novel tunable coupler design that has the potential to be scalable, possibly improving current quantum computer designs. The team’s work can be found in the Physical Review Letters article “Fast Tunable Coupler for Superconducting Qubits.”

With fixed coupling, Bialczak explains, it is easy to couple the qubits so that they can exchange information, but keeping the qubits from interacting with each other is difficult and causes errors in single-qubit operations and measurement. On top of that, problems with fixed coupling multiply as you add more qubits. “As you increase the number of qubits,” Bialczak says, “it gets increasingly difficult to isolate an individual qubit from the others. It’s like having a room full of people and wanting to isolate each person from the conversation of the other people. You won’t be able to do so because there are so many people…each personal will hear one or more of the other persons’ [conversations].”

In order to overcome these issues, the team at UC Santa Barbara developed a method of fast tunable coupling. “With tunable coupling, you can directly turn off the interaction between qubits,” Bialczak says, “The coupler can also arbitrarily tune the coupling strength on nanosecond timescales allowing for fast qubit interaction times while minimizing errors in single-qubit operations and measurement.”

For realistic quantum operations, a practical tunable coupler is needed. Such a coupler would need to be tuned quickly, on the order of nanoseconds. Additionally, large on/off ratios are required, as well as scalability so that many qubits can be coupled. “Previous demonstrations of tunable coupling were able to show one or more of the above in a given device, but were unable to combine all the criteria in one device, making them of limited use in realistic quantum computing experiments,” Bialczak points out. The UCSB team hopes that their new tunable coupler will satisfy all these criteria.

In order to create the tunable coupler circuit, two superconducting qubits are coupled using a fixed negative mutual inductance. This mutual inductance is shunted with a current-biased Josephson junction. The junction acts as a tunable positive inductance and can therefore cancel out the fixed coupling due to the mutual inductance.

“Our coupler also has the added feature of being modular and being able to couple elements over large spatial distances. We can also couple them to other devices and possibly even to qubits from other architectures,” Bialczak says.

Right now, the coupler is being used to develop a new measurement scheme that doesn’t destroy prepared quantum states. “This is commonly called a quantum non-demolition measurement scheme,” Bialczak says.

Bialczak has great hopes for the applications of this coupler. “We feel this is a general modular superconducting circuit element that can have many applications, even outside of quantum computation.”

PhysOrg

More information: R. Bialczak, et. al., “Fast Tunable Coupler for Superconducting Qubits,” Physical Review Letters (2011). Available online.

TStzmmalaysia
post Mar 3 2011, 09:26 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Scientists create one-dimensional ferroelectric ice

Everyone knows that when water freezes, it forms ice. But a lesser known fact is that there is not one, but many different kinds of ice, depending on the way the ice crystals are arranged. In a new study, a team of chemists has developed a new method for synthesizing a type of ferroelectric ice, which is crystallized so that all of its bonds line up in the same direction, producing a large electric field.

The researchers, including Hai-Xia Zhao, Xiang-Jian Kong and La-Sheng Long, along with their coauthors from Xiamen University in Xiamen, China, and Hui Li and Xiao Cheng Zeng from the University of Nebraska in the US, have published their study in a recent issue of the Proceedings of the National Academy of Sciences.

Every water molecule carries a tiny electric field. But because water molecules usually freeze in a somewhat random arrangement, with their bonds pointing in different directions, the ice’s total electric field tends to cancel out. In contrast, the bonds in ectric ice all point in the same direction at low enough temperatures, so that it has a net polarization in one direction that produces an electric field.

Ferroelectric ice is thought to be extremely rare; in fact, scientists are still investigating whether or not pure three-dimensional ferroelectric ice exists in nature. Some researchers have proposed that ferroelectric ice may exist on Uranus, Neptune, or Pluto. Creating pure 3D ferroelectric ice in the laboratory seems next to impossible, since it would take an estimated 100,000 years to form without the assistance of catalysts. So far, all ferroelectric ices produced in the laboratory are less than three dimensions and in mixed phases (heterogeneous).

In the new study, the scientists have synthesized a one-dimensional, single-phase (homogeneous) ferroelectric ice by freezing a one-dimensional water ‘wire.’ As far as the scientists know, this is the first single-phase ferroelectric ice synthesized in the laboratory.

To create the water wire, the researchers designed very thin nanochannels that can hold just 96 H2O molecules per crystalline unit cell. By lowering the temperature from a starting point of 350 K (77°C, 171°F), they found that the water wire undergoes a phase transition below 277 K (4°C, 39°F), transforming from 1D liquid to 1D ice. The ice also exhibits a large dielectric anomaly at this temperature and at 175 K (-98°C, -144°F).

“We know the freezing point should be different from normal water because the water is confined to nanochannels and not in a normal environment,” Zeng told PhysOrg.com. “Why 1D water has a higher temperature in this case is still an open question.”

As the scientists explained, the hydrogen-bonding interactions among H20 molecules in the water wire and the nanochannel play an important role in the ferroelectricity of the ice. While the hydrogen bonds between the water and nanochannel do not break, the remaining hydrogen atoms in the ice rotate under an opposite electric field. As a result, the polarity of the ferroelectric ice can be reversed by reversing the external electric field, a property not seen in everyday water and ice.

Overall, the production of a 1D, single-phase ferroelectric ice using water confined to a nanochannel provides a new way to synthesize ferroelectric materials. The new method could also help scientists better understand the unique properties of ferroelectric ice, which could have applications in the biological sciences, geoscience, and nanoscience. As Zeng noted, ferroelectric ice could potentially have electrical applications, with the efforts of engineers working in nanotechnology.

“[The study] shows that the freezing of water can be greatly affected by the confinement and water/surface interaction,” Zeng said. “So knowledge and insights gained through research in this field will help scientists to control some properties of water through designing different confinements.”

PhysOrg

More information: Hai-Xia Zhao, et al. “Transition from one-dimensional water to ferroelectric ice within a supramolecular architecture.” PNAS Early Edition.



TStzmmalaysia
post Mar 3 2011, 09:28 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Bacteria Communicate With Each Other Through Nanotubes

A pathway whereby bacteria communicate with each other has been discovered by researchers at the Hebrew University of Jerusalem. The discovery has important implications for efforts to cope with the spread of harmful bacteria in the body.

Bacteria are known to communicate in nature primarily via the secretion and receipt of extracellular signaling molecules, said Prof. Sigal Ben-Yehuda of the Institute for Medical Research Israel-Canada (IMRIC) at the Hebrew University Faculty of Medicine, head of the research team on the phenomenon, whose work is currently reported in the journal Cell. This communication enables bacteria to execute sophisticated tasks such as dealing with antibiotic production and secretion of virulence factors.

Ben-Yehuda's group identified a previously uncharacterized type of bacterial communication mediated by nanotubes that bridge neighboring cells. The researchers showed that these nanotubes connect bacteria of the same and different species. Via these tubes, bacteria are able to exchange small molecules, proteins and even small genetic elements (known as plasmids).

This mechanism can facilitate the acquisition of new features in nature, such as antibiotic resistance. In this view, gaining a better molecular understanding of nanotube formation could lead to the development of novel strategies to fight against pathogenic bacteria, said Ben-Yehuda.

ScienceDaily

TStzmmalaysia
post Mar 3 2011, 09:30 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Miniaturized Chips very Energy efficienct thanks to Water-Cooling

Energy consumption poses a critical challenge in the development of next-generation supercomputers and IT systems. Within the next 10 years, IBM scientists and developers aim to build computers featuring exascale computing performance, but with an absolute energy consumption that is not much higher than that of today’s largest systems. Exascale computers are capable of reaching a performance of one ExaFLOP/s, which corresponds to 1018 floating point operations per second. This is about 300 times faster than today’s fastest supercomputer.

New water-cooling technologies that wick off heat right where it is being generated—directly at the chip—offer a promising route to boost significantly the overall energy efficiency of computers. At this year's CeBIT, IBM is presenting its first so-called hot-water-cooled systems, which will provide a sneak preview of future innovations: Supercomputers the size of sugar cubes.

First successful projects

Cooling with hot water combines several advantages: It eliminates the need for energy-intensive coolers, which greatly reduces overall energy consumption. In addition, the removed heat can be reused directly for such purposes as to heat buildings or as process heat. This will improve the CO2 balance of computers and data centers significantly compared to similar air-cooled systems.

In a pilot project, IBM scientists and engineers from Zurich, Switzerland, and Böblingen, Germany, built a revolutionary hot-water-cooled supercomputer for ETH Zurich, a world-renowned technical university. Dubbed Aquasar, the novel supercomputer is cooled with hot water instead of cold air, and the dissipated heat is used to heat the ETH campus. The innovative system consumes up to 40% less energy than a comparable air-cooled computer. And the CO2 balance is impressive: By directly using waste heat, the system reduces net emissions by up to 85%.

This is made possible by an innovative cooling system and powerful microchannel coolers attached to the back of the processors. Computer chips, which develop ten times more heat per square centimeter than a hotplate, can thus be cooled efficiently even with 60°C hot water. The entire cooling system of the computer is a closed, hermetically sealed circuit, enabling valuable waste he at to be recovered. Up to 80% of that waste heat can be repurposed by a heat exchanger that delivers it to a second, external heat cycle. In the case of Aquasar at ETH Zurich, the waste heat is fed into the campus heating system. Dr. Ingmar Meijer, Aquasar project manager at IBM Research - Zurich, explains: "With Aquasar we reached an important milestone in the development of low-energy and CO2-neutral data centers. This sends an important signal to the industry."

The next hot-water-cooled IBM system is already on the drawing board, this time in Germany. It will be significantly larger than Aquasar and is expected to go into operation at the Leibniz Supercomputing Centre (LRZ) in Munich, Germany, by 2012. Called SuperMUC, this new computer will be part of the Partnership for Advanced Computing in Europe (PRACE) HPC infrastructure and made available to scientists and research institutes throughout Europe. The system has a peak performance of 3 petaflop/s (1015 arithmetic operations per second) and is based on an IBM System X iDataPlex, which contains more than 14,000 Intel Xeon next-generation processors. SuperMUC will be more powerful than 110,000 PCs, enabling LRZ scientists to verify theories, develop experiments and predict results to an unprecedented extent—all this, while still requiring massively less energy.

Dr. Michael Malms, Director of Open Systems and High Performance Computing at the IBM Research and Development Center in Böblingen, Germany, states: "With the installation of SuperMUC we are taking the first step of research towards a system that can be used equally by academia and businesses. It is energy- and cost-effective as well as flexible, which m akes it useful for many applications."

Looking further ahead, so-called 3D chips promise even higher performance with lower energy consumption. Paving the way for exascale computers, IBM scientists are pursuing extensive research on 3D integration. 3D chip architectures, in which processors are stacked on top of each other, not only reduce the surface area of the chip but also shorten the communication distance between the chips and increase the bandwidth for data transmission on the chip many times.

One of the main limitations in developing 3D chip layouts currently lies in the performance of conventional coolers. More complex designs with extremely thin, stacked processors can reach power densities of up to 5 kW/cm3 (kilowatts per cubic centimeter)—a power density which exceeds that of any current heat engine, such as internal combustion engines, by ten times.

At IBM Research – Zurich, novel concepts to scale cooling technologies for 3D chip stacks are being explored. In test systems, water is piped directly between the individual chip layers through microscopic channels measuring only about 50 microns. Such designs allow 3D stacks of heating elements to be cooled very efficiently with the heat fluxes released by today's processors.

Before first fully functional prototypes can be realized, which is expected to happen in the next seven to ten years, researchers must still overcome several technical hurdles. Their aim is to develop a system with an optimized flow of water through the thin layers, which at the same time reliably isolates the electronics from the water. A special difficulty is posed by the thousands of electronic connections that run vertically through the chip stack. In fact, the density of the components in such a system would be comparable to that of the human brain, which is intersected by millions of nerve fibers for signal processing, and features tens of thousands of blood capillaries for nutrients and heat transfer—without interfering with each other.

The three-dimensional integration of computer chips is one of the most promising approaches to boost performance tremendously while reducing energy consumption considerably. Supercomputers as small as sugar cubes could thus become reality.

From: PhysOrg

Article Provided by IBM
TStzmmalaysia
post Mar 3 2011, 09:33 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Improved method developed to locate ships in storms

There are already systems that detect ships at sea, but a group of engineers from the UAH, led by the researcher Raúl Vicen, has introduced a new development, involving "the use of artificial intelligence techniques and improvements in the templates used to select input data".

The team has come up with a new detection method "that outperforms the one that has generally been used until now, as well as offering the advantages of low computational costs, and which can also be used in real time".

The new system, the details of which are published in the journal IET Radar, Sonar & Navigation, involves firstly gathering information from radar data using a series of templates designed by the scientists. This phase makes use of regular radar tracking data (both horizontal and vertical), as well as other more advanced modes (diagonal).

An artificial neural network architecture called a "multilayer perceptron" that is capable of learning from its environment, is then used. This makes it possible to differentiate between ships and waves in the confused radar images seen during storms.

Test passed in the North Sea

The technique has been successfully trialled using data from an X-band sea radar system (the most common in these kinds of devices, with frequencies of between 7 and 12.5 gigahertz), located on the German FINO-1 research platform in the North Sea.

"The fact that we obtained results with real data shows that this method can be installed in ship and ocean platform radar systems, without any problem", the authors explain.

According to the study, this system offers "substantial" improvements in comparison with the conventional systems used for detecting ships, such as the CA-CFAR technique (Cell Averaging-Constant False Alarm Rate). Radar systems usually use these algorithms to detect targets among the waves, or 'sea clutter', but the proposed system "outperforms the current systems in terms of its detection rates".

Article from PhysOrg

More information: R. Vicen-Bueno R., Carrasco-Álvarez M.P., Jarabo-Amores J.C., Nieto-Borge y M. Rosa-Zurera. "Ship detection by different data selection templates and multilayer perceptrons from incoherent maritime radar data". IET Radar, Sonar & Navigation 5(2): 144-154, February http://dx.doi.org/10.1049/iet-rsn.2010.00012011.
TStzmmalaysia
post Mar 3 2011, 09:43 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Time travel experiment demonstrates how to avoid Einstein's grandfather paradox

Among the many intriguing concepts in Einstein’s relativity theories is the idea of closed timelike curves (CTCs), which are paths in spacetime that return to their starting points. As such, CTCs offer the possibility of traveling back in time. But, as many science fiction films have addressed, time travel is full of potential paradoxes. Perhaps the most notable of these is the grandfather paradox, in which a time traveler goes back in time and kills her grandfather, preventing her own birth.

In a new study, a team of researchers has proposed a new theory of CTCs that can resolve the grandfather paradox, and they also perform an experiment showing how such a scheme works. The researchers, led by Seth Lloyd from MIT, along with scientists from the Scuola Normale Superiore in Pisa, Italy; the University of Pavia in Pavia, Italy; the Tokyo Institute of Technology; and the University of Toronto, have published their study in a recent issue of Physical Review Letters. The concepts in the study are similar to an earlier study by some of the same authors that was posted at arXiv.org last year.

“Einstein's theory of general relativity supports closed timelike curves,” Lloyd told PhysOrg.com. “For decades researchers have argued over how to treat such objects quantum mechanically. We believe that our theory is the correct theory of such objects. Moreover, our theory shows how time travel might be accomplished even in the absence of general relativistic closed timelike curves.”

In the new theory, CTCs are required to behave like ideal quantum channels of the sort involved in teleportation. In this theory, self-consistent CTCs (those that don’t result in paradoxes) are postselected, and are called “P-CTCs.” As the scientists explain, this theory differs from the widely accepted quantum theory of CTCs proposed by physicist David Deutsch, in which a time traveler maintains self-consistency by traveling back into a different past than the one she remembers. In the P-CTC formulation, time travelers must travel to the past they remember.

Although postselecting CTCs may seem complicated, it can actually be investigated experimentally in laboratory simulations. By sending a “living” qubit (i.e., a bit in the state 1) a few billionths of a second back in time to try to “kill” its former self (i.e., flip to the state 0), the scientists show that only photons that don’t kill themselves can make the journey.

“P-CTCs work by projecting out part of the quantum state,” Lloyd said. “Another way of thinking about closed timelike curves is the following. In normal physics (i.e., without closed timelike curves), one specifies the state of a system in the past, and the laws of physics then tell how that state evolves in the future. In the presence of CTCs, this prescription breaks down: the state in the past plus the laws of physics no longer suffice to specify the state in the future. In addition, one has to supply final conditions as well as initial conditions. In our case, these final conditions specify the state when it enters the closed timelike curve in the future. These final conditions are what project out part of the quantum state as described above.

“Although one would need a real general relativistic CTC actually to impose final conditions, we can still simulate how such a CTC would work by setting up the initial condition, letting the system evolve, and then making a measurement. One of the possible outcomes of the measurement corresponds to the final condition that we would like to impose. Whenever that outcome occurs, then everything that has happened in the experiment up to that point is exactly the same as if the photon had gone backward in time and tried to kill its former self. So when we ‘post-select’ that outcome, the experiment is equivalent to a real CTC.”

To demonstrate, the scientists stored two qubits in a single photon, one of which represents the forward-traveling qubit, and one of which represents the backward-traveling qubit. The backward-traveling qubit can teleport through a quantum channel (CTC) only if the CTC ends by projecting the two entangled qubits into the same state.

After the qubits are entangled, their states are measured by two probe qubits. Next, a “quantum gun” is fired at the forward-traveling qubit, which, depending on the gun’s angle, may or may not rotate the qubit’s polarization. The qubits’ states are measured again to find out if the gun has flipped the forward-traveling qubit’s polarization or not. If both qubits are in the same state (00 or 11), then the gun has not flipped the polarization and the photon “survives.” If the qubits’ states are not equal (01 or 10), then the photon has “killed” its past self. The experiment’s results showed that the qubits’ states were almost always equal, showing that a qubit cannot kill its former self.

The scientists noted that their experiment cannot test whether an actual CTC obeys their new theory, since it is currently unknown whether CTCs exist at all. In the future, they plan to perform more tests to better understand time travel paradoxes.

“We want to perform the so-called `unproved theorem paradox' experiment, in which the time traveler sees an elegant proof of a theorem in a book,” Lloyd said. “She goes back in time and shows the proof to a mathematician, who includes the proof in the book that he is writing. Of course, the book is the same book from which the time traveler took the proof in the first place. Where did the proof come from? Our theory has a specific prediction/retrodiction for this paradox, which we wish to test experimentally.”
TStzmmalaysia
post Mar 3 2011, 09:50 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Sticky feet send insect-bot climbing up the walls

AN OPEN window hundreds of metres up in a sheer glass tower block. No machine could reach it, surely? Step forward an insect-bot, with sticky feet that help it climb.

All insects squirt a sticky fluid from their feet as they walk. This creates a liquid "bridge" between foot and surface, forming a strong glue-like bond through surface and molecular tension.

Minghe Li, a roboticist at Tongji University in Shanghai, China, is trying to replicate this effect to create the next generation of climbing robots. He has designed an insect-bot that releases a mixture of honey and water onto its feet when it wants to climb. This fluid creates the liquid bridges favoured by insects.

But the prototype has not been as successful as an insect. So Li took another look at insects, such as stick insects, and saw they also have grooves on their feet, which enhance their sticking power.

Li is now replicating these grooves in a silicon foot pad etched with hexagons that spread out when pressure is applied as a robot walks.

The grooves lead to the creation of more fluid bridges between the foot and surface than occur with a smooth foot. This makes the pads 50 per cent more adhesive, according to Li's mathematical models.

This design may help Li resolve another problem he has with his groove-free prototype: getting the robot to secrete the sticky fluid slowly enough. Insects release a nanometre-thick layer of fluid, but Li's robot currently releases large droplets. He hopes the grooves will spread the droplets more evenly across the footpads.

Li's insect-bot faces stiff competition from robots modelled on another climber: the gecko. Instead of relying on sticky fluid, geckos use millions of hairs on their footpads to stick to surfaces. The hairs bend on contact with the surface, and the intermolecular van der Waals force causes the two to stick together.

These hairs have been mimicked in the lab using carbon nanotubes. So far, gecko-bots, such as the Stickybot developed by Mark Cutkosky and his team at Stanford University in California, are winning the race to the top as they can scale vertical walls, while Li's insect-bot can't handle inclines over 75 degrees. However, gecko-bots struggle on rough, wet or salty surfaces, as water and salt can interfere with van der Waals forces, says Kellar Autumn of Lewis & Clark College in Portland, Oregon, who studies gecko adhesion and how to apply it to robots.

Not only that, but making hairs that are both ultra-thin and bendy is difficult. As nanotubes get thinner, they also become stiffer and the force needed to bend them means larger motors are required, resulting in bigger robots.

Li's work is preliminary and gecko-bots have their own challenges, says Autumn, but combining the adhesion methods of insects and geckos could one day lead to the ultimate sticking machine.

NewScientist

TStzmmalaysia
post Mar 3 2011, 09:53 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Penn physicists develop scalable method for making graphene

As explained in a recently published study, a Penn research team was able to create high-quality graphene that is just a single atom thick over 95% of its area, using readily available materials and manufacturing processes that can be scaled up to industrial levels.

"I'm aware of reports of about 90%, so this research is pushing it closer to the ultimate goal, which is 100%," said the study's principal investigator, A.T. Charlie Johnson, professor of physics. "We have a vision of a fully industrial process."

Other team members on the project included postdoctoral fellows Zhengtang Luo and Brett Goldsmith, graduate students Ye Lu and Luke Somers and undergraduate students Daniel Singer and Matthew Berck, all of Penn's Department of Physics and Astronomy in the School of Arts and Sciences.

The group's findings were published on Feb. 10 in the journal Chemistry of Materials.

Graphene is a chicken-wire-like lattice of carbon atoms arranged in thin sheets a single atomic layer thick. Its unique physical properties, including unbeatable electrical conductivity, could lead to major advances in solar power, energy storage, computer memory and a host of other technologies. But complicated manufacturing processes and often-unpredictable results currently hamper graphene's widespread adoption.

Producing graphene at industrial scales isn't inhibited by the high cost or rarity of natural resources – a small amount of graphene is likely made every time a pencil is used – but rather the ability to make meaningful quantities with consistent thinness.

One of the more promising manufacturing techniques is CVD, or chemical vapor deposition, which involves blowing methane over thin sheets of metal. The carbon atoms in methane form a thin film of graphene on the metal sheets, but the process must be done in a near vacuum to prevent multiple layers of carbon from accumulating into unusable clumps.


The Penn team's research shows that single-layer-thick graphene can be reliably produced at normal pressures if the metal sheets are smooth enough.

"The fact that this is done at atmospheric pressure makes it possible to produce graphene at a lower cost and in a more flexible way," Luo, the study's lead author, said.

Whereas other methods involved meticulously preparing custom copper sheets in a costly process, Johnson's group used commercially available copper foil in their experiment.

"You could practically buy it at the hardware store," Johnson said.

Other methods make expensive custom copper sheets in an effort to get them as smooth as possible; defects in the surface cause the graphene to accumulate in unpredictable ways. Instead, Johnson's group "electropolished" their copper foil, a common industrial technique used in finishing silverware and surgical tools. The polished foil was smooth enough to produce single-layer graphene over 95% of its surface area.

Working with commercially available materials and chemical processes that are already widely used in manufacturing could lower the bar for commercial applications.

"The overall production system is simpler, less expensive, and more flexible" Luo said.

The most important simplification may be the ability to create graphene at ambient pressures, as it would take some potentially costly steps out of future graphene assembly lines.

"If you need to work in high vacuum, you need to worry about getting it into and out of a vacuum chamber without having a leak," Johnson said. "If you're working at atmospheric pressure, you can imagine electropolishing the copper, depositing the graphene onto it and then moving it along a conveyor belt to another process in the factory."

EurekAlert

This research was supported by Penn's Nano/Bio Interface Center through the National Science Foundation.

TStzmmalaysia
post Mar 3 2011, 09:54 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Power Flower Wind Turbine Trees Could Domesticate Wind Energy

Large, utility-scale wind turbines are fantastic and can generate a lot electricity, but they also require a lot of land and infrastructure. What if we could harness the power of the wind closer to home in an unobtrusive way? NL Architects explored this idea of 'domesticating' wind power with Power Flowers - tree-like structures outfitted with multiple vertical axis wind turbines. These small, almost noiseless generators could be placed in urban settings and serve as both distributed generation and artistic sculptures.

NL Architects has been exploring the idea of advanced wind power generation since 2006 and Power Flowers is their latest iteration. Vertical axis wind turbine technology has advanced their ideas to capture even more wind and become more efficient. Power Flowers relies on 4 kW Eddy turbines by Urban Green Energy and mounts them in groupings of either 3 or 12. These wind trees need less space, can be installed where the power is actually used, create very little noise, do not need to be directed into the wind and have an interesting and aesthetically pleasing design.

Power Flowers could easily be integrated into the urban landscape similar to cell phone towers, lights, electric poles and art sculptures. Less land intensive than even a pole-mounted residential horizontal axis wind turbine, the Power Flowers could easily fit in a back yard. By domesticating wind energy into a form that can be used as distributed generation, we can make wind power more accessible for everyone and as easy to integrate as rooftop solar systems.

Inhabitat

TStzmmalaysia
post Mar 3 2011, 09:57 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Urban Field: Piezoelectric Trees Harvest Rainwater and Generate Energy

We recently showcased Mario Caceres and Christian Canonico’s TREEPODS concept, which sought to sow hundreds of air-cleaning artificial trees through the streets of Boston. Designer Anthony DiMari has conceived of another impressive entry – a piezoelectric field of artificial trees that is able to collect rainwater and generate electricity. Dubbed URBAN FIELD, the project is a finalist in the SHIFTboston contest to create a synthetic tree that yields all the benefits of Mother Nature without requiring water and soil.

DiMari’s URBAN FIELD absorbs water from rainfall and the natural water table to irrigate the Rose F. Kennedy Greenway Landscape. The structures also collects wind energy, which powers L.E.D. lights within that glow at night. The grid is not random, but arranged to take advantage of the intensity of the wind currents that come off the harbor. Much like a tree, the field is used to take advantage of solar energy and water resources.

URBAN FIELD is designed to utilize the given sources of energy to their full potential, maximizing what is already there. The grid is completely changeable and adjustable to coincide with changing wind conditions as well, and creates an interactive park space for visitors to engage with.

Inhabitat

TStzmmalaysia
post Mar 3 2011, 10:00 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Spinal Cord Injury: Human Cells Derived from Stem Cells Restore Movement in Animal Models

For the first time, scientists discovered that a specific type of human cell, generated from stem cells and transplanted into spinal cord injured rats, provide tremendous benefit, not only repairing damage to the nervous system but helping the animals regain locomotor function as well.

The study, published March 2 in the journal PLoS ONE, focuses on human astrocytes -- the major support cells in the central nervous system -- and indicates that transplantation of these cells represents a potential new avenue for the treatment of spinal cord injuries and other central nervous system disorders.

Working together closely, research teams at the University of Colorado School of Medicine and University of Rochester Medical Center have made a major breakthrough in the use of human astrocytes for repairing injured spinal cords in rats.

"We've shown in previous research that the right types of rat astrocytes are beneficial, but this study brings it up to the human level, which is a huge step," said Chris Proschel, Ph.D., lead study author and assistant professor of Genetics at the University of Rochester Medical Center. "What's really striking is the robustness of the effect. Scientists have claimed repair of spinal cord injuries in rats before, but the benefits have been variable and rarely as strong as what we've seen with our transplants."

There is one caveat to the finding -- not just any old astrocyte will do. Using stem cells known as human fetal glial precursor cells, researchers generated two types of astrocytes by switching on or off different signals in the cells. Once implanted in the animals, they discovered that one type of human astrocyte promoted significant recovery following spinal cord injury, while another did not.

"Our study is unique in showing that different types of human astrocytes, derived from the exact same population of human precursor cells, have completely different effects when it comes to repairing the injured spinal cord," noted Stephen Davies, Ph.D., first author and associate professor in the Department of Neurosurgery at the University of Colorado School of Medicine. "Clearly, not all human astrocytes are equal when it comes to promoting repair of the injured central nervous system."

The research teams from Rochester and Colorado also found that transplanting the original stem cells directly into spinal cord injured rats did not aid recovery. Researchers believe this approach -- transplanting undifferentiated stem cells into the damaged area and hoping the injury will cause the stem cells to turn into the most useful cell types -- is probably not the best strategy for injury repair.

According to Mark Noble, director of the University of Rochester Stem Cell and Regenerative Medicine Institute, "This study is a critical step toward the development of improved therapies for spinal cord injury, both in providing very effective human astrocytes and in demonstrating that it is essential to first create the most beneficial cell type in tissue culture before transplantation. It is clear that we cannot rely on the injured tissue to induce the most useful differentiation of these precursor cells."

Attached Image

To create the different types of astrocytes used in the experiment, researchers isolated human glial precursor cells, first identified by Margot Mayer-Proschel, Ph.D., associate professor of Genetics at the University of Rochester Medical Center, and exposed these precursor cells to two different signaling molecules used to instruct different astrocytic cell fate -- BMP (bone morphogenetic protein) or CNTF (ciliary neurotrophic factor) .

Transplantation of the BMP human astrocytes provided extensive benefit, including up to a 70% increase in protection of injured spinal cord neurons, support for nerve fiber growth and recovery of locomotor function, as measured by a rat's ability to cross a ladder-like track.

In contrast, transplantation of the CNTF astrocytes, or of the stem cells themselves, failed to provide these benefits. Researchers are currently investigating why BMP astrocytes performed so much better than CNTF astrocytes, but believe multiple complex cellular mechanisms are probably involved.

"It is estimated that astrocytes make up the vast majority of all cell types in the human brain and spinal cord, and provide multiple different types of support to neurons and other cells of the central nervous system," said Jeannette Davies, Ph.D., assistant professor at the University of Colorado School of Medicine and co-lead author of the study. "These multiple functions are likely to all be contributing to the ability of the right human astrocytes to repair the injured spinal cord."

With these results, the Proschel and Davies teams are moving forward on the necessary next steps before they can implement the approach in humans, including testing the transplanted human astrocytes in different injury models that resemble severe, complex human spinal cord injuries at early and late stages after injury.

"Studies like this one bring increasing hope for our patients with spinal cord injuries," said Jason Huang, M.D., associate professor of Neurosurgery at the University of Rochester Medical Center and Chief of Neurosurgery at Highland Hospital. "Treating spinal cord injuries will require a multi-disciplinary approach, but this study is a promising one showing the importance of modifying human astrocytes prior to transplantation and has significant clinical implications."

In addition to Proschel and Noble, Davies and Davies, Mayer-Proschel and Chung-Hsuan Shih from the University of Rochester Medical Center contributed to the research. Portions of this research were funded by the New York State Spinal Cord Injury Research Program, the Carlson Stem Cell Fund and private donations by the international spinal cord injury community.

ScienceDaily

TStzmmalaysia
post Mar 4 2011, 10:10 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Could the combination of general relativity and quantum mechanics lead to spintronics?

In the early 20th century, two famous discoveries about spin were made. One of them, discovered by Albert Einstein and Wander Johannes de Haas, explains a relationship between the spin of elementary particles. They found a relationship between magnetism and angular momentum. (Around that time, Einstein also put forth his theory of general relativity.) A little more than a decade later, Paul Dirac unveiled his equation dealing with a relativistic quantum mechanical wave, providing an explanation of electrons as elementary spin-1/2 particles.

Even thOugh both of these discoveries have existed for nearly century, Sadamichi Maekawa tells PhysOrg.com, no one thought about combining them. “For nearly 100 years, people did not study putting these together. We decided to combine different ideas to come up with a fundamental Hamiltonian to investigate mechanical rotations in the Dirac equation.”

Maekawa, a scientist working with the Japan Atomic Energy Agency, as well as the Japan Science and Technology Agency, worked with scientists associated with Kyoto University and Tohoku University, to come up with a new model of spin that could be helpful in the development of spintronics. Mamoru Matsuo, Jun’ichi Ieda and Eiji Saitoh were all involved in creating the new model, which is published in Physical Review Letters: “Effects of Mechanical Rotation on Spin Currents.”

“The Einstein-de Haas effect is brought about by the angular momentum conservation between magnetism and rotational motion,” Maekawa explains. “Quantum mechanics tells us that the origin of magnetism is electron spin. Recent progress in nanotechnology enables us to manipulate the flow of electron spins, or ‘spin current’”. He points out that the relationship between spin current and magnets has been studied for nanodevice applications, but there has been little attention paid to the way rotational motion can be used to control spin current.

In Japan, Maekawa and his colleagues decided that studying how to use mechanical rotation to direct spin current could be advantageous in the development of spintronic devices that scientists think could eventually replace silicon-based electronics. “We found that we needed to add general relativity to the equation,” Maekawa says. “Dirac included special relativity, but general relativity was needed as well. We combined the two Einstein theories, and added them to the theory of quantum mechanics. This way, we added mechanical rotation to the quantum equation.”

Part of this new model includes extending the physical system into a noninertial frame from its present inertial frame. Maekawa and his fellows relied on the fact that the dynamics of spin currents is closely related to the spin-orbit interaction, resulting from using the low energy limit of the Dirac equation. “We tried to combine general relativity and spin current, even though general relativity is not so popular in condensed matter physics right now,” he explains. The result, Maekawa points out, is that it should be possible to control spin current using mechanical means.

For now, the model is theory. “We formulated an equation, and in the future we hope to try the theory,” Maekawa says. He believes that, “this theory will give birth to nanoscale motor and dynamo,” providing a practical way to realize spintronics in the future.


More information: Mamoru Matsuo, Jun’ichi Ieda, Eiji Saitoh, and Sadamichi Maekawa, “Effects of Mechanical Rotation on Spin Currents,” Physical Review Letters (2011). Available online:
PhysOrg

TStzmmalaysia
post Mar 4 2011, 10:12 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Algae Converted to Butanol; Fuel Can Be Used in Automobiles

A team of chemical engineers at the University of Arkansas has developed a method for converting common algae into butanol, a renewable fuel that can be used in existing combustible engines. The green technology benefits from and adds greater value to a process being used now to clean and oxygenate U.S. waterways by removing excess nitrogen and phosphorus from fertilizer in runoff.

"We can make cars go," said Jamie Hestekin, assistant professor and leader of the project. "Our conversion process is efficient and inexpensive. Butanol has many advantages compared to ethanol, but the coolest thing about this process is that we're actually making rivers and lakes healthier by growing and harvesting the raw material."

Hestekin and his research team -- undergraduates from the Honors College and several graduate students, including a doctoral student who has discovered a more efficient and technologically superior fermentation method -- grow algae on "raceways," which are long troughs -- usually 2 feet wide and ranging from 5-feet to 80-feet long, depending on the scale of the operation. The troughs are made of screens or carpet, although Hestekin said algae will grow on almost any surface.

Algae survive on nitrogen, phosphorus, carbon dioxide and natural sunlight, so the researchers grow algae by running nitrogen- and phosphorus-rich creek water over the surface of the troughs. They enhance this growth by delivering high concentrations of carbon dioxide through hollow fiber membranes that look like long strands of spaghetti. Municipal and state governments, primarily on the East Coast, have implemented large-scale processes similar to this to address so-called "dead zones," where excess nitrogen and phosphorus have killed fish and plants.

The researchers harvest the algae every five to eight days by vacuuming or scraping it off the screens. After waiting for it to dry, they crush and grind the algae into a fine powder as the means to extract carbohydrates from the plant cells. Carbohydrates are made of sugars and starches. For this project, Hestekin's team works with starches. They treat the carbohydrates with acid and then heat them to break apart the starches and convert them into simple, natural sugars. They then begin a unique, two-step fermentation process in which organisms turn the sugars into organic acids -- butyric, lactic and acetic.

The second stage of the fermentation process focuses on butyric acid and its conversion into butanol. The researchers use a unique process called electrodeionization, a technique developed by one of Hestekin's doctoral students. This technique involves the use of a special membrane that rapidly and efficiently separates the acids during the application of electrical charges. By quickly isolating butyric acid, the process increases productivity, which makes the conversion process easier and less expensive.

As Hestekin mentioned, butanol has several significant advantages over ethanol, the current primary additive in gasoline. Butanol releases more energy per unit mass and can be mixed in higher concentrations than ethanol. It is less corrosive than ethanol and can be shipped through existing pipelines. These attributes are in addition to the advantages gleaned from butanol's source. Unlike corn, algae are not in demand by the food industry. Furthermore, it can be grown virtually anywhere and thus does not require large tracts of valuable farmland.

Hestekin's team is currently working with the New York City Department of Environmental Protection to create biofuel from algae grown at the Rockaway Wastewater Treatment Plant in Queens.

Research articles detailing findings from algae-to-fuel project have been submitted to Biotechnology and Bioengineering and Separation Science and Technology.

ScienceDaily

TStzmmalaysia
post Mar 4 2011, 10:31 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

New generation of optical integrated devices for future quantum computers

A research group led by scientists from the University of Bristol has demonstrated the quantum operation of new components that will enable compact circuits for future photonic quantum computers.

Quantum computers, holding the great promise of tremendous computational power for particular tasks, have been the goal of worldwide efforts by scientists for several years. Tremendous advances have been made but there is still a long way to go.

Building a quantum computer will require a large number of interconnected components – gates – which work in a similar way to the microprocessors in current personal computers. Currently, most quantum gates are large structures and the bulky nature of these devices prevents scalability to the large and complex circuits required for practical applications.

This is a representation of a circuit for quantum information processing that makes use of multiport MMI devices. These circuits will be more compact and robust to fabrication tolerances compared to the current 2x2 ports devices. Credit: Alberto Peruzzo

Recently, the researchers from the University of Bristol's Centre for Quantum Photonics showed, in several important breakthroughs, that quantum information can be manipulated with integrated photonic circuits. Such circuits are compact (enabling scalability) and stable (with low noise) and could lead in the near future to mass production of chips for quantum computers.

Now the team, in collaboration with Dr Terry Rudolph at Imperial College, London, shows a new class of integrated divides that promise further reduction in the number of components that will be used for building future quantum circuits.

These devices, based on optical multimode interference (and therefore often called MMIs) have been widely employed in classical optics as they are compact and very robust to fabrication tolerances. "While building a complex quantum network requires a large number of basic components, MMIs can often enable the implementation with much fewer resources," said Alberto Peruzzo, PhD student working on the experiment.



This is a simulation of classical light propagating in a multimode interference device. The multimode propagation results in equal intensity in each of the four output waveguides. Credit: Alberto Peruzzo

Until now it was not clear how these devices would work in the quantum regime. Bristol researchers have demonstrated that MMIs can perform quantum interference at the high fidelity required.

Scientists will now be able to implement more compact photonics circuits for quantum computing. MMIs can generate large entangled states, at the heart of the exponential speedup promised by quantum computing.

"Applications will range from new circuits for quantum computation to ultra precise measurement and secure quantum communication," said Professor Jeremy O'Brien, director of the Centre for Quantum Photonics.

The team now plans to build new sophisticated circuits for quantum computation and quantum metrology using MMI devices.


article from PhysOrg

More information: The research will be published online in the next issue of Nature Communications (Tuesday 1 March). The open-access paper can be downloaded from: Nature

TStzmmalaysia
post Mar 4 2011, 10:32 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

New camera makes seeing the 'invisible' possible

The science similar to the type used in airport body scanners could soon be used to detect everything from defects in aerospace vehicles or concrete bridges to skin cancer, thanks to researchers at Missouri University of Science and Technology.

The research team, led by Dr. Reza Zoughi, the Schlumberger Distinguished Professor of Electrical Engineering at Missouri S&T, has developed a patented handheld camera that uses millimeter and microwave signals to non-intrusively peek inside materials and structures in real time. His contributions to this field, in part, have earned him the 2011 Joseph F. Keithley Award in Instrumentation and Measurement from the Institute of Electrical and Electronics Engineers (IEEE).

"In the not-so-distant future, the technology may be customized to address many critical inspection needs, including detecting defects in thermal insulating materials that are found in spacecraft heat insulating foam and tiles, space habitat structures, aircraft radomes and composite-strengthened concrete bridge members," Zoughi says.

The technology could help medical professionals detect and monitor a variety of skin conditions in humans, including cancer and burns. Even homeowners could see a direct benefit from the technology as it potentially could be used to detect termite damage.


How it works

The compact system can produce synthetically focused images of objects - at different planes in front of the camera - at speeds of up to 30 images per second. A laptop computer then collects the signal and displays the image in real-time for review. The entire system, powered by a battery similar to the size used in laptops, can run for several hours.

"Unlike X-rays, microwaves are non-ionizing and may only cause some heating effect," Zoughi says. "However, the high sensitivity and other characteristics of this camera enables it to operate at a low-power level."

The idea for developing a real-time, portable camera came to Zoughi in 1998 while he was on sabbatical in France. In 2007, Zoughi's research group completed the first prototype and has spent the past three years decreasing its size, while improving its overall efficiency.

Currently the camera operates in the transmission mode, meaning objects must pass between a transmitting source and its collector to be reviewed. The team is working on designing and developing a one-sided version of it, which will make it operate in a similar fashion to a video camera.

"Further down the road, we plan to develop a wide-band camera capable of producing real-time 3-D or holographic images," Zoughi adds.

Physorg

Provided by Missouri University of Science and Technology (news : web)

TStzmmalaysia
post Mar 4 2011, 10:36 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Put your thinking cap on and type with your mind

Guger Technologies, an Austrian-based company, has developed a computer interface that can work directly with the human brain. The interface allows a user to "type" short messages by staring at letters on the screen. Those messages can then be translated with text to speech software, giving a voice to those who cannot speak for themselves, a funky, robot voice. If speech isn't on the users mind the messages can be sent the same way any other text would be over the web.

Since the system only requires the movement of the eyes in order to function it could be used by people with severe spinal cord injuries or other conditions have the rendered arm movement and vocal cord use impossible or impractical.

The device, which has been named intendiX, was shown off at Cebit.Labs. For those of you who are not familiar with the event Cebit.Labs is a section of the Cebit trade show that is devoted exclusively to showing off research projects.

IntendiX features a tight-fitting skull cap that has a number of electroencephalograph (EEG) electrodes attached to it. These are the Wet style of electrodes, so they do require a gel to function properly, though there is a dry version of the cap in the works. There is also a pocket-sized brainwave amplifier, and a Windows-based application that is designed to analyze the brainwaves received and translate them into letters on the screen. The setup can be connected via a Bluetooth wireless signal.

Currently the fastest in-lab time has been .9 of a second per character, but that is after the users have been trained on the system. Untrained users testing the device have been a slow as 40 seconds per character. No word yet on when the device will be available to the public.

More information: http://www.intendix.com/

PhysOrg

TStzmmalaysia
post Mar 4 2011, 10:39 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

How to keep up with processor-power's doubling time, Part 4: Minimizing Communication between Cores

In the mid-1990s, Matteo Frigo, a graduate student in the research group of computer-science professor Charles Leiserson (whose work was profiled in the previous installment in this series), developed a parallel version of a fast Fourier transform (FFT). One of the most frequently used classes of algorithms in computer science, FFTs are useful for signal processing, image processing, and data compression, among other things.

Steven Johnson, then a graduate student in physics, was using Fourier transforms to solve differential equations and needed FFT software that would run on multiple machines, including parallel machines. “Matteo says, ‘Steven, I have the code for you,’” says Johnson, now an associate professor of applied mathematics. “This is fast and parallel and so forth. I didn’t take his word for it. I took it and I downloaded a half a dozen other FFT programs on the Web, and I benchmarked them on three or four machines, and I made a graph, and I put it up on my Web page. His code was pretty fast, but sometimes faster, sometimes slower than the other codes. His wife said he came home that day and said, ‘Steven put up a Web page that said my code wasn’t the fastest. This has to change.’”

Together, Johnson and Frigo went on to develop software called the fastest Fourier transform in the West, or FFTW, which is, indeed, among the fastest implementations of FFT algorithms for general-purpose computers.

Most FFTs use the divide-and-conquer approach described in the last article in this series: data — an incoming audio or video signal, or an image, or a mathematical description of a physical system — is split into smaller parts, whose Fourier transforms are calculated; but those calculations in turn rely on splitting the data into smaller chunks and calculating their Fourier transforms, and so on.

A program that performed all the steps of the FFT calculation in their natural order — splitting the problem into smaller and smaller chunks and then assembling the solution from the bottom up — would end up spending much of its time transferring data between different types of memory. Much of the work that went into FFTW involved maximizing the number of steps a single core could perform without having to transfer the results.

MIT researchers developed one of the fastest software implementations of the Fourier transform, a technique for splitting a signal into its constituent frequencies. Graphic: Christine Daniloff

The parallel implementation of FFTW compounds the communication problem, because cores working on separate chunks of the calculation also have to exchange information with each other. If the chunks get too small, communication ends up taking longer than the calculations, and the advantages of parallelization are lost. So every time it’s called upon to run on a new machine, FFTW runs a series of tests to determine how many chunks, of what type, to split the data into at each stage of the process, and how big the smallest chunks should be. FFTW also includes software that automatically generates code tailored to chunks of specific size. Such special-purpose code maximizes the efficiency of the computations, but it would be prohibitively time consuming to write by hand.

According to Jonathan Ragan-Kelley, a graduate student in the Computer Graphics Group at the Computer Science and Artificial Intelligence Laboratory, “Real-time graphics has been probably the most successful mass-market use of parallel processors.” Because updates to different regions of a two-million-pixel image can be calculated largely independently of each other, graphics naturally lends itself to parallel processing. “Your 3-D world is described by a whole bunch of triangles that are made up of vertices, and you need to run some math over all those vertices to figure out where they go on screen,” Ragan-Kelley says. “Then based on where they go on screen, you figure out what pixels they cover, and for each of those covered pixels, you have to run some other program that computes the color of that pixel.” Moreover, he says, computing the color of a pixel also requires looking up the texture of the surface that the pixel represents, and then calculating how that texture would reflect light, given the shadows cast by other objects in the scene. “So you have lots of parallelism, over the vertices and over the pixels,” Ragan-Kelley says.

If a parallel machine were to complete each of the stages in the graphics pipeline — the myriad computations that constitute triangle manipulation, pixel mapping, and color calculation — before the next stage began, it would run into the same type of problem that FFT algorithms can: it would spend much of its time just moving data around. Some commercial graphics software — say, the software that generates images on the Microsoft Xbox — is designed to avoid this problem when it encounters calculations that arise frequently — say, those typical of Xbox games. Like FFTW, the software executes as many successive steps as it can on a single core before transferring data. But outside the narrow range of problems that the software is tailored to, Ragan-Kelley says, “you basically have to give up this optimization.” Ragan-Kelley is investigating whether software could be designed to apply the same type of efficiency-enhancing tricks to problems of graphical rendering generally, rather than just those whose structure is known in advance.

At the International Solid-State Circuits conference in San Diego in February 2011, professor of electrical engineering Anantha Chandrakasan and Vivienne Sze, who received her PhD from MIT the previous spring, presented a new, parallel version of the H.264 video algorithm, a staple of most computer video systems. Rather than storing every pixel of every frame of video separately, software using the H.264 standard stores a lot of information about blocks of pixels. For instance, one block might be described as simply having the same color value as the block to its left, or the one below it; another block of pixels might be described as moving six pixels to the right and five down from one frame to the next. Information about pixels ends up taking up less memory than the values of the pixels themselves, which makes it easier to stream video over the Internet.

In all, H.264 offers about 20 different ways to describe pixel blocks, or “syntax elements.” To save even more space in memory, the syntax elements are subjected to a further round of data compression. Syntax elements that occur frequently are encoded using very short sequences of bits; syntax elements that occur infrequently are encoded using longer sequences of bits.

During playback, however, H.264 has to convert these strings of bits into the corresponding syntax elements. Although today’s high-definition TVs are able to decode the syntax elements sequentially without intolerable time lags, the TVs of tomorrow, with more than 10 times as many pixels, won’t be. Sze and Chandrakasan devised a way to assign the decoding of different types of syntax elements to different cores. Their proposal is currently under review with the MPEG and ITU-T standards bodies, and it could very well end up being incorporated into future video standards.

This story is republished courtesy of MIT News (http://web.mit.edu/newsoffice/).

PhysOrg

More information: Computer chips’ clocks have stopped getting faster. To maintain the regular doubling of computer power that we now take for granted, chip makers have been giving chips more “cores,” or processing units. But how to distribute computations across multiple cores is a hard problem, and this five-part series of articles examines the different levels at which MIT researchers are tackling it, from hardware design up to the development of new programming languages.

Designing the hardware

The next operating system

Retooling algorithms





TStzmmalaysia
post Mar 5 2011, 07:46 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Scientists create cell assembly line

Borrowing a page from modern manufacturing, scientists from the Florida campus of The Scripps Research Institute have built a microscopic assembly line that mass produces synthetic cell-like compartments.

The new computer-controlled system represents a technological leap forward in the race to create the complex membrane structures of biological cells from simple chemical starting materials.

"Biology is full of synthetic targets that have inspired chemists for more than a century," said Brian Paegel, Scripps Research assistant professor and lead author of a new study published in the Journal of the American Chemical Society. "The lipid membrane assemblies of cells and their organelles pose a daunting challenge to the chemist who wants to synthesize these structures with the same rational approaches used in the preparation of small molecules."

While most cellular components such as genes or proteins are easily prepared in the laboratory, little has been done to develop a method of synthesizing cell membranes in a uniform, automated way. Current approaches are capricious in nature, yielding complex mixtures of products and inefficient cargo loading into the resultant cell-like structures. The new technology transforms the previously difficult synthesis of cell membranes into a controlled process, customizable over a range of cell sizes, and highly efficient in terms of cargo encapsulation.

The membrane that surrounds all cells, organelles and vesicles – small subcellular compartments – consists of a phospholipid bilayer that serves as a barrier, separating an internal space from the external medium. The new process creates a laboratory version of this bilayer that is formed into small, cell-sized compartments.

Attached Image

How It Works

"The assembly-line process is simple and, from a chemistry standpoint, mechanistically clear," said Sandro Matosevic, research associate and co-author of the study.

A microfluidic circuit generates water droplets in lipid-containing oil. The lipid-coated droplets travel down one branch of a Y-shaped circuit and merge with a second water stream at the Y-junction. The combined flows of droplets in oil and water travel in parallel streams toward a triangular guidepost.

Then, the triangular guide diverts the lipid-coated droplets into the parallel water stream as a wing dam might divert a line of small boats into another part of a river. As the droplets cross the oil-water interface, a second layer of lipids deposits on the droplet, forming a bilayer. The end result is a continuous stream of uniformly shaped cell-like compartments. The newly created vesicles range from 20 to 70 micrometers in diameter—from about the size of a skin cell to that of a human hair. The entire circuit fits on a glass chip roughly the size of a poker chip.

The researchers also tested the synthetic bilayers for their ability to house a prototypical membrane protein. The proteins correctly inserted into the synthetic membrane, proving that they resemble membranes found in biological cells.

"Membranes and compartmentalization are ubiquitous themes in biology," noted Paegel. "We are constructing these synthetic systems to understand why compartmentalized chemistry is a hallmark of life, and how it might be leveraged in therapeutic delivery."

PhysOrg


Added on March 5, 2011, 7:48 pmBIOTECHNOLOGY

Surgeon creates new kidney on TED stage

A surgeon specializing in regenerative medicine on Thursday "printed" a real kidney using a machine that eliminates the need for donors when it comes to organ transplants. "It's like baking a cake," Anthony Atala of the Wake Forest Institute of Regenerative Medicine said as he cooked up a fresh kidney on stage at a TED Conference in the California city of Long Beach.

A surgeon specializing in regenerative medicine on Thursday "printed" a real kidney using a machine that eliminates the need for donors when it comes to organ transplants.

"It's like baking a cake," Anthony Atala of the Wake Forest Institute of Regenerative Medicine said as he cooked up a fresh kidney on stage at a TED Conference in the California city of Long Beach.

Scanners are used to take a 3-D image of a kidney that needs replacing, then a tissue sample about half the size of postage stamp is used to seed the computerized process, Atala explained.

The organ "printer" then works layer-by-layer to build a replacement kidney replicating the patient's tissue.

College student Luke Massella was among the first people to receive a printed kidney during experimental research a decade ago when he was just 10 years old. He said he was born with Spina Bifida and his kidneys were not working. "Now, I'm in college and basically trying to live life like a normal kid," said Massella, who was reunited with Atala at TED. "This surgery saved my life and made me who I am today."

About 90 percent of people waiting for transplants are in need of kidneys, and the need far outweighs the supply of donated organs, according to Atala. "There is a major health crisis today in terms of the shortage of organs," Atala said. "Medicine has done a much better job of making us live longer, and as we age our organs don't last."

PhysOrg


This post has been edited by tzmmalaysia: Mar 5 2011, 07:48 PM
TStzmmalaysia
post Mar 5 2011, 07:49 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Human stem cells transformed into key neurons lost in Alzheimer's

Northwestern Medicine researchers for the first time have transformed a human embryonic stem cell into a critical type of neuron that dies early in Alzheimer's disease and is a major cause of memory loss.

This new ability to reprogram stem cells and grow a limitless supply of the human neurons will enable a rapid wave of drug testing for Alzheimer's disease, allow researchers to study why the neurons die and could potentially lead to transplanting the new neurons into people with Alzheimer's.

These critical neurons, called basal forebrain cholinergic neurons, help the hippocampus retrieve memories in the brain. In early Alzheimer's, the ability to retrieve memories is lost, not the memories themselves. There is a relatively small population of these neurons in the brain, and their loss has a swift and devastating effect on the ability to remember.

"Now that we have learned how to make these cells, we can study them in a tissue culture dish and figure out what we can do to prevent them from dying," said senior study author Jack Kessler, M.D., chair of neurology and the Davee Professor of Stem Cell Biology at Northwestern University Feinberg School of Medicine and a physician at Northwestern Memorial Hospital.

The lead author of the paper is Christopher Bissonnette, a former doctoral student in neurology who labored for six years in Kessler's lab to crack the genetic code of the stem cells to produce the neurons. His research was motivated by his grandfather's death from Alzheimer's.

"This technique to produce the neurons allows for an almost infinite number of these cells to be grown in labs, allowing other scientists the ability to study why this one population of cells selectively dies in Alzheimer's disease," Bissonnette said.

The ability to make the cells also means researchers can quickly test thousands of different drugs to see which ones may keep the cells alive when they are in a challenging environment. This rapid testing technique is called high-throughput screening.

Kessler and Bissonnette demonstrated the newly produced neurons work just like the originals. They transplanted the new neurons into the hippocampus of mice and showed the neurons functioned normally. The neurons produced axons, or connecting fibers, to the hippocampus and pumped out acetylcholine, a chemical needed by the hippocampus to retrieve memories from other parts of the brain.
Human skin cells transformed into stem cells and then neurons

In new, unpublished research, Northwestern Medicine scientists also have discovered a second novel way to make the neurons. They made human embryonic stem cells (called induced pluripotent stem cells) from human skin cells and then transformed these into the neurons.

Scientists made these stem cells and neurons from skin cells of three groups of people: Alzheimer's patients, healthy patients with no family history of Alzheimer's, and healthy patients with an increased likelihood of developing the disease due to a family history of Alzheimer's because of genetic mutations or unknown reasons.

"This gives us a new way to study diseased human Alzheimer's cells," Kessler said. "These are real people with real disease. That's why it's exciting."

Researcher motivated by his grandfather's Alzheimer's disease

Bissonnette's persistence in the face of often frustrating research was fueled by the childhood memory of watching his grandfather die from Alzheimer's. "I watched the disease slowly and relentlessly destroy his memory and individuality, and I was powerless to help him," Bissonnette recalled. "That drove me to become a scientist. I wanted to discover new treatments to reverse the damage caused by Alzheimer's disease."

"My goal was to make human stem cells become new healthy replacement cells so that they could one day be transplanted into a patient's brain, helping their memory function again," he said.

Bissonnette had to grow and test millions of cells to figure out how to turn on the exact sequence of genes to transform the stem cell into the cholinergic neuron. "A stem cell has the potential to become virtually any cell in the body, from a heart cell to a layer of skin," he explained. "Its development is caused by a cascade of things that slowly bump it into a final cell type."

But it wasn't enough just to develop the neurons. Bissonnette then had to learn how to stabilize them so they lived for at least 20 days in order to prove they were the correct cells. "Since this was brand new research, people didn't know what kind of tissue culture mature human neurons would like to live in," he said. "Once we figured it out, they could live indefinitely."

PhysOrg

TStzmmalaysia
post Mar 5 2011, 07:50 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


SPACE SCIENCE

Attached Image

New robotic lander tested at historic test site

Today, engineers at NASA's Marshall Space Flight Center in Huntsville, Ala., began the first phase of integrated system tests on a new robotic lander prototype at Redstone Test Center’s propulsion test facility on the U.S. Army Redstone Arsenal, also in Huntsville. These tests will aid in the design and development of a new generation of small, smart, versatile robotic landers capable of performing science and exploration research on the surface of the moon or other airless bodies, including near-Earth asteroids.

This initial test phase, or strapdown testing, allows the engineering team to fully check out the integrated lander prototype before moving to more complex free flight tests. The team secures, or straps down, the prototype during hot fire tests to validate the propulsion system's response to the flight guidance, navigation and control algorithms and flight software prior to autonomous free flight testing.

"Moving the robotic lander tests to the Redstone Test Center facility is a good example of intergovernmental collaboration at its best," said Larry Hill, Robotic Lunar Lander Development Project Manager Test Director, at the Marshall Center. "Engineers and

technicians from NASA, the Army and our Huntsville-based support contractor, Teledyne Brown Engineering, have worked tirelessly over the last month to modify the historic test facility formerly used for missile testing to accommodate NASA's lander test in record time, saving NASA time and money."

"Our team has been on a record paced design and development schedule to deliver the robotic lander prototype to the test site," said Julie Bassler, Robotic Lunar Lander Development Project Manager. "We have succeeded in designing, building and testing this new lander prototype in a short 17 months with an in-house NASA Marshall team in collaboration with the our partners" -- Johns Hopkins Applied Physics Laboratory of Laurel, Md., and the Von Braun Center for Science and Innovation in Huntsville.

The flight test program includes three phases of testing culminating in free flight testing for periods up to sixty seconds scheduled for summer 2011. The prototype provides a platform to develop and test algorithms, sensors, avionics, software, landing legs, and integrated system elements to support autonomous landings on airless bodies, where aero-braking and parachutes are not options. The test program furthers NASA’s capability to conduct science and exploration activities on airless bodies in the solar system.

Development and integration of the lander prototype is a cooperative endeavor led by the Robotic Lunar Lander Development Project at the Marshall Center, Johns Hopkins Applied Physics Laboratory and the Von Braun Center for Science and Innovation, which includes the Science Applications International Corporation, Dynetics Corp., Teledyne Brown Engineering Inc., and Millennium Engineering and Integration Company, all of Huntsville.

The project is partnered with the U.S. Army’s Test and Evaluation Command’s test center located at Redstone Arsenal. Redstone Test Center is one of six centers under the U.S. Army Test and Evaluation Command and has been a leading test facility for defense systems since the 1950’s. Utilizing an historic test site at the Arsenal, the project is leveraging the Redstone Test Center’s advanced capability for propulsion testing.

PhysOrg

TStzmmalaysia
post Mar 5 2011, 07:52 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Cements that self-repair cracks and store latent heat energy

Dr. Idurre Kaltzakorta introduced capsules filled with organic material into the cement, in a Ph.D. thesis undertaken at Tecnalia and defended at the University of the Basque Country

Cement (and derivatives thereof) is one of the materials most commonly used in construction, given its good performance at low cost. Over recent years, one part of scientific and technological research is aimed at incorporating additional functions into these materials. Specifically, Doctor Idurre Kaltzakorta studied the possibility of adding capacities to the cement such as self-repair of cracks as well as storing latent heat energy. Her PhD thesis, undertaken at Tecnalia's Construction Unit, was presented at the University of the Basque Country (UPV/EHU) and entitled: Synthesis of silica microcapsules encapsulating different organic compounds for addition in the cement paste.

As the title of her research suggests, Dr Kaltzakorta created silica (it is, for instance, the base of glass) microcapsules with organic material inside, the idea being to provide the cement with new functions. She opted for two types of organic materials, each corresponding to one of the two added features mentioned above. Thus, on the one hand, the microcapsules were filled with various epoxy resins (used in the manufacture of adhesives), to provide the cement with the capacity for the self-repair of cracks. On the other, phase change materials were encapsulated. These are materials which absorb or free a great quantity of heat on the phase of the material changing (from solid to liquid or liquid to gas and viceversa), and enable the storage of latent heat energy in the cement material.


Sol-gel and emulsion

Ms Kaltzakorta studied the synthesis of the encapsulated material, opting for synthesising microcapsules by combining sol-gel chemistry with emulsion technology. This route enabled the encapsulation of organic material, difficult with other routes, under mild temperature and pressure conditions.

Once the microcapsules were obtained, the thesis analysed the effect of the addition of these to the cement matrix, to verify the viability of the technique. With this in mind, Ms Kaltzakorta used a number of techniques with which the features of the new cement material could be studied, techniques such as X-ray tomography, scanning electron microscopy, mechanical testing and differential scanning calorimetry

In conclusion, the thesis shows the viability of the development of a new generation of cements capable of the self-repair of cracks as well as storing latent heat energy, based on the application of silica microcapsules with various organic materials. In fact, the research for developing the new cement with the capacity for self-sealing of cracks has given rise to a patent. Moreover, according to Ms Kaltzakorta, the proposal presented in this thesis is a commitment to sustainability. On the one hand, getting the cement material to self-repair increases the useful life of the structures. On the other, using a material capable of regulating the temperature within the buildings will enhance their energy efficiency.

EurekAlert


TStzmmalaysia
post Mar 5 2011, 07:54 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Researchers use human cues to improve computer user-friendliness

Lijun Yin wants computers to understand inputs from humans that go beyond the traditional keyboard and mouse.

"Our research in computer graphics and computer vision tries to make using computers easier," says the Binghamton University computer scientist. "Can we find a more comfortable, intuitive and intelligent way to use the computer? It should feel like you're talking to a friend. This could also help disabled people use computers the way everyone else does."

Yin's team has developed ways to provide information to the computer based on where a user is looking as well as through gestures or speech. One of the basic challenges in this area is "computer vision." That is, how can a simple webcam work more like the human eye? Can camera-captured data understand a real-world object? Can this data be used to "see" the user and "understand" what the user wants to do?

To some extent, that's already possible. Witness one of Yin's graduate students giving a PowerPoint presentation and using only his eyes to highlight content on various slides. When Yin demonstrated this technology for Air Force experts last year, the only hardware he brought was a webcam attached to a laptop computer.

Yin says the next step would be enabling the computer to recognize a user's emotional state. He works with a well-established set of six basic emotions — anger, disgust, fear, joy, sadness, and surprise — and is experimenting with different ways to allow the computer to distinguish among them. Is there enough data in the way the lines around the eyes change? Could focusing on the user's mouth provide sufficient clues? What happens if the user's face is only partially visible, perhaps turned to one side?

"Computers only understand zeroes and ones," Yin says. "Everything is about patterns. We want to find out how to recognize each emotion using only the most important features."

He's partnering with Binghamton University psychologist Peter Gerhardstein to explore ways this work could benefit children with autism. Many people with autism have difficulty interpreting others' emotions; therapists sometimes use photographs of people to teach children how to understand when someone is happy or sad and so forth. Yin could produce not just photographs, but three-dimensional avatars that are able to display a range of emotions. Given the right pictures, Yin could even produce avatars of people from a child's family for use in this type of therapy.

Yin and Gerhardstein's previous collaboration led to the creation of a 3D facial expression database, which includes 100 subjects with 2,500 facial expression models. The database is available at no cost to the nonprofit research community and has become a worldwide test bed for those working on related projects in fields such as biomedicine, law enforcement and computer science.

Once Yin became interested in human-computer interaction, he naturally grew more excited about the possibilities for artificial intelligence.

"We want not only to create a virtual-person model, we want to understand a real person's emotions and feelings," Yin says. "We want the computer to be able to understand how you feel, too. That's hard, even harder than my other work."

Imagine if a computer could understand when people are in pain. Some may ask a doctor for help. But others — young children, for instance — cannot express themselves or are unable to speak for some reason. Yin wants to develop an algorithm that would enable a computer to determine when someone is in pain based just on a photograph.

Yin describes that health-care application and, almost in the next breath, points out that the same system that could identify pain might also be used to figure out when someone is lying. Perhaps a computer could offer insights like the ones provided by Tim Roth's character, Dr. Cal Lightman, on the television show Lie to Me. The fictional character is a psychologist with an expertise in tracking deception who often partners with law-enforcement agencies.

"This technology," Yin says, "could help us to train the computer to do facial-recognition analysis in place of experts."

Article from EurekAlert

About Lijun Yin

Lijun Yin, associate professor of computer science and director of the Graphics and Image Computing Laboratory, joined the Binghamton University faculty in 2001. He earned a doctorate from the University of Alberta in 2000, after receiving undergraduate and master's degrees from schools in China. His research has been sponsored by the National Science Foundation, the Air Force Research Laboratory and the New York State Office of Science, Technology and Academic Research.


For more Binghamton University research news, see http://discovere.binghamton.edu/

A related video clip is available here.
TStzmmalaysia
post Mar 5 2011, 07:56 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Virtual Reality Can Improve Design Skills in Younger Generation

Rapidly improving technology is changing everyday life for all generations. This constantly changing environment can be a difficult adjustment for older generations. However, for the current generation known as “Generation Y”, this sense of constant technology adaption isn’t an adjustment; it is a way of life. A University of Missouri researcher says a widening gap is occurring between educators and students due to the difference in how older and younger generations approach evolving technologies. Newton D’Souza, an assistant professor of architectural studies at MU, is looking for ways to move beyond traditional teaching methods and to bridge the technology gap between teachers and students.

“In a traditional educational model, learning only occurs in the classroom,” MU researcher Newton D’Souza said. “Now, with technology like laptops and mobile phones, learning can occur anywhere from classrooms to hallways to coffee shops. For older generations, technology is a separate fixture. For Generation Y, it’s a part of their lives. On one hand, it is exciting; on the other hand, it challenging because we must find ways to adjust teaching styles.”

Researchers at the University of Missouri are studying ways to integrate technology into design learning, specifically to learn how to teach children design basics. In an effort to study how children who have grown up in a wired, video game culture use technology, D’Souza engaged young students using a 3D virtual reality platform to teach design. Using a popular existing virtual reality platform called 2nd Life, researchers directed students to design a small zoo. The zoo project involved a topic that young students could relate to, while providing adequate research restraints.

The 2nd Life platform provided a realistic 3D spatial simulation for students to explore. They were given instructions on certain design specifics and then allowed to work within the simulation. By studying how the students worked within the virtual reality platform and their eventual design product, D’Souza was able to observe the improvement of specific design skills.

D’Souza found that students working within the 3D virtual reality environment tended to improve spatial skills, including kinesthetic and logical abilities. However, verbal and intrapersonal skills seemed to suffer. He attributed this mixed result as a lesson to constantly work on creating better interfaces for today’s learners. D’Souza also was surprised to find how quickly the students grasped the virtual reality concept and were able to begin working with it.

“Because they are wired in media, the kids entered into the system much faster than we expected,” D’Souza said. “Today’s students already exist in a 3D environment; we need to find a way to teach them where they already are.”

Ultimately, D’Souza says that because each individual learns differently, new media technologies like the 2nd Life platform will teach researchers even more about how students learn. He believes it is important to continually question the assumptions about how humans learn.

“Right now we are failing to communicate with younger children,” D’Souza said. “Learning is effective when previous assumptions are questioned, and nothing is taken for granted. It’s not that we should entirely abandon our traditional teaching techniques; we need to consolidate what we have, and yet improvise to meet the needs of current day learners.”

MU News Bureau

This post has been edited by tzmmalaysia: Mar 5 2011, 07:58 PM
TStzmmalaysia
post Mar 5 2011, 07:59 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Creasing to cratering: Voltage breaks down plastic

A Duke University team has seen for the first time how soft polymers, such as wire insulation, can break down under exposure to electrical current.

Researchers have known for decades that polymers, such those insulating wires, may break down due to deformation of the polymers. But the process had never been seen.

In a series of experiments, Duke University engineers have documented at the microscopic level how plastic deforms to breakdown as it is subjected to ever-increasing electric voltage. Polymers can be found almost everywhere, most commonly as an insulator for electrical wires, cables and capacitors.

The findings by the Duke engineers could help in developing new materials to improve the durability and efficiency of any polymer that must come into contact with electrical currents, as well as in the emerging field of energy harvesting.

"We have long known that these polymers will eventually break down, or fail, when subjected to an increasing electrical voltage," said Xuanhe Zhao, assistant professor of mechanical engineering and materials science at Duke's Pratt School of Engineering. He is the senior scientist in the series of experiments performed by a graduate student Qiming Wang and published online in the Physical Review Letters. "Now we can actually watch the process as it happens in real time."

The innovation the Duke team developed was attaching the soft polymer to another rigid polymer layer, or protective substrate, which enabled observation of the deformation process without incurring the breakdown. They then subjected this polymer-substrate unit to various electrical voltages and observed the effects under a microscope.

"As bread dough rises in a bowl, the top surface of the dough may fold in upon itself to form creases due to compressive stresses developing in the dough," Zhao said, "Surprisingly, this phenomenon may be related to failures of electrical polymers that are widely used in energy-related applications."

"When the voltage reached a critical point, the compressive stress induced a pattern of creases, or folds, on the polymer," Zhao. "If the voltage is increased further, the creases evolved into craters or divots in the polymer as the electrical stress pulls the creases open. Polymers usually break down electrically immediately after the creasing, which can cause failures of insulating cables and organic capacitors."

The substrate the researchers developed for the experiments not only allowed for the visualization of the creasing-to-cratering phenomenon, it could also be the foundation of a new approach to improving the ability of wires to carry electricity.
TStzmmalaysia
post Mar 5 2011, 08:01 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Breakthrough in Molecular Motors: First Molecular Piston Capable of Self-Assembly

Living organisms make extensive use of molecular motors in fulfilling some of their vital functions, such as storing energy, enabling cell transport or even moving about in the case of bacteria. Since the molecular layouts of such motors are extremely complex, scientists seek to create their own, simpler versions. The motor developed by the international team headed by Ivan Huc (2), CNRS researcher in the "Chimie et Biologie des Membranes et des Nanoobjets" Unit (CNRS/Université de Bordeaux), is a "molecular piston." Like a real piston, it comprises a rod on which a moving part slides, except that the rod and the moving part are only several nanometers long.

More specifically, the rod is formed of a slender molecule, whereas the moving part is a helix-shaped molecule (both are derivatives of organic compounds especially synthesized for the purpose). How can the helicoidal molecule move along the rod? The acidity of the medium in which the molecular motor is immersed controls the progress of the helix along the rod: by increasing the acidity, the helix is drawn towards one end of the rod, as it then has an affinity for that portion of the slender molecule. By reducing the acidity, the process is reversed and the helix goes in the other direction.

This device has a crucial advantage compared to existing molecular pistons: self-assembly. In previous versions, which take the form of a ring sliding along a rod, the moving part is mechanically passed onto the rod with extreme difficulty. Conversely, the new piston is self-constructing: the researchers designed the helicoidal molecule specifically so that it winds itself spontaneously around the rod, while retaining enough flexibility for its lateral movements.

By allowing the large scale manufacturing of such molecular pistons, this self-assembly capacity augurs well for the rapid development of applications in various disciplines: biophysics, electronics, chemistry, etc. By grafting several pistons together end-to-end, it could be possible, for example, to produce a simplified version of an artificial muscle, capable of contracting on demand. A surface bristling with molecular pistons could, as and when required, become an electrical conductor or insulator. Finally, a large-scale version of the rod on which several helices could slide would provide a polymer of adjustable mechanical stiffness. This goes to show that the possibilities for this new molecular piston are (almost) infinite.

ScienceDaily

TStzmmalaysia
post Mar 5 2011, 08:02 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


NANOTECHNOLOGY

Attached Image

New 'Frozen Smoke' May Improve Robotic Surgery, Energy Storage

A spongy substance that could be mistaken for packing material has the nanotechnology world buzzing. University of Central Florida Associate Professor Lei Zhai and postdoctoral associate Jianhua Zou have engineered the world's lightest carbon material in such a way that it could be used to detect pollutants and toxic substances, improve robotic surgery techniques and store energy more efficiently.

The new material belongs to the family of the lightest solid, also known by its technical name of aerogel or its common nickname of "frozen smoke."

Zhai's team worked with UCF professors Saiful Khondaker, Sudipta Seal and Quanfang Chen to create multiwalled carbon nanotubes (MWCNT) aerogel. Carbon nanotubes are so small that thousands fit on a single strand of human hair. And using the nanotubes instead of silica (major material in sand), the foundation for traditional aerogel, increases the materials' practical use.

For the first time, even the tiniest pressure change can be detected and tracked. Strips of MWCNT aerogel could be used in robotic fingers and hands to make them super sensitive and give them the ability to distinguish between holding a power saw or a scalpel -- a distinction necessary for use in surgery.

Because the nanotubes have a large surface area , great amounts of energy could be stored in the aerogel, increasing the capacity of lithium batteries or supercapacitors used to store energy generated from renewable resources such as wind and the sun.

Combining the larger surface area and improved electrical conductivity is also important in developing sensors that can detect toxins capable of invading the food or water supply. And the same technique can be used to develop equipment capable of detecting even trace amounts of explosives.

"This has many potential applications and could really open up new areas to explore that we haven't even imagined yet," Zhai said.

ScienceDaily

TStzmmalaysia
post Mar 6 2011, 10:22 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Timon Singh Supercritical Carbon Dioxide Powered Brayton Cycle Turbine to Increase Efficiency By 50%

While one can always look for new sources of renewable energy, there is a lot to be said for simply improving the efficiency of current systems. That is what researchers at the Sandia National Laboratories are working on by developing a gas turbine system that should increase thermal-to-electric conversion efficiency by as much as 50 percent. This could see a dramatic improvement for nuclear powered stations, which use steam turbines to generate energy.

The research team is using supercritical carbon dioxide (S-CO2) to not only reduce costs, but increase efficiency on their design. The system, called Brayton-cycle turbines, would ideally replace the steam-driven Rankine-cycle turbines, which not only have lower efficiency but are corrosive at high temperatures. These old designs are also massive and occupy 30 times as much space because of the need for very large turbines and condensers. By comparisson, the Brayton-cycle yields 20 megawatts of electricity and only takes up four cubic meters.

The Brayton-cycle, named after George Brayton, uses heated air and directs it in a particular direction – much like jet engines.

“This machine is basically a jet engine running on a hot liquid,” said principal investigator Steve Wright of Sandia’s Advanced Nuclear Concepts group. “There is a tremendous amount of industrial and scientific interest in supercritical CO2 systems for power generation using all potential heat sources including solar, geothermal, fossil fuel, biofuel and nuclear.

With other companies working on similar systems, the technology is set to be a game-changer in the energy-generating industry. The only question is who will dominate the market first. Wright, however, has confidence in his team.

“Sandia is not alone in this field, but we are in the lead,” Wright said. “We’re past the point of wondering if these power systems are going to be developed; the question remains of who will be first to market.”

Inhabitat

TStzmmalaysia
post Mar 6 2011, 08:01 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Supercritical carbon dioxide Brayton Cycle turbines promise giant leap

Sandia National Laboratories researchers are moving into the demonstration phase of a novel gas turbine system for power generation, with the promise that thermal-to-electric conversion efficiency will be increased to as much as 50 percent — an improvement of 50 percent for nuclear power stations equipped with steam turbines, or a 40 percent improvement for simple gas turbines. The system is also very compact, meaning that capital costs would be relatively low.

Research focuses on supercritical carbon dioxide (S-CO2) Brayton-cycle turbines, which typically would be used for bulk thermal and nuclear generation of electricity, including next-generation power reactors. The goal is eventually to replace steam-driven Rankine cycle turbines, which have lower efficiency, are corrosive at high temperature and occupy 30 times as much space because of the need for very large turbines and condensers to dispose of excess steam. The Brayton cycle could yield 20 megawatts of electricity from a package with a volume as small as four cubic meters.

The Brayton cycle, named after George Brayton, originally functioned by heating air in a confined space and then releasing it in a particular direction. The same principle is used to power jet engines today.

"This machine is basically a jet engine running on a hot liquid," said principal investigator Steve Wright of Sandia's Advanced Nuclear Concepts group. "There is a tremendous amount of industrial and scientific interest in supercritical CO2 systems for power generation using all potential heat sources including solar, geothermal, fossil fuel, biofuel and nuclear."

Sandia currently has two supercritical CO2 test loops. (The term "loop" derives from the shape taken by the working fluid as it completes each circuit.) A power production loop is located at the Arvada, Colo., site of contractor Barber Nichols Inc., where it has been running and producing approximately 240 kilowatts of electricity during the developmental phase that began in March 2010. It is now being upgraded and is expected to be shipped to Sandia this summer.

A second loop, located at Sandia in Albuquerque, is used to research the unusual issues of compression, bearings, seals, and friction that exist near the critical point, where the carbon dioxide has the density of liquid but otherwise has many of the properties of a gas.

Immediate plans call for Sandia to continue to develop and operate the small test loops to identify key features and technologies. Test results will illustrate the capability of the concept, particularly its compactness, efficiency and scalability to larger systems. Future plans call for commercialization of the technology and development of an industrial demonstration plant at 10 MW of electricity.

A competing system, also at Sandia and using Brayton cycles with helium as the working fluid, is designed to operate at about 925 degrees C and is expected to produce electrical power at 43 percent to 46 percent efficiency. By contrast, the supercritical CO2 Brayton cycle provides the same efficiency as helium Brayton systems but at a considerably lower temperature (250-300 C). The S-CO2 equipment is also more compact than that of the helium cycle, which in turn is more compact than the conventional steam cycle.

Under normal conditions materials behave in a predictable, classical, "ideal" way as conditions cause them to change phase, as when water turns to steam. But this model tends not to work at lower temperatures or higher pressures than those that exist at these critical points. In the case of carbon dioxide, it becomes an unusually dense "supercritical" liquid at the point where it is held between the gas phase and liquid phase. The supercritical properties of carbon dioxide at temperatures above 500 C and pressures above 7.6 megapascals enable the system to operate with very high thermal efficiency, exceeding even those of a large coal-generated power plant and nearly twice as efficient as that of a gasoline engine (about 25 percent).

In other words, as compared with other gas turbines the S-CO2 Brayton system could increase the electrical power produced per unit of fuel by 40 percent or more. The combination of low temperatures, high efficiency and high power density allows for the development of very compact, transportable systems that are more affordable because only standard engineering materials (stainless steel) are required, less material is needed, and the small size allows for advanced-modular manufacturing processes.

"Sandia is not alone in this field, but we are in the lead," Wright said. "We're past the point of wondering if these power systems are going to be developed; the question remains of who will be first to market. Sandia and DOE have a wonderful opportunity in the commercialization effort."

Provided by Sandia National Laboratories.

Sandia National Laboratories is a multiprogram laboratory operated and managed by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.

EurekAlert

TStzmmalaysia
post Mar 6 2011, 08:04 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Graphene etching to usher in computing revolution

Move over sticky tape: a splattering of zinc atoms and a dash of acid is the best way to peel off single layers of graphene, the atom-thin form of carbon that electrons can zip through with incredible efficiency and speed.

This technique is so precise that it might be possible to create electrical circuits using only graphene components, which could allow this exotic material to realise its potential as the basis for ultra-efficient, super-fast computer chips.

Graphene was first discovered in 2004 when Andre Geim and Konstantin Novoselov, both at the University of Manchester, UK, used sticky tape to pull single layers of the stuff off a piece of graphite.

But although its discoverers picked up the physics Nobel prize, until now it has proved difficult to remove single layers of graphene from specific locations, which is essential if you want to use it to build circuits on computer chips.

Sputtering zinc

Now, James Tour from Rice University in Houston, Texas, and colleagues have come up with a simple but effective way of doing this. They used a common laboratory technique known as sputtering to coat the top layer of a stack of graphene sheets with zinc metal: the zinc atoms collide with the stack but only have enough energy to damage the first layer.

Hydrochloric acid is then used to dissolve the zinc, removing this weakened first layer but leaving the other layers intact.

This gives researchers the ability to etch graphene with unprecedented precision – and create samples of a very exact thickness. "We are able to remove one layer at a time. Before this, lithography could never give you single atom precision. If you wanted to remove a layer, you would have to remove lots and lots of layers," says Tour.

He envisages using the technique to scrape off just the right number of layers from multi-layer graphene stacks, leaving behind pre-determined spots that are exactly one, two or three layers thick.

Graphene revolution

This level of control is important because the number of layers in a graphene stack determines its properties. For example, a single layer of graphene behaves like a metal whereas a double layer is like a semiconductor and can be built into a transistor. "The ability to have a single layer right next to a double layer next to triple layer is very attractive," says Tour. "You could build a series of devices very close to each other in any pattern you want, just by removing portions of each layer."

"This could result in a set of different electronic components all made of, and interconnected by, graphene," says Vitor Pereira from the National University of Singapore, who is not part of the research team. Such all-graphene devices would "explore the advantages of graphene to the fullest", he says and "help realise one of the ultimate goals in graphene-based electronics: all-graphene electronic circuits".

Vertical control

Zakaria Moktadir of the Nano Research Group at the University of Southampton, UK agrees, adding that all-graphene circuits could bring us a step closer to building ultra-fast computer chips, as well as more sophisticated sensors and touchscreens.

Tour is confident that his "new tool" will open the door for even more complex and exotic devices to be built with graphene. "We have provided a wrench for people who have never had one before. It is up to them to see what they can do with it," he says.

Now this degree of precision has been reached in the vertical direction, the next coup would be to get that same level of control laterally. Horizontal precision would allow trillions of one-atom-thick transistors to fit onto a chip, says Mokatadir.

NewScientist

Journal reference: Science, DOI: 10.1126/science.1199183

TStzmmalaysia
post Mar 7 2011, 07:43 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Language Barrier: To Take Advantage of Multicore Chips, Programmers will need Fundamentally New Software

For decades, computer scientists tried to develop software that could automatically turn a conventional computer program -- a long sequence of instructions intended to be executed in order -- into a parallel program -- multiple sets of instructions that can be executed at the same time. Now, most agree that that was a forlorn hope: Code that can be parallelized is too hard to recognize, and the means for parallelizing it are too diverse and context-dependent. "If you want to get parallel performance, you have to start writing parallel code," says MIT computer-science professor Saman Amarasinghe. And MIT researchers are investigating a host of techniques to make writing parallel code easier.

One of the most prominent is a software development system created by computer-science professor Charles Leiserson and his Supertech Research Group. Initially, the system used the programming language C — hence its name, Cilk. Cilk, Leiserson says, adds three commands to C: “spawn," “sync,” and a variation of the standard command “for.” If a programmer has identified a section of a program that can be executed in parallel — if, say, the same operation has to be performed on a lot of different data — he or she simply inserts the command “spawn” before it. When the program is running, the Cilk system automatically allocates the spawned computation as many cores as are free to handle it. If the results of the spawned computations need to be aggregated before the program moves on to the next instruction, the programmer simply inserts the command “sync.”

The reason Leiserson’s group could get away with such minimal alteration of C is the “runtime” system that underlies programs written in Cilk. A runtime is an extra layer of software between a program and a computer’s operating system, which allows the same program to run on much different machines; the most familiar example is probably the runtime that interprets programs written in Java. “All the smartness is underneath, in the runtime system,” Leiserson explains.

One unusual feature of Cilk’s runtime is the way it allocates tasks to different cores. Many parallel systems, Leiserson explains, use a technique called “work sharing,” in which a core with a parallel task queries the other cores on the chip to see which are free to take on some additional work. But passing messages between cores is much more time-consuming than executing computations on a given core, and it ends up eating into the gains afforded by parallel execution. The Cilk runtime instead uses a technique called “work stealing.” A core that generates a host of tasks that could, in principle, be executed in parallel just queues them up in its own memory, as it would if there were no other cores on the chip. A core that finds itself without work, on the other hand, simply selects one other core at random and pulls tasks out of its queue. As long as the program has enough parallelism in it, this drastically reduces the communication overhead.

One of the advantages of Cilk, Leiserson explains, is that the programmer writes the same program whether it’s going to run on a multicore computer or a single-core computer. Execution on a single-core computer is no different than execution on a computer with multiple cores, all but one of which is busy. Indeed, Cilk’s advantages are evident enough that Intel now designs its compilers — the programs that convert code into instructions intelligible to computers — to work with Cilk.

Amarasinghe is attacking the problem of parallel programming on several fronts. One difficulty with parallel programs is that their behavior can be unpredictable: If, for instance, a computation is split between two cores, the program could yield a very different result depending on which core finishes its computation first. That can cause headaches for programmers, who often try to identify bugs by changing a line or two of code and seeing what happens. That approach works only if the rest of the program executes in exactly the same way. Amarasinghe and his students have developed a system in which cores report their results in an order determined by the number of instructions they’ve executed, not the time at which they finished their computations. If a core with a short list of instructions runs into an unexpected snag — if, say, its request for data from main memory gets hung up — the other cores will wait for it to finish before reporting their own results, preserving the order in which the results arrive.

Another project, called StreamIt, exploits the parallelism inherent in much digital signal processing. Before a computer can display an Internet video, for instance, it needs to perform a slew of decoding steps — including several different types of decompression and color correction, motion compensation and equalization. Traditionally, Amarasinghe says, video software will take a chunk of incoming data, pass it through all those decoding steps, and then grab the next chunk. But with StreamIt, as one chunk of data is exiting a step, another chunk is entering it. The programmer just has to specify what each step does, and the system automatically divides up the data, passes it between cores, and synthesizes the results.

A programmer trying to decide how to perform a particular computation generally has a range of algorithms to choose from, and which will work best depends on the data it’s handling and the hardware it’s running on. Together with professor of applied mathematics Alan Edelman, Amarasinghe has developed a language called PetaBricks that allows programmers to specify different ways to perform the same computation. When a PetaBricks program launches, it performs a series of measurements to determine which types of operations will work best on that machine under what circumstances. Although PetaBricks could offer mild advantages even on single-core computers, Amarasinghe explains, it’s much more useful on multicore machines. On a single-core machine, one way of performing a computation might, in rare cases, prove two or three times as efficient as another; but because of the complexities of parallel computing, the difference on a multicore machine could be a factor of 100 or more.

One of the more radical parallel-programming proposals at the Computer Science and Artificial Intelligence Laboratory comes from in the lab of Panasonic Professor of Electrical Engineering Gerald Sussman. Traditionally, computer scientists have thought of computers as having two fundamental but distinct components: a logic circuit and a memory bank. In practice, that distinction has been complicated by evolving hardware designs, but for purposes of theoretical analysis, it’s generally taken for granted.

Sussman and his former postdoc Alexey Radul, who completed his PhD at MIT in 2009 and is now at the Hamilton Institute in Maynooth, Ireland, suggest that we instead envision a computer as a fleet of simple processors and memory cells, and programming as wiring those elements together in different patterns. That conception, Radul believes, would make it easier to design software to solve problems common in artificial intelligence, such as constraint-satisfaction problems, whose solutions need to meet several sometimes-contradictory conditions at once. Sudoku puzzles are a simple example.

Radul’s network is an abstraction, designed to make things easier for AI researchers: It could, in principle, be implemented on a single core. But it obviously lends itself to multicore computing. Either way, one of the central problems it poses is how to handle situations in which multiple processing units are trying to store different values in a shared memory cell. In his doctoral thesis, Radul demonstrated how to design memory cells that store information about data rather than storing the data themselves, much like a Sudoku solver who jots several possibilities in the corner of an empty square. But Radul acknowledges that designing the memory cells is just a first step in making his and Sussman’s system feasible. Reinventing computing from the ground up will take more work than that.

PhysOrg

This story is republished courtesy of MIT News (http://web.mit.edu/newsoffice/), a popular site that covers news about MIT research, innovation and teaching.

More information: Computer chips' clocks have stopped getting faster. To maintain the regular doubling of computer power that we now take for granted, chip makers have been giving chips more “cores,” or processing units. But how to distribute computations across multiple cores is a hard problem, and this five-part series of articles examines the different levels at which MIT researchers are tackling it, from hardware design up to the development of new programming languages.

Part 1: Designing the hardware
Part 2: The next operating system
Part 3: Retooling Algorithms
Part 4: Minimizing communication between cores
TStzmmalaysia
post Mar 7 2011, 07:46 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


TRANSPORTATION

Attached Image

Thin-Nosed "Hayabusa" MagLev Train about to make his Debut

Japan's latest bullet train, the thin-nosed "Hayabusa" or Falcon, will make its 300 kilometre per hour (186 mph) debut Saturday, boasting a luxury carriage modelled on airline business class.

Japan has built up a network of cutting-edge Shinkansen train lines since the 1960s that criss-cross the island nation and now hopes to sell the infrastructure technology abroad, including to the United States.

The latest ultra-fast tech-marvel will make three trips a day from Tokyo to the city of Aomori, in a scenic rural backwater on the northern tip of the main Honshu island that has until now been off Japan's bullet train map.

The green-and-silver E5 series Hayabusa will travel at up to 300 kmh to make the 675 kilometre trip in three hours and 10 minutes. From next year, it will push its top speed to 320 kmh to become Japan's fastest train.

Passengers will glide quietly through the straights and tunnels that cut through Japan's mountainous countryside, says operator East Japan Railway Co, which has heavily promoted the launch of the new service.

Those willing to pay 26,360 yen ($320) for a one-way trip can enjoy the comfort of a 'GranClass' car, where a cabin attendant will serve them as they enjoy deeply reclining leather seats and thick woollen carpets.

To promote the service, the train company has also heavily advertised Aomori as a tourist destination, praising its landscape, seafood and winter snow.

Japan's ultra-fast, frequent and punctual bullet trains have made them the preferred choice for many travellers, rather than flying or road travel, ever since the first Shinkansen was launched in time for the 1964 Tokyo Olympics.

But as Japan, and its railway companies, struggle with a fast-greying and shrinking population and falling domestic demand, the government and industry are aggressively seeking to promote the bullet trains abroad.

Japan has in the past sold Shinkansen technology to Taiwan and hopes to capture other overseas markets, such as Brazil and Vietnam, but faces stiff competition from train manufacturers in China, France and Germany.

The biggest prize is a future high-speed US rail network that President Barack Obama has promoted, to be backed by 13 billion dollars in public funding.

California's then-governor Arnold Schwarzenegger was treated to an early test ride on the Hayabusa when he visited Japan in September.

Japan says its trains boast a strong safety record: despite running in an earthquake-prone country, no passenger has ever died due to a Shinkansen derailment or collision -- although people have committed suicide by jumping in front of the trains.

Japan has also been developing a magnetic levitation or maglev train that, its operator says, reached a world record speed of 581 kilometres per hour in 2003 on a test track near Mount Fuji in Tsuru, west of Tokyo.

The plan is to launch maglev services between Tokyo and the central city of Nagoya by 2027. By 2045 they are expected to link Tokyo with the main western city of Osaka in just one hour and seven minutes, compared with the current two hours 25.

PhysOrg

TStzmmalaysia
post Mar 7 2011, 09:03 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

A Stretchy Sensing Tool for Surgery

A new surgical tool covered in stretchable sensors could reduce the time required to map electrical problems in the heart from over an hour to just a few minutes. The tool could be one of the first commercial applications for an innovative method for making dense arrays of stretchable, biocompatible electronics using high-performance materials including silicon. The tool, which senses temperature and electrical activity, could also lead to better monitoring during other types of surgery, potentially reducing the rate of complications.

Putting such devices on a stretchy surface is not possible using conventional electronics manufacturing. The stretchable silicon electronics used were developed by John Rogers, professor of materials science and engineering at the University of Illinois at Urbana-Champagne and a cofounder of MC10, a startup that is commercializing the technology.

The surgical tool has performed well in animal tests designed to mimic a disorder called atrial fibrillation. This results from electrical problems in the heart tissue around the pulmonary vein, which carries blood back to the heart from the lungs. The condition, in which the upper chambers of the heart quiver instead of beating, is seen in over 2 million Americans, and in 15 percent of all people who have strokes. Atrial fibrillation is difficult to control with drugs, and the drugs that are used, including blood thinners, can have serious side effects. But the problem can be corrected with surgery. First, surgeons map the source of the electrical problem with a probe, and then they knock out the electrical trouble spots by heating and damaging those tissues.

The new multifunctional surgical tools could help speed this surgery, lowering the risk that something will go wrong.

Mapping electrical activity in heart tissue is conventionally done using a tool called a balloon catheter—a soft, inflatable probe fitted with one or two electrodes. The catheter is moved back and forth over the damaged tissue, taking thousands of electrical readings one at a time, and these become the basis for a map of electrical activity. But the process is time-consuming—in the case of some fibrillations it takes over an hour.

The new catheter is covered in a mesh of hundreds of thousands of high-performance sensors and other electronics. It can be placed in the area of interest and inflated, making hundreds of thousands of contacts at once without the need to move it. When fitted with heating elements, it can also be used to perform the ablation—the destruction of the malfunctioning tissue—which normally requires the use of a second catheter. "You can keep it registered with the tissues and increase effectiveness and safety by being more accurate," says Marvin Slepian, a cardiologist at the University of Arizona Sarver Heart Center, who led the animal trials and is a cofounder of MC10.

The temperature sensors in the catheter also enhance safety. If the heart tissue gets too hot during surgery, it can fuse with esophogeal tissue, causing a fatal complication called a fistula. Temperature is currently monitored during surgery using a probe placed in the patient's esophagus. But by the time the tissue there heats up, it's often too late, says Slepian.

The results of the initial tests of the catheter are described this week in the journal Nature Materials. Slepian is now leading tests of the sensor-covered catheters in larger animals and is testing their use as a way to map and treat more complicated arrhythmias in the ventricles of the heart. He says the company is still deciding what path to take toward clinical approval of the tool, pending continued successful tests in animals. If clinical trials are required by the Food and Drug Administration, he says, it will be a few years before the tool reaches the market.

MIT Technology Review


TStzmmalaysia
post Mar 7 2011, 09:05 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

New instrument keeps an 'eye' on nanoparticles

Precision measurement in the world of nanoparticles has now become a possibility, thanks to scientists at UC Santa Barbara.

The UCSB research team has developed a new instrument capable of detecting individual nanoparticles with diameters as small as a few tens of nanometers. The study will be published on line this week by Nature Nanotechnology, and appear in the April print issue of the journal.

"This device opens up a wide range of potential applications in nanoparticle analysis," said Jean-Luc Fraikin, the lead author on the study. "Applications in water analysis, pharmaceutical development, and other biomedical areas are likely to be developed using this new technology." The instrument was developed in the lab of Andrew Cleland, professor of physics at UCSB, in collaboration with the group of Erkki Ruoslahti, Distinguished Professor, Sanford-Burnham Medical Research Institute at UCSB.

Fraikin is presently a postdoctoral associate in the Marth Lab at the Sanford-Burnham Medical Research Institute's Center for Nanomedicine, and in the Soh Lab in the Department of Mechanical Engineering at UC Santa Barbara.

The device detects the tiny particles, suspended in fluid, as they flow one by one through the instrument at rates estimated to be as high as half a million particles per second. Fraikin compares the device to a nanoscale turnstile, which can count –– and measure –– particles as they pass individually through the electronic "eye" of the instrument.

The instrument measures the volume of each nanoparticle, allowing for very rapid and precise size analysis of complex mixtures. Additionally, the researchers showed that the instrument could detect bacterial virus particles, both in saline solution as well as in mouse blood plasma.

In this study, the researchers further discovered a surprisingly high concentration of nanoparticles present in the native blood plasma. These particles exhibited an intriguing size distribution, with particle concentration increasing as the diameter fell to an order of 30 to 40 nanometers, an as-yet unexplained result.

EurekAlert

TStzmmalaysia
post Mar 7 2011, 09:08 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

A Bionic Eye Comes to Market

After more than two decades of research and development, the first retinal prosthesis has received European approval for clinical and commercial use. People blinded by degenerative eye disease will have the option of buying an implant that can restore their vision at least partially.

"It marks the beginning of an era in which sight will be restored at ever more astonishing levels," says Robert Greenberg, president and CEO of Second Sight, the California company that developed the device.

Walter Wrobel, CEO of Retina Implant AG of Reutlingen, Germany, a startup that is carrying out trials of a similar device in several countries, says the approval is an exciting development for hundreds of thousands of people who suffer from diseases like retinitis pigmentosa.

With the Argus II system, a camera mounted on a pair of glasses captures images, and corresponding signals are fed wirelessly to chip implanted near the retina. These signals are sent to an array of implanted electrodes that stimulate retinal cells, producing light in the patient's field of view. The process works for people with retinitis pigmentosa because the disease damages only the light-sensing photoreceptors, leaving the remaining retinal cells healthy.

So far, the Argus II can restore only limited vision. "Patients can locate and recognize simple objects, see people in front of them, and follow their movement," says Greenberg. "They can find doors and windows, follow lines, and in the best cases read large print slowly," he says.

Getting this device to market is an important achievement, says Eberhart Zrenner, director of the Institute for Ophthalmic Research at the University of Tübingen in Germany and founder Retinal Implants AG. "On the other hand, the type of vision the Argus II can provide with 60 electrodes is quite limited," he says.

Zrenner is developing a device for Retinal Implants that has more than 1,500 electrodes and captures images using light-sensitive photodiodes on the chip within the eye, instead of with an external camera. "It has the light-sensitive photodiodes positioned under the retina right at the place of the degenerated photoreceptors and therefore needs no camera outside," he says.

Second Sight is also working on larger arrays. But for now, what distinguishes the Argus II from all other devices is its ability to survive long-term implantation in the human body. The Argus II has been tested in trials involving 30 patients. "We have done something that many people would have thought and did think was impossible," says Greenberg.

MIT Technology Review


TStzmmalaysia
post Mar 8 2011, 08:35 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Black holes: a model for superconductors?

Black holes are some of the heaviest objects in the universe. Electrons are some of the lightest. Now physicists at the University of Illinois at Urbana-Champaign have shown how charged black holes can be used to model the behavior of interacting electrons in unconventional superconductors.

"The context of this problem is high-temperature superconductivity," said Phillips. "One of the great unsolved problems in physics is the origin of superconductivity (a conducting state with zero resistance) in the copper oxide ceramics discovered in 1986." The results of research by Phillips and his colleagues Robert G. Leigh, Mohammad Edalati, and Ka Wai Lo were published online in Physical Review Letters on March 1 and in Physical Review D on February 25.

Unlike the old superconductors, which were all metals, the new superconductors start off their lives as insulators. In the insulating state of the copper-oxide materials, there are plenty of places for the electrons to hop but nonetheless—no current flows. Such a state of matter, known as a Mott insulator after the pioneering work of Sir Neville Mott, arises from the strong repulsions between the electrons. Although this much is agreed upon, much of the physics of Mott insulators remains unsolved, because there is no exact solution to the Mott problem that is directly applicable to the copper-oxide materials.

Enter string theory—an evolving theoretical effort that seeks to describe the known fundamental forces of nature, including gravity, and their interactions with matter in a single, mathematically complete system.

Fourteen years ago, a string theorist, Juan Maldacena, conjectured that some strongly interacting quantum mechanical systems could be modeled by classical gravity in a spacetime having constant negative curvature. The charges in the quantum system are replaced by a charged black hole in the curved spacetime, thereby wedding the geometry of spacetime with quantum mechanics.

Since the Mott problem is an example of strongly interacting particles, Phillips and colleagues asked the question: "Is it possible to devise a theory of gravity that mimics a Mott insulator?" Indeed it is, as they have shown.

The researchers built on Maldacena's mapping and devised a model for electrons moving in a curved spacetime in the presence of a charged black hole that captures two of the striking features of the normal state of high-temperature superconductors: 1) the presence of a barrier for electron motion in the Mott state, and 2) the strange metal regime in which the electrical resistivity scales as a linear function of temperature, as opposed to the quadratic dependence exhibited by standard metals.

The treatment advanced in the paper published in Physical Review Letters shows surprisingly that the boundary of the spacetime consisting of a charged black hole and weakly interacting electrons exhibits a barrier for electrons moving in that region, just as in the Mott state. This work represents the first time the Mott problem has been solved (essentially exactly) in a two-dimensional system, the relevant dimension for the high-temperature superconductors.

"The next big question that we must address," said Phillips, "is how does superconductivity emerge from the gravity theory of a Mott insulator?"

PhysOrg

Provided by University of Illinois College of Engineering

TStzmmalaysia
post Mar 8 2011, 08:37 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

New Free Software Enables Researchers to Collaborate with Colleagues around the World

New free software, launched by Oxford University scientists, gives researchers the tools they need to collaborate more efficiently and quickly with colleagues scattered around the world and working in a variety of different research areas.

The colwiz (‘collective wizdom’) R&D platform manages the entire research lifecycle from an initial idea, through a complex collaboration, to publication of the results. It is being launched through Isis Innovation’s Software Incubator – a new programme designed to promote software start-ups from the University of Oxford.

"At the moment researchers are using a dizzying array of different applications to communicate and collaborate," said colwiz Chief Scientist Professor David Gavaghan of Oxford University. "These might include Google Apps, Microsoft Live Services, LinkedIn, Yammer and Social Text. But because these are separate applications they don’t do everything and don’t always talk to each other, and this slows researchers down. colwiz replaces this hotchpotch with an integrated suite of tools custom-built for fast and efficient management of the research process."

At the heart of the colwiz platform is a publication library that enables users to manage publications using both a desktop application (for Windows, Linux and Mac) and a version ‘in the cloud’ that can be accessed from anywhere over the Internet. This is combined with communications and collaboration tools for brainstorming, research tasks and schedule management.

Tahir Mansoori, CEO and co-Founder of colwiz, said: ‘We are working with some of the leading researchers in Oxford who are undertaking projects funded by hundreds of millions of pounds in grant funding, but without any underpinning IT platform. So we thought: “why not build a platform that really supports these research activities?” That’s how colwiz was born and now we’re hoping researchers from institutions around the world will reap the benefits.’

Whilst useful in its own right for researchers writing their own publications, and keeping up with and citing the latest research in their area, the colwiz platform comes into its own with the sort of large interdisciplinary research collaborations needed to tackle some of the grand challenges of 21st Century science where collaboration tools are essential. "By breaking down the research process into its key components we have figured out which tools were potentially the most important. We then custom-built each tool from scratch and integrated them seamlessly into a single platform for individual and group productivity," said Mansoori.

"Over the last ten years my own research in the field of computational biology has become increasingly interdisciplinary, and I now work with a large number of colleagues not only from different departments in Oxford but from different Institutions around the world," said Professor Gavaghan. "I hadn’t found software that enabled me to manage the entire research process – from concept through collaborative execution to published results – within a single platform and neither had my colleagues. colwiz is the first platform to address these needs, and will significantly simplify research activities across the board – from individual students and researchers in universities to corporate R&D departments."

PhysOrg

More information: Members of academic institutions from US and UK universities can sign up for free and start using the colwiz platform. There are plans to extend the support to further academic institutions, government R&D organisations and commercial enterprises in the near future.

Provided by Oxford University.

TStzmmalaysia
post Mar 8 2011, 08:38 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

RA sufferers armed with kitchen safety tool

For sufferers of rheumatoid arthritis (RA), cooking tasks can be both difficult and dangerous. However, a new assistive technology invented by a student from Queensland University of Technology in Brisbane, Australia offers a safe way for people to lift cookware, relying on the strength of their forearms.

His design has earnt a spot on the first-round shortlist of one of the world's most prestigious design competitions - the Australian Design Award/James Dyson Award.

Twenty-four-year-old Ching-Hao (Howard) Hsu, who graduated with a Bachelor of Design (Industrial Design) at the end of 2010, designed the 'arthritis handle' after observing several sufferers of rheumatoid arthritis performing cooking tasks in their own kitchens.

RA is a chronic disease affecting one percent of the population - about 500,000 Australians. It involves inflammation of the joints, which can lead to stiffness, swelling and sometimes disablement in the hands.

"After several observations and lots of interviews, I found that lifting was a major problem for sufferers of RA during cooking preparation," Mr Hsu said.

"It was difficult for sufferers of RA to lift things with their hands, due to having limited strength and flexibility. So they had to lift with their forearms. This limited them to using cookware with handles on both sides.

"If a saucepan only had one handle, most people put a towel over their other forearm to grasp the opposite side of the pot, but this was a slippery and dangerous way of lifting, exposing the person to the risk of burns.

"The arthritis handle allows sufferers of RA to use any kind of cookware, and not be limited to double-handled products. "Due to the limited flexibility of a hand with RA, the ergonomically-designed finger holder at the front of the arthritis handle fits comfortably on the user's hand without twisting the user's fingers. "The shape of the arthritis handle is also ergonomic, in that it spreads the weight of the cookware across the user's forearm."

Mr Hsu said the arthritis handle featured a silicone coating with heat resistance up to 200 degrees celsius, to prevent heat from being directed to the forearm. "The TPE (thermoplastic elastomers) used in the product provide grip, while a magnetic strip enhances the stability for people lifting metal cookware," he said.

"I want to make sure that the arthritis handle is eventually made available in various colours. People using assistive technologies often hate sticking out as being a 'special' person. So I want this to look like a normal kitchen tool, with the inner frame available in bright orange, yellow or green, with a white outer frame."

Mr Hsu, who grew up in Taiwan, began his masters in lighting engineering at QUT in February, and hopes to work on environmentally friendly products in the future.

PhysOrg

Provided by Queensland University of Technology.
TStzmmalaysia
post Mar 8 2011, 08:40 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Laboratory-grown urethras implanted in patients

Researchers at the Institute for Regenerative Medicine at Wake Forest University Baptist Medical Center and colleagues reported today on a new advance in tissue engineering. The team is the first in the world to use patients' own cells to build tailor-made urinary tubes and successfully replace damaged tissue.

In an article published Online First by The Lancet, the research team reports replacing damaged segments of urinary tubes (urethras) in five boys. Tests to measure urine flow and tube diameter showed that the engineered tissue remained functional throughout the six-year (median) follow-up period.

"These findings suggest that engineered urethras can be used successfully in patients and may be an alternative to the current treatment, which has a high failure rate," said Anthony Atala, M.D., senior author, director of the Wake Forest Institute for Regenerative Medicine and a pediatric urologic surgeon. "This is an example of how the strategies of tissue engineering can be applied to multiple tissues and organs."

Atala's team used a similar approach to engineer replacement bladders that were implanted in nine children beginning in 1998, becoming the first in the world to implant laboratory-grown organs in humans. Researchers at the institute are currently working to engineer more than 30 different replacement tissues and organs.

Defective urethras can be the result of injury, disease or birth defects. While short defects in the tube are often easily repairable, larger defects can require a tissue graft, usually taken from skin or from the lining of the cheek.

"These grafts, which can have failure rates of more than 50 percent, often become narrowed, leading to infections, difficulty urinating, pain and bleeding," said Atlantida-Raya Rivera, lead author and director of the HIMFG Tissue Engineering Laboratory at the Metropolitan Autonomous University in Mexico City.

Between March 2004 and July 2007, the research team built engineered urethras for five boys, ages 10 to 14, using the patients' own cells. Three patients had widespread injury due to pelvic trauma and two patients had previous urethra repairs that had failed. The engineered tubes were used to replace entire segments of damaged urethra in the section that runs between the penis and the prostate (posterior section) -- considered the most difficult to repair.

The first step in engineering the replacement urethral segments was taking a small (one-half inch by one-half inch) bladder biopsy from each patient. From each sample, scientists isolated smooth muscle cells and endothelial cells, the cells that line blood vessels and other tubular structures. These cells were multiplied in the lab for three to six weeks and were then placed on a three-dimensional scaffold shaped like a urethral tube. Smooth muscle cells were placed on the outside of the scaffold and endothelial cells on the inside. The scaffolds, which were sized for each individual patient, were made of a biodegradable mesh material. After cell placement, the scaffolds were incubated for seven days – with the total time for construction ranging from four to seven weeks. By day six, all surface areas were completely covered with cells.

After incubation, the tubes were surgically implanted by removing the defective segment of the urethra and scar tissue and sewing the replacement tubes in place. Once in the body, the cells continued to expand and tissue formation began. Biopsies showed that the engineered urethras had normal layers of epithelial and smooth muscle within three months after implantation. Flow measurements, urine tests and patient questionnaires confirmed patient satisfaction as measured by lack of nighttime leaking, straining to urinate, and urinary tract infections – common symptoms when urethral tubes become narrowed.

EurekAlert

TStzmmalaysia
post Mar 8 2011, 08:45 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Teaching robots to move like humans

When people communicate, the way they move has as much to do with what they're saying as the words that come out of their mouths. But what about when robots communicate with people? How can robots use non-verbal communication to interact more naturally with humans? Researchers at the Georgia Institute of Technology found that when robots move in a more human-like fashion, with one movement leading into the next, that people can not only better recognize what the robot is doing, but they can also better mimic it themselves. The research is being presented today at the Human-Robot Interaction conference in Lausanne, Switzerland.

"It's important to build robots that meet people's social expectations because we think that will make it easier for people to understand how to approach them and how to interact with them," said Andrea Thomaz, assistant professor in the School of Interactive Computing at Georgia Tech's College of Computing.

Thomaz, along with Ph.D. student Michael Gielniak, conducted a study in which they asked how easily people can recognize what a robot is doing by watching its movements.

"Robot motion is typically characterized by jerky movements, with a lot of stops and starts, unlike human movement which is more fluid and dynamic," said Gielniak. "We want humans to interact with robots just as they might interact with other humans, so that it's intuitive."

Using a series of human movements taken in a motion-capture lab, they programmed the robot, Simon, to perform the movements. They also optimized that motion to allow for more joints to move at the same time and for the movements to flow into each other in an attempt to be more human-like. They asked their human subjects to watch Simon and identify the movements he made.

"When the motion was more human-like, human beings were able to watch the motion and perceive what the robot was doing more easily," said Gielniak.

In addition, they tested the algorithm they used to create the optimized motion by asking humans to perform the movements they saw Simon making. The thinking was that if the movement created by the algorithm was indeed more human-like, then the subjects should have an easier time mimicking it. Turns out they did.

"We found that this optimization we do to create more life-like motion allows people to identify the motion more easily and mimic it more exactly," said Thomaz.

The research that Thomaz and Gielniak are doing is part of a theme in getting robots to move more like humans move. In future work, the pair plan on looking at how to get Simon to perform the same movements in various ways.

"So, instead of having the robot move the exact same way every single time you want the robot to perform a similar action like waving, you always want to see a different wave so that people forget that this is a robot they're interacting with," said Gielniak.

EurekAlert


TStzmmalaysia
post Mar 8 2011, 08:46 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Molecule that can erase or restore long-term memories – in rats

If you're struggling to remember the names of classmates from high school, or just can't forget that time you made a complete ass of yourself in front of your high school crush, then a single molecule known as PKMzeta could be to blame – and increasing or decreasing its activity in the brain could either help you remember those names that seem on the tip of your tongue or drive that embarrassing memory from your head.

In a new study, researchers have demonstrated that a memory in rats can either be enhanced or erased long after it is formed by manipulating the activity of the brain enzyme PKMzeta.

In earlier studies, the researchers conditioned rats to associate a nauseating sensation with saccharin by pairing it with lithium, so that they shunned the sweet taste. However, after the rats received a chemical that blocked the enzyme PKMzeta in the brain's neocortex, where long-term memories are stored, their sweet tooth returned within a couple of hours. The effect only worked retroactively and appeared to be permanent, suggesting that PKMzeta may be required for sustaining memories throughout the brain.

To confirm the findings of these earlier studies and to demonstrate the opposite effect, the researchers carried out a new study funded by the National Institutes of Health (NIH) using the same aversive learning model and paired it with genetic engineering to increase PKMzeta. To produce an overexpression of the enzyme, they harnessed a virus to infect the neocortex with the PKMzeta gene and saw an enhancement in the rats' memory function.

Conversely, replacing the naturally occurring PKMzeta with a mutant inactive form of the enzyme, erased the memory in much the same way as the chemical blocker used in the previous studies did.

"One explanation of the memory enhancement is that PKMzeta might go to some synapses, or connections between brain cells, and not others," said Todd Sacktor, of the SUNY Downstate Medical Center, New York City, a grantee of the NIH's National Institute of Mental Health (NIMH). "Overexpressed PKMzeta may be selectively captured by molecular tags that mark just those brain connections where it's needed – likely synapses that were holding the memory from the training."

Earlier this year, another study funded by the National Institute of Health found that treating rats with a insulin-like growth factor (IGF-II) significantly boosted retention and prevented forgetting of a fear memory as long as the naturally occurring growth factor was injected into the rats' memory circuitry during time-limited windows just after learning and upon retrieval, when memories become fragile and changeable.

In contrast, the researchers say the PKMzeta mechanism appears to work anytime and the effects applied generally to multiple memories stored in the target brain area, which raises questions about how specific memories might be targeted in any future therapeutic applications.

"This pivotal mechanism could become a target for treatments to help manage debilitating emotional memories in anxiety disorders and for enhancing faltering memories in disorders of aging," said NIMH Director Thomas R. Insel, M.D.

A paper detailing the findings of the study are published in the journal Science.

Gizmag

TStzmmalaysia
post Mar 8 2011, 08:48 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

What humans really want - creating computers that understand users

Binghamton University computer scientist Lijun Yin thinks that using a computer should be a comfortable and intuitive experience, like talking to a friend. As anyone who has ever yelled "Why did you go and do that?" at their PC or Mac will know, however, using a computer is currently sometimes more like talking to an overly-literal government bureaucrat who just doesn't get you. Thanks to Yin's work with things like emotion recognition, however, that might be on its way to becoming a thing of the past.

Most of Yin's research in this area centers around the field of computer vision – improving the ways in which computers gather data with their webcams. More specifically, he's interested in getting computers to "see" their users, and to be able to guess what they want by looking at them.

Already, one of his graduate students has given a PowerPoint presentation, in which content on the slides was highlighted via eye-tracking software that monitored the student's face.

A potentially more revolutionary area of his work, however, involves getting computers to distinguish between human emotions. By obtaining 3D scans of the faces of 100 subjects, Yin and Binghamton psychologist Peter Gerhardstein have created a digital database that includes 2,500 facial expressions. The emotions conveyed by these expressions all fall under the headings of anger, disgust, fear, joy, sadness, and surprise. By mapping the differences in the subjects' faces from emotion to emotion, he is working on creating algorithms that can visually identify not only the six main emotions, but even subtle variations between them.The database is available free of charge to the nonprofit research community.

Besides its use in avoiding squabbles between humans and computers, Yin hopes that his software could be used for telling when medical patients with communication problems are in pain. He also believes it could be used for lie detection, and to teach autistic children how to recognize the emotions of other people.

This is by no means the first foray into computer emotion recognition. Researchers at Cambridge University have developed a facial-expression-reading driving assistance robot, while Unilever has demonstrated a machine that dispenses free ice cream to people who smile at it.

Gizmag

TStzmmalaysia
post Mar 9 2011, 08:56 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


SPACE SCIENCE

Attached Image

Tropical Water Ice Discovered On Mars

In recent years, evidence of water ice on Mars has rapidly grown. At the poles, the ice is there all year round. And there is good evidence that it can be found just below the surface of latitudes down to 45 degrees, the extent of the southern ice cap in winter.

But the polar regions are cold, ever-changing and hazardous. If humans ever visit Mars, they'll have to first land near the equator, where the red planet is milder and more hospitable.

That may just have become more feasible thanks to the announcement of evidence of water ice beneath the surface of Mars at tropical latitudes as low as 25 degrees.

Mapping water ice deposits on Mars is a tricky business. Most of the water ice we know about is beneath the surface and cannot be seen directly. Its presence is inferred by the thermal characteristics of the areas we can see and measure.

For example, carbon dioxide ice can often only form on the surface if there is a cold layer beneath. The properties of this layer can only be explained by water ice.

The data on which these calculations are based come from cameras on the Mars Express and Mars Reconnaissance Orbiter spacecraft which have been circling the planet since 2004 and 2006 respectively.

These cameras clearly show that CO2 forms on polar facing slopes throughout the year at tropical latitudes.

Mathieu Vincendon at Brown University in Rhode Island and buddies have created a detailed model of the thermal budgets involved to explain how this ice forms. Their conclusion is that it is only possible if there is a cold layer that helps to store and release heat two or three metres beneath the surface.

Two materials match the thermal properties of this layer: water ice or ordinary bedrock.

However, Vincedon and co say that the distribution of CO2 ice patches around Mars rules out the possibility that bedrock is responsible . To explain this distribution the bedrock layer would need to be uniformly buried in longitude but increasingly buried more deeply with latitude, they say.

Furthermore, such a layer of bedrock, although stretching across vast swathes of the planet, is not visible in images from space. That it has never been revealed by processes such as erosion or meteor impacts makes its existence unlikely.

The only other option is that there is water ice just beneath the surface of Mars at tropical latitudes.

That's good news for the next generation of robotic explorers. One of the candidate landing sites for the Mars Science Laboratory, due for launch later this year, is the Holden Crater at a latitude of 26 degrees south.

This is right next to an area that Vincedon and co say has vast reserves of subsurface water ice. And that means the Mars Science Laboratory could get its robotic paws on the stuff within the next couple of years.

MIT Technology Review

This post has been edited by tzmmalaysia: Mar 9 2011, 08:56 AM
TStzmmalaysia
post Mar 9 2011, 08:58 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Robot fish can trick the real thing


Scientists have long turned to nature for inspiration and innovation. From unlocking the secrets of spider silk to create super-strong materials to taking hints from geckos for new adhesives, clues from the natural world often lead to advances in our practical world. But the relationship between engineering and nature has been largely one-directional, with humans reaping the majority of the benefits of discovery.

What if it was possible to close the loop, and combine human ingenuity and nature's wisdom to protect a species or ecosystem?

Maurizio Porfiri, assistant professor of mechanical engineering at the Polytechnic Institute of New York University, is one step closer to that goal through his research into the behavior of schooling fish, which is funded by a prestigious NSF Faculty Early Career Development (CAREER) award. Porfiri's findings led him to create a series of biologically inspired robots that may help preserve and protect marine life.

"Studies of schools of fish, flocks of birds and herds of animals have inspired robotic systems designed for our own applications," said Porfiri. "But I wanted to see if I could close the gap, bringing some of those benefits back into the natural world."

A lifelong animal lover who recalls childhood aspirations of becoming a zookeeper, Porfiri began his studies of fish schooling by examining how leadership is established within these populations. "Schooling fish have a rich system of information sharing," explains Porfiri. "They decide when to school based on a wide variety of factors, including vision and pressure cues from other fish. By studying these cues, we can learn how school members recognize--and follow--a leader."

Porfiri posited that if he could enforce leadership by an external member--in this case, a robot that actively engages the group--he could influence the direction and behavior of schooling fish. This could prove a life-saving advantage for marine populations in the event of oil or chemical spills or other natural disasters. Porfiri also envisions the ability to lead fish away from man-made dangers like turbines.

Porfiri's background in dynamical systems, mechanics of advanced materials and underwater robotics aided in the creation of robotic "leader" fish that, while not especially lifelike at first glance, are deceptively agile swimmers. When deployed in an environment with groups of gregarious fish, these robotic members have been effective at influencing the school's behavior. Porfiri suggests that one of the secrets to the robots' ability to successfully school with real fish may lie in their mimicry of the swim characteristics of real fish.

This first generation of robotic fish is capable of swimming along a plane, and future generations will be able to dive and surface. In laboratory observations, Porfiri and his team have noted a variety of interaction patterns between groups of gregarious fish and the underwater robot, including tracking, milling and following, hinting that the group's behavior can be altered by a robotic member.

In the meantime, the NSF CAREER grant, which also supports community outreach, gives Porfiri the opportunity to take his work beyond the lab to recapture the old dream of spending his days at the zoo. Throughout the academic year, he and his students can be found at the New York Aquarium, where they nurture a passion for math, science and engineering among local elementary and middle school students. The young students engage in authentic robot design experiments, creating custom caudal fins for robotic fish. By deploying robots equipped with these fins during test swims, the classes learn how fin size and shape affect swimming performance.

PhysOrg


TStzmmalaysia
post Mar 9 2011, 08:59 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

New 1TB hard drive platters enables 4TB Hard Drives

The world of hardware is, in some ways, a race. We want it faster, smaller, with a longer battery life, and with more storage. While we cannot always get all of those things from one device, when one of them is done really well, or really big, it is usually enough to garner attention, and today all eyes are on Samsung.

They have found a way to build platters for hard drives that are 1TB. That means that they will be able to build 4TB hard drives for computers in the near future, and not just for desktops. They also showed off a 1TB drive with two platters that clocks in at just 2.5-inchs, a standard size that is compatible with most average notebooks.

This news is of note because right now we can only produce 3TB hard drives, with each of the platters coming in at 750MB, and a height of 3.5-inches. In other words, the drive offers more for less.

In a more ambitious plan for the future Samsung mentioned that, with some modifications, this technology could also allow for the creation of 10TB drives, though no specifics were given on this tantalizing prospect.

PhysOrg

TStzmmalaysia
post Mar 9 2011, 09:01 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

How long does a tuning fork ring? 'Quantum-mechanics' solve a very classical problem

Austrian and German researchers at the University of Vienna and Technische Universitaet Muenchen have solved a long-standing problem in the design of mechanical resonators: the numerical prediction of the design-limited damping. They report their achievement, which has a broad impact on diverse fields, in the forthcoming issue of Nature Communications. The article describes both a numerical method to calculate the mechanical damping as well as a stringent test of its performance on a set of mechanical microstructures.

From the wooden bars in a xylophone or the head of a drum, to the strings and sound box of a guitar or violin, musical instruments are the most familiar examples of mechanical resonators. The actual mechanical vibrations of these instruments create acoustic waves that we hear as sound. The purity of the emitted tone is intimately related to the decay of the vibration amplitude, that is, the mechanical losses of the system. A figure of merit for mechanical losses is the quality factor, simply called "Q", which describes the number of oscillations before the amplitude has decayed to a minute fraction of its starting value. The larger Q, the purer the tone and the longer the system will vibrate before the sound damps out.

In addition to the aesthetic examples found in a concert hall, mechanical resonators have become increasingly important for a wide variety of advanced technological applications, with such diverse uses as filtering elements in wireless communications systems, timing oscillators for commercial electronics, and cutting-edge research tools which include advanced biological sensors and emerging quantum electro- and optomechanical devices. Rather than producing pleasing acoustics, these applications rely on very "pure" vibrations for isolating a desired signal or for monitoring minute frequency shifts in order to probe external stimuli.

For many of these applications it is necessary to minimize the mechanical loss. However, it had previously remained a challenge to make numerical predictions of the attainable Q for even relatively straightforward geometries. Researchers from Vienna and Munich have now overcome this hurdle by developing a finite-element-based numerical solver that is capable of predicting the design-limited damping of almost arbitrary mechanical resonators. "We calculate how elementary mechanical excitations, or phonons, radiate from the mechanical resonator into the supports of the device", says Garrett Cole, Senior Researcher in the Aspelmeyer group at the University of Vienna. "This represents a significant breakthrough in the design of such devices."

The idea goes back to a previous work by Ignacio Wilson-Rae, physicist at the Technische Universitaet Muenchen. In collaboration with the Vienna group the team managed to come up with a numerical solution to compute this radiation in a simple manner that works on any standard PC. The predictive power of the numerical Q-solver removes the guesswork that is currently involved (e.g., trial and error prototype fabrication) in the design of resonant mechanical structures. The researchers point out that their "Q-solver" is scale independent and thus can be applied to a wide range of scenarios, from nanoscale devices all the way up to macroscopic systems.

PhysOrg

More information: Phonon-tunnelling dissipation in mechanical resonators, Garrett D. Cole, Ignacio Wilson-Rae, Katharina Werbach, Michael R. Vanner, Markus Aspelmeyer, Nature Communications, 8 March, 2011, DOI: DoI: 10.1038/ncomms1212

Provided by Technische Universitaet Muenchen

TStzmmalaysia
post Mar 9 2011, 09:03 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Molecular tug-of-war could lead to new materials

Tug-of-war isn't just for play. In the chemistry world, the game could identify a Saran-wrap-like material that instantly heals microscopic tears in its own structure.

Duke scientists are testing this idea using atomic forceps to tug on individual molecules. They’ve already discovered that a slight pull can pop open rare, triangle-shaped molecular structures in milliseconds.

The usual way to open these molecules is to heat them at high temperatures – overnight, said chemist Stephen Craig, who described his research at a colloquium on March 3. With the molecular tug-of-war, Craig foresees a microscopic world where scientists could almost instantly move molecules and atoms to create new materials and even new chemistry.

Craig and his colleagues recently explored how molecule chains, called polymers, can snap back to structures smaller than their original forms. The team also trapped a molecule in the middle of the reaction that made it shrink. Typically that halfway point, called a “transition state,” lasts for less than one millionth of a millionth of a second, but Craig’s team succeeded in “catching lightning in a bottle,” which may be useful in understanding the electronic properties of the transition state.

To quantify the tug-of-war at the molecular level requires an atomic force microscope. Craig sees the tool like a diving board. When a particularly heavy person or tough molecule is on the end, the board bends way down. Measuring the bend of the microscope’s board, the team can put a number to the force or strength of the molecule being tugged.

The microscope can pull harder and harder on the chain until it breaks, which shows the polymers that can endure the “heaviest diver” or most force. His team can also use the tool to watch if specific molecules change their shapes, such as opening and closing their triangle structures, as the polymer starts to break apart.

By seeing this new chemistry as it happens, Craig and other scientists could learn how to move atoms and molecules where they need them. The manipulation provides scientists with another way to create new materials for applications from longer-lasting coatings on artificial hips to plastic wrapping that never gets holes.

PhysOrg-chemistry

Provided by Duke University (news : web)


TStzmmalaysia
post Mar 9 2011, 09:04 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Molten salts for efficient solar thermal plants

Researchers from Siemens intend to substantially boost the efficiency of solar thermal power plants and thus reduce the costs of this climate-neutral method of power generation. They intend to use mixtures of molten salts as heat transfer media in the High Performance Solar Thermal Power project. In conjunction with partners, scientists from Siemens will construct a pilot plant in Portugal and test the use of molten salt mixtures in parabolic trough power plants.
This type of power plant uses concave parabolic mirrors that focus sunlight on anabsorber tube at the mirrors’ focus. A heat transfer medium flows along the tube. The heat is transferred to a conventional water-steam cycle in a downstream steam generator, where it is converted into electricity by a steam turbine and a generator. The main factor determining the efficiency of the power generationprocess is the maximum working temperature of the heat transfer medium. As this temperature increases, the utilization of the steam turbine approaches its optimum value.
Siemens intends to use molten salts instead of thermal oil, thereby increasing the working temperature from 400 to more than 500 degrees Celsius. Eliminating the use of thermal oil would also prove beneficial as it has a relatively high vapor pressure and is highly flammable. Salts suitable for use as heat transfer media consist of, for example, a mixture of sodium and potassium nitrates. These are non-flammable and have almost zero vapor pressure. As a result, the plant can be operated without pressure—and that means more safely. Furthermore, salts have a higher heat storage capacity than thermal oil and are considerably cheaper. The solidification temperature of the salt previously used for this purpose must, however, be reduced from the current temperature of approximately 220 degrees to less than 150 degrees Celsius so that it doesn’t “freeze” overnight. Optimizing the composition of the salt mixture and thus its physical properties is an important goal being pursued by scientists from Siemens Corporate Technology and from Siemens Energy.
The pilot plant will be constructed on the grounds of the University of Evora, Portugal. The solar components, the steam generator system, the pipework system, and the pumps will be adapted to cope with the higher temperatures and the properties of the molten salt mixture. The researchers will use the results gained to plan, and implement verification procedures at, commercial facilities with installed powers in excess of 50 megawatts. The project is being funded by Germany’s Federal Ministry for the Environment. The participants include the German Aerospace Center (DLR) and other companies.

PhysOrg


TStzmmalaysia
post Mar 10 2011, 12:59 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


SPACE SCIENCE

Attached Image

The most distant mature galaxy cluster

"We have measured the distance to the most distant mature cluster of galaxies ever found", says the lead author of the study in which the observations from ESO's VLT have been used, Raphael Gobat (CEA, Paris). "The surprising thing is that when we look closely at this galaxy cluster it doesn't look young -- many of the galaxies have settled down and don't resemble the usual star-forming galaxies seen in the early Universe."

Clusters of galaxies are the largest structures in the Universe that are held together by gravity. Astronomers expect these clusters to grow through time and hence that massive clusters would be rare in the early Universe. Although even more distant clusters have been seen, they appear to be young clusters in the process of formation and are not settled mature systems.

The international team of astronomers used the powerful VIMOS and FORS2 instruments on ESO's Very Large Telescope (VLT) to measure the distances to some of the blobs in a curious patch of very faint red objects first observed with the Spitzer space telescope. This grouping, named CL J1449+0856 [1], had all the hallmarks of being a very remote cluster of galaxies [2]. The results showed that we are indeed seeing a galaxy cluster as it was when the Universe was about three billion years old -- less than one quarter of its current age [3].

Once the team knew the distance to this very rare object they looked carefully at the component galaxies using both the NASA/ESA Hubble Space Telescope and ground-based telescopes, including the VLT. They found evidence suggesting that most of the galaxies in the cluster were not forming stars, but were composed of stars that were already about one billion years old. This makes the cluster a mature object, similar in mass to the Virgo Cluster, the nearest rich galaxy cluster to the Milky Way.

Further evidence that this is a mature cluster comes from observations of X-rays coming from CL J1449+0856 made with ESA's XMM-Newton space observatory. The cluster is giving off X-rays that must be coming from a very hot cloud of tenuous gas filling the space between the galaxies and concentrated towards the centre of the cluster. This is another sign of a mature galaxy cluster, held firmly together by its own gravity, as very young clusters have not had time to trap hot gas in this way.

As Gobat concludes: "These new results support the idea that mature clusters existed when the Universe was less than one quarter of its current age. Such clusters are expected to be very rare according to current theory, and we have been very lucky to spot one. But if further observations find many more then this may mean that our understanding of the early Universe needs to be revised."

EurekAlert
TStzmmalaysia
post Mar 10 2011, 02:52 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


NANOTECHNOLOGY

Attached Image

Physicists grow micro-machines from carbon

We’ve seen some creative ways of making tiny BYU logos before, like engraving these nano-sized letters in silica and shaping these even smaller letters from DNA strands. But growing a nano-logo? That’s probably a first on campus.

Here is how BYU physics professor Robert Davis and his student Taylor Wood do it: They start by patterning the carbon seeds of the logo onto an iron plate. Next they send heated gas flowing across the surface, and a forest of carbon nano-tubes springs up.

“It’s a really fragile structure at this point – blowing on it or touching it would destroy it,” Davis said. “We developed a process to coat and strengthen the tubes so that we can make microstructures that have practical applications.”

Another student, Jun Song, used the process to make devices that quickly and neatly separate the various chemicals contained in a solution. The technique is detailed by the BYU physicists in a new study published in the scientific journal

As demonstrated in the paper, their approach using carbon nanotubes is more precise than current chemical separation methods because it gives more control over the channels that the fluids flow through. That’s why the company US Synthetic purchased the commercial rights from BYU.

Designing little logos and separating chemicals isn’t all the BYU researchers are doing, either. They’re also building several kinds of micro-machines including actuators, switches and humidity-detecting cantilevers. Next on their agenda is to create filtration devices.

PhysOrg

TStzmmalaysia
post Mar 10 2011, 02:53 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

U.S. Department of Energy Announces New Biofuel for the Replacement of Gasoline

The U.S. Department of Energy (DOE) has just announced a breakthrough discovery in the world of biofuels. Led by Energy Secretary Steven Chu, the research team headed up by the Department’s BioEnergy Science Center has developed a cost effective method for converting woody plants straight into isobutanol, which can be used in conventional car engines like gasoline. The new discovery will not only provide a feasible and important alternative to oil, but have potential to create a considerable amount of new jobs in rural parts of the country.

Non-edible woody plant matter is the focus material for the biofuel endeavor, and scientists have been on the hunt for a cost effective way to break down the cellulose to obtain the soft innards which could be used for fuel. Scientists have now pinpointed a microbe, the Clostridium celluloyticum, able to process the cellulose. The same microbes have also been proven effective in cleaning up polluted sites, powering fuel cells, and even transforming wastewater into bioplastic. The new super microbe is also able to break down plant matter and produce isobutanol in one relatively inexpensive step, as compared to conventional biofuel production which requires a multi-stage process using various microbes that complete different functions.

In his announcement Chu also pointed out that biofuel production has the potential to create new jobs in rural parts of the country by putting more farmland into production. But it is worth noting that the DOE’s new isobutanol process does not necessarily rely on new agricultural – apart from cultivated biofuel crops, the mircrobes can also process woody waste from other crops including wheat and rice straw, corn stover, and lumber waste. It is the handling, transporting and refining of the wate that could potentially generate new jobs.

Inhabitat

TStzmmalaysia
post Mar 10 2011, 02:57 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

"Gastric Pacemaker" Fakes Fullness as You Eat

An implant intended as a less drastic alternative to stomach stapling or stomach bypass surgery for the morbidly obese is now being sold in Europe. The device senses when a person is eating and generates a premature sensation of fullness by stimulating nerves that curl around the stomach.

The implant, developed by Intrapace, a company based in Mountain View California, is available in Germany, Spain, and the U.K.; the first implantations are scheduled to be carried out later this week. In trials involving 65 patients, the company says, the device led to an average weight loss of 22 percent after one year, with some patients losing as much as 38 percent of their body weight. These results have not yet been published.

The implant, called Abiliti, is also equipped with an accelerometer that shows a physician how much exercise the patient is getting. Data from the device can be uploaded to a computer wirelessly.

Abiliti is about the size of a pacemaker and is designed to be implanted within the abdominal cavity but outside the stomach through minimally invasive laparoscopic surgery. Two leads connect it to the stomach—one for sensing and the other for stimulating. A sensor is passed through the stomach wall; this detects when food enters. A stimulating electrode that's been attached to stretch receptor nerves outside the stomach then sends sensations of fullness to the brain via the vagus nerve.

By detecting when food starts to enter the stomach, the implant is able to stimulate nerves prematurely and simulate sensations of being full before the stomach actually starts to fill, says Chuck Brynelsen, president and CEO of Intrapace.

Gastric neurostimulation has been proposed previously as a way of tackling obesity, but the success of such devices so far has been poor. A similar device developed by Transneuronix, a company in New Jersey, also used vagus-nerve stimulation. But after the company was bought by medical device giant Medtronic, trials with hundreds of patients showed no clear benefit, and the technology appears to have been scrapped.

Devices that stimulate the vagus nerve continuously can lead to habituation, in which the nerve adapts to the stimulus and essentially learns to ignore it.

"What's unique about the Abiliti device, and what attracted me to it, is that it's got the capacity for intelligent sensing," says Abeezar Sarela a bariatric surgeon at the University of Leeds School of Medicine, in the U.K., who plans to start offering the device to his patients.

At a cost of around $21,000 for the device and surgery, Abiliti is slightly more expensive than a stomach bypass and nearly twice the price of a gastric band, says Sarela. However, it should have none of the side effects, such as nausea or vomiting can be caused by structurally altering the stomach, he says.

The only notable new side effect is a slight risk of infection due to the incision in the stomach, says Sarela. But even so, while Abiliti will be particularly suited to some patients, Sarela doesn't see it replacing stomach stapling or bypasses for the vast majority of cases, at least not for the foreseeable future. "These will remain the workhorses of bariatric surgery," he says.

Technology Review

TStzmmalaysia
post Mar 10 2011, 02:59 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Bionic Eye will Enable Blind to See

After more than two decades of research and development, the first retinal prosthesis has received European approval for clinical and commercial use. People blinded by degenerative eye disease will have the option of buying an implant that can restore their vision at least partially.

"It marks the beginning of an era in which sight will be restored at ever more astonishing levels," says Robert Greenberg, president and CEO of Second Sight, the California company that developed the device.

Walter Wrobel, CEO of Retina Implant AG of Reutlingen, Germany, a startup that is carrying out trials of a similar device in several countries, says the approval is an exciting development for hundreds of thousands of people who suffer from diseases like retinitis pigmentosa.

Second Sight's device, the Argus II, will cost around $115,000 and be available only through a small number of clinics in Switzerland, France, and the U.K. The company hopes to receive approval from the U.S. Food and Drug Administration by next year.

With the Argus II system, a camera mounted on a pair of glasses captures images, and corresponding signals are fed wirelessly to chip implanted near the retina. These signals are sent to an array of implanted electrodes that stimulate retinal cells, producing light in the patient's field of view. The process works for people with retinitis pigmentosa because the disease damages only the light-sensing photoreceptors, leaving the remaining retinal cells healthy.

So far, the Argus II can restore only limited vision. "Patients can locate and recognize simple objects, see people in front of them, and follow their movement," says Greenberg. "They can find doors and windows, follow lines, and in the best cases read large print slowly," he says.

Getting this device to market is an important achievement, says Eberhart Zrenner, director of the Institute for Ophthalmic Research at the University of Tübingen in Germany and founder Retinal Implants AG. "On the other hand, the type of vision the Argus II can provide with 60 electrodes is quite limited," he says.

Zrenner is developing a device for Retinal Implants that has more than 1,500 electrodes and captures images using light-sensitive photodiodes on the chip within the eye, instead of with an external camera. "It has the light-sensitive photodiodes positioned under the retina right at the place of the degenerated photoreceptors and therefore needs no camera outside," he says.

Second Sight is also working on larger arrays. But for now, what distinguishes the Argus II from all other devices is its ability to survive long-term implantation in the human body. The Argus II has been tested in trials involving 30 patients. "We have done something that many people would have thought and did think was impossible," says Greenberg.

Technology Review

TStzmmalaysia
post Mar 10 2011, 03:01 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Wearable Sensor Reveals what Overwhelms You

What do you think most stresses you out during the day? A new type of wearable stress sensor, which constantly checks for signs of anxiety, could give you a precise answer. And it might not be what you think.

That's what I found when I tested the Q Sensor, a device made by Affectiva, a company based in Waltham, Massachusetts. It looks like a large digital watch with no readout. A button on its surface lights up in different colors to convey the level of battery charge. Two small silver electrodes on the underside of the device continually send out a low electric current to measure skin conductance. Skin conductance rises along with physiological levels of stress, including both excitement and fear.

Over the last year, the Q Sensor has been snapped up by researchers studying everything from sleep to game design, eating habits, and brand design. Scientists are using it to tailor new treatments for autistic children; others are planning studies to see if information about stress can help treat people with drug addictions or post-traumatic stress disorder. But anyone might benefit from the information the sensor provides. Knowing our daily state of stress could help us understand ourselves and our daily lives better. It might also, perhaps, help us de-stress more effectively.

"We know stress exacerbates medical conditions," says Rosalind Picard, a professor of media arts and sciences at MIT and lead inventor of the Q Sensor. "Stress takes a huge toll on people's health. It's starting to be more biologically understood."

As I wore the Q Sensor throughout the day, I took notes on moments that seemed particularly stressful or relaxing. I assumed a meeting would cause the highest level of stress, and lunch the lowest. At the end of the day, I went to Picard's office at the MIT Media Lab to see my readout. She explained that the raw data can be hard to understand. A peak doesn't necessarily indicate negative stress—it could reflect excitement or an artifact like a hot room. Indeed, there were some artifacts in my data—places where the stress line mysteriously drops suddenly and slowly builds back up. This usually occurs when the sensor is bumped accidentally, Picard explained: when the sensor moves slightly, it comes in contact with dry skin. Because skin conductance goes up with heat as well as stress, an accompanying temperature sensor helps identify artifacts of another kind. And an accelerometer keeps track of the wearer's motion, to indicate, for example, if the person is biking or running.

I wore the sensor on my left, nondominant wrist, which Picard noted moved a lot during the day, probably when I was typing. There were small spikes of stress leading up to the afternoon meeting at which I had to present ideas in front of colleagues, but surprisingly, the largest spikes occurred when I was responding to a bevy of e-mails in the morning. Picard showed me a graph of her own data recording from the day she took her son to an amusement park. Her stress levels were high on the roller coasters—but they were even higher in the morning, when she was getting everyone organized.

Now that I know multitasking can be more stressful to me than a meeting, what can I do with the information? Picard is working on ways for the Q Sensor to give immediate feedback by, for example, transmitting data to a smart-phone app. The device could then issue an alert to serve as a reminder to relax.

Picard says the sensor could also help in more dire situations—for example, helping to prevent drug relapses (researchers have shown that drug cravings trigger peak levels of physiological stress). Picard is in the process of setting up a study with post-traumatic stress patients being treated for addiction at a Veterans Affairs rehabilitation center. For the study, phones supplemented with psychological surveys and positive messages will read and respond to a person's Q Sensor.

Kevin Laugero, a professor in the department of nutrition at the University of California, Davis, studies the neurophysiology of eating and the ways in which stress can affect decision-making related to food intake. He is using the Q Sensor in combination with other tools to look into whether preschool children are more likely to eat a snack when their stress levels are high. In the past, Laugero and his team had to measure stress by taking samples of saliva and checking the levels of the stress hormone cortisol, a process that yielded intermittent rather than continuous data.

Ross is planning to build an extensive Q Sensor database to learn about patterns in larger groups and predict consumer reactions to different situations. "Our goal is to have the largest database of shopper physiological response of any company in North America by the end of the year," says Ross.

Picard hopes the device could eventually have broad appeal. A lot of people simply don't know or believe they're stressed. "This is technology that can transform people's ability to understand themselves and participate in the process of health and medicine," she says.

Technology Review

TStzmmalaysia
post Mar 10 2011, 03:02 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Gravity's Bias for Left may be 'Written in the Sky'

Is gravity left-handed? An answer could provide a clue to a long-sought theory of quantum gravity - and might be within our grasp by 2013.

General relativity describes gravity's actions at large scales. For tiny scales however, a theory of quantum gravity, incorporating quantum mechanics, is needed. But first physicists need to understand gravitons, hypothetical quantum particles that mediate the gravitational force. These likely come in left and right-handed varieties: in the former, the particle's spin would be aligned with the direction of its motion; in the latter, the spin would be the opposite.

General relativity does not distinguish between right and left, so you might expect gravity to be transmitted by both varieties. But the quantum world may play favourites. When it comes to the ghostly particles known as neutrinos, for example, the weak force only interacts with the left-handed variety.

To find out whether gravitons fall into the "ambidextrous" camp of general relativity or exhibit quantum asymmetry much like a neutrino, João Magueijo and Dionigi Benincasa of Imperial College London suggest looking to the cosmic microwave background, relic radiation from the big bang. During inflation, the faster-than-light expansion of the nascent universe, powerful gravitational waves may have rippled through space-time, polarising the CMB's photons in a telltale pattern.

The pair calculate that if gravity depended on just left or right-handed gravitons, that would have skewed the polarisation pattern in an obvious way. What's more, inflation would have stretched these effects to astronomical proportions, making them easily visible to astronomers, write Magueijo and Benincasa in an analysis to appear in Physical Review Letters. The European Space Agency's Planck telescope will image the CMB's polarisation and will release the data in 2013.

A theory called loop quantum gravity, an attempt to unite quantum mechanics and general relativity, already suggests that an asymmetry might be embedded deep into the laws of the universe and that this should render gravity left-handed.

Evidence of left-handed gravitons in the CMB would be "a triple discovery", says Lee Smolin of the Perimeter Institute in Waterloo, Ontario, Canada, who has worked with Magueijo and Benincasa on the subject. "It would confirm inflation, that gravity is quantum mechanical and that there is left-right asymmetry in quantum gravity."

NewScientist

TStzmmalaysia
post Mar 11 2011, 09:16 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Nanotech to boost solid state hydrogen storage

Hydrogen has great potential as a clean fuel source for powering our cars and airplanes, but it also poses some big hurdles – namely production, distribution infrastructure and storage. Storing hydrogen in gas or liquid form onboard a vehicle raises difficulties in terms of volume and pressurization – a hydrogen gas tank for a car would need to be around four times larger than current petroleum tanks. Another possible solution is the use of solid state hydrogen and the European Aeronautic Defense and Space Company (EADS), along with the University of Glasgow, hope to boost this approach by developing a new storage system using materials modified at the nanoscale that receive and release the hydrogen at a faster rate.

The research collaboration will involve changing the composition and microstructure of the current Hydrisafe hydrogen storage tank. This involves replacing the currently used lanthanum nickel (LaNi5) storage alloy with other hydride materials such as magnesium hydride (MgH2).

It's hoped the research can deliver a storage solution able to feed a fuel cell at the required energy densities required of an aeroplane.

"Using new active nanomaterials in combination with novel storage tank design principles presents a hugely exciting opportunity to address the considerable challenges of introducing hydrogen as a fuel for aviation," says Professor Gregory from the University of Glasgow, School of Chemistry says."

If successful in this research, EADS plans to fly an unmanned hydrogen powered test plane in 2014.

"Replacing traditional hydrocarbon-based fuels with pollution-free hydrogen in aeroplane and car engines would deliver huge benefits to the environment because carbon emissions would be dramatically reduced," says Dr.-Ing. Agata Godula-Jopek, Fuel Cells Expert in the EADS Power Generation Team.

Gizmag


TStzmmalaysia
post Mar 11 2011, 09:18 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

New robot system to test 10,000 chemicals for toxicity

Several federal agencies, including the National Institutes of Health, today unveiled a new high-speed robot screening system that will test 10,000 different chemicals for potential toxicity. The system marks the beginning of a new phase of an ongoing collaboration, referred to as Tox21, that is working to protect human health by improving how chemicals are tested in the United States.

The robot system, which is located at the NIH Chemical Genomics Center (NCGC) in Rockville, Md., was purchased as part of the Tox21 collaboration. Tox21 was established in 2008 between the National Institute of Environmental Health Sciences National Toxicology Program (NTP), the National Human Genome Research Institute (NHGRI), and the U.S. Environmental Protection Agency (EPA), with the addition of the U.S. Food and Drug Administration (FDA) in 2010. Tox21 merges existing agency resources (research, funding, and testing tools) to develop ways to more effectively predict how chemicals will affect human health and the environment.

The 10,000 chemicals screened by the robot system include compounds found in industrial and consumer products, food additives, and drugs. A thorough analysis and prioritization process from more than 200 public databases of chemicals and drugs used in the United States and abroad was conducted to select the initial 10,000 chemicals for testing. Testing results will provide information useful for evaluating if these chemicals have the potential to disrupt human body processes enough to lead to adverse health effects.



"Tox21 has used robots to screen chemicals since 2008, but this new robotic system is dedicated to screening a much larger compound library," said NHGRI Director Eric Green, M.D., Ph.D. The director of the NCGC at NHGRI, Christopher Austin, M.D., added "The Tox21 collaboration will transform our understanding of toxicology with the ability to test in a day what would take one year for a person to do by hand."
"The addition of this new robot system will allow the National Toxicology Program to advance its mission of testing chemicals smarter, better, and faster," said Linda Birnbaum, Ph.D., NIEHS and NTP director. "We will be able to more quickly provide information about potentially dangerous substances to health and regulatory decision makers, and others, so they can make informed decisions to protect public health."

Tox21 has already screened more than 2,500 chemicals for potential toxicity, using robots and other innovative chemical screening technologies.

"Understanding the molecular basis of hazard is fundamental to the protection of human health and the environment," said Paul Anastas, Ph.D., assistant administrator of the EPA Office of Research and Development. "Tox21 allows us to obtain deeper understanding and more powerful insights, faster than ever before."

PhysOrg


TStzmmalaysia
post Mar 11 2011, 09:19 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

The more secure you feel, the less you value your stuff, UNH research shows

People who feel more secure in receiving love and acceptance from others place less monetary value on their possessions, according to new research from the University of New Hampshire.

The research was conducted by Edward Lemay, assistant professor of psychology at UNH, and colleagues at Yale University. The research is presented in the Journal of Experimental Social Psychology in the article "Heightened interpersonal security diminishes the monetary value of possessions."

Lemay and his colleagues found that people who had heightened feelings of interpersonal security - a sense of being loved and accepted by others - placed a lower monetary value on their possession than people who did not.

In their experiments, the researchers measured how much people valued specific items, such as a blanket and a pen. In some instances, people who did not feel secure placed a value on an item that was five times greater than the value placed on the same item by more secure people.

"People value possessions, in part, because they afford a sense of protection, insurance, and comfort," Lemay says. "But what we found was that if people already have a feeling of being loved and accepted by others, which also can provide a sense of protection, insurance, and comfort, those possessions decrease in value."

The researchers theorize that the study results could be used to help people with hoarding disorders.

"These findings seem particularly relevant to understanding why people may hang onto goods that are no longer useful. They also may be relevant to understanding why family members often fight over items from estates that they feel are rightfully theirs and to which they are already attached. Inherited items may be especially valued because the associated death threatens a person's sense of personal security," Lemay says.

SOTT

TStzmmalaysia
post Mar 11 2011, 09:24 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Hydrogen-Producing Skyscraper Harvests Energy From Bolts of Lightning

Hydrogen power is an exciting alternative energy source because it burns clean and emits only water vapor and heat -- however the tech is crippled by the fact that it takes a lot of energy to produce hydrogen fuel. This eye-popping Hydra Tower aims to solve the hydrogen conundrum in the most logical awesome way possible -- by harnessing bolts of lighting to smash molecules of water into hydrogen and oxygen. The spire's sinuous exoskeleton is made from graphene, a carbon super-material that is 200 times stronger than steel and highly conductive to heat and electricity - the better to channel incredible amounts of energy straight from the sky.

When lighting strikes, the spire’s super-conductive graphene skin channels electricity into a massive array of batteries in the tower’s base. This energy is then used to split water into hydrogen gas through electrolysis. The tower’s twisting form was inspired by the Hydra, a simple freshwater animal. The project also includes a research facility, housing, and recreational areas for scientists and families – which we assume are a pleasure to use when the skyscraper isn’t being blasted with one billion volts of electricity.
Part lightning spire and part futuristic super-tower, the Hyrdra Skyscraper was designed by Milos Vlastic, Vuk Djordjevic, Ana Lazovic, Milica Stankovic, and was an honorable mention in the 2011 Evolo Skyscraper Competition. It’s meant to be implemented in the tropics, where 70% of all lighting occurs – this includes areas like Singapore, Central Florida, Venezuela, and Kifuka in the Democratic Republic of the Congo.

Inhabitat


TStzmmalaysia
post Mar 11 2011, 09:25 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Iceland Harnesses Geothermal Sources Able to Power Over 1.2 Million Homes

Iceland may have fallen in the hole when the global recession hit, but the country could be soon emerging with one of the world’s largest energy sources. Iceland’s biggest energy company, Landsvirkjun, is planning on constructing the world’s longest underwater electric cable so that the country can sell its vast geothermal and volcanic energy to the rest of Europe. The sub-sea cable, if built, will have the potential to deliver as many as five terawatt-hours (5 billion kilowatt-hours) annually to the continent – this would be enough to power 1.25 million homes with clean energy.

It appears that Iceland could soon become the world’s next big powerhouse – literally!
While the discovery was actually made in 2009, it’s taken years to test the magma. But today researchers believe that that steam could generate up to 25MW of energy, which is enough to power up to 30,000 homes. This accidental discovery also reveals that any place with young volcanic rocks, no matter where in the world, should give way to easy to find, reasonably shallow bodies of magma that could be used as energy sources.
The news also comes on the heels of a discovery made by scientists at the University of California-Davis and a team that was initially drilling a well as part of the Icelandic Deep Drilling Project. The team was originally searching for a source of hot water underground settled under high pressure that they hoped they could use as a source of energy. Instead they discovered a rich seam of the molten rock relatively close to the earth’s surface, and scientists believe that this bit of magma could be a new source of clean energy that could easily be harnessed for widespread use.
The proposed Icelandic cable could extend as long as 1,180 miles, depending on its destination. The company is currently considering connections to some of the largest cities located in Britain, Norway, Holland, and Germany. At present, Landsvirkjun produces about 75 percent of Iceland’s electricity from geothermal sources.

Inhabitat

TStzmmalaysia
post Mar 11 2011, 09:26 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Ultrafast laser 'scribing' technique to cut cost, hike efficiency of solar cells

Researchers are developing a technology that aims to help make solar cells more affordable and efficient by using a new manufacturing method that employs an ultrafast pulsing laser.

The innovation may help to overcome two major obstacles that hinder widespread adoption of solar cells: the need to reduce manufacturing costs and increase the efficiency of converting sunlight into an electric current, said Yung Shin, a professor of mechanical engineering and director of Purdue University's Center for Laser-Based Manufacturing.

Critical to both are tiny "microchannels" needed to interconnect a series of solar panels into an array capable of generating useable amounts of power, he said. Conventional "scribing" methods, which create the channels mechanically with a stylus, are slow and expensive and produce imperfect channels, impeding solar cells' performance.

"Production costs of solar cells have been greatly reduced by making them out of thin films instead of wafers, but it is difficult to create high-quality microchannels in these thin films," Shin said. "The mechanical scribing methods in commercial use do not create high-quality, well-defined channels. Although laser scribing has been studied extensively, until now we haven't been able to precisely control lasers to accurately create the microchannels to the exacting specifications required."

The researchers hope to increase efficiency while cutting cost significantly using an "ultrashort pulse laser" to create the microchannels in thin-film solar cells, he said.

The work, funded with a three-year, $425,000 grant from the National Science Foundation, is led by Shin and Gary Cheng, an associate professor of industrial engineering. A research paper demonstrating the feasibility of the technique was published in Proceedings of the 2011 NSF Engineering Research and Innovation Conference in January. The paper was written by Shin, Cheng, and graduate students Wenqian Hu, Martin Yi Zhang and Seunghyun Lee.

"The efficiency of solar cells depends largely on how accurate your scribing of microchannels is," Shin said. "If they are made as accurately as possibly, efficiency goes up."

Research results have shown that the fast-pulsing laser accurately formed microchannels with precise depths and sharp boundaries. The laser pulses last only a matter of picoseconds, or quadrillionths of a second. Because the pulses are so fleeting the laser does not cause heat damage to the thin film, removing material in precise patterns in a process called "cold ablation."

"It creates very clean microchannels on the surface of each layer," Shin said. "You can do this at very high speed, meters per second, which is not possible with a mechanical scribe. This is very tricky because the laser must be precisely controlled so that it penetrates only one layer of the thin film at a time, and the layers are extremely thin. You can do that with this kind of laser because you have a very precise control of the depth, to about 10 to 20 nanometers."

Traditional solar cells are usually flat and rigid, but emerging thin-film solar cells are flexible, allowing them to be used as rooftop shingles and tiles, building facades, or the glazing for skylights. Thin-film solar cells account for about 20 percent of the photovoltaic market globally in terms of watts generated and are expected to account for 31 percent by 2013.

The researchers plan to establish the scientific basis for the laser-ablation technique by the end of the three-year period. The work is funded through NSF?s Civil Mechanical and Manufacturing Innovation division.

EurekAlert


TStzmmalaysia
post Mar 11 2011, 09:34 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Disposable endoscopic camera is the size of a grain of salt

German engineers have developed a low-cost disposable endoscopic camera that is the size of a coarse grain of salt

Tiny video cameras mounted on the end of long thin fiber optic cables, commonly known as endoscopes, have proven invaluable to doctors and researchers wishing to peer inside the human body. Endoscopes can be rather pricey, however, and like anything else that gets put inside peoples' bodies, need to be sanitized after each use. A newly-developed type of endoscope is claimed to address those drawbacks by being so inexpensive to produce that it can be thrown away after each use. Not only that, but it also features what is likely the world's smallest complete video camera, which is just one cubic millimeter in size.

The prototype endoscope was designed at Germany's Fraunhofer Institute for Reliability and Microintegration, in collaboration with Awaiba GmbH and the Fraunhofer Institute for Applied Optics and Precision Engineering.

Ordinarily, digital video cameras consist of a lens, a sensor, and electrical contacts that relay the data from the sensor. Up to 28,000 sensors are cut out from a silicon disc known as a wafer, after which each one must be individually wired up with contacts and mounted to a lens.

In Fraunhofer's system, contacts are added to one side of the sensor wafer while it's still all in one piece. That wafer can then be joined face-to-face with a lens wafer, after which complete grain-of-salt-sized cameras can be cut out from the two joined wafers. Not only is this approach reportedly much more cost-effective, but it also allows the cameras to be smaller and more self-contained – usually, endoscopic cameras consist of a lens at one end of the cable, with a sensor at the other.

The new camera has a resolution of 62,500 pixels, and it transmits its images via an electrical cable, as opposed to an optical fiber. Its creators believe it could be used not only in medicine, but also in fields such as automotive design, where it could act as an aerodynamic replacement for side mirrors, or be used to monitor drivers for signs of fatigue.

Gizmag

TStzmmalaysia
post Mar 11 2011, 09:35 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Bridgette Meinhold Massive Living Mountain Skyscraper Transforms the Desert Into a Habitable Environment

Desertification is a serious issue we're going to have to contend with as climate change turns our fertile land into dry, sandy, windswept dunes. In order to adapt, we will need to use every drop of knowledge and experience in sustainable living to coax water from the desert in hopes of restoring the land.

Anna-Maria Simatou and Marianthe Dendrou of Greece have proposed the Living Mountain, a skyscraper city located in the desert of Taklamakan, in the northwestern region of China, which extracts water from the region and smartly utilizes it to create a microclimate and eventually a new landscape to beat back the desert. Simatous and Dendrou's Living Mountain just received an honorable mention in the 2011 eVolo Skyscraper Competition.

The high rise conglomerate is built from local materials and stands on a reinforced concrete pier foundation to be able to withstand the shifting sands. Steel and reinforced concrete will be used to construct the main superstructure, which is covered in a translucent material, much like a green house. Inside the structure are prefabricated living pods, which are roughly 2,000 square feet each and are also constructed of a light translucent material. Residents of the Living Mountain have easy access to to all of the facilities via elevator.

Rainwater is collected from the top of the structure and cascades to a central atrium all while filtering pollutants and encouraging growth of indoor vegetation. A lake is constructed underneath the living mountain to store excess water and help restore the landscape into a habitable environment. Eventually, more Living Mountains can be constructed nearby and further help with vegetation growth. These living mountains would be connected together via cable cars.

Inhabitat

TStzmmalaysia
post Mar 11 2011, 09:37 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Underwater Skyscrapers Are Like Moss Covered Icebergs That Recycle Waste From Great Pacific Garbage Patch

We know that the plastic waste in our oceans is heinous. The Great Pacific Garbage Patch is a monstrosity. First of course we need to stop polluting our waters, and secondly, we need a way to clean up our ugly mess.

As part of the eVolo Skyscraper competition, Milorad Vidojevi?, Jelena Pucarevi? and Milica Pihler of Serbia have proposed a giant underwater skyscraper that would collect and recycle the waste into a source of energy.

The underwater skyscraper was inspired by the Eiffel Tower and flipped upside down in the water. The lower portion of the tower collects and stores plastic waste until it can be recycled in the middle section of the tower. Above the recycling facilities are office space and then residential that sticks out above the water.

Ballasts take in or release water to control the buoyancy of the tower depending on how much waste has been collected. Meanwhile the recycled trash is processed to create fuel for use to run the facility or elsewhere. The waste will be heated in the recycling chamber and converted into a gas that will be stored in massive battery like structures.

Lady Landfill Skyscraper took home an honorable mention in the 2011 architectural competition. We love the combination of recycling and futuristic architecture all located under the sea. We’re especially fond of the lush vegetation draping off the part that sticks out of the water, like they were moss covered icebergs.

Inhabitat

TStzmmalaysia
post Mar 11 2011, 09:39 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


TRANSPORTATION

Attached Image

Like a Swiss Army Knife you can ride: The Voltitude folding electric bike

This Bicycle has turned its back on this traditional concept of a bike.

Users are said to be able to fold or unfold the Voltitude bike in about one second, and with one hand, thanks to its unique EasyFold system. Swiss and EU legislation limits the electric assist to 15.5 mph (25 kph), although some frantic footwork could see it achieve faster speeds if required, and the onboard battery is good for between 12 and 25 miles (20 to 40 km) between charges.

Looking at the Voltitude bike, you can't help thinking of a Swiss Army knife and as a last mile transport solution, it could prove just as useful. Unlike the YikeBike, it's not designed to be lifted and carried. Once folded using the unique mechanism, a special button on the handlebar activates the motor to trundle the bike along at walking pace – which is probably just as well as the bike can weigh up to 48 pounds (22 kg) before a battery even enters the equation.

Its 43-inch (1087 mm) wheelbase and seat height/handlebar position make it comparable to a standard bicycle, but you won't find many bikes with the wide scooter wheels and low center of gravity. Onboard sensors determine how much electric assist the rider gets from the rear wheel 250W electric motor – the more you give to the pedals, the more power will be added to the wheel. The onboard 9.5Ah/36v lithium-polymer battery, which is said to take just four hours to fully recharge, also provides juice for the integrated front and rear LED lighting.

The Voltitude bike benefits from strong-but-lightweight aluminum frame and wheels, hydraulic disk brakes, and can be equipped with front or rear luggage racks. Other options include sequential 3, 5 or 8 speed gearboxes.

Although this may vary from country to country, the Voltitude bike is considered a normal bicycle in the EU and Switzerland, so there's no need for a driving license or registration plate. A helmet is also not a legal requirement, but is recommended.

The creation of Eric and André-Marcel Collombin was shown off recently at the Geneva Motor Show, and is currently being manufactured in small numbers for a limited number of Swiss customers. International pre-orders for the Voltitude are now being accepted, but final specs and price are yet to be announced. The first non-Swiss shipments will likely be made towards the end of 2011.

Gizmag

TStzmmalaysia
post Mar 11 2011, 09:41 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

x-Ar exoskeleton takes the weight off your arm

Exoskeletons are mechanical systems that human users wear over their bodies, to augment their own physical abilities. While exoskeletons are already available and in use today, they're sometimes a bit more machine than what is needed.

After all, why put on an expensive full- or half-body contraption, when you're performing a task that mostly just requires the use of one arm? That's where the x-Ar exoskeletal arm support comes in. Users wear it on their dominant arm, and it moves with them, providing support as they do things such as holding tools out in front of themselves.

The x-Ar is the latest product from Equipois, which has already been selling a little something called the zeroG. Also reminiscent of Aliens technology, it's a Steadicam-inspired device that can either be worn via a back brace-like arrangement, or mounted on a work stand, wall track, or other apparatus beside the worker. Just like a Steadicam allows a movie camera to seemingly float weightlessly beside its operator, using nothing but hinges, springs and counterweights, the zeroG (pictured below) does the same thing for whatever tool is mounted on it.

Unlike the zeroG, which doesn't connect with the user's arms at all, the x-Ar attaches to the wrist of their dominant arm via a cuff – they still hold whatever tool they're using in their hand, and it will fall to the floor if they let go of it. Even for people whose jobs don't involve heavy tools, the x-Ar could still be of great assistance to people who are simply required to extend their arm for extended periods.

Equipois CEO Eric Golden has been quoted as saying that besides reducing workplace injuries, the device could also find use in assisting paralyzed people in moving their arms, and that it could eventually even be used with a brain-machine interface.

For a lower-tech gizmo that offers some of the same features, also check out the Portable Support Tool Balancer.

Gizmag

Fast Company

TStzmmalaysia
post Mar 12 2011, 10:49 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Nanoscale whiskers from sea creatures could grow human muscle tissue

Nanoscale whiskers from sea creatures could grow human muscle tissue

Minute whiskers of nanoscale dimensions taken from sea creatures could hold the key to creating working human muscle tissue, University of Manchester researchers have discovered. Scientists have found that cellulose from tunicates, commonly known as sea squirts, can influence the behaviour of skeletal muscle cells in the laboratory. These nanostructures are several thousand times smaller than muscle cells and are the smallest physical feature found to cause cell alignment.

Alignment is important since a lot of tissue in the body, including muscle, contains aligned fibres which give it strength and stiffness. Cellulose is a polysaccharide – a long chain of sugars joined together – usually found in plants and is the main component of paper and certain textiles such as cotton. It is already being used for a number of different medical applications, including wound dressings, but this is the first time it has been proposed for creating skeletal muscle tissue. Tunicates grow on rocks and man-made structures in coastal waters around the world.

Cellulose extracted from tunicates is particularly well suited for making muscle tissue due to its unique properties. University of Manchester academics Dr Stephen Eichhorn and Dr Julie Gough, working with PhD student James Dugan, chemically extract the cellulose in the form of nanowhiskers. One nanometre is one billionth of a metre and these minute whiskers are only 10s of nanometres wide – far thinner than a human hair. When aligned and parallel to each other, they cause rapid muscle cell alignment and fusion.

The method is both simple and relatively quick, which could lead to doctors and scientists having the ability to create the normal aligned architecture of skeletal muscle tissue. This tissue could be used to help repair existing muscle or even grow muscle from scratch. Creating artificial tissue which can be used to replace damaged or diseased human muscles could revolutionise healthcare, and be of huge benefit to millions of people all over the world.

Dr Eichhorn thinks the cellulose extracted from the creatures could lead to a significant medical advancement. He added: "Although it is quite a detailed chemical process, the potential applications are very interesting. "Cellulose is being looked at very closely around the world because of its unique properties, and because it is a renewable resource, but this is the first time that it has been used for skeletal muscle tissue engineering applications. "There is potential for muscle precision engineering, but also for other architecturally aligned structures such as ligaments and nerves."

PhD student James Dugan has become the first UK student to win the American Chemical Society's Cellulose and Renewable Material Division award for his work on nanowhiskers.

EurekAlert

TStzmmalaysia
post Mar 12 2011, 10:55 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

"Repair-Ware" Household Gadgets Designed To Last Forever With Easy Fixability

Please behold this excellent concept for small household appliances by designer Samuel James Davies. The idea is to have a small appliance that can be easily taken apart and repaired by the user; in fact, the entire premise of the design right down to the visual appeal is to promote a culture of repair. The overall appearance is one of accessibility and takes away the high-tech intimidation factor.

This steam iron is part of a larger project called Repair-Ware, which is intended to promote repairability among household gadgets.

Yanko Design pointed us to the design by Davis, who writes, "The brand and range of appliances aim to create a culture of repair amongst their users. This brings together not only the manufacturers knowledge of the product but also that of the user. This culture is created through a website and forum which allow the user to share knowledge, learn, buy new parts and ultimately carry out their own repairs."

In other words, something along the lines of the iFixit community, only with the manufacturers' input and assistance as well.

Davies states that the products are designed to be fully and easily taken apart and put together intuitively -- completely taking out the scary part of wanting to fix something but being afraid you'll just break it more. Plus, they just look awesome, like a mix of old-fashioned and corky Tonka toy. And that cloth cord just brings flashbacks of my grandmother's iron (and telephone and table lamp and...).

"Aesthetically they take queues from older products, hinting towards when products were built to last and trying reduce throwawayism. Whilst using a traditional set of materials to create a feeling of longevity, they do so with more contemporary forms allowing a fresh product that won't easily circum to fashion."

The concept of repairing worn out products yourself does not come natural in today's system. Maintenance and repair are good moneymakers for producers and if it's 'beyond repair' 'they' are happy to sell you a new sell phone, computer, washingmachine etc. This mentality creates an endless stream of waste, polution and inefficiency.

However, new technologies such as these, along with 3D printing, will enable people to produce and repair products themselves. This approach will tremendously reduce waste, polution and obsolecense.

Inhabitat

TStzmmalaysia
post Mar 12 2011, 11:06 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

New method improves modeling of electrons' motions in complex molecules

David Mazziotti has significantly improved a quantum computational method that he introduced in 2004 for efficiently modeling the electrons in atoms and molecules.

Although in principle quantum mechanics can describe the properties of molecules and materials in which the electrons’ motions are strongly correlated, in practice such computations are formidable. Molecules can have from 10 to hundreds or thousands of electrons, and the computational cost of modeling molecules increases exponentially with the number of strongly correlated electrons.

Mazziotti, an associate professor in chemistry at the University of Chicago, has been developing a new approach in which any molecule’s energies and properties can be computed as a function of just two of the molecule’s many electrons. Such a strategy provides accurate approximations for strongly correlated electrons without an exponential computational scaling. In the Feb. 25 issue of Physical Review Letters, Mazziotti announced a newly improved method that is at least 10 to 20 times faster than previous methods.

Mazziotti’s original approach already has been applied to studies of aromatic rings, which are employed in computer displays, and of the energy-transfer process that enables fireflies to glow in the dark.

“The present advance will enable treatment of larger molecules and materials with strongly correlated electrons,” he said.

In the Physical Review Letters article, Mazziotti applied this method to the metal-insulator transition of metallic hydrogen, which forms under the intense pressure found at the cores of Jupiter and Saturn. Computing the electronic properties of a dissociating chain of 50 hydrogen atoms during this transition would require 10 octillion (1028) variables from traditional quantum solutions, while the world’s largest supercomputers can treat approximately a billion (109) variables. The two-electron approach, however, requires only 9.4 million variables and 3.9 million constraints.

The algorithm in Mazziotti’s method is a member of a special family of algorithms known to mathematicians as semidefinite programming. The advance in the Physical Review Letters article also has applications in engineering, computer science, statistics, finance, and economics.

“Remarkably, behind seemingly unrelated phenomena, there lies a common mathematical thread,” Mazziotti said.

In Mazziotti’s method, the energy of a molecule with many electrons is minimized as a function of two electrons, which are constrained to represent all of the electrons.

“In the same fashion, in finance, one might be optimizing profit over a set that is constrained to represent a certain amount of money or a given inventory of products,” he explained. “Both problems require a search — or optimization —of a quantity subject to real-world constraints. In finance these constraints will follow from the laws of business while in chemistry they will follow from the laws of quantum mechanics.”

PhysOrg

More information: “Large-Scale Semidefinite Programming for Many-Electron Quantum Mechanics,” David A. Mazziotti, Physical Review Letters, Vol. 108, No. 8, Feb. 25, 2011.

Provided by University of Chicago (news : web)




TStzmmalaysia
post Mar 12 2011, 11:11 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

UC San Francisco hospital integrating robotic pharmacy

The University of California at San Francisco Medical Center is now starting to use robots, not humans, to dispense medication from its hospital pharmacy. While robots are often brought into workplaces as a cost-cutting measure, UCSF claims that in this case, it's to minimize the chances of patients receiving the wrong medication. So far, it seems to be working out well – out of 350,000 doses of oral and injectable medication prepared to date, not a single error has occurred.

Utilizing Swisslog's PillPick system, bulk batches of pills are separated out into individual doses, bagged and stored. UCSF physicians electronically send orders in to the system, which then proceeds to pick and dispense the appropriate pills. All the bagged doses of all the pills that a patient will need within a 12-hour period are strung together on a plastic ring, and bar-coded. There are plans for nurses to use bar code readers, to confirm that the right medication ends up going to the right patients.

An automated inventory system keeps track of how much medication is in stock.

Three RIVA robots, made by Intelligent Hospital Systems, are able to dispense doses of liquid medication, such as IV syringes or bags.

All of the robots work within a secure, sterile environment, which is said to greatly reduce the chances of medication getting contaminated. Because human pharmacists don't handle the drugs themselves, there is also less risk of them being exposed to toxic drugs, such as those used for chemotherapy.

Hopefully the robots won't be putting any pharmacists out of work, but will instead allow them to put their training to better use. "Automated medication dispensing frees pharmacists from the mechanical aspects of the practice," said Mary Anne Koda-Kimble, dean of the UCSF School of Pharmacy. "This technology, with others, will allow pharmacists to use their pharmaceutical care expertise to assure that patients are treated with medicines tailored to their individual needs."

The phase-in period for the system began in October 2010, and will continue until next year.

Gizmag

TStzmmalaysia
post Mar 12 2011, 11:17 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

New Software can Simulate Future Facial Changes

A Concordia graduate student has designed a promising computer program that could serve as a new tool in missing-child investigations. Khoa Luu has developed a more effective computer-based technique to age photographic images of people's faces – an advance that could help to indentify missing kids.

"Research into computer-based age estimation and face aging is a relatively young field," says Luu, a PhD candidate from Concordia's Department of Computer Science and Software Engineering whose master's thesis explores new and highly effective ways to estimate age and predict future appearance. His work is being supervised by professors Tien Dai Bui and Ching Suen.


Best recorded technique

"We pioneered a novel technique that combines two previous approaches, known as active appearance models (AAMs) and support vector regression (SVR)," says Luu. "This combination dramatically improves the accuracy of age-estimation. In tests, our method achieved the promising results of any published approach."

Most face-aged images are currently rendered by forensic artists. Although these artists are trained in the anatomy and geometry of faces, they rely on art rather than science. As a result, predicted faces drawn by artists can differ widely.



Face changes at different stages

"Our approach to computerized face aging relies on combining existing techniques," says Luu. "The human face changes in different ways at different stages of life. During the growth and development stage, the physical structure of the face changes, becoming longer and wider; in the adult aging phase, the primary changes to the face are in soft tissue. Wrinkles and lines form, and muscles begin to lose their tone."

All this information has to be incorporated into the computer algorithm. Since there are two periods with fundamentally different aging mechanisms, Luu had to construct two different 'aging functions' for this project.

To develop his face aging technique, Luu first used a combination of AAMs and SVR methods to interpret faces and "teach" the computer aging rules. Then, he input information from a database of facial characteristics of siblings and parents taken over an extended period. Using this data, the computer then predicts an individual's facial appearance at a future period.

EurekAlert

TStzmmalaysia
post Mar 12 2011, 11:20 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Smart materials for high-tech products

The droning of a car driving along the highway can be nerve-racking. Often, a driver cannot understand the passengers in the rear seat, not to mention the pianissimo on the car stereo. Actually, though, there are ways to drive this disruptive vibration out of the car.

This is possible thanks to "smart materials" – intelligent materials that can tailor their own condition to changing situations with highest speed. The possible applications are diverse and promising – not just for carmakers but also for mechanical engineering and the electronics industry. This is why 11 Fraunhofer Institutes have joined forces to create the "Adaptronics Alliance," creating new, "smart" solutions.


Piezoceramic bearings to counteract car noise

Vibrations inside a moving car are just one example among many. Researchers use piezoceramics, a material that transforms electrical energy to motion and conversely dampens vibrations by converting them to electrical energy. They are currently using an upmarket passenger car to test piezoceramic bearings attached to the vehicle between the chassis and a metal frame positioned atop the chassis. Normally, rubber components are used for this purpose, but they are not ideal absorbers of annoying vibroacoustics. As a result, vibrations are audible in the car in the form of noise. The piezo bearings, on the other hand, are electromechanical energy transducer devices, being electronically controlled to counteract and neutralize these bothersome vibrations. The result is a quiet ride. In another project, researchers are taking the opposite approach. There, they are developing piezo components that convert the oscillations in a structure – such as within high-traffic bridges – to electrical energy. This energy can be used to supply tiny – energy-autarchic sensors that can monitor the condition of the bridge and notify a control center of any damage.


Hard, viscous or watery at the touch of a button

Piezoceramics are not the only materials that can be "smart." An alternative material of interest to Fraunhofer researchers are "magneto-rheological fluids." These fluids contain tiny particles that align themselves to form fixed chains in a magnetic field. The fluid solidifies. Depending on the strength of the field, the fluid is hard, viscous or watery. The Alliance partners have used it to develop a safety clutch for machinery – for use in motor vehicle drives or milling machines. During operation, the fluid is solid. In this state, it creates a solid linkage between drive shaft and cutter head. Activating the emergency shutoff button switches off the magnetic field. The substance returns to its fluid state. The drive shaft spins freely. The cutter head comes to a standstill.

Specialists from different disciplines work together in the Alliance: Material developers, structural mechanics, electronics specialists and system engineers assemble all of the findings to create a coherent whole. With the current economic upturn, industry experts expect to see additional products based on smart materials on the market in the next two years. "The technology is ready. Work is moving forward on other exciting solutions – from mechanical engineering to the consumer-goods market," notes head of the Alliance Tobias Melz of the Fraunhofer Institute for Structural Durability and System Reliability LBF in Darmstadt. At the HANNOVER MESSE, at a joint stand with other Adaptronics partners, the Alliance is presenting a variety of developments – including a table with vibration-damping bearings, an aircraft component with piezoceramic monitoring sensors and an upmarket passenger car with a smart interior.

EurekAlert

TStzmmalaysia
post Mar 12 2011, 11:23 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

A small quantum leap

New switching device could help build a dream: the ultrafast quantum Internet

Northwestern University researchers have developed a new switching device that takes quantum communication to a new level. The device is a practical step toward creating a network that takes advantage of the mysterious and powerful world of quantum mechanics.

The researchers can route quantum bits, or entangled particles of light, at very high speeds along a shared network of fiber-optic cable without losing the entanglement information embedded in the quantum bits. The switch could be used toward achieving two goals of the information technology world: a quantum Internet, where encrypted information would be completely secure, and networking superfast quantum computers.

The device would enable a common transport mechanism, such as the ubiquitous fiber-optic infrastructure, to be shared among many users of quantum information. Such a system could route a quantum bit, such as a photon, to its final destination just like an e-mail is routed across the Internet today.

The research -- a demonstration of the first all-optical switch suitable for single-photon quantum communications -- is published by the journal Physical Review Letters.

"My goal is to make quantum communication devices very practical," said Prem Kumar, AT&T Professor of Information Technology in the McCormick School of Engineering and Applied Science and senior author of the paper. "We work in fiber optics so that as quantum communication matures it can easily be integrated into the existing telecommunication infrastructure."

The bits we all know through standard, or classical, communications only exist in one of two states, either "1" or "0." All classical information is encoded using these ones and zeros. What makes a quantum bit, or qubit, so attractive is it can be both one and zero simultaneously as well as being one or zero. Additionally, two or more qubits at different locations can be entangled -- a mysterious connection that is not possible with ordinary bits.

Researchers need to build an infrastructure that can transport this "superposition and entanglement" (being one and zero simultaneously) for quantum communications and computing to succeed.

The qubit Kumar works with is the photon, a particle of light. A photonic quantum network will require switches that don't disturb the physical characteristics (superposition and entanglement properties) of the photons being transmitted, Kumar says. He and his team built an all-optical, fiber-based switch that does just that while operating at very high speeds.

To demonstrate their switch, the researchers first produced pairs of entangled photons using another device developed by Kumar, called an Entangled Photon Source. "Entangled" means that some physical characteristic (such as polarization as used in 3-D TV) of each pair of photons emitted by this device are inextricably linked. If one photon assumes one state, its mate assumes a corresponding state; this holds even if the two photons are hundreds of kilometers apart.

The researchers used pairs of polarization-entangled photons emitted into standard telecom-grade fiber. One photon of the pair was transmitted through the all-optical switch. Using single-photon detectors, the researchers found that the quantum state of the pair of photons was not disturbed; the encoded entanglement information was intact.

"Quantum communication can achieve things that are not possible with classical communication," said Kumar, director of Northwestern's Center for Photonic Communication and Computing. "This switch opens new doors for many applications, including distributed quantum processing where nodes of small-scale quantum processors are connected via quantum communication links."

EurekAlert

The title of the paper is "Ultrafast Switching of Photonic Entanglement." In addition to Kumar, other authors of the paper are Matthew A. Hall and Joseph B. Altepeter, both from Northwestern.

TStzmmalaysia
post Mar 13 2011, 11:46 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Wireless electricity could be just months away

In 1905 the Serbian inventor Nikola Tesla built a huge 18-storey tower in Long Island.

His aim was to create the world's first power station that would transmit wireless electricity around the globe.
Unfortunately the dream was short-lived. His financiers, including JP Morgan, grew cautious and withdrew funding. The project was considered too audacious and ill-thought out, and was eventually abandoned.
The tower, torn down to pay Tesla's mounting debts, became his bold failure. But now, 100 years on, can his ambition be realised?
WiTricity, a US-based firm set up by physicists from the Massachusetts Institute of Technology (MIT), is one of a number of companies around the globe developing different models of powering up gadgets without using cables.
Inspired by Tesla's vision, WiTricity believes it can launch wirelessly-powered products within the year.
A wireless world
Wireless electricity is transmitted between a device and its power source via fitted metal coils, explained researcher Aristeidis Karalis.
"One coil is the source, the other is the device. The source generates a magnetic field which induces a current in the device. This is converted into the power the device wants to use."
The goal is to transmit electricity over mid-range distances - so electricity from a wall to the middle of a room.
The firm's president Eric Giler says its research is developing rapidly.
"Imagine your house. Look under the table and there's a coil. You see multiple devices working from a distance away. So you come home and your phone is in your purse - you don't have to think about where to put it."
However some say the technology is still far from perfect.
According to Menno Treffers from the Wireless Power Consortium, energy transfer becomes rapidly inefficient the further a device is moved from the source.
"You can make it work over a coil diameter, but what's the point in transmitting power from the wall to the TV - 20 to 30cm away - if you can't take it outside?"
Climate challenge
Eric Giler agrees that wireless power is not as efficient as using a cable, but the environmental difference it makes is considerable.
"A wireless keyboard uses four batteries. We made about 40 billion of these in the world this year. Building one of those batteries is the same as driving three miles in your car.
"The greenhouse gas emissions are huge. But if you put a coil in the table and put the keyboard on the table - on it comes - without a battery."
The WiTricity physicists did look at Tesla's patents for their research, but for now their plans do not include global power transmission. Still, Eric Giler shares Tesla's expectations about the idea's potential.
"Imagine a pacemaker that never needs to be replaced or a car that starts charging as soon as it's parked. Five years from now it will seem obvious - it's only today it sounds futuristic."

BBC News

TStzmmalaysia
post Mar 14 2011, 08:51 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

'Dunehouse' that Grows Like an Artificial Organism

Italian architect Gianluca Santosuosso unveiled the 'DUNEhouse' as a flexible tool where the client has the possibility to define his own house.

Mainly, the idea is based on a system composed by different volumes where each of them represents a program or a cluster of it (bedroom and bathroom, kitchen, swimming poll, etc.) and the client is able to distribute them on the plot and creating every kind of space and connection that he prefers.

In the project, the most important parameter is the wind, so the base geometry of the house (the sequence of air flow analysis shows some of the wind test) is organised in order to maximise the wind flow and consequently the natural ventilation inside the building. To increase the effect of this mechanism all the house volumes (made of reinforced and insulated concrete shells) have been thought on the top of small artificial dunes working as thermal mass with the aim to regulate the income air temperature.

Moreover, the shell insulation is given by a covering envelope composed by different soil bubbles working as a green wall where the plants are able to grow and consequently regulating the microclimate surrounding the house.

The second parameter evaluated is the view exposition of the house toward the sea. In order to understand and maximise this aspect it has been done on a diagram where it is possible to define where to collocate the windows. These windows are made of mirrored glass and the solar radiation is controlled through louver system placed on the inner part of the glass.

The house is growing as an artificial organism that is taking advantage from the natural resources; also the solution to “melt” the shell of the house with the ground (starting from the level 0.00 with an artificial dune and transforming in spheres of soil, seeds and natural fabric) allows to use as much as possible the thermal inertia to save energy for the cooling heating system.


TStzmmalaysia
post Mar 14 2011, 08:53 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Cartesian Wax – Prototype for a Breathing Skin

This project by Neri Oxman from the MIT Media Lab explores the notion of material organization as it is informed by structural and environmental performance.

A continuous tiling system is differentiated across its entire surface area to accommodate a range of physical conditions of light transmission, heat flux, stored energy modulation and structural support. The surface is thickened locally where it is structurally required to support itself, and modulates its transparency according to the light conditions of its hosting environment. 20 tiles are assembled as a continuum comprised of multiple resin types – rigid and/or flexible.

Each tile is designed as a structural composite representing the local performance criteria as manifested in the mixtures of liquid resin. A single 3-D milled semi adjustable mold made of machinable wax is used to generate multiple tiles. Each tile is cast with high temperature curing plastic deforming the original mold with each casting procedure by controlling the temperature gradient across the surface area of the mold. These processes speculate about light and/or heat sensitive environmental-specific construction techniques.

The work is inspired by the Cartesian Wax thesis as elucidated by Descartes in the 1640’s. The thesis relates to the construction of material perception and effect in our experience of the physical world. According to Descartes, the essence of the wax is whatever survives the various changes in the wax’s physical form. Not unlike the Cartesian Wax, “materials that think” embody processes of formation that have generated their physical form.

Evolo

Neri Oxman


TStzmmalaysia
post Mar 15 2011, 09:53 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Rubbing elbows with robotics

Researchers at Intel Labs located on the campus of Carnegie Mellon University in Pittsburgh are figuring out a way to take the drudgery out of house work. Credit: Science Nation, National Science Foundation

Brian Zenowich will sometimes spend his workdays doing a little arm-in-arm dancing. His dance partners manage to stay in step, duplicating his every move almost flawlessly. The "twist" here isn't the type of dance he's doing. It's the fact that Zenowich isn't dancing with humans. He's actually a robotics engineer for Barrett Technologies in Cambridge, Mass., where the company makes robotic arms and hands.

They're called WAM arms and "W-A-M" is an acronym for "Whole Arm Manipulator." The arms can be sold with attachable robotic hands, too. Zenowich demonstrates how to control the robotic WAM arms and hands of one robot by gracefully moving similar parts on another robot.

"It's a very visceral experience," says Zenowich. "You are in contact with this machine and you're working back and forth with the robot and it's like a dance that you're doing with the robot."

Zenowich operates a master robot, while the slave copies his moves remotely. He's able to pick up a small box, place a coffee thermos on top of the box, and top the thermos with at baseball cap. "You get both the skill of the person, the intelligence of the person and the speed of the robot working together to perform a task," explains Zenowich.
For robot parts to act and react like people parts, Barrett needed to make them small and portable, with maximum agility. With help from the National Science Foundation's Small Business Innovation Research (SBIR) program, his company developed a computerized device that looks like a small hockey puck. So much so that they even call it a "puck."

"A puck is a motor-controller, which goes into all of our products that controls the motors and moves them with as much force--as much torque, as we need them to move," says Zenowich. "Just as people move around, we want our robots to move around so they needed to be small and lightweight and low power."

The puck allows the arms and hands to be utilized in many ways. The operator can actually sense virtual objects through the touch of the robot. "When you are in contact with the robot, you can actually feel objects in 3-D space: the robot will create that virtual environment for you in a physical sense."

That sensing of virtual objects can be programmed into the robot's memory and is useful for applications such as physical therapy. "The Rehabilitation Institute of Chicago is using our robot to perform rehabilitation on stroke patients to make them stronger; to get their brains to really understand what the aftereffects of the stroke were." Patients will push on the arm and hit imaginary objects they can feel through the robot arm.
There are other functions of the arms and hands made by Barrett. Telerobotics allows for remote operation of the arms and hands, which could be a boon for the military. "The robot can go in remotely under human control and can disarm the explosive device," says Zenowich.

Another application is "teach and play."

"It's very easy to teach our robot to make even complex motions like writing," he says. The robot can be programmed to automatically understand what to do. "So if you put a pen in the robot's hand, it would know it's supposed to write something with that pen."

With improved technology, robots can be used more efficiently and in wider applications. According to SBIR program manager, Muralidharan Nair, in the past, most robots were assigned to repetitive tasks in industry with only recent entry into product assembly. "However, the growing needs of aging populations will dwarf these traditional industrial uses," says Nair.

And addressing these quality-of-life needs with robots will require continued improvements in intelligence and sensing capability. "The NSF/SBIR program investments in robotics technology have typically been made in the areas of human assistive technologies, healthcare robotics, education, robotics in manufacturing, and emergency response. The program has been critical to Barrett's success with the robotic arm," adds Nair.

The WAM arms use roughly the same amount of power as a light bulb and can also be used in factories. Most robotic arms don't know their own strength and accidents on assembly lines from robotic arms can fatally crush their human counterparts. But the WAM arm is designed to know its own strength. That's because the robotic arm functions with truly sensitive, sophisticated controls, so humans are not at risk in the presence of these machines.

"If the robot is in a manufacturing environment and gets in the way accidentally, the robot's not going to push through that person. The person is always stronger than the robot, and can push the robot out of the way," explains Zenowich.

"If technologies developed by Barrett are leveraged by a large U.S. corporation, the NSF/SBIR investment in Barrett can have a profound effect on the U.S. economy and quality-of-life for the aging population around the globe," says Nair. "Robotics will also become critical in healthcare, biomedical research and outcomes, and surgical procedures."

Most of the WAM arms and hands are sold to research wings of corporations or universities. Zenowich is impressed with how many different ways the robot arms are being used. "It's amazing to see all the applications where our robots are working directly with people and helping them in some amazing way," notes Zenowich.

PhysOrg

TStzmmalaysia
post Mar 15 2011, 09:54 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Robots to the rescue

Researchers are exploring ways to make rescue robots less "creepy" and more user-friendly, incorporating lessons learned from studies of how humans interact with technology

Everyone knows machines don't have feelings. But try telling that to your brain.

"We have one social brain, and it's the same whether we're dealing with a person or a machine," said Clifford I. Nass, the Thomas M. Storke Professor at Stanford University, who studies the social aspects of technology. "People use the same social graces with machines, especially computers, as they do with people."

Nass has devoted much of his research career to studying the ways in which humans respond socially to technology. Despite what most people know intellectually, they still often automatically treat computers and other devices like human beings, he said.

In a 1993 study, for example, he found that people unconsciously use social rules when interacting with computers. His subjects were much "nicer" to the computer they had worked with--responding favorably to the computer when it "asked" how it performed--than they were to another computer that "asked" the same question about the first computer. "It was as if they didn't want to hurt the first computer's feelings," Nass said.

Several years ago, his unusual research led to a collaboration with Robin Murphy, director of the Center for Robot-Assisted Search and Rescue of Texas A&M University, and a professor of computer science and engineering. He and Murphy, who is regarded as a founder of the field of rescue robotics, are working together to design a rescue robot that is user-friendly.

Rescue robots serve as a trapped disaster victim's lifeline to the outside world. But they are worthless if the victim finds them scary, bossy, out-of-control or just plain creepy.
"Robots don't make eye contact. Their tone doesn't change. When they get closer to people, they start to violate their personal space," Murphy said. "If you are stuck somewhere for10 hours, and something scares you, or annoys you for long enough, you might start disregarding what it is asking you to do. The term that keeps coming up is 'creepy.' People find the robots that are supposed to be helping them creepy."

Nass and Murphy are working to ease the "creep" factor in rescue robots, hoping to reduce anxiety, and bolster existing rescue efforts. The National Science Foundation (NSF) has funded the three-year project with a $1.2 million grant shared by the two universities as part of the American Recovery and Reinvestment Act (ARRA) of 2009. As an economic stimulus, the work will create at least five new research jobs in the short term, but, more importantly, the researchers expect it to jump-start a new industry.

"Several of these people will go out and start new companies based on this technology, and students will go out and work for these companies," Murphy said. "There is a burgeoning emergency response market--think about Haiti. We need more technology that is helpful for these situations. We are creating more knowledgeable people, and encouraging them to go into this sector."

Rescue robots have been used for more than a decade, but the early prototypes were mechanically primitive. "The 1995 Oklahoma City bombing and the earthquake in Kobe (Japan) created a great interest in rescue robots," Murphy said. "These events served as motivation to start focusing on rescue robots. But they weren't ready to go into the field until 1999.

The researchers hope to improve the devices in ways that will make them more valuable to law enforcement, such as hostage negotiation, as well as in emergency response situations, where they already are in use. The robots also have potential in the health care setting, where the researchers believe they could have huge economic potential.

The current project, also supported by Microsoft, will create a multi-media "head" attachment called the "survivor buddy" that can fit on any traditional robot and serve as the interface between trapped victims and the rest of the world for the 10 or more hours it might take to extract them. An animator from Pixar--the company involved in such popular films as "Wall-E" and "Up"--has volunteered to help design the motions.

"How do you design a robot that is socially appropriate at a time when a person is under extreme stress?" Nass asks. "My role is to come up with all the social aspects. We're doing work on body distance, for example, if the robot comes up too close, and rolls right up next to you, that's pretty horrible. It has to do with the various social tricks humans use--it has to respect your personal space."

"But the robot can't be too far away," he adds. "What if the robot stood 100 feet back and said: 'I am very concerned about you. I am here to help you.' That also would be worrisome--the message is: 'I don't really care about you, because I am too far away.' It seems insincere--so insincerity is a very bad thing."

Robots must be programmed to pick up on human cues and respond appropriately--just as humans do with other humans, Nass said.

"We need to design a robot that knows social graces and can garner trust and show respect and expertise," he said. "If you send down a robot that seems like a moron, that's not going to help. It's not going to make you like it. If it's going to be a companion, a buddy, then you'd better like it. Think of all the things you need to be an effective search-and-rescue buddy. The robot has to likeable, seem smart, be trustworthy and seem caring, optimistic--but not overly optimistic."

He recalls the lessons learned many years ago when car company BMW introduced its early navigation system featuring a female voice. Ultimately, the system was recalled. "German male drivers would not take directions from a woman," Nass said. The experience motivated a series of studies "that showed people gender stereotype like crazy," he adds.

The "survivor buddy" will have features to allow victims to engage in two-way video-conferencing, watch the news and listen to music. The media component emerged following a 2005 mine accident--not involving rescue robots--but where trapped miners asked if workers could lower them an MP3 player. "We know people get bored," Murphy said. "These miners got tired of talking to responders on the other side."

The survivor buddy prototype was completed last summer, but hasn't yet been used in a disaster. It is a new robot head that the researchers hope will be able to perform any web-based activity, as well as two-way video conferencing, and the ability to play music and television, among other things. It also will be more user-friendly, hopefully making it less creepy.

"The head will constantly maintain gaze control with you, always maintaining eye contact," Murphy said. "Social gaze is important. Another important thing is the motions--we want it to move more slowly when it's close to you."

Nass adds: "Consider doctors in an emergency room. Doctors move sort of fast--but not insanely fast. You don't see them running really fast--and you don't see them sauntering. There's a right speed for an emergency between wild, frantic speed and sauntering."

The scientists also plan to adjust the volume so that the device speaks more softly the closer it gets to a victim, and it will likely change its coloration. "Most robots now are painted black and have bright head lights," Murphy said.

This can be disconcerting when "you come in the dark at people and blind them--what's more, you can't see the robots in the dark because they are black," she said. "Those are the things we want to avoid. We hope to make it colorful and backlit--and turn the headlights down a little bit."

The scientists plan to test the device in simulated rescue situations using actual people within scenarios as close as possible to the real thing, "without endangering anyone," Murphy said. "You can make people feel they are in a collapse--put them in a dark room, cover them with a blanket."

Previous testing on earlier robots--which prompted the "creep factor" finding--convinced the researchers they needed to make modifications if the rescue robots were to be effective.

"People who were well-fed and well-rested and just in there for an hour were showing significant reactions to the robot," Murphy said. "Imagine if you are already disoriented, or in a lot of pain or fear. The impact will be even more significant. It shows you how important it is to get it right."

For better or worse, research has shown that responses "we thought only applied to people also apply to technology," and that most people are unaware of this, Nass said.

In that earliest computer study, for example, his subjects insisted after the experiment that they would never give different responses to different computers--even though they did.

Moreover, "they were graduate students in electrical engineering in the computer science program at Stanford," Nass adds. "So if anybody knew that computers don't have feelings, these guys did."


TStzmmalaysia
post Mar 15 2011, 09:57 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Neuro Signals Study Gives New Insight Into Brain Disorders

Research into how the brain transmits messages to other parts of the body could improve understanding of disorders such as epilepsy, dementia, multiple sclerosis and stroke.

Scientists at the University of Edinburgh have identified a protein crucial for maintaining the health and function of the segment of nerve fibres that controls transmission of messages within the brain.

The study, published in the journal Neuron, could help direct research into neurodegenerative disorders, in which electrical impulses from the brain are disrupted. This can lead to inability to control movement, causing muscles to waste away.

Professor Peter Brophy, Director of the University of Edinburgh's Centre for Neuroregeneration, said: "Knowing more about how signals in the brain work will help us better understand neurodegenerative disorders and why, when these illnesses strike, the brain can no longer send signals to parts of the body."

The brain works like an electrical circuit, sending impulses along nerve fibres in the same way that current is sent through wires.

These fibres can measure up to a metre, but the area covered by the segment of nerve that controls transmission of messages is no bigger than the width of a human hair.

Dr Matthew Nolan, of the University's Centre for Integrative Physiology, said: "At any moment tens of thousands of electrical impulses are transmitting messages between nerve cells in our brains. Identifying proteins that are critical for the precise initiation of these impulses will help unravel the complexities of how brains work and may lead to new insights into how brains evolved."

ScienceDaily

TStzmmalaysia
post Mar 15 2011, 10:00 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

The Marangoni effect: A fluid phenom

What do a wine glass on Earth and an International Space Station experiment have in common? Well, observing the wine glass would be one of few ways to see and understand the experiment being performed in space.

Ever heard someone say their wine has "legs" or "tears of wine?"

Wine legs or tears of wine is a phenomenon manifested as a ring of clear liquid that forms near the top of a glass above the surface of wine. The drops continuously form and fall in rivulets back into the liquid. One factor in the way fluid moves is called Marangoni convection, or flow, and Japan Aerospace Exploration Agency researchers are very interested in studying it in a gravity-free environment.

Marangoni convection is the tendency for heat and mass to travel to areas of higher surface tension within a liquid. Surface tension is a property of a liquid that causes the surface portion of liquid to be attracted to another surface, such as a drop of mercury that forms a cohesive ball in a thermometer or droplets of water on a well-waxed car. This phenomenon is named after Italian physicist Carlo Marangoni who first studied the phenomenon in the 19th century.

"We are clarifying an unknown phenomenon and that’s very exciting," said Satoshi Matsumoto, a Marangoni science coordinator from the Japan Aerospace Exploration Agency. "Marangoni negatively affects the quality of crystal growth such as semiconductors, optical materials or bio technology materials. The convection also occurs in a heat pipe for heat radiation devices in personal computers, and degrades the radiation performance. Therefore, increased understanding of Marangoni convection not only expands our knowledge of fluid behavior, but also has great significance for production of semiconductor materials and equipment development for both space and ground use."

JAXA has been promoting four Marangoni experiments to fully understand a surface-tension-driven flow in microgravity. It will complete in 2015.

To study how heat and mass move within a fluid in microgravity, investigators are using a larger bridge of silicone oil between two discs. On Earth, that bridge couldn't exist. One of the primary ways heat is transferred on Earth is by buoyancy, where warm air rises and cold air sinks. In space, there is no buoyancy. So investigators heat one disc higher than the other to induce Marangoni convection in that bridge of silicone oil. They are looking at patterns of how fluids move to learn more about how heat is transferred in microgravity.

"It is difficult to observe the effects of Marangoni convection on Earth because the convection is weaker than convection caused by gravity," added Matsumoto. "That is why space experiments of Marangoni convection in a microgravity environment are helpful."

PhysOrg

Provided by JPL/NASA

TStzmmalaysia
post Mar 15 2011, 10:01 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Using quantum methods to read classical memories offers surprising advantages

Currently, the data stored in classical digital memories such as CDs, DVDs, and barcodes is read by classical light. But as a new study shows, using quantum light to read these classical memories can bring surprising advantages. Quantum light can read digital data using very few photons, an ability that could lead to faster digital readers and optical memories with larger storage capacities than before.

Quantum physicist Stefano Pirandola from the University of York, UK, has published the study on the quantum readout of classical memories in a recent issue of Physical Review Letters.

"This is the first demonstration showing that the use of nonclassical light is beneficial for the readout of digital memories, reminiscent of current optical storage devices," Pirandola told PhysOrg.com.

As Pirandola explains in his study, there is an important difference between classical light – the light that is used in practically all of today’s technology applications – and quantum light. In classical light, the states of the electromagnetic field can be decomposed as probabilistic sums of coherent states. In contrast, when this decomposition is not possible, the states of an electromagnetic field are considered to be nonclassical (quantum). Important examples of nonclassical states are those that are entangled, in particular those with Einstein-Podolsky-Rosen (EPR) correlations. When two modes of light are described by these kinds of entangled states, their position and momentum "quadratures" are extremely correlated with each other.

In the proposed method, a classical digital memory consists of many reflective cells, each of which has two possible reflectivities that represent the states 0 and 1 (the two values of a bit). To read the memory, light is irradiated on the cells, and a detector measures the reflected light to determine each cell’s state. Currently, classical light is used for these kinds of memories. However, when its energy is decreased, classical light can only retrieve a limited amount of information from each cell.

Quantum light, on the other hand, doesn’t face the theoretical limits that classical light does. Pirandola’s calculations showed that EPR transmitters (those that use quantum light) can retrieve much more information than classical transmitters in the regime of few photons. He calculates that the enhancement provided by quantum light can be quite large – even up to 1 bit per cell, which corresponds to the extreme situation where only quantum light can retrieve information, and classical light cannot retrieve any information at all.

Since quantum light can read digital information with significantly fewer photons than classical light, it can greatly reduce the reading time of the memory, resulting in higher data transfer rates. For instance, quantum light could increase the rotational speed of a DVD in such a way that only a few photons are irradiated in each data sector. Alternatively, if the reading time is fixed, the quantum light method can offer increased storage capacity compared to reading with classical light.

“The enhancement will be clearer in the future once quantum technology provides more efficient sources of quantum light,” Pirandola said. “Using quantum light, we could read memories using a few photons per bit, while today we use around 1010 photons per bit. This can give an idea of the possible improvement, but I am not able to give good estimates.”

Pirandola also shows that EPR transmitters can be used in error-corrected memory models, in which each bit of information is stored in multiple cells to provide nearly flawless data readout. In contrast, low-energy classical transmitters are basically useless in this situation because they require many more cells for retrieving a single bit of information.

One other possible advantage for reading with quantum light lies in photodegradable organic memories, which contain confidential information. Faint quantum light may be able to read this data since it uses so few photons, whereas energetic classical light would destroy these memories.

“The challenging part [of experimentally demonstrating this concept] is clearly in the [light] source which should be fast and efficient,” Pirandola said. “Despite this, a pilot experiment is within the catch of current technology.”

PhysOrg

More information: Stefano Pirandola. “Quantum Reading of a Classical Digital Memory.” Physical Review Letters 106, 090504 (2011). DOI:10.1103/PhysRevLett.106.090504

TStzmmalaysia
post Mar 15 2011, 10:04 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Good Vibrations Lead to Molecular Revelation

A little luck and the wisdom to recognize what they were seeing helped Rice University researchers solve a molecular conundrum in a way that could be a boon to chemists.

Rice chemist Junrong Zheng and his colleagues in Houston and China have improved upon a long-standing theory for electrolytes through the discovery that vibrational energy transfer can be used to probe how ions cluster in such aqueous solutions.

The discovery opens a path for Zheng, an assistant professor of chemistry at Rice, to measure short-range intermolecular distances without fluorescent or other labeling devices that could skew results. The tool that makes it possible is an ultrafast, time-resolved infrared spectroscope he had custom-built to understand dynamic processes at the subnanoscale.

Zheng is planting a flag in a new field of research with the paper that appeared this week in the Proceedings of the National Academy of Science. The paper specifically shows how the vibrational energy common to all molecules can reveal the mechanics of ion clusters in a solution.

Very dilute solutions of electrolytes are well understood through Debye-Hueckel theory, which was developed in 1923, but as these solutions become slightly more concentrated the theory breaks down, as its authors predicted. A better understanding of electrolyte solutions is essential to scientists who study electochemistry, atmospheric aerosols and biological systems. Zheng's work provides a new way to enlarge this understanding.

The Rice discovery is important for several reasons. First, the strength of an electrolyte -- think of the liquid in a car battery -- depends strongly on how well ions from salts or acids dissolve in the solution. More ions in a solution make it less ideal because of more opposite-charge attraction, according to Debye-Hueckel.

Zheng and his team found unexpected but clear evidence that a significant portion of ions of the same charge form clusters even in less-saturated solutions, in direct opposition to the equation.

Second, the Rice team's technique provides a window for scientists who want to view, for instance, concentrations of sodium and potassium ions in living cells, the ion-dependent movement of proteins or the properties of ion channels in cell membranes.

"Junrong's remarkable accomplishment is to devise a completely new way to learn much more about the structures present in concentrated ionic solutions," said Nobel laureate Robert Curl, Rice's University Professor Emeritus and the Kenneth S. Pitzer-Schlumberger Professor Emeritus of Natural Sciences, who advised Zheng on the paper. "This is exciting and it is important and should be expandable to other important situations."

"Our understanding of concentrated salt solutions is poor, yet these are highly relevant to practical applications, such as solar cells and batteries,” said Gerald Meyer, the Bernard N. Baker Professor of Chemistry at Johns Hopkins University. “Junrong's approach is clever and provides some valuable insights from which new models can be developed. The use of isotopes for the demonstration of energy transfer within the clusters was particularly novel."

The path revealed itself to Zheng and Rice postdoctoral researcher Hongtao Bian last October. "This particular work was not intentionally designed," Zheng said. "It came from a small observation by Bian when I asked him to measure the rotation time constant of an anion in a concentrated solution.

"When he told me the rotation time was only 2.5 picoseconds, I knew something was wrong. I remembered the rotational time constant of this anion in a very dilute solution was around 3.7 picoseconds. We've measured this.

"In a dilute solution, the viscosity is very small," Zheng explained. "People move fast in an easy environment, but when it's crowded you cannot go so fast -- and the same applies here. When a solut is diluted, the molecules should move faster.

"But here in this very viscous solution, the molecules were moving too fast," he said. "Something was up. That's when I realized we weren't actually seeing the molecules rotate at all."

What the probe saw as a too-fast rotation was the vibrational energy as it transferred from one molecule to another with a different orientation. "I had thought that this, at some point, should happen, but I really couldn't experiment to demonstrate it," Zheng said. "The tools didn't exist. Then, just by this small accident, we're developing a whole methodology."

Zheng said his calculations may not apply to all electrolytes, but should cover a wide range of those of interest to researchers. "Only certain types of ions with active infrared modes and the vibrational lifetimes of the modes are comparable to the energy transfer time scales -- but that includes many important ions in biology or electrochemistry. Certainly this method is not limited to ions. It is, in principle, applicable to any molecules with active vibrational modes," he said.

Zheng, a native of China who came to Rice three years ago after completing his doctorate at Stanford, relied on the steady hands of Rice colleagues Anatoly Kolomeisky, an associate professor of chemistry, and Curl, who in fact carried his calculations one step beyond.

"It took us months to figure out a mathematical model to explain the data, and when Bob read it, he said, 'You know, I don't believe you're right,'" Zheng recalled. He said Curl objected to the fact that calculations were based on the average distribution of clusters in a given solution. "It was statistically right," Zheng said, "but it wasn't rigorous.

"Bob took our physical picture and counted the signal size from each cluster. He came up with a very rigorous model that accounts for the distribution of clusters. So his model is perfectly right. No assumptions."

Zheng said both his and Curl's models were in close agreement, since the averaged sample was so large. "But he really helped me to question every detail of the math, to make sure that this is really right.

"I had spent a month with a student who has a bachelor's degree in physics -- in math -- creating our model. It's hard to imagine that Bob could have figured all this out, by himself, in a week."

The paper's co-authors include graduate students Xiewen Wen, Jiebo Li and Hailong Chen, all of Rice; Suzee Han, a student at Clements High School in Sugar Land, Texas, who volunteered in Zheng's lab; and Xiuquan Sun, Jian Song and Wei Zhuang of the Chinese Academy of Sciences, Dalian, China.

PhysOrg

Provided by Rice University
TStzmmalaysia
post Mar 15 2011, 10:05 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Breakthrough in Nanocomposite for High-Capacity Hydrogen Storage

Since the 1970s, hydrogen has been touted as a promising alternative to fossil fuels due to its clean combustion -- unlike hydrocarbon-based fuels, which spew greenhouse gases and harmful pollutants, hydrogen's only combustion by-product is water. Compared to gasoline, hydrogen is lightweight, can provide a higher energy density and is readily available. But there's a reason we're not already living in a hydrogen economy: to replace gasoline as a fuel, hydrogen must be safely and densely stored, yet easily accessed. Limited by materials unable to leap these conflicting hurdles, hydrogen storage technology has lagged behind other clean energy candidates.

In recent years, researchers have attempted to tackle both issues by locking hydrogen into solids, packing larger quantities into smaller volumes with low reactivity -- a necessity in keeping this volatile gas stable. However, most of these solids can only absorb a small amount of hydrogen and require extreme heating or cooling to boost their overall energy efficiency.
Now, scientists with the U.S. Department of Energy (DOE) Lawrence Berkeley National Laboratory (Berkeley Lab) have designed a new composite material for hydrogen storage consisting of nanoparticles of magnesium metal sprinkled through a matrix of polymethyl methacrylate, a polymer related to Plexiglas. This pliable nanocomposite rapidly absorbs and releases hydrogen at modest temperatures without oxidizing the metal after cycling -- a major breakthrough in materials design for hydrogen storage, batteries and fuel cells.
"This work showcases our ability to design composite nanoscale materials that overcome fundamental thermodynamic and kinetic barriers to realize a materials combination that has been very elusive historically," says Jeff Urban, Deputy Director of the Inorganic Nanostructures Facility at the Molecular Foundry, a DOE Office of Science nanoscience center and national user facility located at Berkeley Lab. "Moreover, we are able to productively leverage the unique properties of both the polymer and nanoparticle in this new composite material, which may have broad applicability to related problems in other areas of energy research."
Urban, along with coauthors Ki-Joon Jeon and Christian Kisielowski used the TEAM 0.5 microscope at the National Center for Electron Microscopy (NCEM), another DOE Office of Science national user facility housed at Berkeley Lab, to observe individual magnesium nanocrystals dispersed throughout the polymer. With the high-resolution imaging capabilities of TEAM 0.5, the world's most powerful electron microscope, the researchers were also able to track defects -- atomic vacancies in an otherwise-ordered crystalline framework -- providing unprecedented insight into the behavior of hydrogen within this new class of storage materials.
"Discovering new materials that could help us find a more sustainable energy solution is at the core of the Department of Energy's mission. Our lab provides outstanding experiments to support this mission with great success," says Kisielowski. "We confirmed the presence of hydrogen in this material through time-dependent spectroscopic investigations with the TEAM 0.5 microscope. This investigation suggests that even direct imaging of hydrogen columns in such materials can be attempted using the TEAM microscope."
"The unique nature of Berkeley Lab encourages cross-division collaborations without any limitations," said Jeon, now at the Ulsan National Institute of Science and Technology, whose postdoctoral work with Urban led to this publication.
To investigate the uptake and release of hydrogen in their nanocomposite material, the team turned to Berkeley Lab's Energy and Environmental Technologies Division (EETD), whose research is aimed at developing more environmentally friendly technologies for generating and storing energy, including hydrogen storage.
"Here at EETD, we have been working closely with industry to maintain a hydrogen storage facility as well as develop hydrogen storage property testing protocols," says Samuel Mao, director of the Clean Energy Laboratory at Berkeley Lab and an adjunct engineering faculty member at the University of California (UC), Berkeley. "We very much enjoy this collaboration with Jeff and his team in the Materials Sciences Division, where they developed and synthesized this new material, and were then able to use our facility for their hydrogen storage research."
Adds Urban, "This ambitious science is uniquely well-positioned to be pursued within the strong collaborative ethos here at Berkeley Lab. The successes we achieve depend critically upon close ties between cutting-edge microscopy at NCEM, tools and expertise from EETD, and the characterization and materials know-how from MSD."
This research is reported in a paper titled, "Air-stable magnesium nanocomposites provide rapid and high-capacity hydrogen storage without heavy metal catalysts," appearing in the journal Nature Materials. Co-authoring the paper with Urban, Kisielowski and Jeon were Hoi Ri Moon, Anne M. Ruminski, Bin Jiang and Rizia Bardhan.

ScienceDaily

TStzmmalaysia
post Mar 15 2011, 09:25 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Nanorods developed in UC Riverside lab could greatly improve visual display of information

Chemists at the University of California, Riverside have developed tiny, nanoscale-size rods of iron oxide particles in the lab that respond to an external magnetic field in a way that could dramatically improve how visual information is displayed in the future.

Previously, Yadong Yin's lab showed that when an external magnetic field is applied to iron oxide particles in solution, the solution changes color in response to the strength and orientation of the magnetic field. Now his lab has succeeded in applying a coating of silica (silicon dioxide) to the iron oxide particles so that when they come together in solution, like linearly connected spheres, they eventually form tiny rods – or "nanorods" – that permanently retain their peapod-like structure.

When an external magnetic field is applied to the solution of nanorods, they align themselves parallel to one another like a set of tiny flashlights turned in one direction, and display a brilliant color.

"We have essentially developed tunable photonic materials whose properties can be manipulated by changing their orientation with external fields," said Yin, an assistant professor of chemistry. "These nanorods with configurable internal periodicity represent the smallest possible photonic structures that can effectively diffract visible light. This work paves the way for fabricating magnetically responsive photonic structures with significantly reduced dimensions so that color manipulation with higher resolution can be realized."

Applications of the technology include high-definition pattern formation, posters, pictures, energy efficient color displays, and devices like traffic signals that routinely use a set of colors. Other applications are in bio- and chemical sensing as well as biomedical labeling and imaging. Color displays that currently cannot be seen easily in sunlight – for example, a laptop screen – will be seen more clearly and brightly on devices that utilize the nanorod technology since the rods simply diffract a color from the visible light incident on them.

The researchers note that a simple and convenient way to change the periodicity in the rods is to use iron oxide clusters of different sizes. This, they argue, would make it possible to produce photonic rods with diffraction wavelengths across a wide range of spectrum from near ultraviolet to near infrared.

"One major advantage of the new technology is that it hardly requires any energy to change the orientation of the nanorods and achieve brightness or a color," Yin said. "A current drawback, however, is that the interparticle spacing within the chains gets fixed once the silica coating is applied, allowing for no flexibility and only one color to be displayed."

His lab is working now on achieving bistability for the nanorods. If the lab is successful, the nanorods would be capable of diffracting two colors, one at a time.

"This would allow the same device or pixel to display one color for a while and a different color later," said Yin, a Cottrell Scholar.

EurekAlert

TStzmmalaysia
post Mar 15 2011, 09:28 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

NASA's Hubble Rules out One Alternative to Dark Energy

Astronomers using NASA's Hubble Space Telescope have ruled out an alternate theory on the nature of dark energy after recalculating the expansion rate of the universe to unprecedented accuracy.

The universe appears to be expanding at an increasing rate. Some believe that is because the universe is filled with a dark energy that works in the opposite way of gravity. One alternative to that hypothesis is that an enormous bubble of relatively empty space eight billion light-years across surrounds our galactic neighborhood. If we lived near the center of this void, observations of galaxies being pushed away from each other at accelerating speeds would be an illusion.

This hypothesis has been invalidated because astronomers have refined their understanding of the universe's present expansion rate. Adam Riess of the Space Telescope Science Institute (STScI) and Johns Hopkins University in Baltimore, Md., led the research. The Hubble observations were conducted by the SHOES (Supernova Ho for the Equation of State) team that works to refine the accuracy of the Hubble constant to a precision that allows for a better characterization of dark energy's behavior. The observations helped determine a figure for the universe's current expansion rate to an uncertainty of just 3.3 percent. The new measurement reduces the error margin by 30 percent over Hubble's previous best measurement in 2009. Riess's results appear in the April 1 issue of The Astrophysical Journal.

"We are using the new camera on Hubble like a policeman's radar gun to catch the universe speeding," Riess said. "It looks more like it's dark energy that's pressing the gas pedal."

Riess' team first had to determine accurate distances to galaxies near and far from Earth. The team compared those distances with the speed at which the galaxies are apparently receding because of the expansion of space. They used those two values to calculate the Hubble constant, the number that relates the speed at which a galaxy appears to recede to its distance from the Milky Way. Because astronomers cannot physically measure the distances to galaxies, researchers had to find stars or other objects that serve as reliable cosmic yardsticks. These are objects with an intrinsic brightness, brightness that hasn't been dimmed by distance, an atmosphere, or stellar dust, that is known. Their distances, therefore, can be inferred by comparing their true brightness with their apparent brightness as seen from Earth.

To calculate longer distances, Riess' team chose a special class of exploding stars called Type 1a supernovae. These stellar explosions all flare with similar luminosity and are brilliant enough to be seen far across the universe. By comparing the apparent brightness of Type 1a supernovae and pulsating Cepheid stars, the astronomers could measure accurately their intrinsic brightness and therefore calculate distances to Type Ia supernovae in far-flung galaxies.

Using the sharpness of the new Wide Field Camera 3 (WFC3) to study more stars in visible and near-infrared light, scientists eliminated systematic errors introduced by comparing measurements from different telescopes.

"WFC3 is the best camera ever flown on Hubble for making these measurements, improving the precision of prior measurements in a small fraction of the time it previously took," said Lucas Macri, a collaborator on the SHOES Team from Texas A&M in College Station.

Knowing the precise value of the universe's expansion rate further restricts the range of dark energy's strength and helps astronomers tighten up their estimates of other cosmic properties, including the universe's shape and its roster of neutrinos, or ghostly particles, that filled the early universe.

"Thomas Edison once said 'every wrong attempt discarded is a step forward,' and this principle still governs how scientists approach the mysteries of the cosmos," said Jon Morse, astrophysics division director at NASA Headquarters in Washington. "By falsifying the bubble hypothesis of the accelerating expansion, NASA missions like Hubble bring us closer to the ultimate goal of understanding this remarkable property of our universe."

ScienceDaily


TStzmmalaysia
post Mar 15 2011, 09:30 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

New microscope captures 3D movies of living cells

In some cases, looking at a living cell under a microscope can cause it damage or worse, can kill it. Now, a new kind of microscope has been invented by researchers from the Howard Hughes Medical Institute that is able to non-invasively take a three dimensional look inside living cells with stunning results. The device uses a thin sheet of light like that used to scan supermarket bar codes and could help biologists to achieve their goal of understanding the rules that govern molecular processes within a cell.

Veteran microscope innovator Eric Betzig says that the field of microscopy has been hindered by the fact that many techniques require cells to be killed and fixed before being viewed. Light produced by microscopes used for live-cell techniques can, in some cases, actually cause damage to the cells. The light also floods the whole area being examined, not just the small portion that's in focus – producing blur from the out-of-focus regions.

Two years after arriving at HHMI's Janelia Farm Research Campus, Betzig started working ways to overcome these problems.

"The question was, is there a way of minimizing the amount of damage you're doing so that you can then study cells in a physiological manner while also studying them at high spatial and temporal resolution for a long time?" said Betzig.

First developed around a 100 years ago, plane illumination microscopy involves shining light through the side of a sample rather than from the top. While offering some promise, Betzig's group found that the technique still exposed too much of the sample. A much thinner sheet of light was produced using by sweeping a Bessel beam – a kind of non-diffracting light beam – across the sample but the light produced by this form of plane illumination microscopy proved to be somewhat weak, making the pattern of illumination look somewhat like a bullseye.

Working with postdoctoral researchers Thomas Planchon and Liang Gao, Betzig has spent the last couple of years refining the process to try and overcome the problem. First, instead of sweeping the Bessel beam across the sample, the group rapidly switched it off and on – a method known as structured illumination. Then by concentrating the light to a narrow central part of the Bessel beam using something called two-photon microscopy, they were able to build 3D stacks of the sample at nearly 200 images per second to generate movies of processes like cell division in stunning detail.

Betzig says that Bessel beam plane illumination microscopy will prove a powerful tool for cell biologists, since it non-invasively images the rapidly evolving three-dimensional complexity of cells.

The research is described in detail in a paper entitled Rapid three-dimensional isotropic imaging of living cells using Bessel beam plane illumination, which was recently published in the journal Nature Methods.

Gizmag

TStzmmalaysia
post Mar 16 2011, 11:25 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


NANOTECHNOLOGY

Attached Image

Using light to build nanoparticles into superstructures

Scientists in the Center for Nanoscale Materials and Argonne's Biosciences Division have demonstrated a remarkably simple, elegant, and cost-effective way of assembling nanoparticles into larger structures of any desired shape and form at will via a process called "optically directed assembly."

Optically directed assembly (ODA) involves suspensions of gold and carbon nanoparticles in water. A small droplet of the suspension is placed on a glass slide, and a low-power laser is focused onto a small region within the droplet near its surface. Through a complex process involving optical trapping, heating, evaporation, convective fluid flow, and chemical interactions, the nanoparticles fuse near the laser focus and as the experimenter moves the laser focus around in the droplet, a continuous filament of the fused material follows.

Gold-carbon nanoparticle interactions. (a) TEM image of the tip of a gold-carbon filament; (b) TEM image of encapsulated gold nanoparticles within the tip; © Initial gold-carbon nanoparticle configuration for a molecular dynamics simulation; and (d) molecular dynamics result after 10 ns showing wetting of a gold nanoparticle by carbon atoms in the 450K range. These results indicate the possibility of encapsulation of gold nanoparticles by carbon.

These remarkable structures remain completely intact even after the fluid is drained off. In this manner “handcrafted” filaments of up to millimeter lengths and 10-60 times wider than the original nanoparticles can be formed with arbitrary shape and design. The resulting hierarchical architectures may be useful for a variety of applications, including biological sensing, electronics, optics, and emerging energy technologies. As a first demonstration, the researchers handcrafted a microscopic glyph -- the Chinese symbol for “king.”

Irreversible metal-metal aggregation is observed only when carbon is present. Scientists in CNM's Theory & Modeling Group used molecular dynamics simulations to model gold-carbon nanoparticle configurations and wetting behavior.

PhysOrg
TStzmmalaysia
post Mar 16 2011, 11:27 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Oscillating gels could find many uses

Self-oscillating gels are materials that continuously change back and forth between different states — such as color or size — without provocation from external stimuli.

These changes are caused by the Belousov-Zhabotinsky chemical reaction, which was discovered during the 1950s. Without stirring or other outside influence, wave patterns from this chemical reaction can develop within the material or cause the entire gel itself to pulsate.

Irene Chou Chen, a doctoral candidate in the lab of Krystyn J. Van Vliet, the Paul M. Cook Career Development Associate Professor of Materials Science and Engineering, has been studying exactly how adjusting the size and shape of these gels can affect their behavior.

By integrating experiments with computer simulations conducted by collaborators Olga Kuksenok, Victor Yashin and Anna C. Balazs at the University of Pittsburgh, the MIT researchers have shown that pattern formation within the material can be controlled by changing the gel's size or shape. When the reaction is restricted to a sub-millimeter-sized gel, the material exhibits chemical oscillations that cause it to mechanically swell and shrink. Lasting for several hours, these self-sustained oscillations exemplify chemomechanical coupling — where chemical reactions cause mechanical changes. The work will be published in the March issue of the journal Soft Matter as part of a special focus on “active soft matter.”


The self-sustained pulsations could enable unique applications for this material, the researchers say, such as using it as an environmental sensor or as an actuator that could react to specific conditions. The simulations developed by the University of Pittsburgh group could also help to make such applications easier to implement.

PhysOrg



Provide by Massachusetts Institute of Technology
TStzmmalaysia
post Mar 16 2011, 11:29 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Miniature lasers could help launch new age of the Internet

A new laser device created at the University of Central Florida could make high-speed computing faster and more reliable, opening the door to a new age of the Internet.

Professor Dennis Deppe's miniature laser diode emits more intense light than those currently used. The light emits at a single wavelength, making it ideal for use in compact disc players, laser pointers and optical mice for computers, in addition to high-speed data transmission.

Until now, the biggest challenge has been the failure rate of these tiny devices. They don't work very well when they face huge workloads; the stress makes them crack.

The smaller size and elimination of non-semiconductor materials means the new devices could potentially be used in heavy data transmission, which is critical in developing the next generation of the Internet. By incorporating laser diodes into cables in the future, massive amounts of data could be moved across great distances almost instantaneously. By using the tiny lasers in optical clocks, the precision of GPS and high-speed wireless data communications also would increase.

"The new laser diodes represent a sharp departure from past commercial devices in how they are made," Deppe said from his lab inside the College of Optics and Photonics. "The new devices show almost no change in operation under stress conditions that cause commercial devices to rapidly fail."

"At the speed at which the industry is moving, I wouldn't be surprised if in four to five years, when you go to Best Buy to buy cables for all your electronics, you'll be selecting cables with laser diodes embedded in them," he added.

Deppe and Sabine Freisem, a senior research scientist who has been collaborating with Deppe for the past eight years, presented their findings in January at the SPIE Photonics West conference in San Francisco, where PhysOrg.com participated as a media sponsor.

Deppe has spent 21 years researching semiconductor lasers, and he is considered an international expert in the area. sdPhotonics is working on the commercialization of many of his creations and has several ongoing contracts.

"This is definitely a milestone," Freisem said. "The implications for the future are huge."

But there is still one challenge that the team is working to resolve. The voltage necessary to make the laser diodes work more efficiently must be optimized

Deppe said once that problem is resolved, the uses for the laser diodes will multiply. They could be used in lasers in space to remove unwanted hair.

"We usually have no idea how often we use this technology in our everyday life already," Deppe said. "Most of us just don't think about it. With further development, it will only become more commonplace."

PhysOrg

Deppe joined UCF in 2006 after several years at the University of Texas at Austin. He has a Ph.D. in Electrical Engineering from the University of Illinois, and he has earned many prestigious awards. He was the 1999 Optical Society of America Nicholas Holonyak Award winner and was named a fellow by the OSA and Institute of Electrical and Electronics Engineers in 2000.

Freisem has a Ph.D. in Physics from Leiden University in the Netherlands. She worked with Deppe at UT before moving to UCF in 2006.

Provided by University of Central Florida (news : web)


TStzmmalaysia
post Mar 16 2011, 11:31 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Room-Temperature Spintronic Computers with Silicon Spin Transistors?

University of Utah researchers built "spintronic" transistors and used them to align the magnetic "spins" of electrons for a record period of time in silicon chips at room temperature. The study is a step toward computers, phones and other spintronic devices that are faster and use less energy than their electronic counterparts.

"Electronic devices mostly use the charge of the electrons - a negative charge that is moving," says Ashutosh Tiwari, an associate professor of materials science and engineering at the University of Utah. "Spintronic devices will use both the charge and the spin of the electrons. With spintronics, we want smaller, faster and more power-efficient computers and other devices."


Tiwari and Ph.D. student Nathan Gray report their creation of room-temperature, spintronic transistors on a silicon semiconductor this month in the journal Applied Physics Letters. The research - in which electron "spin" aligned in a certain way was injected into silicon chips and maintained for a record 276 trillionths of a second - was funded by the National Science Foundation.

"Almost every electronic device has silicon-based transistors in it," Gray says. "The current thrust of industry has been to make those transistors smaller and to add more of them into the same device" to process more data. He says his and Tiwari's research takes a different approach.

"Instead of just making transistors smaller and adding more of them, we make the transistors do more work at the same size because they have two different ways [electron charge and spin] to manipulate and process data," says Gray.

A Quick Spin through Spintronics

Modern computers and other electronic devices work because negatively charged electrons flow as electrical current. Transistors are switches that reduce computerized data to a binary code of ones or zeros represented by the presence or absence of electrons in semiconductors, most commonly silicon.

In addition to electric charge, electrons have another property known as spin, which is like the electron's intrinsic angular momentum. An electron's spin often is described as a bar magnet that points up or down, which also can represent ones and zeroes for computing.

Most previous research on spintronic transistors involved using optical radiation - in the form of polarized light from lasers - to orient the electron spins in non-silicon materials such as gallium


arsenide or organic semiconductors at supercold temperatures.

"Optical methods cannot do that with silicon, which is the workhorse of the semiconductor and electronics industry, and the industry doesn't want to retool for another material," Tiwari says.

"Spintronics will become useful only if we use silicon," he adds.


The Experiment

In the new study, Tiwari and Gray used electricity and magnetic fields to inject "spin polarized carriers" - namely, electrons with their spins aligned either all up or all down - into silicon at room temperature.

Their trick was to use magnesium oxide as a "tunnel barrier" to get the aligned electron spins to travel from one nickel-iron electrode through the silicon semiconductor to another nickel-iron electrode. Without the magnesium oxide, the spins would get randomized almost immediately, with half up and half down, Gray says.

"This thing works at room temperature," Tiwari says. "Most of the devices in earlier studies have to be cooled to very low temperatures" - colder than 200 below zero Fahrenheit - to align the electrons' spins either all up or all down. "Our new way of putting spin inside the silicon does not require any cooling."

The experiment used a flat piece of silicon about 1 inch long, about 0.3 inches wide and one-fiftieth of an inch thick. An ultra-thin layer of magnesium oxide was deposited on the silicon wafer. Then, one dozen tiny transistors were deposited on the silicon wafer so they could be used to inject electrons with aligned spins into the silicon and later detect them.

Each nickel-iron transistor had three contacts or electrodes: one through which electrons with aligned spins were injected into the silicon and detected, a negative electrode and a positive electrode used to measure voltage.

During the experiment, the researchers send direct current through the spin-injector electrode and negative electrode of each transistor. The current is kept steady, and the researchers measure variations in voltage while applying a magnetic field to the apparatus

"By looking at the change in the voltage when we apply a magnetic field, we can find how much spin has been injected and the spin lifetime," Tiwari says.



A 328 Nanometer, 276 Picosecond Step for Spintronics

For spintronic devices to be practical, electrons with aligned spins need to be able to move adequate distances and retain their spin alignments for an adequate time.

During the new study, the electrons retained their spins for 276 picoseconds, or 276 trillionths of a second. And based on that lifetime, the researchers calculate the spin-aligned electrons moved through the silicon 328 nanometers, which is 328 billionths of a meter or about 13 millionths of an inch.

"It's a tiny distance for us, but in transistor technology, it is huge," Gray says. "Transistors are so small today that that's more than enough to get the electron where we need it to go."

"Those are very good numbers," Tiwari says. "These numbers are almost 10 times bigger than what we need [for spintronic devices] and two times bigger than if you use aluminum oxide" instead of the magnesium oxide in his study.

He says Dutch researchers previously were able to inject aligned spins into silicon using aluminum oxide as the "tunneling medium," but the new study shows magnesium oxide works better.

The new study's use of electronic spin injection is much more practical than using optical methods such as lasers because lasers are too big for chips in consumer electronic devices, Tiwari says.

He adds that spintronic computer processors require little power compared with electronic devices, so a battery that may power an electronic computer for eight hours might last more than 24 hours on a spintronic computer.

Gray says spintronics is "the next big step to push the limits of semiconductor technology that we see in every aspect of our lives: computers, cell phones, GPS (navigation) devices, iPods, TVs."

PhysOrg

Provided by University of Utah


TStzmmalaysia
post Mar 16 2011, 11:34 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Tying the knot with computer-generated holograms: Winding optical path moves matter

In the latest twist on optical knots, New York University physicists have discovered a new method to create extended and knotted optical traps in three dimensions. This method, which the NYU scientists describe in the Optical Society's open-access journal Optics Express, produces "bright" knots, where the maximum of the light intensity traces out a knotted trajectory in space, for the first time allowing microscopic objects to be trapped along the path of the knot. The method may even, one day, help enable fusion energy as a practical power source, according to the NYU team.

Optical traps can be used to confine and manipulate small objects—ranging in size from a few nanometers to several hundred micrometers—in 3-D. They work because variations in the intensity of the light produce forces that push small objects toward bright regions. The trapping of small objects is widely used for a broad range of research applications in biophysics, condensed matter physics and medical diagnostics.

Ordinary optical traps use Gaussian laser beams that focus to a spot. The beams being used to create extended optical traps focus instead to curves, much like the bright patterns on the bottom of swimming pools. And these bright curves can be tied in knots.

Knotted traps are made by imprinting a computer-generated hologram on the wavefronts of an otherwise ordinary beam of light. NYU undergraduate student Elisabeth Shanblatt and NYU physicist David Grier, the authors of the Optics Express paper, use a "liquid-crystal spatial light modulator" to project their holograms. This is essentially the first cousin of a conventional LCD television screen. The spatial light modulator imprints a calculated pattern of phase shifts onto the light. When the modified beam is brought to a focus with a high-power lens, the region of maximum intensity takes the form of a 3-D curve. This curve can cross over and through itself to trace out a knot. Moreover, the same hologram can redirect the light's radiation pressure to have a component along the curve, so that the total optical force "threads the knot."

When Shanblatt and Grier began this investigation, they thought that creating knots would be a compelling and aesthetically pleasing demonstration of their method's power. Once the knots actually worked, they realized that there are very few—if any—other practical ways to create knotted force fields. Previously reported knotted vortex fields have intensity minima along the knot, rather than the intensity maxima, or "bright knots" that can be created using the computer-generated holograms.

Shanblatt was working on a project with Grier investigating these holographic optical traps, when they discovered a method for projecting holographic optical traps along arbitrary curves in 3-D, with amplitude and phase profiles independently specified (See figure above).

"The knotted optical force fields we created use intensity gradients to hold microscopic objects in place and phase gradients to thread them through the knot," says Shanblatt, describing their method. "These optical knots are a special type of a very general class of 3-D optical traps that can be created using holographic techniques."

Ordinary optical traps have current applications in biophysics, where they are used as surgical tools and to probe the elastic properties of biomolecules, and in condensed matter physics, where they assemble nanomaterials into 3-D functional structures and gauge the forces between microscopic objects. Extended optical traps are especially handy in moving small objects such as biological cells through microfluidic lab-on-a-chip devices. And they can be used to measure very small interactions among such objects, which is helpful for medical diagnostic tests.

Perhaps the most exciting and futuristic potential application the NYU team sees for their method is to create knotted current loops of charged particles in high-temperature plasmas. This is a long-sought-after goal for developing fusion energy as a practical power source.

How can their knots of light solve problems of fusion energy? Fusion reactors work by slamming light atomic nuclei into each other so hard that the nuclei fuse into heavier elements, releasing lots of energy. The best way to accomplish this, Grier says, is to heat the atoms to a high enough temperature so that they can overcome all of the barriers to fusion. At these temperatures, the atoms' electrons ionize and the gas becomes a plasma.

This is doubly good, notes Grier, because you can pass large electric currents through the plasma, therefore heating it still more. "You can also act on the currents with magnetic fields to contain the hot plasma, preventing it from destroying its physical container. These fusion plasmas are literally as hot as the core of the sun," he adds.

A problem occurs when currents flowing through plasma in a fusion reactor become unstable; this is similar to what occurs when the currents flowing through the plasma in a neon sign flicker. The currents thrash around, cool the plasma, damage the container, and generally prevent the process from generating useful energy.

"If the currents in a plasma are tied into a knot, the knot can eliminate most, if not all, of these instabilities because the magnetic field lines generated by the knotted current can't pass though each other," explains Grier.

Shanblatt and Grier believe that projecting a knotted optical force field into a plasma might prove to be a good way to initiate a knotted current loop. If so, the knotted current could then be ramped up by other conventional means. The result? Perhaps, a stable, high-temperature plasma capable of producing bountiful fusion energy.

physorg

More information: "Extended and Knotted Optical Traps in Three Dimensions," Elisabeth R. Shanblatt, David G. Grier, Optics Express, Vol. 19, Issue 7, pp. 5833-5838,

Study Provided by Optical Society of America

TStzmmalaysia
post Mar 16 2011, 11:35 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Hotter Solar Energy

Solar thermal power plants that produce hotter steam can capture more solar energy. That's why Siemens is exploring an upgrade for solar thermal technology to push its temperature limit 160 °C higher than current designs. The idea is to expand the use of molten salts, which many plants already use to store extra heat. If the idea proves viable, it will boost the plants' steam temperature up to 540 °C—the maximum temperature that steam turbines can take.

Siemens's new solar thermal plant design, like all large solar thermal power plants now operating, captures solar heat via trough-shaped rows of parabolic mirrors that focus sunlight on steel collector tubes. The design's Achilles' heel is the synthetic oil that flows through the tubes and conveys captured heat to the plants' centralized generators: the synthetic oil breaks down above 390 °C, capping the plants' design temperature.

Startups such as BrightSource, eSolar, and SolarReserve propose to evade synthetic oil's temperature cap by building so-called power tower plants, which use fields of mirrors to focus sunlight on a central tower. But Siemens hopes to upgrade the trough design, swapping in heat-stable molten salt to collect heat from the troughs. The resulting design should not only be more efficient than today's existing trough-based plants, but also cheaper to build. "A logical next step is to just replace the oil with salt," says Peter Mürau, Siemens's molten salt technology program manager.

The German engineering giant will actually be the second player to try to push molten salts through solar collector tubes. Last summer, the Italian utility Enel began running molten salt through a field of about 30,000 square meters of trough mirrors adjacent to its natural gas-fired power plant near Syracuse, Sicily. The salt exits the 5.4-kilometers of collector pipe at 565 °C, boosting the power plant's output by 5 percent.

Enel's plant uses collector tubes from Italy's Archimede Solar Energy, the only producer of collector tubes designed to handle molten salts. Their collector tubes use a heat-stable metalloceramic coating to maximize heat absorption, as well as thicker titanium-stabilized steel pipes to resist bending at high temperatures. Paolo Martini, Archimede's business development director, says the plant is operating well. Enel plans to build a 30-megawatt plant in Sicily.

Technology Review

saigetsu
post Mar 16 2011, 02:11 PM

Enthusiast
*****
Senior Member
707 posts

Joined: Feb 2010


i wonder why...

the solar power can directly produce electricity, and why bother to use solar power to steam up the water?

i tot the energy will losing out when we transform its energy from from light to heat to kinetic then electricity.
TStzmmalaysia
post Mar 17 2011, 09:04 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Insulin-releasing switch discovered

Johns Hopkins researchers believe they have uncovered the molecular switch for the secretion of insulin — the hormone that regulates blood sugar — providing for the first time an explanation of this process. In a report published online March 1 in Cell Metabolism, the researchers say the work solves a longtime mystery and may lead to better treatments for type 2 diabetes, the most common form of the disease.

"Before our discovery, the mechanism behind how exactly the insulin-producing beta cells in the islet of Langerhans of the pancreas fail in type 2 diabetes was incompletely understood, making it difficult to design new and better therapies, says Mehboob Hussain, M.D., associate professor of pediatrics, medicine and biological chemistry. "Our research cracks open a decades-long mystery."

After a meal, the pancreas produces insulin to move glucose from the blood into cells for fuel. People with type 2 diabetes either don't secrete enough insulin or their cells are resistant to its effects.

In a study designed to figure out more precisely how the pancreas releases insulin, Hussain's group looked at how other cells in the body release chemicals. One particular protein, Snapin, found in nerve cells, caught their eye because it's used by nerve cells to release chemicals necessary for cell communication. Snapin also is found in the insulin-secreting pancreatic beta cells.

To test the role of Snapin, researchers engineered a change to the Snapin gene in mice to keep Snapin permanently "on" in the pancreas. Researchers removed the pancreas cells and grew them in a dish for a day, then added glucose to the cells and took samples to measure how much insulin was released.

When the scientists compared that measurement to what was released by pancreas cells in normal mice, they found that normal mice released about 2.8 billionths of a gram of insulin per cell, whereas the cells from "Snapin-on" mice released 7.3 billionths of a gram of insulin per cell — about three times the normal amount.

"We were surprised to find that the Snapin-on mice didn't have more or bigger pancreas cells, they just made more insulin naturally," says Hussain. "This means all our insulin-secreting cells have this amazing reserve of insulin that we didn't really know existed and a switch that controls it."
To see if permanently turning off Snapin would reduce insulin release and further demonstrate that Snapin controls the process, the researchers first grew normal mouse pancreas cells in a dish, and treated them with a chemical that stopped them from making the Snapin protein. They again bathed the cells in glucose and measured how much insulin was released by the cells. Normal cells released 5.8 billionths of a gram of insulin, whereas cells with no Snapin only released 1.1 billionths of a gram of insulin — about 80 percent less.

"These results convinced us that Snapin is indeed the switch that releases insulin from the pancreas," says Hussain.

Normally, according to Hussain, when we ingest glucose, the pancreatic beta cells release an initial burst of insulin almost immediately, then gradually release more insulin about 15 minutes later. However, people with type 2 diabetes and mice engineered to react metabolically like people with type 2 diabetes don't release this initial spurt of insulin when fed glucose, but still have the later gradual insulin release.

"We knew how important the first burst of insulin is for controlling our blood sugar, but we did not know what really went wrong in our beta cells in people with type 2 diabetes," says Hussain. "We have drugs that restore the first burst of insulin and yet we did not completely understand how they work."

Hussain then questioned whether Snapin could be used to fix the defects in cells from a diabetic animal.

Since the cells with Snapin on made too much insulin, researchers wanted to see if they could use this to restore these mice's ability to secrete the initial burst of insulin. After growing pancreatic beta cells from type 2 diabetes mice in a dish and engineering them to make the Snapin-on protein, the researchers fed the cells glucose and found that they did indeed regain the ability to release that initial insulin burst.

"While keeping Snapin on in these mouse cells corrects the problem in this animal model of type 2 diabetes, we're still a long way from knowing if the same mechanism will work in people, but this gives us an encouraging start," says Hussain.

PhysOrg

TStzmmalaysia
post Mar 17 2011, 09:05 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


SPACE SCIENCE

Attached Image

CU-Boulder space scientists ready for orbital insertion of Mercury spacecraft

NASA's MESSENGER mission, launched in 2004, is slated to slide into Mercury's orbit March 17 after a harrowing 4.7 billion mile journey that involved 15 loops around the sun and will bring relief and renewed excitement to the University of Colorado Boulder team that designed and a built an $8.7 million instrument onboard.

"In 2004, this milestone seemed like it was a long, long way away," said Senior Research Associate William McClintock, a mission co-investigator from CU-Boulder's Laboratory for Atmospheric and Space Physics. "But here we are at last, poised to help solve some of the many tantalizing mysteries about Mercury."

The smallest of the solar system's four rocky planets, Mercury is about two-thirds of the way nearer to the sun than Earth and has been visited by only one other spacecraft, NASA's Mariner 10, in 1974 and 1975. CU-Boulder scientists say learning what makes the hot, rocky planet tick will help them better understand the formation and evolution of planetary systems.

The refrigerator-sized spacecraft is carrying seven instruments -- a camera, a magnetometer, an altimeter and four spectrometers. Designed and built by CU-Boulder's LASP, the Mercury Atmospheric and Surface Composition Spectrometer, or MASCS, is a power-packed, 7-pound instrument that will make measurements of Mercury's surface and its tenuous atmosphere, called the exosphere.

MASCS breaks up light like a prism, and since each element and compound has a unique spectral signature, scientists can determine the distribution and abundance of various minerals and gases on the planet's surface and exosphere, said McClintock. "We now know Mercury's exosphere is constantly changing," he said.

During a 2009 MESSENGER flyby of Mercury, MASCS detected magnesium, an element created inside exploding stars, clumped in the exosphere. The team determined magnesium, sodium and potassium and several other kinds of atoms flying off Mercury's surface were being accelerated by solar radiation pressure to form a gigantic tail of material flowing away from the sun, said McClintock.

"All of the instruments on MESSENGER had to be extremely light, which stretched our imaginations and creativity," said LASP's Mark Lankton, program manager for MASCS. "We have learned a lot, and wound up getting a lot of bang for our buck."

LASP Director Daniel Baker, also a co-investigator on the MESSENGER mission, is studying Mercury's magnetic field and its interaction with the solar wind, including violent "sub-storms" that occur in the planet's vicinity. Since Mercury is the closest planet to the sun, MESSENGER is equipped with a large sunshade and heat-resistant ceramic fabric to protect it, said Baker.

"The three successful flybys of MESSENGER past Mercury have already rewritten the textbooks about the sun's nearest neighbor," Baker said. "We are pleased by all we have learned about the space environment of the planet. But we think there is so much more to learn -- we've probably just scratched the surface, so to speak."

Baker said the orbit insertion of Mercury will be celebrated by all of LASP, including a solar science team that saw its $28 million instrument crash into the sea March 4 due to problems with a NASA-contracted launch vehicle. "A very important aspect of LASP is that it is like a big family," Baker said. "Everyone shares the joys of success and the sorrow of failure, which has been a blessedly rare occurrence in our history."

"We have all of our appendages crossed for a successful orbit insertion," said Lankton. "MESSENGER is part of NASA's Discovery Program, and I'd be surprised if we don't continue to be surprised. Once we are in Mercury's orbit we are going to be getting a bounty of new data every day."

Dozens of undergraduates and graduate students will be involved in analyzing data as information and images begin pouring back to Earth from MESSENGER, dubbed "the little spacecraft that could" by LASP scientists. "This mission is going to be a field day for students, not only at CU-Boulder, but for students all over the world," said Baker.

CU-Boulder's LASP is the only space institute in the world to have designed and flown instruments that have visited or are en route to every planet in the solar system. LASP also has a student-built dust-counting instrument on NASA's New Horizons Mission, launched in 2006 to Pluto and now approaching the orbit of Uranus.

"LASP has some of the best people in the world pursuing great science, great engineering, wonderful mission operations, and superb administrative and managerial achievement," said Baker. "When such a team is given the facilities and resources to thrive, the sky is the limit. But it all starts with our people, including our students."

The data will be sent via NASA's Deep Space Network to the Applied Physics Laboratory at Johns Hopkins University -- which is managing the mission for NASA -- where mission scientists, including researchers and students at LASP's Space Technology Building at the CU Research Park, will access it electronically, he said.

EurekAlert

TStzmmalaysia
post Mar 17 2011, 09:07 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


NANOTECHNOLOGY

Attached Image

Localized delivery of an anti-cancer drug by remote-controlled microcarriers

Soon, drug delivery that precisely targets cancerous cells without exposing the healthy surrounding tissue to the medication's toxic effects will no longer be an oncologist's dream but a medical reality, thanks to the work of Professor Sylvain Martel, Director of the Nanorobotics Laboratory at Polytechnique Montréal.

Known for being the world's first researcher to have guided a magnetic sphere through a living artery, Professor Martel is announcing a spectacular new breakthrough in the field of nanomedicine. Using a magnetic resonance imaging (MRI) system, his team successfully guided microcarriers loaded with a dose of anti-cancer drug through the bloodstream of a living rabbit, right up to a targeted area in the liver, where the drug was successfully administered. This is a medical first that will help improve chemoembolization, a current treatment for liver cancer.

Microcarriers on a mission

The therapeutic magnetic microcarriers (TMMCs) were developed by Pierre Pouponneau, a PhD candidate under the joint direction of Professors Jean-Christophe Leroux and Martel. These tiny drug-delivery agents, made from biodegradable polymer and measuring 50 micrometers in diameter — just under the breadth of a hair — encapsulate a dose of a therapeutic agent (in this case, doxorubicin) as well as magnetic nanoparticles. Essentially tiny magnets, the nanoparticles are what allow the upgraded MRI system to guide the microcarriers through the blood vessels to the targeted organ.

During the experiments, the TMMCs injected into the bloodstream were guided through the hepatic artery to the targeted part of the liver where the drug was progressively released. The results of these in-vivo experiments have recently been published in the prestigious journal Biomaterials and the patent describing this technology has just been issued in the United States.

EurekAlert


TStzmmalaysia
post Mar 17 2011, 09:08 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Natural sequence farming

Improving land management and farming practices in Australia could have an effect on global climate change, according to a study published in the International Journal of Water.

Natural Sequence Farming is a descriptor used when sustainable agriculture mimics the once highly efficient functions of the Australian landscape. NSF pioneer Peter Andrews of Denman in New South Wales and coordinator of the NSF movement, Duane Norris of Hardy's Bay, New South Wales explain how NSF techniques could re-couple environmental carbon and water cycles not only to improve farming yields but to avoid soil erosion and reduce carbon dioxide emissions.

Agricultural practices such as clearing, burning, plowing, draining, and irrigation, have become commonplace across the Australian continent, as they have elsewhere. Their effect on the organic carbon content of soil has led to a decline in soil quality across farmland on the continent with levels currently a tenth of what they were 200 years ago prior to the major European settling of Australia.

Andrews and Norris point out that this has had implications for atmospheric carbon dioxide levels and will continue to impact on global warming if farming practices are not modified. "Soils hold twice as much carbon as the atmosphere, and three times as much as vegetation," the team explains, "But carbon in soil exposed by common agricultural practices leads to the oxidation of the carbon and the release of carbon dioxide into the atmosphere." Estimates suggest that soils that once contained carbon matter 4,000 to 10,000 years old, are now holding carbon that is a mere two years old because poor management of livestock grazing leaves soil de-vegetated and in an oxidizing state.

Plants extract carbon from carbon dioxide in the air by photosynthesis, the team says. This carbon is critical to soil health and plant fertility, but it is lost when a ploughed paddock is left bare with no plant cover. More carbon is released when grassland and trees are cleared. However, when vegetation is allowed to break down, even if it is weedy cover, the carbon content of the soil is raised and growing conditions improve.

But plants also have another critical function - relevant to both soil fertility and to climate stabilization. This is the atmospheric cooling that takes place through the evaporation of moisture from leaves, as it rises to form rain clouds, and then falls again restoring of the small water cycle to a local area. In this respect, hands-on NSF research in Australia converges with cutting edge scientific research elsewhere.

The team adds that careful water management, planting, and mulch farming all work together in NSF practices so remediating eroded land. NSF techniques have been developed to restore ecosystems by re-coupling the carbon and water cycles and could overcome the calamitous decline in soil carbon content caused by oxidation, soil erosion and loss to the sea because of fast-running water flows and floods.

There are four guidelines for Natural Sequence Farming:

First, restoring fertility held by nutrients and organic matter to improve the biological function of soils. Second, reinstating the hydrological balance to increase groundwater storage in the floodplain aquifer, increasing freshwater recharge and hence reducing saline groundwater discharge. The third principle is to re-establish natural vegetation succession through pioneer species to promote the healthy growth of native plant communities. The fourth guideline is to understand the hydrological and biogeochemical processes that drive the natural landscape system, which will allow their management to restore ecological function.

The researchers recognize that Australians cannot turn the clock back 100,000 years to recreate the forested continent of mega fauna and sediment-carrying flood plains that existed before humans arrived.

EurekAlert


TStzmmalaysia
post Mar 17 2011, 09:10 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

3-D printing method advances electrically small antenna design

While most electronic components benefit from decreased size, s—whether in a cell phone or on an aircraft—suffer limitations in gain, efficiency, system range, and bandwidth when their size is reduced below a quarter-wavelength.

"Recent attention has been directed toward producing antennas by screen-printing, inkjet printing, and liquid metal-filled microfluidics in simple motifs, such as dipoles and loops," explained Jennifer T. Bernhard, a professor of electrical and computer engineering at the University of Illinois at Urbana-Champaign. "However, these fabrication techniques are limited in both spatial resolution and dimensionality, yielding planar antennas that occupy a large area relative to the achieved performance."

"Omnidirectional printing of metallic nanoparticle inks offers an attractive alternative for meeting the demanding form factors of 3D electrically small antennas (ESAs)," stated Jennifer A. Lewis, the Hans Thurnauer Professor of Materials Science and Engineering and director of the Frederick Seitz Materials Research Laboratory at Illinois.

"To our knowledge, this is the first demonstration of 3D printed antennas on curvilinear surfaces," Lewis stated. The research findings and fabrication methods developed by Bernhard, Lewis, and their colleagues are featured in the cover article of the March 18 issue of Advanced Materials ("Conformal Printing of Electrically Small Antennas on Three-Dimensional Surfaces").

According to Bernhard, these antennas are electrically small relative to a wavelength (typically a twelfth of a wavelength or less) and exhibit performance metrics that are an order of magnitude better than those realized by monopole antenna designs.

"There has been a long-standing problem of minimizing the ratio of energy stored to energy radiated—the Q—of an ESA," Bernhard explained. "By printing directly on the hemispherical substrate, we have a highly versatile single-mode antenna with a Q that very closely approaches the fundamental limit dictated by physics (known as the Chu limit).

Conformal printing allows the antenna's meander lines to be printed on the outside or inside of hemispherical substrates, adding to its flexibility.

"Unlike planar substrates, the surface normal is constantly changing on curvilinear surfaces, which presents added fabrication challenges," Lewis noted. To conformally print features on hemispherical substrates, the silver ink must strongly wet the surface to facilitate patterning even when the deposition nozzle (100 ?m diameter) is perpendicular to the printing surface.

To fabricate an antenna that can withstand mechanical handling, for example, the silver nanoparticle ink is printed on the interior surface of glass hemispheres. Other non-spherical ESAs can be designed and printed using a similar approach to enable integration of low Q antennas on, for example, the inside of a cell phone case or the wing of an unmanned aerial vehicle. The antenna's operating frequency is determined primarily by the printed conductor cross-section and the spacing (or pitch) between meander lines within each arm.

According to the researchers, their design can be rapidly adapted to new specifications, including other operating frequencies, device sizes, or encapsulated designs that offer enhanced mechanical robustness.

"This conformal printing technique can be extended other potential applications, including flexible, implantable, and wearable antennas, electronics, and sensors," Lewis said.

EurekAlert

TStzmmalaysia
post Mar 17 2011, 09:11 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Jessica Dailey Blu Homes Announces That Its Prefab Houses Save Homeowners 50-70% in Energy Costs

Blu Homes is a modular home maker that designs eco-friendly prefab houses. There are already dozens of reason to love Blu Homes, but today the company gives us another: the highly efficient, steel frame homes cost homeowners 50 to 70 percent less to operate than a conventional home.

Most green prefabs are Energy Star rated, which means they save homeowners at least 20 percent, but Blu Homes‘ data shows that their structures perform significantly better than that.

Blu Homes likes to compare her designs to hybrid cars. “Just as we make the investment to buy a hybrid or electric car because it will be more sustainable in the long run and because it reflects our personal values, many people are starting to think about building homes that operate under the exact same principle,”.

The energy savings in Blu Homes are the result of a variety of green building elements. They use passive energy designs combines high-quality insulation, radiant floor heating, a high efficiency HVAC system, and rainwater and sunlight catchment. The green features enhance the energy savings that come from Blu Homes proprietary steel framing, which by itself is a highly-insulated home. On top of that, the frames are foldable, which means that even transportation to the building site is eco-friendly.

After one of the coldest, snowiest winters on record, Blu Homes owners can speak to the effectiveness of the designs. “The passive solar light and natural beauty outside feel like they’re ‘entering in’ from the big windows, keeping it warm and comfortable inside, while keeping our energy bills low,” said Long Island Blu homeowner Chris Howitt.

Blu Homes recently reengineered its Breezehouse design for maximum efficiency. Just unveiled last month, the model was redesigned to use Blu Homes’ folding steel frame, making it more efficient and cheaper to construct. The first Breezehouse homes are being sold this month and will be complete by the end of the year.

Inhabitat

TStzmmalaysia
post Mar 17 2011, 09:13 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Physicists control light scattering in graphene

Scientists at the Lawrence Berkeley National Laboratory (Berkeley Lab) and the University of California, Berkeley have learned to control the quantum pathways determining how light scatters in graphene. Controlled scattering provides a new tool for the study of this unique material – graphene is a single sheet of carbon just one atom thick – and may point to practical applications for controlling light and electronic states in graphene nanodevices.

The research team, led by Feng Wang of Berkeley Lab's Materials Sciences Division, made the first direct observation, in graphene, of so-called quantum interference in Raman scattering. Raman scattering is a form of "inelastic" light scattering. Unlike elastic scattering, in which the scattered light has the same color (the same energy) as the incident light, inelastically scattered light either loses energy or gains it.

Raman scattering occurs in graphene and other crystals when an incoming photon, a particle of light, excites an electron, which in turn generates a phonon together with a lower-energy photon. Phonons are vibrations of the crystal lattice, which are also treated as particles by quantum mechanics.

Quantum particles are as much waves as particles, so they can interfere with one another and even with themselves. The researchers showed that light emission can be controlled by controlling these interference pathways. They present their results in a forthcoming issue of the journal Nature, now available in Advance Online Publication.



Manipulating quantum interference, in life and in the lab

"A familiar example of quantum interference in everyday life is antireflective coating on eyeglasses," says Wang, who is also an assistant professor of physics at UC Berkeley. "A photon can follow two pathways, scattering from the coating or from the glass. Because of its quantum nature it actually follows both, and the coating is designed so that the two pathways interfere with each other and cancel light that would otherwise cause reflection."

Wang adds, "The hallmark of quantum mechanics is that if different paths are nondistinguishable, they must always interfere with each other. We can manipulate the interference among the quantum pathways that are responsible for Raman scattering in graphene because of graphene's peculiar electronic structure."

In Raman scattering, the quantum pathways are electronic excitations, which are optically stimulated by the incoming photons. These excitations can only happen when the initial electronic state is filled (by a charged particle such as an electron), and the final electronic state is empty.

Quantum mechanics describes electrons filling a material's available electronic states much as water fills the space in a glass: the "water surface" is called the Fermi level. All the electronic states below it are filled and all the states above it are empty. The filled states can be reduced by "doping" the material in order to shift the Fermi energy lower. As the Fermi energy is lowered, the electronic states just above it are removed, and the excitation pathways originating from these states are also removed.

"We were able to control the excitation pathways in graphene by electrostatically doping it – applying voltage to drive down the Fermi energy and eliminate selected states," Wang says. "An amazing thing about graphene is that its Fermi energy can be shifted by orders of magnitude larger than conventional materials. This is ultimately due to graphene's two-dimensionality and its unusual electronic bands."

The Fermi energy of undoped graphene is located at a single point, where its electronically filled bands, graphically represented as an upward-pointing cone, meet its electronically empty bands, represented as a downward-pointing cone. To move the Fermi energy appreciably requires a strong electric field.

Team member Rachel Segalman, an associate professor of chemical engineering at UC Berkeley and a faculty scientist in Berkeley Lab's Materials Sciences Division, provided the ion gel that was key to the experimental device. An ion gel confines a strongly conducting liquid in a polymer matrix. The gel was laid over a flake of graphene, grown on copper and transferred onto an insulating substrate. The charge in the graphene was adjusted by the gate voltage on the ion gel.

"So by cranking up the voltage we lowered the graphene's Fermi energy, sequentially getting rid of the higher energy electrons," says Wang. Eliminating electrons, from the highest energies on down, effectively eliminated the pathways that, when impinged upon by incoming photons, could absorb them and then emit Raman-scattered photons.



What comes of interference, constructive and destructive

"People have always known that quantum interference is important in Raman scattering, but it's been hard to see," says Wang. "Here it's really easy to see the contribution of each state."

Removing quantum pathways one by one alters the ways they can interfere. The changes are visible in the Raman-scattering intensity emitted by the experimental device when it was illuminated by a beam of near-infrared laser light. Although the glow from scattering is much fainter than the near-infrared excitation, changes in its brightness can be measured precisely.

Feng Wang beside a diagram showing how lowering the Fermi energy eliminates quantum pathways in graphene (lower left). The upper plot reveals that when destructively interfering quantum pathways are blocked, Raman scattering intensity is strongly enhanced (pale blue vertical, labeled G). At the same scattering, and at specific values of the Fermi energy, the plot reveals “hot electron luminescence” (labeled H.L.). Credit: Roy Kaltschmidt, Lawrence Berkeley National Laboratory

"In classical physics, you'd expect to see the scattered light get dimmer as you remove excitation pathways," says Wang, but the results of the experimenter came as a surprise to everyone. "Instead the signal got stronger!"

The scattered light grew brighter as the excitation pathways were reduced – what Wang calls "a canonical signature of destructive quantum interference."

Why "destructively?" Because phonons and scattered photons can be excited by many different, nondistinguishable pathways that interfere with one another, blocking one path can either decrease or increase the light from scattering, depending on whether that pathway was interfering constructively or destructively with the others. In graphene, the lower and higher-energy pathways interfered destructively. Removing one of them thus increased the brightness of the emission.

"What we've demonstrated is the quantum-interference nature of Raman scattering," Wang says. "It was always there, but it was so hard to see that it was often overlooked."

In a second observation, the researchers found yet another unexpected example of inelastic light scattering. This one, "hot electron luminescence," didn't result from blocked quantum pathways, however.

When a strong voltage is applied and the graphene's Fermi energy is lowered, higher-energy electron states are emptied from the filled band. Electrons that are highly excited by incoming photons, enough to jump to the unfilled band, thus find additional chances to fall back to the now-vacant states in what was the filled band. But these "hot" electrons can only fall back if they emit a photon of the right frequency. The hot electron luminescence observed by the researchers has an integrated intensity a hundred times stronger than the Raman scattering.


The road taken

The poet Robert Frost wrote of coming upon two roads that diverged in a wood, and was sorry he could not travel both. Not only can quantum processes take both roads at once, they can interfere with themselves in doing so.

The research team, working at UC Berkeley and at Berkeley Lab's Advanced Light Source, has shown that inelastic light scattering can be controlled by controlling interference between the intermediate states between photon absorption and emission. Manipulating that interference has enabled new kinds of quantum control of chemical reactions, as well as of "spintronic" states, in which not charge but the quantum spins of electrons are affected. Strongly enhanced Raman scattering can be a boon to nanoscale materials research. Hot luminescence is potentially attractive for optoelectronics and biological research, in which near-infrared tags – even weak ones – could be very useful.

"Likewise the phenomenon of hot electron luminescence, because it immediately follows excitation by a probe laser, could become a valuable research tool," says Wang, "particularly for studying ultrafast electron dynamics, one of the chief unusual characteristics of graphene."



More information: "Controlling Inelastic Light Scattering Quantum Pathways in Graphene," by Chi-Fan Chen, Cheol-Hwan Park, Bryan W. Boudouris, Jason Horng, Baisong Geng, Caglar Girit, Alex Zettl, Michael F. Crommie, Rachel A. Segalman, Steven G. Louie, and Feng Wang, appears in a forthcoming issue of Nature.

PhysOrg

Provided by Lawrence Berkeley National Laboratory

TStzmmalaysia
post Mar 17 2011, 09:14 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Underwater Kite Turbines Harvest Energy From Ocean Waves

Wave power generators generally take the form of sea snake-like designs that sit atop rolling waves, gathering kinetic energy and turning it into power. However a new idea combines the worlds of wind power and wave power to create underwater kite-turbines. Designed by Swedish renewable energy company Minesto, each underwater kite spans 8-14 meters and features a turbine attached to its underbelly. Each kit is tethered to the sea floor and it can “fly” with the tidal stream in a figure-8 motion.

Working under the same principles as the Seagen Turbine (pictured above), the kite-turbine’s blades are turned by the movement of passing tides. But the swooping motion of the kites amplify the speed of the water flowing through the turbine 10 fold – similar to the way sail boats gather speed by cutting across the wind. Furthermore, the kite has neutral buoyancy, so doesn’t sink as the tide turns, and the turbine mouth is protected to keep fish from flying through.

Ander Jansson, Minesto’s managing director, told the Guardian that the kite should work in flows of 1 – 2.5 meters per second, while first-generation devices need over 2.5 meters per second. Depending on the location and size of the kites, each will have a capacity of between 150 and 800 kW, and be deployed in waters 50-300 meters deep. The technology’s first test at Strangford Loch, Ireland will be at one-tenth scale.

Over the next 18 months the UK Carbon Trust and Invest Northern Ireland will spend $564,000 to fund trials as the UK strives to meet its ever-more ambitious environmental goals. Harnessing the power of the ocean presents many challenges, not least the economic cost. Two turbines with a combined capacity of 1MW could cost over $3m, and commercial viability will rely heavily on economies of scale.

Inhabitat


TStzmmalaysia
post Mar 18 2011, 05:54 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Shockwave-Generating Wave Discs Could Replace Internal Combustion Engines

Michigan researchers have built a prototype of a new auto motor that does away with pistons, crankshafts and valves, replacing the old internal combustion engine with a disc-shaped shock wave generator. It could slash the weight of hybrid cars and reduce auto emissions by 90 percent.

The generator is about the size of a saucepot, and would replace the 1,000-pound power train in most cars — no transmission, cooling system, emissions regulation or fluids needed. Norbert Müller and colleagues at Michigan State University showed off the new motor prototype at a meeting with the Department of Energy’s Advanced Research Projects Agency.

It consists of a rotor carved with wave-like channels. Fuel and air enter through central inlets, and the rotor spins to block their exit through a separate outlet. The sudden build-up of pressure generates a shock wave, compressing the mixture. Then it’s ignited, and as the rotor keeps spinning, the outlet opens again to let the hot gases escape. New Scientist explains in further detail.

The novel generator would use about 60 percent of fuel for propulsion, according to MSU. This is a dramatic improvement over typical car engines, which use only 15 percent of fuel for forward movement. The system could also make cars 20 percent lighter, improving fuel economy even more.

Müller said he hopes to have an even larger 25-kilowatt prototype by the end of this year.

PopSci

TStzmmalaysia
post Mar 18 2011, 05:56 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Wearable Scanner Opens New Frontier in Neuroscience

A tiny wearable scanner has been used to track chemical activity in the brains of unrestrained animals for the first time. By revealing neurological circuitry as the subjects perform normal tasks, researchers say, the technology could greatly broaden the understanding of learning, addiction, depression, and other conditions.

The device was designed to be used with rats—the main animal model used by behavioral neuroscientists. But the researchers who developed the device, at Brookhaven National Laboratory, say it would be straightforward to engineer a similar device for people.

Positron emission tomography, or PET, is already broadly used in neuroscience research and in clinical treatment. It allows researchers to track the location of radioactively labeled neurotransmitters (the chemicals that carry signals between neurons) or drugs within the brain. Images of the way neurotransmitters and drugs move through the brain can reveal the processes that underpin normal behavior such as learning as well as pathologies including addiction. PET has been used to map drug-binding sites in the brains of addicts and healthy people, and to study how those sites change over time and with therapy.

A conventional PET scanner is so large that these studies have to be performed with the subject lying inside a large tube. Large photomultiplier tubes amplify signals from gamma rays emitted by labeled chemicals in the brain. The signals then pass through a desk-sized rack of electronics that process them and map them to a particular region of the brain. To get good readings during animal studies, the subjects are typically anaesthetized or restrained. What's being measured is not normal waking behavior.

"We have very limited data about what brains do in the real world," says Paul Glimcher, professor of neuroscience, economics, and psychology at New York University. Glimcher was not involved with the work.

The new portable scanner is designed to provide the same information about brain chemistry while an animal behaves naturally. It is small and lightweight enough that a rat can carry it around on its head. "[The rat] can move freely, interact with other animals, and at the same time we can make a 3-D map of, for example, dopamine receptors throughout the brain," says David Schlyer, a senior scientist at Brookhaven who led the work.

Schlyer's group worked for years to engineer a miniature PET scanner that could be worn by a moving subject. The device consists of a metal ring hanging from a support structure that helps support its weight and allows the rat to move around. The rat's head goes inside the ring, which contains both detectors and electronics.

The key to miniaturizing the device, Schlyer says, was integrating all the electronics for each detector in the ring on a single, specialized chip. An avalanche photodiode also replaces the large photomultiplier tubes of conventional PET, amplifying the signals emitted by the labeled chemicals in the brain. "The rats take about an hour to acclimate, then begin behaving normally," says Schlyer. The Brookhaven device is described this week in the journal Nature Methods.

Technology Review

TStzmmalaysia
post Mar 18 2011, 05:57 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Researchers demonstrate self-repairing chip

As chips continue to get smaller, the technological possibilities just get larger. One of the trade-offs of miniaturization, however, is that smaller things are also often more fragile and less dependable. Anticipating a point at which chips will become too tiny to maintain their current level of resilience, a team of four companies and two universities in The Netherlands, Germany, and Finland have created what they say could be the solution – a chip that monitors its own performance, and redirects tasks as needed.

"Because of the rapidly growing transistor density on chips, it has become a real challenge to ensure high system dependability," said Hans Kerkhoff of The Netherlands' University of Twente, and part of the CRISP (Cutting-edge Reconfigurable ICs for Stream Processing) consortium. "The solution is not to make non-degradable chips, it's to make architectures that can degrade while they keep functioning, which we call graceful degradation."

In order to make that graceful degradation possible, the CRISP chip incorporates multiple cores. Different tasks are assigned to different cores, by a built-in resource manager. The connections of those cores are continuously tested, and when a fault is detected, the task assigned to that core is simply reallocated to another one.

Although the chip itself isn't actually any stronger, it can function at full capacity for a longer period of time.

CRISP's self-testing, self-repairing chip was recently demonstrated at the DATE2011 conference in Grenoble, France.

Gizmag

TStzmmalaysia
post Mar 18 2011, 05:59 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Skinput turns your hand into a touchscreen and your fingers into a keypad

Always thought your skin was more than just a thing to stop your insides falling out? Well, you were right. Chris Harrison has developed Skinput, a way in which your skin can become a touch screen device or your fingers buttons on a MP3 controller. Harrison says that as electronics get smaller and smaller they have become more adaptable to being worn on our bodies, but the monitor and keypad/keyboard still have to be big enough for us to operate the equipment. This can defeat the purpose of small devices but with the clever acoustics and impact sensing software, Harrison and his team can give your skin the same functionality as a keypad. Add a pico projector attached to an arm band, and your wrist becomes a touch screen.

In the past, Harrison says he has used tables and walls as touch screens but has experimented using the surface area of our bodies because technology is now small enough to be carried around with us and we can’t always find an appropriate surface.

A third year PhD student in the Human-Computer Interaction Institute at Carnegie Mellon University, Harrison says we have roughly two square meters of external surface area, and most of it is easily accessible by our hands (eg: arms, upper legs, torso).

He has used the myriad sounds our body makes when tapped by a finger on different areas of say, an arm or hand or other fingers, and married these sounds to a computer function.

A beauty of this type of functionality is our ability to “operate” our body without the need to use our eyes i.e. we can snap our fingers, touch the tip of our nose or pull our ear without having to look. It’s called proprioception.

Harrison says this ability is great operating equipment while on the move, say, changing tracks on an MP3 while out jogging, answering a phone call or starting a stop watch. Not to mention that the possibility of using your hand as calculator means you really can count on your fingers.

The team has created its own bio-acoustic sensing array that is worn on the arm meaning that no electronics are attached to the skin. Harrison explains that when a finger taps the body, bone densities, soft tissues, joint proximity, etc, affect the sound this motion makes. The software he has created recognizes these different acoustic patterns and interprets them as function commands.

He says he has achieved accuracies as high as 95.5 percent and enough buttons to control many devices, his video even shows someone playing Tetris using only their fingertips as a controller.

Harrison's research paper, co-authored by Desney Tan and Dan Morris from Microsoft Research, titled : Appropriating the Body as an Input Surface will appear in Proceedings of the 28th Annual SIGCHI Conference on Human Factors in Computing Systems (Atlanta, Georgia) in April.

Gizmag

TStzmmalaysia
post Mar 18 2011, 06:02 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Rubbery muscle motors to make robots more lifelike

IT WOBBLES like a jelly, but could make robots more flexible than ever before. Soft artificial muscles have been used to make a motor with only a few parts, and no gears, bearings or cogs.

The motor signals a new dawn for artificial muscles, says Iain Anderson, head of the Auckland Bioengineering Institute's Biomimetics Lab in New Zealand, where it was created.

The muscles themselves are electroactive structures consisting of two layers of conducting carbon grease separated by an extremely stretchy insulating polymer film, says Anderson. "It can stretch by more than 300 per cent."

When a voltage is applied, the configuration behaves like a capacitor, with positive and negative charges accumulating on either side of the insulator. As the opposite charges attract one another the insulator is squashed between them and flattens and stretches. Turn the voltage off and it contracts again to its original size.

The motor looks rather like a bicycle wheel, with the elastic muscles stretched between the edge of the wheel and the centre, like flat spokes. To turn a shaft, six of the muscles work in concert, contracting one after the other. Although the device looks as if it is wobbling like jelly, the spokes are connected to a foam ring wrapped tightly around the central shaft, and this arrangement exerts a continuous rotational force.

This is not the first time electroactive polymers (EAPs) have been used to create rotary motion, says Anderson. But previous efforts used a sort of ratcheting mechanism instead of the foam ring. Anderson's design removes the need for bearings, gears or anything else that is rigid.

"There's huge potential for this kind of actuator," says Chris Melhuish, director of the Bristol Robotics Lab in the UK. "We are going to have a different class of robot." Robots made of artificial muscle would feel soft and flesh-like and would be able to mimic the dexterity and mobility of living creatures, without the need for rigid mechanical components.

These kinds of EAPs are extremely strong, says Yosef Bar-Cohen, who specialises in electroactive materials at NASA's Jet Propulsion Laboratory in Pasadena, California - many times stronger than their biological counterparts. The simplified motor that Anderson has built opens up a whole new range of uses for artificial muscles, he says.

VIDEO

For example, they could be used to make instruments for keyhole surgery that are soft enough to be squeezed through tiny incisions but still able to perform the jobs of more rigid mechanical devices.

A company called Artificial Muscle in Sunnyvale, California, is developing EAP-based motors designed to behave like haptic displays, which respond when they are touched. They provide tactile feedback for cellphones, computer mice and touchscreens, producing a range of different feelings beneath a user's fingertips. They can produce the satisfying "click" sensation of a real button when typing on a touchscreen, for example.

The first of these, designed for the iPhone, will become available in May. It will replace the traditional vibrate function, and its diaphragm-like design allows it to respond much faster than the motors currently used in cellphones and produce a broader range of frequencies, says Andy Chen of Artificial Muscle.

Anderson's design will be presented at the Electroactive Polymer Actuators and Devices conference in San Diego, California, this week, along with the iPhone device. Also on display will be EAPs capable of generating electricity by a reverse process, creating a current when physically pumped, as well as robots that can harvest their own power and muscles that provide sensory feedback.

New Scientist

TStzmmalaysia
post Mar 18 2011, 06:03 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

The Adept Quattro robot beats iPhone game 1to50 in 6.67 seconds

Adept Technologies thinks so. They created a robot that is designed to play your iPhone games for you. The robot, called the Adept Quattro, who plays iPhone games and does it pretty well. Its current game Du Jour is 1to50. For those of you who are not familiar with 1to50, it is a game that asks you to tap the numbers one to fifty on your screen as fast as you can.

A good human time, is about 20 seconds, with the fastest human player clocking in at 7.85 seconds. Adept Quattro can beat the game in about 6.67 seconds. This record is made even more dexterous when you consider that most human players who achieve a good time do it with the use of multiple fingers and the Adept Quattro only uses a single finger.

The funny part is that the Adept Quattro is actually faster than the app. Once it beats the screen it has to wait for the app to catch up.

Real world applications for the Adept Quattro includes mostly industrial work. The machine is built to moving products between assembly lines and sorting goods into piles for packaging.
This isn't the Adept Quattro's first attempt at showing off. In April of 21st in 2010 this same model of robot was pitted against a Wii remote using human. So really, this is its second victory.



PhysOrg

TStzmmalaysia
post Mar 18 2011, 06:05 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

CryoEnergy System uses liquid air to store energy

Balancing demand for energy with timely production is a juggling act that is particularly relevant to renewable sources such as wind and solar. Because the wind isn't always blowing and the sun isn't always shining, the energy produced by these systems needs to be stored efficiently so it can be used when it's needed. While some scientists are looking into storing such energy by converting it to natural gas, Britain's Highview Power Storage has its own approach, which is already in use in a pilot project. In a nutshell, the company is storing excess energy as liquid air.

In Highview's CryoEnergy System (CES), excess energy is used to run refrigeration units which cool air down to a temperature of -196C (-320.8F), at which point it liquifies. The liquid air, also known as cryogen, can be stored in an insulated tank, at an ambient pressure of about 1 bar.

At higher-demand periods, when the direct output of existing energy sources can't meet the needs of the municipal power grid, the liquid air is released into a confined space. The liquid boils as soon as it is heated above -196C, so even room temperatures will superheat it, causing it to regasify and expand in volume by approximately 700 percent. From there, a steam engine effect comes into play, with the high-pressure gas spinning a turbine which in turn powers a generator.

When exposed to ambient air temperatures, the liquid air returns about 50 percent of the energy that went into creating it. If exposed to heated air, however, the phase change from liquid to gas is more intense, resulting in an efficiency of up to 70 percent. If a CES were to be installed at an existing facility where waste heat were already present, the system could use that heat to boost its own efficiency. Conversely, because the only by-product of the system is cold air, it could also be used to provide air conditioning, refrigeration, or even to create more liquid air.

Currently, one of the most common forms of large-scale energy storage is the pumped hydro method. In this system, excess energy is used to pump water from one body of water up to a reservoir at a higher elevation. When power is required, water is released from that reservoir through a dam, spinning turbines as it cascades back down. While that system is more energy-efficient than CES, it is also reportedly more expensive to build and operate, requires a mountainous topography, offers a much lower energy density, and the energy it stores isn't portable. Tanks of liquid air, on the other hand, can be loaded onto a truck and transported to where power is needed – assuming that the energy required to run the truck is less than the amount that is being delivered.

Attached Image

According to a study conducted by consulting firm Frost and Sullivan, the use of batteries for energy storage is inferior to CES, when all criteria are combined. While some types of batteries offer more energy efficiency, they cost about US$4,000 per kilowatt of generating capacity, while CES only costs a quarter of that amount.

A 300-kilowatt pilot demonstrator of the CryoEnergy System has been in use at Scotland's Slough Heat & Power plant for the past nine months, where it has been utilizing the plant's waste heat, and regularly exporting electricity to the national grid. So far, the air has been liquified off-site, but the next phase of the project will integrate a liquifier into the system. The company plans to have a 3.5-megawatt commercial-scale plant operational by late next year, with plans to increase its capacity to 8 to 10 megawatts by early 2014.

Gizmag

TStzmmalaysia
post Mar 19 2011, 06:12 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Scientists find a key to maintaining our DNA

DNA contains all of the genetic instructions that make us who we are, and maintaining the integrity of our DNA over the course of a lifetime is a critical, yet complex part of the aging process. In an important, albeit early step forward, scientists have discovered how DNA maintenance is regulated, opening the door to interventions that may enhance the body's natural preservation of genetic information.

The new findings may help researchers delay the onset of aging and aging-related diseases by curbing the loss or damage of our genetic makeup, which makes us more susceptible to cancers and neurodegenerative diseases, such as Alzheimer's. Keeping our DNA intact longer into our later years could help eliminate the sickness and suffering that often goes hand-in-hand with old age.

"Our research is in the very early stages, but there is great potential here, with the capacity to change the human experience," said Robert Bambara, Ph.D., chair of the Department of Biochemistry and Biophysics at the University of Rochester Medical Center and leader of the research. "Just the very notion is inspiring."

In the Journal of Biological Chemistry, Bambara and colleagues report that a process called acetylation regulates the maintenance of our DNA. The team has discovered that acetylation determines the degree of fidelity of both DNA replication and repair.

The finding builds on past research, which established that as humans evolved, we created two routes for DNA replication and repair – a standard route that eliminates some damage and a moderate amount of errors, and an elite route that eliminates the large majority of damage and errors from our DNA.

Only the small portion of our DNA that directs the creation of all the proteins we are made of – proteins in blood cells, heart cells, liver cells and so on – takes the elite route, which uses much more energy and so "costs" the body more. The remaining majority of our DNA, which is not responsible for creating proteins, takes the standard route, which requires fewer resources and moves more quickly.

But, scientists have never understood what controls which pathway a given piece of DNA would go down. Study authors found, that like a policeman directing traffic at a busy intersection, acetylation directs which proteins take which route, favoring the protection of DNA that creates proteins by shuttling them down the elite, more accurate course.

"If we found a way to improve the protection of DNA that guides protein production, basically boosting what our body already does to eliminate errors, it could help us live longer," said Lata Balakrishnan, Ph.D., postdoctoral research associate at the Medical Center, who helped lead the work. "A medication that would cause a small alteration in this acetylation-based regulatory mechanism might change the average onset of cancers or neurological diseases to well beyond the current human lifespan."

"Clearly, a simple preventative approach would be a key, not to immortality, but to longer, disease-free lives," added Bambara.

DNA replication is an intricate, error-prone process, which takes place when our cells divide and our DNA is duplicated. Duplicate copies of DNA are first made in separate pieces, that later must be joined to create a new, full strand of DNA. The first half of each separate DNA segment usually contains the most errors, while mistakes are less likely to appear in the latter half.

For DNA that travels down the standard route, the first 20 percent of each separate DNA segment is tagged, cut off and removed. This empty space is then backfilled with the latter part – which is the more accurate section – of the adjoining piece of DNA as the two segments come together to form a full strand.

In contrast, DNA that travels down the elite route gets special treatment: the first 30 to 40 percent of each separate DNA segment is tagged, removed and backfilled, meaning more mistakes and errors are eliminated before the segments are joined. The end result is a more accurate copy of DNA.

The same situation occurs with the DNA repair process, as the body works to remove damaged pieces of DNA.

Unlike the current work, the majority of aging-related research zeroes in on specific agents that damage our DNA, called reactive oxygen species, and how to reduce them. The new research represents a small piece of the pie, but has the potential to be a very important one.

Bambara's team is investigating the newly identified acetylation regulatory process further to figure out how they might be able to intervene to augment the body's natural safeguarding of important genetic information. They are studying human and yeast cell systems to determine how proteins in cells work together to trigger acetylation, which adds a specific chemical to the proteins involved in DNA replication and repair. Researchers are manipulating cells in various ways, through damage or genetic alterations, to see if these changes activate or influence acetylation in any way.

Though they are far from identifying compounds or existing drugs to test, they do see this research having an impact in the future.

"The translational rate is becoming better and better. Today, the course between initial discovery and drug development is intrinsically faster. I could see having some sort of therapeutic that helps us live longer and healthier lives in 25 years," said Bambara.

EurekAlert

TStzmmalaysia
post Mar 19 2011, 06:16 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Wave Energy Prototype, 'SeaRay,' Exceeds Expectations

Add another notch to Oregon’s growing wave power industry. The case for commercialized wave energy is enjoying another surge forward now that Columbia Power Technologies has officially deployed a prototype wave energy device and secured fresh funding from both private and government backers.

Just a few months ago we reported that the Corvallis, Oregon company appeared to be gaining ground in the effort to fund the next phase of its R&D. Now, their protoype device, called SeaRay, is floating in the Puget Sound and sending back performance data for analysis.

“The SeaRay is performing beyond our expectations and tracking well with modeling predictions,” said Reenst Lesemann, CEO of Columbia Power Technologies. “Our task is to demonstrate to utilities and independent power producers that we can help them deliver power predictably, reliably, and at a cost that is competitive. At this stage, we are making this happen in a very rapid and capital-efficient manner.”

According to Columbia Power Technologies, the SeaRay’s design allows it to extract up to twice the energy from ocean waves as other developing technologies. By employing what the company refers to as a “heave and surge” energy capture design, the SeaRay is able to reportedly tap the full energy potential from passing waves. Its design also looks to make it uniquely conditioned to survive a harsh battering about at sea.

Columbia Power Technologies indicated its longer term goal is “to deliver megawatt-scale devices, capable of operating in the widest range of temperate zone coastal load centers around the globe.” To do that, they’ll need funding and, it would seem, they now have it. Though details on how much funding they attained was not disclosed, Columbia Power Technologies did confirm that private backers were on board saying: “…the closing of Columbia Power’s recent private capital signifies excellent validation of the company’s vision and technical development capabilities.”

For those who wonder if there is money to be made from wave power for companies like Columbia, consider this: according to the start up “the world’s oceans are estimated to contain enough practically extractable energy to provide over 6,000 terawatt hours of electricity each year, which is enough to power over 600 million homes and is worth over $900 billion annually.” It looks like there might be gold in them there ocean waves after all.

Huffington Post

TStzmmalaysia
post Mar 20 2011, 02:54 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Pepsi creates plastic bottle from 100% plant material

Regardless of your preference for Pepsi or Coca-Cola, if you buy it in bottles then that isn’t great for the environment. The petroleum-based plastic used to create them can take decades to breakdown and can’t easily be recycled.

Coca-Cola has already started experimenting with using renewable material in its bottles which make up around 30% of each bottle. But Pepsi has gone a step further and managed to make a new plastic bottle consisting of only plant materials, therefore breaking the link and reliance on petroleum.

The new bottles use a mix of switch grass, pine bark, corn husks, and a number of other renewable materials. Importantly, Pepsi has managed to make the new bottle indistinguishable from the old one so it will have no impact on marketing or brand recognition. It has done this by matching the molecular stucture of the old bottles using the plant materials, so not only does it look the same, but it has the same feel and strength.

The best news is for the environment, though. Pepsi produces billions of bottles every year and starting in 2012 they will begin moving over to renewable bottle production that’s 100% recyclable. Eventually all Pepsi bottles will be made of plant materials.

Anyone worried about Pepsi running out of these plant materials used to produce so many bottles shouldn’t be. The design of the bottle allows for other plant materials including orange peel, potato peel and oat hulls to be used as alternatives in the mix. Other plant material types are also being considered making for a very versatile plastic material that could certainly be used in other industries.

Geek.com

TStzmmalaysia
post Mar 20 2011, 02:57 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

'The Pearl' dome house - passive solar design with a touch of high-tech

Like its stablemate the Domespace house, David Fanchon's eco-friendly design is aimed at maximizing passive solar energy – though unlike the Domespace there's no rotating option. Dubbed "The Pearl," the standout features of the elegant domed structure are its integrated solar panels which can be adjusted to different angles to provide additional shade and optimize energy collection through the changing seasons.

The pictures tell the story of the way in which The Pearl takes advantage of passive solar principles. Large south-facing (or north-facing if you reside below the equator) bay windows fitted with an automated venting system soak up the winter sun and allow light to enter every room, while the white steel roof reflects the sun in summer.

Some additional energy saving options are not as apparent from the designs – the roofing shell can be insulated with a layer of air and cork beads (>R28), external walls are made of 12" thick compressed straw and the design can incorporate geothermal and wood pellet fed heating systems. There's also a rain water storage tank located at the base of the northern pedestal.

The aerodynamic dome shape delivers protection from high winds and wild weather and the arch shape also provides resistance to earthquakes.

The timber is FSC certified and the interior layout is fully customizable – the trick would be to make sure your property's best views lay to the south so you can make the most of the full 180 degree view from the main living area.

Gizmag

More info at the Solaleya site.


TStzmmalaysia
post Mar 20 2011, 02:59 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

'Domespace' is Rotating Wooden House

Taking up a large section of the Eco Habitat zone at the recent Viv'expo exhibition in Bordeaux was a walk-in cutaway model of a rotating wooden house known as Domespace. Built on a central concrete pedestal, the Domespace home benefits from little or no damp penetration, and its aerodynamic shape has been found to be resistant to cyclonic winds of up to 174mph (280kph). It also makes the most of passive solar energy, has a central chimney with a designer open fire and is surprisingly spacious.

Approaching a Domespace home from a distance, you might be forgiven for thinking that you've wandered onto the set of a 1950s B-movie about aliens invading the earth. However, this particular saucer doesn't fly, even in high winds. In tests, the curved structure has been found to stand firm in winds of up to 149 mph (240kph), although one home in Taiwan managed to survive Cyclone Tim's winds of around 174mph in 1994.

The protective cocoon doesn't stop at wind resistance either. Domespace says that "every Domespace is erected over an elastomeric belt that works as a 'silencer block'... like a piece of rubber that cushions vibrations." The structure has been found to withstand earthquake activity registering up to eight on the Richter scale.


A wooden cocoon

The first thing to hit us as we wandered through the cutaway at Bordeaux's Viv'expo exhibition was the pleasant smell of wood and the feeling of warmth it seemed to generate, even in the fairly bland environment of the exhibition hall. In the center of the demonstration model sat a chimney which would normally house an open fireplace, many of which have been designed by Dominique Imbert.

Although placing an open fire chimney at the heart of a wooden house might seem like you're asking for trouble, the designers point out that the glued laminated timber construction is actually quite resistant to fire. Indeed, the American Institute of Timber Construction agrees that if fire breaks out in buildings using such heavy timber construction, the "wood retains a significantly higher percentage of its original strength for a longer period of time, losing strength only as material is lost through surface charring. Fire fighting is safer due to elimination of concealed spaces and the inherent structural integrity of large glued laminated timbers."

A Domespace home is constructed using FSC-certified (or equivalent) wood, including spruce beams, red cedar roof, cork or pulped wood insulation and plywood or oriented strand board. Strategically-placed sloped openings on both static and rotating versions of the design are said to benefit from light pouring through at some point during the day, brightening up the interior with more natural light than vertical windows can offer.

One disadvantage to such sloped windows is that when it rains heavily, it may sound like the ghosts of Cozy Powell and Keith Moon are having a showdown on the roof, but the double glazing is said to help a little. Shutters can be installed on the outside which may also help in this regard.



Following the sun

Those choosing the rotating Harmonique version are offered remote control positioning to make the most of passive solar energy to heat or cool areas of the house, or to ensure that any externally-fitted photovoltaic panels are given maximum exposure to bright sunlight, or even just to change the view. Also, being able to rotate your house about an axis means that if the neighbors are throwing a party, rather than move over to the quieter side of the house, you can just move the house around to suit.

The designers say that rotation is slow and smooth enough so as to be hardly noticeable – the user can choose between one and four inches per second with either manual or automated options available. The system doesn't simply spin around and around but rather swivels anywhere from 180 degrees to 330 and back again. Flexible utility cables running through the central structure turn with the house using much the same principle that allows us to drink water while we turn our heads.

Should the electric motor that drives the movement fail, the house can also be moved with a bit of manual persuasion. Of course, if the home is built on a sloping landscape, you'd need to be mindful of returning the door to a safe position before attempting to exit.

Other claimed benefits of the basic design include little (if any) penetrating humidity thanks to the bottom of the home being raised off the ground on a concrete pedestal, and marked energy savings inherent in the design. Green power and heating technologies such as photovoltaic, geothermal, aerothermal and water recycling can also be included if required. Such a design also lends itself to placement on land that's otherwise harder to develop, such as a steep incline, mountain side or ocean front.


Building a Domespace home

The company doesn't actually build a Domespace home itself. Rather, it provides the plans and materials, and is on hand to help with any design, construction and planning issues. There are a number of different build options and various sizes available. For example, a 23.6 feet (7.2m) radius property is said to offer some 2,260 square feet (210 square meters) of total living space over two floors (although due to the shape of the design, actual usable living space is likely to be in the region of 1,754 square feet, or 163 square meters).

Buyers could go down the do-it-yourself route, whereby Domespace would provide all of the materials needed to build the home, with the buyer putting all the bits together and hopefully not being left with any spare bits at the end. The company says that the most popular option is for a contractor to erect the basic structure and then leave the customer to complete the final internal design stage. The path of least resistance is to engage a specialized company to do the lot.

It generally takes anywhere from six to nine months for a Domespace construction to be completed, and while every construction shares common characteristics, opportunities for unique installations abound, opening up numerous domestic, commercial or community possibilities.

The brainchild of Patrick Marsilli – who is said to have been inspired by similar shapes used in nature, traditional human dwellings and even churches and cathedrals – the very first Domespace was built in 1988 in Brittany using sustainably-sourced wood for its construction. Since then, hundreds of dome residences have popped up in France, Switzerland, Germany, Spain, Taiwan, and the United States. Visits to existing installations may be possible by contacting the company.

Domespace construction is offered in rotating Harmonique or static Elevation versions of various sizes and from one to three floors. There's also a smaller Transit version available for use as mini-lofts, guest rooms and so on. U.S. readers can download a Domespace brochure from Solaleya in zip format and more detailed information is available from Domespace International.

Gizmag

More information on Domespace.com




TStzmmalaysia
post Mar 20 2011, 03:04 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Tiny iron oxide particles promise big benefits for display technology

Chemists at the University of California, Riverside, are developing a future display technology using nanoscale-sized iron oxide rods that shine when exposed to an external magnetic field. Though in its early stages, the research could pave the way for producing magnetically responsive, ultra high-res displays with significantly reduced dimensions and power demands.

The researchers have shown in the past that by using a simple magnet, the color of iron oxide particles suspended in water can be manipulated in response to the strength and orientation of a magnetic field.

The latest development is the application of silica to the iron oxide particles to form chains of light emitting particles that refract the visible light into brilliant colors when magnetically charged.

To do this, a thin layer of silica is applied to iron oxide molecules in a water solution. Then, a magnetic field is applied to assemble the particles into chains. Next, in order to stabilize the chain structure, the chains are coated with additional silica to form a shell, making tiny rods or nanorods. When an external magnetic field is applied, the nanorods align themselves parallel to one another like a set of tiny flashlights.

It is the arrangement of nanorods that effectively diffracts light and displays a color, while it is the spacing between the particles that determines the actual color that is show

Currently, the process can show a single color as the particle spacing is fixed after the silica coating is applied to the chains. In order to show different colors the researchers explain that the nanorods could be used in clusters of different sizes, effectively varying the interparticle spacing. The researchers are now working on achieving two colors, one at a time. If successful, this would allow a screen or pixel to display one color for a while, and a different one later.

"We have essentially developed tunable photonic materials whose properties can be manipulated by changing their orientation with external fields," said Yadong Yin, an assistant professor of chemistry at University of California. "These nanorods with configurable internal periodicity represent the smallest possible photonic structures that can effectively diffract visible light. This work paves the way for fabricating magnetically responsive photonic structures with significantly reduced dimensions so that color manipulation with higher resolution can be realized."

Future applications of this research include high definition posters, pictures, and energy efficient color displays. Laptop screens would be much more visible in bright sunlight as the nanorod technology diffracts color from the visible light around it. Battery life would also be greatly extended.

Another positive to this technology, Yin says, is that iron oxide is a cheap resource, non toxic and plentiful.

Gizmag

This video shows the technology in action (Video credit: Yin lab, UC Riverside)


TStzmmalaysia
post Mar 20 2011, 03:05 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

'Pruned' microchips are leaner and meaner

If you had to use a commuting bicycle in a race, you would probably set about removing the kickstand, fenders, racks and lights to make the thing as fast and efficient as possible. When engineers at Houston's Rice University are developing small, fast, energy-efficient chips for use in devices like hearing aids, it turns out they do pretty much the same thing. The removal of portions of circuits that aren't essential to the task at hand is known as "probabilistic pruning," and it results in chips that are twice as fast, use half the power, and are half the size of conventional chips.

"I believe this is the first time someone has taken an integrated circuit and said, 'Let's get rid of the part that we don't need,'" said principal investigator Krishna Palem, a Professor of Computing at Rice. "What we've shown is that we can boost performance and cut energy use simultaneously if we prune the unnecessary portions of the digital application-specific integrated circuits that are typically used in hearing aids, cameras and other multimedia devices."

There is a cost for those improvements, however – the pruned chips gain an error magnitude of 8 percent. The researchers are OK with that, though. The chips are designed to allow for the probability of errors, and to limit which calculations cause them. Even with that additional 8 percent error magnitude, they should reportedly still be fine for hearing aids and similar devices, which the team states can tolerate error magnitudes as high as 10 percent.

Pruned and traditional chips that were created together will be presented in a side-by-side comparison at this week's DATE11 microelectronics conference in Grenoble, France. While the pruned chip is expected to perform at twice the speed and using half the power of the regular chip, Palem has even higher hopes for a hearing aid-specific chip that he is about to design – four to five times more run time on a set of batteries.

Rice University is collaborating on the research project with Nanyang Technological University in Singapore and Switzerland's Center for Electronics and Microtechnology.

Probabilistic pruning stands in sharp contrast to the self-monitoring, self-repairing chips that we just featured on ZeitNews. In that case, researchers have added cores to chips, so that tasks allocated to cores that are found to be defective can be reassigned to functioning ones.

Gizmag


TStzmmalaysia
post Mar 20 2011, 03:08 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Physicist Discover New Theory for the Structure of Space

Physicists at UCLA set out to design a better transistor and ended up discovering a new way to think about the structure of space.

Space is usually considered infinitely divisible — given any two positions, there is always a position halfway between. But in a recent study aimed at developing ultra-fast transistors using graphene, researchers from the UCLA Department of Physics and Astronomy and the California NanoSystems Institute show that dividing space into discrete locations, like a chessboard, may explain how point-like electrons, which have no finite radius, manage to carry their intrinsic angular momentum, or "spin."

While studying graphene's electronic properties, professor Chris Regan and graduate student Matthew Mecklenburg found that a particle can acquire spin by living in a space with two types of positions — dark tiles and light tiles. The particle seems to spin if the tiles are so close together that their separation cannot be detected.

"An electron's spin might arise because space at very small distances is not smooth, but rather segmented, like a chessboard," Regan said.

Their findings are published in the March 18 edition of the journal Physical Review Letters.



In quantum mechanics, "spin up" and "spin down" refer to the two types of states that can be assigned to an electron. That the electron's spin can have only two values — not one, three or an infinite number — helps explain the stability of matter, the nature of the chemical bond and many other fundamental phenomena.

However, it is not clear how the electron manages the rotational motion implied by its spin. If the electron had a radius, the implied surface would have to be moving faster than the speed of light, violating the theory of relativity. And experiments show that the electron does not have a radius; it is thought to be a pure point particle with no surface or substructure that could possibly spin.

Electrons are thought to spin, even though they are pure point particles with no surface that can possibly rotate. Recent work on graphene shows that the electron’s spin might arise because space at very small distances is not smooth, but rather segmented like a chessboard. The standard cartoon of an electron shows a spinning sphere with positive or negative angular momentum, as illustrated in blue or gold above. However, such cartoons are fundamentally misleading: compelling experimental evidence indicates that electrons are ideal point particles, with no finite radius or internal structure that could possibly “spin”. A quantum mechanical model of electron transport in graphene, a single layer of graphite (shown as a black honeycomb), presents a possible resolution to this puzzle. An electron in graphene hops from carbon atom to carbon atom as if moving on a chessboard with triangular tiles. At low energies the individual tiles are unresolved, but the electron acquires an “internal” spin quantum number which reflects whether it is on the blue or the gold tiles. Thus the electron’s spin could arise not from rotational motion of its substructure, but rather from the discrete, chessboard-like structure of space. (Image: Chris Regan/CNSI)

In 1928, British physicist Paul Dirac showed that the spin of the electron is intimately related to the structure of space-time. His elegant argument combined quantum mechanics with special relativity, Einstein's theory of space-time (famously represented by the equation E=mc2).

Dirac's equation, far from merely accommodating spin, actually demands it. But while showing that relativistic quantum mechanics requires spin, the equation does not give a mechanical picture explaining how a point particle manages to carry angular momentum, nor why this spin is two-valued.

Unveiling a concept that is at once novel and deceptively simple, Regan and Mecklenburg found that electrons' two-valued spin can arise from having two types of tiles — light and dark — in a chessboard-like space. And they developed this quantum mechanical model while working on the surprisingly practical problem of how to make better transistors out of a new material called graphene.

Graphene, a single sheet of graphite, is an atomically-thin layer of carbon atoms arranged in a honeycomb structure. First isolated in 2004 by Andre Geim and Kostya Novoselov, graphene has a wealth of extraordinary electronic properties, such as high electron mobility and current capacity. In fact, these properties hold such promise for revolutionary advances that Geim and Novoselov were awarded the 2010 Nobel Prize a mere six years after their achievement.

Regan and Mecklenburg are part of a UCLA effort to develop extremely fast transistors using this new material.

"We wanted to calculate the amplification of a graphene transistor," Mecklenburg said. "Our collaboration was building them and needed to know how well they were going to work."

This calculation involved understanding how light interacts with the electrons in graphene.

The electrons in graphene move by hopping from carbon atom to carbon atom, as if hopping on a chessboard. The graphene chessboard tiles are triangular, with the dark tiles pointing "up" and light ones pointing "down." When an electron in graphene absorbs a photon, it hops from light tiles to dark ones. Mecklenburg and Regan showed that this transition is equivalent to flipping a spin from "up" to "down."

In other words, confining the electrons in graphene to specific, discrete positions in space gives them spin. This spin, which derives from the special geometry of graphene's honeycomb lattice, is in addition to and distinct from the usual spin carried by the electron. In graphene the additional spin reflects the unresolved chessboard-like structure to the space that the electron occupies.

"My adviser [Regan] spent his Ph.D. studying the structure of the electron," Mecklenburg said. "So he was very excited to see that spin can emerge from a lattice. It makes you wonder if the usual electron spin could be generated in the same way."

"It's not yet clear if this work will be more useful in particle or condensed matter physics," Regan said, "but it would be odd if graphene's honeycomb structure was the only lattice capable of generating spin."

PhysOrg

TStzmmalaysia
post Mar 20 2011, 03:11 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

3D, 360-degree Holographic Display Works

Any fan of Star Trek knows about the joys of the holodeck. The idea of a 3D, 360-degree immersive digital environment, projected on demand, is an enticing one that has thus far been confined to the silver screen, but technologies are emerging that just may make this possible. In 2008 Physorg reported on a 3D fog display on a room-sized scale, but it could not give a 360 degree experience. New developments in this area may make this possible at some point in the future.

Researchers at Osaka University in Japan have made a 3D and 360-degree display that projects from a variety of different angles onto a cylindrical fog display. This combination of multiple-point of view projectors and the cylinder allows for a display that is 3D no matter what side you view it from, though in order to get a holodeck style of projection a much larger set of projectors, and a lot more fog, would need to be on hand. In order to project into the one cylinder of fog, it took three projectors. So for now, don't expect to be able to get your virtual playground on for at least a few more years, since a system like this would undoubtedly be expensive to install and maintain.

You can watch this technology in action HERE.

The researchers expect that at some unnamed time in the future the technology will have applications in both the entertainment and health care arenas. As is usually the case with experimental prototypes there is no word yet on when we can expect to see these 360, 3D displays in use, in the real world, so don't hold your breath.

PhysOrg

This post has been edited by tzmmalaysia: Mar 20 2011, 03:12 PM
TStzmmalaysia
post Mar 22 2011, 10:44 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

A Hidden Universe Could Exist Inside Every Black Hole

“Our own Universe may be the interior of a black hole existing in another universe.” In a remarkable paper about the nature of space and the origin of time, Nikodem Poplawski, a physicist at Indiana University, suggests that a small change to the theory of gravity implies that our universe inherited its arrow of time from the black hole in which it was born.

Poplawski proposes that the universe in which the Earth exists may be located within the wormhole of a black hole which itself exists in an even larger universe. A universe could exist 'inside every black hole,' claims scientist

Using an adaptation of Einstein's general theory of relativity, Poplawski analysed the theoretical motion of particles entering a black hole. He concluded that it was possible for a whole new universe to exist inside every black hole, which could mean that our own universe could be inside a black hole as well.

"Maybe the huge black holes at the centre of the Milky Way and other galaxies are bridges to different universes," he told New Scientist.

Explaining his theory in the journal Physics Letters B, he said he used the Einstein-Cartan-Kibble-Sciama (ECKS) theory of gravity, in his analysis to account for the angular momentum of particles in a black hole. Doing this it made it possible to calculate a quality of space-time called torsion, a property believed to repel gravity.

He says instead of matter reaching infinite density in a black hole called "singularities" in Einstein's theory of relativity - the behaviour of the space-time acts more like a spring being compressed with matter rebounding and expanding continuously.

Dr Poplawski explains that this "bounce-back" effect is caused by the torsion of space-time having a repulsive force against the gargantuan strength of gravity in a black hole.

Dr Poplawski also claims that this recoiling effect could be what has led to our expanding universe that we observe today and could explain why our universe is flat, homogeneous and isotropic without needing cosmic inflation.

It is hard to see how we could test whether or not Dr Poplawski's theory is correct; the force of gravity in black holes is such that nothing can escape, so no information about what is going on inside one can ever reach us.

However, according to Dr Poplawski, if we were living in a spinning black hole then the spin would transfer to the space-time inside, meaning the universe would have a preferred direction - something we would be able to measure. Such a preferred direction could be related to the observed imbalance of matter and anti-matter in the universe and could explain the oscillation of neutrinos.

Poplawski says that the idea that black holes are the cosmic mothers of new universes is a natural consequence of a simple new assumption about the nature of spacetime. Poplawski points out that the standard derivation of general relativity takes no account of the intrinsic momentum of spin half particles. However there is another version of the theory, called the Einstein-Cartan-Kibble-Sciama theory of gravity, which does.

This theory predicts that particles with half integer spin should interact, generating a tiny repulsive force called torsion. In ordinary circumstances, torsion is too small to have any effect. But when densities become much higher than those in nuclear matter, it becomes significant. In particular, says Poplawski, torsion prevents the formation of singularities inside a black hole.

Astrophysicists have long known that our universe is so big that it could not have reached its current size given the rate of expansion we see now. Instead, they believe it grew by many orders of magnitude in a fraction of a second after the Big Bang, the period known as known as inflation.

Poplawski's approach immediately solves the inflation problem, saying that torsion caused this rapid inflation, which means the universe as we see it today can be explained by a single theory of gravity without any additional assumptions about inflation.

Another important corollary of Poplawski's approach is that it makes it possible for universes to be born inside the event horizons of certain kinds of black hole where torsion prevents the formation of a singularity but allows energy density to build up, which leads to the creation of particles on a massive scale via pair production followed by the expansion of the new universe.

"Such an expansion is not visible for observers outside the black hole, for whom the horizon's formation and all subsequent processes occur after infinite time," says Poplawski. For this reason, he emphasizes, the new universe is a separate branch of space time and evolves accordingly.

Poplawski's theory also suggests an solution lto why time seems to flow in one direction but not in the other, even though the laws of physics are time symmetric.

Poplawski says the origin of the arrow of time comes from the asymmetry of the flow of matter into the black hole from the mother universe. "The arrow of cosmic time of a universe inside a black hole would then be fixed by the time-asymmetric collapse of matter through the event horizon," he says.

Translated, this means that our universe inherited its arrow of time from its source. "Daughter universes," he says, "may inherit other properties from their mothers," implying that it may be possible to detect these properties, providing an experimental falsifiable proof of his idea.

DailyGalaxy

TStzmmalaysia
post Mar 22 2011, 10:47 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Large Hadron Collider could be world's first time machine

If the latest theory of Tom Weiler and Chui Man Ho is right, the Large Hadron Collider – the world's largest atom smasher that started regular operation last year – could be the first machine capable causing matter to travel backwards in time.

"Our theory is a long shot," admitted Weiler, who is a physics professor at Vanderbilt University, "but it doesn't violate any laws of physics or experimental constraints."

One of the major goals of the collider is to find the elusive Higgs boson: the particle that physicists invoke to explain why particles like protons, neutrons and electrons have mass. If the collider succeeds in producing the Higgs boson, some scientists predict that it will create a second particle, called the Higgs singlet, at the same time.

According to Weiler and Ho's theory, these singlets should have the ability to jump into an extra, fifth dimension where they can move either forward or backward in time and reappear in the future or past.

"One of the attractive things about this approach to time travel is that it avoids all the big paradoxes," Weiler said. "Because time travel is limited to these special particles, it is not possible for a man to travel back in time and murder one of his parents before he himself is born, for example. However, if scientists could control the production of Higgs singlets, they might be able to send messages to the past or future."


Unsticking the "brane"

The test of the researchers' theory will be whether the physicists monitoring the collider begin seeing Higgs singlet particles and their decay products spontaneously appearing. If they do, Weiler and Ho believe that they will have been produced by particles that travel back in time to appear before the collisions that produced them.

Weiler and Ho's theory is based on M-theory, a "theory of everything." A small cadre of theoretical physicists have developed M-theory to the point that it can accommodate the properties of all the known subatomic particles and forces, including gravity, but it requires 10 or 11 dimensions instead of our familiar four. This has led to the suggestion that our universe may be like a four-dimensional membrane or "brane" floating in a multi-dimensional space-time called the "bulk."

According to this view, the basic building blocks of our universe are permanently stuck to the brane and so cannot travel in other dimensions. There are some exceptions, however. Some argue that gravity, for example, is weaker than other fundamental forces because it diffuses into other dimensions. Another possible exception is the proposed Higgs singlet, which responds to gravity but not to any of the other basic forces.


Answers in neutrinos?

Weiler began looking at time travel six years ago to explain anomalies that had been observed in several experiments with neutrinos. Neutrinos are nicknamed ghost particles because they react so rarely with ordinary matter: Trillions of neutrinos hit our bodies every second, yet we don't notice them because they zip through without affecting us.

Weiler and colleagues Heinrich Päs and Sandip Pakvasa at the University of Hawaii came up with an explanation of the anomalies based on the existence of a hypothetical particle called the sterile neutrino. In theory, sterile neutrinos are even less detectable than regular neutrinos because they interact only with gravitational force. As a result, sterile neutrinos are another particle that is not attached to the brane and so should be capable of traveling through extra dimensions.

Weiler, Päs and Pakvasa proposed that sterile neutrinos travel faster than light by taking shortcuts through extra dimensions. According to Einstein's general theory of relativity, there are certain conditions where traveling faster than the speed of light is equivalent to traveling backward in time. This led the physicists into the speculative realm of time travel.


Ideas impact science fiction

In 2007, the researchers, along with Vanderbilt graduate fellow James Dent, posted a paper titled "Neutrino time travel" on the preprint server that generated a considerable amount of buzz.

Their ideas found their way into two science fiction novels. Final Theory by Mark Alpert, which was described in the New York Times as a "physics-based version of The Da Vinci Code," is based on the researchers' idea of neutrinos taking shortcuts in extra dimensions. Joe Haldeman's novel The Accidental Time Machine is about a ing MIT graduate student and includes an author's note that describes the novel's relationship to the type of time travel described by Dent, Päs, Pakvasa and Weiler.


Ho is a graduate fellow working with Weiler. Their theory is described in a paper posted March 7 on the arXiv.org physics preprint website.

PhysOrg

More information: Causality-Violating Higgs Singlets at the LHC, Chiu Man Ho, Thomas J. Weiler, arXiv:1103.1373v1 [hep-ph]. http://arxiv.org/abs/1103.1373

Provided by Vanderbilt University


TStzmmalaysia
post Mar 22 2011, 11:00 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Kinect to help the blind 'see' in augmented reality

It seems like there are no shortage of uses for the Kinect system. The device, which was initially created by Microsoft as an add-on to its popular Xbox 360 video game console, to allow users to ditch the controller, has been getting around. Now, it has been integrated into a system designed to help the blind.

The system in question, which was designed by graduate students at the Universität Konstanz in Germany, has been dubbed the NAVI, or Navigational Aids for the Visually Impaired.

NAVI works something like this. The infrared camera from a Kinect system is mounted to a helmet that can be worn by a bilnd person. The visual data from that camera is turned into a set of audio instructions that are then transmitted to the wearer via a wireless headset. The system also features a standard camera added as well, allowing for a kind of three-camera stereoscopic vision. Certain items, such as door, will trigger events, such as a countdown, to prevent users from walking into the aforementioned door, in a kind of augmented reality.

The goal is to be able to give a blind person warnings about potential obstructions and directions to navigate in set spaces, at a longer distance than the current systems in place. Though this reporter for one, thinks that a Kinect will be much more of a supplement to seeing eye dogs than a replacement for them.

The system is also paired with a vibro-tactile arduino system in the belt, that will most likely act as a warning system, should an obstacle come perilously close to the user. No plans for commercial release of the system have been mentioned at this time.

PhysOrg



More information HERE


TStzmmalaysia
post Mar 22 2011, 11:06 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Physicists investigate lower dimensions of the universe

Several speculative theories in physics involve extra dimensions beyond our well-known four (which are broken down into three dimensions of space and one of time). Some theories have suggested 5, 10, 26, or more, with the extra spatial dimensions "hiding" within our observable three dimensions. One thing that all of these extra dimensions have in common is that none has ever been experimentally detected; they are all mathematical predictions.

More recently, physicists have been theorizing the possibility of lower dimensionality, in which the universe has only two or even one spatial dimension(s), along with one dimension of time. The theories suggest that the lower dimensions occurred in the past when the universe was much smaller and had a much higher energy level (and temperature) than today. Further, it appears that the concept of lower dimensions may already have some experimental evidence in cosmic ray observations.

Now in a new study, physicists Jonas Mureika from Loyola Marymount University in Los Angeles, California, and Dejan Stojkovic from SUNY at Buffalo in Buffalo, New York, have proposed a new and independent method for experimentally detecting lower dimensions. They’ve published their study in a recent issue of Physical Review Letters.

In 2010, a team of physicists including Stojkovic proposed a lower-dimensional framework in which spacetime is fundamentally a (1 + 1)-dimensional universe (meaning it contains one spatial dimension and one time dimension). In other words, the universe is a straight line that is “wrapped up” in such a way so that it appears (3 + 1)-dimensional at today’s higher energy scales, which is what we see.

The scientists don’t know the exact energy levels (or the exact age of the universe) when the transitions between dimensions occurred. However, they think that the universe’s energy level and size directly determine its number of dimensions, and that the number of dimensions evolves over time as the energy and size change. They predict that the transition from a (1 + 1)- to a (2 + 1)-dimensional universe happened when the temperature of the universe was about 100 TeV (teraelectronvolts) or less, and the transition from a (2 + 1)- to a (3 + 1)-dimensional universe happened later at about 1 TeV. Today, the temperature of the universe is about 10-3 eV.

So far, there may already be one piece of experimental evidence for the existence of a lower-dimensional structure at a higher energy scale. When observing families of cosmic ray particles in space, scientists found that, at energies higher than 1 TeV, the main energy fluxes appear to align in a two-dimensional plane. This means that, above a certain energy level, particles propagate in two dimensions rather than three dimensions.

In the current study, Mureika and Stojkovic have proposed a second test for lower dimensions that would provide independent evidence for their existence. The test is based on the assumption that a (2 + 1)-dimensional spacetime, which is a flat plane, has no gravitational degrees of freedom. This means that gravity waves and gravitons cannot have been produced during this epoch. So the physicists suggest that a future gravitational wave detector looking deep into space might find that primordial gravity waves cannot be produced beyond a certain frequency, and this frequency would represent the transition between dimensions. Looking backwards, it would appear that one of our spatial dimensions has “vanished.”

The scientists added that it should be possible, though perhaps more difficult, to test for the existence of (1 + 1)-dimensional spacetime.

“It will be challenging with the current experiments,” Stojkovic told PhysOrg.com. “But it is within the reach of both the LHC and cosmic ray experiments if the two-dimensional to one-dimensional crossover scale is 10 TeV.”

Lower dimensions at higher energies could have several advantages for cosmologists. For instance, models of quantum gravity in (2 + 1) and (1 + 1) dimensions could overcome some of the problems that plague quantum gravity theories in (3 + 1) dimensions. Also, reducing the dimensions of spacetime might solve the cosmological constant problem, which is that the cosmological constant is fine-tuned to fit observations and does not match theoretical calculations. A solution may lie in the existence of energy that is currently hiding between two folds of our (3 + 1)-dimensional spacetime, which will open up into (4 + 1)-dimensional spacetime in the future when the universe’s decreasing energy level reaches another transition point.

A change of paradigm,” Stojkovic said about the significance of lower dimensions. “It is a new avenue to attack long-standing problems in physics.

PhysOrg

More information: Jonas Mureika and Dejan Stojkovic. “Detecting Vanishing Dimensions via Primordial Gravitational Wave Astronomy.” Physical Review Letters 106, 101101 (2011). DOI: 10.1103/PhysRevLett.106.101101

TStzmmalaysia
post Mar 22 2011, 11:09 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


NANOTECHNOLOGY

Attached Image

Batteries charge quickly and retain capacity, thanks to new structure


Braun's group developed a three-dimensional nanostructure for battery cathodes that allows for dramatically faster charging and discharging without sacrificing energy storage capacity. The researchers' findings will be published in the March 20 advance online edition of the journal Nature Nanotechnology.

Aside from quick-charge consumer electronics, batteries that can store a lot of energy, release it fast and recharge quickly are desirable for electric vehicles, medical devices, lasers and military applications.

"This system that we have gives you capacitor-like power with battery-like energy," said Braun, a professor of materials science and engineering. "Most capacitors store very little energy. They can release it very fast, but they can't hold much. Most batteries store a reasonably large amount of energy, but they can't provide or receive energy rapidly. This does both."

The performance of typical lithium-ion (Li-ion) or nickel metal hydride (NiMH) rechargeable batteries degrades significantly when they are rapidly charged or discharged. Making the active material in the battery a thin film allows for very fast charging and discharging, but reduces the capacity to nearly zero because the active material lacks volume to store energy.

Braun's group wraps a thin film into three-dimensional structure, achieving both high active volume (high capacity) and large current. They have demonstrated battery electrodes that can charge or discharge in a few seconds, 10 to 100 times faster than equivalent bulk electrodes, yet can perform normally in existing devices.

This kind of performance could lead to phones that charge in seconds or laptops that charge in minutes, as well as high-power lasers and defibrillators that don't need time to power up before or between pulses.

Braun is particularly optimistic for the batteries' potential in electric vehicles. Battery life and recharging time are major limitations of electric vehicles. Long-distance road trips can be their own form of start-and-stop driving if the battery only lasts for 100 miles and then requires an hour to recharge.

"If you had the ability to charge rapidly, instead of taking hours to charge the vehicle you could potentially have vehicles that would charge in similar times as needed to refuel a car with gasoline," Braun said. "If you had five-minute charge capability, you would think of this the same way you do an internal combustion engine. You would just pull up to a charging station and fill up."

All of the processes the group used are also used at large scales in industry so the technique could be scaled up for manufacturing.

They key to the group's novel 3-D structure is self-assembly. They begin by coating a surface with tiny spheres, packing them tightly together to form a lattice. Trying to create such a uniform lattice by other means is time-consuming and impractical, but the inexpensive spheres settle into place automatically.

Then the researchers fill the space between and around the spheres with metal. The spheres are melted or dissolved, leaving a porous 3-D metal scaffolding, like a sponge. Next, a process called electropolishing uniformly etches away the surface of the scaffold to enlarge the pores and make an open framework. Finally, the researchers coat the frame with a thin film of the active material.

The result is a bicontinuous electrode structure with small interconnects, so the lithium ions can move rapidly; a thin-film active material, so the diffusion kinetics are rapid; and a metal framework with good electrical conductivity.

The group demonstrated both NiMH and Li-ion batteries, but the structure is general, so any battery material that can be deposited on the metal frame could be used.

"We like that it's very universal, so if someone comes up with a better battery chemistry, this concept applies," said Braun, who is also affiliated with the Materials Research Laboratory and the Beckman Institute for Advanced Science and Technology at Illinois. "This is not linked to one very specific kind of battery, but rather it's a new paradigm in thinking about a battery in three dimensions for enhancing properties."

EurekAlert


TStzmmalaysia
post Mar 22 2011, 11:12 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Researchers create organic nanoparticle that uses sound and heat to find, treat tumors

A team of scientists from Princess Margaret Hospital have created an organic nanoparticle that is completely non-toxic, biodegradable and nimble in the way it uses light and heat to treat cancer and deliver drugs. (A nanoparticle is a minute molecule with novel properties).

The findings, published online today in Nature Materials are significant because unlike other nanoparticles, the new nanoparticle has a unique and versatile structure that could potentially change the way tumors are treated, says principal investigator Dr. Gang Zheng, Senior Scientist, Ontario Cancer Institute (OCI), Princess Margaret Hospital at University Health Network.

Dr. Zheng says: "In the lab, we combined two naturally occurring molecules (chlorophyll and lipid) to create a unique nanoparticle that shows promise for numerous diverse light-based (biophotonic) applications. The structure of the nanoparticle, which is like a miniature and colorful water balloon, means it can also be filled with drugs to treat the tumor it is targeting."

It works this way, explains first author Jonathan Lovell, a doctoral student at OCI: "Photothermal therapy uses light and heat to destroy tumors. With the nanoparticle's ability to absorb so much light and accumulate in tumors, a laser can rapidly heat the tumor to a temperature of 60 degrees and destroy it. The nanoparticle can also be used for photoacoustic imaging, which combines light and sound to produce a very high-resolution image that can be used to find and target tumors." He adds that once the nanoparticle hits its tumor target, it becomes fluorescent to signal "mission accomplished".

"There are many nanoparticles out there, but this one is the complete package, a kind of one-stop shopping for various types of cancer imaging and treatment options that can now be mixed and matched in ways previously unimaginable. The unprecedented safety of this nanoparticle in the body is the icing on the cake. We are excited by the possibilities for its use in the clinic," says Dr. Zheng.

PhysOrg

TStzmmalaysia
post Mar 22 2011, 11:24 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


TRANSPORTATION

Attached Image

The drive toward hydrogen vehicles just got shorter

Researchers have revealed a new single-stage method for recharging the hydrogen storage compound ammonia borane. The breakthrough makes hydrogen a more attractive fuel for vehicles and other transportation modes.

In an article appearing today in Science magazine, Los Alamos National Laboratory (LANL) and University of Alabama researchers working within the U.S. Department of Energy's Chemical Hydrogen Storage Center of Excellence describe a significant advance in hydrogen storage science.

Hydrogen is in many ways an ideal fuel. It possesses a high energy content per unit mass when compared to petroleum, and it can be used to run a fuel cell, which in turn can be used to power a very clean engine. On the down side, H2 has a low energy content per unit volume versus petroleum (it is very light and bulky). The crux of the hydrogen issue has been how to get enough of the element on board a vehicle to power it a reasonable distance.

Work at LANL and elsewhere has focused on chemical hydrides for storing hydrogen, with one material in particular, ammonia borane, taking center stage. Ammonia borane is attractive because its hydrogen storage capacity approaches a whopping 20 percent by weight—enough that it should, with appropriate engineering, permit hydrogen-fueled vehicles to go farther than 300 miles on a single "tank," a benchmark set by the U.S. Department of Energy.

Hydrogen release from ammonia borane has been well demonstrated, and its chief drawback to use has been the lack of energy-efficient methods to reintroduce hydrogen into the spent fuel once burned. In other words, until now, after hydrogen release, the ammonia borane couldn't be recycled efficiently enough.

The Science paper describes a simple scheme that regenerates ammonia borane from a hydrogen depleted "spent fuel" form (called polyborazylene) back into usable fuel via reactions taking place in a single container. This "one pot" method represents a significant step toward the practical use of hydrogen in vehicles by potentially reducing the expense and complexity of the recycle stage. Regeneration takes place in a sealed pressure vessel using hydrazine and liquid ammonia at 40 degrees Celsius and necessarily takes place off-board a vehicle. The researchers envision vehicles with interchangeable hydrogen storage "tanks " containing ammonia borane that are used, and sent back to a factory for recharge.

The Chemical Hydrogen Storage Center of Excellence was one of three Center efforts funded by DOE. The other two focused on hydrogen sorption technologies and storage in metal hydrides. The Center of Excellence was a collaboration between Los Alamos, Pacific Northwest National Laboratory, and academic and industrial partners.

LANL researcher Dr. John Gordon, a corresponding author for the paper, credits collaboration encouraged by the Center model with the breakthrough.

"Crucial predictive calculations carried out by University of Alabama Professor Dave Dixon's group guided the experimental work of the Los Alamos team, which included researchers from both the Chemistry Division and the Materials Physics and Applications Division at LANL," Gordon said.

The success of this particular advance built on earlier work by this team (see: Angew. Chem. Int. Ed. 2009, 37, 6812). Input from colleagues at Dow Chemical (also a Center Partner), indicated that an alternative approach to the work in the Angew. Chem. paper would be required if ammonia borane recycle were to be feasible on a large scale. Armed with this information, it was "the insight, creativity and hard work of Dr. Andrew Sutton of Chemistry Division at LANL that provided the key to unlocking the 'one-pot' chemistry," Gordon said.

EurekAlert

TStzmmalaysia
post Mar 22 2011, 11:32 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

3D Printed Air Bike is as Strong as Steel but 1/3rd the Weight of Aluminium

a ero space light weight material,3d printed bike, super light weight bike, ALM product,additive layer manufacturing,EADS bike, air bike, green bike, new bike material, printed parts, next generation material

This funky bike is making news today not just for its design but for how it is made. UK engineers printed the bike using a powder composed of nylon and metal which results in a frame that has the strength of steel while also being 65% lighter than aluminum. The Air Bike is a demonstration of a technology called additive layer manufacturing which was able to create the fully working bike with only six parts. The goal is to show how the technology can revolutionize product design from airplanes to satellites to more down to earth items like bikes.

EADS the European Aerospace and Defense Group, the parent company of Airbus, is developing the technology to make strong lightweight and robust materials for aeronautical equipment. They decided to create the bike to show how the technology is not just about hidden parts, but how it can revolutionize the production of many common things.

As the term implies, additive layer manufacturing is about creating a product by joining a material together rather than by subtracting from a material such as machining steel. A fine powder of nylon and metal is laid out and melted with a laser. Like a 3-d printer consecutive layers are added and fused to create a bond that rivals the strength of steel.

The process also uses one tenth of the material which makes it a very green substitute. No word on if the material can be reused, so its cradle to cradle potential is still up in the air, so to speak. Dont expect to see the Air Bike on the street any time soon as it is just a demonstration of the technology but the potential is enormous.

Inhabitat


TStzmmalaysia
post Mar 22 2011, 11:51 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

LAVA’s Home Of The Future is a Geodesic Plant-Filled Bubble

LAVA’s Home of the Future is a showcase for ultramodern living with man, nature and technology in harmony and is actually going to be built in late 2011 on a rooftop of a new furniture mall in Beijing, China.

This home is set inside a geodesic skydome made out of ETFE bubbles and will create a year-round microclimate for the gardens contained inside.

A series of living zones are spread throughout the enclosure and surrounded by the gardens enjoying abundant daylighting, fresh air and a quiet repose away from the city noise. Visitors to the mall will be able to tour the futuristic home and the 15 different living spaces that include internal/external bathroom zones, kitchens flowing to veggie patches, barbecues and sunken bedrooms with ‘dream inducing lighting’.

Inspired by nature’s efficiencies like coral, cells and bubbles, the home is an environment where technology is invisibly integrated to satisfy every day needs. As the sun goes down, the home and the tropical garden turn into an electric experience, with a vein-like lighting system illuminating the space, plants and bubble enclosure. Alongside the futuristic home, LAVA has also designed the Future Hotel, which is a demonstration project focusing on meeting the requirements of hotel guests using tomorrow’s technology.

Chris Bosse, Director of LAVA says: “The Home of the Future acts as a metaphor for the questions of our times, our relationship with nature, with technology and with ourselves.” LAVA’s Home of the Future is currently on exhibit as part of the annual Art + Architecture 11 show, HOME – Real and Ideal, at the Boutwell Draper Gallery in Sydney from the 17th of March to the 31st.

Inhabitat

Read more: LAVA’s Home Of The Future Is A Neon, Geodesic Plant-Filled Bubble LAVA Home of the Future – Inhabitat - Green Design Will Save the World

TStzmmalaysia
post Mar 22 2011, 11:56 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Seeing in stereo: Engineers invent lens for 3-D microscope

Engineers at Ohio State University have invented a lens that enables microscopic objects to be seen from nine different angles at once to create a 3D image.

Other 3D microscopes use multiple lenses or cameras that move around an object; the new lens is the first single, stationary lens to create microscopic 3D images by itself.

Allen Yi, associate professor of integrated systems engineering at Ohio State, and postdoctoral researcher Lei Li described the lens in a recent issue of the Journal of the Optical Society of America A.

Yi called the lens a proof of concept for manufacturers of microelectronics and medical devices, who currently use very complex machinery to view the tiny components that they assemble.

Though the engineers milled their prototype thermoplastic lens on a precision cutting machine, the same lens could be manufactured less expensively through traditional molding techniques, Yi said.

"Ultimately, we hope to help manufacturers reduce the number and sizes of equipment they need to miniaturize products," he added.

The prototype lens, which is about the size of a fingernail, looks at first glance like a gem cut for a ring, with a flat top surrounded by eight facets. But while gemstones are cut for symmetry, this lens is not symmetric. The sizes and angles of the facets vary in minute ways that are hard to see with the naked eye.

"No matter which direction you look at this lens, you see a different shape," Yi explained. Such a lens is called a "freeform lens," a type of freeform optics.

Freeform optics have been in use for more than a decade. But Lei Li was able to write a computer program to design a freeform lens capable of imaging microscopic objects.

Then Yi and Li used a commercially available milling tool with a diamond blade to cut the shape from a piece of the common thermoplastic material polymethyl methacrylate, a transparent plastic that is sometimes called acrylic glass. The machine shaved bits of plastic from the lens in increments of 10 nanometers, or 10 billionths of a meter – a distance about 5,000 times smaller than the diameter of a human hair.

The final lens resembled a rhinestone, with a faceted top and a wide, flat bottom. They installed the lens on a microscope with a camera looking down through the faceted side, and centered tiny objects beneath the flat side.

Each facet captured an image of the objects from a different angle, which can be combined on a computer into a 3D image.

The engineers successfully recorded 3D images of the tip of a ballpoint pen – which has a diameter of about 1 millimeter – and a mini drill bit with a diameter of 0.2 millimeters.

"Using our lens is basically like putting several microscopes into one microscope," said Li. "For us, the most attractive part of this project is we will be able to see the real shape of micro-samples instead of just a two-dimensional projection."

In the future, Yi would like to develop the technology for manufacturers. He pointed to the medical testing industry, which is working to shrink devices that analyze fluid samples. Cutting tiny reservoirs and channels in plastic requires a clear view, and the depths must be carved with precision.

Computer-controlled machines – rather than humans – do the carving, and Yi says that the new lens can be placed in front of equipment that is already in use. It can also simplify the design of future machine vision equipment, since multiple lenses or moving cameras would no longer be necessary.

Other devices could use the tiny lens, and he and Li have since produced a grid-shaped array of lenses made to fit an optical sensor. Another dome-shaped lens is actually made of more than 1,000 tiny lenses, similar in appearance to an insect's eye.

EurekAlert

TStzmmalaysia
post Mar 22 2011, 11:58 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

New imaging technique provides rapid, high-definition chemistry

With intensity a million times brighter than sunlight, a new synchrotron-based imaging technique offers high-resolution pictures of the molecular composition of tissues with unprecedented speed and quality. Carol Hirschmugl, a physicist at the University of Wisconsin-Milwaukee (UWM), led a team of researchers from UWM, the University of Illinois at Urbana-Champaign and University of Illinois at Chicago (UIC) to demonstrate these new capabilities.

Hirschmugl and UWM scientist Michael Nasse have built a facility called "Infrared Environmental Imaging (IRENI)," to perform the technique at the Synchrotron Radiation Center (SRC) at UW-Madison. The new technique employs multiple beams of synchrotron light to illuminate a state-of-the-art camera, instead of just one beam.

IRENI cuts the amount of time needed to image a sample from hours to minutes, while quadrupling the range of the sample size and producing high-resolution images of samples that do not have to be tagged or stained as they would for imaging with an optical microscope.

"Since IRENI reveals the molecular composition of a tissue sample, you can choose to look at the distribution of functional groups, such as proteins, carbohydrates and lipids," says Hirschmugl, "so you concurrently get detailed structure and chemistry."

The technique could have broad applications not only in medicine, but also in pharmaceutical drug analysis, art conservation, forensics, biofuel production, and advanced materials, such as graphene, she says.

The work is a collaboration with the labs of Rohit Bhargava, assistant professor of bioengineering at the University of Illinois at Urbana-Champaign and pathologists Dr. Virgilia Macias and Dr. André Kajdacsy-Balla at UIC. "It has taken three years to establish IRENI as a national user facility located at the SRC," says Nasse. "It is the only facility of its kind worldwide."



Chemical fingerprints

The unique features of the synchrotron make it a highly versatile light source in spectroscopy. Streams of speeding electrons emit continuous light across the entire electromagnetic spectrum so that researchers can access whatever wavelength is best absorbed for a particular purpose.

Although not visible to the human eye, the mid-infrared range of light used by the team documents the light absorbed at thousands of locations on the sample, forming graphic "fingerprints" of biochemically important molecules.

Using 12 beams of synchrotron light in this range allows researchers to collect thousands of these chemical fingerprints simultaneously, producing an image that is 100 times less-pixelated than in conventional infrared imaging.

"We did not realize until now the improvement in detail and quality that sampling at this pixel size would bring," says Bhargava. "The quality of the chemical images is now quite similar to that of optical microscopy and the approach presents exciting new possibilities."



Testing for future applications

The team tested the technique on breast and prostate tissue samples to determine its capabilities for potential use in diagnostics for cancer and other diseases. The researchers were able to detect features that distinguished the epithelial cells, in which cancers begin, from the stromal cells, which are the type found in deeper tissues, with unprecedented detail.

Separating the two layers of cells is a "basement membrane" which prevents malignant cells from spreading from the epithelial cells into the stromal cells. Early-stage cancers are concentrated in the epithelial cells, but metastasis occurs when the basement membrane is breached. Using a prostate cancer sample, the team had encouraging results in locating spectra of the basement membrane, but more work needs to be done.

"IRENI provides us a new opportunity to study tissues and provides lessons for the development of the next generation of IR imaging instruments," says Michael Walsh, a Carle Foundation Hospital-Beckman Institute post-doctoral fellow at the University of Illinois at Urbana-Champaign and co-author on the paper.

It opens the door for development of synchrotron-based imaging that can monitor cellular processes, from simple metabolism to stem cell specialization.

EurekAlert
TStzmmalaysia
post Mar 22 2011, 12:01 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Chicago’s Willis Tower to Become a Vertical Solar Farm

Chicago’s iconic Willis Tower (formally the Sears Tower) is set to become a massive solar electric plant with the installation of a pilot solar electric glass project. The high-profile project on the south side of the 56th floor will replace the windows with a new type of photovoltaic glass developed by Pythagoras Solar which preserves daylighting and views while reducing heat gain and producing the same energy as a conventional solar panel. The project could grow to 2 MW in size — which is comparable to a 10 acre field of solar panels — turning North America’s tallest building into a huge urban vertical solar farm.

The project is a collaboration between the tower’s owner and the manufacturer to prove the viability of the building integrated photovoltaic (BIPV) system, which will also save energy by reducing heat gain and cooling costs. The new windows, dubbed high power density photovoltaic glass units (PVGU), are a clever hybrid technology that lays typical monocrystalline silicon solar cell horizontally between two layers of glass to form an individual tile. An internal plastic reflective prism directs angled sunlight onto the solar cells but allows diffuse daylight and horizontal light through. Think of it as a louvered shade which allows for views but cuts out the harsh direct sun.

The manufacturer claims that the vertically integrated solar cells will produce the same amount of energy as normal rooftop-mounted solar panels. This is great news for cities that have precious little rooftop space and towering walls of glass. The product is also a potential breakthrough in energy efficiency in glass towers, where solar heat gain is the bane of energy-efficient design.

Attached Image

Inhabitat

TStzmmalaysia
post Mar 22 2011, 12:06 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Molecular Determinant of Cell Identity Discovered

If a big bunch of your brain cells suddenly went rogue and decided to become fat cells, it could cloud your decision-making capacity a bit. Fortunately, early in an organism's development, cells make firm and more-or-less permanent decisions about whether they will live their lives as, say, skin cells, brain cells or, well, fat cells.

Those decisions essentially boil down to which proteins, among all the possible candidates encoded in a cell's genes, the cell will tend to make under ordinary circumstances. But exactly how a cell chooses its default protein selections from an overwhelmingly diverse genetic menu is somewhat mysterious.

A new study from the Stanford University School of Medicine may help solve the mystery. The researchers discovered how a particular variety of the biomolecule RNA that had been thought to be largely irrelevant to cellular processes plays a dynamic regulatory role in protein selection. In unraveling this molecular mechanism, the study also offers enticing clues as to how certain cancers may arise.

Howard Chang, MD, PhD, associate professor of dermatology, is the senior author of the study, to be published online March 20 in Nature.

"All the cells in your body have the same genes, but they don't all make the same proteins," said Chang, MD, PhD, who is also a Howard Hughes Medical Institute Early Career Scientist. In this new study, Chang and his colleagues identified a novel action by a subset of RNA that reinforces cells' decisions about which combinations of their genes are to be active and which must stay silent.

RNA is a chemical lookalike of DNA -- the stuff our genes are made of -- that, according to standard textbooks, mainly functions as a messenger: a copy of a gene, made by a cell's gene-reading machinery, that can float away from the chromosomes where genes reside to other places in the cell where proteins are made. There the messenger-RNA molecule serves as an instruction manual for the production of proteins.

Scientists used to see RNA mostly as a stodgy servant of its kingly commander, DNA, in the protein-production process. But in recent decades scientists have learned of several ways RNA can influence the production of proteins besides merely conveying information from genes to a cell's protein-making apparatus.

In the Nature study, the researchers identified a novel regulatory role for a class of RNA molecules called lincRNA (for long intergenic noncoding RNA). A typical cell spawns as many as 10,000 distinct species of lincRNA molecules -- on a par with the number of conventional protein-coding genes -- but lincRNAs don't spell out recipes for making proteins. For years, many biochemists were skeptical that lincRNA played any important role in a cell and considered the molecules just mere "noise," perhaps vestigial protein-coding genes that had mutated to become nonfunctional. Chang's group has been instrumental in proving that lincRNAs can play a critical regulatory role: determining what proteins a cell produces and, thereby, what identity it assumes.

To do so, Chang and his associates turned to human fibroblasts, which are easily grown in culture. Fibroblasts are cells that lie just beneath the skin and secrete factors determining skin cells' local character. "You'll never see hair growing out of someone's palm," Chang said. The factors that fibroblasts secrete vary depending on where in the body they're located.

Remarkably, cultured fibroblasts from different parts of the body somehow remember their sense of where they belong, continuing to maintain characteristic patterns of genes that are "on" or "off" even over dozens of generations of cell division in a petri dish. "Why is that?" Chang asked.

A related question intrigued the study's first author, Kevin Wang, MD, PhD, an instructor of dermatology and a postdoctoral scholar in Chang's lab. "I was initially interested in conditions like psoriasis, a skin disease whose manifestations in the body are region-specific," he said. "Cells that have the same DNA, that look the same under a microscope -- what made them act differently?"

Chang has been using cultured fibroblasts as workhorse cells to help answer these questions. In a study published last year in Science, his group showed that one species of lincRNA, which he and his labmates had discovered and named HOTAIR, acts quite differently from your standard mRNA molecule: It contorts into a kind of adapter plug and then latches onto massive protein complexes, which have the ability to silence genes. Once hooked up to such complexes, HOTAIR shuttles them to particular spots along a chromosome -- "positional identity" genes. Defects in these genes, first identified in fruit flies, can result in bizarre outcomes such as a fly with legs growing out of its head, instead of antennae. Particular on/off patterns of a cell's positional-identity genes lead the cell to behave in a characteristic way (palm versus scalp, for example).

In a nutshell, HOTAIR locks cells' positional identities into place by marking key genes with the biochemical equivalent of "gone fishing" signs, so that they remain closed for business.

The new study, in contrast, demonstrates how another lincRNA, dubbed HOTTIP, grabs onto an opposing type of protein complex, which marks similar positional-identity genes as "open for business." The researchers observed that this complex wheels into action once HOTTIP links to it, and then biochemically fixes cell-position-appropriate genes in the "on" position.

An ability to act as a mute button for protein production has been demonstrated for other RNA types besides lincRNA. But, said Chang, HOTTIP is the first example of any RNA molecule that creates a memory of gene activation rather than silencing them. "When we experimentally impeded HOTTIP activity, fibroblasts that were supposed to express certain positional-identity genes didn't," he said.

Interestingly, the particular genes that HOTTIP caused to retain a switched-on status were fairly remote from one another along the stretch of chromosome where they reside. To learn more about how this works, Chang, Wang and their Stanford colleagues teamed up with a group at the University of Massachusetts Medical School, in Worcester, whose research focuses on the three-dimensional organization of genomes.

What they learned from this holds implications for how some cancers could get started. The investigators found that DNA can form complicated looping structures that bring genes distant from one another on a chromosome, or on entirely different chromosomes, physically close. This lets HOTTIP and the protein complex it's linked to efficiently mark appropriate genes as "open for business."

But it could also lead to things going awry, possibly triggering certain cancers. Biochemical interactions at close range among these ordinarily distant genes can cause their fusion -- or even an exchange in their positions -- and resulting faulty protein production characteristic of a number of cancers, Chang said.

The study was funded by the California Institute for Regenerative Medicine, the National Institutes of Health, the Scleroderma Research and W.M. Keck foundations and the Howard Hughes Medical Institute. Other Stanford co-authors are Joanna Wysocka, PhD, assistant professor of chemical and systems biology and of developmental biology; Jill Helms, PhD, DDS, professor of surgery; Rajnish Gupta, MD, PhD, clinical assistant professor of dermatology; Bo Liu PhD, a research associate in Helms' laboratory; medical and graduate student Yul Yang; graduate student Ryan Corces-Zimmerman; medical student Ryan Flynn; and research assistant Angeline Protacio. In addition to the team at the University of Massachusetts, the study also involved a researcher at the University of Michigan.

ScienceDaily

TStzmmalaysia
post Mar 22 2011, 12:10 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


NANOTECHNOLOGY

Attached Image

Templated growth technique produces graphene nanoribbons with metallic properties

Georgia Tech graduate students Yike Hu and John Hankinson observe a high-temperature furnace used to produce epitaxial graphene on a silicon carbide wafer. A new "templated growth" technique allows fabrication of nanoribbons with smooth edges and high conductivity. Credit: Georgia Tech Photo: Gary Meek

A new "templated growth" technique for fabricating nanoribbons of epitaxial graphene has produced structures just 15 to 40 nanometers wide that conduct current with almost no resistance. These structures could address the challenge of connecting graphene devices made with conventional architectures – and set the stage for a new generation of devices that take advantage of the quantum properties of electrons.

"We can now make very narrow, conductive nanoribbons that have quantum ballistic properties," said Walt de Heer, a professor in the School of Physics at the Georgia Institute of Technology. "These narrow ribbons become almost like a perfect metal. Electrons can move through them without scattering, just like they do in carbon nanotubes."

De Heer was scheduled to discuss recent results of this graphene growth process March 21st at the American Physical Society's March 2011 Meeting in Dallas. The research was sponsored by the National Science Foundation-supported Materials Research Science and Engineering Center (MRSEC).

First reported Oct. 3 in the advance online edition of the journal Nature Nanotechnology, the new fabrication technique allows production of epitaxial graphene structures with smooth edges. Earlier fabrication techniques that used electron beams to cut graphene sheets produced nanoribbon structures with rough edges that scattered electrons, causing interference. The resulting nanoribbons had properties more like insulators than conductors.

"In our templated growth approach, we have essentially eliminated the edges that take away from the desirable properties of graphene," de Heer explained. "The edges of the epitaxial graphene merge into the silicon carbide, producing properties that are really quite interesting."

The "templated growth" technique begins with etching patterns into the silicon carbide surfaces on which epitaxial graphene is grown. The patterns serve as templates directing the growth of graphene structures, allowing the formation of nanoribbons and other structures of specific widths and shapes without the use of cutting techniques that produce the rough edges.

In creating these graphene nanostructures, de Heer and his research team first use conventional microelectronics techniques to etch tiny "steps" – or contours – into a silicon carbide wafer whose surface has been made extremely flat. They then heat the contoured wafer to approximately 1,500 degrees Celsius, which initiates melting that polishes any rough edges left by the etching process.
Established techniques are then used for growing graphene from silicon carbide by driving off the silicon atoms from the surface. Instead of producing a consistent layer of graphene across the entire surface of the wafer, however, the researchers limit the heating time so that graphene grows only on portions of the contours.

The width of the resulting nanoribbons is proportional to the depth of the contours, providing a mechanism for precisely controlling the nanoribbon structures. To form complex structures, multiple etching steps can be carried out to create complex templates.

"This technique allows us to avoid the complicated e-beam lithography steps that people have been using to create structures in epitaxial graphene," de Heer noted. "We are seeing very good properties that show these structures can be used for real electronic applications."

Since publication of the Nature Nanotechnology paper, de Heer's team has been refining its technique. "We have taken this to an extreme – the cleanest and narrowest ribbons we can make," he said. "We expect to be able to do everything we need with the size ribbons that we are able to make right now, though we probably could reduce the width to 10 nanometers or less."

While the Georgia Tech team is continuing to develop high-frequency transistors – perhaps even at the terahertz range – its primary effort now focuses on developing quantum devices, de Heer said. Such devices were envisioned in the patents Georgia Tech holds on various epitaxial graphene processes.

"This means that the way we will be doing graphene electronics will be different," he explained. "We will not be following the model of using standard field-effect transistors (FETs), but will pursue devices that use ballistic conductors and quantum interference. We are headed straight into using the electron wave effects in graphene."

Taking advantage of the wave properties will allow electrons to be manipulated with techniques similar to those used by optical engineers. For instance, switching may be carried out using interference effects – separating beams of electrons and then recombining them in opposite phases to extinguish the signals.

Quantum devices would be smaller than conventional transistors and operate at lower power. Because of its ability to transport electrons with virtually no resistance, epitaxial graphene may be the ideal material for such devices, de Heer said.

"Using the quantum properties of electrons rather than the standard charged-particle properties means opening up new ways of looking at electronics," he predicted. "This is probably the way that electronics will evolve, and it appears that graphene is the ideal material for making this transition."

De Heer's research team hopes to demonstrate a rudimentary switch operating on the quantum interference principle within a year.

Epitaxial graphene may be the basis for a new generation of high-performance devices that will take advantage of the material's unique properties in applications where higher costs can be justified. Silicon, today's electronic material of choice, will continue to be used in applications where high-performance is not required, de Heer said.

"This is an important step in the process," he added. "There are going to be a lot of surprises as we move into these quantum devices and find out how they work. We have good reason to believe that this can be the basis for a new generation of transistors based on quantum interference."

PhysOrg

TStzmmalaysia
post Mar 22 2011, 12:13 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Scientists grow personalized collections of intestinal microbes

Each of us carries a unique collection of trillions of friendly microbes in our intestines that helps break down food our bodies otherwise couldn't digest.

This relationship between humans and their microbes is generally a healthy one, but changes to the mix of microbes in the digestive tract are suspected to play a role in obesity, malnutrition, Crohn's disease and other ailments.

Now, scientists at Washington University School of Medicine in St. Louis show they can grow and manipulate personalized collections of human intestinal microbes in the laboratory and pluck out particular microbes of interest.

The research sets the stage for identifying new probiotics and evaluating in preclinical trials whether microbe transplants can restore the natural balance of intestinal bacteria in "sick" microbial communities.

The research, by Jeffrey I. Gordon, MD, the Dr. Robert J. Glaser Distinguished University Professor and director of the Center for Genome Sciences & Systems Biology, and his team is reported online March 21 in the early online edition of the Proceedings of the National Academy of Sciences.

"This research helps set up a discovery pipeline in which we can deliberately manipulate collections of human intestinal microbes from people of different ages and cultures who are either healthy or sick," says Gordon, whose research first established a possible link between obesity and other facets of nutritional status and the mix of microbes that inhabit the intestine. "This gives us the opportunity to identify new groups of microbes that may be extremely beneficial in various therapeutic settings."

Researchers have grown bacterial microbes in the laboratory before, but until recently there's been no reliable way to know whether communities captured in a Petri dish mirror the extensive bacterial collections that exist in particular habitats of the body, such as the intestine.
"There are so many types of bacteria that live in different parts of our bodies, as well as substantial differences in these collections from person to person, that most scientists have thought we're probably missing a lot of the richness of microbial communities when we try to grow them in the laboratory," Gordon says. "But we found that the ability to successfully grow collections of gut microbes is much greater than had been expected."

For the study, the researchers obtained stool samples from two unrelated people. A portion of each sample was grown in the laboratory under strict "anaerobic" conditions because gut microbes live in an environment that lacks oxygen.

Then, they used the latest DNA sequencing technology to sequence a gene found in all microbes. This gene, 16S rDNA, functions as a barcode of life to determine "who" is there and can be used to inventory the various species present in a microbial community.

In all, they discovered that most of the different groups of intestinal bacteria found in an individual also were present in their corresponding bacterial collections that were grown, or cultured, in the laboratory.

"We were able to capture a remarkable proportion of the diversity of each person's intestinal bacteria in the samples we grew in the laboratory," says first author Andrew Goodman, PhD, a former postdoctoral student in Gordon's lab who is now on the faculty at Yale University.

The researchers then transplanted collections of microbial communities from the cultured and uncultured samples into the intestinal tracts of formerly germ-free mice. The mice, in essence, acquired a collection of gut microbes that mimicked the community in the original human donor.

By analyzing these "humanized" mice, the researchers demonstrated that both cultured and uncultured gut microbial communities from the same person behaved in the same manner when the mice were switched from their typical diet — a low-fat, plant-based mouse chow — to a standard western diet that is high in fat and sugar. Some species became more dominant and others less so, but the changes were virtually identical, regardless of whether the original sample was cultured or not.

The researchers also demonstrated they could split apart an entire community of cultured intestinal microbes and create a "personalized" library of bacterial species.

Microbes that react strongly to changes in diet or exposure to antibiotics, for example, can be retrieved from these libraries and their genomes can be sequenced to help understand why they respond as they do. Then, these microbes can be reunited with other members of a microbial community in germ-free mice to create more simplified models of human gut communities.

Gordon envisions that this approach makes it possible to obtain personalized microbial communities from people around the world who consume different diets and from individuals who are obese or malnourished or who have Crohn's or other diseases.

"This gives us the ability to test the contributions of specific microbes or groups of microbes and their influence on a person's health," Gordon explains. "One central question we hope to answer is how much of a person's overall nutritional status can be ascribed to their gut microbes and whether nutritional status can be improved by therapeutic interventions directed to gut microbial communities."

More information: Goodman AL, Kallstrom G, Faith JJ, Reyes A, Moore A, Dantas G, Gordon JI. Extensive personal human gut microbiota culture collections characterized and manipulated in gnotobiotic mice. Proceedings of the National Academy of Sciences Early Edition. March 21, 2011.

PhysOrg

TStzmmalaysia
post Mar 23 2011, 10:50 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


NANOTECHNOLOGY

Attached Image

First Lens to Produce Nanometre Images With Visible light

When it comes to microscopy, the smallest thing you can resolve is limited by the wavelength of light you're using. With visible light, the limit is about 200 nm, that's about the size of a measles virus.

Today, Allard Mosk at the University of Twente in The Netherlands and a few pals demonstrate an entirely new type of microscopy that doubles this resolution. To show that it works, they use 561nm laser light to image gold nanoparticles just 97 nanometres across and say it should be possible to do even better.

But the most amazing thing about this technique is the lens it uses. Mosk and co achieve their trick using a flat piece of frosted glass (ie a transparent slab which is etched on one side in a way that entirely scatters the light that passes through). Here's how.

First, imagine what this frosted slab does to a plane wave of light passing through it. The flat wave hits the etched surface and scatters in all directions. Some of the light then continues into the glass although the wave front is no longer flat but dramatically distorted. The distorted light then emerges from the other (clear) side of the glass and now appears as a kind of random speckle.

Attached Image

Mosk and co record this distorted wavefront using a CCD chip and work out its shape.

Now imagine the set up again with a slight difference. This time, before the plane light wave hits the scattering surface, Mosk and co send it through a spatial light modulator which can distort the wave in any way they like.

Mosk and co could use the information from the first experiment to bend the incoming wave in exactly the right way to cancel out the distortion due to the scattering layer. Astronomers use this approach to correct light from stars that is distorted by the atmosphere.

But actually Mosk and buddies go further. They distort the incoming plane wave in such a way that the scattering layer causes it to come to a focus. The important point, however, is that this focal point is much tighter than can be achieved with an ordinary lens that relies on refraction alone. This is what allows the higher resolution.

Their equipment is so precise that it can control exactly where the focal point appears and can even move it around. That allows them to build up an image by scanning the focal point back and forth across the object under investigation to build up a 2D image.

Mosk and co demonstrate the technique by imaging gold nanoparticles just 97 nm across but say it should work down to 72 nm. "Our work is the ?rst lens that provides a resolution in the nanometer regime at visible wavelengths," they say.

That's an elegant and powerful technique that could have wide application. The lens, which is actually a flat slab of gallium phosphide that has been acid-etched on one side, is cheap and easy to make. It is also free of the aberrations and distortions that plague conventional refraction-based designs.

It's easy to imagine it being rapidly adopted in many labs.

MIT Technology Review


TStzmmalaysia
post Mar 23 2011, 10:55 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Bug Creates Butanol Direct from Cellulose

Butanol—a promising next-generation biofuel—packs more energy than ethanol and can be shipped via oil pipelines. But, like ethanol, biobutanol production is focused on using edible feedstocks such as beets, corn starch, and sugarcane.

Now James Liao, a biomolecular engineer at the University of California, Los Angeles, has developed two routes to liberate butanol from its dependence on food crops. Liao, who has a track record for commercializing innovative biofuels processes, has proven that microbes can produce the advanced biofuel directly from agricultural wastes, as well as from protein feedstocks such as algae.

Liao's demonstration of direct cellulose-to-butanol conversion could bring down the cost of cellulosic biofuels, which is currently prohibitively high. His protein-based process provides the biofuels field with entirely novel feedstock options.

While they're renewable, biofuels face attacks from environmental and food activists, and biobutanol is no exception: the first generation of biobutanol plants under development will run on corn-based sugar and starch. "Butanol has some technical benefits, but the real problem is the amount of food that goes into making a gallon of fuel," says Jeremy Martin, a senior scientist at the Union of Concerned Scientists, a Cambridge, Massachusetts-based advocacy group that is part of a broad coalition pushing Congress to end lucrative tax credits for corn ethanol.

Liao's innovations could end biobutanol's association with corn—an association that, ironically, is partly of his making. In 2008, Liao developed a microbial pathway for converting sugar into isobutanol, a high-octane isomer of butanol. That innovation is now being commercialized by Gevo, an Englewood, Colorado-based startup that Liao cofounded. Gevo raised $107 million in an IPO last month to support its plans to retrofit corn ethanol plants to produce isobutanol instead.

Plans for a shift to biofuels production from biomass feedstocks such as switchgrass, corn stalks, and sugarcane bagasse (or plant residue) are, meanwhile, moving slowly because of higher costs. The U.S. Environmental Protection Agency mandated use of just 6.6 million gallons of cellulosic ethanol this year—less than 3 percent of the 250-million-gallon goal set by Congress four years ago. The holdup is from added processing steps required to break down these cellulosic feedstocks and thus generate sugars for fermentation; the processing boosts costs considerably, making production facilities difficult to finance.

Liao's direct cellulose-to-butanol process, developed in collaboration with researchers at Oak Ridge National Laboratory, promises to simplify things by expanding the capabilities of fermentation microbes. The key was adding Liao's sugar-to-isobutanol pathway to a microbe, Clostridium cellulolyticum, that likes chewing on biomass but does not normally make butanol. The microbe was originally isolated from composted grass, and two years ago, the U.S. Department of Energy's Joint Genome Institute completed a sequence of its genome.

The result of the genetic engineering, published this month in the journal Applied and Environmental Microbiology, is a single organism that takes in cellulose and cranks out isobutanol. Liao says the output and conversion rate are low, but says this "proof of principle" is likely the trickiest part of the development process. "The rest is relatively straightforward. Not trivial, but straightforward. It becomes a matter of funding and resources," says Liao.

The next step is to move the genetic modifications to a faster-growing variant of Clostridium or some other microbe. Liao bets the technology could be production-ready in as little as two years.

One speed bump that could slow things down is litigation over rights to use Liao's technology. Gevo is being sued for patent infringement by competitor Butamax Advanced Biofuels, a joint venture between BP and DuPont that, like Gevo, plans to convert corn-based ethanol plants to isobutanol. Butamax alleges that Gevo's use of genetic engineering to make butanol violates a broad U.S. patent issued to Butamax in December 2010.

Another obstacle is concern about the environmental impact of heavy biomass use. In January, the EPA issued a draft report to Congress on the environmental impacts from biofuels production. The report outlined several concerns with production of biomass-based fuels. It noted that using corn stover (the leaves and stalks left after harvest) to produce fuels, instead of plowing the stover back into farmlands, could result in soil degradation and choke streams and rivers with increased runoff. Environmental activists have raised concerns about the cultivation of marginal lands that have been set aside to boost biodiversity and provide protective barriers around water bodies.

Liao's demonstration of genetically engineered E. coli that can turn protein into isobutanol also provides a potential alternative to biomass feedstocks: fast-growing photosynthetic algae. Current R&D projects developing algae-based biofuels seek to convert algal-produced fats, which make up about a quarter of algal mass. Proteins, in contrast, make up roughly two-thirds.

It would be possible, says Liao, to create a recycling production system in which isobutanol-producing microbes are sustained by algal protein as well as industrial fermentation residues recovered from prior rounds of butanol production. Like algae, fermentation residues are composed largely of proteins.

"These results show the feasibility of using proteins for biorefineries," Liao and UCLA colleagues wrote this month in the journal Nature Biotechnology.

Liao says protein-fed biorefineries cranking out isobutanol are probably five to 10 years from realization, so cellulosic isobutanol is likely to come first. He acknowledges that algae-based protein feedstocks may, like cellulosic biomass, turn out to have unforeseen costs. But one thing is certain, says Liao: "They're certainly much more sustainable than petroleum or coal or sugar."

MIT Technology Review

TStzmmalaysia
post Mar 23 2011, 10:57 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


NANOTECHNOLOGY

Attached Image

Cheap catalyst made easy

Catalysts made of carbon nanotubes dipped in a polymer solution equal the energy output and otherwise outperform platinum catalysts in fuel cells, a team of Case Western Reserve University engineers has found.

The researchers are certain that they'll be able to boost the power output and maintain the other advantages by matching the best nanotube layout and type of polymer.

But already they've proved the simple technique can knock down one of the major roadblocks to fuel cell use: cost.

Platinum, which represents at least a quarter of the cost of fuel cells, currently sells for about $65,000 per kilogram. These researchers say their activated carbon nanotubes cost about $100 per kilogram.

Their work is published in the online edition of Journal of the American Chemical Society at http://pubs.acs.org/doi/full/10.1021/ja1112904.

"This is a breakthrough," said Liming Dai, a professor of chemical engineering and the research team leader.

Dai and research associates Shuangyin Wang and Dingshan Yu found that by simply soaking carbon nanotubes in a water solution of the polymer polydiallyldimethylammoniumn chloride for a couple of hours, the polymer coats the nanotube surface and pulls an electron partially from the carbon, creating a net positive charge.

They placed the nanotubes on the cathode of an alkaline fuel cell. There, the charged material acts as a catalyst for the oxygen-reduction reaction that produces electricity while electrochemically combining hydrogen and oxygen.

In testing, the fuel cell produced as much power as an identical cell using a platinum catalyst.

But the activated nanotubes last longer and are more stable, the researchers said. Unlike platinum, the carbon-based catalyst: doesn't lose catalytic activity and, therefore, efficiency, over time; isn't fouled by carbon monooxide poising; and is free from the crossover effect with methanol. Methanol, a liquid fuel that's easier to store and transport than hydrogen, reduces activity of a platinum catalyst when the fuel crosses over from the anode to the cathode in a fuel cell.

The new process builds on the Dai lab's earlier work using nitrogen-doped carbon nanotubes as a catalyst. In that process, nitrogen, which was chemically bonded to the carbon, pulled electron partially from the carbon to create a charge. Testing showed the doped tubes tripled the energy output of platinum.

Dai said the new process is far simpler and cheaper than using nitrogen-doped carbon nanotubes and he's confident his lab will increase the energy output as well. "We have not optimized the system yet."

EurekAlert

TStzmmalaysia
post Mar 23 2011, 11:00 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Voice command-based robot feeding arm unveiled

Eating a good meal is one of the few things in life that is both absolutely necessary, and extremely pleasurable at the same time. But what would you do if you could not pick up the knife and fork to eat with? You would have to rely on a caregiver to help you feed yourself. Up until now that caregiver has been a human, but what if it could be a robot?

No, I'm not playing with you. Someone has developed a feeding robot. That someone is an undergraduate student named Isao Wakabayashi, who studies at Chukyo University in Japan.

The robot works simply enough. It detects the food on your plate, and feeds you what you want to eat. How does it know what you want to eat? Simple, you tell it. The robot has the ability to understand a limited set of voice commands. So, if you tell it to feed you the broccoli, it will pick up the green stalk and bring it right to your mouth.

The body of the robot was made from a Rascal robot set by Robix, but Isao Wakabayashi wrote the image processing software that allows the system to tell meatloaf from pudding, and this is the innovation that really sets this feeding bot apart. Companies, such as the Japanese Secom, have been selling feeding robots as feeding assistants for the differently abeled, for several years now, but the voice system will help to make the robots more user friendly, as most current models cannot be used independently unless the user can operate a joy stick.



PhysOrg


TStzmmalaysia
post Mar 23 2011, 11:08 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Portable solar device creates potable water

By harnessing the power of the sun, a Monash University graduate has designed a simple, sustainable and affordable water-purification device, which has the potential to help eradicate disease and save lives.

The Solarball, developed as Mr Jonathan Liow’s final year project during his Bachelor of Industrial Design, can produce up to three litres of clean water every day. The spherical unit absorbs sunlight and causes dirty water contained inside to evaporate. As evaporation occurs, contaminants are separated from the water, generating drinkable condensation. The condensation is collected and stored, ready for drinking.

Liow’s design was driven by a need to help the 900 million people around the world who lack access to safe drinking water. Over two million children die annually from preventable causes, triggered largely by contaminated water. It is an increasing problem in developing nations due to rapid urbanisation and population growth.

‘After visiting Cambodia in 2008, and seeing the immense lack of everyday products we take for granted, I was inspired to use my design skills to help others,’ Mr Liow said.

Mr Liow’s simple but effective design is user-friendly and durable, with a weather-resistant construction, making it well suited to people in hot, wet, tropical climates with limited access to resources.

‘The challenge was coming up with a way to make the device more efficient than other products available, without making it too complicated, expensive, or technical,’ Mr Liow said.

Mr Liow, and a working prototype of his Solarball, was featured on ABC1’s ‘The New Inventors’. The product has been named as a finalist in the 2011 Australian Design Awards - James Dyson Award. It will also be exhibited at the Milan International Design Fair (Salone Internazionale del Mobile) in April 2011.

PhysOrg

Provided by Monash University (news : web)

TStzmmalaysia
post Mar 23 2011, 11:10 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Solar Panel Factory in Greece is Powered by the Sun

Henning Larsen Architects take the ugly grind out of industry with this handsome solar panel factory in Greece. The Solar Panel Factory is located in Kilkis - an industrial area north of Thessaloniki in Greece. The building comprises two production lines, input and output warehouses, technical support areas as well as administration and staff areas.

Commissioned for Kilkis, an industrial area north of Thessaloniki, the factory’s dual purpose as a production facility and office space required a comfortable, practical, and sustainable design. Like other Henning Larsen projects, such as the Campus Roskilde and EnergyFlex house, the 100,000 square meter facility rises to the challenge by marrying passive design and modern technology to create a factory that is almost entirely self-sufficient.

Photovoltaic panels installed on the roof double as sun protection and energy generators, while solar gain is maximized by the building’s south-facing orientation. These PVs generate the majority of the factory’s required energy.

The construction materials and coloring selected to mitigate energy loss also improve the building’s aesthetic appeal. Unlike some factories that are dull and mechanized, the solar panel production and administrative areas are bright and welcoming thanks to a series of large daylights that let in plenty of natural lighting.

Design

The overall architectural design is based on a ‘form follows function’ concept. The clear and rational design follows the line of the production process from the input preparation area through the production zone where the solar panels are assembled to the output and storage warehouse where the prepacked panels are collected by lorries.

The building generates the majority of energy used itself by means of photovoltaic panels installed on the roof as part of the sun protection of the skylights.

Energy losses have been reduced to a minimum by means of a rational selection of building materials and colouring. The building faces southwards in order to achieve maximum energy generation from sunrise to sunset. In the production and administration areas of the building, the roof features large skylights that contribute to creating a bright and comfortable indoor environment.

With its winning combination of large side windows, tall ceilings, and stacked appearance, Henning Larsen’s latest industrial facility is both stunning and sustainable.

Inhabitat

Henning Larsen

TStzmmalaysia
post Mar 23 2011, 11:12 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Princeton engineers make breakthrough in ultra-sensitive sensor technology

Princeton researchers have invented an extremely sensitive sensor that opens up new ways to detect a wide range of substances, such as tell-tale signs of cancer.

The sensor, which is the most sensitive of its kind to date, relies on a completely new architecture and fabrication technique developed by the Princeton researchers. The device boosts faint signals generated by the scattering of laser light from a material placed on it, allowing the identification of various substances based on the color of light they reflect. The sample could be as small as a single molecule.

The technology is a major advance in a decades-long search to identify materials using Raman scattering, a phenomena discovered in the 1920s by an Indian physicist, Chandrasekhara Raman, where light reflecting off an object carries a signature of its molecular composition and structure.

"Raman scattering has enormous potential in biological and chemical sensing, and could have many applications in industry, medicine and other fields," said Stephen Y. Chou, the professor of electrical engineering who led the research team. "But current Raman sensors are so weak that their use has been very limited outside of research. We've developed a way to significantly enhance the signal over the entire sensor and that could change the landscape of how Raman scattering can be used."

Chou and his collaborators, electrical engineering graduate students, Wen-Di Li and Fei Ding, and post-doctoral fellow, Jonathan Hu, published a paper on their innovation in February in the journal Optics Express. The research was funded by the Defense Advance Research Projects Agency.

In Raman scattering, a beam of pure one-color light is focused on a target, but the reflected light from the object contains two extra colors of light. The frequency of these extra colors are unique to the molecular make-up of the substance, providing a potentially powerful method to determine the identity of the substance, analogous to the way a finger print or DNA signature helps identify a person.

Since Raman first discovered the phenomena – a breakthrough that earned him Nobel Prize – engineers have dreamed of using it in everyday devices to identify the molecular composition and structures of substances, but for many materials the strength of the extra colors of reflected light was too weak to be seen even with the most sophisticated laboratory equipment.

Researchers discovered in the 1970s that the Raman signals were much stronger if the substance to be identified is placed on a rough metal surface or tiny particles of gold or silver. The technique, known as surface enhanced Raman scattering (SERS), showed great promise, but even after four decades of research has proven difficult to put to practical use. The strong signals appeared only at a few random points on the sensor surface, making it difficult to predict where to measure the signal and resulting in a weak overall signal for such a sensor.

Abandoning the previous methods for designing and manufacturing the sensors, Chou and his colleagues developed a completely new SERS architecture: a chip studded with uniform rows of tiny pillars made of metals and insulators.

One secret of the Chou team's design is that their pillar arrays are fundamentally different from those explored by other researchers. Their structure has two key components: a cavity formed by metal on the top and at the base of each pillar; and metal particles of about 20 nanometers in diameter, known as plasmonic nanodots, on the pillar wall, with small gaps of about 2 nanometers between the metal components.

The small particles and gaps significantly boost the Raman signal. The cavities serve as antennae, trapping light from the laser so it passes the plasmonic nanodots multiple times to generate the Raman signal rather than only once. The cavities also enhance the outgoing Raman signal.

The Chou's team named their new sensor "disk-coupled dots-on-pillar antenna-array" or D2PA, for short.

So far, the chip is a billion times (109) more sensitive than was possible without SERS boosting of Raman signals and the sensor is uniformly sensitive, making it more reliable for use in sensing devices. Such sensitivity is several orders of magnitude higher than the previously reported.

Already, researchers at the U.S. Naval Research Laboratory are experimenting with a less sensitive chip to explore whether the military could use the technology pioneered at Princeton for detecting chemicals, biological agents and explosives.

In addition to being far more sensitive than its predecessors, the Princeton chip can be manufactured inexpensively at large sizes and in large quantities. This is due to the easy-to-build nature of the sensor and a new combination of two powerful nanofabrication technologies: nanoimprint, a method that allows tiny structures to be produced in cookie-cutter fashion; and self-assembly, a technique where tiny particles form on their own. Chou's team has produced these sensors on 4-inch wafers (the basis of electronic chips) and can scale the fabrication to much larger wafer size.

"This is a very powerful method to identify molecules," Chou said. "The combination of a sensor that enhances signals far beyond what was previously possible, that's uniform in its sensitivity and that's easy to mass produce could change the landscape of sensor technology and what's possible with sensing."

EurekAlert
TStzmmalaysia
post Mar 23 2011, 11:16 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

3-D-Printed Ornithopter Insect Hovers, Flapping Delicate Wings

Manufacturing tiny, fragile wings is a delicate business, but using 3-D printing tech a team of Cornell roboticists have trimmed days off their usual production process by printing their fragile robots right on the workbench. That’s not just good news for ornithopter enthusiasts (surely they exist), but for aerospace engineers and entomologists as well.

Hovering ornithopters (flapping wing aircraft) usually mimic insects or something like a hummingbird, which means they need a unique set of wings that are delicate, lightweight, thin, and durable enough to stand up to high-speed flapping. Making these wings usually takes researchers a good deal of time and effort, making experimentation with new wing designs a lengthy and tedious process.

Leveraging advances in rapid prototyping tech, the Cornell team was able to cut that time--often up to a few days--to just a few minutes. Printing the translucently thin wings (constructed of a thin polyester film stretched on a carbon fiber frame) on a desktop printer allowed them not only to cut down on production time, but allows for much faster experimentation with different wing designs.

For the Cornell team, that resulted in a functioning 3.89-gram ornithopter capable of hovering for 85 seconds, the lightest and longest-duration model yet. But the ability to experiment quickly and precisely with various wing shapes and constructions should also allow researchers to more closely mimic real insect wing designs and study the lift dynamics powering a variety of natural flying organisms.

PopSci



TStzmmalaysia
post Mar 24 2011, 09:34 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Physicists create heaviest form of antimatter ever seen

A newly created form of antimatter is the heaviest and most complex anti-thing ever seen. Anti-helium nuclei, each containing two anti-protons and two anti-neutrons, have been created and detected at the Relativistic Heavy Ion Collider (RHIC) in Upton, New York.

Anti-particles have the opposite electrical charge to ordinary matter particles (anti-neutrons, which are electrically neutral, are made up of antiquarks that have the opposite charge to their normal quark counterparts). They annihilate on contact with matter, making them notoriously tricky to find and work with. Until recently, the most complex unit of antimatter ever seen was the counterpart of the helium-3 nucleus, which contains two protons and one neutron.

But experiments at RHIC are changing that. RHIC collides heavy atomic nuclei such as lead and gold to form microscopic fireballs, where energy is so densely packed that many new particles can be created.

Last year RHIC announced the creation of a new variety of antimatter. Called the anti-hypertriton, it is made of one anti-proton, one anti-neutron and one unstable particle called an anti-lambda. The anti-hypertriton was then the heaviest antiparticle known, but the 18 nuclei of anti-helium-4 seen at RHIC now takes the record.



Anti-periodic table

"They have moved us up to the next element in the anti-periodic table," says Frank Close of the University of Oxford in the UK.

But he adds, "It doesn't take us nearer to the big question of why is the universe at large not full of antimatter?" Indeed, standard theories say that matter and antimatter were created in equal amounts in the universe's first instants, but for unknown reasons, matter prevailed.

An experiment called the Alpha Magnetic Spectrometer, due to launch to the International Space Station in April, will try to chip away at the problem. Anti-protons are known to occur naturally in small quantities among the high-energy particles called cosmic rays that hit Earth.

The AMS will search for heavier anti-particles. But if anti-helium is produced only rarely in collisions, as shown by RHIC, then the AMS should see no anti-helium. If it finds higher levels of anti-helium, that could bolster a theory that antimatter was not all destroyed in the early universe but merely separated in a different part of space, where it would not come into contact with matter.

The next heaviest anti-element, anti-lithium, could in theory form solid antimatter at room temperature – but it will be much harder to make. The RHIC team calculates that it will occur in their collisions less than one-millionth as often as anti-helium, putting it beyond the reach of today's colliders.

NewScientist

Journal reference HERE




TStzmmalaysia
post Mar 24 2011, 09:38 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


NANOTECHNOLOGY

Attached Image

The 'coolest' semiconductor nanowiresSemiconductor nanowires are essential materials in the development of cheaper and more efficient solar cells, as well as batteries with higher storage capacity. Moreover, they are important building blocks in nanoelectronics. However, manufacturing semiconductor nanowires on an industrial scale is very expensive. The main reason for this is the high temperatures at which they are produced (600?900 C), as well as the use of expensive catalysts, such as gold. Scientists at the Max Planck Institute for Intelligent Systems in Stuttgart, formerly the Max Planck Institute for Metals Research, have now been able to produce crystalline semiconductor nanowires at a much lower temperature (150 C) while using inexpensive catalysts, such as aluminium. In this way, nanostructured semiconductors can even be deposited directly on heat-sensitive plastic substrates.

Nanowires made of semiconductors such as silicon (Si) or germanium (Ge) will be indispensable for many technical applications in the future. Until now, they have been manufactured using a process that was first described in 1964. The so-called vapour-liquid-solid (VLS) mechanism utilises tiny particles of metal catalysts as seeds for the growth of the nanowires. The metal seeds are deposited on a solid substrate, melted and exposed to a gas atmosphere containing silicon or germanium. The metal droplets will then take up semiconductor atoms from the gas until they are supersaturated, and the excess semiconductor material precipitates at the boundary with the substrate: a nanowire grows. In most cases, gold is used as a catalyst, since it can dissolve a lot of silicon or germanium when molten. The use of this expensive catalyst and the high processing temperature of 600 to 900 ºCelsius lead, however, to high production costs.

Materials scientists from Eric Mittemeijer’s department at the Max Planck Institute for Intelligent Systems have now discovered a method to produce semiconductor nanowires at a strikingly lower temperature of only 150 C, while using cheap catalysts like aluminium. Together with colleagues from the Stuttgart Center for Electron Microscopy, a research facility at the same Institute, the scientists have managed to observe nanowire growth at an atomic scale in real time.

To this end, the scientists prepared a bilayer of crystalline aluminium and amorphous silicon. The layer was produced in vacuum and at room temperature using thermal evaporation. Whereas the atoms are disordered in the amorphous silicon phase, they are arranged in an ordered crystalline lattice in the aluminium layer. In fact, the Al layer is constituted of billions of tiny aluminium crystals, each of them of size as small as about 50 nanometres. The crystal grains are in tight contact with each other. Their boundaries thus form a two-dimensional grain-boundary network within the aluminium layer.

Transmission electron micrograph (plan view) showing the formation of a silicon nanowire structure along the boundaries between adjacent aluminium crystals at 170 °Celsius (red: silicon; blue/green: aluminium). Right: Scanning electron microscope image (at a tilt angle of 30 degree) which shows the silicon nanowire pattern after removal of the aluminium by chemical etching.

Using analytical transmission electron microscopy, the scientists were able to directly observe that silicon atoms begin to flow from the silicon layer into the aluminium catalyst at a temperature as low as 120 °Celsius. At such low temperatures, the aluminium catalyst is solid and cannot dissolve any silicon atoms. Microscopic investigations reveal that the silicon atoms are instead accommodated at the boundaries between the aluminium crystals. As more and more silicon atoms gather at the aluminium grain boundaries, they are restructured into tiny crystalline nanowires, as this reduces the total energy of the system. This produces a network of crystalline nanowires, the pattern of which is precisely determined by the aluminium grain-boundary network. Wires as thin as 15 nanometres can thus be produced.
Clearly the growth mechanism of nanowires discovered by the material scientists in Stuttgart is fundamentally different from the conventional VLS growth mechanism. Most strikingly, the new growth method does not require semiconductor solubility in the metal catalyst and can therefore be realized at low temperatures (150 °Celsius), while using cheap catalysts like aluminium.

The major benefits of the new method are therefore that it does not require high substrate temperatures or expensive catalysts. In addition, materials scientists can tailor the size of the aluminium grains and thereby the form of the aluminium grain-boundary network, to produce the desired pattern of silicon nanowires. The Al catalyst can easily be removed through selective etching. Since aluminium films have been used in microelectronics for decades, their production and processing are widely established. Other catalysts may also be suitable for the method. Another advantage is that nanostructured silicon devices can be grown directly on most plastic substrates, even if they are heat-sensitive.
PhysOrg



TStzmmalaysia
post Mar 24 2011, 09:40 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Silicon Chips Wired With Nerve Cells Could Enable New Brain/Machine Interfaces

It’s reminiscent of Cartman’s runaway Trapper Keeper notebook in that long-ago episode of South Park, but researchers at the University of Wisconsin-Madison may be scratching the surface of a new kind of brain/machine interface by creating computer chips that are wired together with living nerve cells.

A team there has found that mouse nerve cells will connect with each other across a network of tiny tubes threaded through a semiconductor material. It’s not exactly clear at this point how the nerve cells are functioning, but what is clear is that the cells seem to have an affinity for the tiny tubes, and that alone has some interesting implications.

To create the nerve-chip hybrid, the researchers created tubes of layered silicon and germanium that are large enough for the nerve cells’ tendrils to navigate but too small for the actual body of the cell to pass through. They then introduced nerve cells to the tubes and found that the cells will readily thread their tendrils through them--even through complex geometries like helical curves--to connect with each other physically.

What isn’t clear is whether or not the cells are actually communicating with each other they way they would naturally. Going forward, the team aims to get sensors into the chips to see exactly how they are interacting. But the fact that nerve cells will follow the tubes along a preset path designed by researchers belies thrilling prospects.

For instance, nerve-electronic hybrid chips would make great places to test neurological drugs or to study the way nerve cells afflicted with disorders like Parkinson’s communicate. But even more tantalizing is the idea of a nerve-computer interface that would enable the kind of Skywalker-esque control of artificial limbs that is the holy grail prosthetics research.

PopSci

[Discovery News]


TStzmmalaysia
post Mar 24 2011, 09:41 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


NANOTECHNOLOGY

Attached Image

Self-Strengthening Nanocomposite Created

Researchers at Rice University have created a synthetic material that gets stronger from repeated stress much like the body strengthens bones and muscles after repeated workouts.

Work by the Rice lab of Pulickel Ajayan, professor in mechanical engineering and materials science and of chemistry, shows the potential of stiffening polymer-based nanocomposites with carbon nanotube fillers. The team reported its discovery this month in the journal ACS Nano.

The trick, it seems, lies in the complex, dynamic interface between nanostructures and polymers in carefully engineered nanocomposite materials.

Brent Carey, a graduate student in Ajayan's lab, found the interesting property while testing the high-cycle fatigue properties of a composite he made by infiltrating a forest of vertically aligned, multiwalled nanotubes with polydimethylsiloxane (PDMS), an inert, rubbery polymer. To his great surprise, repeatedly loading the material didn't seem to damage it at all. In fact, the stress made it stiffer.

Carey, whose research is sponsored by a NASA fellowship, used dynamic mechanical analysis (DMA) to test their material. He found that after an astounding 3.5 million compressions (five per second) over about a week's time, the stiffness of the composite had increased by 12 percent and showed the potential for even further improvement.

"It took a bit of tweaking to get the instrument to do this," Carey said. "DMA generally assumes that your material isn't changing in any permanent way. In the early tests, the software kept telling me, 'I've damaged the sample!' as the stiffness increased. I also had to trick it with an unsolvable program loop to achieve the high number of cycles."

Materials scientists know that metals can strain-harden during repeated deformation, a result of the creation and jamming of defects -- known as dislocations -- in their crystalline lattice. Polymers, which are made of long, repeating chains of atoms, don't behave the same way.

The team is not sure precisely why their synthetic material behaves as it does. "We were able to rule out further cross-linking in the polymer as an explanation," Carey said. "The data shows that there's very little chemical interaction, if any, between the polymer and the nanotubes, and it seems that this fluid interface is evolving during stressing."

"The use of nanomaterials as a filler increases this interfacial area tremendously for the same amount of filler material added," Ajayan said. "Hence, the resulting interfacial effects are amplified as compared with conventional composites. "For engineered materials, people would love to have a composite like this," he said. "This work shows how nanomaterials in composites can be creatively used."

They also found one other truth about this unique phenomenon: Simply compressing the material didn't change its properties; only dynamic stress -- deforming it again and again -- made it stiffer.

Carey drew an analogy between their material and bones. "As long as you're regularly stressing a bone in the body, it will remain strong," he said. "For example, the bones in the racket arm of a tennis player are denser. Essentially, this is an adaptive effect our body uses to withstand the loads applied to it. "Our material is similar in the sense that a static load on our composite doesn't cause a change. You have to dynamically stress it in order to improve it."

Cartilage may be a better comparison -- and possibly even a future candidate for nanocomposite replacement. "We can envision this response being attractive for developing artificial cartilage that can respond to the forces being applied to it but remains pliable in areas that are not being stressed," Carey said.

Both researchers noted this is the kind of basic research that asks more questions than it answers. While they can easily measure the material's bulk properties, it's an entirely different story to understand how the polymer and nanotubes interact at the nanoscale.

"People have been trying to address the question of how the polymer layer around a nanoparticle behaves," Ajayan said. "It's a very complicated problem. But fundamentally, it's important if you're an engineer of nanocomposites. "From that perspective, I think this is a beautiful result. It tells us that it's feasible to engineer interfaces that make the material do unconventional things."

ScienceDaily

TStzmmalaysia
post Mar 24 2011, 09:44 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Does Belief in Free Will Lead to Action?

Free will may be an illusion. Yet we persist in believing we are the masters of our fates -- and that belief affects how we act. Think you determine the course of your life and you're likely to work harder toward your goals and feel better about yourself too. Think you don't, and you're likelier to behave in ways that fulfill that prophesy.

"Folk psychology tells us if you feel in control, you perform better," says Davide Rigoni, an experimental psychologist now at the University of Marseille. "What is crucial is that these effects are present at a very basic motor level, a deep level of brain activity."

Working with Marcel Brass and Simone Kuhn of the University of Gent and the University of Padova's Giuseppe Sartori, Rigoni showed that shaking people's belief in self-mastery impairs their brains' readiness to act, even before they're aware of the intention to move. The study is published in an upcoming issue of Psychological Science, a journal of the Association for Psychological Science.

To see how free-will beliefs affect pre-conscious aspects of motor control, the team observed a well-known brain marker of voluntary action: the negative electrical wave of "readiness potential," which first fires in preparation to move and then, milliseconds later, activates as the brain sends signals to the muscles. Because the first part is not conscious but is modulated by intention, the researchers thought its strength might reflect belief -- or disbelief -- in free will.

The study divided 30 men and women ages 18 to 24 into two groups. The experimental group read a text stating that scientists had discovered free will to be an illusion. The control group read about consciousness with no mention of free will. They were instructed to read carefully in preparation for a quiz.

Then the participants performed a "Libet task": pressing a button whenever and however many times they chose, while indicating on a screen the time they became aware of the intention to act. Meanwhile, an EEG recorded their brain activity.

Finally, participants answered questions assessing their beliefs in free will and determinism, both regarding people in general and themselves in particular.

The questionnaires showed the text worked: the first group's belief in their own self-determination was weaker than that of the control group's.

The same effect showed up in the Libet test. The no-free-will group's EEGs measured brain activity far lower than the control group's during that first, unconscious phase of readiness potential. Deep in the brain, the gumption to act flagged along with the belief in self-determination.

Impatient with the biological deterministic bent of science -- "that genes and brains control us and we have no control," Rigoni was motivated by a more philosophical question: "Is it better to believe or not believe we are free? What if we all disbelieved in free will?" The study gives scientific support to his intuition that it is better to believe. "If we are not free," he says, "it makes no sense to put effort into actions and to be motivated."

ScienceDaily

TStzmmalaysia
post Mar 25 2011, 05:58 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Memsys Creates a Solar-Powered Shipping Container Desalination System

Using desalination to provide water to large communities is a controversial topic. But desalination has its uses where no one would argue its benefits -- for instance, in a disaster area desalination could be life saving. But how does one go about getting a plant to an area in need of fresh water supplies?

Well, convert a shipping container. But how does a shipping container-desalination plant get the energy to run a process like reverse oasmosis? 1 Option is: solar power. A company from Singapore called memsys has developed a new technology that can trim the costs of desalination and make it more mobile. They're using a process called vacuum multi-effect membrane distillation -- a mouthful to say, but promising for fresh water supplies.

The new process by memsys combines the two most popular forms of desal tech, thermal distillation and membrane distillation. The process includes a vacuum in which water is boiled at low temperatures, from 50-80 degrees Celcius, and the steam is passed through several membrane distillation processes at progressively lower temperatures and pressures. Energy is recovered during each step, in order to power the next step, thereby making the entire process far more energy efficient.

"We have the first modular thermal separation process," Götz Lange, managing director of memsys, said during an interview with The New York Times. "We didn't change the thermal technology itself -- you can't change physics -- we are just the first to put this advanced technology of thermal separation into a very tiny, cheap and reliable modular concept."

The article reports, "After seven years of development, a small demonstration unit, powered by solar energy for extra sustainability, was installed last year at Marina Barrage, a dam completed in 2008 across the mouth of Marina Bay that has converted what is left of the old Singapore harbor after massive land reclamation, into a freshwater reservoir."

Designed for disaster relief, the shipping container unit can run entirely on solar power, and can produce 265 gallons of fresh water daily. The technology is already getting attention from businesses such as IBM, who is using it in a desalination project powered by waste heat from a concentrated photovoltaic generator, and an Australian company that wants to use the technology to clean brackish groundwater in a remote area in Western Australia.

While the technology might have a high per-unit price to start, that cost could come down. And when it comes to providing relief for disaster areas, it would be well worth the cost.

TreeHugger

TStzmmalaysia
post Mar 25 2011, 06:00 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


TRANSPORTATION

Attached Image

MIT's electric CityCar still on track

Since the MIT CityCar first generated buzz with its nod from TIME magazine as a best invention of 2007, a lot has happened. The world has changed; electric vehicles are more widely accepted because the price of oil has become more volatile, economic pressure to become more green has increased, stimulus plans to ignite the green economy have been put into place, and policymakers and the public are more aware than ever of the issues surrounding climate change.

Ryan Chin, a research specialist in the Smart Cities group of the Media Lab at the Massachusetts Institute of Technology, told New Scientist at the MIT Energy Conference last week that in the last four years, "We've developed almost all of the core elements [for the car], including the folding mechanism, the wheel motors, and the control system for the vehicle."

He went on to explain that battery technology has improved, so the car can drive greater distances and is lighter. Electric motors have also become lighter, faster, cheaper, and more powerful, because the demand for them is greater.

Since 2010, the Smart Cities group has collaborated with the Spanish company DenokInn. With the support of the Spanish government, private investors, and banks, DenokInn established the Hiriko project, an initiative to build the initial full-scale, drivable prototype of the CityCar, and then scale up and commercialise its production and distribution. Chin told New Scientist that he estimates the first full-scale prototype will be built by the end of the summer, and that the CityCar will be on the road within three years.

NewScientist

TStzmmalaysia
post Mar 25 2011, 06:01 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

iMobot rolls, crawls and creeps

Graham Ryland and Professor Harry Cheng hope their "iMobot" will be a useful research and teaching tool. They also say the technology could be used in industrial applications for rapidly prototyping complex robotics — and may eventually form the basis of robots for search-and-rescue operations in difficult terrain. The university has filed a patent on the robot.
Ryland and Cheng developed the iMobot while Ryland was studying for his master's degree in mechanical engineering and conducting research in Cheng's Integration Engineering Laboratory at UC Davis.

A single iMobot module has four controllable degrees of freedom, with two joints in the center section and two wheels, one on each end. An individual module can drive on its wheels, crawl like an inchworm, or raise one end of its body and pan around as a camera platform.
Individual modules could be assembled into larger robots for particular tasks, such as a snakelike robot that could get into confined spaces, or a larger, wheeled robot for smoother terrain.

"We wanted to create a robot that was modular and could be assembled together, but was also mobile and useful by itself. We feel this hardware platform could drastically speed up university and industry research in the field of robotics," Ryland said.
Commercial robots are usually built for a specific application. But there is a lot of interest in modular robots -- machines made up of durable subunits that can function alone or be configured for a specific task.

The iMobot could be used as a testbed tool for engineers studying control systems for individual robots or groups of robots, Cheng said.
"It's very difficult to build the kind of robot with flexibility, modularity, and reconfigurability that people want to use for research and teaching," he said.

By using an off-the-shelf commercial robot like iMobot, researchers can focus on solving problems in areas such as artificial intelligence, robot collaboration, and reconfigurable and adaptive systems, without having to first develop the hardware part of the robot.
Currently, there are no commercial research-grade modular robots available, Ryland said.

PhysOrg

TStzmmalaysia
post Mar 25 2011, 06:04 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

NASA's successful 'can crush' will aid heavy-lift rocket design


This trailblazing project is examining the safety margins needed in the design of future, large launch vehicle structures. Test results will be used to develop and validate structural analysis models and generate new "shell-buckling knockdown factors" -- complex engineering design standards essential to launch vehicle design.

"This type of research is critical to NASA developing a new heavy-lift vehicle," said NASA Administrator Charlie Bolden. "The Authorization Act of 2010 gave us direction to take the nation beyond low-Earth orbit, but it is the work of our dedicated team of engineers and researchers that will make future NASA exploration missions a reality."

The aerospace industry's shell buckling knockdown factors date back to Apollo-era studies when current materials, manufacturing processes and high-fidelity computer modeling did not exist. These new analyses will update essential design factors and calculations that are a significant performance and safety driver in designing large structures like the main fuel tank of a future heavy-lift launch vehicle.

During the test, a massive 27.5-foot-diameter and 20-foot-tall aluminum-lithium test cylinder received almost one million pounds of force until it failed. More than 800 sensors measured strain and local deformations. In addition, advanced optical measurement techniques were used to monitor tiny deformations over the entire outer surface of the test article.

The Shell Buckling Knockdown Factor Project is led by engineers at NASA's Engineering and Safety Center (NESC), and NASA's Langley Research Center in Hampton, Va. NASA's heavy-lift space launch system will be developed and managed at Marshall.
"Launch vehicles are thin walled, cylindrical structures and buckling is one of the primary failure modes," said Mark Hilburger, a senior research engineer in the Structural Mechanics and Concepts Branch at Langley and the principal investigator of the NESC's Shell Buckling Knockdown Factor project. "Only by studying the fundamental physics of buckling through careful testing and analysis can we confidently apply the new knowledge to updated design factors. The outcome will be safer, lighter, more efficient launch vehicles."

Leading up to this full-scale test, the shell buckling team tested four, 8-foot-diameter aluminum-lithium cylinders. Current research suggests applying the new design factors and incorporating new technology could reduce the weight of large heavy-lift launch vehicles by as much as 20 percent.
"Marshall's Structural and Dynamics Engineering Test laboratory is uniquely suited for shell buckling testing," said Mike Roberts, an engineer in Marshall's structural strength test branch and the center lead for this activity. "Originally built to test Saturn rocket stages, the capabilities found here were essential to developing the lightweight space shuttle external tank flying today and for testing International Space Station modules."

For this test, Marshall led all test operations including the engineering, test equipment design and safety assurance. Lockheed Martin Space Systems Company fabricated the test article at Marshall's Advance Weld Process Development Facility using state-of-the-art welding and inspection techniques. Langley engineers led the design and analysis of the test articles, defined the test requirements, and developed new optical displacement measurement standards that enabled highly accurate assessment of the large-scale test article response during the test.

In the future, the shell buckling team will test carbon-fiber composite structures that are 20-30 percent lighter than aluminum and widely used in the automotive and aerospace industries.

PhysOrg


TStzmmalaysia
post Mar 25 2011, 06:08 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ROBOTICS

Attached Image

Festo creates SmartBird flying robotic seagull

Festo has added to its robotic menagerie with the creation of a robotic seagull that weighs just 450 g (15.87 oz) and boasts a wingspan of 1.96 m (6.4 ft). Dubbed the SmartBird, the ultralight flying robot was inspired by the herring gull and can take off, fly and land autonomously, without the help of any additional drive systems.

In creating the SmartBird, Festo says it has succeeded in deciphering the flight of birds. The robot's wings not only beat up and down, with a lever mechanism increasing the degree of deflection to increase from the torso to the wing tip, but also twist at specific angles along their length in the same way that a real bird's do so that the leading edge is directed upwards during the upward stroke.

Directional control is achieved through the opposing movement of the robot's head and torso sections, which is synchronized by means of two electric motors and cables. This enables it to bend aerodynamically, with simultaneous weight displacement, and is responsible for the SmartBird's agility and maneuverability.

As with a real bird, the SmartBird's tail isn't just for show either. It produces lift and functions as both a pitch elevator and yaw rudder. In addition to stabilizing the robot in a similar way to an aircraft's conventional vertical stabilizer, the tail also tilts to initiate left and right turns and rotates about the longitudinal axis to produce yaw.

Packed inside the SmartBird's torso are the battery, engine and transmission, the crank transmission and control and regulation electronics. Wing position and torsion can be monitored via two-way ZigBee protocol radio communication and can be adjusted and optimized in real time during flight.

Festo says developing the SmartBird has provided insights that will help it in a variety of areas. The robot's minimal use of materials and lightweight construction will help increase efficiencies in resource and energy consumption, while the functional integration of its coupled drive units have provided ideas the company says it can transfer to the development of hybrid drive technology. Additionally, analysis of its flow characteristics during development has provided insights into ways to optimize future designs. Another plus is that it won't try and steal your chips at the beach.

Gizmag

TStzmmalaysia
post Mar 25 2011, 06:10 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Mosquito needle helps take sting out of injections

LOOK away now if you are afraid of needles. A motorised, harpoon-like needle sounds painful, but in fact hurts far less than a regular injection because it resembles a mosquito's mouth parts.

Seiji Aoyagi and colleagues at Kansai University in Osaka, Japan, have developed a needle that mimics a mosquito's proboscis, which is serrated and barely touches the skin so you don't feel the initial bite. A smooth hypodermic, on the other hand, leaves a lot of metal in contact with the skin, stimulating the nerves and causing pain.

Aoyagi hopes his design could help diabetic people who have to take blood samples. Etched from silicon, the needle imitates three of the creature's seven mobile mouthparts: the two serrated maxillae and the tubular labrum (see diagram).

Unlike Aoyagi's previous attempts to mimic a mosquito's bite, each of these parts is driven by tiny motors based on lead zirconium titanate (PZT) - a piezoelectric crystal that expands very slightly when you apply an alternating voltage (Sensors and Actuators, DOI: 10.1016/j.sna.2010.02.010). The vibrations of the crystal can be used as a simple motor to control how the needle enters the skin.

The sections of the needle break the skin in the same sequence as they do with a mosquito, vibrating at about 15 hertz to ease it into the skin - as observed in mosquitoes under high-speed video microscopes. Aoyagi has tested his needle on himself and three volunteers, who agree that the pain is much reduced but lasts longer than with a conventional syringe. He thinks that by mimicking more of the creature's mouthparts, including an addition to steady the needle's entry, he'll be able to reduce that dull pain.

Microfluidics engineer Suman Chakraborty of the Indian Institute of Technology in Kharagpur, who has also worked on similar designs in the past, is impressed by Aoyagi's progress. "It's a substantial move towards improving the technology," he says.

NewScientist

TStzmmalaysia
post Mar 26 2011, 10:39 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Nanotechnology points the way to greener pastures

Nourishing crops with synthetic ammonia (NH3) fertilizers has increasingly pushed agricultural yields higher, but such productivity comes at a price. Over-application of this chemical can build up nitrate ion (NO3–) concentrations in the soil -- a potential groundwater poison and food source for harmful algal blooms. Furthermore, industrial manufacturing of ammonia is an energy-intensive process that contributes significantly to atmospheric greenhouse gases.

A research team led by Miho Yamauchi and Masaki Takata from the RIKEN SPring-8 Center in Harima has now discovered an almost ideal way to detoxify the effects of ammonia fertilizers. By synthesizing photoactive bimetallic nanocatalysts that generate hydrogen gas from water using solar energy, the team can catalytically convert NO3– back into NH3 through an efficient route free from carbon dioxide emissions.

Replacing the oxygen atoms of NO3– with hydrogen is a difficult chemical trick, but chemists can achieve this feat by using nanoparticles of copper–palladium (CuPd) alloys to immobilize nitrates at their surfaces and catalyzing a reduction reaction with dissolved hydrogen atoms. However, the atomic distribution at the ‘nanoalloy’ surface affects the outcome of this procedure: regions with large domains of Pd atoms tend to create nitrogen gas, while well-mixed alloys preferentially produce ammonia.

According to Yamauchi, the challenge in synthesizing homogenously mixed CuPd alloys is getting the timing right—the two metal ions transform into atomic states at different rates, causing phase separation. Yamauchi and her team used the powerful x-rays of the SPring-8 Center’s synchrotron to characterize the atomic structure of CuPd synthesized with harsh or mild reagents. Their experiments revealed that a relatively strong reducing reagent called sodium borohydride gave alloys with near-perfect mixing down to nanoscale dimensions.

Most ammonia syntheses use hydrogen gas produced from fossil fuels, but the use of solar energy by the researchers avoids this. They found that depositing the nanoalloy onto photosensitive titanium dioxide (TiO2) yielded a material able to convert ultraviolet radiation into energetic electrons; in turn, these electrons stimulated hydrogen gas generation from a simple water/methanol solution. When they added nitrate ions to this mixture, the CuPd/TiO2 catalyst converted nearly 80% into ammonia—a remarkable chemical selectivity that the researchers attribute to high concentrations of reactive hydrogen photocatalytically produced near the CuPd surface.

Yamauchi is confident that this approach can help reduce the ecological impact of many classical chemical hydrogenation reactions. “Considering the environmental problems we face, we have to switch from chemical synthesis using fossil-based hydrogen to other clean processes,” she says.

PhysOrg

TStzmmalaysia
post Mar 26 2011, 10:43 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


BIOTECHNOLOGY

Attached Image

Where robots labour to overcome genetic disease

I'M PEERING through an internal window into an eerie blue-lit room packed with high-tech machinery. The temperature inside is held at 28 °C and the humidity is high. Technicians scan computer screens to ensure the robots are happy, but seem oblivious to my presence.

Replace the blue light with green and it would resemble the interior of one of Star Trek's Borg cubes. Thankfully, the technicians look nothing like cyborgs, and the goal is not the assimilation of humanity. Rather, this is the production line that Complete Genomics, a start-up in Mountain View, California, bets will revolutionise the discovery of disease genes.

It is already the world's largest human genome sequencing factory. In a room half the size of a tennis court sit 16 robots that sequenced the genomes of 800 people last year. Going full tilt, they can now churn out 400 genomes a month.

The current price, offered to researchers and pharma companies but not yet to private individuals, is $9500 per genome; place an order for 1000 or more, and it drops to $5500. When you consider that the first human genome was completed a decade ago for billions, DNA sequencing has come a long way, fast.

As I survey the scene, Jennifer Turcotte, the company's marketing chief, explains why it looks different from other sequencing labs. DNA is usually read inside hermetically sealed machines, but here the robots work with their guts exposed, for ease of maintenance. The heat and humidity suit the biochemistry of the sequencing reactions, and the dim blue light avoids frequencies that would bleach the fluorescent probes used to detect each letter of the genetic code.

The technicians wear clean-room gear, as dust would interfere with reading the sequences. Unless something goes awry, there is no need for them to intervene. The robots add the required reagents, and manoeuvre the samples so that a camera can record the light signals that reveal the DNA sequence of 70 bases at a time.

The formidable computation needed to assemble these snippets into 3-billion base-pair human genomes is done at a fully automated data centre about 20 minutes' drive away in Santa Clara - electricity is cheaper there, and data storage is charged by the kilowatt-hour, explains Clifford Reid, the company's CEO.

With the production line essentially running itself, most of the 185 staff are busy improving the company's sequencing technology, or liaising with customers. "We need people to interact with people, but not to interact with the DNA," says Reid.

In a cute twist, the company has even automated its reception area. When I arrived, I was greeted by a computer terminal, which asked for my name and who I had come to see. A label printer spat out a visitor badge while an email summoned Turcotte to lead me into the inner sanctum.

The culture of automation has a serious scientific goal. Geneticists had hoped that mutations determining our susceptibility to disease would emerge from limited scans, which record common variants at some 1 million positions across the genome. But the discoveries so far explain a small part of the heritability of many conditions. Gene hunters are starting to hit a wall.

Looking for the missing mutations means sequencing entire genomes and pinpointing rare anomalies that are inherited with the disease in question. The principle was exhibited last year by a team led by Leroy Hood of the Institute for Systems Biology in Seattle, which narrowed to a list of four the mutations responsible for the craniofacial condition Miller syndrome in one affected family (Science, DOI: 10.1126/science.1186802). Complete Genomics did the sequencing.

Several companies are pushing the envelope of cost and speed in DNA sequencing. But Complete Genomics is unusual in tailoring its technology to the task of churning out whole human genomes, and deciding not to sell machines but to offer a contract sequencing service. Hood sees little point in scientists doing the work: "We want to put our efforts into developing the tools to interpret the information."

New Scientist


Added on March 26, 2011, 10:51 pmRESEARCH



The First Plastic Computer Processor

This post has been edited by tzmmalaysia: Mar 26 2011, 10:51 PM
TStzmmalaysia
post Mar 26 2011, 10:54 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

The First Plastic Computer Processor

Two recent developments—a plastic processor and printed memory—show that computing doesn't have to rely on inflexible silicon.

Silicon may underpin the computers that surround us, but the rigid inflexibility of the semiconductor means it cannot reach everywhere. The first computer processor and memory chips made out of plastic semiconductors suggest that, someday, nowhere will be out of bounds for computer power.

Researchers in Europe used 4,000 plastic, or organic, transistors to create the plastic microprocessor, which measures roughly two centimeters square and is built on top of flexible plastic foil. "Compared to using silicon, this has the advantage of lower price and that it can be flexible," says Jan Genoe at the IMEC nanotechnology center in Leuven, Belgium. Genoe and IMEC colleagues worked with researchers at the TNO research organization and display company Polymer Vision, both in the Netherlands.

The processor can so far run only one simple program of 16 instructions. The commands are hardcoded into a second foil etched with plastic circuits that can be connected to the processor to "load" the program. This allows the processor to calculate a running average of an incoming signal, something that a chip involved in processing the signal from a sensor might do, says Genoe. The chip runs at a speed of six hertz-on the order of a million times slower than a modern desktop machine-and can only process information in eight-bit chunks at most, compared to 128 bits for modern computer processors.

Organic transistors have already been used in certain LED displays and RFID tags, but have not been combined in such numbers, or used to make a processor of any kind. The microprocessor was presented at the ISSCC conference in San Jose, California, last month.



Making the processor begins with a 25-micrometer thick sheet of flexible plastic, "like what you might wrap your lunch with," says Genoe. A layer of gold electrodes are deposited on top, followed by an insulating layer of plastic, and the plastic semiconductors that make up the processor's 4,000 transistors. Those transistors were made by spinning the plastic foil to spread a drop of organic liquid into a thin, even layer. When the foil is heated gently the liquid converts into solid pentacene, a commonly used organic semiconductor. The pentacene layer was then etched using photolithography to make the final pattern for transistors.

In the future, such processors could be made more cheaply by printing the organic components like ink, says Genoe. "There are research groups working on roll-to-roll or sheet-to-sheet printing," he says, "but there is still some progress needed to make organic transistors at small sizes that aren't wobbly," meaning physically irregular. The best lab-scale printing methods so far can only deliver reliable transistors in the tens of micrometers, he says.

Creating a processor made from plastic transistors was a challenge, because unlike those made from ordered silicon crystals, not every one can be trusted to behave like any other. Plastic transistors each behave slightly differently because they are made up of amorphous collections of pentacene molecules. "You won't have two that are equal," says Geneo. "We had to study and simulate that variability to work out a design with the highest chance of behaving correctly."

The team succeeded, but that doesn't mean the stage is set for plastic processors to displace silicon ones in consumer computers. "Organic materials fundamentally limit the speed of operation," Genoe explains. He expects plastic processors to appear in places where silicon is barred by its cost or physical inflexibility. The lower cost of the organic materials used compared to conventional silicon should make the plastic approach around 10 times cheaper.

"You can imagine an organic gas sensor wrapped around a gas pipe to report on any leaks with a flexible microprocessor to clean up the noisy signal," he says. Plastic electronics could also allow disposable interactive displays to be built into packaging, for example for food, says Genoe. "You might press a button to have it add up the calories in the cookies you ate," he says.

But such applications will require more than just plastic processors, says Wei Zhang, who works on organic electronics at the University of Minnesota. At the same conference where the organic processor was unveiled, Zhang and colleagues presented the first printed organic memory of a type known as DRAM, which works alongside the processor in most computers for short-term data storage. The 24-millimeter-square memory array was made by building up several layers of organic "ink" squirted from a nozzle like an aerosol. It can store 64 bits of information.

Previous printed memory has been nonvolatile, meaning it holds data even when the power is off and isn't suitable for short-term storage involving frequent writing, reading, and rewriting, says Zhang. The Minnesota group was able to print DRAM because it devised a form of printed, organic transistor that uses an ion-rich gel for the insulating material that separates its electrodes.

The ions inside enable the gel layer to store more charge than a conventional, ion-free insulator. That addresses two problems that have limited organic memory development. The gel's charge-storing ability reduces the power needed to operate the transistor and memory built from it; it also enables the levels of charge used to represent 1 and 0 in the memory to be very distinct and to persist for as long as a minute without the need for the memory to be refreshed.

Organic, printed DRAM could be used for short-term storage of image frames in displays that are today made with printed organic LEDs, says Zhang. That would enable more devices to be made using printing methods and eliminate some silicon components, reducing costs.

Finding a way to combine organic microprocessors and memory could cut prices further, although Zhang says the two are not yet ready to connect. "These efforts are new techniques, so we cannot guarantee that they will be built and work together," says Zhang. "But in the future, it would make sense."

Technology Review

TStzmmalaysia
post Mar 26 2011, 10:59 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Video: Magnetic Gels That Swim, Shimmy, and 'Walk'

Miklós Zrínyi of Semmelweiss University in Budapest, Hungary, has created some gels that are anything but gellin’. In fact, these gels are moving, shaking, and otherwise getting around with a little help from magnetism. The gel “snakes”--made from a mix of polymer and metal particles--bend to match the shape of any magnetic field exerted upon them.

That means with a little ingenuity, these gels can be manipulated in a variety of ways using either permanent magnets or electromagnets, depending on the shape and strength of the fields. As you can see in the video below, that means you can make them do all kinds of quirky things. But as New Scientist notes, a magnetic material that’s also soft and flexible could find an array of applications, like in artificial robot muscles or to replace machine parts that are usually rigid with softer alternatives.


PopSci

VIDEO HERE http://bcove.me/mf6sym79

New Scientist


TStzmmalaysia
post Mar 26 2011, 11:01 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

A Quantum Communications Switch

The Internet is made of photons that zip through fiber-optic cables and flow through devices like switches, modulators, and amplifiers. But those standard devices would be inadequate for superfast quantum computing or communications—experimental approaches that exploit the peculiar properties of particles at the quantum scale to carry out complex calculations incredibly quickly or to prevent anyone from eavesdropping on messages.

Commercial switches have various problems that make them unsuitable for rerouting entangled photons. Those that are made of micro-electromechanical components keep entangled states in tact, but operate too slowly. Other opto-electronic switches either add too much noise so that single photons are difficult to detect, or they completely destroy the quantum information.

Prem Kumar, professor of electrical engineering and computer science at Northwestern University, has developed a quantum routing switch that can shuttle entangled photons along various paths while keeping the quantum information intact.

The device could be particularly useful for quantum computing, says James Franson, professor of physics at the University of Maryland, Baltimore County. "To build a quantum computer using photons, we need the ability to switch [entangled] photons," says Franson. A quantum switch could also someday allow entangled photons from different quantum computers to be shared over long distances—like cloud computing, but with quantum information.

Kumar says the switch will also make ultra-secure quantum networks a reality. Today's information is typically secured using what's called public key encryption, which relies on the practical impossibility of performing certain mathematical tasks, like factoring extremely large numbers. Quantum networks would offer an even more secure alternative to public key encryption. Using entangled photons to communicate ensures security because any attempt to intercept a message would disturb the particles' quantum state.

To build the new quantum switch, the researchers used commercial fiber-optic cable and other standard optical components, says Kumar. "My goal is to do things in the quantum information space that are very compatible with existing fiber infrastructures," he says.

The first step is to prepare the photons. Entangled photons have properties, such as polarization, that are fundamentally linked. If two photons are entangled, then the measured polarization of one reveals the corresponding state of the other. The researchers used a technique in which they mixed together multiple wavelengths of light within a standard fiber to create entangled photon pairs.

The next step is to send one photon down the optical fiber to the switch, which changes the photon's course. The researchers' switch is made of only optical components, including a spool of 100 meters of optical fiber arranged in a loop. One photon of an entangled pair is sent through one end of the loop, and through a multiplexer, while a powerful laser sends pulses of light into the spool. The photon is shifted in such a way that at the other end of the loop it separates out along a separate path, while remaining entangled with its partner.

The end result is a switch that's very fast, has low background noise, and most importantly, preserves the quantum information. Single photon detectors at the end of the fibers confirm that both photons maintained their entangled state, showing that the quantum information was preserved. The work is described in a recent issue of the journal Physical Review Letters.

"It's an important development, because switching photons is really the main difference in going ahead in further progress in quantum computing using photons," says Franson.

Technology Review

TStzmmalaysia
post Mar 26 2011, 11:05 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

3-D Models Created by a Cell Phone

Capturing an object in three dimensions needn't require the budget of Avatar. A new cell phone app developed by Microsoft researchers can be sufficient. The software uses overlapping snapshots to build a photo-realistic 3-D model that can be spun around and viewed from any angle.

"We want everybody with a cell phone or regular digital camera to be able to capture 3-D objects," says Eric Stollnitz, one of the Microsoft researchers who worked on the project.

To capture a car in 3-D, for example, a person needs to take a handful of photos from different viewpoints around it. The photos can be instantly sent to a cloud server for processing. The app then downloads a photo-realistic model of the object that can be smoothly navigated by sliding a finger over the screen. A detailed 360 degree view of a car-sized object needs around 40 photos, a smaller object like a birthday cake would need 25 or fewer.

If captured with a conventional camera instead of a cell phone, the photos have to be uploaded onto a computer for processing in order to view the results. The researchers have also developed a Web browser plug-in that can be used to view the 3-D models, enabling them to be shared online. "You could be selling an item online, taking a picture of a friend for fun, or recording something for insurance purposes," says Stollnitz. "These 3-D scans take up less bandwidth than a video because they are based on only a few images, and are also interactive."

To make a model from the initial snapshots, the software first compares the photos to work out where in 3-D space they were taken from. The same technology was used in a previous Microsoft research project, PhotoSynth, that gave a sense of a 3-D scene by jumping between different views (see video). However, PhotoSynth doesn't directly capture the 3-D information inside photos.

"We also have to calculate the actual depth of objects from the stereo effect," says Stollnitz, "comparing how they appear in different photos." His software uses what it learns through that process to break each image apart and spread what it captures through virtual 3-D space (see video, below). The pieces from different photos are stitched together on the fly as a person navigates around the virtual space to generate his current viewpoint, creating the same view that would be seen if he were walking around the object in physical space.

"This is an interesting piece of software," says Jason Hurst, a product manager with 3DMedia, which makes software that combines pairs of photos to capture a single 3-D view of a scene. However, using still photos does have its limitations, he points out. "Their method, like ours, is effectively time-lapse, so it can't deal with objects that are moving," he says.

3DMedia's technology is targeted at displays like 3-D TVs or Nintendo's new glasses-free 3-D handheld gaming device. But the 3-D information built up by the Microsoft software could be modified to display on such devices, too, says Hurst, because the models it builds contain enough information to create the different viewpoints for a person's eyes.

Hurst says that as more 3-D-capable hardware appears, people will need more tools that let them make 3-D content. "The push of 3-D to consumers has come from TV and computer device makers, but the content is lagging," says Hurst. "Enabling people to make their own is a good complement."

Technology Review

TStzmmalaysia
post Mar 26 2011, 11:14 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

100% Renewable Energy By 2050 Is Possible - Here's How We Can Do It

We recently examined how Australia can meet 100% of its electricity needs from renewable sources by 2020. Here we will examine how that goal can be scaled up for the rest of the world.

Energy consulting firm Ecofys produced a report detailing how we can meet nearly 100% of global energy needs with renewable sources by 2050. Approximately half of the goal is met through increased energy efficiency to first reduce energy demands, and the other half is achieved by switching to renewable energy sources for electricity production (Figure 1, below).

To achieve the goal of 100% renewable energy production, Ecofys forsees that global energy demand in 2050 will be 15% lower than in 2005, despite a growing population and continued economic development in countries like India and China. In their scenario:

QUOTE
Industry uses more recycled and energy-efficient materials, buildings are constructed or upgraded to need minimal energy for heating and cooling, and there is a shift to more efficient forms of transport.
As far as possible, we use electrical energy rather than solid and liquid fuels. Wind, solar, biomass and hydropower are the main sources of electricity, with solar and geothermal sources, as well as heat pumps providing a large share of heat for buildings and industry. Because supplies of wind and solar power vary, "smart" electricity grids have been developed to store and deliver energy more efficiently.  Bioenergy (liquid biofuels and solid biomass) is used as a last resort where other renewable energy sources are not viable.


Attached Image
Figure 1: Ecofys projected global energy consumption between 2000 and 2050.

To achieve the necessary renewable energy production, Ecofys envisions that solar energy supplies about half of our electricity, half of our building heating, and 15% of our industrial heat and fuel by 2050. This requires an average annual solar energy growth rate much lower than we're currently achieving - an encouraging finding.

The report notes that wind could meet one-quarter of the world's electricity needs by 2050 if current growth rates continue, and sets that as its goal. Ecofys also envisions more than one-third of building heat coming from geothermal sources by 2050. If we double current geothermal electricity production growth rates, it can provide 4% of our total electricity needs by that date. Ocean power, through both waves and tides, accounts for about 1% of global electricity needs in 2050. Hydropower, which currently supplies 15% of global electricity, ultimately supplies 12% in the Ecofys scenario. As you can see in Figure 2, global renewable energy use ramps up gradually between now and 2050.

Attached Image
Figure 2: Energy use by source between 2000 and 2050.

Burning biomass (such as plant and animal waste) will supply 60% of industrial fuels and heat, 13% of building heat, and 13% of electricity needs. Much of the proposed biomass use comes from plant residues from agriculture and food processing, sawdust and residues from forestry and wood processing, manure, and municipal waste. All of these renewable energy technologies currently exist, and it's just a matter of implementing them on a sufficiently large scale.

Ecofys also envisions using currently existing technology and expertise to "create buildings that require almost no conventional energy for heating or cooling, through airtight construction, heat pumps and sunlight. The Ecofys scenario foresees all new buildings achieving these standards by 2030." 2-3% of existing buildings will also need to be retrofitted per year to improve energy efficiency. Ecofys notes that Germany is already retrofitting buildings at this rate. Transportation must become more efficient, using more fuel efficient vehicles like electric cars, and increasing use of mass public transportation.

Accomplishing all of this will require a major effort, but Ecofys has a number of suggestions how we can start:

Introduce minimum efficiency standards worldwide for all products that consume energy, including buildings.
Build energy conservation into every stage of product design.
Introduce strict energy efficiency criteria for all new buildings.
Introduce an energy tax, or perhaps a carbon emissions price.
Help developing countries pursue alternatives to inefficient biomass burning, such as such as improved biomass cooking stoves, solar cookers and small-scale biogas digesters.
Substantial investment in public transportation.
Make individuals, businesses, and communities more aware of their energy consumption, and encourage increased efficiency.

TreeHugger



TStzmmalaysia
post Mar 26 2011, 11:20 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


ENERGY

Attached Image

Tata & MIT Work on Breakthrough Way to Generate Power From Ordinary Water

The Tata Group has signed a deal with the founder of SunCatalytix, MIT scientist Daniel Nocera, who has discovered how to generate energy water. Although the terms of their agreement have not yet been disclosed, this breakthrough technology could bring power to as many as three billion people worldwide. What’s more, Nocera’s technology generates energy more efficiently than solar panels, according to the folks at Fast Company.

Nocera and his team discovered recently that an artificial cobalt and phosphate coated silicon leaf placed into a jar of water generates power. Similar to photosynthesis, this process splits hydrogen from the two oxygen molecules in water to create power from the sun.

One and a half bottles of water, including wastewater, can power a small house, and a swimming pool filled with water refreshed once a day will generate enough energy to run a plant. Although in preliminary testing stages, Nocera and TATA envision that this technology could improve the standard of living for billions of people. One small caveat from us: often places that are short on electricity are also short on water. Being just 45 days old, the TATA/MIT team still has a ways to go to get this incredible technology off the ground.

Inhabitat

TStzmmalaysia
post Mar 26 2011, 11:32 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Neutron Analysis Yields Insight Into Bacteria for Solar Energy

Structural studies of some of nature's most efficient light-harvesting systems are lighting the way for new generations of biologically inspired solar cell devices.

Researchers from Washington University in St. Louis and the Department of Energy's Oak Ridge National Laboratory used small-angle neutron scattering to analyze the structure of chlorosomes in green photosynthetic bacteria. Chlorosomes are efficient at collecting sunlight for conversion to energy, even in low-light and extreme environments.

"It's one of the most efficient light harvesting antenna complexes found in nature," said co-author and research scientist Volker Urban of ORNL's Center for Structural Molecular Biology, or CSMB.

Neutron analysis performed at the CSMB's Bio-SANS instrument at the High Flux Isotope Reactor allowed the team to examine chlorosome structure under a range of thermal and ionic conditions.

"We found that their structure changed very little under all these conditions, which shows them to be very stable," Urban said. "This is important for potential biohybrid applications -- if you wanted to use them to harvest light in synthetic materials like a hybrid solar cell, for example."

The size, shape and organization of light-harvesting complexes such as chlorosomes are critical factors in electron transfer to semiconductor electrodes in solar devices. Understanding how chlorosomes function in nature could help scientists mimic the chlorosome's efficiency to create robust biohybrid or bio-inspired solar cells.

"What's so amazing about the chlorosome is that this large and complicated assembly is able to capture light effectively across a large area and then funnel the light to the reaction center without losing it along the way," Urban said. "Why this works so well in chlorosomes is not well understood at all."

"We're trying to find out general principles that are important for capturing, harvesting and transporting light efficiently and see how nature has solved that," Urban said.

Small-angle neutron scattering enabled the team to clearly observe the complicated biological systems at a nanoscale level without damaging the samples.

"With neutrons, you have an advantage that you get a very sharp contrast between these two phases, the chlorosome and the deuterated buffer. This gives you something like a clear black and white image," Urban said.

The team, led by Robert Blankenship of Washington University, published its findings in the journal Langmuir. The research was supported through the Photosynthetic Antenna Research Center, an Energy Frontier Research Center funded by DOE's Office of Science. Both HFIR and the Bio-SANS facility at ORNL's Center for Structural Molecular Biology are also supported by DOE's Office of Science.

ScienceDaily

TStzmmalaysia
post Mar 26 2011, 11:35 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


NANOTECHNOLOGY

Attached Image

New entropy battery pulls energy from difference in salinity between fresh water and seawater

A team of researchers, led by Dr. Yi Cui, of Stanford and Dr. Bruce Logan from Penn State University have succeeded in developing an entropy battery that pulls energy from the imbalance of salinity in fresh water and seawater. Their paper, published in Nano Letters, describes a deceptively simple process whereby an entropy battery is used to capture the energy that is naturally released when river water flows into the sea.

Up to now, this kind of process has been accomplished by passing seawater though a membrane which unfortunately is too costly to merit creating large-scale operations.

The new process works like this:

Step 1 - Two types of nanorod electrodes are placed in river water; one silver anionic electrode contains Cl- ions and one manganese dioxide cationic electrode contains Na+ ions. The battery charges as the river water’s low salinity concentration of salt pulls the chorine and the sodium from the respective electrodes.

Step 2 - The river water is slowly replaced with seawater, causing a potential difference between the two concentrations of ions in the combined water. This is due to the Cl- ions, or anions, traveling to the silver electrode and the Na+ sodium ions, or cations, traveling to the manganese dioxide electrode.

Step 3 - Ions in the electrodes discharge into the seawater when the electrodes receive more ions than they can accommodate.

Step 4 - The salt water is slowly replaced with river water. This lessens the potential difference of the two electrodes which charges the battery. More energy was released in Step3 into the saltwater than is needed to charge the battery, thus the battery collects and stores the energy that has been building up as the ions have been moving in and out of the crystal lattice of the electrodes.

With the entropy battery, costs are much lower than other ways of accomplishing the same thing due to the absence of replaceable membranes.

Cui believes that the entropy battery might eventually contribute up to 13% of total energy needs. He also believes that by moving the two electrodes closer, he might be able to improve his efficiency rate from 74% percent to 85%.

Because the entropy battery operates in both warm and cold conditions it is a completely renewable resource; one that might lead to mass energy production in both developed countries and those in the third world.

More information: Batteries for Efficient Energy Extraction from a Water Salinity Difference, by Fabio La Mantia et al., Nano Lett., Article ASAP

PhysOrg

TStzmmalaysia
post Mar 28 2011, 09:44 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Smaller Particles Could Make Solar Panels More Efficient

Studies done by Mark Lusk and colleagues at the Colorado School of Mines could significantly improve the efficiency of solar cells. Their latest work describes how the size of light-absorbing particles--quantum dots--affects the particles' ability to transfer energy to electrons to generate electricity.

The results are published in the April issue of the journal ACS Nano.


The advance provides evidence to support a controversial idea, called multiple-exciton generation (MEG), which theorizes that it is possible for an electron that has absorbed light energy, called an exciton, to transfer that energy to more than one electron, resulting in more electricity from the same amount of absorbed light.


Quantum dots are human-made atoms that confine electrons to a small space. They have atomic-like behavior that results in unusual electronic properties on a nanoscale. These unique properties may be particularly valuable in tailoring the way light interacts with matter.


Experimental verification of the link between MEG and quantum dot size is a hot topic due to a large degree of variation in previously published studies. The ability to generate an electrical current following MEG is now receiving a great deal of attention because this will be a necessary component of any commercial realization of MEG.


For this study, Lusk and collaborators used a National Science Foundation (NSF)-supported high performance computer cluster to quantify the relationship between the rate of MEG and quantum dot size.


They found that each dot has a slice of the solar spectrum for which it is best suited to perform MEG and that smaller dots carry out MEG for their slice more efficiently than larger dots. This implies that solar cells made of quantum dots specifically tuned to the solar spectrum would be much more efficient than solar cells made of material that is not fabricated with quantum dots.


According to Lusk, "We can now design nanostructured materials that generate more than one exciton from a single photon of light, putting to good use a large portion of the energy that would otherwise just heat up a solar cell."


The research team, which includes participation from the National Renewable Energy Laboratory, is part of the NSF-funded Renewable Energy Materials Research Science and Engineering Center at the Colorado School of Mines in Golden, Colo. The center focuses on materials and innovations that will significantly impact renewable energy technologies. Harnessing the unique properties of nanostructured materials to enhance the performance of solar panels is an area of particular interest to the center.


"These results are exciting because they go far towards resolving a long-standing debate within the field," said Mary Galvin, a program director for the Division of Materials Research at NSF. "Equally important, they will contribute to establishment of new design techniques that can be used to make more efficient solar cells."

Science Daily

TStzmmalaysia
post Mar 28 2011, 09:45 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

A New Way to Turn Out Cheap LED Lighting

A startup in California has developed a manufacturing technique that could substantially cut the cost of LED lightbulbs—a more energy-efficient type of lighting.

LEDs are conventionally made on a relatively costly substrate of silicon carbide or sapphire. Bridgelux has come up a new process takes advantage of existing fabrication machines used to make silicon computer chips, potentially cutting LED production costs by 75 percent, according to the company.

Despite their higher efficiencies and longer life, few homes and businesses use LED lighting—largely because of the initial cost. An LED chip makes up 30 to 60 percent of a commercial LED lightbulb. Electronic control circuits and heat management components take up the rest. So for a 60-watt equivalent bulb that costs $40, Bridgelux's technology could bring the cost down by $9 to $18. Integrating the light chip with the electronics might further reduce costs.

LEDs made with the new technique produce 135 lumens for each watt of power. The U.S. Department of Energy's Lighting Technology Roadmap calls for an efficiency of 150 lumens per watt by 2012. Some LED makers, such as Cree, in Durham, North Carolina, already sell LED lamps with efficiencies in that range. In contrast, incandescent bulbs emit around 15 lumens per watt, and fluorescent lightbulbs emit 50 to 100 lumens per watt.

Manufacturers typically make white LEDs by coating blue gallium-nitride devices with yellow phosphors. The gallium nitride is grown on two- to four-inch sapphire or silicon carbide wafers. Cree builds its chips on silicon-carbide wafers, "because we believe it produces superior LEDs," says company spokesperson Michelle Murray.

Larger wafers mean more devices fabricated at once, which brings down cost. But large sapphire or silicon carbide wafers are more difficult, and expensive, to make. Companies such as Osram Opto Semiconductors in Germany are now moving to 15-centimeter sapphire wafers, most likely the largest size possible. Making 20-centimeter silicon wafers, on the other hand, is routine in the semiconductor chip-making industry. Bridgelux's new silicon wafers were, in fact, made at an old silicon fabrication plant in Silicon Valley.

It is hard to grow gallium nitride on silicon, mainly because the materials expand and contract at very different rates, explains Colin Humphreys, a materials science researcher at Cambridge University. The process is carried out at temperatures around 1,000 °C, and, upon cooling, the gallium nitride cracks because it is under tension, Humphreys says. One way to solve the problem is to insert additional thin films around the gallium nitride to compress the material and balance out the tension produced during cooling. In fact, Humphreys and his colleagues have used this trick to make gallium-nitride LEDs on silicon; their devices produce 70 lumens per watt. Bridgelux might be using a similar technique. "The result from Bridgelux is impressive," Humphreys says. "It offers the promise of a large cost reduction without any reduction of efficiency."

Other LED makers, including Osram, are also trying to make gallium-nitride LEDs on silicon. Bridgelux expects to deliver its first silicon-based LEDs in two to three years.

Technology Review

TStzmmalaysia
post Mar 28 2011, 09:46 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Debut of the first practical 'artificial leaf'

Scientists today claimed one of the milestones in the drive for sustainable energy — development of the first practical artificial leaf. Speaking here at the 241st National Meeting of the American Chemical Society, they described an advanced solar cell the size of a poker card that mimics the process, called photosynthesis, that green plants use to convert sunlight and water into energy.

"A practical artificial leaf has been one of the Holy Grails of science for decades," said Daniel Nocera, Ph.D., who led the research team. "We believe we have done it. The artificial leaf shows particular promise as an inexpensive source of electricity for homes of the poor in developing countries. Our goal is to make each home its own power station," he said. "One can envision villages in India and Africa not long from now purchasing an affordable basic power system based on this technology."

The device bears no resemblance to Mother Nature's counterparts on oaks, maples and other green plants, which scientists have used as the model for their efforts to develop this new genre of solar cells. About the shape of a poker card but thinner, the device is fashioned from silicon, electronics and catalysts, substances that accelerate chemical reactions that otherwise would not occur, or would run slowly. Placed in a single gallon of water in a bright sunlight, the device could produce enough electricity to supply a house in a developing country with electricity for a day, Nocera said. It does so by splitting water into its two components, hydrogen and oxygen.

The hydrogen and oxygen gases would be stored in a fuel cell, which uses those two materials to produce electricity, located either on top of the house or beside it.

Nocera, who is with the Massachusetts Institute of Technology, points out that the "artificial " is not a new concept. The first artificial leaf was developed more than a decade ago by John Turner of the U.S. National Renewable Energy Laboratory in Boulder, Colorado. Although highly efficient at carrying out photosynthesis, Turner's device was impractical for wider use, as it was composed of rare, expensive metals and was highly unstable — with a lifespan of barely one day.

Nocera's new leaf overcomes these problems. It is made of inexpensive materials that are widely available, works under simple conditions and is highly stable. In laboratory studies, he showed that an artificial leaf prototype could operate continuously for at least 45 hours without a drop in activity.

The key to this breakthrough is Nocera's recent discovery of several powerful new, inexpensive catalysts, made of nickel and cobalt, that are capable of efficiently splitting water into its two components, hydrogen and oxygen, under simple conditions. Right now, Nocera's leaf is about 10 times more efficient at carrying out photosynthesis than a natural leaf. However, he is optimistic that he can boost the efficiency of the artificial leaf much higher in the future.

"Nature is powered by photosynthesis, and I think that the future world will be powered by photosynthesis as well in the form of this artificial leaf," said Nocera, a chemist at Massachusetts Institute of Technology in Cambridge, Mass.

PhysOrg

Research Provided by American Chemical Society

TStzmmalaysia
post Mar 28 2011, 09:49 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

High-temperature superconductor spills secret: A new phase of matter

Scientists from the U.S. Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) and the University of California at Berkeley have joined with researchers at Stanford University and the SLAC National Accelerator Laboratory to mount a three-pronged attack on one of the most obstinate puzzles in materials sciences: what is the pseudogap?

A collaboration organized by Zhi-Xun Shen, a member of the Stanford Institute for Materials and Energy Science (SIMES) at SLAC and a professor of physics at Stanford University, used three complementary experimental approaches to investigate a single material, the high-temperature superconductor Pb-Bi2201 (lead bismuth strontium lanthanum copper-oxide). Their results are the strongest evidence yet that the pseudogap phase, a mysterious electronic state peculiar to high-temperature superconductors, is not a gradual transition to superconductivity in these materials, as many have long believed. It is in fact a distinct phase of matter.

"This is a paradigm shift in the way we understand high-temperature superconductivity," says Ruihua He, lead author with Makoto Hashimoto of the paper in the March 25 issue of the journal Science that describes the team's findings. "The involvement of an additional phase, once fully understood, might open up new possibilities for achieving superconductivity at even higher temperatures in these materials." When the research was done Hashimoto and He were members of SIMES, of Stanford's Department of Applied Physics, and of Berkeley Lab's Advanced Light Source (ALS), where He is now a postdoctoral fellow.

The pseudogap mystery

Superconductivity is the total absence of resistance to the flow of electric current. Discovered in 1911, it was long thought to occur only in metals and only below a critical temperature (Tc) not far above absolute zero. "Ordinary" superconductivity commonly takes place at 30 kelvins (30 K) or less, equivalent to more than 400 degrees below zero Fahrenheit. Awkward as reaching such low temperatures may be, ordinary superconductivity is widely exploited in industry, health, and science.

High-Tc superconductors were discovered in 1986. "High" is a relative term; the highest-Tc superconductors function at temperatures five times higher than ordinary superconductors but still only about twice that of liquid nitrogen. Many high-Tc superconductors have been found, but the record holders for critical temperature remain the kind first discovered, the cuprates — brittle oxides whose structure includes layers of copper and oxygen atoms where current flows.

In all known superconductors electrons join in pairs (Cooper pairs) to move in correlated fashion through the material. It takes a certain amount of energy to break Cooper pairs apart; in ordinary superconductors, the absence of single-electron states below this energy constitutes a superconducting gap, which vanishes when the temperature rises above Tc. Once in the normal state the electrons revert to unpaired, uncorrelated behavior.

Not so for cuprate superconductors. A similar superconducting gap exists below Tc, but when superconductivity ceases at Tc the gap doesn't close. A "pseudogap" persists and doesn't go away until the material reaches a higher temperature, designated T* (T-star). The existence of a pseudogap in the normal state is itself anything but normal; its nature has been heatedly debated ever since it was identified in cuprates more than 15 years ago.

Attempts to explain what's going on in the pseudogap have coalesced around two main schools of thought. Traditional thinking holds that the pseudogap represents a foreshadowing of the superconducting phase. As the temperature is lowered, first reaching T*, a few electron pairs start to form, but they are sparse and lack the long-range coherence necessary for superconductivity — they can't "talk" to one another. As the temperature continues to fall, more such pairs are formed until, upon reaching Tc, virtually all conducting electrons are paired and act in correlation; they're all talking. In this scheme, there's only a single phase transition, which occurs at Tc.

Another school of thought argues that the appearance of the pseudogap at T* is also a true phase transition. The pseudogap does not represent a smooth shift to the superconducting state but is itself a state distinct from both superconductivity and normal "metallicity" (the usual state of delocalized, uncorrelated electrons). This new phase implies the existence of a "quantum critical point" — a point along a line at zero temperature where competing phases meet. In theory, with competing phases wildly fluctuating in the neighborhood of a quantum critical point, there may be entirely new routes to superconductivity.

"Promising as the 'quantum critical' paradigm is for explaining a wide range of exotic materials, high-Tc superconductivity in cuprates has stubbornly refused to fit the mold," says Joseph Orenstein of Berkeley Lab's Materials Sciences Division, a professor in physics at UC Berkeley, whose group conducted one of the research team's three experiments. "For 20 years, the cuprates managed to conceal any evidence of a phase-transition line where the quantum critical point is supposed to be found."

In recent years, however, hints have emerged. "New ultrasensitive probes have found fingerprints of phase transitions in high-Tc materials," Orenstein says, "although there's been no smoking gun. The burning question is whether we can discover the nature of the new phase or phases."

A multipronged attack on the pseudogap

In the Stanford-Berkeley study, three groups of researchers joined forces to probe the pseudogap phase on the same sample.

"Pb-Bi2201 was chosen because, first, it is structurally simple, and second, it has a relatively wide temperature range between Tc and T*," says Ruihua He. "This permits a clean separation of any remnant effect of superconductivity from genuine pseudogap physics."

Groups led by Z.-X. Shen at beamline 5?4 of the Stanford Synchrotron Radiation Lightsource (SSRL) at SLAC and by Zahid Hussain, ALS Division Deputy for Scientific Support, at beamline 10.0.1 of Berkeley Lab's ALS, studied the sample with angle-resolved photoemission spectroscopy (ARPES). In ARPES, a beam of x-rays directed at the sample surface excites the emission of valence electrons. By monitoring the kinetic energy and momentum of the emitted electrons over a wide temperature range the researchers map out the material's low-energy electronic band structure, which determines much of its electrical and magnetic properties.

At Stanford, researchers led by Aharon Kapitulnik of SIMES, a professor in applied physics at Stanford University, studied the same crystal of Pb-Bi2201 with the magneto-optical Kerr effect. In light reflected from the sample under a zero magnetic field, tiny rotations of the plane of polarization are measured as the temperature changes. The rotations are proportional to the net magnetization of the sample at different temperatures.

Finally, Orenstein's group at Berkeley applied time-resolved reflectivity to the sample. A pump pulse from a laser excites electrons many layers of atoms deep, temporarily affecting the sample's reflectivity. Probe pulses, timed to follow less than a trillionth of a second after the pump pulses, reveal changes in reflection at different temperatures.

All these experimental techniques had previously pointed to the possibility of a phase transition in the neighborhood of T* in different cuprate materials. But no single result was strong enough to stand alone.

ARPES experiments performed in 2010 by the same group of experimenters as in the present study revealed the abrupt opening of the pseudogap at T* in Pb-Bi2201. Variations in T* in different materials and even different samples, as well as in the surface conditions to which ARPES is sensitive, had left room for uncertainty, however.

In 2008, the Kerr effect was measured in another cuprate, also by the same group as in the present study, and showed a change in magnetization from zero to finite across T*. This was long-sought thermodynamic evidence for the existence of a phase transition at T*. But compared to the pronounced spectral change seen by ARPES, the extreme weakness of the Kerr-effect signal left doubt that the two results were connected.

Finally, since the late 1990s various experiments with time-resolved reflectivity in different cuprates have reported signals setting in near T* and increasing in strength as the temperature drops, until interrupted by the onset of a separate signal below Tc. The probe is complex and there was a lack of corroborating evidence for the same cuprates; the results did not receive wide attention.

Now the three experimental approaches have all been applied to the same material. All yielded consistent results and all point to the same conclusion: there is a phase transition at the pseudogap phase boundary – the three techniques put it precisely at T*. The electronic states dominating the pseudogap phase do not include Cooper pairs, but nevertheless intrude into the lower-lying superconducting phase and directly influence the motion of Cooper pairs in a way previously overlooked.

"Instead of pairing up, the electrons in the pseudogap phase organize themselves in some very different way," says He. "We currently don't know what exactly it is, and we don't know whether it helps superconductivity or hurts it. But we know the direction to take to move forward."

Says Orenstein, "Coming to grips with a new picture is a little like trying to steer the Titanic, but the fact that all three of these techniques point in the same direction adds to the mounting evidence for the phase change."

Hussain says the critical factor was bringing the Stanford and Berkeley scientists together. "We joined forces to tackle a more complex problem than any of us had tried on our own."

PhysOrg

Provided by Lawrence Berkeley National Laboratory
TStzmmalaysia
post Mar 28 2011, 09:52 AM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


APPLIED SCIENCES

Attached Image

Engineers put a damper on 'aeroelastic flutter'

Anyone who has ever flown knows the feeling: an otherwise smooth flight gets a little choppy. If you are lucky, the plane skips a few times like a rock across a pond and then settles. For the not-so-lucky, the captain has turned on that seatbelt sign for a reason, but even the worst turbulence usually fades.

In certain rare situations, however, those vibrations don't settle and the consequences turn dire. The twisting, up-and-down movement in the wing builds upon itself, each wave compounding the next, until the vibration worsens and the wing is ripped from the plane. In an instant, a simple bit of turbulence becomes a matter of life and death.

Aeronautical engineers know it as "aeroelastic ." Pilots call it "buzz."



Complicated stuff

Aeronautical engineers have puzzled over the phenomenon as long as there have been planes. They made their planes better. They built them of new materials. They used supercomputers to predict when it might occur. But, try as they might, they could not absolutely eliminate aeroelastic flutter.

Professor Charbel Farhat, chair of the Aeronautics and Astronautics Department at Stanford's School of Engineering, and David Amsallem, a postdoctoral scholar who worked on his PhD thesis with Farhat, have been studying and trying to solve aeroelastic flutter for years. Computers help, but only to a point.

Listening to Farhat is a bit like flying. He talks quickly with great expression, piloting the listener through his world, swooping from idea to idea in great arcs like a stunt plane, always on the edge of control. He is a barnstormer. Amsallem, the mild-mannered mathematical whiz behind it all, smiles gently, tossing in a French-accented word of clarification here and there.

"This is complicated stuff. It takes today's fastest computer an hour to calculate the aeronautical effect of even a small change in a single variable," said Farhat, his voice rising to deliver the word "today's" as if to reinforce that we are not talking about some mid-century mainframe here. "Imagine a plane in flight and you'll quickly grasp that there are hundreds of variables. Now, imagine the rate of change in those variables for an F-16 at full throttle."

Each incremental shift in pitch of the wing, every inch of altitude, every variation in speed, each milliliter of fuel added or burned sloshing back and forth in the tank are equally at play – alter one, you alter the entire system. And each time you alter the system, the supercomputer starts back at go, computing anew – that is, if you can get time on the supercomputer.



"Now you begin to understand. This is complicated stuff," Farhat said. At this point in the conversation, the professor leaned in. His eyes narrowed and his tone grew serious. "We can now predict flutter in real time … on an iPhone." Someday, he predicted, airplanes will have chips on board that will sense and counteract flutter in real time.

The work has caused a sensation in the aeronautics field. When Farhat and Amsallem presented their paper at the Army Science Conference recently, the crowd of aeronautical old hands sat stunned as the two did a live demo of their work on an iPad.

Farhat chose the iPad over a smaller device, he said, not for processor speed – their innovation works fine on an iPhone, he assured – but for pure visual impact: It looks better on an iPad.

The seat of your pants

Over time, aeronautical engineers have been able to "engineer" flutter to a point of virtual oblivion – emphasis on virtual. It is now only a remote risk.

"But, there are instances when even the smallest of risks is too great a risk," Farhat said.

For instance, when you are a fighter pilot in the saddle of a $60 million F-16, one of the fastest, most agile fighter planes ever developed. F-16 pilots – and their planes – regularly endure forces many times that of gravity. A little turbulence can be a big deal. Each time the wing starts to bounce, no matter how slight the bounce, a little voice nags in the pilot's mind, "Is this the one?" If the vibration fails to fade, the pilot must contemplate the "eject" button. At this point, it is life or plane. Call it flying by the seat of your $60 million pants.

How have Farhat and Amsallem succeeded where others have come up short? The answer sounds suitably complex: interpolation on manifolds. What it means, in essence, is approximating unknowns based on known information. The two engineers devised a system of mathematical approximations that break down complex, computationally demanding equations into smaller, more manageable parts. In mathematics, this is known as "reducing." Reducing allows them to make some very educated guesses, very quickly.

Starting with a mostly random, but carefully selected, sample of a few flight conditions – the variables such as air speed, wing angle and altitude – they "pre-computed" a series of reduced-order models using the very supercomputers they aimed to beat. These models are called "snapshots" – mathematical pictures of the fluid dynamics at play at each flight point. Farhat and Amsallem stored these snapshots in a database, which their computer algorithms can later pluck as needed to make more complex calculations, quite literally, on the fly.

That is the easy part.

Next, using a cleverly designed and painstakingly tested methodology, Farhat and Amsallem segmented data into small groups centered on the pre-calculated points in the databases. In essence, Farhat and Amsallem drew circles around the things they knew – those pre-computed flight conditions – and crafted a system to interpolate the value of any point within each circle.

On a graph, these "cells," as they are described, look like living cells with the known data serving as the nucleus. The rest of the cell is considered of similar enough aeronautical characteristic to the nucleus as to allow the engineers to make very educated guesses as to the behavior of the entire cell – allowing them to determine how the wing will respond at any given moment.

What's more, each time they make a new set of calculations, the new numbers are added to the database, bolstering future results and making their interpolations all the more accurate. The mathematicians call this "training," as if they are teaching the numbers and not the other way around – as if they are telling the numbers what they can and can't do.

Thus, the smartphone beat the supercomputer.



A field aflutter

So, what took so long?

"It sounds simple," said Amsallem, explaining his work, "but the mathematics are exceptionally complex and the stakes of being wrong are great. There's no room for error."

Farhat and Amsallem are able to accurately predict – in real time and where no one had before – whether a plane will experience flutter. Soon, they hope, all planes will have active control mechanisms that continually monitor multiple flight and mechanical data and steer the planes clear of flutter.

While this is good news for pilots and passengers, the implications of the work run far beyond the relatively rare, albeit deadly, phenomenon of flutter.

"Our interpolation method is general enough to work, in principle, on many complex engineering problems," said Farhat, hinting at future possibilities.

And this is what has the field aflutter. Everyone from the Air Force to the Navy to airplane manufacturers to Formula 1 racing teams are lining up – as they once did for time on the supercomputers – to apply the Stanford aeronautical algorithm to their problems.

PhysOrg

TStzmmalaysia
post Mar 29 2011, 05:49 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Structure of DNA Repair Complex Reveals Workings of Powerful Cell Motor

Over the last years, two teams of researchers at The Scripps Research Institute have steadily built a model of how a powerful DNA repair complex works. Now, their latest discovery provides revolutionary insights into the way the molecular motor inside the complex functions -- findings they say may have implications for treatment of disorders ranging from cancer to cystic fibrosis.

In a paper published in an Advance Online Edition of Nature Structural and Molecular Biology March 27, 2011, the scientists say that the complex's motor molecule, known as Rad50, is a surprisingly flexible protein that can change shape and even rotate depending on the task at hand.

The finding solves the long-standing mystery of how a single protein complex known as MRN (Mre11-Rad50-Nbs1) can repair DNA in a number of different, and tricky, ways that seem impossible for "standard issue" proteins to do, say team leaders Scripps Research Professor John Tainer, Ph.D., and Scripps Research Professor Paul Russell, Ph.D., who also collaborated with members of the Lawrence Berkeley National Laboratory on the study.

They say the finding also provides a critical insight into the ABC-ATPase superfamily of molecular motors, of which Rad50 is a member.

"Rad50 and its brethren proteins in this superfamily are biology's general motors," said Tainer, "and if we know how they work, we might be able to control biological outcomes when we need to."

For example, knowing that Rad50 changes its contour to perform a function suggests it might be possible to therapeutically target unique elements in that specific conformation. "There could be a new generation of drugs that are designed not against an active site, like most drugs now (an approach that can cause side effects, but against the shape the protein needs to be in to work," Tainer said.

Russell added, "Proteins are often viewed as static, but we are showing the moving parts in this complex. They are dynamic. They move about and change shape when engaging with other molecules."

First Responder

The MRN complex is known as a first-responder molecule that rushes in to repair serious double-strand breaks in the DNA helix -- an event that normally occurs about 10 times a day per cell due to ultraviolet light and radiation damage, etc. If these breaks are not fixed, dangerous chromosomal rearrangements can occur that lead to cancer. Paradoxically, the complex also mends DNA breaks promoted by chemotherapy, protecting cells against cancer treatment.

When MRN senses a break, it activates an alarm telling the cell to shut down division until repairs are made. Then, it binds to ATP (an energy source) and repairs DNA in three different ways, depending on whether two ends of strands need to be joined together or if DNA sequences need to be replicated. "The same complex has to decide the extent of damage and be able to do multiple things," Tainer said. "The mystery was how it can do it all."

To find out, Tainer, head of a structural biology group, and Russell, who leads a yeast genetics laboratory, began collaborating five years ago. With the additional help of team members at Lawrence Berkeley National Laboratory and its Advanced Light Source beamline, called SIBYLS, the collaboration has produced a series of high-resolution images of the crystal structure of parts of all three proteins (rad50, Mre11, and Nbs1), taken from fission yeast and archaea. The scientists also used the lab's X-ray scattering tool to determine the proteins' overall architecture in solution, which approximates how a protein appears in a natural state.

The scientists say that the parts of the complex, when imagined together as a whole unit, resemble an octopus: the head consists of the repair machinery (the Rad50 motor and the Mre11 protein, which is an enzyme that can break bonds between nucleic acids) and the octopus arms are made up of Nbs1 which can grab the molecules needed to help the machinery mend the strands.

In this study, Tainer and Russell were able to produce crystal and X-ray scattering images of parts of where Rad50 and Mre11 touched each other, and what happened when ATP bound to this complex and what it looked like when it didn't.

In these four new structures, they showed that ATP binding allows Rad50 to drastically change its shape. When not bound to ATP, Rad50 is flexible and floppy, but bound to ATP, Rad50 snaps into a ring that presumably closes around DNA in order to repair it.

"We saw a lot of big movement on a molecular scale," said Tainer. "Rad50 is like a rope that can pull. It appears to be a dynamic system of communicating with other molecules, and so we can now see how flexibly linked proteins can alter their physical states to control outcomes in biology."

"We thought ATP allowed Rad50 to change shape, but now we have proof of it and how it works," Russell said. "This is a key part of the MRN puzzle."

An Engine for Many Vehicles

Rad50 and ATP provide the motor and gas for a number of biological machines that operate across species. These machines are linked to a number of disorders, such as cystic fibrosis, which is caused by a defect in the cystic fibrosis transmembrane conductance regulator (CFTR) gene, which is a member of the ABC ATPase superfamily.

"Our study suggests that ABC ATPase proteins are used so often in biology because they can flexibly hook up to so many different things and produce a specific biological outcome," Tainer said.

Given this new prototypic understanding of these motors, Tainer and Russell envision a future in which therapies might be designed that target Rad50 when it changes into a shape that promotes a disease. For example, chemotherapy could be coupled with an agent that prevents the MRN complex from repairing DNA damage, promoting death of cancer cells.

"There are some potentially very cool applications to these findings that we are only beginning to think about," Russell said.

The study was funded by the National Cancer Institute, the National Institutes of Health, and the Department of Energy.

ScienceDaily

TStzmmalaysia
post Mar 29 2011, 05:50 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


NANOTECHNOLOGY

Attached Image

'Nano-bricks' may help build better packaging to keep foods fresher longer

Scientists are reporting on a new material containing an ingredient used to make bricks that shows promise as a transparent coating for improving the strength and performance of plastic food packaging. Called "nano-bricks," the film even looks like bricks and mortar under a microscope, they say. The coating could help foods and beverages stay fresh and flavorful longer and may replace some foil packaging currently in use, they note. The scientists described the new, eco-friendly material here today at the 241st National Meeting and Exposition of the American Chemical Society (ACS).

Ordinary plastic soda bottles tend to loose their fizz after just a few months of storage on grocery store shelves. If manufacturers apply the new coating to these bottles, the material could slow the loss of carbon dioxide gas and help sodas stay bubbly for several more months or even years, the scientists said. The coating could also extend the shelf life for those portable food packages known as MREs (Meal, Ready to Eat) that sustain soldiers in the field, with the added benefit of being microwavable, they noted. Although made to last for at least three years, their shelf life can drop to as little as three months when exposed to harsh conditions such as high heat.

"This is a new, 'outside of the box' technology that gives plastic the superior food preservation properties of glass," said Jaime Grunlan, Ph.D., who reported on the research. "It will give consumers tastier, longer lasting foods and help boost the food packaging industry."

Grunlan notes that manufacturers currently use a variety of advanced packaging materials to preserve foods and beverages. These materials include plastics that are coated with silicon oxide, a material similar to sand, that provide a barrier to oxygen that can speed food spoilage. Another example is the so-called metalized plastics — plastics with a thin coating of metal or foil — used in many potato chip bags.

These and other packaging materials have drawbacks, Grunlan said. Some plastics crack easily during transport or impact. Metalized plastics are non-transparent — a turn-off to consumers who would like to be able to see their food prior to purchase. The presence of metal also prevents their use in the microwave. Food pouches made out of metal, such as MREs, provide impact resistance, but they lack both transparency and microwavability. Consumers need better food packaging options.

Grunlan has identified a promising alternative in the form of "nano-bricks." The new film combines particles of montmorillonite clay, a soil ingredient used to make bricks, with a variety of polymer materials. The resulting film is about 70 percent clay and contains a small amount of polymer, making it more eco-friendly than current plastics. The film is less than 100 nanometers thick — or thousands of times thinner than the width of a single human hair — and completely transparent to the naked-eye.
"When viewed under an electron microscope, the film looks like bricks and mortar," said Grunlan, an associate professor in the Department of Mechanical Engineering at Texas A&M University in College Station, Texas. "That's why we call it 'nano-bricks'."

When layered onto existing plastic packaging, it adds strength and provides an improved barrier to oxygen, he said. Grunlan demonstrated in lab studies that the film is 100 times less permeable to oxygen than existing silicon oxide coatings. This means that it's also likely to be a better oxygen barrier than a metal coating, whose permeability is similar to that of silicon oxide, the scientists noted.

"Others have added clay to polymer to reduce (gas) permeability, but they are thousands of times more permeable than our film," Grunlan said. "We have the most organized structure — a nano-brick wall — which is the source of this exceptional barrier. This is truly the most oxygen impermeable film in existence."

Grunlan is currently trying to improve the quality of the film to make it more appealing to packaging manufacturers, including making it more moisture resistant. He envisions that manufacturers will dip plastics in the coating or spray the coating onto plastics. In the future, he hopes to develop nano-brick films that block sunlight and contain antimicrobial substances to enhance packaging performance.

The new coating also shows promise for use in flexible electronics, scratch-resistant surfaces, tires, and sporting goods, Grunlan said. It could potentially help basketballs and footballs stay inflated longer than existing balls, he added.

PhysOrg

TStzmmalaysia
post Mar 29 2011, 05:51 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

Bullying alters brain chemistry, leads to anxiety

Bullies and the brain. Mice that have been repeatedly bullied by dominant males show an unusual reluctance to approach new, even nonthreatening mice. Above a bullied mouse (right) keeps as much distance as it can from its corralled counterpart.

Being low mouse on the totem pole is tough on murine self-esteem. It turns out it has measurable effects on brain chemistry, too, according to recent experiments at Rockefeller University. Researchers found that mice that were bullied persistently by dominant males grew unusually nervous around new company, and that the change in behavior was accompanied by heightened sensitivity to vasopressin, a hormone involved in a variety of social behaviors. The findings suggest how bullying could contribute to long-term social anxiety at the molecular level.

“We found that chronic social stress affects neuroendocrine systems that are paramount for adaptive mammalian social behaviors such as courtship, pair-bonding and parental behaviors,” says Yoav Litvin, M. S. Stoffel Postdoctoral Fellow in Mind, Brain and Behavior. “Changes in components of these systems have been implicated in human disorders, such as social phobias, depression, schizophrenia and autism.”

Litvin and colleagues in Donald Pfaff’s Laboratory of Neurobiology and Behavior set up a rough-and-tumble school yard scenario in which a young mouse is placed in a cage with a series of larger older mice — a different one in each of 10 days. The mice, being territorial, fight it out in a contest that the new arrival invariably loses. Following the 10-minute battle, the mice were separated in the same cage by a partition that keeps them physically apart but allows them to see, smell and hear one another, a stressful experience for the loser.

Given a day to rest, the test mice are then put in the company of nonthreatening mice of comparable size and age. The biggest change in behavior was that the traumatized mice were more reluctant to socialize with their fellow mice, preferring to keep their distance compared to their unbullied counterparts. The mice that had lost their battles were also more likely to “freeze” in place for longer periods of time and to frequently display “risk assessment” behaviors toward their new cage-mates, behaviors that have been shown to be valid indices of fear and anxiety in humans. The researchers also gave a group of mice a drug that blocked vasopressin receptors, which partly curbed some of the anxious behavior in the bullied mice.

The researchers then examined the brains of the mice, particularly sections in the middle of the forebrain known to be associated with emotion and social behavior. They found that mRNA expression for vasopressin receptors — specifically V1bRs — had increased in the bullied mice, making them more sensitive to the hormone, which is found in high levels in rats with innate high anxiety. In humans, the hormone is associated with aggression, stress and anxiety disorders. The surge of vasopressin receptors was especially notable in the amygdala, Litvin and colleagues reported this month in Physiology & Behavior.

How long these effects last remains an open question. Other studies have found, for instance, that chronic stress can impair some cognitive functions in rodents and people, but that their brains can bounce back, given time to recuperate.

Still, many studies in rodents, primates and people have shown that early psychological trauma can have ill effects on health throughout life. Litvin says his study suggests that victims of bullying may have difficulty forming new relationships, and it identifies the possible role for a specific vasopressin receptor.

“The identification of brain neuroendocrine systems that are affected by stress opens the door for possible pharmacological interventions,” Litvin says. “Additionally, studies have shown that the formation and maintenance of positive social relationships may heal some of the damage of bullying. These dynamic neuroendocrine systems may be involved.”

PhysOrg

TStzmmalaysia
post Mar 29 2011, 05:53 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


NANOTECHNOLOGY

Attached Image

Dutch researchers identify huge potential of nanocrystals in fuel cells

The addition of extremely small crystals to solid electrolyte material has the potential to considerably raise the efficiency of fuel cells. Researchers at TU Delft were the first to document this accurately. Their second article on the subject in a very short time was published in the scientific journal, Advanced Functional Materials.

The researchers at the Faculty of Applied Sciences at TU Delft were concentrating their efforts on improving electrolyte materials. This is the material between two electrodes, for example in a fuel cell or a battery. The better the characteristics of the electrolyte, the better, more compactly or more efficiently the fuel cell or battery works.

The electrolyte is usually a liquid, but this has a number of drawbacks. The liquid has to be very well enclosed, for example, and it takes up a relatively large amount of space. "It would therefore be preferable to have an electrolyte made of solid matter," says PhD student Lucas Haverkate. "Unfortunately though, that has disadvantages as well. The conductivity in solid matter is not as good as it is in a liquid."

"In a solid matter you have a network of ions, in which virtually every position in the network is taken. This makes it difficult for the charged particles (protons) to move from one electrode to another. It’s a bit like a traffic jam on a motorway. What you need to do is to create free spaces in the network."

One of the ways of achieving this, and therefore of increasing conductivity in solid electrolytes, is to add nanocrystals (of seven nanometres to around fifty nanometres), of Titanium Dioxide. "A characteristic of these TiO2 crystals is that they attract protons, and this creates more space in the network." The nanocrystals are mixed in the electrolyte with a solid acid (CsHSO4). This latter material 'delivers' the protons to the crystals. "The addition of the crystals appears to cause an enormous leap in the conductive capacity, up to a factor of 100," concludes Haverkate.

This remarkable achievement by TU Delft has already led to two publications in the scientific journal Advanced Functional Materials. Last December, Haverkate published an article on the theory behind the results. His fellow PhD student, Wing Kee Chan, is the main author of a second item that appeared in the same publication this week. Chan focused on the experimental side of the research. "The nice thing about these two publications is that the experimental results and the theoretical underpinning strongly complement each other," says Haverkate.

Chan carried out measurements on the electrolyte material using the neutron diffraction method. This involves sending neutrons through the material. The way in which the neutrons are dispersed makes it possible to deduce certain characteristics of the material, such as the density of protons in the crystals. Haverkate: "It is the first time that measurements have been taken of solid-material electrolytes in this way, and on such a small scale. The fact that we had nuclear research technologies at the Reactor Institute Delft at our disposal was tremendously valuable."

However, the combination of TiO2 and CsHSO4 does not mark the end of the search for a suitable solid-material electrolyte. Other material combinations will be tested that may achieve better scores in the area of stability, for example. Professor Fokko Mulder, who is Haverkate’s and Chan’s PhD supervisor, says. "At this stage, we are more concerned about acquiring a fundamental understanding and a useful model, than the concrete issue of finding out what the most suitable material is. It is important that we identify the effect of nanocrystals, and give it a theoretical basis. I think there is great potential for these electrolytes. They also have the extra benefit of continuing to function well over a wide range of temperatures, which is of particular relevance for applying them in fuel cells."

PhysOrg

TStzmmalaysia
post Mar 29 2011, 05:54 PM

Enthusiast
*****
Senior Member
869 posts

Joined: Oct 2010


RESEARCH

Attached Image

'Green' cars could be made from pineapples and bananas

Your next new car hopefully won't be a lemon. But it could be a pineapple or a banana. That's because scientists in Brazil have developed a more effective way to use fibers from these and other plants in a new generation of automotive plastics that are stronger, lighter, and more eco-friendly than plastics now in use. They described the work, which could lead to stronger, lighter, and more sustainable materials for cars and other products, here today at the 241st National Meeting & Exposition of the American Chemical Society (ACS).

Study leader Alcides Leão, Ph.D., said the fibers used to reinforce the new plastics may come from delicate fruits like bananas and pineapples, but they are super strong. Some of these so-called nano-cellulose fibers are almost as stiff as Kevlar, the renowned super-strong material used in armor and bulletproof vests. Unlike Kevlar and other traditional plastics, which are made from petroleum or natural gas, nano-cellulose fibers are completely renewable.

"The properties of these plastics are incredible," Leão said, "They are light, but very strong — 30 per cent lighter and 3-to-4 times stronger. We believe that a lot of car parts, including dashboards, bumpers, side panels, will be made of nano-sized fruit fibers in the future. For one thing, they will help reduce the weight of cars and that will improve fuel economy."

Besides weight reduction, nano-cellulose reinforced plastics have mechanical advantages over conventional automotive plastics, Leão added. These include greater resistance to damage from heat, spilled gasoline, water, and oxygen. With automobile manufacturers already testing nano-cellulose-reinforced plastics, with promising results, he predicted they would be used within two years.

Cellulose is the main material that makes up the wood in trees and other parts ofplants. Its ordinary-size fibers have been used for centuries to make paper, extracted from wood that is ground up and processed. In more recent years, scientists have discovered that intensive processing of wood releases ultra-small, or "nano" cellulose fibers, so tiny that 50,000 could fit inside across the width of a single strand of human hair. Like fibers made from glass, carbon, and other materials, nano-cellulose fibers can be added to raw material used to make plastics, producing reinforced plastics that are stronger and more durable.

Leão said that pineapple leaves and stems, rather than wood, may be the most promising source for nano-cellulose. He is with Sao Paulo State University in Sao Paulo, Brazil. Another is curaua, a plant related to pineapple that is cultivated in South America. Other good sources include bananas; coir fibers found in coconut shells; typha, or "cattails;" sisal fibers produced from the agave plant; and fique, another plant related to pineapples.

To prepare the nano-fibers, the scientists insert the leaves and stems of pineapples or other plants into a device similar to a pressure cooker. They then add certain chemicals to the plants and heat the mixture over several cycles, producing a fine material that resembles talcum powder. The process is costly, but it takes just one pound of nano-cellulose to produce 100 pounds of super-strong, lightweight plastic, the scientists said.

"So far, we're focusing on replacing automotive plastics," said Leão. "But in the future, we may be able to replace steel and aluminum automotive parts using these plant-based nanocellulose materials."

Similar plastics also show promise for future use in medical applications, such as replacement materials for artificial heart valves, artificial ligaments, and hip joints, Leão and colleagues said.

PhysOrg

Provided by American Chemical Society


2 Pages  1 2 >Top
 

Change to:
| Lo-Fi Version
2.6895sec    0.26    6 queries    GZIP Disabled
Time is now: 24th December 2025 - 04:38 AM