Archive for September 21st, 2011

September 21, 2011

Scientists Turn Back the Clock on Adult Stem Cells Aging

The main function of adult stem cells is to enable the replacement of old or damaged cells of most types, from neurons to skin to the liver. One of the main reasons that organisms as a whole suffer from aging is that their adult stem cells do too, almost like any other cell type.

An important differences between stem cells and other types of cells is that there is a limit to how often an ordinary cell can divide (to create new cells of the same type). This limit is controlled by telomeres – structures on the ends of chromosomes that are gradually shortened every time a cell divides, in part because DNA copying mechanisms cannot accurately copy the ends of DNA strands. In stem cells, however, a mechanism is active that can rebuild shortened telomeres. (This also happens in cancer cells, unfortunately.)

However, in spite of telomere repair in stem cells, they still experience aging, so there must be more to aging than telomeres. One factor is the accumulation of DNA damage due to the inherent imperfections in DNA repair mechanisms.

The research in question here compared young adult stem cells with cells of the same type that had been allowed to divide repeatedly in cultures, in order to determine what changed. One important difference found was the accumulation of DNA segments called Alu element retrotransposons. This type of noncoding DNA is common in primate genomes. However, the accumulation that occurs in aging stem cells appears to be toxic to the cells and eventually forces them into a senescent state.

The good news is that when copying of these Alu elements is suppressed, stem cells are able to regain their self-renewing properties. Naturally, this is being investigated further for possible applications in slowing the overall aging process.

Scientists Turn Back the Clock on Adult Stem Cells Aging

The regenerative power of tissues and organs declines as we age. The modern day stem cell hypothesis of aging suggests that living organisms are as old as are its tissue specific or adult stem cells. Therefore, an understanding of the molecules and processes that enable human adult stem cells to initiate self-renewal and to divide, proliferate and then differentiate in order to rejuvenate damaged tissue might be the key to regenerative medicine and an eventual cure for many age-related diseases. A research group led by the Buck Institute for Research on Aging in collaboration with the Georgia Institute of Technology, conducted the study that pinpoints what is going wrong with the biological clock underlying the limited division of human adult stem cells as they age.

“We demonstrated that we were able to reverse the process of aging for human adult stem cells by intervening with the activity of non-protein coding RNAs originated from genomic regions once dismissed as non-functional ‘genomic junk’,” said Victoria Lunyak, associate professor at the Buck Institute for Research on Aging.

Further reading:

Inhibition of activated pericentromeric SINE/Alu repeat transcription in senescent human adult stem cells reinstates self-renewal

Advertisements
September 21, 2011

Deep Oceans May Mask Global Warming for Years at a Time

Very few processes in nature proceed uniformly in one direction when examined at increasingly small time intervals. One would not expect rising temperatures every day from February through July at a place in the northern hemisphere. But on average, if measured over a number of years, one certainly would. On a global scale, there will be fluctuations lasting as much as a decade, but the long-term trend remains.

Deep Oceans May Mask Global Warming for Years at a Time – US National Science Foundation

Earth’s deep oceans may absorb enough heat at times to flatten the rate of global warming for periods of as long as a decade–even in the midst of longer-term warming. This according to a new analysis led by scientists at the National Center for Atmospheric Research (NCAR).

The study, based on computer simulations of global climate, points to ocean layers deeper than 1,000 feet as the main location of the “missing heat” during periods such as the past decade when global air temperatures showed little trend.

The findings also suggest that several more intervals like this can be expected over the next century, even as the trend toward overall warming continues.

Further reading:

Model-based evidence of deep-ocean heat uptake during surface-temperature hiatus periods

September 21, 2011

First Quantum Computer With Quantum CPU And Separate Quantum RAM

Quantum computers are still in a very early experimental stage. Even the underlying technology best used to implement a quantum computer hasn’t yet been settled upon. The problem is that qubits, the basic objects that a quantum computer works with, are very difficult to control. So it’s not surprising that the larger scale architecture of quantum computers is also not yet determined. Up until now, experimental architectures have not been like the easily reprogrammable Von Neumann architecture of all modern electronic computers. But finally that architecture is being explored for quantum computers.

First Quantum Computer With Quantum CPU And Separate Quantum RAM – Technology Review

Today, Matteo Mariantoni at the UC Santa Barbara and pals reveal the first quantum computer with an information processing unit and a separate random access memory.

Their machine is a superconducting device that stores quantum bits or qubits as counter-rotating currents in a circuit (this allows the qubit to be both a 0 and 1 at the same time). These qubits are manipulated using superconducting quantum logic gates, transferred using a quantum bus and stored in separate microwave resonators.

Let’s say upfront that the result is not a particularly powerful computer. Mariantoni and co show off their device by demonstrating a couple of simple but unspectacular algorithms but ones that were carefully chosen as the building blocks of more impressive tasks such as error correction and factoring large numbers.

Not that they’ve actually done any of those things. What’s impressive, however, is that they soon could since this approach is eminently scalable.

Further reading:

Could this quantum computer be the real deal?

UCSB physicists demonstrate the quantum von Neumann architecture

Implementing the Quantum von Neumann Architecture with Superconducting Circuits

Implementing the Quantum von Neumann Architecture with Superconducting Circuits

September 21, 2011

NASA’S WISE Mission Captures Black Hole’s Wildly Flaring Jet

Supernovae that originate from the collapse of massive stars and are the power behind gamma-ray bursts have them. Supermassive black holes at the centers of galaxies have them. And so too do stellar mass black holes that are the remnants of supernovae that flared long ago. Jets. Intense jets of radiation and particles moving at nearly the speed of light.

Jets occur frequently in objects of interest to researchers working on high-energy astrophysics. Yet they still aren’t well understood. GX 339-4 is the designation given to a black hole with a mass about 6 times that of our Sun and located 20,000 light years away.

NASA’s Wide-field Infrared Survey Explorer (WISE) records images covering the whole sky every 11 seconds. This made it possible to capture repeated images of a small area around GX 339-4 at the base of its jets. Its jets are fueled by an accretion disk which in turn is fed by gas sucked from a companion star. What was surprising about the observations was the wide variability of both the size and the brightness of the small region where the jets originate.

NASA’S WISE Mission Captures Black Hole’s Wildly Flaring Jet

The results surprised the team, showing huge and erratic fluctuations in the jet activity on timescales ranging from 11 seconds to a few hours. The observations are like a dance of infrared colors and show that the size of the jet’s base varies. Its radius is approximately 15,000 miles (24,140 kilometers), with dramatic changes by as large as a factor of 10 or more.

“If you think of the black hole’s jet as a firehose, then it’s as if we’ve discovered the flow is intermittent and the hose itself is varying wildly in size,” Poshak said.

The new data also allowed astronomers to make the best measurements yet of the black hole’s magnetic field, which is 30,000 times more powerful than the one generated by Earth at its surface. Such a strong field is required for accelerating and channeling the flow of matter into a narrow jet. The WISE data are bringing astronomers closer than ever to understanding how this exotic phenomenon works.

Further reading:

Black Hole Jets Gone Wild

A variable mid-infrared synchrotron break associated with the compact jet in GX 339-4

A VARIABLE MID-INFRARED SYNCHROTRON BREAK ASSOCIATED WITH THE COMPACT JET IN GX 339-4

Rapid optical and X-ray timing observations of GX 339-4: multi-component optical variability in the low/hard state

Rapid optical and X-ray timing observations of GX 339-4: flux correlations at the onset of a low/hard state

Related articles:

Scientists precisely locate black hole using the material it ejects

September 21, 2011

Gamma-ray bursts shed light on the nature of dark energy

Dark energy was “discovered” unexpectedly in 1997 (published in 1998), when a survey of relatively nearby Type Ia supernovae found something strange. These supernovae should all have about the same intrinsic luminosity. But instead, using assumptions current at the time about the rate of expansion of the universe, the farther away these supernovae were the less their intrinsic luminosity would have to be.

If in fact these supernovae have pretty much the same intrinsic luminosity in all cases, then they are “standard candles”. Hence a simple application of the inverse square law (apparent luminosity is proportional to the inverse square of the distance) would determine how far away the objects should be. However, by another chain of reasoning, the distance can be calculated using “Hubble’s law” from the observed redshift in the objects’ spectra, because the law states that relative velocity (which is inferred from redshift) is proportional to distance, with a constant of proportionality called “Hubble’s constant”. (The law was named after Edwin Hubble, who first conceived it.)

The problem was that this prediction of distance was too small for the observed luminosity. Otherwise stated, if the remote supernovae were actually as close as the distances calculated by Hubble’s law, they could not be as luminous intrinsically as more nearby supernovae. However, this problem would go away if there were an error in calculating the correct distance from Hubble’s law – which could happen if something now called “dark energy” existed.

In order to verify this hypothesis of dark energy, there are several reasons astrophysicists want to have another type of standard candle besides Type Ia supernovae for gauging very large distances. Ideally another type of standard candle that doesn’t depend on the behavior of Type Ia supernovae and works out to much larger distances could be identified. Astrophysicists have suspected that another type of supernova, which is responsible for a much more energetic pulse of electromagnetic radiation – a gamma-ray burst – can fill the bill.

Research just announced has studied the properties of relatively nearby gamma-ray bursts and identified certain characteristics that allow predicting the intrinsic brightness of the burst. Comparing that brightness to what’s actually observed determines the distance of the event.

Gamma-ray bursts shed light on the nature of dark energy – University of Warsaw

Dark energy is the basic constituent of the Universe today, one that is responsible for its accelerated expansion. Although astronomers observe the cosmological effects of the impact of dark energy, they still do not know exactly what it is. A new method for measuring the largest distances in the Universe developed by scientists from the Faculty of Physics, University of Warsaw and the University of Naples Federico II helps solve the mystery. A key role is played by the most powerful cosmic explosions – gamma-ray bursts.

What is the nature of dark energy, a recently discovered dominant constituent of the Universe today? Is expansion-accelerating dark energy an intrinsic property of space-time itself or rather a field unknown to science? A new distance-measuring method developed by scientists from the Faculty of Physics, University of Warsaw (FUW) and the University of Naples Federico II can provide the answer. “We are able to determine the distance of an explosion on the basis of the properties of the radiation emitted during gamma-ray bursts. Given that some of these explosions are related to the most remote objects in space that we know about, we are able, for the first time, to assess the speed of space-time expansion even in the relatively early periods after the Big Bang,” says Prof. Marek Demiański (FUW). The method was used to verify models of the structure of the Universe containing dark energy.

read more »