Cancer Causing Gene Protein Found

Dr. Tak Mak and scientists at The Campbell Family Institute for Breast Cancer Research at Princess Margaret Hospital have discovered the role of two "cousins" in the genetic family tree of cancer development.

The findings, published online today in the journal Genes and Development, plant the seed for a critical new branch of scientific inquiry, says Dr. Mak, principal investigator. Dr. Mak, Director of The Campbell Family Institute is also a Professor, University of Toronto, in the Departments of Medical Biophysics and Immunology.

The cousins are proteins related to the gene p53 family – the patriarch known for two decades to be the master gatekeeper that controls all cancer development. When gene p53 is defective, it loses its ability to regulate healthy cells and suppress cancer.

"Until now, we thought these cousins (TAp73 protein isoforms) were not involved in cancer. Our results prove that they are. This is fundamental to understanding every human cancer and furthering the science."

In the lab, Dr. Mak and his team challenged traditional thinking about the role of these proteins. "Before, scientists studied only whether these proteins were present or absent. We decided to study how they interact with each other and discovered that they actually have a split personality. When we turn one 'on' or 'off', the other changes behavior and becomes part of the cancer-causing process. The key is understanding the ratio of the interaction."

"The next step is to understand how the ratio affects cell division that leads to human cancer," says Dr. Mak, whose work was supported by the Canadian Institutes for Health Research.
Read More...>>

A Unique Way to Measure Dark Energy with Galaxies and Quasars

The Sloan Digital Sky Survey (SDSS) uses a 2.5-meter telescope with a wider field of view than any other large telescope, located on a mountaintop in New Mexico called Apache Point and devoted solely to mapping the universe. We now know that some three-quarters of the universe consists of dark energy, whose very existence was unsuspected when telescope construction began in 1994 and still controversial when the first Sloan survey started in 2000.

Investigating dark energy has since emerged as one of the most crucial tasks of the SDSS. SDSS-III, the third major mapping program, started in midsummer 2008, with the biggest of its four component surveys being a dark-energy probe called BOSS, the Baryon Oscillation Spectroscopic Survey.

Astrophysicist David Schlegel, since 2004 a member of the Physics Division at the U.S. Department of Energy's Lawrence Berkeley National Laboratory, is the principal investigator of BOSS; formerly of Princeton University, Schlegel has been part of the SDSS team since its beginning.

"To tell the truth, the first time I heard about dark energy I was skeptical," Schlegel says.

The evidence came from studies, pioneered by the Supernova Cosmology Project based at Berkeley Lab, comparing the brightness and redshift of distant Type Ia supernovae. The results showed that the expansion of the universe was accelerating, driven by something which, being unknown, was soon tagged with the label dark energy.

Schlegel was soon won over, however, and immediately realized that the Sloan telescope, "which has an enormous field of view," could be used for quite a different kind of dark energy measurement, one completely independent of supernova studies. Baryon acoustic oscillation is a fancy name for the way galaxies are distributed; their density varies regularly, bunching up roughly every 500 million light years and providing a natural measuring stick or "ruler" for measuring how much the universe has expanded since early in its history.

To use a 500-million-light-year ruler, however, one has to have at least a few billion light years to use it in. The Sloan telescope was custom-designed to take in such an enormous volume of space.
Anisotropies in the cosmic microwave background, originating when the universe was less than 400,000 years old, are directly related to variations in the density of galaxies as observed today.
Click here for more information.

The distribution of visible mass in the universe

"Baryon" (meaning protons and neutrons and other relatively massive particles) is shorthand for ordinary matter. For almost the first 400,000 years, the universe was so dense that particles of matter were thoroughly entangled with particles of light (photons), the whole a vast, quivering, liquid-like blob where density variations caused sound waves (pressure waves) to move spherically outward at over half the speed of light.

Suddenly the expanding universe cooled enough for light and matter to "decouple." Photons shot through transparent space unimpeded; the speed of sound plummeted. What had been variations in the density of the liquid universe left two marks in the now-transparent sky.

Variations in the temperature of the radiation that filled the early universe have descended to us as anisotropies in the cosmic microwave background (CMB). Variations in the density of matter persist in the clustering of galaxies, as baryon acoustic oscillations (BAO). The two scales, the roughly one-degree anisotropy of the CMB and the 500-million-light-year clustering of BAO, are closely related; the standard ruler of the universe measured from BAO can be calculated from the CMB for any epoch since decoupling.

Schlegel and his colleague Nikhil Padmanabhan, who came to Berkeley Lab from Princeton in late 2006, first used the SDSS telescope to complete the largest three-dimensional map of the universe ever made until then: 8,000 square degrees of sky out to a distance of 5.6 billion light years, determining the clustering of 60,000 luminous red galaxies. This program, part of SDSS-II, measured galactic distances to a redshift of z = 0.35 and detected the 500-million-light-year scale of BAO.

"We were mostly excited that we could make a measurement," Schlegel says. "We proved we had a ruler we could use."

"With BOSS, we're going from a measurement to a much more precise measurement that we can use to constrain dark energy," says Padmanabhan.

BOSS will double the volume of space in which red luminous galaxies will be studied, observing 10,000 square degrees of sky out to redshifts of z = 0.7; the galaxy sample will increase from 60,000 to 1.5 million. BOSS will also include a new kind of object, measuring up to 200,000 quasars at even more extreme redshifts of z = 2 or more.
David Schlegel, principal investigator of BOSS, shows one of the numerous "plug plates " used to map and select hundreds of galaxies for each exposure. Light from each galaxy enters a...
Click here for more information.

"The epoch of the most common quasars was at redshifts between 2 and 3," Schlegel says, "just enough that we can still see them at optical wavelengths in the blue and ultraviolet; in this respect, nature was kind to us. BOSS will be the first look at dark energy at these redshifts. Only a few Type Ia supernovae have been found beyond redshift 1."

Galaxies and quasars provide different ways to measure the expansion of the universe using baryon acoustic oscillation. The angle of separation among galaxies across the sky and the distance of their separation along the line of sight (at different redshifts) shows how much the BAO cosmic ruler has changed since baryons and photons decoupled.

"At the moment of decoupling, baryon oscillations were frozen," says Padmanabhan. "A really simple geometry test tells us how much the scale has expanded – and accelerated – since the oscillations were frozen in."

Quasars allow a look at a different kind of baryonic oscillation from the distribution of galaxies, that of the varying density of the gas in the universe, which can be probed at hundreds of points along the line of sight to each quasar.

The measured spectrum of an individual quasar is conditioned by the absorption of its light by clouds of hydrogen gas between the quasar and the viewer. "If the universe were empty, the spectra would be featureless," Schlegel says. "We are using the quasars as a backlight to measure hydrogen absorption."

Because the CMB precisely locks in the value of BAO at the moment on decoupling – equivalent to a redshift z = 1,089 – it greatly leverages the accuracy of baryon acoustic oscillations at much more recent epochs and lower redshifts. Padmanabham says that "BOSS will be within a factor of 2 of the best possible map of BAO in the universe."

Schlegel adds, "No one will repeat this experiment."

What it will take

Good as the Sloan telescope is, BOSS will require some major improvements in instrumentation, to be carried out under the direction of Berkeley Lab physicist Natalie Roe. The first task is to increase the number of objects that can be measured with each exposure.

Objects to be included in the survey (galaxies or quasars) are first chosen and located from previous photos. A hole is drilled in a metal "plug plate," a mask to reduce ambient light. An optical fiber is plugged into each hole to carry the light from the object directly to the CCDs. The diameter, separation, and other characteristics of the fibers put a limit on how many can be used at once. At present the limit is 640; BOSS will increase that number to 1,000 fibers with improved optics, allowing more objects and less sky contamination in each exposure.

Higher redshifts call for CCDs with better sensitivity to the red and near-infrared end of the spectrum. The Berkeley Lab high-resistance CCD, descended from silicon detectors used in high-energy physics, is particularly suited to this purpose and is integral to the Lab's proposed SuperNova/Acceleration Probe (SNAP) satellite, the inspiration and leading contender for the NASA/DOE Joint Dark Energy Mission (JDEM). SNAP will study both supernovae and "weak lensing" (a third approach to measuring dark energy). BOSS will use the same rugged, highly red-sensitive CCDs to study luminous red galaxies.

BAO and supernovae are two highly complementary ways of approaching the wide-open questions of dark energy – for example, whether dark energy has been constant or variable over time, or even whether it may be illusory, with its most obvious effect, the accelerating expansion of the universe, resulting instead from some unperceived flaw in Einstein's General Theory of Relativity. Although BAO and supernova studies are both purely geometric, they are independent.

To study expansion using distant Type Ia supernova, scientists must determine their brightness as measured against "nearby" Type Ias. Baryon oscillation studies do almost the reverse, measuring the expansion of the universe by calibrating the dimensions of the BAO cosmic ruler, seen in relatively nearby objects, against the scale that was frozen in when the universe was less than 400,000 years old. BOSS will be able to determine BAO with an accuracy approaching one percent, one of the most precise possible measurements of the expansion of the universe.

Source
Read More...>>

3-D virtual reality environment by UCLA

Its name sounds like something out of science fiction, but the StarCAVE at the University of California, San Diego is now a science fact. The virtual-reality environment allows groups of scientists to venture into worlds as small as nanoparticles and as big as the cosmos – permitting new insights that could fuel discoveries in many fields. Early users of the StarCAVE include UC San Diego researchers in biomedicine, neuroscience, structural engineering, archaeology, earth science, genomics, art history and other disciplines.
Photo of Calit2 researchers explore proteins in 3D
Calit2 researchers explore proteins in 3-D from the Protein Data Bank, displayed inside the StarCAVE.

The StarCAVE is a five-sided virtual reality (VR) room where scientific models and animations are projected in stereo on 360-degree screens surrounding the viewer, and onto the floor as well. It was constructed by the UC San Diego division of the California Institute for Telecommunications and Information Technology (Calit2). At less than $1 million, the StarCAVE immersive environment cost approximately the same as earlier VR systems, while offering much higher resolution and contrast.

“When you’re inside the StarCAVE the quality of the image is stunning,” said Thomas A. DeFanti, director of visualization at Calit2 and one of the pioneers of VR systems. “The StarCAVE supports 20/40 vision and the images are very high contrast, thanks to the room’s unique shape and special screens that allow viewers to use 3-D polarizing glasses. You can fly over a strand of DNA and look in front, behind and below you, or navigate through the superstructure of a building to detect where damage from an earthquake may have occurred.”

A research paper about the design and construction of the StarCAVE appears in the current issue of the Elsevier journal, Future Generation Computer Systems (FGCS), and is available online at ScienceDirect.* DeFanti’s co-authors on “The StarCAVE, a Third-Generation CAVE and Virtual Reality OptIPortal,” include Calit2’s Gregory Dawe, Jurgen P. Schulze, Peter Otto, Javier Girado, Falko Kuester, Larry Smarr and Ramesh Rao (all at UC San Diego), as well as Daniel J. Sandin of the University of Illinois at Chicago’s Electronic Visualization Lab (EVL), and Javier Girado (now at Qualcomm Inc.).

The StarCAVE represents the third generation of surround-VR rooms. DeFanti’s team built and named the original Cave Automated Virtual Environment (CAVE) at the University of Illinois at Chicago in 1991. A second-generation model built ten years later at EVL is now the standard surround-VR technology and is widely used around the world and marketed by Mechdyne Corp. The first- and second-generation CAVEs require viewers to wear battery-powered ‘shutter’ glasses; the StarCAVE provides an improved 3-D experience and allows viewers to wear only lightweight, polarized ‘sun’ glasses.
Photo of Calit2 Director of Visualization Tom DeFanti inside the StarCAVE virtual reality system
Calit2 Director of Visualization Tom DeFanti inside the StarCAVE virtual reality system.

The room operates at a combined resolution of over 68 million pixels – 34 million per eye – distributed over 15 rear-projected walls and two floor screens. Each side of the pentagon-shaped room has three stacked screens, with the bottom and top screens titled inward by 15 degrees to increase the feeling of immersion (while also reducing the ghosting, or ‘seeing double’, that bedevils VR systems).

Because the StarCAVE is designed to help scientists, DeFanti and his team made sure to incorporate the latest in computer graphics processing – using 34 of the newest nVIDIA chips that can generate highly complex images. Thirty-four high-definition projectors (two per screen) create very bright left and right eye visuals, i.e. stereo or 3-D, per screen. Each pair of projectors is powered by a high-end, quad-core PC running on Linux, with dual graphics processing units and dual network cards to achieve gigabit Ethernet or 10GigE networking.

“With its advanced networking, the StarCAVE is the best virtual-reality portal within the OptIPuter network,” said Calit2’s DeFanti, co-principal investigator on the National Science Foundation-funded OptIPuter project. “That network now connects more than 20 so-called OptIPortals which are up and running around the world. They are ultra-high-resolution, tiled display walls, but for some researchers, flat display walls just aren’t enough: they want the realism that comes from the fully immersive 3-D experience that only a 360-degree VR room such as the StarCAVE can offer.”

Adding to the virtual reality in the StarCAVE is the surround sound system, which harnesses recent advances in wave field synthesis – a way to maximize the perception of many channels of sound emanating from different sides of the room. Calit2 also worked closely with Meyer Sound, Inc., to customize the installation of three arrays of five conventional high-quality speakers to provide 5.1 surround sound or up to 15 channels of discreet audio diffusion (with a subwoofer channel built into the floor structure).

Users of the StarCAVE can interact with the visuals on the 360-degree display – by pointing a “wand” that makes it easy to fly through the 3-D images and zoom in or out. The exact position of the wand and the user is determined by a multi-camera wireless tracking system.

Among the VR room’s other features, it is wheelchair accessible, and it was designed to withstand earthquakes. One of the StarCAVE’s five walls (along with its six projectors, three screens and three computers) rolls back on steel rails to provide access for users into the space, and the wall rolls back into place to provide the full 360-degree, immersive VR experience.

While the StarCAVE was in development, computer scientists were working on new applications to adapt computer programs for display in the VR environment. The room connects to the Protein Data Bank, so users can pull up one or multiple proteins and fly around them to find similarities and differences between the proteins. A virtual replica of Calit2’s headquarters building at UC San Diego, Atkinson Hall, has been used by neuroscientists who want to know if the human brain operates differently in virtual reality versus reality in ‘wayfinding’ situations.

“We also created an application which displays computer-aided design models of parts of the new San Francisco Oakland Bay Bridge,” said DeFanti. “The application allows users to walk/fly through these parts at their real size, to find material clashes, construction errors, and generally to draw conclusions about whether the structure could be built as designed.”
Read More...>>

Superhard, Super Slippery Diamonds

They call diamonds “ice,” and not just because they sparkle. Engineers and physicists have long studied diamond because even though the material is as hard as an ice ball to the head, diamond slips and slides with remarkably low friction, making it an ideal material or coating for seals, high performance tools and high-tech moving parts.

Robert Carpick, associate professor in the Department of Mechanical Engineering and Applied Mechanics at the University of Pennsylvania, and his group led a collaboration with researchers from Argonne National Laboratories, the University of Wisconsin-Madison and the University of Florida to determine what makes diamond films such slippery customers, settling a debate on the scientific origin of its properties and providing new knowledge that will help create the next generation of super low friction materials.

The Penn experiments, the first study of diamond friction convincingly supported by spectroscopy, looked at two of the main hypotheses posited for years as to why diamonds demonstrate such low friction and wear properties. Using a highly specialized technique know as photoelectron emission microscopy, or PEEM, the study reveals that this slippery behavior comes from passivation of atomic bonds at the diamond surface that were broken during sliding and not from the diamond turning into its more stable form, graphite. The bonds are passivated by dissociative adsorption of water molecules from the surrounding environment. The researchers also found that friction increases dramatically if there is not enough water vapor in the environment.

Some previous explanations for the source of diamond’s super low friction and wear assumed that the friction between sliding diamond surfaces imparted energy to the material, converting diamond into graphite, itself a lubricating material. However, until this study no detailed spectroscopic tests had ever been performed to determine the legitimacy of this hypothesis. The PEEM instrument, part of the Advanced Light Source at Lawrence Berkeley National Laboratory, allowed the group to image and identify the chemical changes on the diamond surface that occurred during the sliding experiment.

The team tested a thin film form of diamond known as ultrananocrystalline diamond and found super low friction (a friction coefficient ~0.01, which is more slippery than typical ice) and low wear, even in extremely dry conditions, (relative humidity ~1.0%). Using a microtribometer, a precise friction tester, and X—ray photoelectron emission microscopy, a spatially resolved X-ray spectroscopy technique, they examined wear tracks produced by sliding ultrananocrystalline diamond surfaces together at different relative humidities and loads. They found no detectable formation of graphite and just a small amount of carbon re-bonded from diamond to amorphous carbon. However, oxygen was present on the worn part of the surface, indicating that bonds broken during sliding were eventually passivated by the water molecules in the environment.
PerformancingAds
Read More...>>

Your Ad Here