Seti Statistics :
SETI (Search for Extraterrestrial Intelligence) is a scientific area whose goal is to detect intelligent life outside Earth. One approach, known as radio SETI, uses radio telescopes to listen for narrow-bandwidth radio signals from space. Such signals are not known to occur naturally, so a detection would provide evidence of extraterrestrial technology.
Radio telescope signals consist primarily of noise (from celestial sources and the receiver’s electronics) and man-made signals such as TV stations, radar, and satellites. Modern radio SETI projects analyze the data digitally. More computing power enables searches to cover greater frequency ranges with more sensitivity. Radio SETI, therefore, has an insatiable appetite for computing power. Previous radio SETI projects have used special-purpose supercomputers, located at the telescope, to do the bulk of the data analysis. In 1995, David Gedye proposed doing radio SETI using a virtual supercomputer composed of large numbers of Internet-connected computers, and he organized the SETI@home project to explore this idea. SETI@home was originally launched in May 1999.
Climate predictions :
The aim of climateprediction.net is to investigate the approximations that have to be made in state-of-the-art climate models (read more about this). By running the model thousands of times (a ‘large ensemble’) we hope to find out how the model responds to slight tweaks to these approximations – slight enough to not make the approximations any less realistic. This will allow us to improve our understanding of how sensitive our models are to small changes and also to things like changes in carbon dioxide and the sulphur cycle. This will allow us to explore how climate may change in the next century under a wide range of different scenarios. In the past estimates of climate change have had to be made using one or, at best, a very small ensemble (tens rather than thousands!) of model runs. By using your computers, we will be able to improve our understanding of, and confidence in, climate change predictions more than would ever be possible using the supercomputers currently available to scientists. The climateprediction.net experiment should help to “improve methods to quantify uncertainties of climate projections and scenarios, including long-term ensemble simulations using complex models”, identified by the Intergovernmental Panel on Climate Change (IPCC) in 2001 as a high priority. Hopefully, the experiment will give decision makers a better scientific basis for addressing one of the biggest potential global problems of the 21st century. The results from climateprediction.net experiment will be fed into the work of the Quantifying Uncertainty in Model Predictions (QUMP) team at the Met Office and will form part of the UK contribution to the Fourth Assessment Report of the IPCC. To help make participation in climateprediction.net more rewarding and fun, we are developing educational resources to help participants learn more about what their model is telling them. These include materials for schools, an Open University short course, and a lively, interactive web-based community where participants can compare discuss, analyse and learn about their model runs.
Einstein@Home is a project developed to search data from the Laser Interferometer Gravitational wave Observatory (LIGO) in the US and from the GEO 600 gravitational wave observatory in Germany for signals coming from extremely dense, rapidly rotating stars. Such sources are believed to be either quark stars or neutron stars, and a subclass of these are already observed by conventional means as pulsars or X-ray emitting celestial objects. Scientists believe that some of these compact stars may not be perfectly spherical, and if so, they should emit characteristic gravitational waves, which LIGO and GEO 600 may begin to detect in coming months. Bruce Allen of the University of Wisconsin-Milwaukee’s (UWM) LIGO Scientific Collaboration (LSC) group is leading the development of the Einstein@Home project. Einstein@Home is one, small part of the LSC scientific program. It is being set up as a distributed computing project, which means that it relies on computer time donated by private computer users like you to search for gravity wave-emitting compact stars.
Predictor@home is a world-community experiment and effort to use distributed world-wide-web volunteer resources to assemble a supercomputer able to predict protein structure from protein sequence. Our work is aimed at testing and evaluating new algorithms and methods of protein structure prediction. We recently performed such tests in the context of the Sixth Biannual CASP (Critical Assessment of Techniques for Protein Structure Prediction) experiment, and now need to continue this development and testing with applications to real biological targets. Our goal is to utilize these approaches together with the immense computer power that can be harnessed through the internet and volunteers all over the world (you!) to address critical biomedical questions of protein-related diseases. Predictor@home is a pilot project of the Berkeley Open Infrastructure for Network Computing (BOINC)
The Large Hadron Collider (LHC) is a particle accelerator which is being built at CERN, the European Organization for Nuclear Research, the world’s largest particle physics laboratory. When it will switch on in 2007, it will be the most powerful instrument ever built to investigate on particles proprieties. The LHC will take the place of CERN’s Large Electron Positron (LEP) collider, and will sit in its 27 Km long tunnel, about 100m underground. It will accelerate 2 separate beams of protons up to an energy of 7 TeV , and then bring them into head-on collisions (from here the name “collider”). The protons collision energy will then be of 14 TeV. But the LHC will not be limited to the study of proton-proton collisions as it can also collide heavy ions, such as lead, with a collision energy of 1148 TeV.
Before being injected into the LHC, proton beams will be prepared by CERN’s existing “accelerator complex”. This is a succession of machines with increasingly higher energies, injecting the beam each time into the next one, which takes over to bring the beam to an even higher energy.
To bend the 7 TeV protons around the ring, the LHC dipoles must be able to produce magnetic fields of 8.36 Tesla, a value which is made possible by the use of “superconductivity”. This is the ability of certain materials, usually at very low temperatures, to conduct electric current without resistance and power losses, and therefore produce high magnetic fields. The LHC will operate at about 300 degrees below room temperature (even colder than outer space!) and use the most advanced superconducting magnet and accelerator technologies ever employed. 1,296 superconducting dipoles and more than 2,500 other magnets will guide and collide the LHC beams. They range from small, normally conducting bending magnets to large, superconducting focusing quadrupoles. When completed, the accelerator will be the largest superconducting installation in the world.
Five experiments, with huge detectors, will study what happens when the LHC’s beams collide. They will handle as much information as the entire European telecommunications network does today!
Contact the webmaster ;