Showing posts with label Ocean data. Show all posts
Showing posts with label Ocean data. Show all posts

Thursday, November 05, 2009

Soil Moisture and Ocean Salinity (SMOS) Satellite Forms Three-pointed Star In The Sky


Following the launch of ESA's SMOS satellite on 2 November, the French space agency CNES, which is responsible for operating the satellite, has confirmed that the instrument's three antenna arms have deployed as planned, and that the instrument is in good health.


During launch and the first few orbits around Earth, the Soil Moisture and Ocean Salinity (SMOS) instrument's antenna arms remained safely folded up. Today, these three arms folded-out and now form a large three-pointed star shape. With its unusual shape, measuring eight metres across, SMOS can be dubbed a 'star in the sky'.
The SMOS instrument is called MIRAS -- short for Microwave Imaging Radiometer with Aperture Synthesis -- and is actually bigger than the satellite platform. It consists of a central hub and the three arms that have just deployed. This deployment is crucial to the success of the mission as they carry the key measuring devices: most of the 69 small antenna receivers called LICEFs.
To acquire data on soil moisture and ocean salinity, each of the LICEF antenna-receivers measures radiation emitted from Earth's surface within the 'L-band', around a frequency of 1.4 GHz. This frequency provides the best sensitivity to variations in moisture in the soil and changes in the salinity of the surface waters of the oceans. In addition, this frequency is not affected too much by the weather, atmosphere and vegetation cover.
To achieve the spacial resolution required by the data users, the MIRAS instrument employs a novel use of technology. Under normal circumstances, measuring these two environmental variables using L-band would only work with a huge antenna -- which would be too big to be carried by a satellite. To overcome this challenge, the SMOS mission has borrowed techniques used in radio astronomy.
Radio astronomers, searching for celestial objects that are not detectable in optical astronomy, also faced the challenge of needing to detect small signals from point sources in space at a long wavelength, requiring a big antenna. Since signals are detected as waves, signals from different telescopes can be added to synthesise the pinpointing of a much larger telescope. To achieve this, radio astronomers combined 27 radio telescopes, each 25 m in diameter, and deployed them on a Y-shaped track that can be extended up to 35 km. This is known as the Very Large Array in New Mexico, US.
Like the Very Large Array, the SMOS instrument also forms a Y-shape and through a process of interferometry the 69 small antenna receivers mimic a much larger antenna.
SMOS mimics the 'Very Large Array'
The deployment of the SMOS arms marks another significant milestone for ESA's water mission. The satellite will now undergo a series of health checks within its six-month commissioning phase. So far, however, all the signs are good that this second of ESA's Earth Explorer satellites in orbit is fit and healthy following launch and will be able to deliver the data to advance our understanding of Earth's water cycle.
Adapted from materials provided by European Space Agency.

Thursday, September 24, 2009

Keeping An Eye On The Oceans


In the last ten years, scientists have set up a global observing system to monitor the world's oceans. The observation system works by combining satellite observations with data from in-water recording devices such as buoys, tide gauges and an array of more than 3000 Argo robots.


Now the initial system is up and running, scientists are meeting next week at OceanObs’09 in Venice (21-25 September) to see how they can expand the system and, perhaps most importantly, secure it for the long term.
OceanObs ‘09 is organized by UNESCO’s Intergovernmental Oceanographic Commission and the European Space Agency (ESA) and will be attended by EUMETSAT and over 580 participants from 36 countries.
EUMETSAT's role in ocean observations is to establish, maintain and use European systems of operational meteorological satellites, contribute to the operational monitoring of the climate and the oceans - for instance monitoring sea level rise with the Jason 2 altimetry satellite - and establish new ocean-monitoring missions, such as Jason 3.
So how does the ocean observing system operate?
In the water, recording devices such as tide gauges, mooring buoys, and drifting buoys, monitor aspects of the sea such as tides, water temperature, and currents. Over the last 10 years, scientists have also dropped more than 3000 Argo robots into the sea, and these robots are now methodically rising and falling around the world’s oceans recording temperature and salinity profiles, and transmitting this data via satellite back to scientists every ten days. The Argo robots are also joined by pilot-less ocean gliders which bristle with recording instruments and soar and glide through the oceans - sometimes down to depths of 6 km - collecting data.
Joining the gliders, scientists have also sporadically enlisted the help of marine animals, such as elephant seals, by attaching miniature data loggers to record the temperature, salinity and depth conditions they experience on their daily travels. And even ships and ferries are playing a part in monitoring the ocean, as boats on regular passage around the world tow plankton recorders or pipe-in water to sophisticated on-board FerryBox systems, which are like mini oceanographic laboratories.
Space observations
All this data from the in-water samplers – so called in situ data - provides the detail on conditions in specific locations, but for the big picture of what is happening in the oceans, scientists are relying on satellites. One of the key tools in understanding issues such as global sea level rise is the Jason 2 satellite, operated by EUMETSAT, whose onboard altimeter scans the world’s oceans, recording global sea level to the nearest cm. When this information is combined with information from satellite-based gravity measurements, tide gauges, Argo floats and other devices, it gives scientists the ability to precisely monitor global sea levels. Satellites are also monitoring a host of other ocean variables - from sea surface temperature, to wind, ocean colour and sea ice cover.
One of the most important features of any ocean observing system is that it must be a long-term system if changes are to be understood in the right context. As an example, satellite monitoring of sea levels began in 1992 with the launch of the TOPEX/Poseidon satellite, which was followed by Jason 1 (2001), Envisat (2002) and more recently Jason 2 (2008), which will be joined in 2012 by Sentinel-3, another satellite carrying altimetry equipment.
Dr Hans Bonekamp, Ocean Mission Scientist at EUMETSAT said: “The long-term datasets on sea levels that the satellite altimeters are collecting are enabling scientists to establish how sea levels have changed in the last two decades and understand the effects of global warming at regional and global levels.”
Making sure the existing ocean observation system, both satellite and in situ data, is sustainable in the long-term is one of the key aims of Oceanobs’09, where the ocean observing community will take stock of progress to date and map out the priorities for the next decade - a task that is unlikely to be easy in the current financial climate.
But the benefits that an operational ocean observing system will bring, are an extremely strong justification: the system is already providing data for the International Panel on Climate Change assessments, and it will also provide better data for maritime security, oil spill prevention, management of marine resources, marine meteorology, seasonal and long term weather forecasting, coastal activities, and monitoring of water quality.
Adapted from materials provided by European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT), via AlphaGalileo.

Thursday, April 16, 2009

Harnessing cloud computing for data-intensive research on oceans, galaxies


Private companies, universities and government agencies are joining forces to bring scientific research into the era of "cloud computing," the name for massive clusters of computers connected through the Internet. The University of Washington has won three recent awards from the National Science Foundation related to cloud computing. Two of the grants will fund projects examining ocean climate simulations and analyzing astronomical images. Both provide tools so researchers can use cloud computing to easily interact with the massive datasets that are becoming more and more common in science. A third grant to the UW provides curriculum and training to teach cloud computing. The projects are funded through NSF's Cluster Exploratory program, which will access a cloud datacenter established for educational use in 2007 through a partnership between Google, IBM and six academic institutions, of which the UW was the first member. NSF joined the group last year. Climate modelers are beginning to use computer simulations in more exploratory ways, said Bill Howe, a researcher at the UW's eScience Institute, a newly established group to support data-intensive research at the university. Instead of running a simulation to test a single hypothesis, climate scientists are now running long-term simulations and then sifting through tens of thousands of gigabytes of resulting data to discover trends. "Using current tools, you can comfortably analyze and visualize datasets that fit in the computer underneath your desk," Howe said. "But you can't comfortably and interactively explore datasets at this new scale." Howe's project aims to provide that interactivity for tens of thousands of gigabytes of simulation results. He created a tool, GridFields, to visualize the polygonal mesh of climate simulation output, and is now working to redesign GridFields to be efficient in a cloud computing environment. Collaborators at the University of Utah have an award under the same program to extend an accompanying system that makes it easier to write and keep track of computer programs. "We need to get smart sooner rather than later on how to design and build a system that doesn't just live out on these machines at government or company data centers, but extends the cloud right down to your computer," Howe said. Someday the tool should be easy enough that undergraduates and high-school students could sift through raw data themselves, he said. A second grant will use cloud computing to study astronomical images. Astronomy has changed dramatically during the past decade, says Andrew Connolly, a UW associate professor of astronomy who was awarded the grant with UW research scientist Jeffrey Gardner. Scientists once competed for time on telescopes, recorded data and then studied the individual images in detail. Now telescopes continuously record high-resolution images that are available to all, providing millions of times more information. "In the past I could have spent a couple of hours working on a single image. But now, if I have to multiply it by factors of many tens of thousands, that couple of hours each becomes something that's not feasible," Connolly said. Companies such as Google, Microsoft, Amazon and Yahoo! have now created frameworks that make it easier to store and process information in the cloud and make the information available over the Web. "We want to use these frameworks to enable science, and make it so that astronomers can come in and do the work that they need to do without needing to learn the intricacies of how to work with thousands of machines," Connolly said. His grant will prepare astronomers to deal with data coming from telescopes scheduled to come online in coming years, such as the Large Synoptic Survey Telescope, of which the UW is a founding institution. The telescope's 27-foot mirror is connected to a 3.2 billion-pixel camera that takes pictures every 15 seconds. It is expected to record more than 30,000 gigabytes of data and detect more than 100 million astronomical sources every night. "Cloud computing enables us to scale to the point where we can actually analyze that sort of data," Connolly said. The third grant funded a 3-day workshop held in Seattle last July in which computer science professors learned from UW computer science and engineering faculty and students how to teach cloud computing skills. "The rapid evolution of sensors is transforming all sciences from data-poor to data-rich," said Ed Lazowska, a UW professor of computer science and engineering who led the workshops. "The challenge is to use modern cloud computing resources, such as Amazon Web Services, and modern computer science advances, such as data mining and machine learning, to explore these massive volumes of data. This new computational science will be pervasive and will have enormous impact. UW is fortunate to be in on the ground floor." The UW is the only institution to have won three awards through NSF's new data-intensive computing programs, and it has the largest total award value of nearly $700,000.