DGFI-TUM recently published the latest in a series of empirical ocean tide (EOT) models. The new model, named EOT20, shows improved results compared to other global tide models (including our earlier model EOT11a), especially in the coastal region.
Ocean tides play a vital role in various practical applications, especially in the coastal environment. In addition, tides are of importance in geodetic data analysis, for example in improving the observation of sea surface processes from along-track satellite altimetry and in determining high-resolution gravity fields from missions such as GRACE. Although in recent years tide models have made significant progress in the estimation of tides using satellite altimetry, the coastal region remains a challenge due to the complexity of shorelines, poorly resolved bathymetry and land contamination of altimetry radar echoes.
EOT20 benefits from advances in coastal altimetry, particularly in the use of the ALES retracker. The EOT20 approach relies on residual tidal analysis with respect to a reference tide model (FES2014) to estimate residual signals of the ocean tides. Further developments include the incorporation of more altimetry data, improved coastline representation and triangular gridding.
The model's accuracy was evauated using in-situ tide gauge data from DGFI-TUM's TICON dataset. Error reduction was found for the eight major tidal constituents in EOT20 compared to other global ocean tide models in the coastal region, with an error reduction of ~0.2 cm compared to the next best model (FES2014). EOT20 is on par with the best tide models in shelf regions and the open ocean, with improvement over EOT11a throughout. When used as a tidal correction for satellite altimetry, EOT20 reduced the sea level variance compared to both EOT11a and FES2014. These improvements, particularly in the coastal region, encourage the use of EOT20 as a tidal correction for satellite altimetry in sea-level research.
The ocean tide and load tide datasets of EOT20 are available in our Science Data Products section. Methodology and results are described in the publication EOT20: a global ocean tide model from multi-mission satellite altimetry (Earth System Science Data, 2021, DOI: 10.5194/essd-13-3869-2021, [PDF]).
Precise data for improved coastline protection: Led by DGFI-TUM, an international team of researchers has created the first comprehensive data sets of regional sea level rise in the North Sea and the Baltic Sea, including coastal areas and regions covered by sea ice. The data sets provide new insights into long-term and seasonal seal level changes over the past quarter century. This information is of vital importance for planning protective measures and for understanding dynamic processes in the oceans and the climate system.
Especially near coastlines, where many cities and industry facilities are located, the quality and quantity of data collected by the satellites are compromised by strong perturbances of the radar signal. Another problem is sea ice, which covers parts of the oceans in winter, and is impenetrable to radar. In the ESA Baltic Sea Level project (Baltic SEAL) the researchers developed algorithms to process the measurement data from radar satellites to permit precise and high-resolution measurements of sea level changes even in coastal areas and beneath sea ice. In this effort, the Baltic Sea can be seen as a model region since the complex shape of the coastline and sea ice make the data analysis particularly difficult. Analytical methods that work here can be easily adapted to other regions. Hundreds of millions of radar measurements taken between 1995 and 2019 were processed in a newly developed multi-stage process, comprising the identification of signals from the ice-covered sea water in the radar reflections produced along cracks and fissures, the development of new computational methods to achieve better quality of sea level data close to land, and finally the calibration and combination of measurements from the various satellite missions.
The analysis of these data for the Baltic Sea shows that the sea level has risen at an annual rate of 2-3 millimeters in the south, on the German and Danish coasts, as compared to 6 millimeters in the north-east, in the Bay of Bothnia. The cause of this above-average rise: Strong south-westerly winds connected to the NAO drive the waters to the north-east. The developed method has also been applied to the North Sea, where the sea level is rising by 2.6 millimeters per year, and by 3.2 millimeters in the German Bight.
The data sets Baltic SEAL and North SEAL of sea level changes are available for download in our Science Data Products section. Methods and results are described in the respective publications Absolute Baltic Sea Level Trends in the Satellite Altimetry Era: A Revisit (Frontiers in Marine Science, 2021, doi: 10.3389/fmars.2021.647607, [PDF]) and North SEAL: A new Dataset of Sea Level Changes in the North Sea from Satellite Altimetry (Earth System Science Data, 2021, doi: 10.5194/essd-13-3733-2021, [PDF]).
Knowledge of ocean wave heights at the coast is essential for several operational applications, ranging from coastal protection to energy exploitation. In this context, the Significant Wave Height (SWH) is one of the most general quantitative parameters that describe the sea state at a particular location. SWH, representing the average height of the highest waves, can be measured from satellites using radar altimeters. Over the open ocean, such measurements are routinely used, for example, for ocean weather predictions. In the coastal zone however, the radar measurements were not considered reliable. As an alternative, in-situ buoys or high-resolution ocean models are employed. While the network of in-situ buoys is very sparse and can only provide data at specific locations, appropriate ocean models are computationally very expensive and not globally available, besides requiring constant validation.
Led by DGFI-TUM, an international team has now analyzed reprocessed data from radar altimetry, specifically tailored to improve the quality and quantity of coastal measurements. The results, published in the article Global coastal attenuation of wind-waves observed with radar altimetry (Nature Communications, 2021, doi: 10.1038/s41467-021-23982-4, [PDF]), provide a global picture of the average wave climate when going from offshore (about 30 km) to the coast (up to 3 km from land). The typical attenuation of the waves when approaching the coast, for example due to the shading effect from the land, is quantified to be about 20% of the wave height reached offshore. As a consequence, the energy flux transported by the waves is calculated to decline by about 40% on a global average. This result is paramount for coastal assessments, which until now are often based on models with validation relative to offshore satellite altimetry data.
Very Long Baseline Interferometry (VLBI) is a geodetic space technique which measures the difference in arrival times (delay) of extra-galactic radio signals at separate antennas across the Earth. It depends on the distances between each two antennas, the so-called “baselines”. The observed delays allow for estimating the absolute positions of the antennas in the terrestrial reference frame (TRF), the positions of the radio sources in the celestial reference frame (CRF), as well as the complete set of Earth Orientation Parameters (EOP), linking TRF and CRF.
The positions of the antennas vary during VLBI measurements, and the instantaneous displacements with respect to the long-term linear motion as provided by the TRF are generated from different geophysical effects. One such effect is the deformation of the Earth surface by non-tidal loading, driven by the redistribution of air and water masses within the atmosphere, ocean and continental hydrolosphere. However, oceanic and hydrological loading are usually omitted in routine VLBI processing. In the recent study Benefits of non-tidal loading applied at distinct levels in VLBI analysis (Journal of Geodesy, 2020, doi: 10.1007/s00190-020-01418-z, [PDF]), researchers of DGFI-TUM applied all three non-tidal loading types in the analysis of VLBI sessions between 1984 and 2017 and investigated the impact for various geodetic parameters.
Loading data in terms of three-dimensional station site displacements was applied at two distinct levels of the parameter estimation process: The “observation level” represents the rigorous application, while only average site displacements are considered in the approximation at the “normal equation level”. The study revealed that each baseline is most sensitive to a different loading type (Figure). Considering all types jointly provides the best results, as the variation in estimated heights decreases to a larger extent and for more stations than with any of the single loading types. In particular, the inclusion of hydrological loading leads to a significant reduction in the annual residual signal of station heights. These effects, which improve the stability of station positions, were observed for both application levels with a similar magnitude, and hence the correction for non-tidal loading at normal equation level proved to be a suitable approximation in VLBI analysis.
Many coastal regions are exposed to sea level rise and are thus increasingly threatened by the risk of flooding during extreme events. Risk assessment and the development of appropriate adaptation measures are complex and require a reliable data basis of regional coastal sea level changes from precise observations over long time spans. But systematic coastal sea level observations are lacking along most of the world coastlines. Coastal zones are highly under-sampled by tide gauges, and altimetry data are largely defective because of land contamination of the radar signals.
Now, in the framework of the Climate Change Initiative (CCI) Sea Level project of the European Space Agency (ESA), a novel altimetry-based coastal sea level data record has been created. It consists of high-resolution (~300 m) monthly sea level data along the satellite tracks, at distances of less than 3-4 km from the coastlines in general, sometimes even closer, within 1-2 km from the coast. The data set is based on a complete reprocessing of altimetry radar observations from the Jason-1/2/3 missions and provides coastal sea level trends over 2002-2018 at 429 coastal sites located in six regions (Northeast Atlantic, Mediterranean Sea, West Africa, North Indian Ocean, Southeast Asia and Australia). DGFI-TUM is involved in the CCI Sea Level project by designing and testing of improved radar signal processing techniques to exploit the radar signal in the coastal zone and to correct the measurements. The procedure and the new coastal sea level record are described in the article Coastal sea level anomalies and associated trends from Jason satellite altimetry over 2002–2018 (Nature Scientific Data, 2020, doi: 10.1038/s41597-020-00694-w, [PDF]). The data is freely available at the SEANOE repository (doi: 10.17882/74354).
Space weather and natural disaster monitoring, navigation, positioning and other applications imply an increasing need for low latency ionosphere information. In order to create such information, a suitable estimator is required, making use of observation data as soon as they are available. In this sense, the Kalman Filter (KF) is often applied in (ultra) rapid and (near) real-time applications. The requirement of the prior definition of model uncertainties is a drawback associated with the standard implementation of the KF, and the uncertainties can exhibit temporal variations. The implementation of adaptive approaches into the KF is a way to tune the stochastic model parameters during the filter run-time.
In the last years DGFI-TUM developed approaches for modeling the global vertical total electron content (VTEC) of the ionosphere as a series expansion in terms of localizing B-spline basis functions from unevenly distributed input data such as the dual-frequency GNSS measurements of the IGS network. Scientists of DGFI-TUM made now the next step and developed adaptive methods for an ultra-rapid VTEC modelling to tune the associated model uncertainties in a self-learning manner. The adaptive approach relies on the method of Variance Component Estimation (VCE) and significantly reduces the effort to set up the measurement model and the associated uncertainties for different groups of observations. In order to define the dynamic (prediction) model of the ionosphere target parameters, advantages of the B-spline representation are exploited. For instance, since the coefficients of the B-spline representation resemble the VTEC signal, physical interpretations can be directly deduced from the coefficients. This leads to developing the empirical prediction model very efficiently.
The approach is applied to ultra-rapid VTEC modeling employed with a maximum latency of about 2.5 hours using ionosphere measurements from GPS and GLONASS and can be extended for additional GNSS constellations such as GALILEO or other measurement techniques. Details are presented in the article Adaptive Modeling of the Global Ionosphere Vertical Total Electron Content (Remote Sensing, 2020, doi:10.3390/rs12111822, [PDF]).
In the debate of climate change impacts, the availability and accessibility of freshwater on Earth is an extremely important topic. About 0.25% of the Earth’s freshwater is stored in lakes and reservoirs. A large fraction of these water bodies is characterized by strong storage changes, not only with the seasons, but also in the long-term in consequence of, e.g., human interference or climate-related phenomena. Furthermore, the number of reported flood events relating to inland waters is steadily increasing, from about 150 in 1980 to more than 400 in recent years. Various tasks, such as water resource management, water supply or civil protection require accurate and current information about water storage. In the light of decreasing ground-based measurements, remote sensing techniques have become extremely relevant for the monitoring of lakes and reservoirs worldwide.
Since many years, DGFI-TUM has been working on the determination of accurate water level changes from satellite altimetry also for small inland water bodies and provides respective time series for more than 2740 targets in its Database for Hydrological Time Series of Inland Waters (DAHITI). A newly developed approach now combines these data with areal information from remote sensing images and thus enables the determination of volume changes. For each water body, the underlying algorithm creates a fixed water level/surface area relation (so-called hypsometry) and a high-resolution bathymetry above the lowest observed water level. Details can be found in the article Volume Variations of Small Inland Water Bodies from a Combination of Satellite Altimetry and Optical Imagery (Remote Sensing, 2020, doi: 10.3390/rs12101606, [PDF]).
The procedure was applied to 28 lakes and reservoirs located in Texas, USA, with volumes between 0.1 km³ and 6.0 km³. Validation with ground data features correlation coefficients between 0.80 and 0.99. The relative errors vary between 1.5% and 6.4% with an average of 3.1%. All data are publicly accessible via DAHITI.
Gravity field determination is a major topic in geodesy, supporting applications from Earth system science, orbit determination or the realization of physical height systems. Coarse resolution global gravity field information from satellite observations can be combined with high-resolution gravity data from airborne, shipborne, or terrestrial measurements for regional gravity refinement. In this process, regularization is in most cases inevitable, and choosing an appropriate value for the regularization parameter is a crucial issue. Variance component estimation (VCE) and L-curve method are two frequently used procedures for choosing the regularization parameter.
VCE simultaneously determines the relative weighting between different observation types and the regularization parameter. The prior information is regarded to be another observation type and is required to be stochastic. However, in most of the regional gravity modeling studies, a background model serves as prior information, which has no random character but is deterministic. In this case, the regularization parameter estimated by VCE can be unreliable. On the other hand, the L-curve method (or other conventional regularization methods) cannot weight heterogeneous observations.
To overcome these drawbacks, scientists of DGFI-TUM developed two ‘combined approaches’ for the regularization parameter determination when different data sets are to be combined. The two approaches combine VCE and the L-curve method in such a way that the relative weights are estimated by VCE, but the regularization parameters are determined by the L-curve method. They differ in whether determining the relative weights between each observation type first (VCE-Lc) or the regularization parameter by the L-curve method first (Lc-VCE). Numerical investigations show that these two proposed approaches deliver lower RMS errors with respect to the validation data than the L-curve method and VCE do. Details are provided in the recent article Determination of the Regularization Parameter to Combine Heterogeneous Observations in Regional Gravity Field Modeling (Remote Sensing, 2020, doi: 10.3390/rs12101617, [PDF]).
Satellite altimetry is a key technique for the observation of the world’s oceans. Initially, it has been introduced to determine the ocean surface topography and changes of the sea level by repeatedly measuring the distance between a satellite and the water surface. This distance relates to the round-trip travel time of a radio pulse emitted by the satellite and reflected by the water. But the shape of the radar echo received also enables to study other relevant conditions at the ocean surface, such as significant wave height (SWH) and wind speed. Both of these quantities are related to the sea state, the knowledge of which is essential for numerous applications, e.g. ocean wave monitoring (for fishing or shipping route planning), weather forecasting, or wave climate studies. Information on the sea state is received from the radar echo using an algorithmic approach called retracking.
In the framework of the European Space Agency Sea State Climate Change Initiative (SSCCI) project a competitive exercise (round robin) has been conducted to determine the best retracking algorithm for sea state retrieval. The assessment is focused on the Jason-3 and Sentinel-3A missions, representing the two main satellite altimeter technologies: the so-called "Low Resolution Mode", which encompasses over 25 years of data, and the newest "Delay-Doppler Mode", which exhibits an improved along-satellite-track resolution and signal-to-noise-ratio. 19 retracking algorithms from six international research groups were included in the study. Results showed that all novel retracking algorithms perform better in the majority of the metrics than the baseline algorithms currently used for operational generation of the products. According to an objective weighting scheme that is based on the SSCCI criteria, DGFI-TUM’s retracking algorithms WHALES (Low Resolution Mode) and WHALES-SAR (Delay-Doppler Mode) were ranked second best for all scenarios. Considering coastal scenarios only, WHALES proved the best performance among the Low Resolution Mode retracking algorithms. More details on the study are provided in the publication Round Robin Assessment of Radar Altimeter Low Resolution Mode and Delay-Doppler Retracking Algorithms for Significant Wave Height (Remote Sensing, 2020, doi: 10.3390/rs12081254, [PDF]).
Ionospheric signal delay is one of the largest error sources in GNSS (Global Navigation Satellite Systems) applications and can cause positioning errors in the order of several meters. Especially for single-frequency users, who cannot correct for ionospheric signal delay, external information about the state of the ionosphere is essential. The International GNSS Service (IGS) and its Ionosphere Associated Analysis Centers (IAAC) routinely provide this information in terms of global ionosphere maps (GIM) representing the Vertical Total Electron Content (VTEC). The GIMs are typically limited in their spatial and spectral resolution (spherical harmonic degree 15) caused by the globally inhomogeneous distribution of GNSS observations used for GIM generation. Regional GNSS networks, however, offer dense clusters of observations, which can be used to generate regional VTEC solutions with a higher spectral resolution.
Based on a two-step approach which comprises a global model and a regional densification, scientists of DGFI-TUM have developed an algorithm to generate regional VTEC maps with higher spectral resolution accounting for the finer signal structures. The algorithm is integrated in a software package to provide VTEC maps using hourly GNSS data and ultra-rapid orbits. It allows for the dissemination of VTEC maps with a latency of 2 to 3 hours.
A numerical study for March 2015 based on hourly GNSS data provided by the IGS and the EUREF network is presented in the article Global and Regional High-Resolution VTEC Modelling Using a Two-Step B-Spline Approach (Remote Sensing, 2020, doi:10.3390/rs12071198, [PDF]). Validation with independent data shows that the generated regional high-resolution VTEC maps are of comparable accuracy to the regional maps provided by the Royal Observatory of Belgium, currently being the best known publicly available product for Europe. While the latter is provided with a latency of about one week, DGFI-TUM’s VTEC maps are characterized by low latency allowing for high precision positioning and navigation.
Low Earth orbiting satellites are strongly influenced by perturbing accelerations. For the precise orbit determination (POD) of a non-spherical satellite, accurate modelling of the satellite-body attitude and solar panel orientation is important as the acceleration is directly related to the satellite’s effective cross-sectional area. Moreover, the positions of tracking instruments mounted on the satellite are affected by its attitude.
The altimeter satellites Jason-1/-2/-3 have been providing continuous precise monitoring data of the sea level since December 2001. They have a complex shape, comprising the main spacecraft body on which solar panels and numerous measurement and positioning instruments are mounted. The attitude in space is commonly modelled using a so-called nominal yaw steering model. Scientists of DGFI-TUM have now implemented an observation-based algorithm in which extensively preprocessed quaternions of the satellite body orientation (measured by star tracking cameras) and rotation angles of the solar arrays are used to derive improved attitude information. Processing steps comprise a detailed data analysis, outlier elimination, temporal resampling of the observation data and the optimal interpolation of missing values. The procedure is described in the recent publication Observation-based attitude realization for accurate Jason satellite orbits and its impact on geodetic and altimetry results (Remote Sensing, 2020, doi: 10.3390/rs12040682, [PDF])
The study investigates the benefit of using preprocessed observation-based attitude in contrast to a nominal yaw steering model for the POD of Jason satellites by Satellite Laser Ranging (SLR) over more than two decades. Results show that the new algorithm improves the root mean square (RMS) of SLR observation residuals by 5.9%, 8.3% and 4.5% for Jason-1, Jason-2 and Jason-3, respectively, compared to the nominal attitude realization. The analysis of single-satellite crossover differences revealed a reduction of the mean of absolute differences by 6%, 15%, and 16%. Furthermore, it could be shown that artificial (non-geophysical) signals in station coordinate time series due to orbit modelling deficiencies are significantly reduced. The observation-based attitude data of Jason-1/-2/-3 can be made available on request.
Deeper knowledge about geostrophic ocean surface currents in the northern Nordic Seas supports the understanding of ocean dynamics in this region characterized by rapidly changing environmental conditions. Monitoring the sea-ice-affected area by satellite altimetry results in fragmented and irregularly distributed data sampling and prevents the creation of homogeneous and highly resolved spatio-temporal datasets. In order to overcome this problem, an ocean model can be used to fill in data where altimetry observations are missing.
The joint study Geostrophic currents in the northern Nordic Seas from a combination of multi-mission satellite altimetry and ocean modeling (Earth System Science Data, 2019, doi: 10.5194/essd-11-1765-2019, [PDF]) of DGFI-TUM and the Alfred Wegener Institute (AWI) resulted in a novel dataset of geostrophic currents based on a combination of along-track satellite altimetry data and simulated differential water heights from the Finite Element Sea ice Ocean Model (FESOM). The combination approach is based on principal component analysis (PCA) and links the the temporal variability of along-track ocean topography from satellite altimetry with the most-dominant spatial patterns of FESOM differential water heights. Annual variability and constant offsets were removed from both datasets before combination and added back from altimetry to the combined dataset. Surface currents were computed applying the geostrophic flow equations to the combined topography. The resulting final product is characterized by the spatial resolution of the ocean model (around 1 km) and the temporal variability of the altimetry along-track derived DOT heights. Comparisons to in situ surface drifter observations demonstrate good agreement of spatial patterns, magnitude and flow direction. Mean differences of 0.004 m/s in the zonal and 0.02 m/s in the meridional component are observed. A direct pointwise comparison between the combined geostrophic velocity components interpolated to drifter locations indicates that about 94% of all residuals are smaller than 0.15 m/s.
The dataset provides surface circulation information within the sea ice area and will support a deeper comprehension of ocean currents in the northern Nordic Seas between 1995 and 2012. Data are available at https://doi.org/10.1594/PANGAEA.900691.
Ionospheric disturbances are the main error source in single-frequency precise point positioning. The International GNSS Service (IGS) and its Ionosphere Associated Analysis Centers (IAAC) routinely provide maps of the Vertical Total Electron Content (VTEC) of the ionosphere to correct for ionospheric influences. Since these maps are based on post-processed observations and final orbits, they are usually disseminated to the user with latencies of days to weeks. Precise dual- and multi-frequency GNSS applications, however, such as autonomous driving or precision farming, require high-resolution ionosphere products in (near) real-time (NRT).
To meet this requirement, scientists of DGFI-TUM developed a new procedure to create NRT high-resolution information about the state of the ionosphere based on raw observation data and ultra-rapid orbits. To consider additionally the inhomogeneous distribution of the GNSS data, DGFI-TUM’s modelling approach is based on localizing basis functions (polynomial and trigonometric B-splines). In this way, in contrast to the aforementioned IAAC products, data gaps can be handled appropriately, and a multi-scale representation (MSR) allows for generating VTEC maps of higher or lower spectral resolution by applying a Kalman filter estimation procedure and the pyramid algorithm known from wavelet decomposition.
The realization of the MSR and the generation of the VTEC maps with different temporal and spectral resolutions (Figure) are presented in the article High-resolution vertical total electron content maps based on multi-scale B-spline representations (Annales Geophysicae, 2019, doi: 10.5194/angeo-37-699-2019, [PDF]). The latency of DGFI-TUM’s high-resolution product amounts to only 2-3 hours. A validation against the most prominent final and rapid products from the IAACs in Berne (CODE) and Barcelona (UPC) revealed a quality improvement of a few TECU, where a variation of 1 TECU [=10^16 electrons/m^2] corresponds to an error of around 16 centimeters in the distance between a GNSS satellite and a receiver on the Earth’s surface.
Satellite Laser Ranging (SLR) is one of the four fundamental geodetic space techniques for the accurate determination of geodetic key parameters related to Terrestrial Reference Frames (TRFs), the Earth’s rotation and its gravity field. Those parameters provide the basis for precise geodetic reference frames and thus for geo-referencing and quantifying geodynamic processes and the effects of global change. SLR is a technique based on the measurement of the two-way travel time of laser pulses between stations on the Earth’s surface and satellites in various orbits. These measurements contribute, inter alia, to the determination of satellite orbits, the study of tectonic processes, and the realization of the Coordinated Universal Time (UTC) that is linked to variations of the Earth’s rotation.
However, the inhomogeneity of the current global SLR station network is one of the major obstacles to the determination of the geodetic parameters of interest under the ambitious accuracy requirements on the millimeter level of IAG’s Global Geodetic Observing System (GGOS). But the installation of an additional SLR station is not only a geographical but also a financial decision and thus requires detailed pre-investigation of its potential benefit.
A recent study at DGFI-TUM aimed to determine the locations where new SLR stations would be most valuable. In a simulation, the existing SLR network was amended by one additional SLR station, and the required geodetic parameters were estimated. This simulation was performed repeatedly, whereby the additional station was placed at 42 different locations distributed homogeneously over the globe. The results showed that priority should be given to an additional station in the Antarctic. It would significantly improve the observation geometry and thus the quality of orbits, TRF and Earth rotation parameters. The latter would also benefit from an additional station in the vicinity of the equator, and the datum realization of TRFs would improve with an additional station in the Atlantic and Pacific Ocean regions. Details and results are provided in the open access article Future TRFs and GGOS - where to put the next SLR station? (Advances in Geosciences, 2019, doi: 10.5194/adgeo-50-17-2019, [PDF]).
Tracking down climate change with radar eyes: Over the past 22 years, the sea level in the Arctic Ocean has risen an average of 2.2 millimeters per year. This is the conclusion of an investigation performed jointly by DTU Space and DGFI-TUM as part of ESA's Sea Level Climate Change Initiative (CCI) project.
The most complete and precise overview of the sea level changes in the Arctic Ocean to date was obtained after evaluating 1.5 billion radar measurements of various altimetry satellites. A major challenge for a comprehensive analysis is the presence of sea ice which covers vast areas of the Arctic Ocean and obscures the ocean surface underneath. Applying DGFI-TUM's dedicated retracking algorithm ALES+ to ENVISAT and ERS-2 original measurements, radar echoes reflected even from small water openings in the ice could be identified and analysed. After harmonizing observation data from ice-covered and open water areas, maps of monthly sea level elevations were computed for 1996-2018.
Analysis of the long-term measurements revealed significant regional differences of sea level trends: Within the Beaufort Gyre north of Greenland, Canada and Alaska, the water stage rose twice as fast as on average. Low-salinity meltwater collects here, while a steady east wind produces currents that prevent the meltwater from mixing with other ocean currents. Along the coast of Greenland, on the other hand, the sea level is falling, on the west coast by more than 5 mm per year. Here, the melting glaciers weaken the gravity attraction. More information about the study can be found in the open access article Arctic Ocean Sea Level Record from the Complete Radar Altimetry Era: 1991–2018 (Remote Sensing, 2019, DOI: 10.3390/rs11141672, [PDF]). The results are also subject of a current TUM press release (English, German).
Continuous monitoring of lakes, rivers and reservoirs is of great importance for various hydrological, societal and economic questions. In many regions, hydrological parameters of inland water bodies, such as water level, surface extent, volume and discharge and their temporal changes are directly related to security-relevant aspects. Among those are water supply, water management and the protection of population and infrastructure, in particular in the view of global change, where the growth of population and more frequent extreme weather situations pose various challenges. The availability of up-to-date observation data is essential for the measurement of flooded regions and for the quantification of water storage in lakes and reservoirs.
Scientists of DGFI-TUM developed a new approach for the automated extraction of high-resolution time-variable water surface masks of lakes and reservoirs from optical satellite images, using observations from Landsat and Sentinel-2 between 1984 and 2018. The algorithm first extracts land-water masks from the images by combining five different water indices and applying an automated threshold computation. These monthly masks are used to compute a long-term water probability mask, which is then applied to fill data gaps caused by voids, clouds, cloud shadows or snow. Iteratively, all data gaps in all monthly masks are filled, which leads to a gap-free surface area time series. The resulting surface area changes were compared with water level time series from gauging stations, or, if in-situ data was not available, water level time series from satellite altimetry. Overall, 32 globally distributed lakes and reservoirs of different extents up to 2480 km² were investigated. Filling of the data gaps improved the average correlation coefficients between the time series of surface area and water levels from 0.61 to 0.86. This means an improvement of 41%. A cross-validation showed RMS errors smaller than 40 km² for all study areas, corresponding to relative errors below 8%. This demonstrates the strong impact of a reliable gap-filling approach and the quality enhancement of the land-water masks resulting from the new algorithm. Details are provided in the publication Automated Extraction of Consistent Time-Variable Water Surfaces of Lakes and Reservoirs Based on Landsat and Sentinel-2 (Remote Sensing, 2019, DOI: 10.3390/rs11091010, [PDF]). All presented surface area time series are freely available via DGFI-TUM’s Database of Hydrological Time Series of Inland (DAHITI).
The fundamental role of reference systems in space geodesy has been expressed already 30 years ago. In a famous article, Kovalevsky  states: "The Earth, its environment and the celestial bodies in the universe are not static: they move, rotate and undergo deformations. Motions and positions are not absolute concepts and can be described only with respect to some reference.”
In the last decades, the measurement accuracy of modern space geodetic techniques has improved by orders of magnitudes. This development requires that today's realizations of reference systems must be accurate at the millimeter level on global scale. Such accuracy is required for precise positioning and navigation on Earth and in space, and it is a prerequisite for the long-term monitoring and quantification of dynamic processes in the Earth system, such as seismic deformations, post-glacial uplift, or global and regional sea level changes.
Triggered by the need of a precise reference on Earth and in space, the DFG research unit “Space-Time Reference Systems for Monitoring Global Change and for Precise Navigation in Space” (FOR1503) aims at developing integrative methods and procedures for a consistent definition and realization of geodetic reference systems as well as to accomplish computations for their establishment and maintenance. This international consortium of scientists now published a 10-article special issue of the Journal of Geodesy. Scientists from DGFI-TUM were involved in two of the published studies:
The article Consistent estimation of geodetic parameters from SLR satellite constellation measurements (Journal of Geodesy, 2018, DOI: 10.1007/s00190-018-1166-7) describes the scientific exploitation of satellite laser ranging (SLR) observations to up to 11 satellites with various altitudes and orbit inclinations. The observations are used for the joint estimation of reference frame parameters, satellite orbits, gravity field changes and variations of Earth rotation. In contrast to the standard 4-satellite constellation currently used by the International Laser Ranging Service (ILRS), the extended constellation allows for a significant reduction of correlations between the respective parameters and thus for a higher precision.
One of the key goals of the research unit, the first simultaneous and consistent realization of the global Terrestrial Reference Frame (TRF), the Celestial Reference Frame (CRF) and the Earth orientation parameters (EOP) from the space-geodetic observing techniques VLBI, SLR, GNSS and DORIS is described in the article Consistent realization of Celestial and Terrestrial Reference Frames (Journal of Geodesy, 2018, DOI: 10.1007/s00190-018-1130-6, [PDF]). For the first time, this study realized the IUGG Resolution R3 (2011) which urges that highest consistency between TRF, CRF and EOP should be a primary goal of all future realizations.
The 4,300 kilometer Mekong River is a lifeline for South-East Asia. If this mighty river system bursts its banks, flooding can affect the lives and livelihoods of millions of people. Permanent monitoring of the river's water stage is thus essential. Using the example of the Mekong river with its pronounced changes in water level, an innovative method to monitor complex river basins solely based on satellite data has been developed in collaboration between TUM scientists from geodesy and mathematics. The approach allows for modelling how water levels are impacted on various sections of the river by extreme weather events such as heavy rainfall or drought over extended periods.
The approach uses measurement data collected from various altimetry satellite missions. In a first step, their raw observations are analysed applying specially developed retracking algorithms in order to create precise time series of water levels for the crossing points of the satellites' tracks with the river. Altimetry satellites on repetitive orbits usually pass over the same points on a repeating cycle of 10 to 35 days. As a result, water level data are captured for each of these points at regular intervals. But the study also integrates observations collected by Cryosat-2, a SAR altimetry satellite on a long-repeat orbit. The SAR altimetry method is superior to conventional systems in terms of accuracy, and the long-repeat orbit results in a very dense spatial resolution of the observations. But at the same time, the temporal resolution of Cryosat-2 data is very low. Thus, the points observed by Cryosat-2 are well distributed throughout the entire river system, but each of the points is only measured once or twice.
The flow patterns of the river, with its complex network of tributaries, are modelled using the statistical method known as universal kriging. The model allows for linking satellite data from different altimetry missions, including Cryosat-2, and makes it possible to extrapolate water levels observed at certain points to determine the levels at almost any location in the entire river system. It was demonstrated that including the precise and densely distributed SAR measurements in the model greatly improved the quality of the results. Details on the study including data processing and results are presented in the publication Observing water level extremes in the Mekong River Basin: The benefit of long-repeat orbit missions in a multi-mission satellite altimetry approach (Journal of Hydrology, 2019, DOI: 10.1016/j.jhydrol.2018.12.041). The study is also subject of a current TUM press release (English, German).
The Strait of Gibraltar is the only gateway between the Mediterranean Sea and the Atlantic Ocean. Both seas have very different characteristics in terms of temperature, salinity and nutrients. Water exchange that takes place here may have consequences much farther away, for example contributing to the high salinity of the Nordic Seas, a key area for deep water formation. Due to the prominent role of the ocean as a climate regulator, understanding the dynamics of this water exchange is essential to understand the climate in the Mediterranean as well as important features of the global ocean circulation.
In the recent study Wind-induced cross-strait sea level variability in the Strait of Gibraltar from coastal altimetry and in-situ measurements (Remote Sensing of Environment, 2018, DOI: 10.1016/j.rse.2018.11.042), a group of scientists led by J. Gómez-Enri from the University of Cadiz has used satellite altimetry and model data to monitor the water level from the two sides of the strait. Differences between the two sides are a way to monitor the surface water flow out and into the Mediterranean Sea. DGFI-TUM contributed to this work by recomputing a mean sea state using several years of dedicated reprocessed coastal data from satellite altimetry. This data allowed linking the sense of the surface currents in the strait to the wind regime. It is observed that specific wind events are able to reverse the mean circulation (which normally drives surface waters out of the Mediterranean) and therefore weaken the net Atlantic water inflow toward the Mediterranean Sea.
The study is a paramount example of the value of innovative coastal sea level data from satellites to improve the knowledge of ocean dynamics in areas where previously only sparse in-situ data could offer a localised view. It is also an example of the need of coastal oceanography to evolve as a synergy of different remote sensing, model and in-situ data.
Precise information on sea level changes is required for various questions of societal, economic and scientific relevance. DGFI-TUM researchers have produced a new correction to sea level data measured from space which improves their precision by about 30%.
Since more than two decades scientists use radar antennas (altimeters) installed on satellites orbiting around the Earth. The altimeters send a pulse of energy that expands onto an area of the ocean surface that can be several kilometers wide. The illuminated area is influenced by wind and waves (so-called sea state), that interact with the radar signal sent by the satellite. Moreover, the algorithms used to analyse the reflection from the ocean surfaces estimate sea level, waves and wind at the same time, causing interconnections between the estimation errors. This generates the need of a correction to the measured sea level, called Sea State Bias.
A recent study demonstrates a strong improvement in the precision of sea level data by estimating a new sea state bias correction. The innovative approach applies a data correction to measurements recorded every 300 m ("high-frequency” measurements), while in the past this correction was generated for 7 km averages (“low-frequency"). The study focused on two oceanic regions: the North Sea and the Mediterranean Sea. In order to verify the new correction, sea level measurements from nearby locations were compared against each other. It was shown that the noise of the high-frequency measurements is lower by 30%: While sea level anomalies (i.e. deviations of the sea level from a mean state that are typically up to 2 m) could previously be measured with a precision of 8 cm, this can be now done with a precision of less than 6 cm. This is very valuable in particular for oceanographic applications, such as for example the study of ocean currents, whose determination is strongly affected by the noise in the data provided by satellites.
The study was partially funded by the European Space Agency’s Sea Level Climate Change Initiative and is presented in the publication Improving the precision of sea level data from satellite altimetry with high-frequency and regional sea state bias corrections (Remote Sensing of Environment, 2018, DOI: 10.1016/j.rse.2018.09.007).
The Alps are on the go: The mountain range drifts northwards an average of one-half millimeter every year and rises 1.8 millimeters. This is the result of a recent study for which scientists of DGFI-TUM analyzed 12 years of measurements from more than 300 GPS stations distributed over the entire chain of the Alps and its foreland. The scientists identified the positions of the GPS stations, accurate down to fractions of a millimeter. A large number of the stations were set up in the EU project ALPS-GPSQUAKENET and are in part operated by DGFI-TUM itself.
The greatest challenge was the homogeneous processing of one-half million observed data items. The measurements are impaired by several interference factors, e.g. atmospheric signal delay, that have to be detected and corrected. The study used the corrected measured values to create a computer model that illustrates horizontal and vertical shifts as well as lateral spreading and compression over the entire Alpine region at a resolution of 25 kilometers.
The model depicts visibly both large-scale patterns of movement and regional special factors: For example each year the Alps grow an average of 1.8 millimeters in height and move to the northeast at a speed of up to 1.3 millimeters. In South and East Tyrol however a rotation towards the east is superimposed on this movement, while at the same time the mountain range is being compressed. And the rise in height is not identical everywhere either: Very small in the southern part of the western Alps, it reaches its maximum with a speed of more than 2 millimeters per year in the central Alps, at the boundaries of Austria, Switzerland and Italy. These changes in the surface of the earth serve as the basis for inferences regarding underground plate tectonics. The research was conducted in collaboration with the Geodesy and Glaciology project of the Bavarian Academy of Sciences and Humanities. Details on the data processing and the results of the study are presented in the open-access publication Present-day surface deformation of the Alpine Region inferred from geodetic techniques (Earth System Science Data, 2018, DOI: 10.5194/essd-10-1503-2018). The study is also subject of a current TUM press release (English, German).
The precise knowledge of the density of the Earth’s thermosphere is relevant for satellite mission planning, precise orbit determination (POD), re-entry predictions, and collision avoidances of Low Earth Orbiting (LEO) satellites. Empirical thermosphere models have been derived since the beginning of the space era from observations, e.g., from mass spectrometers and later from accelerometer data of CHAMP and GRACE.
Scientists of DGFI-TUM developed a new approach for the estimation of the thermospheric density using satellite laser ranging (SLR) observations to spherical LEO satellites in combination with a full POD. The approach is based on detailed analysis of the thermospheric drag. The drag coefficient is computed analytically using a gas-surface interaction model. In a case study, the derived procedure was applied to the spherical satellite ANDE-P at a mean altitude of around 350 km. From the analysis of the SLR observations to ANDE-P between August 16 and October 3, 2009, the scientists derived time series of estimated scale factors for the thermospheric density provided by four empirical models (see Figure). The results show that all models overestimate the true thermospheric density along the ANDE-P trajectory during the processed period. Furthermore, the study disclosed the partly high correlations between the thermospheric scale factors and orbital elements of the satellite’s motion. The approach and detailed results are described in the paper Towards thermospheric density estimation from SLR observations of LEO satellites: a case study with ANDE-Pollux satellite (Journal of Geodesy, 2018, DOI: 10.1007/s00190-018-1165-8).
The ability to predict tidal elevations in coastal areas is of crucial importance for society. In certain regions, tidal events combined with extreme meteorological conditions are responsible for severe flooding and consequent environmental issues. Precise ocean tide information is also required to correct satellite observations, e.g. for the accurate determination of sea level changes or gravity field variations. The accuracy of ocean tide models has largely improved over the last decades as a result of the enormous amount of globally distributed sea level measurements from 25 years of satellite altimetry and enhanced data analysis capabilities.
Difficulties still remain in coastal areas where complex tidal conditions exist. Satellite measurements close to the coast are of reduced quantity and characterized by poor quality due to the contamination through land influences. Now, scientists of DGFI-TUM have shown how a dedicated coastal processing of altimetry observations can help to improve tidal modelling. By reprocessing the satellite altimetry observations with the ALES algorithm, precision and accuracy of coastal sea level measurements are significantly improved. A comparison with globally distributed in-situ tide gauge measurements reveals a substantial reduction of errors for distances closer than 20 km from land when using ALES data to compute tidal constituents. Moreover, the absolute differences at single locations feature improvements larger than 2 cm, with an average impact of over 10% for individual tidal constituents. The entire study is described in the open-access publication Coastal Improvements for Tide Models: The Impact of ALES Retracker (Remote Sensing, 2018, DOI:10.3390/rs10050700, [PDF]).
An improved and homogeneous altimeter sea level record has been derived within the second phase of the Sea Level project of the European Space Agency’s Climate Change Initiative. The record covers a 23-year long time span (1993-2015) and is based on the data from nine satellite altimeter missions. It includes the monthly gridded time series of multi-mission merged sea level anomalies at a 0.25° spatial resolution and derived products suitable for climate studies: global mean sea level time series, regional mean sea level trends (see Figure), and maps of the amplitude and phase of the annual and semi-annual signals of the sea level.
The use of improved geophysical corrections and improved orbit solutions, careful bias reduction between missions and inclusion of two new altimeter missions (SARAL/AltiKa and CryoSat-2) improved the sea level products and reduced their uncertainties on different spatial and temporal scales. The derived global mean sea level trend is 3.3 mm/a over the entire time span, the regional sea level trends range between -5 and +10 mm/a.
Scientists of DGFI-TUM contributed to the development of this new sea level record, in particular, by the analysis of new orbit solutions and the assessment of the impact of orbit improvements on regional sea level results. Processing strategy, results, quality assessment and validation of the new data set are described in the open-access publication An improved and homogeneous altimeter sea level record from the ESA Climate Change Initiative (Earth System Science Data, 2018, DOI:10.5194/essd-10-281-2018, [PDF]).
CryoSat-2, launched in 2010, is the first altimetry satellite partly operating in synthetic-aperture radar (SAR) mode. Compared to conventional radar altimeters, the SAR mode enables altimetry measurements at increased along-track resolution and with smaller footprint. This opens new possibilities for the determination of water levels also of smaller inland water bodies that could not reliably be observed by the classical altimetry satellites. With its repeat time of 369 days CryoSat-2 does not provide a good temporal resolution, but on the other hand this long repeat orbit configuration results in a very dense spatial sampling of the water level, e.g. along a river.
The first step in the calculation of the water levels from CryoSat-2 data is the classification of the recorded radar returns into signals from water and land surfaces. Usually the classification is based on a predefined land-water mask. Such masks are often invariant with respect to time, i.e. they neither account for seasonal variations of the water extent nor for inter-annually shifting river banks. The determination of dynamic land-water (e.g. from remote sensing images) is difficult in particular in regions with frequent cloud coverage, and usually satellite images are not available simultaneously with the altimetry measurements.
By example of the Mekong river basin, scientists of DGFI-TUM and DTU Space (Denmark) developed and validated a new method for the identification of water returns in CryoSat-2 data that is independent from a land-water mask and relies solely on features derived from the SAR and range-integrated power (RIP) waveforms. The new approach has proven its effectiveness especially in the upstream region of the basin that is characterized by small- and medium-sized rivers. Compared to an approach based on a land-water mask, the new method significantly increased the number of valid measurements and their precision (1700 with 2% outliers vs. 1500 with 7% outliers). The approach and results are described in the open-access publication River Levels Derived with CryoSat-2 SAR Data Classification - A Case Study in the Mekong River Basin (Remote Sensing, 2017, DOI:10.3390/rs9121238, [PDF]).
Consistent and accurate global terrestrial reference frames (TRFs, i.e., the realization of coordinate systems attached to the Earth) are a fundamental prerequisite for navigation, positioning, and for Earth system science by referencing smallest changes of our planet in space and time. As Earth system research is increasingly relying on satellite-based Earth observation data, also the precise orientation of the Earth with respect to inertial space is required to relate the satellite observations to coordinates in the Earth reference frame.
Satellite Laser Ranging (SLR) is among the most important space geodetic techniques contributing to the determination of TRFs. SLR relies on the two-way travel time of laser pulses from stations on the Earth’s surface to satellites equipped with retro-reflectors. The SLR measurement principle allows for the realization of origin (Earth’s center of mass) and scale of global networks with very high accuracy. One of the major factors limiting the present accuracy of SLR-derived TRFs and Earth orientation parameters (EOP), however, is the currently unbalanced global SLR station distribution with a lack of stations particularly in the southern hemisphere.
The benefit of the potential future development of the SLR network for the accuracy of TRF (including origin and scale) and EOP has been investigated in the study Future global SLR network evolution and its impact on the terrestrial reference frame (Journal of Geodesy, 2017, DOI: 10.1007/s00190-017-1083-1). It compares a simulated SLR network including potential future stations to improve the network geometry with the current network. The study highlights that an improved SLR network geometry means a cornerstone to meet the ambitious accuracy requirements for reference frames defined by IAG’s Global Geodetic Observing System (GGOS).
The Arctic and Antarctic oceans are located in areas that are experiencing new conditions due to climate change (higher atmospheric temperatures, melting of ice sheets). Nevertheless, it is difficult to understand the changes in sea level, due to the fact that large areas are seasonally or permanently covered by sea ice. Scientists at DGFI-TUM have developed a new technique to spot the leads, i.e. openings in sea ice that uncover the sea surface, by analysing the data of ESA's successful Cryosat-2 mission.
The radar altimeter on board sends electromagnetic waves and collects the reflections from the ocean surface at different incidence angles. But the retrieval of meaningful sea level estimates requires not only the recognition of the leads. It also needs to be ensured that the openings are perpendicular to the satellite position (nadir position).
As the sea is calm and flat, the leads act like a mirror for the satellite: they could be easy to recognise, but their reflection is so strong that it has a signature on the data also at other incidence angles. An undetected lead is a missed opportunity to measure sea level, but a lead detected when not at nadir can cause a wrong estimation.
By tracking the signature that the leads leave on the collected data, it is possible to improve the detection capabilities. This is shown in the publication Lead Detection using Cryosat-2 Delay-Doppler Processing and Sentinel-1 SAR images (Advances in Space Research, 2017, DOI: 10.1016/j.asr.2017.07.011, [PDF]). The technique is also validated using radar images from Sentinel-1 (see picture above). This study will increase the reliability of the sea level analysis at high latitudes and thus contributes to improve the knowledge of the sea level dynamics in the Arctic and Antarctic oceans.
Open water areas in sea ice regions significantly influence the ocean-ice-atmosphere interaction. For the debate about Arctic climate change, the monitoring and quantification of such openings, so-called leads and polynyas, is of high relevance.
In a recent study, scientists from DGFI-TUM demonstrated the potential of high-frequency satellite altimetry data from the missions SARAL and Envisat for the detection of open water areas in the ice-covered Greenland Sea. In comparison with Synthetic Aperture Radar (SAR) images, they obtained a consistency rate of 76.9% for SARAL and 70.7% for Envisat. Some samples even resulted in true water detection rates of up to 94%.
The study is based on an innovative, unsupervised classification approach that relates the radar altimetry echoes (so-called waveforms) with different surface conditions, among them open water and sea ice. The algorithm has successfully been used for the detection of water areas with different spatial extent, and it can be applied to all pulse-limited altimetry data sets. The procedure and results are described in the article Monitoring the Arctic Seas: How Satellite Altimetry can be used to detect open water in sea-ice regions (Remote Sensing, 2017, available via open access).
The rapidly growing number of terrestrial GNSS (Global Navigation Satellite System) receivers providing double frequency measurements in real-time and near real-time enables the computation of ionosphere parameters such as the Vertical Total Electron Content (VTEC) with increasing accuracy and decreasing latency.
Scientists of DGFI-TUM have now developed a comprehensive processing framework to compute VTEC maps in near real-time from low latency GNSS measurements using compactly supported B-splines and recursive filtering methods. Details and results are described in the article Near real-time estimation of ionosphere vertical total electron content from GNSS satellites using B-splines in a Kalman filter (Annales Geophysicae, 2017, DOI: 10.5194/angeo-35-263-2017).
Series expansions in terms of B-spline functions allow for an appropriate handling of heterogeneously distributed input data. Kalman filtering enables the processing of the data immediately after acquisition and paves the way of sequential (near) real-time estimation of the unknown parameters, i.e. VTEC B-spline coefficients and differential code biases. Under the investigated conditions, the validation tests of our near real-time products show promising results in terms of accuracy and agreement with the post-processed final products of the International GNSS Service (IGS) and its analysis centers which are usually publicly available with several days of latency.
Satellite altimetry has been monitoring the sea level since more than 25 years. Measurements in the coastal zone, however, were routinely discarded due to poor quality. Recently, several studies addressed various techniques to improve the precision and the accuracy of coastal sea level measurements.
Scientists at DGFI-TUM are reprocessing the satellite signals using the ALES algorithm and have set up an experimental platform to distribute corrected sea level anomalies with a documented procedure. The COastal Sea level Tailored ALES (COSTA) dataset is now available in the Mediterranean and in the North Sea and provides the user with time series at each point along the satellite tracks of two satellite missions (ERS-2 and Envisat) covering the years 1996-2010. The COSTA dataset improves the precision of the standard product in over 70% of the domain. Details are provided in the recent presentation ALES Coastal Processing Applied to ERS: Extending the Coastal Sea Level Time Series (10th Coastal Altimetry Workshop, 21-24 February 2017, Florence, Italy). COSTA guarantees therefore not only improvements in the coastal data, but it means also a step forward in the precision of satellite altimetry in the open sea.
COSTA data is available for download in our Science Data Products section.
Classical approaches of inland altimetry determine water level variations of rivers at so-called virtual stations, i.e. fixed locations given by the crossings of altimeter tracks and rivers. Depending on the repeat cycles of the individual altimeter missions, the temporal resolution of these time series is limited to 35 or 10 days.
Now, scientists of DGFI-TUM have developed a method to provide a complete spatio-temporal description of the Mekong River based on multi-mission altimetry data. Details and results are described in the article Combination of Multi-Mission Altimetry Data Along the Mekong River with Spatio-Temporal Kriging (Journal of Geodesy, 2016, DOI: 10.1007/s00190-016-0980-z). The approach uses statistical robust interpolation methods to combine measurements of different altimetry missions, considering diverse flow velocities of the river as well as remote relationships between different catchment areas. With this approach, water level time series can be determined for arbitrary locations along the river at a significantly increased temporal resolution. For the test case of the Mekong river, a resolution of 5 days could be achieved, and the accuracy was improved by 23 to 34% compared to standard methods.
Two new flyers have recently been issued: The first one contains background information and directions for data access for DGFI-TUM's most recent realization of the International Terrestrial Reference System, the DTRF2014 (for more information about the DTRF2014 see message below).
The second flyer advertises DAHITI, DGFI-TUM's Database for Hydrological Time Series of Inland Waters that has been operated since 2013. DAHITI provides time series of water levels of lakes, reservoirs, rivers, and wetlands derived from multi-mission satellite altimetry for more than 400 globally distributed targets.
Both flyers can be accessed by clicking on the images on the right.
The DTRF2014 is DGFI-TUM’s new realization of the International Terrestrial Reference System (ITRS). It comprises positions and velocities of 1712 globally distributed stations of the space geodetic observation techniques VLBI, SLR, GNSS and DORIS as well as consistently estimated Earth orientation parameters. The DTRF2014 includes six additional years of data compared to the previous realization, i.e., the DTRF2008 (Seitz et al., 2012). Additionally, for the first time, non-tidal atmospheric and hydrological loading is considered in the DTRF2014.
In its role as an "ITRS Combination Centre" within the International Earth Rotation and Reference Systems Service (IERS), DGFI-TUM took the responsibility for providing realizations of the ITRS in regular intervals. An up-to-date ITRS realization at highest accuracy and long-term stability is an indispensable requirement for various applications in daily life (e.g., for navigation and positioning, for the realization of height systems and precise time systems or for the computation of spacecraft and satellite orbits). Furthermore, it is the backbone for Earth system research by providing the metrological basis and uniform reference for monitoring processes in the context of global change (e.g., ice melting, sea level rise). Read more about the DTRF2014 here (also available in German here).
The DTRF2014 is available for download in our Science Data Products section.
Strong earthquakes cause changes in positions (up to several meters) and velocities of geodetic reference stations. Hence, existing global and regional reference frames become unusable in the affected regions. To ensure the long-term stability of the geodetic reference frames, the transformation of station positions between different epochs requires the availability of reliable continuous surface deformation models.
Scientists of DGFI-TUM now published the new continental continuous surface deformation model VEMOS2015 (Velocity Model for SIRGAS [Sistema de Referencia Geocéntrico para Las Américas]) for Latin America and the Caribbean inferred from GNSS (GPS+GLONASS) measurements gained after the strong earthquakes in 2010 in Chile and Mexico. VEMOS2015 is based on a multi-year velocity solution for a network of 456 continuously operating GNSS stations covering a five years period from March 14, 2010 to April 11, 2015. The approach and results are described in the article Crustal deformation and surface kinematics after the 2010 earthquakes in Latin America (Journal of Geodynamics, 2016, DOI: 10.1016/j.jog.2016.06.005).
VEMOS2015 as well as the SIRGAS reference frame realization SIR15P01 (multi-year solution for 456 GNSS stations, including weekly residual time series) are provided in our Science Data Products section. More information on DGFI-TUM's research activities related to SIRGAS can be found here.
Over the last years satellite altimetry has proven its potential to monitor water level variations not only over the oceans but also over inland water bodies. DGFI-TUM provides altimetry-derived time series of water stage variations of various globally distributed rivers and lakes via its web service "Database for Hydrological Time Series over Inland Waters" (DAHITI; see below).
Now, scientists of DGFI-TUM have developed an innovative processing method for monitoring and analyzing water level variations in wetlands and flooded areas. The approach is based on automated altimeter data selection by waveform classification and an optimized waveform retracking. It is described in the article Potential of ENVISAT Radar Altimetry for Water Level Monitoring in the Pantanal Wetland (Remote Sensing, 2016, available via open-access).
Using the example of the Pantanal wetland in South America, this study demonstrates the capability and limitations of the ENVISAT radar altimeter for monitoring water levels in inundation areas. The accuracy of water stages varies between 30 and 50 cm (RMSE) and is in the same order of magnitude as reported for smaller rivers. Most areas of the Pantanal show clear annual water level variations with maximum water stages between January and June. The amplitudes can reach up to about 1.5 m for larger rivers and their floodplains. However, some areas of the wetland show water level variations of less than a few decimeters, which is below the accuracy of the method. These areas cannot reliably be monitored by ENVISAT. Further investigations will show if the usage of Delay-Doppler altimeter data (such as measured by the recently launched Sentinel-3 mission) might improve the results there.
The Global Geodetic Observing System (GGOS) of the International Association of Geodesy (IAG) promotes through its Focus Area 1 Unified Height System the definition and realization of a global vertical reference system with homogeneous consistency and long-term stability. For the term 2011-2015 DGFI-TUM coordinated the Working Group Vertical Datum Standardization, which main purpose was to determine an updated value for the gravity potential W0 of the geoid to be introduced as the conventional reference level for the realization of a global height system.
The derived value was officially adopted by the IAG in its Resolution No. 1, July 2015, as the conventional W0 value for the definition and realization of the International Height Reference System. A detailed description about the DGFI-TUM computation strategy of W0, applied models, conventions and standards, as well as results is presented in the recent publication A conventional value for the geoid reference potential W0 (Journal of Geodesy, 2016, DOI: 10.1007/s00190-016-0913-x).
Read more about the background and DGFI-TUM's activities related to the determination of the new W0 value here.
Different measurement techniques of the Earth's gravity field are characterized by different spectral sensitivities, i.e they allow for detecting structures of the gravity field at different spatial scales. By combining the observations from various measurement techniques a data set of a broad spectral range can be obtained. Typically, high-resolution gravity data from regional measurements are combined with global satellite information of lower spatial resolution.
To exploit the gravitational information as optimally as possible, scientists of DGFI-TUM set up a regional modeling approach. It uses radial spherical basis functions and emphasizes the strengths of various data sets by a flexible combination of high- and middle-resolution terrestrial, airborne, shipborne, and altimetry measurements. The resulting regional models can serve as a basis for various applications, such as the refinement of global gravity field models, national geoid determination, and the detection of mass anomalies in the Earth’s interior. Details can be found in the recently published article Combination of various observation techniques for regional modeling of the gravity field (Journal of Geophysical Research, 2016, DOI:10.1002/2015JB012586).
Scientists of DGFI-TUM have developed a new approach for the automated estimation of water levels of inland water bodies based on satellite observations from multi-mission altimetry. Time series of water stage variations of various globally distributed rivers and lakes are made available through DGFI-TUM's web service “Database for Hydrological Time Series over Inland Waters” (DAHITI).
The approach is described in the publication DAHITI – an innovative approach for estimating water level time series over inland waters using multi-mission satellite altimetry (Hydrology and Earth System Sciences, 2015, available via open-access).
The new method is based on an extended outlier rejection and a Kalman filter incorporating cross-calibrated multi-mission altimeter data from Envisat, ERS-2, Jason-1,Jason-2, TOPEX/Poseidon, and SARAL/AltiKa, including their uncertainties. The paper presents water level time series for a variety of lakes and rivers in North and South America. A comprehensive validation is performed by comparisons with in situ gauge data and results from external inland altimeter databases. The new approach yields RMS differences with respect to in situ data between 4 and 36 cm for lakes and 8 and 114 cm for rivers. For most study cases, more accurate height information than from other available altimeter databases can be achieved.
In February 2015, the UN General Assembly adopted its first geospatial resolution „A Global Geodetic Reference Frame for Sustainable Development“. This resolution recognizes the importance of geodesy for many societal and economic benefit areas, including navigation and transport, construction and monitoring of infrastructure, process control, surveying and mapping, and the growing demand for precisely observing our planet's changes in space and time. The resolution stresses the significance of the global reference frame for accomplishing these tasks, for natural disaster management, and to provide reliable information for decision-makers.
The United Nations Global Geospatial Information Management (UN-GGIM) Working Group on the Global Geodetic Reference Frame (GGRF) has the task for drafting a roadmap for the enhancement of the GGRF under UN mandate.
Based on its competence in the realization of reference frames DGFI-TUM is involved in this activity by contributing to the compilation of a concept paper in the frame of the International Association of Geodesy (IAG). The main purpose of this paper is to provide a common understanding for the definition of the GGRF and the scientific basis for the preparation of the roadmap to be accomplished by the UN-GGIM Working Group on the GGRF. [more]