Recent news:

Assessment of non-tidal loading data for DGFI-TUM’s upcoming ITRS 2020 realization DTRF2020

[Contributions to geocenter motion computed from NTL site displacements of ESMGFZ (blue) and GCTI20 (red). Geocenter motion estimated by SLR is shown in grey. Time-series for GCTI20 and SLR have been shifted by 10 and 20 mm, respectively.]

Site displacements caused by non-tidal loading (NTL) are among the major limiting factors for the accuracy of today's reference frames. On the occasion of the current 2020 realization of the International Terrestrial Reference System (ITRS), we investigated the NTL models of two providers with respect to their suitability for the determination of long-term stable reference frames.

As one of the three ITRS Combination Centers of the International Earth Rotation and Reference Systems Service (IERS), DGFI-TUM is in charge of realizing the ITRS in regular intervals. Our upcoming solution, the DTRF2020, will correct for site displacements caused by mass load changes of atmosphere, ocean and hydrology.

Since there is no conventional model for the application of NTL yet, the best approach and data for the DTRF2020 needs to be investigated. The appropriate modeling of NTL is crucial to our ITRS realization as accounting for NTL site displacements will affect the estimated coordinates of reference points (i.e., geodetic observatories), but should not affect the realized geocenter. We analyzed two sets of site displacements based on different geophysical models. One is the Global Geophysical Fluid Center contribution (labelled GCTI20) to the ITRS 2020 realization, the other is the operational NTL data from the Earth System Modelling group of the Deutsches GeoForschungsZentrum (ESMGFZ; already applied at DGFI-TUM e.g. for VLBI analysis). Among other things, we compared the displacements with time series of GNSS station residuals and calculated the contributions to geocenter motion (see Figure).

While the correlations are satisfactory, neither data set could be identified as having the better agreement with the residual GNSS station positions. The main differences between GCTI20 and ESMGFZ are the hydrological loading components and the presence of artificial trend changes in ESMGFZ site displacements (and hence geocenter motion contributions). The latter is a hindrance to realizing a secular reference frame. As a result, GCTI20 will be applied for the DTRF2020. The study is published in the article Comparison of non‐tidal loading data for application in a secular terrestrial reference frame (Earth, Planets and Space, 2022, DOI: 10.1186/s40623-022-01634-1, [PDF]).
 

Improved modeling of atmospheric drag in precise orbit determination

[POD of a spherical satellite by Satellite Laser Ranging: The calculated orbit depends on gravitational and non-gravitational accelerations, such as air drag (Left). In-situ measurement of thermospheric density along the orbit of GRACE satellites using on-board accelerometers (Right).]

A major problem in the precise orbit determination (POD) of satellites at altitudes below 1,000 km is the modelling of atmospheric drag, which depends mainly on thermospheric density and causes the largest non-gravitational acceleration. Normally, thermospheric densities at satellite positions are determined by empirical models, which have limited accuracy. But conversely, satellites orbiting the Earth within the thermosphere can be used to derive thermospheric density information because of their sensitivity to perturbing accelerations.

Scientists from DGFI-TUM and the Institute of Geodesy and Geoinformation at the University of Bonn (IGG Bonn) have for the first time compared thermospheric density corrections in the form of scale factors for the NRLMSISE-00 model with a temporal resolution of 12 hours. It was shown that time-averaged scale factors from in-situ acceleration measurements on board CHAMP and GRACE fit well to arc-wise scale factors from the Satellite Laser Ranging (SLR) technique applied to the spherical satellites Larets, Stella, WESTPAC and Starlette. The estimated scale factors vary by up to 30% around the value of 1 at low solar activity and by up to 70% at high solar activity. This shows the extent to which the NRLMSISE-00 model values of thermospheric density deviate from the observed values. On average, at low solar activity the model overestimates the thermospheric density and has to be scaled down with the estimated scale factors, while at high solar activity the model underestimates the density values and has to be scaled up.

Depending on the altitude, there are correlations of up to 0.8 between the scale factors derived from accelerometer data and those estimated from SLR. To check the reliability of the latter, the POD results from two different software packages were compared, namely DOGS-OC at DGFI-TUM and GROOPS at IGG Bonn. Above 680 km altitude, a linear decrease in the estimated thermospheric density scale factors of about -5% per decade was observed, possibly related to climate change. The results of this study are published in the article Scale Factors of the Thermospheric Density: A Comparison of Satellite Laser Ranging and Accelerometer Solutions (Journal of Geophysical Research: Space Physics, 2021, DOI: 10.1029/2021JA029708, [PDF]).
 

New global ocean tide model EOT20 from multi-mission satellite altimetry

[Amplitudes and phases of tidal constituents M2 (top) and K2 (bottom).]

DGFI-TUM recently published the latest in a series of empirical ocean tide (EOT) models. The new model, named EOT20, shows improved results compared to other global tide models (including our earlier model EOT11a), especially in the coastal region.

Ocean tides play a vital role in various practical applications, especially in the coastal environment. In addition, tides are of importance in geodetic data analysis, for example in improving the observation of sea surface processes from along-track satellite altimetry and in determining high-resolution gravity fields from missions such as GRACE. Although in recent years tide models have made significant progress in the estimation of tides using satellite altimetry, the coastal region remains a challenge due to the complexity of shorelines, poorly resolved bathymetry and land contamination of altimetry radar echoes.

EOT20 benefits from advances in coastal altimetry, particularly in the use of the ALES retracker. The EOT20 approach relies on residual tidal analysis with respect to a reference tide model (FES2014) to estimate residual signals of the ocean tides. Further developments include the incorporation of more altimetry data, improved coastline representation and triangular gridding.

The model's accuracy was evauated using in-situ tide gauge data from DGFI-TUM's TICON dataset. Error reduction was found for the eight major tidal constituents in EOT20 compared to other global ocean tide models in the coastal region, with an error reduction of ~0.2 cm compared to the next best model (FES2014). EOT20 is on par with the best tide models in shelf regions and the open ocean, with improvement over EOT11a throughout. When used as a tidal correction for satellite altimetry, EOT20 reduced the sea level variance compared to both EOT11a and FES2014. These improvements, particularly in the coastal region, encourage the use of EOT20 as a tidal correction for satellite altimetry in sea-level research.

The ocean tide and load tide datasets of EOT20 are available in our Science Data Products section. Methodology and results are described in the publication EOT20: a global ocean tide model from multi-mission satellite altimetry (Earth System Science Data, 2021, doi: 10.5194/essd-13-3869-2021, [PDF]).
 

First comprehensive measurements of sea level changes in the Baltic Sea and the North Sea

[Rise of mean sea level in the North and Baltic Sea between 1995 and 2019. Gray shading indicates areas with high statistical uncertainty.]

Precise data for improved coastline protection: Led by DGFI-TUM, an international team of researchers has created the first comprehensive data sets of regional sea level rise in the North Sea and the Baltic Sea, including coastal areas and regions covered by sea ice. The data sets provide new insights into long-term and seasonal seal level changes over the past quarter century. This information is of vital importance for planning protective measures and for understanding dynamic processes in the oceans and the climate system.

Especially near coastlines, where many cities and industry facilities are located, the quality and quantity of data collected by the satellites are compromised by strong perturbances of the radar signal. Another problem is sea ice, which covers parts of the oceans in winter, and is impenetrable to radar. In the ESA Baltic Sea Level project (Baltic SEAL) the researchers developed algorithms to process the measurement data from radar satellites to permit precise and high-resolution measurements of sea level changes even in coastal areas and beneath sea ice. In this effort, the Baltic Sea can be seen as a model region since the complex shape of the coastline and sea ice make the data analysis particularly difficult. Analytical methods that work here can be easily adapted to other regions. Hundreds of millions of radar measurements taken between 1995 and 2019 were processed in a newly developed multi-stage process, comprising the identification of signals from the ice-covered sea water in the radar reflections produced along cracks and fissures, the development of new computational methods to achieve better quality of sea level data close to land, and finally the calibration and combination of measurements from the various satellite missions.

The analysis of these data for the Baltic Sea shows that the sea level has risen at an annual rate of 2-3 millimeters in the south, on the German and Danish coasts, as compared to 6 millimeters in the north-east, in the Bay of Bothnia. The cause of this above-average rise: Strong south-westerly winds connected to the NAO drive the waters to the north-east. The developed method has also been applied to the North Sea, where the sea level is rising by 2.6 millimeters per year, and by 3.2 millimeters in the German Bight.

The data sets Baltic SEAL and North SEAL of sea level changes are available for download in our Science Data Products section. Methods and results are described in the respective publications Absolute Baltic Sea Level Trends in the Satellite Altimetry Era: A Revisit (Frontiers in Marine Science, 2021, doi: 10.3389/fmars.2021.647607, [PDF]) and North SEAL: A new Dataset of Sea Level Changes in the North Sea from Satellite Altimetry (Earth System Science Data, 2021, doi: 10.5194/essd-13-3733-2021, [PDF]).

The study is also subject of a current TUM press release (English, German).
 

Global coastal attenuation of wind-waves observed with radar altimetry

Knowledge of ocean wave heights at the coast is essential for several operational applications, ranging from coastal protection to energy exploitation. In this context, the Significant Wave Height (SWH) is one of the most general quantitative parameters that describe the sea state at a particular location. SWH, representing the average height of the highest waves, can be measured from satellites using radar altimeters. Over the open ocean, such measurements are routinely used, for example, for ocean weather predictions. In the coastal zone however, the radar measurements were not considered reliable. As an alternative, in-situ buoys or high-resolution ocean models are employed. While the network of in-situ buoys is very sparse and can only provide data at specific locations, appropriate ocean models are computationally very expensive and not globally available, besides requiring constant validation.

Led by DGFI-TUM, an international team has now analyzed reprocessed data from radar altimetry, specifically tailored to improve the quality and quantity of coastal measurements. The results, published in the article Global coastal attenuation of wind-waves observed with radar altimetry (Nature Communications, 2021, doi: 10.1038/s41467-021-23982-4, [PDF]), provide a global picture of the average wave climate when going from offshore (about 30 km) to the coast (up to 3 km from land). The typical attenuation of the waves when approaching the coast, for example due to the shading effect from the land, is quantified to be about 20% of the wave height reached offshore. As a consequence, the energy flux transported by the waves is calculated to decline by about 40% on a global average. This result is paramount for coastal assessments, which until now are often based on models with validation relative to offshore satellite altimetry data.
 

Improved parameters of geodetic VLBI by correcting for all types of non-tidal loading

[Change in the baseline length between the VLBI antennas Wettzell and Badary due to different effects of non-tidal loading.]

Very Long Baseline Interferometry (VLBI) is a geodetic space technique which measures the difference in arrival times (delay) of extra-galactic radio signals at separate antennas across the Earth. It depends on the distances between each two antennas, the so-called “baselines”. The observed delays allow for estimating the absolute positions of the antennas in the terrestrial reference frame (TRF), the positions of the radio sources in the celestial reference frame (CRF), as well as the complete set of Earth Orientation Parameters (EOP), linking TRF and CRF.

The positions of the antennas vary during VLBI measurements, and the instantaneous displacements with respect to the long-term linear motion as provided by the TRF are generated from different geophysical effects. One such effect is the deformation of the Earth surface by non-tidal loading, driven by the redistribution of air and water masses within the atmosphere, ocean and continental hydrolosphere. However, oceanic and hydrological loading are usually omitted in routine VLBI processing. In the recent study Benefits of non-tidal loading applied at distinct levels in VLBI analysis (Journal of Geodesy, 2020, doi: 10.1007/s00190-020-01418-z, [PDF]), researchers of DGFI-TUM applied all three non-tidal loading types in the analysis of VLBI sessions between 1984 and 2017 and investigated the impact for various geodetic parameters.

Loading data in terms of three-dimensional station site displacements was applied at two distinct levels of the parameter estimation process: The “observation level” represents the rigorous application, while only average site displacements are considered in the approximation at the “normal equation level”. The study revealed that each baseline is most sensitive to a different loading type (Figure). Considering all types jointly provides the best results, as the variation in estimated heights decreases to a larger extent and for more stations than with any of the single loading types. In particular, the inclusion of hydrological loading leads to a significant reduction in the annual residual signal of station heights. These effects, which improve the stability of station positions, were observed for both application levels with a similar magnitude, and hence the correction for non-tidal loading at normal equation level proved to be a suitable approximation in VLBI analysis.
 

New coastal sea level record from reprocessed Jason satellite altimetry

[Coastal sea level trends (mm/yr) at the 429 selected coastal sites of the study.]

Many coastal regions are exposed to sea level rise and are thus increasingly threatened by the risk of flooding during extreme events. Risk assessment and the development of appropriate adaptation measures are complex and require a reliable data basis of regional coastal sea level changes from precise observations over long time spans. But systematic coastal sea level observations are lacking along most of the world coastlines. Coastal zones are highly under-sampled by tide gauges, and altimetry data are largely defective because of land contamination of the radar signals.

Now, in the framework of the Climate Change Initiative (CCI) Sea Level project of the European Space Agency (ESA), a novel altimetry-based coastal sea level data record has been created. It consists of high-resolution (~300 m) monthly sea level data along the satellite tracks, at distances of less than 3-4 km from the coastlines in general, sometimes even closer, within 1-2 km from the coast. The data set is based on a complete reprocessing of altimetry radar observations from the Jason-1/2/3 missions and provides coastal sea level trends over 2002-2018 at 429 coastal sites located in six regions (Northeast Atlantic, Mediterranean Sea, West Africa, North Indian Ocean, Southeast Asia and Australia). DGFI-TUM is involved in the CCI Sea Level project by designing and testing of improved radar signal processing techniques to exploit the radar signal in the coastal zone and to correct the measurements. The procedure and the new coastal sea level record are described in the article Coastal sea level anomalies and associated trends from Jason satellite altimetry over 2002–2018 (Nature Scientific Data, 2020, doi: 10.1038/s41597-020-00694-w, [PDF]). The data is freely available at the SEANOE repository (doi: 10.17882/74354).
 

Adaptive Modeling of the Global Ionosphere Vertical Total Electron Content

[VTEC maps covering three days in 2015 around the St Patrick geomagnetic storm generated by the developed modeling approach adapting the estimator to the environmental changes autonomously: March 16 (day before storm, top), March 17 (storm day, mid) and March 18 (day after storm, bottom).]

Space weather and natural disaster monitoring, navigation, positioning and other applications imply an increasing need for low latency ionosphere information. In order to create such information, a suitable estimator is required, making use of observation data as soon as they are available. In this sense, the Kalman Filter (KF) is often applied in (ultra) rapid and (near) real-time applications. The requirement of the prior definition of model uncertainties is a drawback associated with the standard implementation of the KF, and the uncertainties can exhibit temporal variations. The implementation of adaptive approaches into the KF is a way to tune the stochastic model parameters during the filter run-time.

In the last years DGFI-TUM developed approaches for modeling the global vertical total electron content (VTEC) of the ionosphere as a series expansion in terms of localizing B-spline basis functions from unevenly distributed input data such as the dual-frequency GNSS measurements of the IGS network. Scientists of DGFI-TUM made now the next step and developed adaptive methods for an ultra-rapid VTEC modelling to tune the associated model uncertainties in a self-learning manner. The adaptive approach relies on the method of Variance Component Estimation (VCE) and significantly reduces the effort to set up the measurement model and the associated uncertainties for different groups of observations. In order to define the dynamic (prediction) model of the ionosphere target parameters, advantages of the B-spline representation are exploited. For instance, since the coefficients of the B-spline representation resemble the VTEC signal, physical interpretations can be directly deduced from the coefficients. This leads to developing the empirical prediction model very efficiently.

The approach is applied to ultra-rapid VTEC modeling employed with a maximum latency of about 2.5 hours using ionosphere measurements from GPS and GLONASS and can be extended for additional GNSS constellations such as GALILEO or other measurement techniques. Details are presented in the article Adaptive Modeling of the Global Ionosphere Vertical Total Electron Content (Remote Sensing, 2020, doi:10.3390/rs12111822, [PDF]).
 

Satellite-based time series of volume variations of small inland water bodies

[Derived bathymetry and volume time series (1982-2019) of Hubbard Creek Lake, USA. The dotted line displays the ground track of the altimeter satellite.]

In the debate of climate change impacts, the availability and accessibility of freshwater on Earth is an extremely important topic. About 0.25% of the Earth’s freshwater is stored in lakes and reservoirs. A large fraction of these water bodies is characterized by strong storage changes, not only with the seasons, but also in the long-term in consequence of, e.g., human interference or climate-related phenomena. Furthermore, the number of reported flood events relating to inland waters is steadily increasing, from about 150 in 1980 to more than 400 in recent years. Various tasks, such as water resource management, water supply or civil protection require accurate and current information about water storage. In the light of decreasing ground-based measurements, remote sensing techniques have become extremely relevant for the monitoring of lakes and reservoirs worldwide.

Since many years, DGFI-TUM has been working on the determination of accurate water level changes from satellite altimetry also for small inland water bodies and provides respective time series for more than 2740 targets in its Database for Hydrological Time Series of Inland Waters (DAHITI). A newly developed approach now combines these data with areal information from remote sensing images and thus enables the determination of volume changes. For each water body, the underlying algorithm creates a fixed water level/surface area relation (so-called hypsometry) and a high-resolution bathymetry above the lowest observed water level. Details can be found in the article Volume Variations of Small Inland Water Bodies from a Combination of Satellite Altimetry and Optical Imagery (Remote Sensing, 2020, doi: 10.3390/rs12101606, [PDF]).

The procedure was applied to 28 lakes and reservoirs located in Texas, USA, with volumes between 0.1 km³ and 6.0 km³. Validation with ground data features correlation coefficients between 0.80 and 0.99. The relative errors vary between 1.5% and 6.4% with an average of 3.1%. All data are publicly accessible via DAHITI.
 

Combining heterogeneous observations in regional gravity field modeling

[High-resolution regional gravity data (airborne and terrestrial measurements) in combination with global satellite data.]

Gravity field determination is a major topic in geodesy, supporting applications from Earth system science, orbit determination or the realization of physical height systems. Coarse resolution global gravity field information from satellite observations can be combined with high-resolution gravity data from airborne, shipborne, or terrestrial measurements for regional gravity refinement. In this process, regularization is in most cases inevitable, and choosing an appropriate value for the regularization parameter is a crucial issue. Variance component estimation (VCE) and L-curve method are two frequently used procedures for choosing the regularization parameter.

VCE simultaneously determines the relative weighting between different observation types and the regularization parameter. The prior information is regarded to be another observation type and is required to be stochastic. However, in most of the regional gravity modeling studies, a background model serves as prior information, which has no random character but is deterministic. In this case, the regularization parameter estimated by VCE can be unreliable. On the other hand, the L-curve method (or other conventional regularization methods) cannot weight heterogeneous observations.

To overcome these drawbacks, scientists of DGFI-TUM developed two ‘combined approaches’ for the regularization parameter determination when different data sets are to be combined. The two approaches combine VCE and the L-curve method in such a way that the relative weights are estimated by VCE, but the regularization parameters are determined by the L-curve method. They differ in whether determining the relative weights between each observation type first (VCE-Lc) or the regularization parameter by the L-curve method first (Lc-VCE). Numerical investigations show that these two proposed approaches deliver lower RMS errors with respect to the validation data than the L-curve method and VCE do. Details are provided in the recent article Determination of the Regularization Parameter to Combine Heterogeneous Observations in Regional Gravity Field Modeling (Remote Sensing, 2020, doi: 10.3390/rs12101617, [PDF]).
 

Wave heights in the ocean: A round robin assessment of satellite altimetry retracking algorithms

Satellite altimetry is a key technique for the observation of the world’s oceans. Initially, it has been introduced to determine the ocean surface topography and changes of the sea level by repeatedly measuring the distance between a satellite and the water surface. This distance relates to the round-trip travel time of a radio pulse emitted by the satellite and reflected by the water. But the shape of the radar echo received also enables to study other relevant conditions at the ocean surface, such as significant wave height (SWH) and wind speed. Both of these quantities are related to the sea state, the knowledge of which is essential for numerous applications, e.g. ocean wave monitoring (for fishing or shipping route planning), weather forecasting, or wave climate studies. Information on the sea state is received from the radar echo using an algorithmic approach called retracking.

In the framework of the European Space Agency Sea State Climate Change Initiative (SSCCI) project a competitive exercise (round robin) has been conducted to determine the best retracking algorithm for sea state retrieval. The assessment is focused on the Jason-3 and Sentinel-3A missions, representing the two main satellite altimeter technologies: the so-called "Low Resolution Mode", which encompasses over 25 years of data, and the newest "Delay-Doppler Mode", which exhibits an improved along-satellite-track resolution and signal-to-noise-ratio. 19 retracking algorithms from six international research groups were included in the study. Results showed that all novel retracking algorithms perform better in the majority of the metrics than the baseline algorithms currently used for operational generation of the products. According to an objective weighting scheme that is based on the SSCCI criteria, DGFI-TUM’s retracking algorithms WHALES (Low Resolution Mode) and WHALES-SAR (Delay-Doppler Mode) were ranked second best for all scenarios. Considering coastal scenarios only, WHALES proved the best performance among the Low Resolution Mode retracking algorithms. More details on the study are provided in the publication Round Robin Assessment of Radar Altimeter Low Resolution Mode and Delay-Doppler Retracking Algorithms for Significant Wave Height (Remote Sensing, 2020, doi: 10.3390/rs12081254, [PDF]).
 

Regional high-resolution ionosphere maps for continental regions

[Global VTEC map (top) represents the coarse structures comparable to the products of the IAACs. In combination with the delta VTEC map (bottom), the regional high-resolution VTEC map with a spectral content up to degree 48 can be generated.]

Ionospheric signal delay is one of the largest error sources in GNSS (Global Navigation Satellite Systems) applications and can cause positioning errors in the order of several meters. Especially for single-frequency users, who cannot correct for ionospheric signal delay, external information about the state of the ionosphere is essential. The International GNSS Service (IGS) and its Ionosphere Associated Analysis Centers (IAAC) routinely provide this information in terms of global ionosphere maps (GIM) representing the Vertical Total Electron Content (VTEC). The GIMs are typically limited in their spatial and spectral resolution (spherical harmonic degree 15) caused by the globally inhomogeneous distribution of GNSS observations used for GIM generation. Regional GNSS networks, however, offer dense clusters of observations, which can be used to generate regional VTEC solutions with a higher spectral resolution.

Based on a two-step approach which comprises a global model and a regional densification, scientists of DGFI-TUM have developed an algorithm to generate regional VTEC maps with higher spectral resolution accounting for the finer signal structures. The algorithm is integrated in a software package to provide VTEC maps using hourly GNSS data and ultra-rapid orbits. It allows for the dissemination of VTEC maps with a latency of 2 to 3 hours.

A numerical study for March 2015 based on hourly GNSS data provided by the IGS and the EUREF network is presented in the article Global and Regional High-Resolution VTEC Modelling Using a Two-Step B-Spline Approach (Remote Sensing, 2020, doi:10.3390/rs12071198, [PDF]). Validation with independent data shows that the generated regional high-resolution VTEC maps are of comparable accuracy to the regional maps provided by the Royal Observatory of Belgium, currently being the best known publicly available product for Europe. While the latter is provided with a latency of about one week, DGFI-TUM’s VTEC maps are characterized by low latency allowing for high precision positioning and navigation.
 

Observation-based attitude model improves the orbits of Jason altimetry satellites

[Several coordinate transformations are required to describe the satellite's actual attitude with respect to to the celestial reference system (GCRS).]

Low Earth orbiting satellites are strongly influenced by perturbing accelerations. For the precise orbit determination (POD) of a non-spherical satellite, accurate modelling of the satellite-body attitude and solar panel orientation is important as the acceleration is directly related to the satellite’s effective cross-sectional area. Moreover, the positions of tracking instruments mounted on the satellite are affected by its attitude.

The altimeter satellites Jason-1/-2/-3 have been providing continuous precise monitoring data of the sea level since December 2001. They have a complex shape, comprising the main spacecraft body on which solar panels and numerous measurement and positioning instruments are mounted. The attitude in space is commonly modelled using a so-called nominal yaw steering model. Scientists of DGFI-TUM have now implemented an observation-based algorithm in which extensively preprocessed quaternions of the satellite body orientation (measured by star tracking cameras) and rotation angles of the solar arrays are used to derive improved attitude information. Processing steps comprise a detailed data analysis, outlier elimination, temporal resampling of the observation data and the optimal interpolation of missing values. The procedure is described in the recent publication Observation-based attitude realization for accurate Jason satellite orbits and its impact on geodetic and altimetry results (Remote Sensing, 2020, doi: 10.3390/rs12040682, [PDF])

The study investigates the benefit of using preprocessed observation-based attitude in contrast to a nominal yaw steering model for the POD of Jason satellites by Satellite Laser Ranging (SLR) over more than two decades. Results show that the new algorithm improves the root mean square (RMS) of SLR observation residuals by 5.9%, 8.3% and 4.5% for Jason-1, Jason-2 and Jason-3, respectively, compared to the nominal attitude realization. The analysis of single-satellite crossover differences revealed a reduction of the mean of absolute differences by 6%, 15%, and 16%. Furthermore, it could be shown that artificial (non-geophysical) signals in station coordinate time series due to orbit modelling deficiencies are significantly reduced. The observation-based attitude data of Jason-1/-2/-3 can be made available on request.
 

Surface currents in polar oceans from satellite altimetry complemented by ocean modelling:
A new dataset for the northern Nordic Seas

[Geostrophic velocities of ocean currents in the northen Nordic Seas]

Deeper knowledge about geostrophic ocean surface currents in the northern Nordic Seas supports the understanding of ocean dynamics in this region characterized by rapidly changing environmental conditions. Monitoring the sea-ice-affected area by satellite altimetry results in fragmented and irregularly distributed data sampling and prevents the creation of homogeneous and highly resolved spatio-temporal datasets. In order to overcome this problem, an ocean model can be used to fill in data where altimetry observations are missing.

The joint study Geostrophic currents in the northern Nordic Seas from a combination of multi-mission satellite altimetry and ocean modeling (Earth System Science Data, 2019, doi: 10.5194/essd-11-1765-2019, [PDF]) of DGFI-TUM and the Alfred Wegener Institute (AWI) resulted in a novel dataset of geostrophic currents based on a combination of along-track satellite altimetry data and simulated differential water heights from the Finite Element Sea ice Ocean Model (FESOM). The combination approach is based on principal component analysis (PCA) and links the the temporal variability of along-track ocean topography from satellite altimetry with the most-dominant spatial patterns of FESOM differential water heights. Annual variability and constant offsets were removed from both datasets before combination and added back from altimetry to the combined dataset. Surface currents were computed applying the geostrophic flow equations to the combined topography. The resulting final product is characterized by the spatial resolution of the ocean model (around 1 km) and the temporal variability of the altimetry along-track derived DOT heights. Comparisons to in situ surface drifter observations demonstrate good agreement of spatial patterns, magnitude and flow direction. Mean differences of 0.004 m/s in the zonal and 0.02 m/s in the meridional component are observed. A direct pointwise comparison between the combined geostrophic velocity components interpolated to drifter locations indicates that about 94% of all residuals are smaller than 0.15 m/s.

The dataset provides surface circulation information within the sea ice area and will support a deeper comprehension of ocean currents in the northern Nordic Seas between 1995 and 2012. Data are available at https://doi.org/10.1594/PANGAEA.900691.
 

High-resolution ionosphere maps for precise GNSS applications

[MSR of VTEC for the St.Patrick storm (March 17, 2015, 19:00 UT), the largest solar event for more than 15 years. DGFI-TUM’s high-resolution VTEC map (upper panel) contains finer signal structures than the moderate-resolution VTEC map (bottom panel).]

Ionospheric disturbances are the main error source in single-frequency precise point positioning. The International GNSS Service (IGS) and its Ionosphere Associated Analysis Centers (IAAC) routinely provide maps of the Vertical Total Electron Content (VTEC) of the ionosphere to correct for ionospheric influences. Since these maps are based on post-processed observations and final orbits, they are usually disseminated to the user with latencies of days to weeks. Precise dual- and multi-frequency GNSS applications, however, such as autonomous driving or precision farming, require high-resolution ionosphere products in (near) real-time (NRT).

To meet this requirement, scientists of DGFI-TUM developed a new procedure to create NRT high-resolution information about the state of the ionosphere based on raw observation data and ultra-rapid orbits. To consider additionally the inhomogeneous distribution of the GNSS data, DGFI-TUM’s modelling approach is based on localizing basis functions (polynomial and trigonometric B-splines). In this way, in contrast to the aforementioned IAAC products, data gaps can be handled appropriately, and a multi-scale representation (MSR) allows for generating VTEC maps of higher or lower spectral resolution by applying a Kalman filter estimation procedure and the pyramid algorithm known from wavelet decomposition.

The realization of the MSR and the generation of the VTEC maps with different temporal and spectral resolutions (Figure) are presented in the article High-resolution vertical total electron content maps based on multi-scale B-spline representations (Annales Geophysicae, 2019, doi: 10.5194/angeo-37-699-2019, [PDF]). The latency of DGFI-TUM’s high-resolution product amounts to only 2-3 hours. A validation against the most prominent final and rapid products from the IAACs in Berne (CODE) and Barcelona (UPC) revealed a quality improvement of a few TECU, where a variation of 1 TECU [=10^16 electrons/m^2] corresponds to an error of around 16 centimeters in the distance between a GNSS satellite and a receiver on the Earth’s surface.
 

Enhancement of the Global Geodetic Observing System: Where to put the next SLR station?

[Image Credit: NASA]

Satellite Laser Ranging (SLR) is one of the four fundamental geodetic space techniques for the accurate determination of geodetic key parameters related to Terrestrial Reference Frames (TRFs), the Earth’s rotation and its gravity field. Those parameters provide the basis for precise geodetic reference frames and thus for geo-referencing and quantifying geodynamic processes and the effects of global change. SLR is a technique based on the measurement of the two-way travel time of laser pulses between stations on the Earth’s surface and satellites in various orbits. These measurements contribute, inter alia, to the determination of satellite orbits, the study of tectonic processes, and the realization of the Coordinated Universal Time (UTC) that is linked to variations of the Earth’s rotation.

However, the inhomogeneity of the current global SLR station network is one of the major obstacles to the determination of the geodetic parameters of interest under the ambitious accuracy requirements on the millimeter level of IAG’s Global Geodetic Observing System (GGOS). But the installation of an additional SLR station is not only a geographical but also a financial decision and thus requires detailed pre-investigation of its potential benefit.

A recent study at DGFI-TUM aimed to determine the locations where new SLR stations would be most valuable. In a simulation, the existing SLR network was amended by one additional SLR station, and the required geodetic parameters were estimated. This simulation was performed repeatedly, whereby the additional station was placed at 42 different locations distributed homogeneously over the globe. The results showed that priority should be given to an additional station in the Antarctic. It would significantly improve the observation geometry and thus the quality of orbits, TRF and Earth rotation parameters. The latter would also benefit from an additional station in the vicinity of the equator, and the datum realization of TRFs would improve with an additional station in the Atlantic and Pacific Ocean regions. Details and results are provided in the open access article Future TRFs and GGOS - where to put the next SLR station? (Advances in Geosciences, 2019, doi: 10.5194/adgeo-50-17-2019, [PDF]).
 

Long-term measurements document sea level rise in the Arctic

[Sea level change in the Arctic Ocean: The map illustrates the strong regional differences.]

Tracking down climate change with radar eyes: Over the past 22 years, the sea level in the Arctic Ocean has risen an average of 2.2 millimeters per year. This is the conclusion of an investigation performed jointly by DTU Space and DGFI-TUM as part of ESA's Sea Level Climate Change Initiative (CCI) project.

The most complete and precise overview of the sea level changes in the Arctic Ocean to date was obtained after evaluating 1.5 billion radar measurements of various altimetry satellites. A major challenge for a comprehensive analysis is the presence of sea ice which covers vast areas of the Arctic Ocean and obscures the ocean surface underneath. Applying DGFI-TUM's dedicated retracking algorithm ALES+ to ENVISAT and ERS-2 original measurements, radar echoes reflected even from small water openings in the ice could be identified and analysed. After harmonizing observation data from ice-covered and open water areas, maps of monthly sea level elevations were computed for 1996-2018.

Analysis of the long-term measurements revealed significant regional differences of sea level trends: Within the Beaufort Gyre north of Greenland, Canada and Alaska, the water stage rose twice as fast as on average. Low-salinity meltwater collects here, while a steady east wind produces currents that prevent the meltwater from mixing with other ocean currents. Along the coast of Greenland, on the other hand, the sea level is falling, on the west coast by more than 5 mm per year. Here, the melting glaciers weaken the gravity attraction. More information about the study can be found in the open access article Arctic Ocean Sea Level Record from the Complete Radar Altimetry Era: 1991–2018 (Remote Sensing, 2019, DOI: 10.3390/rs11141672, [PDF]). The results are also subject of a current TUM press release (English, German).

Time-variable surface areas of lakes and reservoirs monitored from space

[Water probability mask of Pires Ferreira Reservoir, Brazil (DAHITI-ID 8671)]

Continuous monitoring of lakes, rivers and reservoirs is of great importance for various hydrological, societal and economic questions. In many regions, hydrological parameters of inland water bodies, such as water level, surface extent, volume and discharge and their temporal changes are directly related to security-relevant aspects. Among those are water supply, water management and the protection of population and infrastructure, in particular in the view of global change, where the growth of population and more frequent extreme weather situations pose various challenges. The availability of up-to-date observation data is essential for the measurement of flooded regions and for the quantification of water storage in lakes and reservoirs.

Scientists of DGFI-TUM developed a new approach for the automated extraction of high-resolution time-variable water surface masks of lakes and reservoirs from optical satellite images, using observations from Landsat and Sentinel-2 between 1984 and 2018. The algorithm first extracts land-water masks from the images by combining five different water indices and applying an automated threshold computation. These monthly masks are used to compute a long-term water probability mask, which is then applied to fill data gaps caused by voids, clouds, cloud shadows or snow. Iteratively, all data gaps in all monthly masks are filled, which leads to a gap-free surface area time series. The resulting surface area changes were compared with water level time series from gauging stations, or, if in-situ data was not available, water level time series from satellite altimetry. Overall, 32 globally distributed lakes and reservoirs of different extents up to 2480 km² were investigated. Filling of the data gaps improved the average correlation coefficients between the time series of surface area and water levels from 0.61 to 0.86. This means an improvement of 41%. A cross-validation showed RMS errors smaller than 40 km² for all study areas, corresponding to relative errors below 8%. This demonstrates the strong impact of a reliable gap-filling approach and the quality enhancement of the land-water masks resulting from the new algorithm. Details are provided in the publication Automated Extraction of Consistent Time-Variable Water Surfaces of Lakes and Reservoirs Based on Landsat and Sentinel-2 (Remote Sensing, 2019, DOI: 10.3390/rs11091010, [PDF]). All presented surface area time series are freely available via DGFI-TUM’s Database of Hydrological Time Series of Inland (DAHITI).
 

Reference Systems – the Backbone for Positioning, Navigation and Earth System Research

[Image Credit: ESO/A.Santerne [CC BY 4.0]]

The fundamental role of reference systems in space geodesy has been expressed already 30 years ago. In a famous article, Kovalevsky [1989] states: "The Earth, its environment and the celestial bodies in the universe are not static: they move, rotate and undergo deformations. Motions and positions are not absolute concepts and can be described only with respect to some reference.”

In the last decades, the measurement accuracy of modern space geodetic techniques has improved by orders of magnitudes. This development requires that today's realizations of reference systems must be accurate at the millimeter level on global scale. Such accuracy is required for precise positioning and navigation on Earth and in space, and it is a prerequisite for the long-term monitoring and quantification of dynamic processes in the Earth system, such as seismic deformations, post-glacial uplift, or global and regional sea level changes.

Triggered by the need of a precise reference on Earth and in space, the DFG research unit “Space-Time Reference Systems for Monitoring Global Change and for Precise Navigation in Space” (FOR1503) aims at developing integrative methods and procedures for a consistent definition and realization of geodetic reference systems as well as to accomplish computations for their establishment and maintenance. This international consortium of scientists now published a 10-article special issue of the Journal of Geodesy. Scientists from DGFI-TUM were involved in two of the published studies:

The article Consistent estimation of geodetic parameters from SLR satellite constellation measurements (Journal of Geodesy, 2018, DOI: 10.1007/s00190-018-1166-7) describes the scientific exploitation of satellite laser ranging (SLR) observations to up to 11 satellites with various altitudes and orbit inclinations. The observations are used for the joint estimation of reference frame parameters, satellite orbits, gravity field changes and variations of Earth rotation. In contrast to the standard 4-satellite constellation currently used by the International Laser Ranging Service (ILRS), the extended constellation allows for a significant reduction of correlations between the respective parameters and thus for a higher precision.

One of the key goals of the research unit, the first simultaneous and consistent realization of the global Terrestrial Reference Frame (TRF), the Celestial Reference Frame (CRF) and the Earth orientation parameters (EOP) from the space-geodetic observing techniques VLBI, SLR, GNSS and DORIS is described in the article Consistent realization of Celestial and Terrestrial Reference Frames (Journal of Geodesy, 2018, DOI: 10.1007/s00190-018-1130-6[PDF]). For the first time, this study realized the IUGG Resolution R3 (2011) which urges that highest consistency between TRF, CRF and EOP should be a primary goal of all future realizations.
 

River levels tracked from space

[Distribution and number of altimetry observations in the Mekong river system applied in the universal kriging approach.]

The 4,300 kilometer Mekong River is a lifeline for South-East Asia. If this mighty river system bursts its banks, flooding can affect the lives and livelihoods of millions of people. Permanent monitoring of the river's water stage is thus essential. Using the example of the Mekong river with its pronounced changes in water level, an innovative method to monitor complex river basins solely based on satellite data has been developed in collaboration between TUM scientists from geodesy and mathematics. The approach allows for modelling how water levels are impacted on various sections of the river by extreme weather events such as heavy rainfall or drought over extended periods.

The approach uses measurement data collected from various altimetry satellite missions. In a first step, their raw observations are analysed applying specially developed retracking algorithms in order to create precise time series of water levels for the crossing points of the satellites' tracks with the river. Altimetry satellites on repetitive orbits usually pass over the same points on a repeating cycle of 10 to 35 days. As a result, water level data are captured for each of these points at regular intervals. But the study also integrates observations collected by Cryosat-2, a SAR altimetry satellite on a long-repeat orbit. The SAR altimetry method is superior to conventional systems in terms of accuracy, and the long-repeat orbit results in a very dense spatial resolution of the observations. But at the same time, the temporal resolution of Cryosat-2 data is very low. Thus, the points observed by Cryosat-2 are well distributed throughout the entire river system, but each of the points is only measured once or twice.

The flow patterns of the river, with its complex network of tributaries, are modelled using the statistical method known as universal kriging. The model allows for linking satellite data from different altimetry missions, including Cryosat-2, and makes it possible to extrapolate water levels observed at certain points to determine the levels at almost any location in the entire river system. It was demonstrated that including the precise and densely distributed SAR measurements in the model greatly improved the quality of the results. Details on the study including data processing and results are presented in the publication Observing water level extremes in the Mekong River Basin: The benefit of long-repeat orbit missions in a multi-mission satellite altimetry approach (Journal of Hydrology, 2019, DOI: 10.1016/j.jhydrol.2018.12.041). The study is also subject of a current TUM press release (English, German).
 

Coastal altimetry reveals wind-induced cross-strait sea level variability in the Strait of Gibraltar

[Oceanic internal waves in the Strait of Gibraltar (ERS-2 SAR image; Credit: ESA).]

The Strait of Gibraltar is the only gateway between the Mediterranean Sea and the Atlantic Ocean. Both seas have very different characteristics in terms of temperature, salinity and nutrients. Water exchange that takes place here may have consequences much farther away, for example contributing to the high salinity of the Nordic Seas, a key area for deep water formation. Due to the prominent role of the ocean as a climate regulator, understanding the dynamics of this water exchange is essential to understand the climate in the Mediterranean as well as important features of the global ocean circulation.

In the recent study Wind-induced cross-strait sea level variability in the Strait of Gibraltar from coastal altimetry and in-situ measurements (Remote Sensing of Environment, 2018, DOI: 10.1016/j.rse.2018.11.042), a group of scientists led by J. Gómez-Enri from the University of Cadiz has used satellite altimetry and model data to monitor the water level from the two sides of the strait. Differences between the two sides are a way to monitor the surface water flow out and into the Mediterranean Sea. DGFI-TUM contributed to this work by recomputing a mean sea state using several years of dedicated reprocessed coastal data from satellite altimetry. This data allowed linking the sense of the surface currents in the strait to the wind regime. It is observed that specific wind events are able to reverse the mean circulation (which normally drives surface waters out of the Mediterranean) and therefore weaken the net Atlantic water inflow toward the Mediterranean Sea.

The study is a paramount example of the value of innovative coastal sea level data from satellites to improve the knowledge of ocean dynamics in areas where previously only sparse in-situ data could offer a localised view. It is also an example of the need of coastal oceanography to evolve as a synergy of different remote sensing, model and in-situ data.
 

Improving the precision of sea level data

[Noise of the sea level anomalies for different significant wave heights in the Mediterranean Sea using different sea state bias (SSB) corrections.]

Precise information on sea level changes is required for various questions of societal, economic and scientific relevance. DGFI-TUM researchers have produced a new correction to sea level data measured from space which improves their precision by about 30%.

Since more than two decades scientists use radar antennas (altimeters) installed on satellites orbiting around the Earth. The altimeters send a pulse of energy that expands onto an area of the ocean surface that can be several kilometers wide. The illuminated area is influenced by wind and waves (so-called sea state), that interact with the radar signal sent by the satellite. Moreover, the algorithms used to analyse the reflection from the ocean surfaces estimate sea level, waves and wind at the same time, causing interconnections between the estimation errors. This generates the need of a correction to the measured sea level, called Sea State Bias.

A recent study demonstrates a strong improvement in the precision of sea level data by estimating a new sea state bias correction. The innovative approach applies a data correction to measurements recorded every 300 m ("high-frequency” measurements), while in the past this correction was generated for 7 km averages (“low-frequency"). The study focused on two oceanic regions: the North Sea and the Mediterranean Sea. In order to verify the new correction, sea level measurements from nearby locations were compared against each other. It was shown that the noise of the high-frequency measurements is lower by 30%: While sea level anomalies (i.e. deviations of the sea level from a mean state that are typically up to 2 m) could previously be measured with a precision of 8 cm, this can be now done with a precision of less than 6 cm. This is very valuable in particular for oceanographic applications, such as for example the study of ocean currents, whose determination is strongly affected by the noise in the data provided by satellites.

The study was partially funded by the European Space Agency’s Sea Level Climate Change Initiative and is presented in the publication Improving the precision of sea level data from satellite altimetry with high-frequency and regional sea state bias corrections (Remote Sensing of Environment, 2018, DOI: 10.1016/j.rse.2018.09.007).
 

The Alps in motion

[Horizontal strain field derived from the GPS data: Red areas indicate compression, blue indicates lateral spreading. Image: DGFI-TUM]

The Alps are on the go: The mountain range drifts northwards an average of one-half millimeter every year and rises 1.8 millimeters. This is the result of a recent study for which scientists of DGFI-TUM analyzed 12 years of measurements from more than 300 GPS stations distributed over the entire chain of the Alps and its foreland. The scientists identified the positions of the GPS stations, accurate down to fractions of a millimeter. A large number of the stations were set up in the EU project ALPS-GPSQUAKENET and are in part operated by DGFI-TUM itself.

The greatest challenge was the homogeneous processing of one-half million observed data items. The measurements are impaired by several interference factors, e.g. atmospheric signal delay, that have to be detected and corrected. The study used the corrected measured values to create a computer model that illustrates horizontal and vertical shifts as well as lateral spreading and compression over the entire Alpine region at a resolution of 25 kilometers.

The model depicts visibly both large-scale patterns of movement and regional special factors: For example each year the Alps grow an average of 1.8 millimeters in height and move to the northeast at a speed of up to 1.3 millimeters. In South and East Tyrol however a rotation towards the east is superimposed on this movement, while at the same time the mountain range is being compressed. And the rise in height is not identical everywhere either: Very small in the southern part of the western Alps, it reaches its maximum with a speed of more than 2 millimeters per year in the central Alps, at the boundaries of Austria, Switzerland and Italy. These changes in the surface of the earth serve as the basis for inferences regarding underground plate tectonics. The research was conducted in collaboration with the Geodesy and Glaciology project of the Bavarian Academy of Sciences and Humanities. Details on the data processing and the results of the study are presented in the open-access publication Present-day surface deformation of the Alpine Region inferred from geodetic techniques (Earth System Science Data, 2018, DOI: 10.5194/essd-10-1503-2018). The study is also subject of a current TUM press release (English, German).
 

DGFI-TUM contributes to the implementation of an UN Resolution for a Global Geodetic Reference Frame

[A highly precise and long-term stable global geodetic reference frame is an indispensable requirement for a reliable determination of global sea level rise over many decades.]

In February 2015, the UN General Assembly adopted its first geospatial resolution „A Global Geodetic Reference Frame for Sustainable Development“. This resolution recognizes the importance of geodesy for many societal and economic benefit areas, including navigation and transport, construction and monitoring of infrastructure, process control, surveying and mapping, and the growing demand for precisely observing our planet's changes in space and time. The resolution stresses the significance of the global reference frame for accomplishing these tasks, for natural disaster management, and to provide reliable information for decision-makers.

The United Nations Global Geospatial Information Management (UN-GGIM) Working Group on the Global Geodetic Reference Frame (GGRF) has the task for drafting a roadmap for the enhancement of the GGRF under UN mandate.

Based on its competence in the realization of reference frames DGFI-TUM is involved in this activity by contributing to the compilation of a concept paper in the frame of the International Association of Geodesy (IAG). The main purpose of this paper is to provide a common understanding for the definition of the GGRF and the scientific basis for the preparation of the roadmap to be accomplished by the UN-GGIM Working Group on the GGRF. [more]

News Archive

Find more topics on the central web site of the Technical University of Munich: www.tum.de