Recent news:

NEW: Follow DGFI-TUM now on Twitter for more Updates!

Monitoring thin ice and open water in the Arctic Ocean with CryoSat-2 SAR altimetry

[Classified CryoSat-2 observations versus HH-polarized Sentinel-1A image (Laptev Sea, February 2018): Detection of thin-ice (orange), sea ice (yellow) and leads (cyan).]

Dwindling sea ice and an increasing number of open water areas have a significant impact on sea ice dynamics in the Arctic Ocean and the energy exchange between ocean and atmosphere. Areas of water (polynyas) and fractures in the sea ice (leads) are not permanently open, but partially frozen and covered by a thin layer of ice, up to about 25 cm thick. The surface temperature of the so-called thin ice is between that of the open water and the thicker sea ice surfaces, affecting the heat flux between ocean and atmosphere. This must be taken into account in climate models and predictions.

DGFI-TUM and the Alfred Wegener Institute (AWI) have jointly developed an unsupervised classification of altimeter radar echoes from ESA's CryoSat-2 for thin ice layer detection. The classification results were compared and validated with thin ice thickness derived from MODIS thermal imagery and with radar images from ESA's Sentinel-1 Copernicus mission.

The results demonstrate how monitoring of the polar oceans can be improved. They contribute to the knowledge of the time-varying Arctic ice cover, particularly by monitoring the overall sea ice thickness distribution. Moreover, the CryoSat-2 classification supports the development of improved waveform retracking algorithms that provide more reliable estimates of sea ice freeboard or sea level in the polar oceans. Details are provided in the article Monitoring Arctic thin ice: a comparison between CryoSat-2 SAR altimetry data and MODIS thermal-infrared imagery (The Cryosphere, 2023, DOI: 10.5194/tc-17-809-2023, [PDF]).

Reached a milestone: DAHITI provides hydrological data for 10,000 targets worldwide

The Database for Hydrological Time Series of Inland Waters (DAHITI) of DGFI-TUM provides free hydrological information for meanwhile 10,000 inland waters distributed worldwide. For rivers, lakes, reservoirs and wetlands, it provides water levels, area and volume changes based on satellite data in near-real-time.

With several thousand registered users and more than two hundred thousand downloads in recent years, DAHITI is a widely used data source for numerous applications in science and practice. The Global Climate Observing System (GCOS) lists the data base as an openly accessible data source for the Essential Climate Variable (ECV) "Lakes". To date, DAHITI provides water level and water extent for more than 1400 lakes and reservoirs, continuously and automatically updated with the latest satellite information.

Station-dependent satellite laser ranging (SLR) correction improves the orbit of the TOPEX/Poseidon altimetry satellite

[Geodetic Observatory Wettzell, SLR telescope (Image: Hessels)]

TOPEX/Poseidon (T/P) was one of the first major altimetry satellite missions. It was operational between 1992 and 2006 and serves as the reference mission for its successors in the Jason and Sentinel-6 series. As such, it is of great importance for studies of global and regional sea level change, ocean circulation, and climate phenomena such as El Niño. With a diameter of over 160 cm, its Laser Retroreflector Array (LRA), the on-board target for Satellite Laser Ranging (SLR) measurements, is not ideally designed for centimeter orbit accuracies. The resulting large phase center fluctuations are a major limiting factor for precise orbit determination of T/P and thus for ocean surface surveys.

Scientists of DGFI-TUM have developed a correction function for SLR observations that resolves LRA-related phase center variations as well as effects that depend on satellite and observing station, such as range biases. The function uses the viewing angles of the observation to determine a correction value that is added to the SLR distance measurement. Since the function is continuous, interpolation between tabulated values is not required and interpolation errors are avoided. The correction reduces the root mean square (RMS) of the SLR observation residuals (observed - computed) for the entire T/P mission from 33.78 cm to 1.97 cm (1.59 cm for SLR core stations). Details can be found in the article Station-dependent satellite laser ranging measurement corrections for TOPEX/Poseidon (Advances in Space Research, 2023, DOI: 10.1016/j.asr.2022.09.002). The station-dependent correction parameters are published as a supplement to the article, so that the measurement correction can be implemented in any precise orbit determination software.

Novel multi-resolution representation scheme for regional gravity field refinement

[Differences (in terms of disturbing potential) between the calculated gravity model and validation data, delivered by the single-level approach (top) and the MRR based on the pyramid algorithm (bottom)]

The optimal combination of different observation types is a key to obtaining high-resolution and high-precision regional gravity field models. Existing literature suggests that the single-level model based on spherical radial basis function (SRBF) may be biased toward high-resolution measurements. To avoid this effect, a spectral combination by a multiresolution representation (MRR) allows to decompose the gravity signal into an expansion in the form of spherical harmonics for the global long-wavelength part and a number of detail signals in the form of wavelet functions for the regional mid- and high-frequency parts.

The article Combination of different observation types through a multi-resolution representation of the regional gravity field using the pyramid algorithm and parameter estimation (Journal of Geodesy, 2022, DOI: 10.1007/s00190-022-01670-5, [PDF]) presents an innovative MRR method based on the pyramid algorithm and sequential parameter estimation. Instead of estimating the coefficients of the detail signals at each resolution level independently, the pyramid algorithm connects the different levels and estimates the coefficients sequentially. Numerical investigations based on simulated and real gravity data demonstrate the ability of the MRR scheme based on the pyramid algorithm to adequately capture gravity information not only from high-resolution terrestrial data, but also from medium- to low-resolution measurements. It actually provides a model accuracy improvement of more than 30% in terms of RMS error over validation data, compared to the single-level approach.

River slopes determined from spaceborne lidar onboard ICESat-2

Knowledge of the water surface slope (WSS) is essential for estimating the discharge and flow velocity of rivers. These parameters are among the Essential Climate Variables (ECVs) as defined by the Global Climate Observing System (GCOS). They critically contribute to the characterization of Earth’s climate, and their determination on a global scale is thus of great scientific relevance.

Field surveys of WSS are, however, very costly, while remote sensing approaches are limited by several factors. While, for example, wide swath SAR interferometric measurements are relatively inaccurate, precise point measurements of radar altimetry lack simultaneous observations over short distances. This obstructs the observation of WSS that vary greatly in space and time. In contrast, the unique measurement geometry of ICESat-2 with six parallel laser beams enables instant, highly accurate WSS observations. A new approach uses two different methods, depending on the intersection angle between the satellite orbit and the river: If multiple beams cross a river reach nearly perpendicularly, the WSS between the crossings can be calculated with the across-track approach. Otherwise, if satellite orbit and river are nearly parallel, the along-track approach derives the WSS directly from the continuous water level observations along a single intersecting crossing beam. The method can be applied globally, and the long-repeat orbit pattern of ICESat-2 allows continuous WSS monitoring, revealing details of the highly varying WSS with little effort.

WSS is not only relevant for the derivation of river discharge, but also as a correction for the ground track variability of altimetry satellites. In this way, errors at regularly observed sites (so-called virtual stations) can be reduced by up to 30 cm or 66 %. Details of the study are presented in the article ICESat‐2 based River Surface Slope and its Impact on Water Level Time Series from Satellite Altimetry (Water Resources Research, 2022, DOI: 10.1029/2022WR032842).

Machine learning techniques for Vertical Total Electron Content forecasting

[Solar missions ACE, DSCOVR, SOHO in Lagrange points L1/L2, and GPS satellites. Diagrams compare 1 h forecasted VTEC from a VR model at three locations along the 10°E meridian (green) with corresponding ground truth VTEC maps (GIMs of CODE; orange).]

Space weather is considered the greatest risk to global navigation satellite systems (GNSS). High-precision GNSS applications, such as positioning and navigation, require advanced forecast methods for the effects of space weather and the ionosphere on GNSS. Impacts are difficult to model adequately using conventional mathematical approaches since the relationships are often non-linear.

DGFI-TUM and ETH Zurich jointly developed a novel forecast model for ionospheric vertical total electron content (VTEC) using decision tree-based machine learning techniques of Random Forest, Adaptive Boosting (AdaBoost), and eXtreme Gradient Boosting (XGBoost). A novelty of the approach is that an ensemble meta-estimator, i.e., a Voting Regressor (VR) model, combines predictions of several powerful machine learning techniques to create a final VTEC forecast with higher accuracy and better adaptation to data outside the training set. The VR model forecasts VTEC for 1 hour and 24 hours for both calm and stormy conditions. In addition, the relative importance of individual input features to VTEC forecasting was estimated to provide information on which input data the model is particularly good at learning from. In this way, the influence of a predictor on the target variable can be quantified and physical understanding improved. Details of the study are given in the article Ensemble Machine Learning of Random Forest, AdaBoost and XGBoost for Vertical Total Electron Content Forecasting (Remote Sensing, 2022, DOI: 10.3390/rs14153547, [PDF]).

DiscoTimeS: New approach to detect change points in GNSS, satellite altimetry, tide gauge and other geophysical time series

[Bayesian model fit for SATTG (top) and GNSS (bottom) time series at Kujiranami (Japan). Orange: Observed height changes [m]. Blue lines: Best-fit piecewise trend. Blue shading displays the two-sigma confidence intervals (CI) of the fit. Detected discontinuities are indicated by dashed vertical lines.]

Precise knowledge of coastal vertical land motion (VLM) is essential for assessing the impact of sea level rise on coastal regions and its consequences for local populations. It links relative sea level, measured by tide gauges with a fixed connection to land, with absolute sea level, measured by satellite altimetry. VLM can be determined pointwise using Global Navigation Satellite Systems (GNSS) or from the combination of tide gauge measurements with absolute sea level changes measured by satellite altimetry.

Among the largest sources of uncertainty in the determination of VLM are discontinuities and trend changes in the observed time series. Discontinuities are often caused by instrumental changes, especially in GNSS. Trend changes often have seismic causes, but can also be due to long-term changes in surface loading or to local effects. Although these issues have been addressed extensively for GNSS data analysis, there is limited knowledge of how to directly detect and mitigate such events when determining VLM from the difference of altimetry and tide gauge (SATTG).

The novel Bayesian approach DiscoTimeS automatically and simultaneously detects discontinuities and trend changes, for the first time not only for GNSS time series but also for VLM derived from SATTG. We show that accounting for time-varying VLM significantly increases the agreement of SATTG with GNSS measurements (by 0.36 mm/year on average for 339 globally distributed station pairs). Bayesian change point detection is applied to 606 SATTG and 381 GNSS time series. One of the main results of this work is that the determination of time-varying VLM now makes it possible to avoid extrapolation errors of coastal VLM and sea level change projections. Details of the study are presented in the article Bayesian modeling of piecewise trends and discontinuities to improve the estimation of coastal vertical land motion (Journal of Geodesy, 2022, DOI: 10.1007/s00190-022-01645-6, [PDF]).

Assessment of non-tidal loading data for DGFI-TUM’s upcoming ITRS 2020 realization DTRF2020

[Contributions to geocenter motion computed from NTL site displacements of ESMGFZ (blue) and GCTI20 (red). Geocenter motion estimated by SLR is shown in grey. Time-series for GCTI20 and SLR have been shifted by 10 and 20 mm, respectively.]

Site displacements caused by non-tidal loading (NTL) are among the major limiting factors for the accuracy of today's reference frames. On the occasion of the current 2020 realization of the International Terrestrial Reference System (ITRS), we investigated the NTL models of two providers with respect to their suitability for the determination of long-term stable reference frames.

As one of the three ITRS Combination Centers of the International Earth Rotation and Reference Systems Service (IERS), DGFI-TUM is in charge of realizing the ITRS in regular intervals. Our upcoming solution, the DTRF2020, will correct for site displacements caused by mass load changes of atmosphere, ocean and hydrology.

Since there is no conventional model for the application of NTL yet, the best approach and data for the DTRF2020 needs to be investigated. The appropriate modeling of NTL is crucial to our ITRS realization as accounting for NTL site displacements will affect the estimated coordinates of reference points (i.e., geodetic observatories), but should not affect the realized geocenter. We analyzed two sets of site displacements based on different geophysical models. One is the Global Geophysical Fluid Center contribution (labelled GCTI20) to the ITRS 2020 realization, the other is the operational NTL data from the Earth System Modelling group of the Deutsches GeoForschungsZentrum (ESMGFZ; already applied at DGFI-TUM e.g. for VLBI analysis). Among other things, we compared the displacements with time series of GNSS station residuals and calculated the contributions to geocenter motion (see Figure).

While the correlations are satisfactory, neither data set could be identified as having the better agreement with the residual GNSS station positions. The main differences between GCTI20 and ESMGFZ are the hydrological loading components and the presence of artificial trend changes in ESMGFZ site displacements (and hence geocenter motion contributions). The latter is a hindrance to realizing a secular reference frame. As a result, GCTI20 will be applied for the DTRF2020. The study is published in the article Comparison of non‐tidal loading data for application in a secular terrestrial reference frame (Earth, Planets and Space, 2022, DOI: 10.1186/s40623-022-01634-1, [PDF]).

Improved modeling of atmospheric drag in precise orbit determination

[POD of a spherical satellite by Satellite Laser Ranging: The calculated orbit depends on gravitational and non-gravitational accelerations, such as air drag (Left). In-situ measurement of thermospheric density along the orbit of GRACE satellites using on-board accelerometers (Right).]

A major problem in the precise orbit determination (POD) of satellites at altitudes below 1,000 km is the modelling of atmospheric drag, which depends mainly on thermospheric density and causes the largest non-gravitational acceleration. Normally, thermospheric densities at satellite positions are determined by empirical models, which have limited accuracy. But conversely, satellites orbiting the Earth within the thermosphere can be used to derive thermospheric density information because of their sensitivity to perturbing accelerations.

Scientists from DGFI-TUM and the Institute of Geodesy and Geoinformation at the University of Bonn (IGG Bonn) have for the first time compared thermospheric density corrections in the form of scale factors for the NRLMSISE-00 model with a temporal resolution of 12 hours. It was shown that time-averaged scale factors from in-situ acceleration measurements on board CHAMP and GRACE fit well to arc-wise scale factors from the Satellite Laser Ranging (SLR) technique applied to the spherical satellites Larets, Stella, WESTPAC and Starlette. The estimated scale factors vary by up to 30% around the value of 1 at low solar activity and by up to 70% at high solar activity. This shows the extent to which the NRLMSISE-00 model values of thermospheric density deviate from the observed values. On average, at low solar activity the model overestimates the thermospheric density and has to be scaled down with the estimated scale factors, while at high solar activity the model underestimates the density values and has to be scaled up.

Depending on the altitude, there are correlations of up to 0.8 between the scale factors derived from accelerometer data and those estimated from SLR. To check the reliability of the latter, the POD results from two different software packages were compared, namely DOGS-OC at DGFI-TUM and GROOPS at IGG Bonn. Above 680 km altitude, a linear decrease in the estimated thermospheric density scale factors of about -5% per decade was observed, possibly related to climate change. The results of this study are published in the article Scale Factors of the Thermospheric Density: A Comparison of Satellite Laser Ranging and Accelerometer Solutions (Journal of Geophysical Research: Space Physics, 2021, DOI: 10.1029/2021JA029708, [PDF]).

New global ocean tide model EOT20 from multi-mission satellite altimetry

[Amplitudes and phases of tidal constituents M2 (top) and K2 (bottom).]

DGFI-TUM recently published the latest in a series of empirical ocean tide (EOT) models. The new model, named EOT20, shows improved results compared to other global tide models (including our earlier model EOT11a), especially in the coastal region.

Ocean tides play a vital role in various practical applications, especially in the coastal environment. In addition, tides are of importance in geodetic data analysis, for example in improving the observation of sea surface processes from along-track satellite altimetry and in determining high-resolution gravity fields from missions such as GRACE. Although in recent years tide models have made significant progress in the estimation of tides using satellite altimetry, the coastal region remains a challenge due to the complexity of shorelines, poorly resolved bathymetry and land contamination of altimetry radar echoes.

EOT20 benefits from advances in coastal altimetry, particularly in the use of the ALES retracker. The EOT20 approach relies on residual tidal analysis with respect to a reference tide model (FES2014) to estimate residual signals of the ocean tides. Further developments include the incorporation of more altimetry data, improved coastline representation and triangular gridding.

The model's accuracy was evauated using in-situ tide gauge data from DGFI-TUM's TICON dataset. Error reduction was found for the eight major tidal constituents in EOT20 compared to other global ocean tide models in the coastal region, with an error reduction of ~0.2 cm compared to the next best model (FES2014). EOT20 is on par with the best tide models in shelf regions and the open ocean, with improvement over EOT11a throughout. When used as a tidal correction for satellite altimetry, EOT20 reduced the sea level variance compared to both EOT11a and FES2014. These improvements, particularly in the coastal region, encourage the use of EOT20 as a tidal correction for satellite altimetry in sea-level research.

The ocean tide and load tide datasets of EOT20 are available in our Science Data Products section. Methodology and results are described in the publication EOT20: a global ocean tide model from multi-mission satellite altimetry (Earth System Science Data, 2021, doi: 10.5194/essd-13-3869-2021, [PDF]).

First comprehensive measurements of sea level changes in the Baltic Sea and the North Sea

[Rise of mean sea level in the North and Baltic Sea between 1995 and 2019. Gray shading indicates areas with high statistical uncertainty.]

Precise data for improved coastline protection: Led by DGFI-TUM, an international team of researchers has created the first comprehensive data sets of regional sea level rise in the North Sea and the Baltic Sea, including coastal areas and regions covered by sea ice. The data sets provide new insights into long-term and seasonal seal level changes over the past quarter century. This information is of vital importance for planning protective measures and for understanding dynamic processes in the oceans and the climate system.

Especially near coastlines, where many cities and industry facilities are located, the quality and quantity of data collected by the satellites are compromised by strong perturbances of the radar signal. Another problem is sea ice, which covers parts of the oceans in winter, and is impenetrable to radar. In the ESA Baltic Sea Level project (Baltic SEAL) the researchers developed algorithms to process the measurement data from radar satellites to permit precise and high-resolution measurements of sea level changes even in coastal areas and beneath sea ice. In this effort, the Baltic Sea can be seen as a model region since the complex shape of the coastline and sea ice make the data analysis particularly difficult. Analytical methods that work here can be easily adapted to other regions. Hundreds of millions of radar measurements taken between 1995 and 2019 were processed in a newly developed multi-stage process, comprising the identification of signals from the ice-covered sea water in the radar reflections produced along cracks and fissures, the development of new computational methods to achieve better quality of sea level data close to land, and finally the calibration and combination of measurements from the various satellite missions.

The analysis of these data for the Baltic Sea shows that the sea level has risen at an annual rate of 2-3 millimeters in the south, on the German and Danish coasts, as compared to 6 millimeters in the north-east, in the Bay of Bothnia. The cause of this above-average rise: Strong south-westerly winds connected to the NAO drive the waters to the north-east. The developed method has also been applied to the North Sea, where the sea level is rising by 2.6 millimeters per year, and by 3.2 millimeters in the German Bight.

The data sets Baltic SEAL and North SEAL of sea level changes are available for download in our Science Data Products section. Methods and results are described in the respective publications Absolute Baltic Sea Level Trends in the Satellite Altimetry Era: A Revisit (Frontiers in Marine Science, 2021, doi: 10.3389/fmars.2021.647607, [PDF]) and North SEAL: A new Dataset of Sea Level Changes in the North Sea from Satellite Altimetry (Earth System Science Data, 2021, doi: 10.5194/essd-13-3733-2021, [PDF]).

The study is also subject of a current TUM press release (English, German).

Global coastal attenuation of wind-waves observed with radar altimetry

Knowledge of ocean wave heights at the coast is essential for several operational applications, ranging from coastal protection to energy exploitation. In this context, the Significant Wave Height (SWH) is one of the most general quantitative parameters that describe the sea state at a particular location. SWH, representing the average height of the highest waves, can be measured from satellites using radar altimeters. Over the open ocean, such measurements are routinely used, for example, for ocean weather predictions. In the coastal zone however, the radar measurements were not considered reliable. As an alternative, in-situ buoys or high-resolution ocean models are employed. While the network of in-situ buoys is very sparse and can only provide data at specific locations, appropriate ocean models are computationally very expensive and not globally available, besides requiring constant validation.

Led by DGFI-TUM, an international team has now analyzed reprocessed data from radar altimetry, specifically tailored to improve the quality and quantity of coastal measurements. The results, published in the article Global coastal attenuation of wind-waves observed with radar altimetry (Nature Communications, 2021, doi: 10.1038/s41467-021-23982-4, [PDF]), provide a global picture of the average wave climate when going from offshore (about 30 km) to the coast (up to 3 km from land). The typical attenuation of the waves when approaching the coast, for example due to the shading effect from the land, is quantified to be about 20% of the wave height reached offshore. As a consequence, the energy flux transported by the waves is calculated to decline by about 40% on a global average. This result is paramount for coastal assessments, which until now are often based on models with validation relative to offshore satellite altimetry data.

Improved parameters of geodetic VLBI by correcting for all types of non-tidal loading

[Change in the baseline length between the VLBI antennas Wettzell and Badary due to different effects of non-tidal loading.]

Very Long Baseline Interferometry (VLBI) is a geodetic space technique which measures the difference in arrival times (delay) of extra-galactic radio signals at separate antennas across the Earth. It depends on the distances between each two antennas, the so-called “baselines”. The observed delays allow for estimating the absolute positions of the antennas in the terrestrial reference frame (TRF), the positions of the radio sources in the celestial reference frame (CRF), as well as the complete set of Earth Orientation Parameters (EOP), linking TRF and CRF.

The positions of the antennas vary during VLBI measurements, and the instantaneous displacements with respect to the long-term linear motion as provided by the TRF are generated from different geophysical effects. One such effect is the deformation of the Earth surface by non-tidal loading, driven by the redistribution of air and water masses within the atmosphere, ocean and continental hydrolosphere. However, oceanic and hydrological loading are usually omitted in routine VLBI processing. In the recent study Benefits of non-tidal loading applied at distinct levels in VLBI analysis (Journal of Geodesy, 2020, doi: 10.1007/s00190-020-01418-z, [PDF]), researchers of DGFI-TUM applied all three non-tidal loading types in the analysis of VLBI sessions between 1984 and 2017 and investigated the impact for various geodetic parameters.

Loading data in terms of three-dimensional station site displacements was applied at two distinct levels of the parameter estimation process: The “observation level” represents the rigorous application, while only average site displacements are considered in the approximation at the “normal equation level”. The study revealed that each baseline is most sensitive to a different loading type (Figure). Considering all types jointly provides the best results, as the variation in estimated heights decreases to a larger extent and for more stations than with any of the single loading types. In particular, the inclusion of hydrological loading leads to a significant reduction in the annual residual signal of station heights. These effects, which improve the stability of station positions, were observed for both application levels with a similar magnitude, and hence the correction for non-tidal loading at normal equation level proved to be a suitable approximation in VLBI analysis.

New coastal sea level record from reprocessed Jason satellite altimetry

[Coastal sea level trends (mm/yr) at the 429 selected coastal sites of the study.]

Many coastal regions are exposed to sea level rise and are thus increasingly threatened by the risk of flooding during extreme events. Risk assessment and the development of appropriate adaptation measures are complex and require a reliable data basis of regional coastal sea level changes from precise observations over long time spans. But systematic coastal sea level observations are lacking along most of the world coastlines. Coastal zones are highly under-sampled by tide gauges, and altimetry data are largely defective because of land contamination of the radar signals.

Now, in the framework of the Climate Change Initiative (CCI) Sea Level project of the European Space Agency (ESA), a novel altimetry-based coastal sea level data record has been created. It consists of high-resolution (~300 m) monthly sea level data along the satellite tracks, at distances of less than 3-4 km from the coastlines in general, sometimes even closer, within 1-2 km from the coast. The data set is based on a complete reprocessing of altimetry radar observations from the Jason-1/2/3 missions and provides coastal sea level trends over 2002-2018 at 429 coastal sites located in six regions (Northeast Atlantic, Mediterranean Sea, West Africa, North Indian Ocean, Southeast Asia and Australia). DGFI-TUM is involved in the CCI Sea Level project by designing and testing of improved radar signal processing techniques to exploit the radar signal in the coastal zone and to correct the measurements. The procedure and the new coastal sea level record are described in the article Coastal sea level anomalies and associated trends from Jason satellite altimetry over 2002–2018 (Nature Scientific Data, 2020, doi: 10.1038/s41597-020-00694-w, [PDF]). The data is freely available at the SEANOE repository (doi: 10.17882/74354).

Adaptive Modeling of the Global Ionosphere Vertical Total Electron Content

[VTEC maps covering three days in 2015 around the St Patrick geomagnetic storm generated by the developed modeling approach adapting the estimator to the environmental changes autonomously: March 16 (day before storm, top), March 17 (storm day, mid) and March 18 (day after storm, bottom).]

Space weather and natural disaster monitoring, navigation, positioning and other applications imply an increasing need for low latency ionosphere information. In order to create such information, a suitable estimator is required, making use of observation data as soon as they are available. In this sense, the Kalman Filter (KF) is often applied in (ultra) rapid and (near) real-time applications. The requirement of the prior definition of model uncertainties is a drawback associated with the standard implementation of the KF, and the uncertainties can exhibit temporal variations. The implementation of adaptive approaches into the KF is a way to tune the stochastic model parameters during the filter run-time.

In the last years DGFI-TUM developed approaches for modeling the global vertical total electron content (VTEC) of the ionosphere as a series expansion in terms of localizing B-spline basis functions from unevenly distributed input data such as the dual-frequency GNSS measurements of the IGS network. Scientists of DGFI-TUM made now the next step and developed adaptive methods for an ultra-rapid VTEC modelling to tune the associated model uncertainties in a self-learning manner. The adaptive approach relies on the method of Variance Component Estimation (VCE) and significantly reduces the effort to set up the measurement model and the associated uncertainties for different groups of observations. In order to define the dynamic (prediction) model of the ionosphere target parameters, advantages of the B-spline representation are exploited. For instance, since the coefficients of the B-spline representation resemble the VTEC signal, physical interpretations can be directly deduced from the coefficients. This leads to developing the empirical prediction model very efficiently.

The approach is applied to ultra-rapid VTEC modeling employed with a maximum latency of about 2.5 hours using ionosphere measurements from GPS and GLONASS and can be extended for additional GNSS constellations such as GALILEO or other measurement techniques. Details are presented in the article Adaptive Modeling of the Global Ionosphere Vertical Total Electron Content (Remote Sensing, 2020, doi:10.3390/rs12111822, [PDF]).

Long-term measurements document sea level rise in the Arctic

[Sea level change in the Arctic Ocean: The map illustrates the strong regional differences.]

Tracking down climate change with radar eyes: Over the past 22 years, the sea level in the Arctic Ocean has risen an average of 2.2 millimeters per year. This is the conclusion of an investigation performed jointly by DTU Space and DGFI-TUM as part of ESA's Sea Level Climate Change Initiative (CCI) project.

The most complete and precise overview of the sea level changes in the Arctic Ocean to date was obtained after evaluating 1.5 billion radar measurements of various altimetry satellites. A major challenge for a comprehensive analysis is the presence of sea ice which covers vast areas of the Arctic Ocean and obscures the ocean surface underneath. Applying DGFI-TUM's dedicated retracking algorithm ALES+ to ENVISAT and ERS-2 original measurements, radar echoes reflected even from small water openings in the ice could be identified and analysed. After harmonizing observation data from ice-covered and open water areas, maps of monthly sea level elevations were computed for 1996-2018.

Analysis of the long-term measurements revealed significant regional differences of sea level trends: Within the Beaufort Gyre north of Greenland, Canada and Alaska, the water stage rose twice as fast as on average. Low-salinity meltwater collects here, while a steady east wind produces currents that prevent the meltwater from mixing with other ocean currents. Along the coast of Greenland, on the other hand, the sea level is falling, on the west coast by more than 5 mm per year. Here, the melting glaciers weaken the gravity attraction. More information about the study can be found in the open access article Arctic Ocean Sea Level Record from the Complete Radar Altimetry Era: 1991–2018 (Remote Sensing, 2019, DOI: 10.3390/rs11141672, [PDF]). The results are also subject of a current TUM press release (English, German).

DGFI-TUM contributes to the implementation of an UN Resolution for a Global Geodetic Reference Frame

[A highly precise and long-term stable global geodetic reference frame is an indispensable requirement for a reliable determination of global sea level rise over many decades.]

In February 2015, the UN General Assembly adopted its first geospatial resolution „A Global Geodetic Reference Frame for Sustainable Development“. This resolution recognizes the importance of geodesy for many societal and economic benefit areas, including navigation and transport, construction and monitoring of infrastructure, process control, surveying and mapping, and the growing demand for precisely observing our planet's changes in space and time. The resolution stresses the significance of the global reference frame for accomplishing these tasks, for natural disaster management, and to provide reliable information for decision-makers.

The United Nations Global Geospatial Information Management (UN-GGIM) Working Group on the Global Geodetic Reference Frame (GGRF) has the task for drafting a roadmap for the enhancement of the GGRF under UN mandate.

Based on its competence in the realization of reference frames DGFI-TUM is involved in this activity by contributing to the compilation of a concept paper in the frame of the International Association of Geodesy (IAG). The main purpose of this paper is to provide a common understanding for the definition of the GGRF and the scientific basis for the preparation of the roadmap to be accomplished by the UN-GGIM Working Group on the GGRF. [more]

News Archive

Find more topics on the central web site of the Technical University of Munich: