Abstract
Deep learning techniques are used for capturing intricate structures of
large-scale data by employing computational models of multiple
processing layers that can learn and represent data with multiple levels
of abstraction [1]. Such methods can include Convolutional Neural
Networks, stacked auto-encoders and Long-Short Term Memory (LSTM)
architectures. LSTM networks are suitable for dealing with
time-dependent data through mapping input sequences to output sequences
as it is done, for instance, in language modeling and speech
recognition. One application that has recently attracted considerable
attention within the geodetic community is the possibility of applying
these techniques to account for the adverse effects of the ionospheric
delays on the GNSS satellite signals. LSTM architectures model
long-range dependencies in time series, making them appropriate for
ionospheric modeling in GNSS positioning. This paper deals with a
modeling approach suitable for predicting the ionospheric delay at
different locations of the IGS network stations using the LSTM networks.
We also incorporate a Bayesian optimization method for selecting the
best configuration parameters of the LSTM network, thus improving
network’s performance.