Extreme fluctuations in the horizontal geomagnetic field (dBh/dt) may be generated at the Earth’s surface by electrical currents in the ionosphere and magnetosphere. Using a global database of 125 magnetometers covering several decades we present occurrence statistics for fluctuations exceeding the 99.97th percentile (P99.97) for both ramp changes (Rn) and the root-mean-square (Sn) of fluctuations over periods, τ, from 1 to 60 min and describe their variation with geomagnetic latitude and magnetic local time (MLT). Rates of exceedance are explained by reference to the magneto-ionospheric processes dominant in different latitude and MLT sectors, including ULF waves, interplanetary shocks, auroral substorm currents, and travelling convection vortices. By fitting Generalised Pareto tail distributions above P99.97 we predict return levels (RLs) for Rn and Sn over return periods up to 500 years. P99.97 and RLs increase monotonically with frequency (1/τ) (with a few exceptions at auroral latitudes) and this is well modelled by quadratic functions whose coefficients vary smoothly with latitude. For UK magnetometers providing 1-s cadence measurements, the analysis is extended to cover periods from 1 to 60 seconds and empirical Magnetotelluric Transfer functions are used to predict percentiles and return levels of the geoelectric field over a wide frequency range (2x10-4 to 4x10-2 Hz) assuming a sinusoidal field fluctuation. These results help identify the principal causes of field fluctuations leading to extreme geomagnetically induced currents (GIC) in ground infrastructure over a range of timescales and they inform the choice of frequency dependence to use with dBh/dt as a GIC proxy.