A novel approach is proposed in this paper to reconstruct the far-field radiation pattern from the phaseless electric field of an antenna scanned on a single near-field sphere. It adopts the dipole equivalence approach to project the near-field electric field into a spherically distributed array of electric dipoles. We introduce a term representing the residue from the linearly correlated portion in the least-square problem associated with the dipole equivalence, namely the linear correlation residue (LCR). It is demonstrated that by iterative search to minimize the LCR using the Covariance Matrix Adaptation Evolution Strategy (CMA-ES), near-field phase distributions can be found efficiently from the magnitude-only near-field, and the far-field radiation pattern can be computed. Two representative case studies are given here to validate the proposed method. Results demonstrate good agreements between computations and simulations.