loading page

Approximation Error Estimates by Noise-injected Neural Networks
  • Keito AKIYAMA
Keito AKIYAMA
Tohoku Daigaku - Aobayama Campus

Corresponding Author:[email protected]

Author Profile

Abstract

One-hidden-layer feedforward neural networks are described as functions having many real-valued parameters. The larger the number of parameters is, neural networks can approximate various functions (universal approximation property). The essential optimal order of approximation bounds is already derived in 1996. We focused on the numerical experiment that indicates the neural networks whose parameters have stochastic perturbations gain better performance than ordinary neural networks, and explored the approimation property of neural networks with stochastic perturbations. In this paper, we derived the quantitative order of variance of stochastic perturbations to achieve the essential approximation order.
08 Aug 2023Submitted to Mathematical Methods in the Applied Sciences
08 Aug 2023Submission Checks Completed
08 Aug 2023Assigned to Editor
16 Aug 2023Review(s) Completed, Editorial Evaluation Pending
16 Aug 2023Reviewer(s) Assigned
05 Nov 2023Editorial Decision: Revise Major
14 Feb 2024Reviewer(s) Assigned
07 Apr 2024Review(s) Completed, Editorial Evaluation Pending
15 Apr 2024Editorial Decision: Revise Minor
04 Jun 2024Review(s) Completed, Editorial Evaluation Pending