*Result*: Hypothesis spaces for deep learning.

Title:
Hypothesis spaces for deep learning.
Authors:
Wang R; School of Mathematics, Jilin University, Changchun, 130012, PR China. Electronic address: rwang11@jlu.edu.cn., Xu Y; Department of Mathematics and Statistics, Old Dominion University, Norfolk, 23529, VA, USA. Electronic address: y1xu@odu.edu., Yan M; Department of Mathematics and Statistics, Old Dominion University, Norfolk, 23529, VA, USA. Electronic address: myan007@odu.edu.
Source:
Neural networks : the official journal of the International Neural Network Society [Neural Netw] 2026 Jan; Vol. 193, pp. 107995. Date of Electronic Publication: 2025 Aug 21.
Publication Type:
Journal Article
Language:
English
Journal Info:
Publisher: Pergamon Press Country of Publication: United States NLM ID: 8805018 Publication Model: Print-Electronic Cited Medium: Internet ISSN: 1879-2782 (Electronic) Linking ISSN: 08936080 NLM ISO Abbreviation: Neural Netw Subsets: MEDLINE
Imprint Name(s):
Original Publication: New York : Pergamon Press, [c1988-
Contributed Indexing:
Keywords: Deep learning; Deep neural network; Representer theorem for deep learning; Reproducing kernel Banach space
Entry Date(s):
Date Created: 20250829 Date Completed: 20251217 Latest Revision: 20251217
Update Code:
20260130
DOI:
10.1016/j.neunet.2025.107995
PMID:
40882408
Database:
MEDLINE

*Further Information*

*This paper introduces a hypothesis space for deep learning based on deep neural networks (DNNs). By treating a DNN as a function of two variables-the input variable and the parameter variable-we consider the set of DNNs where the parameter variable belongs to a space of weight matrices and biases determined by a prescribed depth and layer widths. To construct a Banach space of functions of the input variable, we take the weak* closure of the linear span of this DNN set. We prove that the resulting Banach space is a reproducing kernel Banach space (RKBS) and explicitly construct its reproducing kernel. Furthermore, we investigate two learning models-regularized learning and the minimum norm interpolation (MNI) problem-within the RKBS framework by establishing representer theorems. These theorems reveal that the solutions to these learning problems can be expressed as a finite sum of kernel expansions based on training data.
(Copyright © 2025 Elsevier Ltd. All rights reserved.)*

*Declaration of competing interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.*