Analyse numérique
A functional equation with polynomial solutions and application to Neural Networks
Comptes Rendus. Mathématique, Tome 358 (2020) no. 9-10, pp. 1059-1072.

We construct and discuss a functional equation with contraction property. The solutions are real univariate polynomials. The series solving the natural fixed point iterations have immediate interpretation in terms of Neural Networks with recursive properties and controlled accuracy.

Reçu le :
Révisé le :
Accepté le :
Publié le :
DOI : 10.5802/crmath.124
Classification : 65Q20, 65Y99, 78M32
Després, Bruno 1 ; Ancellin, Matthieu 2

1 Laboratoire Jacques-Louis Lions, Sorbonne Université, 4 place Jussieu, 75005 Paris, France and Institut Universitaire de France, France
2 Université Paris-Saclay, ENS Paris-Saclay, CNRS, Centre Borelli, F-91190 Gif-sur-Yvette, France
@article{CRMATH_2020__358_9-10_1059_0,
     author = {Despr\'es, Bruno and Ancellin, Matthieu},
     title = {A functional equation with polynomial solutions and application to {Neural} {Networks}},
     journal = {Comptes Rendus. Math\'ematique},
     pages = {1059--1072},
     publisher = {Acad\'emie des sciences, Paris},
     volume = {358},
     number = {9-10},
     year = {2020},
     doi = {10.5802/crmath.124},
     language = {en},
     url = {http://www.numdam.org/articles/10.5802/crmath.124/}
}
TY  - JOUR
AU  - Després, Bruno
AU  - Ancellin, Matthieu
TI  - A functional equation with polynomial solutions and application to Neural Networks
JO  - Comptes Rendus. Mathématique
PY  - 2020
SP  - 1059
EP  - 1072
VL  - 358
IS  - 9-10
PB  - Académie des sciences, Paris
UR  - http://www.numdam.org/articles/10.5802/crmath.124/
DO  - 10.5802/crmath.124
LA  - en
ID  - CRMATH_2020__358_9-10_1059_0
ER  - 
%0 Journal Article
%A Després, Bruno
%A Ancellin, Matthieu
%T A functional equation with polynomial solutions and application to Neural Networks
%J Comptes Rendus. Mathématique
%D 2020
%P 1059-1072
%V 358
%N 9-10
%I Académie des sciences, Paris
%U http://www.numdam.org/articles/10.5802/crmath.124/
%R 10.5802/crmath.124
%G en
%F CRMATH_2020__358_9-10_1059_0
Després, Bruno; Ancellin, Matthieu. A functional equation with polynomial solutions and application to Neural Networks. Comptes Rendus. Mathématique, Tome 358 (2020) no. 9-10, pp. 1059-1072. doi : 10.5802/crmath.124. http://www.numdam.org/articles/10.5802/crmath.124/

[1] Bensoussan, Alain; Li, Yiqun; Nguyen, Dinh Phan Cao; Tran, Minh-Binh; Yam, Sheung Chi Phillip; Zhou, Xiang Machine Learning and Control Theory (https://arxiv.org/abs/2006.05604)

[2] Bodén, Mikael A Guide to Recurrent Neural Networks and Backpropagation, 2001 (published in Dallas project, SICS technical report)

[3] Ciarlet, Philippe G. Linear and nonlinear functional analysis with applications, Other Titles in Applied Mathematics, 130, Society for Industrial and Applied Mathematics, (SIAM), 2013 | Zbl

[4] Cybenko, George Approximation by superpositions of a sigmoidal function, Math. Control Signals Systems, Volume 2 (1989) no. 4, pp. 303-314 | DOI | MR | Zbl

[5] Daubechies, Ingrid C.; DeVore, Ronald A.; Foucart, Simon; Hanin, Boris L.; Petrova, Guergana Nonlinear Approximation and (Deep) ReLU Networks (https://arxiv.org/abs/1905.02199v1)

[6] Després, Bruno Machine Learning, adaptive numerical approximation and VOF methods, 2020 (colloquium LJLL/Sorbonne university, https://www.youtube.com/watch?v=OPKFYe01hH4)

[7] Després, Bruno; Jourdren, Hervé Machine learning design of volume of fluid schemes for compressible flows, J. Comput. Phys., Volume 408 (2020), 109275 | DOI | MR

[8] Goodfellow, Ian; Bengio, Yoshua; Courville, Aaron Deep Learning, Adaptive Computation and Machine Learning, MIT Press, 2016 | Zbl

[9] Hata, Masayoshi; Yamaguti, Masaya Weierstrass’s function and chaos, Hokkaido Math. J., Volume 12 (1983) no. 3, pp. 333-342 | MR | Zbl

[10] Hata, Masayoshi; Yamaguti, Masaya The Takagi Function and Its Generalization, Japan J. Appl. Math., Volume 1 (1984) no. 1, pp. 183-199 | DOI | MR | Zbl

[11] He, Juncai; Li, Lin; Xu, Jinchao; Zheng, Chunyue ReLU Deep Neural Networks and Linear Finite Elements, J. Comput. Math., Volume 38 (2020) no. 3, pp. 502-527

[12] Li, Bo; Tang, Shanshan; Yu, Haijun Better Approximations of High Dimensional Smooth Functions by Deep Neural Networks with Rectified Power Units, Commun. Comput. Phys., Volume 27 (2019) no. 2, pp. 379-411 | MR

[13] Lu, Jianfeng; Shen, Zuowei; Yang, Haizhao; Zhang, Shijun Deep Network Approximation for Smooth Functions, 2020 (https://blog.nus.edu.sg/matzuows/publications/)

[14] Mogensen, Patrick K.; Riseth, Asbjørn N. Optim: A mathematical optimization package for Julia, J. Open Source Softw., Volume 3 (2018) no. 24, 615 | DOI

[15] Opschoor, Joost A. A.; Petersen, Philipp C.; Schwab, Christoph Deep ReLU networks and high-order finite element methods, Anal. Appl. (Singap.), Volume 18 (2020) no. 5, pp. 715-770 | DOI | MR | Zbl

[16] Revels, Jarett; Lubin, Miles; Papamarkou, Theodore Forward-Mode Automatic Differentiation in Julia (2016) (https://arxiv.org/abs/1607.07892)

[17] Yarotsky, Dmitry Error bounds for approximations with deep ReLU networks, Neural Netw., Volume 97 (2017), pp. 103-114 | DOI | Zbl

Cité par Sources :