Open Access
Issue
ESAIM: COCV
Volume 28, 2022
Article Number 3
Number of page(s) 44
DOI https://doi.org/10.1051/cocv/2021100
Published online 11 January 2022
  1. R.A. Adams and J.J.F. Fournier, Sobolev spaces, volume 140 of Pure and Applied Mathematics (Amsterdam). Elsevier/ Press, Amsterdam (2003), second edition. [Google Scholar]
  2. J. Adler and O. Öktem, Solving ill-posed inverse problems using iterative deep neural networks. Inverse Probl. 33 (2017) 124007. [CrossRef] [Google Scholar]
  3. C.D. Aliprantis and K.C. Border, Infinite dimensional analysis, a Hitchhiker’s Guide. Springer (2006). [Google Scholar]
  4. S. Arridge, P. Maass, O. Öktem and C.B. Schönlieb, Solving inverse problems using data-driven models. Acta Numer. 28 (2019) 1–174. [CrossRef] [MathSciNet] [Google Scholar]
  5. H. Attouch, G. Buttazzo and G. Michaille, Variational analysis in Sobolev and BV spaces, volume 17 of MOS-SIAM Series on Optimization. Society for Industrial and Applied Mathematics (SIAM), Philadelphia, PA; Mathematical Optimization Society, Philadelphia, PA (2014), second edition. [Google Scholar]
  6. T. Bachlechner, B.P. Majumder, H.H. Mao, G.W. Cottrell and J. McAuley, Rezero is all you need: Fast convergence at large depth. Preprint arXiv (2020). [Google Scholar]
  7. B. Baker, O. Gupta, N. Naik and R. Raskar, Designing neural network architectures using reinforcement learning. Conference paper on International Conference on Learning Representations (2017) 1–18. https://openreview.net/pdf?id=S1c2cvqee. [Google Scholar]
  8. F. Balsiger, A. Shridhar Konar, S. Chikop, V. Chandran, O. Scheidegger, S. Geethanath and M. Reyes, Magnetic resonance fingerprinting reconstruction via spatiotemporal convolutional neural networks. In Machine Learning for Medical Image Reconstruction. MLMIR 2018, edited by D. Rueckert, F. Knoll and A. Maier. Volume 11074 of LNCS. Springer, Cham (2018) 39–46. https://doi.org/10.1007/978-3-030-00129-2˙5. [Google Scholar]
  9. F. Bloch, Nuclear induction. Phys. Rev. 70 (1946) 460–473. [CrossRef] [Google Scholar]
  10. L. Bottou, F.E. Curtis and J. Nocedal, Optimization methods for large-scale machine learning. SIAM Rev. 60 (2018) 223–311. [CrossRef] [MathSciNet] [Google Scholar]
  11. A. Braides, Convergence of local minimizers. In Local Minimization, Variational Evolution and Γ-Convergence. Springer (2014) 67–78. [CrossRef] [Google Scholar]
  12. Brainweb: Simulated brain database. http://www.bic.mni.mcgill.ca/brainweb/. [Google Scholar]
  13. L. Bungert, R. Raab, T. Roith, L. Schwinn and D. Tenbrinck, CLIP: Cheap Lipschitz training of neural networks. Preprint https://arxiv.org/abs/2103.12531 (2021). [Google Scholar]
  14. D.L. Collins, A.P. Zijdenbos, V. Kollokian, J.G. Sled, N.J. Kabani, C.J. Holmes and A.C. Evans, Design and construction of a realistic digital brain phantom. IEEE Trans. Med. Imag. 17 (1998) 463–468. [CrossRef] [Google Scholar]
  15. W.M. Czarnecki, S. Osindero, M. Jaderberg, G. Swirszcz and R. Pascanu, Sobolev training for neural networks. In Proceedings of the 31st International Conference on Neural Information Processing Systems, NIPS’17 (2017) 4281–4290. [Google Scholar]
  16. G. Dal Maso, Introduction to Γ-convergence. Birkhäuser (1993). [CrossRef] [Google Scholar]
  17. H. Daniels and M. Velikova, Monotone and partially monotone neural networks. IEEE Trans. Neural Netw. 21 (2010) 906–917. [CrossRef] [PubMed] [Google Scholar]
  18. M. Davies, G. Puy, P. Vandergheynst and Y. Wiaux, A compressed sensing framework for magnetic resonance fingerprinting. SIAM J. Imag. Sci. 7 (2014) 2623–2656. [CrossRef] [Google Scholar]
  19. S.P. Dirkse and M.C. Ferris, The path solver: A non-monotone stabilization scheme for mixed complementarity problems. Optim. Methods Softw. 5 (1995) 123–156. [CrossRef] [Google Scholar]
  20. G. Dong, M. Hintermüller and K. Papafitsoros, Quantitative magnetic resonance imaging: From fingerprinting to integrated physics-based models. SIAM J. Imag. Sci. 12 (2019) https://doi.org/10.1137/18M1222211. [Google Scholar]
  21. W.E., A proposal on machine learning via dynamical systems. Commun. Math. Stat. 5 (2017) 1–11. [MathSciNet] [Google Scholar]
  22. H. Engl, M. Hanke and A. Neubauer, https://www.springer.com/gp/book/9780792341574. [Google Scholar]
  23. L.C. Evans, Partial differential equations. Vol. 19 of Graduate studies in mathematics. American Mathematical Society, second edition (2010). [CrossRef] [Google Scholar]
  24. H.O. Fattorini, Infinite-dimensional optimization and control theory. Vol. 62 of Encyclopedia of Mathematics and its Applications. Cambridge University Press, Cambridge (1999). [Google Scholar]
  25. I. Goodfellow, Y. Bengio and A. Courvill, Deep Learning. MIT Press (2016). [Google Scholar]
  26. I. Gühring, G. Kutyniok and P. Petersen, Error bounds for approximations with deep ReLU neural networks in Ws,p norms. Anal. Appl. 18 (2020) 803–859. [CrossRef] [MathSciNet] [Google Scholar]
  27. E. Haber and L. Ruthotto, Stable architectures for deep neural networks. Inverse Prob. 34 (2018) 014004. [CrossRef] [Google Scholar]
  28. J. Han, A. Jentzen and W.E., Solving high-dimensional partial differential equations using deep learning. Proc. Natl. Acad. Sci. 115 (2018) 8505–8510. [CrossRef] [MathSciNet] [PubMed] [Google Scholar]
  29. M. Hanke, The regularizing Levenberg-Marquardt scheme is of optimal order. J. Integr. Equ. Appl. 22 (2010) 259–283. [CrossRef] [Google Scholar]
  30. K. He, X. Zhang, S. Ren and J. Sun, Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (2016). https://openaccess.thecvf.com/content_cvpr_2016/papers/He_Deep_Residual_Learning_CVPR_2016_paper.pdf. [Google Scholar]
  31. M. Hintermüller, Mesh independence and fast local convergence of a primal-dual active-set method for mixed control-state constrained elliptic control problems. ANZIAM J. 49 (2007) 1–38. [CrossRef] [MathSciNet] [Google Scholar]
  32. M. Hintermüller, K. Ito and K. Kunisch, The primal-dual active set strategy as a semismooth Newton method. SIAM J. Optim. 13 (2002) 865–888. [Google Scholar]
  33. M. Hintermüller and K. Kunisch, Feasible and noninterior path-following in constrained minimization with low multiplier regularity. SIAM J. Control Optim. 45 (2006) 1198–1221. [CrossRef] [MathSciNet] [Google Scholar]
  34. M. Hintermüller and M. Ulbrich, A mesh-independence result for semismooth Newton methods. Mathematical Programming, Series B 101 (2004) 151–184. [Google Scholar]
  35. M. Leshno, V.Y. Lin, A. Pinkus and S. Schocken, Multilayer feedforward networks with a nonpolynomial activation function can approximate any function. Neural Netw. 6 (1993) 861–867. [CrossRef] [Google Scholar]
  36. J.-L. Lions, Optimal control of systems governed by partial differential equations. Translated from the French by S. K. Mitter. Die Grundlehren der mathematischen Wissenschaften, Band 170. Springer-Verlag, New York-Berlin (1971). [Google Scholar]
  37. X. Liu, X. Han, N. Zhang and Q. Liu, Certified monotonic neural networks. In Volume 33 of Advances in Neural Information Processing Systems, edited by H. Larochelle, M. Ranzato, R. Hadsell, M. F. Balcan, and H. Lin (2020) 15427–15438. [Google Scholar]
  38. Z. Long, Y. Lu, X. Ma and B. Dong, PDE-Net: Learning PDEs from data. Proc. Mach. Learn. Res. 80 (2018) 3208–3216. [Google Scholar]
  39. S. Lu and J. Flemming, Convergence rate analysis of Tikhonov regularization for nonlinear ill-posed problems with noisy operators. Inverse Probl. 28 (2012) 104003. [CrossRef] [Google Scholar]
  40. D. Ma, V. Gulani, N. Seiberlich, K. Liu, J. Sunshine, J.L. Duerk and M.A. Griswold, Magnetic resonance fingerprinting. Nature 495 (2013) 187–193. [CrossRef] [PubMed] [Google Scholar]
  41. D.J.C. MacKay, Bayesian interpolation. Neural Comput. 4 (1992) 415–447. [CrossRef] [Google Scholar]
  42. G. Mazor, L. Weizman, A. Tal and Y.C. Eldar, Low-rank magnetic resonance fingerprinting. Med. Phys. 45 (2018) 4066–4084. [CrossRef] [Google Scholar]
  43. P. Neittaanmaki, J. Sprekels and D. Tiba, Optimization of elliptic systems. Springer Monographs in Mathematics. Springer, New York (2006). Theory and applications. [Google Scholar]
  44. D. Nguyen and B. Widrow, Improving the learning speed of 2-layer neural networks by choosing initial values of the adaptive weights. 1990 IJCNN International Joint Conference on Neural Networks 3 (1990) 21–26. [CrossRef] [Google Scholar]
  45. A. Pinkus, Approximation theory of the MLP model in neural networks. Acta Numer. 8 (1999) 143–195. [CrossRef] [Google Scholar]
  46. M.J.D. Powell, A view of unconstrained optimization. In Optimization in Action, edited by L.C.W. Dixon. Academic Press, London and New York (1976) 117–152. [Google Scholar]
  47. T. Qin, K. Wu and D. Xiu, Data driven governing equations approximation using deep neural networks. J. Comput. Phys. 395 (2019) 620–635. [CrossRef] [MathSciNet] [Google Scholar]
  48. D. Ralph, Global convergence of damped Newton’s method for nonsmooth equations via the path search. Math. Oper. Res. 19 (1994) 352–389. [CrossRef] [MathSciNet] [Google Scholar]
  49. K. Scheffler, A pictorial description of steady-states in rapid magnetic resonance imaging. Concepts Magn. Reson. 11 (1999) 187–193. [Google Scholar]
  50. J. Sirignano and K. Spiliopoulos, DGM: A deep learning algorithm for solving partial differential equations. J. Comput. Phys. 375 (2018) 1339–1364. [CrossRef] [MathSciNet] [Google Scholar]
  51. A. Sivaraman, G. Farnadi, T. Millstein and G. Van den Broeck Counterexample-guided learning of monotonic neural networks. Preprint arXiv:2006.08852 (2020). [Google Scholar]
  52. G. Teschl, Ordinary Differential Equations and Dynamical Systems. Volume 140 of Graduate Studies in Mathematics. American Mathematical Society, first edition (2012). [CrossRef] [Google Scholar]
  53. F. Tröltsch, Optimal Control of Partial Differential Equations: Theory, Methods and Applications. Vol. 112 of Graduate Studies in Mathematics. American Mathematical Society (2010). [CrossRef] [Google Scholar]
  54. J. Zowe and S. Kurcyusz, Regularity and stability for the mathematical programming problem in banach spaces. Appl. Math. Optim. 5 (1970) 49–62. [Google Scholar]

Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.

Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.

Initial download of the metrics may take a while.