Generalized fixed point theory in b-metric spaces with applications to optimization and machine learning algorithms

Authors

  • Shilpa Patra 1DEPARTMENT OF MATHEMATICS, NARAJOLE RAJ COLLEGE,
  • Sudipta Sarkar 2DEPARTMENT OF MATHEMATICS, HERITAGE INSTITUTE OF TECHNOLOGY
  • Kulbhushan Agnihotri DEPARTMENT OF MATHEMATICS, PANJAB UNIVERSITY
  • Krishna Pada Das Mahadevananda Mahavidyalaya Department Of Mathematics Monirampore P.O.-Barrackpore Kol-120

Abstract

This paper develops a comprehensive fixed point frame work in b-metric spaces and demonstrates its relevance to modern optimization and machine learning. By introducing an auxiliary control function we establish a generalized contraction condition ensuring , existence, uniqueness and geometric convergence of iterative schemes. Theoretical results extend the classical Banach contraction principle to settings where distances are non Euclidean or structurally modified, providing greater flexibility for high dimensional learning problems.  Gradient descent proximal and inertial optimization algorithms are reformulated as fixed point iterations within this framework, offering imposed convergence grantees  . Applications to deep learning particularly recurrent neural networks highlight stability conditions based on spectral radius and Lipschitz properties.  Furthermore, we show that fixed point theory naturally models equilibria in biological systems including population dynamics and epidemiological models The results unify classical mathematical models and comtemporary data driven methods supporting the development of robust algorithms in generalized metric enviroment. 

Published

2026-02-28

How to Cite

Generalized fixed point theory in b-metric spaces with applications to optimization and machine learning algorithms. (2026). Nonlinear Studies, 33(1), 299-313. https://www.nonlinearstudies.com/index.php/nonlinear/article/view/4129