Not signed in (Sign In)

Not signed in

Want to take part in these discussions? Sign in if you have an account, or apply for one below

  • Sign in using OpenID

Site Tag Cloud

2-category 2-category-theory abelian-categories adjoint algebra algebraic algebraic-geometry algebraic-topology analysis analytic-geometry arithmetic arithmetic-geometry book bundles calculus categorical categories category category-theory chern-weil-theory cohesion cohesive-homotopy-type-theory cohomology colimits combinatorics complex complex-geometry computable-mathematics computer-science constructive cosmology deformation-theory descent diagrams differential differential-cohomology differential-equations differential-geometry digraphs duality elliptic-cohomology enriched fibration foundation foundations functional-analysis functor gauge-theory gebra geometric-quantization geometry graph graphs gravity grothendieck group group-theory harmonic-analysis higher higher-algebra higher-category-theory higher-differential-geometry higher-geometry higher-lie-theory higher-topos-theory homological homological-algebra homotopy homotopy-theory homotopy-type-theory index-theory integration integration-theory k-theory lie-theory limits linear linear-algebra locale localization logic mathematics measure-theory modal modal-logic model model-category-theory monad monads monoidal monoidal-category-theory morphism motives motivic-cohomology nforum nlab noncommutative noncommutative-geometry number-theory of operads operator operator-algebra order-theory pages pasting philosophy physics pro-object probability probability-theory quantization quantum quantum-field quantum-field-theory quantum-mechanics quantum-physics quantum-theory question representation representation-theory riemannian-geometry scheme schemes set set-theory sheaf sheaves simplicial space spin-geometry stable-homotopy-theory stack string string-theory superalgebra supergeometry svg symplectic-geometry synthetic-differential-geometry terminology theory topology topos topos-theory tqft type type-theory universal variational-calculus

Vanilla 1.1.10 is a product of Lussumo. More Information: Documentation, Community Support.

Welcome to nForum
If you want to take part in these discussions either sign in now (if you have an account), apply for one now (if you don't).
    • CommentRowNumber1.
    • CommentAuthorDavid_Corfield
    • CommentTimeFeb 6th 2023

    A stub to note some references.

    v1, current

    • CommentRowNumber2.
    • CommentAuthorDavid_Corfield
    • CommentTimeFeb 6th 2023
    • (edited Feb 6th 2023)

    I remember during my machine learning phase thinking that deep neural networks should have more to tell us than support vector machines:

    Physicists have known for decades that the macroscopic behavior of the systems we care about is the consequence of critical points in the energy landscape: global behavior is dominated by the local behavior of a small set of singularities. This is true everywhere from statistical physics and condensed matter theory to string theory. Singular learning theory tells us that learning machines are no different: the geometry of singularities is fundamental to the dynamics of learning and generalization. (Hoogland).

    • CommentRowNumber3.
    • CommentAuthorDavid_Corfield
    • CommentTimeApr 7th 2023

    Added reference to

    • Singular Learning Theory seminar, (webpage)

    diff, v3, current

    • CommentRowNumber4.
    • CommentAuthorUrs
    • CommentTimeApr 7th 2023

    have hyperlinked deep neural network and statistical distribution and have given this and related entries the context menu “Probability theory”

    diff, v4, current

    • CommentRowNumber5.
    • CommentAuthorUrs
    • CommentTimeApr 7th 2023

    found links for these books:

    diff, v4, current

    • CommentRowNumber6.
    • CommentAuthorzskoda
    • CommentTimeApr 7th 2023
    • (edited Apr 7th 2023)

    Hông Vân Lê is using diffeological spaces for information geometry in singular statistical models (including motivated by machine learning), but I do not know if this has to do directly with singular optimization problems as understood by Watanabe. The following are some of the references of her work.

    For singular statistical models (including those arising in machine learning) one needs more version of Fisher metric beyond manifolds; one possibility is in the framework of diffeologies,

    • Hông Vân Lê, Diffeological statistical models and diffeological Hausdorff measures, video yt, slides pdf

    • Hông Vân Lê, Natural differentiable structures on statistical models and the Fisher metric, Information Geometry (2022) arXiv:2208.06539 doi

    • Hông Vân Lê, Alexey A. Tuzhilin, Nonparametric estimations and the diffeological Fisher metric, In: Barbaresco F., Nielsen F. (eds) Geometric Structures of Statistical Physics, Information Geometry, and Learning, p. 120–138, SPIGL 2020. Springer Proceedings in Mathematics & Statistics 361, doi arXiv:2011.13418

    In this paper, first, we survey the concept of diffeological Fisher metric and its naturality, using functorial language of probability morphisms, and slightly extending Lê’s theory in (Le2020) to include weakly C kC^k-diffeological statistical models. Then we introduce the resulting notions of the diffeological Fisher distance, the diffeological Hausdorff–Jeffrey measure and explain their role in classical and Bayesian nonparametric estimation problems in statistics.

    • Hông Vân Lê, Diffeological statistical models,the Fisher metric and probabilistic mappings, Mathematics 2020, 8(2) 167 arXiv:1912.02090