Not signed in (Sign In)

Not signed in

Want to take part in these discussions? Sign in if you have an account, or apply for one below

  • Sign in using OpenID

Site Tag Cloud

2-category 2-category-theory abelian-categories adjoint algebra algebraic algebraic-geometry algebraic-topology analysis analytic-geometry arithmetic arithmetic-geometry book bundles calculus categorical categories category category-theory chern-weil-theory cohesion cohesive-homotopy-type-theory cohomology colimits combinatorics complex complex-geometry computable-mathematics computer-science constructive cosmology deformation-theory descent diagrams differential differential-cohomology differential-equations differential-geometry digraphs duality elliptic-cohomology enriched fibration foundation foundations functional-analysis functor gauge-theory gebra geometric-quantization geometry graph graphs gravity grothendieck group group-theory harmonic-analysis higher higher-algebra higher-category-theory higher-differential-geometry higher-geometry higher-lie-theory higher-topos-theory homological homological-algebra homotopy homotopy-theory homotopy-type-theory index-theory integration integration-theory internal-categories k-theory lie-theory limits linear linear-algebra locale localization logic mathematics measure measure-theory modal modal-logic model model-category-theory monad monads monoidal monoidal-category-theory morphism motives motivic-cohomology nlab noncommutative noncommutative-geometry number-theory of operads operator operator-algebra order-theory pages pasting philosophy physics pro-object probability probability-theory quantization quantum quantum-field quantum-field-theory quantum-mechanics quantum-physics quantum-theory question representation representation-theory riemannian-geometry scheme schemes set set-theory sheaf simplicial space spin-geometry stable-homotopy-theory stack string string-theory superalgebra supergeometry svg symplectic-geometry synthetic-differential-geometry terminology theory topology topos topos-theory tqft type type-theory universal variational-calculus

Vanilla 1.1.10 is a product of Lussumo. More Information: Documentation, Community Support.

Welcome to nForum
If you want to take part in these discussions either sign in now (if you have an account), apply for one now (if you don't).
    • CommentRowNumber1.
    • CommentAuthorUrs
    • CommentTimeOct 15th 2019

    Stub. For the moment just for providing a place to record this reference:

    • Jean Thierry-Mieg, Connections between physics, mathematics and deep learning, Letters in High Energy Physics, vol 2 no 3 (2019) (doi:10.31526/lhep.3.2019.110)

    v1, current

    • CommentRowNumber2.
    • CommentAuthorUrs
    • CommentTimeOct 15th 2019

    added these references on the learning algorithm as analogous to the AdS/CFT correspondence:

    • Yi-Zhuang You, Zhao Yang, Xiao-Liang Qi, Machine Learning Spatial Geometry from Entanglement Features, Phys. Rev. B 97, 045153 (2018) (arxiv:1709.01223)

    • W. C. Gan and F. W. Shu, Holography as deep learning, Int. J. Mod. Phys. D 26, no. 12, 1743020 (2017) (arXiv:1705.05750)

    • J. W. Lee, Quantum fields as deep learning (arXiv:1708.07408)

    • Koji Hashimoto, Sotaro Sugishita, Akinori Tanaka, Akio Tomiya, Deep Learning and AdS/CFT, Phys. Rev. D 98, 046019 (2018) (arxiv:1802.08313)

    v1, current

    • CommentRowNumber3.
    • CommentAuthorDavid_Corfield
    • CommentTimeOct 15th 2019
    • (edited Oct 15th 2019)

    I never got round to looking at

    • Brendan Fong, David Spivak, Rémy Tuyéras, Backprop as Functor: A compositional perspective on supervised learning, (arXiv:1711.10455)
    • CommentRowNumber4.
    • CommentAuthorUrs
    • CommentTimeOct 15th 2019

    What I don’t understand yet in HSTT 18 is where the non-linear activiation functions are in the story, i.e. how is what they have different from a discretized solution of a differential equation. But I don’t really have time to look into this properly.

    • CommentRowNumber5.
    • CommentAuthorDavid_Corfield
    • CommentTimeOct 16th 2019

    Added that article in #3.

    diff, v3, current

    • CommentRowNumber6.
    • CommentAuthorDavid_Corfield
    • CommentTimeMar 4th 2021

    Added two more category-theoretic treatments

    Removed the redirect to ’machine learning’, as this is far more general.

    diff, v7, current

    • CommentRowNumber7.
    • CommentAuthorDavid_Corfield
    • CommentTimeMar 4th 2021

    Added two more category-theoretic treatments

    Removed the redirect to ’machine learning’, as this is far more general.

    diff, v7, current

    • CommentRowNumber8.
    • CommentAuthorUrs
    • CommentTimeMar 4th 2021

    Sorry for raising a trivial point on formatting:

    In a reference, let’s not have a comma before the parenthesis with the arXiv number, it doesn’t seem to be needed. What do you think?

    • CommentRowNumber9.
    • CommentAuthorDavid_Corfield
    • CommentTimeMar 4th 2021

    But I would always punctuate after a title

    • Corfield, D. 2003. Towards a philosophy of real mathematics. CUP.

    or something like that.

    • CommentRowNumber10.
    • CommentAuthorUrs
    • CommentTimeMar 4th 2021

    I see. That reminds me that we should use Richard’s new bibtex-like functionality to harmonize formatting. Maybe once that is a little more convenient to use: I just tried to offer it the bibtex data as produced by the arXiv

      @misc{spivak2021learners,
            title={Learners' languages}, 
            author={David I. Spivak},
            year={2021},
            eprint={2103.01189},
            archivePrefix={arXiv},
            primaryClass={math.CT}
      }
    

    but it does not swallow that.

  1. Yes, it only accepts the ’article’ document type for now. But completely agree this is exactly the kind of thing the bibliography is for :-)! I need to complete the ability to edit references in the bibliography, and then I will add support for all the common document types. Have had to focus on other things in the nLab software recently, but I’ll work on this when I have a chance.

    • CommentRowNumber12.
    • CommentAuthorUrs
    • CommentTimeMar 4th 2021

    Thanks. In fact article would be more sensible. To refer to an arXiv preprint as a “miscellaneous” reference is a weird anachronism!

    So I am happy to stick with your (currently) supported fields!

    But when I go to the edit pane you made, all I get to see is a big white box and no indication what to do.

    If you could just make the edit pane show a rudimentary template of fields into which the user could then type their data, that would already get us started!

  2. I will indeed add something like this, but have not had a chance yet. Am prioritising the fundamental functionality first; I wouldn’t be averse to adding something sooner, but I need to think it through a bit (I’d rather not put in place some quick hack that people get used to, which later causes problems!).

    For example, I think we probably should stick to BibTex’s convention of having article only refer to published articles, because this allows validation: we can require that a journal, etc, is given, and in fact such a requirement is already implemented. Of course we could make up our own new document type such as ’preprint’ or something (but allow use of ’misc’ or ’unpublished’, converting to ’preprint’ behind the scenes if for example the arXiv field is present).

    • CommentRowNumber14.
    • CommentAuthorLuigi
    • CommentTimeApr 29th 2021

    Added a small mention of the relation with renormalisation group flow

    diff, v9, current

    • CommentRowNumber15.
    • CommentAuthorUrs
    • CommentTimeApr 29th 2021
    • (edited Apr 29th 2021)

    Good. I have added these further references in this direction:


    Further discussion under the relation of renormalization group flow to bulk-flow in the context of the AdS/CFT correspondence:

    • Yi-Zhuang You, Zhao Yang, Xiao-Liang Qi, Machine Learning Spatial Geometry from Entanglement Features, Phys. Rev. B 97, 045153 (2018) (arxiv:1709.01223)

    • W. C. Gan and F. W. Shu, Holography as deep learning, Int. J. Mod. Phys. D 26, no. 12, 1743020 (2017) (arXiv:1705.05750)

    • J. W. Lee, Quantum fields as deep learning (arXiv:1708.07408)

    • Koji Hashimoto, Sotaro Sugishita, Akinori Tanaka, Akio Tomiya, Deep Learning and AdS/CFT, Phys. Rev. D 98, 046019 (2018) (arxiv:1802.08313)


    diff, v10, current

    • CommentRowNumber16.
    • CommentAuthorUrs
    • CommentTimeJun 21st 2021

    added pointer to today’s:

    • Daniel A. Roberts, Sho Yaida, Boris Hanin, The Principles of Deep Learning Theory, Cambridge University Press 2022 (arXiv:2106.10165)

    diff, v15, current

  3. adding information about how neural networks are related to differential equations/dynamical systems.

    Anonymous

    diff, v18, current

    • CommentRowNumber18.
    • CommentAuthorUrs
    • CommentTimeJun 3rd 2022

    Since revision 4 the Idea-section starts out with

    A neural network is a class of functions used…

    This seems a little strange. Maybe what is meant is:

    Neural networks are a class of functions used…

    But either way, the sentence conveys no information about the nature of neural networks.

    diff, v19, current

    • CommentRowNumber19.
    • CommentAuthorDavid_Corfield
    • CommentTimeApr 24th 2023

    Added some references for topological deep learning

    • Ephy R. Love, Benjamin Filippenko, Vasileios Maroulas, Gunnar Carlsson, Topological Deep Learning (arXiv:2101.05778)

    • Mathilde Papillon, Sophia Sanborn, Mustafa Hajij, Nina Miolane, Architectures of Topological Deep Learning: A Survey on Topological Neural Networks (arXiv:2304.10031)

    • Mustafa Hajij et al., Topological Deep Learning: Going Beyond Graph Data (pdf)

    diff, v22, current

    • CommentRowNumber20.
    • CommentAuthorNikolajK
    • CommentTimeDec 29th 2023
    • (edited Dec 29th 2023)

    Added breakdown of Neural Network Gaussian process (NNGP) results, Neural tangent kernel (NTK) theory and more recent approaches to QFT (Neural Network field theory, NNFT, and the latest paper in that direction).

    One could also think of making a separate page for Neural tangent kernel theory and moving some of the large width build-up there. There, there’s some obvious reference links one could add and I might later at one point. (In the style I’d like to further clean up the somewhat over reliance on brackets in my paragraph and remove the explanation-by-comparison to classical mechanics by the actual formulas, albeit even the Wikipedia breakdown isn’t that bad.) There were already references in that direction, but no main text. Feel free to alter any running text.

    I’m personally mostly interested in the field theory and stochastics stuff, but the article could also bridge to information geometry results

    diff, v23, current

    • CommentRowNumber21.
    • CommentAuthorDavid_Corfield
    • CommentTimeMay 14th 2024

    Added two articles

    • Bruno Gavranović, Paul Lessard, Andrew Dudzik, Tamara von Glehn, João G. M. Araújo, Petar Veličković, Categorical Deep Learning: An Algebraic Theory of Architectures [arXiv:2402.15332]

    • Theodore Papamarkou, Tolga Birdal, Michael Bronstein, Gunnar Carlsson, Justin Curry, Yue Gao, Mustafa Hajij, Roland Kwitt, Pietro Liò, Paolo Di Lorenzo, Vasileios Maroulas, Nina Miolane, Farzana Nasrin, Karthikeyan Natesan Ramamurthy, Bastian Rieck, Simone Scardapane, Michael T. Schaub, Petar Veličković, Bei Wang, Yusu Wang, Guo-Wei Wei, Ghada Zamzmi, Position Paper: Challenges and Opportunities in Topological Deep Learning [arXiv:2402.08871]

    Position Paper: Challenges and Opportunities in Topological Deep Learning

    diff, v28, current

  4. Added tags

    Fabio Zanasi

    diff, v29, current

  5. Updated reference

    Fabio Zanasi

    diff, v29, current