# Start a new discussion

## Not signed in

Want to take part in these discussions? Sign in if you have an account, or apply for one below

## Site Tag Cloud

Vanilla 1.1.10 is a product of Lussumo. More Information: Documentation, Community Support.

• CommentRowNumber1.
• CommentAuthorUrs
• CommentTimeOct 15th 2019

Stub. For the moment just for providing a place to record this reference:

• Jean Thierry-Mieg, Connections between physics, mathematics and deep learning, Letters in High Energy Physics, vol 2 no 3 (2019) (doi:10.31526/lhep.3.2019.110)
• CommentRowNumber2.
• CommentAuthorUrs
• CommentTimeOct 15th 2019

added these references on the learning algorithm as analogous to the AdS/CFT correspondence:

• Yi-Zhuang You, Zhao Yang, Xiao-Liang Qi, Machine Learning Spatial Geometry from Entanglement Features, Phys. Rev. B 97, 045153 (2018) (arxiv:1709.01223)

• W. C. Gan and F. W. Shu, Holography as deep learning, Int. J. Mod. Phys. D 26, no. 12, 1743020 (2017) (arXiv:1705.05750)

• J. W. Lee, Quantum fields as deep learning (arXiv:1708.07408)

• Koji Hashimoto, Sotaro Sugishita, Akinori Tanaka, Akio Tomiya, Deep Learning and AdS/CFT, Phys. Rev. D 98, 046019 (2018) (arxiv:1802.08313)

• CommentRowNumber3.
• CommentAuthorDavid_Corfield
• CommentTimeOct 15th 2019
• (edited Oct 15th 2019)

I never got round to looking at

• Brendan Fong, David Spivak, Rémy Tuyéras, Backprop as Functor: A compositional perspective on supervised learning, (arXiv:1711.10455)
• CommentRowNumber4.
• CommentAuthorUrs
• CommentTimeOct 15th 2019

What I don’t understand yet in HSTT 18 is where the non-linear activiation functions are in the story, i.e. how is what they have different from a discretized solution of a differential equation. But I don’t really have time to look into this properly.

• CommentRowNumber5.
• CommentAuthorDavid_Corfield
• CommentTimeOct 16th 2019

Added that article in #3.

• CommentRowNumber6.
• CommentAuthorDavid_Corfield
• CommentTimeMar 4th 2021

Added two more category-theoretic treatments

Removed the redirect to ’machine learning’, as this is far more general.

• CommentRowNumber7.
• CommentAuthorDavid_Corfield
• CommentTimeMar 4th 2021

Added two more category-theoretic treatments

Removed the redirect to ’machine learning’, as this is far more general.

• CommentRowNumber8.
• CommentAuthorUrs
• CommentTimeMar 4th 2021

Sorry for raising a trivial point on formatting:

In a reference, let’s not have a comma before the parenthesis with the arXiv number, it doesn’t seem to be needed. What do you think?

• CommentRowNumber9.
• CommentAuthorDavid_Corfield
• CommentTimeMar 4th 2021

But I would always punctuate after a title

• Corfield, D. 2003. Towards a philosophy of real mathematics. CUP.

or something like that.

• CommentRowNumber10.
• CommentAuthorUrs
• CommentTimeMar 4th 2021

I see. That reminds me that we should use Richard’s new bibtex-like functionality to harmonize formatting. Maybe once that is a little more convenient to use: I just tried to offer it the bibtex data as produced by the arXiv

  @misc{spivak2021learners,
title={Learners' languages},
author={David I. Spivak},
year={2021},
eprint={2103.01189},
archivePrefix={arXiv},
primaryClass={math.CT}
}


but it does not swallow that.

1. Yes, it only accepts the ’article’ document type for now. But completely agree this is exactly the kind of thing the bibliography is for :-)! I need to complete the ability to edit references in the bibliography, and then I will add support for all the common document types. Have had to focus on other things in the nLab software recently, but I’ll work on this when I have a chance.

• CommentRowNumber12.
• CommentAuthorUrs
• CommentTimeMar 4th 2021

Thanks. In fact article would be more sensible. To refer to an arXiv preprint as a “miscellaneous” reference is a weird anachronism!

So I am happy to stick with your (currently) supported fields!

But when I go to the edit pane you made, all I get to see is a big white box and no indication what to do.

If you could just make the edit pane show a rudimentary template of fields into which the user could then type their data, that would already get us started!

2. I will indeed add something like this, but have not had a chance yet. Am prioritising the fundamental functionality first; I wouldn’t be averse to adding something sooner, but I need to think it through a bit (I’d rather not put in place some quick hack that people get used to, which later causes problems!).

For example, I think we probably should stick to BibTex’s convention of having article only refer to published articles, because this allows validation: we can require that a journal, etc, is given, and in fact such a requirement is already implemented. Of course we could make up our own new document type such as ’preprint’ or something (but allow use of ’misc’ or ’unpublished’, converting to ’preprint’ behind the scenes if for example the arXiv field is present).

• CommentRowNumber14.
• CommentAuthorLuigi
• CommentTimeApr 29th 2021

Added a small mention of the relation with renormalisation group flow

• CommentRowNumber15.
• CommentAuthorUrs
• CommentTimeApr 29th 2021
• (edited Apr 29th 2021)

Good. I have added these further references in this direction:

Further discussion under the relation of renormalization group flow to bulk-flow in the context of the AdS/CFT correspondence:

• Yi-Zhuang You, Zhao Yang, Xiao-Liang Qi, Machine Learning Spatial Geometry from Entanglement Features, Phys. Rev. B 97, 045153 (2018) (arxiv:1709.01223)

• W. C. Gan and F. W. Shu, Holography as deep learning, Int. J. Mod. Phys. D 26, no. 12, 1743020 (2017) (arXiv:1705.05750)

• J. W. Lee, Quantum fields as deep learning (arXiv:1708.07408)

• Koji Hashimoto, Sotaro Sugishita, Akinori Tanaka, Akio Tomiya, Deep Learning and AdS/CFT, Phys. Rev. D 98, 046019 (2018) (arxiv:1802.08313)

• CommentRowNumber16.
• CommentAuthorUrs
• CommentTimeJun 21st 2021

added pointer to today’s:

• Daniel A. Roberts, Sho Yaida, Boris Hanin, The Principles of Deep Learning Theory, Cambridge University Press 2022 (arXiv:2106.10165)
3. adding information about how neural networks are related to differential equations/dynamical systems.

Anonymous

• CommentRowNumber18.
• CommentAuthorUrs
• CommentTimeJun 3rd 2022

Since revision 4 the Idea-section starts out with

A neural network is a class of functions used…

This seems a little strange. Maybe what is meant is:

Neural networks are a class of functions used…

But either way, the sentence conveys no information about the nature of neural networks.