Not signed in (Sign In)

Not signed in

Want to take part in these discussions? Sign in if you have an account, or apply for one below

  • Sign in using OpenID

Site Tag Cloud

2-category 2-category-theory abelian-categories adjoint algebra algebraic algebraic-geometry algebraic-topology analysis analytic-geometry arithmetic arithmetic-geometry bundles calculus categories category category-theory chern-weil-theory cohesion cohesive-homotopy-theory cohesive-homotopy-type-theory cohomology colimits combinatorics complex-geometry computable-mathematics computer-science constructive cosmology deformation-theory descent diagrams differential differential-cohomology differential-equations differential-geometry digraphs duality education elliptic-cohomology enriched fibration foundations functional-analysis functor gauge-theory gebra geometric-quantization geometry graph graphs gravity group group-theory harmonic-analysis higher higher-algebra higher-category-theory higher-differential-geometry higher-geometry higher-lie-theory higher-topos-theory homological homological-algebra homotopy homotopy-theory homotopy-type-theory index-theory infinity integration integration-theory k-theory lie-theory limit limits linear linear-algebra locale localization logic mathematics measure measure-theory modal modal-logic model model-category-theory monads monoid monoidal monoidal-category-theory morphism motives motivic-cohomology multicategories nlab noncommutative noncommutative-geometry number-theory of operads operator operator-algebra order-theory pages pasting philosophy physics planar pro-object probability probability-theory quantization quantum quantum-field quantum-field-theory quantum-mechanics quantum-physics quantum-theory question representation representation-theory riemannian-geometry scheme schemes science set set-theory sheaf simplicial space spin-geometry stable-homotopy-theory stack string string-theory superalgebra supergeometry svg symplectic-geometry synthetic-differential-geometry terminology theory topology topos topos-theory type type-theory universal variational-calculus

Vanilla 1.1.10 is a product of Lussumo. More Information: Documentation, Community Support.

Welcome to nForum
If you want to take part in these discussions either sign in now (if you have an account), apply for one now (if you don't).
    • CommentRowNumber1.
    • CommentAuthortomr
    • CommentTimeJan 10th 2020

    I tried to find materials about symbolic, non-parametric methods for which is kind a GUT for AI (especially due to paradigm “Reinforcement learning as inference”) but I have found nothing that is why I came up with this musing and I hope that there can be people who would be interested in such generalizations and can answer.

    The formal translations from one formal grammar into another grammar is possible, e.g. using framework of multilingual abstract categorial grammars, e.g. The formal translation from one formal grammar into another space is possible, e.g. encoding of theorems as Goedel number. Obviously - such translations and encodings can be chained to together.

    I have question about the existence of the framework for such translations/encodings and chaining of them.

    Specifically, I would like to get the following answers from such framework:

    1) What are the possible lanuages (and their Chomsky-type hierarchies, expressivity, complexity) for specifying translations and encodings?
    2) Do such translations and encodings form the group structure?
    There are some kind of measures on the expressions in arbitrary language, e.g. Kolmogorov complexity for programs in some language. What is the behaviour of conrete implementations of those measures from language-to-language or from language-to-space then the translation or encoding is done?
    3) Is there some "canonical" grammar or "canonical" space which has particularly nice properties, i.e. that can admit the shortes length of programs, that can admit the simples expressions of measure functions, like simples expressions for the computation of Kolmogorov complexity.
    4) Maybe there exist optimal encodings for mapping mathematical language expressions into real (multidimensional) space and the solution of can be done in this real space using classical analysis and then the solution can be mapped (decoded) back to the grammar of mathematical language and hence the problem be solved?
    5) Maybe there exist optimal encodings for mapping mathematical language expressions into vector space (neural networks) and the solution of math problems can be mapped to the solutions of the NN learning problems.

    Of course, this is lot to ask in one question, but actually the essence of my question is simple, narrow and very concrete - are there mathematics that research the full possible set of transformations among grammars (translations) and transformations language(grammar)->real_space(analysis) (encodings), that research the optimal elements of this set (optimal transaltion, optimal encoding)?

    I am aware of the instituion theory which has category of signatures and functors are mappings among sentences (of some logics). Similarly - maybe the category theory can be used for specifying languages/translations and spaces/encodings?

    Is there some research in this direction?

    Of course, those translations and encodings are written in some kind of (meta)language as well and they have their own Kolmogorov complexity, some recursion arises here (higher order categories?) but still - are there efforts to handle all of this?

    • CommentRowNumber2.
    • CommentAuthortomr
    • CommentTimeJan 12th 2020

    Well, someone closed my stack question, but it is good that it is still live here. is extraodinary article how to encode the universe of symbolic expressions/mathematical functions using some set of numerical parameters and how to find the optimal expression (modellling data) by doing gradient descent over those numerical parameters. It involves Meier G-function (and there are some generalizations of them).

    I think that this is unbelievable achivement, because lack of more or less efficient encoding of symbolic math into some numerical parameters was the last barrier from neural networks towards artificial general intelligence. And not this barrier has been broken. 2020 should be exciting year due to such ideas!