Not signed in (Sign In)

Not signed in

Want to take part in these discussions? Sign in if you have an account, or apply for one below

  • Sign in using OpenID

Site Tag Cloud

2-category 2-category-theory abelian-categories adjoint algebra algebraic algebraic-geometry algebraic-topology analysis analytic-geometry arithmetic arithmetic-geometry book bundles calculus categorical categories category category-theory chern-weil-theory cohesion cohesive-homotopy-type-theory cohomology colimits combinatorics complex complex-geometry computable-mathematics computer-science constructive cosmology definitions deformation-theory descent diagrams differential differential-cohomology differential-equations differential-geometry digraphs duality elliptic-cohomology enriched fibration foundation foundations functional-analysis functor gauge-theory gebra geometric-quantization geometry graph graphs gravity grothendieck group group-theory harmonic-analysis higher higher-algebra higher-category-theory higher-differential-geometry higher-geometry higher-lie-theory higher-topos-theory homological homological-algebra homotopy homotopy-theory homotopy-type-theory index-theory integration integration-theory k-theory lie-theory limits linear linear-algebra locale localization logic mathematics measure-theory modal modal-logic model model-category-theory monad monads monoidal monoidal-category-theory morphism motives motivic-cohomology nforum nlab noncommutative noncommutative-geometry number-theory of operads operator operator-algebra order-theory pages pasting philosophy physics pro-object probability probability-theory quantization quantum quantum-field quantum-field-theory quantum-mechanics quantum-physics quantum-theory question representation representation-theory riemannian-geometry scheme schemes set set-theory sheaf simplicial space spin-geometry stable-homotopy-theory stack string string-theory superalgebra supergeometry svg symplectic-geometry synthetic-differential-geometry terminology theory topology topos topos-theory tqft type type-theory universal variational-calculus

Vanilla 1.1.10 is a product of Lussumo. More Information: Documentation, Community Support.

Welcome to nForum
If you want to take part in these discussions either sign in now (if you have an account), apply for one now (if you don't).
    • CommentRowNumber1.
    • CommentAuthortomr
    • CommentTimeJan 10th 2020

    I tried to find materials about symbolic, non-parametric methods for https://en.wikipedia.org/wiki/Bayesian_programming which is kind a GUT for AI (especially due to paradigm “Reinforcement learning as inference”) but I have found nothing that is why I came up with this musing https://math.stackexchange.com/questions/3503933/do-translations-among-grammars-and-encodings-between-grammar-and-other-space-f and I hope that there can be people who would be interested in such generalizations and can answer.

    The formal translations from one formal grammar into another grammar is possible, e.g. using framework of multilingual abstract categorial grammars, e.g. https://www.grammaticalframework.org/doc/tutorial/gf-tutorial.html#toc28. The formal translation from one formal grammar into another space is possible, e.g. encoding of theorems as Goedel number. Obviously - such translations and encodings can be chained to together.

    I have question about the existence of the framework for such translations/encodings and chaining of them.

    Specifically, I would like to get the following answers from such framework:

    1) What are the possible lanuages (and their Chomsky-type hierarchies, expressivity, complexity) for specifying translations and encodings?
    2) Do such translations and encodings form the group structure?
    There are some kind of measures on the expressions in arbitrary language, e.g. Kolmogorov complexity for programs in some language. What is the behaviour of conrete implementations of those measures from language-to-language or from language-to-space then the translation or encoding is done?
    3) Is there some "canonical" grammar or "canonical" space which has particularly nice properties, i.e. that can admit the shortes length of programs, that can admit the simples expressions of measure functions, like simples expressions for the computation of Kolmogorov complexity.
    4) Maybe there exist optimal encodings for mapping mathematical language expressions into real (multidimensional) space and the solution of https://en.wikipedia.org/wiki/Symbolic_regression can be done in this real space using classical analysis and then the solution can be mapped (decoded) back to the grammar of mathematical language and hence the problem be solved?
    5) Maybe there exist optimal encodings for mapping mathematical language expressions into vector space (neural networks) and the solution of math problems can be mapped to the solutions of the NN learning problems.
    

    Of course, this is lot to ask in one question, but actually the essence of my question is simple, narrow and very concrete - are there mathematics that research the full possible set of transformations among grammars (translations) and transformations language(grammar)->real_space(analysis) (encodings), that research the optimal elements of this set (optimal transaltion, optimal encoding)?

    I am aware of the instituion theory https://en.wikipedia.org/wiki/Institution_(computer_science) which has category of signatures and functors are mappings among sentences (of some logics). Similarly - maybe the category theory can be used for specifying languages/translations and spaces/encodings?

    Is there some research in this direction?

    Of course, those translations and encodings are written in some kind of (meta)language as well and they have their own Kolmogorov complexity, some recursion arises here (higher order categories?) but still - are there efforts to handle all of this?

    • CommentRowNumber2.
    • CommentAuthortomr
    • CommentTimeJan 12th 2020

    Well, someone closed my stack question, but it is good that it is still live here.

    https://papers.nips.cc/paper/9308-demystifying-black-box-models-with-symbolic-metamodels is extraodinary article how to encode the universe of symbolic expressions/mathematical functions using some set of numerical parameters and how to find the optimal expression (modellling data) by doing gradient descent over those numerical parameters. It involves Meier G-function (and there are some generalizations of them).

    I think that this is unbelievable achivement, because lack of more or less efficient encoding of symbolic math into some numerical parameters was the last barrier from neural networks towards artificial general intelligence. And not this barrier has been broken. 2020 should be exciting year due to such ideas!