Want to take part in these discussions? Sign in if you have an account, or apply for one below
Vanilla 1.1.10 is a product of Lussumo. More Information: Documentation, Community Support.
1 to 2 of 2
I tried to find materials about symbolic, non-parametric methods for https://en.wikipedia.org/wiki/Bayesian_programming which is kind a GUT for AI (especially due to paradigm “Reinforcement learning as inference”) but I have found nothing that is why I came up with this musing https://math.stackexchange.com/questions/3503933/do-translations-among-grammars-and-encodings-between-grammar-and-other-space-f and I hope that there can be people who would be interested in such generalizations and can answer.
The formal translations from one formal grammar into another grammar is possible, e.g. using framework of multilingual abstract categorial grammars, e.g. https://www.grammaticalframework.org/doc/tutorial/gf-tutorial.html#toc28. The formal translation from one formal grammar into another space is possible, e.g. encoding of theorems as Goedel number. Obviously - such translations and encodings can be chained to together.
I have question about the existence of the framework for such translations/encodings and chaining of them.
Specifically, I would like to get the following answers from such framework:
1) What are the possible lanuages (and their Chomsky-type hierarchies, expressivity, complexity) for specifying translations and encodings?
2) Do such translations and encodings form the group structure?
There are some kind of measures on the expressions in arbitrary language, e.g. Kolmogorov complexity for programs in some language. What is the behaviour of conrete implementations of those measures from language-to-language or from language-to-space then the translation or encoding is done?
3) Is there some "canonical" grammar or "canonical" space which has particularly nice properties, i.e. that can admit the shortes length of programs, that can admit the simples expressions of measure functions, like simples expressions for the computation of Kolmogorov complexity.
4) Maybe there exist optimal encodings for mapping mathematical language expressions into real (multidimensional) space and the solution of https://en.wikipedia.org/wiki/Symbolic_regression can be done in this real space using classical analysis and then the solution can be mapped (decoded) back to the grammar of mathematical language and hence the problem be solved?
5) Maybe there exist optimal encodings for mapping mathematical language expressions into vector space (neural networks) and the solution of math problems can be mapped to the solutions of the NN learning problems.
Of course, this is lot to ask in one question, but actually the essence of my question is simple, narrow and very concrete - are there mathematics that research the full possible set of transformations among grammars (translations) and transformations language(grammar)->real_space(analysis) (encodings), that research the optimal elements of this set (optimal transaltion, optimal encoding)?
I am aware of the instituion theory https://en.wikipedia.org/wiki/Institution_(computer_science) which has category of signatures and functors are mappings among sentences (of some logics). Similarly - maybe the category theory can be used for specifying languages/translations and spaces/encodings?
Is there some research in this direction?
Of course, those translations and encodings are written in some kind of (meta)language as well and they have their own Kolmogorov complexity, some recursion arises here (higher order categories?) but still - are there efforts to handle all of this?
Well, someone closed my stack question, but it is good that it is still live here.
https://papers.nips.cc/paper/9308-demystifying-black-box-models-with-symbolic-metamodels is extraodinary article how to encode the universe of symbolic expressions/mathematical functions using some set of numerical parameters and how to find the optimal expression (modellling data) by doing gradient descent over those numerical parameters. It involves Meier G-function (and there are some generalizations of them).
I think that this is unbelievable achivement, because lack of more or less efficient encoding of symbolic math into some numerical parameters was the last barrier from neural networks towards artificial general intelligence. And not this barrier has been broken. 2020 should be exciting year due to such ideas!
1 to 2 of 2