Want to take part in these discussions? Sign in if you have an account, or apply for one below
Vanilla 1.1.10 is a product of Lussumo. More Information: Documentation, Community Support.
So if this is a critique of Chomskyan grammar, then at least it ought to mention Chomskyan grammar, rather than expunge such mention outright.
I am tempted just to roll back to the previous revision. The usual modus operandi here at the nLab is not simply to erase the efforts of previous contributors, but instead add to them where possible. I encourage you to to find a decorous way of doing so.
I actually feel that the entry is better after the edit. Montague’s work is not a continuation of Chomsky’s, and thus I agree with the editor not to mention it than to have something potentially misleading.
According to Wikipedia’s entry Montague grammar the quote by Montague, which currently constitutes half of the nLab entry, continues with the very words
and mathematically precise theory. On this point I differ from a number of philosophers, but agree, I believe, with Chomsky and his associates.
The SEP entry Montague semantics sees Montague in the footsteps of Chomsky:
constituted a revolution: after the Chomskyan revolution that brought mathematical methods into syntax, now such methods were introduced in semantics.
and sees the main difference on whether linguistics is “regarded” as a subject either in psychology or in mathematics:
When Montague grammar emerged, the leading theory about syntax was Chomskyan grammar. That approach claimed that it revealed processes that went on in the brain, and that linguistics was a branch of biology. In those days it was experimentally shown that the passive transformation was a real process in the brain. Chomskyan grammar still is a leading theory, and although most of the theory has changed considerably (there is no passive transformation anymore), it still considers itself to be revealing psychologically real processes. Montague had no psychological claim for his theory; on the contrary, he considered linguistics as a branch of mathematics and not of psychology.
In any case, neither Wikipedia nor SEP agree that Montague’s approach is based on “compositionality” (whatever that means, the nLab entry claims this without substantiating it) but are clear about it being based on formal logic. I’d urge that this be clarified in the nLab entry, lest it appears to be hijacking a well-established concept. The role of the “compositionality” newspeak should be substantiated or else be removed.
Compositionality has a decent philosophical pedigree SEP: compositionality, and Montague features in that tradition:
Montague (1970) suggested a perspicuous way to capture the principle of compositionality formally. The key idea is that compositionality requires the existence of a homomorphism between the expressions of a language and the meanings of those expressions.
Montague’s work was highly innovative. Nobody at the time thought that one could treat natural language semantics formally. There was no ’obvious step’ from syntactics (where Chomsky had done work along with many others) to semantics.
If one reads the full text of the work where the first quote that Urs cites comes from, it is clear that the mention of Chomsky is a single aside; Montague’s contributions were completely new.
I see, thanks. So I have reorganized the References-section at compositionality such as to avoid the impression that the origin of the term is a recent workshop.
Compositionality is a fundamental concept in linguistics, going back to Frege. It basically just means that the meaning of something is composed of the meaning of its parts. This point of view is essential to the formal way in which Montague semantics is defined (as it is to almost all work in formal linguistics).
Edited: our posts crossed :-).
Richard, thanks, but please add that to compositionality!
Am just passing by this subject, but, for what it’s worth, I am left with the impression that it comes down to first implicitly declaring to ignore those aspects of natural language that are crucially different from formal languages (poetry is not compositional…) to then conclude from this assumption that natural language is not essentially different from formal language.
This matches my general impression that it is easy to fall into the trap of shoehorning emergent aspects of the wet complex macroscopic world to fit the mathematical tools that are only really applicable to the dry crystalline nature of the microscopic world.
For what it’s worth, I’ve travelled somewhat along the path from your scepticism to something more positive, especially if dependent type theory is used, thanks to Ranta:
In Stockholm, when I first discussed the project with Per Martin-Löf, he said that he had designed type theory for mathematics, and than natural language is something else. I said that similar work had been done within predicate calculus, which is just a part of type theory, to which he replied that he found it equally problematic. But his general attitude was far from discouraging: it was more that he was so serious about natural language and saw the problems of my enterprise more clearly than I, who had already assumed the point of view of logical semantics. His criticism was penetrating but patient, and he was generous in telling me about his own ideas. So we gradually developed a view that satisfied both of us, that formal grammar begins with what is well understood formally, and then tries to see how this formal structure is manifested in natural language, instead of starting with natural language in all it unlimitedness and trying to force it into some given formalism. (Ranta)
Speculatively, if DTT is there in our language, it could provide the first step in the passages described in Gell-Mann’s third quote and Hartle’s second at Nature conformable to herself.
You yourself have described the passage taken in physics as refinement of the dependent type theory, e.g., here.
How about this resolutuion: It’s exceedingly useful to isolate precise parts within natural language, and to, conversely, attain ability to speak with precision in natural language, say in speaking “informal type theory”. But what seems clearly wrong is to behave as if this precise fragment could be all of natural language. That’s the I-have-a-hammer-so-I-see-everything-as-nails fallacy, isn’t it.
Right. There’s got to be some kind of middle ground. But I’m amazed so many philosophers thought they were getting anywhere towards this with an untyped predicate logic.
Sure, I completely agree. In #12 I was reflecting on my encounter (just a few hours old, admittedly) with Montague, who apparently meant to make the sweeping claim that natural language in its entirely may be thought of as not much different from a formal language. That seems blatantly wrong, and clearly motivated but by the wish it were so, much like the economist wishes their subjects behave as rational agents.
A priori I don’t see it as any more sweeping than the idea that mathematics can be thought of as a formal language. Both are simplifications based on a formalist/logicist philosophical point of view. There is a vast field of ’computational linguistics’ that makes practical use in an essential way of formal approaches both to syntax and semantics, the latter originating with Montague.
The difference between mathematics and natural language is that the latter expresses all of the former and on top of that joking, cursing, babbling, praying, mumbling, stuttering, wheeping, irony, poetry, litany, blasphemy, etc. pp.
’Contrariwise,’ continued Tweedledee, ’if it was so, it might be; and if it were so, it would be; but as it isn’t, it ain’t. That’s logic.’
Yes, I realised that this is what you meant, and was observing that the distinction is less than one might think. Geometric topology is undoubtedly mathematics, but it is a convenient fiction that any existing formal language for mathematics accurately expresses it. Similarly for natural language: there is a logical structure to much of it, and one can study formalisations of that structure. Part of what linguists do is study nuances of the kind you mention. And, as I mentioned, one measure of the point of doing something is its usefulness, and computational linguistics for example, which is definitely useful, relies on formalisations of natural language syntax and semantics.
Re #18, yes that was quite a departure when ’ordinary language philosophy’ started to take note of functions of language other than description, such as promising. Regarding cursing, there’s an attempt to use monads to capture them as side effects, e.g., to explain why when we deny a statement
the force of ’damned’ remains.
But even sticking with description, it seems to me that we spend a lot of time speaking about activities which lead to changes of state, and I don’t see that well represented by typical formalisms of philosophers.
For those not aware, it should also maybe be said that Montague’s formalism uses modal logic in a fundamental way for example, and is also higher-order; I think it is wrong to assume that he was blissfully unaware of the subtleties of natural language vs mathematics!
Richard, David, thanks for insisting. I certainly appreciate the points you are making!, only that I still feel all these niceties only apply to that fragment of natural language to which they apply, tautologically. Maybe it’s a moot point, but the Carrollian lawlessness of natural language as in #18 is not going to be part of that fragment. There is a lesson to be learned from how GoogleTranslate overcame decades of vain attempts to shoehorn natural language into compositional rules by forgetting about rules altogether and passing to the free association of deep nets.
Out of interest, can you understand what Google Translate translates as the following?
Ich habe zwei Tage gebraucht, um in 60 Sekunden mehr als eine Stunde lang das Spielen des Minutenwalzers zu lernen.
I’m claiming somewhere that (in the English version) parsing relies on an ’iteration’ function from the type of accomplishments to the type of activities.
(The reverse translation gets the order wrong: ’It took me two days to learn to play the minute reel for over an hour in 60 seconds’.)
David, now you are making my point, no? I think natural understanding of natural language goes by grabbing all ingredients offered and trying to connect them in any way that makes sense using a wealth of background knowledge, with any rules being adhered to just there to ease that process, by providing further background structure to go by. That’s why poetry, irony and humor can even exist. If natural language were completely based on compositional rules, we’d be entirely conversing as in law texts or, of course, as in maths texts. Natural language can do that, and it’s good to be able to do that when necessary, but that’s not all there is to it.
But I have said this before, and I take it that it’s not really controversial, and I need to be looking into something else, and should bow out of this discussion here, and I don’t even know if this sentence still adheres to the rules of grammar, as it seems to keep rambling on, and so it’s propbably time to quit and which rule do I break by ending with an emoticon, but anyways :-)
Yes, time to quit. But just to point out that the German was generated by a perfectly understandable English sentence:
It took me two days to learn to play the Minute Waltz in 60 seconds for over an hour.
Parsing is achieved by knowing that ’in’ with times occurs with achievements, and ’for’ with times occurs with activities. The only solution is to transform the achievement ’play the Minute Waltz in 60 seconds’ into an activity by iteration.
There is an intricate, and largely implicit, conception of time, process, event, state, etc. embodied in our instinctive use of natural language.
Re #22: Hehe, yes, I do have sympathy with that point of view too!
1 to 26 of 26