Speculation / Beyond Generative Grammar
September 15, 2018
The complexity of human language together with the success of modern computational complexity research seduced many scientists into believing that there is a computationally enumerable generative grammar for human language, even innate, up to convention. However, ordinary human mind, given infinite logical depth, is Turing-complete, complex enough to simulate all Turing machines. Thus, the generative grammar doctrine demands that a particular computable enumeration is favored by nature at the expense of all other computable enumerations. Does nature embrace grammatical favoritism?
Arithmetic statements may be formulated with human language notation. A true arithmetic statement may be considered grammatical under the context of number theory. It follows that the generative grammar doctrine entails a computable enumeration E for entire human knowledge about arithmetic.
Suppose that computational logical depth is not a material constraint. All human knowledge about arithmetic may be enumerated by E. Because E is composed of true arithmetic statements, E can not include the statement G that E is consistent, due to Gödel's incompleteness theorem for second order arithmetic. The generative grammar doctrine declares that G can not be learned, which sounds dubious enough, for readers already learned it here.
It's natural to speculate that E doesn't exist under the context of number theory, provided with sufficient computational resource and logical depth, or reasonable size limit on E being humanly, not astronomical.
Generative grammar scientists, with luck, might discover sorts of statement generators resembling true or false arithmetic statements and allow us to run them in order to determine E.
Suppose we articulate and learn that the statement generator is consistent, before the determination of E, even though we don't know whether the statement generator is actually consistent or not.
If the statement generator is inconsistent, it will prove both its own consistency and inconsistency in finite logical depth. It follows that the statement generator is not E. On the other hand, if the statement generator is consistent and generative grammar scientists identify it with E, E will not prove its own consistency in finite logical depth, according to Gödel's incompleteness theorem. Thus, the learned statement G, that the statement generator is consistent, can not be derived from E. Generative grammar scientists claim that the successfully learned statement G is not learnable, which is absurd and proves that E doesn't exist under the context of number theory.
To be fair, there may exist humans of limited mental capacity that G is not learnable. However, the existence of these humans deals another fatal blow to generative grammar scientists who claim that humans share a mother tongue. If G is learnable for some, but not others, G can not possibly be a shared feature derived from common human genetic factors.
Generative grammar scientists may try to exclude arithmetic from human language, but since arithmetic is expressible with ordinary human language, the exclusion elicits generative grammar scientists' explicit denigration of human creativity.