Grammatical incompleteness

From a thread on the "Corpora" email list (http://www.uib.no/mailman/public/corpora/2007-September/005000.html):

Personally I think we can clear up a lot of the mess, and get a very
predictive model, by abandoning just one assumption. I believe much of
machine learning to be quite sound for instance. We can use it (right from
the level of sound waves.) We can even keep grammar, in a sense.

The assumption I believe we need to abandon is the one that there is only
one grammar to be found.

Why do we insist on the assumption of global generalizations?

Instead we need to accept there are many grammatical perspectives possible,
and many of them contradict. (By this view the grammars we learn now,
"statistical language models" and the like, only appear random because they
are many contradictory grammars summed together.) Instead of attempting
global generalizations we should keep the corpus, and let context select the
generalizations we need at the time we need them.

I think Chomsky saw this too. That is the "value" I see in his rejection of
the phoneme. But he preferred to keep his assumption of one grammar, and
abandon the idea grammar could be learned instead.