Patterns among objects

warning: Creating default object from empty value in /var/www/chaoticlanguage.com/public_html/modules/taxonomy/taxonomy.pages.inc on line 33.

Chomsky's "loss of generality in the formulation of the rule-governed regularities of the language."

From a discussion about "loss of generality" on the Funknet discussion list (http://lloyd.emich.edu/cgi-bin/wa?A2=ind0406&L=funknet&D=0&P=3622):

'The particular analysis which interests me is one I found in a historical
retrospective by Fritz Newmeyer and others "Chomsky's 1962 programme for
linguistics" (in Newmeyer's "Generative Linguistics -- A Historical
Perspective", Routledge, 1996, and apparently also published in "Proc. of the
XVth International Congress of Linguists".)

Newmeyer is talking mostly about Chomsky's "Logical basis of linguistic

Nicolai Hartmann

"the modern discoverer of emergence—originally called by him categorial novum."

http://en.wikipedia.org/wiki/Nicolai_Hartmann

"Games", Conway, names, and meaning

"One of the most brilliant mathematicians of the last and current century is John Horton Conway. Near the middle of the last century he formalized a notion of game in terms of a certain recursive data structure. He went on to show that every notion of number that has made it into the canon of numerical notions could be given representations in terms of this data structure. These ideas are documented in his delightful On Numbers and Games. Knuth popularized some of these ideas in his writings on surreal numbers."

http://biosimilarity.blogspot.com/2008/03/naming-as-dialectic.html

Randomness and cellular automata

Rudy Rucker on cellular automata: "I was first hooked on modern cellular automata by [Wolfram84]. In this article, Wolfram suggested that many physical processes that seem random are in fact the deterministic outcome of computations that are simply so convoluted that they cannot be compressed into shorter form and predicted in advance. He spoke of these computations as "incompressible," and cited cellular automata as good examples."

http://www.fourmilab.ch/cellab/manual/chap5.html

Complexity of patterns: cellular automata

"We have seen that the Glider Gun generates ordered gliders every 30 generations, and that the generation process is chaotic: it exhibits the butterfly effect."

http://www.upscale.utoronto.ca/GeneralInterest/Harrison/LifeEnergy/LifeA...

"Nativelike selection"

Work like the classic seminal work of Pawley and Syder demonstrate natural language is far from random, but is equally far from regular:

p.g. 2
"The problem we are addressing is that native speakers do not
exercise the creative potential of syntactic rules to anything like
their full extent, and that, indeed, if they did do so they would
not be accepted as exhibiting nativelike control of the language.
The fact is that only a small proportion of the total set of grammatical
sentences are nativelike in form - in the sense of being

Grammar: formally incomplete or just random?

Natural language appears to be random (c.f. from the Hutter Prize page):

"...in 1950, Claude Shannon estimated the entropy (compression limit)
of written English to be about 1 bit per character [3]. To date, no
compression program has achieved this level."
(http://cs.fit.edu/~mmahoney/compression/rationale.html)

The most successful contemporary natural language technologies are probabilistic.

The usual explanation is that something external selects between alternatives which are equally probable on linguistic grounds. Commonly this external factor is assumed to be "meaning".

Relationship of Goedel's incompleteness theorem to uncertainty principles and randomness

From Heisenberg to Goedel via Chaitin
Authors: C.S. Calude, M.A. Stay
(Submitted on 26 Feb 2004 (v1), last revised 11 Jul 2006 (this version, v6))

Gosper's glider gun

"Conway conjectured on the existence of infinitely growing patterns, and offered a reward for an example. Gosper was the first to find such a pattern (specifically, the Glider gun), and won the prize."

http://en.wikipedia.org/wiki/Bill_Gosper

Number of predictive classes you can find in text

From a discussion on the "Hutter-Prize" Google group (http://groups.google.com/group/Hutter-Prize/browse_thread/thread/bfea185...):

...If A_1, A_2,... A_n
are the contexts of A in some text, and X_1, X_2,...X_n are contexts of
other tokens, then the number of ways A can have common contexts with
other tokens in the text, and thus uniquely specify some new
paradigmatic class, are just Matt's "(n choose k) = n!/(k!(n-k)!)
possible sets", where k is the number of common contexts between A and
some other token.

The syntagmatic distribution of sequences AX_? specified by these

Syndicate content