Read Of Minds and Language Online

Authors: Pello Juan; Salaburu Massimo; Uriagereka Piattelli-Palmarini

Of Minds and Language (10 page)

BOOK: Of Minds and Language
12.21Mb size Format: txt, pdf, ePub
ads

Now, the combination of basic grouping on the one hand, and copying on the other, gives you endocentric structures. It gives you Merge, which is in the linguistic sense a very specific kind of hierarchical structure. Not the type of structure that you get even in phonology. If you take, say, the syllable structure in phonology, that is a type of hierarchy that is not headed in the same way that syntax is. It is not endocentric (a VP is a V, but a syllable is not a nucleus). So what we should target precisely is that process of combining those two presumably fairly basic operations or processes, namely Concatenate and Copy, and it is the result of these two operations that gives you a very specific representation of vocabulary that we call Merge. Now notice that those two operations, Basic Grouping and Copy, need not be linguistically specific. These might have been recruited from other systems that presumably exist. I haven't checked, but other systems may make use of copying operations or operations that basically combine things. But it is the combination of these two presumably general processes that gives you the specificity that linguistic structures display.

That is actually a welcome consequence of work in linguistics, trying to decompose Merge. It is an arcane question, if you want, but it should be a
welcome consequence for biologists because biologists have long noted that typically novel things, like novel abilities, are very rare in nature. That is, novelty as such is usually not what you find in the biological world. Typically, what you find is a recombination of old processes that are put together in new ways that give you novelty. But you do not develop novelty out of nowhere. It is typically ancient things that you recombine. Now presumably Copy and Basic Grouping are ancient processes that you find elsewhere, and it is the combination of them that could actually define a good chunk of FLN. So the specificity for language would come from the combination of old things.

Stephen Jay Gould was very fond of making a distinction between the German terms
Neubildung
, that is “new formation,” which is very, very rare in the biological world, and novelty coming about by what he called
Umbildung
, “recombination,” the topological variations of old things, which is very, very common. That is what I think Jacob (1977) had in mind when he was talking about tinkering. He really did not have in mind what evolutionary psychologists like to use “tinkering” for (the less than optimal connotation of the term). Instead I think that what he wanted to stress was that if you have something that emerges as a novel aspect of the world, what you should first explore is the possibility that that novelty is just the result of recombination of old parts (which is not at all incompatible with suboptimal results). I think that decomposing Merge in that sense is what linguists can contribute, by saying that there is a way of making Merge not completely novel, outlandish, and very different from everything else that we know in the cognitive world; instead we should find basic operations that, once put together, result in a unique, specific structure that can be used for language and that may be recruited for other systems.

Now admittedly, this does not give us everything that has to evolve for language to become this very special object that we have. So for example I have not mentioned anything about phonology, about parameters, or about the lexicon or things of that sort. But it seems to me that Merge is the central component that has been taken, even in the recent literature, as something that is so unique and unlike anything else, that it is hard to see how it could have evolved even in a short period. By contrast, if you decompose it into more basic components, I think you can get a better handle on that question. If you can do that, if you can reduce Darwin's Problem to more basic questions, then it seems not implausible to think that, just as we solved Plato's Problem at least conceptually (though not in detail), we may at least begin to have a better handle on Darwin's Problem. And that is the suggestion I'd like to leave on the table here.

Discussion

L
AKA
: I agree that headedness seems to be an outstanding formal feature of language. The point you were trying to make is that we should think of Merge as a combination of two operations, and if I understood you correctly, that these two operations are likely to be independently found in other cognitive domains; and you also said that you think headedness is a good candidate for the language faculty in the narrow sense (FLN), which I assume we agree would be that part of language where you find novelty that is specific for language. My question is, if Merge is decomposed into two different operations, you might as well say it belongs to the faculty of language broadly understood (FLB), because you could also say that all those other things we find in FLB form a constellation that is unique to human language.

B
OECKX
: Yes, my intention is to say that some of the very specific aspects that define language, and headedness is an obvious one, may not be the result of completely new processes as such, but of the very novel or specific combinations of things that might actually be part of FLB. So that FLN might be, say, a new representation of vocabulary that results from the combination of processes that may be part of FLB for example. So it is just a different take on the FLB/ FLN distinction. I think the distinction makes an awful lot of sense, but sometimes some of the content of FLN, you don't want to make it too specific so that it becomes this weird thing that we don't know how it could have evolved. It could be that these are just a new combination of old parts basically, so they might be part of FLB, okay? But you don't want to say that FLN is an empty set. You just want to say that some of the specificity of FLN could be the result of things that are in FLB and that are recruited for FLN.

P
ARTICIPANT
: Suppose we agree that language to some extent is conceptually innovative. It is one thing to state that, but the question is how does it do that? How would language do that? And I want to send this out as a kind of test to my fellow linguists here. What is it about current thinking about syntax that makes us expect that language could have the conceptual and semantic consequences that have been discussed here? In particular, if you have such an impoverished view of Merge, if you think that the materials that enter into structure building are so conservative and you just bundle them together in a new way, why would language lead to the new ways of seeing the world that Marc Hauser mentions, for example?
4

B
OECKX
: It's not implausible to think that as soon as you have a new representation in the vocabulary – even if it builds on old processes for combining things – that once you have that, you can use it as an exaptation for other things, giving you a completely different cognitive mind. For example, the hypothesis that Liz Spelke and others have explored that once you have language as a concept booster, you could have a very different conceptual world that results from that. Namely, you would have enough with basic Merge to combine modular information that is encapsulated otherwise, yielding as a result cross-modular concepts. That's something which, for example, Paul Pietroski
5
has explored. Now, once you have that (as a result of just using those basic operations, but using those operations to cross modules that have not been crossed in other animals), you get a very different kind of mind. It is not the only possibility, but it is one possibility, I think.

U
RIAGEREKA
: A technical question for you, Cedric. Once you have talked about concatenation and copying, an immediate question that comes to mind is that you have concatenation in Markovian systems and you have copying in loops. So I wonder if that is a possibility you have thought about, that you exapt from those?

B
OECKX
: A very short answer: yes, that is exactly what I had in mind when you were saying that these could be exapted from more basic systems, and once you combine them you get a much more powerful system.

P
ARTICIPANT
: I have a question about the proposal to decompose Merge. There are a few things I didn't really understand. First of all, I'm not really clear why concatenation is somehow simpler, less mysterious than Merge. In particular I thought that, at least in the version of Merge that I'm familiar with, it's not linearly ordered for all elements. So the flow of speech, one word after another, I take this to be a feature that is due to restrictions on the phonological interface in minimalism, so you probably don't want narrow syntax to have this constraint already built in. But now concatenation, at least in my computer, is a function that is ordered. AB and BA are two different results from the same elements and the same concatenation function. It seems like you're building order into it.

B
OECKX
: Yes, it's unfortunately built in the notion of concatenation for some, but it's not what I intended, so if you don't like “concatenation,” use “combine” or “set formation” or something else that's very simple. There is no linear order
meant there. It's just putting A and B together. That I think is a very basic and general operation, but I didn't intend to put linear order into the picture.

C
HOMSKY
: Actually, there is a question I wanted to raise, but technically, what the last person just said was correct. “Concatenate” means order, so it is more complex than Merge. But if you take the order away from “concatenate,” it just is Merge. Merge simply says, “Take two objects, make another object.” I think you are right in saying that something ought to be decomposed, but it seems to me that there is a better way to do it. In my talk earlier,
6
I just mentioned in a phrase that you can get rid of labeling, and I didn't explain it then, but I'll try to do so now. I don't agree that headedness is a property of language. I think it is an epiphenomenon, and there is nothing simpler than Merge. You can't decompose it, and when you take order away from concatenation, well that is what you have. But the crucial thing about language is not Merge; it is unbounded Merge. So just the fact that things are hierarchic elsewhere doesn't really tell you anything. They have to be
unboundedly
hierarchic. Now there is a way to decompose it coming from a different perspective, which I think might be promising. The crucial fact about Merge – the “almost true generalization” about Merge for language is that it is a head plus an XP.
7
That is virtually everything. Now, there is a pretty good, plausible reason for that. For one thing it follows from theta-theory. It is a property of semantic roles that they are kind of localized in particular kinds of heads, so that means when you are assigning semantic roles, you are typically putting together a head and something. It is also implicit in the cartographical approach. So when you add functional structures, there is only one way to do it, and that is to take a head and something else, so almost everything is head-XP, but when you have head-XP, that kind of construction, then headedness is a triviality; it comes from minimal search. If the element that you formed, the head-XP, is going to participate in further combinatorial operations, some information about it is relevant, and the simplest way to find the information – minimal search for the information – will be to take one of the two objects. Well, one of them has no information, because you have to find
its
head, and that is too deep down, so you find the other one. So the trivial consequence of an optimization procedure (presumably nonlinguistic and not organic, or maybe the law of nature) is in H-XP, take H.

Okay, that takes care of almost everything. It takes care of selection, it takes care of probe–goal relations – virtually everything. That eliminates any need for a copying operation. I don't see any reason for a copying operation. Copying
just takes two objects, one of which happens to be inside the other. That is one of the two logical possibilities. Either one is inside the other, or one is outside the other. So that is just logical. We don't need a copying operation. All that this leaves out, and it is an interesting class that it leaves out, is XP-YP structures. Well, there are two types of those. One of them is coming from Internal Merge, where you pick something from the inside and you tack it on, on the outside, but in that case again, minimal search gives you a kind of obvious algorithm for which piece of the structure is relevant to further combination – labeling. Namely, keep being conservative, i.e. pick the one that did the work. The one that did the work is the probe of what would Y be, which itself was an H-XP thing, and that, for all kinds of probe–goal reasons that we know, found the internal one. Put it on the outside; OK, just keep that as the identifying element, the label for the next thing. And here Caterina Donati's
8
discovery was important, that if the thing you are adding happens to be a head, you do get an ambiguity. You can take either the conservative one to be the head, or the new head to be the head, but that is correct, as she showed. It shows up in various ways.

Well, that leaves only one case, and it is a striking case because it is exceptional, and that is the external argument. The only other plausible case that exists (sorry, this is getting too internal to linguistics) is the external argument in the V. That is the one case that remains. We intuitively take the V, not the external argument, and you need an answer for that. But in order to answer that, we first ought to notice how exceptional this case is. For one thing, this new object that you form when you put on an external argument never behaves like a constituent, so for example it never fronts, never moves, and it cannot remain that way. Something has to pull out of it. It is an old problem, with various proposals (I don't think they are very convincing), but it doesn't act like a constituent the way everything else does. You have to take something out of it; it can't remain. Furthermore, these things have different kinds of semantic roles. Actually, there's a paper of Jim Higginbotham's,
9
about subjects and noun phrases, where Jim argues that they just don't have the same kinds of semantic roles as the subjects of verb phrases, or they may have no semantic role, but it is different than a theta-role, and that is the other case of XP-YP. It is the specifier of a noun phrase. So it is different in that respect.

BOOK: Of Minds and Language
12.21Mb size Format: txt, pdf, ePub
ads

Other books

Lost Boys by Orson Scott Card
You and I, Me and You by MaryJanice Davidson
Silent Witness by Rebecca Forster
The Stone Gallows by C David Ingram
25 Brownie & Bar Recipes by Gooseberry Patch
The Cherry Tree Cafe by Heidi Swain