Está en la página 1de 34

Chapter Fifteen

ON GRAMMAR AND GRAMMATICS (1996)

The problem

Most of us are familiar with the feeling that there must be something odd about linguistics. We recognize this as a problem in the interpersonal sphere because as linguists, probably more than other professionals, we are always being required to explain and justify our existence. This suggests, however, that others see it as a problem in the ideational sphere. The problem seems to arise from something like the following. All systematic knowledge takes the form of language about some phenomenon; but whereas the natural sciences are language about nature, and the social sciences are language about society, linguistics is language about language language turned back on itself , in Firths often quoted formulation. So, leaving aside the moral indignation some people seem to feel, as if linguistics was a form of intellectual incest, there is a real problem involved in drawing the boundary: where does language end and linguistics begin? How does one keep apart the object language from the metalanguage the phenomenon itself from the theoretical study of that phenomenon? The discursive evidence rather suggests that we dont, at least not very consistently. For example, the adjective linguistic means both of language, as in linguistic variation, and of linguistics as in linguistic association (we never know, in fact, whether to call our professional bodies linguistic associations or linguistics associations). But a situation
First published in Functional Descriptions: Theory in Practice, 1996, edited by Ruqaiya Hasan, Carmel Cloran and David G. Butt. Amsterdam: John Benjamins, pp. 138.

384

on grammar and grammatics

analogous to this occurs in many disciplines: objects in nature have physical properties, physicists have physical laboratories; there are astronomical societies and astronomical forces (not to mention astronomical proportions). It is easy to see where this kind of slippage takes place: astronomers observe stars, and an expression such as astronomical observations could equally well be glossed as observations of stars, or as observations made during the course of doing astronomy. Likewise linguistic theory is theory of language, but it is just as plausibly theory in the eld of linguistics. To a certain extent this is a pathological peculiarity of the English language, because in English the ambiguity appears even in the nouns: whereas sociology is the study of society, psychology originally the study of the psyche has since slipped across to mean not only the study but also that which is studied, and we talk about criminal psychology (which means the psyche characteristic of criminals, though it ought to mean theories of the psyche developed by scholarly criminals). So now psychology is the study of psychology; and an expression such as Australian psychology is unambiguously ambiguous. Such confusion is not normally found for example in Chinese, where typically a clear distinction is made between a phenomenon and its scientic study; thus shehui : shehuixue :: xinli : xinlixue (society : sociology :: psyche : psychology) and so on. But one can see other evidence for the special difculties associated with linguistics. For example, it is a feature of linguistics departments that, in their actual practice, what they teach is often not so much the study of language as the study of linguistics. (And one of the few elds where the terminological distinction is not consistently maintained in Chinese is that of grammar, where yufa often does duty also for yufaxue.) There do seem to be special category problems arising where language is turned back on itself.

Grammar and grammatics

In fact the ambiguity that I myself rst became aware of, as a teacher of linguistics (and before that, as a teacher of languages), was that embodied in the term grammar. Here the slippage is in the opposite direction to that of psychology: grammar, the name of the phenomenon (as in the grammar of English), slides over to become the name of the study of the phenomenon (as in a grammar of English). This was already confusion enough; it was made worse by the popular use of the term to mean rules of linguistic etiquette (for example bad grammar). As a way of getting round part of the problem I started using the term
385

construing and enacting

grammatics I think the rst published occasion was in a discussion of ineffability (see above Chapter 12). This was based on the simple proportion grammatics : grammar :: linguistics : language. I assumed it was unproblematic: the study of language is called linguistics; grammar is part of language; so, within that general domain, the study of grammar may be called grammatics. But this proportion is not quite as simple as it seems. The relationship of linguistics to language is unproblematic as long as we leave language undened; and we can do this as linguists, we can take language for granted, as sociologists take society for granted, treating it as a primitive term. Grammar, on the other hand, needs dening. Although the word is used in a non-technical sense, as in the bad grammar example, one cannot take this usage over to dene a domain of systematic study: in so far as it has any objective correlate at all, this would refer to an inventory of certain marginal features of a language dened by the fact that they carry a certain sort of social value for its speakers. We can study ethnographically the patterns of this evaluation, and their place in the social process; but that is a distinct phenomenal domain. Grammatics, in fact, has no domain until it denes one for itself (or until one is dened for it within general linguistics exactly at what point the term grammatics takes over from linguistics is immaterial). And it is this that makes the boundary hard to draw. Since both the grammar and the grammatics are made of language, then if, in addition, each has to be used to dene the other, it is not surprising if they get confused. Now you may say, as indeed I said to myself when rst trying to think this through: it doesnt matter. It does no harm if we just talk about grammar without any clear distinction between the thing and the study of the thing. They are in any case much alike: if you turn language back on itself, it is bound to mimic itself in certain respects. But this comforting dismissal of the problem was belied by my own experience. If I had become aware of the polysemy in the word grammar it was because it got in the way of clear thinking my own, and that of the students I was trying to teach. (It does not help, incidentally, to take refuge in the term syntax, where precisely the same polysemy occurs.) There was confusion in certain concepts, such as universals of grammar and rule of grammar, and in the status and scope of grammatical categories of various kinds. But also, I suspect, a problem that has been so vexing in recent years that of relating the system to the text (so often discourse is analysed as if there were no general principles of meaning behind it) is ultimately part of the same overall unclarity.
386

on grammar and grammatics

Dening grammar

In the simplest denition grammar is part of language. If we pick up a book purporting to describe a language, or to help us to learn it, we expect to nd some portion or portions of the book but not the whole of the book devoted to grammar. In my own work, I have operated with the concept of lexicogrammar (that is, grammar and vocabulary as a single unity), while usually referring to it simply as grammar for short; this is a stratal concept, with grammar as one among an ordered series comprising (at least) semantics / lexicogrammar / phonology. But whatever partwhole model is adopted, language remains the more inclusive term. But there is a further step, by which grammar is not just one among various parts of language; it is a privileged part. The exact nature of this privilege will be interpreted differently by different linguists, and some might deny it altogether; but most would probably accept it in one form or another. I would be inclined to characterize grammar in the rst instance as the part of language where the work is done. Language is powered by grammatical energy, so to speak. Let me approach the denition of grammar, however, from a somewhat different angle. I shall assume here, as a general theoretical foundation, the account of language given by Lemke (1993). Lemke characterizes human communities as eco-social systems which persist in time through ongoing exchange with their environment; and the same holds true of each of their many sub-systems. The social practices by which such systems are constituted are at once both material and semiotic, with a constant dynamic interplay between the two. Note that by semiotic I mean having to do with meaning, not having to do with signs; thus, practices of doing and practices of meaning. The important feature of the materialsemiotic interplay is that, as Lemke points out, the two sets of practices are strongly coupled: there is a high degree of redundancy between them. We may recall here Firths concept of mutual expectancy between text and situation. Underlying the semiotic practices are semiotic systems of various kinds. In fact, we usually use the term system to cover both system and process: both the potential and the instances that occur; thus a semiotic system is a meaning potential together with its instantiation in acts of meaning. Now, one special kind of semiotic system is one that has a grammar in it: such a system means in two phases, having a distinct phase of wording serving as the base for the construction of meaning. In other words, its content plane contains a grammar as
387

construing and enacting

well as a semantics. We could characterize this special kind of semiotic system as a grammatico-semantic system. It is the presence of a grammar that gives such a system its unique potential for creating (as distinct from merely reecting) meaning.

The emergence of grammar through time

We could locate grammatico-semantic systems within the framework of an evolutionary typology of systems, as in Figure 1. In this frame, semiotic systems appear as systems of a fourth order of complexity, in that they are at once physical and biological and social and semiotic. Within semiotic systems, those with a grammar in them are more complex than those without.
physical + life = biological + value = social

+ meaning = semiotics1 [primary]

+ grammar = semiotic2 [higher order, i.e. grammatico-semantic]

S 1

S 2

S 3

S 4.1

S 4.2

Figure 1 Evolutionary typology of systems

Semiotic systems rst evolve in the form of what Edelman (1992) calls primary consciousness. They evolve as inventories of signs, a sign being a content/expression pair. Systems of this kind, which may be called primary semiotics, are found among numerous species: all higher animals, including our household pets; and such a system is also developed by human infants in the rst year of their lives I referred to this as the protolanguage (Halliday 1975). Primary semiotic systems have no grammar. The more complex type of semiotic system is that which evolves in the form of Edelmans higher order consciousness. This higher order semiotic is what we call language. It has a grammar; and it appears to be unique to mature (i.e. post-infancy) human beings. In other words, it evolved as the sapiens in homo sapiens. (I say this without prejudice; I would be happy indeed very excited to learn
388

on grammar and grammatics

that higher-order, stratied semiotics had evolved also with other species, such as cetaceans, or higher primates. But I am not aware of any convincing argument or demonstration that they have.1) Certain features of the human protolanguage, our primary semiotic, persist into adult life; for example expressions of pain, anger, astonishment or fear (rephonologized as interjections, like ouch!, oy!, wow! . . .). On the other hand, human adults also develop numerous non-linguistic semiotic systems: forms of ritual, art forms, and the like; these have no grammar of their own, but they are parasitic on natural language their meaning potential derives from the fact that those who use them already have a grammar. (See OToole (1994) for a rich interpretation of visual semiotics in grammatico-semantic terms.) Thus all human semiotic activity, from early childhood onwards, is as it were ltered through our grammar-based higher order consciousness. What then is a grammar, if we look at it historically in this way, as evolving (in the species) and developing (in the individual)? A grammar is an entirely abstract semiotic construct that emerges between the content and the expression levels of the original, sign-based primary semiotic system. By entirely abstract I mean one that does not interface directly with either of the phenomenal realms that comprise the material environment of language. The expression system (prototypically, the phonology) interfaces with the human body; the (semantic component of the) content interfaces with the entire realm of human experience; whereas the grammar evolves as an interface between these two interfaces shoving them apart, so to speak, in such a way that there arises an indenite amount of play between the two.

Grammar in semiotic function

The grammar is thus the latest part of human language to have evolved; and it is likewise the last part to develop in the growth of the individual child. It emerges through deconstructing the original sign and reconstructing with the content plane split into two distinct strata, semantics and lexicogrammar. Such a system (a higher-order semiotic organized around a grammar) is therefore said to be stratied (Lamb 1964; 1992; Martin 1992; 1993). A stratied semiotic has the unique property of being able to create meaning. A primary semiotic, such as an infants protolanguage, means by a process of reection: its meanings are given, like here I am!, Im in pain, lets be together!, thats nice; and hence they cannot modify each other or change in the course of unfolding. By
389

construing and enacting

contrast, a stratied semiotic can constitute: it does not simply reect, or correspond to pre-existing states of affairs. The stratal pattern of organization, with an entirely substance-free stratum of grammar at its core, makes it possible to construct complex open-ended networks of semantic potential in which meanings are dened relative to one another and hence can modify each other and also can change in interaction with changes in the ongoing (semiotic and material) environment. The grammar does not, of course, evolve in isolation; meanings are brought into being in contexts of function. The functional contexts of language fall into two major types, and the constitutive function that the grammar performs differs as between the two types. On the one hand, language constitutes human experience; and in this context, the grammars function is to construe: the grammar transforms experience into meaning, imposing order in the form of categories and their interrelations. On the other hand, language constitutes social processes and the social order; and here the grammars function is to enact: the grammar brings about the processes, and the order, through meaning. And, as we know, the grammar achieves this metafunctional synthesis, of semiotic transformation with semiotic enactment (of knowledge with action, if you like), by constituting in yet a third sense creating a parallel universe of its own, a phenomenal realm that is itself made out of meaning. This enables the semiotic process to unfold, through time, in cahoots with material processes, each providing the environment for the other. To put this in other terms, the grammar enables the ow of information to coincide with, and interact with, the ow of events (Matthiessen 1992; 1995). This metafunctional interdependence is central to the evolution of language, and to its persistence through constant interaction with its environment. In the experiential (or, to give it its more inclusive name, the ideational) metafunction, the grammar takes over the material conditions of human existence and transforms them into meanings. We tend to become aware of the grammatical energy involved in this process only when we have to write a scientic paper; hence, this semiotic transformation may appear to be just a feature of knowledge that is systematic. But all knowledge is like this: to know something is to have transformed it into meaning, and what we call understanding is the process of that transformation. But experience is understood in the course of, and by means of, being acted out interpersonally and, in the same way, interpersonal relations are enacted in the course of, and by means of, being construed ideationally. The grammar ows
390

on grammar and grammatics

these two modes of meaning together into a single current, such that everything we say (or write, or listen to, or read) means in both these functions at once. Thus every instance of semiotic practice every act of meaning involves both talking about the world and acting on those who are in it. Either of these sets of phenomena may of course be purely imaginary; that in itself is a demonstration of the constitutive power of a grammar.

Grammar as theory

So far I have been talking about various properties of grammar. But in talking about grammar, I have been doing grammatics it is my discourse that has been construing grammar in this way. Naturally, I have also been doing grammar: the properties have been being construed in lexicogrammatical terms. In other words I have been using grammar to construct a theory about itself. Every scientic theory in fact every theory of any kind, whether scientic or otherwise is constructed in similar fashion, by means of the resources of grammar. A theory is a semiotic construct (see Lemke (1990) for a powerful presentation of this point). That we are able to use a grammar as a resource for constructing theories is because a grammar is itself a theory. As I suggested in the previous section, the grammar functions simultaneously as a mode of knowing and a mode of doing; the former mode the construction of knowledge is the transformation of experience into meaning. A grammar is a theory of human experience. Construing experience is a highly theoretical process, involving setting up categories and relating each category to the rest. As Ellis (1993) points out, there are no natural classes: the categories of experience have to be created by the grammar itself. Or, we might say, there are indenitely many natural classes: indenitely many ways in which the phenomena of our experience may be perceived as being alike. In whichever of these terms we conceive the matter, the grammar has to sort things out, assigning functional value selectively to the various possible dimensions of perceptual order. The grammars model of experience is constantly being challenged and reinforced in daily life; thus it tends to change when there are major changes in the conditions of human existence not as a consequence, but as a necessary and integral element, of these changes. The difference between a grammar, as a commonsense theory of experience, and a scientic theory (such as grammatics) is that grammars
391

construing and enacting

evolve, whereas scientic theories are at least partially designed. (The nearest to an independent, fully designed semiotic system is mathematics. Mathematics is grounded in the grammar of natural language; but it has taken off to the point where its operations can probably no longer be construed in natural language wordings.) But it is still the grammar of natural language that is deployed in the designing of scientic theories (cf. Halliday and Martin 1993). In the next few sections I shall discuss some of the properties of grammars that enable them to function as they do: to theorize about human experience and to enact human relationships. In addition to their metafunctional organization, already alluded to as enabling the integration of knowledge and action, I shall mention (a) their size and ability to expand, (b) their multiplicity of perspective, and (c) their indeterminacy. In talking about these features, of course, I shall still be doing grammatics. Then, in the nal sections, I shall turn to talking about grammatics.

How big is a grammar?

The semogenic operations performed by a grammar are, obviously, extremely complex. Neuroscientists explain the evolution of the mammalian brain, including that of homo sapiens, in terms of its modelling the increasingly complex relationships between the organism and its environment. This explanation foregrounds the construal of experience (the ideational metafunction); so we need to make explicit also its bringing about the increasingly complex interactions between one organism and another (the interpersonal metafunction). To this must be added the further complexity, in a grammar-based higher-order semiotic, of creating a parallel reality in the form of a continuous ow of meaning (the textual metafunction). It could be argued that, since language has to encompass all other phenomena, language itself must be the most complex phenomenon of all. While we may not want to go as far as this, there is still the problem of how language achieves the complexity that it has. Let us pose the simple question: how big is a language? (It seems strange how seldom this question is actually asked.) A simple (though not trivial) answer might be: a language is as big as it needs to be. There is no sign, as far as I know, that languages are failing to meet the immense demands made on them by the explosion of knowledge that has taken place this century. In major languages of technology and science, such as English, Russian or Chinese, there must be well over a million words in use, if
392

on grammar and grammatics

we put together the full range of specialized dictionaries and the dictionaries can never be absolutely exhaustive. Of course, no one person uses more than a small fraction of these. But counting words in any case tells us very little; what we are concerned with is the total meaning potential, which is construed in the lexicogrammar as a whole. And here again we have to say that there seems no indication that languages are collapsing under the weight. From this point of view, then, it seems as if all we can say is that a language is indenitely large; however many meanings it construes, it can always be made to add more. Is it possible to quantify in some way its overall meaning potential? At this point we have to bring in a specic model from the grammatics, in which a grammar is represented paradigmatically as a network of given alternatives (a system network). Given any system network it should in principle be possible to count the number of alternatives shown to be available. In practice, it is quite difcult to calculate the number of different selection expressions that are generated by a network of any considerable complexity. If we pretend for the moment that all systems are binary, then given a network containing n systems, the number of selection expressions it generates will be greater than n (n+1 if the systems are maximally dependent) and not greater than 2n (the gure that is obtained if all systems are independent). But that does not help very much. Given a network of, say, 40 systems, which is not a very large network, all it tells us is that the size of the grammar it generates lies somewhere between 41 and 240 (which is somewhere around 1012). We do not know how to predict whereabouts it will fall in between these two gures. So let me take an actual example of a network from the grammar of English. Figure 2 shows a system network for the English verbal group (based on the description given in Halliday 1994, but with tense treated non-recursively in order to simplify). This network contains 28 systems, and generates just over seventy thousand selection expressions 70,992 to be exact. That is a little way over 216. (Not all the systems in it are binary.) This network is relatively unconstrained: it shows no conjunct entry conditions, and it shows an unusually high degree of independence among constituent systems probably more than there should be, although in this respect the English verbal group is somewhat untypical of (English and other) grammars as a whole. On the other hand, it is not outstandingly delicate: it does not distinguish between can and may, for example, or could and might, or between [they] arent and [they]re not; or among the
393

construing and enacting

Figure 2 The English verbal group: a simplied system network

various possible locations of contrast in a verbal group selecting more than one secondary tense. (And, it should be pointed out, the options shown are all simply the variant forms of one single verb.) So when I prepared a network of the English clause as the rst grammar for William Manns Penman text generation project in 1980, which had
394

on grammar and grammatics

81 systems in it, Mann was probably not far wrong when he estimated off the cuff that it would generate somewhere between 108 and 109 clause types. Of course there are lots of mistakes in these complex networks, and the only way to test them is by programming them and setting them to generate at random. It is not difcult to generate the paradigm of selection expressions from a reasonably small network (already in 1966 Henrici developed a program for this purpose; cf. Halliday and Martin 1981), where you can inspect the output and see where it has gone wrong. But even if the program could list half a billion expressions it would take a little while to check them over. As far as their overall capacity is concerned, however, they are probably not orders of magnitude out. It has been objected that the human brain could not possibly process a grammar that size, or run through all the alternative options whenever its owner said or listened to a clause. I am not sure this is so impossible. But in any case it is irrelevant. For one thing, this is a purely abstract model; for another thing, the number of choice points encountered in generating or parsing a clause is actually rather small in the network of the verbal group it took only 28 systems to produce some 70,000 selection expressions, and in any one pass the maximum number of systems encountered would be even less probably under half the total, in a representative network. In other words, in selecting one out of half a billion clause types the speaker/listener would be traversing at the most about forty choice points. So although the system network is not a model of neural processes, there is nothing impossible about a grammar of this complexity that is, where the complexity is such that it can be modelled in this way, as the product of the intersection of a not very large number of choices each of which by itself is extremely simple.

How does your grammar grow?

Grammars do not remain static. They tend to grow; not at an even rate, but with acceleration at certain moments in the history of a culture. On the one hand, they grow by moving into new domains. This happens particularly when there is an expansion in the cultures knowledge and control: in our present era, new domains are opened up by developments in technology and science. We are likely to become aware of this when we meet with a crop of unfamiliar words,
395

construing and enacting

like those associated with the recent move into nanotechnology (engineering the very small); but the expansion may take place anywhere in the lexicogrammar, as new wording, in any form. The grammar is not simply tagging along behind; technological developments, like other historical processes, are simultaneously both material and semiotic the two modes are interdependent. Early on in his researches into science and technology in China, Needham noted how in the medieval period, when there was no adequate institutional mechanism for keeping new meanings alive, the same material advances were sometimes made two or three times over, without anyone realizing that the same technology had been developed before (Needham 1958). On the other hand, grammars grow by increasing the delicacy in their construction of existing domains. (This has been referred to by various metaphors: rening the grid or mesh, sharpening the focus, increasing the granularity and so on. I shall retain the term delicacy, rst suggested by Angus McIntosh in 1959.) This is a complex notion; it is not equivalent to subcategorizing, which is simply the limiting case although also the one that is likely to be the most easily recognized. The grammar does construct strict taxonomies: fruit is a kind of food, a berry is a kind of fruit, a raspberry is a kind of berry, a wild raspberry is a kind of raspberry; these are typically hyponymic and can always be extended further, with new words or new compositions of words in a grammatical structure, like the nominal group in English and many other languages. But greater delicacy is often achieved by intersecting semantic features in new combinations; and this is less open to casual inspection, except in isolated instances which happen to be in some way striking (like certain politically correct expressions in presentday English). The massive semantic innovations brought about by computing, word processing, networking, multimedia, the information superhighway and the like, although in part construing these activities as new technological domains, more typically constitute them as new conjunctions of existing meanings, as a glance at any one of thousands of current periodicals will reveal. On a somewhat less dramatic scale, we are all aware of the much more elaborate variations in the discourse of environmental pollution and destruction than were available a generation ago. Even a seemingly transparent piece of wording such as smoke-free construes a new conuence of meanings; indeed the whole semogenic potential of -free as a derivational morpheme has recently been transformed. (Similar expansions have happened with -wise and -hood.) There is a special case of this second heading perhaps even a third
396

on grammar and grammatics

type of grammar growth in the form of semantic junction brought about by grammatical metaphor. Here what happens is a kind of reconstrual of some aspect of experience at a more abstract level, brought about by the metaphoric potential inherent in the nature of grammar. A new meaning is synthesized out of two existing ones, (a) a lexicalized meaning and (b) the category meaning of a particular grammatical class. So, for example, when [weapons] that kill more people was rst reworded as [weapons] of greater lethality, a new meaning arose at the intersection of kill with thingness (the prototypical meaning of a noun). Much technical, commercial, bureaucratic and technocratic discourse is locked in to this kind of metaphoric mode. We can observe all these processes of grammar growth when we interact with children who are growing up (Painter 1992; Derewianka 1995). This is a good context in which to get a sense of the openendedness of a grammar. In the last resort and in some sense that is still unclear there must be a limit to how big a grammar can grow: that is, to the semiotic potential of the individual meaner; after all, the capacity of the human brain, though undoubtedly large, is also undoubtedly nite. But there is no sign, as far as I know, that the limit is yet being approached.

Grammar as multiple perspectives

In a stratied semiotic system, where grammar is decoupled from semantics, the two strata may differ in the arrangement of their internal space. Things which are shown to be topologically distant at one stratum may appear in the same systemic neighbourhood at the other. (See Martin and Matthiessen 1992, where the distinction is interpreted as between topological (semantics) and typological (lexicogrammar).) It is this degree of freedom the different alignment of semogenic resources between the semantics and the grammar that enables language to extend indenitely its meaning-making potential (a striking example of this is grammatical metaphor, mentioned at the end of the previous section). It is also this characteristic which explains how syndromes of grammatical features scattered throughout different regions of the grammar may cluster semantically to form what Whorf called frames of consistency; cf. Hasans ways of meaning (1984b), Martins grammatical conspiracies (1988). This amount of play is obviously to be encountered across the (typically arbitrary) boundary between content and expression: we do not expect things which mean the same to sound the same although
397

construing and enacting

there is considerable seepage, which Firth labelled phonaesthesia (Firth 1957). But between the semantics and the grammar, this new frontier (typically non-arbitrary) within the content plane, we expect to nd more isomorphism: things which mean alike might reasonably be worded alike. As a general rule, they are: grammatical proportionalities typically construe semantic ones. But not always. On the one hand, there are regions of considerable drift in both directions; an obvious one in English is the semantic domain of probability and subjective assessment, which is construed in many different regions of the grammar each of which may in turn construe other semantic features, such as obligation or mental process. On the other hand, there are the syndromes mentioned above high-level semantic motifs which are located all around the terrain of the lexicogrammar, such as the complex edice of meanings that goes to make up a standard language. People make much use of these realignments in reasoning and inferencing with language. This stratied vision of things enables the grammar to compromise among competing models of reality. As pointed out above in Section 6, a grammar sorts out and selects among the many proportionalities that could arise in the construal of experience. It does this by making adjustments among the different strata. Things may appear alike from any of three different angles: (i) from above similarity of function in context; (ii) from below similarity of formal make-up; and (iii) from the same level t with the other categories that are being construed in the overall organization of the system. The grammar looks at objects and events from all three angles of orientation. It takes account of their function: phenomena which have like value for human existence and survival will tend to be categorized as alike. It takes account of their form: phenomena which resemble each other to human perceptions will tend to be categorized as alike. And it takes account of how things relate to one another: phenomena are not categorized in isolation but in sets, syndromes and domains. In other words, the grammar adopts what we may call a trinocular perspective. It often happens that the various criteria conict: things (whether material or semiotic) that are alike in form are often not alike in function; and the way they relate to each other may not reect either kind of likeness. Other things being equal, the grammar tends to give some precedence to functional considerations: consider any crowded lexical domain, such as that of maps, plans, charts, gures, diagrams, tables and graphs in English; or grammatical systems that are highly critical for survival, like that of polarity in any language. But the construal of
398

on grammar and grammatics

categories must make sense as a whole. And this means that it needs to be founded on compromise. The grammar of every natural language is a massive exercise in compromise, accommodating multiple perspectives that are different and often contradictory. Such compromise demands a considerable degree of indeterminacy in the system.

10

Indeterminacy in grammar

It seems obvious that grammars are indeterminate (or fuzzy, to borrow the term from its origins in Zadehs fuzzy logic), if only because of the effort that goes into tidying them up. Formal logic and even mathematics can be seen as the result of tidying up the indeterminacies of natural language grammars. The typology of indeterminacy is itself somewhat indeterminate. For the present discussion I will identify three types: (a) clines, (b) blends, and (c) complementarities, with (d) probability as a fourth, though rather different case. Clines are distinctions in meaning which take the form of continuous variables instead of discrete terms. The prototype examples in grammar are those distinctions which are construed prosodically, typically by intonation (tone contour): for example, in English, force, from strong to mild, realized as a continuum from wide to narrow pitch movement if the tone is falling, then from wide fall (high to low) to narrow fall (midlow to low). But one can include in this category those distinctions where, although the realizations are discrete (i.e. different wordings are involved), the categories themselves are shaded, like a colour spectrum: for example, colours themselves; types of motorized vehicles (car, bus, van, lorry, truck, limousine . . . etc.); types of process (as illustrated on the cover of the revised edition of my Introduction to Functional Grammar 1994). In this sense, since in the grammars categorization of experience fuzziness is the norm, almost any scalar set will form a cline: cf. humps, mounds, hillocks, hills and mountains; or must, ought, should, will, would, can, could, may, might. Blends are forms of wording which ought to be ambiguous but are not. Ambiguity in the strict sense, as in lexical or structural puns, is not a form of indeterminacy as considered here, because it does not involve indeterminacy of categorization. Blends also construe two (or more) different meanings; but the meanings are fused it is not a matter of selecting one or the other. A favourite area for blends, apparently in many languages, is modality; in English, oblique modal nites like
399

construing and enacting

should provide typical examples, for example the brake should be on, meaning both ought to be and probably is. There is then the further indeterminacy between an ambiguity and a blend, because a wording which is clearly ambiguous in one context may be blended when it occurs in another. A metaphor is the limiting case of a blend. Complementarities are found in those regions of (typically experiential) semantic space where some domain of experience is construed in two mutually contradictory ways. An obvious example in English is in the grammar of mental processes, where there is a regular complementarity between the like type (I like it; cf. notice, enjoy, believe, fear, admire, forget, resent . . . ) and the please type (it pleases me; cf. strike, delight, convince, frighten, impress, escape, annoy . . .). The feature of complementarities is that two conicting proportionalities are set up, the implication being that this is a complex domain of experience which can be construed in different ways: here, in a process of consciousness the conscious being is on the one hand doing, with some phenomenon dening the scope of the deed, and on the other hand being done to with the phenomenon functioning as the doer. All languages (presumably) embody complementarities; but not always in the same regions of semantic space (note for example the striking complementarity of tense and aspect in Russian). One favourite domain is causation and agency, often manifested in the complementarity of transitive and ergative construals. Strictly speaking probability is not a fuzzy concept; but probability in grammar adds indeterminacy to the denition of a category. Consider the network of the English verbal group in Figure 2 above. As an exercise in grammatics this network is incomplete, in that there are distinctions made by the grammar that the network fails to show: in that sense, as already suggested, no network ever can be complete. But it is incomplete also in another sense: it does not show probabilities. If you are generating from that network, you are as likely to come up with wont be taken as with took; whereas in real life positive is signicantly more likely than negative, active than passive, and past than future. Similarly a typical dictionary does not tell you that go is more likely than walk and walk is more likely than stroll, though you might guess it from the relative length of the entries. A grammar is an inherently probabilistic system, in which an important part of the meaning of any feature is its probability relative to other features with which it is mutually dening. Furthermore the critical factor in register variation is probabilistic: the extent to which local probabilities depart from the global patterns of the language as a whole; for example a
400

on grammar and grammatics

register of weather forecasting (and no doubt other kinds of forecasting as well), where future becomes more probable than past; or one in which negative and passive suddenly come to the fore, like that of bureaucratic regulations (Halliday 1991). Probabilities are signicant both in ideational and in interpersonal meanings, as well as in the textual component; they provide a fundamental resource for the constitutive potential of the grammar.

11

Some matching features

In the last few sections I have picked out certain features of natural language grammars which a theory of grammar a grammatics is designed to account for. The purpose of doing this was to provide a context for asking the questions: how does the grammatics face up to this kind of requirement? Given that every theory is, in some sense, a lexicogrammatical metaphor for what it is theorizing, is there anything different about a theory where what it is theorizing is also a lexicogrammar? There is (as far as I can see) no way of formally testing a grammar in its role as a theory of human experience: there are no extrinsic criteria for measuring its excellence of t. We can of course seek to evaluate the grammar by asking how well it works; and whatever language we choose it clearly does grammars have made it possible for humanity to survive and prosper. They have transmitted the wisdom of accumulated experience from one generation to the next, and enabled us to interact in highly complex ways with our environment. (At the same time, it seems to me, grammars can have quite pernicious side-effects, now that we have suddenly crossed the barrier from being dominated by that environment to being in control of it, and therefore also responsible for it; cf. Halliday 1993). I suspect that the same holds true for the grammatics as a theory of grammar: we can evaluate such a theory, by seeing how far it helps in solving problems where language is centrally involved (problems in education, in health, in information management and so on); but we cannot test it for being right or wrong. (This point was made by Hjelmslev many years ago, as the general distinction between a theory and a hypothesis.) By the same token a grammatics can also have its negative effects, if it becomes reductionist or pathologically one-sided. The special quality of a theory of grammar, I think, is the nature of the metaphoric relationship that it sets up with its object of enquiry. If we consider just those features of language brought into the discussion
401

construing and enacting

above the size (and growth) of the grammar, its trinocular perspective, and its fuzz how does the grammatics handle these various parameters? To put this in very general terms: how do we construe the grammatics so as to be able to manage the complexity of language? It seems to me that there are certain matching properties. The grammatics copes with the immense size of the grammar, and its propensity for growing bigger, by orienting itself along the paradigmatic axis, and by building into this orientation a variable delicacy; this ensures that the grammar will be viewed comprehensively, and that however closely we focus on any one typological or topological domain this will always be contextualized in terms of the meaning potential of the grammar as a whole. It copes with the trinocular vision of the grammar by also adopting a trinocular perspective, based on the stratal organization of the grammar itself. And it copes with the indeterminacy of the grammar by also being indeterminate, so that the categories of the theory of grammar are like the categories that the grammar itself construes. Theories in other elds, concerned with non-semiotic systems, begin by generalizing and abstracting; but they then take off, as it were, to become semiotic constructs in their own right, related only very indirectly and obliquely to observations from experience. The prototype of such a theory is a mathematical model; and one can theorize grammatics in this way, construing it as a formal system. But a grammatics does not need to be self-contained in this same manner. It is, as theory, a semiotic construct; but this does not create any disjunction between it and what it is theorizing it remains permeable at all points on its surface. The grammatics thus retains a mimetic character: it explains the grammar by mimicking its crucial properties. One could say that it is based on grammatical logic rather than on mathematical logic. In some respects this will appear as a weakness: it will lack the rigour of a mathematical theory. But in other respects it can be a source of strength. It is likely to be more relevant to understanding other semiotic systems: not only verbal art, but also other, non-verbal art forms, as demonstrated by OTooles masterly interpretation of painting, architecture and sculpture in terms of systemic grammatics, referred to already (OToole 1994). And the new eld of intelligent computing, associated with the work of Sugeno, and explicitly dened by him as computing with (natural) language, requires a theory that celebrates indeterminacy (it is a development of fuzzy computing) and that allows full play to the interface between wording and meaning (see section 20 below).
402

on grammar and grammatics

In the next few sections I will make a few observations about these matching properties of the grammatics, as they seem to me to emerge in a systemic perspective.

12

Paradigmatic orientation and delicacy

When many years ago I rst tried to describe grammar privileging the paradigmatic axis of representation (the system in Firths framework of system and structure), the immediate reasons related to the theoretical and practical tasks that faced a grammatics at the time (the middle 1960s): computational (machine translation), educational (rst and second language teaching; language across the curriculum); sociological (language and cultural transmission, in Bernsteins theoretical framework, for example Bernstein (1971)); functional-variational (development of register theory) and textual (stylistics and analysis of spoken discourse). All these tasks had in common a strong orientation towards meaning, and demanded an approach which stretched the grammar in the direction of semantics. There were perhaps ve main considerations. i: The paradigmatic representation frees the grammar from the constraints of structure; structure, obviously, is still to be accounted for (a point sometimes overlooked when people draw networks, as Fawcett (1988) has thoughtfully pointed out), but structural considerations no longer determine the construal of the lexicogrammatical space. The place of any feature in the grammar can be determined from the same level, as a function of its relationship to other features: its line-up in a system, and the interdependency between that system and others. ii: Secondly, and by the same token, there is no distinction made, in a paradigmatic representation, between describing some feature and relating it to other features: describing anything consists precisely in relating it to everything else. iii: Thirdly, the paradigmatic mode of description models language as a resource, not as an inventory; it denes the notion of meaning potential and provides an interpretation of the system in the other, Saussurean sense but without setting up a duality between a langue and a parole. iv: Fourthly, it motivates and makes sense of the probabilistic modelling of grammar. Probability can only be understood as the relative probabilities of the terms in a (closed) system.
403

construing and enacting

v: Fifthly, representing grammar paradigmatically shapes it naturally into a lexicogrammar; the bricks-&-mortar model of a lexicon of words stuck together by grammatical cement can be abandoned as an outmoded relic of structuralist ways of thinking. This last point was adumbrated many years ago under the formulation lexis as delicate grammar (see above, Chapter 2); it has subsequently been worked out theoretically and illustrated in two important papers by Hasan (1985; 1987). The principle is that grammar and lexis are not two distinct orders of phenomena; there is just one stratum here, that of (lexico)grammar, and one among the various resources that the grammar has for making meaning (i.e. for realizing its systemic features) is by lexicalizing choosing words. In general, the choice of words represents a delicate phase in the grammar, in the sense that it is only after attaining quite some degree of delicacy that we reach systems where the options are realized by the choice of the lexical item. The lexicogrammar is thus construed by the grammatics as a cline, from most grammatical to most lexical; but it is also a complementarity, because we can also view lexis and grammar as different perspectives on the whole. The reason people write grammars on the one hand and dictionaries on the other is that options at the most general (least delicate) end of the cline are best illuminated by one set of techniques while options at the most delicate (least general) end are best illuminated by a different set of techniques. One can employ either set of techniques all the way across; but in each case there will be diminishing returns (increasing expenditure of energy, with decreasing gains). To say that, as the description moves towards the lexical end, one eventually reaches systems where the options are realized by the choice of a lexical item, does not mean, on the other hand, that these are systems where there is a direct correspondence of feature to item, such that feature 1 is realized by lexical item a, feature 2 by lexical item b and so on. What it means is that one reaches systems where the features are components of lexical items. (Thus, they are like the features of a standard componential analysis, except that they form part of the overall system network and no distinction is made between features that are lexical and those that are grammatical.) Any given lexical item then appears as the conjunct realization of a set of systemic features; and the same lexical item may appear many times over, in different locations, much as happens in a thesaurus (where however the organization is taxonomic rather than componential).
404

on grammar and grammatics

13

A note on delicacy

Inherent in the paradigmatic orientation is the concept of variable delicacy, in which again the grammatics mimics the grammar: delicacy in the construal of grammar (by the grammatics) is analogous to delicacy in the construal of experiential phenomena (by the grammar). Since for the most part the lexicalized mode of realization is associated with fairly delicate categories in the grammar, we can talk of lexis as delicate grammar (this refers to lexical items in the sense of content words; grammatical items, or function words, like the, of, it, not, as, turn up in the realization of very general systemic features). But this is not the same thing as saying that when one reaches the stage of lexical realization one has arrived at the endpoint in delicacy. What is the endpoint, on the delicacy scale? How far can the grammatics go in rening the categories of the grammar? In one sense there can be no endpoint, because every instance is categorially different from every other instance, since it has a unique instantial context of situation. We tend to become aware of this when an instance is codied in the work of a major writer and hence becomes immortalized as a quotation. It seems trivial; but it may not be trivial in the context of intelligent computing, where the program might need to recognize that, say, turn left!, as instruction to the car, has a different meaning and therefore a different description at every instance of its use. This is the sense in which a grammar can be said to be an innite (i.e. indenitely large) system. But if we are literate, then in our commonsense engagements with language, in daily life, we behave as if there is an endpoint in delicacy: namely, that which is dened by the orthography. We assume, in other words, that if two instances look different (i.e. are represented as different forms in writing) they should be described as different types; whereas if two instances are written alike they should be described as tokens of the same type however delicate the description, it will not tease them apart. The orthography is taken as the arbiter of paradigmatic boundaries: the way things are written determines their identity. There is sense in this: writing represents the unconscious collective wisdom of generations of speakers/listeners. And we do allow exceptions. (a) We recognize homonymy and, more signicantly, polysemy, where the delicacy of categorization does not stop at the barrier created by the writing system. (b) We accept that there are systematic distinctions which orthography simply ignores: for example, in English, all those realized by intonation and rhythm. (c) And, as already noted, it
405

construing and enacting

never was assumed, except perhaps among a very few linguists, that a function word like of has only one location in the terrain described by the grammatics. These exceptional cases challenge the implicit generalization that the orthographic form always denes a type within the wording. A more explicit principle could be formulated: that, as far as the grammatics is concerned, the endpoint in delicacy is dened by what is systemic: the point where proportionalities no longer continue to hold. As long as we can predict that a : a :: b : b :: . . . , we are still dealing with types, construed as distinct categories for purposes of grammatical description. In practice, of course, we are nowhere near this endpoint in writing our systemic grammars. (I nd it disturbing when the very sketchy description of English grammar contained in Halliday (1994) is taken as some kind of endpoint. Every paragraph in it needs to be expanded into a book, or perhaps some more appropriate form of hypertext; then we will be starting to see inside the grammar and be able to rewrite the introductory sketch!) We are only now beginning to get access to a reasonable quantity of data. This has been the major problem for linguistics: probably no other dened sphere of intellectual activity has ever been so top-heavy, so much theory built overhead with so little data to support it. The trouble was that until there were rst of all tape recorders and then computers, it was impossible to assemble the data a grammarian needs. Since grammars are very big, and very complex, an effective grammatics depends on having accessible a very large corpus of diverse texts, with a solid foundation in spontaneous spoken language; together with the sophisticated software that turns it into an effective source of information.

14

A note on the corpus

A corpus is not simply a repository of useful examples. It is a treasury of acts of meaning which can be explored and interrogated from all illuminating angles, including in quantitative terms (cf. Hasan 1992a). But the corpus does not write the grammar for us. Descriptive categories do not emerge out of the data. Description is a theoretical activity; and as already said, a theory is a designed semiotic system, designed so that we can explain the processes being observed (and, perhaps, intervene in them). A corpus grammar will be (a description based on) a grammatics that is so designed as to make optimum use of the corpus data available, maximizing its value as an information source
406

on grammar and grammatics

for the description. (Corpus-based grammar might be a less misleading term.) It is not a grammatics that is mysteriously theory-free (cf. Matthiessen and Nesbitt 1996). Not even the most intelligent computer can perform the alchemy of transmuting instances of a grammar into the description of a grammatical system. Corpus-based does not mean lexis-based. One may choose to take the lexicologists standpoint, as Sinclair does (1991), and approach the grammar from the lexical end; such a decision will of course affect the initial design and implementation of the corpus itself, but there is nothing inherent in the nature of a corpus that requires one to take that decision. A corpus is equally well suited to lexis-driven or to grammar-driven description. It is worth recalling that the rst major corpus of English, the Survey of English Usage set up by Quirk at University College London, was explicitly designed as a resource for writing a grammar in the traditional sense that is, one that would be complementary to a dictionary. The most obvious characteristic of the corpus as a data base is its authenticity: what is presented is real language rather than sentences concocted in the philosophers den. Typically in trawling through a corpus one comes across instances of usage one had never previously thought of. But, more signicantly, any kind of principled sampling is likely to bring out proportionalities that have remained entirely beneath ones conscious awareness. I would contend that it is precisely the most unconscious patterns in the grammar the cryptogrammatic ones that are the most powerful in their constitutive effect, in construing experience and in enacting the social process, and hence in the construction of our ideological makeup. Secondly, the corpus enables us to establish the probability proles of major grammatical systems. Again, I would contend that quantitative patterns revealed in the corpus as relative frequencies of terms in grammatical systems are the manifestation of fundamental grammatical properties. The grammar is an inherently probabilistic system, and the quantitative patterns in the discourse that children hear around them are critical to the way they learn their mother tongues. Thirdly, the corpus makes it possible to test the realization statements, by using a general parser and, perhaps more effectively, by devising pattern-matching programs for specic grammatical systems; one can match the results against ones own analysis of samples taken from the data. Some form of dedicated parsing or pattern matching is in any case needed for quantitative investigations, since the numbers to be counted are far above what one could hope to process manually (cf. Halliday and James 1993). Fourthly, since modern
407

construing and enacting

corpuses are organized according to register, it becomes possible to investigate register variation in grammatical terms: more particularly, in quantitative terms, with register dened as the local resetting of the global probabilities of the system.

15

Trinocular vision

The trinocular principle in the grammatics can be simply stated. In categorizing the grammar, the grammarian works from above, from roundabout and from below; and these three perspectives are dened in terms of strata. Since the stratum under attention is the lexicogrammar, from roundabout means from the standpoint of the lexicogrammar itself . From above means from the standpoint of the semantics: how the given category relates to the meaning (what it realizes ). From below means from the standpoint of morphology and phonology, how the given category relates to the expression (what it is realized by). What are being taken into account are the regularities (proportionalities) at each of the three strata. Since the patterns seen from these three angles tend to conict, the resulting description of the grammar, like the grammars own description of experience, must be founded on compromise. This is easy to say; it is not so easy to achieve. Often one nds oneself hooked on one oculation obsessed, say, with giving the most elegant account of how some pattern is realized, and so according excessive priority to the view from below; then, on looking down on it from above, one nds one has committed oneself to a system that is semantically vacuous. If the view from below is consistently given priority, the resulting description will be a collapsed grammar, so at that only an impoverished semantics can be raised up on it. On the other hand, if one is biased towards the view from above, the grammar will be so inated that it is impossible to generate any output. And if one looks from both vertical angles but forgets the view from roundabout (surprisingly, perhaps, the commonest form of trap) the result will be a collection of isolated systems, having no internal impact upon each other. In this case the grammar is not so much inated or collapsed; it is simply curdled. Thus the categories of the grammatics, like those of the grammar, rest on considerations of underlying function, internal organization (with mutual denition) and outward appearance and recognition. But there is more than a simple analogy embodied here. I referred above to the notion of semiotic transformation: that the grammar transforms
408

on grammar and grammatics

experience into meaning. The trinocular perspective is simply that: it is the process of transforming anything into meaning of semioticizing it in terms of a higher order, stratied semiotic. Construing the phenomena of experience means parsing them into meanings, wordings and expressions (you only have to do this, of course, when form and function cease to match; this is why the task is inescapably one of achieving compromise). The entire stratal organization of language is simply the manifestation of this trinocular principle. Making this principle explicit in the grammatics is perhaps giving substance to the notion of language turned back upon itself.

16

Indeterminacy in grammatics

That the grammatics should accommodate indeterminacy does not need explaining: indeterminacy is an inherent and necessary feature of a grammar, and hence something to be accounted for and indeed celebrated in the grammatics, not idealized out of the picture just as the grammars construal of experience recognizes indeterminacy as an inherent and necessary feature of the human condition. But construing indeterminacy is not just a matter of leaving things as they are. Construing after all is a form of complexity management; and just as, in a material practice such as looking after a wilderness, once you have perturbed the complex equilibrium of its ecosystem you have to intervene and actively manage it, so in semiotic practice, when you transform something into meaning (i.e. perturb it semiotically) you also have to manage the complexity. We can note how the grammar manages the complexity of human experience. In the rst instance, it imposes articial determinacy, in the form of discontinuities: thus, a growing plant has to be construed either as tree or as bush or as shrub (or . . .); the line of arbitrariness precludes us from creating intermediate categories like shrush. Likewise, one thing must be in or on another; you are either walking or running, and so on. At the same time, however, each of these categories construes a fuzzy set, whose boundaries are indeterminate: on and run and tree are all fuzzy sets in this sense. Furthermore, the grammar explicitly construes indeterminacy as a semantic domain, with expressions like half in and half on, in between a bush and a tree, almost running and the like. The specic types of indeterminacy discussed in Section 10 above, involving complex relationships between categories, are thus only special cases, foregrounding something which is a property of the grammar as a whole. Now consider the grammatics from this same point of view. The
409

construing and enacting

categories used for construing the grammar things like noun and subject and aspect and hypotaxis and phrase are also like everyday terms: they impose discontinuity. Either something is a noun or it is a verb (or . . .); we cannot decide to construe it as a nerb. But, in turn, each one of these itself denotes a fuzzy set. And, thirdly, the same resources exist, if in a somewhat fancier form, for making the indeterminacy explicit: verbal noun, pseudo-passive, underlying subject, and so on. What then about the specic construction of indeterminacy in the overall edice constructed by such categories? Here we see rather clearly the grammatics as complexity management. On the one hand, it has specic strategies for defuzzifying for imposing discontinuity on the relations between one category and another; for example, for digitalizing the grammars clines (to return to the example of force, cited in section 10, it can establish criteria for recognizing a small, discrete set of contrasting degrees of force). A system network is a case in point: qualitative relationships both within and between systems may be ironed out, so that (i) the system is construed simply as a or b (or . . .), without probabilities, and (ii) one system is either dependent on or independent of another, with no degrees of partial association. But, at the same time, the grammatics exploits the various types of indeterminacy as resources for managing the complexity. I have already suggested that the concept of lexicogrammmar (itself a cline from most grammatical to most lexical) embodies a complementarity in which lexis and grammar compete as theoretical models of the whole. There are many blends of different types of structure, for example the English nominal group construed both as multivariate (congurational) and as univariate (iterative) but without ambiguity between them. And the two most fundamental relationships in the grammatics, realization and instantiation, are both examples of indeterminacy. I have said that a grammar is a theory of human experience. But that does not mean, on the other hand, that it is not also part of that experience; it is. We will not be surprised, therefore, if we nd that its own complexity comes to be managed in ways that are analogous to the ways in which it itself manages the complexity of the rest. In the last resort, we are only seeing how the grammar construes itself.

17

A note on realization and instantiation

I referred earlier to these two concepts as being critical when we come to construe a higher order semiotic. Realization is the name given to
410

on grammar and grammatics

the relationship between the strata; the verb realize faces upwards, such that the lower stratum realizes the higher one. (Realization is also extended to refer to the intrastratal relation between a systemic feature and its structural (or other) manifestation.) Instantiation is the relationship between the system and the instance; the instance is said to instantiate the system. It can be said that, in the elements of a primary semiotic (signs), the signier realizes the signied; but this relationship is unproblematic: although the sign may undergo complex transformations of one kind or another, there is no intermediate structure between the two (no distinct stratum of grammar). With a higher order semiotic, where a grammar intervenes, this opens up the possibility of many different types of realization. It is not necessary to spell these out here; they are enumerated and discussed in many places (for example Berry 1977; Fawcett 1980; Martin 1984; Hasan 1987; Matthiessen 1988; Eggins 1994). But there is another opening-up effect which is relevant to the present topic: this concerns the nature and location of the stratal boundary between the grammar and the semantics. This is, of course, a construct of the grammatics; many fundamental aspects of language can be explained if one models them in stratal terms, such as metaphor (and indeed rhetorical resources in general), the epigenetic nature of childrens language development, and metafunctional unity and diversity, among others. But this does not force us to locate the boundary at any particular place. One can, in fact, map it on to the boundary between system and structure, as Fawcett does (system as semantics, structure as lexicogrammar); whereas I have found it more valuable to set up two distinct strata of paradigmatic (systemic) organization. But the point is that the boundary is indeterminate it can be shifted; and this indeterminacy enables us to extend the stratal model outside language proper so as to model the relationship of a language to its cultural and situational environments. Instantiation is the relationship which denes what is usually thought of as a fact in the sense of a physical fact, a social fact and so on. Facts are not given; they are constructed by the theorist, out of the dialectic between observation and theory. This has always been a problem area for linguistics: whereas the concept of a physical principle became clear once the experimental method had been established a law of nature was a theoretical abstraction constructed mathematically by the experimenter the concept of a linguistic principle has proved much more difcult to elucidate.
411

construing and enacting

Saussure problematized the nature of the linguistic fact; but he confused the issue of instantiation by setting up langue and parole as if they had been two distinct classes of phenomena. But they are not. There is only one set of phenomena here, not two; langue (the linguistic system) differs from parole (the linguistic instance) only in the position taken up by the observer. Langue is parole seen from a distance, and hence on the way to being theorized about. I tried to make this explicit by using the term meaning potential to characterize the system, and referring to the instance as all act of meaning; both implying the concept of a meaning group as the social-semiotic milieu in which semiotic practices occur, and meanings are produced and understood. Instantiation is a cline, with (like lexicogrammar) a complementarity of perspective. I have often drawn an analogy with the climate and the weather: when people ask, as they do, about global warming, is this a blip in the climate, or is it a long-term weather pattern?, what they are asking is: from which standpoint should I observe it: the system end, or the instance end? We see the same problem arising if we raise the question of functional variation in the grammar: is this a cluster of similar instances (a text type, like a pattern of semiotic weather), or is it special alignment of the system (a register, like localized semiotic climate)? The observer can focus at different points along the cline; and, whatever is under focus, the observation can be from either of the two points of vantage.

18

Realization and instantiation: some specic analogies

It is safe to say that neither of these concepts has yet been thoroughly explored. Problems arise with instantiation, for example, in using the corpus as data for describing a grammar (why a special category of corpus grammar?); in relating features of discourse to systemic patterns in grammar (why a separate discipline of pragmatics?); and in construing intermediate categories (such as Bernsteins code, which remains elusive (like global warming!) from whichever end it is observed which is what makes it so powerful as an agency of cultural reproduction). (See Francis 1993 for the concept of corpus grammar; Martin 1992 for showing that there can be a system-based theory of text; Bernstein 1990 for code; Hasan 1989; 1992b for interpretation of coding orientation; and also Sadovnik 1995 for discussion of Bernsteins ideas). As far as realization is concerned, Lemke has theorized this power412

on grammar and grammatics

fully as metaredundancy (Lemke 1984) (and cf. Chapter 14 above); but this still leaves problems in understanding how metafunctional diversity is achieved, and especially the non-referential, interpersonal aspects of meaning; and in explaining the realization principles at work at strata outside language itself (see Thibault (1992) and Matthiessen (1993a) on issues relating to the construal of interpersonal meanings; Eggins and Martin in press, Hasan (1995), Matthiessen (1993b), on issues involving the higher strata of register and genre). I am not pursuing these issues further here. But as a nal step I will shift to another angle of vision and look at realization and instantiation from inside the grammar turning the tables by using the grammar as a way of thinking about the grammatics. One of the most complex areas in the grammar of English is that of relational processes: processes of being, in the broadest sense. I have analysed these as falling into two major types: (i) attributive, and (ii) identifying. The former are those such as Paula is a poet, this case is very heavy, where some entity is assigned to a class by virtue of some particular attribute. The latter are those such as Fred is the treasurer/ the treasurer is Fred, the shortest day is 22nd June/ 22nd June is the shortest day, where some entity is identied by being matched bi-uniquely with some particular other. (See Halliday 19678; 1994.) The identifying relationship, as construed in the grammar of English, involves two particular functions, mutually dening such that one is the outward form, that by which the entity is recognized, while the other is the function the entity serves. This relationship of course takes a variety of more specic guises: form / function, occupant / role, sign / meaning, and so on. I labelled these grammatical functions Token and Value. This Token / Value relationship in the grammar is exactly one of realization: the Token realizes the Value, the Value is realized by the Token. It is thus analogous to the relationship dened in the grammatics as that holding between different strata. The grammar is modelling one of the prototypical processes of experience as constructing a semiotic relationship precisely the one that is fundamental to the evolution of the grammar itself. The attributive relationship involves a Carrier and an Attribute, where the Attribute does not identify the Carrier as unique but places it as one among a larger set. It was pointed out by Davidse (1992) that this Carrier / Attribute relationship in the grammar is actually one of instantiation: the Carrier is an instance of, or instantiates, the Attribute. It is thus analogous to the relationship dened in the grammatics as that holding between an instance and the (categories of the) system.
413

construing and enacting

(In that respect the original term ascriptive, which I had used earlier to name this type of process, might better have been retained, rather than being replaced by attributive.) Here too, then, the grammar is construing a signicant aspect of human experience the perception of a phenomenon as an instance of a general class in terms of a property of language itself, where each act of meaning is an instance of the systemic meaning potential. Of course, the boot is really on the other foot: the grammatics is parasitic on the grammar, not the other way around. It is because of the existence of clause types such as those exemplied above that we are able to model the linguistic system in the way we do. The grammatics evolves (or rather one should say the grammatics is evolved, to suggest that it is a partially designed system) as a metaphoric transformation of the grammar itself. This is a further aspect of the special character of grammatics: while all theories are made of grammar (to the extent that they can be construed in natural language), one which is a grammar about a grammar has the distinctive metaphoric property of being a theory about itself.

19

Centricity

Since the grammatics is a theory about a logo system, it is logocentric, or rather perhaps semocentric: its task is to put semiotic systems in the centre of attention. In the same way, biological sciences are bio-centric: biased towards living things; and so on. I think it is also a valid goal to explore the relevance of grammatics to semiotic systems other than language, and even to systems of other types. The grammatics is also totalizing, because that is the job of a theory. Of course, it focuses on the micro as well as on the macro the semiotic weather as well as the semiotic climate; but that again is a feature of any theoretical activity. It has always been a problem for linguists to discover what are the properties of human language as such, and what are features specic to a given language. The problem is compounded by the fact that there is more than one way of incorporating the distinction (wherever it is drawn) into ones descriptive practice. Firth articulated the difference between two approaches: what is being sketched here is a general linguistic theory applicable to particular linguistic descriptions, not a theory of universals for general linguistic description (Firth 1957: 21; Firths emphasis). I have preferred to avoid talking about universals because it seems to me that this term usually refers to descriptive categories being
414

on grammar and grammatics

treated as if they were theoretical ones. As I see it, the theory models what is being treated as universal to human language; the description models each language sui generis, because that is the way to avoid misrepresenting it. Thus while the theory as a whole is logocentric, the description of each language is what we might call glottocentric: it privileges the language concerned. The description of English is anglocentric, that of Chinese sinocentric, that of French gallocentric and so on. (Note that the theory is not anglocentric; the description of English is.) This is not an easy aim to achieve, since it involves asking oneself the question: how would I describe this language as if English (or other languages that might get used as a descriptive model) did not exist? But it is important if we are to avoid the anglocentric descriptions that have dominated much of linguistics during the second half of the century. In practice, of course, English does exist, and it has been extensively described; so inevitably people tend to think in terms of categories set up for English or for other relatively well-described languages. I have suggested elsewhere some considerations which seem to me relevant to descriptive practice (Halliday 1992). As far as my own personal history is concerned, I worked rst of all for many years on the grammar of Chinese; I mention this here because when I started working on English people told me I was making English look like Chinese! (It seems ironic that, now that systemic theory is being widely applied to Chinese studies, the work of mine most often cited as point of reference is the descriptive grammar of English.) In my view an important corollary of the characterological approach (that is, each language being described in its own terms) is that each language is described in its own tongue. The protocol version of the grammar of English is that written in English; the protocol version of the grammar of Chinese is that written in Chinese; and so on. The principle of each language its own metalanguage is important, because all descriptive terminology carries with it a load of semantic baggage from its use in the daily language, or in other technical and scientic discourses; and this semantic baggage has some metalinguistic value. This applies particularly, perhaps, to the use of theoretical terms as metacategories in the description; words such as (the equivalents of) option, selection, rank, delicacy are likely to have quite signicant (but variable) loadings. But the principle also helps to guard against transferring categories inappropriately. Even if descriptive terms have been translated from
415

construing and enacting

English (or Russian, or other source) in the rst place, once they are translated they get relocated in the semantic terrain of the new language, and it becomes easier to avoid carrying over the connotations that went with the original. So if, say, the term subject or theme appears in a description of Chinese written in English, its status is as a translation equivalent of the denitive term in Chinese. Perhaps one should point out, in this connection, that there can be no general answer to the question how much alike two things have to be for them to be called by the same name!

20

A nal note on grammatics

As I said at the beginning, when I rst used the term grammatics I was concerned simply to escape from the ambiguity where grammar meant both the phenomenon itself a particular stratum in language and the study of that phenomenon; I was simply setting up a proportion such that grammatics is to grammar as linguistics is to language. But over the years since then I have found it useful to have grammatics available as a term for a specic view of grammatical theory, whereby it is not just a theory about grammar but also a way of using grammar to think with. In other words, in grammatics, we are certainly modelling natural language; but we are trying to do so in such a way as to throw light on other things besides. It is using grammar as a kind of logic. There is mathematical logic and there is grammatical logic, and both are semiotic systems; but they are complementary, and in some contexts we may need the evolved logic of grammar rather than, or as well as, the designed logic of mathematics. This reects the fact that, as I see it, grammatics develops in the context of its application to different tasks. As Matthiessen (1991b) has pointed out, this, in general, is the way that systemic theory has moved forward. Recently, a new sphere of application has been suggested. As mentioned above in Section 10, Sugeno has introduced the concept of intelligent (fuzzy) computing: this is computing based on natural language (Sugeno 1995). He has also called it computing with words, although as I have commented elsewhere (Halliday 1995) this is really computing with meanings. Sugenos idea is that for computers to advance to the point where they really become intelligent they have to function the way human beings do namely, through natural (human) language. This view (and it is more than a gleam in the eye: Sugeno has taken signicant steps towards putting it into practice) derives ultimately from Zadehs fuzzy logic; it depends on reasoning and
416

on grammar and grammatics

inferencing with fuzzy sets and fuzzy matching processes. But to use natural language requires a grammatics: that is, a way of modelling natural language that makes sense in this particular context. Systemic theory has been used extensively in computational linguistics; and the Penman nigel grammar, and Fawcetts communal grammar, are among the most comprehensive grammars yet to appear in computational form (Matthiessen 1991a; Matthiessen and Bateman 1992; Fawcett and Tucker 1990; Fawcett, Tucker and Lin 1993). But, more importantly perhaps, systemic grammatics is not uncomfortable with fuzziness. That is, no doubt, one of the main criticisms that has been made of it; but it is an essential property that a grammatics must have if it is to have any value for intelligent computing. This is an exciting new eld of application; if it prospers, then any grammarian privileged to interact with Sugenos enterprise will learn a lot about human language, as we always do from applications to real-life challenging tasks.

Note
1. This is not to question the semiotic achievements of the bonobo chimpanzees (cf. Introduction, p. 3). The issue is whether their construal of human language is an equivalent stratied system with a lexicogrammar at the language is an equivalent stratied system with a lexicogrammar at the core.

417

También podría gustarte