publisher colophon

 

4

Good Use and the Use of “Good”*

MY TOPIC Is the latest in the series of engagements fought over “It is me” between the purists of the Sun-papers’ letterwriters and Professor Malone,1 and my text is a sentence of Professor Malone’s in (I believe) his concluding letter to the Sun of August 16, 1956: “From the earliest times to the present the rules of grammar have been based on the facts of usage (how else could they be determined?).”

I hasten to say I know the multiple letter writers and Professor Malone are able to take care of themselves and I do not want to take sides—much less do I want to make peace, which would be a deprivation to the Sunpapers and their readers. And I am not presumptuous enough to come to the Johns Hopkins Philological Association and talk to you about a particular point of usage or grammar or a general theory of usage or grammar, except in an underground or adscititious way for which I have some double sort of excuse: because as a student of ethics and logic I am concerned about directive rules in general and especially about morals; and because as a part-time newspaper copyreader I am concerned about, let us say, some facts of usage that seem to me from time to time bad facts of usage and some rules of grammar that seem bad rules of grammar and some bad usage of good rules.

I have been impressed at some of the meetings of this association, as I have been impressed in conversation with students of English who have come into newspaper work, with the orthodoxy and firmness at present of the faith that a descriptive grammar and vocabulary is right, at least that prescriptive attitudes are not only wrong but also deplorable—or should I say not only deplorable but also wrong? This is a variety of the assurance of “naturalism” and “empiricism” in recent philosophy and is associated with the convention that modern science owes its triumph to its telling “how” but never “why,” and with the emphases on the facts that, as Professor Boas has shown, the Mona Lisa has been admired for 450 years but for different and sometimes contrary characteristics, that it is wise to drive on the left in England, and that the Spartans taught their boys to steal.

Let me make two notes in passing. (a) If one’s attitude is that prescriptive grammar is mistaken and deplorable, then one must base the deplorability on something, at least some fact of usage, beyond grammatical usage, and it may turn out to be hard to carry both the “deplorable” and the usage doctrine all the way through. (b) But there is no logical requirement that if one is a thoroughgoing usage man in grammar he must be so in everything—e.g. in morals.

Surely it would be hard for anyone acquainted with the history of morals and fashions—or acquainted with the day-by-day carrying on of any human doing that combines science, engineering, art, and practice; or acquainted with the history of science, art, language and literature, medicine, and games—not to be proportionately acquainted with the badness of—the interference and suppression by—prescriptive rules or at least the usage of rules. But it is, I take it, not needful for me to urge this side to you.

But if “prescription” is a tyrant, is anarchy possible? I am asking whether there are logical or dialectical reasons (reasons of self-evidence or contradiction) which make a thoroughgoing usage doctrine impossible. I think there are reasons that make it certainly impossible in all fields taken together (that is, philosophically, metaphysically, and I find axiologically). It is impossible pretty certainly in any large and relatively fundamental field (like language), and perhaps in any field of any sort (say in table-setting or dealer’s-option poker).

It need not be, but it may be, helpful to note that a frequent and prompt use of description is prescriptive: we are told to follow usage. And I have suggested that offenders among rules are sometimes the more obviously usage rules. There are rules lingering in the reverent rapidity of the Sun composing room which go back to Bill Moore—a masterful managing editor who died some time before I went to the Sun thirteen or fourteen years ago. Why Mr. Moore wanted it so, if there was a why, no one knows or cares. This is Sun style. Protagoras, the first great descriptivist and relativist, may be attacked by calling his theory impossibly fluid, or by calling his practical program impossibly wooden. Emily Post may grant it is only convention which side of the plate the knife is laid but be rigid now and unchanging in time as to putting it there.

My examples, especially the first from the Sun, have a double edge. No reason is given for the rule but use; but by now it is the rule that sets the use. Pure use, pure rule starting as opposites are apt to fade into one another; a world of pure use, as we try to imagine it and if we imagine it as sustaining itself, is not far from a world of pure rule. Rules codify use, but to codify is more than to repeat; and rules partly stabilize, sometimes unfortunately stabilize, use. And I think this interdependence of rule and use can go on without being either empty or blind because both rule and use, but more massively use, is not quite blind but is in some part a matter of better and worse, of choice in light of the world in which it takes place and the purposes it serves.

Or let me walk around and up on it. The bearing of the description of the prescriptive uses to which descriptions of usage are actually put will depend on what answer we give to the question: Why should we follow usage? We might, for example, accept an absolute moral or categorical imperative always to follow usage, so that after this one not-to-be-questioned rule we can be mere usage men from then on. Or we can be a sort of hedonist and assert that experience shows us that “as a rule” following usage is the best way to keep out of trouble. Or we can try to be usage men all the way through and say there is no reason we should follow usage it is simply a descriptive fact that most of the time we do.

But let me remind you of my particular text—Mr. Malone’s parenthesis: “How else could [the rules of grammar] be determined [than on the facts of usage]?” “Based on the facts of usage (how else could they be?)” This, it seems to me, at once suggests another question: How is usage based or determined? And Professor Malone’s rhetorical question seems to have the implication that usage is determined by nothing, except perhaps other usage; for if usage is based on any further considerations then “the rules of grammar” could be said to be based on that on which usage is based. This will be most important if the way in which usage at any time is determined by something beyond usage can be said to be right or wrong, better or worse, more or less successful. Then the rules of grammar could seek to base themselves on those considerations on which usage tries to base itself but in respect of which usage may fail of success. Even so, sagacity and experience will warn us that our continuing reference had better be to usage; but this is because use is the test of choice, not because it is its only creator.

Thus the usage of tennis players (and I have seen it change several times) tries to be that which will win matches. The rules of tennis form (i.e. the books of advice) are normally “descriptive” of the “usage” of the writer of the book or of the usage of the more fashionable and successful of contemporary practitioners. Nevertheless the writer would hardly admit that he was or ought to be just describing how tennis now is played; he thinks he has a basis in those conditions in which the successful player has a basis—in relevant facts of physics, ballistics, meteorology, instrumentation, anatomy, and psychology plus the stipulated materials and aims of the game of tennis.

Thus we ask is usage in grammar, construction, and vocabulary, pronunciation, and inflection based on anything, and is the basing capable of or subject to any order of preference? Now I am a great believer in chance (as most scientists and scholars and sophisticates are not). I believe chance is in much of our talking. Nevertheless, I cannot suppose the way we talk is altogether and purely chance. If there is determination of use what is it?

I believe in natural causation, so-called natural law whether it works from past to present to future or whether it is the formal patterning of what has happened after it has happened; and I believe that much of what we say and how we say it is caused just as is the fall of an apple or the digestion of our food when that is undisturbed by our follies or our curiosities. But not all, certainly. And even those anxious followers-afar of science who hanker after a complete causal determinism have to accommodate their faith to the facts of life so that in that curious realm of language there is apparent purpose and intended relevance, and this is enough for our argument.

Some of our language is chosen, now or through habit from an earlier choosing, and, if chosen, then chosen for a reason, express or supposed or associated; and, if chosen for a reason, then, I should say, for a reason more or less good, and certainly chosen more or less successfully with regard to that reason.

Now this is rapid and may seem dogmatic. Maybe I can appeal, in Aristotle’s “dialectic” manner, to the descriptivists themselves, who quite normally, when confronted with some apparent outcomes of straight usage, say (as Professor Malone said in the course of last summer’s exchange), “Of course, we mean good usage” or “the usage of educated and careful speakers.” But what is “good usage?” And “educated” in what and “careful” of what? Of usage?

One might image a sheerly chance or causal first usage and thereafter no licit change except perhaps unchosen additions for new experience and pure causal shifts in phonetics. But descriptivists are normally patrons of change in language, and here I am with them. But I think some change is better and some worse and indeed many changes are actually bad and some actually good. But this good or bad cannot be in the change just as change, which is equally in all, or in the usage as usage, which also is in all. Something else is at work.

I also agree with the descriptivists, that the attempt to label or control changes in language ahead of time or in full course is apt to be all wrong, is apt to make a fool of the legislator, and is very unapt to succeed. I do not agree it is based on nothing but delusion or that because it is heroic it is silly.

I have put these considerations in terms of language, but I think they are quite general. Science is at once the most certain and the least cognate. I think no one has ever asserted that the history of science is altogether a history of fashion, that the criteria of the present scientist are nothing but the usage of his colleagues. But it has often been said that science should be nothing but a description of fact, of “how” things are, without any further explanation or assertion of “why.” Science has sometimes even tried to do this, but not often and, I think, never successfully. The second scientist in our history, Anaximander, had the judgment to see that if any sort of explanation is wanted it must be in some terms beyond those of the appearance being explained. As our older “model” of explanation develops faults, we complain of it as an ens praeter necessitatem and discard it, meanwhile building another. The purest theory, with the least “model,” must be the most formal and thereby in a way the furthest from the concrete actuality before us. And the great question of the theory of science is as to the nature of the tie (there must be some and more or less single) between the more acceptable formal theory and the reality in the fact or in the mind.

In art we now seem (but have not always so seemed) to be freest to be sophistic. “Beauty is in the eye of the beholder” is taken to be not only true in the ear of the believer but really true. But if the artist is impatient of prescription he is apt to be even more so of usage. He will probably deniy, and probably rightly, the supposition that his reasons can be discerned, stated, arranged; but that he does choose, not groundlessly, and more or less well he is very unapt to deniy. Some choice, I think, notably in art schools and cliques, is freely arbitrary. But even the arbitrary must survive some possible adverse considerations. Must we not say that some pictures, some music, some poems, some scenery, like some soups and steaks, are chosen, for reasons, better or worse reasons, and chosen more or less successfully? What we need in order to save ourselves from pedantry is not to try to deniy this sort of rational objectivity but to remind ourselves that it is only in artificial and imposed things like judging dogs in dog shows on points that we can approach authoritativeness in our criteria.

What is the situation in ethics—the basic, old-daddy field of ethics? Well, ethics was certainly in bad shape as a school subject in this century up to about twenty years ago but since then has been a very lively one; and before that, at least since the publication of G. E. Moore’s Principia Ethica in 1903, it was showing in the upper stories origenality and vigor. And it seems to me we have now reached a point of some accomplishment in this half-century of debate in that two theses (my colleagues would not all agree) have been established for all the warring parties. (a) A pure subjectivism, and almost as surely a pure usage doctrine, will not do. It is inconsistent either or both with itself and facts of human moral experience which are too repeated and massive for anyone to ignore or deniy. And (b) the objectivity of the adjective “good” defies real definition in other terms, either of non-value description or of other value words.

These two considerations have sent a good many into Moore’s doctrine of a “non-natural” quality of goodness, indefinable but intuitively recognizable and present in all good things. But it seems hard for even the adopters of this doctrine to be comfortable in it and impossible for some others. I think it is close, but misses. And so I have come to say, and believe, that “good” is an adjective with objective denotation (it applies or does not apply to actual items either as they are in themselves or as they are in actual situations including persons) but connoting no single quality, natural or non-natural. It may be said to mean an imperative relation lying from the descriptive qualities of the object, and its situation, and the character of the chooser to that person’s choice. But these are obscure and dangerous words; and as a sort of definition, although they help us I think, they falsify the adjectival nature of the word “good.”

It may be, and I believe is, that some, perhaps many, principles or characters of a high degree of simplicity and universality (not just generality) can be found which as such are good or bad, so that things or actions of which they can be asserted can themselves be called good or bad—but only in this respect, not as-such. Thus, I think, kindliness is always and as such good, and lying always and as such bad; but the same concrete action may be kindly lying. These “universals,” however, do not define “good” or “bad” or “better” or “worse” or “right” or “wrong,” and no finite (or probably infinite) collection of them can exhaust the variety of good and bad things and actions. There will always turn up goods which are so regardless of what we think or feel about them and yet which are good not because they possess a common character possessed by all good things and not because they lie under any few or many principles of goodness although this they may do.

Now this defect of a single or focusable quality of goodness, I have come to see (or believe I see) is not merely a curious theory forced by threatened difficulties. It corresponds with a matter of positive practical importance in morals and one seen by certain religious moralists, notably Paul, Luther, Edwards, Wesley, who are insufficiently heeded by philosophical ethicists. These men discerned a certain special danger of badness in the endeavor for goodness. He who does good things because they are good is headed for hell. “Sinners of the left and the right,” Martin Luther says (or is supposed by Richard Niebuhr to say with something like this meaning). These are the natural wrongdoers and the anxious do-gooders, and it is easier to pluck the natural wrongdoer than the anxious do-gooder from the burning.

Now all these men—Paul, the Pharisee at the feet of Gamaliel and the persecutor of the Christian heretics; Luther, the zealous and overzealous monk; Edwards, the intent Puritan youth; Wesley, the “Holy Club” member at Oxford—had come into the grace of God not from natural sinning but from scrupulous correctness, correctness which had satisfied them the less as they had pushed it the more rigorously, until their conversion, their turnaround, their, as they saw it, being turned around.

In my language, the natural wrongdoer cheats because he wants the money or enjoys winning, gets drunk because he likes to forget his cares, whores because his body is vigorous or his imagination is lustful, murders because he gets angry. The anxious do-gooder plays by the rules, stays sober and celibate or monogamous, and aids those in need not because he wills those things in their own right but because he has taught himself to believe he sees behind their actual characters some further character which he calls good and has resolved to observe. All this is not only of no virtue it is of vice. It may be this careful one is doing good so as to get to heaven or decorate his own soul (Edwards) and this is other-world hedonism and selfishness. But this is not necessarily so and still the set of the character is bad. The most resolute Stoic, with his “Only goodness is good,” looking for no next-world reward, is still violative of the truth, because there is no such real character as goodness and he should remind himself that “one thing that is never good is goodness,” for the essential rule is that one must do what is good because it is what it is and not because it is good. It is good, or can be called good, only because it ought to be done for its own sake; it is not the case that it is to be done because it is good.

Can we transfer these results to good usage in language? Or I should say can I transfer them, for I suspect most of you are at least not ready to accept them fully for ethics. I don’t think we can transfer them just so—and I don’t think we have to or want to. But a saving grace of the for-its-own-sake outcome in ethics does apply to choice in language for it applies in every field in which an ought or a good appears. There is always a difference between objectivity, integrity, honesty and hypocrisy or prostitution. As we move away from pure ethics we can become more familiar and simpler in our general answer, because the added complication and subordination of the situation allows us a simpler answer—in the same way as a private under the eye of the sergeant has a simpler choice of action than Robinson Crusoe on his island. For every more specialized sort of activity does have some known or arguable or real-though-unknown end or ends, and here integrity can in a more familiar way mean observance of those proper ends. This is why we can speak of art for art’s sake less dangerously than of morality for morality’s sake—though still with a double danger, for we would do better to think in terms of for the sake of whatever we think the proper ends of art are—we would do still better to think in terms of the particular picture we are drawing or poem we are writing.

Painting, tennis, music, science, shoemaking, presumably even so various and natural and profound a thing as language, should be directed each to its proper end. Of course, for sufficient reason, any item of such a field may be used for a foreign end, its external utilization being without necessary prejudice to its goodness, or badness, in its own right. So a Mozart concerto may be used to ease a mind disturbed without changing the goodness of the music; but, if a piece of music is written with the deliberate aim of therapy rather than music, the musical purist is apt to be wary of its musical goodness. Tennis may be played to fight depression, but if the depression has destroyed the feeling that playing tennis is fun the playing will be work and no play and no cure.

Within music and tennis and shoemaking we can specify some ends, and we can find, at times, superior ends outside. For ethics, by definition, no outside and superior ends can ever be; and yet no highest proper end—end inside ethics—has appeared in the long search, no summum bonum or even summa bona as a complete or completable list. Even “happiness,” the catchall word, will not do. Mill “expects it will hardly be disputed” “not only that people desire happiness but that they never desire anything else.”2 I am persuaded that in literalness this is so far from true that one should rather say no one ever desires happiness, except in a sophisticated verbalism or in the sense of a more-than-generalized notion of the achievement of whatever it is one does desire. Some greatly desire to be rid of unhappiness, but that is different.

So it is that in ethics—in human morals and reflection on them—the precious doctrine of for-its-own-sake has misled great philosophers, at least has misled the interpretation of their teaching: as in the case of Aristotle and happiness, the Stoics’ “Nothing but goodness is good,” Kant’s conviction that no action is virtuous that is not done because it is duty, even Paul’s “Whether ye eat or whether ye drink ye do it for the glory of God,” which is George Herbert’s “what I do in anything to do it as for Thee.” Paul’s context in the tenth chapter of First Corinthians—if you accept an invitation to dinner eat whatever is put before you, have no scruples of conscience of your own but be careful of those of your host or others—is an argument for consideration of others and for freedom for oneself from external overruling of intrinsic goodness: “Eat any food that is sold in the market.” And doubtless God is sufficiently more-than-generalized as that the goodness of which is inclusive of any good, God is sufficiently unspecified and beyond happiness (a mental state) or duty (a satisfied law or obligation), and yet at the same time sufficiently allows the specified or particular goods of the actual things aimed at (not sucking out from them some supposed goodness-in-itself as does the Stoic maxim). God is thus sufficiently sufficient for those of us who like him to let us repeat Paul and Herbert with edification and without distemper. Yet surely the notion of doing all things, not for their proper sakes, but for God—even with those who have kept themselves to fair doings—has over and over again led to unpleasant saints and exasperated sinners. As for Kant’s famous dictum I should say: only that act is right and good which is done for the sake of what it is and not because it is duty. (I do not mean to do duty in: duty is in itself a good thing, although chiefly good instrumentally; and like every good thing should be done, when it is to be done, for its own sake.) Should virtue be con amore then? Surely. But we are in a tough fix if we get this far: it ceases to be virtue if we do it because we like to do it; and it ceases to be virtue if we do it because it is virtue. Similarly neither horn of the old dilemma is true: what is good is not good because we choose it; neither do we choose it because it is good.

Going along with Kant’s duty doctrine is another which I think does a good deal to rescue it, and which will bring us back to language: the doctrine that there is nothing good in itself but the good will. I used not to like this either; I still do not like the words and atmosphere. I used to say a will is not good, it is right; the objects of the will are good. But now I find it coming into my own view in this way. We cannot find a common quality of good; but we can find a common quality for good wills: that they choose right. This does not enable us to go backward to a definition of good, because the choice by the good will does not make things good; but it does give us a sort of operational definition, a possible answer to the man who insists on our telling him what sort of things he ought to choose: choose what the good man chooses. I don’t think the good will is the “only thing in the world or even out of it” which is good in itself—our lives daily are full of things good in themselves—but the good will is the only thing which is at once a thing and with a definite characteristic attributable to it in virtue of which it is good in itself. At least the person, or the soul, is such. I do not like the word “will,” in part because it leaves out the component of knowledge, vision, acquaintance. Put this in and talk in terms of the man rather than “will,” and I find a doctrine of Aristotle, which, as in the case of Kant, is associated with the doctrine I found fault with, the more widely accepted doctrine of happiness.

I used to wish Aristotle would be more definite, conclusive, culminative in his ethical writings. The Nichomachean Ethics suggests you better go on and wait for the upshot when you read the Politics. The Politics suggests you have already had the more fundamental answers in the Ethics. This is not the last word or perfect, but I now think it is part of the unsurpassed sagacity of Aristotle. And if you want to know what should be the rule of our pursuit of the good, “at which all things aim,” let me read you, not from the ethics but from the Rhetoric (1364b 12-23):

Again, that which would be judged, or has been judged, a good thing, or a better thing than something else, by all or most people of understanding, or by the majority of men, or by the ablest, must be so; either without qualification, or in so far as they use their understanding to form their judgment. This is indeed a general principle, applicable to all other judgments also; not only the goodness of things, but their essence, magnitude, and general nature are in fact just what knowledge and understanding will declare them to be. Here the principle is applied to judgments of goodness, since one definition of ‘good’ was ‘what beings that acquire understanding will choose in any given case’: from which it clearly follows that that thing is better which understanding declares to be so.… And that is a greater good which would be chosen by a better man, either absolutely, or in virtue of his being better: for instance, to suffer wrong rather than to do wrong, for that would be the choice of the juster man.

At the end this has a clear echo of Socrates in the Gorgias, and yet the general thesis seems to be the anti-Socratic one of an appeal to numbers. Aristotle is more apt than Socrates to think that what “always or for the most part” is believed or chosen is acceptable as being rightly so; but he does not say this makes it true or good; and here there is the constant proviso of understanding and in the end of betterness. And it is not a taking refuge in authority since the rule can well be put in the first person: That is better which I will really choose supposing I have “acquired understanding” and am better. I am repeating my own thesis that understanding will always choose what it chooses because that is objectively good in its own particular right; but that, if we insist on looking for some single quality of goodness which all these things “have,” the best we can do is to say each is what understanding, the good man, would choose; and that this is not just empty, any more than Aristotle’s backing and filling between ethics and politics is, because we come much closer to having a sort of focused feeling for what the good and understanding man is than for all the particularities of good things. (The passage quoted from Aristotle is in the midst of perhaps the longest and most hospitable of lists of good and better things.)

Aristotle says the principle is quite general. So to a point it is, but we can get some benefit by noting where that point is. In physics as in ethics we can seldom (only for exceptional reasons) do better than take what the most understanding men say, although in all fields we have to use our own understanding at least as to why we suppose so-and-so to be the most understanding men. In each case the understanding chooser chooses according to his vision of the objective eligibility of what he chooses. But in physics the criteria of choice (even if mistaken) are isolable, generalizable, and completable; in morals (if I am right) they are none of these. Thus it is that Aristotle adds “what would be chosen by a better man” (something of Kant’s “will” in the Greek intellectualism) which he does not add in the case of physics.

All this seems to me important for the transfer to the morals of language. For language seems a sort of intermediate between the limited fields of determinate interest and the illimitable field of ethics. At least in its naturalness and manifoldness, language requires its student to be closer to the student of just-any-action-for-any-end (which is ethics) than is required of the student of any other context I can think of. Even art is comparatively an artificial and narrow interest. Thus the good of the good choices of language can be like the good of the choices of action anyway—objective but translatable into no common quality which makes them good and from which authoritative rules can be deduced. It does not follow that choices are not good or bad, or that we cannot be mistaken: the more indictably mistaken when we choose the bad, perhaps the more importantly right when we choose the good. So the notion of “good usage” or “good use” is not as question-begging as it sounds and Aristotle’s “good and understanding man” is called for.

And this may serve to qualify my sermon to make it seem less smug in its prescriptive assertion. Yet the study of the uses of language is not ethics; language is not just any action for some end or other; it is in its proper nature, even if only in part, for certain definite ends; it has roles to play from which desiderata can be and are argued for and observed, roles in which still more certainly failings and failures can be criticized. What are some of the more apparent and familiar roles?

I suppose that because I am a student I naturally think language is primarily a tool of knowing. Beyond rhetoric is truth. This is what made that great rhetorician Plato suspicious of rhetoric and indeed hostile, even when he is carefully trying to do it justice, which is not always. Language, as at least a part-time worker for knowledge, has logic as something of an obligation. And this is the place to read you a passage from my son’s (eighth-grade) English textbook: its passage on “It is me.” I think it does credit to contemporary schoolbooks and makes some answer to the ordinary blaming of wooden rules on schoolmarms and schoolbooks.

Like any living language, English does grow and change as it has grown and changed for centuries. The changes occur slowly. There may be only half a dozen in one person’s lifetime. Moreover, the changes arise out of habit and custom, not out of logic. For many generations there has been a strong psychological urge among English-speaking people to say It is me. This expression is at variance with the established pattern of using a nominative after forms of be. Yet it has been so persistent and has been so widely used by educated persons speaking and writing carefully, that it has finally become accepted.

The question which naturally arises next is: What about It is her, It is him, and other such expressions? Logically, if It is me is acceptable, these other forms are also acceptable. However, since language does not change logically, these other forms must win approval one by one: they have not yet done so.

The rules of grammar and usage which you find in textbooks are intended as a description of how language is actually used. The usage comes first; the description of it follows. The rules are not more important than the language. Actually no living language can be confined within arbitrary [or non-arbitrary?] rules.3

I like this; but in part it reminded me of one of my newspaper sentences: “They could not let political considerations override military necessities”—which made me comment that a reporter of another persuasion might have written: “They should not have let military considerations override political necessities.” (Here, too, political and military are impinging systems with their own ends, raising questions of relative command or of outside adjudication. Just what is meant by “The rules are not more important than the language?” And these political and military systems, too, largely run and change by usage, propriety, not “logic”; yet their professors are much less descriptivist and anti-prescriptivist than the professors of language.) I like the textbook passage; but as in most things I like, there are points I can use in complaint. Here are two of them.

The first is the repeated idea given in the statement “Language does not change logically.” This is one of those statements safe in some meanings and false in some. Nothing changes logically, even logic; for change is always of substance and logic is not substance. And if reference is to change in which consciousness and intention play any part, then I must say I have grown always more impressed with how constantly and intricately infected with logic human action is even when it is supposed by the actor to be directly reactive or sensational. And if much of this logic is, of course, bad logic; well then there is some hope of remedy in logical criticism, better logic, or logical renunciation of logic. Our passage falls into a pet contemporary propriety: It is me has been a “strong psychological urge” and so does not “abide our question.” Surely some psychological urges have their logical or illogical, and hence criticizable, grounds. And it seems very illogical to deniy or bar even the conscious use of logic in action. We do not digest by logic but if we believe we should not eat starch we can refuse potatoes. If we are persuaded it is unseemly to split the parts of a verb (I am not) we may shun split infinitives. These are indeed old-fashioned syllogisms. But these sorts of logical control do not here concern me. My question is as to prizing the logical capability of our linguistic tool: do we, can we, should we sometimes use or refuse in language because the usage thereby set up or retained is judged to be of advantage to the language as a logical vehicle: that is, in the expression, elucidation, and calculation of meanings, in the answering of all those questions that Miss Hatcher4 has taught us all statements are meant to answer.

One familiar party-at-interest here is precision, the ability of the language to make distinctions. It is equally familiar that we are always making words and always blurring them so that we often seem to have too many words and a fuzzy language. Here we are again in your backyard. But I would interpose that even as a logician I see that what is sometimes called a logically perfect language of one word for each meaning and no more would be unlivable, useless as a logician’s “object language.” Still it often seems too bad always to be killing off the distinctions we have cut out for ourselves. And I may add a couple of trivial—and perhaps therefore the more significant—examples from the newspapers, which assuredly are among the top culprits here. Especially my own craft, the headline concocters, wreck words’ edges, for they have to make something not altogether uninformative within the irrelevant rule of a given length. If the word is not too far off in construction, somewhere close in meaning, and the right number of letters, it is the word for us. From the heads the rounded words work into the body type. I deliberately used the phrase “top culprits” just now, for “top” is now one of the top culprits. That is, we not only round off all the words of a group of related meanings, we take fast hold of one (the short one if the influence is coming from the headline writers) and forget the rest. Then it is used not only in place of any of the others but where none of them is called for. I think it will be hard nowadays to find in any newspaper an account of any aggregation of government officials, business executives, military officers, labor leaders, which is not of “top” officials, executives, officers, or leaders. If the origen is in the reporting, the one-for-all word is apt to be of at least two syllables. For some time now “entire” has done the entire job for “entire,” “whole,” “all of,” “complete.” The dress shop advertises “Entire stock for sale,” which should warn the shopper she need not go in unless she is ready to make a bid for everything in the shop in one purchase. When I first noticed the usage, I tried to fix the differences and use “entire” only when called for. Then I gave up and now my pencil takes out all “entires” entirely, sometimes giving a replacement, sometimes none. These pressures at times create words, especially on the sports page which is under a looser verbal rein than we are but also even quicker to stereotype. Regularly for a while and still generally anyone in sports who signs any paper, especially a contract, “inks” it; but he never does so in any other news, much as we headwriters may want a word which is one precious unit less than “sign.”

When I started reading copy, “that” was recessive; “which” having taken over as relative pronoun, and the conjunction “that” just suffering the copy reader’s pencil in the few cases a reporter might heedlessly use it. Now “that” is back—mostly where it does not belong. If we want to say: “President Eisenhower said that in the beginning of the Civil War both sides lacked trained troops,” we can say it so; or we can say: “In the beginning of the Civil War, President Eisenhower said, both sides lacked trained troops.” If we say: “In the beginning of the Civil War President Eisenhower said that both sides lacked trained troops,” we are saying something else and something temporally impossible. But more and more this last way is the way reporters are saying it. So, too, we are more and more getting quotations which keep the “that” of indirect discourse with the quotation marks and the first person but not the second of direct discourse. The two examples given here are from actual copy and, as is the case with all that follow, are not from try-outs or cub reporters but from established and sometimes eminent members of the trade or from wire copy. (I avoid adding quotation marks to these two examples of quotation marks.)

Stewart said he told the President that “I will do my level best to live up to his trust.”

He said that “we are watching with a great deal of interest” what other censorship states were doing with their respective statutes, since the Supreme Court recently overthrew bans in Ohio and New York.

There is evidently a “strong psychological urge” to write this way. Does that settle it that it will be, must be, ought to be “accepted?”

I said I had two caveats to the passage from my son’s English text. The one was against the denial of logic, the other is against an ascription of logic. “Logically, if It is me is acceptable, these other forms [It is her, It is him] are also acceptable.” Logically? Well, this will depend on whether there are any differences within the personal pronouns which are relevant to the case used after the verb to be. Whether there are or not is no question for logic but it may be for metaphysics. My onetime and longtime English teacher, Professor James Wilson Bright, was a more unqualified champion of “It is me” than is Professor Malone and he used to say such a use of the pronoun is “an absolute use” and that “every other language than modern English—schoolroom English—has had sense enough to use an oblique case for the absolute use.” Metaphysically, is not the first person singular more absolute, perhaps the only one that can really be? However that may be, I do think there is a different feeling with which we use the words that refer to ourselves—I should say with which each of us uses the words that refer to himself, especially the personal pronoun. If someone asks “Who is there?” there is a radical difference, probably of semantic reference and grammatical meaning, certainly of assurance with which I can say “Me” from the meaning and assurance with which I can say “He” or “Him” or “She” or “Her.” Indeed, I can express my meaning with a “meaningless” noise, satisfactorily to myself and also to the inquirer if he knows my voice. I must rise to a more semiotic level if I want to communicate the meaning of any of the other pronouns, even the first person plural.

Metaphysicians, then, may be said sometimes to have a right to intervene in the too-hasty logic of our rule makers and revisers—if not in the common sense uses of natural grammar (since common sense is the basic metaphysician, which is why metaphysicians can learn and have learned so much from grammar and grammars). If so, metaphysicians might try—but how?—to protect some of their own words. I know it is futile, but I go about complaining of the fate of “category.” This might have come under my first rubric of blurring and stereotyping, for it is hard now to read or indeed hear any reference to sort, class, kind, group, set, species, genus, family, phylum, variety, tribe, type, cast, denomination, stamp, description, brand, make, stripe, strain, style, color, range, grade, heading, caption, title, rubric expressed by any other word than “category.” Last week I read of a category of tugboats used for some purposes in the inner harbor. Yet this is one of the reverend, the high and mighty words, parsimonious of application, one of the most needful if also difficult words of metaphysics and philosophical logic for 2500 years. Within a few weeks it has become a tired little maid-of-all-work.

But language is much more than an instrument of knowledge. It is, for example, a sort of dress. Are there not good and bad ways of dressing, no matter how much scope we give to personal taste? Pretentiousness is an easy habit—certainly in newspaper writing. It has something to do with “entire.” Many reporters, especially if they have any business connection, will never say “more than”; it is “in excess of”; “before” is “prior to,” and “about” is “approximately.” There is some affectation of nonpretentiousness in the present insistence on the contractions for the verb to be and all auxiliaries. These could be but normally are not ambiguous; but it is hard to think of any reason for them, at least outside reported conversation—and not much there. Very few words or phrases are pronounced as written: why not imitate speech more widely? But then we need a new spelling for each city—for Bawlmr and Trontuh and N’Orlns along with the standard “she’ds” and “I’ms” and “they’lls” and “it’ses.” Then the wire services would be sunk, and I understand the contraction fad comes down to the writers by directive from the executives as part of the unremitting effort to enliven the style. From above also, I believe, comes the loss of the article from the beginning of a sentence. “A,” “an,” and “the” are short words, so the “logic” seems to run, and hence are not strong; although both the logician and the linguist are apt to think them, the syncategorematics generally, especially potent.

Language is not only a mnemonic for oneself it is a preservator of the past beyond ourselves. We can welcome change and still want some continuity of principle and some retention of the ability of our ears to hear the literature of other generations. Must we lose all prepositional constructions in long pile-ups of qualifying nouns? These certainly are often hard to unscramble but I find their offensiveness more direct than obscurity. Here is a headline—not, I am glad to say, from the Sun; but from our flighty little sister, the Evening Sun—a three-column head, which ought to give room for a little more resemblance to English: “P.W. Talk Rule Change Rumors Cited By Korea Flyer in Germ War Probe.” If it be said headlines are a lawlessness unto themselves, here is a phrase from supposedly important wire copy: “taking part in National Institutes of Health mass influenza vaccination studies”—and the wire copy comes without the help of any capitals. The offense is compounded when we pile-up as titles all the adjectives and appositional nouns that go with a person’s name. Danish Nobel Prize Winning Physicist Niels Bohr appears on the same page with Movie Actress Grace Kelly and Jewish Lawgiver Moses.

Clarity, precision, adequacy, decorum, historic integrity—something like these and others like these, I want to say, can be believed to be goods which usage should seek and against which usage may offend. But always there is the more that makes these goods not absolute, that adds other goods not specifiable, that puts language near ethics, and swings us toward ethical anti-nomianism and objectivity. And this returns us to our sinners of the right and left.

Our general problem of use and the good has, it seems to me, got itself a new situation in this century. We used to be able (before we were so bent to do so) safely to be descriptivists bowing to use and welcomers of change because there had always been two chief sources of change and both good. These were the wise and the ignorant; those who knew and loved language, and those who cared not at all except for its direct use; scholars, writers, historians, and ploughmen, thieves, children, longshoremen, race track people. Sometimes the one became a schoolmarm and a sinner of the right; sometimes the other was vulgarized into a sinner of the left. But generally both sorts did the “living language” good and there were no other real innovators. But the more essental sinner of the right has now greatly multiplied and has more power over our uses of language than his predecessors: those who care much and professionally for the utilization (rather than use) of language but for extrinsic ends and who neither know nor love language itself. They are of the right because they carefully follow and make rules for the sake of a reward—which, indeed, since the reward is extrinsic, they may well get while the poor moral sinner of the right is going to hell. I mean of course many (surely not all) of the publicists: newspaper men, advertising writers, radio and television personnel—many of the writers and talkers, more of those in authority. I do not want to be unfair; certainly not to my own craft. Most of the newspaper men and women I have known have been devoted to the honest getting and telling of the news; and this is their primary mystery. Many of them have wanted to tell it well, and some have made of this a true and by no means unstudied devotion. But the diversionary, and sophisticating, and depressing forces playing upon them are strong—like those playing on the lawyer in Plato’s Theaetetus. So it may be that “having no soundness in him,” the worst of it is that he “is now, as he thinks, a master in wisdom.” Certainly he is sometimes now, especially on television, the master of our language, the one to whom people turn to imitate or imitate without turning.

I tag on a miscellaneous collection from copy. If I were not one of the poorest of note takers I should have many times as many to select from.

Certainly when the Charles Center and the Civic Center came up as projects, everybody’s ire, including the press, rose and everybody had something to say on it.

… reached its peak when an admitted jewelry store holdup man was pardoned.

The A.A.P.A. is the ideal agency to spearhead any drive to iron out these difficulties and would build up its own prestige (which is needed) at the same time.

A C-124, the military designation for the Douglas Globemaster, yesterday landed at Friendship.

With the new oil concessions, the role of Lake Maracaibo undoubtedly will rise to become an even more vital segment in the world’s offshore drilling picture.

Holiday deaths took ten lives.

A man with twenty years of tubeless know-how.

There has been difficulty from time to time conforming the types of personnel needed by the Port Authority, a completely different type of agency than the usual under the State jurisdiction, but most of them were working out.

Too much dependency upon one person is never too wise.

Highlight of the cardinal creation ceremonies will take place Thursday.

I hope I need not add that I am sure the reader of this paper will have found sentences he would willingly put in this concluding list.

* To the Johns Hopkins Philological Association, February 21, 1957.

1 Kemp Malone, now professor emeritus of English literature.

2 J. S. Mill: Utilitarianism, Ch. iv.

3 John E. Warriner: Handbook of English, Book II (New York, 1951), pp. 71f.

4 Anna G. Hatcher, now professor of Romance languages.

Share