Between you and I, what's up with the misuse of "I"?

You keep saying that you deny the ignorant prescriptivist tropes, but somehow you just can’t help but keep repeating them. What else are you doing here but weak parody implying that if you abandon prescriptivism then it must imply that “anything goes” for pronouns?

Your subsequent comments about schooling and written language acquisition are simply moving the goalposts. The primary rules of oral language are acquired by implicit unconscious learning early in life. You can “object” to this all you want, but it will make no more difference than an objection to the theory of gravitation will influence the orbit of the moon.

Of course, when we learn written language at school the process is quite different, it is didactic and prescriptivist - the symbols that we use to write are explicitly articulated conventions that we learn to follow by imitation and practice. Later in school we learn the subtleties of literary appreciation, and develop our eloquence, our fluency, our style. These are subjective aesthetic matters - and no less important for that - but clearly qualitatively different from the fundamental rules of the structure of language that we learn implicitly and unconsciously as much younger children.

Bolding mine.

Perhaps we’re having a dispute partly over the level of meta- applied and what the heck “language” means.

How is your bolded assertion different from this modification of it:

How many Mommies are spending vast amounts of every day telling children “This is a ball. That’s a doggie. What’s this? What’s that?” And correcting the errors when a kid calls a dog a cat or a fork a spoon. And parents provide prescriptive feedback: “No dear, we say ‘I want juice.’ not ‘juice I want.’”

Naturally the process of 2 year olds acquiring spoken language is not as stilted as that of 8 yos in 3rd grade English class. The younger ages and the less-skilled teachers guarantee that. As does the one-on-one experience.

I’m not suggesting you’re dead wrong. I’m just suggesting it’s not quite as clear-cut a distinction as you’ve made. This isn’t my area of expertise. I’m happy to be shown where I’ve missed the point.

The distinction is that every human (barring brain damage or other extraordinary conditions) automatically learns spoken language regardless of any level of formal explicit instruction in such. Humans have been using spoken language in this way since long before there was any such thing as schools or grammar teachers or whatever. As much as walking, eating, defecating, and whatever else, the use of spoken language is an instinctual ability humans naturally possess.

However, no human automatically learns written language without a great deal of explicit formal instruction in such. This is not a skill humans instinctually, naturally possess.

Spoken language and written language are far, far further apart, in their nature as human behaviors, than is often naively presumed in these discussions. The difference in the way they are learnt is like the difference between learning to walk or to run and learning to ride a bike or drive a car.

A child cannot possibly learn language by simple imitation, because there are vast number of possible sentences to which the child will never be exposed. Instead, children instinctively acquire generalized rules about how to build valid sentences, refining those rules progressively as they mature. But parents do not teach these rules to their children in any explicit way. Parents themselves cannot explicitly articulate many of the rules of language that they themselves use to build sentences, and even if they could, young children cannot consciously understand grammatical terminology. Instead, parents (and others in the child’s environment) expose the child to many valid example sentences, from which the child unconsciously infers generalized rules. Of course, parents do give feedback when the child gets the rule slightly wrong, but more often than not the feedback is still by way of a specific correction and other valid example sentences that allow the child to update the rule.

The details of the process are a matter of widespread research and some controversy, but the basic principles are not in doubt. Chomsky’s insight was that there seems to be a paucity of information available to the child – the child learns “too fast” – which led him to propose that there is a certain amount of innate pre-programmed language ability, the notion of Universal Grammar.

By contrast, when we learn written language at school, the teacher explicitly articulates a rule – that a certain sound in oral language corresponds to a certain squiggle on the page, etc., and the child consciously absorbs the rule, and deliberately practices over and over again in order to remember it.

[ETA - obviously repeating some of Indistinguishable’s post that I hadn’t seen, but I’ll leave it here anyway.]

One of the reasons that learning a second language fluently is so difficult is that as we get older we lose the ability to absorb a language in the instinctive way a young child does.

Here’s an illustration I’ve used on these boards before. Let me give you some genuine rules of the English language:

The adverb “only” can be used to express an upper bound (“Only God can make a tree. No one else can.”) but not a lower bound. However, the adverb “alone” can be used to express both an an upper bound (“God alone can make a tree. No one else can.”) or a lower bound (“The title alone makes Rebel Without a Cause worth watching, on top of which there is James Dean’s acting and the film’s historical importance.” Note that one couldn’t say “Only the title makes Rebel Without a Cause worth watching…” to mean the same thing.). “just” can be used to express an upper bound (“Just twelve men have ever walked on the moon. No one else has.”) and also a lower bound, but only when appearing adjacent to the focus (one could say “Just the title would make Rebel Without a Cause worth watching…” but one couldn’t say “The title would just make Rebel Without a Cause worth watching…” to convey the same meaning).

Furthermore, like most restrictive focusing adverbs, “only” can be used in a noun phrase (pre-head, as in “Only God can make a tree”), adverbial phrase (“Only with great difficulty did he complete the race”), or verb phrase (“Things can only get better from here”). However, in the same role, “alone” can only be used in a noun phrase whose head is its focus, appearing after the head (“God alone can make a tree”, not “Alone God can make a tree”, or “With great difficulty alone did he complete the race” or “Things can get better alone from here”).

I sure as hell wasn’t taught any of these rules in English class (or by parental reminder or whatever). Hell, I wasn’t even explicitly aware of them till looking them up a few years ago in the Cambridge Grammar of the English Language (and it’s even possible I’ve made some errors transcribing them). Nonetheless, I’ve been unconsciously following them quite well ever since childhood. I’ve picked them up unthinkingly from the speech I’ve heard around me. That’s how (spoken) language works.

[As it happens, written language doesn’t work this way. (Speech is in our blood; writing is just a thing we’ve figured out). No one learns purely by osmosis the particular correspondence of shape-strings and sounds we’ve devised, nor even instinctually that there should be such a correspondence. It’s not even taking any position on descriptivism or prescriptivism to note this distinction; it’s just observing an empirical fact. The manner in which spoken and written language are learnt is completely different.]

Yes, those are a really nice examples. Strict rules that we didn’t know we knew, but that we all now realize that we follow rigorously, and that you don’t need any complicated analysis of sentence structure to articulate.

For the most part, I speak proper English, so I tend to write it as well. Plus I’m a voracious reader, so I see a lot of properly written things. But I still would be unlikely to quote chapter and verse from any sort of English grammar rules, because I was taught that rather indifferently in 7th grade, and haven’t revisited it since.

How do you think someone of lesser education who doesn’t read a lot is going to fare?

Then why did it take me three times as long to read, and I’m still not sure what it meant? If you had used grammatically-correct language, I would have understood perfectly after one quick reading.

Those are interesting examples and I get your point that we do have a natural sense of language that serves us well in some aspects of spoken language (and even spills over somewhat into the written form).

And yet even a simple word like “only” does not escape abuse. A sentence like “Only I like dancing at a party” can have at least four completely different meanings depending on where the word “only” is placed, and I’m not sure that the appropriate usage is necessarily obvious to all. And I’m pretty sure that the distinction between “He only went to the store” and “He went only to the store” isn’t something that is informed by some intrinsic sense of language because using the wrong variant is a common mistake.

I do agree, of course, that there’s a fundamental difference between spoken language skills and writing skills, and most of what I’ve been saying here is about the written language. [ETA: that fact may be the reason for some of the disagreements here, where we were just talking past each other – I have no issue with the premise of how we acquire basic spoken language skills.] The central point is well described in that article I linked: students graduate from high school, go to college, graduate from there, and finally go out into the world with their heads stuffed full of education and they still can’t write worth a damn. The skills that they’re lacking are not superficial, and they are a lot more than just a matter of “style” or conformance with “arbitrary” rules. They are, in fact, foundational to effective written communication.

I don’t claim that the answer is to memorize a bunch of grammar rules, because as suggested upthread practice and exposure to good writing may be more important, but some sort of process is needed, whatever it may be. The competing notion that I reject is that nothing is needed because whatever words someone puts to paper are just fine as long as they conform to basic S-V-O structure and they’re more or less comprehensible. It’s not fine, and if someone wants to call the missing skill set “style” then we simply have a semantic disagreement, and I would argue that in that case “style” is a crucially important aspect of the written language.

The matter of “I” and “me” spans both the spoken and the written language, providing an outstanding opportunity to blunder in both domains. If one wants to take a populist approach, and say that the majority of native English speakers and not prescriptivist peevers should be the determiners of what constitutes correct language, then I give you the NPR Grammar Hall of Shame audience survey. At the top of the list, #1 in annoying grammar mistakes that the audience would like to see eradicated from the universe forever, is the “I” vs. “me” debacle.

I found myself wondering what the world would look like if we did, in fact, manage to achieve grammatically accurate use of “I” vs. “me”, and was also perusing the venerable Geoff Pullum’s “Language Log” (although it’s predictable what the contrarian old Scotsman would have to say on the matter! ;)).

My concession to descriptivism is that it’s too late for some of those forms of usage to change. Granted, I find some of Pullum’s arguments unpersuasive. When he states unequivocally that “the copular verb takes accusative pronoun complements” he’s stating it like a law of nature or a broad consensus, which it plainly is not – he’s simply stating an opinion, allegedly bolstered by the fact that such things are “heard constantly in the conversation of people whose status as speakers of Standard English is clear”. But how many such people, and where is it written that “speakers of Standard English” are immune from the fault of hypercorrection?

But Pullum acknowledged that in the course of his work on the massive Cambridge Grammar of the English Language he changed his mind about many things, and so do we all. I’m particularly inspired by his comment that if someone knocks on your door, and when you ask who’s there they reply “It is I”, you shouldn’t let them in because this is probably someone you don’t want to know!

He probably has a point, and I think what we’re dealing with here is a matter of gradation rather than absolutes. I’m willing to concede that “It’s me” is so entrenched in the language that it’s become de facto standard. I think it’s the same with “He isn’t any better than me” mentioned upthread. I don’t know whether it’s because the errant pronoun is hiding in an adjectival phrase or if it’s something else, but regardless, it’s established. It sounds fine. This notwithstanding the obvious meaning that “He isn’t any better than I [am]”.

That’s it. Cases like that are a lost cause. But IMHO a construct like “between you and I” or my previous example “That doesn’t make sense to Jane and I” are so obviously wrong to so many speakers of standard English (see the NPR survey I linked above) that they should just be regarded as hypercorrection mistakes, even if Pullum thinks it’s perfectly fine.

Incidentally, Pullum, contrarian as always, also tried to make the case that “that” was perfectly permissible for a non-restrictive clause, citing a journalist who had used it that way. The interesting thing is that, just exactly as I said in discussing this issue earlier, he couldn’t figure out for sure what the sentence meant until he actually spoke with the author. (Substitute “which” for “that” to create one sentence, then put “that” back in but remove the comma – you have two different meanings.) He also acknowledges that the usage is as rare as an ivory-billed woodpecker, which is about as close as he will ever come to saying that something is “wrong” or at least “non-standard”. :slight_smile:

Are you honestly trying to misrepresent that I was saying that S-V-O word order is the only rule of syntax? You seem to be clutching at straws here. Obviously it was an example of one of many thousands of rules that we all instinctively acquire as children and use unconsciously and flawlessly without the help of prescriptivists - the true rules that actually allow us to communicate. You know (well, okay you don’t), the ones you never mention.

Despite what you say, you just can’t let go of the standard ignorant prescriptivist trope that if the non-rule that you so desperately want to exist does not exist, then no rules can possibly exist and linguistic anarchy follows. At the same time, you’re unwilling to back off to a point of expressing your opinion about usage the way you should have expressed it originally - as perfectly reasonable style advice - because for some reason you so do desperately want the people who don’t speak the way you want them to speak to be objectively wrong.

As for the rest of what you’ve said here: well, it’s standard tedious prescriptivist stuff that people have been saying for centuries. The language is going to hell, this or that grammatical error is a lost cause, young people these days can’t communicate clearly, etc. etc. Old man yells at cloud.

Re Geoff Pullum - if you’re trying to make some kind of weak claim that a pre-eminent linguist is coming around to your misguided view of language, think again. There are no linguists that agree with you, because your view of language is entirely at odds with reality.

This is kind of an aside, but one instinctive rule in English (and languages in general) I’ve always found interesting is adjective order. I’ve never seen this prescriptively addressed (at least not ever in any of my English grammar classes in school), yet all native English speakers seem to agree that “big yellow taxi” is grammatical, or more “natural”, but not so much “yellow big taxi.” Or “big old yellow taxi” but not “yellow old big taxi.” Or “the beautiful big old yellow taxi” but not the “old big yellow beautiful taxi.” Here’s the order, for those interested.

On the same topic, this popped up on twitter recently

As with the great examples that Indistinguishable gave above, it’s fascinating how rigid some of the unconscious consensus rules seem to be - in situations where there’s no obvious advantage to having such a strict rule. I wonder if this is a side-effect of the strong instinct toward rapidly acquiring the standard rules of syntax of the environmental language in childhood?

Prompted by the tweet I mention above, some extensive commentary today from Mark Liberman on adjective order:

http://languagelog.ldc.upenn.edu/nll/?p=27890

Actually, I was trying to make the opposite point – that I’ve partially come around to Pullum’s view on some of the “I/me” constructions, like “it’s me” and those in the pattern of the other one I cited. Pullum himself (in The Cambridge Grammar of the English Language) asserts different levels of acceptance for different forms of these expressions with pronoun case disagreements depending on how commonly used they are, though elsewhere he seems accepting of all of them.

Steven Pinker seems to take a position somewhere in the middle, arguing that the case of a pronoun in a coordinate structure can be independent of the case of the coordinate itself, so that “Jane and I” can be deemed grammatical even when used in the object case, but he calls such usage a hypercorrected solecism.

Not at all. I’m simply highlighting the inadequacy of that criterion alone, and I was thinking of an earlier position in which (unless I’m misinterpreting) you went so far as to excuse “would of” and “could of” as “subjective choice and a matter of style and appropriateness given social perceptions, but there is no correct or incorrect in the empirical sense (not even to ‘could of’ …”

That’s fine if the assumption is correct and this is some sort of in-group dialect, but it’s more likely the result of ignorance, the crucial difference being that it then gets propagated into more formal writing where it has no standing as a subjective choice or a matter of style. Hence the first article I linked, which incidentally is titled The Case for Teaching Grammar and is worth a read; another is this one from The Atlantic: Why American Students Can’t Write, or American Students Can’t Write… Because Schools Never Taught Them How, or Novelist teaches freshman writing, is shocked by students’ inability to construct basic sentences.

I appreciate that this is now far removed from the perfectly valid points you were making about how we learn spoken language, about linguistics, and about the ultimately empirical nature of language. But this isn’t just about peeved prescriptivists shouting at clouds, and the grammar sections of the online Oxford Dictionary and all its equivalents aren’t just there to cater to prescriptivist peevers as you claim.

So here you’re simply restating your rather bizarre misrepresentation that I think S-V-O is the only rule of syntax.

It’s certainly true that the variant “could of” arose as a mistake, because in many dialects “could have” is often orally contracted as “could’ve” and is then aurally almost indistinguishable from “could of”. It’s also true that in current usage there’s no dialect that I’m aware of where this variant would not be considered a marker of some degree of poor literacy, since the distinction is clear in writing.

However, it may be that in several decades the language will evolve, and “could of” will be considered a perfectly acceptable variant even in formal registers, who knows? It’s not a question of “excusing” the usage, but language just evolves, and we can’t stop it. And why is it a problem if this does happen?

The point is that the concept of an error in language only ever makes sense relative to current empirical usage. There is no absolute inviolable standard. By the standards of a couple of hundred years ago, everything that well-educated literate speakers say or write today would be considered to be littered with mistakes.

The thing that you fail to grasp is that the evolution of language is not, never has been, and never will be, a battle between “upholding standards” and “decay”. That’s simply not the way language works, and to think about language that way is just stultifying.

There’s an important social aspect to this, too. When I criticize your use of the words “wrong” and “error” when you express stylistic views on variants, or when you comment on variants that are stylistically appropriate in different registers or social contexts, it’s not just a question of a minor semantic quibble about what “wrong” means. Derleth’s excellent post #51 was on point, I’ll re-quote part of it:

When variants are associated with social class, or with non-dominant social groups, things can get remarkably ugly if we lose sight of the fact that linguistic standards are empirical and arbitrary, the idea that

[QUOTE=Max Weinreich]
A language is a dialect with an army and a navy.
[/QUOTE]

The treatment of Rachel Jeantel, a witness in the George Zimmerman trial, is a sobering tale.

http://languagelog.ldc.upenn.edu/nll/?p=5161

The sole basis for this appalling treatment was the fact that her native dialect is AAVE rather than “standard” English.

I reread your post #69 carefully and I see where the misunderstanding came from. I was describing why I felt the hypercorrection that substitutes “I” for “me” was wrong. Your issue was with the word “wrong” and you replied that only something that deviates from universal consensus on something like the S-V-O structure could be deemed “wrong”. I incorrectly took that to be a claim that S-V-O was the only thing that really mattered. I see that you weren’t actually saying that, so apologies for the misread.

Furthermore, I think when I use a word like “wrong” and you refute it as above, we’re talking about different things and talking at cross-purposes. I’m evaluating usage from the standpoint of consensus prescriptive grammar rules and guidelines like some of those I linked to, but as Steven Pinker has pointed out, words like “rules” and “grammar” mean completely different things in common parlance than they do in linguistics. When I use a word like “wrong” you should read it as meaning “non-standard with respect to widely accepted prescriptive rules of grammar” or even just “not preferred”.

Prescriptive rules are not silly or superflous, nor are they particularly “arbitrary”, except for the bad ones. And their purpose isn’t to block the evolution of language or prevent some apocalyptic decay, but to protect and enhance its value here and now by promoting better and more consistent communication. Good prescriptive rules support that goal. Those that don’t should be discarded. The real problem that we have isn’t too many meddling prescriptivists, it’s English speakers (or more aptly, people attempting to write English) with poor levels of literacy. Have a look at some of the links I posted previously. I see prescriptive rules as well-intentioned if imperfect attempts to try to codify the principles of basic literacy – not fearsome strictures, but helpful guidelines.

BTW, in the interest of accuracy, I just read Pinker’s analysis of “I/me” usage in coordinates in its full context, and I think it would be misleading to claim that he really thought the substitution of “I” for “me” was a “hypercorrected solecism.” He used that term in passing, almost sarcastically, but it’s clear that he’s fully committed to the idea that the case of the pronouns in a coordinate can be independent of the case in which the coordinate is used. His logic is that if this is true for number (“Ellen” and “she” are both singular, but the coordinate “Ellen and she” is treated as plural) then it should also be true for nominative/objective case. He offers no opinion on preferred usage, though, but the prescriptive rules certainly do.

It was ugly, all right, and her language was unquestionably a big part of the reason for her treatment, but this isn’t a problem that linguistics is going to solve. To demean someone because of their dialect is exactly the same as demeaning someone because of their skin color or heritage. Bigotry arises from difference, from the concept of “not like us” as understood by the hateful narrow-minded. There will be bigotry as long as there are bigots.

Preach it! And pass the Cheez Doodles…

So this begs the question* - please explain the source for these “rules”, which you claim have authority that goes beyond stylistic or social preference to the extent that they can show syntactic variants in common usage to be objectively “wrong” (i.e. the rules are not empirical), yet are not arbitrary?

[*I dare you]