Begs the quesiton

Now we get into my own pet peeve. I’ve been frustrated by this argument before. :slight_smile: Personally, I’m of the mind that a shift in speech based on incorrect spelling is illegitimate and evidence that people are, as a rule, insane. However, I recognize that it happens and to rant about it is as useless and counterproductive as pissing against the wind, so I don’t let it get my hackles up any more. I just correct it when the situation prompts for correcting and ignore it otherwise.

No! They’ll have to pry this dictionary from my cold, dead, hands first. :smiley:

Dude, when did you get it into your head that language change is any way ‘logical’? Yeah, there are certain patterns that are usually followed, and some changes are unlikely to ever happen, but the logic of language change is not the logic you wish it to be.

All people, barring the severely mentally disabled, can learn to use language quickly, often astoundingly so. It is an inherent aspect of humanness. Writing, however, was thought up only a few times in human history, and cultures took the idea from each other. You need specific schooling to learn how to read and write, but learning to speak is a matter of interacting with any other people. Language is not manufactured. Writing is. Your ignorance on all things linguistic is astounding.
**
HazelNutCoffee**, on the other hand, has a point that academic writing is all about pedantic, prescriptive nonsense, so it’s important to know these types of rules. Outside of that very limited domain – not so much.

IANABosstone, but as I’ve said before on these boards, the basis of my own stance is that “correctness” is a flawed concept to use when discussing language. Language is a tool developed by humans for communication. Its purpose is to get ideas across. If a given use of language succeeds at communicating what was intended, then it has been used as “correctly” as it can be.

That’s not to say that certain uses of language aren’t preferable to others for various reasons. The specific words you choose can alter the way you, not just your message, are perceived by your audience. Also, certain phrasings might improve clarity, or efficiency, or make the statement accessible to a more diverse group of people. The descriptivist who denies these things is a fool. The driving point of my brand of descriptivism is that the optimal usage of language, inasmuch as it exists, depends on what you are trying to communicate, and to whom. There is no “right” or “wrong” in this; it’s just a matter of choosing the right tool (or words) for the job (getting your message across).

Pounding a nail into wood with a monkey wrench; using your fingernail to turn a screw; saying “the point is mute” to mean “the point is irrelevant”…all of these things are inefficient, and probably less than optimal, but in each case, it’ll get it done. Lifting a 100-pound rock with an inch of sewing thread; using wet sand as brick mortar; saying “an apple” to refer to what you know your audience calls “a peach”…these just plain won’t work, and you’ll need to do something different to accomplish your intended goal.

Looking at it that way, the stereotypical prescriptivist is a person who says that you that can’t use a butter knife to turn a screw, because everyone knows that’s what a screwdriver is for. The stereotypical descriptivist is a person who says he can make a window out of sheet rock, and who are you to tell him otherwise? Thing is, we’re not stereotypes, and nobody should hold to either of these clearly ridiculous extremes.

In short, can’t we all just…get along?

Just because I’m in a rambling mood, the phrases that stymie me in this case are “could of,” “would of,” “should of,” etc. Anyone with a modicum of self-awareness should realize that this is wrong and not what the word of is used for (while content words like nouns, adjectives, verbs, and idioms are easily mutable, function words are very, very stable, even linguists recognize that). But people hear the contraction “should’ve” and misunderstand it, writing it down as “should of.” It then gets picked back up, and a friend of mine has insisted that she’s heard the difference between “should’ve” and “should of” and can attest that people are indeed saying the latter. I find it absolutely insane, but at the same time I can’t see how it’s not hypocritical of me to split this sort of hair, so I’ve more or less come to terms with how the language is changing with the advent of mass text communication.

Check the time stamps on our posts. Your qualification came in right at the buzzer.

Seriously, how much do you actually know about linguistics? Semantics? Syntax? Morphology? What is the formal definition of a noun?

I got no hard cites here, but I’d bet 99.9% of people can’t answer these questions. They speak fully grammatical sentences every single day, but they can’t describe even the most basic structures that come out of their own mouths. And that 99.9% of people includes most primary and secondary English teachers. I would guess that you, HazelNutCoffee, could not, without checking a source, give an adequate definition of “noun” that would stand up to fifteen seconds of scrutiny from a trained linguist.

If there is a difference between biology and linguistics (besides the natural/social science angle), it’s that biology is much better understood. Creationists peddle their filth in their own unrespected rags. But the prescriptivist Safire was allowed to vomit obviously incorrect garbage about language on the pages of the New York Times.

If you follow both the creationist and prescriptivist threads, as I do, you’d notice that they resemble nothing so closely as a game of Whack-a-Mole. No matter how many times we shoot down stupid shit, someone new to the party decides that it’s the height of cleverness to repeat something about the Second Law of Thermodynamics (in biology threads) or to start writing nonstandardly (in these threads) as if that, by itself, has made the point for them.

With respect to both grammar and semantics, correctness is determined by usage. This is why Two and a Half Inches of Fun’s statement is so patently obtuse: with a 30 second check to a reputable online dictionary, we can find “irregardless” clearly labelled as nonstandard, which is another way of saying that it is not common usage. Another 30 seconds of minimal research would have revealed that “ain’t” is commonly used both in speech and writing in order to effect an informal style. This happens even in edited publications such as newspapers.

The particular sounds, words, and grammatical structures are arbitrary, but speech development itself is a biological fact, the same as breathing. Every healthy human being will learn to speak, regardless of their level of formal education.

Writing, in contrast, is artificial. Its conventions must be memorized by bored students and repeatedly reinforced by stern teachers with sharp sticks. And even after a decade of formal training, many students remain unable to create compositions within the bounds of even the most widely accepted of these conventions, whereas their speech will be an astoundingly precise reproduction of the patterns of the people around them.

This is how people learn language. Children especially are like little supersponges that soak up new meanings of words based entirely on context, and often on an extremely limited sample size. Make no mistake, lugging out a dictionary to be sure is the weird way of doing things, even in the age of literacy.

I am not arguing that it is good or right or logical that our language works this way. I am simply repeating that it works this way and that the best way to cope is to enjoy the aesthetic of change.

I agree with both of you, Voyager and Bosstone. :slight_smile: I just place “begs the question” in the same general category.

Roland:

All fair enough. But the trouble is that the new usage of “begs the question” makes it difficult for me to clearly communicate what I mean when I use the term in its original sense. There is no efficient way to restate it (“assumes the conclusion” is closest, but it’s definitionally incomplete and it sounds hifaluting). Unlike the new usage of “begs the question,” which is precisely equivalent to “raises the question” or, as I said, “raises the obvious question.”

For what it’s worth, I tend to say ‘raise the question’ myself, just because I know someone’s bound to go on a tear about it if I don’t.

So you language mavens are effecting some change, even if minor. :smiley:

Are you aware what proportion of the population, even among educated adults, would read this sentence and be convinced that you’ve misspelled “affecting”? :slight_smile:

Jesus, I wasn’t aware we were in GD. I will fully admit that I am not a linguist and have only a shallow knowledge of linguistic theory. Yes, my definition of a noun stops at the “person, place, thing, or idea” definition. (FTR, I teach college-level English, which is more about composition and literature; grammar is only covered in the context of composition. Plus my degree is in literature, not linguistics. I took one linguistics class and was bored to tears.) I’m frankly not interested in the level of debate you’re trying to engage in here; I simply came in to express my annoyance at the nonstandard :rolleyes: usage of certain phrases. If this were a serious debate, I wouldn’t have even bothered to read the thread.

I’m not attacking the validity of your points; I’m simply not qualified enough to comment on them. So my post can be summed up thusly: :stuck_out_tongue:

How about the use of “individual”? I can (and just did) look it up in a dictionary and see that it should not be used in place of “a person,” but this usage is extremely common. Is it a problem because the dictionary says so? Or it not a problem because it is easily understood and commonly used?

If not in contact with people using speech, will they learn to speak or just make noises?

It’s arguable that not being in contact with other people precludes calling a child healthy, but that’s hairsplitting. There is evidence that a child who does not come in contact with speech during their early years will not develop language.

That does not mean language must be taught, mind. A child will learn language just by hearing it spoken; they don’t need structured lessons to figure out what is and is not correct. Situations where a child never encounters speech are exceptionally rare.

Language is being taught to the child in informal lessons. When a mother says, “Do you want some milk?” and gives the child milk, she is teaching the child a lesson. Whether the lessons are structured or not is irrelevant.

How very semantic.

What problem do these two solutions solve? Namely, what is wrong with using “begging the question” instead of these?

Because, as I mentioned previously, it effectively forecloses my use of the phrase as it was originally meant, if I want to be understood. There is no corresponding equivalent for “beg the question” in its original sense.

Okay, I can grok that. I’m fully in support of a person defining their terms to others for clarity, and making arguments that a particular usage is preferable. You may be fighting the tide, to be sure, but you’ve every right use language in the way you prefer, just like everyone else. The key concept here, though, is preference. Where I take issue with the typical prescriptivist stance is the insistence on applying the notion of correctness to it, as if their choice of how to use “begging the question” is somehow inherently superior to someone else’s in all situations, when in reality it’s all just a means of getting an idea across.

If I were speaking to you, I would use your preferred meaning; if I were talking to my friend Dave, I’d use the other, because I’m pretty sure he doesn’t know that your usage exists. It baffles me that someone would look at the latter situation and tell Dave and me that we were “incorrect”. We wanted to communicate and we did so, efficiently and clearly, with no need for added explanation.

In one-on-one conversations, it basically boils down to this: if you have a preferred usage for a certain phrase, and my alternate usage is causing confusion, let me know about it. I may switch usages for that conversation, or maybe we’ll discuss it and I’ll decide your usage better suits my broader purposes too. Using language in this way furthers communication and understanding. Hearing my usage and telling me I’m “wrong” because someone used it your way in a book from 1953 is just being contrarian for no good reason, and actively impedes communication and understanding. Why would anyone want language to do that?

Also, every time I see it, I think, for a second, is there a circular argument here?

As I said, I can’t think of many sentences where anybody who actually understood petitio principii could get the two confused. Besides, wouldn’t the difference be obvious from context? If the sentence isn’t immediately followed by another, relevant question, then the chances that the informal usage was meant are pretty slim, no?

“Circular reasoning”. Which is handy, as it’s actually obvious what is meant by that phrase, as opposed to “begs the question”, whose literal parsing is most accurately reflected by how it’s used in informal practice :slight_smile: .

Failing that, there’s also petitio principii - seeing as we use ad hominem tu quoqe and ex falsum quod libet in analysing arguments, what is wrong with the original Latin?

Aside: does anybody actually have any evidence that the informal usage of “begs the question” is derived from an improper understanding of Aristotle’s principle?

I do not really care how people speak to each other in conversations. I, also, do not really care about informal writing including internet postings. Sure, sometimes something will annoy me, and I will overreact. But, I do try to let things slide.

The real problem I have is when usage problems occur in publications that have editors. It is just lazy. The writer and the editors should be using a style guide to avoid incorrect and confusing usage. These writers are professionals and should be putting out a better product.