Badly designed surveys

I’m not talking about push surveys, which are specifically designed to get a particular result. I’m talking about an actual, legitimate survey that ends up misrepresenting its results because it’s designed to leave off certain responses.

I was thinking about this last night, after I got a call from a Marist College survey asking about government consolidation.

Many of the questions went like this:

Them: “Do you favor or oppose consolidation of police departments in your area?”
Me: “It would depend on the details: what is being consolidated and what the potential savings, if any, there are.”
Them: “But in general, do you favor or oppose?”
Me: “I can’t answer that question. It would depend on the details.”

She kept going through the questions and on most I would say, “It depends.” I get the feeling that my opinion will be completely disregarded. It doesn’t fit neatly into the options on the checklist.

Eventually, they will release the survey. It will be useless because of this. There should have been a “it depends on the situation” options, but that makes surveying difficult and the results can’t be simply reported one way or another.

“Which did you admire more about George W. Bush, his unwavering faith or his hard-line stance on terrorism?”

I have only answered a couple surveys without getting to a point where I go none of the answers apply or the question can be taken two ways. At that point I have to quit or give a wrong or random answer to continue. At that point I don’t care if I lie to push my agenda as the poll is crap. I no longer eat bacon twice a month, it’s every day.

Awhile ago I took the “Minnesota Multiphasic Personality Inventory” for a class.

I had the same problem answering many of the items. There were many statements that I couldn’t simply agree (or disagree) with, as my reaction would really depend on the circumstances.

Items like “I would rather go to a social event than stay at home to work on something” (I’m paraphrasing a bit here, but you get the idea).

What kind of social event is it? Who is holding it? Is my attendance expected? What am I working on? Is it for my job, or a hobby? Do I have a deadline? Is it the weekend, or a weekday?

It was quite a frustrating experience.

Heh…no snark intended but if this is really your reaction I would bet that you should be answering “stay at home and do work”

I used to work for a polling company and we had to design questions really carefully to avoid this. Getting too many “don’t know/don’t care/can’t say/other” does not please clients at all. Is consolidation of police departments a big issue in your area? If it’s not something most people are likely have have an opinion about, it’s not a good question. They should have broken it up into more detailed plans, and done an agree/disagree on those.

Interestingly, there are some things you just can’t ask people if you want accurate answers. If people say they have seen a company advertised lately and you ask them where they saw the ads, a lot of people will claim to have seen ads where they couldn’t possibly have been - like on the side of bus when the company only advertised on TV.

That depends. What kind of work is it?

:stuck_out_tongue:

When we were trying for baby 2 on a schedule I did get calls to hurry up and finish at the office so we would get our “homework” done.

I became self-employed four years ago; I’ve been picked twice for government surveys on “use of IT in small/medium companies (PYMEs).” The survey from two years ago was a lot longer, but both had the same glaring design defect, noted by many of the responders (I know three other people who got it two years ago and all of us made a note on the final “other remarks” space).´

There is no division based on which size company you are. Yes, there is a question which can be used to separate answers post-facto, but many of the questions are completely irrelevant for many small companies, or for companies from some sectors.

One example (required and paraphrased from memory):
When your company receives an order, which of the following departments receive it without someone needing to forward it?

  1. Sales Y/N
  2. Production Y/N
  3. Finance Y/N

It doesn’t make sense for companies without a “production” (for example retail stores, many of which are PYMEs; I’m sure many professionals such as accountants, architects and lawyers would be surprised to hear they have a “production” department too); it doesn’t make much sense for me… I mean, yeah, when I get a contract all of my departments get the memo at the same time, but that’s because they are me, myself and I.

The one two years ago had several questions about whether and how the company used ERP software internally. Well, no, no, I don’t: I get paid to implement one, though, it’s my freaking primary line of work… so while I don’t “use” ERP software for my Head of Finance to go over reports with my CEO while the Head of RnD brings the coffee, I wouldn’t even exist as a business if that program hadn’t been written.

There was a question in both editions about “what would be required for you to use IT more?” What, more? Well, gee, I work in IT and that’s not enough using IT for you? Well, it’s nice when I get remote clients, but that’s not my choice - nor an available answer (there was no “Others, please list”).

There was also some clarification in the instructions which had those of us who work alone or almost scratching our head in confusion and asking for clarification “in non-accountant terms, please”. In Spain, there is something called the “General Accounting Plan” which defines which account numbers are used to reflect which items; thing is, many microcompanies don’t use an accountant (we’re not required to; in my case, accountants hear I may be billing abroad, shit their pants and cower behind their chair shaking and muttering “go away, go away…”). For us, “Please state your income (bills actually received plus pending bills)” makes sense - “Please state your inputs (701, 702 and 703 groups)” makes us go “Mommy?”

That’s interesting, I just received a survey for the election (I live in Canada) on the phone, and it was pretty much what you were you talking about.

When I read the title, I thought of something totally different. There’s a survey I take from time to time where the “Next” button is on the left instead of the right which is really awkward. The only reason I can see for it is because it lines up with the radio buttons…but that’s not what you were asking.

What I hate is survey questions with black and white answers. On OKCupid, which is of course the pinnacle of well written survey questions…being authored by random OKC users, I’ll regularly run across questions along the lines of “Have you ever had sex with someone the same gender as yourself…A)No, that’s just wrong; B)No, but I’d like to C)Yes and I enjoyed it” Hmmm, how about a simple “Yes/No”
Again though, this is OKCupid and the questions are written by the users.

I just took a Rate the Music survey, featuring two songs by a “hot new artist”. The music itself was OK, but I hated the singer. And at the end, I got the choice of whether I’d buy the album, or wanted to listen to more of it, or if I didn’t like it. OK, that was good. The last question was which of the songs I enjoyed more. Well, I didn’t enjoy EITHER of them! But that wasn’t an option.

This is pretty interesting.

reading the article doesn’t give enough info to understand exactly what went on but it’s unbelievable that more test marketing wasn’t done.

Anyone in business should be able to tell you the limitations of surveys and focus groups.

Basically, they took all of their high-margin merchandise out of heavily trafficked areas and stashed them away on side counters. Completely killed off nearly all impulse purchases.