Transparency on Teacher Effectiveness

What do you think of this list from the St Pete Times that shows the results by school, teacher and subject on AP tests? I found it pretty interesting. These are two of the biggest counties in the Tampa Bay area and two of the largest school districts in the United States. Of course this only shows the results of AP tests, which are elective, but I find some of the results intriguing, especially when the same subject is taught by two different teachers at the same school.

Here’s a link directly to the complete article.

An example at the school my daughter would like to attend next year.

Teacher A - AP Calculus AB 29 students tested 10 passed 34% success
Teacher C - AP Calculus AB 15 students tested 15 passed 100% success

Now, it could be that Teacher A is better at presenting the material in an effective manner to pass the test. Or, it could be that Teacher A encourages all of her students to take the test, while Teacher B coaches only those she feels are ready to take the test.

A more extreme example:

Teacher D - AP World Hist 38 students tested 1 passed 3% success
Teacher K - AP World Hist 49 students tested 37 passed 76% success

In the above example, these are teachers at different schools. Teacher D works at an IB magnet school. Most, if not all, of the students in her class are likely to be IB students. They may not be as concerned with the AP tests as the kids at the kids at the other school, which is just a “regular” school. It’s also possible that they take these courses in different years at each of these schools.

Some questions to discuss:

  • Do you think this kind of information should be publicly available?
  • Do you think that any useful conclusions could be drawn by the data? Would it differ if it were a single point in time or over a period of time?
  • If your child were a student in one of the schools, would it affect how you dealt with the school - for example, try to get your child into one of the more effective teacher’s classes?
  • Should there be merit pay for those teachers who produce the best results? Could that even work?
  • Other thoughts or comments?

Note to mods: feel free to move to GD if you feel that this fits better there.

Personally, I don’t think this should be publicly available unless it is listed without teacher’s names. It just seems that this could lead to certain things that are unbecoming of a teacher - specifically an ethical question. Knowing that a teacher can be published for their pass rate may lead to them specifically teaching certain things (only to the test, not the knowledge). Even worse, it could lead to a teacher providing copies of test answers to select students before hand to inflate their pass rate. This may be even more of a temptation if there was monetary gain attached to test pass rates. Also, it would seem the most effective way to present this information would be with an average of pass/fail rates over time, but with some additional information.

This may be because of my profession, but I feel that it is important not just to look at pass/fail rates, but also outcomes. If teacher A had 5 out of 25 pass the test, but 20 our of 25 graduate, attend a post-secondary option, and be ultimately successful, is that so bad? If teacher B had 20 out of 27 pass the AP test (which still isn’t unrealistic) but those students only learned test material and don’t actually retain the information, is it worth it? What happens when they need to apply the information taught later in life? Just seems like the scores only tell part of the story of what goes on in the classroom.

Also, I am in a fortunate situation. My family has been involved with the school district here since 1972 in one capacity or another. My parents have both worked for the local schools, as does one of my sisters, and I work for an alternative education/vocation program that exists in the same area but is not affiliated with the district. My son will be attending the same elementary, middle, and high school both me and my wife attended, and in a small town. We already know most of the teachers he will have, and went to school with some of the new ones that have started since we left. I still don’t feel we should go to the school and ask for him to be in a specific class based on any criteria other than extreme ones (illegal/unethical activity or the like). This just seems like a form of helicopter parenting to me.

I think the biggest thing is to remember that classes should be challenging, and getting A’s is not the most important part (at least not to me). My concern (with my students) is that students retain information they are taught. A student that has a solid B or C because they worked their butt off to learn everything they know in a subject seems to be a better outcome than a student who got an A due to completing a lot of easy assignments that are essentially meaningless, but grade inflation is a related but separate issue.

This also started a great discussion in my home, so thanks!

Brendon Small

One: There are merit programs out there. There has been one in Dallas for over 15 years, and AP scores have gone up considerably. At my school, the students get $100 for every math, science, or English test they pass. The teacher gets $150 per test, and I believe there is some sort of bonus for the principal. We used to get paid for logged tutoring time outside the school day.

Without a doubt, the incentive program helps attract better teachers. That said, I am not sure how much of a difference in makes in effort once those teachers are in place: I teach one class that pays an incentive (AP English Language) and one class that doesn’t (AP Macroeconomics). I don’t think I work any less hard at Economics because of this, though the grading load is certainly easier.

Because we have had this program in Dallas for so long, it seems to me it would be a perfect way to gather data: we have thousands of AP tests taken by thousands of students and incentives given for some but not all of them. It seems like you should be able to look nationally at the correlation rate between, say, AP English Language and AP US History scores and see if that rate is different in a system that gives an incentive for one but not the other. It seems to me that if there is a significant difference, that would strongly suggest that such programs have an impact.

As far as stats go, they are valuable up to a point, but we are talking about fairly small sample sizes and lots of variables. For example:

  1. Is an AP course an elective (like AP sciences, most places) or is it in lieu of another course that is a graduation requirement? The sort of kid who takes a science elective instead of early release or a blow-off class is very different than the kid who has to take English anyway, and so opts for the AP version.

  2. Does the school encourage borderline kids to take AP courses? I accept kids into my AP courses that I know–absolutely know–cannot pass the exam. For one thing, I am never going to tell a kid not to try, and two, I know that making a C in my class is a better college prep than an A in a regular class. This is normal for English because it is a spiral curriculum, so skills are always being reinforced. In math, though, doing well in pre-calculus is a better prep for college than flailing about in calculus. For example, my school’s calculus teacher gets a lot of credit (and deserves it) for her 95-100% pass rate. However, I have many more pass each year, even though I have a much lower passing percentage.

  3. Schedules can create weird outliers. There may be three sections of AP US History, but only one that students who are also in AP BC Calculus, AP Chemistry, Pre-AP Physics, and on the newspaper staff can be in. So you end up with a “power section” of AP US History because the top ten kids in the class are all in there. If one teacher has that section and another has the other two, you end up with a disparity that may have nothing to do with the teachers.

  4. There is a lot of variation from year to year. I had a 75% pass rate on Macroeconomics last year, and 4 kids passed–with 4s or 5s–the *microeconomics *exam based on 4-5 study sessions we had after school. I’m not a bad teacher, by any means, but I also had a group of kids that loved economics. I honestly think 20% of them have gone on to be econ majors or minors, and another 20% are business students. This year, I have a terrible group. Their AP scores have been low all the way through high school, and if I have a 50% passing rate, I’ll be happy.