Interesting thread. I am a lifetime certified teacher in grades PK-6 in the state of Texas, and currently a doctoral student in education. While pedagogy isn’t my main focus, it does touch on much of what I do.
A previous poster explained quite well why certification is needed in K-12 teaching. Kids tend to follow a pretty regular pattern of intellectual development. As it was said, knowing that 2+2 equals four will not necessarily make you a good teacher. You have to understand how children learn at a certain age to get this information - and the method how you came to said information - across. As a fourth grade teacher, I learned in childhood development classes that my least favorite pedagogical method - rote memorization - actually had significant value for kids in that age group. Remember learning the times tables, and not even knowing what what happening arithmetically?
But there is a fine question here - are we assuming that after age 18, students cease to develop in learning styles? In higher education, typically, yes. My university has a teaching center that offers tons of resources for profs and teaching fellows, but no-one is required to attend any of the sessions. In fact, I’d say it was the rare prof that would have the time, interest, or self-critical ability to voluntarily go to a session to learn more about teaching.
Most universities need research dollars to stay in business. To be an institution that exclusively and predominantly exists as a teaching institution, you need tons of endowment money and will have to charge high tuition, as the financial reward of good teaching doesn’t pay like patents do. But some schools do this - small liberal arts colleges like Bennington charge a premium for great teaching, and are able to successfully do so because graduates enjoy the experience and give money when they graduate to perpetuate the process. Of course, most schools can’t do this, and need grants from industry and the government to make money - hence, the number of schools that can say they are prioritizing teaching over research are very small.
There’s also the feeling that in higher education, the output is new knowledge - research. That’s what academia produces. If students are learning the exact same content as they did a generation before, it can be argued that the institution isn’t really doing much new. So occasionally a scholar will pen a revised perspective on Shakespeare’s life, or study his works from a critical discourse perspective… this keeps the knowledge fresh and appealing.
I help teach a course on the economics of higher education. One of the big questions we tackle in the course is the question of signaling versus productivity. Take, for instance, Stanford and Xavier University in New Orleans. Stanford’s admissions criteria are among the highest in the country - high test scores, stellar grades, and activities are the price of admission. Xavier is a historically Black college with a modest endowment, but looks for students who demonstrate initiative and drive - even if their preparation is lacking due to weaker academic standards at their schools and the like. However, in four years, Xavier places more African American students in medical school than Stanford - as well as Harvard, Johns Hopkins, and a number of other highly ranked institutions combined. (This is a fact chronicled in Ellis Cose’s Colorblind, as well as in a number of studies conducted by the top medical schools in the nation.) At Xavier, it seems that there is a strong tendency towards productivity - in other words, students are really learning something and can demonstrate that learning in assessments like grades, MCATs, and interviews. At Stanford, it’s possible that students that came in to the institution are not necessarily learning as much - the fact of the matter is that Stanford graduates, with the institution’s prestigious reputation and strong alumni network, will probably end up doing just fine after college. The institution sends a strong signal about the quality and capabilities of their students.
This of course is a sweeping generalization - there are Stanford students who grow in leaps and bounds intellectually, and there are Xavier students who don’t get into medical school. But it makes you think about what the purpose of college is, and what students pay for - in part. I say that because the common perception that college is a service, like a meal in a restaurant, where you pay a price that covers the preparation and serving of your dinner, is inaccurate. In fact, virtually all colleges subsidize the cost of attendance through tuition discounts before students even see the bill. This is especially true in state institutions, where legislation may cap tuition and colleges have to find the cash to pay for students. Part of the cost is passed along many times in tuition hikes, but the majority of the cost comes out of state coffers (sometimes, of course, paid for in part through tax dollars of non-college attending state residents).
All of this is to say that the prioritization of research over teaching is a reality in higher education unlikely to change - Stanford was among the first institutions to realize that research labs were lucrative, prestige-granting, and one of the few ways to attract the best and brightest to academic life (see Rebecca S. Lowen, “Transforming the University: Administrators, Physicists, and Industrial and Federal Patronage at Stanford, 1935-49,” History of Education Quarterly, 31 (Autumn 1991): 365-388). Higher education is indeed a business, and a competitive one at that - so to gain prestige, one has to play along - or take a position that is so contrary to the majority of institutions, that you may in fact attract a handful of students at the cost of alienating most of your potential customers…