HERE is a small school in South Dakota. On thing I like about them is they put out a “Placement Report” on their graduates 6 months after graduation. HERE and HERE are examples. They show:
of graduates
who respond to survey
who got jobs
who got jobs in the field
who continued their education elsewhere or went into the military
who got jobs in state
Average salary 6 months after graduation.
Now other schools will just have general statistics like nationwide average salaries but give few specifics.
Do you think other colleges should do this? How big of influence of future students or on the staff would such a survey get?
In this schools case it must be pretty good because they have been named the #1 Tech school in the country. In 2015 Obama even gave the commencement address which was because of their high graduation rate. Over 80% of students who start the school, stay and finish their programs on time. The school has grown every year of its 15 year history.
My experience with tech schools through my son is that this is normal, although you sometimes need to dig or ask for the information.
For liberal arts colleges it tends to be different, since “getting a job in your field” is not necessarily the end goal for someone with a B.A. in Media Studies. It tends to be more “has a job” or “is in grad school” - but those are published at all the schools we looked at for my daughter.
(Liberal Arts colleges were not designed to “get you a job” - they were designed to make you into a well rounded human being. Some still embrace that idea and really don’t want to be a four year trade school.)
It sounds like useful information, and if I were in the market for a college, seeing good numbers in a report like that would certainly incline me favorably to the school. But I’m not sure it would make sense to mandate such reporting, if that’s what you’re asking. It sounds like a task better suited for the free market.
Yeah, for a technical institute, whose focus is job training, it makes sense. For other kinds of institutions, not as much (or at least, the survey might have to be different to be meaningful).
35 years ago, when I was getting packets of information from colleges all over the place, many of them (particularly the colleges focused on engineering) included this kind of information. I recall Rose-Hulman in particular advertised their placement rate (and apparently still does 97 Percent Placement Rate Affirms Return on Investment | Rose-Hulman)
Such a report would be pretty useless across a large university: as mentioned, it’s hard to determine what a “job in your field” even is (I mean, if you were an English major and you are get a job in a bank or with a financial services company, is that “in your field”? Because superficially, it’s not, but in reality that’s the sort of entry level job that a liberal arts degree can get you, and which can serve as a gateway to a pretty solid career). Beyond that, however, is the fact that not all programs in a university are equal: for example, med school placement is kinda it’s own thing: there are schools that are very good at getting kids into med school, way out of proportion to the rest of their school. Honors programs also often have extraordinarily effective placement records, because top companies recruit through them. But that doesn’t say much about students not in the honors program.
There’s also a really basic fact that in any school, students who work with/through Career Services often are dramatically more successful after college–this start Freshman year, when you use Career placement to find internships that give you some sort of specialized skill set and something to talk about at interviews. Not nearly enough students take advantage of Career Services.
Finally, graduation rate is probably more important than anything, and that’s certainly publicly available knowledge that, in my experience, very few people look at. For example, here in Dallas we have two UT regionals–UT Arlington and UT Dallas. UT Arlington has an 18% 4-year graduation rate and a 42% 6-year graduation rate. UT Dallas has a 51% 4-year graduation rate and a 70% 6-year rate–despite a much higher % of tech degrees, which require more hours. This is a huge difference and it’s public information–but I bet not 1 student in a 100 on either campus knows this. People don’t think stats apply to them, so they don’t care.
Statistics have to be put into some kind of perspective. The OP noted the school gives the number of graduates. But what percentage of the original class went on to graduate? Of the rest, what percentage managed to get a job in their field before graduation, what percentage shifted to another field, and what percentage just quit or failed?
And my son is a weird sort of reverse example. He took a seven year break between getting his B.S. and going to graduate school. Even though he was listed as “completing” his degree and getting a job in his field, he always considered his status to be temporary until he went back to school.
I’m a university statistician whose job is to prepare these kinds of reports. Two reasons why this is not a good idea.
The first is just that it’s impossible. Universities have no way of compelling a graduate to tell us where they are working, if they are working, what kind of work they do or anything else-and it’s really none of our business. We do send out surveys, we get maybe a quarter back if we’re lucky. It’s really impossible to guess who the people not returning the surveys are and whether they are more or less likely to be employed in the field of their choosing.
The second is that it’s just one more metric in the recruitment wars. Here’s the dirty not-so-secret about universities. Largely, the quality of education between them is more a matter of degrees than any huge difference. We really know this. There are some resource differences, but if you look within a Carnegie Classification, the differences are small. What happened in the 80s is that US News (a conservative magazine and lover of the free market) decided that it could quantify these minute differences so that the consumer could be better informed. So what they did was take things like graduation rate and selectivity and said, “That’s how we stick numbers to the problem.” Unfortunately, what this meant was that schools that were highly selective got shuffled to the ‘best’ category while schools that were serving say disadvantaged populations or minorities became ‘low-tiered’ for obvious reasons. If you recruit students who had 4.0 GPAs in high school, they tend to do really well and the quality of education you are providing has little if anything to do with it. If you recruit say Hispanic, first-generation students who struggled in high school, they tend to do a lot worse, even if you’re the best freaking educators in the country.
In practice it really meant that states with large populations have ‘better’ schools simply because there’s a bigger pool of students to pick from, so they can shaft low-income students who typically have worse schools onto their branch campuses. It also meant that in order to get ‘better’ you have to continually up your selectivity. There are two ways to do this, find a bigger pool of applicants or take in a smaller freshman class. Smaller classes mean smaller budgets which ain’t happening, so the alternative is bigger pool of applicants. The problem is that all of the schools are similarly trying to get ‘better’ and there are only so many high school students in the country. It thus became a war to attract the best ones. How do you do that? Facilities and amenities requiring larger staffs and capital outlays. Where does the money for these things come from? Tuition. So you hike tuition to fight these recruitment wars, all in the name of ‘serving the students better.’ You end up in a situation where students are paying way, way more for school, but not really being much better educated if at all. Now, to be fair, they do have some serious amenities. Going to a big state school is basically like living at a resort for four or five years. You have people catering to your every need, but you’re paying for it.
Thank you for sharing that perspective from the inside. It validates and reinforces criticisms I have heard about higher ed from others. The “resort” aspect is a particular bugaboo of some of the people I talk to. The other point you make about the marginal difference in education is one of my talking points when discussing school choices with young people. I am always trying make the point that, unless you are looking for a specific program that is not available at any in-state school, the additional cost of going to an out-of-state school is not reflected in any discernible difference in education. But young people often don’t make choices based on pure reason, and if they can swing the out-of-state costs with parent’s money, who am I to discourage them?
That type of info is not unusual at universities that offer accredited professional programs (BS degrees in our case). In my department we have two such programs, and we are required by the accreditation agencies to post graduation rates, 6 mos post-graduation employment data, and licensure exam pass rates.
Since these programs lead to specific jobs that require certification, knowing if a program is successfully graduating students who can pass the exam and be hired, is important.
I’d also add that often LATI only has 12-20 or so students to look at per program so keeping track of them is alot easier than what you are dealing with.
Exactly. I think a program whose specific purpose is to get its graduates a particular type of job should provide information on how successfully it’s fulfilling that purpose.
But the utility of such information becomes more doubtful in the case of a general university/liberal-arts education whose graduates aren’t necessarily trying to get placed into one particular type of job. In that case, it’s more useful to ask people in the program(s) you’re interested in what their students generally do for internships, summer jobs, and jobs post-graduation, whether they have a strong alumni network, etc.
The vast majority of all graduates with a bachelor’s degree go on to some kind of gainful employment. Whether or not they are ultimately successful and happy in their careers is not something that can really be measured with a placement report.
It also makes you think what a school is like without the “resort” luxuries you talk about.
Back to LATI as an example, it has NO gym, dorms, sports, clubs, or any of the other activities or facilities one would associate with a “college”. Everytime something like that is added to a college the schools operational costs just go up along with the tuition and student fees to pay for them. But are they truly needed? I moved out of the dorms and spent little time on campus after my second year of college so I never used any of the on campus stuff like the workout areas. Plus nowadays many students are adult or distance or even online students who don’t care about a fancy dining hall or activity center.
I know LATI brings in the students. They say soon almost a quarter of their student body will be from outside South Dakota and frankly its better than similar schools here in Kansas.
If a school would just focus on the basics and getting students thru their programs in the least amount of time yet setting them up for future success, we would see college costs plummet.
Well, AFAICT, it isn’t a college: it’s more a vocational training institute which operates largely online. It provides mostly certification and diploma programs in various technologies and skilled trades, which take usually 9-24 months to complete and involve a significant amount of online coursework, with campus visits as needed for specific hands-on training.
Which AFAICT is succeeding admirably at what it does, but which is not the same thing as a four-year residential college. A lot of students go to college partly for the student life experience, which includes gym, dorms, sports, clubs and so on.
I agree that those students shouldn’t be applying to four-year residential colleges if they’re not interested in the four-year residential college experience.
I’m not convinced that that’s really the primary issue. For example, residential college costs were much lower in real terms back before the 1990s, even though the vast majority of students lived on-campus and there was no such thing as “distance learning”.
There’s an argument to be made that colleges’ residential amenities are getting needlessly fancy, but I haven’t seen any evidence that that’s the primary driver in skyrocketing college costs. One of the biggest factors in the case of state colleges and universities, AFAICT, is shrinking support in state and local budgets. Administration bloat, with administrative staffs doubling or more in size over the past few decades, also plays a significant role.
Yes, with very large systems it’s difficult get meaningful data, as senoy explains. That said, all schools should have outcomes data available, especially public schools.
Nevertheless, you should be skeptical of placement reports that private schools issue. They often are misleading or outright false, and can lead to sanctions if the school is drawing upon federal financial aid. (From a cursory view of the web page, the school in the OP seems on the up-and-up.)
The largest system of higher education in the nation (California Community Colleges, which, in addition to AA degrees, offer CTE training, like this school in the OP), is approaching the same issue from a different angle. The funding allocated to separate districts is going to be based in large part on student outcomes, including employment (or increase in wages) after attending. The data will come from the California Employment Development Department, tracked by Social Security numbers, and will be publicly available (without individual identifiers, of course).
Private schools can say whatever they want, but there’s really no accountability unless they get investigated for financial aid fraud.
Your choice of “bad career fields” appears to be a little out of date. Although back around 2012 there were lots of pop-press articles like this one bemoaning the high rate of unemployment among architecture graduates, the trends have shifted since then, as described in this pop-press article from July 2017:
Which is just one of many reasons why it’s kind of a dumb-ass move to try to base the assessment of general degree programs on the short-term employment rates of their graduates. Economies change in the course of a few years, and what looked like a solid career field to a high-school junior looking at colleges may be very tough to get a job in by the time they get their bachelor’s degree. Or vice versa.
Hrm, the information I would be interested in isn’t tracked by the California Community Colleges…which is good because programs like those for History Education Ph.D. would be gone.
“Attainment of regional living wage” criteria is a nice bar, but that doesn’t seem to capture the placement of graduates based on their education. Outside of the coastal tech towns a typical garbage man definition would meet that standard.
It would be nice to see how many people with Masters degrees in Architecture work in the field and aren’t struggling to pay off their loans.
I am not anti-free choice in degrees, but want numbers to help advocate for const controls and a return to non-student debt accumulating types funding for education.
That said I am in a field where you could get a job or go to school and the experience is basically 1:1 mostly because it is possible to eek through a degree without learning the concepts which is the real value a degree should provide. In my field degrees often just mean you have more debt (although non-field degrees like math or physics are typically of higher value)
While as stated above there is value in a liberal arts education, I do wish there were numbers on how many of those undergrads who finish their 5600 hours of intern work before taking their ARE.
The system is highly biased in favor of making schools money and I think more students would choose to do liberal arts and not a profession had they been given the information by their counselors.
People should be free to follow a fine arts track and not take the architecture track because they think it will be more marketable as an example. I want to hire people who have a passion about learning and diverse skills to bring to the table and I feel like the current system incentivizes degrees that provide less “additional value” as promised.