Should they be required to have a bachelor’s degree, with a well-rounded education including subjects unrelated to programming? Maybe not. Is there value in such an education? Probably so. I can think of ways some background in physiology might possibly wind up benefitting a programmer, either in his personal or his professional life.
And so the jizzer becomes the jizzee. It’s the circle-jerk of life, or something.
Ummm… judging from your other college thread, you’re a teacher, Olbaid3?
What level? If it’s undergrad, you might be giving too much content in a non-conversational format for anyone to digest it all.
It’s what they call a cost disease (The Baumol Effect). As far as what can be done, the typical responses are as follows:
Colleges could do many of those things by doing the following:
-
Embrace technology. Some colleges are doing this, but far too few are using technology in ways that reduce labor costs and/or add value to their product. For example, online courses can greatly reduce costs while providing similar value.
-
Embrace (More) Cost-Shfting. Colleges are not only more expensive because of cost diseases, they also DO far more, and pick up the slack for a number of other institutions and industries. At some point in the recent past, we decided that the vast majority of research and job training should occur on college campuses. That choice comes with added costs. There is no reason why, if Exxon wants more competent geologists, that they shouldn’t contribute to the costs of training them. Or if Google needs better software engineers, they should help universities train them to their specifications. Colleges then can use the money and guidance to incorporate more real world training into their curricula.
-
More cooperation. Right now, competition in a number of educational arenas is what is increasing their costs. One school get a new stadium, then everyone else has to. One school gets a new hospital, then other in the area do. Universities, especially those in the same area, need to cooperate more often in non-competitive areas, and in areas of innovation. The generally do so when they have to in the latter, but not the former. For example, I live in the DC area. There are six 4-year, not for profit, comprehensive liberal arts schools in DC proper. The fact that on each campus, students are taking basic intro courses taught by different people is more expensive than it should be. These basic course could all be taught in one place, or by fewer people, at a lower cost, if the universities decided to cooperate more often.
-
More Specialization. Universities should specialize more often, and limit the scope of their goals. It’s better to have 10 schools each focusing on one or two broad areas (eg. Science, Tech, Teaching, Economics) than 10 schools trying to do everything.
-
Expand Successful Brands. This is another thing colleges are only staring to catch on to. For example, now NYU has academic centers on every continent, and another full university in the UAE. There is no reason why good schools should never expand beyond that they have not adjusted for the times. The population has expanded far faster than the spots at elite schools have.
-
Use Price Discrimination. At most schools, tuition does not cover costs. Furthermore, the sticker price is rarely what students pay. That said, there is a small, but fairly numerous group who can afford to pay more. Just as airline charge more based on how early you commit to buying. Sensitivity to student price sensitivity, as it currently exists, only functions to lower revenue for universities since the only discount. The price should be what the market will bear with some respect towards lower income individuals.
I wasn’t saying that there haven’t been new languages - just that going into computer science with the goal of developing a new one is not a great career path. I’m not in ACM any more, but I don’t remember the last time there was a paper let alone special issue of Computer on languages. I’d actually read that.
In engineering there is some support of universities by companies, both directly and through consortia like the SRC. If that was more encourage, perhaps by tax breaks, it would naturally support those faculty members where there is competition on salaries.
Heh. At one point we had a lex/yacc program called JTran, which converted English text into Jive.
We can save this thread yet!
MIT is putting its stuff on line. Lots of for-profit colleges are heavily on-line. Whether this method works is still to be seen. It can probably impart information just fine, but I’m not sure about the other aspects of an education. If we are preparing people for jobs above that of drones, there is a lot of interaction and teamwork, something which can’t be found in an on-line environment.
If I’m not mistaken, in Japan companies assume they must teach new hires everything. here we are going in the opposite direction, as companies want to hire people who can start immediately, and consider a college ivory tower for not teaching the application of the day.
Unless Google can indenture the students whose education it is supporting, why support training for kids who will work for its rivals? In my area there is a lot of university support at the graduate level, with the partial expectation that the student will join the company, but it often does not happen.
We also must worry about the long term. The more theoretical classes I took in college and grad school - like programming linguistics, graph theory and complexity - have been a lot more useful 40 years later than IBM 360 Assembler, PL/1 and Multics. The call for real world training is sometimes an excuse to let companies hire new grads with the latest knowledge and toss out old grads whose knowledge is now obsolete.
Also, real world training can be expensive, in time and money. In my area a lot of research is done requiring circuits, and I’ve been very active in providing some real world benchmarks for universities. However most papers are still using benchmarks from 23 years ago, which were fairly simple even back then. A lot is because most students just don’t have the time to get into an open-source microprocessor design and do anything interesting. That is something that can be handled for free, more or less. If you try to implement things, it gets real expensive. I built logic circuits out of some small components and wires - now you need FPGA development systems and test stations.
Good idea, but one which might be only accomplished by funding for cooperation and no funding for going it alone.
This happens to some extent, since some universities are a lot stronger in some departments than others, but you don’t want too much of it or else you might lose synergy. My daughter’s research is somewhere in the mix of psychology, economics and marketing. Not having a business school at her university would be very limiting.
Actually catching on quite quickly. There is a CMU software engineering institute in Silicon Valley. MIT is doing its online courses specifically so as to not open lots of satellite locations which might cheapen the brand. So while colleges have monetary incentives to do this, it can be dangerous. Unv. California schools have satellite campuses also, but they make the distinction quite clear between taking a class there and really attending.
Universities are big on the list of industries to be disrupted. As Stanford, MIT and others put their intellectual property (lectures) online for free - it changes the value statement of what the school can provide. What is the difference between the 200 person lecture hall and a recorded video after all? At what point do we require X hours of videos with tests to qualify for attendance and accreditation to take a small seminar with extensive interaction with faculty?
If I go and watch 5 Stanford videos in computer science and submit some programs, can I finish my CS degree (I stopped halfway). Can I watch 2 videos and get my Econ bachelors? I only had 1.5 classes left to double major. Why not?
Well, MIT started the open courseware movement in 2002. Only recently have you seen others putting a decent effort into it. The problem is still that there are too many brands, platforms, standards, and participants. There is also the problem that you don’t get actual credits for competing the work.
Well, typically you would have a system where you have disincentives for free riders. So the big companies would hopefully contribute equally. The system I am imagining would allow Google, for example, to sponsor interns who would work for them part-time, while receiving credit, and college instruction that relates to the work they do for the company.
True. I imagine courses would be closer to the former than the latter.
Forgive me as I am not particularly conversant or knowledgeable about circuitry, but what kinds of costs are you talking about? That said, my goal would be that where colleges are now (largely) training people for technical fields, in the future, companies would take more of a role in assisting financially and materially.
For example, law school is usually 3 years. Summers from elite schools are paid by top firms a good sum of money to train. The summers do very little real work; they just get exposure to real work and the corporate culture. The firms pay a lot as a matter of prestige, and to curry favor with them once they graduate. This informal system exists already. Why not have a top firm expose a larger number of people to real work so that they actually gain skills rather than this informal contest of egos and reputations that only serves to subsidize the students at elite schools. In the long run, a well formed system would save everyone money. If graduates came out of school competent in doc review or rules of evidence, or how to research, everyone would be better off.
The other important part is that sponsorship brings with it awareness of one’s company. In many ways, sponsorships are awareness advertising campaigns. That has it’s own benefits assuming it doesn’t become commercialized.
True. In most major cities with good public transport, this could be done if the universities were so inclined, with little extra money.
Good point.
Well, that’s part of the problem in my opinion. The value of the brand should be based on the value added, not exclusivity. For too long in higher ed, exclusivity has determined how we view quality. It’s just a dumb way to do things, and leads to worse outcomes as it provides little pressure to improve beyond what is necessary to maintain the illusion of quality.
I have a question:
WHY IN THE NAME OF GOD IS THIS THREAD IN THE PIT???!!!???
Wow, all those pretty letters on that pretty wall. I wish there were colors.
The Semiconductor Research Corporation was an attempt to do this. A consortium of semiconductor companies jointly sponsored research and hoped to get early access to results and to students. Plus I know of several PhD students who are spending lots of time working at companies as they finish, in no small part because it gives them access to information they can’t get at a university. But this isn’t going to help the undergrad problem, since there are too many of them and interns take up a lot of time, and sometimes don’t pay off (though mine from last summer was a superstar.)
And then clueless CIOs complain about the schools not teaching real world stuff.
Ya can’t win.
The stuff I’m talking about is all free. The problem is time. If you are teaching some methods, and want to give class assignments, it is much easier to use 1,000 gate circuits rather than million gate circuits. The problem is, the interesting problems show up in the big designs. Too much research is on the easy to do stuff which commercial tools handle really well already. Some professors are clueless about this, but others know the problem full well. But you don’t want a student to spend an entire semester getting lost in complexity, even when complexity is where the action is. I don’t know the answer.
Definitely done through intern programs at engineering companies. But it is mostly limited to people in PhD programs, and it doesn’t help the undergrad problem. We pay our interns - it appears that in many fields the interns get to work for nothing which brings up all sorts of problems. It is also hostage to the economy - internships for law students seem to have been way down lately.
But quality leads to exclusivity as more people try to get in. And I don’t think quality has suffered at the high end. I see lots of resumes from MIT students, and it seems that the current crew is a hell of a lot smarter than my class was.