"Buy not build"--is that the standard in commercial IT shops?

(Since this question could elicit a variety of answers, I think it belongs in IMHO; if not, please move it where you will)

Recently we had an all-hands meeting of the IT department where I work. Among the 12 guiding principles that our new CIO set forth was that our first approach should be to buy a solution rather than build one. This is disheartening to me because it seems to be the first step in a path that will lead to our work being little more than glorified Radio Shack clerks. If your primary purpose is to recommend and implement existing solutions rather than build your own, then the value of your development skills seems to be sharply diminished. Another issue is that when you buy a solution, you usually have to get the business side to agree to change their procedures to be in accord with the way the canned solution works; rather than being able to build something that does exactly what they want.

I’d like to hear from other IT people…is buy-don’t-build becoming the standard?

Not necessarily. I think it depends upon a lot of things, first and foremost the complexity of the application compared to the resources of the IT shop. For example, in the managed care world, almost no health plan builds their claims adjudication system from scratch anymore, simply because the rules are so complex and arcane.

Many, but not all, of our applications are purchased; this allows my staff to focus on integration, reporting/data warehouseing/decision support, etc.

I agree that there are significant trade-offs, but as a health care CIO I could never justify the expense and risk of building a really major application in-house when there is proven off-the-shelf software out there.

Hopefully this isn’t too disheartening an answer - my staff has plenty to keep them occupied, and since they’re writing in .NET over SQL Server they don’t feel like they’re missing much.

I’ve done a lot of integration work myself, and it can be challenging and interesting. But sometimes I miss the days when the IT group would write an entire A/R system in some 3GL. It was a great feeling to be one of the ones who really made those screens do what the users wanted, from the ground up.

It was inevitable that commercially available business applications would reach a level of maturity that would make it difficult for companies (whose main product is not software and who were, quite frankly, never very good at producing it) to justify ‘rolling their own’. As far as the business folks are concerned, most of the problems we built into our systems were to preserve the way they worked rather than rethink the way they worked. To a large extent we have been held back by clients who were not prepared to change. Many of them would probably benefit from being given a well designed system and told to adapt to it rather than commissioning a system that perpetuates their current inefficiencies.

As this trend progresses I think that in house development projects will be limited to those rare instances where management believes that a unique vision will give them a strategic advantage. Most of the time, they will be wrong.

It will certainly change our jobs. It already has.

Where I work, their philosophy is buy, then highly customize so it costs much more than either buying or building alone and turns into a maintenance nightmare, then call it a failure, and start all over again.

We’re definitely moving that direction. We’ve purchased or are evaluating for purchase several different apps that we would have built inhouse a couple years ago.

It’s getting extensive enough that we’re pushing to hire an ‘application support’ person who would be dedicated to it. I and another developer have spent HUGE amounts of time over the past year on this. (Researching, evaluating, recommending apps for purchase to meet business needs; handling install and setup of the apps; client support; and doing custom reporting/interfaces where the software didn’t meet needs (or, frequently, where they didn’t want to pony up for additional modules).

As one example, I obtained several quotes for custom programming for a job that I didn’t have time/resources to do myself (at the time, I was the only developer). The business side couldn’t understand why it didn’t just magically happen until quotes came back in the $70K range. We bought a prebuilt app that will do most of what they wanted for $35. (Of course, I’ve also spent a large chunk o’ time on that customization I talked about - the business office never counts that cost, since it doesn’t show up in their expenses.)

We’ve never really been a full-on “IT shop”, so this works for us. We’ve got plenty to do in programming small jobs that wouldn’t be cost-efficient to farm out, doing datawarehousing, websites, reporting, etc.

I tend to agree with ethelbert - often times, building something to meet ‘the way we’ve always done it’ is a mistake. Business people seem to take “you’ll have to change the way you do it” much better when it comes from the sales guy than from our office. However, I also agree with porcupine that then endless customizations to finagle the purchased app are extremely costly.

Definitely not the standard where I work. We are a consulting company and we do custom stuff for a number of big clients. We build nearly everything in-house, all based on open-source stuff that we heavily modify, and a lot of original code.

I never have heard of a canned solution that worked out of the box. IME the difficulty caused by doing just integration tasks is that it’s more difficult to understand the core of the application because you never work at developing or maintaining it–it’s a black box.

Often, too, you have another level of troubleshooting to go through in case of problems, and frequent contact with the vendor, whereas in the old days you’d just find the developer or team who did the module in house.

It sounds like he is a pretty smart guy.

I am an IT consultant and business systems analyst. I can’t think of many types of businesses and industries that would demand a large custom application in this day and age. Every business is convinced that they are unique and they are not. That perspective is usually just the result of inefficient non-standard business practices and sloppy data that need to be fixed for its own sake. It is not wise to custom build a system around existing business flaws.

I worked for a company (a shoe company) once that wanted a new ERP system. They selected a package from a startup software company and were one of its first customers. Midway through implementation, the software company closed its doors. The show company decided that it could become a software company as well and brought in a slew of consultants to finish the implementation. $24 million later they had a functioning software package and a nearly bankrupt company that still hasn’t recovered. There are a couple of lessons in that.

As someone mentioned, out-of-the box software packages are far from development and maintenance free once they are installed. I have been a staff developer for several companies that bought large packages. Systems like SAP and Oracle Financials require an in-house crew of developers to run.

I just went through a week of technical architectural design school, and buying packages was absolutely the way to go. Developers were to be kept busy customizing the front ends of all the packages and figuring out how to get all the parts to play nicely with one another. We couldn’t identify any specific requirement which was best met by developing something completely on our own. We didn’t have all the facts necessary to come up with a detailed architecture, but it seemed like there were packages out there to cover all the bases.

“Buy, don’t build” generally makes good sense. The software company that sells the “canned” solution can spread the development cost between all its customers, whereas the costs of an application built in-house will be borne by one company.

I worked IT for an open source software company for a while. Many of the programmers were pretty upset when we’d choose a closed-source software package to do the things we needed done. They argued that they could write it themselves. But that completely ignored the fact that they already had things they were supposed to be working on. The dev group managers (who were also incensed about the closed-source solutions) stopped griping pretty damn fast once they were asked flat out: Which of your engineers will you be devoting, full time, to developing and supporting this app?

Aside from manpower issues, time is a factor. Purchased solutions work now (or at least, soonish) as opposed to having to go through a whole development cycle, testing, etc. Even if buying it is more expensive, getting things done faster is often more cost effective in the long run.

Purchased products also tend to have employable experts floating around in the work force that you can hire. Need someone versed in Oracle or Remedy? The headhunter can find someone. Need someone who knows your homegrown app after your current employee walks off the job? You’re hosed.

Also, if ready-made solutions reduce the IT force to glorified Radio Shack clerks (they don’t, of course; Oracle guru’s hourly rates reflect this fact) that’s good for the business. Any time you can get by with less skilled and, more importantly, lower paid employees, you have the opportunity to cut costs. Since I’m an IT guy, I can’t say I like that a lot, but it makes good sense from a business standpoint.

There’s also the issue of capital assets vs. ongoing salaries. Having your regular IT guy (who’s going to be a fairly well paid guy if he’s also qualified to develop apps) building something shows up on the books as a recurring salary. Buying some expensive software (and even hiring contractors to make it run properly) can be jotted down as a one-time expense, which generally looks better on the annual budget report. (I can’t say I fully understand this, but that’s what the bean counters have told me)

So, yeah, I’d say “buy, don’t build” is the standard in IT shops.