Did this happen (80's computers)

Sure, but the OP, and much of the replies, has been about intentionally disabling certain features from a perfectly good product.

Another example is that my employer makes industrial products often with several optional features in every unit, and those options are enabled only if ordered. Many times these options are physical pieces of hardware, but it’s less expensive for us to include them all in the production. A side benefit for our customers is that they can upgrade to get a feature without having to have anything physically installed.

As I alluded to above, sometimes people just don’t want to give away something for nothing. In the case described in the OP the manufacturer spent extra money to downgrade the product. The company may want to avoid a reputation for easily providing a reduced price product. In that case the extra costs would have been wrapped up in the price and they’d let the purchaser know they could have it for even less if they ordered enough quantity to justify new designs and production. For smaller companies making other types of products someone may just be obstinate and refuse to deliver any functionality not paid for. Certainly for low quantity purchases it’s not surprising to hear that you’ll need to buy another unit at full price if you want the additional features.

I am now starting to wonder if my car has two cylinders that are disabled. Maybe I can hack my engine into a 6.

“Golden Screwdriver” is a term I’ve heard used to describe when a company (IBM is usually the company in question) leases a slower version of a mainframe, then when the customer needs more capacity and is willing to pay, they send a tech to go in, kick everyone out of the room, and essentially turn a screw enabling faster performance.

Slightly differently, back in the 90’s, Apple would sell Macs with their hard drives partitioned to a specific size despite the capacity of the physical hard drives they put in the machine. Late in the production run of the Mac IIsi & LC they were putting 120 Mb drives in, but only formatting them for 80 Mb*. You could reformat and instantly get 50% more disc storage. They would sell the machines in specific configurations, say, 25/8/80 (25 mhz processor, 8 meg of ram, 80 meg hard drive) and for some reason they decided to nerf the drives rather than update their marketing materials. This was, probably not coincidentally, also the time that Apple was constantly on the verge of going under.

*I think those were the numbers, going by distant memory at this point.

Also similar is the practice of shipping DLC on game discs that require you to buy a code to unlock, but don’t require a download. I remember a big thing about this but I suspect it’s largely irrelevant in these days of game downloads.

I’ve heard the same thing. I wonder, however, if the reason for this was that the first product set included inherently slower machines. As yields and processes improved it became cheaper to make only faster machines while not eliminating the slower product from the product line.

The range in performance over the 360 line was real, with the cheaper slower models having their instruction set implemented with microprogramming while the more expensive, faster processors had them hard wired.

My first thought on reading this was “What? No way-- Pentiums didn’t go that slow. They started at 100 MHz”. Then I realized you had typed “Pentium 486”, when you obviously meant “Intel 486”.

It’s obviously been awhile-- we’ve been accustomed to Pentiums for so long! :o Thanks for pointing that out!

My first home computer was actually a DEC Vaxmate, a rather nice integrated system based on a 286 (and running Windows 2.x – either 2.0 or 2.1, can’t remember) that looked vaguely like modern iMacs, and that I got for free from work because it was obsolete (its original base price with a single floppy drive was just under $5K and went up as you added options). Once I realized how much fun PCs were, I pined for a 486 which were just coming out at the time, and spent real money on that Intel 486/25.

Anyone remember the “inHell Pentagram Pro” joke ad from the late 1990’s? It was 666 MHz (which would have been quite uber leet in those days) and taglined as “A devilishly fast CPU”. No virgin sacrifices required. The actual Pentium Pro only went up to 200 MHz.

Nitpick, the earliest and slowest Pentiums ran at 60 MHz.

http://www.cpu-world.com/CPUs/Pentium/

My computer has a 7870 XT graphics card, which is a 79xx Tahiti card that had a bunch of defects on some of its units so it runs a lil slower and worse but is a lot cheaper, wouldn’t be surprised if AMD introduced the defects knowingly by buying cheaper silicon or w/e cos they know they can sell the slightly broken cards too

It doesn’t quite work that way. It’s a very significant effort to change fabs, so you can’t just take a design and have it easily fabbed at some cheap but low-quality house.

That said, there are lots of things that affect the defect rate. For instance, chips at the center of the wafer tend to have lower defect rates than ones on the edge. If you can disable subunits, then you can sell those otherwise defective edge chips.

There are also complex relations between defect rate, clocks, and power. A given chip may not be reliable at default clocks and voltage, but work fine if you lower clocks or boost the voltage a bit. Desktop graphics cards can get away with higher power use, so you might use the “bad” chips on desktop while saving the “good” ones for laptops. Or you might just disable the worst subunits; this could even have the effect of increasing performance (say you disabled one out of 20 units for a 5% loss, but this enabled increasing the clocks by 10%).

At some points in time and space, customers would only pay fabs for “perfect” chips. This made the idea even more compelling–a 95% good chip was free! I don’t think anyone operates under this business model any more, since there were some undesirable effects, but it did happen in the past.

They did the same thing previously described on as400’s and it wasn’t due to mfg changes, it was designed into the product line.

Some sort of governor existed that could be dialed back for $'s.

Interesting. I have an anecodote which might shed light on (or confuse) this.

Almost 50 years ago (:eek:) I ran programs on a CDC 6400 (running CDC’s Scope operating system?) A guy I knew (who ended up an IBM Fellow!) asked me to put an XJ instruction into a piece of code, told me it would crash the OS, and said he was trying to teach somebody a lesson! (The lesson, I suppose, was that they needed to reinstall the wire Senegoid refers to.)

I might have been naive and/or mischievous in those days, but I didn’t fall for this …

ETA: I can’t remember what I had for dinner two days ago, but I managed to conjure up this memory which I’ve not thought of in 3 or 4 decades!

Doesn’t Econ 101 suggest that what would instead happen is that everyone would be offered the high-end product at the low-end price? Am I to believe that Econ 101 is not a perfect model of the world?

Certainly. That’s because we technocrats have this stuff burned into our brains. :smiley:

For instance, while I’m unfamiliar with the Control Data architecture, the XJ instruction sounds vaguely like the UUO (Unimplemented User Operation) on the PDP-10, which I of course likewise will never forget! From the hardware standpoint UUOs were just a set of unimplemented instructions whose opcode began with 0 which caused a hardware interrupt and instruction trap. Half of them trapped to locations 40-41 in user space, the other half to 40-41 in executive space. Hence the second half were used to execute system calls to the OS – they had mnemonics like TTCALL and LOOKUP which today seem like the haunting names of ghostly old friends long gone but beloved still.

I was thinking in followon to the preceding about examples where DEC might have pulled the kind of trick that’s been the subject of this thread. I’ve worked with DEC computers and peripherals and known DEC as a company for a great many years, yet I can’t think of a single example. DEC certainly had product lines with comparable products with different capabilities and prices, but I can’t think of a case where a product’s relative price wasn’t a direct reflection of its relative cost of manufacture.

Perhaps it’s the fact that DEC was so markedly an engineering company and so not a marketing company. Perhaps, one might even say, that it was because Ken Olsen was a man of such integrity, and promulgated that culture in his company.

When a mainframe manufacturer builds a $400,000 (retail) box, cripples it and sells it for $300,000, one would guess that their profit margins are huge. But it might have been more expensive to design a separate mainframe for the lower-price target. What they’d save in manufacturing cost, they’d lose in engineering cost.

IBM, OTOH, had such gargantuan profit margins, that even that added engineering cost wouldn’t be a problem. Consequently, IBM didn’t need to resort to tricks like the one-wire upgrade on NAS mainframe.

They may not have been above forcing customers to buy more powerful mainframes than needed, however. For example, one model might max out at half a megabyte of memory(*). If the customer needed more memory than that, he had to buy a more powerful mainframe even if he didn’t need the extra processor speed. (This was partly motivated by IBM’s huge underestimation of what memory needs would be – their manufacturing capacity was inadequate to meet demand for their mainframes’ memory.)

(* - Half a megabyte may seem small, by today’s standards, for a large computer. Yes, that’s Megabyte with an M.)

Yeah, but once Intel discovered they could do this they went a step further. Back then motherboards still had a separate socket to add a math co-processor alongside the main CPU (if it didn’t have a built-in one). So Intel started making & selling fully functional 486DX CPUs (except with a different pin pattern) as a 80487 math co-processor for the 486SX chip. And it literally was a scam, because when you installed this supposed 80487 math co-processor all it did was shut down the crippled (non-math co-processor having) main 486SX CPU and then it functioned as a normal 486DX CPU (because that’s what it was!). The only reason they changed the pin pattern was because otherwise you could have bought the much less expensive 80487 ‘fake’ math co-processor, plugged it into the main CPU socket and it would have functioned as a 486DX CPU (again because internally that’s exactly what it was).

Intel was for its first 25 years or so a provider of industrial components. Then, almost overnight they became a consumer product company. IOW they went from being a company like Rockwell to being one like Panasonic and it took them a while to understand that marketing products for consumers was **not **like bidding on govt contracts…

Maybe not anything quite so simplistic, but they DID engage in similar tactics of “crippling” – there were several examples in this thread. My point was that, so far as I am aware, DEC never did.

The thing is, it wasn’t only IBM’s profits that were gargantuan, it was also their market share. Back in the day, IBM was business computing, and to a large extent, scientific computing, too (e.g.- the 70xx series). IBM developed their own needlessly obscure technobabble, their own standards (EBCDIC), and with their tacit approval it even entered the vernacular of the common language: a computer was an “IBM machine”, a punch card was an “IBM card”. IBM’s marketeers had the luxury of entertaining fanciful notions that what it charged for its products should have little relationship to their real costs, but rather to what it was worth to the business to have IBM incorporate it into a total solution to a business problem, packaged with IBM’s famed customer-focused services.

In such an environment no IBM marketeer would think twice about taking, say, a product with “x” capability that could profitably rent at say $50K/mo, and crippling it so that the original could rent at say $70K/mo. I have no idea how often they did it, but as with my IBM printer example, they certainly did it. IBM was – and is – the epitome of a marketing company, while DEC was so purely an engineering company that it was its ultimate undoing. The difference is vast indeed. Had it not been for historical happenstance DEC could have overtaken IBM and probably deserved to, at least to the extent that a company focused on making the best possible products probably deserves to prevail over a company focused on making the most possible money.