New Cheap PC for Windows 11 or supporting Discourse discussion {Also obsoleting Browsers & OP systems}

Yeah, same here, and I think that’s worth emphasizing. Even among people not normally resistant to change, recent Windowses can be an exception.

Microsoft has so thoroughly enshittified Windows 11 that the user experience is actively horrifying these days. It feels like a free-to-play mobile game that tries to spam you with accidental purchases at every turn, with ads and upsells and dark patterns everywhere and different Copilots forced down your throat at every turn. It reeks of desperation.

Personally, in the last few years, I’ve switched browsers, operating systems, search engines, software stacks at work, email clients, project management systems, and so much more. They were all slightly annoying, but nothing terrible or even worth remembering.

Then I had to do a fresh install of Windows 11 for a friend, which I hadn’t had to do for a while since I typically just upgrade and keep my settings intact (which prevents a lot of bloat). I could not believe how much spam and crapware a new install came with. It was unfathomable that anyone would willingly pay money for this crap. I’ve been using computers and installing operating systems for three and a half decades, professionally for half of that, and I have never seen a paid software program as bad as Windows 11… especially an operating system. If I were the Microsoft Defender app, I’d classify Windows itself as malware and force it to commit seppuku.

How the hell did it get so bad?! Who is running the Windows group these days? Microsoft is making gazillions on AI and cloud stuff, and they couldn’t just leave Windows well enough alone and survive off the enterprise licenses, they had to go and actively enshittify it to this degree?! I am not just disappointed, I am horrified at how bad it had gotten. 11 makes Vista seem lean and light in comparison.

Not only is not worth “upgrading” to, it completely turned me away from Windows. I was in the market for a new gaming desktop, but opted not to after seeing the mess Windows has become. No way that’s worth dealing with on a day to day basis; it’s an operating system that’s actively hostile towards its user. Ugh.

Microsoft seems to have forgotten their typical pattern of “shitty OS release, followed by a much improved one”. From Windows 10 to 11, they’ve gone from “shitty OS release” followed by one that’s “almost unbelievably shittier”. Windows 12 better be something amazing, and it better come quick, and it better be free for Windows 11 users.

Not in the slightest. The biggest problem with Vista was the huge change to the driver model, which most vendors weren’t ready for. Microsoft probably could have waited longer for things to settle, but frankly I doubt it would have helped.

Win7 had the same new driver model, but by then a few years had passed and the drivers were in better shape. That plus a little spit and polish gave it a better reception even though the internals were basically the same. There was almost no resemblance to XP.

And the driver model improvement was crucial. XP put basically everything in kernel mode where any driver bug could take the whole machine down. Vista moved large segments out to user mode where they can only crash the app, not the system. Drivers have only gotten more complicated since then, increasing the security surface area–making it even more important that they be in user mode.

While I’m not familiar with the internals of Windows drivers, this sounds like Microsoft had finally clued in to a very old idea first introduced by Digital Equipment in the PDP-11 in 1970 (55 years ago!) and later in the VAX – the distinction between port drivers and class drivers. The port driver has to run in kernel mode and handles the actual interactions with the hardware, while the class driver handles generic functions that are the same for all devices of that class. If it took Microsoft 55 years to understand the distinction, that’s not exactly a recommendation!

Sorta. The kernel-mode driver is a minimal (unfortunately not as minimal as it should be) interface to the hardware. The user-mode portion does as much work as possible converting submitted work into something the hardware can understand (typically, “command buffers” filled with hardware-specific commands). Note that this is still device-specific, but the user-mode driver can’t actually submit the work without the kernel-mode driver’s help (writing registers, that kind of thing).

But anyway, yes, they were pretty late in having the distinction at all. It’s a big improvement since creating those command buffers (along with all the auxiliary work) is much more complicated than just programming a few hardware bits.

Also, you can have multiple user-mode drivers all using the same kernel-mode driver. OpenGL, Direct3D, etc. all share the same KM driver.

Well, my machine boots up faster and seems overall more stable going from 10 to 11.

I like the UI better in 11. Once I got used to it. It’s cleaner and easier for me to navigate.

I upgraded from 7 to 10 so long ago that I can’t relate my experience going to 10 anymore. But I don’t remember any problems.

I will say that 10 and 11 have been a lot more reliable than 7 ever was. With 7 it seemed like I was always tracking down one error after another. After going to 10, it became much less of a hassle. And with 11 it’s about the same but everything runs faster.

Maybe I just notice these changes because fixing computers is my job.

:laughing:

Case in point. I just wanted to get some footage of playing Minesweeper on windows vs Linux recently.
On Linux, Minesweeper (or something closely resembling it) looks like Minesweeper did on windows from 3.1 onwards.
Playing Minesweeper on modern windows involves installing it from the store (OK so far); when you launch it to play, there’s a 30 second delay while it says ‘connecting to Xbox Live to get your data’; once that’s done
, there’s a popup that offers you different themes for the game (some of which are in-app purchases), but the default skin is free. You start playing the game and there are ad banners. There are also popups offering you the opportunity to pay for hints. When you lose, there’s a screen suggesting you could pay to have extra lives, so you can undo a fatal click. And it goes on like that - popups, ads, upsells, data harvesting, interference and interruption to the gameplay of what should be a pleasant little toy. What. The actual. Fuck.

The game is unplayable because of all the ‘improvement’ that has been visited upon it.

First, it was DrDeth who brought up television. I was responding.

Second, before Philo Farnsworth there were televisions using a sort of mechanical disk system. Once Farnsworth’s invention became the standard, those televisions were no longer supported.

OTTOMH, I cannot remember how long it took for broadcasters to start broadcasting only in color. It was decades. Yes, once they did start broadcasting only in color, you could still use a black and white set.

The switch to stereo took how much longer? Yes, you could continue to use an old set.

I don’t recall for certain whether I had to request a discount converter coupon from the government. I think I did. Even with the coupon, I had to spend about $30 to buy a converter box. The box required a television with the right kind of inputs (IIRC coaxial or RCA). If your television was old enough not to have those, you needed to buy a new television. The converter needed to be plugged in to an AC outlet. Portable television sets were thus made obsolete.

Television is treated differently because broadcasts are regulated by the US government and the air waves are considered to belong collectively to the American people. Additionally, as I have said television took a long time to change and, as you have agreed, the changes were not really that significant for many decades. Computers have changed much more frequently and in much more significant ways since home computers were first sold.

Again, after Farnsworth television technology changed after decades to color. Then, after decades to stereo. Then, after decades to a digital signal. Computer technology has changed much more profoundly and at a much faster pace.

Luckily, right now is absolutely the very best time to buy a mac. The new base M4 Mac Mini launched at $599 which every computer expert admits is a price point actually impossible for a windows computer to come close to matching like for like (where they get you is the upgrade pricing is absurd but they also just upped the base RAM to 16GB because of AI so the base model is a fantastic machine for 90+% of people’s needs). If you can borrow some kind of vague academic affiliation from someone, you’re also eligible for the educational discount which drops another $100 off the price.

At the same time, this has driven down the price of used M1 Macs and the M1 was such a good chip that it’s likely to be one of the most future-proof Apple chips ever released.

And when I mean right now, I mean literally right now, squeaking in before the tariffs come into play and Apple adjusts their pricing. The window for buying a new machine is closing by the minute.

For software companies, the accumulation of legacy stuff and old stuff causes a whole lot of rework over time. When they write say… Win 7 using one API, 10 using a later version, and 11 using a third even later version, they have to keep both the talent and the systems available to support and maintain those earlier versions. And often the developers of those systems aren’t interested in supporting or licensing some ancient API from 2009.

So they eventually have to move away from those old systems and all that supports them. It’s NO different than manufacturers stopping production of parts for 2009 era appliances, cars, or whatever.

And really… unless you’ve been under a rock for the past three decades, you know already that PCs have about a seven or eight year useful cycle at most, before they start becoming PITAs to deal with all that old garbage and getting/setting up a new one becomes the easier option. Having a machine running Win 7 and griping that it’s no longer supported (EOL was in 2020, FWIW) is pretty ridiculous.

I don’t even think it sucks; it’s just the way of the world, and fighting it is absurd.

What Microsoft is doing is pretty different.

It’s not like these are old hardware parts “wearing out” and needing replacements that are simply no longer manufactured.

It’s not even a question of supporting older APIs (which, by the way, Microsoft is going to keep doing anyway because that’s 80% of the reason legacy enterprises stick with Windows… Microsoft does a lot of things poorly, but backward compatibility with old apps is something they really care about, in contrast to Apple or Google, say.)

In this case, Microsoft is artificially disabling Windows 11 installs on computers that could otherwise run it just fine. There is no real hardware or software limitation there, they just want to drive upgrades.

Windows is largely just a UI for launching Office and a web browser anyway. It doesn’t really need to keep evolving; it hasn’t had any significant new features in more than a decade and probably won’t for another decade. And since most major desktop apps are cross-platform anyway, they don’t use platform-specific APIs any more than they absolutely have to, and certainly don’t peg their software only to the latest versions of those APIs.

Many apps are just webviews now anyway, precisely because of how irrelevant desktop OSes have become and how much more work it is to keep up with each platform’s latest APIs and native UI refreshes. Spotify, Slack, Discord, etc. are just thin wrappers around web apps, while many others make use of UI kits derived from Linux projects and other open-source spinoffs that aren’t pegged to anything Microsoft made. There are fleetingly few truly Windows-native apps anymore. Microsoft itself makes fake apps like Teams and VSCode.

The big exception there is PC games, and those are the rare kind of user that actually do need modern hardware. Most others don’t.

There is nothing about inherent about 10 year old computer that renders it unfit for the job, as long as it’s fast enough for a given user’s needs. You can put Linux on there and run a modern browser and most websites will continue to work fine.

All the TPM bullshit, Secure Boot, maybe UEFI… those are just lame excuses for Microsoft trying to create artificial upgrade cycles that would otherwise not be there.

This is not like Apple making a new architecture and thus needing to retire Intel support because of that. (Microsoft is doing that too, with a gentle nudge towards Arm and Snapdragon, but even there they include an x86 emulator inside Windows for Arm).

It’s just not really comparable. There is no inherent need to deprecate any of these OSes or machines. They can run most apps just fine. Word processors and Gmail haven’t changed much in the last few decades, and so much personal computing is done on phones now anyway (which do get more frequent upgrades, if only because of 2G/3G/4G/LTE/5G modem changes and non-user-replaceable batteries). It is fear of becoming irrelevant, not meaningful technological progress, that drives Microsoft’s shitty behavior around Windows. This is not the “way of the world”. This is simple, old-fashioned corporate greed.

My primary personal computer is still a vintage-2011 MacBook Pro running MacOS 10.11.6. Having virtual machines makes that possible: I have a couple of websites for which MacOS 12.4 and its browser is necessary, and a tiny handful of applications (e.g., FileMaker’s latest incarnation) that I need to run under MacOS 14 but I have those available to me without having to upgrade my native OS.

My professional computer, the one I use for the majority of paid work, is a bit newer (vintage 2018 MacBook Pro running MacOS 10.14.6) but it’s basically in the same situation.

Meanwhile, I get to run MacOS 10.6.8 where I still use Eudora as my email client. I even have MacOS 8.6 and System 7 available to me — not to mention Windows builts ranging from Windows 11 back to Windows for Workgroups — so I hardly ever have to give up old software or buy a new computer just to run the latest and greatest.

I admit that setting up virtual machines and emulators is a bit of an investment but I save money by still being able to run everything I need to run on these older computers.

This is true.

Obsolescence is inevitable and normal and has always been part of the computer industry.

But on top of that, Microsoft is artificially enforcing obsolescence for what seems like arbitrary reasons. At least with 11 they are, I don’t remember these shenanigans with 8 or 10.

It’s true that 7 is really old, and there is nothing unusual about an OS as old as it is becoming functionally unusable for a person who wants to stay connected to the internet. But the frustrating roadblocks that Microsoft has put on people wanting to upgrade to 11 using reasonably recent equipment are documented.

It is indeed a new thing with Windows 11, and one they used to be smart enough to realize was shortsighted. It’s the main reason why Windows 11 adoption is so poor that Windows 10 has even gained on it several times.

The way they pulled off getting people off of Windows 7 (after Windows 8 failed to do so) was to offer Windows 10 as a free upgrade. It wasn’t a big enough leap to get people off on their own, just like Windows 11 very much isn’t.

They also decided to force updates, which meant the vast majority of systems were secure, preventing problems. They got nearly everyone using the same OS and the most secure versions.

Then, supposedly for security reasons, they require this extra feature that so many computer don’t have, and, worse, refused to even authenticate CPUs without it even though you could add the feature separately. Some big companies got them to leave in a workaround, but then Microsoft has actively been trying to stop it.

People will maybe put up with installing a new OS, but replacing their computer that, from their perspective, does all they need it to do? That’s a hard sell. And som will even get invested in keeping it.

Throw in refusing to listen to actual complaints about Windows 11 removing features, and they just set themselves up for the toughest migration when Windows 10 goes end-of-life. It’s so dumb.

One interesting thing I remember about Windows 10 is that Microsoft was so desperate to promote it that there were numerous ways to legitimately get it for free (if you were already a licensed user of a previous OS going as far back as Windows 7). Which I attribute to the fact, mentioned multiple times by multiple posters, that Windows 10 offered no actual functional benefits over Windows 7 or 8.x.

A few years ago, back when Windows 10 was still current, I helped my son build a high-powered new gaming computer. Then it came time to install an operating system. I happened to have about half a dozen license keys for Windows 7 Professional, some of which were still unused. I downloaded an install image of Windows 10, and he installed the “Pro” version, and then entered the Windows 7 Pro license key I had given him. Based on things I had read, I was hopeful that it would work. And it did, and successfully activated Windows 10! He’s still running it today.

But I’m not sure that you could activate Windows 10 that way any more, and certainly not Windows 11.

It’s still possible.

I’ve read others saying they’ve done it successfully in the past year. It’s not guaranteed to work but it looks like many have managed it. Of course, W10 won’t be supported after October of this year so I wouldn’t be surprised if that’s when that window finally ends for good.

And then anyone with an activated copy of W10 can upgrade to W11 for free (if their machine is eligible).

Mostly you need to make sure TPM is enabled in BIOS. As long as your computer isn’t ridiculously old, it should be an option.

It’s surprising that Microsoft would allow a free upgrade to the next OS version. They were publicizing the ability to upgrade from Win 7 or 8.x to Win 10 because they were trying to promote it, but continuing that indefinitely seems to defeat their business model! Do you have a cite for a Windows 10 user (Win 10 Pro, specifically) being able to upgrade for free to Win 11 Pro?

I’ll pass that info on to him. I still have a bunch of the manuals for the different components that he never bothered to take with him, including the one for the Asus X570-eGaming motherboard. This board has gone through many revisions over the years, but according to the manual, it does support “Windows Secure Boot” which Microsoft says is a requirement for Windows 11 and I presume means the same thing as TPM.

Have you been under the impression this whole time that people have to pay to upgrade from Windows 10 to 11?

The biggest killer for the latest versions of MS-Windows 11 for me is their taskbar change.. I have always had the taskbar (or equivalent) to be at the top of the screen. It only covers up the rarely used title bar of a window. And with auto-hide on that doesn’t really affect even those operations. The result is conservation of screen space.

But the real benefit of a top-screen task bar is that I use it a lot and the mouse cursor is often near the top of the screen anyway. Having to move the cursor all the way down to the bottom over and over is a pain.

But MS has decided that I don’t want that. Um, beg your pardon?

Google also only allows side and bottom taskbars on Chromebooks. I have no idea why these companies think their choices should override user choices.

I am hoping someone adds a way to have at the top of the screen at some point in something like Open_shell at some point. Until then, no switching.

I believe I have mentioned this: if you install Windows 11 IoT Enterprise (not Pro), the installer does not care about a TPM, and it does not care about Secure Boot. Your Windows 10 Pro license may or may not work to activate it, though— it is a different “version”.

As for Linux, Windows applications can be run via Wine (or Proton, etc.), but it is not 100% guaranteed any given program will work.