The funny thing is that Windows Enterprise does not do this. If this is what is happening, they bought a computer from Best Buy.
But the Microsoft EA (Enterprise Agreement) (PDF) requires the purchase of a minimum of 500 licenses. A TV station may be a fairly major business but is very unlikely to be that IT-intensive!
I’m hugely resentful of software development organizations acting like they own your devices and using the ubiquity of the internet to enforce that practice. The most common excuses are (a) but the new version is better, and who cares what you think, and (b) it’s for security!!! (and also, who cares what you think!)
The problem with (a) is that it’s self-evidently false if I, the user, feel that it’s not better at all, but much worse. As the customer, I should at least have the option of refusing the “improvement”.
The problem with (b) is even more insidious, and there were comments about it in another thread about the insistence on Oauth2 support in email clients which made a lot of legacy email clients stop working – email clients that millions of us were very much accustomed to.
The only reason I could possibly fathom for this sudden “concern” on the part of email providers for my security, where they clearly don’t give a shit about anything else about me, is the matter of liability. Apparently in the event of a hack, I could sue them with the argument “you should have had better security”. I feel that this is just illustrative of how America is the most litigious nation on earth, as illustrated by the following fact:
The US has less than 5% of the world population, but two thirds of its lawyers
I know that’s an old clip (the upgrade is from W8 to W10), but Microsoft 365 E3 and higher subscriptions now include Windows Pro to Enterprise upgrades with no minimum, we sell it. The old Microsoft EA is essentially obsolete.
OK, but my point stands about the extreme intrusiveness that internet connectivity has enabled in the current software development paradigm. A software product that you buy is no longer a product that you own, it’s more like a service that you license. Good for the vendors, not so good for the customers when it’s out of their control.
And it’s not just Microsoft. I have my Kindle permanently in “airplane mode” because the last time I was foolish enough to leave it online, it tried to auto-update the firmware and the update failed. The nice folks at Amazon helped me recover it (though it took level-2 support to do it) and at the end of it all the newer firmware was, to me, worse than the previous one. Now I leave it offline all the time. When I buy a book, I buy it via a Kindle app on a laptop and transfer it offline to the Kindle.
Not a software guy but I worked closely with software guys and attended their scrum meetings. This is indeed the answer.

The one that drives me crazy is McDonalds. I don’t go often, but when I do, i always think “wow, i will use the app! Apps for drive throughs are such a good idea!” And then I pull it up and it needs to update to be used at all. Since I’m likely in the parking lot of the grocery store or something (because who leaves their house to go to McDonald’s?), its too much of a PITA.
My phone is set to auto-update. In the middle of the night, all of the updates happen and I never even know about it. I can understand why people might not want to do that but I have never had an issue.

My phone is set to auto-update. In the middle of the night, all of the updates happen and I never even know about it. I can understand why people might not want to do that but I have never had an issue.
Oh, man, not about phones but just about updates in general, I could write a book about my experiences with “updates” over many decades and all the many, many things they have broken. I’d love to round up all the software development executives in the world and have them sit in auditoriums all around the world and be forced to watch a screen that for three hours displays only the following in bright white letters:
If it ain’t broke, don’t fix it!

I’d dispute the better code quality item on that list. Back when I was in college and you submitted card decks for programming assignments, you carefully checked everything. When you could just type cc on the command line, syntax errors could get fixed in a second so desk checking no longer happened. But if you shipped something out the door, you had to be careful because it is expensive to fix.
Certainly quicker cycles let you be more sloppy at the first step, but CI also provides numerous ways to be more careful as well. None of my code gets checked in without passing through a bunch of linters and static analysis, and all unit tests have to pass. It requires a good testing culture to work, but I’d put it ahead of anything I could do just by my own “being careful” on my own, even before the integration and staging tests.
Microsoft pushes updates to Office weekly, or more often.
It makes me nuts - the damn Update Assistant is constantly asking to install some update or another, and I yell at it that I JUST DID THAT!

I can only guess the makers’ primary concern is to keep us thinking about their apps and keep us using them as frequently as possible.
Well, as with website design, frequent updates to apps are necessary due to the perception that a lack of change means they’re falling behind the competition, becoming obsolete or just looking dated.
At least, that’s what software developers and/or their bosses seem to think.
The constant churning works out great for everyone except users.

The problem with (a) is that it’s self-evidently false if I, the user, feel that it’s not better at all, but much worse. As the customer, I should at least have the option of refusing the “improvement”.
You raise an interesting point. One other downside to CI/CD is that most users for most products are going to develop comfort/familiarity with the product in whatever state it’s in, and any future change is likely to run counter to the user’s idea of “better” (which is often: if the software behaves in the way I expect based on prior uses, that’s better. If I have to change or learn a new pattern because some invisible sky fairy (aka software dev) decided today was the day, that’s worse than whatever I’m used to).

You raise an interesting point. One other downside to CI/CD is that most users for most products are going to develop comfort/familiarity with the product in whatever state it’s in, and any future change is likely to run counter to the user’s idea of “better” (which is often: if the software behaves in the way I expect based on prior uses, that’s better. If I have to change or learn a new pattern because some invisible sky fairy (aka software dev) decided today was the day, that’s worse than whatever I’m used to).
This is ultimately the issue for me. I despise interface changes after I’ve become used to a certain configuration, not because I can’t adjust, but because the changes almost always seem so arbitrary and unnecessary. It’s as if a car manufacturer suddenly decided to switch positions of the gas and brake pedals.
I mentioned earlier the recent changes to the iPhone calling screen moving around the buttons. Whatever security improvements were made in the update, whatever code was untangled I just can’t believe it was necessary to move the button positions around. They literally swapped the mute and speaker buttons’ position on the screen - why do that? What was gained by that change?
If it matters to the app makers (and I’m not saying it does, because they seem to have an amazing ability to ignore the wishes of users) I have deleted apps for making unnecessary interface changes. To my thinking, if I have to learn a new interface because of pointless updates, I may as well get a whole different app to accomplish the same task. More often than not there’s a similar app I can use.
Is it a “each sprint takes two weeks, and we’ll release whatever’s finished at the end of each sprint” kind of thing?
I wasn’t really thinking of that sort of thing as scheduled, per-se, but rather that they’re just pushing the updates when they’re finished, and that cycle takes two weeks. So sort of scheduled, sort of not, in that releases are scheduled every couple of weeks, but any particular change isn’t necessarily scheduled to be released at a specific release, but rather at the scheduled release following its completion.

This is ultimately the issue for me. I despise interface changes after I’ve become used to a certain configuration, not because I can’t adjust, but because the changes almost always seem so arbitrary and unnecessary. It’s as if a car manufacturer suddenly decided to switch positions of the gas and brake pedals.
Isn’t that more of a different question though? I mean, unsolicited and dubious interface changes are kind of independent of the release schedule; they’re going to be annoying and perplexing if they do them annually, quarterly, or CI/CD style.
I’m with you- I generally don’t see a point in moving stuff around on an interface unless there’s good reason to do so (like 70% of users bitch about it), and even then, it should be well socialized ahead of time so that it’s not such a shock. I mean, if I realized that 70% of users absolutely hated something I was ok with, I’d understand why they changed it. But they typically put something cryptic about it in the release notes, which aren’t always readily visible on mobile apps, so we’re left wondering why something changed.
Sometimes that’s the case (that’s how my current teams work based on certain limitations we have), but the C means continuous, and that reflects the desirable end state. As soon as a requirement or ticket is complete, it goes into the pipeline, and that pipeline ends at the consumer. Major websites from big tech companies are going to get updates several times an hour, most likely, for example.
As for the need for constant change, keep in mind that the entire software ecosystem is changing all the time. Library versions go out of date, browsers get updated, new phones come out… there’s always going to be something to do.
eta: And yes, CI/CD doesn’t mean that interface changes need to happen frequently. That’s not really my area of expertise, but as I understand it, users may complain about frequent UI changes, but it keeps them more flexible so they’re better able to adapt to major UI changes.

Isn’t that more of a different question though?
I don’t know - again, I’m speculating, but I suspect it’s related. With frequent updates come frequent nonsensical interface changes. Or so it seems to me.
If it’s true that this annoys many people, surely the app makers know it. Yet it still happens all the time. Which brings me back to my original suspicion - this is done for a reason. What, I don’t know. I’m not a conspiracy theory guy. But it sure seems bizarre that an incredibly annoying practice (pointless interface changes) happens so often, despite user dissatisfaction, with such frequency (sooooo many updates).
Assuming the CI/CD model is the answer, or a big part of it, how do we account for so much user dissatisfaction? The production line just keeps running at speed, regardless? Seems like a bad business model. Reminds me of an ongoing argument I used to have with a programmer friend of mine. He’d tell me about how all kinds of bad code went out in products, and I’d say if we built physical infrastructure the same way all the buildings would fall down. He would then get upset and DEFEND the system that permitted bad code to be shipped.
Whatever. All I know is, as a user, I’m not letting my devices update unless I explicitly allow it. And I’m ditching apps that piss me off with pointless changes.
The only thing that’s sometimes screwed me up is major OS updates. I always do those at the end of the year when work slows down for me in case any software breaks and there isn’t an update or patch for it available yet. And I wait for the x.1 version at least, if not x.2.
Keep in mind… for a lot of apps, you are not the customer, you are the product. Facebook sells you to advertisers. The advertisers are Facebook’s customers. User dissatisfaction doesn’t concern them, as long as they’re making money off you.

Assuming the CI/CD model is the answer, or a big part of it, how do we account for so much user dissatisfaction? The production line just keeps running at speed, regardless? Seems like a bad business model.
I think part of the problem is that your average user doesn’t have visibility into everything that’s going on, so from their perspective a lot of the changes seem nonsensical at best, and actively bad at worst.
An interface change may be due to a library change, or something like a tweak to the design standards that Apple promulgates, or something else that the developers don’t have much control over. To the end user, it looks like something just got changed for no reason, and without any advantage. But on the back end, they made sure that the app continues to work correctly.
It’s largely about communication and visibility, and a lot of mobile app development shops aren’t great about that sort of communication, and users are definitely not good about actually reading it in the first place.

Keep in mind… for a lot of apps, you are not the customer, you are the product. Facebook sells you to advertisers. The advertisers are Facebook’s customers. User dissatisfaction doesn’t concern them, as long as they’re making money off you.
I mean, yeah, but even if an app’s sole purpose is to gather information to sell to Al Qaeda, retaining users is still a goal of the app. User dissatisfaction results in less user engagement which results in less data collected which results in less money being made.