Why does Windows demand that I reboot?

Yes, if something like that doesn’t get built that way from the start then the effort grows substantially, I was really referring to the decision being made in the beginning with no preexisting dependencies.

This. It’s just simpler to dump Windows altogether. I mean, I think that all people have a certain breaking point where Windows-weariness will happen, what with all the Winrot, incessant rebooting, malware, bloating, terrible customer service, et al. There are other alternatives out there. For me this was like 3 years ago.

That particular decision was made back in the Windows 0.x phase, when the target processor was a Real-Mode x86 at best (where x is “”, 1, or 2). The CPU addressing scheme was segment : offset, and segments were 64k max. To implement DLL addressing, Bill Gates and team developed a cunning plan involving editing return addresses on the stack to make them references to a DLL entry point, then rediting the stack once the DLL was reloaded and the reference was resolved. It was complex and well dodgy - the move to i386 Protected Mode (with CPU supported virtual memory, paging support and 32-bit address segments) made DLL reference management much easier - they went with the CPU supported addressing option but (because efficiency was paramount in the days of CPUs running at 10s of MHz) they did not use that extra level of indirection you are suggesting.

Kids these days. When I were 't lad, we had to bootstrap our micro with an EPROM programmed with a hand-cranked phone magneto, 1 bit at a time…

Si

um, no, that’s even more difficult. Especially when the software you need to run only runs on Windows.

can we stick to things that exist?

I reboot once a month for Patch Tuesday. If that’s something you consider “incessant” then good lord.

I didn’t feel like addressing the rest of your vague “points.”

Reasonable points.

Most of my machine and assembly coding was on a 6809 (a whopping 1mhz) and I used this technique frequently despite being very cycle-conscious (video games), so to me, it’s not obvious that it’s simply too much overhead, but we would have to go back in time to test so, oh well.

I liked the 6809, you could write relocatable code, so you could shove an asm routine in a basic string, get the address of the string in memory, and jump to it even if the string had moved. That was pretty cool in my book (when I was 14 or so, anyhow), as compared to a ZX81 (Z80 cpu) where asm routines had to be stored in a massive REM line that was the first line of the basic program so you knew where it was, and not all asm commands could use relative addressing.

Si

I liked the 6809 also. It was my start in computers (15 also), and I didn’t realize how good I had it until I compared notes with friends working on the 6502 - it was like night and day the difference in capabilities.

sounds like that would be a security risk in a modern system.

I am just a user, not a programmer so am unqualified to address any of the technical issues here presented. But I have to point out that at least one other popular OS, Apple’s OSX, requires reboots for almost all operating system changes as well. There are indeed fewer of them, but it is not at all uncommon for Apple Update to tell me it has 3 updates to install, one of which requires a reboot and gives me the option of installing/rebooting now or sometime later.

No one above said anything specifically to the contrary, but the impression given is, I submit, somewhat overstated.

Mhhmm, mhhmmm, yeah, that’s what I used to think also once upon a time.

Turns out that then I did my homework, and I found out I do have a choice. All the professional software I use thankfully is not Windows-only as I mistakenly believed.

find me non-Windows software which will work with these:

http://www.datatranslation.com/products/dataacquisition/usb/DT9834/default.asp

Well looks like you’re fresh SOL jack.

Then again data acquisition companies have been behind the curve for ages, what with their inability to support 64bit drivers all this time.

Of course I’m not an audio expert but I don’t see why you couldn’t use Labview…

To get (somewhat) back to the original thread – what’s the reason for the iterative updates Windows makes you do sometimes. Meaning, whenever I have a fresh Windows install I can’t just install, boot, install updates, reboot, and be good. Often the cumulative updates are in 2-3 “batches” and I have to download, install, and reboot after each one.

Okay, you may say “dependencies” and I’d say fine, except for the fact that unless the updater itself is updated, all that should really matter if you’re INSTALLING new DLL files (and such) is that you maintain the correct order and make sure they’re all installed before you link them. Indeed, the purpose of Windows Service Packs is to glue all the updates created up 'til that point (and more) into one meaty package. So what’s the difference between a Service Pack and installing all the stand alone files in a single batch?

Many years ago we were introducing NT 3.1. We also had VMS servers running Pathworks and so had experience of a more heavyweight OS. When we saw the “please reboot” after any update on NT, we decided to investigate. In most cases a stop/start of a service or process worked just as well. I think that the reboot, needed in some cases no doubt, is just a catch all.

On the Novell subject: I remember being the joke attendee at a regular PC networking conference, where everyone else was running Netware and I was responsible for a Pathworks (aka PCSA) environment. All the jokes stopped when during a session on reliability I was called on to comment. The great netware users were all congratulatory on reaching 1 week business hours uptime, but I just mentioned celebrating 1 year continuous operations across 10 international sites for over 20,000 PCs…

no, I’m not- I use Windows.

what you see as “behind the curve” I see as "no business case to support OSes that constitute a low single digit percentage of the market.

yeah, just let me pay someone thousands of dollars to write routines that do exactly what the manufacturer-provided software already does just so I can thumb my nose at Microsoft.

yeah, no.

Code changing ain’t an easy process. A given bit of patch code is written to be applied to its preceding bit of code, for example. If you were to just slap on the latest update to code 3 generations back it would break the hell out of everything. Some patching doesn’t require reboots b/c it’s fairly straightforward; others patching requires a ton of work to implement.

Be grateful that you’ve got a mostly automated process. Perhaps in 100 years, patching will be so easy that we don’t have to lose 5 minutes of our precious lives in order to improve security and reliability.

Sorry for the sarcastic tone - the “patching is a pain” argument is one of my pet peeves.

I’ll give you one situation where it is a pain- try installing Windows 7 from an original release DVD. I did this last weekend and I shit you not I had to run Windows update 8 times before everything was up-to-date. Install, then windows update has 110 MB of updates. Do that, reboot, and there are now 28 MB of updates. Do those, reboot, now there’s 300 MB of updates because it finally allowed Service Pack 1 to show up. Install that, reboot, and now there’s 40 MB of updates for the .NET frameworks. And so on.

At least Apple is better at providing combo/rollup updates.

I never said to slap on the newest code on the old code, my point is that you encode information in the files that impose a relative ordering (id numbers should be sufficient in most cases, since they generally get bigger). Then when one file requires six updates, you update it with the first, second, third, fourth, fifth, and sixth updates in order. If the system is in a reboot state before the file is in use by the system, it shouldn’t matter whether all six updates are done in succession, or done from six different download/boot cycles.

If the code relies on other dependencies, you make sure the updater system installs the dependencies first, and then if any dependencies fail you skip the rest of the update tree that’s ultimately dependent on that file with an error message. Dependency trees aren’t exactly out there, it’s practically a makefile.

I assume there is a reason for this, but if there is you didn’t really give it. It’s not really that big of a deal, I was mostly just wondering if there was a technical reason that you have to do it in multiple batches rather than downloading a big pack and sorting through the mess at install time.