Computer issues are more difficult for people because computers are complicated multifunction devices, and the vast majority of humans do not know how they are used.
I’ve had a weirdly privileged life when it came to computing, I was using a mainframe in the 70s in school, and while it wasn’t my major, I took a few basic classes about the design and operation of computers (it was actually in the electrical engineering department back then–would probably be a Computer Science course today.) Before computers were at all common in normal life, I understood how they worked, I understood how they stored information and retrieved it, and I understood the basic operation of the processing system. I understood what a compiler was and an operating system, and the basic parameters for what a computer program was and how it could operate. Interestingly the things I learned all those years ago, most of those paradigms are basically still valid. Someone who learned those same things back then, transported to 2021 would be shocked at the proliferation of GUIs and the miniaturization, but they would likely grok very quickly what a lot of major applications were doing on at least some level.
In the 1970s you really couldn’t easily be a casual computer user, there were a few secretarial type positions that had some level of casual terminal system setup and things of that nature, but to really use computers back then you had to actually understand how they worked. While many companies, universities and the government were using computers, managers and most staff didn’t have a terminal on their desk and wouldn’t know how to use the computer if their life depended on it, a small staff that understood them would run the programs they wanted and get the outputs they wanted.
As the 70s gave way to the 80s, the age of the personal computer began to arrive, and the computer went from being a “big iron machine” that many people interacted with indirectly, but few people used, to a small machine that far more people were expected to be able to operate. To handle this paradigm personal computers of the 1980s all took a swing at some basic layer of accessibility (the first GUIs) for non-technical people with zero interest in understanding how a computer worked. For non-GUI systems, lay users would usually keep a little note card telling them a sequence of commands to type to open “their program”, and they’d usually have to reboot or something if they got confused.
GUIs got better in the 90s and 2000s, and a whole generation, the majority of people to ever really use desktop personal computers, became familiar with them in this paradigm in which you learned “how to use a computer”, but had no reason to understand how it worked or a lot of the underlying concepts.
What this does is creates “task” oriented users. Like OP–someone who maybe wants to send emails and use a word processor. To that person, that’s all a computer is. Then another person just wants to boot up video games. Another person wants to use a spreadsheet and nothing else. Still another person is a creative professional using graphic design software, but does almost nothing else on a computer.
If you actually understand the underlying paradigms of computing, frankly none of the evolutions since the 1970s have been difficult to adapt to whatsoever, it’s just most people have never had a need to understand computing on this level.
What’s interesting is I hear from educators and other people more connected to the youngest generation, things are actually starting to go back the other way. While the desktop computers of the late 80s / 90s / 00s were very GUI driven and “easy”, in many ways you still had to understand certain paradigms that modern smartphone OSes largely obscure away. I’ve heard that younger people today have a great difficulty understanding things like a file system, or even how to do very basic troubleshooting. Most are not very comfortable using desktop or laptop computers, and many rarely use them outside of school computer labs. That form factor of computer is “for work” their video game consoles, phones and tablets are for entertainment, and they don’t recognize a lot of the paradigms they share.
I have a friend who said his daughter, who has used a smartphone for 10+ years and is in her teens, was almost in tears over having to install software on a computer, she just couldn’t understand it. Meanwhile I think even some of my least techie friends in the 90s could get a shrink wrapped software package home and installed relatively easily.