Yeah I was thinking about the minitel. I used it to, say, look up meteo, play games, find hook ups, this kind of stuff.
This may be a bit political for this forum, but without the internet, your new God King Trump would not have got anywhere near the presidency.
I mean, that’s just one example of the bad side of the internet. You can find your specific bubble. I mean, you can find online forums dealing with quilting, weird diets, incel discussion, paragliding, whatever. It all exists. Rule 34 does not only apply to porn.
Everyone can find their bubble. In the Trump case, it is unfortunately very large.
I see lots of people on line who claim to know about something because they read about it. IMO you can’t say you know something until you’ve been tested on it, and pass. They either only read things that support their preexisting beliefs, or ignore the parts of what they read which don’t.
This is mostly online, but I know people in real life just like this.
Certainly you need to at least be challenged on what you’ve learned and be able correctly restate it in your own words. If you can’t explain how a pulley works, or the basic workings of the Krebs cycle, or the principle of moral relativism in your own terms without referencing a text or online source, you don’t really understand these concepts.
Chatbot LLMs have complicated this further because they can produce plausible sounding gibberish that a user can cut and paste into their own venue without actually understanding or checking the response, presenting themselves as knowledgable without actually knowing anything. Ultimately it essentially just becomes people trading nonsense symbols with no real semantic content, like two toddlers babbling at each other.
Stranger
The effectiveness of the internet as a learning resource largely depends on the individual using it. If you’re someone who genuinely wants to learn a new subject out of true interest and bring even a sprinkle of critical thinking skills to the table, then the internet is an excellent tool. It allows you to dive deeply into a wide array of subjects, cross-reference information, check sources, and gain a well-rounded understanding with minimal effort and without burning a hole in your wallet. You can even find, or AI generate relevant tests in order to fine tune your knowledge. It’s self-education at its finest, and it’s about satisfying your own curiosity—not trying to outwit your dinner party guests or win online debates.
But, if you’re someone who lacks critical thinking skills and merely wants to appear knowledgeable to impress others with your supposed cognitive brilliance, the internet unfortunately enables that behavior as well. It’s a stage for self-proclaimed “experts” to grandstand. For example, when such folks try to impress their doctor with the latest unfounded woo theories they’ve discovered online, the doctor may nod and seem interested, but in reality, they’re most likely thinking what a gullible, misguided buffoon you are or planning their weekend golf outing.
I encounter both types of folks. I prefer the former.
I can’t deny the good aspects, but I think the negative impact to society nearly cancels them out. As with many technological advances, mankind doesn’t appear to be capable of not misusing it.
My first job out of college in 1987 was with a partnership of Citibank (banking), NYNEX (telecom), and RCA (electronics manufacturing) that was established to leverage its partners’ domain expertise to develop a product that was very much like Minitel (without the hookup aspect).
We made a what looked like a standard table phone, but with a screen (ascii like an old ATM) and a slide out keyboard so you could receive and pay bills, do 411 lookups, order groceries, etc. The guys who worked on connecting it all were using ARPA net concepts.
It all fell apart when RCA got bought by GE, and NYNEX’ lawyers decided that it would be violating the consent decree.
Of course, soon afterward the internet we know today got going and it would have been quickly obsolete.
When toddlers do that they are practicing language, and I think they are getting something out of it. When they babble at adults they get feedback.
An annoying online ad I’ve seen is for the Adobe AI, where they have a woman who owns some sort of candy or chocolate company have the AI write a flyer, which she then sends out apparently without reading. I just think someday she is going to get into big trouble doing that. Maybe when enough people do (like those lawyers) they will learn critical thinking and know they need to review what they send out.
But probably not.
Your a very cynical guy.
It’s one of your most admirable qualities.
Stranger
Exaclty. I came here to make just this point. Except that the development of the internet as a public resource was such an obvious and inevitable next step to what was already there that I would have used the word “step” instead of “leap”.
Back in the days when the internet’s predecessor, the ARPAnet, was still a small military and academic network, we already had:
-
Private and public timesharing services like Compuserve, with capabilities for collaborative information sharing including email
-
Message boards (BBS) and public information services like AOL
-
Internal corporate networks that in many ways resembled a private internet
On that last point, the largest such corporate network at the time was the one operated by Digital Equipment Corporation (DEC) which provided email and information sharing services that enabled employee collaboration around the world. It was based on a proprietary set of protocols called DECnet which even at the time was functionally superior to TCP/IP. Of course it lacked HTML and broswer capabilities which wouldn’t have made any sense at the time anyway since most of the connected terminal devices were dumb character-mode terminals, but it otherwise provided many of the services we associate with the internet today.
Also of note is that if the rapid evolution of the ARPAnet and its transition to a public internet had not made TCP/IP the default protocol set for global digital communications, the internet would have evolved anyway and would probably have been based on the much superior ISO/OSI standards – the Open Systems Interconnect model from the International Organization for Standardization. It started to be adopted in Europe but due to several factors, including its somewhat daunting complexity, but mostly due to the rapid growth of the ARPAnet based internet making TCP/IP the de factor world standard, OSI never really went anywhere.
The point is, however, that all these things were lurking in the background ready to jump to the forefront had the internet not happened to evolve the way it did. Also, while Tim Berners-Lee was undoubtedly a visionary for inventing HTML and the World Wide Web, the advent of the personal computer with its GUI as pretty much the ubiquitous terminal device made the development of a graphical hyperlinked user interface to the internet pretty much inevitable, too. If Berners-Lee had not come up with it first, someone else would have.
The problem wouldn’t be technology, it would be business interests. The most likely alternative, as I have said, would be multiple incompatible time sharing systems.
As an example consider the home computer market. In the very early days IBM clones were often incompatible, but Intel, IBM and Microsoft got everyone into line. Yet still today we have an incompatible island of Apple products. It’s not the technology, it’s the business.
I remember a story in Analog from way back then using the analogy of cars for PCs. In this story cars had to go to only certain gas stations, since their fuel tanks were incompatible with other gas pumps. I think there were limitations on what road they could drive on. It didn’t seem that wildly exaggerated when the story came out.
Except that the need for a universal protocol standard in wide-area digital communications had been well recognized for many years, independent of a public internet. Enormous effort had been put into the development of both TCP/IP and OSI for just that purpose. If TCP/IP had not prevailed, OSI would have.
In fact even DECnet evolved in that direction. Despite the proprietary advantage it gave DEC to have a common protocol that could could connect any DEC computer to any other, thus encouraging customers to maintain a “walled garden” of only DEC computers, DEC chose to make it open. In DECnet Phase V, it was renamed DECnet/OSI and conformed to OSI open standards, even back when the internet was not yet widespread and was still often referred to as “ARPAnet”.
ETA: Actually, the fact that the internet had not yet taken over the world was precisely why DEC thought that OSI would prevail as the global communications standard.
I was going to post something else, but this popped up and caught my eye.
What? In what way? As an Apple user I’d like to know.
Anyway.
If we posit that Internet would’ve come about one way or another, the topic is almost moot. But what if we’d gotten another type of internet. @scudsucker made a point about DT47 and I think he’s right. Watch when almost all the fringe right wing movements started gaining momentum and later power and track that with the usage of the so called social media.
They are correct. MSM/legacy media did filter out their idiocies. That’s the job of a (good) journo: asking critical question. It was only when they got unfiltered access to to the public that they could get traction. Remember, most people will rather hear an entertaining lie, even knowing it’s a lie,
What I was going to say was about using phones as phones. About ten years ago, I noticed that my students didn’t know how to use a phone. i don’t mean physically, i.e. dialing a number. i mean that they didn’t know how to use the phone to get in touch with someone specific that they didn’t know in person.
However, since there are infinite layers of call centers, robot services and the like, the possibility of actually getting connected to someone is small and approaching zero. if the call is put through, people don’t answer their phones. So maybe the skill of using the phone to talk to another person is going the way of the pager/fax/telegraph.
I assume he means that applications that run on Windows won’t run on macOS and vice versa. But in the context of communications, despite Apple’s penchant for making everything possible proprietary, Apple computers are of course compatible with physical-layer network protocols like wireless and Ethernet and with higher level protocols in the TCP/IP protocol stack. That was precisely my point. The evolution of common networking standards was pretty much guaranteed and preceded rather than followed the rise of the internet.
For example, the Ethernet LAN was first developed in 1972, long before the internet as we know it, and the ARPAnet which spawned the TCP/IP WAN standard was first experimentally demonstrated in 1969. Both soon became industry-wide standards. It’s true that for a while IBM standardized on token ring for their LAN, but that was just IBM being IBM. Word came down from on high that IBM needed a LAN technology, and it must not be Ethernet! They spent the next couple of decades fighting a ridiculous losing battle to prove that token ring was the superior LAN technology, which it definitely was not.
Anyway, this is now getting off topic. To return to my original point, the standardization of LAN and WAN communications protocols meant that global digital connectivity was inevitable, and therefore so was the internet. The internet might have evolved quite differently – for example, it might have been much more commercialized and rigidly controlled by the telcos – but we would have had it one way or another. To imagine a world without internet we’d have to assume the absence of ubiquitous inexpensive computers and standardized networking protocols.

What? In what way? As an Apple user I’d like to know
I have a Mac as my prinary work laptop. I have an Android phone. I can move data/photos/whatever between them with little effort. I have mentioned elsewhere how I used to run Windows XP, iOS, and Linux all together, using a software “KVM switch” (Synergy - used to be free, but I liked it so much I still have a valid license) across three computers to manage client expectations for web dev (Linux was overkill) but the rendering engines for the browsers were well different in the early 2nd wave internet boom.
I don’t see an “Apple Island”, really, as much as Apple would like one.

The internet has transformed how we gain knowledge. Before its arrival, most people’s knowledge was limited to their jobs or personal interests, and learning more required effort—through books, professional journals, libraries, or asking experts.
…
I think the main difference is that ignorant people were content to just wallow in their own ignorance, as opposed to having the ability to link with thousands of like-minded morons to validate their moronic ideas.
Again, it depends what you mean by “internet”, but I recall growing up pre-internet and starting my career in the early days of mid 90s dot-com tech. To me, the world seemed more “real” and “tangible” than it does today. Things may have moved at a slower pace, but it seemed more deliberate. If you were doing something cool, it didn’t get blasted all over social media, nor did you have every loser with access to social media trying to get in on it.

What? In what way? As an Apple user I’d like to know.
At work, when Apple users wanted to plug their laptops into a projector, they had to use a special adapter. And of course there are the programs not available on one platform or the other. And the locking in of Apple customers to their hardware.
When I buy a new PC I can switch brands easily - but if I decided to switch to Apple it would be a major pain. And vice versa, I’m sure.

At work, when Apple users wanted to plug their laptops into a projector, they had to use a special adapter. And of course there are the programs not available on one platform or the other. And the locking in of Apple customers to their hardware.
That has been mostly rectified; although Apple is still trying sell the public on the merits of Thunderbolt (which is a good standard but not widely adopted) they also provide HDMI ports on MacBook Pro and desktop machines (and you can buy a USB-C-to-HDMI adapter). Thunderbolt 4 has the same form as USB-C and will interoperate mostly seamlessly with USB-C devices. While you used to be mostly locked into Apple peripherals there is now a broad market for aftermarket peripherals which are certified for compliance by Apple, and the interoperability and file conversion problems of pre-OSX Macs are a thing of the past. Except for not being able to use Windows software without an emulator and the limitations on access for upgrade or repair, the interoperability and general access problems are minimal. And while I’ve had some interface design criticisms about recent releases of MacOS, it is far more intuitive and functional than Windows, in my opinion.
I switch from then Windows NT to OSX/MacOS after brief interludes with Slackware, Caldera, and FreeBSD while continually using most versions of Windows at work until end of support (and maintaining Red Hat Enterprise on clusters), so I’m reasonably familiar with the major alternatives (also ran OS/2 ‘Warp’, BeOS, System/360, and AS/400, to many flavors of Unix to count, and of course many versions the ubiquitous MS/DOS in college and post-college hobbifying). I would prefer to work on MacOS than any other operating system, and despite the upgrade and repair limitations of Apple hardware I reliably get 5+ years out of MacBooks, iPads, and iPhones. While they certainly have created a ‘walled garden’ that offends my ethical sensibilities, these devices work together with high reliability which is more than I can say about Google and other Linux-based mobile devices.
Stranger
If we had multiple islands of “internets” I suspect there would be a push towards more interoperability also.
As for me, I’ve always found the Apple GUI unintuitive. But I spend my entire career working in command line. Which has great benefits. I got to teach my daughter, who was doing research on supercomputer running Linux, how to extract sort and filter data all in one UNIX command line. She loved the newfound power.

But I spend my entire career working in command line. Which has great benefits. I got to teach my daughter, who was doing research on supercomputer running Linux, how to extract sort and filter data all in one UNIX command line. She loved the newfound power.
I mean, if you want to work on the command line, you can do almost everything you need to do on the MacOS Terminal (which is cromulent and reasonably configurable if not outstanding) or one of the third party command line applications in which you can run any shell that anyone has ever created. OSX/MacOS is fundamentally a Unix system (well, FreeBSD with an XNU hybrid microkernel) with a powerful windowing system and graphical user interface layer, and you can literally do anything on it that you can do with any other Unix shell except reconfigure the kernel and core services. (I mean, you can try to do that but it is very likely to break things.). The main reason I migrated over to OSX/MacOS is that I could run *nix and X Window applications natively and configure the system without delving into the gothic horrors of the arcane and dreaded “Windows Registry”.
If your experience with Apple and their operating systems is OS9 (ironically advertised as “the best OS ever”, which it definitely was not) and previous, then…I’m sorry? But If you are a Linux user who gets frustrated with having to manually configure drivers for ‘non-standard’ hardware or the still often poor graphics performance of Linux boxes, MacOS is kind of the best of all worlds. Of course, you’ve paying for it and you’ve buying into Tim Cook’s walled garden, so you kind of picking your flavor of evil but it is at least one case where there is actually an upside to evil even if I agree with Cory Doctorow on his philosophical (and most practical) criticisms.
Stranger