Come on, it’s 1986 already; upgrade to a terminal program which supports zmodem! It comes with resume, and CRC checking.
I remember scheduling all your downloads for the night and hope nothing went wrong.
How many of you remember reading this book.
To answer the question about how dial-up works.
The two copper wires that go from your home to the telephone company office were designed to carry voice and information you need to understand voice comes in the 0-4Khz voice band. Humans can hear up to 20Khz, but 4K is fine for voice.
Poor sound and interference is caused by wires acting as antenna and picking up radio induced electrical noise. Also elecrical signals can leak when they are bundled together as they go into the telephone company office. To solve that problem the lines were voice ‘conditioned’ to cut out all the frequencies on the wire above 4KHz, filtered.
When computers came along and the requirement to connect them, this conditioning presented a problem. 4KHz is a very small fraction of the frequency carrying capacity of a copper wire. Telephone companies would take the filters off a line, but that service was rationed because using full range of possible frequencoes would cause noise on the far greater number of voice lines. A digital line was a premium service priced out of reach of the general public. Expensive digital lines at the time did only went at 64K or 2Mbps.
However, it was possible to use a device called a modem on a voice line. This takes in digital data from a serial port on the computer and turns it into what sounds like a screeching set of tones that can be understood by the computer modem at the far end, which decodes it and sends it to the far computers serial port. Digital data can be pased back and forth, but it was slow.
A modem is a modulator/demodulator. Modulation is a technique for sending data by changing the signal at one end of a copper cable such that the change can be detected at the other end and associated with a digital value. So what can you change on copper wire? The answer is three things: the frequency, amplitude and something called phase shift. These are all properties of an electical wave on the copper wire. Change amplitude, the loudness at the sender and the reciever can detect it. Same with the frequency and the phase. These changes can be associated with numbers. You can more associations if you have several levels of amplitide, several changes of frequency and several phase shifts. Associate all these combinations with digital numbers and you have a scheme for sending and recieving digital data.
Ever since the early days of electricity, there have been modulation schemes like this. Simply turning the elecricty on and off for short durations and using the Morse coding system was one of the first examples used by the telegraph system. The first practical use of radio for sending messages started simply by connecting and disconnecting a elecrical source to an antenna, a spark gap could be used to send morse by radio. These early efforts gave way to many types of modulation schemes. The first ones were analog, dealing with continuously varying streams like voice. Analog modulation schemes that powered radio and TV could be implemented using regular electronics. Digital modulation scheme are quite recent and had to wait for the development of silicon chips.
Now, the personal computer boom of the 1980s created a demand for modems that could use dialup lines to connect computers. Chip makers started using the same large scale integration techniques as the makers of microprocessers and applied it to processing signals. This technical progression is somewhat neglected but it paid a crucial role in the development of computer communcation and made possible todays Internet.
There was a race between chip companies to design the codec chips to go in modems that could squeeze as much data carrying capacity as possible out of the 4Khz available on a dialup voice line. How many bits per second? 300, 1200, 2400, 4800, 9600…it topped out at 56,000.
Now what if you had a codec that could handle, not just the bottom 4KHz band, but each successive band going up and up in frequency in 4k chunks? They did precisely that and produced chips that had the 256 of these codecs, each handling a different frequency band. At the other end, in the telephone office, the filters had to be taken off and the copper wire terminated on a device that handles many lines at the same time, a DSLAM.
This is called the Digital Subscriber Link or DSL technology and there are various schemes for probing which frequency bands work reliably and allocating them to sending or recieving. For home users, who are very numerous, they use Asymmetric DSL that has more 4Khz channels for downloading than uploading. This is to minimise the interference caused by signals leaking between wires at the telephone company end. DSL technolgies have developed consistently over the past two decades and it is quite remarkable how much data they can carry over a couple of copper wires. The copper wire network going to each home is a huge resource and DSL technology sqeezes every bit of capacity out of it. It will last until the nation is rewired with fibre optic cable to the home or 5G makes that uneccessary.
But I do feel an odd twinge of nostalgia when I hear the coquettish burbling that is the sound of a dialup modem trying to seduce another at some distant location to do the dance and start exchanging data.
I’ll throw you guys another question about modems. What’s different between dial-up modems and broadband modems? Why couldn’t I use my current modem to connect for dial-up internet connection? Like, just connect a phone cord to the back of my modem? Why would I need a 56K modem? Don’t all modems do the same thing?
For the same reason you can’t connect your USB cable to your Ethernet port.
Dial-up modems are designed to connect to the POTS (Plain Old Telephone System). Broadband modems connect to Co-axial cable. The signals on those two lines are completely different.
Coax if you have cable, if you have a regular phone line it is a pair of copper wires.
You can’t connect a voice band modem to a line conditioned for broadband because the DSLAM in the telephone company office is expecting a broadband modem and is expecting to decode the data and send it onward to an Internet Service Provider and their data network.
When connections are first made between voice modems, you can hear them having a bit of to and fro chirping. Fax machines do much the same sort of thing. They are saying hello and agreeing a speed and choosing an error correction scheme they both know. Broadband modems do the something similar but they use a different language (protocol).
While you can hear voice modems doing their exchange, the most you get out of broadband modems these days is a few blinking lights.
For a start, your current modem doesn’t contain a dialer…
The more critical point is that your DSL or cable modem is a kind of radio transmitter (designed and built for conducted transmission — along your telephone lines). Unless you have a matching radio receiver at the telephone exchange, you can’t communicate.
The old modems were “noise” modems. They transmitted noise you could hear along your telephone lines. Unless there is a modem (or device) listening for that kind of noise at the other end, you can’t communicate.
I say ‘or device’ because a 56K modem is a special case – at the other end is not a modem, it’s a device that directly decodes the ‘modulated’ noise, instead of demodulating it first.
The better question is “what’s the similarity?” and the answer is “none”, except the word “modem”. The simply stated reason is that they work using vastly different signaling protocols over carrier frequencies that differ in bandwidth capacity by many orders of magnitude, broadband being on the order of a million times faster or more. Pushing broadband speeds over voice-grade carriers is something like trying to push Niagara Falls through a straw.
Your DSL modem is connecting to a DSLAM, and the SLIC (Subscriber Line Interface Card) that makes POTS (Plain Old Telephone Service) signaling work isn’t there. There’s an old acronym called BORSCHT that describes what’s needed for a POTS connection, (more specifically, it’s what’s produced or managed by a SLIC) and almost all of that soup is missing in a DSL connection.
More to the point, WHY would you want to use a dial-up connection today?
It actually goes back a good bit earlier to 1968 and what’s known as the Carterfone Decision. And then there’s Hush-A-Phone, which was another landmark ruling in 1956 that permitted mechanical connections to a phone, which paved the way for acoustic coupler modems where you’d stuff the telephone’s receiver into a pair of rubber cups. Ma Bell was seriously unhappy at the thought of people connecting things to their equipment. The Hayes Smartmodem was introduced in 1981 and was the first directly-connected modem. All of this happened before the Bell divestiture.
ISDN: The backronym I remember for it was It Still Does Nothing. It was confusing to order, challenging to install, and frustrating to use. If you got both bearer channels to stay up for the day, it was a really good day. By the time Bell got the kinks worked out, DSL was on the scene and considerably faster and cheaper.
ISDN was much more popular in the UK and very commonly used to provide dial-around backup for private leased circuits, once router suppliers like Cisco introduced software that worked reliably. It would up and down in 64k incremtent according to demand. It was also used for voice and video conferencing. Video conferencing was quite a prestige item in the 1990s. Doing it internationally, using ISDN was especially difficult to the US because its implementation was not as reliably standardised or supported. The result was more than a bit flaky, the quality was rather like some Zoom calls today, but very businesslike. You were shown into a ‘video conferencing suite’ which was outfitted at great expense and connected using ISDN. Modems were very often used by support technicians to dial in and configure equipment and these were sometimes upgraded to ISDN because it ran a lot faster and connected quicker. ISDN was expensive and got less attention from hackers who were on the lookout for telephone numbers that connected to modems.
Europe in general seems to like to be an early adopter of nascent technologies that end up going nowhere, even though sometimes they were making technically good choices. A good example was the Open Systems Interconnect (OSI) standard which was far superior to TCP/IP. But TCP/IP was being pushed and heavily funded by the US DoD as part of ARPAnet and its own internal networks, and when ARPAnet transitioned to a public internet, OSI was pretty much doomed.
OTOH, if I’m not mistaken, it was some European company that insisted that DEC provide them with Ethernet over unshielded twisted pair (UTP). DEC engineers believed that the best solution was their proprietary “Thinwire” Ethernet which replaced the big fat traditional coax cable but was still shielded coax, and that UTP was sheer folly. Today, of course, pretty much all Ethernet LANs run over some form of UTP.
As noted, “modem” is just short for “modulator demodulator”. And the physical effect that gets modulated (whether sound waves, radio waves, electrical impulses, light, gravitational waves, or something else) is different between modems. The form of the modulation may also differ (not all cable modems are intercompatible, for instance).
Whether a thing is actually called a modem is also fairly arbitrary and just a historical artifact. An Ethernet card is a modem, since it modulates/demodulates signals to go over twisted pair wires. It’s just not usually called that, for whatever reason–maybe because “Ethernet” or “network card” is more specific. A cell phone has a modem in it to modulate to radio frequencies, but it’s rarely called that, unless you’re talking about a cell connection add-on for a computer. They’re both doing exactly the same thing, but we think of modems as a kind of computer add-on for data connectivity. That’s not the original meaning but that’s how it’s come to be understood.
In short, every long-distance data transmission device has a modem, even if it’s not called that. And none of them are compatible with each other unless they adhere to a specific standard, like 802.11 or DOCSIS.
Europe in general used different exchange equipment, and ISDN was a feature of that equipment. In the USA you could get a T5 connection – an AT&T carrier connection, not generally available in Europe.
The USA in general was an early adopter of nascent technologies that ended up being inferior to later international developments. (color TV and mobile phones are good examples).
And ISDN isn’t a good counter example: it was already a dead end technology that was in the process of being replaced by fibre when suddenly ADSL came out of nowhere and revolutionalised the whole business.
This isn’t really right. DSL is a digital medium and it uses phone lines. Cable internet uses an analog signal (yes, to this very day, though on the way out) despite running over coax.
All analog vs digital means is how the signal is encoded. That’s it. Analog and digital signals can coexist on the same medium (wire, fiber optic, radio) at the same time as long as there is sufficient bandwith for the signals (here, bandwidth refers to the available frequency range and not upload or download speeds). As long as each signal does not overlap in frequency, multiple signals can be sent down the same medium at the same time. This is called multiplexing.
A modem is a device that converts between an analog and a digital signal. Telephone modems converted the digital data from the computer’s serial port into an analog modulated audio signal that could be transmitted the same as a voice signal and vice versa. A DSL terminal adapter uses the same phone lines as voice but takes advantage of high-frequency signals digitally encoded that cannot be heard. A cable modem takes digital data and encodes it into an analog radio frequency signal that is transmitted along the coax cable, and in modern systems, is again converted into an analog fiber optic signal (where the signal is represented by the intensity of the light rather than it being off or on) that is transported to the cable headend, where it is finally converted into a digital signal again.
A modem is not an analog to digital converter, (ADC) nor is it a DAC. It has a digital signal on both sides.
You may read about the ‘analog’ side of a modem, and the ‘digital’ side of a modem. Those are just terms of reference: both sides are digital. Some people still don’t understand that. So they often try to ‘explain’ the naming convention in ways that really don’t make any sense.
In particular, analog vs digital does NOT mean anything about how the signal is encoded, EXCEPT when you use the words just as labels:
‘Analog’ meant it connected to a system designed to handle telephone calls. ‘Digital’ meant that it connects to a system that is NOT designed to handle telephone calls
So, using those special domain-specific definitions: "A modem is a device that converts between a system designed to handle telephone calls and a system that is NOT designed to handle telephone calls ".
In every other context, “analog” has a quite different meaning, and confusion between the two quite different meanings is just … unhelpful. When using the word “analog” in it’s normal, general, historical, engineering sense, there are no analog signals on either side of a computer modem
Well, this isn’t right either. First of all DSL is not a “medium”, it’s a multi-layer signaling, frame-, and packet-level data transmission protocol. The medium itself can be copper, fiber, or potentially other things. And it’s false or at least extremely misleading to refer to broadband cable internet as “analog”. It’s digital in the same sense that cable TV channels today are digital, sending digital bitstreams modulated using quadrature amplitude modulation (QAM) protocols, analogous to the ATSC standard for digital OTA broadcasting. The DOCSIS cable broadband standards also use variants that are all essentially QAM modulation. Pretty much everything on a modern cable service is digital – television, internet, and even phone service (if you subscribe to that). Maybe you’re thinking of old obsolete cable systems.
OK, substitute “medium” for “customer wiring” if you must. It’s no more useful to invoke the entire DSL backend than it is to explain how a CMTS or how the digital part of the PSTN works when the topic is how the customer’s equipment communicates upstream
I am not thinking of analog TV. A QAM waveform is continually varying. It is a digital signal encoded onto an analog RF carrier. This is meaningfully different from something like Ethernet, which does not use RF and encodes data in its signal transitions in baseband This is ultimately a matter of semantics, but I stand behind my description.
This is not the place to get into an extended debate about this, but it’s just wrong. The only difference between classic Ethernet and carrier modulation protocols is that classic Ethernet is baseband, meaning it uses no carrier and depends on time-division multiplexing. The fact that in broadband systems a carrier is being modulated is irrelevant to the discussion; it just means that it’s a broadband rather than baseband physical layer infrastructure. Yes, QAM has two continuously variable carriers, but the actual data that it encodes can be either analog or digital, and in most cases today it’s a digital bitstream. When this is the case, it is a digital transmission system by any rational definition.
Like I said in my other reply, this is semantics. You are basically saying that only telephone modems exist and that there is no other kind of modem. That is not useful. There is no meaningful difference between the concept of converting between a discretely encoded data stream from a computer into an amplitude-modulated (partly) audible frequency signal in a telephone modem and encoding it into a QAM modulated RF signal used in a cable modem. Both were created by a DAC and both will be sampled by an ADC. But there is a meaningful difference between that and a medium like Ethernet. And in the end the semantic you are trying to argue is the same kind of thing when it’s pointed out digital signals are an abstraction and that no real world signal can truly behave that way.