Lol. My first system with a hard drive had 300MB of storage total.
I wish any other industry had seen the kind of progress we’ve had with microprocessors.
Lol. My first system with a hard drive had 300MB of storage total.
(Goes into Four Yorkshiremen mode)
300MB? What luxury! When the first IBM PCs with hard drives came out, we had a 20MB hard drive, and we were grateful to get it!
PDP-11/45 with a pair of 5 MB disks.
And you had to toggle the bootstrap into the front panel in binary.
Paper tape reader for data transfer.
Ran An early Unix as well. Written in C.
2D routing? What an amusing concept. I wasn’t talking about routing difficulty though, but about how routing area takes up space that could be used for transistors.
However I wrote a column, a take on Heinlein’s “And He Built a Crooked House,” about 4D routing. With a similar ending.
As for “can we do better?” - of course we can. No argument there.
No large memory is built these days without redundancy - spares rows and columns that can be swapped in after manufacturing to make the memory work. It is possible to do built-in self repair, so memory cell failures in the field can be repaired.
15 years back or so there was a startup building a 100 core processor, and they built extra cores to make the yields reasonable. And big processors and computing systems have error detection and correction. We found that you need to turn off these alerts for the user so the users don’t get freaked out and demand replacement of good parts.
The question is when do we see these problems with logic? We will someday - I’m just glad I retired before it will happen. I’ve seen some proposed solutions, none of them particularly practical. You’re probably safe with bit flips in random logic, bit flips in flip flops are a bit more problematic.If you’re lucky it will trigger a failure, the processor will get rebooted, and it won’t happen for another year. Track your hardware residing in mountains for an early warning of problems, though.
Reliability as I meant it is distinct from yield issues. I was able to prove that early life failures are mostly the result of defects that don’t get detected by manufacturing tests but which do get detected by system tests. Thus the bathtub curve we saw was from lack of coverage, not reliability. We burned in everything, and it worked pretty well, so I never saw many random reliability failures - since the warranty expired before they happened and it wasn’t our problem. Field failures we did see came from specific problems and weren’t the traditional kind of reliability failure.
In my last five years or so I got to see all the manufacturing test data. Luxury! I know several professors who would have offered their firstborn for it.
Pascal, which was a language intentionally designed on the theory that the language would be complete, naturally always came with smart runtime optimization, resulting in small runtime size. It could do this, because the large infrastructure was part of the language, part of what the compiler did as part of the compilation, which always knew exactly what was required, and had complete control over it.
c, which was a language intentionally designed on the theory that the language was part of a larger system which provided I/O capacity, naturally had poor runtime optimization, always resulting in a larger runtime size. An important distinction is that any sort of function or macro that a programmer wishes to used, was general purpose, compiled separately, and linked using only the information available in the header files and library definition.
*Here I use ‘always’ in the Great Debates sense, rather than in the General Questions sense.
Pascal was designed and written as a minimal teaching language. The first implementation ran on a CDC Cyber series machine, a machine that didn’t even have a stack in the ISA. The Pascal compiler was written in itself but cross compiled from an initial version written in Fortran. It emitted p-code, a code stream that was interpreted by a p-machine, itself running on the CDC instruction set. Famously Pascal didn’t even have a full set of trig functions. Its IO was very limited.
Later Pascal implementations on different machine and from commercial vendors extended the capability of the language to a full system capable language. I wrote a lot of code in the early 80’s in Pascal for VAX/VMS and that implementation compiled to machine code and was able to take advantage of the entire system run-time library available on VMS. None of that library was part of the language, but since VAX/VMS provided a common call standard it was possible to cross call functions written in different languages (within limits). They did provide a definitional header that defined all the system calls and structures in Pascal.
Most of the VMS utilities were written Bliss or assembler, and early on some (the file system for instance) were actually run in PDP emulation mode.
In its heyday there were even efforts to microcode a p-machine onto an LSI-11. Pascal ran out of steam pretty quickly however. Modula-2 and C++ out competed it quickly.
The compiler I wrote for my dissertation was for a language I designed that was Pascal-based, and I used the Wirth and Jensen Pascal compiler as the base. First thing I had to do was to get it to run on Multics. They assumed 60 bit words for their implementation of sets. But for some reason they were into two or maybe three letter variable names. If you ever read Wirth’s book on data structures, the variable names in the examples were in the same style as in the compiler.
My language was for microprogramming, so all the stuff Pascal didn’t implement was a win for me.
Yeah. CDC Cyber series was a 60 bit machine. It used a 6 bit character set so each word held 10 characters. Strings often sat on 10 character boundaries. 60 bit sets makes perfect sense. I remember hitting the same limit. Seymour really didn’t care about anything except numerical performance. And he was king of that.
I can’t believe you’re still going on about this. Let’s recap. This all started when I took issue with your claim that other languages were “closer to assembler” than C, and mentioned in passing that, in addition to language features that are close to machine code, another benefit is that C doesn’t require the kind of runtime infrastructure that other languages do, which is what makes it a good language for systems programming (and embedded systems). You insisted that “The c runtime was always bigger than the FORTRAN or Pascal runtime”.
I tried – and apparently failed – to explain what I meant by “runtime”, and the important distinction between a library of program-level callable functions linked at build time (the C standard library) and the mandatory, monolithic runtime system that is a major component of most other language execution environments. It may seem like quibbling over definitions but it’s a really important practical distinction when building real systems.
I’m sufficiently exasperated at this point that all I can do is give you some links (emphasis mine):
As compared to other languages, C has a very small runtime. And unlike other programming languages, C has absolutely no runtime dependencies.
Why C Continues to the Preferred Systems Programming Language
C has a very small runtime. And the memory footprint for its code is smaller than for most other languages.
Why the C Programming Language Still Runs the World | Toptal
The concept of a runtime library should not be confused with an ordinary program library like that created by an application programmer or delivered by a third party, nor with a dynamic library, meaning a program library linked at run time. For example, the C programming language requires only a minimal runtime library (commonly called crt0), but defines a large standard library (called C standard library) that has to be provided by each implementation.
Runtime library - Wikipedia
Pedantically crt0 is the bootstrap code that calls your C program from the OS and depending upon the OS and it’s process model is responsible for things like setting up the program arguments and finally calling main(). I would not characterise it as the runtime environment, but that might be splitting hairs. The C language expects the basic libc, which is defined as part of the ANSI C language. That is where the defined intrinsic support operations such as memory management (malloc etc) file IO (stdio), threads (via pthreads), floating point formats and handing, math functions, exceptions, and so on. It is pretty minimal. File IO and socket IO are perhaps the most complex, but they only interface to the OS provided functionality, and don’t actually implement the capability.
I think what was meant was that “The C executable was always bigger…”
That gets you into all manner of weird issues. Most modern OSs don’t place any part of the runtime inside the executable. Early Unix implementations did link the entire mess into one whole. I remember a hello world program under SunOS occupied many megabytes. It had a copy of the entire runtime in the executable. Yet a VMS hello world was a few tens of bytes long. The OS memory mapped the runtime into the process. As Unix got better at using virtual memory it quickly went down much the same path. An executable containing a list of the needed libraries, and the loader called the linker to provide dynamic linking and mapping of all the needed externally available libraries. You can still force the linker to create statically linked executables. Those residing in sbin are traditionally statically linked, as there isn’t enough of the OS running to let them run if dynamically linked when they are needed in the bootstrap process.
I can’t believe you’re still going on about this. I tried to explain that Pascal that and FORTRAN did not have a ‘monolithic’ runtime, and that the ‘monolithic’ description betrayed ignorance of both Pascal and c.
A discussion of c which excludes things that are created by an application programmer or delivered by a third party has no relevance to this thread, or to … anything …
I’m sufficiently exasperated at this point that I all can do is emphasize that I actually have compiled (and decompiled) programs in Pascal and c and FORTRAN.
I’ll have to root my copy out. I went through the entire book in undergraduate study. 2nd year data structures was the first couple of chapters, then advanced data structures was the next 2, and compiler construction was heavily based on chapter 5. Lordy this was a long time ago. 2nd year was on punch cards on a CDC Cyber 173. We got the Vaxen in 3rd year.
I remember annotating the code errors in the book as we went through.
I think we’re all talking at cross-purposes here, and using terms to mean different things.
There’s a difference between stand-alone applications and applications that depend on a runtime environment.
If you write and compile an application in C or Fortran or Pascal, it functions as a stand-alone executable with no dependencies (other than the OS itself). In other words, if you deliver an application to a client, it will consist of a single exe file.
In the case of other languages such as Java, Python, .NET they depend on a set of runtime libraries or a runtime environment which has to be installed on any system where you want to run the application.
Depends what you mean by other than the OS itself. It is usually expected that a Unix OS provides libc, and libm for C programs. It needs them to run itself. If you ship a compiled C program it will expect that these exist on the host OS, and are not statically linked into the executable. If you have custom libraries it would still be unusual at best to statically link them into the executable. Typically you will ship your libraries as dynamically linkable libraries and expect the OS loader to map them in at runtime. Setting LD_LIBRARY_PATH to include your runtimes or simply installing them somewhere like /usr/local/lib (or one of the many variants we see in different Unixes.)
But for say Fortran, the runtime is nowadays not installed by default. So if you ship a Fortran program you will typically ship it without the runtimes, and expect the user to have explicitly installed the runtime. Depends on the compiler you use. Having the user run apt-get and install the runtimes for a language is not exactly an imposition, and it frees a software vendor from having to track versions across different OS builds.
For small embedded systems, yes, you will statically link. The OS usually has no dynamic linking capability. But any system that provides for dynamic linking, it is usually expected that the runtime environment is provided locally. Any Windows or Unix/Linux system this will be true of.
Have a look in /usr/lib (or the equivalent on your Unix variant) you will find all the libraries, including libc. Try removing libc.s and see how much of your system still works.
You did? If you did, you never posted it. The first time the word “monolithic” appeared in this discussion was when I used it just now. I used it to mean a big chunk of code that needs to be present in its entirety in order to support the compiled program, irrespective of any functions the program explicitly calls – i.e.- a runtime system.
A discussion of any language that muddles the distinction between a user-callable library of functions versus a runtime infrastructure has no relevance to the real world. I provided cites showing that your claim that “The c runtime was always bigger than the FORTRAN or Pascal runtime” was flat-out wrong, presumably because of your misconception of what a “runtime system” actually is.
This argument is going around in circles and I don’t think any further discussion would be productive.
I’m sure that’s the case on Linux, but I just checked on Windows, and I don’t see libc.dll or any similar system files that could be identified as C supporting dlls.
It’s a long time since I’ve worked in C or Fortran, but modern Pascal intrinsically expects no runtimes or supportive dynamically linked libraries, either on Windows or Linux. An application may or may not use extra dlls of its own.
In general, in Pascal, there are numerous statically linked libraries (also written in Pascal) that the linker will only include if required.
On Windows it’s called ucrtbase.dll.
IIRC the implementation got as many members of the set in a word as possible, which is perfectly reasonable. I was complaining that the implementation was not the slightest bit portable.
I taught data structures from it, but since that was one of three classes I was teaching for the first (and last) time I didn’t have the bandwidth to look too much at the code.
The department chair reneged on the promise that I wouldn’t have to teach, so I made a deal that if they made me faculty and paid me semi-reasonably I’d teach 3 classes one term and then never have to do it again. One class would have screwed up my research almost as much as 3, so I figured I came out ahead.