What's the relationship between C and C++?

Surprising. At that time I thought most students were learning BASIC as their first language.

I was in grad school when I saw the Bell Labs C Memo, and as a PDP11 programmer my first reaction was "this is sugared PDP11 assembler - ++, – …)
But C started as a Systems Implementation Language, based in part on BCPL which was used on Multics.
I was at Bell Labs when C++ came in, but already a manager so I didn’t get to program much anymore, but I did take a class. Back then object oriented languages were the coming thing, though C++ was the first popular one though not the first. I did a seminar on designing an object oriented language in 1976 at Illinois, and the language I did for my dissertation was an object-oriented microprogramming language, which was published in 1980.

I was doing physics and computer science, and Fortran seemed to be their language of choice. I think I only learned Basic when the first PCs came out.

I don’t think any student (in the sense of a proper student of computer science) would have ever been taught BASIC. it was fine for hobby programmers and for high school students doing a computing elective or the like. But as to using it as a formal teaching language - Yuk! BASIC was a cut down Noddy (Mickey Mouse for those in the US) FORTRAN like language. You either got taught Fortran or, when it caught on, Pascal.

Back then you could solve problems much faster with BASIC–as one used BASIC on a interactive time-sharing system–where with FORTRAN you submitted punched cards decks to the big mainframe and found out a couple hours later you had made a typo on the 50th card.

Unfortunately true; I’ve written many powerful apps in VB, but I tend to feel judged if I mention that to anybody else with a programming background!

I remember starting at Uni and a lecturer shaking her head when she asked how I first learned programming, and I told her that I self-learned QBasic.

She then proceeded to teach two semesters of Smalltalk, where we learned to hate OO (I’m very much a fan of OO for many years, that was just a horrible intro) and only ever wrote programs that performed trivial functions yet were difficult to understand. Progress!

Very helpful replies, thanks a lot everyone. I don’t really have specific use cases or projects in mind that I want to do, it’s more that I want to broaden my scope a little on top of the Python I’ve done so far. Compilability is a criterion for me, not because of any requirements for the programs I want to write but simply to explore this - I see that the difference between interpreted and compiled languages might not matter much in the end, but I thought since old-school programming is compiled I might as well try that just to see what it’s like.

I’ve played around in C these days, and my first impression is that things such as getting the number of digits in an integer or getting the nth digit in an integer, for which Python provides straightforward ready-made functions and indexing, already require a lot of cumbersome loops and modulo operations in C. But I guess that’s just part of the closeness to hardware that people have mentioned, and it’s interesting to see what is going on underneath the hood of a simple ready-to-eat Python function.

You’re both right!

Like many people of my era, my first experience with programming was in BASIC. But that was on my own, on home computers where BASIC was the default. When I got to college and took computer science classes, the introductory language was Pascal, which had many of the features and structures that BASIC lacked and that serious programmers and CS students would be expected to become familiar with (like functions and parameters, or data types and structures).

Later, versions and variations of BASIC came along that did include some of those features.

In a more literal sense, I believe, the relationship between C and C++ is created by a preprocessor, which is a computer program (that could be written in any language) that takes as its input data the text of a program written according to C++ rules, and generates as its output data the text of another (generally significantly longer) program written according to ordinary C rules. Or at least when C++ was invented, this was how it was done.

A good example is how C handles things like linked lists versus other languages. Other languages basically give you methods to manipulate them, while C lets you directly manipulate the pointer (actually I think it requires you do so).

While C may be a simple language, it does require some advanced understanding of a lot of concepts. Pointer arithmetic, machine architecture, memory management, etc. If you are new to the language, misunderstanding some concepts can cause bedeviling bugs that will drive you crazy.

I started out learning basic and assembler back in the day, then moved to C, Modula-2, Pascal, C++, Javascript, C#, then Java.

My recommendation if you want to be in the Microsoft ecosystem with their free Visual Studio would be to learn C#. Especially if you want to build native windows apps. The Windows Presentation Foundation is pretty good for UI work, and you can learn all the professional stuff like unit tests and version control in that environment.

But I think the best way to go would be to learn Java. There are free IDE’s like Eclipse you can use, gitHub is an awesome code repository/version control system and also free. Java has memory management and a huge number of available classes for doing all the grunt work. You can write code for multiple platforms with it. And there are tons of good Java tutorials online.

One thing that should be noted is that “object oriented programming” (OOP) is more of a philosophy than a particularly programming methodology, and every programming language implements its version of OOP with a different set of mechanics. If you learn OOP in Python, for instance, you are going to be quite disappointed if you then try to transfer your learning to C++ and discover that it lacks many capabilities you would consider critical, and something like Smalltalk or Objective-C is going to look pretty alien even though they aren’t all that different in what can be done with them.

OOP is one of those things that seems really ‘important’ when you are first introduced to it but really is only helpful if you are doing a certain type of application programming; particularly wide scale projects where your code may need to integrate with other existing components or hypothetical future projects where everything is controlled via interface without insight into the particulars of each part, and the cost of doing that is a lot of overhead and having to design interfaces in a very specific, tightly controlled way. If you are building an enterprise-wide system or some kind of extensible operating system then it makes sense, but for most general programming—especially anything a hobbyist would be doing—it is largely unnecessary or at least adds extra burden. Not that it isn’t good to understand the principles and and learn them in a particular language (and if you are going to do that, Python is one of the best) but don’t expect that it will substantially reduce the learning curve when doing OOP in a different language.

I know many people believe that Fortran is a “dead language”, largely based on the fact that it is no longer in use for general purpose scientific data manipulation and visualization (for which it it has been replaced by applications like Matlab, LabView, Python/NumPy/SciPy, et cetera) and was never in use for general computing or any kind of web application programming but Fortran is the foundational language for much of the high performance engineering and scientific data tools in use, from finite element analysis (FEA) and computational fluid analysis (CFD) solvers to astrodynamics and hydrodynamics codes used in astrophysics and plasma/fusion physics. The widely used BLAS and LAPACK numerical libraries are all written in Fortran, as are all global climate circulation models used to model climate effects on the world’s most powerful supercomputers, and unless you are using a high performance computing tool that has been written completely from scratch in the last twenty years, there is a high probability that at the core there is a substantial amount of Fortran code which needs to be maintained and updated. Most engineers and scientists who do not work in the development of these codes will have at best a passing familiarity with Fortran but there is a whole subset of the scientific computing community that still actively uses Fortran on a daily basis.

@Francis_Vaughan is correct that the interpreted vs compiled comparison doesn’t really have much validity today for the vast majority of uses. Computing hardware is fast enough, and with a language like Python you can either access libraries (typically written in C or, yes, Fortran) to do the heavy lifting or pre-compile Cython code, to the extent that performance differences are pretty negligible. This isn’t to say that you want to write a finite element solver in pure Python, but you can certainly write the front end for it including GUI applications and then have your external libraries that are optimized to do all of the computationally intensive work. The only application I can think of where having an interpreted language is a hindrance is in safety-critical real time embedded applications where you want to run with minimal hardware or latency; even for that you can build a RaspberryPi or BeagleBone-based application using an interpreted language that are as responsive as the attached physical hardware will be.

If you want to do some kind of programming with a graphic interface, I honestly wouldn’t bother with VB.NET or some other OS-dedicated framework; there are so many different libraries and frameworks that will allow you to put your interface as a web browser application that you can port to any platform that unless you need to work in a specific environment it just isn’t worth being hamstrung and having to carry the overhead of bloated and overpurposed general application frameworks. It is far quicker to gin up a pretty sophisticated graphical application in Flask or Pyramid with Python than it is to even write a “Hello, World” application in VB.NET. Again, the only real reason to work in an OS-specific application framework is if you are doing something like writing applications for your phone/tablet, or working on some kind of commercial application like an office productivity suite or a CAD tool.

Stranger

Hmmm … I’m not sure that I fully agree with this, although it may matter less in these days of computers that are mostly very fast relative to our typical everyday needs, but it certainly mattered a lot years ago when computer cycles were precious. The fact is that an interpreter is always going to be slower than native code; exactly how much slower depends on what kind of intermediate code, if any, has been generated. If the thing that is being interpreted is the original source code itself, such as was the case with early LISP interpreters, for instance, then it can be very slow indeed. If it’s some intermediate code that more closely maps to machine instructions, it will be correspondingly faster.

I’m unimpressed by the claim that sometimes processors themselves are “interpreters”. To me that’s stretching the meaning of the term. Sure, a great many CISC processors were microcoded, and the instructions could be said to be “interpreted” in firmware, but the actual hardware was extremely specialized for that purpose and the firmware architecture itself equally specialized and exotic (i.e.- it had no resemblance to a general-purpose instruction set). Microcoded processors certainly ran slower than hardwired ones, but the firmware was so close to the hardware that they were at least in the same ballpark. Whereas a wholly interpreted high-level language – I give you as an example FOCAL on a PDP-8 – was orders of magnitude slower than machine language, to the point that it was sometimes just impractically slow.

So you’re implying that those who remember FORTRAN-77 are old geysers? How about those, like me, who remember FORTRAN II? :grinning:

OK, it’s true, but FORTRAN IV existed even in my youth. But for a long time FORTRAN II was all DEC could do on the humble PDP-8. An excellent version existed on the PDP-10, and FORTRAN IV came along for the PDP-8 with the much more advanced instruction set available with the optional FPP-8 floating point processor (which wasn’t just a floating point add-on – it actually had a full Turing complete instruction set). Speaking of interpreters, because the FPP-8 was so expensive, DEC wrote an FPP-8 interpreter for the PDP-8 so that anyone could now write FORTRAN IV programs for the PDP-8. Of course, it ran like a pig. :wink:

I completely disagree with that. Object oriented design works well at every scale. I just did a little micronctroller project in C++ for an LED lamp, and I certainly could have just written a big file with a bunch of functions. But it was just better to build objects and interfaces for the lamp, the effects, the commands, etc. It’s easily extensible, very understandable and maintainable.

There’s actually not all that much code, and it certainly could have been written without objects, but doing it the object way is just superior. If I want a new effect I just add a new effect class, and I can tie it into the command system easily to add the UI for it. I built a little phone interface using REST calls, and none of the code had to change anywhere other than the command class.

If you are going to go to the trouble of learning a compiled language, and not just hacking out 100 line python scripts or something, learn Object-Oriented Design.

C++ is too ugly to be a Ferrari. Also, a Ferrari isn’t actively trying to kill you.

Maybe a Bugatti Veyron.

OK, here’s the “Hello World” program in C:

#include <stdio.h>
int main() {
   // printf() displays the string inside quotation
   printf("Hello, World!");
   return 0;
}

And here it is in C#:

// Hello World! program
namespace HelloWorld
{
    class Hello {         
        static void Main(string[] args)
        {
            System.Console.WriteLine("Hello World!");
        }
    }
}

I think programming languages have tended to build up more and more baggage. I did think C# had a lot to teach me, but I also fell in love with FORTH:

: HelloWorld .( Hello World!) CR ;

Is there ANY design that works well at any scale?

It’s great to see how elegantly simple modern languages have become! :smirk: :grinning: :grinning:

In order to output “Hello world” in FORTRAN, back in the day, I would have had to type all this:

     WRITE(6,100)
100  FORMAT(" Hello World!")

There are many different sub-fields of programming, and people who know a lot about one kind of programming don’t necessarily know much about other kinds.

The languages, the skill sets, and the whole approach tend to be different in different environments.

I’d say the major fields today (with some cross-over) are:

  • Database and commercial
  • Web and internet (front-end and back-end)
  • Mobile apps
  • Desktop apps
  • Scientific and engineering
  • Embedded systems and low-level programming in general
  • Game development and high performance graphics
  • Cryptography and security
  • Networks and communications
  • AI, robotics, virtual reality

I’ve probably left out a few, but there are any number of niche areas, niche languages, and specialities.

Perhaps you might like something like Julia. It doesn’t have quite the ecosystem of Python, but it’s definitely gaining ground in some Python-centric verticals.