How do I become a good programmer? Some advice needed

I have rather recently decided that I want to take up computer programming as a hobby in my spare time.

However, I am hopelessly lost when it comes to deciding what to do next.

My basic plan is that I think I should start learning some languages. But there’s just so many (or so it seems to me - a complete newbie) that in the end of the day I don’t know where to begin.

Realistically, all of this would have to be self-taught. I’ve heard (and read) that some of the best programmers are actually self-taught, but in any case that’s the only realistic option available to me at the moment.

Okay, so I have a slew of questions for the programming Dopers here.

  • When beggining my programming “career”, what language is it advisable (given the nature of being “self-taught”) to start-off with?

I know that some languages are more difficult than others, and I have been advised to never start off with C or it’s variants. So what would be an ideal starter?

  • How many languages is it realistically possible to learn?

A few people I have known seemed to know at least 6 different ones. Is there an “invisible” cap on the total number of languages that you can digest?

  • Is it worth knowing a large number of diverse languages?

Obviously I know that this will depend on what you use them for. However is it correct to assume that for the vast majority of tasks I wish to accomplish, a certain spread of languages (e.g. Java, Perl, HTML, C, Linux) will be sufficient?

  • How much does my hardware know-how figure into this equation?

This splits into two parts:

a) Do I need a particularly powerful computer to run any language(s)?
Reading around, it seems to indicate that I need nothing more powerful than an ancient 486 Pentium to use any and all of these languages, however I’m no techie so I turn to you

b) When programming, can the software overcome any of the weaknesses of the hardware?
This question may seem strange, but here’s kinda what it means. See I always reckoned that languages are standard, regardless of how powerful the machine you program 'em into. So my worry is that I won’t be able to do many of the things that someone with a more souped up computer will be able to (programming-wise). Are my fears unfounded?
*How do I learn?

Meaning should I just get out some books from the library, and learn how to program from them? How did you learn? Are there any good web-tutorials?

I’m preferably looking for FREE information here, but I’d like any recommendations for learning materials you could give me. So this can include paid for stuff too.

  • How long did it take YOU to become proficient in a certain language?
  • Is it more a case of knowledge or practice?

I’ve always said that if you want to learn a new (geographic) language, you got to practice speaking it. In the same way, is learning programming more about the practice of writing good code than it is about simply reading texts?

  • Anything else I should know before I begin?

Lots of questions here. For what it’s worth, I’ve been programming for about seven years now, so I like to think my comments are relevant.

Java’s a good starting language because of all the GUI facilities it offers. C/C++ are definitely viable options for a beginner. You were recommended to avoid them because they require you to manually manage pointers, which some folks find difficult. However, you’re eventually going to have to learn how to do that, even in Java, and it’s probably easier when all the pointers are explicit.

Many. There’s one underlying paradigm to a lot of languages, which can be viewed as dialects of that fundamental approach. Some features vary, and the syntax is always different, but there’s that fundamental sameness that makes it easier.

Granted, keeping track of the difference in features can be pretty tricky. Most people have experience with a lot of languages but are probably only fluent in two or three.

Um, Linux is an operating system, and HTML is a document markup language. You might want to brush up on that kind of stuff.

It really depends on what you want to do. C++/Java will get you very far, but it’s definitely good to know other languages if you’re looking to make a career out of it. Perl and SQL are also good.

Very little unless you’re writing drivers.

In C, I can give you a program that will run quickly on a 286. I can also give you a program that will take years to complete on the supercomputer of your choice.

You’re fundamentally limited by your hardware speed, but you can make the best of what you have by programming intelligently and being patient.

The absolute best thing you can do is to have someone teach you. It doesn’t have to be a class–offer one of your programmer friends money to sit down with you occasionally and explain stuff. It’s really worthwhile to pick a good book, too. I can’t give you any recommendations because I’m university-taught and none of my language-specific textbooks would be worth much without a class to go along with them.

What does proficient mean? From the day I started programming in C++, I was able to turn out some kind of program, but I’m still learning (and sometimes re-learning) about everything the language does.

Practice, definitely. Reading texts counts for a lot less than you might think.

Do you want to be a programmer or an engineer?

Programmers IMO are the people who write code from detailed specs. They are fairly low on the food chain. Engineers understand the problems and come up with solutions. They are far more valued.

Aside from using a language to learn, I wouldn’t concentrate on specific languages. What is important is that you understand the concepts, the specific languages can come along later. It would be good to have knowledge of object oriented programming, so Java or C++ would be good languages to learn those concepts, but I wouldn’t be too concerned with which one you chose.

It’s worth have a diverse toolbox, but if I see someone with lots of languages on their resume and no practical experience I will assume it’s all book learning and essentially useless.

Ignore hardware unless you want to be a hardware engineer or tech ops. There’s little or no need as a developer.

Take some courses on theory and design, not “How to Program” courses. When hiring I’m not usually looking for someone who knows how to use a hammer. I want someone who understands what goes into building a house, even if they can’t do it all by themselves yet.

So you mean for example, a computer science degree, as opposed to Java 101?

A CS degree will invariably contain Java 101 as part of its curriculum. But don’t worry too much about that right now–you’re not going to get very far without knowing how to code. Learn that first.

Told you I was a newbie.

Can you expand on that a little? Is it some complex algorithm?

OK, thanks.

Look at MIT’s OpenCourseWare computer science curriculum. The first course uses Scheme, which is a functional language. And yes, despite the inevitable complaints of the undergrads in the intro course I’ve TA’d for (not at MIT), Scheme (and functional languages in general) are an excellent place to start. You can download Dr. Scheme for free for many different platforms.

No, it could be something very simple, like enumerating every possible permutation of the numbers 1 to 1000.

In a lot of cases, complex algorithms are faster than simple ones. There has to be some trade-off, or no one would code it.

Software engineer here, with ~13-15 years experience, depending on how you count. “Self taught” in the sense that the last programming language class I had was Fortran in high school in the mid-eighties. So yeah, it’s possible to teach yourself, at least to the point where you can get an entry-level job coding.

I’d start with a language that includes a GUI based IDE. In simple terms, that means a language that comes with a front end that steps you through stuff, allows you to design screens with a minimum amount of hassle, etc. Java & Perl and all those freeby languages are powerful, but IMO very hard for a beginner to pick up when compared to something like Visual Basic. You can probably pick up a used copy of Visual Basic off eBay or something for very cheap, along with a few VB books to get you started.

Another idea would be MS Access. You can learn a lot about programming with it, as well as database skills, which is good for any coder.

Yeah, I’d stay away from C or C++ to start out with. They’re much more powerful than VB and Java, but there’s more gotchas as well. C#, however, would be good for a beginner.

Programming, once you get beyond the basics, is more about logic, the ability to model data, and debugging skills than about the specific language. Once you know a few languages, picking up a new one is relatively easy. It’s the logic and all the other stuff that takes years to perfect.

Case in point: in the last 18 months, I wrote a C# application from start to finish on my own, and then got a job doing Java programming where I’m now tech lead for a project. I’d never touched C# or Java before I got either of those jobs.

Sure, but like I said above, they’re easy to pick up after the first few.

Well, I don’t know how much you know about hardware, but in general most programmers are fairly clueless. Before I started programming I was a hardware type person, and used to make fun of programmers who couldn’t do the first thing when it came to hardware. A decade later, I’m now more or less one of those programmers. The hardware has changed so much that my knowledge of how things work is very out of date. On the other hand, I know much more about software guts. It’s funny, but the two are not very related.

As far as how much hardware you need, you don’t need much to learn how to program. You can use a fairly outdated computer with no problem.

Start with books. Find some good forums where you can post questions. If you can find a mentor, go for it.

I do believe that once you get to a certain point, you’re not going to be able to go very far past it without finding someone to work with you. I’ve seen self-taught programmers who work on their own for more than a year or so; skill-wise, they don’t compare to people who’ve worked in teams with people who know more than them. The worst case is people who never learn basic skills, and produce really, really bad code, because a whole portion of their knowledge base is missing.

Note that the above applies equally to people with Computer Science degrees as self-taught people. I do think that if you’re motivated, you can teach yourself enough programming skills that you’ll just about equal someone fresh out of school with a CS degree. You’re both green newbies :smiley:

Good question. I started out with DB type languages - FoxPro and dBase and such. Those were pretty easy to get OK at. Then I moved to C, which was harder. I was productive with C after about 2 weeks of study (I got a job where they agreed to work with me to learn the language.) I was proficient within a couple of months. I wouldn’t say I was really good at C for another couple years. I then spent another 5 or 8 years doing C and C++ and getting really, really good at 'em (if I say so myself!)

Remember that a lot of that time was me learning how to program in general. I now can take the skills I learned with C and C++ and immediately use them in any language, even if the syntax is different. So nowadays I figure I can be immediately proficient in just about anything.

Both, except I think programming is more like riding a bike than speaking a language; once you learn how to program, it’ll come back even if you don’t do it for a few years. Still, you don’t get good at programming by reading a book about it; you have to do it. There’s nothing like really writing code to learn a language; in fact, I don’t think you can do it without writing code, and writing lots of it.

Remember that writing the code is only about 25% of the job; design and debugging are the majority of the work. I’m currently mentoring two junior programmers, and I’ve found the real difference in our skill sets isn’t the code we write, but the way I can design and debug. I can hone in on where the bugs are in code in a fraction of the time that it takes them to find it, and what seems obvious to me isn’t to them. I’m not criticizing them; it’s simply a skill that comes with practice, and I’ve practiced more than they have.

Anything else? Prepare to be frustrated. Prepare to want to throw your computer out the window. Learn logic, and learn how to look at everything in the smallest detail. Remember that writing the code is the smallest portion of time; the code that takes you an hour to write might take 3 (or 6, or 8) to debug. Learn when to ask questions (and find someone who you can ask). And have fun!

As an addition to the advice given above, one thing I’d reccomend is something that isn’t programming per se, but it will help you immensely down the road:

Play around with logic-based puzzles and experiment with writing down directions on how to do something as clearly as possible. (A classic example is to make a paper airplane, then write down the directions of how to make that specific paper airplane. Give the directions to someone else, but don’t let them see your original airplane. See if by following your directions that the other person can duplicate your airplane. If a person, who can jump to logical conclusions and draw on previous experience can’t follow your directions, then how could a computer who takes everything as is follow your directions?)

Why? The idea is to get your mind trained to think like a good programmer. Making sure that you can think something through logically and clearly will help when you deal with ‘large’ programs, and learning how to quickly spot patterns and similarities will save time, effort, and lead to better code later on. It is better to get good habits now, rather than try to break yourself of bad ones later on if you find your hobby leading you to larger programming efforts. :slight_smile:

I know there are various rants in the Pit about the code beginning (and so-called ‘professional’!) programmers write up, you may want to seek these out to get hints and tips about what not to do.

Lastly, as to the question of knoweldge versus experience: yes. :slight_smile: Knoweldge will get you going (learning how to print something to the screen, etc.), and experience will help you learn what to do when your knoweldge fails you.


<< Programming is an art form that fights back. :wink: >>

I have a couple of things to add, though I’m by no means the most experienced person here.

About the diversity of languages - languages are divided up into a couple of different types. Once you’ve learnt C++, Java will be quite easy to learn because they’re both object-oriented languages. However, learning Prolog might well be a mind-boggling experience, because it’s a logical programming language and doesn’t have all that much in common with C++ or Java. I think the different types are procedural, object-oriented, logical, and functional. If you want diversity in what you learn, it’s more important to spread the languages you learn over these groups rather than just learning heaps of languages. The first two are the most useful to learn, by most people’s reckoning at least.

The other thing was that you should try to find a textbook with lots of info about coding style, and follow their advice. It might seem odd to a non-programmer that there’s such a thing as coding style, but it’s very very important. However, be wary of online tutorials and quick courses, because they tend to focus only on the concrete details (in my experience), and always doing that will make you into a bad programmer.

~ Isaac

There are also dataflow languages, but those are unusual.

Steve McConnell’s “Code Complete” is the essential text in this regard.

I’m going to take a different tack: what do you do now, and do you enjoy it?

If you look at Telemark’s post, you will note that he distinguishes between a programmer and an engineer. This indicates that he works on the “tech” side of the house.

On the other hand, on the “apps” side of the house, no one gets called an engineer, we are analysts.

(Business) applications are used to drive commerce: accounting, sales, marketing, payroll, etc. If you think you might enjoy crawling around in the guts of an operating system, then you should pick a “tech” career. If you think you would prefer to solve data problems for people, you might consider the “apps” side of the business. They are both equally (dis)honorable professions, but it helps to know which direction you’d like to go.

This does not really change any of the advice that has already been given. You need to get familiar with at least one language so as to know whether you have the oddly wired brain that enjoys programming. However, as you begin to look for directions to take, be aware that there is no such thing as a “programmer” any more (if there ever was). All programmers have to do some amount of specialization in order to keep up with new advancements and the guy who does data mining for the CEO is going to be different than the Sys Admin who will be different than the guy who writes or tweaks operating systems who will be different than the guy who knows all about communication but could not care less what is being communicated. Then there are specialty jobs such as game writers. It’s a big field.

How far do you want to progress as a programmer?
Personally I wanted to be a badass, make the computer stand up and sing programmer, so I started with assembly. But, all of 0.1% of programmers in their twenties know assembly (particularly not where they can code it and always get better (~X2) performance than with C or such) just as it is too arcane to be applicable to 99.9% of all modern programming. But knowing it makes learning C a snap and you know how to make code run better.
But I only recommend it for me as that’s what I wanted to do. For most people I usually say start with Java or C#.

Any language is fine to start with, it’s just a question of how you want to progress. C would be good if you want to start towards the deep end, and Visual Basic if you wanted to start with something easy. Both can get you to the same position–it’s just a question of what works for you and whether you are willing to progress into using stuff that you don’t inherently like.

How much time do you have?
I know at least 10 or 12 and have only been programming for 4 years.

No, but there is on how much you can remember without a refresher or good reference book.

Yes, and necessary if you want to be a hardcore coder. There are Java Programmers who can only do Java and essentially refuse to use anything harder, or VB programmers, etc. They find employment just fine I’m sure, but will never be as proficient or as sought after as someone who will pick up new languages at need.

There are some certain core languages at exactly this time on this day that if you knew, you would have pretty much all the knowledge you need to do most anything–or at least that would be close enough to something else that you needed to use that you could move into it very easily.

At this moment, I would say:
C
C#
Python
PostgreSQL (Can be a language)
PHP/HTML (HTML itself is not a language, but you have to know it to use PHP)

None really. Having an understanding of some hardware issues can help you make code that runs faster, but in most instances people are more concerned that your code runs correctly than fast.

No. A language can make code that isn’t particularly efficient, but that’s a matter of inefficient coding not hardware. Of course, fast hardware can (though not necessarily will) make your code run faster–but that’s just a bonus not a necessity.

Certainly. When people started making 3D games, they had to do all of the math by themselves, now there is hardware that will (to an extent) just take a list of 3D coordinates and colors and figure out how to turn that into a 2D image for you. But this will always be true, sometimes there will be nifty toys you can use, other times there won’t. And sometimes something is just impossible.

You will be able to do everything that you could do on a better computer, it just might take your computer longer to process it.
Computers don’t get bored. If you tell it to do something 8 billion times, it will do it 8 billion times regardless of whether it is a super computer or a PDA. But the PDA will take a few hours while as the super computer will finish in a half second.

Computer books are expensive, but I would recommend buying. Things go out of date so fast that any book older than one year will already start not matching what you are going to end up seeing on your screen (or make sure that you verify that the version of the language you are dealing with is the same as the book.)
But yep, I bought a books on various languages and went through them and made stuff that worked.
There may be good web tutorials–but generally these don’t have enough in-depth info to be of any use to a beginner. A guy who knows several languages can go through a tutorial and get some good info, but he’s just looking for the little thingies that are different from the languages he knows.

Library will work, you just want to watch out for the print date.
Also, Visual Basic and C# aren’t free. So you might go for Java over C# (they’re essentially the same) as you can get it for free.

You’re always getting better, so proficiency is relative. And personally, learning programming was like learning how to chew for me–so I’m probably not a good benchmark to compare against.

Probably more practice–but where you realise what youa re practicing and actively consider what the good and bad bits were before and after you’re done.

See if you can get usenet. Then find the group that discusses the language you’re studying and ask lots of questions, and try and answer the questions that other newbies ask.

There’s been alot of good advice here from some very knowledgable people, but based on my readong of the OP I think you need to back up the train a little bit. There’s a few concepts and definitions your should focus on grasping before you start talking about what languages to learn and how to “be a programmer”.

I say this because some of the questions you ask are a little vague.

First off, I think you should focus on getting a top-down undertstanding of how a computer works. Nothing too technical, just learning what all the components do and where a machine’s physical limitations arise. This won’t have direct bearing on the practice of programming, because as it has been said, hardware is almost completely irrevlavant to writing code. In this analysis I’d expect you to learn the relationships between different tasks, components of which will be addressed in learning to code.

For example, once you understand how a computer (desktop) that is connected with the internet talks to another computer connected to the internet (server) – I’m speaking high-level undertsanding here, you don’t need to delve into ports and packets etc. – you’ll have a easier time understanding how the concept of VBScript with ASP works with Java and SQL. All three are different languages, but more importantly they function differently and serve very different tasks. Learn the definitions of all those tasks and what langauges apply to each and you’ll end up with a better idea of how to teach things to yourself.

The next step isn’t digging into syntax and code theory. Thats probably going to leave you lost, and even if you do understand it, you’ll not have any practical knowledge on the subject. The first step of “learning to program” is to learn how computer logic works. Things like boolean math, conditional statements (if-then, do-while) and the concept of datatypes and objects. Once you know these concepts, most languages come easy. Being a “good programmer” and “knowing languages” are very rarely related. Knowing how to program is a very important skill that too few code-monkeys have. And knowing a language basically amounts to having all the syntax memorized. A good programmer can grab a reference book and write a program in a language he’s never heard of before just by looking up syntax. So, lets move away from the idea of “learning languages” and start planning to understand program logic and scripts.

Last, but not least, is the suggestion that once you do begin understanding the scope of what you’re undertaking and have found a direction you should, before actually learning a language, read a book that focuses on good programming practices. You don’t want to start by learning bad habits. A book like this will focus on the concept of properly commenting code, declaring variables, using descriptive variables, using compartmentalized code that contains functions and/or stored procedures etc. These concepts are very important, and most self taught people start writing code for simple exercises without knowing these concepts. Those exercises aren’t complex enough to illustrate why good practices are critical, and they end up learning to code using poor fundamentals. Unteaching these bad habits can be tough.

Once you have a handle on all that, then you can decide better what language to worry about learning. I personally would suggest learning a database language, probably Transact-SQL. A server side scripting language along with HTML, PHP is a good start. And an object oriented language such as Java/Javascript. Using those tools you will get a pretty strong base of understanding for writing web based code. Not everyone chooses web development as the final destination, but I think it’s the most accessible and scalable way to learn a variety of languages and to try and understand the basic concepts and challenges of coding. Plus there’s no shortage of samples code to work off of…though most of that violates all the fundamentals I rambled on above.

First off, I wouldn’t pay too much attention to what you call yourself. There’s a post here that says a programmer is just writing code, and an engineer is coming up with solutions, or something to that effect. I know others who consider the term “programmer” to be reserved for those highest on the totem pole, e.g., people who write compilers. It’s like when I was in grad school (for geology) and one of my professors had us read a paper on the definition of a reef. It was 60 pages of what various other professors thought a reef was. It was 99% bull.

But now I’m a programmer, or a software developer, or some such thing. I write specs and then write the code to do what the spec says. Or I do it the other way around, becuase people are often so eager for the code that there’s no time to write the spec till after I’m done. Whatever.

My two cents is to go straight for C (aka, c). Not C++; that gives you too many ways to hang yourself. But I think it’s very important to know something about what’s logically going on in the machine. Assembly is probably best for this, but it takes too many lines of code to get something done. Java and other such high-level languages protect you too much, so you don’t have a sense of what’s really going on. But C is between assembly and the higher-level languages, and really closer in many ways to the former, and it will teach you things you need to know.

I’m very strongly in the anything-but-C crowd. IMHO, the language you first start out in shapes how you view programming and how computers work as well as your whole programming philosophy. Afterwards, it can be very hard to change your outlook. For example, once, I was working on a project with a team member on a medium sized project that looked to have a fairly tight schedule. Now, he came from a traditional C/C++, every processor cycle counts, background. Wheras I got into programming much much later and I was more of a Processor cycles are cheap, time isn’t build it now and optimise later type person. Reasonably early on in the development, he realised that he could exploit some assumptions to make a linked list that would work faster than the standard library functions. I pointed out that it was likely that the list code was not a significant factor in performance and that a profile later on would demonstrate this. He acknowledged intellectually that this was the case yet he still felt compelled to write the code as he said he felt uncomfortable if he didn’t. He then devoted 2 days to writing a seperate linked list library (which was radically faster). But late profiling did indeed prove my point that even though his code was 1000x faster in some cases, at best, it sped up the code by 0.1%. Granted, thats a rather radical example but it shows how hard it can be to go against in-built habits.

It’s a lot like human languages. If you know English and Latin, then it’s probably pretty easy to pick up french and german although occasionally you get the vocab wrong or you might need to look up something in the dictionary. However, learning Chinese is going to be a lot harder. However, the more you learn, the faster you can learn them. I would say that 6 is about the maximum you can juggle at one time. More and it becomes confusing trying to sort out the details of each one. But you can easily know 30 or 40 quite varied languages as long as your willing to spend 2 hours re-acquainting yourself to get up to speed.

Yes, absolutely, diverse more than a large number. Knowing a few languages hampers how you think about problems. A lot of how you approach a problem centers around the particular quirks of a language.

Furthermore, there are some things you can only really do efficiently in certain, specialised languages. You can do a horribly, kludge job that takes 10x as long in a general purpose language but it’s worth knowing a wide variety of domain specific languages just to get a feel for what is possible.

It’s absolutely essential in my mind that a good programmer needs to know a wide range of languages, not just to know the syntax, but to truely “get” the philosophy behind them. It’s like when you first start out learning French, say, that you mentally arrange your sentence in English and then translate it to French. Once your fluent, you find yourself thinking in French concepts.

By far the largest family of languages is the C family which comprises of C, C++, Java, C# and marginally Perl, Python and Ruby. The syntax of all of these is largely similar and the way programs are structured are pretty much the same. This is very deceptive as the philosophies behind them are pretty radically different and the similarity in syntax often hampers truely understanding the differing concepts. However, you find that once you can understand one of them, you should be able to read most of the code in any of the others.

It depends on what your doing. Unless your writing high performance applications, not too much hardware knowledge is involved. But it does come in handy. Knowing, for example, that reading from memory takes as a rule of thumb, 1000x faster than reading from disc occasionally is useful and knowing about caches and page sizes etc. sometimes helps when you need to pick a magic number.

Well, any application that you can run on a 486, you could probably program on it. So simple text based applications which do minimal processing run perfectly well and are great for learning. However, if you want to write real applications that only run on modern machines, say 3D games, then it’s obviously going to take more computing power. Howeer, unless your program requires constant user input, as long as you can put up with waiting longer, then any machine will do.

You would have a hard time finding many programs that would not run acceptably on a 800Mhz machine but would run acceptably on a 4Ghz one. So anything withing the last 5 years is more than enough.

I learn’t via going to good university and I would strongly reccomend anybody who has the opportunity to do so. A bad university is not worth the effort. The key thing that university brings is that it teaches you in a way that encourages good habits and discourages bad ones with constant feedback and exposes you to a diverse variety of languages. I never tried self-teaching but I hear the biggest drawback is that you tend to become insular and bad habits fester unchecked.

A good rule of thumb I would say is that your first language should take about a year. The second should take 6 months. The 3rd, 3 months. The 4th, a month and then 3 weeks for every one after that. Double the time if your moving to a radically different paradigm that you’ve never encountered before. However, this all depends on your definition of “proficient”. Most programmers tend to find a little niche of certain features in a language and never stray far from it. It’s very rare for a programmer to be aware of all the intricacies of a certain language.

I would say that knowledge helps you to learn how to learn a language and practise helps you learn the actual language. The basic concepts behind most programs are all very similar and once you’ve understood certain theorys, then many languages are just the application of a subset of features with a perculiar syntax that needs to be mastered.

Good Luck!

Y’know, I’d be really interested if the people here advocating C as a first language have actually learned C as their own first language. I can’t think of a better way to really frustrate a beginning programmer than insisting they learn C before languages that protect you from pointer errors. Add in the fact that a great deal of the programming jobs out there have gravitated towards languages like Java, C#, and VB, and I can’t think of why one would want to pick up C first.

Sure, knowing C is a great thing - once you’ve wrestled with all those memory overflows and overwrites and pointer errors and all that you really know how to code well - but as a first language? Never.

C++ was my first language, and I credit that with making me a good programmer.

Languages that protect you from pointer errors also protect you from having to know anything about what your program actually looks like in memory. IMO, that’s an essential point of view to have mastered to be a good programmer. Any of the other popular languages just don’t force you to learn that, and it’s not something that’s terribly appealing.

Is it frustrating? Sure. But that doesn’t mean it’s not worthwhile.