I just did a test to find the smallest Object Pascal (Free Pascal) program I could write.
Here it is in its entirety:
program HelloWorld;
begin
WriteLn('Hello world!');
end.
It compiles and runs!
I just did a test to find the smallest Object Pascal (Free Pascal) program I could write.
Here it is in its entirety:
program HelloWorld;
begin
WriteLn('Hello world!');
end.
It compiles and runs!
Yes. We use it in our Android projects when we need something lightning fast.
I completely disagree with that. Object oriented design works well at every scale. I just did a little micronctroller project in C++ for an LED lamp, and I certainly could have just written a big file with a bunch of functions. But it was just better to build objects and interfaces for the lamp, the effects, the commands, etc. It’s easily extensible, very understandable and maintainable.
Sure, you can use OOP at any scale, but the question is whether you really need to or are getting any real advantage from doing so. Making objects and associate methods with abstracted interfaces makes it easier to add functionality than “a big file with functions”, but I’m guessing your line count of actual operational code is probably in the few tens which anyone could sit down and work through in a half an hour unless you’ve actually made the effort to obfuscate the code, so the time you save in extensibility is probably a wash for the extra overhead. And frankly, unless you are careful about how you apply OOP principles and document interfaces it can end up being much harder for another programmer to understand, which is fundamentally my complaint about casual and reflexive use of OOP.
There are, of course, other reasons to use OOP, particularly if you want to use abstraction to make your code reusable. As an example, I wrote a Python library to do signal processing and cohort statistics for dynamic environments from accelerometer time history data, but I used a class factory so a user could come along and inherit the base signal class and then add methods and properties to do different things with time history data from pressure transducers, thermocouples, stain gauges, et cetera and the be able to use the same cohort stat methods to generate summary statistics. In that case, it was worth the extra overhead because I can use it to handle and manipulate any time history data, and modifying the class factory (which actually has its own set of abstract interfaces) is trivial. In fact, the class factory is designed to be actually import parameters and predefined methods from an external spreadsheet (as well as the predefined classes in the module) so unless the user needs some kind of visualization or processing methods that aren’t available in the module they can actually generate a whole new class type without doing any programming at all. Unfortunately, I have not been able to wean users away from their bag o’ functions Matlab scripting and into Python, but that is another issue.
Anyway, someone who is experienced and conversant can write good OOP code probably about as fast as functional code, but a lot of OOP code is not well thought out or provides any real utility at the cost of making the project harder to understand or reuse instead of easier, which is my main gripe about ‘needing’ to use OOP. In other words, if you can’t think of a compelling benefit to use OOP design in your project, you probably either don’t need it or don’t understand OOP well enough to apply it correctly.
I just did a test to find the smallest Object Pascal (Free Pascal) program I could write.
Here it is in its entirety:
program HelloWorld; begin WriteLn('Hello world!'); end.
It compiles and runs!
There is nothing about that program that is object oriented, or even functional since it is really just a single imperative statement in the main body.
Perhaps you might like something like Julia. It doesn’t have quite the ecosystem of Python, but it’s definitely gaining ground in some Python-centric verticals.
I’ve been trying to find an excuse to use Julia for something because I would really like to learn it but I’m honestly hard pressed at this point to come up with any data analysis and visualization that I can’t just as easily do in Python with NumPy and SciPy, and the milliseconds I might gain in additional performance would be lost in time spent learning Julia and having to interface with existing Python libraries for functionality not directly available in Julia modules. Maybe when I start writing that astro code that has been sitting in my personal ToDo pile for the last decade…
C++ is too ugly to be a Ferrari. Also, a Ferrari isn’t actively trying to kill you.
C is a Piper Cub. C++ is a Caproni Noviplano.
Stranger
I’ve been trying to find an excuse to use Julia for something because I would really like to learn it but I’m honestly hard pressed at this point to come up with any data analysis and visualization that I can’t just as easily do in Python with NumPy and SciPy
Sure, but the OP was simply looking for something that compiles to play around with, not trying to get things done in the most efficient manner. If he enjoys Python, I think Julia might be in his wheelhouse, since most Julia programmers that I know come most recently from the Python world.
And I agree with your C to C++ analogy much more than a Honda Civic to a Ferrari. My first thought was keeping in the automotive world with C as the Ariel Atom and C++ as the Aston Martin Lagonda.
There is nothing about that program that is object oriented, or even functional since it is really just a single imperative statement in the main body.
Sure, but I think you missed the point.
You can write the most sophisticated OOP you like in Object Pascal, but you are not obliged to put in the complexities if you don’t want to. Unlike the C# example above.
Sure, but the OP was simply looking for something that compiles to play around with, not trying to get things done in the most efficient manner. If he enjoys Python, I think Julia might be in his wheelhouse, since most Julia programmers that I know come most recently from the Python world.
Oh, agreed; I was mostly just lamenting that I don’t have a good reason to ‘need’ to learn Julia for its performance improvements. I suspect Julia has a bright future in scientific and data analysis programming once the selection of libraries gets up there, although whether it will actually displace Python for general scientific use in the foreseeable future is questionable.
Stranger
Anyway, someone who is experienced and conversant can write good OOP code probably about as fast as functional code, but a lot of OOP code is not well thought out or provides any real utility at the cost of making the project harder to understand or reuse instead of easier, which is my main gripe about ‘needing’ to use OOP.
Perhaps you haven’t done much complex desktop GUI programming, or complex database programming, because in those cases OOP is absolutely indispensable. It would be impossible to write and maintain real-world applications without it.
Perhaps you haven’t done much complex desktop GUI programming, or complex database programming, because in those cases OOP is absolutely indispensable. It would be impossible to write and maintain real-world applications without it.
Looking back at what I wrote (emphasis added):
OOP is one of those things that seems really ‘important’ when you are first introduced to it but really is only helpful if you are doing a certain type of application programming; particularly wide scale projects where your code may need to integrate with other existing components or hypothetical future projects where everything is controlled via interface without insight into the particulars of each part, and the cost of doing that is a lot of overhead and having to design interfaces in a very specific, tightly controlled way. If you are building an enterprise-wide system or some kind of extensible operating system then it makes sense, but for most general programming—especially anything a hobbyist would be doing—it is largely unnecessary or at least adds extra burden.
Doing “complex desktop GUI programming” and “complex database programming” are self-evidently large scale and enterprise wide types of applications where OOP makes sense because these are applications that involve thousands of programmers and often have lifetimes of decades. However, most hobbyist programmers are not working on projects of anything like that scale or interfacing with other abstractly defined projects, and the overhead of writing OOP is not really necessary, although it may be useful for reuse or extensibility, and of course to learn basic OOP design patterns.
Stranger
I don’t think any student (in the sense of a proper student of computer science) would have ever been taught BASIC. it was fine for hobby programmers and for high school students doing a computing elective or the like. But as to using it as a formal teaching language - Yuk! BASIC was a cut down Noddy (Mickey Mouse for those in the US) FORTRAN like language. You either got taught Fortran or, when it caught on, Pascal.
Let me remind everyone what Dijkstra said about BASIC
“It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration.”
Back when PCs were infiltrating homes my CS professor friends dreaded Septembers when the freshmen who taught themselves BASIC at home would arrive at school, sure they were already expert programmers.
I’m not an OOP fanatic by any means, and never wrote any big programs in C++ or Python, but I think it is far better that anyone learning to program, even as a hobbyist, should learn it right from the beginning. It will prevent them from having bad habits if they ever do anything significant.
I had some code in Perl - not interacting with anything but databases - which I wish I would have done in Python. The deadline I was given made that impossible - I never could have learned it in time. It would have made many things much easier, though, and I encourage people rewriting parts of my code to do it in Python.
I had some code in Perl - not interacting with anything but databases - which I wish I would have done in Python. The deadline I was given made that impossible - I never could have learned it in time.
I think this sentiment is shared by anyone who has done real work with Perl. Unfortunately, Perl is incredibly useful for doing certain types of sorting/filtering/processing tasks, and if you are coming from a *nix background it seems like someone ‘fixed’ sed, awk, and regex functionality into a real scripting language that doesn’t require piping a bunch of oblique bash shell commands together. But if you ever find yourself writing more than about ten lines of Perl, you should probably be rethinking your choices in life starting with your preference in programming languages.
Stranger
And it’s closer to the hardware than pretty much anything other than assembly.
For those of you who didn’t learn Pascal or FORTRAN or BASIC.
Let me remind everyone what Dijkstra said about BASIC
Let me remind everyone that Dijkstra believed that Computer Science was the development, understanding, and teaching of algorithms, and that practically nothing he said about any other subject was of any value.
Another reason to start with just regular C is that you’ll have to learn memory management. Both C and C++ require you to carve out bytes of memory for your data and free it up when you are done. With languages like Python, you don’t need to worry about where the memory comes from and cleaning stuff up. Python automatically carves out memory for new data and frees up memory when you’re no longer using it. It’s called automatic garbage collection. So for something like this in Python:
string3 = string1 + string2
You don’t do anything about memory storage. Python does it all for you. The corresponding C program would look something like this:
string3 = malloc(strlen(string1) + strlen(string2) + 1); // Allocate memory for string1+string2
strcpy(string3,string1); // Copy string1 into space allocated for string3
strcat(string3,string2); // Concatenate string2 to the bytes in string3
free(string1); // Don't need string1 or string2 anymore.
free(string2); // Free up memory associated with them.
As you can see, there’s a lot more to deal with when you have to do your own memory management. Improper memory management is one of the most frequent causes of bugs in these kinds of programs. Both C and C++ require the programmer to be aware of these memory issues and make sure it’s done properly. Mistakes will often cause a program to crash. It’s better to get comfortable with memory management in a simple language like C before you have to deal with it in the more complex environment of C++.
But if you ever find yourself writing more than about ten lines of Perl, you should probably be rethinking your choices in life starting with your preference in programming languages.
Well remember Larry and his followers, recognizing that Perl has lots of problems, have switched to the development of Raku (formerly known as Perl 6):
Raku is a member of the Perl family of programming languages. Formerly known as Perl 6, it was renamed in October 2019. Raku introduces elements of many modern and historical languages. Compatibility with Perl was not a goal, though a compatibility mode is part of the specification. The design process for Raku began in 2000. In Perl 6, we decided it would be better to fix the language than fix the user. The Raku design process was first announced on 19 July 2000, on the fourth day of that year's...
It’s great to see how elegantly simple modern languages have become!
In order to output “Hello world” in FORTRAN, back in the day, I would have had to type all this:
WRITE(6,100) 100 FORMAT(" Hello World!")
Sometimes Hello World is a bit misleading though. In C# you can print a message to the console or a message box with a single line of code.
It’s just that the “minimum viable” program requires a couple of lines of declaration of the program itself.
In fact, isn’t this true with FORTRAN too? Don’t you need a PROGRAM
line and wouldn’t most real world programs have IMPLICIT NONE
which is not even functional (it just turns off a dangerous feature)?
</language war>
Well remember Larry and his followers, recognizing that Perl has lots of problems, have switched to the development of Raku (formerly known as Perl 6):
I did not know that. Looking through the Wikipedia page makes me wonder who this language is supposed to appeal to; Perl had a natural audience in sysadmins looking to automate tasks without stringing together shell commands, and then morphed into CGI scripting tool among other things before PHP became the dominant language. I can’t imagine it is anyone’s primary language today, and I’d only use Perl for some quick text handling like renaming files, and would probably just do that in Python even at the expense of having to import a couple of libraries because it would still be faster than refamiliarizing myself with Perl syntax. I know there are some Perl Data Language users out there in bioinformatics but I don’t hear anyone threatening to storm the castle for a new Perl-like language, and frankly kind of assumed Perl 6 was pretty much dead.
Stranger
In fact, isn’t this true with FORTRAN too? Don’t you need a
PROGRAM
line and wouldn’t most real world programs haveIMPLICIT NONE
which is not even functional (it just turns off a dangerous feature)?
Isn’t an empty file a valid Fortran program?
No. Not according to the fortran 95 standard or the iFort compiler.
Or is this alluding to some historic event?
Isn’t an empty file a valid Fortran program?
A valid FORTRAN 77 program has to at least have an END card.
Stranger