Q for engineers (programming language)

You have intrinsic types (real, complex, integer, character, and logical), and derived data types (structures with other intrinsic and derived types as components). On top of that, of course you have (multi-dimensional) arrays, pointers, etc.

You can tell Fortran’s purpose as a scientific computation language from the ways the basic types work. For example, if you need 15 digits of precision and an exponent range of at least 307 then you can declare a variable like



integer, parameter :: dp = selected_real_kind(15, 307)
real(kind=dp) :: a


and of course complex numbers are intrinsic…

Elixir and Erlang are great for fault-tolerant, distributed applications, but AFAIK not really suitable for high-performance computation per se. It was originally designed by Ericsson for telephony applications.
[

](Starting Out (for real) | Learn You Some Erlang for Great Good!)

Milliseconds is “mighty fast”? I definitely work in a different world than they do.

Now… I never mentioned Erlang, and I never suggested Elixir is top of the heap for high-performance computation. I will suggest that the BEAM ecosystem is supreme in its niche (high concurrency and availability).

And… rightly or wrongly, Elixir is being seen as a competitor to other backend languages like Python or Ruby, and it absolutely destroys them in terms of concurrency and computational performance.

Personally I feel like absolute performance is kind of a secondary concern now that horizontal scaling is so cheap. So I like a language that performs well, and strikes a reasonable balance between expressing how we talk to the machine vs. how we talk to other humans. And humans are more fluent than we tend to assume about machine-talk, so I find it useful to avoid all of the effripperies of Ruby/Rails.

Elixir is very pleasant to work with, you should give it a try.

In my world, we get 16 milliseconds… to render a frame of an entire virtual world (actually, these days it’s really 8 milliseconds). I’d kill for an extra 100 microseconds. A lot of the time, I’m fighting nanoseconds.

I do not see any quote from me saying that anything is alive nor dead. I have said that it had been supplanted, is on the way out, and that the sorts of things that it has been hanging on to are, similarly on the way out to be replaced by Hadoop.

If you have a case then there is some metric you can use to demonstrate otherwise. As of yet, you have provided none and none of your argument has fallen outside of the range of what I have and originally did concede. You have shown no reason to believe that the usage is growing, nor even that it is holding still, nor any reason to believe that it would do anything but shrink.

This is data:

https://trends.google.com/trends/explore?date=all&geo=US&q=Numpy,Lapack,ggplot2

If it is ideal for supercomputers, for example, that ignores that it used to be popular throughout science and mathematics. You are pretending like the current usage matches the past usage. I’m reasonably sure that’s false. In every day, non-supercompter usage, it has been supplanted except for maybe some tendrils of dependencies tracing back to legacy usage within that usage shave. Arguing that is is king of a niche and thus it is invincible ignores that it has already been bested in most other spaces except that niche.

Assembler has its niche. Lisp probably still has its niche. Erlang has its niche. And while that certainly means that there are jobs for those, any presentation that these were anything beyond niche, esoteric-bound languages would be misleading. Trying to pretend like Fortean is the language of math and science and it will always be is demonstrably false.

And even the argument that it had a special place in supercomputing does not seem to be terribly plausible. If you look through the list of scientific modeling software, to see which languages they’re implemented in, Fortran is not the clear winner:

New branches of computationally intensive mathematics like machine learning or crypto mining are not basing themselves on Fortran. Only communities with a legacy there are continuing with it at all and they are not telling their younglings that there’s a need to learn Fortran.

Supercomputers are not going to win against Hadoop-like frameworks. Even if we were to take your argument as true that Fortran is king and uncontested in its space, that doesn’t matter if that space is itself going to disappear.

https://trends.google.com/trends/explore?date=all&geo=US&q=Hadoop,supercomputer

And, like I said, just based on the libraries available, while I grant that it is outside of my scope of knowledge and experience, I’m not seeing a clear reason to believe that Fortran is uncontested nor dominant in the land of high computation frameworks. It was my understanding, when I got into the business, that a lot of that realm was moving towards GPU usage and that those libraries were largely C/C++ based because they’re using the hardware and libraries that were built for computer games. Looking at the existing frameworks, while I haven’t confirmed that they are GPU-focused, I see no shortage of what look like C/C++ based stuff intended for the same hardcore sort of thing as the Fortran frameworks. And knowing at least some amount of the pluses and minuses of hardware and human laziness, I would expect GPUs to be very popular for scientists and I would expect them to go the path of least resistance to interface with the C/C++ frameworks (DirectX, OpenGL, etc.) to interface with them.

And, similarly, I would expect Hadoop to destroy all of that for any bulk calculation need.

Crypto, given that the result is time sensitive, will stay with GPUs. Science will go for cheap and bulk where it needs that. For everywhere else, it has patently and provenly already stampeded away from Fortran to new languages.

Again, I am not saying that it does not exist, that there are not people making a living working in it, or anything else. I have stated simply that the first page of this thread have a false impression that it was a dominant or even significant language within the programming world and that it would continue to be that way.

It is not. It is not even that in the mathematics nor science world. It is shrinking, it will also certainly continue to be replaced, and to the extent that it is changing to modern standards and incorporating new ideas, that’s not saying much.

You have not countered any of that. Your statements may correctly give the current status of the language within a particular space, but that doesn’t conflict with what I have said. I admit to that space. But that’s not a particularly useful frame of reference for almost everyone for almost any purpose and they would be at a disadvantage to approach things in modern day from that vantage, without having the wider frame of reference.

If you want to get into math and science, learn Python (or R). If you want to get into mega-processing within that space, learn Hadoop. I don’t know that the science world is quite there with that, but it will be and you’ll be the hero for introducing it.

And if you’re a Fortran coder and you’re not approaching retirement, that would be the same advice. The tendrils of legacy might see you safely employed for the rest of your life. I went deny that. But I’d, personally, not bank on it.

Hadoop is not a language, it is a library and system to distribute processing of large datasets across several computers.

If you want to do science, Python and R are probably your best choices today. As I said before though, the actual work in Python and R is occasionally being done by libraries written in Fortran. If instead of doing just generic “science” you want to write an R package to do some numerical analysis that you’ve developed, it is entirely possible that you’ll decide that Fortran is your best choice for some, or all, of the parts of that package.

Just because Fortran is not even in the top 20 list of languages doesn’t meant that it isn’t used or important. The idea that it is dead is pretty easily refuted by the fact that open source projects and corporations continue to release and support Fortran compilers and libraries. If I removed Fortran compilers from the systems I maintain, then I would not be able to install the software my users need.

You said that you suspected there is no active development of FORTRAN-based applications. I’m not sure where you’d put that on the alive/dead continuum, but the difference between “dead” and “zero active development” seems vanishingly slim to me.

Plus, I provided counterexamples of robust, actively-developed applications written in FORTRAN, as did several others. You’re moving the goalposts.

Yep. I use an NVIDIA Tesla V100 to help solve my finite element analysis models. Again, all the solvers I use are written in FORTRAN.

Nope. Scientists and writers of scientific software can use C to talk to GPUs, but when they’re already coding in FORTRAN, why switch languages? In this case, the path of least resistance is often NVIDIA’s CUDA FORTRAN compiler. From the present link:

I get it: you hadn’t encountered FORTRAN in your career and you assumed that it was a lot less important than it is. That’s understandable. But it turns out that it’s still vibrant and useful in scientific computing circles—and you’ve admitted you aren’t familiar with these applications. (If you were familiar with them in any substantial way, you wouldn’t have doubted that any projects are actively developed in FORTRAN; you would also be aware that both NumPy (Python) and Matlab rely on the BLAS and LAPACK libraries, both of which are written in FORTRAN).

I’m happy to concede that FORTRAN is fading in importance. Its heyday is well past and it will eventually be superseded by another language or languages. I don’t think anyone here is claiming otherwise. Rather, we’re saying that within the niche of scientific computing, there are still great reasons to use it (including performance). Many people within that niche do use it. That’s a fairly narrow claim.

It’s a small niche, but an important one. Non-specialists like yourself tend not to know much about it, and there’s no shame in that.

Actually, you said you suspected that no FORTRAN projects were actively being developed, and that the only “work” being done in FORTRAN was maintaining legacy programs.

It may not be a significant language in your world, but it’s a central one in mine and others’. Your experience is not the ultimate arbiter of anything. Neither is mine, of course, but you were plainly mistaken in your suspicion that there were no actively developed FORTRAN projects.

Yes, it does conflict with your claims and no, you didn’t “admit to that space” at first. You clearly don’t understand “that space” is or what programming languages suit it. If you did, you’d stop mentioning Hadoop.

You keep telling us what you said. Let’s look at what you actually said:

What’s false is your sense of how much FORTRAN is actually used, whether directly or as the guts of a library called from another, more general-purpose language. No one is claiming that FORTRAN is anything but a specialist’s tool. You seem to be a generalist, and the “no new FORTRAN development” assertion makes it clear that you were unaware of this niche.

If I may take the liberty of abstracting your argument a bit: your initial post seemed to assert that FORTRAN is no longer the best tool for any job. Posters like myself and Stranger, who do numerical simulation, pointed out that it’s still the best tool for the job we do. We (and echoreply) also mentioned that Python and many other languages call libraries written in FORTRAN when they need to do linear algebra.

In my experience on these boards, most people acknowledge such moments with two simple words: “ignorance fought.” Then again, there’s a reason that fighting ignorance is taking longer than we thought.

I missed the edit window, but echoreply mentioned (way upthread) the same compiler I linked to in my post.

I’ll grant that there are a lot of libraries out there, including some still in widespread use, that were written in Fortran. But I don’t think that this is because Fortran is better suited for writing such libraries than C. In most cases, those libraries are written in Fortran simply because Fortran came first, and when C came along, nobody wanted to put in the time or effort to re-write them. And a lot of the time, these libraries are self-contained enough that there’s no need to look under the hood, and so nobody actually cares what language they were originally written in.

If, for some reason, it ever became necessary to update those old numerical libraries, it’d probably be easier to re-write them from scratch than to go back to the original source code and edit it. And that re-write would, in most cases, probably be done in C.

That’s true. I don’t mean to overstate the case for FORTRAN. But the rumors of its demise have been exaggerated, at least in this thread.

Certainly some of that is done. ATLAS, an optimized implementation of BLAS, is written in C and assembly language, not Fortran. Since this is the realm where squeezing out every percent of performance is essential, it would be enlightening to know more about the reasons behind their ultimate choice of programming language.

The reason cited to me for translating numerical libraries from Fortran to C (in the late 90s) is that the Fortran language prohibits pointer aliasing (the ability to have two separate pointers refer to the same memory location). That means that a Fortran complier can do a lot more aggressive optimizations.

C99 introduced the “restrict” keyword, and a lot of C++ compliers have (non-standard) annotations, to indicate non-aliasing pointers, so they can now perform the same optimizations, (and also C/C++ compliers today are much cleverer at figuring these things out for themselves, even without help) but there was a couple of decades where C was conspicuously slower than fortran for these things.

I agree with that. If all fortran libraries and their source code were magically removed from existence, but all the fortran compliers were still around, there is very little that would be re-implemented in fortran.

A comment was made in a Reddit thread on a different topic:

I think for the fields of numerical analysis, statistics, modelling, high performance computing, and other math heavy fields that quote also applies perfectly to Fortran. Just because you didn’t use any Fortran code, doesn’t mean you don’t depend on any Fortran code.

I fully agree with Echoreply’s post above—a post which evokes the OP’s question:

Despite the tangent about Fortran’s current vitality (which I exacerbated), the answer to this well-bounded question is pretty straightforward:

Yes, engineers and physical scientists who do numerical analysis (and some other kinds of math) still use FORTRAN, and frequently.

Within those same parameters, Python is common for general-purpose coding and prototyping. While FORTRAN is entrenched in these fields, C is making inroads, especially when computational speed matters and the code is being written by those who know C better than FORTRAN.

Does that sound like a reasonable answer to the OP’s question?

You should perhaps look more carefully, then.