I do not see any quote from me saying that anything is alive nor dead. I have said that it had been supplanted, is on the way out, and that the sorts of things that it has been hanging on to are, similarly on the way out to be replaced by Hadoop.
If you have a case then there is some metric you can use to demonstrate otherwise. As of yet, you have provided none and none of your argument has fallen outside of the range of what I have and originally did concede. You have shown no reason to believe that the usage is growing, nor even that it is holding still, nor any reason to believe that it would do anything but shrink.
This is data:
https://trends.google.com/trends/explore?date=all&geo=US&q=Numpy,Lapack,ggplot2
If it is ideal for supercomputers, for example, that ignores that it used to be popular throughout science and mathematics. You are pretending like the current usage matches the past usage. I’m reasonably sure that’s false. In every day, non-supercompter usage, it has been supplanted except for maybe some tendrils of dependencies tracing back to legacy usage within that usage shave. Arguing that is is king of a niche and thus it is invincible ignores that it has already been bested in most other spaces except that niche.
Assembler has its niche. Lisp probably still has its niche. Erlang has its niche. And while that certainly means that there are jobs for those, any presentation that these were anything beyond niche, esoteric-bound languages would be misleading. Trying to pretend like Fortean is the language of math and science and it will always be is demonstrably false.
And even the argument that it had a special place in supercomputing does not seem to be terribly plausible. If you look through the list of scientific modeling software, to see which languages they’re implemented in, Fortran is not the clear winner:
New branches of computationally intensive mathematics like machine learning or crypto mining are not basing themselves on Fortran. Only communities with a legacy there are continuing with it at all and they are not telling their younglings that there’s a need to learn Fortran.
Supercomputers are not going to win against Hadoop-like frameworks. Even if we were to take your argument as true that Fortran is king and uncontested in its space, that doesn’t matter if that space is itself going to disappear.
https://trends.google.com/trends/explore?date=all&geo=US&q=Hadoop,supercomputer
And, like I said, just based on the libraries available, while I grant that it is outside of my scope of knowledge and experience, I’m not seeing a clear reason to believe that Fortran is uncontested nor dominant in the land of high computation frameworks. It was my understanding, when I got into the business, that a lot of that realm was moving towards GPU usage and that those libraries were largely C/C++ based because they’re using the hardware and libraries that were built for computer games. Looking at the existing frameworks, while I haven’t confirmed that they are GPU-focused, I see no shortage of what look like C/C++ based stuff intended for the same hardcore sort of thing as the Fortran frameworks. And knowing at least some amount of the pluses and minuses of hardware and human laziness, I would expect GPUs to be very popular for scientists and I would expect them to go the path of least resistance to interface with the C/C++ frameworks (DirectX, OpenGL, etc.) to interface with them.
And, similarly, I would expect Hadoop to destroy all of that for any bulk calculation need.
Crypto, given that the result is time sensitive, will stay with GPUs. Science will go for cheap and bulk where it needs that. For everywhere else, it has patently and provenly already stampeded away from Fortran to new languages.
Again, I am not saying that it does not exist, that there are not people making a living working in it, or anything else. I have stated simply that the first page of this thread have a false impression that it was a dominant or even significant language within the programming world and that it would continue to be that way.
It is not. It is not even that in the mathematics nor science world. It is shrinking, it will also certainly continue to be replaced, and to the extent that it is changing to modern standards and incorporating new ideas, that’s not saying much.
You have not countered any of that. Your statements may correctly give the current status of the language within a particular space, but that doesn’t conflict with what I have said. I admit to that space. But that’s not a particularly useful frame of reference for almost everyone for almost any purpose and they would be at a disadvantage to approach things in modern day from that vantage, without having the wider frame of reference.
If you want to get into math and science, learn Python (or R). If you want to get into mega-processing within that space, learn Hadoop. I don’t know that the science world is quite there with that, but it will be and you’ll be the hero for introducing it.
And if you’re a Fortran coder and you’re not approaching retirement, that would be the same advice. The tendrils of legacy might see you safely employed for the rest of your life. I went deny that. But I’d, personally, not bank on it.