this post was submitted on 02 Jan 2026
507 points (97.7% liked)
Programmer Humor
28241 readers
880 users here now
Welcome to Programmer Humor!
This is a place where you can post jokes, memes, humor, etc. related to programming!
For sharing awful code theres also Programming Horror.
Rules
- Keep content in english
- No advertisements
- Posts must be related to programming or programmer topics
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Oh that's cool to hear, I was under the impression in research that whilst a lot of the processing actually happens in FORTRAN-written code, it was nearly always reusing already-written functions and primitives in a higher level language (such as python, via the aforementioned SciPy). And then those libraries being maintained by a handful of wizards on the internet somewhere.
Can you elaborate on the kind of research where people are still actively writing directly in FORTRAN? Did people typically arrive with the skills already or was there training for learning how to write it well?
Someone else using fortran in research checking in. In particle physics, were basically writing huge, physics heavy Markov chain monte Carlos in it. Just one example.
Your daily weather forecast likely runs on FORTRAN. It's quite terrible code in many places because the people writing it are not software engineers but meteorologists, mathematicians, or physicists with little to no formal training in software design writing a million-line behemoth.
And FORTRAN adds to the suck because it is superbly verbose, lacks generics, has a few really bad language design decisions carried over from the 60's, and a thoroughly half-assed object model tacked on. As a cherry on top, the compilers are terrible because nobody uses the language anymore -- especially the more recent features (2003 and later).
Don't get me wrong: python probably is the main language used in research. However there's software that needs to be fast at crunching numbers, I work in computational chemistry and pretty much any reliable software is either Fortran or C++. Indeed you have python libraries, but most are just wrappers.
You have Gaussian: https://en.wikipedia.org/wiki/Gaussian_%28software%29 GAMESS: https://en.wikipedia.org/wiki/GAMESS_%28US%29 CP2K: https://en.wikipedia.org/wiki/CP2K Mopac: https://en.wikipedia.org/wiki/MOPAC
Now, most people do not work in Fortran, but it is something you learn a little bit when you start working in computational chemistry. It happens sometimes to have to debug a software not working or to have to write a module to test an hypothesis. People writing those softwares are also researchers, but mostly are full time dedicated to the software. Generally, there is a huge lack on investment on the software infrastructure, very few people are dedicated at maintaining software that is used by hundreds of thousands of people.
While hiring people, I am satisfied as long as they know a bit of python, but knowledge of Fortran really stands out and highlights a more thorough education. If I have time, I do give all the people an introduction to Fortran, as it is still something you often come across in our field. But yes, unless you're working on the development of such software suites, Fortran is not that common now. You'd publish a proof of concept in python or Julia and then wait for someone else to implement it in one of those libraries.
I think you mean an ' ypothesis (only vowels use an; consonants use "a"; h is a special case as French and French influenced English drop the h from the start of words). It's polite to show the letters you have dropped with an apostrophe so readers don't take incorrect ideas from one's writing
Do you have anything actually relevant to add to the conversation?
As far as I've seen checking right now, an hypothesis can be used, just as a hypothesis can be used. I have never seen anyone writing with an apostrophe and would be very confused if reading it.