Let’s start off easy shall we?
Everyone’s been saying that C has been dead for… well, more than a decade.
Some laugh at it. Others mutter darkly about living in fear of the infamous
forbidden curses undefined behaviour. Sure, there remain loyal
followers, and scattered cults here and there. But almost everyone who still
sees the Sun will happily claim that it’s a dead language, gone forever, and
there’s no reason to use it any more.
Scattered around the world there are… things. Little, unregarded things… that keep C from dying out altogether. Embedded devices, household appliances, single purpose systems and various other unseen, disregarded objects carry C’s compiled code within them. And while they exist, and require maintenance and upgrades, C cannot be killed.
C is… Voldemort, the dark lord of languages.
Older even than C, Fortran has lived long enough to see the rise and fall of some of the greatest careers in science and computing. A great many extraordinary feats owe their success to Fortran.
But Fortran is… uh… getting on in years, I suppose is the polite way to put it. I mean, there are plenty who will tell you it’s still capable of some amazing stuff… but then, what language isn’t?
Like C, there are bits of Fortran scattered here and there around the world. But unlike C, these are more memorials than horcruxes. Shining testaments to great acts of insightful mathematics and elegant algorithms remain standing, so that those who wish to build upon great works may one day do so with new and shinier code.
Fortran is Albus Dumbledore (who, incidentally, also frowned upon aliasing pointers when operating on arrays — there’s a little known HP fact for you).
This is a language that found its way into the halls of ingenuity by appealing to all the right people at the top. The whole language, its uptake and its culture seems to be based on managerial interference and fear.
It seems decent enough though. Certainly better than… well, You Know What… that’s for sure! No more buffer overflows, no more uninitialised memory, no casting void pointers… and look at the things it’s accomplished!
But after only a few short months, you look up to realise that blatant, horrific evil has been replaced by a kind of dreary, soul sapping awfulness. Actual achievements have been replaced by studying for useless certifications. Design patterns and frameworks are codified as law and hammered into the whiteboards. Gaslighting abounds. Morale plummets.
There’s nothing in here about using defensive bounds checking.
Using bounds checks? Ha ha! Well I can’t imagine why you would need to use pointers in my classroom.
We’re not gonna use pointers?
You will be learning about coding in a secure, risk-free way.
Well, what use is that? If we’re going to deploy code, it won’t be risk-free!
Programmers will raise their hands when they speak in my class.
Java is Dolores Umbridge, and you know, deep down… you deserve this implementation of generics.
C++ is a langauge pulled between two worlds. One is the dark and evil world of… You Know Who… and the other is the bright-but-annoyingly-preppy world of those who wish to overthrow the dark lord.
And C++’s role in all of this is somewhat inscrutable. Is it harbouring and abetting the enemy? Or is it actually keeping you safe from its worst excesses? Is it constantly undermining you to make you weaker, or is it teaching you about your own shortcomings to make you more powerful?
Although it occasionally manages to rise above wretchedness and misery, it just keeps getting clawed back into the clutches of C and its followers. And this will ultimately destroy it. Because it will never displace C. It will never live to see what does. It will just be universally reviled for every wrong thing it does, and then it will die.
Also, I’m pretty sure C++ is what really killed off Fortran. Or at least sharpened the knife.
C++ is the hated, the wretched, the unreadable… Severus Snape.
Ada is a language that is half pure practicality, half lofty academia, all austere glare. Designed by committee to create a race of military space robots, Ada eschews things like runtime checks and casts in favour of refusing to let you even try to compile until you prove that you’ve been listening.
This langauge is a strict and unforgiving disciplinarian, but once you’ve earned its approval, you can pretty much blow up any goddamned bridge you want with unerring precision.
Ada is Minerva McGonagall, and if you mess around in its class, you will leave and not come back. You have been warned.
It’s easy to forget about Bash. It just works away in the background there doing a thousand different tasks in a thousand different places that no other language really wants to do. It’s kind of clunky and oafish, and it doesn’t really like change, but there’s a magnificent ecosystem out there that would devolve into chaos if it disappeared.
Bash is Rubeus Hagrid, the only one who knows where the food is for that SysV init script from 1989 that you didn’t even know still lived in your distro.
Windows Batch File
If Bash is Hagrid, then the Windows batch file syntax is Argus Filch.
INTERCAL was intended to be completely different from all other computer languages. Common operations in other languages have cryptic and redundant syntax in INTERCAL. From the INTERCAL Reference Manual:
It is a well-known and oft-demonstrated fact that a person whose work is incomprehensible is held in high esteem.
The INTERCAL Reference Manual contains many paradoxical, nonsensical, or otherwise humorous instructions:
Caution! Under no circumstances confuse the mesh with the interleave operator, except under confusing circumstances!
Oh hey it’s Luna Lovegood. But don’t worry. You’re just as sane as it is.
our websites do oh-so-1337 stuff in the late 90s. That really annoying scripting
language that we had deal with to make web apps do anything useful, because
parents ECMA forced us to let it tag along.
But around about 2010, we started having conversations like this:
What are you using on the server?
No, the server.
Do you even understand the words I’m using?
Are we talking about the same language that used to cry when the other languages got on the train to go to the school for real languages?
PHP is pure mediocrity from top to toe. It is. Yes, it is. No— no, shut up. It is. Even its most notable, most powerful acts are just… half hearted crap done in unecessarily obtuse ways. Just kill someone already FFS.
Some people thought that PHP could be improved by bringing it into C’s flock,
and all they really managed to do was give it
Some people thought that PHP could be turned to greater deeds, and they’re dead now.
PHP might have some money behind it, but it’s not ambitious enough to be a real villian, and it’s too mediocre for any kind of redemption.
PHP is Draco Malfoy, trailed after by slack jawed knuckle-draggers who just keep getting hit on the head but don’t really know what else to hitch their wagon to.
Haskell is kind of dismissed as too academic to be useful, but is actually pretty bloody powerful. Here is a language that is built upon the idea that a moment’s quiet reflection is considerably more efficient than hours of ineffective keyboard mashing.
Buuuut if you don’t actually do the keyboard mashing, no-one sees you as a hero (you know, of coding). If you solve a major problem by spending three days contemplating data types, everyone just kind of thanks you in a vaguely patronising way and then goes off to mash keys anyway, albeit now with slightly more of a clue.
So you roll your eyes, and sigh, and secretly continue to work on that stasis spell that will freeze everyone but you in time so you don’t have to deal with these idiots any more… but you’re still willing to roll up your sleeves and do the stupid kind of work anyway, because you don’t want to see your colleagues get their souls eaten more than is strictly necessary for character development.
Haskell is Hermione Granger, rolling her eyes at you discovering list comprehensions.
Perl cops a lot of flak for a lot of things. From mean people, I mean, not me.
It’s nothing special, they say. It’s suited to the more… dreary kind of data
processing, they sneer. Maybe you should have spent some of those
$s on a
proper type system and gotten some class, they heckle.
If you’re feeling really nasty, you might even say it’s kind of a… homely looking langauge there I said it.
I mean, who would ever want to program with Perl when there’s Python?
People did, though. They still do. It has loyal friends everywhere: programmers who have counted on it to help them with some huge variety of unpleasant, boring or outright terrifying tasks when every other language just happened to have better things to do.
Perl is Ronald Weasley. Sorry about the class remark by the way. PHP said it. Not me.
Well gee everyone just loves this language! Except the jerks who don’t, but we all know who they secretly worship.
Python is smart enough, but not too academic. Python has some pretty influential people behind it, but is always helping out those who don’t have a lot of power. And it seems like every time you’re in trouble, Python just happens to have exactly the right magical doodad in its pocket to save your life.
(Python also seems to be locked into some utterly inexplicable enmity with PHP, with friends of each just sniping and spitting and snarking at each other when they could just be getting some work done. Who knows why.)
Python can be kind of overhyped sometimes, but it’s not its fault. And let’s face it… some days it really is just rescuing people left, right and centre.
But Python harbours a terrible secret. The reason that many of its
spells libraries are so powerful is that buried underneath, in
its most hidden depths, there lurks a sinister, terrible core of ANSI FLIPPIN'
“Why can’t I install this package on Windows,” you innocently wonder. “What’s this about MSVC and assignment from incompatible pointer types hey wait a second BY MERLIN’S BEARD NOOOOO” and then all you can see are visions of being a snake killing the people you love while you hear corrupting whispers in a language you wished you didn’t understand.
One day, however, Python will wake up in the universe’s most brightly lit train station. And it will look down, at the core of C mewling pathetically at its feet, and it will stomp the crap out of that thing.
Harry Potter is Python. Well, CPython, to begin with. But one day… it will be free of the corrupting influence at its core, and we will know only PyPy.
Well we all know who really kills off C, don’t we?