His point about references is no small thing. Other dynamic languages don’t make users think much about the distinction between references and values at the syntax level. With Perl you needed to use “->” arrow operator frequently if and only if you were using references. So getting at a map inside an array or vice versa had its own syntax vs reading a string in a map or array.
Also it had bolted on, awkward OO on top of the bolted on, awkward params passing. You literally had to shift “self” (or “this”) off a magical array variable (@_).
By default it wouldn’t warn if you tried to read from an undeclared variable or tried to use one in a conditional or assign from one. You had to declare “use strict;” for that. Which wasn’t hard! But these awkward things piled up, a bunch of small cuts. Don’t forget “use warnings;” also, another thing to put at the top of every Perl file.
To the extent its awkward syntax came out of aping of shell and common Unix cli tools, you could maybe see it as cultural issue if you squint.
But any language in the mid 90s was infected with the “rtfm” priesthood vibe the author writes about, because the internet then was disproportionately populated by those sysop types, especially the part that can answer programming language questions on usenet, which is basically where you had to ask back then.
So for example Rails won for technical reasons, it is much more concise with fewer footguns than its Perl equivalents. I was actively coding web stuff in Perl when it came along and early switched. It wasn’t a cultural thing, having choice in Perl was fine (and ruby has sadly never grown much culture outside Rails - it could really use some). It probably did help that it came along in the mid aughts by which time you could ask questions on the web instead of Usenet. And it used YouTube for that first rails demo. So ruby did end up with a less sysopy culture but that had more to do with the timing of its success than the success itself.
my($f) = `fortune`; # assigns first line of output to $f.
my $f = `fortune`; # assign all output to $f.
Which allegedly got a HS kid in hot water[^1].[^1]: "It's all about context" (2001): https://archive.ph/IB2kR (http://www.stonehenge.com/merlyn/UnixReview/col38.html)
[^1]: >>44790671
A shift to Python or Ruby is fundamentally a shift to a different set of core cognitive patterns. This influences how problems are solved and how sense is made of the world, with the programming languages being tools to facilitate and, more often than not, shepherd thought processes.
The culture shift we have seen with corporations and socialized practices for collaboration, coding conventions, and more coincides with the decline of a language that does in fact have a culture that demands you RTFM. Now, the dominant culture in tech is one that either centralizes solutions to extract and rent seek or that pretends that complexity and nuance does not exist so as to move as quickly as possible, externalizing the consequences until later.
If you've been on this forum for a while, what I am saying should seem familiar, because the foundations have already been laid out in "The Pervert's Guide to Computer Programming", which applies Lacanian psychoanalysis to cognitive patterns present in various languages[1][2]. This explains the so-called decline of Perl—many people still quietly use it in the background. It also explains the conflict between Rust and C culture.
As an aside, I created a tool that can use this analysis to help companies hire devs even if they use unorthodox languages like Zig or Nim. I also briefly explored exposing it as a SaaS to help HR make sense of this (since most HR generalists don't code and so have to go with their gut on interviews, which requires them to repeat what they have already seen). With that stated, I don't believe there is a large enough market for such a tool in this hiring economy. I could be wrong.
[1] [PDF] -- "The Pervert's Guide to Computer Programming" https://s3-us-west-2.amazonaws.com/vulk-blog/ThePervertsGuid...
[2] [YouTube Vulc Coop]-- https://www.youtube.com/watch?v=mZyvIHYn2zk
Using venvs is trivial (and orders of magnitude more lightweight than a container). And virtually every popular package has a policy of supporting at least all currently supported Python versions with each new release.
You need to set up a venv because of how the language is designed, and how it has always worked since the beginning. Python doesn't accommodate multiple versions of a package in the same runtime environment, full stop. The syntax doesn't provide for version numbers on imports. Imports are cached by symbolic name and everyone is explicitly expected to rely on this for program correctness (i.e., your library can have global state and the client will get a singleton module object). People just didn't notice/care because the entire "ecosystem" concept didn't exist yet.
I have at least one local from-source build of every Python from 3.3-3.14 inclusive (plus 2.7); it's easy to do. But I have them explicitly for testing, not because using someone else's project forces me to. The ecosystem is just not like that unless perhaps you are specifically using some sort of PyTorch/CUDA/Tensorflow related stack.
> It's the standard now because system Python can't do it.
Your system Python absolutely can have packages installed into it. The restrictions are because your Linux distro wants to be able to manage the system environment. The system package manager shouldn't have to grok files that it didn't put there, and system tools shouldn't have to risk picking up a dependency you put there. Please read https://peps.python.org/pep-0668/, especially the motivation and rationale sections.
> major breakages in point version upgrades
I can think of exactly one (`async` becoming a keyword, breaking Tensorflow that was using it as a parameter name). And they responded to that by introducing the concept of soft keywords. Beyond that, it's just not a thing for your code to become syntactically invalid or to change in semantics because of a 3.x point version change. It's just the standard library that has changes or removals. You can trivially fix this by vendoring the old code.
They have tried and succeeded: https://docs.activestate.com/activepython/2.7/
I still consider the result "broken".
It's actually somehow even worse than this. The Perl type system is honestly ridiculous. Most veteran Perl developers don't fully understand it.
https://blogs.perl.org/users/leon_timmermans/2025/02/a-deep-... is a good introduction.
Also, I think Larry Wall's "Diligence, Patience, Humility"[0] is among my favourite articles about programming.
[0] https://www.oreilly.com/openbook/opensources/book/larry.html
There was strong cultural pressure to be able to write perl in as few bytes as possible, ideally as a CLI one-liner. Books[1] were written on the topic.
https://www.thriftbooks.com/w/perl-one-liners-130-programs-t...
That is just how I felt about Perl (4 years full time dev in the 2000s) and how I now feel about https://raku.org (aka Perl6). Anyway, I tried to gather some fellow feelings here about 18 months ago:
https://rakujourney.wordpress.com/2024/05/22/perl-love-notes...
It is sad that Perl became so despised after the error of preannouncing a non-compatible upgrade. I understand that people couldn't wait. But Raku is here now and it is worth a second look imo.
Wikipedia says that this source [1] claims early versions of PHP were built on top of mod_perl, but I can't access the archive right now for some reason so I can't confirm.
[1] https://web.archive.org/web/20130625070202/http://www.theper...