so I really liked Zach's essay about pg, and his takedown of Arc and I had some thoughts as well.
I think pg's brevity thing is kind of weird and I think it's unfortunate that it's found so much traction in programming circles. Obviously it reflects the time when people were horrified about Java's poor choices and verbosity, but that doesn't mean it should have carried on.
I don't find brevity/concision to be true first order values. They ought to exist in service of other values, like "does this aid in comprehension". Certainly they can: APL is very intelligible to people who know APL, but APL is also readable because it has a coherent design.
I don't think there is any evidence - none at all - that brevity of code contributes to longer-lasting or more impactful systems or languages, and I think this points to brevity not being an indicator - leading or otherwise - of successful languages or technologies.
Brevity of code is not an engineering quality. Brevity of code does not say anything about fitness for purpose, or the ability to compile for hardware, or run, or handle load.
It's not a comprehension quality either. While I think it's certainly true that people are more likely to read things the shorter they are, I don't think there's any evidence that a 40kloc system is in any way provably "more readable" than an equivalent 50kloc line system
Certainly some of the slowest, buggiest, least-clear systems I've seen have been in Ruby, a language with a community that is definitely on the brevity train. I don't say this to deride Ruby, but to specifically say that nothing follows from brevity. It is a contextless quality.
So far the things that have been successful in information technology have been things that (in no order of importance): 1. are stable 2. are portable 3. are able to attract and sustain communities 4. can actually solve a problem
I find pg and Arc's fascination with brevity to be in contrast to successful technologies that have embodied the qualities in the previous tweet, for example C/C++ and Excel.
C/C++ and Excel have all of these qualities. They have survived multiple hardware generations over decades, they have large, active communities, and they are obsessed with stability and effectiveness. They'll run on basically anything.
The C/C++ people are obsessed with hardware, and for good reason. All software is hosted on hardware. It's inescapable, regardless of how many layers of VM you have pretending that that machine has unlimited memory or disk.
This matters because they have gotten good at getting their stuff running on new hardware *effectively*. They are well-positioned to advance to any new territory that appears (Apple Silicon, etc). To communities that don't think about hardware, this is more daunting.
Hardware is important, especially so if you care about writing code that people directly interact with, like an app, or a device, or appliance. The last 10 years are evidence of what happens to human interface quality if you go "I don't care, the JIT will make it fast"
I'm not saying everyone has to become an electrical engineer or compiler engineer (I'm certainly not) but a willful ignorance and scorn hardware knowledge or appreciation in favor of beautiful abstract symbology is a red flag.
I wonder if there is anybody out there who thinks Arc will outlast C/C++. If so, I'll bet you $1,000 that no one will be running Arc (or Bel) in 2103 (or 2050 for that matter), and that many, many people will be running C/C++ code.
w/r/t Excel: Excel has a hilarious degree of stability and compatibility that allows people to build on top of it, so they can focus on their domain. Things that let people work on their domain will survive, things that don't will get weeded out.
Further, Excel has a large, active community. Lots of experts, but also lots of new blood ready to solve a problem they've having by using a computer in a way that's new to them. Excel has a compelling story around using spreadsheets as a device to solve problems.
When I hear about Arc, or Bel, or whatever, I think "what are they for?" If there's no story that allows people to see themselves solving their own problems with this thing, then it's dead. Simple as that.
C has this. Excel has this. Even more obscure things like Clojure or Rust have this (to their credit). Brevity isn't a first-order problem that anybody actually has. Nobody wakes up in the morning and is concerned about the length of their source code files. It's not a thing.
This kind of shoddy design thinking is ersatz minimalism: people will productively use a noisy, rough-riding car that gets them somewhere; they won't put up with a car that doesn't have wheels because the creator thought wheels were gauche and you ought to build them yourself.
To awkwardly wrap this up, I guess my final reflections are these: all technology is situated, hardware matters, and self-importance is no predictor of actual importance.