When I was first learning programming - not counting TI BASIC when I was 13 - the more experienced developers I was learning from stressed the importance of good names for things like variables, methods, classes, and modules. I think they were right to an extent, but as I've built and maintained more systems, I've grown less and less enamored with the value of "good names" in software systems relative to other information channels.
k say nothing about
for loops, or indexes, or counters. They aren't actually "good names" in any way. They don't say anything about your intent when writing this code or the work your
for loop happens to be doing. They're just the names we happened to give to that concept in that situation, and they stuck. Maybe they're derived from math in some way, but at this point, their origin matters less than their widespread use in the family of languages that have
For another example, when I was working with Ruby at my first real programming job, I remember being incredibly frustrated at how hard it was to tell what anything in our Ruby codebases was or did. Objects and values and classes and all had names, and they lived in modules that also had names, but I can still remember the daily frustration of trying to figure out what anything was, what it could do, or how it fit together with anything else in the system. The names just weren't sufficient to determine the capabilities of the objects without reading the various class and method implementations in full. (This was exacerbated by the predominant Ruby tooling at the time, which was almost always some combination of Sublime Text 2, Vim, and grep, but that's another blog post for another time.)
And maybe this is a sign of seniority (probably not) or burnt-out-ness (probably), but when I'm attempting to solve a specific problem, I don't want to read the class and method implementations of a bunch of unrelated features if I don't have to. In the systems that I have seen, the challenge to doing effective work is often not one of construction but reduction, i.e. the act of actually "writing" the code is less important than identifying and understanding the code that is relevant to you in a sea of code that isn't. The likelihood that any given module or function is relevant to the problem you happen to be working on decreases substantially with the size of the system, its longevity, and the number of people working on it.
And names just don't help that much with this kind of filtration! Names don't have anything to say about the importance of one module vs. another, or relationships between modules, or how to correctly use a subsystem. They often don't say anything about history, except in ad-hoc, useless ways like "LegacyQueue" or "APIv3". What a module does, and what it is related to, almost always matters more than what name its author has given it. That so many languages have such poor tools for identifying and presenting what code modules do does not refute this, and in fact confirms it, as in these languages you see programmers really leaning in to naming, as it is one of the few levers they can actually pull in lieu of having real tooling. You can't pull a lever you don't have, so you pull the ones you do, and unfortunately in many languages, names are all you've got.
One obvious mechanism to reach beyond names is types. Crucially, types aren't really syntax in the way that identifiers are. You can name a type, but in most languages that are worth a damn, types encode information at a deeper level than syntax. You can ask the compiler the type of something. You can ask the compiler what that type allows you to do, and how it interacts with other types. You can programmatically manipulate instances of that type. Types encode richer information than that which is conveyed by an a function paramter being called
user. User of what? How? In what context? What can this user do?
Now, I'm not a types maximalist - though I do like them for some things and Rust is a favorite language of mine - but I think types are a really great tool in a lot of cases. I'm encouraged that the industry seems to agree with me, with the rising popularity of things like Typescript and mypy. But there are of course other ways to gain greater understanding of our systems, like debuggers, REPLs, live reloading, image environments like in Smalltalk, and many others. These are all good, but I don't think we're nearly advanced enough. Too many programmers still think programming is mostly naming things in text files, and we really have got to move beyond it if we hope to match the intelligibility and reliability of other, more mature disciplines.