https://mastodon.ar.al/@aral/114160190826192080
"Coding is like taking a lump of clay and slowly working it into the thing you want it to become. It is this process, and your intimacy with the medium and the materials you’re shaping, that teaches you about what you’re making – its qualities, tolerances, and limits – even as you make it. You know the least about what you’re making the moment before you actually start making it. That’s when you think you know what you want to make. The process, which is an iterative one, is what leads you towards understanding what you actually want to make, whether you were aware of it or not at the beginning. Design is not merely about solving problems; it’s about discovering what the right problem to solve is and then solving it. Too often we fail not because we didn’t solve a problem well but because we solved the wrong problem.
When you skip the process of creation you trade the thing you could have learned to make for the simulacrum of the thing you thought you wanted to make. Being handed a baked and glazed artefact that approximates what you thought you wanted to make removes the very human element of discovery and learning that’s at the heart of any authentic practice of creation. Where you know everything about the thing you shaped into being from when it was just a lump of clay, you know nothing about the image of the thing you received for your penny from the vending machine."
That said, the framing feels a bit too poetic for engineering. Software isn't only craft, it's also operations, risk, time, budget, compliance, incident response, and maintenance by people who weren't in the room for the "lump of clay" moment. Those constraints don't make the work less human; they just mean "authentic creation" isn't the goal by itself.
For me the takeaway is: pursue excellence, but treat learning as a means to reliability and outcomes. Tools (including LLMs) are fine with guardrails, clear constraints up front and rigorous review/testing after, so we ship systems we can reason about, operate, and evolve (not just artefacts that feel handcrafted).
I wholeheartedly disagree but I tend to believe that's going to be highly dependent on what type of developer a person is. One who leans towards the craftsmanship side or one who leans towards the deliverables side. It will also be impacted by the type of development they are exposed to. Are they in an environment where they can even have a "lump of clay" moment or is all their time spent on systems that are too old/archaic/complex/whatever to ever really absorb the essence of the problem the code is addressing?
The OP's quote is exactly how I feel about software. I often don't know exactly what I'm going to build. I start with a general idea and it morphs towards excellence by the iteration. My idea changes, and is sharpened, as it repeatedly runs into reality. And by that I mean, it's sharpened as I write and refactor the code.
I personally don't have the same ability to do that with code review because the amount of time I spend reviewing/absorbing the solution isn't sufficient to really get to know the problem space or the code.