I fundamentally disagree with the "gun to the head" strategy.
One of the major projects I worked on was a virus genome analysis pipeline. Our initial funding was for single-segment virus analysis, but about a year into the project, our grant collaborators needed to demonstrate multi-segment virus support within two weeks. My PI and I agreed on a quick and dirty method that would produce the analyses needed and could be done in the allotted time. That piece of code was fundamental to how the whole pipeline worked, however, and so as the pipeline grew, it took on the shape of the "gun to the head" decision we had made to the point where other, more essential features of the pipeline had to be delayed so I could come up with more workarounds.
There were clearly other issues at play here (scope creep and lack of separation of concerns were huge). My time at the lab came to a close, but if I were to have a chance to continue that project, I would start over from scratch rather than deal with the baggage that that one "gun to the head" moment created.
I understand that it's a heuristic and not meant to be taken as 100% truth for every situation. I also understand that it's trying to avoid that "paralysis by analysis" that's so easy to fall into. I just question how useful it truly is as a heuristic, especially since it seems to go against the "write everything twice" algorithms presented in the rest of the piece.
I'd say it goes with the "write everything twice" heuristic! If you're in an environment where you can write things twice—you have trust and autonomy and aren't encumbered by process—then writing an initial version as fast as possible gets you started faster, lets you play with something concrete and leaves you more room for your second version.
My best projects were like that. I'd figure out something quick—some combination of reducing scope and doing "things that don't scale"—then spend time refining the conceptual design and interfaces, and finally rewrite the initial piece based on that new design. This can absolutely work better than just trying to write something "good" the first time around, but it looks wasteful to somebody superficially tracking individual "tasks" you're working on.
> I just question how useful it truly is as a heuristic
I think the author is presenting them as analytical tools that might or might not be useful depending on the situation.
Very often when faced with a difficult problem it's hard to know where to start attacking. Any idea, even if simplistic and wrong, can be useful to start gaining insight on what is going to work and why; even just refuting the original idea with a clear counterargument might suggest alternative avenues.
OT: this is IMO part of the reason why people like LLMs so much. Maybe the answer is trash, but articulating why it's trash gets you unstuck.
As the saying goes, there's nothing more permanent than a temporary solution. If you're going to do this, you either have to explicitly plan for the cost of the rewrite to do it the "right" way after doing it the "wrong" way, or you have to accept that you're probably not going to revisit the "temporary" solution for a long time.
One of the major projects I worked on was a virus genome analysis pipeline. Our initial funding was for single-segment virus analysis, but about a year into the project, our grant collaborators needed to demonstrate multi-segment virus support within two weeks. My PI and I agreed on a quick and dirty method that would produce the analyses needed and could be done in the allotted time. That piece of code was fundamental to how the whole pipeline worked, however, and so as the pipeline grew, it took on the shape of the "gun to the head" decision we had made to the point where other, more essential features of the pipeline had to be delayed so I could come up with more workarounds.
There were clearly other issues at play here (scope creep and lack of separation of concerns were huge). My time at the lab came to a close, but if I were to have a chance to continue that project, I would start over from scratch rather than deal with the baggage that that one "gun to the head" moment created.
I understand that it's a heuristic and not meant to be taken as 100% truth for every situation. I also understand that it's trying to avoid that "paralysis by analysis" that's so easy to fall into. I just question how useful it truly is as a heuristic, especially since it seems to go against the "write everything twice" algorithms presented in the rest of the piece.