Rob Pike's 5 Rules of Programming

(cs.unc.edu)

258 points | by vismit2000 3 hours ago

27 comments

  • anymouse123456 47 minutes ago
    There are very few phrases in all of history that have done more damage to the project of software development than:

    "Premature optimization is the root of all evil."

    First, let's not besmirch the good name of Tony Hoare. The quote is from Donald Knuth, and the missing context is essential.

    From his 1974 paper, "Structured Programming with go to Statements":

    "Programmers waste enormous amounts of time thinking about, or worrying about, the speed of noncritical parts of their programs, and these attempts at efficiency actually have a strong negative impact when debugging and maintenance are considered. We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil. Yet we should not pass up our opportunities in that critical 3%."

    He was talking about using GOTO statements in C. He was talking about making software much harder to reason about in the name of micro-optimizations. He assumed (incorrectly) that we would respect the machines our software runs on.

    Multiple generations of programmers have now been raised to believe that brutally inefficient, bloated, and slow software is just fine. There is no limit to the amount of boilerplate and indirection a computer can be forced to execute. There is no ceiling to the crystalline abstractions emerging from these geniuses. There is no amount of time too long for a JVM to spend starting.

    I worked at Google many years ago. I have lived the absolute nightmares that evolve from the willful misunderstanding of this quote.

    No thank you. Never again.

    I have committed these sins more than any other, and I'm mad as hell about it.

    • sph 15 minutes ago
      Another one from my personal experience: apply DRY principles (don't repeat yourself) the third time you need something. Or in other words: you're allowed to copy-and-paste the same piece of code in two different places.

      Far too often we generalise a piece of logic that we need in one or two places, making things more complicated for ourselves whenever they inevitably start to differ. And chances are very slim we will actually need it more than twice.

      Premature generalisation is the most common mistake that separates a junior developer from an experienced one.

      • busfahrer 9 minutes ago
        Agreed, I think even Carmack advocates this rule
    • pjc50 1 minute ago
      Slow code is more of a project management problem. Features are important and visible on the roadmap. Performance usually isn't until it hits "unacceptable", which may take a while to feed back. That's all it is.

      (AI will probably make this worse as well, having a bloat tendency all of its own)

    • dwb 4 minutes ago
      Totally agree. I’ve see that quote used to justify wilfully ignoring basic performance techniques. Then people are surprised when the app is creaking exactly due to the lack of care taken earlier. I would tend to argue the other way most of the time: a little performance consideration goes a long way!

      Maybe I’ve had an unrepresentative career, but I’ve never worked anywhere where there’s much time to fiddle with performance optimisations, let alone those that make the code/system significantly harder to understand. I expect that’s true of most people working in mainstream tech companies of the last twenty years or so. And so that quote is basically never applicable.

    • devnullbrain 38 minutes ago
      > and the missing context is essential.

      Oh yes, I'd recommend everyone who uses the phrase reads the rest of the paper to see the kinds of optimisations that Knuth considers justified. For example, optimising memory accesses in quicksort.

      • kalaksi 27 minutes ago
        This shows how hard it is to create a generalized and simple rule regarding programming. Context is everything and a lot is relative and subjective.

        Tips like "don't try to write smart code" are often repeated but useless (not to mention that "smart" here means over-engineered or overly complex, not smart).

      • anymouse123456 30 minutes ago
        Exactly!

        I wish Knuth would come out and publicly chastise the many decades of abuse this quote has enabled.

    • twoodfin 19 minutes ago
      Ignoring optimization opportunities until you see the profile only works when you actually profile!

      Profiling never achieved its place in most developers’ core loop the way that compiling, linting, or unit testing did.

      How many real CI/CD pipelines spit out flame graphs alongside test results?

    • bko 21 minutes ago
      > Multiple generations of programmers have now been raised to believe that brutally inefficient, bloated, and slow software is just fine. There is no limit to the amount of boilerplate and indirection a computer can be forced to execute. There is no ceiling to the crystalline abstractions emerging from these geniuses. There is no amount of time too long for a JVM to spend starting.

      I think that's due to people doing premature optimization! If people took the quote to heart, they would be less inclined to increasing the amount of boilerplate and indirection.

    • YesThatTom2 19 minutes ago
      I hear you, friend!

      While you were seeing those problems with Java at Google, I saw seeing it with Python.

      So many levels of indirection. Holy cow! So many unneeded superclasses and mixins! You can’t reason about code if the indirection is deeper than the human mind can grasp.

      There was also a belief that list comprehensions were magically better somehow and would expand to 10-line monstrosities of unreadable code when a nested for loop would have been more readable and just as fast but because list comprehensions were fetishized nobody would stop at their natural readability limits. The result was like reading the run-on sentence you just suffered through.

    • frereubu 33 minutes ago
      Totally agree. Out of this context, the word "premature" can mean too many things.
    • aljgz 32 minutes ago
      In all honesty, this is one of the less abused quotes, and I have seen more benefit from it than harm.

      Like you, I've seen people produce a lot of slow code, but it's mostly been from people who would have a really hard time writing faster code that's less wrong.

      I hate slow software, but I'd pick it anytime over bogus software. Also, generally, it's easier to fix performance problems than incorrect behavior, especially so when the error has created data that's stored somewhere we might not have access to. But even more so, when the harm has reached the real world.

      • anymouse123456 28 minutes ago
        I don't believe there is any tension at all between fast and simple software.

        We can and should have both.

        This is a fraud, made up by midwits to justify their leaning towers of abstraction.

        • embedding-shape 26 minutes ago
          User-facing, sure, nothing stopping us from doing "simple and fast" software. But when it comes to the code, design and architecture, "simple" is often at odds with "fast", and also "secure". Once you need something to be fast and secure, it often leads to a less simple design, because now you care about more things, it's kind of hard to avoid.
      • gspr 18 minutes ago
        > I have seen more benefit from it than harm.

        Same. I, too, am sick of bloated code. But I use the quote as a reminder to myself: "look, the fact that you could spend the rest of the workday making this function run in linear instead of quadratic time doesn't mean you should – you have so many other tasks to tackle that it's better that you leave the suboptimal-but-obviously-correct implementation of this one little piece as-is for now, and return to it later if you need to".

  • embedding-shape 2 hours ago
    "Epigrams in Programming" by Alan J. Perlis has a lot more, if you like short snippets of wisdom :) https://www.cs.yale.edu/homes/perlis-alan/quotes.html

    > Rule 5. Data dominates. If you've chosen the right data structures and organized things well, the algorithms will almost always be self-evident. Data structures, not algorithms, are central to programming.

    Always preferred Perlis' version, that might be slightly over-used in functional programming to justify all kinds of hijinks, but with some nuance works out really well in practice:

    > 9. It is better to have 100 functions operate on one data structure than 10 functions on 10 data structures.

    • rsav 2 hours ago
      There's also:

      >I will, in fact, claim that the difference between a bad programmer and a good one is whether he considers his code or his data structures more important. Bad programmers worry about the code. Good programmers worry about data structures and their relationships.

      -- Linus Torvalds

      • mikepurvis 52 minutes ago
        I think this is sometimes a barrier to getting started for me. I know that I need to explore the data structure design in the context of the code that will interact with it and some of that code will be thrown out as the data structure becomes more clear, but still it can be hard to get off the ground when me gut instinct is that the data design isn't right.

        This kind of exploration can be a really positive use case for AI I think, like show me a sketch of this design vs that design and let's compare them together.

        • ignoramous 43 minutes ago
          > This kind of exploration can be a really positive use case for AI I think

          Not sure if SoTA codegen models are capable of navigating design space and coming up with optimal solutions. Like for cybersecurity, may be specialized models (like DeepMind's Sec-Gemini), if there are any, might?

          I reckon, a programmer who already has learnt about / explored the design space, will be able to prompt more pointedly and evaluate the output qualitatively.

          > sometimes a barrier to getting started for me

          Plenty great books on the topic (:

          Algorithms + Data Structures = Programs (1976), https://en.wikipedia.org/wiki/Algorithms_%2B_Data_Structures...

          • mikepurvis 26 minutes ago
            Yeah key word is exploration. It's not "hey Claude write the design doc for me" but rather, here's two possible directions for how to structure my solution, help me sketch each out a bit further so that I can get a better sense what roadblocks I may hit 50-100 hours into implementation when the cost of changing course is far greater.
    • Intermernet 2 hours ago
      I believe the actual quote is:

      "Show me your flowchart and conceal your tables, and I shall continue to be mystified. Show me your tables, and I won't usually need your flowchart; it'll be obvious." -- Fred Brooks, The Mythical Man Month (1975)

      • bfivyvysj 2 hours ago
        This is the biggest issue I see with AI driven development. The data structures are incredibly naive. Yes it's easy to steer them in a different direction but that comes at a long term cost. The further you move from naive the more often you will need to resteer downstream and no amount of context management will help you, it is fighting against the literal mean.
        • Intermernet 1 hour ago
          Naive doesn't mean bad. 99% of software can be written with understood, well documented data structures. One of the problems with ai is that it allows people to create software without understanding the trade offs of certain data structures, algorithms and more fundamental hardware management strategies.

          You don't need to be able to pass a leet code interview, but you should know about big O complexity, you should be able to work out if a linked list is better than an array, you should be able to program a trie, and you should be at least aware of concepts like cache coherence / locality. You don't need to be an expert, but these are realities of the way software and hardware work. They're also not super complex to gain a working knowledge of, and various LLMs are probably a really good way to gain that knowledge.

        • dotancohen 1 hour ago
          Then don't let the AI write the data structures. I don't. I usually don't even let the AI write the class or method names. I give it a skeleton application and let it fill in the code. Works great, and I retain knowledge of how the application works.
        • andsoitis 1 hour ago
          > This is the biggest issue I see with AI driven development. The data structures are incredibly naive.

          Bill Gates, for example, always advocated for thinking through the entire program design and data structures before writing any code, emphasizing that structure is crucial to success.

          • neocron 1 hour ago
            Ah Bill Gates, the epitome of good software
            • andsoitis 1 hour ago
              > Ah Bill Gates, the epitome of good software

              While developing Altair BASIC, his choice of data structures and algorithms enabled him to fit the code into just 4 kilobytes.

            • dotancohen 1 hour ago
              Yes, actually. Gates wrote great software.

              Microsoft is another story.

              • jll29 36 minutes ago
                And Paul Allen wrote a whole Altair emulator so that they could use an (academic) Harvard computer for their little (commercial) project and test/run Bill Gates' BASIC interpreter on it.
    • tangus 52 minutes ago
      Aren't they basically saying opposite things? Perlis is saying "don't choose the right data structure, shoehorn your data into the most popular one". This advice might have made sense before generic programming was widespread; I think it's obsolete.
      • Rygian 23 minutes ago
        Pike: strongly typed logic is great!

        Perlin: stringly typed logic is great!

    • 0xpgm 1 hour ago
      Reminded me of this thread between Alan Kay and Rich Hickey where Alan Kay thinks "data" is a bad idea.

      My interpretation of his point of view is that what you need is a process/interpreter/live object that 'explains' the data.

      https://news.ycombinator.com/item?id=11945722

      EDIT: He writes more about it in Quora. In brief, he says it is 'meaning', not 'data' that is central to programming.

      https://qr.ae/pCVB9m

      • johnmaguire 7 minutes ago
        Hm, not sure. Data on its own (say, a string of numbers) might be meaningless - but structured data? Sure, there may be ambiguity but well-structured data generally ought to have a clear/obvious interpretation. This is the whole idea of nailing your data structures.
      • christophilus 40 minutes ago
        I’m with Rich Hickey on this one, though I generally prefer my data be statically typed.
    • alberto-m 1 hour ago
      This quote from “Dive into Python” when I was a fresh graduate was one of the most impacting lines I ever read in a programming book.

      > Busywork code is not important. Data is important. And data is not difficult. It's only data. If you have too much, filter it. If it's not what you want, map it. Focus on the data; leave the busywork behind.

    • mchaver 1 hour ago
      I find languages like Haskell, ReScript/OCaml to work really well for CRUD applications because they push you to think about your data and types first. Then you think about the transformations you want to make on the data via functions. When looking at new code I usually look for the types first, specifically what is getting stored and read.
      • embedding-shape 1 hour ago
        Similarly, that approach works really well in Clojure too, albeit with a lot less concern for types, but the "data and data structures first" principle is widespread in the ecosystem.
    • dcuthbertson 45 minutes ago
      But doesn't No. 2 directly conflict with Pike's 5th rule? It seems to me these are all aphorisms that have to be taken with a grain of salt.

      > 2. Functions delay binding; data structures induce binding. Moral: Structure data late in the programming process.

    • TYPE_FASTER 52 minutes ago
      > Rule 5. Data dominates. If you've chosen the right data structures and organized things well, the algorithms will almost always be self-evident. Data structures, not algorithms, are central to programming.

      If I have learned one thing in my 30-40 years spent writing code, it is this.

    • linhns 1 hour ago
      Nice to see Perlis mentioned once in a while. Reading SICP again, still learning new things.
    • JanisErdmanis 1 hour ago
      With 100 functions and one datastructure it is almost as programming with a global variables where new instance is equivalent to a new process. Doesn’t seem like a good rule to follow.
      • embedding-shape 37 minutes ago
        The scope of where that data structure or functions are available is a different concern though, "100 functions + 1 data structure" doesn't require globals or private, it's a separate thing.
    • Hendrikto 1 hour ago
      I feel like these are far more vague and less actionable than the 5 Pike rules.
    • DaleBiagio 1 hour ago
      " 9. It is better to have 100 functions operate on one data structure than 10 functions on 10 data structures."

      That's great

    • bandrami 1 hour ago
      Also basically everything DHH ever said (I stopped using Rails 15 years ago but just defining data relationships in YAML and typing a single command to get a functioning website and database was in fact pretty cool in the oughts).
    • mpalmer 2 hours ago
      Was the "J" short for "Cassandra"?

          When someone says "I want a programming language in which I need only say what I wish done," give him a lollipop.
    • Pxtl 1 hour ago
      As much as relational DBs have held back enterprise software for a very long time by being so conservative in their development, the fact that they force you to put this relationship absolutely front-of-mind is excellent.
      • embedding-shape 50 minutes ago
        I'd personally consider "persistence" AKA "how to store shit" to be a very different concern compared to the data structures that you use in the program. Ideally, your design shouldn't care about how things are stores, unless there is a particular concern for how fast things read/writes.
    • mosura 2 hours ago
      Perlis is just wrong in that way academics so often are.

      Pike is right.

      • Intermernet 1 hour ago
        Hang on, they mostly agree with each other. I've spoken to Rob Pike a few times and I never heard him call out Perlis as being wrong. On this particular point, Perlis and Pike are both extending an existing idea put forward by Fred Brooks.
        • mosura 1 hour ago
          Perlis absolutely is not saying the same thing, and as the commenter notes the functional community interpret it in a particularly extreme way.

          I would guess Pike is simply wise enough not to get involved in such arguments.

      • jacquesm 1 hour ago
        Perlis is right in the way that academics so often are and Pike is right in the way that practitioners often are. They also happen to be in rough agreement on this, unsurprisingly so.
      • hrmtst93837 1 hour ago
        Treating either as gospel is lazy, Perlis was pushing back on dogma and Pike on theory, while legacy code makes both look cleaner on paper.
      • AnimalMuppet 1 hour ago
        Could you be more specific?
        • mosura 58 minutes ago
          Promoting the idea of one data structure with many functions contradicts:

          “If you've chosen the right data structures and organized things well, the algorithms will almost always be self-evident.”

          And:

          “Use simple algorithms as well as simple data structures.”

          A data structure general enough to solve enough problems to be meaningful will either be poorly suited to some problems or have complex algorithms for those problems, or both.

          There are reasons we don’t all use graph databases or triple stores, and rely on abstractions over our byte arrays.

  • dasil003 15 minutes ago
    These rules aged well overall. The only change I would make these days is to invert the order.

    Number 5 is timeless and relevant at all scales, especially as code iterations have gotten faster and faster, data is all the more relevant. Numbers 4 and 3 have shifted a bit since data sizes and performance have ballooned, algorithm overhead isn't quite as big a concern, but the simplicity argument is relevant as ever. Numbers 2 and 1 while still true (Amdahl's law is a mathematical truth after all), are also clearly a product of their time and the hard constraints programmers had to deal with at the time as well as the shallowness of the stack. Still good wisdom, though I think on the whole the majority of programmers are less concerned about performance than they should be, especially compared to 50 years ago.

  • CharlieDigital 2 hours ago
    I feel like 1 and 2 are only applicable in cases of novelty.

    The thing is, if you build enough of the same kinds of systems in the same kinds of domains, you can kinda tell where you should optimize ahead of time.

    Most of us tend to build the same kinds of systems and usually spend a career or a good chunk of our careers in a given domain. I feel like you can't really be considered a staff/principal if you can't already tell ahead of time where the perf bottleneck will be just on experience and intuition.

    • PaulKeeble 2 hours ago
      I feel like every time I have expected an area to be the major bottleneck it has been. Sometimes some areas perform worse than I expected, usually something that hasn't been coded well, but generally its pretty easy to spot the computationally heavy or many remote call areas well before you program them.

      I have several times done performance tests before starting a project to confirm it can be made fast enough to be viable, the entire approach can often shift depending on how quickly something can be done.

      • projektfu 1 hour ago
        It really depends on your requirements. C10k requires different design than a web server that sees a few requests per second at most, but the web might never have been invented if the focus was always on that level of optimization.
      • pydry 1 hour ago
        The number 1 issue Ive experienced with poor programmers is a belief that theyre special snowflakes who can anticipate the future.

        It's the same thing with programmers who believe in BDUF or disbelieve YAGNI - they design architectures for anticipated futures which do not materialize instead of evolving the architecture retrospectively in line with the future which did materialize.

        I think it's a natural human foible. Gambling, for instance, probably wouldnt exist if humans' gut instincts about their ability to predict future defaulted to realistic.

        This is why no matter how many brilliant programmers scream YAGNI, dont do BDUF and dont prematurely optimize there will always be some comment saying the equivalent of "akshually sometimes you should...", remembering that one time when they metaphorically rolled a double six and anticipated the necessary architecture correctly when it wasnt even necessary to do so.

        These programmers are all hopped up on a different kind of roulette these days...

        • rcxdude 45 minutes ago
          Aye. The number one way to make software amenable to future requirements is to keep it simple so that it's easy to change in future. Adding complexity for anticipated changes works against being able to support the unanticipated ones.
    • Bengalilol 1 hour ago
      > you can kinda tell where you should optimize ahead of time

      Rules are "kinda" made to be broken. Be free.

      I've been sticking to these rules (and will keep sticking to them) for as long as I can program (I've been doing it for the last 30 years).

      IMHO, you can feel that a bottleneck is likely to occur, but you definitely can't tell where, when, or how it will actually happen.

    • HunterWare 1 hour ago
      ROFL, I wish Pike had known what he was talking about. /s ;)
    • relaxing 2 hours ago
      Rob Pike wrote Unix and Golang, but sure, you’re built different.
      • Intermernet 1 hour ago
        Rob Pike is responsible for many cool things, but Unix isn't one of them. Go is a wonderful hybrid (with its own faults) of the schools of Thompson and Wirth, with a huge amount of Pike.

        If you'd said Plan 9 and UTF-8 I'd agree with you.

        • jacquesm 1 hour ago
          Rob Pike definitely wrote large chunks of Unix while at Bell Labs. It's wrong to say he wrote all of it like the GP did but it is also wrong to diminish his contributions.

          Unless you meant to imply that UNIX isn't cool.

      • my-next-account 1 hour ago
        Do you think Rob Pike ever decided that maybe what was done before isn't good enough? Stop putting artificial limits on your own competency.
      • andsoitis 1 hour ago
        > Rob Pike wrote Unix

        Unix was created by Ken Thompson and Dennis Ritchie at Bell Labs (AT&T) in 1969. Thompson wrote the initial version, and Ritchie later contributed significantly, including developing the C programming language, which Unix was subsequently rewritten in.

        • 9rx 1 hour ago
          Pike didn’t create Unix initially, but was a contributor to it. He, with a team, unquestionably wrote it.
          • andsoitis 1 hour ago
            > but was a contributor to it. He, with a team, unquestionably wrote it.

            contribute < wrote.

            His credits are huge, but I think saying he wrote Unix is misattribution.

            Credits include: Plan 9 (successor to Unix), Unix Window System, UTF-8 (maybe his most universally impactful contribution), Unix Philosophy Articulation, strings/greps/other tools, regular expressions, C successor work that ultimately let him to Go.

            • 9rx 35 minutes ago
              Are you under the impression he was, like, a hands-off project manager or something? His involvement was in writing it. Not singlehandedly, but certainly as part of a team. He unquestionably wrote it. He did not envision it like he did the other projects you mention.
  • ta20211004_1 36 minutes ago
    Can't agree more on 5. I've repeatedly found that any really tricky programming problem is (eventually) solved by iterative refinement of the data structures (and the APIs they expose / are associated with). When you get it right the control flow of a program becomes straightforward to reason about.

    To address our favorite topic: while I use LLMs to assist on coding tasks a lot, I think they're very weak at this. Claude is much more likely to suggest or expand complex control flow logic on small data types than it is to recognize and implement an opportunity to encapsulate ideas in composable chunks. And I don't buy the idea that this doesn't matter since most code will be produced and consumed by LLMs. The LLMs of today are much more effective on code bases that have already been thoughtfully designed. So are humans. Why would that change?

  • tasuki 38 minutes ago
    The first four are kind of related. For me the fifth is the important – and oft overlooked – one:

    > Data dominates. If you've chosen the right data structures and organized things well, the algorithms will almost always be self-evident. Data structures, not algorithms, are central to programming.

  • piranha 1 hour ago
    > Rule 5 is often shortened to "write stupid code that uses smart objects".

    This is probably the worst use of the word "shortened" ever, and it should be more like "mutilated"?

    • andsoitis 1 hour ago
      Syntactic sugar is cancer of the semicolon.
      • franktankbank 1 hour ago
        Tide goes in tide goes out, can't explain that.
  • DaleBiagio 1 hour ago
    The attribution to Hoare is a common error — "Premature optimization is the root of all evil" first appeared in Knuth's 1974 paper "Structured Programming with go to Statements."

    Knuth later attributed it to Hoare, but Hoare said he had no recollection of it and suggested it might have been Dijkstra.

    Rule 5 aged the best. "Data dominates" is the lesson every senior engineer eventually learns the hard way.

    • YesThatTom2 11 minutes ago
      If Dijkstra blamed Knuth it would have been the best recursive joke ever.
    • zabzonk 49 minutes ago
      I've always thought it was Dijkstra - it even sounds Dijkstra-ish.
  • nateb2022 1 hour ago
    Previous discussion: https://news.ycombinator.com/item?id=15776124 (8 years ago, 18 comments)
  • keyle 1 hour ago
    Rule 5 is definitely king. Code acts on data, if the data is crap, you're already lost.

    edit: s/data/data structure/

    • andsoitis 1 hour ago
      … if the data structures are crap.

      Good software can handle crap data.

      • keyle 1 hour ago
        That is not what I meant. I meant crap data structures. Sorry it's late here.
  • tobwen 2 hours ago
    Added to AGENTS.md :)
    • wwweston 1 hour ago
      How good is your model at picking good data structures?

      There’s several orders of magnitude less available discussion of selecting data structures for problem domains than there is code.

      If the underlying information is implicit in high volume of code available then maybe the models are good at it, especially when driven by devs who can/will prompt in that direction. And that assumption seems likely related to how much code was written by devs who focus on data.

      • skydhash 57 minutes ago
        > There’s several orders of magnitude less available discussion of selecting data structures for problem domains than there is code.

        I believe that’s what most algorithms books are about. And most OS book talks more about data than algorithms. And if you watch livestream or read books on practical projects, you’ll see that a lot of refactor is first selecting a data structure, then adapt the code around it. DDD is about data structure.

    • ozgrakkurt 1 hour ago
      Would be cool to see the live reaction of Rob Pike to this comment
  • kleiba 2 hours ago
    I believe the "premature evil" quote is by Knuth, not Hoare?!
    • swiftcoder 2 hours ago
      Potentially its by either (or even both independently). Knuth originally attributed it to Hoare, but there's no paper trail to demonstrate Hoare actually coined it first
      • Intermernet 1 hour ago
        Turns out that premature attribution is actually the root of all evil...
      • Bengalilol 1 hour ago
        Every empirical programmer will, at some point, end up yelling it out loud (too).
  • igtztorrero 49 minutes ago
    Rule 4, I have always practiced and demanded of junior programmers, to make algorithms and structures that are simple to understand, for our main user: the one who will modify this code in the future.

    I believe that's why Golang is a very simple but powerful language.

  • Devasta 1 hour ago
    > "Premature optimization is the root of all evil."

    This Axiom has caused far and away more damage to software development than the premature optimization ever will.

    • gjadi 1 hour ago
      Because people only quote it partially.

      > We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil. Yet we should not pass up our opportunities in that critical 3%.

  • elcapitan 1 hour ago
    Meta: Love the simplicity of the page, no bullshit.

    Funny handwritten html artifact though:

        <title> <h1>Rob Pike's 5 Rules of Programming</h1> </title>
  • heresie-dabord 2 hours ago
  • Mercuriusdream 2 hours ago
    never expected it to be a single HTML file so kind of surprised, but straight to the point, to be honest.
  • bsenftner 2 hours ago
    Obvious. Why the elevation of the obvious?
    • DrScientist 2 hours ago
      I think for people starting out - rule 5 isn't perhaps that obvious.

      > Rule 5. Data dominates. If you've chosen the right data structures and organized things well, the algorithms will almost always be self-evident. Data structures, not algorithms, are central to programming.

      If want to solve a problem - it's natural to think about logic flow and the code that implements that first and the data structures are an after thought, whereas Rule 5 is spot on.

      Conputers are machines that transform an input to an output.

      • mosura 2 hours ago
        > If want to solve a problem - it's natural to think about logic flow and the code that implements that first and the data structures are an after thought, whereas Rule 5 is spot on.

        It is?

        How can you conceive of a precise idea of how to solve a problem without a similarly precise idea of how you intend to represent the information fundamental to it? They are inseparable.

        • DrScientist 1 hour ago
          Obviously they are linked - the question is where do you start your thinking.

          Do you start with the logical task first and structure the data second, or do you actually think about the data structures first?

          Let's say I have a optimisation problem - I have a simple scoring function - and I just want to find the solution with the best score. Starting with the logic.

          for all solutions, score, keep if max.

          Simple eh? Problem is it's a combinatorial solution space. The key to solving this before the entropic death of the universe is to think about the structure of the solution space.

      • TheOtherHobbes 1 hour ago
        I mean - no. If you're coming to a completely new domain you have to decide what the important entities are, and what transformations you want to apply.

        Neither data structures nor algorithms, but entities and tasks, from the user POV, one level up from any kind of implementation detail.

        There's no point trying to do something if you have no idea what you're doing, or why.

        When you know the what and why you can start worrying about the how.

        Iff this is your 50th CRUD app you can probably skip this stage. But if it's green field development - no.

        • DrScientist 1 hour ago
          Sure context is important - and the important context you appear to have missed is the 5 rules aren't about building websites. It's about solving the kind of problems which are easy to state but hard to do (well) .

          eg sort a list.

    • praptak 2 hours ago
      A good chunk of great advice is obvious things that people still fail to do.

      That's why a collection of "obvious" things formulated in a convincing way by a person with big street cred is still useful and worth elevating.

      • pm215 2 hours ago
        Also, "why these 5 in particular" is definitely not obvious -- there are a great many possible "obvious in some sense but also true in an important way" epigrams to choose from (the Perlis link from another comment has over a hundred). That Pike picked these 5 to emphasise tells you something about his view of programming, and doubly so given that they are rather overlapping in what they're talking about.
    • HunterWare 1 hour ago
      Can't be but so obvious if the first comment I saw here was that the first two rules didn't seem so important. =)
    • bazoom42 2 hours ago
      Definitely not obvious to everybody.
    • knorker 17 minutes ago
      I'd call it more derivative than obvious.

      "Why quote someone who's just quoting someone else?" — Michael Scott — knorker

    • pjc50 2 hours ago
      You've got to elevate some obviously correct things, otherwise social media will fill the void with nonobviously incorrect things.
      • mosura 2 hours ago
        Better to have 100 comments on one topic than 10 comments on 10 topics.
  • Shawn19s83 40 minutes ago
    surprised this isn't talked about more
  • doe88 1 hour ago
    Great rules, but Rule 3.: WOW, so true, so well enunciated, masterful.
    • bell-cot 37 minutes ago
      Yes, and I'd say it's more true now than then. Best case, your fancy algorithms are super-sizing code that runs 1% of the time, always kicking more-often-run code out of the most critical CPU caches. Worst case, your fancy algorithms contain security bugs, and the bad guys cash in.
  • anthk 1 hour ago
    9front it's distilled Unix. I corrected Russ Cox' 'xword' to work in 9front and I am just a newbie. No LLM's, that's Idiocratic, like the movie; just '9intro.us.pdf' and man pages.

    LLM's work will never be reproducible by design.

  • jcmartinezdev 51 minutes ago
    Rule 6: Never disagree with AI slop
  • openclaw01 1 hour ago
    [dead]
  • catchcatchcatch 2 hours ago
    [dead]
  • Iamkkdasari74 1 hour ago
    [dead]
  • seedpi 1 hour ago
    [flagged]