Why do most programmers work so hard at pretending that they’re not doing math?

In the early days programming was considered a subdiscipline of mathematics. In fact, the very first person to write an algorithm was renowned as a mathematical genius. However, somewhere along the way we forgot. We began to think of ourselves as something different, a profession not beholden to rigor or deep understanding of the models we create.

It’s easy to see how this would happen within an industry in which so much more weight is put on knowing the API of the year over understanding base principles or expressing the desire to dig deep. People can make huge amounts of money pushing methodologies when they have little to no evidence of effectiveness. We work in an environment where hearsay and taste drive change instead of studies and models. We are stumbling in the dark.

I have come to attribute our sorry state to a combination of factors. The first is the lack of formal training for most programmers. This isn’t in itself a bad thing, but when combined with a lack of exposure to mathematics beyond arithmetic, due primarily to an inadequate school system, we are left with a huge number of people who think programming and math are unrelated. They see every day how their world is filled with repeating patterns but they miss the beautiful truth that math is really about modeling patterns and that numbers are just a small part of that.

The relationship between math and programming extends far beyond SQL’s foundation in set theory or bits being governed by information theory. Math is intertwined within the code you write every day. The relationships between the different constructs you create and the patterns you use to create them are math too. This is why typed functional programming is so important: it’s only when you formalize these relationships into a model that their inconsistencies become apparent.

Most recently it seems to have become a trend to think of testing as a reasonable substitute for models. Imagine if physics was done this way. What if the only way we knew how to predict how an event would turn out was to actually measure it each time? We wouldn’t be able to generalize our findings to even a slightly different set of circumstances. But it gets even worse: How would you even know what your measurement is of without a model to measure it within? To move back into the context of programming: how useful is a test that doesn’t capture all of the necessary input state?

This is exactly why dynamic programs with mutation become such a mess. Object oriented structure only makes things worse by hiding information and state. Implicit relationships, complex hidden structure and mutation each impair reasoning alone, but when combined they create an explosion of complexity. They each conspire with each other to create a system which defies comprehensive modeling by even the brightest minds.

This is also why programmers who use typed functional languages frequently brag about their low bug count yet eschew methodologies like test driven development: tests pale in comparison to the power of actual models.

Many thanks to my friends on twitter for the discussion leading up to this post: @seanschade, @craigstuntz, @sswistun, @joakinen, @copumpkin, @dibblego, @taylodl, @TheColonial, @tomasekeli, @tim_g_robinson and @CarstKoenig. Special thanks to @danfinch for taking the time to proof read.

Enjoy this post? Continue the conversation with me on twitter.

Tags: , , , ,


  1. Awesome, simple awesome.

    I have observed exactly the same thing over 20+ years n the field. The whole programming thingy gets really easy if the programmer can create abstract models of the domain. This preffession is not about SQL vs. No SQL. Or how dynamic is a methodology. But the new toy is so shinny, and look ma, no hands!

  2. Do you mind elaborating on these models you’re talking about? Is there a guiding school of thought on this, or am I going to have to buy a book on mathematical modeling and try to apply it to my code in order to attain a system that I trust more than unit tests?

  3. “Imagine if physics was done this way.”


    Imagine if a field where testing things was expensive and difficult to do accurately did testing as often as a field where it is instant and nearly costless.

    Mathematics (more specifically geometry) is important in carpentry in as well, it governs how things should be planned out and put together. Economists also started out as geniuses well versed in math, not so much anymore.

    People strive to keep it simple, because in mathematics you seek to develop universal laws and structures, while most programmers are just trying to build a comment system, or CMS, while the carpenter is just trying to build a table or a house. They don’t bother learning what they will scarcely use until it comes time to learn it, and even then they keep it as bear minimum as they can. It’s about laziness and knowing just enough to get the job done right, in an cost-effective amount of time. As fields become better understood it becomes easier to cut out the parts that your average contributor simply doesn’t need.

    • Indeed maths is a particular approach to the problems that programmers implement algorithmic solutions to.

      Maths can study computer science and programmers and programmers employ maths as a tool. But what they are doing isnt maths … but does incorporate elements.

      Thats like saying everyone who plays music is an expert in music theory or are even using music theory to work. That is nonsense, musicians play things that sound good and use music theory to communicate ideas … or their ears. Of course some musicians may use music theory to write but its not necessary or even helpful a lot of the time. At the end of the day all that matters is that it sounds good.

  4. Oh boy you’ve pushed some of my buttons.

    You wrote “people can make huge amounts of money pushing methodologies when they have little to no evidence of effectiveness.”

    Is this what you are doing here now? You praise the benefits of F# and of programming with Hindley-Milnerish languages. But where is the proof of the effectiveness? Not in the “market share”, at least. Individual success stories may say X is the best thing ever… but that is what everyone says.

    But let’s assume you are not doing that. That you are honest. Then, if I’ve read you correctly, the application programming level is where we should make our stand with math. I just can’t agree. I’ll provide two points of view; one is what we could call “boring enterprisey programming realist” and one “Let’s Kill von Neumannist”.

    Yesterday I wrote some code to automate the deployment of a certain system X on a distributed environment. Many parts of that experience rest upon math. The sorting of the dependency graph. The uncompressing code in gzip. The protocols of SSH.

    However, I used these as a tool. My code had no mathematical component to it whatsoever.

    Compare this with the shovelling of snow I’ve done. The whole experience rests on mathematics. The torque I apply to the shovel. The electrostatic fields and the Pauli exclusion principle that keep the snow from falling through the shovel. The parabolic curve traced by the bits of snow while they fly through the air under gravity.

    However, I use my shovel as a tool. The whole shovelling thing has no mathematical component to it whatsover.

    I know maths! I’d love to use maths while coding. Most of the time, however, I’m just shovelling data. It is not maths. It *rests* on math, but is not it.

    The other point of view. Let’s Kill von Neumann. Computation is what matters. Programming languages, operating systems and processor architectures are the tools we use to achieve desired computational results. If we really would like to reinvent programming, we should start at the source. Why stop at programming languages? The hardware is the only source of computational power; everything else is just layers on top. Are we sure that the foundation is solid? Should we not first investigate other computer architectures? Perhaps there are other ways to get to our goal, ways that completely subvert our current programming models and languages and theories. Perhaps quantum computing does that. Perhaps it does not.

    Should programmers then think of computer architectures and perhaps even the physics beyond them when solving a problem? Or should they just solve the problem at hand?

    Making progress is a delicate thing. We have only so much resources available. While in theory I agree that yes, every problem should be solved deeply and with finality, in practice I see this as a monomaniacal effort that leads to poor resource use.

    Yes, I’d want some better abstractions and more able tools. No, I don’t think static type systems are the final answer.

    • Best comment I’ve ever read.

    • “My code had no mathematical component to it whatsoever.”

      The code itself is a mathematical component.

    • Explained the way I feel clearly and with an excellent example. It is true that we do not need to be math geniuses to write good code, but learning never hurt anyone :)
      By the way, any good books on programming models?

    • Great comment, I agree entirely (like the snow shovel example). I write code on a daily basis, have formal Comp Sci qualifications with strong bakground in maths and 95% of my code is simply moving and transforming data with little to no need for Maths beyond simple arithmatic.

  5. Just as you piqued my interest you stopped!

  6. Euclid’s algorithm for greatest common divisor is more than two thousand years old. The Egyptians had algorithms for multiplication and division based on binary numbers as far back as 1850BC. I agree with you about the relationship between math and programming, but I believe math provides appropriate tools for handling mutation. State is a fundamental aspect of computers, and is key to their broad applicability; our programming languages and methodology must be able to handle it.

  7. I can’t disagree with you, but I feel like you’re giving a lot of responsibility to the title “programmer”. Anyone can be taught to program. It takes theory to learn Computer Science, and it takes a lot of theory and applied knowledge to be a Computer or Software Engineer.

    Better engineers know more about the science that their applied skills stand on. And better engineers know the results are not just about the immediate end product. This holds true in any of the engineering disciplines including Software.

  8. I understand your point and really agree with most of the perspective. However, I feel technology and these wonderful abstractions have empowered people with limited understanding of much of anything the power to create tools that are useful in some shape or form. These people tend to have the loudest voice, which can give a bad name to the math/science that is all based upon. But we can’t forget the whole point is to move humanity forward and empower the individual.

  9. I have conflicted feelings about this post. On one hand, I largely agree that writing programs is essentially akin to “doing math”. You define axioms which are tautologically true – the code – so that a myriad of special cases are handled correctly by a generic set of rules, and running the program is in essence a side-effect of the truth of the code.
    Where I get less comfortable is with your attack on tests. Conceptually, I think I understand where you are coming from. 2 + 2 = 4 is not a proof that addition works. At the same time, I am a huge fan of TDD and unit testing, which I found valuable even when working with F#. I see a unit test playing a role similar to a lemma: it’s not a proof of the overall theory, but it’s a useful intermediary result, which helps me establish some practical result.
    To your question ” how useful is a test that doesn’t capture all of the necessary input state?”, I would say, that’s why unit tests help me. Typically, when you cannot write such a test, it is because you cannot extract properly the state that is assumed in the result, and to make it work, you will have to rethink your design with explicit input conditions and no side-effects. In my experience, writing tests is useful mostly in thinking about my code and building intermediary results – I don’t really think of them as a proof that the system works.
    Another way to think about unit tests is that I am trying to construct a scientific theory; each test is an experiment which, if failing, tells me that my theory is incorrect.

  10. This post is certain to attract the denialists. I am not one of them.

    PS: the answer to your question is fear.

  11. “Sorry *state*” indeed. That’s the crux of the problem.

    One of the more compelling approaches I’ve seen is Functional *Relational* Programming (the other “FRP”), in which the stateful parts of the program are governed according to relational algebra, and the rest (by definition, stateless) is purely functional.

    “Out of the Tar Pit” by Ben Moseley.

  12. I think you miss a point here. Math is a tool. Computers and software are tool too.

    You speak about math like the ultimate goal to all things on earth. And the ultimate model.

    But most people are not like that. Math doesn’t (at least yet) properly modelize love, life… Do you think about physics equations when driving your car? Does a ant think about it when hanging arround?

    Many kind of model are available, not all formal. In fact our life is centered arround informal model we have in our head, in our body of the environment around.

    And one should use the best model available to deal with a problem. Do you really think facebook is about maths? Do you really think the web is about math? Or your CMS, your HR software? The latest game?

    No. They use some math, but that not the point. We have room for both searcher and theoritians on one side, and pratical people of the other side.

    People making software and computer are on the pratical side. They don’t have the same need as theoricians.

  13. Nice article Rick! I’m a big fan of functional languages, but although math is important to software development, I think the subtleties of human sociology and nature are more important. That is one of the big reasons that computers are becoming more social. There is a non-mathematical element to software development (or programming) that requires user interface, user experience, emotion, ease of use, value, and many more aspects that are not very easily expressed mathematically nor should they.

    In fact, physics can only get you so far when it comes to modeling. Our most accurate atomic clocks need to be adjusted every so often after taking measurements of the solar clock because the model is too perfect and doesn’t model reality. See http://en.wikipedia.org/wiki/Leap_second

    Good programming is more of an art in my mind than an engineering discipline. That is why Google, Apple, and many other companies hire artists, musicians, and sociologists to develop their software.

    Anyone who does enterprise application development knows that business logic is completely illogical. And that flexibility is much more important that modeling the system perfectly. This is why the agile movement has been so critical in the last decade. Software modeling was taking so long that by the time the workflows, business rules, and user interfaces are built based on the requirements gathered months, sometimes years before hand, the business had moved on. The business had changed the rules. The business is an organism made up of humans that morph and change over time.

    A perfect model becomes obsolete pretty fast in business software development. That is why the biggest business packages such as SAP are heavily customized and tailored to the client. It is also why SharePoint is so popular, because it is made up of building blocks to help people build solutions instead of pre-canned models.

  14. Spot on!

    One doesn’t have to adopt a functional language to apply the kinds of mathematical models to an application.

    Each method/function/procedure in whatever language offer the opportunity to develop a logically consistent development process. The key is using an understanding of set theory to ensure that all possibilities are “present and accounted for.”

    Any module has: input, processing, output. So, the input step should validate and filter all input to ensure that it meets the expectations of the processing step. The processing step should reduce all input into variables named for the process at hand. And, again, this should encourage finding “set completeness” in processing. Output transforms the results of processing to return values which are expected by the calling routine.

    What this does, in any language, much as in functional languages — in couching programming as an algorithm, is provide the ability to validate, at least to some extent, code by reading (literate programming).

    Object orientation both giveth and taketh away. If not carefully done, an impedance mismatch can accrue in creating a class from lower level functionality. That impedes rather than assists an algorithmic representation of an application.

  15. Why stop here? Without a thorough understanding of EE you cannot understand a computer. Or without a PhD in physics you can’t drive a car as well you might know that F=ma but what it you go really fast?

    People learn enough to get the job done. I doubt if 90% of programmers need to know any more math than how to add numbers together.

  16. I’ve seen a programmer who didn’t know math try to solve a quadratic equation by stepping through many small floating-point values until they “hit” the one that solved it. It didn’t work.

    I heard another non-mathematical programmer claim that pi was “defined” as 22/7.

    I worked with another programmer trying to translate a short BASIC segment into C, who was helpless because they didn’t understand or know how to implement the matrix operations that were built into BASIC, but not C.

    Other programmers waste time trying to solve NP-complete problems with more and more hacks, not even aware this is a class of problems with no simple solution.

    I can’t count the number of times I’ve seen programmers do many floating-point operations and test their result for equality with other floating point operations, and wonder why they’re not equal.

    Many software engineering problems are really math problems in disguise.

    Math lurks everywhere in software engineering, and programmers who pretend it doesn’t are less effective at developing good solutions and good software.

    • Just as an aside, floating point inaccuracies are not a mathematical problem but a hardware one; in maths 0.00000001 == 0.00000001 each and every time, not so on the hardware (I’ve seen many a-mathematician fall on that whereas programmers know to beware).

      Other than that, sometimes hacking away is the only cost effective thing to do, I’ve had the pleasure and displeasure of working with people who had very high mathematical skills but could not produce a working efficient-enough solution to a problem because of over-engineering.

      There are many layers, levels and skills under the umbrella term of programmer, some require more mathematical understanding then others…

  17. I think that the discussion is mixing two differents problems: the “science” of programming and the “practical use” of programming. Both are different on their own meaning.
    The “science” uses math, optimization, graphs, etc to solve general (fastest coding) and specific problems (put a man on the moon) and state/develop/research laws/rules/methodologies, etc to HELP people who make code. To do such thing, is a hard process, involving researchig, hypotesis, tests, similar papers, etc. and requieres highly skilled people to do it. This is the view of programming as a SCIENCE.
    The “practical use” uses the laws/rules/methodologies/etc developed by the science to implement more “trivial” problems (a accountant system, a bank system ) and it is used for other people (“without high skills” as the scientist) and however add their experience and “artist” point of view to write code. This is the PRACTICAL view.
    You can see this situation happening in many other professions: electronic engineer (LCDs, Plasmas, Lassers developers vs the integrators), medicine (cancer vaccines, genetics developers vs the doctors -as the users-)
    In both cases is important that people in each team should have the “rigth knowledgement”. You can´t use a programming framework who doesn’t probe its efectiveness in a correct way… and also you wouldn’t trust a programmer who doesn´t probe the methodolgies he uses.

    Regards, Elias

  18. Bonus points for your first in-article link.
    Everything else is just talk. :)
    I have written several programs which perform no calculations except simple counting and timing, using Visual Studio. The VS environment is so helpful I think someone could, with practice, make a living at coding with very little background in mathematics, never realizing the huge amount of number-crunching being done under their boots.

  19. Nice Post! Some thoughts from my side:
    From my years as hobby thorist and professional developer I can agree to “Programming has a lot to do with maths”. But you should strictly draw a line between “inventing” and “using”. If I find a new solution for a mathematical problem thats “inventing”, if I find a new sorting algorithm this is “inventing” too. If I open the formula book and take formula xy for a problem or take library xy and use a handy function, this is “using”.
    Most programmers out there are not “inventers” but “users” (frameworks, libraries, programming languages, hardware, and yes, math).
    Btw. For me, here lies the reason for the “bad view” most people have on maths. They only learn to “use” math in scool – (and yes, many learn how to calculate the volume of a sphere, but how many can show the proof for the formula?) the interesting part – the “invention” – is mostly reserved for university level education.
    So my argument is: What gives you the satisfaction is only “inventing” and not “using”. And so many programmers like to “invent” new solutions for programming problems, but they don’t know how to “invent” new maths. On the contraty, if you have to develop “new” maths, this is considered as a bad thing (don’t re-invent the wheel, “can’t you use the proofed function xy?”).
    My conclusion: Your post is like telling a car driver “hey there is a lot of physics involved in your car-driving, how can you drive if you don’t know how to calculate exactly how much lateral acceleration you will feel through the next corner?” Others mentioned it, in real life no one (of the “users”) does this calculations – we hope the manufactures do them. The same is true for programming, if I “just want to drive the car” (develop some enterprise data app) only “useage” is needed and “invention” is bad (“every solution is a new problem”, “others won’t get it”, …). But if you want to “build a new car which can drive on the moon” you have to “invent” and of course you have to do all the calculations and proofs then.

    So in the end your well formulated post holds true only for a small subset of the “programming-domain”. But in between programming covers such a big area that every statement about it is always true and false! (we could call it “Gödel’s incompleteness theorem of programming”). Therefore I accept that your other statements about programming (unit tests, oop) comes from your “corner” of programming, and may be true there. From my view (C++ veteran) I strongly disagree on that.
    Nevertheless I support your implicit appeal: “Programmers, learn some maths!”

    P.S. sorry for bad english – it’s not my native language (for the unlikely case you didn’t notice it)

  20. Spoken like a true computer scientist (which is to say, “someone who comes to CS/CE from maths”).

    One of your earliest assertions is, however, incorrect: in some places, the computational program came out of the *EE* department (thus the ‘CE’ part of ‘CS/CE’). It is still where a lot of interest in the field comes from. And if you haven’t dealt with the low-level physics involved in modern EE/CE, you’re just as lacking as if you haven’t studied the more complex points of the math involved.

    As for me, my day-to-day job is to *get something done*. If I’m dealing with large data sets where O(n) will play a significant factor, then I’ll take the time to look up a fast solution. If I’m working with n=10, even O(n!) just isn’t going to take long enough that I would have time to type my first Google query in. The skill is primarily in being able to spot and understand when an approach isn’t going to be workable, and be able to switch tracks and find a different way that will.

    As for modelling vs. testing? The single worst example of unreadable code I’ve ever seen was from a CS researcher. And, lo and behold, when subjected to the real world, it proved to have a tremendous variety of nasty (and in several cases subtle) bugs. Showing that 2+2=4 is by no means a sufficient condition, but it *is* a necessary one.

    And as others have said, one of the points of TDD (or in fact any test-heavy development methodology, ‘driven’ by it or not) is to force the developer to *think* about the problem domain, in particular about what the corner cases and edges of it are and how the logic should behave there.

    And finally, “imagine if physics was done this way”? It is. The first step is to come up with a theory, but the *second* step is to *test* it. Otherwise you’re just pissing about and wasting everyone’s time (your own included).

  21. I have read a lot of, um, “stuff”, to put it politely, written about how great models and formal methods are and how tests aren’t as good.

    I have to disagree. A lot.

    What these people miss is that their model will have bugs. Not silly things like forgetting the brace on an if statement, but deep conceptual problems.

    Testing is, in a way, creating a second model to check the first model. This is a good thing.

  22. Programming is a subset of math? Agreed.
    Economics is math? Also true.

    However, does a basic supermarket assistant need a deeper understanding of math? Not necessarily (although it would still be more handy than most people would think of).

    Thus, why should basic programmers which just clobber a lot of stuff together (“hey, it works!”) excel in math?

    Only a minor part of software business has accuracy, performance or mission critical demands important enough for extended mathematical brain work.
    Everything else can be handled well enough with a naive programming approach and some trial-and-error improvements.

  23. People don’t build REAL things with Physics
    People build things from Physics with Engineering
    Engineering is Physics with Testing
    We need Computer Engineers and Computer Scientists and we need to respect both

  24. Dear Richard,

    If Linus Torvalds reads your blog post, He will certainly agree with you 100%. In this letter, you will find what you are telling in one of his arguments.

    Personnaly, I studied 2 years of advanced mathematics and algorithmic after my bachelor’s degree. Then I prepared my software engineer master’s degree for 3 years where I learned algorithmic and patterns in a different way and learned how to implement these algorithms and patterns in different languages and different ways. And I can tell that these skills really helps me nowadays in real business world. I would like to add that communication skills go always first in business world. And the communication skills are only learned on ground.

    The languages and the patterns are just tools to transform the information into something real. If the information (the idea that goes through your mind), before being transformed is true, It could become mistaken after being implemented. Because the information goes through plenty of filters in conscious human brain.

    This can be compared to spoken languages. If you take a french citation and you give it to someone who speaks fluently english and started learning the french language in order to translate it in english, The translated citation (Information) will certainly have a mistaken meaning If he didn’t use correct words (write tools).

    Kind regards,

  25. On honor as an Engineer, I’ve always cringe when I read about programmers calling themselves Engineers but refuse to touch Math. Granted, engr is hardly all about math, and also about process design, logic, and project management, but we should embrace Math.

    • Frankly, I cringe when most programmers call themselves “engineers”. The bulk of them are only a “software engineer” or “computer engineer” the way that a janitor is a “sanitary engineer” — which is to say, only when the word is treated as being synonymous with “technician”.

      Actual software engineering involves the same sorts of things that any other form of engineering does, just in forms that are applicable to the domain. Things such as studying existing software to find its structure, then extrapolating from available data what the likely failure points are going to be when it is put under loads of various sorts.

      That requires a deep enough understanding of the algorithms and fundamental math to have a solid grasp on what is going on, but also having the knowledge of how actual systems “in reality” — as opposed to a in a pure model — behave, including nasty little problems that pure math tends to hand-wave such as “you ran into an I/O bottleneck”.

      The goal of all of that being the ability to answer questions such as “how gracefully does it degrade under load?”, “are there any severe inflection points in the curve?”, and “at what point does it just fall over and die (for any practical purpose)?”

  26. Programs are games. In as much as game theory is math (kinda yes, kinda no), programming is math.

  27. Programming is better than math.

    i = i+1 :)

  28. Nice article and ideas, and though I agree that math is important to progrmmers, I will subit that it is for a differnt reason.

    I’ve been a programmer for many years and still do it all day every day with minimal use of standard math. So I can see why programmers say that they don’t use math.

    That said, basically every if statement used by a progrogrammer (please try to tell me you never use this) is indeed the use of Boolean Algebra. So yes, math is very much used daily with programming. I would say that most programmers just don’t know it. I know taking a class Boolean Algebra changed very much the way I thought in implenting parts of my code (not always for the better when experimenting with implementing a simple or statement using NANDs). It did help me learn different ways to implement logic in different ways.

  29. Your conflation of rigour and mathematics is not very rigorous. Mathematics is the study of tautologies, and there is far more to software than tautologies.

    The example of physics to illustrate the deficiencies of testing struck me as odd. There are many models of the physical universe that are self-consistent but haven’t turned out useful because empirical testing has shown they aren’t accurate.

    I wholeheartedly agree that we should improve the rigour of software development, but I think we should do this by encouraging the study of literature and the philosophy of science.

    Thanks for writing this post – even though it was not expressed in mathematics it’s a valuable contribution to the discussion. :-)

  30. Math can enhance a person’s comprehension of essentially anything. There is no profound insight to the notion that programming is the same.

    Programming also has a linguistic element, and it all ultimately boils down to physics. Do I need a degree in linguistics or electrical engineering to write code?

    I also find it telling, the way you lump all “math” together as though there are no distinct, largely independent subdisciplines. Advanced philosophers can often think circles around us when it comes to ontology, epistemology, elegance, logic, and certainly abstraction. These are obviously not the same people architecting bridges or working at CERN. So whose set of concerns are closer to those of the typical application programmer?

    Please show me the low-level math I can use to derive a wheel and axel. Show me Mozart’s notes about Fourier or Picasso’s treatise on Euclid. Let’s hear about Gianni Versace’s contributions to the field of textile engineering. Or finance, for that matter.

    Math is obviously extremely useful for programmers, but you take it so far as to undermine the entire point of programming languages.

    Also, I can’t wait to run some of that amazing, bug-free, stateless code on my xbox.

  31. […] they are well-suited. Even though we have computers to do our math for us, programmers need to have superior math skills to tell the computers what kind of math to do. It’s not simple […]

  32. […] Why do most programmers work so hard at pretending that they’re not doing math? « Inviting Epipha… […]

  33. […] you all for your comments on my previous post, I appreciate the time you all took in sharing your perspectives very much.  Many of you have […]

  34. […] heartily agree with Richard Minerich when he says that testing does not replace a strong, theoretically-validated model. It’s the very same reason that pushed me to build most of this application’s engine on paper […]

Leave a comment