• Siwoo
  • Posts
  • Occam's Norelco

Occam's Norelco

Between two good-enough codes, the longer one is faster

Survivorship bias incarnates in various forms. From the notorious WW2 fighter jets to good-ol-days rosy retrospections, we deal with it daily. Today, I encountered another example of it: Long codes are faster.

Unless an algorithm employs an exquisite mathematical property, fast codes are typically long due to their vast optimization schemes and architectural decisions.

But this statement ignores that if it’s not fast, there’s no reason to keep it long. Thus, surviving long codes are indeed fast.

Occam’s razor dictates that between two possible explanations, the more direct one is usually correct. Thus, I hereby propose Occam’s Norelco:

Between two good-enough codes, the longer one is faster

Occam’s Norelco

“Good-enough” here means that you didn’t do something crazy to make it intentionally slow. In reverse, we can also say that between two similarly performing codes, the shorter one is usually correct.

Occam’s back-to-disposable razor?

But here comes the interesting part: Occam’s Norelco itself can make Un-norelco-ed. Copenhagen interpretation, popular in quantum physics, dictates that if you scale up your microscope, your subject will be too small at one point, and your observation’s photon rays can influence the subject. This is fascinating because the posteriori (future) can affect the priori (past). The action of observation, which is caused by the event itself, paradoxically screws the event!—kinda pedantically, it’s not, but this is not physics class.

This happens in Computer Science when you need to ship your optimized code. Deployable code optimization goes backward; you try your best to keep the code as short as possible. You tree-shake, remove polyfills, and do magical things like partial prerendering to make the first paint as light as possible. But this goes against Occam’s Norelco, which says that long codes are faster.

Like the Copenhagen interpretation, optimizing (making it longer) can sometimes ruin the purpose itself (it paradoxically makes things slower) in certain fields. How fascinating!

Pyrrhic Razor?

This begs us the ultimate question: Is optimization worth it?

There’s an interesting law (known as Proebsting’s law) that says for compiler optimizations:

Optimization advances double computing power every 18 years.

Proebsting’s law

That’s a pretty disappointing number if you put it side-by-side with Moore’s law. In the end, it seems like optimizing good-enough code only gives diminishing returns—what a Pyrrhic victory.

Fun Fact: Occam’s Electric Razor

When I first wrote this article, I named it Occam’s Electric Razor. But there already is an amazing article on this. Long story short: technology has evolved to make human interactions simpler. Thus, we can predict, similarly to Occam’s Razor, that all future technological enhancement will only simplify human interactions. The article then shifts gears to American legislation, stating that Americans need to simplify legal documents, which I want to touch on in the future. In fact, the reality unfolded in the opposite direction. The invention of Word Processors made legal documents explode in quantity—which I think was super myopic—but that’s a story for another day.

Written 100% by human with simple spellcheck.

Reply

or to participate.