The underlying truth of this joke is: Programming syntax is less confusing than mathematical syntax. There are genuinely ambiguous layouts of syntax in math (to a human reader that hasn’t internalized PEMDAS, anyways) whereas you get a compilation error if ANYTHING is ambiguous in programming. (yes, I am WELL aware of the frustrations of runtime errors)
Also: sometimes, a mathematician just has to invent some concept or syntax to convey something unconventional. The specific use of subscript/superscript, whatever ‘phi’ is being used for, etc. on whatever paper you’re reading doesn’t have to correlate to how other work uses the same concepts. It’s bad form, but sometimes its needed, and if useful enough is added to the general canon of what we call “math”. Meanwhile, you can encapsulate and obfuscate things in software, sure, but you can always get down to the bedrock of what the language supports; there’s no inventing anything new.
Yea, that’s it. Math syntax was created for humans, and programming syntax had to always remain deterministic for computers. It’s not an insult to either, just interesting how ambiguities show up often when humans are involved. I say ‘often’ for the general case: Math should be just as deterministic as programming, but it’s not in some situations.
The underlying truth of this joke is: Programming syntax is less confusing than mathematical syntax. There are genuinely ambiguous layouts of syntax in math (to a human reader that hasn’t internalized PEMDAS, anyways) whereas you get a compilation error if ANYTHING is ambiguous in programming. (yes, I am WELL aware of the frustrations of runtime errors)
Also: sometimes, a mathematician just has to invent some concept or syntax to convey something unconventional. The specific use of subscript/superscript, whatever ‘phi’ is being used for, etc. on whatever paper you’re reading doesn’t have to correlate to how other work uses the same concepts. It’s bad form, but sometimes its needed, and if useful enough is added to the general canon of what we call “math”. Meanwhile, you can encapsulate and obfuscate things in software, sure, but you can always get down to the bedrock of what the language supports; there’s no inventing anything new.
Yea, that’s it. Math syntax was created for humans, and programming syntax had to always remain deterministic for computers. It’s not an insult to either, just interesting how ambiguities show up often when humans are involved. I say ‘often’ for the general case: Math should be just as deterministic as programming, but it’s not in some situations.
Maths is 100% deterministic for order of operations. The issue is people not following all of the rules. Order of operations thread index
Math is. The syntax is arbitrary in some edge cases.