I like to see, right there in front of me, the source-code that the interpreter is going to finally execute.
That makes as much sense as a chocolate teapot, and survives the heat of reasoned thought for about as long.
By that logic, you would never use subroutines -- much less those imported from a different file -- nor symbolic constants; nor enumerations; nor object methods.
Hell. You couldn't possible use Perl; because it is actually executed as C; but then the C is actually assembler; and assembler is just a symbolic representation of the processor opcodes, so you would have to program in hex. But then hex is just a representation of the underlying binary, so you'll have to code in the binary to achieve your goal.
Ah! Still then, the processor opcodes are actually translated into micro code before they can be executed...
A macro -- done right -- is simply an abstraction just like subroutines or symbolic constants. A way of clarifying source code by allowing what the programmer reads to more closely reflect the purpose of the code -- the algorithm being implemented -- not the mechanics of that implementation.
But then, logic never was your strong suit.
With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.
|