in reply to Re^8: Perl 6 ... dead? (no, just convalescing)
in thread Perl 6 ... dead?

Just out of curiosity, what language(s) do you consider safe from deparsing, decompiling, and general reverse engineering?

I'm not well versed in this area, but it seems that the quote along the lines of "Whatever man can hide through obfuscation, another man can uncover with sufficient intelligence, knowledge, sweat, research, time, and a good beer."

(OK, I don't have the quote handy, but that was the gist of it.)

I can't think of any absolutely secure way of distributing code that can't be reverse engineered. Sure, if the author could come around and type in a password to decrypt it, and the machine was in a known state so that keystroke grabbers and image snatchers were known not to be present, that would be PDS [Pretty Damn Secure]. Short of that (and I'm sure someone will argue with even that concession), we're all just fooling ourselves, maybe occasionally buying time through indifference, the limited resources of interested folks, and the huge number of interesting projects for those interested folks to attack.

So what do you consider secure?

-QM
--
Quantum Mechanics: The dreams stuff is made of

  • Comment on Re^9: Perl 6 ... dead? (no, just convalescing)

Replies are listed 'Best First'.
Re^10: Perl 6 ... dead? (no, just convalescing)
by Aristotle (Chancellor) on Sep 02, 2004 at 19:14 UTC

    Actually, you can quite trivially secure software against reverse engineering: if you have control over the hardware it runs on.

    Of course, that's not much help in practice. Particularly because you cannot, by any means, do so if the hardware is not under your control.

    Hence efforts like the TPM chip.

    Makeshifts last the longest.

      Any lock can be opened, and most locks rather trivially as well. Does that mean locks are stupid? Does that mean you never lock your car/bike/house? Just like locks, the point of "securing" source code isn't to make it 100% impossible to break it. If the cost of breaking the security is higher than the gain, then it will do.

        I won't disagree with that (because I can't), but I find it does not apply directly to software. I'm in the camp of those who say that if you don't want this thing or that to be done with your code, it's a legal issue, not a technical one, and so should be kept in the license.

        Making it a technical issue doesn't work.

        It would only be worth the effort for software for which you expect broad distribution, anyway: if the code is written for a specific customer under contract, it shouldn't be hard to figure out how to insure yourself with simple and effective legal measures.

        Except that the broad distribution scenario makes things much worse than they'd otherwise be. For one, increasing complexity leads to an increasing rate of bugs; many copy protection schemes that fail for a small, but significant portion of potential customers have provided ample evidence that they're no exception to this law. For another, in contrast to security in the material world, once one copy of your software is cracked, all copies are cracked. The bottom line is that your paranoia is only going to penalize your legitimate users, without particularly detracting the bad guys.

        I'm not even going to go into the ethical issues I have with the idea of profitting from an incredible amount of freely provided volunteer effort while refusing to give anything back.

        Makeshifts last the longest.

        A reply falls below the community's threshold of quality. You may see it by logging in.
A reply falls below the community's threshold of quality. You may see it by logging in.