Selling Python Software

2 views
Skip to first unread message

Will Stuyvesant

unread,
Nov 3, 2003, 3:35:15 AM11/3/03
to
Suppose I want to sell a (small, 1000 lines) Python program. It is a
commandline program for database applications and I have a customer.
The customer wants to "try it out" before buying. The try-out version
should be a full, complete, version.

As much as I like the opensource movement, I have a problem now. If I
just give them the Python source code then they can show it to their
programmers and they have no reason anymore to pay money to me. Sure
that would break our agreements, but you know bussiness, legal issues.
The thing the customer is interested in is the algorithm used in the
program. Not much I could do about such a scenario effectively.

I tried py2exe before, but someone told me it is always possible to
decompile...revealing the source code.

Anybody has a solution to this, besides more legal paperwork (I am in
europe...forget about claiming your rights here)? Is it not possible
to give away a Windows .exe file that can not be decompiled (or only
with *extreme* difficulty)?

Erik Max Francis

unread,
Nov 3, 2003, 3:42:02 AM11/3/03
to
Will Stuyvesant wrote:

> I tried py2exe before, but someone told me it is always possible to
> decompile...revealing the source code.

It's always possible to decompile programs compiled to machine code, as
well, you know. Ultimately, every software developer must defend
himself with licenses and legal means, not technical ones.

--
Erik Max Francis && m...@alcyone.com && http://www.alcyone.com/max/
__ San Jose, CA, USA && 37 20 N 121 53 W && &tSftDotIotE
/ \ The most exhausting thing in life is being insincere.
\__/ Anne Morrow Lindbergh

Alex Martelli

unread,
Nov 3, 2003, 5:21:37 AM11/3/03
to
Erik Max Francis wrote:

> Will Stuyvesant wrote:
>
>> I tried py2exe before, but someone told me it is always possible to
>> decompile...revealing the source code.
>
> It's always possible to decompile programs compiled to machine code, as
> well, you know. Ultimately, every software developer must defend
> himself with licenses and legal means, not technical ones.

...unless said SW developer keeps the extremely precious parts of his
SW safely on a network server under his control (yes, it IS possible
to technically secure that -- start with an OpenBSD install...:-) and
only distributes the run-of-the-mill "client-oid" parts he doesn't
particularly mind about. The server-side SW can supply the precious
parts of the overall program's functionality to the client-side SW
via secure webservices or proprietary protocols; this also allows you
to enforce different business models (subscription, pay-per-use, ...)
more easily than distributing things does.

Very little software is WORTH protecting so closely, but for that
1 in 1000, or whatever, this option IS, IMHO, worth considering.


Alex

Alex Martelli

unread,
Nov 3, 2003, 5:28:07 AM11/3/03
to
Will Stuyvesant wrote:

> Suppose I want to sell a (small, 1000 lines) Python program. It is a
> commandline program for database applications and I have a customer.
> The customer wants to "try it out" before buying. The try-out version
> should be a full, complete, version.
>
> As much as I like the opensource movement, I have a problem now. If I
> just give them the Python source code then they can show it to their
> programmers and they have no reason anymore to pay money to me. Sure
> that would break our agreements, but you know bussiness, legal issues.
> The thing the customer is interested in is the algorithm used in the
> program. Not much I could do about such a scenario effectively.
>
> I tried py2exe before, but someone told me it is always possible to
> decompile...revealing the source code.

Yes, exactly as could be done if you coded your precious algorithm
in C, machine-language, or whatever: if you distribute executable
code it CAN be cracked and the algorithm reverse-engineered (see any
warez site: game companies go to HUGE length to defend their programs
and they STILL get cracked anyway).


> Anybody has a solution to this, besides more legal paperwork (I am in
> europe...forget about claiming your rights here)? Is it not possible
> to give away a Windows .exe file that can not be decompiled (or only
> with *extreme* difficulty)?

"Can not be decompiled" is impossible whatever language you're using.

"*extreme* difficulty" is in the eye of the beholder. You can e.g.
add layers of encryption/decription to the bytecode, etc, but whatever
you do somebody else can undo. Depending on the relative skills of
you and the "somebody else" the ratio (your effort to keep things
secret, to theirs to uncover them) can be any.

Couldn't you keep some crucial part of your precious algorithm OFF
the code you distribute, and have said code access said part via
webservices towards your personally-controlled, secure host?


Alex

Radovan Garabik

unread,
Nov 3, 2003, 6:48:56 AM11/3/03
to

use upx to pack it. Of course, it is always possible to unpack
the executable, but if an effort to do so is bigger that money they would
pay you, it would not be in their interest. Or enter into google
"encrypt executable" and it will give you a plethora of links.

--
-----------------------------------------------------------
| Radovan Garabík http://melkor.dnp.fmph.uniba.sk/~garabik/ |
| __..--^^^--..__ garabik @ kassiopeia.juls.savba.sk |
-----------------------------------------------------------
Antivirus alert: file .signature infected by signature virus.
Hi! I'm a signature virus! Copy me into your signature file to help me spread!

John J. Lee

unread,
Nov 3, 2003, 8:06:48 AM11/3/03
to al...@aleax.it
Alex Martelli <al...@aleax.it> writes:
[...]

> "Can not be decompiled" is impossible whatever language you're using.
>
> "*extreme* difficulty" is in the eye of the beholder. You can e.g.
> add layers of encryption/decription to the bytecode, etc, but whatever
> you do somebody else can undo. Depending on the relative skills of
> you and the "somebody else" the ratio (your effort to keep things
> secret, to theirs to uncover them) can be any.
[...]

Whie this is all true, you seem to put undue emphasis on the fact that
it's always *possible* to decompile stuff. Isn't the point you make
in your last sentence actually crucial here? The game is to make your
opponent (customer ;-) incur more expense in decompiling it than it
would cost to just go ahead and pay you, is it not? And yeah, you
also have to take into account how much it costs you to come up with
the protection scheme, of course.

So, is there a good practical solution of that form, for Python code
of this sort of size (or any other size)? I suspect the answer for
standard Python may be no, while the answer for optimising compilers
may be yes -- but that's just a guess.


John

Peter Hansen

unread,
Nov 3, 2003, 8:49:36 AM11/3/03
to

I've read the other answers here, which pretty much just repeat past
discussions, but I think the various respondents have to some extent
not paid close attention to your *specific* needs here.

Alex is pretty close to the mark, but there's no need to be as extreme
as his "start with OpenBSD install" response to Erik Max Francis, as
you definitely don't need the high security of this approach.

Nevertheless, the server-based-code approach is definitely the only
one that is worth the effort here, considering that the effort is
practically nil (or should be, as if you're really trying to sell
commercial software, having access to an Internet server should be
a pretty trivial thing). You could use Pyro to make the fact that
part of the system is running on a server practically transparent
both to the potential customer and to you. Moving the server-based
code into the real application once they've paid you would be a
tiny last step.

"Only decompiled with *extreme* difficulty?" There is no such approach
within your means, I'm afraid, and there are any number of stories
from the real world which ipso facto prove it. There *are* hardware-key
based solutions, but I can tell you from personal experience that
they will really not stop a determined attacker and in any case they
will cost more to implement than the potential income from your
1000-line Python program.

On another note, having been in the consulting business myself for
years (before my current job) I would give you this advice. Don't
underestimate the value of a trusting business relationship, nor
the value of a decent written contract. If you are right that this
customer would cheat you at the earliest opportunity, you are
likely going to find yourself cheated in some manner no matter
what you do. Without adequate attention to written agreements and
a good mutual understanding, you could just as well find yourself
in the position of having delivered the final, unprotected version
to them and still not get paid because "it doesn't meet the
requirements you promised to implement" or something like that.

A decent license *should* be adequate for just about any customer
with whom you ought to be doing business. IMHO.

-Peter

Alex Martelli

unread,
Nov 3, 2003, 8:54:52 AM11/3/03
to
John J. Lee wrote:

> Alex Martelli <al...@aleax.it> writes:
> [...]
>> "Can not be decompiled" is impossible whatever language you're using.
>>
>> "*extreme* difficulty" is in the eye of the beholder. You can e.g.
>> add layers of encryption/decription to the bytecode, etc, but whatever
>> you do somebody else can undo. Depending on the relative skills of
>> you and the "somebody else" the ratio (your effort to keep things
>> secret, to theirs to uncover them) can be any.
> [...]
>
> Whie this is all true, you seem to put undue emphasis on the fact that
> it's always *possible* to decompile stuff. Isn't the point you make
> in your last sentence actually crucial here? The game is to make your

Of course it's crucial. But so what?

> opponent (customer ;-) incur more expense in decompiling it than it
> would cost to just go ahead and pay you, is it not? And yeah, you
> also have to take into account how much it costs you to come up with
> the protection scheme, of course.

Of course. It can be framed as a zero-sum game of incomplete
information on both sides. You don't really know that anybody
will ever try to steal your code -- any eurocent you invest in
protecting it might be a complete waste, if nobody ever even
dreams of so trying. At the other extreme, whoever tries to do
the stealing might be technically good and well informed, as well
as dishonest, so that in 5 minutes they destroy 5 days' worth of
work by you on "protection". In both cases, investing in such
protection is throwing money away from your POV. The hypothetical
adversary, for his part, may not know and be unable to gauge the
effort needed to crack and steal your code -- if he's not well
informed nor competent, he might just be flailing around for 10
days and stop just before the 11th day's effort WOULD deliver the
illegal goods he's after.

Of course, guess what IS the effect on this game's payoff matrix
of discussing technical possibilities in a public forum. "I give
you three guesses, but the first two don't count"...:-).


> So, is there a good practical solution of that form, for Python code
> of this sort of size (or any other size)? I suspect the answer for
> standard Python may be no, while the answer for optimising compilers
> may be yes -- but that's just a guess.

The answer is no for either case. I've spent too high a proportion of
my life (at my previous employer) putting "protection systems" in
place (including optimising compilers, weird machine-code tricks,
even in one case some microcode hacks), and it was the worst waste of
my time anybody could possibly devise.

Part of the problem is, that the "warezdoodz culture" is stacked
against you. If you DO come up with a novel approach, that is a
challenge to guys who SPEND THEIR LIFE doing essentially nothing
but cracking software-protection schemes *for fun*. Even if it's
taken you 10 hours and it makes them spend 20 hours, they _do not
account this as a cost_, any more than a crossword enthusiast sees
as "a cost" the hours he spends cracking a particularly devious
crossword -- indeed, once said enthusiast is good enough, unless
the puzzle it's hard it's no fun. But don't think that therefore
using an obviously weak scheme is a counter: just below the top
crackerz there are layers and layers of progressively less capable
ones desperate to put notches in their belt.

To me, playing such zero-sum games is a net loss and waste of time
because with the same investment of my time and energy I could
be playing games with sum _greater_ than zero, as is normally the
case for technical development not connected to security issues
(i.e., where the only "net benefit" of the development doesn't
boil down to frustrating somebody else's attempts at cracking and
stealing) and even for much security-related work (e.g., most of
OpenBSD's developments help against plain old BUGS and crashes just
as much as they help against wilfull attacks against you).

There exist technical solutions that DO make it impossible for
anybody to crack your precious algorithms: just make those precious
algorithms available only from a network server under your total
control, NEVER giving out executable code for them. (You can then
work on securing the network server, etc, but these ARE problems
that are technically susceptible to good solutions). If anybody
refuses this solution (surely a costly one on some parameters) it
probably means their algorithms aren't worth all that much after
all (there may be connectivity, latency or bandwidth problems in
some cases, of course, but with the spread of network technologies
these are progressively less likely to apply as time goes by). If
the algorithms aren't worth all that much, they're not worth me
spending my time in zero-sum games to protect them -- lawyers are
probably more emotionally attuned than engineers to playing zero-
sum games, since so much of legal practice vs so little engineering
practice is like that, so that may be a back-up possibility.

It's not about programming languages at all. In the end, "clever"
schemes that are presumed to let people run code on machines under
their control yet never be able to "read" the code must rely on
machinecode tricks of some sort, anyway, since obviously, from a
technical viewpoint, if the code must be executable, it must be
*read* on the way to the execution engine -- if it's encrypted it
must exist in decrypted form at some point (and it can then be
captured and examined at that point), etc. Some of the code that
I was supposed to "protect" for my previous employer was in C,
Fortran, C++, and other high-level languages; some was in
machine code; other yet was in intermediate-code generated by a
proprietary scripting language; ... in the end it made no real
difference one way or another.


Alex

Lulu of the Lotus-Eaters

unread,
Nov 3, 2003, 11:34:37 AM11/3/03
to Pythonistas
hw...@hotmail.com (Will Stuyvesant) wrote previously:

|Is it not possible to give away a Windows .exe file that can not be
|decompiled (or only with *extreme* difficulty)?

NO! It is NOT possible.

It is not possible in Python. It is not possible in VB. It is not
possible in C#. It is not possible in C. It is not possible in
Fortran. It is, in fact, not possible in Assembly.

However, distributing a .pyc or an .exe make it require a little bit of
effort to find the underlying code... enough that someone needs to make
a conscious decision to do so, rather than "accidentally" open the .py
file in Notepad.

Yours, Lulu...

--
mertz@ _/_/_/_/ THIS MESSAGE WAS BROUGHT TO YOU BY: \_\_\_\_ n o
gnosis _/_/ Postmodern Enterprises \_\_
.cx _/_/ \_\_ d o
_/_/_/ IN A WORLD W/O WALLS, THERE WOULD BE NO GATES \_\_\_ z e


Erik Max Francis

unread,
Nov 3, 2003, 3:39:16 PM11/3/03
to
"John J. Lee" wrote:

> Whie this is all true, you seem to put undue emphasis on the fact that
> it's always *possible* to decompile stuff. Isn't the point you make
> in your last sentence actually crucial here? The game is to make your
> opponent (customer ;-) incur more expense in decompiling it than it
> would cost to just go ahead and pay you, is it not? And yeah, you
> also have to take into account how much it costs you to come up with
> the protection scheme, of course.

The problem here is that it is almost certain that the efforts of
dedicated crackers will be far greater than anything you can come up
with. If someone really wants to crack your program, they will be able
to do so, and there's not much you can do to stop them. Furthermore, in
some senses the monetary analogy, though useful, is flawed. There are
plenty of crackers who simply _will not pay you_ no matter what happens,
even if they fail to compromise your copy protection scheme. And, if
they fail, they may pass it on to friends who are more experience than
they are, and the fruits of that labor will be shared among the cracker
community.

In essence the battle isn't one against many -- you vs. an individual
cracker -- it's one against many -- you vs. an entire community of
crackers -- and you're hopelessly outnumbered. Any amount of effort
you're willing to spend to come up with a sophisticated copy protection
scheme can be easily matched by a group of crackers if they so desire.
Of course this brings up issues of how widely distributed and appealing
your application is, but the bottom line here is that this is a battle
you cannot win -- you simply hope you to avoid the fight in the first
place.

Most copy protection schemes these days, such as they are, are intended
to discourage casual violations. Things like requiring a serial number
or the right CD in the drive are usually considered fairly good common
ground solutions, because they're well-known territory and the average
customer won't be put off too much by them.

--
Erik Max Francis && m...@alcyone.com && http://www.alcyone.com/max/
__ San Jose, CA, USA && 37 20 N 121 53 W && &tSftDotIotE

/ \ Exercise is wonderful. I could sit and watch it all day.
\__/ Louis Wu

Peter Hansen

unread,
Nov 3, 2003, 3:52:02 PM11/3/03
to
Erik Max Francis wrote:
>
> Most copy protection schemes these days, such as they are, are intended
> to discourage casual violations. Things like requiring a serial number
> or the right CD in the drive are usually considered fairly good common
> ground solutions, because they're well-known territory and the average
> customer won't be put off too much by them.

And, sadly, there is a large number of "average customers" who seem
to be aware of Warez sites and know enough to download and install the
broken versions of software.

The number of people I know with "free" versions of XP is somewhat
shocking.

I think Quicken is one of the few programs I've seen lately which seems
to remain somewhat intact. (I'm quite sure I'll hear otherwise now, but
so far I haven't seen rampant copying.) The basic technique used there
is online registration, which I assume decrypts and/or enables certain
critical portions of the code after server verification of credentials,
and prevents repeat registrations using the same CD. Maybe that's a
good new baseline for such protection.

-Peter

John J. Lee

unread,
Nov 3, 2003, 5:10:21 PM11/3/03
to al...@aleax.it
Alex Martelli <al...@aleax.it> writes:
[...]
> Part of the problem is, that the "warezdoodz culture" is stacked
> against you. If you DO come up with a novel approach, that is a
[...]

Ah, stop right there (oops, too late!-). I think we're somewhat at
cross-purposes. I was talking about protecting something more at the
level of source code than running programs.

I mostly agree with you on the issue of protecting "binaries", but:

> Part of the problem is, that the "warezdoodz culture" is stacked
> against you. If you DO come up with a novel approach, that is a

[...]

Though information is indeed always incomplete, it seems a good bet
that war3zd00dz are not an issue for a consultant being hired by a
company to write a 1000 line program. Do you disagree?

Anyway, back to source vs. binaries. Obviously, code that's closer to
the "source" end of the spectrum has additional value. I'd got the
impression that something rather similar to the original source could
be recovered from Python byte-code, due to its high-level nature
(albeit obviously missing a lot of stuff -- including all those
valuable names). Certainly that's impossible with optimising
compilers (I should have stated this much more strongly in my last
message, of course -- there's no "may" or "guessing" involved there,
unlike the Python case, where I don't know the answer).


> It's not about programming languages at all. In the end, "clever"
> schemes that are presumed to let people run code on machines under
> their control yet never be able to "read" the code must rely on
> machinecode tricks of some sort, anyway, since obviously, from a

[...]

Until MS get their grubby hands on our CPUs, anyway :-(


John

John J. Lee

unread,
Nov 3, 2003, 5:14:57 PM11/3/03
to
Erik Max Francis <m...@alcyone.com> writes:

> "John J. Lee" wrote:
[...]


> The problem here is that it is almost certain that the efforts of
> dedicated crackers will be far greater than anything you can come up
> with. If someone really wants to crack your program, they will be able
> to do so, and there's not much you can do to stop them. Furthermore, in
> some senses the monetary analogy, though useful, is flawed. There are

[...]

Oh, absolutely, but we're not (weren't, anyway) talking about a
consumer app here, but a 1000 line Python program written by a
consultant for a company. Companies are not "warez doodz", and tend
to be driven by money. The source (or something closely related to
it, like a decompiled Python program) may be significantly more
valuable to them than the executable.


John

Ron Adam

unread,
Nov 3, 2003, 5:16:51 PM11/3/03
to
On 3 Nov 2003 00:35:15 -0800, hw...@hotmail.com (Will Stuyvesant)
wrote:

It looks to me like you just want a guarantee that you will be paid
for your work. I don't think that is too much to ask from a company
that will use it to increase productivity.

Incrypting it or not won't make any difference in this case since the
demo version will be fully functional. They can still make a copy of
it and use it anyway without telling you.

You need to secure the demo process not the demo itself. Making a web
page and inserting your code in it as a cgi script would be the best
way to do that. Alex's point of view I think. It can also serve as
an advertizement if you want to sell it to others.

Another is if they are physically close to you, is to do the
demonstrationt in person on a laptop that you can take with you when
you leave. The program is never run on their computers.

A third is to offer a demo animation. A slide shows example input and
output but doesn't actually contain your program.

All of these might be worth while if you intend to sell your program
to others.

If this is a one time deal, then probably the simplest way, is to
have them give you some sample data, and you send them back the
output. If they like what they see, then they buy it. And since the
source is included, they can change it if it's not quite what they
want. That could be a selling point.

You will still need some sort of non-resell/distribution agreement,
put that on the sales invoice and in the code.

Does this help?

_Ron


John J. Lee

unread,
Nov 3, 2003, 5:20:10 PM11/3/03
to
Peter Hansen <pe...@engcorp.com> writes:

> Erik Max Francis wrote:
[...]


> I think Quicken is one of the few programs I've seen lately which seems
> to remain somewhat intact. (I'm quite sure I'll hear otherwise now, but
> so far I haven't seen rampant copying.) The basic technique used there
> is online registration, which I assume decrypts and/or enables certain
> critical portions of the code after server verification of credentials,
> and prevents repeat registrations using the same CD. Maybe that's a
> good new baseline for such protection.

In the end, that's just as vulnerable to hacking as anything else to a
determined binary-attacker (which as you point out, do exist for
consumer apps).


John

Erik Max Francis

unread,
Nov 3, 2003, 7:58:49 PM11/3/03
to
"John J. Lee" wrote:

> Though information is indeed always incomplete, it seems a good bet
> that war3zd00dz are not an issue for a consultant being hired by a
> company to write a 1000 line program. Do you disagree?

Yeah, but at the same time, if he's dealing in good faith with a company
that he thinks is reputable enough to be talking to in the first place,
what is the likelihood they're going to rip off his work at all?

--
Erik Max Francis && m...@alcyone.com && http://www.alcyone.com/max/
__ San Jose, CA, USA && 37 20 N 121 53 W && &tSftDotIotE

/ \ Be thine own place, or the world's thy jail.
\__/ John Donne

Erik Max Francis

unread,
Nov 3, 2003, 8:00:35 PM11/3/03
to
"John J. Lee" wrote:

> Oh, absolutely, but we're not (weren't, anyway) talking about a
> consumer app here, but a 1000 line Python program written by a
> consultant for a company.

If he wrote the software for the company under the agreement that they'd
buy it if they liked it, what's the likelihood that this is a concern at
all? I would hope that if he had a real concern that this was a likely
outcome -- that they'd talk to him, wait for him to deliver a product,
and then refuse to pay him -- that he wouldn't be dealing with them at
all anyway.

--
Erik Max Francis && m...@alcyone.com && http://www.alcyone.com/max/
__ San Jose, CA, USA && 37 20 N 121 53 W && &tSftDotIotE

Bengt Richter

unread,
Nov 3, 2003, 10:56:03 PM11/3/03
to
On Mon, 03 Nov 2003 13:54:52 GMT, Alex Martelli <al...@aleax.it> wrote:
[...]

>It's not about programming languages at all. In the end, "clever"
>schemes that are presumed to let people run code on machines under
>their control yet never be able to "read" the code must rely on
>machinecode tricks of some sort, anyway, since obviously, from a
>technical viewpoint, if the code must be executable, it must be
>*read* on the way to the execution engine -- if it's encrypted it
>must exist in decrypted form at some point (and it can then be
>captured and examined at that point), etc. Some of the code that
(or maybe it can't, for practical purposes, see below)

>I was supposed to "protect" for my previous employer was in C,
>Fortran, C++, and other high-level languages; some was in
>machine code; other yet was in intermediate-code generated by a
>proprietary scripting language; ... in the end it made no real
>difference one way or another.
OTOH, we are getting to the point where rather big functionality can be put
on a chip or tamper-proof-by-anyone-but-a-TLA-group module. I.e., visualize
the effect of CPUs' having secret-to-everyone private keys, along with public keys,
and built so they can accept your precious program code wrapped in a PGP encrypted
message that you have encrypted with its public key. The computer owner can
say run it or not, but other than i/o nothing can be observed, because decryption
happens in cache memory that you can't reach the address/data busses of, and execution
likewise passes no instructions or temporary data through any tappable wire. A separate
execution unit commpunicates via a restricted protocol to a normal CPU unit that has
normal OS stuff running and can do the rest, but can't reach into that private space.

This is not so magic. You could design a PC with a locked enclosure and special BIOS
to simulate this, except that that wouldn't be so hard to break into. But the principle
is there. Taking the idea to SOC silicon is a matter of engineering, not an idea break-through
(though someone will probably try to patent on-chip stuff as if it were essentially different
and not obvious ;-/)

I think this will come. It can be a good thing _if used right_, but that's a big if. I just hope
it won't be used to make artificial obstacles to interoperability of free vs proprietary. And doing
it between proprietary stuff just raises the prices for what can't compete on merit.
I have a hunch FSF/EFF lawyers should be preparing to lobby against laws that will permit such
market manipulations, if they aren't already. Or maybe get law enacted mandating free interoperability.

Maybe I'm just a worrywart ;-)

Regards,
Bengt Richter

Lulu of the Lotus-Eaters

unread,
Nov 3, 2003, 11:20:33 PM11/3/03
to Pythonistas
j...@pobox.com (John J. Lee) wrote previously:

|consumer app here, but a 1000 line Python program written by a
|consultant for a company. Companies are not "warez doodz", and tend
|to be driven by money. The source (or something closely related to
|it, like a decompiled Python program) may be significantly more
|valuable to them than the executable.

To my mind, distributing a .pyc to clients is plenty of protection. I
admit that if you send the .py script itself, it's fairly easy for them
to "accidentally" copy it or alter it in a way your contract doesn't
allow. Basically completely effortless, which makes cheating perhaps a
tad too tempting.

But decomiling a .pyc takes some work. Not a LOT of work mind you. But
enough that a boss has to ASSIGN a programmer the job of creating the
decompiled version. In other words, a client who will decompile a .pyc
has some actual malice in their intent to violate your license (assuming
that's what the license says). The lack of ambiguity gets you precisely
to the point where contracts and laws are the right framework for
protecting copyright.

Yours, Lulu...

--
Keeping medicines from the bloodstreams of the sick; food from the bellies
of the hungry; books from the hands of the uneducated; technology from the
underdeveloped; and putting advocates of freedom in prisons. Intellectual
property is to the 21st century what the slave trade was to the 16th.

Andrew Dalke

unread,
Nov 4, 2003, 4:03:51 AM11/4/03
to
Bengt Richter:

> OTOH, we are getting to the point where rather big functionality can be
put
> on a chip or tamper-proof-by-anyone-but-a-TLA-group module. I.e.,
visualize
> the effect of CPUs' having secret-to-everyone private keys, along with
public keys,

Actually, we aren't. There have been various ways to pull data of
of a smart card (I recall readings some on RISKS, but the hits I
found are about 5+ years old). In circuit emulators get cheaper and
faster, just like the chips themselves. And when in doubt, you can
buy or even build your own STM pretty cheap -- in hobbiest range
even (a few thousand dollars).

> and built so they can accept your precious program code wrapped in a PGP
encrypted
> message that you have encrypted with its public key.

Some of the tricks are subtle, like looking at the power draw.
Eg, suppose the chip stops when it finds the key is invalid. That
time can be measured and gives clues as to how many steps it
went through, and even what operations were done. This can
turn an exponential search of key space into a linear one.

> This is not so magic. You could design a PC with a locked enclosure and
special BIOS
> to simulate this, except that that wouldn't be so hard to break into. But
the principle
> is there. Taking the idea to SOC silicon is a matter of engineering, not
an idea break-through
> (though someone will probably try to patent on-chip stuff as if it were
essentially different
> and not obvious ;-/)

But the counter principle (breaking into a locked box in an uncontrolled
environment) is also there. There are a lot of attacks against smart
cards (eg, as used in pay TV systems), which cause improvements (new
generation of cards), which are matched by counter attacks.

These attacks don't require the resources of a No Such Agency,
only dedicated hobbiest with experience and time on their hands.

Andrew
da...@dalkescientific.com
P.S.
I did have fun breaking the license protection on a company's
software. Ended up changing one byte. Took about 12 hours.
Would have been less if I knew Solaris assembly. And I did
ask them for permission to do so. :)

Marijan Tadin

unread,
Nov 4, 2003, 3:35:11 AM11/4/03
to
Well, I am by no means Python or programming expert, but I've been
reading posts to the same topic before.
I wonder why nobody mentioned Pyrex in this context:

http://www.cosc.canterbury.ac.nz/~greg/python/Pyrex/

As far as I've understood, in Pyrex module you can mix Python and
C-code, and Pyrex will translate this module in C-code (Python extension
module), and then all you have to do is compile this C-sorce file
(automaticaly created by Pyrex). You can also write pure Python code in
your Pyrex module, and Pyrex will translate this into a C-code. Then you
can use py2exe with this compiled extensions (otherwise generated *.exe
file is actually *.zip file with *.pyc files in it).
There might be a problem: AFAIK there are some restrictions, and you can
not use all the features of Python in Pyrex, but in that case, you may
put just some vital parts of your programm in Pyrex.
By doing so, you get the same protection as you would have if you've
written your programm in C, and it would be much more difficult to
decompile your code then from *.pyc file, and I believe that this is
what many people are asking for.
Of course, if the value of your programm is high enough, so that some
dedicated cracker wants to crack it, then everything that was written in
other posts fully applies.
I hope this helps a little,
Marijan Tadin


John J. Lee

unread,
Nov 4, 2003, 8:11:45 AM11/4/03
to
"Andrew Dalke" <ada...@mindspring.com> writes:

> Bengt Richter:
> > OTOH, we are getting to the point where rather big functionality can be put
> > on a chip or tamper-proof-by-anyone-but-a-TLA-group module. I.e., visualize
> > the effect of CPUs' having secret-to-everyone private keys, along with public keys,
>
> Actually, we aren't. There have been various ways to pull data of
> of a smart card (I recall readings some on RISKS, but the hits I

[...]

Right.


> In circuit emulators get cheaper and faster, just like the chips
> themselves.

Though there will always be some consumer apps that will run too slow
like that. Maybe not the most important ones, though (email, sound
and video playing stuff, etc.).


> And when in doubt, you can
> buy or even build your own STM pretty cheap -- in hobbiest range
> even (a few thousand dollars).

That's a thought: somebody is going to know (or be able to find
experimentally) exactly where to look on each chip, and once that fact
is out, I guess people are going to be selling motherboards / CPUs
with their private key stuck to them on a post-it note. :-)

A mass-market for STMs -- maybe it's worth investing ;-)


> > and built so they can accept your precious program code wrapped in a PGP
> > encrypted message that you have encrypted with its public key.
>
> Some of the tricks are subtle, like looking at the power draw.
> Eg, suppose the chip stops when it finds the key is invalid. That
> time can be measured and gives clues as to how many steps it
> went through, and even what operations were done. This can
> turn an exponential search of key space into a linear one.

[...]

Yeah, it's all very cute, and sounds like rocket science (which it is,
in some ways), but really it does all boil down to the same technique
I used as a child to crack my own bicycle chain combination lock (I
forgot the combination). I discovered you could hear the right
position on the dials individually, so I could crack it one dial at a
time. But that was a cheap lock, and there are others that don't do
that. Processors aren't locks, of course, and they give off so much
noise that I wonder if that's possible in their case.


John

Kyler Laird

unread,
Nov 4, 2003, 8:13:27 AM11/4/03
to
Alex Martelli <al...@aleax.it> writes:

>> It's always possible to decompile programs compiled to machine code, as
>> well, you know. Ultimately, every software developer must defend
>> himself with licenses and legal means, not technical ones.

>...unless said SW developer keeps the extremely precious parts of his
>SW safely on a network server under his control (yes, it IS possible
>to technically secure that -- start with an OpenBSD install...:-) and
>only distributes the run-of-the-mill "client-oid" parts he doesn't
>particularly mind about.

I can imagine the "client-oid" part being a simple shell script that
just runs SSH (perhaps with some tunnel commands for database
connections).

Especially for a command-line app, this seems like an *easy* and
complete answer. Heck, for some situations, it might even be considered
the final product. It'd be a snap for the developer to maintain.

--kyler

Svenne Krap

unread,
Nov 4, 2003, 9:26:19 AM11/4/03
to

> But decomiling a .pyc takes some work.

How is that done in the first place ? (not that I am that client, and
not that I need to know, it's just nice to be informed.)

I agree with most of you, copyprotection does not work, if the software
is essential to a large enough "market". But even simple protections
keeps the casual pirate away and after all, the most effective system
are contracts and laywers (I'm not glad it is that way, but it is).

Svenne

Peter Hansen

unread,
Nov 4, 2003, 10:42:23 AM11/4/03
to

Program called "decompyle" can do it, though it's generally not
up to date with the latest Python version until some time afterwards.
It may not even be supported any more. Google would help...

-Peter

Alex Martelli

unread,
Nov 4, 2003, 10:54:30 AM11/4/03
to
Erik Max Francis wrote:

> "John J. Lee" wrote:
>
>> Oh, absolutely, but we're not (weren't, anyway) talking about a
>> consumer app here, but a 1000 line Python program written by a
>> consultant for a company.
>
> If he wrote the software for the company under the agreement that they'd
> buy it if they liked it, what's the likelihood that this is a concern at
> all? I would hope that if he had a real concern that this was a likely
> outcome -- that they'd talk to him, wait for him to deliver a product,
> and then refuse to pay him -- that he wouldn't be dealing with them at
> all anyway.

Well, in the real world of consulting (in Southern Europe, at least),
getting your invoices actually *paid* by the companies you've done work
for isn't necessarily trivial (which may explain why I "commute" to
AB Strakt Sweden, and also telework for them... they _do_ pay their
bills, and on time too!, besides being a great firm in other ways:-).

This has little to do with "selling software", necessarily -- services
are of course more problematic still, but even if you sell durable
goods that might in theory be repossessed for non-payment, the amount
of trouble, cost and impact on cash-flow can be problematic. Francis
Fukuyama's "Trust: human nature and the reconstitution of social
order" is highly recommended reading on the underlying problems --
but unless one wants to emigrate, even being fully aware of them
doesn't necessarily take one much closer to avoiding them:-).


Alex

Alex Martelli

unread,
Nov 4, 2003, 11:31:08 AM11/4/03
to
John J. Lee wrote:

> Alex Martelli <al...@aleax.it> writes:
> [...]
>> Part of the problem is, that the "warezdoodz culture" is stacked
>> against you. If you DO come up with a novel approach, that is a
> [...]
>
> Ah, stop right there (oops, too late!-). I think we're somewhat at
> cross-purposes. I was talking about protecting something more at the
> level of source code than running programs.

Oh, "shrouding"? Sure, you can do that. Many programs might
actually be _enhanced_ by that approach (at least the variable
and function names, while not helpful, aren't actively hostile:-),
but probably not Python programs.


> I mostly agree with you on the issue of protecting "binaries", but:
>
>> Part of the problem is, that the "warezdoodz culture" is stacked
>> against you. If you DO come up with a novel approach, that is a
> [...]
>
> Though information is indeed always incomplete, it seems a good bet
> that war3zd00dz are not an issue for a consultant being hired by a
> company to write a 1000 line program. Do you disagree?

A Python 1000-SLOC program may be about 200+ function points --
not exactly trivial (it may be equivalent to more than 10,000
lines of C, easily) though not earth-shaking. But, anyway,
we weren't talking about somebody being _hired_, but rather
wanting to sell what they independently came up with the idea
of developing -- there's a difference! And yes, it wouldn't
be the first time that a company deliberately exploits the warez
"circuit" to get programs cracked -- look around and you'll see
it's definitely NOT just games and the like that end up there.


> Anyway, back to source vs. binaries. Obviously, code that's closer to
> the "source" end of the spectrum has additional value. I'd got the
> impression that something rather similar to the original source could
> be recovered from Python byte-code, due to its high-level nature
> (albeit obviously missing a lot of stuff -- including all those
> valuable names). Certainly that's impossible with optimising
> compilers (I should have stated this much more strongly in my last
> message, of course -- there's no "may" or "guessing" involved there,
> unlike the Python case, where I don't know the answer).

If you think you do, "you're in denial". Check out:

http://www.program-transformation.org/twiki/bin/view/Transform/DecompilationPossible
http://boomerang.sourceforge.net/
http://www.itee.uq.edu.au/~cristina/dcc.html#dcc

I suspect it must in some way be easier (but my multiplicative
constants, not O(N) easier...;-) for lower "semantic gaps" -- but
that intuition might well be misguided (it's close to "it must in
some way be easier to produce optimal machine code for a CISC
than for a RISC", and that's simply not true).


Alex

Bengt Richter

unread,
Nov 4, 2003, 2:53:33 PM11/4/03
to
On Tue, 04 Nov 2003 09:03:51 GMT, "Andrew Dalke" <ada...@mindspring.com> wrote:

>Bengt Richter:
>> OTOH, we are getting to the point where rather big functionality can be
>put
>> on a chip or tamper-proof-by-anyone-but-a-TLA-group module. I.e.,
>visualize
>> the effect of CPUs' having secret-to-everyone private keys, along with
>public keys,
>
>Actually, we aren't. There have been various ways to pull data of
>of a smart card (I recall readings some on RISKS, but the hits I
>found are about 5+ years old). In circuit emulators get cheaper and
>faster, just like the chips themselves. And when in doubt, you can
>buy or even build your own STM pretty cheap -- in hobbiest range
>even (a few thousand dollars).

Even if you knew exactly where on a chip to look, and it wasn't engineered
to have the key self-destruct when exposed, what would you do with the key?
You'd have the binary image of an executable meant to execute in the secret-room
processing core. How would you make it available to anyone else? You could re-encrypt
it with someone else's specific public key. Or distribute a program that does that,
along with the clear binary. But what if the program contains an auth challenge for the target
executing system? Now you have to reverse engineer the binary and see if you can modify it
to remove challenges and checks and still re-encrypt it to get it executed by other processors.
Or you have to translate the functionality to a program that runs in clear mode on the ordinary cores.
Sounds like real work to me, even if you have a decompyler and the inter-core comm specs.
Of course, someone will think it's fun work. And they would get to start over on the next program,
even assuming programs encrypted with the public key of the compromised system would be provided,
so there better not be a watermark left in the warez images that would indentify the compromised
system. Or else they would get to destroy another CPU module to find its key. Probably easier the
second time, assuming no self-destruct stuff ;-)

>
>> and built so they can accept your precious program code wrapped in a PGP
>encrypted
>> message that you have encrypted with its public key.
>
>Some of the tricks are subtle, like looking at the power draw.
>Eg, suppose the chip stops when it finds the key is invalid. That
>time can be measured and gives clues as to how many steps it
>went through, and even what operations were done. This can
>turn an exponential search of key space into a linear one.

That was then. Plus remember this would not be an isolated card chip that you can
probe, it's one or more specialized cores someplace on a general purpose
multi-cpu chip that you can't get at as a hobbyist, because opening it without destroying
what you want to look at requires non-hobby equipment, by design.

>
>> This is not so magic. You could design a PC with a locked enclosure and
>special BIOS
>> to simulate this, except that that wouldn't be so hard to break into. But
>the principle
>> is there. Taking the idea to SOC silicon is a matter of engineering, not
>an idea break-through
>> (though someone will probably try to patent on-chip stuff as if it were
>essentially different
>> and not obvious ;-/)
>
>But the counter principle (breaking into a locked box in an uncontrolled
>environment) is also there. There are a lot of attacks against smart
>cards (eg, as used in pay TV systems), which cause improvements (new
>generation of cards), which are matched by counter attacks.
>
>These attacks don't require the resources of a No Such Agency,
>only dedicated hobbiest with experience and time on their hands.
>

Sounds like an article of faith ;-)

> Andrew
> da...@dalkescientific.com
>P.S.
> I did have fun breaking the license protection on a company's
>software. Ended up changing one byte. Took about 12 hours.
>Would have been less if I knew Solaris assembly. And I did
>ask them for permission to do so. :)

Changed a conditional jump to unconditional? Some schemes aren't so
static and centralized ...

I once ran into a scheme that IIRC involved a pre-execution snippet of code that had to
run full bore for _lots_ of cycles doing things to locations and values in
the program-to-be-executed that depended on precise timing and obscured info.
I guess the idea was that if someone tried to trace or step through it, it would
generate wrong locations and info and also stop short and the attacker would have
to set up to record memory addresses and values off the wires to figure out what to change,
but even then they would run into code that did mysterious randomly sprinkled
milestone checks, so capturing the core image after start wasn't free lunch either.
Plus it had to run in a privileged CPU mode, and it wasn't a stand-alone app, it
was part of an OS ... that wasn't open source ... and you didn't have to tools to rebuild...
This was just some code I stumbled on, I may have misunderstood, since I didn't
pursue it, being there for other reasons. But that was primitive compared to what you
could do with specialized chip design.

Regards,
Bengt Richter

Andrew Dalke

unread,
Nov 4, 2003, 4:03:09 PM11/4/03
to
Bengt Richter:

> Even if you knew exactly where on a chip to look, and it wasn't engineered
> to have the key self-destruct when exposed, what would you do with the
key?

As I mentioned, I'm not a hardware guy. What I know is that
trying to hide code on a chip is open to its own sorts of attacks and
that at least some companies which have tried to do so (like pay-TV
companies) have had their systems broken, and not broken by the
efforts of a three letter agency.

> That was then.

Yup.

Crypto software is notoriously hard to write. Even with the good
libraries we have now, people make mistakes and misuse them.
Similarly, while it people can make chips that are hard to decode,
doing so in practice is likely to be even harder.

> Plus remember this would not be an isolated card chip that you can
> probe, it's one or more specialized cores someplace on a general purpose
> multi-cpu chip that you can't get at as a hobbyist, because opening it
without destroying
> what you want to look at requires non-hobby equipment, by design.

Then perhaps a small business instead of a hobbyist. Still
doesn't require large government-based efforts to crack it.

> Changed a conditional jump to unconditional? Some schemes aren't so
> static and centralized ...

That's what I did for that one.

Another time I was working for a company. We were shipping a time-locked
program and had forgotten about it (there was a big lay off a few months
before I came in). Then one day our customers started complaining that
it wasn't working. The builds were done at another site, so as a backup
plan I figured out how to break our own scheme. In that one I looked
for the time call then added some code to shift the return value some
arbitrary time in the future.

> I once ran into a scheme that IIRC involved a pre-execution snippet of
code that had to
> run full bore for _lots_ of cycles doing things to locations and values in
> the program-to-be-executed that depended on precise timing and obscured
info.

Another one is the code in an old copy of MS Windows which gave a
warning message when run on top of DR-DOS (which had about 1/300th
of the marketplace). See
http://www.ddj.com/documents/s=1030/ddj9309d/

But these are defeatable. For example, run the program under an
emulator, save the state after the license check occurs, and restart
by reloading from that state. Or write a wrapper which intercepts
all I/O calls and returns the results from a known, good call (a
replay attack).

The system you propose seems to require a lot of changes to how
existing computer hardware is built, so that people can attach
dedicated compute resources. It's definitely an interesting idea,
and not just for program hiding. But a big change.

I'll end with a quote from E.E. "Doc" Smith's "Gray Lensman"
(p10 in my copy)

Also, there was the apparently insuperable difficulty of
the identification of authorized personnel. Triplanetary's best
scientists had done there best in the way of a non-counter-
feitable badge -- the historic Golden Meteor, of which upon
touch impressed upon the toucher's consciousness an unpro-
nounceable, unspellable symbol -- but that best was not
enough. /What physical science could devise and synthesize,
physical science could analyze and duplicate/; and that analy-
sis and duplication had caused trouble indeed.

(I realize this is not at issue and that everyone agrees this
to be the case. I just like the quote ;)

Andrew
da...@dalkescientific.com


Will Stuyvesant

unread,
Nov 4, 2003, 4:11:16 PM11/4/03
to
Thank you all for your input and thoughts! I am replying to my first
post since otherwise I would have to choose one of the threads and I
can't: several are useful.

I got the customer's interest via a CGI based webservice that did show
what I can do. Given this, now it would be very simple for me to keep
it as a webservice and give them a client script using the urllib
module or something like that...it would access the webservice and
they would never see the algorithm (I want to keep that to myself).
That is the solution I would like best. Not only because I want to
keep stuff to myself, but also because then I can easily upgrade to
new versions in one fell swoop if they or others are interested.

But a program that has to be connected to the internet is not
acceptable, their boss says. Humph.

Maybe I should trust them and do normal bussiness as I did before with
other customers already: just send them the commandline version of the
program; but somewhere in the back of my mind I feel uneasy about
this. Dunno why. Maybe just because they have a pointy haired boss.

Alex Martelli told about software bussiness in southern europe, well,
it's one big EU now, with the italian Berlusconi as chairman (soon
appointed invulnerable for that job too?), and I feel it's the same
thing now here in mideurope too.

I guess I am going to send them an MIT licence (although I am afraid
those licences are pretty useless in The Netherlands), the .pyc files
for the algorithm and the utility modules, and a .py file for the main
program. Or maybe I am going to use upx or another .exe encrypter
(but then I have to find out first if I can wrap my head around the
usage of the latest py2exe thing). Then if I find out later they
copied the algorithm then maybe I have a case against them. Just
let's not hope it comes to that, I'd rather start another project.
But this little thing I like and it would be great if I could make
some extra money with it. Curious? It does XML transformations.
With a twist.

Will let you know how it works out, takes a couple of weeks perhaps
until they decide.

Thank you!

Peter Hansen

unread,
Nov 4, 2003, 4:45:40 PM11/4/03
to
Will Stuyvesant wrote:
>
> I guess I am going to send them an MIT licence (although I am afraid
> those licences are pretty useless in The Netherlands), the .pyc files
> for the algorithm and the utility modules, and a .py file for the main
> program.

No need even for that .py file, is there? You can easily execute
a .pyc file directly if the .py doesn't exist. To create it,
easiest way is just to do "import mymainmodule" from the interactive
prompt, then copy the resulting .pyc somewhere else.

Alternatively, keep you .py as the main entry point, but do nothing
inside it except import another module and execute code in it.
But if you can do that, then invoking that module directly is easy
too: "python -c 'import main; main.main()'" does the trick...

-Peter

Erik Max Francis

unread,
Nov 4, 2003, 7:06:31 PM11/4/03
to
Will Stuyvesant wrote:

> Maybe I should trust them and do normal bussiness as I did before with
> other customers already: just send them the commandline version of the
> program; but somewhere in the back of my mind I feel uneasy about
> this. Dunno why. Maybe just because they have a pointy haired boss.

I'd be a little concerned why they're asking for a trial demo which
they'll decide whether to pay you for afterward given that, as you've
said, you've already demonstrated the ability to do the job with a CGI
based service that they've already explored and is what in fact drew
them to you in the first place. At this point I'd probably press on
about what the difficulty is in agreeing on contract fees up front would
be.

I agree with Peter's response, though -- it sounds like in this case,
simply distributing .pyc files (make sure you make it clear which
version of Python the .pyc files were created with, since the format
changes with releases!) would be sufficient obscurity to encourage them
to actually pay you once they've gotten the package and have
demonstrated it to work. I'd still wonder why they're insisting on in
effect a second trial demo, here; I now understand your initial
hesitation.

> I guess I am going to send them an MIT licence (although I am afraid

> those licences are pretty useless in The Netherlands) ...

If you're planning on reselling this to other customers, why would you
want an MIT license?

I'd also make it very clear, if you haven't already -- before you give
them anything -- on whether or not they think you're transferring the
copyright to them or merely buying a license to use it which leaves you
with the ability to resell it to other clients.

--
Erik Max Francis && m...@alcyone.com && http://www.alcyone.com/max/
__ San Jose, CA, USA && 37 20 N 121 53 W && &tSftDotIotE

/ \ To understand is to forgive, even oneself.
\__/ Alexander Chase

John J. Lee

unread,
Nov 4, 2003, 8:53:47 PM11/4/03
to
bo...@oz.net (Bengt Richter) writes:

> On Tue, 04 Nov 2003 09:03:51 GMT, "Andrew Dalke" <ada...@mindspring.com> wrote:
>
> >Bengt Richter:
> >> OTOH, we are getting to the point where rather big functionality can be put
> >> on a chip or tamper-proof-by-anyone-but-a-TLA-group module. I.e., visualize
> >> the effect of CPUs' having secret-to-everyone private keys, along with
> >> public keys,
> >
> >Actually, we aren't. There have been various ways to pull data of
> >of a smart card (I recall readings some on RISKS, but the hits I
> >found are about 5+ years old). In circuit emulators get cheaper and
> >faster, just like the chips themselves. And when in doubt, you can
> >buy or even build your own STM pretty cheap -- in hobbiest range
> >even (a few thousand dollars).
>
> Even if you knew exactly where on a chip to look, and it wasn't

(Which knowledge is bound to become available -- I don't think any
leak is required.)


> engineered to have the key self-destruct when exposed, what would

Exposed to what?


> you do with the key? You'd have the binary image of an executable
> meant to execute in the secret-room processing core. How would you

No, you already have that -- it's on your hard drive (the current
scheme is only about the processor & associated gubbins, if I read
Ross Anderson's page right).


> make it available to anyone else?

[...]

Copy it.

I think the idea is something like this (got from Ross Anderson's TC
FAQ). The processor makes sure that a single process can only see
it's own memory space. The processor also has a private key, and
knows how to take an md5 sum (or whatever), sign it with the key, and
send that off to the software author's server along with your
identity. The server checks that it was signed with your processor's
private key, and that you've paid for the software, and a sends a
signed message back that tells your machine "OK". Obviously (hmm... I
should hesitate to use that word about anything related to security!),
if you have your machine's private key, you can play
"man-in-the-middle".

Presumably the next phase is to make hard drives, etc. 'trusted'. I
couldn't find much useful stuff on this on the web. Anybody have any
good links to overviews of this?


John

Max M

unread,
Nov 5, 2003, 3:51:39 AM11/5/03
to
Erik Max Francis wrote:

> I'd be a little concerned why they're asking for a trial demo which
> they'll decide whether to pay you for afterward given that, as you've
> said, you've already demonstrated the ability to do the job with a CGI
> based service that they've already explored and is what in fact drew
> them to you in the first place. At this point I'd probably press on
> about what the difficulty is in agreeing on contract fees up front would
> be.


I find this odd too. I have never had to deliver a working version of a
product before the customer decided if they wanted to pay me or not.

Just give them an offer for the delivered programme specifying which
features it will have.

Then they either say yes or no. And you won't have to go through all
that trouble.


regards Max M

John J. Lee

unread,
Nov 5, 2003, 10:22:08 AM11/5/03
to al...@aleax.it
Alex Martelli <al...@aleax.it> writes:

> John J. Lee wrote:
[...]
> > Though information is indeed always incomplete, it seems a good bet
> > that war3zd00dz are not an issue for a consultant being hired by a
> > company to write a 1000 line program. Do you disagree?

[...]


> we weren't talking about somebody being _hired_, but rather
> wanting to sell what they independently came up with the idea
> of developing -- there's a difference! And yes, it wouldn't

Right. Substitute "a consultant selling a not-widely-distributed 1000
line program to a company" in what I said, though, and I think it's
still a good bet.


> be the first time that a company deliberately exploits the warez
> "circuit" to get programs cracked -- look around and you'll see
> it's definitely NOT just games and the like that end up there.

Oh sure, but don't the vast majority tend to be far more widely
distributed than (I imagine, guessing of course) this 1000 line code
is? Maybe I'm just naive.


[...about decompilation: recovering source-like code from compiled code...]


> > valuable names). Certainly that's impossible with optimising
> > compilers (I should have stated this much more strongly in my last
> > message, of course -- there's no "may" or "guessing" involved there,
> > unlike the Python case, where I don't know the answer).
>
> If you think you do, "you're in denial". Check out:
>
> http://www.program-transformation.org/twiki/bin/view/Transform/DecompilationPossible
> http://boomerang.sourceforge.net/
> http://www.itee.uq.edu.au/~cristina/dcc.html#dcc

[...]

OK, OK, not impossible if you have knowledge of the way compilers
actually do things (and, sigh... security is always about 'cheating',
isn't it). Still, even given that, the sample input / output on the
second two pages, though impressive, appear to show that doing it in
practice is far from a solved problem (assuming this is representative
of the state of the art). One would expect that it's far harder to do
this with optimised languages than with Python -- true?


John

Reply all
Reply to author
Forward
0 new messages