[RC5] [RC5-Mac] the DEATH of d.net?

Greg Delisle gdelisle at indiana.edu
Tue Jul 21 11:14:30 EDT 1998


At 3:48 AM -0500 on 7/21/98, Patrick T Kent wrote [on rc5mac]:

> It seems to me that at the moment we are either wasting time trying to
> crack a code for which a super computer now exists for that sole purpose
> and can achieve the results in a faster time than we can do it. Or we
> are working on a project that again seems a complete waste of time in
> that it is unlikely we will ever complete it before beginning a more
> important (HOPEFULLY!!) project. And even if we continue in that, isn't
> it likely the same super computer can do a better job?

Indeed! This question has been bugging me for a little while now. It seems
to me that in the founding days of d.net, the rationale for choosing this
project was that it required resources of a supercomputer-scale nature, and
that it was to show that supercomputer-scale work could be done by those
without supercomputer-scale funds. Well, that was a year or two ago, and
now it seems we may have been overtaken by Moore's Law -- which not only
stipulates the doubling of processor power, but doubling _at the same
price_. Not only more powerful, but cheaper too. When one thinks about
scale, one can see that to upgrade all of distributed.net's processors
would take millions of dollars. It also means that in 18 months or so, Deep
Crack will either be twice as fast, or cost only $125,000. True, it is a
hardware solution that can't be used for RC5 in its present form, but since
it is a public spec (as long as you shell out the dough for their book),
Deep Crack Clones are sure to follow, and possibly hardware designed to
crack RC5. If so, distributed.net doesn't stand a chance against its
branch-guessing algorithms. Likewise, the ramp-up latency problem that
allowed Deep Crack to jump out to such a commanding lead in DES-II-2 will
still exist for DES-II-3 and all further projects, even if v3 clients can
manage to diminish that time through intelligent scheduling. Deep Crack
will never have this problem, and though by the time DES-II-3 rolls around
we may have, as Adam has said, 2.6 times as much processing power, if Deep
Crack (or "Deep Crack II: Crack Deeper") gets the same sort of lead in the
first day, we will never beat it.

The other motivating idea behind distributed.net was to promote the
potential of distributed computing, that is, the sharing of processing
power over a network. Not too long ago, Sun introduced its Jini
spec/project/thingy which promises to be the actual realization of this
idea as a practical venture. One reason why code-cracking was chosen to
demonstrate d.net power was that it can be coded and maintained with little
effort (relative to creating commercially viable apps in a business cycle
framework -- no offense to the hard-working folks of d.net). Nobody needs a
code cracking client to do their daily work, but it can be built and run
relatively trouble-free by a handful of motivated volunteers. With the
advent of Jini, however, all sorts of apps will be distributed --
spreadsheets, e-mail, graphics, modeling, database, multimedia, you name
it... but probably not code cracking, because nobody needs it to get their
daily work done. This puts a two-pronged relevancy challenge to d.net: on
the one hand, our point has been proven, and on the other hand, we produce
little of value by our efforts.

Thus, it seems to me that the entire distributed.net project is in danger
of disappearing, and is perched on the horns of this dilemma -- our claim
of "speed through sharing" is being challenged on the speed end by Deep
Crack and on the sharing end by Jini. With these twin challenges, I fear
that d.net will find it harder and harder to gain new recruits, and easier
to lose current participants. I think that perhaps distributed.net needs to
rethink things on a top level, that is, instead of spending our efforts
making faster, more efficient clients to do the same work, we should be
looking for more valuable work. Think of this: the most optimistic
estimates of our RC5-64 project are measured in years. While it's true that
the winning key could be found today, not many of us expect this to happen.
If it does in fact take years to dig out the key, how much satisfaction
will you have derived from it? As Patrick has wisely pointed out, how much
has it cost us in resources -- electricity being the major "waste" -- to
find the winning key? I suppose if I were Adam Beberg, I might see things
differently, but I'm not, and despite the charities, the prize money, and
my own feelings about government encryption policies, I'm seriously
rethinking my commitment to distributed.net.

On a personal note -- my individual stats for RC5-64 have touched the
2100th rank, and on 7/21 my contributions amounted to 93,989 blocks of
keys. I check my stats nearly every day to see if I have gone up a notch or
two. I am a member of Team Evangelist, and it does give me some
satisfaction to see us at the top of the mountain. All the same, I don't
feel like my efforts, my contributions, are doing the world much good.
Unless this feeling changes, I will probably withdraw myself and my
machines from the project soon.

[PS -- I am posting this to both rc5mac and rc5. Anyone else who would like
to forward this to other d.net lists I am not on may do so with my
permission.]


-Greg Delisle
Indiana University Press Journals
http://www.indiana.edu/~iupress/journals/


--
To unsubscribe, send 'unsubscribe rc5' to majordomo at lists.distributed.net
rc5-digest subscribers replace rc5 with rc5-digest



More information about the rc5 mailing list