[Hardware] Notes... The case for an open client
jbass at dmsd.com
jbass at dmsd.com
Mon Aug 16 16:29:24 EDT 2004
Elektron <elektron_rc5 at yahoo.ca> writes:
> What about the bits in between? Moving 16-bit segments? At least 5
> loads are required, adding 5 cycles to 300, or about 2%. 2% is a hell
> of a lot when you're talking about a 2^72 keyspace.
The bits in between simply mean searching a small amount of space early.
In hardware, and other concurrent programming environments with excess
PE's, any task that is outside the critical path, is "free" in terms
of contributing to the functions latency. That is certainly the case here,
since the key increment, and indirection thru the lookup tables, is
completely overlapped by the key test.
I am not concerned with the impact it might have on traditional processors,
as I do not propose to implement it there.
> I may be argumentative, but at least I am civil.
You are not civial when you state I have no right to explore this matter
because I lack complete formal training in the matter.
You are not civil when you ridicule my interest in exploring this line
of thought because it's based on weak evidence and personal experience.
Especially when you beat your chest as being highly enlightened, then
proceed to construct flawed arguemnts that any Traditional Logic student
would recognize after the second week.
I conjecture that A may be false.
Your flawed proof is "If A is true, then ....", without EVER making
You are not civil, when in the end your false chest thumping high sounding
"proof" sums down to your attempt to claim the right assert as fact things
which are both unsubstantiated assertions, and you can not prove as true.
Things that you take on faith.
"I don't claim to be a math god. I just doubt that RSA would happily use
a flawed algorithm, after having designed a decent cipher."
I only claim that there's no point doing what you suggest if the keys
are entirely random, and RSA would be smart enough to produce keys that
are as close to 'entirely random' as possible.
I have made my living for 35 years examining and correcting industry
failures based on teams inability to re-examine the fundamental truths
they based their designs on. Critical design flaws, that resulted
from accepting as fact, prior decisions that were based on a foundation
that was no longer true, and thus invalidated a whole chain of decisions
that followed. In short, when someone stands in front of me with the defense,
"everyone knows" my first response is to question that assertion and
fully explore the logic chain behind it to uncover all the unstated
assumptions in the design, especially those that are no longer true.
I have given dozens of industry seminars and talks on this subject,
where I explore dozens of case studies where "everybody knows".
My tag line in these talks is: "It's not what you know that hurts
your projects, it's what you think you know".
So when you stand before me in this discussion, huffing and puffing,
that I don't have a right to explore this line of thought, because
you are the expert:
> Please don't talk about "probabilistic number theory"
> without any knowledge; this is A-level statistics, which I learned at
and then construct fundamentally flawed arguments as your chest
thumping proof - don't be suprised if someone calls your bluff.
I suspect that more than myself in this forum is not impressed
that your proof in the end was just a trust in RSA.
More information about the Hardware