[rc5] Overclocking (was: Another approach)

Del Conte ddelcon at voicenet.com
Sun Jul 6 16:53:36 EDT 1997

	I must agree that accuracy is very important in this effort.  But I do not
feel that overclocking affects reliability of the results.  The reason for
this is that most overclockers use intel chips.  Intel has a habit of
underrating their MHz speed on their chips.  This means that what they sell
as a 150 can actually run comfortably at 166.  If you look at the other
manufacturers, they tend to rate their chips at higher speeds which makes
overclocking harder.

	Also, if an error were to occur while a chip was overclocked it would
likely cause the computer to crash.  This would mean that the block would
not finish and therefore the results would not be sent back.  The chances
of a flaw slipping into the results is so low that I believe it is not even
a factor.

	Do others feel that overclocking could cause unreliable data?


At 01:23 PM 7/6/97 -0600, you wrote:
>	I have a concern about increasing the proportion of clients
>running on overclocked machines; possibly due to my ignorance of the
>overclocking experience.  
>	My impression is that nominal chip/processor speeds are based
>upon the speed at which that particular component is known to run
>_reliably_, i.e., without errors.  And that practically any chip _can_
>run at a faster speed (i.e., boot up, respond to instructions, output
>results), but that there is a point (probably based on increasing
>internal temperatures) at which errors start to occur.  These errors may
>or may not be detectable by the user.  The computer itself can't tell
>which numbers are accurate and which are bogus, and merrily rolls along,
>like the original Pentinum chip with FDIV problems, until enough
>problematic errors occur, that the machine crashes/turns pink/writes
>Hamlet.  Overclockers tolerate this risk because the speed increase of
>10+ % outweighs the negligible risk of a crash or undetected
>errors (Screen goes gray? Reboot, BFD).
>	However, in other contexts, accuracy and reliability of results
>is important.  If I were in charge of an expensive, time-consuming
>project, I would either use the most reliable equipment available to do
>the processing--or use faster, less-reliable equipment to do the
>processing TWICE (on the notion that squaring the small probability of
>heat-related errors generates sufficient certainty).  Any numbers on
>reliability of overclocked processors? Even a 1-in-a-trillion
>probability of undetected errors could render uncertain a significant
>number of potential keys. 
>	- Richard "Let's go through 2^56 keys.  Twice!" Ebling

To unsubscribe, send email to majordomo at llamas.net with 'unsubscribe rc5' in the body.

More information about the rc5 mailing list