The following test code fails on Mac OS 10.5 with gcc "i686-apple-
darwin9-g++-4.0.1 (GCC) 4.0.1 (Apple Inc. build 5490)" on Mac OS
10.5. It prints out
N8CryptoPP50DL_PrivateKey_WithSignaturePairwiseConsistencyTestINS_16DL_P
rivateKey_ECINS_3ECPEEENS_5ECDSAIS2_NS_4SHA1EEEEE
and then the dynamic_cast yields NULL. (Thanks to Brian Warner for
running that test.)
On Mac OS 10.5 with "i686-apple-darwin9-g++-4.0.1 (GCC) 4.0.1 (Apple
Inc. build 5484)" it emits the same type information and then the
dynamic_cast succeeds and yields a pointer to an object that appears
to work. (Thanks to Kevin Reid for running that test.)
On Mac OS 10.4 with "i686-apple-darwin8-g++-4.0.1 (GCC) 4.0.1 (Apple
Computer, Inc. build 5367)" it emits the same type information and
then the dynamic_cast succeeds. (Thanks to myself for running that
test.) Also on many of the other platforms on the pycryptopp
buildbot [1] the equivalent dynamic_cast (built into the pycryptopp
unit tests) succeeds.
So there are two questions I have:
1. What changed between build #5484 and build #5490 of Apple's
version of g++ v4.0.1 that makes this dynamic_cast return NULL?
2. Why does the type information contain the string "SHA1" when the
private key was declared as using Tiger? Brian instrumented his test
code to print the type of the "::Signer" instance s and it was:
"PN8CryptoPP16PK_FinalTemplateINS_13DL_SignerImplINS_25DL_SignatureSchem
eOptionsINS_5DL_SSINS_13DL_Keys_ECDSAINS_3ECPEEENS_18DL_Algorithm_ECDSAI
S5_EENS_37DL_SignatureMessageEncodingMethod_DSAENS_5TigerEiEES6_S8_S9_SA
_EEEEEE".
Regards,
Zooko
[1] http://allmydata.org/buildbot-pycryptopp/waterfall
------- begin test code
#include <cryptopp/filters.h>
#include <cryptopp/osrng.h>
#include <cryptopp/eccrypto.h>
#include <cryptopp/oids.h>
#include <cryptopp/tiger.h>
#include <cryptopp/sha.h>
#include <cryptopp/pubkey.h>
#include <cryptopp/rng.h>
#include <iostream>
#include <cryptopp/ecp.h>
#include <cryptopp/hex.h>
USING_NAMESPACE(CryptoPP)
int main(int argc, char**argv) {
ECDSA<ECP, Tiger>::Verifier *v;
ECDSA<ECP, Tiger>::Signer *s;
DL_GroupParameters_EC<ECP> params(ASN1::secp192r1());
params.SetPointCompression(true);
RandomPool rng;
s = new ECDSA<ECP, Tiger>::Signer(rng, params);
std::cout << "xxx 0" << typeid(s->GetPrivateKey()).name() <<
"\n"; std::cout.flush();
const DL_PrivateKey_EC<ECP>* privkey = dynamic_cast<const
DL_PrivateKey_EC<ECP>*>(&(s->GetPrivateKey()));
std::cout << "xxx 1" << typeid(privkey).name() << "\n";
std::cout.flush();
if (!privkey) {
std::cout << "dynamic_cast failed for k->GetPrivateKey()" <<
"\n"; std::cout.flush();
return -1;
}
return 0;
}
Mac OS 10.4 with "i686-apple-darwin8-g++-4.0.1 (GCC) 4.0.1 (Apple
Computer, Inc. build 5367)" (me): succeeds
Mac OS 10.5 with "i686-apple-darwin9-g++-4.0.1 (GCC) 4.0.1 (Apple
Inc. build 5484)" (Kevin Reid): succeeds
Mac OS 10.5 with gcc "i686-apple-darwin9-g++-4.0.1 (GCC) 4.0.1 (Apple
Inc. build 5490)" (Brian Warner): fails
Mac OS 10.5 with "gcc version 4.2.1 (Apple Inc. build 5566)" (Mouse):
succeeds
All other platforms: succeeds
My other question remains, though, why does the following code result
in a string containing the substring "SHA1" instead of "Tiger"? Is
this indicative of a bug or is this expected?
> s = new ECDSA<ECP, Tiger>::Signer(rng, params);
>
> std::cout << "xxx 0" << typeid(s->GetPrivateKey()).name() <<
> "\n"; std::cout.flush();
Regards,
Zooko
This is not a bug. SHA1 is only being used to perform a pairwise consistency
test upon key generation when FIPS mode is enabled.