std::basic_string<unsigned char>
mscrypt_derive_key_sha1(std::basic_string<unsigned char> password)
{
unsigned char buf1[64];
unsigned char buf2[64];
std::fill_n(buf1, 0x36, sizeof(buf1));
std::fill_n(buf2, 0x5C, sizeof(buf2));
std::basic_string<unsigned char> hash = hash_sha1(password);
for (std::size_t i = 0; i < hash.size(); ++i)
{
buf1[i] ^= hash[i];
buf2[i] ^= hash[i];
}
return hash_sha1(buf1) + hash_sha1(buf2).substr(0, 4);
}
I only try to emulate CryptDeriveKey for 3DES with SHA1. That's why I simply
return 192 bits for the key. The function hash_sha1 has been implemented and
tested separately.
Can anyone confirm if my implementation is correct? Or is it possible to
output the key which is generated by CryptDeriveKey on Windows? Then I could
make some tests at least and compare the generated keys to be sure that my
implementation works.
Thanks in advance,
Boris
Boris,
here is a test vector made on windows using a python version of the above
algo:
string input : 124-Kelp
derived key (hex): 5ab48b8def0bdca77f16f8c4f4781823e92ecc1c4d40f762
hth,
tlviewer
Thanks, there must be something wrong then. When I derive the key from your
string input with the .NET class PasswordDeriveBytes I get this key:
5bb58a8cef0bdca77f16f8c4f4791923e92fcd1c4c40f762
The C++ algorithm comes pretty close but it's not yet the same. Here's the
.NET code I used:
PasswordDeriveBytes pdb = new PasswordDeriveBytes("124-Kelp", null);
byte[] pwIV = new byte[] { 0, 0, 0, 0, 0, 0, 0, 0 };
byte[] desKey = pdb.CryptDeriveKey("TripleDES", "SHA1", 192, pwIV);
I'll read the description at
http://msdn.microsoft.com/library/default.asp?url=/library/en-us/seccrypto/security/cryptderivekey.asp
some more times to understand hopefully what Microsoft means ...
Boris
I tested now 124-Kelp with my C++ implementation and get a completely wrong
key. Can you show me your Python version? There must be something different
as otherwise I would expect that we get the same keys at least (even though
they are still different from what PasswordDeriveBytes returns).
Boris
yes this is my result corrected for parity.
5bb58a8cef0bdca77f16f8c4f4791923e92fcd1c4c40f762
you will find that exported session keys (via simpleblob) will not be
corrected for
parity that's why I didn't add that. Only when calling CryptEncrypt or
CryptHashSessionKey will
you see this correction. It's transparent -- you don't have to think about
it when using the
CryptoAPI. I don't bother with the .NET stuff.
I gave you a derived key which will match the value exported from CryptoAPI
as a
PlainTextBlob in Winxp or higher.
hth,
tlviewer
There were two bugs in mscrypt_derive_key_sha1. After correcting them
everything works as expected. As I couldn't find any source code at all
which shows how CryptDeriveKey works I post a small C++ program for Windows
which might help others in the future.
Boris
std::basic_string<unsigned char> hash_sha1(std::basic_string<unsigned char>
data)
{
HCRYPTPROV hProv = 0;
HCRYPTHASH hHash = 0;
if (!CryptAcquireContext(&hProv, NULL, NULL, PROV_RSA_FULL, 0))
std::exit(EXIT_FAILURE);
if (!CryptCreateHash(hProv, CALG_SHA1, 0, 0, &hHash))
std::exit(EXIT_FAILURE);
if (!CryptHashData(hHash, data.c_str(), static_cast<DWORD>(data.size()),
0))
std::exit(EXIT_FAILURE);
DWORD dwHashLen;
DWORD dwCount = sizeof(DWORD);
if (!CryptGetHashParam(hHash, HP_HASHSIZE, (BYTE*)&dwHashLen, &dwCount, 0))
std::exit(EXIT_FAILURE);
unsigned char *pbHash = new unsigned char[dwHashLen];
if (!CryptGetHashParam(hHash, HP_HASHVAL, pbHash, &dwHashLen, 0))
std::exit(EXIT_FAILURE);
std::basic_string<unsigned char> hash = std::basic_string<unsigned
char>(pbHash, dwHashLen);
delete[] pbHash;
if (hHash)
CryptDestroyHash(hHash);
if (hProv)
CryptReleaseContext(hProv, 0);
return hash;
}
std::basic_string<unsigned char>
mscrypt_derive_key_sha1(std::basic_string<unsigned char> password)
{
unsigned char buf1[64];
unsigned char buf2[64];
std::fill_n(buf1, sizeof(buf1), 0x36);
std::fill_n(buf2, sizeof(buf2), 0x5C);
std::basic_string<unsigned char> hash = hash_sha1(password);
for (std::size_t i = 0; i < hash.size(); ++i)
{
buf1[i] ^= hash[i];
buf2[i] ^= hash[i];
}
return hash_sha1(std::basic_string<unsigned char>(buf1, sizeof(buf1))) +
hash_sha1(std::basic_string<unsigned char>(buf2, sizeof(buf2))).substr(0,
4);
}
int main(int argc, char* argv[])
{
if (argc < 2)
return EXIT_FAILURE;
std::cout << "Password: " << argv[1] << std::endl;
std::basic_string<unsigned char> password = reinterpret_cast<const unsigned
char*>(argv[1]);
std::basic_string<unsigned char> hash = mscrypt_derive_key_sha1(password);
std::string hex;
for (std::size_t i = 0; i < hash.size(); ++i)
{
hex += "0123456789ABCDEF"[hash[i] / 16];
hex += "0123456789ABCDEF"[hash[i] % 16];
}
std::cout << "Key: " << hex << std::endl;
}
The exported session key (for 3DES) in a simple blob is the indentical key
that generates the same cipher text in all of .NET, Java and capi as
discussed here::
http://www.jensign.com/JavaScience/dotnet/NetDESEncrypt/
So I gather that if you supply the 24 bytes 3DES binary key for interop, you
supply the non-corrected (for parity) key value.
Seems strange that PaswordDeriveBytes would output the parity-corrected
value as this will create issues in .NET when using that Pswd derivation fn.
- Mitch Gallant
MVP Security
I had no interop problems so far: Encrypting on .NET and decrypting with
Crypto API (only 3DES though). From what I understand the parity bits are
not used for 3DES anyway thus they shouldn't matter?
Boris
Talking about .NET, the recommended pbkf function is now (in .NET 2) PBKD2
Rfc2898DeriveBytes(..)
- Mitch
"Boris" <bo...@gtemail.net> wrote in message
news:%23K7gKPm...@TK2MSFTNGP05.phx.gbl...