Game Development Community

CryptoPP Auth. System Development Help.

by Robert Fritzen · in Technical Issues · 01/24/2011 (3:06 pm) · 21 replies

Hello, I think I've made mentions in a few places that I have created my own authentication system in source. However, I am now faced with an interesting C++ challenge, and I'm wondering if someone here knows how to patch it up.

Basically put, I've created what runs along the lines of an x509 Certificate scheme with no expiration on certificate time, eliminating the need for a 24/7 account/authentication server. I am more than willing to share my system with the community, heck I'll even resource the thing if you can help me solve this issue of mine, I'm using the open_SSL crypto library.

The key step in my system comes into play when a player tries to join a server, their account is pushed through a special function known as RSA_Verify, which verifies a client's account signature to ensure it is from a valid source. However, my issue comes with the title of the thread. Open_SSL requires that the signature be sent in "ASCII" format (0xHH). But, there seems to be one noticeable problem. When someone has a hex pair of 0x20 OR 0x00, the code places it in "Whitespace" format, which is incorrect, especially for 0x00 which absolutely must remain as 0x00. The true ASCII representation is more of a vertical rectangle object looking character. this is my code to convert hex into ASCII. I've already asked the people at cplusplus.com without any luck, so I was hoping someone here could enlighten me on this issue.

char PGDCrypto::hexToAscii(char first, char second) {
	char hex[5], *stop;
	hex[0] = '0';
	hex[1] = 'x';
	hex[2] = first;
	hex[3] = second;
	hex[4] = 0;
	return strtol(hex, &stop, 16);
}

std::string PGDCrypto::hexToStr(std::string &input) {
   std::ostringstream output;
   int len = input.length();
   for (int i = 0; i < len; i += 2) {
      
	  output << hexToAscii((char)input[i], (char)input[i+1]);
   }
   return output.str();
}

So, my question is pretty simple. How can I get the output of 0x20 to be the correct symbol instead of the whitespace. Just for a reference, I do believe 0x00 also swaps to whitespace which is also incorrect.

Any help would be greatly appreciated, thanks!
Page «Previous 1 2
#1
01/25/2011 (1:29 am)
If you want to convert a number into a hex string (ie, convert 'A' into "0x41"), you could do it like this:
std::string convert(char c)
{
    char hex[5];
    snprintf(hex, 5, "0x%02X", (unsigned char)c);
    return std::string(hex);
}
#2
01/25/2011 (7:21 am)
Uh, I think you might be a little mistaken to what I'm trying to do. I already have the "hex string", for example, let's take the account signature hex:

96d013fa9cfe343952a952b671ada6867116948912726c890a732864198a1c3a861e994854796dce89da9a6d6e7f4b6220a8491d37ff113616c9a7812b0deb2d

so, I have that. The issue now arises when it hits this block towards the end: 62 and 20.

The 62 doesn't cause the issue, but the conversion function above is not outputting the correct value for 20. I'm not trying to go from one single character to ASCII. I need to convert a given "hex pair" to ASCII.
#3
01/25/2011 (7:31 am)
Okay, so what should the correct value be? 0x20 in ASCII is a space, so I have no clue what you are expecting the proper output to be. If you can give an example, it would help. Are you maybe thinking about base64?
#4
01/25/2011 (7:38 am)
Interesting, I think I may have been reading the wrong character the whole time, sorry. :(

the error hits here: 948912. I know 94 is the lowercase "d", but 89 or 12 is eluding me, and my output. I think it's the 89 because a character appears in the output where the 12 is.

http://www.phantomdev.net/auth/ISRPG/checkSignature.php?name=Dayuppy
http://www.phantomdev.net/rsadebug.txt

The binary (ASCII) output that should be outputting (PHP given) is shown on that first page. The second page is the output generated by the client. The non matching value comes at the 89.

*EDIT*
I should probably also mention that it's only certain accounts that are not working, some of them that are being created work completely fine at the verify step.
#5
01/26/2011 (10:04 am)
The BIN section of both the PHP output and from rsadebug.txt are identical. I checked with a hex editor. The BIN section was derived from the S section. 948912 comes from the HASH section. How is the HASH section being used? Everything seems identical.
#6
01/26/2011 (1:20 pm)
Really, then this makes that issue sound even more confusing. this is the code I'm using for the RSA_Verify that is being turned down. I'm sure you realize due to crypto based security reasons I have omitted the "CA KEY" from RSA_Verify. Other than that, here is the code used:

ConsoleFunction(GatherAccountDetails, void, 5, 5, "(string) returns a long appended string of account details (E, N, SIG)") {
   argc;
   //RSA_Verify Time :D
   //what is needed:
      //1: Full Details: whirlpool($guid@$name)
      //2: The Signature: base64
   //We are given the guid, email, and name by the client, attach E and N, then calc a whirlpool hash.
   std::string toWhrl = argv[1];
   toWhrl.append(argv[2]);
   toWhrl.append(argv[3]);
   std::string toWhrl_final = toWhrl;
   toWhrl_final = pgd.doWhirlpool(toWhrl.c_str());
   //
   std::string hexSig;
   hexSig.assign(pgd.hexToStr(string(argv[4])));
   //
   int rsaverifyresult = pgd.RSAVerify((char *)toWhrl_final.c_str(), (unsigned char *)hexSig.c_str());
   
   fstream debugger("rsadebug.txt", ios::out);
   debugger << "Testing\n";
   debugger << "GUID/Name: " << argv[1] << "\n";
   debugger << "E: " << argv[2] << "\n";
   debugger << "N: " << argv[3] << "\n";
   debugger << "S: " << argv[4] << "\n";
   debugger << "Data Using: " << toWhrl << "\n";
   debugger << "HASH: " << toWhrl_final << "\n";
   debugger << "BIN: " << hexSig << "\n";
   debugger << "Verify: " << rsaverifyresult << "\n";
   debugger << string("Done");
   debugger.close();
   
   if(rsaverifyresult != 1) {
	   //Naughty Naughty... someone is using a bogus certificate
	   //Kill Them!!!
	   Con::printf("RSA_Verify has denied this account certificate.");
	   Con::executef("MessageBoxOk", "Invalid Account", "Invalid Account Certificate");
	   Con::executef("disconnect");
	   return;
   }
   // give our client E/N in an appending sig to work with
   Con::setVariable("AuthenticationDetails", "");
   std::string worker;
   worker.assign(argv[2]);
   worker.append(argv[3]);
   Con::setVariable("AuthenticationDetails", worker.c_str());
   //give the client his signature too!
   Con::setVariable("AuthenticationSignature", "");
   std::string clientSig;
   clientSig.assign(argv[4]);
   Con::setVariable("AuthenticationSignature", clientSig.c_str());
   //
   Con::setVariable("AccountDetails", "");
   std::string hold;
   hold.assign(argv[2]);
   hold.append(":");
   hold.append(argv[3]);
   hold.append(":");
   hold.append(argv[4]);
   //Store it
   Con::setVariable("AccountDetails", hold.c_str());
   Con::printf("AccountDetails is stored.");
}

int PGDCrypto::RSAVerify(char * accountData, unsigned char * signature) {
   unsigned char hash[SHA_DIGEST_LENGTH];

   char pub_key[] = {"OMMITED KEY IN PEM FORMAT"};

   int length = strlen((const char *)signature);
   // WARNING: no error checking for brevity
   BIO* bio = BIO_new_mem_buf(pub_key, sizeof(pub_key));

   RSA* rsa_key = 0;
   PEM_read_bio_RSA_PUBKEY(bio, &rsa_key, 0,0); 

   hash[0] = 0;
   SHA1((unsigned char*)accountData, strlen(accountData), hash); //strlen(accountData)

   int ret = RSA_verify(NID_sha1, hash, 20, signature, length, rsa_key);

   return ret;
}

hopefully this is enough to help solve this issue.
#7
01/27/2011 (3:40 pm)
I just saw in my email that the post you have there is different, and you mentioned something about my PHP possibly being incorrect.

Here is the PHP code that generates the account signature using the given data:

$FullAccountData = hash('whirlpool', ($guid.$name.$exp.$public)); //what we need
         openssl_sign($FullAccountData, $signature, CA_private());
         $ok = openssl_verify($FullAccountData, $signature, CA_public());
         if($ok == 1) {
            $sigFinal = bin2hex($signature);       
            //successful account creation
            //update the database adding the new user's data
            //openssl_free_key($pkey);
            //openssl_free_key($pubkey);
            
            $sql="INSERT INTO Accounts (guid, E, N, D, D_DEC_HASH, signature, username, email, hashpass) 
            VALUES ('$guid', '$exp', '$public', '$private', '$priv_dec_hash', '$sigFinal', '$name', '$email', '$accountData')";
      
            mysql_query($sql,$con) or die("$".'INTERNAL_ERRORn');     
            echo "$"."PGD"."$"."CERT ". $name ."t". $guid ."t". $email ."n";
            echo "$"."PGD"."$"."CERT2 ". $exp ."t". $public ."t". $sigFinal ."n"; // complete their public certificate
            echo "$"."PGD"."$"."CERT3 "; //order the client to construct their certificate.
         }
         else {
            die("$"."PGD"."$"."SIGN_ERROR ".$ok."");
         }

If there isn't anything that's coming up as a red flag, I'll try my next option, which is to just send the client PHP's binary form of the signature to store in a separate file during account creation, and I'll just call up that form when needed.
#8
01/27/2011 (8:01 pm)
At this point, I don't know what to say. Everything seems correct. Personally, I would check to see if $FullAccountData, $signature, and CA_public() in your PHP code 100% match, byte for byte, your hash, signature, and rsa_key you use in your C++ code. I double checked every single value from your checkSignature.php output and your rsadebug.txt output and they are completely identical. I mean, it could be possible that a \r or \n is finding its way into your data. You should double check that all input values are completely identical.
#9
01/28/2011 (7:58 am)
I'll go ahead and check for those, If I find any, I'll just trim it from the string.

If that doesn't work, I'll go to the option I mentioned above which would be storing the PHP binary form of the signature in it's own file and just use that instead of trying the conversion. I'll report back with my findings, and hopefully get this solved.

Thanks for all the help you've provided so far.
#10
01/28/2011 (2:22 pm)
Well, unfortunately I can't seem to figure this out, I went through and tested what you said, and still gives me a match.

Unfortunately, I think this is a problem with the openSSL itself, that or some hidden error, I guess I won't find out.

I've decided to move away from openSSL and to re-code my entire RSA system using the CryptoPP library. I figure that this entire disadvantage to openSSL comes from the fact that it just doesn't use the hashed string version of the signature as what it checks for. I'm not sure if CryptoPP does, but from the few examples I have looked at, it does with a single conversion of data type.

So, this may take a few weeks or so for me to get reset with cryptoPP, but I will indeed report back with my findings on if it works. I'll still keep my promise of resourcing my auth system if I can get it working under cryptoPP.
#11
01/28/2011 (2:31 pm)
I feel ya. I once had to debug an error when sending zlib compressed data across a network connection. It turned out that, when given a specific type of input string, zlib would return corrupted compressed data; it claimed it returned n bytes of compressed data when in fact it returned far less. This caused the whole packet stream to become corrupt as it tried to read n bytes of compressed data on the other end!

Anyway, I hope everything works out for you.
#12
01/29/2011 (12:12 pm)
ouch, well at least I do have some good news, I got CryptoPP compiled as a static library and got my RSA key generation function re-coded for it, so that leaves the encryption and hashing functions.

Hopefully not too much work needed, I have a pretty good idea of what I need to do already. :)
#13
01/30/2011 (3:16 pm)
Actually, maybe you can help me get this done quicker. It would be nice to have this system working.

How much do you know about CryptoPP? Because I've just hit what appears to be a lovely wall (crash) scenario. I'm using Crypto++ 5.6.1, under MSVS 9.0 (2008). I've compiled CryptoPP into a static library following the tutorial at the site, the hashing functions work again, but this is where it's breaking

int xxz568::rsaSign(InvertibleRSAFunction rsa, string message, string &output) { 
   AutoSeededRandomPool rng;
   // Signer object
   RSASS< PKCS1v15, SHA1 >::Signer signer(rsa);
   // Create signature space
   byte* signature = new byte[ signer.MaxSignatureLength() ];
   if( NULL == signature ) { 
      return -1; 
   }
   // Sign message
   size_t length = signer.SignMessage(rng, (const byte*) message.c_str(), message.length(), signature);
   return 1;
}

I've asked the people at cplusplus.com, but they don't seem to get the fact, that I'm not this "expert" C++ coder that they assume everyone to be.

I run that code and this exception spits out at my compiler and my test program dies.
First-chance exception at 0x7625f328 in cppauth.exe: Microsoft C++ exception: CryptoPP::Exception at memory location 0x003df7b0..
Unhandled exception at 0x7625f328 in cppauth.exe: Microsoft C++ exception: CryptoPP::Exception at memory location 0x003df7b0..

The RSA key generation works fine, It's even getting into that function (I put a few print statements to see if the RSA key info was there, and it is). I tried a few other examples with the same result, so I'm wondering exactly what's going on and how to fix it.
#14
01/30/2011 (4:31 pm)
Let's catch the exception to find out what it is:
int xxz568::rsaSign(InvertibleRSAFunction rsa, string message, string &output) {
   try {
      AutoSeededRandomPool rng;
      // Signer object
      RSASS< PKCS1v15, SHA1 >::Signer signer(rsa);
      // Create signature space
      byte* signature = new byte[ signer.MaxSignatureLength() ];
      if( NULL == signature ) { 
         return -1; 
      }
      // Sign message
      size_t length = signer.SignMessage(rng, (const byte*) message.c_str(), message.length(), signature);
      return 1;
   }
   catch (CryptoPP::Exception e) {
      cout << "GetWhat: " << e.GetWhat() << endl;
      cout << "ErrorType: " << (int)e.GetErrorType() << endl;
      return -1;
   }
}
This will give some insight as to what is going wrong.
#15
01/30/2011 (5:28 pm)
GetWhat: InvertibleRSAFunction: computational error during private key operation.

ErrorType: 6

The key generation process was making bad keys, I added a loop in the generator using the validate function, and it works now.. Thanks! I'll report back if I run into any more problems.

*EDIT*

Time to laugh at my fail again :D.
int xxz568::rsaSign(InvertibleRSAFunction rsa, string message, string &output) {
   AutoSeededRandomPool rng;
   RSA::PrivateKey privateKey(rsa);

   // Signer object
   RSASSA_PKCS1v15_SHA_Signer signer(privateKey);

   // Create signature space
   size_t length = signer.MaxSignatureLength();
   SecByteBlock signature(length);

   // Sign message
   signer.SignMessage(rng, (const byte*) message.c_str(), message.length(), signature);
   std::string holder;
   holder.assign((const char *)signature.BytePtr());

   HexEncode(holder, output);

   return 1;
}

bool xxz568::rsaVerify(InvertibleRSAFunction rsa, std::string &message, std::string &textSig) {
   
   RSA::PublicKey publicKey(rsa);
   RSASSA_PKCS1v15_SHA_Verifier verifier(publicKey);

   // Create signature space
   size_t length = verifier.MaxSignatureLength();
   SecByteBlock signature(length);
   std::string holder;
   HexDecode(textSig, holder);

   memcpy(signature, holder.c_str(), length);

   // Verify
   bool result = verifier.VerifyMessage((const byte*)message.c_str(), message.length(), signature, signature.size());

   return result;
}

int xxz568::HexEncode(std::string input, std::string &output) {
   CryptoPP::StringSource foo(input, true,
	  new CryptoPP::HexEncoder(
         new CryptoPP::StringSink(output), false));
   return 1;
}

int xxz568::HexDecode(std::string input, std::string &output) {
   CryptoPP::StringSource foo(input, true,
	  new CryptoPP::HexDecoder(
         new CryptoPP::StringSink(output)));
   return 1;
}

This code at least doesn't crash me anymore, but the verify function is saying that some of the messages are not correct. i'm wondering if I'm screwing up the data somewhere in here, or I'm passing one of the size arguments incorrectly.
*END EDIT*
#16
02/01/2011 (8:00 am)
Have another update on this: I appear to have gotten Verify working using the PKCS#1 v15 filters.

int xxz568::rsaSign(InvertibleRSAFunction rsa, string message, string &output) {
   AutoSeededRandomPool rng;
   RSA::PrivateKey privateKey(rsa);
   std::string holder;

   RSASSA_PKCS1v15_SHA_Signer signer(privateKey);

   StringSource(message, true, 
       new SignerFilter(rng, signer,
           new StringSink(holder)
      ) // SignerFilter
   ); // StringSource

   HexEncode(holder, output);

   return 1;
}

bool xxz568::rsaVerify(InvertibleRSAFunction rsa, std::string message, std::string textSig) {
   
   RSA::PublicKey publicKey(rsa);
   std::string holder;

   try {
      RSASSA_PKCS1v15_SHA_Verifier verifier(publicKey);

      HexDecode(textSig, holder);

      StringSource(message+holder, true,
          new SignatureVerificationFilter(
              verifier, NULL,
              SignatureVerificationFilter::THROW_EXCEPTION
         ) // SignatureVerificationFilter
      ); // StringSource
   }
   catch(std::exception e) {
      return false;
   }

   return true;
}

Now I need to proceed to implementing the signing algorithm on the PHP side, I'm going to try to use phpseclib to accomplish this. About the only thing that remains on the to-do list after this is successfully implemented is the AES-256-CBC algorithm for account storage.

I'm getting there, it should hopefully be done here soon.
#17
02/03/2011 (9:42 am)
Sorry about the triple post.

This little issue of mine is standing in my way of completing the AES encryption needed for account creation purposes.

The init vec is encoded in hex and tagged to the front of the output. The decryption function then gets the first 'x' chars of the string as the IV and proceeds, this is where my code for encrypt stands now:

int xxz568::AESEncrypt(std::string &key, std::string input, std::string &output) {
   byte keyB[32], iv[AES::BLOCKSIZE];
   std::string hashKey = sha1(key).substr(0, 32);
   memcpy(keyB, hashKey.c_str(), 32); 
   //
   memcpy(iv, generateRandLen(), 16);
   //
   std::string hold, final, hexVec;
   HexEncode((const char *)iv, hexVec);
   output = hexVec;
   cout << "IV: " << iv << endl << hexVec << endl;
   StringSink* sink = new StringSink(hold);
   CBC_Mode<AES>::Encryption aes(keyB, sizeof(keyB), iv);
   StreamTransformationFilter* aes_enc = new StreamTransformationFilter(aes, sink);
   StringSource cipher(input, true, aes_enc);
   //convert to a hash
   HexEncode(hold, final);
   output += final;
   //
   return 1;
}

int xxz568::HexEncode(std::string input, std::string &output) {
   CryptoPP::StringSource foo(input, true,
	  new CryptoPP::HexEncoder(
         new CryptoPP::StringSink(output), false));
   return 1;
}

int xxz568::HexDecode(std::string input, std::string &output) {
   CryptoPP::StringSource foo(input, true,
	  new CryptoPP::HexDecoder(
         new CryptoPP::StringSink(output)));
   return 1;
}

byte * xxz568::generateRandLen() {
   AutoSeededRandomPool rng;
   byte randLen[16];
   rng.GenerateBlock(randLen, 16);
   //
   return randLen;
}

Basically, where this stands is, the generateRandLen function is called to generate the "initialization vector" needed for the AES-CBC cipher, however, when I check the length of the encryption based vector, is is 3 bits longer than what it's supposed to be. No idea why, or even how it can do this if the array bounds are 16.

Oh, I've renamed this thread accordingly.
#18
02/03/2011 (10:50 am)
Just FYI, array bounds in C++ are not set in stone. The only time C++ actually knows about the size you created the array is when you create it. That's why you have to be careful and make sure your not accessing past the array bounds or you might access other data on the stack. Hackers can use this to change the return pointer of a function on the stack to execute whatever code they want.
#19
02/03/2011 (11:37 am)
What's weird is if I change the integer value on the memcpy function, if I make it lower than 16 (ie: 15) the decrypted output (IV) has 3 more in it's length. whereas if I make the memcpy statement use 16, it's 3 less in length.

Then, the hex encoded value of the encrypted pair contains 14 additional characters (7 hex pairs), which just completely makes me draw a blank. How can 3 additional ASCII chars make 7 hex pairs?

*EDIT*
http://www.phantomdev.net/StrangeAESError.jpg

There's a picture of my output.
#20
02/03/2011 (3:27 pm)
http://www.phantomdev.net/greatNews.jpg

No more help needed. :)

Just figured out the AES problem, turns out the encryption worked fine after making a few simple modifications to only use 16 bits.

Now for the fun (and probably more challenging) part, implementing into TGEA and then from there, testing it.
Page «Previous 1 2