0

In looking at this question I was trying to figure out this function

_shl: function (a, b){
  for (++b; --b; a = ((a %= 0x7fffffff + 1) & 0x40000000) == 0x40000000 ? a * 2 : (a - 0x40000000) * 2 + 0x7fffffff + 1);
  return a;
}

I figured out the javascript syntax and also found an uncompressed version of the function used in a md5 javascript implementation

function shl1(a) {
  a=a%(0x7fffffff+1);
  if (a&0x40000000==0x40000000)
  {
    a-=0x40000000;  
    a*=2;
    a+=(0x7fffffff+1);
  } else
    a*=2;
  return a;
}

function shl(a,b) {
  a=integer(a);
  b=integer(b);
  for (var i=0;i<b;i++) a=shl1(a);
  return a;
}

My question is what is significant about 0x40000000 and 0x7fffffff. I somewhat understand the idea of a bitwise shift, but I am lost about the importance of these two numbers.

3 Answers 3

1

The 0x7fffffff is, in binary, a 0 followed by 31 1s. Adding one to it gives 0x80000000, a 1 followed by 31 0s. I don't know why the direct constant isn't there. %ing by 0x80000000 will cut off the 32nd and all higher bits.

The 0x40000000 is, in binary, a 0, a 1, and then 30 0s. &ing with 0x40000000 and checking for equality to 0x40000000 checks whether that 31st bit (counting from the right this time) is set.

As far as I can tell, the section (a - 0x40000000) * 2 + 0x7fffffff + 1) should be the same as a. Not sure why the extended code is required.

Sign up to request clarification or add additional context in comments.

Comments

1

0x7fffffff represents the first 31 bits of 1 (reading from RHS). So 0x7fffffff+1 is 32 bits 1's.

0x40000000 represents 1000000000000000000000000000000 where the 1 is the 31st bit (reading RTL).

This has to do with manipulating 32 bit data block, I assume.

Comments

0

The functions that you listed are hash generators. They're generating an unique number based on the value received.
In Javascript '0x' is the prefix for an hexadecimal number.

0x40000000 is 1 GB.
0x7fffffff = 2^31 - 1 = 2,147,483,647
The number 2,147,483,647 is also the maximum value for a 32-bit signed integer in computing.

http://en.wikipedia.org/wiki/2147483647
http://en.wikipedia.org/wiki/Gigabyte

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.