This is literally the weirdest thing I have ever encountered. So, I have a float called ratingDecimal, and I use some conditions to compare it. Here is what the conditions look like:
if (ratingDecimal >= 0.0) {
if (ratingDecimal < 0.3 && ratingDecimal != 0.3) {
ratingDecimal = 0.0;
NSLog(@"bigger or equal to 0 and smaller than 0.3");
}
}
if (ratingDecimal >= 0.3) {
if (ratingDecimal < 0.8 && ratingDecimal != 0.8) {
ratingDecimal = 0.5;
NSLog(@"middle");
}
}
if (ratingDecimal >= 0.8) {
NSLog(@"bigger or equal to 0.8");
ratingDecimal = 1.0;
}
Here is a weird problem. I have set the ratingDecimal to 0.8, the console logs ratingDecimal as:
0.8
but calls:
middle
Which should only be called if ratingDecimal is bigger or equal to 0.3 but smaller than 0.8 and DOES NOT equal to 0.8.
The statement that should be getting called is:
if (ratingDecimal >= 0.8) {
NSLog(@"bigger or equal to 0.8");
ratingDecimal = 1.0;
}
But I do not see
bigger or equal to 0.8
You are probably wondering why my conditions are so tedious and complex, they can be as simple as:
if (ratingDecimal >= 0.3 < 0.8)
They used to be like that but since this has never worked I kept on breaking the statement down and it is STILL acting weird.
Why is this happening!?
Here is an image example:
