1

This is literally the weirdest thing I have ever encountered. So, I have a float called ratingDecimal, and I use some conditions to compare it. Here is what the conditions look like:

    if (ratingDecimal >= 0.0) {

    if (ratingDecimal < 0.3 && ratingDecimal != 0.3) {

        ratingDecimal = 0.0;

        NSLog(@"bigger or equal to 0 and smaller than 0.3");
    }
}

if (ratingDecimal >= 0.3) {

    if (ratingDecimal < 0.8 && ratingDecimal != 0.8) {

        ratingDecimal = 0.5;

        NSLog(@"middle");
    }
}

if (ratingDecimal >= 0.8) {

    NSLog(@"bigger or equal to 0.8");

    ratingDecimal = 1.0;
}

Here is a weird problem. I have set the ratingDecimal to 0.8, the console logs ratingDecimal as:

0.8

but calls:

middle

Which should only be called if ratingDecimal is bigger or equal to 0.3 but smaller than 0.8 and DOES NOT equal to 0.8.

The statement that should be getting called is:

if (ratingDecimal >= 0.8) {

NSLog(@"bigger or equal to 0.8"); 

ratingDecimal = 1.0;

}

But I do not see

bigger or equal to 0.8

You are probably wondering why my conditions are so tedious and complex, they can be as simple as:

if (ratingDecimal >= 0.3  < 0.8)

They used to be like that but since this has never worked I kept on breaking the statement down and it is STILL acting weird.

Why is this happening!?

Here is an image example:

PROBLEM

1
  • If ratingDecimal < 0.3 then there is no point in checking whether ratingDecimal != 0.3. It must be different from 0.3. Commented Apr 15, 2014 at 21:14

3 Answers 3

7

Your ratingDecimal value is probably something like 0.7999999999999. When printed to the console, something is probably rounding it to 0.8, but the comparison still sees it as less than 0.8.

This is due to the binary nature of floating point numbers. See What Every Computer Scientist Should Know About Floating-Point Arithmetic for extensive information on this.

Sign up to request clarification or add additional context in comments.

1 Comment

It's the same reason it's very hard to get 3 * (1/3) to equal 1 in decimal. There is no fixed-length decimal value that gives 1 when multiplied by 3. In most fixed-length decimal systems 3 * (1/3) will be printed as 1 but is internally .999999.. and so will also test as less than 1.
1

@Greg gave a good answer. I'd like to point out that you can write that code in a much clearer and simpler fashion:

if (ratingDecimal >= 0.0) {
    if (ratingDecimal < 0.3) {
        ratingDecimal = 0.0;

        NSLog(@"bigger or equal to 0 and smaller than 0.3");
    } else if (ratingDecimal < 0.8) {
        ratingDecimal = 0.5;

        NSLog(@"middle");
    } else {
        ratingDecimal = 1.0;

        NSLog(@"bigger or equal to 0.8");
    }
} else {
    // negative
}

As an FYI - a line such as:

if (ratingDecimal < 0.3 && ratingDecimal != 0.3) {

doesn't make such sense. If a number is less than 0.3 then it must be unequal to 0.3 so the second check doesn't do anything useful.

1 Comment

I know that, in my question I say that the only reason I put that was to break the if statement and make 100 percent sure I was doing it right
0

you can use functions like roundf() according to your need.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.