0

How i get the float result of the division?

Although i defined the average array as float.

int main()
{    
const int Number = 20; 
int Fibonacci[Number]; 
float average[Number];
for ( int i =0; i <= Number; i++ )
{
    if ( i == 0 ) 
       {Fibonacci[i] = 0;}
    else if ( i == 1 ) 
       {Fibonacci[i] = 1;}
    else 
       {Fibonacci[i] = Fibonacci[i -1] + Fibonacci[i -2];

        //average[i] = (Fibonacci[i -1] + Fibonacci[i -2])/2 ;   
       }
}

cout<< "The first 20 Fibonacci series numbers are: \n";
for ( int i = 1; i <= Number; i++)
{    cout<< Fibonacci[i]<<endl;

}



cout<< "The average adjacent array numbers are: \n";
for ( int i = 3; i <= Number; i++)
{   average[i] = (Fibonacci[i]/2);
    //cout.precision(0);

Here the proplem

    cout<< average[i]<<endl;     <-----here the problem!!
}

return 0;
}

I appreciate any help. Thanks in advance.

3
  • You are doing integer division. Fibonacci contains int, and 1 is also an int. You want to convert one of the values to a float first. For example: average[i] = (Fibonacci[i]/2.f) Commented Jun 10, 2020 at 13:11
  • 4
    Does this answer your question? Why does dividing two int not yield the right value when assigned to double? Commented Jun 10, 2020 at 13:12
  • for ( int i =0; i <= Number; i++ ) should be for ( int i =0; i < Number; i++ ). As written, the final pass through the loop goes off the end of the array. Commented Jun 10, 2020 at 14:07

3 Answers 3

3

When you do the division, you are doing an integer division, so you won't get floating point results. A simple fix would be the following:

average[i] = Fibonacci[i] / 2.0f;

Note that if one of the operands to / is a float, then you get floating point division.

Also, note that your loops index too far into the array. You need to stop before Number, like this:

for ( int i = 0; i < Number; i++)

This is because valid array indexes go from 0 .. (Number - 1).

Sign up to request clarification or add additional context in comments.

Comments

1

If Fibonacci[i] is of type int, then (Fibonacci[i]/2) is an integral division, resulting in an integral value (without any fractional part). Assigning this integral result to a float does not change the fact that you have performed an integral division.

You can enforce a floating point division by making one of the operands a floating point value.

So (1) use either a cast...

((float)Fibonacci[i])/2

or (2) divide by 2.0f (which is a float value):

Fibonacci[i]/2.0f

1 Comment

2.0 is a double. 2.0f is a float.
0

Just use typecasting the fibonacci array like this-

average[i] = (float)Fibonacci[i]/2;

Because in order to get float result any of the two variable used in / operation has to be float.

1 Comment

@PeteBecker thanks. I just use his reference with least modification. However, I have updated my solution.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.