0

I'm using db2, not sure what version. I'm getting an overflow error when trying to sum a table. I thought that I would be able to cast the sum to a BIGINT which seems to work for a sum total but I'm looking to get a percentage and when I cast to a BIGINT my data is inaccurate. How do I get an accurate percentage for Percent_DeliveredB/A? Converting the numerator and denominator to BIGINT and dividing for percentage is not giving me the correct results.

Here's my script:

SELECT 
FAT.DIM_BUILDING_ID,
FAT.BUILDING_NAME,
SUM(CAST(FAT.AMOUNT AS BIGINT)) AS SALES_SUM,
SUM(CAST(FAT.ORDERS AS BIGINT)) AS ORDERS_SUM,
SUM(CAST(FAT.CAPABILITY AS BIGINT)) AS CAPABILITY_SUM,
SUM(FAT.ORDERS_B)/sum(FAT.Amount) AS Percent_DeliveredB,
SUM(FAT.ORDERS_A)/sum(FAT.Amount) AS Percent_DeliveredA,
SUM(CAST(FTS.GROUP_A AS BIGINT)) AS GROUP_A,
SUM(CAST(FTS.GROUP_B AS BIGINT)) AS GROUP_B,
SUM(CAST(FTS.GROUP_C AS BIGINT)) AS GROUP_C  

FROM ORDERS AS FAT
INNER JOIN GROUPS AS FTS ON FAT.DIM_PROJECT_ID = FTS.DIM_PROJECT_ID
GROUP BY FAT.DIM_BUILDING_ID, FAT.BUILDING_NAME;

I tried the following but it comes back with 0 for the percentage.

SUM(CAST(FAT.ORDERS_B AS BIGINT))/sum(CAST(FAT.Amount AS BIGINT)) AS Percent_DeliveredB
2
  • What is your question? Commented Sep 25, 2013 at 14:39
  • I edited my question. Hopefully it's more clear. Commented Sep 25, 2013 at 14:50

3 Answers 3

4

I was able to get the correct results converting to double.

SUM(CAST(FAT.ORDERS_B AS DOUBLE))/sum(CAST(FAT.Amount AS DOUBLE)) AS Percent_DeliveredB,
Sign up to request clarification or add additional context in comments.

Comments

2

It is giving you a correct result, as any value less than 1, when cast to an integer (or BIGINT for that matter) will be truncated to 0, obviously. If you are expecting a fractional number, use DECIMAL or FLOAT data types:

cast(SUM(FAT.ORDERS_B) as decimal(10,2)) / 
cast(sum(FAT.Amount) as decimal(10,2)) AS Percent_DeliveredB

Use the correct precision for your needs, of course.

Comments

2

I got the same error while using a select count(*) on a very big table.

I solved it by using count_big(*) which can handle a count that is more than 2.147.483.647 (the MAX_LONGINT_INT SQL limit).

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.