0

We are using Licensed version SQL Server Standard 2008 R2 of AWS.
We are upgrading our database from SQL 2008 R2 (10.50.1600.1) to SQL Server 2016 (13.0.16106.4).
After setting Compatibility level to 130, below query returns the below error.

Msg 8115, Level 16, State 2, Line 22 Arithmetic overflow error converting expression to data type int.

Both tables wfWorkflows & wfTasks have same column WorkflowId with same data type INT, NOT NULL.

declare @CompanyID int = 510, @RecordPkId int = 4551138,@zoneDifference varchar(6) = null

Select Minutes = 
               (
               Select sum(isnull(TimeSpentOnTask,0)) 
               from wfTasks With(nolock) 
               inner join comAdvisers With(nolock)  on 
               comAdvisers.AdviserId = wfTasks.AdviserId 
               Where WorkflowId = wfWorkflows.WorkflowId
               )
from wfWorkflows With(nolock) 
Where companyid = @CompanyID 
And ISNULL(ClientID,AdviserID) = @RecordPkId And ISNULL(InitSave,0) <> 1;

if i comment any single column from select then i will get no error, query works fine. With all column i get error. declare @CompanyID int = 510, @RecordPkId int = 4551138,@zoneDifference varchar(6) = null

Select Minutes = (Select sum(isnull(TimeSpentOnTask,0)) from wfTasks With(nolock) Where WorkflowId = wfWorkflows.WorkflowId), EstMinutes = (Select sum(isnull(Duration,1)) from wfTasks With(nolock) Where WorkflowId = wfWorkflows.WorkflowId)
from wfWorkflows With(nolock) Where companyid = @CompanyID And ISNULL(ClientID,AdviserID) = @RecordPkId And ISNULL(InitSave,0) <> 1

12
  • 1
    How can anyone answer without knowing what data caused the problem? Don't look for bugs, find the value that can't fit in an int Commented Jun 21, 2017 at 14:40
  • where condition in sub query causing issue. If i cast first column as bigint, its resolved. Where cast(WorkflowId as bigint) = wfWorkflows.WorkflowId. We have around 4000 Db components and coverting all columns into bigint is not the feasible solution. Commented Jun 21, 2017 at 14:44
  • WITH (NOLOCK) means you are OK with reading dirty data. It means don't respect locks, not don't take locks. This suggests that your query already has performance issues. Simplify your query, comment parts and bring them back one by one until you find which one causes the overlfow. Commented Jun 21, 2017 at 14:44
  • Why ask a question at all then? If the number is too large, the number is too large. That's exactly what the error says Commented Jun 21, 2017 at 14:45
  • Joining between views, aggregate subqueries with external filtering, NOLOCK, I'd say you have a lot of issues to fix already. Especially if you have so many rows that the sum of TimeSpentOnTask requires a 64-bit integer Commented Jun 21, 2017 at 14:47

1 Answer 1

0

Even when "everything is int", the sum can overflow int type, it's normal, look at this:

    declare @t table(col int);
    insert into @t values(2147483647),(2147483647);

    select sum(col)
    from @t;

    --Msg 8115, Level 16, State 2, Line 4
    --Arithmetic overflow error converting expression to data type int.

    select sum(cast(col as bigint))
    from @t;

In my first select I get the error, and that's right because the result does not fit into int. In my second select I cast every summand to bigint so the sum will be bigint as well and no overflow occured.

So as the first step you should cast the summands to bigint to at least receive the result and see if it effectively can fit into int. If it cannot, it cannot on the earlier versions too and this is the question of your data. If the result is small enough to fit into int, the cause maybe the following: if you have negative summands, the order in which the sum is calculated metters, for example, if I do

2147483647 - 2147483647 + 2 

I still fit into int, but what happens if I do it in this order:

2147483647 + 2 - 2147483647

It overflows int at the first step. So what may happend in your case is for example parralel plan that repartition the input in different ways, so the overall result still can fit into int but one of the stream caused int overflow

Sign up to request clarification or add additional context in comments.

2 Comments

I have more that 4000 DB components, Coverting to BIGINT is not feasible solution.
I didn't tell you to change a data type, you can cast it when you aggregate, in your query only, in order to receive the result and to see if it fit in int. The query from your post does not contain 4000 components, it takes 1 minute to rewrite it using cast as bigint

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.