I have a strange problem with some specific sql statement performance. I have an sql statement (contents, I think is not important). When running through .NET application (using EF as a DB access; SQL gets parameterized) and catching data with SQL Profiler I get following results:
- CPU: 28.253
- Reads: 8.503.799
- Writes: 0
- Duration: 54.274 (a little more than 54 seconds)
However, when running the same SQL statement through SSMS (first executing set statistics io on and set statistics time on) I get following results:
Table 'mytable'. Scan count 1, logical reads 4, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Worktable'. Scan count 0, logical reads 0, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
SQL Server Execution Times:
CPU time = 16 ms, elapsed time = 138 ms.
Both SQLs are executed under same user context, but obviously different connection. As you can see, running the same query through SSMS is way faster than running it through application.
Where should I look for differences, e.g. what triggers different reads and duration? Can it be, that the problem is parameter sniffing (I don't see how, although)?