I have a table that contains a JSON array column (type nvarchar(max))
with millions of rows (expected to be billions of rows in the future).
The table structure is like this:
[SnapshotId] - PK,
[BuildingId],
...
[MeterData],
MeterData contains a JSON array formatted like this:
[{
"MeterReadingId": 0,
"BuildingMeterId": 1,
"Value": 1.0,
}, {
"MeterReadingId": 0,
"BuildingMeterId": 2,
"Value": 1.625,
}]
I need to filter the table where "BuildingMeterId = 255".
For example, I wrote the below query
SELECT *
FROM [HourlySnapshot] h
CROSS APPLY OPENJSON(h.MeterData)
WITH (
BuildingMeterId int '$.BuildingMeterId'
) AS MeterDataJson
WHERE MeterDataJson.BuildingMeterId = 255
Works fine, but performance is bad due to the JSON parsing.
I read you can overcome the performance issue by creating indexes,
so I created a clustered index like below
CREATE CLUSTERED INDEX CL_MeterDataModel
ON HourlySnapshot (MeterDataModel)
but can't see any improvements in terms of speed.
Have I done it wrong?
What is the best way to improve the speed?