2

I am having a nested JSON file as shown below (where condition and rules can be nested to multiple levels)

    {
    "condition": "and",
    "rules": [
      {
        "field": "26",
        "operator": "=",
        "value": "TEST1"
      },
      {
        "field": "36",
        "operator": "=",
        "value": "TEST2"
      },
      {
        "condition": "or",
        "rules": [
          {
            "field": "2",
            "operator": "=",
            "value": 100
          },
          {
            "field": "3",
            "operator": "=",
            "value": 12
          },
          {
            "condition": "or",
            "rules": [
              {
                "field": "12",
                "operator": "=",
                "value": "CA"
              },
              {
                "field": "12",
              "operator": "=",
              "value": "AL"
            }
          ]
        }
      ]
    }
  ]
}

I want to save this JSON (conditon and rules fields in json file can be nested to multiple levels) in to SQL Server Tables and later wanted to construct the same JSON from these created tables. How can i do this ? From these table i am planning to get other json formats also that is why decided to split the json to table columns.

I think need to create a recursive sql function to do same.

i have created following tables to save the same json .

CREATE TABLE [Ruleset]
([RulesetID]       [BIGINT] IDENTITY(1, 1) NOT NULL PRIMARY KEY,
 [Condition]       [VARCHAR](50) NOT NULL,
 [ParentRuleSetID] [BIGINT] NULL
);
GO
CREATE TABLE [Rules]
([RuleID]    [BIGINT] IDENTITY(1, 1) NOT NULL PRIMARY KEY,
 [Fields]    [VARCHAR](MAX) NOT NULL,
 [Operator]  [VARCHAR](MAX) NOT NULL,
 [Value]     [VARCHAR](MAX) NOT NULL,
 [RulesetID] [BIGINT] NULL
                      FOREIGN KEY REFERENCES [Ruleset](RulesetID)
);

insert script as follows,

INSERT INTO [Ruleset] values  
 ('AND',0),
 ('OR',1),
 ('OR',2) 

 INSERT INTO [Rules] values  
 ('26','=','TEST1',1),
 ('364','=','TEST2',1),
 ('2','=','100',2),
 ('3','=','12',2),
  ('12','=','CA',3),
 ('12','=','AL',3)

Will these tables are enough? Will be able to save all details?

Attaching the values that i have added to these tables manually.

enter image description here

How can i save this JSON to these table and later will construct the same JSON from these tables via stored procedure or queries ?

please provide suggestions and samples!

5
  • 1
    What is the reason to deconstruct the JSON tree and store it in the database? I think the best approach is to store the JSON tree in a NVARCHAR column. You can easily use the JSON tree with queries, but also retrieve the complete tree: learn.microsoft.com/en-us/sql/relational-databases/json/… Commented Jun 12, 2020 at 14:28
  • @rfkortekaas : i have to construct other json formats from these tables. in that case this approach is right? Commented Jun 16, 2020 at 5:41
  • 1
    If the data is required to be in json format for manipulation then why not read it how you write it. If Serialization and Deserialization is not an issue across boundaries then I would store it as json, just as xml is stored. If you need to manipluate it in SQL Server then I would start going over the guides for JSON data. -> learn.microsoft.com/en-us/sql/relational-databases/json/… Commented Jun 17, 2020 at 14:54
  • Are you Using JPA in your code?? If so recently I answered such question. We can use converter method which will do the job perfectly for us. Refer stackoverflow.com/questions/62039910/… Commented Jun 19, 2020 at 4:18
  • @KunalVohra : No Kunal Commented Jun 19, 2020 at 4:57

4 Answers 4

5

Actually you can declare the column type as NVARCHAR(MAX), and save the json string into it.

Sign up to request clarification or add additional context in comments.

Comments

2
+50

As JSON case sensitive please check your schema definition and sample data. I see a discrepancy between the definition of the tables, their contents and your JSON

All scripts tested on MS SQL Server 2016

I used a temporary table variable in this script, but you can do without it. See an example in SQL Fiddle

-- JSON -> hierarchy table
DECLARE @ExpectedJSON NVARCHAR(MAX) = '
{
    "condition": "and",
    "rules": [
      {
        "field": "26",
        "operator": "=",
        "value": "TEST1"
      },
      {
        "field": "36",
        "operator": "=",
        "value": "TEST2"
      },
      {
        "condition": "or",
        "rules": [
          {
            "field": "2",
            "operator": "=",
            "value": 100
          },
          {
            "field": "3",
            "operator": "=",
            "value": 12
          },
          {
            "condition": "or",
            "rules": [
              {
                "field": "12",
                "operator": "=",
                "value": "CA"
              },
              {
                "field": "12",
              "operator": "=",
              "value": "AL"
            }
          ]
        }
      ]
    }
  ]
}
'

DECLARE @TempRuleset AS TABLE 
(RulesetID          BIGINT NOT NULL PRIMARY KEY,
 condition          VARCHAR(50) NOT NULL,
 ParentRuleSetID    BIGINT NOT NULL,
 RulesJSON          NVARCHAR(MAX)
)

;WITH ParseRuleset AS (
    SELECT  1 AS RulesetID,
            p.condition,
            p.rules,
            0 AS ParentRuleSetID
    FROM OPENJSON(@ExpectedJSON, '$') WITH (
        condition   VARCHAR(50),
        rules       NVARCHAR(MAX) AS JSON
    ) AS p

    UNION ALL

    SELECT  RulesetID + 1,
            p.condition,
            p.rules,
            c.RulesetID AS ParentRuleSetID
    FROM ParseRuleset AS c
        CROSS APPLY OPENJSON(c.rules) WITH (
            condition   VARCHAR(50),
            rules       NVARCHAR(MAX) AS JSON
        ) AS p
    where
        p.Rules IS NOT NULL
)

INSERT INTO @TempRuleset (RulesetID, condition, ParentRuleSetID, RulesJSON)
SELECT  RulesetID,
        condition,
        ParentRuleSetID,
        rules
FROM ParseRuleset

 -- INSEERT INTO Ruleset ...
SELECT RulesetID,
        condition,
        ParentRuleSetID,
        RulesJSON
FROM @TempRuleset

-- INSERT INTO Rules ...
SELECT  RulesetID,
        field,
        operator,
        value
FROM @TempRuleset tmp
     CROSS APPLY OPENJSON(tmp.RulesJSON) WITH (
                field       VARCHAR(MAX),
                operator    VARCHAR(MAX),
                value       VARCHAR(MAX)
             ) AS p
WHERE   p.field IS NOT NULL

SQL Fiddle

Hierarchy tables -> JSON:

CREATE TABLE Ruleset
(RulesetID       BIGINT IDENTITY(1, 1) NOT NULL PRIMARY KEY,
 condition       VARCHAR(50) NOT NULL,
 ParentRuleSetID BIGINT NULL
);
GO
CREATE TABLE Rules
(RuleID     BIGINT IDENTITY(1, 1) NOT NULL PRIMARY KEY,
 field      VARCHAR(MAX) NOT NULL,
 operator   VARCHAR(MAX) NOT NULL,
 value      VARCHAR(MAX) NOT NULL,
 RulesetID  BIGINT NULL FOREIGN KEY REFERENCES Ruleset(RulesetID)
);

INSERT INTO Ruleset values  
    ('and',0),
    ('or',1),
    ('or',2) 

INSERT INTO Rules values  
    ('26','=','TEST1',1),
    ('36','=','TEST2',1),
    ('2','=','100',2),
    ('3','=','12',2),
    ('12','=','CA',3),
    ('12','=','AL',3)

-- hierarchy table -> JSON
;WITH GetLeafLevel AS 
(
    SELECT  Ruleset.RulesetID,
            Ruleset.condition,
            Ruleset.ParentRuleSetID,
            1 AS lvl,
            (   SELECT  field,
                        operator,
                        value
                FROM    Rules
                WHERE   Rules.RulesetID = Ruleset.RulesetID
                FOR JSON AUTO, WITHOUT_ARRAY_WRAPPER 
            ) AS JSON_Rules
    FROM    Ruleset
    WHERE   ParentRuleSetID = 0
    UNION ALL
    SELECT  Ruleset.RulesetID,
            Ruleset.condition,
            Ruleset.ParentRuleSetID,
            GetLeafLevel.lvl + 1,
            (   SELECT  field,
                        operator,
                        value
                FROM    Rules
                WHERE   Rules.RulesetID = Ruleset.RulesetID
                FOR JSON AUTO, WITHOUT_ARRAY_WRAPPER 
            )
    FROM    Ruleset
            INNER JOIN GetLeafLevel ON Ruleset.ParentRuleSetID = GetLeafLevel.RulesetID
),
-- SELECT * FROM GetLeafLevel       -- debug 
ConcatReverseOrder AS 
(
    SELECT  GetLeafLevel.*,
            CONCAT('{"condition":"',
                    GetLeafLevel.condition,
                    '","rules":[',
                    GetLeafLevel.JSON_Rules,
                    ']}'
                    ) AS js
    FROM    GetLeafLevel
    WHERE   GetLeafLevel.lvl = (SELECT MAX(lvl) FROM GetLeafLevel)
    UNION ALL
    SELECT  GetLeafLevel.*,
            CONCAT('{"condition":"',
                            GetLeafLevel.condition,
                            '","rules":[',
                            GetLeafLevel.JSON_Rules,
                            ',',
                            ConcatReverseOrder.js,
                            ']}'
                            ) AS js
    FROM    GetLeafLevel
            INNER JOIN ConcatReverseOrder ON GetLeafLevel.RuleSetID = ConcatReverseOrder.ParentRuleSetID 
)
-- SELECT * FROM ConcatReverseOrder     -- debug 

SELECT  js
FROM    ConcatReverseOrder
WHERE   ParentRuleSetID = 0

SQL Fiddle

Comments

1

I feel like I would need to know more about how you plan to use the data to answer this. My heart is telling me that there is something wrong about storing this information in MSSQL, if not wrong, problematic.

If i had to do it, I would convert these conditions into a matrix lookup table of rotatable events within your branch, so for each conceivable logic branch you could create a row in a lookup to evaluate this.

Depending out on your required output / feature set you can either do something like the above or just throw everything in a NVARCHAR as suggested by rkortekaas.

3 Comments

i want to create some other json file formats also from these values that is why i am saving this as a table and column entries and this concept is a part of existing application which was build over SQL, so i cannot use other databases.
In reponse to your question, i have to construct other json formats from these tables. in that case this approach is right? Yes, so long as you are willing to create the procedures functions to interpret the json objects and extract the data that you're interested in. It still feels like you could benefit from formatting the data before bringing it into the DB.
No unfortunately, if you were to explain your use case i may be able to elaborate but for the most part i believe that you are going about this the wrong way.
0

Your use case really does seem a perfect match for a NoSql Option such as MongoDb, Azure Table storage, or CosmosDB (CosmosDB can be pricey if you don't know your way round it).

Extract from MongoDB page:

In MongoDB, data is stored as documents. These documents are stored in MongoDB in JSON (JavaScript Object Notation) format. JSON documents support embedded fields, so related data and lists of data can be stored with the document instead of an external table.

However, from here on I'm going to assume you are tied to SQL Server for other reasons.

You have stated that you are going to are just putting the document in and getting the same document out, so it doesn't make sense to go to the effort of splitting out all the fields.

SQL Server is much better at handling text fields than it used to be IMO.

Systems I've worked on before have had the following columns (I would write the sql, but I'm not at my dev machine):

  • Id [Primary Key, Integer, Incrementing index]
  • UserId [a Foreign Key to what this relates to - probably not 'user' in your case!]
  • Value [nvarchar(1000) contains the json as a string]

The lookup is easily done based on the foreign key.

However, suppose you want it to be more NoSql like, you could have:

  • Id [Primary Key, Integer, Incrementing index]
  • Key [nvarchar(100) a string key that you make, and can easily re-make for looking up the value (e.g. User_43_Level_6_GameData - this column should have an index]
  • Value [nvarchar(1000) contains the json as a string]

The reason I've kept to having an integer ID is to avoid fragmentation. You could obviously make the Value column bigger.

Json can easily be converted between a json object and a json string. In Javascript, you would use Json Parse and Stringify. If you are using C# you could use the following snippets, though there are many ways to do this task (the objects can be nested as deep as you like)

.NET Object to Json

Weather w = new Weather("rainy", "windy", "32"); var jsonString = JsonSerializer.Serialize(w);

Json to .NET Object (C#)

var w = JsonSerializer.Deserialize(jsonString);

UPDATE

Although this is the way I've done things in the past, it looks like there are new options in sql server to handle JSON - OPENJSON and JSONQUERY could be potential options, though I haven't used them myself - they still use nvarchar for the JSON column.

2 Comments

Agreed with @HockeyJ. option is NOSQL. you can used ravendb.net as well.
@HockeyJ this is a part of existing application which was build over SQL, so i cannot use other databases. I am looking for samples of openjson and jsonquery with this scenario.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.