<

Check out our new AWS podcast with Certified Expert, Phil Phillips! Listen to "How to Execute a Seamless AWS Migration" on EE or on your favorite podcast platform. Listen Now

x

How to load JSON's data into SQL Server in different ways

Published on
3,618 Points
418 Views
2 Endorsements
Last Modified:
Editors:
Ryan Chong
CERTIFIED EXPERT
(NIV) Hebrews 10:35 So do not throw away your confidence; it will be richly rewarded.
Data can be kept in different media, Sometimes, the data need to be extracted, transformed or loaded in different ways.

For this article, I'm going to demonstrate some of the popular ways of importing JSON data into MS SQL Server.
Recently I wrote an article showing How to load Excel's data into SQL Server in different ways. It really gave me a lot of opportunities to update my technical skills. I certainly know that without the push, there won't be a beginning point in learning.

In this article, I demonstrate the most effective ways that I know of, to load JSON data into a Microsoft SQL Server. If you know of other ways that you feel are more efficient, please share your ideas in the comments section below. Note that some customization may still be needed to cater for your own project.


Preparation
 
Step 1. Create the table structure for Table TableJSON by executing the SQL command below:

CREATE TABLE [dbo].[TableJSON](
    [Dates] [datetime] NULL,
    [Type] [varchar](1) NULL,
    [Code] [varchar](20) NULL,
    [Price] [money] NULL,
    [Unit] [int] NULL,
    [SubTotal] [money] NULL,
    [Brokeage Rate] [money] NULL,
    [Tax] [money] NULL,
    [Stamp Duty] [money] NULL,
    [Clearing Fee] [money] NULL,
    [Amount] [money] NULL,
    [Svc Cost] [money] NULL
) ON [PRIMARY]
GO

Step 2. Copy and paste the following JSON Data to a physical file and named it as source.json or you can download a copy of source.json file here.

[{
        "Dates": "2018-02-03T00:00:00",
        "Type": "B",
        "Code": "AB 101",
        "Price": 4.8800,
        "Unit": 500,
        "SubTotal": 2440.0000,
        "Brokeage Rate": 28.0000,
        "Tax": 1.6800,
        "Stamp Duty": 3.0000,
        "Clearing Fee": 0.7400,
        "Amount": 2473.4200,
        "Svc Cost": 33.4200
    }, {
        "Dates": "2018-03-09T00:00:00",
        "Type": "B",
        "Code": "AB 101",
        "Price": 4.8500,
        "Unit": 500,
        "SubTotal": 2425.0000,
        "Brokeage Rate": 28.0000,
        "Tax": 1.6800,
        "Stamp Duty": 3.0000,
        "Clearing Fee": 0.7300,
        "Amount": 2458.4100,
        "Svc Cost": 33.4100
    }, {
        "Dates": "2018-04-25T00:00:00",
        "Type": "B",
        "Code": "DX 012",
        "Price": 8.9300,
        "Unit": 300,
        "SubTotal": 2679.0000,
        "Brokeage Rate": 28.0000,
        "Tax": 1.6800,
        "Stamp Duty": 3.0000,
        "Clearing Fee": 0.8100,
        "Amount": 2712.4900,
        "Svc Cost": 33.4900
    }, {
        "Dates": "2018-06-27T00:00:00",
        "Type": "D",
        "Code": "AB 101",
        "Price": 19.0000,
        "Unit": 1,
        "SubTotal": 19.0000,
        "Tax": 0.0000,
        "Amount": 19.0000
    }, {
        "Dates": "2018-07-26T00:00:00",
        "Type": "B",
        "Code": "ND 252",
        "Price": 26.9000,
        "Unit": 100,
        "SubTotal": 2690.0000,
        "Brokeage Rate": 28.0000,
        "Tax": 1.6800,
        "Stamp Duty": 3.0000,
        "Clearing Fee": 0.8100,
        "Amount": 2723.4900,
        "Svc Cost": 33.4900
    }, {
        "Dates": "2018-08-15T00:00:00",
        "Type": "B",
        "Code": "EW 013",
        "Price": 3.0500,
        "Unit": 800,
        "SubTotal": 2440.0000,
        "Brokeage Rate": 28.0000,
        "Tax": 1.6800,
        "Stamp Duty": 3.0000,
        "Clearing Fee": 0.7400,
        "Amount": 2473.4200,
        "Svc Cost": 33.4200
    }, {
        "Dates": "2018-10-22T00:00:00",
        "Type": "B",
        "Code": "ND 252",
        "Price": 21.9000,
        "Unit": 100,
        "SubTotal": 2190.0000,
        "Brokeage Rate": 28.0000,
        "Tax": 1.6800,
        "Stamp Duty": 3.0000,
        "Clearing Fee": 0.6600,
        "Amount": 2223.3400,
        "Svc Cost": 33.3400
    }, {
        "Dates": "2018-11-22T00:00:00",
        "Type": "D",
        "Code": "ND 252",
        "Price": 0.2600,
        "Unit": 100,
        "SubTotal": 26.0000,
        "Tax": 0.0000,
        "Amount": 26.0000
    }, {
        "Dates": "2018-12-18T00:00:00",
        "Type": "B",
        "Code": "FP 009",
        "Price": 5.7800,
        "Unit": 500,
        "SubTotal": 2890.0000,
        "Brokeage Rate": 28.0000,
        "Tax": 1.6800,
        "Stamp Duty": 3.0000,
        "Clearing Fee": 0.8700,
        "Amount": 2923.5500,
        "Svc Cost": 33.5500
    }, {
        "Dates": "2019-01-02T00:00:00",
        "Type": "D",
        "Code": "DX 012",
        "Price": 0.2500,
        "Unit": 300,
        "SubTotal": 75.0000,
        "Tax": 0.0000,
        "Amount": 75.0000
    }, {
        "Dates": "2019-02-10T00:00:00",
        "Type": "D",
        "Code": "AB 101",
        "Price": 0.0190,
        "Unit": 1000,
        "SubTotal": 19.0000,
        "Tax": 0.0000,
        "Amount": 19.0000
    }, {
        "Dates": "2019-03-11T00:00:00",
        "Type": "D",
        "Code": "EW 013",
        "Price": 0.0600,
        "Unit": 800,
        "SubTotal": 48.0000,
        "Tax": 0.0000,
        "Amount": 48.0000
    }, {
        "Dates": "2019-04-21T00:00:00",
        "Type": "B",
        "Code": "ND 252",
        "Price": 19.6800,
        "Unit": 100,
        "SubTotal": 1968.0000,
        "Brokeage Rate": 28.0000,
        "Tax": 1.6800,
        "Stamp Duty": 2.0000,
        "Clearing Fee": 0.6000,
        "Amount": 2000.2800,
        "Svc Cost": 32.2800
    }, {
        "Dates": "2019-05-12T00:00:00",
        "Type": "B",
        "Code": "DX 012",
        "Price": 8.8200,
        "Unit": 300,
        "SubTotal": 2646.0000,
        "Brokeage Rate": 28.0000,
        "Tax": 1.6800,
        "Stamp Duty": 3.0000,
        "Clearing Fee": 0.8000,
        "Amount": 2679.4800,
        "Svc Cost": 33.4800
    }, {
        "Dates": "2019-06-18T00:00:00",
        "Type": "D",
        "Code": "FP 009",
        "Price": 0.0650,
        "Unit": 500,
        "SubTotal": 32.5000,
        "Tax": 0.0000,
        "Amount": 32.5000
    }, {
        "Dates": "2019-07-26T00:00:00",
        "Type": "D",
        "Code": "ND 252",
        "Price": 0.2900,
        "Unit": 300,
        "SubTotal": 87.0000,
        "Tax": 0.0000,
        "Amount": 87.0000
    }, {
        "Dates": "2019-08-04T00:00:00",
        "Type": "B",
        "Code": "KE 130",
        "Price": 0.9850,
        "Unit": 3200,
        "SubTotal": 3152.0000,
        "Brokeage Rate": 28.0000,
        "Tax": 1.6800,
        "Stamp Duty": 4.0000,
        "Clearing Fee": 0.9500,
        "Amount": 3186.6300,
        "Svc Cost": 34.6300
    }, {
        "Dates": "2019-09-27T00:00:00",
        "Type": "D",
        "Code": "AB 101",
        "Price": 0.0180,
        "Unit": 1000,
        "SubTotal": 18.0000,
        "Tax": 0.0000,
        "Amount": 18.0000
    }, {
        "Dates": "2019-10-14T00:00:00",
        "Type": "B",
        "Code": "DX 012",
        "Price": 8.5900,
        "Unit": 400,
        "SubTotal": 3436.0000,
        "Brokeage Rate": 28.0000,
        "Tax": 1.6800,
        "Stamp Duty": 4.0000,
        "Clearing Fee": 1.0400,
        "Amount": 3470.7200,
        "Svc Cost": 34.7200
    }, {
        "Dates": "2019-11-09T00:00:00",
        "Type": "D",
        "Code": "KE 130",
        "Price": 0.0184,
        "Unit": 3200,
        "SubTotal": 58.8800,
        "Tax": 5.8900,
        "Amount": 52.9900,
        "Svc Cost": 5.8900
    }, {
        "Dates": "2019-12-30T00:00:00",
        "Type": "B",
        "Code": "PC 924",
        "Price": 6.4200,
        "Unit": 600,
        "SubTotal": 3852.0000,
        "Brokeage Rate": 28.0000,
        "Tax": 1.6800,
        "Stamp Duty": 4.0000,
        "Clearing Fee": 1.1600,
        "Amount": 3886.8400,
        "Svc Cost": 34.8400
    }, {
        "Dates": "2020-01-04T00:00:00",
        "Type": "B",
        "Code": "FP 009",
        "Price": 4.9200,
        "Unit": 800,
        "SubTotal": 3936.0000,
        "Brokeage Rate": 28.0000,
        "Tax": 1.6800,
        "Stamp Duty": 4.0000,
        "Clearing Fee": 1.1900,
        "Amount": 3970.8700,
        "Svc Cost": 34.8700
    }, {
        "Dates": "2020-01-18T00:00:00",
        "Type": "D",
        "Code": "ND 252",
        "Price": 0.3300,
        "Unit": 300,
        "SubTotal": 99.0000,
        "Tax": 0.0000,
        "Amount": 99.0000
    }, {
        "Dates": "2020-02-27T00:00:00",
        "Type": "D",
        "Code": "AB 101",
        "Price": 0.0180,
        "Unit": 1000,
        "SubTotal": 18.0000,
        "Tax": 0.0000,
        "Amount": 18.0000
    }, {
        "Dates": "2020-03-27T00:00:00",
        "Type": "D",
        "Code": "PC 924",
        "Price": 0.0700,
        "Unit": 600,
        "SubTotal": 42.0000,
        "Tax": 0.0000,
        "Amount": 42.0000
    }
]


How to load JSON's data into SQL Server
 
1 )   Using SQL Server Management Studio - OPENROWSET + OPENJSON commands
2 )   Using SSIS in Visual Studio
3 )   Using PowerShell
4 )   Using Node.js
5 )   Using Python
6 )   Using SQL Server Management Studio - sp_execute_external_script command
7 )   Using .NET (C#)
8 )   Using PHP
9 )   Using Java
10 ) Using Golang
 
* The above list will be updated in due course.
 
1 ) Using SQL Server Management Studio - OPENROWSET + OPENJSON Commands 
 
To import the data, we can try the following SQL scripts:

--This is for testing purposes, so clean up the table
TRUNCATE TABLE TableJSON;


--Import the data to target table
INSERT INTO TableJSON
SELECT tbl.*
FROM OPENROWSET (BULK 'C:\yourpath\source.json', SINGLE_CLOB) js
CROSS APPLY OPENJSON(BulkColumn)
WITH
(
    [Dates] [datetime],
    [Type] [varchar](1),
    [Code] [varchar](20),
    [Price] [money],
    [Unit] [int],
    [SubTotal] [money],
    [Brokeage Rate] [money],
    [Tax] [money],
    [Stamp Duty] [money],
    [Clearing Fee] [money],
    [Amount] [money],
    [Svc Cost] [money]
) tbl;
Note: You may encounter the following error when you execute the OPENROWSET command.
  SQL Server blocked access to STATEMENT 'OpenRowset/OpenDatasource' of component 'Ad Hoc Distributed Queries' because this component is turned off as part of the security configuration for this server. A system administrator can enable the use of 'Ad Hoc Distributed Queries' by using sp_configure. For more information about enabling 'Ad Hoc Distributed Queries', search for 'Ad Hoc Distributed Queries' in SQL Server Books Online. 
To resolve the issue, you would need to execute the following SQL commands:

EXEC sp_configure 'show advanced options', 1
RECONFIGURE WITH OVERRIDE
GO

EXEC sp_configure 'ad hoc distributed queries', 1
RECONFIGURE WITH OVERRIDE
GO

EXEC master.dbo.sp_MSset_oledb_prop N'Microsoft.ACE.OLEDB.15.0' , N'AllowInProcess' , 1
GO

EXEC master.dbo.sp_MSset_oledb_prop N'Microsoft.ACE.OLEDB.15.0' , N'DynamicParameters' , 1
GO
 The OPENJSON SQL command is relatively new in SQL Server but we realize it started to gain popularity among the SQL Server users since it can be used to read data in JSON format easily.

If all is done properly, you should able to see the data inserted successfully.


Then you can examine the data that was imported into the  table: TableJSON in SQL Server.


2 ) Using SSIS in Visual Studio 
 
In order to create an SSIS project in Visual Studio, you would first need to download and install the SQL Server Data Tools (SSDT) separately.
 
  Download SQL Server Data Tools (SSDT) for Visual Studio 
 
Once installed and available, you should be able to see a new tree note called Business Intelligence, and then select Integrated Services Project to continue.


To make a start, we can drag and drop an Execute SQL Task into the Control Flow.

Then we need to configure the Connection as well as the SQLStatement to be executed, which is to clean up the target table with SQL below:

Truncate Table TableJSON;


After this, drag and drop a Data Flow Task underneath the Execute SQL Task that we have created.

Once this is done, select the Execute SQL Task and a green arrow will appear.


Drag the green arrow to the top of Data Flow Task and link them together.


Now, double click the Data Flow Task, and it will bring us to the Data Flow tab.

The Data Flow is the place where we select, transform the data source and then load it into a data destination.

First, we need to select and drag the OLE DB Source from SSIS Toolbox > Other Sources into the Data Flow workspace.

Double click the OLE DB Source to edit the Connection Manager, Columns Mapping, Error Handling.


For the Data access mode, we need to select SQL command and enter the following SQL, which is exactly the OPENROWSET + OPENJSON commands in the previous method to SQL command text.

SELECT tbl.*
FROM OPENROWSET (BULK 'C:\yourpath\source.json', SINGLE_CLOB) js
CROSS APPLY OPENJSON(BulkColumn)
WITH
(
    [Dates] [datetime],
    [Type] [varchar](1),
    [Code] [varchar](20),
    [Price] [money],
    [Unit] [int],
    [SubTotal] [money],
    [Brokeage Rate] [money],
    [Tax] [money],
    [Stamp Duty] [money],
    [Clearing Fee] [money],
    [Amount] [money],
    [Svc Cost] [money]
) tbl;

Once this is done, then we need to select and drag the OLE DB Destination from SSIS Toolbox > Other Destinations into the Data Flow workspace.


After this, link the flows together.


Try executing the Package by clicking the Start (or press F5) button.

You should able to see that the Package is being executed successfully.


Then you can examine the data that was imported into the  table: TableJSON in SQL Server.

3 ) Using PowerShell

PowerShell is a very powerful tool developed in .NET and can facilitate a lot of tasks effectively.

In order to import JSON directly into SQL Server, we will need to install 1 additional library


We can install this library from the command Install-Module, such as:

Install-Module dbatools
Once the libraries are installed, we can try the following scripts to import the JSON data into SQL Server

$TargetServer = 'ServerName'
$TargetDb = 'DatabaseName'
$TableName = 'TableJSON'

$SourcePath = 'C:\yourpath\source.json'

try{

    $data = Get-Content $SourcePath | ConvertFrom-Json
   
    $data | ConvertTo-DbaDataTable | Write-DbaDbTableData -SqlInstance $TargetServer -Database $TargetDb -Table $TableName -Truncate -Confirm:$false;

    ((Get-Date).ToString() + " | File Imported: " + $SourcePath);
}
catch
{
    $ErrorMessage = $_.Exception.Message
    $FailedItem = $_.Exception.ItemName
    ((Get-Date).ToString() + " | Error: " + $ErrorMessage + " - " + $FailedItem);
}

The command ConvertFrom-Json converts a JSON-formatted string to a custom object or a hash table.

The command Write-DbaDbTableData is very powerful. You should look into the documentation on what features are available.

You should be able to see that the script is being executed successfully.


Then you can examine the data that was imported into the  table: TableJSON in SQL Server.

4 ) Using Node.js

In order to import the JSON data into SQL Server, you may first need to install the relevant library needed for the connection.


Once the installation is done, you can prepare and run the codes below:

'use strict';
var http = require('http');
var port = process.env.PORT || 1337;


http.createServer(function (req, res) {
    if (req.url === '/favicon.ico') {
        res.writeHead(200, { 'Content-Type': 'image/x-icon' });
        res.end(/* icon content here */);
    } else {
        res.writeHead(200, { 'Content-Type': 'text/plain' });


        try {
            const conn = require('mssql');


            //Settings
            var TableName = 'TableJSON';
            var SourceFile = 'C:\\yourpath\\source.json';


            var config = {
                server: 'ServerName',
                database: 'DatabaseName',
                authentication: {
                    type: 'default',
                    options: {
                        userName: 'UserID',
                        password: 'Password'
                    }
                },
                options: {
                    encrypt: false //Set to true if you're connecting to Azure
                }
            };


            //Load JSON Data
            var fs = require('fs');
            var contents = fs.readFileSync(SourceFile);
            var xlData = JSON.parse(contents);


            conn.connect(config, function (err) {
                if (err) {
                    console.log(err);
                }


                //Clean the Target Table
                var sql = 'Truncate Table ' + TableName;
                conn.query(sql, function (err, result) {
                    if (err) {
                        console.log(err);
                    }
                });


                //Read Target Table Fields
                var sql = 'SELECT COLUMN_NAME, DATA_TYPE, CHARACTER_MAXIMUM_LENGTH, NUMERIC_PRECISION, NUMERIC_SCALE FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = N\'' + TableName + '\' ORDER BY ORDINAL_POSITION';
                conn.query(sql, function (err, result) {
                    if (err) {
                        console.log(err);
                    } else {
                        var cols = result.recordset;


                        var table = new conn.Table(TableName);
                        table.create = false;


                        if ((typeof (cols) !== 'undefined') && (cols !== null)) {
                            for (var i = 0; i < cols.length; i++) {
                                switch (cols[i]['DATA_TYPE']) {
                                    case 'datetime':
                                        table.columns.add(cols[i]['COLUMN_NAME'], conn.Date, { nullable: true });
                                        break;
                                    case 'varchar':
                                        table.columns.add(cols[i]['COLUMN_NAME'], conn.VarChar(cols[i]['CHARACTER_MAXIMUM_LENGTH']), { nullable: true });
                                        break;
                                    case 'money':
                                    case 'decimal':
                                        table.columns.add(cols[i]['COLUMN_NAME'], conn.Decimal(cols[i]['NUMERIC_PRECISION'], cols[i]['NUMERIC_SCALE']), { nullable: true });
                                        break;
                                    case 'int':
                                        table.columns.add(cols[i]['COLUMN_NAME'], conn.Int, { nullable: true });
                                        break;
                                    default:
                                        //Cannot be mapped
                                        break;
                                }
                            }
                        }


                        for (var row in xlData) {
                            table.rows.add(
                                ((typeof (xlData[row][cols[0]['COLUMN_NAME']]) === 'undefined' || (xlData[row][cols[0]['COLUMN_NAME']]) === null) ? null : xlData[row][cols[0]['COLUMN_NAME']] + 'Z'),
                                ((typeof (xlData[row][cols[1]['COLUMN_NAME']]) === 'undefined' || (xlData[row][cols[1]['COLUMN_NAME']]) === null) ? null : xlData[row][cols[1]['COLUMN_NAME']]),
                                ((typeof (xlData[row][cols[2]['COLUMN_NAME']]) === 'undefined' || (xlData[row][cols[2]['COLUMN_NAME']]) === null) ? null : xlData[row][cols[2]['COLUMN_NAME']]),
                                ((typeof (xlData[row][cols[3]['COLUMN_NAME']]) === 'undefined' || (xlData[row][cols[3]['COLUMN_NAME']]) === null) ? null : xlData[row][cols[3]['COLUMN_NAME']]),
                                ((typeof (xlData[row][cols[4]['COLUMN_NAME']]) === 'undefined' || (xlData[row][cols[4]['COLUMN_NAME']]) === null) ? null : xlData[row][cols[4]['COLUMN_NAME']]),
                                ((typeof (xlData[row][cols[5]['COLUMN_NAME']]) === 'undefined' || (xlData[row][cols[5]['COLUMN_NAME']]) === null) ? null : xlData[row][cols[5]['COLUMN_NAME']]),
                                ((typeof (xlData[row][cols[6]['COLUMN_NAME']]) === 'undefined' || (xlData[row][cols[6]['COLUMN_NAME']]) === null) ? null : xlData[row][cols[6]['COLUMN_NAME']]),
                                ((typeof (xlData[row][cols[7]['COLUMN_NAME']]) === 'undefined' || (xlData[row][cols[7]['COLUMN_NAME']]) === null) ? null : xlData[row][cols[7]['COLUMN_NAME']]),
                                ((typeof (xlData[row][cols[8]['COLUMN_NAME']]) === 'undefined' || (xlData[row][cols[8]['COLUMN_NAME']]) === null) ? null : xlData[row][cols[8]['COLUMN_NAME']]),
                                ((typeof (xlData[row][cols[9]['COLUMN_NAME']]) === 'undefined' || (xlData[row][cols[9]['COLUMN_NAME']]) === null) ? null : xlData[row][cols[9]['COLUMN_NAME']]),
                                ((typeof (xlData[row][cols[10]['COLUMN_NAME']]) === 'undefined' || (xlData[row][cols[10]['COLUMN_NAME']]) === null) ? null : xlData[row][cols[10]['COLUMN_NAME']]),
                                ((typeof (xlData[row][cols[11]['COLUMN_NAME']]) === 'undefined' || (xlData[row][cols[11]['COLUMN_NAME']]) === null) ? null : xlData[row][cols[11]['COLUMN_NAME']])
                            );
                        }


                        //Insert into Target Table
                        var req = new conn.Request();
                        conn.connect(config, function (err) {
                            if (err) {
                                console.log(err);
                            }


                            req.bulk(table, function (err, rowCount) {
                                //console.log(table);


                                if (err) {
                                    console.log(err);
                                } else {
                                    console.log(new Date().toLocaleString().replace(/T/, ' ').replace(/\..+/, '') + ' | Data Loaded Successfully | ' + rowCount.rowsAffected + ' rows affected');
                                }


                            });
                        });
                    }
                });
            });


            res.end('done!');


        } catch (err) {
            console.log(err);
        }
    }
}).listen(port);

You will probably get the error shown below when executing the above codes.
Failed to connect to YourServerName:1433 - Could not connect (sequence)
To resolve this, we need to make sure the TCP/IP setting is enabled under Protocols for MSSQLServer. We can find this setting by launching Computer Management and then look for Services and Applications > SQL Server Configuration Manager > SQL Server Network Configuration.


Once you've done that and re-execute the codes, you should able to see that the script is being executed successfully.


Then you can examine the data that was imported into the  table: TableJSON in SQL Server.

5 ) Using Python

The execution in Python is easy too.

Before the execution, you need to install the following modules:


We can install these two libraries from the command pip install, such as:

pip install pandas

pip install Flask-SQLAlchemy
The script is shown as follows:

from sqlalchemy import create_engine
import urllib
import pandas as pd
import datetime as dt

# Settings
TargetServer = 'ServerName'
SchemaName = 'dbo'
TargetDb = 'DatabaseName'
TableName = 'TableJSON'
UserName = 'UserID'
Password = 'Password'
SourceFile = "C:\\yourpath\\source.json"

# Configure the Connection
Params = urllib.parse.quote_plus(r'DRIVER={SQL Server};SERVER=' + TargetServer + ';DATABASE=' + TargetDb + ';UID=' + UserName + ';PWD=' + Password)
ConnStr = 'mssql+pyodbc:///?odbc_connect={}'.format(Params)
Engine = create_engine(ConnStr)

# Load the json into a DataFrame
df = pd.read_json(SourceFile)

# Clear the Data in Target Table
sql = 'Truncate Table ' + TableName
with Engine.begin() as conn:
    conn.execute(sql)

# Load the Data in DataFrame into Table
df.to_sql(TableName, con=Engine, schema=SchemaName, if_exists='append', index=False)

print(dt.datetime.now().strftime("%Y-%m-%d %H:%M:%S") + ' | Data Imported Successfully')
We are using Pandas' read_jsonmethod to load JSON into a Data Frame.

You should be able to see that the script is being executed successfully.


Then you can examine the data that was imported into the  table: TableJSON in SQL Server.

6 ) Using SQL Server Management Studio - sp_execute_external_script command

Due to the popularity of Machine Learning programs, from SQL Server 2016 onwards, we are able to execute Python and R scripts in SQL Server environment, in particular, the   sp_execute_external_script command is being used.

Before we run through the scripts, we need to do some configuration:

a) Execute the SQL script below to enable the settings:

sp_configure 'external scripts enabled', 1;
RECONFIGURE WITH OVERRIDE;

b) Restart the SQL Server services.

Once these are done, we can execute the SQL script below:

--Clear the Target Table
Truncate Table TableJSON;

--Import Data into the Target Table
INSERT INTO TableJSON
EXEC sp_execute_external_script
@language = N'Python',
@script =
N'
# Import Libraries
import pandas as pd

# Load JSON
SourceFile = "C:\\yourpath\\source.json"

df = pd.read_json(SourceFile)
df = pd.DataFrame(data=df, columns=[''Dates'', ''Type'', ''Code'', ''Price'', ''Unit'', ''SubTotal'', ''Brokeage Rate'', ''Tax'', ''Stamp Duty'', ''Clearing Fee'', ''Amount'', ''Svc Cost''])

OutputDataSet = df
'
You should be able to see that the script is being executed successfully.

Then you can examine the data that was imported into the  table: TableJSON in SQL Server.

7 ) Using .NET (C#)

In the .NET environment, we could use a SqlBulkCopy object to efficiently import data into the database.

String ConnStr_SQLServer = "Data Source=ServerName;Initial Catalog=TableName;Persist Security Info=True;User ID=UserID;Password=Password";
            String TableName = "TableJSON";
            String SourceFile = @"C:\yourpath\source.json";

            using (SqlConnection ConnSQLServer = new SqlConnection(ConnStr_SQLServer))
            {
                //Read JSON data into DataTable
                String json = File.ReadAllText(SourceFile);
                DataTable dt = (DataTable)JsonConvert.DeserializeObject(json, (typeof(DataTable)));
               
                //Open Database Connection
                ConnSQLServer.Open();

                //Clean up the Target Table
                SqlCommand CmdSQLServer = new SqlCommand("Truncate Table " + TableName, ConnSQLServer)
                {
                    CommandType = CommandType.Text
                };
                CmdSQLServer.ExecuteNonQuery();


                // Set up the Bulk Copy object.
                // Note that the column positions in the source
                // data table match the column positions in
                // the destination table so there is no need to
                // map columns.
                using (SqlBulkCopy bulkCopy = new SqlBulkCopy(ConnSQLServer))
                {
                    bulkCopy.DestinationTableName = TableName;

                    try
                    {
                        // Write from the source to the destination.
                        bulkCopy.WriteToServer(dt);
                    }
                    catch (Exception ex)
                    {
                        Console.WriteLine(ex.Message);
                    }
                    finally
                    {
                        dt.Clear();
                    }
                }
            }
            MessageBox.Show(DateTime.Now.ToString("yyyy-MM-dd h:mm tt") + " | Data Imported Successfully!", "Process Completed", MessageBoxButtons.OK);
You should be able to see that the script is being executed successfully.


Then you can examine the data that was imported into the  table: TableJSON in SQL Server.

8 ) Using PHP

We could use json_decode function to decode JSON data easily.

<?php
ini_set('error_reporting', E_ALL);
ini_set('display_errors', true);


//DB Settings
$TargetServer = "ServerName";
$TargetDb = "DatabaseName";
$TableName = "TableJSON";
$UserName = "UserID";
$Password = "Password";
$SourceFile = "C:\\yourpath\\source.json";


//Establish Connection
$connectionInfo = array("Database" => "$TargetDb", "UID" => "$UserName", "PWD" => "$Password", "CharacterSet" => "UTF-8");
$conn = sqlsrv_connect($TargetServer, $connectionInfo);
$colcnt = 0;


//Clear the Data
$sql = "Truncate Table $TableName;";
$stmt = sqlsrv_prepare($conn, $sql);
if (sqlsrv_execute($stmt) === false) {
    die(print_r(sqlsrv_errors(), true));
}


//Load the JSON
$json = file_get_contents($SourceFile);
$json_obj = json_decode($json, true);


if (count($json_obj) > 0) {
    foreach ($json_obj as $id=>$row) {
        //Doing the fields and values mapping
        //Not using array_map method since JSON does not have 'header'
        $insertPairs = array();
        foreach ($row as $key=>$val) {
            $insertPairs[addslashes($key)] = addslashes($val);
        }
        $insertKeys = '['. implode('],[', array_keys($insertPairs)).']';
       
        $sql = "INSERT INTO $TableName ({$insertKeys}) VALUES (".str_repeat("?, ", count($row) - 1)."?);";
       
        $stmt = sqlsrv_prepare($conn, $sql, array_values($insertPairs));
        if (!$stmt) {
            die(print_r(sqlsrv_errors(), true));
        }


        if (sqlsrv_execute($stmt) === false) {
            die(print_r(sqlsrv_errors(), true));
        }
    }
    echo date('Y-m-d H:i:s')." | Data Loaded Successfully";
} else {
    echo date('Y-m-d H:i:s')." | Empty Data";
}


//Close Connection
sqlsrv_close($conn);
You should be able to see that the script is being executed successfully.

Then you can examine the data that was imported into the  table: TableJSON in SQL Server.

9 ) Using Java

Again, I have an opportunity to touch up my programming skill in Java by implementing the codes below.

As far as what I know, there is no native way in Java to parse JSON data and there are quite a lot of additional libraries available to fulfil this task, such as:


In this tutorial, I would like to give Jsoniter a try.

So we would need to install libraries below:


After that, we can prepare and run the codes, as shown below:

package javaapplication1;


import com.jsoniter.JsonIterator;
import com.jsoniter.any.*;


import java.io.*;
import java.sql.*;
import java.text.*;
import java.util.*;


public class ImportJSON {


    public static void main(String[] args) {
        Connection conn = null;
        try {


            // Settings
            String TargetName = "ServerName";
            String SchemaName = "DatabaseName";
            String TableName = "TableJSON";
            String UserName = "UserID";
            String Password = "Password";
            String url = "jdbc:sqlserver://" + TargetName + ";databaseName=" + SchemaName + ";user=" + UserName + ";password=" + Password + ";";
            String SourceFile = "C:\\yourpath\\source.json";


            //Connect to Target Database
            conn = DriverManager.getConnection(url, UserName, Password);
            conn.setAutoCommit(false);


            //Clean the Target Table
            Statement stmt = conn.createStatement();
            String sql = "Truncate Table " + TableName;
            stmt.executeUpdate(sql);
            conn.commit();
           
            //Prepare the Insert SQL statement
            sql = "INSERT INTO " + TableName + " VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?);";
            PreparedStatement prepstmt = conn.prepareStatement(sql);
               
            //Read JSON and Insert its Content to Target Table
            String json = "";
            try {
                StringBuilder stringBuilder;
                try (BufferedReader reader = new BufferedReader(new FileReader(SourceFile))) {
                    stringBuilder = new StringBuilder();
                    String line = null;
                    String ls = System.getProperty("line.separator");
                    while ((line = reader.readLine()) != null) {
                        stringBuilder.append(line);
                        stringBuilder.append(ls);
                    }
                    //Delete the last new line separator
                    stringBuilder.deleteCharAt(stringBuilder.length() - 1);
                }
                json = stringBuilder.toString();
            } catch (IOException ex) {
                ex.printStackTrace(System.out);
            }
           
            Any records = JsonIterator.deserialize(json);
            for (Any record : records) {
               
                //Set parameters to NULL by default
                for (int i = 1; i <= 12; i++) {
                    prepstmt.setNull(i, Types.NULL);
                }
               
                Any.EntryIterator entryIterator = record.entries();
                while (entryIterator.next()) {
                    switch(entryIterator.key()) {
                        case "Dates":
                            if (entryIterator.value() != null) {
                                SimpleDateFormat inputFormat = new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss");
                                java.util.Date t;
                                try {
                                    t = inputFormat.parse(entryIterator.value().toString());
                                    prepstmt.setObject(1, t);
                                } catch (ParseException ex) {
                                    ex.printStackTrace(System.out);
                                }                                
                            }
                            break;
                        case "Type":
                            if (entryIterator.value() != null) {
                                prepstmt.setObject(2, entryIterator.value().toString());
                            }
                            break;
                        case "Code":
                            if (entryIterator.value() != null) {
                                prepstmt.setObject(3, entryIterator.value().toString());
                            }
                            break;
                        case "Price":
                            if (entryIterator.value() != null) {
                                prepstmt.setObject(4, entryIterator.value().toBigDecimal());
                            }
                            break;
                        case "Unit":
                            if (entryIterator.value() != null) {
                                prepstmt.setObject(5, entryIterator.value().toBigDecimal());
                            }
                            break;
                        case "SubTotal":
                            if (entryIterator.value() != null) {
                                prepstmt.setObject(6, entryIterator.value().toBigDecimal());
                            }
                            break;
                        case "Brokeage Rate":
                            if (entryIterator.value() != null) {
                                prepstmt.setObject(7, entryIterator.value().toBigDecimal());
                            }
                            break;
                        case "Tax":
                            if (entryIterator.value() != null) {
                                prepstmt.setObject(8, entryIterator.value().toBigDecimal());
                            }
                            break;
                        case "Stamp Duty":
                            if (entryIterator.value() != null) {
                                prepstmt.setObject(9, entryIterator.value().toBigDecimal());
                            }
                            break;
                        case "Clearing Fee":
                            if (entryIterator.value() != null) {
                                prepstmt.setObject(10, entryIterator.value().toBigDecimal());
                            }
                            break;
                        case "Amount":
                            if (entryIterator.value() != null) {
                                prepstmt.setObject(11, entryIterator.value().toBigDecimal());
                            }
                            break;
                        case "Svc Cost":
                            if (entryIterator.value() != null) {
                                prepstmt.setObject(12, entryIterator.value().toBigDecimal());
                            }
                            break;
                    }
                }
                prepstmt.addBatch();
            }
           
            //Insert into Target Table
            prepstmt.executeBatch();
            conn.commit();
           
            String pattern = "dd/MM/yyyy h:mm:ss a z";
            SimpleDateFormat simpleDateFormat = new SimpleDateFormat(pattern, new Locale("en", "US"));
            String date = simpleDateFormat.format(new java.util.Date());


            System.out.println(date + " | Data Loaded Successfully");
        //} catch (SQLException | IOException ex) {
        } catch (SQLException ex) {
            ex.printStackTrace(System.out);
        } finally {
            try {
                if (conn != null && !conn.isClosed()) {
                    conn.close();
                }
            } catch (SQLException ex) {
                ex.printStackTrace(System.out);
            }
        }
    }
}
You should be able to see that the script is being executed successfully.


Then you can examine the data that was imported into the  table: Sheet1$ in SQL Server.

10 ) Using Golang


There are different ways to load JSON in Golang but in this article, we going to use the default library.

The codes for execution is shown as follows:

package main


import (
    "context"
    "database/sql"
    "encoding/json"
    "fmt"
    "io/ioutil"
    "log"
    "reflect"
    "strings"
    "time"


    _ "github.com/denisenkom/go-mssqldb"
)


// NullInt64 is an alias for sql.NullInt64 data type
type NullInt64 sql.NullInt64


// Scan implements the Scanner interface for NullInt64
func (ni *NullInt64) Scan(value interface{}) error {
    var i sql.NullInt64
    if err := i.Scan(value); err != nil {
        return err
    }


    // if nil then make Valid false
    if reflect.TypeOf(value) == nil {
        *ni = NullInt64{i.Int64, false}
    } else {
        *ni = NullInt64{i.Int64, true}
    }
    return nil
}


// NullBool is an alias for sql.NullBool data type
type NullBool sql.NullBool


// Scan implements the Scanner interface for NullBool
func (nb *NullBool) Scan(value interface{}) error {
    var b sql.NullBool
    if err := b.Scan(value); err != nil {
        return err
    }


    // if nil then make Valid false
    if reflect.TypeOf(value) == nil {
        *nb = NullBool{b.Bool, false}
    } else {
        *nb = NullBool{b.Bool, true}
    }


    return nil
}


// NullFloat64 is an alias for sql.NullFloat64 data type
type NullFloat64 sql.NullFloat64


// Scan implements the Scanner interface for NullFloat64
func (nf *NullFloat64) Scan(value interface{}) error {
    var f sql.NullFloat64
    if err := f.Scan(value); err != nil {
        return err
    }


    // if nil then make Valid false
    if reflect.TypeOf(value) == nil {
        *nf = NullFloat64{f.Float64, false}
    } else {
        *nf = NullFloat64{f.Float64, true}
    }


    return nil
}


// NullString is an alias for sql.NullString data type
type NullString sql.NullString


// Scan implements the Scanner interface for NullString
func (ns *NullString) Scan(value interface{}) error {
    var s sql.NullString
    if err := s.Scan(value); err != nil {
        return err
    }


    // if nil then make Valid false
    if reflect.TypeOf(value) == nil {
        *ns = NullString{s.String, false}
    } else {
        *ns = NullString{s.String, true}
    }


    return nil
}


// NullTime is an alias for sql.NullTime data type
type NullTime sql.NullTime


// Scan implements the Scanner interface for NullTime
func (nt *NullTime) Scan(value interface{}) error {
    var t sql.NullTime
    if err := t.Scan(value); err != nil {
        return err
    }


    // if nil then make Valid false
    if reflect.TypeOf(value) == nil {
        *nt = NullTime{t.Time, false}
    } else {
        *nt = NullTime{t.Time, true}
    }


    return nil
}


// JSONDate : custom date type
type JSONDate time.Time


//UnmarshalJSON : imeplement Marshaler und Unmarshalere interface
func (j *JSONDate) UnmarshalJSON(b []byte) error {
    s := strings.Trim(string(b), "\"")
    t, err := time.Parse("2006-01-02T00:00:00", s)
    if err != nil {
        return err
    }
    *j = JSONDate(t)
    return nil
}


//MarshalJSON : imeplement Marshaler und Unmarshalere interface
func (j JSONDate) MarshalJSON() ([]byte, error) {
    return json.Marshal(j)
}


//Format : Maybe a Format function for printing your date
func (j JSONDate) Format(s string) string {
    t := time.Time(j)
    return t.Format(s)
}


//FormatNow : Format Date Time
func FormatNow() string {
    const layout = "2 Jan 2006 3:04pm"
    now := time.Now().Format(layout)
    return now
}


// Settings
const (
    server     = "ServerName"
    port       = 1433
    database   = "DatabaseName"
    user       = "UserID"
    password   = "Password"
    sourcepath = "C:\\yourpath\\source.json"
    tablename  = "TableJSON"
)


var (
    db  *sql.DB
    err error
)


// TargetTable : Target Table's Structure
type TargetTable struct {
    colName          string
    dataType         string
    chaMaxLen        NullInt64
    numericPrecision NullInt64
    numericScale     NullInt64
}


// DataList : Data List's Structure
type DataList struct {
    DataList []Data `json:"datalist"`
}


// Data : Data's Structure
type Data struct {
    Dates        JSONDate    `json:"Dates"`
    Type         interface{} `json:"Type"`
    Code         interface{} `json:"Code"`
    Price        interface{} `json:"Price"`
    Unit         interface{} `json:"Unit"`
    SubTotal     interface{} `json:"Subtotal"`
    BrokeageRate interface{} `json:"Brokeage Rate"`
    Tax          interface{} `json:"Tax"`
    StampDuty    interface{} `json:"Stamp Duty"`
    ClearingFee  interface{} `json:"Clearing Fee"`
    Amount       interface{} `json:"Amount"`
    SvcCost      interface{} `json:"Svc Cost"`
}


func main() {


    // Create connection string
    connString := fmt.Sprintf("server=%s;user id=%s;password=%s;port=%d;database=%s;", server, user, password, port, database)


    // Create connection pool
    db, err = sql.Open("sqlserver", connString)
    if err != nil {
        log.Fatal(err)
    }


    //Clean the Target Table
    sqlstr := ""
    sqlstr = fmt.Sprintf("Truncate Table %s;", tablename)
    if _, err = db.Exec(sqlstr); err != nil {
        log.Fatal(err)
    }


    //Read Target Table Fields
    sqlstr = fmt.Sprintf("SELECT COLUMN_NAME, DATA_TYPE, CHARACTER_MAXIMUM_LENGTH, NUMERIC_PRECISION, NUMERIC_SCALE FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = N'%s' ORDER BY ORDINAL_POSITION;", tablename)
    trows, err := db.Query(sqlstr)
    if err != nil {
        log.Fatal(err)
    }
    defer trows.Close()
    var records []TargetTable


    for trows.Next() {
        var record TargetTable


        err := trows.Scan(&record.colName, &record.dataType, &record.chaMaxLen, &record.numericPrecision, &record.numericScale)
        if err != nil {
            log.Fatal(err)
        }
        records = append(records, record)
    }


    //Read JSON File Content
    f, err := ioutil.ReadFile(sourcepath)
    if err != nil {
        log.Fatal(err)
    }


    var datalist []Data
    err = json.Unmarshal(f, &datalist)
    if err != nil {
        log.Fatal(err)
    }


    var sqlcmdstr = "insert into %s ([%s], [%s], [%s], [%s], [%s], [%s], [%s], [%s], [%s], [%s], [%s], [%s]) values (@pDates, @pType, @pCode, @pPrice, @pUnit, @pSubTotal, @pBrokeageRate, @pTax, @pStampDuty, @pClearingFee, @pAmount, @pSvcCost);"
    sqlstr = fmt.Sprintf(sqlcmdstr, tablename, records[0].colName, records[1].colName, records[2].colName, records[3].colName, records[4].colName, records[5].colName, records[6].colName, records[7].colName, records[8].colName, records[9].colName, records[10].colName, records[11].colName)
    stmt, err := db.Prepare(sqlstr)


    if err != nil {
        log.Fatal(err)
    }


    // Close the database connection pool after program executes
    defer db.Close()


    ctx := context.Background()


    for i := 0; i < len(datalist); i++ {
        // Close the statement
        defer stmt.Close()


        var pDates = sql.Named("pDates", datalist[i].Dates.Format("2 Jan 2006"))
        var pType = sql.Named("pType", datalist[i].Type)
        var pCode = sql.Named("pCode", datalist[i].Code)
        var pPrice = sql.Named("pPrice", datalist[i].Price)
        var pUnit = sql.Named("pUnit", datalist[i].Unit)
        var pSubTotal = sql.Named("pSubTotal", datalist[i].SubTotal)
        var pBrokeageRate = sql.Named("pBrokeageRate", datalist[i].BrokeageRate)
        var pTax = sql.Named("pTax", datalist[i].Tax)
        var pStampDuty = sql.Named("pStampDuty", datalist[i].StampDuty)
        var pClearingFee = sql.Named("pClearingFee", datalist[i].ClearingFee)
        var pAmount = sql.Named("pAmount", datalist[i].Amount)
        var pSvcCost = sql.Named("pSvcCost", datalist[i].SvcCost)


        if _, err := stmt.ExecContext(ctx, pDates, pType, pCode, pPrice, pUnit, pSubTotal, pBrokeageRate, pTax, pStampDuty, pClearingFee, pAmount, pSvcCost); err != nil {
            log.Fatal(err)
        }
    }


    fmt.Println(FormatNow() + " | Data Loaded Successfully!")
}
You should be able to see that the script is being executed successfully.


Then you can examine the data that was imported into the  table: TableJSON in SQL Server.
2
Author:Ryan Chong
Ask questions about what you read
If you have a question about something within an article, you can receive help directly from the article author. Experts Exchange article authors are available to answer questions and further the discussion.
Get 7 days free