Eddie Shipman
asked on
"Pivoting" Rows to Columns in C#
We have a function that takes a dataset and "reorganizes" it from rows into columns:
Here's demo data, each record has a KEYWORD that are essentially the columns and the VALUE is the data for the column except the FNAME column gets the value of FNAME when the KEYWORD is FNAME. The code also adds a column called DSENT that contains the LASTMOD value from first "record" in the group.
private DataSet ReorgFileDataset(ref DataSet InDataSet)
{
DataRow _ResultDataRow = new DataRow();
DataRow _DataRow = new DataRow();
DataSet _ResultDataSet = new DataSet();
DataTable _ResultDataTable = new DataTable();
_ResultDataSet.Tables.Add(_ResultDataTable);
string ColumnName = null;
string Value = null;
int FileID = 0;
//*********************************************************
// Builds the columns for each of the Keywords
//*********************************************************
foreach ( _DataRow in InDataSet.Tables(0).Rows) {
ColumnName = _DataRow.Item("KEYWORD");
if (!_ResultDataTable.Columns.Contains(ColumnName)) {
_ResultDataTable.Columns.Add(ColumnName);
}
}
_ResultDataTable.Columns.Add("DSENT");
//*********************************************************
_ResultDataRow = null;
//*********************************************************
// Now reads the data into the columns
//*********************************************************
foreach ( _DataRow in InDataSet.Tables(0).Rows) {
// Has to keep track of each "record", they all have same RID
// Creates a new row in _ResultDataTable when it changes
if (FileID != _DataRow.Item("RID")) {
if ((_ResultDataRow != null)) {
// Adds the previously created row
_ResultDataTable.Rows.Add(_ResultDataRow);
}
FileID = _DataRow.Item("RID");
// Creates the new row and sets the LASTMOD value from the DSENT value
_ResultDataRow = _ResultDataTable.NewRow();
_ResultDataRow.Item("DSENT") = _DataRow.Item("LASTMOD");
}
// Now read the data into the columns
if (_DataRow.Item("KEYWORD") != DBNull & _DataRow.Item("VALUE") != DBNull) {
ColumnName = _DataRow.Item("KEYWORD");
Value = _DataRow.Item("VALUE");
_ResultDataRow.Item(ColumnName) = Value;
}
}
//*********************************************************
if (_ResultDataRow != null) {
_ResultDataTable.Rows.Add(_ResultDataRow);
}
return _ResultDataSet;
}
But I need it to create this JSON like this instead of a Dataset:[
{"CER":"5010243", "ST":"SC", "RID":"1088741", "DNAME":"CERT", "EC":"20051109SCIP", "FNAME":"\1088741\17\C\SC_20051109SCIP.PDF", "PDATE":"20051130", "DG":"", "PCODE":"", "UD":"", "DSENT":"20150911"}
{"CER":"4925213", "ST":"SC", "RID":"1088741", "DNAME":"CERT", "EC":"20050421SCPA", "FNAME":"\1088741\17\C\SC_20050421SCPA.PDF", "PDATE":"20050509", "DG":"", "PCODE":"", "UD":"", "DSENT":"20150911"}
{"CER":"", "ST":"", "RID":"1088741", "DNAME":"PASS LET", "EC":"20061102SCIP", "FNAME":"\1088741\17\L\CR_PASS_LET20061102SCIP.PDF", "PDATE":"20061116", "DG":"CR", "PCODE":"IP", "", "DSENT":"20150911"}
{"CER":"", "ST":"", "RID":"1088741", "DNAME":"ID Card", "EC":"", "FNAME":"\1088741\17\LET\ID_CARD_1088741.PDF", "PDATE":"20150913", "DG":"", "PCODE":"", "UD":"", "DSENT":"20150911"}
{"CER":"", "ST":"", "RID":"1088741", "DNAME":"UPD LET", "EC":"", "FNAME":"\1088741\17\L\UPD_LET_20150828SCLH.PDF", "PDATE":"20150911", "DG":"CR", "PCODE":"LH", "UD":"5704220", "DSENT":"20150911"}
]
Here's demo data, each record has a KEYWORD that are essentially the columns and the VALUE is the data for the column except the FNAME column gets the value of FNAME when the KEYWORD is FNAME. The code also adds a column called DSENT that contains the LASTMOD value from first "record" in the group.
"RID","KEYWORD","VALUE","FNAME","LASTMOD"
162895,"CER","5010243","\1088741\17\C\SC_20051109SCIP.PDF","2/1/2006 9:30:16 PM"
162895,"ST","SC","\1088741\17\C\SC_20051109SCIP.PDF","2/1/2006 9:30:16 PM"
162895,"RID","1088741","\1088741\17\C\SC_20051109SCIP.PDF","2/1/2006 9:30:16 PM"
162895,"DNAME","CERT","\1088741\17\C\SC_20051109SCIP.PDF","2/1/2006 9:30:16 PM"
162895,"EC","20051109SCIP","\1088741\17\C\SC_20051109SCIP.PDF","2/1/2006 9:30:16 PM"
162895,"FNAME","\1088741\17\C\SC_20051109SCIP","\1088741\17\C\SC_20051109SCIP.PDF","2/1/2006 9:30:16 PM"
162895,"PDATE","20051130","\1088741\17\C\SC_20051109SCIP.PDF","2/1/2006 9:30:16 PM"
251893,"CER","4925213","\1088741\17\C\SC_20050421SCPA.PDF","4/25/2006 4:36:16 PM"
251893,"ST","SC","\1088741\17\C\SC_20050421SCPA.PDF","4/25/2006 4:36:16 PM"
251893,"RID","1088741","\1088741\17\C\SC_20050421SCPA.PDF","4/25/2006 4:36:16 PM"
251893,"DNAME","CERT","\1088741\17\C\SC_20050421SCPA.PDF","4/25/2006 4:36:16 PM"
251893,"EC","20050421SCPA","\1088741\17\C\SC_20050421SCPA.PDF","4/25/2006 4:36:16 PM"
251893,"FNAME","\1088741\17\C\SC_20050421SCPA","\1088741\17\C\SC_20050421SCPA.PDF","4/25/2006 4:36:16 PM"
251893,"PDATE","20050509","\1088741\17\C\SC_20050421SCPA.PDF","4/25/2006 4:36:16 PM"
414199,"RID","1088741","\1088741\17\L\CR_PASS_LET20061102SCIP.PDF","1/31/2007 1:20:46 PM"
414199,"DG","CR","\1088741\17\L\CR_PASS_LET20061102SCIP.PDF","1/31/2007 1:20:46 PM"
414199,"DNAME","PASS_LET","\1088741\17\L\CR_PASS_LET20061102SCIP.PDF","1/31/2007 1:20:46 PM"
414199,"EC","20061102SCIP","\1088741\17\L\CR_PASS_LET20061102SCIP.PDF","1/31/2007 1:20:46 PM"
414199,"FNAME","\1088741\17\L\CR_PASS_LET20061102SCIP","\1088741\17\L\CR_PASS_LET20061102SCIP.PDF","1/31/2007 1:20:46 PM"
414199,"PDATE","20070131","\1088741\17\L\CR_PASS_LET20061102SCIP.PDF","1/31/2007 1:20:46 PM"
414199,"PCODE","IP","\1088741\17\L\CR_PASS_LET20061102SCIP.PDF","1/31/2007 1:20:46 PM"
2317695,"RID","1088741","\1088741\17\LET\ID_CARD_1088741.PDF","9/13/2015 8:11:28 AM"
2317695,"DNAME","ID Card","\1088741\17\LET\ID_CARD_1088741.PDF","9/13/2015 8:11:28 AM"
2317695,"FNAME","\1088741\17\Let\ID_CARD_1088741.pdf","\1088741\17\LET\ID_CARD_1088741.PDF","9/13/2015 8:11:28 AM"
2317695,"PDATE","20150913","\1088741\17\LET\NA_ID_CARD_1088741.PDF","9/13/2015 8:11:28 AM"
6714777,"RID","1088741","\1088741\17\L\UPD_LET_20150828SCLH.PDF",10/1/2015 11:02:59 AM
6714777,"DG","CR","\1088741\17\L\UPD_LET_20150828SCLH.PDF",10/1/2015 11:02:59 AM
6714777,"DNAME","CR_UPD_LET_CIC","\1088741\17\L\UPD_LET_20150828SCLH.PDF",10/1/2015 11:02:59 AM
6714777,"EC","20150828SCLH","\1088741\17\L\UPD_LET_20150828SCLH.PDF",10/1/2015 11:02:59 AM
6714777,"FNAME","\1088741\17\L\Upd_LET_20150828SCLH.pdf","\1088741\17\L\UPD_LET_20150828SCLH.PDF",10/1/2015 11:02:59 AM
6714777,"PDATE","20151001","\1088741\17\L\UPD_LET_20150828SCLH.PDF",10/1/2015 11:02:59 AM
6714777,"PCODE","LH","\1088741\17\L\UPD_LET_20150828SCLH.PDF",10/1/2015 11:02:59 AM
6714777,"UD","5704220","\1088741\17\L\UPD_LET_20150828SCLH.PDF",10/1/2015 11:02:59 AM
ASKER
No, the JSON has the same data as in the sample data.
hmm, I don't see it. The RID column does not contain 1088741 as in your JSON.
ASKER
You aren't looking correctly:
The VALUE in the row containing the KEYWORD "RID" should be put into a COLUMN named "RID"
162895,"RID","1088741","\1088741\17\C\SC_20051109SCIP.PDF","2/1/2006 9:30:16 PM"
Each ROW is supposed to be a column in the JSON.The VALUE in the row containing the KEYWORD "RID" should be put into a COLUMN named "RID"
ASKER
I see what you are talking about. The KEYWORD RID should be CRID so the data for the first record would look like this:
"RID","KEYWORD","VALUE","FNAME","LASTMOD"
162895,"CER","5010243","\1088741\17\C\SC_20051109SCIP.PDF","2/1/2006 9:30:16 PM"
162895,"ST","SC","\1088741\17\C\SC_20051109SCIP.PDF","2/1/2006 9:30:16 PM"
162895,"CRID","1088741","\1088741\17\C\SC_20051109SCIP.PDF","2/1/2006 9:30:16 PM"
162895,"DNAME","CERT","\1088741\17\C\SC_20051109SCIP.PDF","2/1/2006 9:30:16 PM"
162895,"EC","20051109SCIP","\1088741\17\C\SC_20051109SCIP.PDF","2/1/2006 9:30:16 PM"
162895,"FNAME","\1088741\17\C\SC_20051109SCIP","\1088741\17\C\SC_20051109SCIP.PDF","2/1/2006 9:30:16 PM"
162895,"PDATE","20051130","\1088741\17\C\SC_20051109SCIP.PDF","2/1/2006 9:30:16 PM"
and the resulting JSON should look like this:{"CER":"5010243", "ST":"SC", "CRID":"1088741", "DNAME":"CERT", "EC":"20051109SCIP", "FNAME":"\1088741\17\C\SC_20051109SCIP.PDF", "PDATE":"20051130", "DG":"", "PCODE":"", "UD":"", "DSENT":"20150911"}
ASKER
The RID column in the data is ignored.
ASKER
In the JSON, that is ;-)
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
It seems to work perfectly for what we want, thanks.
The problem is that your JSON does use different data than your table sample.