zakwithu2012
asked on
BULK INSERT skips (ignores) the 1st row (record) in the flat file
Hi expert Team,
I'm using this SQL query to load data from myDatafile.txt to my Database using bulk insert with file format like below:
i have a problem which cause a headache for me and appreciate your help. Everything is working fine and the data is being loaded successfully. however i noticed that the 1st row is always been skipped(ignored). I'm setting FIRSTROW=3 to start from the 3rd row because the 1st contains headers and the 2nd contain data-type and i dont want them to be imported.
To test my case you need the following:
First: create this table:
Second:
create new text file 'C:\sqldata\myDatafile.txt ' then copy the below data and paste it in the 'myDatafile.txt' file in location 'C:\sqldata\':
Third:
this is the file format: 'C:\sqldata\CDRimport.txt'
Last:
Run this SQL Query:
Logically since the data file contain 2 records, then it should insert 2 rows, however it insert only 1 row.
all the above scripts are attached.
My Question: Any idea? can anyone tell me what is wrong with my code or file format that make the first data row to be skipped from inserting?
myDatafile.txt
CDRimport.txt
Create-table.sql
I'm using this SQL query to load data from myDatafile.txt to my Database using bulk insert with file format like below:
BULK INSERT CDR FROM 'C:\sqldata\myDatafile.txt' WITH (FORMATFILE='C:\sqldata\CDRimport.txt', FIRSTROW=3)
i have a problem which cause a headache for me and appreciate your help. Everything is working fine and the data is being loaded successfully. however i noticed that the 1st row is always been skipped(ignored). I'm setting FIRSTROW=3 to start from the 3rd row because the 1st contains headers and the 2nd contain data-type and i dont want them to be imported.
To test my case you need the following:
First: create this table:
CREATE TABLE [dbo].[CDR](
[cdrRecordType] [varchar](50) NULL,
[globalCallID_callManagerId] [varchar](50) NULL,
[globalCallID_callId] [varchar](50) NULL,
[origLegCallIdentifier] [varchar](50) NULL,
[dateTimeOrigination] [varchar](50) NULL,
[origNodeId] [varchar](50) NULL,
[origSpan] [varchar](50) NULL,
[origIpAddr] [varchar](50) NULL,
[callingPartyNumber] [varchar](50) NULL,
[callingPartyUnicodeLoginUserID] [varchar](50) NULL,
[origCause_location] [varchar](50) NULL,
[origCause_value] [varchar](50) NULL,
[origPrecedenceLevel] [varchar](50) NULL,
[origMediaTransportAddress_IP] [varchar](50) NULL,
[origMediaTransportAddress_Port] [varchar](50) NULL,
[origMediaCap_payloadCapability] [varchar](50) NULL,
[origMediaCap_maxFramesPerPacket] [varchar](50) NULL,
[origMediaCap_g723BitRate] [varchar](50) NULL,
[origVideoCap_Codec] [varchar](50) NULL,
[origVideoCap_Bandwidth] [varchar](50) NULL,
[origVideoCap_Resolution] [varchar](50) NULL,
[origVideoTransportAddress_IP] [varchar](50) NULL,
[origVideoTransportAddress_Port] [varchar](50) NULL,
[origRSVPAudioStat] [varchar](50) NULL,
[origRSVPVideoStat] [varchar](50) NULL,
[destLegIdentifier] [varchar](50) NULL,
[destNodeId] [varchar](50) NULL,
[destSpan] [varchar](50) NULL,
[destIpAddr] [varchar](50) NULL,
[originalCalledPartyNumber] [varchar](50) NULL,
[finalCalledPartyNumber] [varchar](50) NULL,
[finalCalledPartyUnicodeLoginUserID] [varchar](50) NULL,
[destCause_location] [varchar](50) NULL,
[destCause_value] [varchar](50) NULL,
[destPrecedenceLevel] [varchar](50) NULL,
[destMediaTransportAddress_IP] [varchar](50) NULL,
[destMediaTransportAddress_Port] [varchar](50) NULL,
[destMediaCap_payloadCapability] [varchar](50) NULL,
[destMediaCap_maxFramesPerPacket] [varchar](50) NULL,
[destMediaCap_g723BitRate] [varchar](50) NULL,
[destVideoCap_Codec] [varchar](50) NULL,
[destVideoCap_Bandwidth] [varchar](50) NULL,
[destVideoCap_Resolution] [varchar](50) NULL,
[destVideoTransportAddress_IP] [varchar](50) NULL,
[destVideoTransportAddress_Port] [varchar](50) NULL,
[destRSVPAudioStat] [varchar](50) NULL,
[destRSVPVideoStat] [varchar](50) NULL,
[dateTimeConnect] [varchar](50) NULL,
[dateTimeDisconnect] [varchar](50) NULL,
[lastRedirectDn] [varchar](50) NULL,
[pkid] [varchar](50) NULL,
[originalCalledPartyNumberPartition] [varchar](50) NULL,
[callingPartyNumberPartition] [varchar](50) NULL,
[finalCalledPartyNumberPartition] [varchar](50) NULL,
[lastRedirectDnPartition] [varchar](50) NULL,
[duration] [varchar](50) NULL,
[origDeviceName] [varchar](50) NULL,
[destDeviceName] [varchar](50) NULL,
[origCallTerminationOnBehalfOf] [varchar](50) NULL,
[destCallTerminationOnBehalfOf] [varchar](50) NULL,
[origCalledPartyRedirectOnBehalfOf] [varchar](50) NULL,
[lastRedirectRedirectOnBehalfOf] [varchar](50) NULL,
[origCalledPartyRedirectReason] [varchar](50) NULL,
[lastRedirectRedirectReason] [varchar](50) NULL,
[destConversationId] [varchar](50) NULL,
[globalCallId_ClusterID] [varchar](50) NULL,
[joinOnBehalfOf] [varchar](50) NULL,
[comment] [varchar](50) NULL,
[authCodeDescription] [varchar](50) NULL,
[authorizationLevel] [varchar](50) NULL,
[clientMatterCode] [varchar](50) NULL,
[origDTMFMethod] [varchar](50) NULL,
[destDTMFMethod] [varchar](50) NULL,
[callSecuredStatus] [varchar](50) NULL,
[origConversationId] [varchar](50) NULL,
[origMediaCap_Bandwidth] [varchar](50) NULL,
[destMediaCap_Bandwidth] [varchar](50) NULL,
[authorizationCodeValue] [varchar](50) NULL,
[outpulsedCallingPartyNumber] [varchar](50) NULL,
[outpulsedCalledPartyNumber] [varchar](50) NULL,
[origIpv4v6Addr] [varchar](50) NULL,
[destIpv4v6Addr] [varchar](50) NULL,
[origVideoCap_Codec_Channel2] [varchar](50) NULL,
[origVideoCap_Bandwidth_Channel2] [varchar](50) NULL,
[origVideoCap_Resolution_Channel2] [varchar](50) NULL,
[origVideoTransportAddress_IP_Channel2] [varchar](50) NULL,
[origVideoTransportAddress_Port_Channel2] [varchar](50) NULL,
[origVideoChannel_Role_Channel2] [varchar](50) NULL,
[destVideoCap_Codec_Channel2] [varchar](50) NULL,
[destVideoCap_Bandwidth_Channel2] [varchar](50) NULL,
[destVideoCap_Resolution_Channel2] [varchar](50) NULL,
[destVideoTransportAddress_IP_Channel2] [varchar](50) NULL,
[destVideoTransportAddress_Port_Channel2] [varchar](50) NULL,
[destVideoChannel_Role_Channel2] [varchar](50) NULL
) ON [PRIMARY]
GO
Second:
create new text file 'C:\sqldata\myDatafile.txt
"cdrRecordType","globalCallID_callManagerId","globalCallID_callId","origLegCallIdentifier","dateTimeOrigination","origNodeId","origSpan","origIpAddr","callingPartyNumber","callingPartyUnicodeLoginUserID","origCause_location","origCause_value","origPrecedenceLevel","origMediaTransportAddress_IP","origMediaTransportAddress_Port","origMediaCap_payloadCapability","origMediaCap_maxFramesPerPacket","origMediaCap_g723BitRate","origVideoCap_Codec","origVideoCap_Bandwidth","origVideoCap_Resolution","origVideoTransportAddress_IP","origVideoTransportAddress_Port","origRSVPAudioStat","origRSVPVideoStat","destLegIdentifier","destNodeId","destSpan","destIpAddr","originalCalledPartyNumber","finalCalledPartyNumber","finalCalledPartyUnicodeLoginUserID","destCause_location","destCause_value","destPrecedenceLevel","destMediaTransportAddress_IP","destMediaTransportAddress_Port","destMediaCap_payloadCapability","destMediaCap_maxFramesPerPacket","destMediaCap_g723BitRate","destVideoCap_Codec","destVideoCap_Bandwidth","destVideoCap_Resolution","destVideoTransportAddress_IP","destVideoTransportAddress_Port","destRSVPAudioStat","destRSVPVideoStat","dateTimeConnect","dateTimeDisconnect","lastRedirectDn","pkid","originalCalledPartyNumberPartition","callingPartyNumberPartition","finalCalledPartyNumberPartition","lastRedirectDnPartition","duration","origDeviceName","destDeviceName","origCallTerminationOnBehalfOf","destCallTerminationOnBehalfOf","origCalledPartyRedirectOnBehalfOf","lastRedirectRedirectOnBehalfOf","origCalledPartyRedirectReason","lastRedirectRedirectReason","destConversationId","globalCallId_ClusterID","joinOnBehalfOf","comment","authCodeDescription","authorizationLevel","clientMatterCode","origDTMFMethod","destDTMFMethod","callSecuredStatus","origConversationId","origMediaCap_Bandwidth","destMediaCap_Bandwidth","authorizationCodeValue","outpulsedCallingPartyNumber","outpulsedCalledPartyNumber","origIpv4v6Addr","destIpv4v6Addr","origVideoCap_Codec_Channel2","origVideoCap_Bandwidth_Channel2","origVideoCap_Resolution_Channel2","origVideoTransportAddress_IP_Channel2","origVideoTransportAddress_Port_Channel2","origVideoChannel_Role_Channel2","destVideoCap_Codec_Channel2","destVideoCap_Bandwidth_Channel2","destVideoCap_Resolution_Channel2","destVideoTransportAddress_IP_Channel2","destVideoTransportAddress_Port_Channel2","destVideoChannel_Role_Channel2"
INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,VARCHAR(50),VARCHAR(128),INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,VARCHAR(64),VARCHAR(64),INTEGER,INTEGER,INTEGER,INTEGER,VARCHAR(50),VARCHAR(50),VARCHAR(128),INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,VARCHAR(64),VARCHAR(64),INTEGER,INTEGER,VARCHAR(50),UNIQUEIDENTIFIER,VARCHAR(50),VARCHAR(50),VARCHAR(50),VARCHAR(50),INTEGER,VARCHAR(129),VARCHAR(129),INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,VARCHAR(50),INTEGER,VARCHAR(2048),VARCHAR(50),INTEGER,VARCHAR(32),INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,VARCHAR(32),VARCHAR(50),VARCHAR(50),VARCHAR(64),VARCHAR(64),INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER
1,2,2759873,35208306,1330847184,2,0,-637263956,"5838","",0,0,4,-637263956,24692,4,20,0,0,0,0,0,0,"0","0",35208307,6,35208307,-2096103670,"90556138315","90556138315","",0,16,4,-2096103670,23090,4,20,0,0,0,0,0,0,"0","0",1330847195,1330847219,"90556138315","37d523ab-680b-41a9-ab6b-d9867974f044","Part_National","Part_DU_CCs_NAT","Part_National","Part_National",24,"SEP180373C4463E","10.255.15.131",0,12,0,0,0,0,0,"StandAloneCluster",0,"","",0,"",3,1,0,0,64,64,"","043699155","90556138315","172.31.4.218","10.255.15.131",0,0,0,0,0,0,0,0,0,0,0,0
1,2,2759107,35206523,1330846644,2,35206523,-737214710,"042861396","",0,0,4,-2129658102,19000,4,20,0,0,0,0,0,0,"0","0",35206524,7,0,1562582956,"6894","6894","",0,16,4,1562582956,18658,4,20,0,0,0,0,0,0,"0","0",1330846645,1330847219,"6894","a1df3ed4-0b42-4974-8c60-50cb9d696d5c","Part_India","","Part_India","Part_India",574,"10.255.14.212","SEP00269944A143",0,12,0,0,0,0,0,"StandAloneCluster",0,"","",0,"",1,3,0,0,64,64,"","","","10.255.15.129","172.31.35.93",0,0,0,0,0,0,0,0,0,0,0,0
Third:
this is the file format: 'C:\sqldata\CDRimport.txt'
10.0
94
1 SQLCHAR 0 50 "," 1 cdrRecordType Arabic_CI_AS
2 SQLCHAR 0 50 "," 2 globalCallID_callManagerId Arabic_CI_AS
3 SQLCHAR 0 50 "," 3 globalCallID_callId Arabic_CI_AS
4 SQLCHAR 0 50 "," 4 origLegCallIdentifier Arabic_CI_AS
5 SQLCHAR 0 50 "," 5 dateTimeOrigination Arabic_CI_AS
6 SQLCHAR 0 50 "," 6 origNodeId Arabic_CI_AS
7 SQLCHAR 0 50 "," 7 origSpan Arabic_CI_AS
8 SQLCHAR 0 50 ",\"" 8 origIpAddr Arabic_CI_AS
9 SQLCHAR 0 50 "\",\"" 9 callingPartyNumber Arabic_CI_AS
10 SQLCHAR 0 50 "\"," 10 callingPartyUnicodeLoginUserID Arabic_CI_AS
11 SQLCHAR 0 50 "," 11 origCause_location Arabic_CI_AS
12 SQLCHAR 0 50 "," 12 origCause_value Arabic_CI_AS
13 SQLCHAR 0 50 "," 13 origPrecedenceLevel Arabic_CI_AS
14 SQLCHAR 0 50 "," 14 origMediaTransportAddress_IP Arabic_CI_AS
15 SQLCHAR 0 50 "," 15 origMediaTransportAddress_Port Arabic_CI_AS
16 SQLCHAR 0 50 "," 16 origMediaCap_payloadCapability Arabic_CI_AS
17 SQLCHAR 0 50 "," 17 origMediaCap_maxFramesPerPacket Arabic_CI_AS
18 SQLCHAR 0 50 "," 18 origMediaCap_g723BitRate Arabic_CI_AS
19 SQLCHAR 0 50 "," 19 origVideoCap_Codec Arabic_CI_AS
20 SQLCHAR 0 50 "," 20 origVideoCap_Bandwidth Arabic_CI_AS
21 SQLCHAR 0 50 "," 21 origVideoCap_Resolution Arabic_CI_AS
22 SQLCHAR 0 50 "," 22 origVideoTransportAddress_IP Arabic_CI_AS
23 SQLCHAR 0 50 ",\"" 23 origVideoTransportAddress_Port Arabic_CI_AS
24 SQLCHAR 0 50 "\",\"" 24 origRSVPAudioStat Arabic_CI_AS
25 SQLCHAR 0 50 "\"," 25 origRSVPVideoStat Arabic_CI_AS
26 SQLCHAR 0 50 "," 26 destLegIdentifier Arabic_CI_AS
27 SQLCHAR 0 50 "," 27 destNodeId Arabic_CI_AS
28 SQLCHAR 0 50 "," 28 destSpan Arabic_CI_AS
29 SQLCHAR 0 50 ",\"" 29 destIpAddr Arabic_CI_AS
30 SQLCHAR 0 50 "\",\"" 30 originalCalledPartyNumber Arabic_CI_AS
31 SQLCHAR 0 50 "\",\"" 31 finalCalledPartyNumber Arabic_CI_AS
32 SQLCHAR 0 50 "\"," 32 finalCalledPartyUnicodeLoginUserID Arabic_CI_AS
33 SQLCHAR 0 50 "," 33 destCause_location Arabic_CI_AS
34 SQLCHAR 0 50 "," 34 destCause_value Arabic_CI_AS
35 SQLCHAR 0 50 "," 35 destPrecedenceLevel Arabic_CI_AS
36 SQLCHAR 0 50 "," 36 destMediaTransportAddress_IP Arabic_CI_AS
37 SQLCHAR 0 50 "," 37 destMediaTransportAddress_Port Arabic_CI_AS
38 SQLCHAR 0 50 "," 38 destMediaCap_payloadCapability Arabic_CI_AS
39 SQLCHAR 0 50 "," 39 destMediaCap_maxFramesPerPacket Arabic_CI_AS
40 SQLCHAR 0 50 "," 40 destMediaCap_g723BitRate Arabic_CI_AS
41 SQLCHAR 0 50 "," 41 destVideoCap_Codec Arabic_CI_AS
42 SQLCHAR 0 50 "," 42 destVideoCap_Bandwidth Arabic_CI_AS
43 SQLCHAR 0 50 "," 43 destVideoCap_Resolution Arabic_CI_AS
44 SQLCHAR 0 50 "," 44 destVideoTransportAddress_IP Arabic_CI_AS
45 SQLCHAR 0 50 ",\"" 45 destVideoTransportAddress_Port Arabic_CI_AS
46 SQLCHAR 0 50 "\",\"" 46 destRSVPAudioStat Arabic_CI_AS
47 SQLCHAR 0 50 "\"," 47 destRSVPVideoStat Arabic_CI_AS
48 SQLCHAR 0 50 "," 48 dateTimeConnect Arabic_CI_AS
49 SQLCHAR 0 50 ",\"" 49 dateTimeDisconnect Arabic_CI_AS
50 SQLCHAR 0 50 "\",\"" 50 lastRedirectDn Arabic_CI_AS
51 SQLCHAR 0 50 "\",\"" 51 pkid Arabic_CI_AS
52 SQLCHAR 0 50 "\",\"" 52 originalCalledPartyNumberPartition Arabic_CI_AS
53 SQLCHAR 0 50 "\",\"" 53 callingPartyNumberPartition Arabic_CI_AS
54 SQLCHAR 0 50 "\",\"" 54 finalCalledPartyNumberPartition Arabic_CI_AS
55 SQLCHAR 0 50 "\"," 55 lastRedirectDnPartition Arabic_CI_AS
56 SQLCHAR 0 50 ",\"" 56 duration Arabic_CI_AS
57 SQLCHAR 0 50 "\",\"" 57 origDeviceName Arabic_CI_AS
58 SQLCHAR 0 50 "\"," 58 destDeviceName Arabic_CI_AS
59 SQLCHAR 0 50 "," 59 origCallTerminationOnBehalfOf Arabic_CI_AS
60 SQLCHAR 0 50 "," 60 destCallTerminationOnBehalfOf Arabic_CI_AS
61 SQLCHAR 0 50 "," 61 origCalledPartyRedirectOnBehalfOf Arabic_CI_AS
62 SQLCHAR 0 50 "," 62 lastRedirectRedirectOnBehalfOf Arabic_CI_AS
63 SQLCHAR 0 50 "," 63 origCalledPartyRedirectReason Arabic_CI_AS
64 SQLCHAR 0 50 "," 64 lastRedirectRedirectReason Arabic_CI_AS
65 SQLCHAR 0 50 ",\"" 65 destConversationId Arabic_CI_AS
66 SQLCHAR 0 50 "\"," 66 globalCallId_ClusterID Arabic_CI_AS
67 SQLCHAR 0 50 ",\"" 67 joinOnBehalfOf Arabic_CI_AS
68 SQLCHAR 0 50 "\",\"" 68 comment Arabic_CI_AS
69 SQLCHAR 0 50 "\"," 69 authCodeDescription Arabic_CI_AS
70 SQLCHAR 0 50 ",\"" 70 authorizationLevel Arabic_CI_AS
71 SQLCHAR 0 50 "\"," 71 clientMatterCode Arabic_CI_AS
72 SQLCHAR 0 50 "," 72 origDTMFMethod Arabic_CI_AS
73 SQLCHAR 0 50 "," 73 destDTMFMethod Arabic_CI_AS
74 SQLCHAR 0 50 "," 74 callSecuredStatus Arabic_CI_AS
75 SQLCHAR 0 50 "," 75 origConversationId Arabic_CI_AS
76 SQLCHAR 0 50 "," 76 origMediaCap_Bandwidth Arabic_CI_AS
77 SQLCHAR 0 50 ",\"" 77 destMediaCap_Bandwidth Arabic_CI_AS
78 SQLCHAR 0 50 "\",\"" 78 authorizationCodeValue Arabic_CI_AS
79 SQLCHAR 0 50 "\",\"" 79 outpulsedCallingPartyNumber Arabic_CI_AS
80 SQLCHAR 0 50 "\",\"" 80 outpulsedCalledPartyNumber Arabic_CI_AS
81 SQLCHAR 0 50 "\",\"" 81 origIpv4v6Addr Arabic_CI_AS
82 SQLCHAR 0 50 "\"," 82 destIpv4v6Addr Arabic_CI_AS
83 SQLCHAR 0 50 "," 83 origVideoCap_Codec_Channel2 Arabic_CI_AS
84 SQLCHAR 0 50 "," 84 origVideoCap_Bandwidth_Channel2 Arabic_CI_AS
85 SQLCHAR 0 50 "," 85 origVideoCap_Resolution_Channel2 Arabic_CI_AS
86 SQLCHAR 0 50 "," 86 origVideoTransportAddress_IP_Channel2 Arabic_CI_AS
87 SQLCHAR 0 50 "," 87 origVideoTransportAddress_Port_Channel2 Arabic_CI_AS
88 SQLCHAR 0 50 "," 88 origVideoChannel_Role_Channel2 Arabic_CI_AS
89 SQLCHAR 0 50 "," 89 destVideoCap_Codec_Channel2 Arabic_CI_AS
90 SQLCHAR 0 50 "," 90 destVideoCap_Bandwidth_Channel2 Arabic_CI_AS
91 SQLCHAR 0 50 "," 91 destVideoCap_Resolution_Channel2 Arabic_CI_AS
92 SQLCHAR 0 50 "," 92 destVideoTransportAddress_IP_Channel2 Arabic_CI_AS
93 SQLCHAR 0 50 "," 93 destVideoTransportAddress_Port_Channel2 Arabic_CI_AS
94 SQLCHAR 0 50 "\n" 94 destVideoChannel_Role_Channel2 Arabic_CI_AS
Last:
Run this SQL Query:
BULK INSERT CDR FROM 'C:\sqldata\myDatafile.txt' WITH (FORMATFILE='C:\sqldata\CDRimport.txt', FIRSTROW=3)
Logically since the data file contain 2 records, then it should insert 2 rows, however it insert only 1 row.
all the above scripts are attached.
My Question: Any idea? can anyone tell me what is wrong with my code or file format that make the first data row to be skipped from inserting?
myDatafile.txt
CDRimport.txt
Create-table.sql
Check your FIRSTROW=3 from above queries to make sure that is indeed what you want.
And which file you need really to import? myDataFile or CDRImport?
ASKER
100% sure I'm intentionally setting FIRSTROW=3 to start from the 3rd row because the 1st contains headers and the 2nd contain data-type and i dont want them to be imported.
If i omit the header and the data-type rows and modify FIRSTROW=0 (or removing it from the code) then it will work fine. However i can not omit those to rows from the file because the file itself is a system generated file and i don't have control on that system so i can generate the files without the first two rows.
As per my testing, i think the issue comes from the 2nd row (the call-type row) because when i removed only the 2nd row and modify FIRSTROW=2 then it works fine and the 2nd record is actually inserted. if i added back to the file and modify FIRSTROW=3 then Bulk insert will start inserting from row 4 skipping row 3 (the 1st record of data)
I need to know what's wrong in the 2nd data-type record if any.
Any other suggestion?
If i omit the header and the data-type rows and modify FIRSTROW=0 (or removing it from the code) then it will work fine. However i can not omit those to rows from the file because the file itself is a system generated file and i don't have control on that system so i can generate the files without the first two rows.
As per my testing, i think the issue comes from the 2nd row (the call-type row) because when i removed only the 2nd row and modify FIRSTROW=2 then it works fine and the 2nd record is actually inserted. if i added back to the file and modify FIRSTROW=3 then Bulk insert will start inserting from row 4 skipping row 3 (the 1st record of data)
I need to know what's wrong in the 2nd data-type record if any.
Any other suggestion?
ASKER
I need to import myDataFile, the other file CDRImport is used for mapping the fields in the file to the field in the database.
Ok - do you have a hex editor? or do you know the system that generates the myDataFile?
I think that line 2 is actualy not line 2 - at the end of it maybe only a LF (0A hex) not CRLF or 0D0A. that's why it looks like its getting only line 4 in because there's no line 2 actualy.
did you tried FIRSTROW=2?
I think that line 2 is actualy not line 2 - at the end of it maybe only a LF (0A hex) not CRLF or 0D0A. that's why it looks like its getting only line 4 in because there's no line 2 actualy.
did you tried FIRSTROW=2?
ASKER
I checked with Hex Editor and it is only (0A hex) the file is already attached can you check from your side and confirm. also can you explain more what that mean.
I also tried FIRSTROW=2 but didn't work.
For testing purpose I put all the 2nd row values between two quotations like the below :
it works but this is not the solution actually we are near the solution i don't want to touch the data file. i have to keep the original generated file as it is.
my question now:
why does it work fine when i put all the values of 2nd row (data-type) between two quotation marks?
Can anyone think with me?
I also tried FIRSTROW=2 but didn't work.
For testing purpose I put all the 2nd row values between two quotations like the below :
"INTEGER","INTEGER","INTEGER","INTEGER","INTEGER","INTEGER","INTEGER","INTEGER","VARCHAR(50)","VARCHAR(128)","INTEGER","INTEGER","INTEGER","INTEGER","INTEGER","INTEGER","INTEGER","INTEGER","INTEGER","INTEGER","INTEGER","INTEGER","INTEGER","VARCHAR(64)","VARCHAR(64)","INTEGER","INTEGER","INTEGER","INTEGER","VARCHAR(50)","VARCHAR(50)","VARCHAR(128)","INTEGER","INTEGER","INTEGER","INTEGER","INTEGER","INTEGER","INTEGER","INTEGER","INTEGER","INTEGER","INTEGER","INTEGER","INTEGER","VARCHAR(64)","VARCHAR(64)","INTEGER","INTEGER","VARCHAR(50)","UNIQUEIDENTIFIER","VARCHAR(50)","VARCHAR(50)","VARCHAR(50)","VARCHAR(50)","INTEGER","VARCHAR(129)","VARCHAR(129)","INTEGER","INTEGER","INTEGER","INTEGER","INTEGER","INTEGER","INTEGER","VARCHAR(50)","INTEGER","VARCHAR(2048)","VARCHAR(50)","INTEGER","VARCHAR(32)","INTEGER","INTEGER","INTEGER","INTEGER","INTEGER","INTEGER","VARCHAR(32)","VARCHAR(50)","VARCHAR(50)","VARCHAR(64)","VARCHAR(64)","INTEGER","INTEGER","INTEGER","INTEGER","INTEGER","INTEGER","INTEGER","INTEGER","INTEGER","INTEGER","INTEGER","INTEGER"
it works but this is not the solution actually we are near the solution i don't want to touch the data file. i have to keep the original generated file as it is.
my question now:
why does it work fine when i put all the values of 2nd row (data-type) between two quotation marks?
Can anyone think with me?
Your input file has unix LF (ascii 10) rowterminator.
See for a way to use Dynamic SQL;
Or you can just enter a CR in T-SQL, which also works;
BULK INSERT [CDR]
FROM 'C:\sqldata\myDatafile.txt ' with (fieldterminator=',',rowte rminator='
', firstrow=3)
No need to use the format file. The fields will map to your table fields as they are.
See for a way to use Dynamic SQL;
...
DECLARE @bulk_cmd varchar(1000)
SET @bulk_cmd = 'BULK INSERT AdventureWorks2008R2.Sales .SalesOrde rDetail
FROM ''<drive>:\<path>\<filenam e>''
WITH (ROWTERMINATOR = '''+CHAR(10)+''')'
EXEC(@bulk_cmd)
DECLARE @bulk_cmd varchar(1000)
SET @bulk_cmd = 'BULK INSERT AdventureWorks2008R2.Sales
FROM ''<drive>:\<path>\<filenam
WITH (ROWTERMINATOR = '''+CHAR(10)+''')'
EXEC(@bulk_cmd)
Or you can just enter a CR in T-SQL, which also works;
BULK INSERT [CDR]
FROM 'C:\sqldata\myDatafile.txt
', firstrow=3)
No need to use the format file. The fields will map to your table fields as they are.
Actually I was sort-of mistaken. You can't actually hit enter to get the LF between quotes in rowterminator. I had executed this in QA text output mode;
select '>'+char(10)+'<'
copied, and pasted the LF between >< into the rowterminator='??' value. That worked. I just assumed you could also just hit CR until I just tried it. So either use dynamic SQL (from here http://msdn.microsoft.com/en-us/library/ms188365.aspx which did not show up as a link the first time) or do what I just described hopefully with a resonable comment. Ugh.
select '>'+char(10)+'<'
copied, and pasted the LF between >< into the rowterminator='??' value. That worked. I just assumed you could also just hit CR until I just tried it. So either use dynamic SQL (from here http://msdn.microsoft.com/en-us/library/ms188365.aspx which did not show up as a link the first time) or do what I just described hopefully with a resonable comment. Ugh.
In response to your 2nd question above about wrapping the datatype in "quotes". The FIRSTROW attribute is not intended to skip column headers. Skipping headers is not supported by the BULK INSERT statement. When skipping rows, the SQL Server Database Engine looks only at the field terminators, and does not validate the data in the fields of skipped rows. This is the reason why when you wrap your values in "" it works because it is considered a valid field terminator.
ASKER
thank you all very much for the replies...
the reason that I'm insisting in using the format file is that most of the files are not having the same data-type. As you can see some of fields are in "quotes" while other are not.
Also using format file gives me flexibility in future to import specific selected files rather than importing the whole file.
besides, if the generated data file change the columns order, the importing process will not be impacted.
for the time being it can be helpful if you can modify the below so as not to import the quotes to my database.
DECLARE @bulk_cmd varchar(1000)
SET @bulk_cmd = 'BULK INSERT AdventureWorks2008R2.Sales .SalesOrde rDetail
FROM ''<drive>:\<path>\<filenam e>''
WITH (ROWTERMINATOR = '''+CHAR(10)+''')'
EXEC(@bulk_cmd)
how can i achieve that? how can i ignore any quotes and import only the data e.g. i have this column callingPartyNumber in the file whose data is like below
"90556138111"
"90556138222"
i want it to be inserted in my database like this:
90556138111
90556138222
what modidication required here:
BULK INSERT [CDR]
FROM 'C:\sqldata\myDatafile.txt ' with (fieldterminator=',',rowte rminator=' ', firstrow=3)
the reason that I'm insisting in using the format file is that most of the files are not having the same data-type. As you can see some of fields are in "quotes" while other are not.
Also using format file gives me flexibility in future to import specific selected files rather than importing the whole file.
besides, if the generated data file change the columns order, the importing process will not be impacted.
for the time being it can be helpful if you can modify the below so as not to import the quotes to my database.
DECLARE @bulk_cmd varchar(1000)
SET @bulk_cmd = 'BULK INSERT AdventureWorks2008R2.Sales
FROM ''<drive>:\<path>\<filenam
WITH (ROWTERMINATOR = '''+CHAR(10)+''')'
EXEC(@bulk_cmd)
how can i achieve that? how can i ignore any quotes and import only the data e.g. i have this column callingPartyNumber in the file whose data is like below
"90556138111"
"90556138222"
i want it to be inserted in my database like this:
90556138111
90556138222
what modidication required here:
BULK INSERT [CDR]
FROM 'C:\sqldata\myDatafile.txt
ASKER
Hi I just tried to do the same as suggested in:
DECLARE @bulk_cmd varchar(1000)
SET @bulk_cmd = 'BULK INSERT AdventureWorks2008R2.Sales .SalesOrde rDetail
FROM ''<drive>:\<path>\<filenam e>''
WITH (ROWTERMINATOR = '''+CHAR(10)+''')'
EXEC(@bulk_cmd)
this is my code:
I ran the above query and it worked fine for inserting the first rows but the problem is that the data is inserted in quotes if it has quotes in the data file.
I attached screen shot for the output:
Any idea how to remove the quotation marks while inserting. I know how to do that in file format. i was using escape characters like: ",\"" for only the column which put its data between two quotes. check my above CDRimport.txt
Any idea how to do the same for the above query?
thanks,
wrong.jpg
correct.jpg
DECLARE @bulk_cmd varchar(1000)
SET @bulk_cmd = 'BULK INSERT AdventureWorks2008R2.Sales
FROM ''<drive>:\<path>\<filenam
WITH (ROWTERMINATOR = '''+CHAR(10)+''')'
EXEC(@bulk_cmd)
this is my code:
DECLARE @bulk_cmd varchar(1000)
SET @bulk_cmd = 'BULK INSERT CDR75.dbo.CDR
FROM ''C:\sqldata\myDatafile.txt''
WITH (ROWTERMINATOR = '''+CHAR(10)+''',fieldterminator='','', firstrow=3)'
print @bulk_cmd
EXEC(@bulk_cmd)
I ran the above query and it worked fine for inserting the first rows but the problem is that the data is inserted in quotes if it has quotes in the data file.
I attached screen shot for the output:
Any idea how to remove the quotation marks while inserting. I know how to do that in file format. i was using escape characters like: ",\"" for only the column which put its data between two quotes. check my above CDRimport.txt
Any idea how to do the same for the above query?
thanks,
wrong.jpg
correct.jpg
ASKER
Also another workaround will succeed if I'm able to remove the first 2 rows (header and datatype rows) prior BULK INSERT.
Any idea how to delete the first two lines from a flat file using SQL query? I'm using sql2008 and i think it is possible by using powershell.
Anyway so far we reached to two options workarround to solve my issue:
1) Either remove the quotes from the data.
2) Or remove the first two lines from the flat file.
Can anybody help me in any of the above? or you can suggest new 3rd option.
many thanks in advance.
Any idea how to delete the first two lines from a flat file using SQL query? I'm using sql2008 and i think it is possible by using powershell.
Anyway so far we reached to two options workarround to solve my issue:
1) Either remove the quotes from the data.
2) Or remove the first two lines from the flat file.
Can anybody help me in any of the above? or you can suggest new 3rd option.
many thanks in advance.
No ideas at the moment. Yes I understand the problem. The format file will tell bulk insert that selected fields are delimited with "," others with , others with ", and others with ,"
That is what you have done. The problem is your second row of data does not conform to that delimiter rule. All datatypes names are delimited with just , which is why putting quotes around those names fixes the problem.
I am afraid you will have to import the data with embeded quotes, then replace the quotes after you load the data.
Out of curiosity, what happens if you do this
INSERT CDR
SELECT *
FROM OPENDATASOURCE('Microsoft. ACE.OLEDB. 12.0','Dat a Source=c:\sqldata;Extended Properties="Text;HDR=Yes;F MT=Delimit ed"')...[m ydatafile# txt]
set rowcount 1 -- Delete the first row containing datatypes
delete from CDR
set rowcount 0
May work if you have ACE provider installed. It's default behavior is to strip quotes from text and assume data is comma delimited.
Also, if you just don't want to type all of the replace() statements to fix the quotes, you can run this;
declare @fixquotes varchar(max)
select
@fixquotes = IsNull ( @fixquotes + char(10)+ ',', 'update CDR set ') + '['+column_name+'] = replace(['+column_name+'], ''"'','''' )'
from INFORMATION_SCHEMA.COLUMNS where table_name = 'CDR'
select @fixquotes
exec ( @fixquotes )
That is what you have done. The problem is your second row of data does not conform to that delimiter rule. All datatypes names are delimited with just , which is why putting quotes around those names fixes the problem.
I am afraid you will have to import the data with embeded quotes, then replace the quotes after you load the data.
Out of curiosity, what happens if you do this
INSERT CDR
SELECT *
FROM OPENDATASOURCE('Microsoft.
set rowcount 1 -- Delete the first row containing datatypes
delete from CDR
set rowcount 0
May work if you have ACE provider installed. It's default behavior is to strip quotes from text and assume data is comma delimited.
Also, if you just don't want to type all of the replace() statements to fix the quotes, you can run this;
declare @fixquotes varchar(max)
select
@fixquotes = IsNull ( @fixquotes + char(10)+ ',', 'update CDR set ') + '['+column_name+'] = replace(['+column_name+'],
from INFORMATION_SCHEMA.COLUMNS
select @fixquotes
exec ( @fixquotes )
A unix-world tool called tail will output the file beginning at line N
tail +3 mydatafile.txt>mydatafile. csv
will strip the first 2 lines and store the output in mydatafile.csv
C:\Users\KSelvia>tail --version
tail (GNU textutils) 2.0
Written by Paul Rubin, David MacKenzie, Ian Lance Taylor, and Jim Meyering.
Copyright (C) 1999 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
I don't remember where I got mine from but it looks like you can download it from here
http://unxutils.sourceforge.net/
tail +3 mydatafile.txt>mydatafile.
will strip the first 2 lines and store the output in mydatafile.csv
C:\Users\KSelvia>tail --version
tail (GNU textutils) 2.0
Written by Paul Rubin, David MacKenzie, Ian Lance Taylor, and Jim Meyering.
Copyright (C) 1999 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
I don't remember where I got mine from but it looks like you can download it from here
http://unxutils.sourceforge.net/
You can also use grep, cat, cut and awk to massage the files to your liking before bcp'in them into your tables.
IMO this is the way to go to save time (and energy) given that the files you receive are static formattingwise.
IMO this is the way to go to save time (and energy) given that the files you receive are static formattingwise.
I would read the myDataFile.txt into a SQL temp table delete the two rows and export data back to myDataFile or better another file so you have the original untouched if you need to debug. You can read a file into SQL table something like:
set nocount on;
declare @filename sysname,
@sqlstr varchar(max)
set @filename = 'c:\myDatafile.txt'
Create table #tempfile (line varchar(8000))
exec ('bulk insert #tempfile from "' + @filename + '"')
select * from #tempfile
set ROWCOUNT 2 -- so only top 2 rows are deleted
delete from #tempfile;
select * from #tempfile
set @sqlstr = 'exec master..xp_cmdshell N''sqlcmd -E -Q"select * from ' +#tempfile+'" -h-1 -W -o"'+@filename+'"''--, no_output'
print (@sqlstr)
exec @sqlstr
drop table #tempfile
set nocount on;
declare @filename sysname,
@sqlstr varchar(max)
set @filename = 'c:\myDatafile.txt'
Create table #tempfile (line varchar(8000))
exec ('bulk insert #tempfile from "' + @filename + '"')
select * from #tempfile
set ROWCOUNT 2 -- so only top 2 rows are deleted
delete from #tempfile;
select * from #tempfile
set @sqlstr = 'exec master..xp_cmdshell N''sqlcmd -E -Q"select * from ' +#tempfile+'" -h-1 -W -o"'+@filename+'"''--, no_output'
print (@sqlstr)
exec @sqlstr
drop table #tempfile
ASKER
i think we drove away from the main problem. let me remind you... my main problem is that i can't read all the data values in my flat text data file using Bulk insert. Always the 1st row is been skipped unexpectedly.
i can not go for the option of read the myDataFile.txt into a SQL temp table, because the first row will be missed.
Create table #tempfile (line varchar(8000))
exec ('bulk insert #tempfile from "' + @filename + '"')
select * from #tempfile
the 1st row will be missed in the #tempfile which become useless.
Now I need to focus on the option of deleting the first 2 records from the file.
How can i delete record using powershell?
i can not go for the option of read the myDataFile.txt into a SQL temp table, because the first row will be missed.
Create table #tempfile (line varchar(8000))
exec ('bulk insert #tempfile from "' + @filename + '"')
select * from #tempfile
the 1st row will be missed in the #tempfile which become useless.
Now I need to focus on the option of deleting the first 2 records from the file.
How can i delete record using powershell?
Unfortunately I cant Powershell, but this seemed helpful:
http://stackoverflow.com/questions/2074271/remove-top-line-of-text-file-with-powershell
http://stackoverflow.com/questions/2074271/remove-top-line-of-text-file-with-powershell
ASKER
focusing back on the below code which works fine except for the quotes that appear with the data:
this code is 95% fine for me. i want to spend some effords on this.
i want the fieldterminator to skip the quotes from any field regrdless if field already contain quotes or not. i need something like this.
replacing fieldterminator by '"\",\""'
SET @bulk_cmd = 'BULK INSERT CDR75.dbo.CDR
FROM ''C:\sqldata\myDatafile.tx t''
WITH (ROWTERMINATOR = '''+CHAR(10)+''',fieldterm inator='"\ ",\""', firstrow=3)'
for sure this is not working alhough i know the escape charachter is "\"
any suggestion how to play on the above?
DECLARE @bulk_cmd varchar(1000)
SET @bulk_cmd = 'BULK INSERT CDR75.dbo.CDR
FROM ''C:\sqldata\myDatafile.txt''
WITH (ROWTERMINATOR = '''+CHAR(10)+''',fieldterminator='','', firstrow=3)'
print @bulk_cmd
EXEC(@bulk_cmd)
this code is 95% fine for me. i want to spend some effords on this.
i want the fieldterminator to skip the quotes from any field regrdless if field already contain quotes or not. i need something like this.
replacing fieldterminator by '"\",\""'
SET @bulk_cmd = 'BULK INSERT CDR75.dbo.CDR
FROM ''C:\sqldata\myDatafile.tx
WITH (ROWTERMINATOR = '''+CHAR(10)+''',fieldterm
for sure this is not working alhough i know the escape charachter is "\"
any suggestion how to play on the above?
ASKER
I'm not sure if my problem is a bug in SQL 2008 or not... i found many cases in the web facing the same problem, like this one:
Same Problem - first row is skipped by bulk insert.
any idea if it is a know bug?
Same Problem - first row is skipped by bulk insert.
any idea if it is a know bug?
ASKER
hi Experts,
waiting for your replies pleeeeeeeeease
waiting for your replies pleeeeeeeeease
OK lets try to
1) Import LF terminated row file, skilling first 2 rows, into single column
2) export remaining (2 in this case) rows to CR+LF terminated flat file
3) import new file, using format file to specify quoted field delimiters where necessary
Change the last line of C:\sqldata\CDRimport.txt from
94 SQLCHAR 0 50 "\n" 94 destVideoChannel_Role_Chan nel2 Arabic_CI_AS
to
94 SQLCHAR 0 50 "\r\n" 94 destVideoChannel_Role_Chan nel2 Arabic_CI_AS
-- Load the records into a single-column table as lchoan suggested, but use dynamic SQL so you can specify ROWTERMINATOR and also skip the first 2 rows;
create table ##tempfile (line varchar(8000) null)
DECLARE @bulk_cmd varchar(1000)
SET @bulk_cmd = 'BULK INSERT ##tempfile
FROM ''C:\sqldata\myDatafile.tx t''
WITH (ROWTERMINATOR = '''+CHAR(10)+''',fieldterm inator='', '', firstrow=3)'
print @bulk_cmd
EXEC(@bulk_cmd)
-- Extract the rows from ##temptable to mydatafile.csv (which will not include first 2 rows now)
declare @sqlstr varchar(255)
set @sqlstr = 'exec master..xp_cmdshell N''sqlcmd -E -Q"set nocount on;select * from ##tempfile" -h-1 -W -o"'+'c:\sqldata\mydatafil e.csv+'"'' --, no_output'
exec ( @sqlstr )
drop table ##temptable
-- Insert new .CSV file into CDR table using new format file;
BULK INSERT CDR
FROM 'C:\sqldata\myDatafile.csv ' with (formatfile = 'C:\sqldata\CDRimport.txt' )
1) Import LF terminated row file, skilling first 2 rows, into single column
2) export remaining (2 in this case) rows to CR+LF terminated flat file
3) import new file, using format file to specify quoted field delimiters where necessary
Change the last line of C:\sqldata\CDRimport.txt from
94 SQLCHAR 0 50 "\n" 94 destVideoChannel_Role_Chan
to
94 SQLCHAR 0 50 "\r\n" 94 destVideoChannel_Role_Chan
-- Load the records into a single-column table as lchoan suggested, but use dynamic SQL so you can specify ROWTERMINATOR and also skip the first 2 rows;
create table ##tempfile (line varchar(8000) null)
DECLARE @bulk_cmd varchar(1000)
SET @bulk_cmd = 'BULK INSERT ##tempfile
FROM ''C:\sqldata\myDatafile.tx
WITH (ROWTERMINATOR = '''+CHAR(10)+''',fieldterm
print @bulk_cmd
EXEC(@bulk_cmd)
-- Extract the rows from ##temptable to mydatafile.csv (which will not include first 2 rows now)
declare @sqlstr varchar(255)
set @sqlstr = 'exec master..xp_cmdshell N''sqlcmd -E -Q"set nocount on;select * from ##tempfile" -h-1 -W -o"'+'c:\sqldata\mydatafil
exec ( @sqlstr )
drop table ##temptable
-- Insert new .CSV file into CDR table using new format file;
BULK INSERT CDR
FROM 'C:\sqldata\myDatafile.csv
ASKER
i tried to test your suggestion, the below portion is not working:
declare @sqlstr varchar(255)
set @sqlstr = 'exec master..xp_cmdshell N''sqlcmd -E -Q"set nocount on;select * from ##tempfile" -h-1 -W -o"'+'c:\sqldata\mydatafil e.csv+'"'' --, no_output'
exec ( @sqlstr )
when i checked the out put file mydatafile.csv i found it is empty.
any idea?
declare @sqlstr varchar(255)
set @sqlstr = 'exec master..xp_cmdshell N''sqlcmd -E -Q"set nocount on;select * from ##tempfile" -h-1 -W -o"'+'c:\sqldata\mydatafil
exec ( @sqlstr )
when i checked the out put file mydatafile.csv i found it is empty.
any idea?
ASKER
Hi All,
deleting the first to lines of the original data file would be of great help and all will work fine.
I just found a tip on how to delete specific lines from a text file using powershell scripting. from From Here
Manually the code succeeded to delete the line from powershell command line:
now i want to run the same command dynamically from T-SQL
For my case the full path is stored in this T-SQL variable @path and the file name is stored in this variable @file
how to run the below command from sql query using my file variable:
(gc myDatafile.txt | select -Skip 2) | sc myDatafile.txt
this would be the best workaround for my problem.
deleting the first to lines of the original data file would be of great help and all will work fine.
I just found a tip on how to delete specific lines from a text file using powershell scripting. from From Here
Manually the code succeeded to delete the line from powershell command line:
(Get-Content $file | Select-Object -Skip 2) | Set-Content $file
Or simply for my case
(gc myDatafile.txt | select -Skip 2) | sc myDatafile.txt
now i want to run the same command dynamically from T-SQL
For my case the full path is stored in this T-SQL variable @path and the file name is stored in this variable @file
how to run the below command from sql query using my file variable:
(gc myDatafile.txt | select -Skip 2) | sc myDatafile.txt
this would be the best workaround for my problem.
1) Addressing your previous comment first, it did not work because I must have accidently deleted a quote somehow. This
set @sqlstr = 'exec master..xp_cmdshell N''sqlcmd -E -Q"set nocount on;select * from ##tempfile" -h-1 -W -o"'+'c:\sqldata\mydatafil e.csv+'"'' --, no_output'
should be
set @sqlstr = 'exec master..xp_cmdshell N''sqlcmd -E -Q"set nocount on;select * from ##tempfile" -h-1 -W -o"'+'c:\sqldata\mydatafil e.csv'+'"' '--, no_output'
Notice the ' after .csv above
mydatafile.csv'+'
That method worked fine when I ran it.
2) The method from executing DOS commands from within T-SQL is to use the built-in extended stored procedure "xp_cmdshell" , which has been referenced in several comments so far. To run powershell from within T-Sql using variables for file and path, you would do this
declare @path varchar(30)
declare @file varchar(30)
declare @cmd varchar(255)
set @path = 'c:\sqldata'
set @file = 'mydatafile.csv'
select @cmd='exec master..xp_cmdshell ''powershell -noprofile -noninteractive -command "(gc ' + @path + '\' + @file + ' | select -Skip 2) | sc c:\sqldata\mydatfile.txt"' ''
exec ( @cmd )
In the case above, @cmd =
exec master..xp_cmdshell 'powershell -noprofile -noninteractive -command "(gc c:\sqldata\mydatafile.csv | select -Skip 2) | sc c:\sqldata\mydatafile.txt" '
Notice the input file is mydatafile.csv, and the output file is mydatafile.txt.
You don't want to over-write the input file by using the same name as the output file. (Not sure PowerShell would even allow it.)
Also notice that powershell converts the file from LF terminated rows to CRLF, as you can tell by opening the new file in Notepad and see it has 2 rows instead of 1 LF terminated row, (which Notepad ignores), so you will still need to change the last line of CDRImport.txt format file to
94 SQLCHAR 0 50 "\r\n" 94 destVideoChannel_Role_Chan nel2 Arabic_CI_AS
Then this will work;
BULK INSERT CDR
FROM 'C:\sqldata\myDatafile.txt ' with (formatfile = 'C:\sqldata\CDRimport.txt' )
Also notice, even though your original input file and path are in variables, the intermediate 2-row file is hard-coded as C:\sqldata\myDatafile.txt. That way you don't have to worry about making BULK INSERT dynamic. It will always import from c:\sqldata\mydatafile.txt
set @sqlstr = 'exec master..xp_cmdshell N''sqlcmd -E -Q"set nocount on;select * from ##tempfile" -h-1 -W -o"'+'c:\sqldata\mydatafil
should be
set @sqlstr = 'exec master..xp_cmdshell N''sqlcmd -E -Q"set nocount on;select * from ##tempfile" -h-1 -W -o"'+'c:\sqldata\mydatafil
Notice the ' after .csv above
mydatafile.csv'+'
That method worked fine when I ran it.
2) The method from executing DOS commands from within T-SQL is to use the built-in extended stored procedure "xp_cmdshell" , which has been referenced in several comments so far. To run powershell from within T-Sql using variables for file and path, you would do this
declare @path varchar(30)
declare @file varchar(30)
declare @cmd varchar(255)
set @path = 'c:\sqldata'
set @file = 'mydatafile.csv'
select @cmd='exec master..xp_cmdshell ''powershell -noprofile -noninteractive -command "(gc ' + @path + '\' + @file + ' | select -Skip 2) | sc c:\sqldata\mydatfile.txt"'
exec ( @cmd )
In the case above, @cmd =
exec master..xp_cmdshell 'powershell -noprofile -noninteractive -command "(gc c:\sqldata\mydatafile.csv | select -Skip 2) | sc c:\sqldata\mydatafile.txt"
Notice the input file is mydatafile.csv, and the output file is mydatafile.txt.
You don't want to over-write the input file by using the same name as the output file. (Not sure PowerShell would even allow it.)
Also notice that powershell converts the file from LF terminated rows to CRLF, as you can tell by opening the new file in Notepad and see it has 2 rows instead of 1 LF terminated row, (which Notepad ignores), so you will still need to change the last line of CDRImport.txt format file to
94 SQLCHAR 0 50 "\r\n" 94 destVideoChannel_Role_Chan
Then this will work;
BULK INSERT CDR
FROM 'C:\sqldata\myDatafile.txt
Also notice, even though your original input file and path are in variables, the intermediate 2-row file is hard-coded as C:\sqldata\myDatafile.txt.
ASKER
Hi kselvia, All,
thank you for your efforts and time...
although i didn't get your last comment (starting from "Also notice...")... your code works fine only when input file is different than output file. when i tried to run your script it gave this error:
I intentionally want to replace the original file (the input file). i managed to do that from powershell command line however i couldn't do the same from SQL query. like this:
PS C:\> (gc myDatafile.txt | select -Skip 2) | sc myDatafile.txt
I modified your script to have both input and output file are the same like this:
Any idea how to force overwrting the input file?
thank you again
thank you for your efforts and time...
although i didn't get your last comment (starting from "Also notice...")... your code works fine only when input file is different than output file. when i tried to run your script it gave this error:
Set-Content : Access to the path 'C:\sqldata\testFine.dat' is denied.
At line:1 char:51
+ (gc c:\sqldata\testFine.dat | select -Skip 2) | sc <<<< c:\sqldata\testFine.
dat
+ CategoryInfo : NotSpecified: (:) [Set-Content], UnauthorizedAcc
essException
+ FullyQualifiedErrorId : System.UnauthorizedAccessException,Microsoft.Pow
erShell.Commands.SetContentCommand
NULL
I intentionally want to replace the original file (the input file). i managed to do that from powershell command line however i couldn't do the same from SQL query. like this:
PS C:\> (gc myDatafile.txt | select -Skip 2) | sc myDatafile.txt
I modified your script to have both input and output file are the same like this:
declare @path varchar(30)
declare @file varchar(30)
declare @cmd varchar(255)
set @path = 'c:\sqldata'
set @file = 'testFine.dat'
set @cmd='exec master..xp_cmdshell ''powershell -noprofile -noninteractive -command "(gc ' + @path + '\' + @file + ' | select -Skip 2) | sc ' + @path + '\' + @file + '"'''
--print ( @cmd )
exec ( @cmd )
Any idea how to force overwrting the input file?
thank you again
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
Yes this command ran successfully from DOS CLI:
powershell -noprofile -noninteractive -command "(gc C:\sqldata\testFine.dat | select -Skip 2) | sc C:\sqldata\testFine.dat"
Strangely enough, the same doesn't work from SQL query unless i change the output file.
Anyway i don't want to spend more time and efforts on this. what i have reached so far is pretty good and i'll go ahead with your advise to use different file for the output. this will definitely solve my issue for the time being.
i'm very curious to know if the main issue (skipping the 1st line in BULK Insert) is a known bug in SQL server or not. for sure i'll try to avoid using BULK Insert in coming projects and will go for integration services.
I'll take this moment to thank all of you who participated in this discussion. Actually this is my first trial for https://www.experts-exchange.com and really i found it amazing. i got the answer even faster than what i got for my vendor supports. I'll renew my subscription and i hope i will be more active in any discussion of my expertise.
I'll consider the issue is resolved.
Again thank you all.
powershell -noprofile -noninteractive -command "(gc C:\sqldata\testFine.dat | select -Skip 2) | sc C:\sqldata\testFine.dat"
Strangely enough, the same doesn't work from SQL query unless i change the output file.
Anyway i don't want to spend more time and efforts on this. what i have reached so far is pretty good and i'll go ahead with your advise to use different file for the output. this will definitely solve my issue for the time being.
i'm very curious to know if the main issue (skipping the 1st line in BULK Insert) is a known bug in SQL server or not. for sure i'll try to avoid using BULK Insert in coming projects and will go for integration services.
I'll take this moment to thank all of you who participated in this discussion. Actually this is my first trial for https://www.experts-exchange.com and really i found it amazing. i got the answer even faster than what i got for my vendor supports. I'll renew my subscription and i hope i will be more active in any discussion of my expertise.
I'll consider the issue is resolved.
Again thank you all.
ASKER
I accepted solution provided by: kselvia
He or she (kselvia) is very active and fast reply. He/She gives ideas out of the box.
thank you kselvia
He or she (kselvia) is very active and fast reply. He/She gives ideas out of the box.
thank you kselvia
ASKER
Hi
I tried to excuete the command from DOS CLI:
C:\>powershell -noprofile -noninteractive -command "(gc C:\sqldata\testFine.dat
| select -Skip 2) | sc C:\sqldata\testFine2.dat"
but i got this error:
Any idea?
I tried to excuete the command from DOS CLI:
C:\>powershell -noprofile -noninteractive -command "(gc C:\sqldata\testFine.dat
| select -Skip 2) | sc C:\sqldata\testFine2.dat"
but i got this error:
C:\>powershell -noprofile -noninteractive -command "(gc C:\sqldata\testFine.dat
| select -Skip 2) | sc C:\sqldata\testFine2.dat"
Select-Object : A parameter cannot be found that matches parameter name 'Skip'.
At line:1 char:43
+ (gc C:\sqldata\testFine.dat | select -Skip <<<< 2) | sc C:\sqldata\testFine2
.dat
C:\>
Any idea?
ASKER
I managed to figure out the above issue by installing powershell v2.0 previously i was testing in pwershell v1.0
things went smooth as expected.
thanks
things went smooth as expected.
thanks
You are welcome. I am a he (Ken Selvia)
BULK INSERT replaces the \n rowterminator with \r\n
http://msdn.microsoft.com/en-us/library/ms191485.aspx
"When you use bcp interactively and specify \n (newline) as the row terminator, bcp automatically prefixes it with a \r (carriage return) character, which results in a row terminator of \r\n."
Apparently that applies to BULK INSERT as well as bcp. The only way to make it use a LF (char(10)) character is with dynamic sql (as we have done) but you can't make a format file specify a LF as field terminator. Since BULK INSERT converts the \n in your format file to \r\n, that causes it not to see the \n at the end of the first line because it is "eaten" in place of the \r, causing it not to find the next \n until the end of the next line, making it look like it skipped the first row.
BULK INSERT replaces the \n rowterminator with \r\n
http://msdn.microsoft.com/en-us/library/ms191485.aspx
"When you use bcp interactively and specify \n (newline) as the row terminator, bcp automatically prefixes it with a \r (carriage return) character, which results in a row terminator of \r\n."
Apparently that applies to BULK INSERT as well as bcp. The only way to make it use a LF (char(10)) character is with dynamic sql (as we have done) but you can't make a format file specify a LF as field terminator. Since BULK INSERT converts the \n in your format file to \r\n, that causes it not to see the \n at the end of the first line because it is "eaten" in place of the \r, causing it not to find the next \n until the end of the next line, making it look like it skipped the first row.
ASKER
hi the 'He'
thank you for your help and justification...
i understood your explanation, does microsoft consider this as a bug. if not then how we can make it work without involving scripting.
Now it is working fine after i removed the 1st two lines. withou even doing any changes in the format file. i still use only \n (not\r\n)
i was expecting a minor change in the format file will make it work like replacing the \n with something else. i tried to replace it with \r\n it didn't work.
thank you for your help and justification...
i understood your explanation, does microsoft consider this as a bug. if not then how we can make it work without involving scripting.
Now it is working fine after i removed the 1st two lines. withou even doing any changes in the format file. i still use only \n (not\r\n)
i was expecting a minor change in the format file will make it work like replacing the \n with something else. i tried to replace it with \r\n it didn't work.