[Okta Webinar] Learn how to a build a cloud-first strategyRegister Now

x
?
Solved

powershell delete first 7 rows of CSV

Posted on 2012-08-20
21
Medium Priority
?
7,746 Views
Last Modified: 2012-09-02
Hi,

There is  a file I need to automatically create a SQL bulk insert with daily from a csv file
I have a powershell script that goes in an selects the columns that I want. But need also to remove the top 7 rows first.
There are 7 rows that I dont want (the top 7)

For the removing of the columns i do a select on the columns name and then export csv.

How can I add in a removing of the first 7 rows first..
(the first 7 rows are garbage and the column headers start at row 7)

Thanks in advance
0
Comment
Question by:neoptoent
  • 6
  • 6
  • 6
  • +2
21 Comments
 

Author Comment

by:neoptoent
ID: 38312628
Also,
 need to delete the last row as well
0
 
LVL 9

Expert Comment

by:chrismerritt
ID: 38312928
There's probably multiple things you can do here, but have you considered setting the FIRSTROW value on your BULK INSERT to always skip the first rows?

BULK INSERT <TABLENAME>
FROM <FILENAME>
WITH
(
	FIRSTROW = 2,
	FIELDTERMINATOR = '\t'
)

Open in new window

0
 
LVL 35

Accepted Solution

by:
Robert Schutt earned 900 total points
ID: 38313043
I don't have a clean solution (learning PS myself still...)

If you don't mind a temporary file you can use:
$a = get-content test.csv
$a[7..($a.length - 2)] > temp.csv
Import-CSV temp.csv

Open in new window

0
Concerto's Cloud Advisory Services

Want to avoid the missteps to gaining all the benefits of the cloud? Learn more about the different assessment options from our Cloud Advisory team.

 
LVL 9

Expert Comment

by:chrismerritt
ID: 38313141
You don't realistically need to use a different file, you can override the existing one.

Be warned this example is a bit long as you need to prep a file to use etc, and the resulting file is unusable as you are then missing the header columns.

#Prep File
$File = "C:\Temp\Services.csv"
$Services = Get-Service
$Services | Export-Csv $File

#Remove lines
$FileContent = gc $File
$FileContent = $FileContent[7..($FileContent.Count - 2)]
$FileContent | Out-File $File

Open in new window

0
 

Author Comment

by:neoptoent
ID: 38313249
Note sure I understand what a prep file is.
I am new to this as well

So how do I do it where it is usable?
0
 
LVL 6

Expert Comment

by:rwskas
ID: 38313263
Use select to skip the first 7. You are already selecting your columns, so just add in a 'skip'

Example if your CSV was saved as $Table

$Table | Select Column1, Column2 -Skip 7
0
 
LVL 9

Expert Comment

by:chrismerritt
ID: 38313290
Skipping the columns will largely depend on how you are getting the results in the first place as well. It's easier to skip the columns if you are just running a command or whatever to get the initial data.

Sorry if my example confused you, it was more for the robert_schutt to show you can use the same file without needing a 2nd one for his example above mine.
0
 
LVL 35

Expert Comment

by:Robert Schutt
ID: 38313355
Ok, thanks Chris but the example is a bit unfortunate because Get-Service doesn't have 7 bogus rows before the header (and also no totals row but that's not important here) so logically the headers are missing.

@neoptoent: you said you had a csv export file. My example uses "test.csv", replace that with your file and you should be up and running.

Taking the advice from Chris, you can overwrite your existing csv file instead of using temp.csv as well and take that as input for the final part of your script. If you don't know how to integrate it, post the relevant part of your code.
0
 

Author Comment

by:neoptoent
ID: 38313398
ok,
I will try it. I also need to delete the last row also.
How could I do that
0
 
LVL 35

Expert Comment

by:Robert Schutt
ID: 38313506
That's already in there. The file is put in an array line by line. The first line is array index 0. The part
$a[7..($a.length - 2)]

Open in new window

gives back line 8 until the one but last line.
0
 
LVL 9

Expert Comment

by:chrismerritt
ID: 38313513
Maybe it would help if you could post an example CSV file neo?
0
 

Author Comment

by:neoptoent
ID: 38313828
robert,

When I try this it messes up the csv formatting. it the temp file is double the size and the is not a csv anymore
0
 
LVL 35

Expert Comment

by:Robert Schutt
ID: 38313889
What is the code you use now? There could have been a hiccup with combining your existing code with the new code.

When I tested my original posted code I made an extremely simple csv file, I'll post the input and output for that here.
test.csv:
bogus line 1
bogus line 2
bogus line 3
bogus line 4
bogus line 5
bogus line 6
bogus line 7
header1,header2,header3
1,2,3
4,5,6
total1,total2,total3

Open in new window

temp.csv (output like I first used it):
header1,header2,header3
1,2,3
4,5,6

Open in new window

This seems to me exactly like you described it in your question (including the first follow-up post), but correct me if I'm wrong. There is one glitch in your post: you mention 7 rows to be skipped several times, then at the end you say the headers are on the 7th row, I assumed you meant the 8th row there, but if not that's easy to fix of course.

It's about bed time here for me, so my next answer may come tomorrow morning (european time) unless the heat keeps me awake anyway...
0
 
LVL 9

Expert Comment

by:chrismerritt
ID: 38313923
Really the best thing is for neo to post an example .csv file, then we could actually work with it like a live sample! :)
0
 

Author Comment

by:neoptoent
ID: 38316375
Chris,

I cant post an example. It seems the get content removes the formatting associated with it
0
 
LVL 9

Expert Comment

by:chrismerritt
ID: 38317257
Not sure why get-content comes in here, you have a .csv file before you start to edit it right, with the extra lines etc? if so just need a copy of that, or example of one which is similar enough, remove any sensitive info if you need to.
0
 

Author Comment

by:neoptoent
ID: 38320838
the whole thing is sensitive
0
 
LVL 35

Expert Comment

by:Robert Schutt
ID: 38321268
Ok, why not just compare the _structure_ of your file with what I posted, and also please post the general idea of your code as you have constructed it now (input - commands - output) or even before you posted it here otherwise I don't see how we can help further.
0
 
LVL 71

Assisted Solution

by:Qlemo
Qlemo earned 600 total points
ID: 38337162
The file is doubled in size because it is Unicode now (caused by out-file). Use Export-CSV instead on the array:
  $a[7..($a.length - 2)] | export-csv -NotypeInfo test.csv

Don't read the file in with get-content - that might garble it up, as you said. Instead, assign the result of your SELECT to the var $a, and apply code as shown above.
The Select-Object -Skip 7 should also work to remove the first 7 lines, but since we have to remove more, I would use above code.
0
 
LVL 35

Assisted Solution

by:Robert Schutt
Robert Schutt earned 900 total points
ID: 38350382
@neoptoent, hope you're still with us here. If the only way to go forward is to share the data privately I am still willing to help and close this off. You can find my email on my website, through my profile page.
0
 
LVL 71

Expert Comment

by:Qlemo
ID: 38359565
Thanks for closing this question, but there are two questions arising:

1. Why did you accept http:#a38350382 ? This comment is no solution, and does not contribute to one in any way.

2. Why a grade of "B"? Didn't you get the results as needed? A "B" should be given you still are left with the majority of work (and the Experts giving you pointers only, or partial info).

Since I'm Topic Advisor, I'm able to reopen the question (or do any other modification) to allow for a different closure - just tell. http:#a38350382 cannot stay as accepted answer anyway.
0

Featured Post

Free Tool: Subnet Calculator

The subnet calculator helps you design networks by taking an IP address and network mask and returning information such as network, broadcast address, and host range.

One of a set of tools we're offering as a way of saying thank you for being a part of the community.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

It is possible to export the data of a SQL Table in SSMS and generate INSERT statements. It's neatly tucked away in the generate scripts option of a database.
In this post, I will showcase the steps for how to create groups in Office 365. Office 365 groups allow for ease of flexibility and collaboration between staff members.
Exchange organizations may use the Journaling Agent of the Transport Service to archive messages going through Exchange. However, if the Transport Service is integrated with some email content management application (such as an antispam), the adminiā€¦
Loops Section Overview

868 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question