Link to home
Start Free TrialLog in
Avatar of JWeb Admin
JWeb Admin

asked on

Powershell - run array of SQL scripts and output to Excel tabbed worksheet

I have a directory of SQL files located in C:\Scripts\SQL\ which I'd like to execute on SQL servers which are specified in a file called ServerNames.txt.  What I need to accomplish is to run the SQL queries on all the servers/instances in ServerNames.txt, and output to a CSV document.  Ideally, I'd like to place the results for each script within a folder that is simply named SERVERNAME.domain.com, and each SQL scripts output should have its own out-file.

So far, what I have working is that I'm able to run test.sql on each instance inside ServerNames.txt.  I'm working on the logic to scan the Scripts directory and run all *.sql scripts, and send the output of each to a .csv file:

$Path = "C:\Scripts\"
$OutputPath = "C:\Scripts\Output\"
$Servers = Get-Content "$Path\ServerNames.txt"
foreach ($Server in $Servers ) {
    New-Item -ItemType Directory -Force -Path "$Path\Output\$Server"
    Invoke-sqlcmd -ServerInstance $Server `
    -InputFile ".\SQLScripts\test.sql" | `
   Export-Csv -Path "$OutputPath\$Server\$Server.csv" -NoTypeInformation
}

Open in new window


Here is the logic I was thinking about implementing somehow using invoke-sqlcmd:

foreach( $file in Get-ChildItem -path $databaseFolder -Filter *.sql | sort-object )
{
    Get-Content $file.fullName | Foreach-Object { $fullSqlScript = $fullSqlScript + $_ + "`n" } ;
}

Open in new window

Avatar of JWeb Admin
JWeb Admin

ASKER

I've made some code changes to try and incorporate the logic.  The output files contain only partial data, and I'd like the output files to go into the folder that is created with the server name.  Any assistance is appreciated:

import-module sqlps

$Path = "C:\SQLProject\"
$OutputPath = "C:\SQLProject\Output\"
$SQLScriptsPath = 'C:\SQLProject\SQLScripts\'
$Servers = Get-Content "$Path\ServerNames.txt"
$SQLScripts = Get-ChildItem -path $SQLScriptsPath -Include * | sort-object

# Create a directory with the server name, and run the SQL files in the array
foreach ($f in Get-ChildItem -path "C:\SQLProject\SQLScripts\" -Filter *.sql | sort-object)
{
    foreach ($Server in $Servers) {
        New-Item -ItemType Directory -Force -Path "$Path\Output\$Server"
        try
        {
            $out = $OutputPath + $f.name.split(".")[0] + ".txt" ;
            invoke-sqlcmd –ServerInstance $Server -InputFile $f.fullname | format-table | out-file -filePath $out
        }
        catch
        {
            Write-Host($error)
            #Write-Host "Error connecting to SQL Server:" $Server -ForegroundColor Red
        }
  }
}

Open in new window

Avatar of Chris Dent
Whether or not Export-Csv works depends on what's returned by Invoke-SqlCmd, but it's there if it works.
Import-Module sqlps

$OutputPath = "C:\SQLProject\Output"
$SQLScriptsPath = 'C:\SQLProject\SQLScripts'
$ServerList = Get-Content "C:\SQLProject\ServerNames.txt"

Get-ChildItem $SqlScriptsPath -Filter *.sql | Sort-Object Name | ForEach-Object {
  $File = $_
  Get-Content $ServerList | ForEach-Object {

    # A directory and file based on the server name from the text file
    New-Item "$OutputPath\$_" -ItemType Directory -Force | Out-Null
    $OutputFile = "$OutputPath\$_\$($File.BaseName).csv"
    
    try {
      Invoke-SqlCmd –ServerInstance $Server -InputFile $File.FullName |
        Export-Csv $OutputFile -NoTypeInformation
    } catch {
      Write-Host $_.Exception.Message -ForegroundColor Red
    }
  }
}

Open in new window

Chris
Your logic makes much more sense.  Only thing that is still happening is even though we've assigned $Outputpath to C:\SQLProject\Output, the code appears to be trying to create the csv's inside the root of C:\SQLProject rather than C:\SQLProject\Output.

I've checked the value of $ServerList and it is not null.  The error occurs when Get-Content $ServerList is executed on line 9.  If I replace $ServerList with the actual path, the code works.  Perhaps we aren't passing in the value of $ServerList properly?  Does it need to be inside a -Scriptblock?

Get-Content : Cannot find path 'C:\SQLProject\server1.domain.com' because it does not exist.

Open in new window

ASKER CERTIFIED SOLUTION
Avatar of Chris Dent
Chris Dent
Flag of United Kingdom of Great Britain and Northern Ireland image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
That's too funny - I just found the error with the paths due to Get-Content trying to fetch an item that didn't exist in that path.  

So by defining the $File and $Server variables to $_ inside the ForEach-Object loop, since you're using the same _$ variable, how is the second definition not overwritten into memory as we're declaring it twice?

The next thing to solve is how to make the output work correctly - it looks like the script partially runs, however there are several select statements that print the output into several sections in the Results pane in SSMS.

I thank you for your help!
Chris was a huge help!  I appreciate his time.
> since you're using the same _$ variable, how is the second definition not overwritten into memory as we're
> declaring it twice?

$_ is the pipeline variable and holds the current element from the pipeline. If we take that element of the script we can see the values is exposes in each loop:
Get-ChildItem $SqlScriptsPath -Filter *.sql | Sort-Object Name | ForEach-Object {
  Write-Host $_
  Get-Content $ServerList | ForEach-Object {
    Write-Host $_
  }
}

Open in new window

Variables are mutable, and are not (except in some special usage scenarios) strongly typed.

Chris
> how is the second definition not overwritten into memory as we're declaring it twice?

One more addition. While the value held in $_ is changing in context, we're not creating a reference to $_ so once we've assigned it to another variable it won't change unless we explicitly act on the new variable.

When you're dealing with Objects the opposite tends to be true. For example:
$Object = New-Object PSObject -Property @{Value = 123}
$NewVariable = $Object
$NewVariable.Value = 234
$Object
$NewVariable

Open in new window

It's something we all have to be quite careful about because it's easy to get caught out. I believe this is the behaviour you're talking about?

Chris
Excellent info.  I have one more question for you if you have time - I have several SQL scripts, and many of them containing syntax such as GO, get some data from the server, print, and then it moves onto another select statement.  It appears as though there are multiple tables of data being returned, and only the first result set is output to the file.  For instance:

"--- WHEN SCRIPT RAN ---"
"Oct 30 2014 11:33:09:510AM"

There are about 30 more statements that run, however that's as far as the output gets.  Do I need to handle the output differently?  I've tried using some alternate code that populates a hash table, and exports the hash table to CSV, and I see many errors when it runs such as "Incorrect syntax near 'GO'" and "Incorrect syntax near the keyword 'INSERT'."  

Some of the scripts run and output to file perfectly fine, however there are a few that are causing issues.  Any insight is also appreciated!  Here's the code I was referring to, however if I could use the simplified code, that would be much better.  I'd be more than happy to open a new question as well.

push-location;
import-module sqlps;
Pop-Location;

$OutputPath = "C:\SQLScripts\Output"
$SQLScriptsPath = 'C:\SQLScripts\SQLScripts'
$ServerList = "C:\SQLScripts\ServerNames.txt"

Get-ChildItem $SqlScriptsPath -Filter *.sql | Sort-Object Name | ForEach-Object {
  $File = $_
  Get-Content $ServerList | ForEach-Object {
    $Server = $_

    # A directory and file based on the server name from the text file
    New-Item "$OutputPath\$Server" -ItemType Directory -Force | Out-Null
    $OutputFile = "$OutputPath\$Server\$($File.BaseName).csv"
    try {
    #Connection Strings
    #Export File
    $OutputFile = "$OutputPath\$Server\$($File.BaseName).csv"
    # Connect to SQL and query data, extract data to SQL Adapter
    $SqlQuery = Get-Content $File.FullName -Raw
    $SqlConnection = New-Object System.Data.SqlClient.SqlConnection
    $SqlConnection.ConnectionString = "Data Source=$Server;Integrated Security = True"
    $SqlCmd = New-Object System.Data.SqlClient.SqlCommand
    $SqlCmd.CommandText = $SqlQuery
    $SqlCmd.Connection = $SqlConnection
    $SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
    $SqlAdapter.SelectCommand = $SqlCmd
    $DataSet = New-Object System.Data.DataSet
    $nRecs = $SqlAdapter.Fill($DataSet)
    $nRecs | Out-Null
    #Populate Hash Table
    $objTable = $DataSet.Tables[0]
    #Export Hash Table to CSV File
    ($objTable | ConvertTo-Csv -Delimiter ";" -NoTypeInformation) -replace "`"", "" | Out-File -Force $OutputFile
    } catch {
      Write-Host $_.Exception.Message -ForegroundColor Red
    }
  }
}

Open in new window

Are the multiple select queries in here?
    $SqlQuery = Get-Content $File.FullName -Raw

Open in new window

I do have, in my library, a simplified SQL query function. You're welcome to try that out. It won't help too much if there's more than one query at a time. But it does sort out problems with Data Tables (although it does also lose Type information).
function Get-SqlData {
  # .SYNOPSIS
  #   Get information from an SQL server.
  # .DESCRIPTION
  #   Execute a SELECT query against an SQL database defined by a connection string. 
  # .PARAMETER ConnectionString
  #   An SQL connection string.
  # .PARAMETER Query
  #   A SQL query.
  # .INPUTS
  #   System.String
  # .OUTPUTS
  #   System.Object[]
  # .EXAMPLE
  #   C:\PS>$SqlConnectionString = 'Data Source=Server;Initial Catalog=Database;Integrated Security=SSPI;'
  #   C:\PS>$SqlQuery = "SELECT * FROM Table"
  #   C:\PS>Get-SqlData -ConnectionString $ConnectionString -SqlQuery $SqlQuery
  
  [CmdLetBinding()]
  param(
    [String]$ConnectionString,
    
    [String]$Query
  )

  $SqlConnection = New-Object Data.SqlClient.SqlConnection($ConnectionString)
  $SqlConnection.Open()

  $SqlDataAdapter = New-Object Data.SqlClient.SqlDataAdapter($SqlQuery, $SqlConnection)

  $DataTable = New-Object Data.DataTable
  [Void]$SqlDataAdapter.Fill($DataTable)

  $SqlConnection.Close()

  # Extract fields from the DataTable, then forcefully convert the table to PsCustomObject
  # this changes fields set to DBNull to Null and vastly simplifies checking later
  $DataTable | Select-Object * -Exclude Row*, Table, ItemArray, HasErrors | ConvertTo-Csv | ConvertFrom-Csv
}

Open in new window

Chris
Yes sir - there are multiple select queries in some of the SQL scripts.  Thank you!  I'll give your code a shot!

Update: same results - there are several SELECT AS statements that seems to be outputting a header. Thank you for the help!  I think we may be able to make this work by just placing each query into its own SQL file.