• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 2086
  • Last Modified:

Uploading Files to a libary in Sharepoint 3.0 with Powershell

Hi, I have trawled the net looking for ways to automate uploads of a doc and docx files from a folder and have them uploaded to the sharepoint document libary, here is a script we used to use a while back, but doesnt seem to work anymore as i recieve this error

D:\Upload>powershell gci D:\Upload\*.doc | upload.ps1
New-Object : Exception calling ".ctor" with "1" argument(s): "The Web applicati
on at http://iccintranet/it/Knowledge%20Base could not be found. Verify that yo
u have typed the URL correctly. If the URL should be serving existing content,
the system administrator may need to add a new request URL mapping to the inten
ded application."
At D:\Upload\upload.ps1:8 char:18
+  $site=new-object  <<<< Microsoft.SharePoint.SPSite($docliburl)
You cannot call a method on a null-valued expression.
At D:\Upload\upload.ps1:9 char:20
+  $web=$site.openweb( <<<< $relweburl)
You cannot call a method on a null-valued expression.
At D:\Upload\upload.ps1:10 char:24
+  $folder=$web.getfolder( <<<< $docliburl)
Cannot index into a null array.
At D:\Upload\upload.ps1:11 char:19
+  $list=$web.lists[$ <<<< folder.ContainingDocumentLibrary]
Get-Content : Cannot bind argument to parameter 'Path' because it is null.
At D:\Upload\upload.ps1:18 char:20
+  $bytes=get-content  <<<< $_ -encoding byte
You cannot call a method on a null-valued expression.
At D:\Upload\upload.ps1:20 char:19
+  $folder.files.Add( <<<< $_.Name,$bytes,$propbag, $true) > $null


Here is the code


If anyone has another example I could try,
By the way it was working on 32bit Windows and we are currently running 64bit server 2008

Thanks



begin
{
 $propbag=@{ContentType=Document
                       MyCol =PowerShell About Docs }
 $docliburl=http://iccintranet/it/Knowledge%20Base
 $relweburl=/it/Knowledge%20Base
 [System.Reflection.Assembly]::LoadWithPartialName(Microsoft.SharePoint) > $null
 $site=new-object Microsoft.SharePoint.SPSite($docliburl)
 $web=$site.openweb($relweburl)
 $folder=$web.getfolder($docliburl)
 $list=$web.lists[$folder.ContainingDocumentLibrary]
} 

process
{ 

 ## I expect FileInfo objects in the pipeline
 $bytes=get-content $_ -encoding byte
 $bytes=[byte[]]$bytes
 $folder.files.Add($_.Name,$bytes,$propbag, $true) > $null
}

Open in new window

0
Illawarra
Asked:
Illawarra
  • 4
  • 3
1 Solution
 
quihongCommented:
How about keeping it really simple and just use the copy command?

Something like:

copy *.doc \\server\sites\sitename\doclibname

The Web Client service needs to be running to be able to access the document lib via UNC, which is the default on workstation OSes like XP but is disabled on Server OSes.
0
 
IllawarraAuthor Commented:
Yeah, that works
but what about permissions when they need to be applied etc..?
0
 
quihongCommented:
Not sure I understand your question about permissions...
0
Concerto's Cloud Advisory Services

Want to avoid the missteps to gaining all the benefits of the cloud? Learn more about the different assessment options from our Cloud Advisory team.

 
IllawarraAuthor Commented:
It's just bascially about uploading with a user name and password, and carrying over the all important metadata because "aparantly" the metadata screws up if its just coppied into the UNC path?
0
 
quihongCommented:
Regarding "permissions", whatever is running the process will be the account used to upload the document. You can probably use a runas command if you need to specify a different account.

Metadata wasn't mentioned in your original question and wasn't part of the code snippet. :) Was your powershell script doing that?





0
 
IllawarraAuthor Commented:
ah, I actually have no idea as it was all working a couple of years ago but since moving servers my manager has told me to research a way to import metadata along with the documents and I found this old script laying around,

you think you could point me in the direction of something that could be of use please? :)
0
 
quihongCommented:
When you say "was all working", are you saying the script set the metadata also? How would the script know what field/value to set? Would love to see the script.

I think you should post your "upload document and set metadata" as a separate question, so that you can get more participation
0
 
rdcproCommented:
Usually when you're migrating documents, the metadata is all over the place.  I don't see how a powershell script would work, unless you created something for it to read the metadata from.  
If you're going to do that, why re-invent the wheel? I've used this tool in the past:
http://www.vyapin.com/products/sharepoint/moss-2007/dockit/sharepoint-2007-file-migration.htm
It can be used in a variety of ways, but I imported thousands of files for a client by creating a set of spreadsheets with the metadata for the files, and one column with the path to the file.  A simple DOS command got me a list of all the files in a form that worked, and I just pasted it in the spreadsheet.  
dir *.* /b /s > files.txt
Then based on where the document was stored, I set up metadata for each document.  Excel made it pretty easy to do, especially where one piece of metadata was repeated over a lot of lines.  
Once I had that ready DocKIT ripped through the spreadsheet line-by-line, uploaded each file, updated the metadata, and then checked the file in.  
IMO, it's well worth the money, but you can always give the trial version a try.  We just had a vendor create something similar for Documentum, and it cost a bundle.
Regards,
Mike Sharp
0

Featured Post

VIDEO: THE CONCERTO CLOUD FOR HEALTHCARE

Modern healthcare requires a modern cloud. View this brief video to understand how the Concerto Cloud for Healthcare can help your organization.

  • 4
  • 3
Tackle projects and never again get stuck behind a technical roadblock.
Join Now