SQL LOADER versus UTL_FILE to read several files at one time.what to use?

Hi experts,

I have an important task and i need fast help.
The  operating system is UNIX.From a defined directory i will read any number of files and i need to load every row from every files exists in the directory.Now i know to do it with utl_file ,but if the users want sql loader as the tool to load data ,how do i write the program to load from serveral files at one time.i only know to load 1 file into a table.IS it possible to write a program that will read all files from the direcotory?
IF so what in general the program is? If u know to write that program please explain to me what u recommand to use sql loader or utl_file methood to load the data.what is better in controlling bad rows and ,what is more efficient and why.If its not possible to write the program i asked and u explain me why, i will regard it as an answer also.The sqlloader will run from Oracle applications.
THANKS in advance
Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

I would also go for the sql loader solution - it should be much faster than utl_file.
If you want to load all files from a directory and don't know the files names and the number of files, then I think the easiest way would be to concatenate all the files into a single file and load it. This way the name of the file to be loaded will be always the same and you'll have a simple and fixed control file. To concatenate all the files you just need to run one command:
cat /<directory_path>/* > file_name
cat /u80/home/input_files/* > /u80/home/input_files/input_file.dat

Then in the control file you'll have INFILE /u80/home/input_files/input_file.dat. You just need to run the unix command before running sql loader.
If the files have different format and/or headers, then you'll need to write a script to concatenate them manually.
Hope that helps!

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
yeitanAuthor Commented:
hi pennnn

Your idea is nice and i will consider it,but there are problems with it.
If the files are big .Is there any problem to integrate them to one file?
Its sounds risky.What if this command doesnt succeed to combine all the files but only few of them? how can i know about it?maybe it cant be...i just ask...
When sql loader works in case it didnt succeed to load some rows,it creates a file with extension .bad ,i will have to try load the bad rows again-In the next loading process .Its difficult to have control .How can i only load the rows that were with a problem in the next loading process with out the rows that were ok?
When i work with utl_file in case there was an exception i can stop loading then i can mark or single out  the file with the problem and i will not remove it from the directory so that for the next loading process that file(after i correct it) can be loaded again (i will think how to deal with the rows from that file which were inserted with no problem at the 1st time-cause i dont want duplicate data).
all the rest files(with no exception whats so ever) will be removed from the directory so that it will not be loaded again.Do u understand me?With your idea i dont have all the control.This program is loading tables foe an interface,its a process that might happen even few time a day.So do u still think your idea is the best to our needs?

I'm sorry I didn't get back to you earlier, but I was away for a while.
First, I have to say that the best solution is the one that suits YOU best. If you feel more comfortable with utl_file, then you can try creating a high-level prototype using utl_file and see if it shows acceptable performance.
Now if you want to try the SQL Loader/unix scripts solution there are several things that can probably help you.
You can always check in a shell script if the last command was successfull using the $? variable. Example:
if test $? -ne 0
   echo "ERROR"

Then you can use another approach to avoid the problem with the big files. You can process each file separately by using a control file template that can be updated for each files that needs to be loaded.
Here's a high-level algorithm (might have syntax errors):
1) Create a control file that has the following (or similar) line for the input file (the string INPUT_FILE_PLACEHOLDER is just an exmaple, you just have to use the same string later).:

The rest of the controlfile should be whatever you want it to be, just the INFILE will have to be as shown above.
2) The shell script should have the following algorithm (example):
for file in /your_path/*
   $ORACLE_HOME/bin/sqlldr userid=username/password control=$CONTROLFILE;
   if test $? -ne 0
      echo "ERROR in file "$file >> $yourlogfile

Let me know if it's not clear enough...
Hope that helps!
yeitanAuthor Commented:
pennn hi...the soulution i picked was like the 1st comment of yours and i used sql loader and inside the shell script i used the cat cammand u gave me which works very good(united 70 files togethere with no problem)!!!

It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Oracle Database

From novice to tech pro — start learning today.