[Last Call] Learn how to a build a cloud-first strategyRegister Now

x
?
Solved

Passing Parameter To Oracle Procedure From UNIX

Posted on 2007-11-26
10
Medium Priority
?
3,071 Views
Last Modified: 2013-12-07
Hi, I am reading lines from a file and passing each one as a parameter to a procedure which inserts the data into my db. The length of the data is about 800k.

Each line from my file is processed as  "cmd.sh <data line from file>"
The content of cmd.sh is "sqlplus -S login/pwd@sid @my.sql $1'
The content of my.sql is "execute my_proc(&1); exit"

This works fine if  the data passed is not greater than 270kish, if it is I get an error stating the data is too big. If I cut and paste my data into my.sql replacing "&1" with a literal quoted string of about 800k of data it works fine, therefore there is no limit problem with the 'in' parameter declared in the procedure.

Any ideas on how else I can pass this data as a parameter to my procedure ?
0
Comment
Question by:adlikon
10 Comments
 
LVL 74

Expert Comment

by:sdstuber
ID: 20353902
Is it an oracle error or a shell error?
0
 
LVL 23

Expert Comment

by:David
ID: 20354036
This doesn't address your specific question, but have you ruled out traditional ETL such as SQL*Loader?  Alternatively, defining your source as an external table?
0
 
LVL 18

Expert Comment

by:rbrooker
ID: 20354925
an external table is probably the best approach for you...
using the file as a template, you can create a table based on that file.  the table then loads the file into memory when you select from it.

the following example reads the alert log into the database...  you can change this to read the csv file you want loaded.

good luck :)
CREATE TABLE "<OWNER>"."ALERT_LOG" 
   (	"LOG_LINE" VARCHAR2(4000)
   ) 
   ORGANIZATION EXTERNAL 
    ( TYPE ORACLE_LOADER
      DEFAULT DIRECTORY "EXTERNAL_TABLES"
      ACCESS PARAMETERS
      ( RECORDS DELIMITED BY NEWLINE
BADFILE external_tables:'alert_log_<DB>.bad'
DISCARDFILE external_tables:'alert_log_<DB>.dsc'
LOGFILE external_tables:'alert_log_<DB>.log'
SKIP 0
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"' AND '"'
MISSING FIELD VALUES ARE NULL
REJECT ROWS WITH ALL NULL FIELDS
(
   log_line    CHAR
 )
         )
      LOCATION
       ( "BDUMP_DIR":'alert_<DB>.log'
       )
    )
   REJECT LIMIT UNLIMITED

Open in new window

0
Technology Partners: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

 

Author Comment

by:adlikon
ID: 20356427
Hi sdstuber, the problem is reported by sqplus, this must be a limit with the &1 parameter.
0
 

Author Comment

by:adlikon
ID: 20356571
Hi guys, I should have explained that the real data I would like to capture is from a network broadcast, the goal is  to have a Java JMX API that captures the subscribed data and then simply calls my stored procedure to insert it into the database.  At the moment I have an example  dump of this data in a flat file that I am manipulating  & extracting via a perl script  which calls my stored procedure, this is just proof of concept  to get my procedure working. As you can see the extranal table sollution will not solve my problem :-(

My procedure is working fine if the string I pass is not over the limit of the "&1" parameter, but the majority of data I have to process is about 800K per line. So seeing that I can not use the &1 approach, is there another way of passing this data into the procedure, can I for example push it to a UNIX environment variable and read this ?

The only sollution I see at the moment is I actually generate an sql file with the content of "execute my_proc(<data>); exit"  for each line I read. I know this will work but it is an additional clunky step which I do not like.

Any other ideas ?
0
 
LVL 18

Expert Comment

by:rbrooker
ID: 20359460
can you creatre a script that inserts the data into a table and then calls the procedure which has been changed to look in a table?  that way you can still use the same method of capturing the data, but there is a two step process to process it:
1/. place data in a clob column recording the rowid in a variable
2/. process the data using the rowid as the parameter to the procedure.  the procedure looks up the value at that rowid.
0
 

Author Comment

by:adlikon
ID: 20366588
Hi Rbooker,

That might work for my current proof of concept but when I go to production I will have a Java API to feed the data.

If my understaning is correct due to  parameter limitations with Oracle procedures & sqlplus I have 3 options;
1) The API generates & runs a sql file for each line of data in the stream read.
    This will contain "exec proc (<..data..>); exit"
2) Drop the procedure idea, use JDBC instead from my Java API to insert the data into the db.
3) Get the Java API to dump to a flat file and use an Oracle External table connection to read the flat file passing this data to the procedure as a parameter.

Are these my options ?
0
 
LVL 18

Accepted Solution

by:
rbrooker earned 2000 total points
ID: 20367443
they seem like the easiest.

point 2 sounds like what i would use with a few alterations. use jdbc to insert the data into a table with 2 columns, a column for the line of data from the java feed and another column saying when it was processed.  have the procedure altered to loop through all records with a null processed date, updating the processed date when complete.  schedule the procedure to run every minute or so...  you could also hold a rolling 7 days of data so you can report on numbers, data throughput etc...

with external tables, you run the risk of reprocessing lines, or when the file gets overwritten with a new one, not all lines were processed.

just random thoughts.
0
 

Author Closing Comment

by:adlikon
ID: 31411095
Hi, I didn't get a technical sollution that I hadn't  thought of but you did suggest usefull ideas and confirmed to me my options which is just as important - thanks.
0
 

Author Comment

by:adlikon
ID: 20372959
I have another possible workaround. If the "&" standard bind variable is limited to say 270K and I want to pass about 800K, I can split my data up into for example 4 chunks, pass this as 4 parameters "&1, &2, &3,&4 " and then just concat then all together in my proc.
0

Featured Post

Transaction-level recovery for Oracle database

Veeam Explore for Oracle delivers low RTOs and RPOs with agentless transaction log backup and transaction-level recovery of Oracle databases. You can restore the database to a precise point in time, even to a specific transaction.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Configuring and using Oracle Database Gateway for ODBC Introduction First, a brief summary of what a Database Gateway is.  A Gateway is a set of driver agents and configurations that allow an Oracle database to communicate with other platforms…
This post first appeared at Oracleinaction  (http://oracleinaction.com/undo-and-redo-in-oracle/)by Anju Garg (Myself). I  will demonstrate that undo for DML’s is stored both in undo tablespace and online redo logs. Then, we will analyze the reaso…
This video shows how to Export data from an Oracle database using the Original Export Utility.  The corresponding Import utility, which works the same way is referenced, but not demonstrated.
This video explains what a user managed backup is and shows how to take one, providing a couple of simple example scripts.
Suggested Courses
Course of the Month18 days, 2 hours left to enroll

830 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question