Solved

Passing Parameter To Oracle Procedure From UNIX

Posted on 2007-11-26
10
3,054 Views
Last Modified: 2013-12-07
Hi, I am reading lines from a file and passing each one as a parameter to a procedure which inserts the data into my db. The length of the data is about 800k.

Each line from my file is processed as  "cmd.sh <data line from file>"
The content of cmd.sh is "sqlplus -S login/pwd@sid @my.sql $1'
The content of my.sql is "execute my_proc(&1); exit"

This works fine if  the data passed is not greater than 270kish, if it is I get an error stating the data is too big. If I cut and paste my data into my.sql replacing "&1" with a literal quoted string of about 800k of data it works fine, therefore there is no limit problem with the 'in' parameter declared in the procedure.

Any ideas on how else I can pass this data as a parameter to my procedure ?
0
Comment
Question by:adlikon
10 Comments
 
LVL 74

Expert Comment

by:sdstuber
ID: 20353902
Is it an oracle error or a shell error?
0
 
LVL 23

Expert Comment

by:David
ID: 20354036
This doesn't address your specific question, but have you ruled out traditional ETL such as SQL*Loader?  Alternatively, defining your source as an external table?
0
 
LVL 18

Expert Comment

by:rbrooker
ID: 20354925
an external table is probably the best approach for you...
using the file as a template, you can create a table based on that file.  the table then loads the file into memory when you select from it.

the following example reads the alert log into the database...  you can change this to read the csv file you want loaded.

good luck :)
CREATE TABLE "<OWNER>"."ALERT_LOG" 
   (	"LOG_LINE" VARCHAR2(4000)
   ) 
   ORGANIZATION EXTERNAL 
    ( TYPE ORACLE_LOADER
      DEFAULT DIRECTORY "EXTERNAL_TABLES"
      ACCESS PARAMETERS
      ( RECORDS DELIMITED BY NEWLINE
BADFILE external_tables:'alert_log_<DB>.bad'
DISCARDFILE external_tables:'alert_log_<DB>.dsc'
LOGFILE external_tables:'alert_log_<DB>.log'
SKIP 0
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"' AND '"'
MISSING FIELD VALUES ARE NULL
REJECT ROWS WITH ALL NULL FIELDS
(
   log_line    CHAR
 )
         )
      LOCATION
       ( "BDUMP_DIR":'alert_<DB>.log'
       )
    )
   REJECT LIMIT UNLIMITED

Open in new window

0
PRTG Network Monitor: Intuitive Network Monitoring

Network Monitoring is essential to ensure that computer systems and network devices are running. Use PRTG to monitor LANs, servers, websites, applications and devices, bandwidth, virtual environments, remote systems, IoT, and many more. PRTG is easy to set up & use.

 

Author Comment

by:adlikon
ID: 20356427
Hi sdstuber, the problem is reported by sqplus, this must be a limit with the &1 parameter.
0
 

Author Comment

by:adlikon
ID: 20356571
Hi guys, I should have explained that the real data I would like to capture is from a network broadcast, the goal is  to have a Java JMX API that captures the subscribed data and then simply calls my stored procedure to insert it into the database.  At the moment I have an example  dump of this data in a flat file that I am manipulating  & extracting via a perl script  which calls my stored procedure, this is just proof of concept  to get my procedure working. As you can see the extranal table sollution will not solve my problem :-(

My procedure is working fine if the string I pass is not over the limit of the "&1" parameter, but the majority of data I have to process is about 800K per line. So seeing that I can not use the &1 approach, is there another way of passing this data into the procedure, can I for example push it to a UNIX environment variable and read this ?

The only sollution I see at the moment is I actually generate an sql file with the content of "execute my_proc(<data>); exit"  for each line I read. I know this will work but it is an additional clunky step which I do not like.

Any other ideas ?
0
 
LVL 18

Expert Comment

by:rbrooker
ID: 20359460
can you creatre a script that inserts the data into a table and then calls the procedure which has been changed to look in a table?  that way you can still use the same method of capturing the data, but there is a two step process to process it:
1/. place data in a clob column recording the rowid in a variable
2/. process the data using the rowid as the parameter to the procedure.  the procedure looks up the value at that rowid.
0
 

Author Comment

by:adlikon
ID: 20366588
Hi Rbooker,

That might work for my current proof of concept but when I go to production I will have a Java API to feed the data.

If my understaning is correct due to  parameter limitations with Oracle procedures & sqlplus I have 3 options;
1) The API generates & runs a sql file for each line of data in the stream read.
    This will contain "exec proc (<..data..>); exit"
2) Drop the procedure idea, use JDBC instead from my Java API to insert the data into the db.
3) Get the Java API to dump to a flat file and use an Oracle External table connection to read the flat file passing this data to the procedure as a parameter.

Are these my options ?
0
 
LVL 18

Accepted Solution

by:
rbrooker earned 500 total points
ID: 20367443
they seem like the easiest.

point 2 sounds like what i would use with a few alterations. use jdbc to insert the data into a table with 2 columns, a column for the line of data from the java feed and another column saying when it was processed.  have the procedure altered to loop through all records with a null processed date, updating the processed date when complete.  schedule the procedure to run every minute or so...  you could also hold a rolling 7 days of data so you can report on numbers, data throughput etc...

with external tables, you run the risk of reprocessing lines, or when the file gets overwritten with a new one, not all lines were processed.

just random thoughts.
0
 

Author Closing Comment

by:adlikon
ID: 31411095
Hi, I didn't get a technical sollution that I hadn't  thought of but you did suggest usefull ideas and confirmed to me my options which is just as important - thanks.
0
 

Author Comment

by:adlikon
ID: 20372959
I have another possible workaround. If the "&" standard bind variable is limited to say 270K and I want to pass about 800K, I can split my data up into for example 4 chunks, pass this as 4 parameters "&1, &2, &3,&4 " and then just concat then all together in my proc.
0

Featured Post

Networking for the Cloud Era

Join Microsoft and Riverbed for a discussion and demonstration of enhancements to SteelConnect:
-One-click orchestration and cloud connectivity in Azure environments
-Tight integration of SD-WAN and WAN optimization capabilities
-Scalability and resiliency equal to a data center

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

This article started out as an Experts-Exchange question, which then grew into a quick tip to go along with an IOUG presentation for the Collaborate confernce and then later grew again into a full blown article with expanded functionality and legacy…
How to Unravel a Tricky Query Introduction If you browse through the Oracle zones or any of the other database-related zones you'll come across some complicated solutions and sometimes you'll just have to wonder how anyone came up with them.  …
This video shows, step by step, how to configure Oracle Heterogeneous Services via the Generic Gateway Agent in order to make a connection from an Oracle session and access a remote SQL Server database table.
This video shows how to configure and send email from and Oracle database using both UTL_SMTP and UTL_MAIL, as well as comparing UTL_SMTP to a manual SMTP conversation with a mail server.

840 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question