?
Solved

Passing Parameter To Oracle Procedure From UNIX

Posted on 2007-11-26
10
Medium Priority
?
3,066 Views
Last Modified: 2013-12-07
Hi, I am reading lines from a file and passing each one as a parameter to a procedure which inserts the data into my db. The length of the data is about 800k.

Each line from my file is processed as  "cmd.sh <data line from file>"
The content of cmd.sh is "sqlplus -S login/pwd@sid @my.sql $1'
The content of my.sql is "execute my_proc(&1); exit"

This works fine if  the data passed is not greater than 270kish, if it is I get an error stating the data is too big. If I cut and paste my data into my.sql replacing "&1" with a literal quoted string of about 800k of data it works fine, therefore there is no limit problem with the 'in' parameter declared in the procedure.

Any ideas on how else I can pass this data as a parameter to my procedure ?
0
Comment
Question by:adlikon
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
10 Comments
 
LVL 74

Expert Comment

by:sdstuber
ID: 20353902
Is it an oracle error or a shell error?
0
 
LVL 23

Expert Comment

by:David
ID: 20354036
This doesn't address your specific question, but have you ruled out traditional ETL such as SQL*Loader?  Alternatively, defining your source as an external table?
0
 
LVL 18

Expert Comment

by:rbrooker
ID: 20354925
an external table is probably the best approach for you...
using the file as a template, you can create a table based on that file.  the table then loads the file into memory when you select from it.

the following example reads the alert log into the database...  you can change this to read the csv file you want loaded.

good luck :)
CREATE TABLE "<OWNER>"."ALERT_LOG" 
   (	"LOG_LINE" VARCHAR2(4000)
   ) 
   ORGANIZATION EXTERNAL 
    ( TYPE ORACLE_LOADER
      DEFAULT DIRECTORY "EXTERNAL_TABLES"
      ACCESS PARAMETERS
      ( RECORDS DELIMITED BY NEWLINE
BADFILE external_tables:'alert_log_<DB>.bad'
DISCARDFILE external_tables:'alert_log_<DB>.dsc'
LOGFILE external_tables:'alert_log_<DB>.log'
SKIP 0
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"' AND '"'
MISSING FIELD VALUES ARE NULL
REJECT ROWS WITH ALL NULL FIELDS
(
   log_line    CHAR
 )
         )
      LOCATION
       ( "BDUMP_DIR":'alert_<DB>.log'
       )
    )
   REJECT LIMIT UNLIMITED

Open in new window

0
Veeam Disaster Recovery in Microsoft Azure

Veeam PN for Microsoft Azure is a FREE solution designed to simplify and automate the setup of a DR site in Microsoft Azure using lightweight software-defined networking. It reduces the complexity of VPN deployments and is designed for businesses of ALL sizes.

 

Author Comment

by:adlikon
ID: 20356427
Hi sdstuber, the problem is reported by sqplus, this must be a limit with the &1 parameter.
0
 

Author Comment

by:adlikon
ID: 20356571
Hi guys, I should have explained that the real data I would like to capture is from a network broadcast, the goal is  to have a Java JMX API that captures the subscribed data and then simply calls my stored procedure to insert it into the database.  At the moment I have an example  dump of this data in a flat file that I am manipulating  & extracting via a perl script  which calls my stored procedure, this is just proof of concept  to get my procedure working. As you can see the extranal table sollution will not solve my problem :-(

My procedure is working fine if the string I pass is not over the limit of the "&1" parameter, but the majority of data I have to process is about 800K per line. So seeing that I can not use the &1 approach, is there another way of passing this data into the procedure, can I for example push it to a UNIX environment variable and read this ?

The only sollution I see at the moment is I actually generate an sql file with the content of "execute my_proc(<data>); exit"  for each line I read. I know this will work but it is an additional clunky step which I do not like.

Any other ideas ?
0
 
LVL 18

Expert Comment

by:rbrooker
ID: 20359460
can you creatre a script that inserts the data into a table and then calls the procedure which has been changed to look in a table?  that way you can still use the same method of capturing the data, but there is a two step process to process it:
1/. place data in a clob column recording the rowid in a variable
2/. process the data using the rowid as the parameter to the procedure.  the procedure looks up the value at that rowid.
0
 

Author Comment

by:adlikon
ID: 20366588
Hi Rbooker,

That might work for my current proof of concept but when I go to production I will have a Java API to feed the data.

If my understaning is correct due to  parameter limitations with Oracle procedures & sqlplus I have 3 options;
1) The API generates & runs a sql file for each line of data in the stream read.
    This will contain "exec proc (<..data..>); exit"
2) Drop the procedure idea, use JDBC instead from my Java API to insert the data into the db.
3) Get the Java API to dump to a flat file and use an Oracle External table connection to read the flat file passing this data to the procedure as a parameter.

Are these my options ?
0
 
LVL 18

Accepted Solution

by:
rbrooker earned 2000 total points
ID: 20367443
they seem like the easiest.

point 2 sounds like what i would use with a few alterations. use jdbc to insert the data into a table with 2 columns, a column for the line of data from the java feed and another column saying when it was processed.  have the procedure altered to loop through all records with a null processed date, updating the processed date when complete.  schedule the procedure to run every minute or so...  you could also hold a rolling 7 days of data so you can report on numbers, data throughput etc...

with external tables, you run the risk of reprocessing lines, or when the file gets overwritten with a new one, not all lines were processed.

just random thoughts.
0
 

Author Closing Comment

by:adlikon
ID: 31411095
Hi, I didn't get a technical sollution that I hadn't  thought of but you did suggest usefull ideas and confirmed to me my options which is just as important - thanks.
0
 

Author Comment

by:adlikon
ID: 20372959
I have another possible workaround. If the "&" standard bind variable is limited to say 270K and I want to pass about 800K, I can split my data up into for example 4 chunks, pass this as 4 parameters "&1, &2, &3,&4 " and then just concat then all together in my proc.
0

Featured Post

On Demand Webinar: Networking for the Cloud Era

Did you know SD-WANs can improve network connectivity? Check out this webinar to learn how an SD-WAN simplified, one-click tool can help you migrate and manage data in the cloud.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Working with Network Access Control Lists in Oracle 11g (part 2) Part 1: http://www.e-e.com/A_8429.html Previously, I introduced the basics of network ACL's including how to create, delete and modify entries to allow and deny access.  For many‚Ķ
Using SQL Scripts we can save all the SQL queries as files that we use very frequently on our database later point of time. This is one of the feature present under SQL Workshop in Oracle Application Express.
Via a live example show how to connect to RMAN, make basic configuration settings changes and then take a backup of a demo database
This video explains at a high level about the four available data types in Oracle and how dates can be manipulated by the user to get data into and out of the database.
Suggested Courses

777 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question