Solved

Passing Parameter To Oracle Procedure From UNIX

Posted on 2007-11-26
10
3,052 Views
Last Modified: 2013-12-07
Hi, I am reading lines from a file and passing each one as a parameter to a procedure which inserts the data into my db. The length of the data is about 800k.

Each line from my file is processed as  "cmd.sh <data line from file>"
The content of cmd.sh is "sqlplus -S login/pwd@sid @my.sql $1'
The content of my.sql is "execute my_proc(&1); exit"

This works fine if  the data passed is not greater than 270kish, if it is I get an error stating the data is too big. If I cut and paste my data into my.sql replacing "&1" with a literal quoted string of about 800k of data it works fine, therefore there is no limit problem with the 'in' parameter declared in the procedure.

Any ideas on how else I can pass this data as a parameter to my procedure ?
0
Comment
Question by:adlikon
10 Comments
 
LVL 73

Expert Comment

by:sdstuber
ID: 20353902
Is it an oracle error or a shell error?
0
 
LVL 23

Expert Comment

by:David
ID: 20354036
This doesn't address your specific question, but have you ruled out traditional ETL such as SQL*Loader?  Alternatively, defining your source as an external table?
0
 
LVL 18

Expert Comment

by:rbrooker
ID: 20354925
an external table is probably the best approach for you...
using the file as a template, you can create a table based on that file.  the table then loads the file into memory when you select from it.

the following example reads the alert log into the database...  you can change this to read the csv file you want loaded.

good luck :)
CREATE TABLE "<OWNER>"."ALERT_LOG" 

   (	"LOG_LINE" VARCHAR2(4000)

   ) 

   ORGANIZATION EXTERNAL 

    ( TYPE ORACLE_LOADER

      DEFAULT DIRECTORY "EXTERNAL_TABLES"

      ACCESS PARAMETERS

      ( RECORDS DELIMITED BY NEWLINE

BADFILE external_tables:'alert_log_<DB>.bad'

DISCARDFILE external_tables:'alert_log_<DB>.dsc'

LOGFILE external_tables:'alert_log_<DB>.log'

SKIP 0

FIELDS TERMINATED BY ','

OPTIONALLY ENCLOSED BY '"' AND '"'

MISSING FIELD VALUES ARE NULL

REJECT ROWS WITH ALL NULL FIELDS

(

   log_line    CHAR

 )

         )

      LOCATION

       ( "BDUMP_DIR":'alert_<DB>.log'

       )

    )

   REJECT LIMIT UNLIMITED

Open in new window

0
 

Author Comment

by:adlikon
ID: 20356427
Hi sdstuber, the problem is reported by sqplus, this must be a limit with the &1 parameter.
0
 

Author Comment

by:adlikon
ID: 20356571
Hi guys, I should have explained that the real data I would like to capture is from a network broadcast, the goal is  to have a Java JMX API that captures the subscribed data and then simply calls my stored procedure to insert it into the database.  At the moment I have an example  dump of this data in a flat file that I am manipulating  & extracting via a perl script  which calls my stored procedure, this is just proof of concept  to get my procedure working. As you can see the extranal table sollution will not solve my problem :-(

My procedure is working fine if the string I pass is not over the limit of the "&1" parameter, but the majority of data I have to process is about 800K per line. So seeing that I can not use the &1 approach, is there another way of passing this data into the procedure, can I for example push it to a UNIX environment variable and read this ?

The only sollution I see at the moment is I actually generate an sql file with the content of "execute my_proc(<data>); exit"  for each line I read. I know this will work but it is an additional clunky step which I do not like.

Any other ideas ?
0
PRTG Network Monitor: Intuitive Network Monitoring

Network Monitoring is essential to ensure that computer systems and network devices are running. Use PRTG to monitor LANs, servers, websites, applications and devices, bandwidth, virtual environments, remote systems, IoT, and many more. PRTG is easy to set up & use.

 
LVL 18

Expert Comment

by:rbrooker
ID: 20359460
can you creatre a script that inserts the data into a table and then calls the procedure which has been changed to look in a table?  that way you can still use the same method of capturing the data, but there is a two step process to process it:
1/. place data in a clob column recording the rowid in a variable
2/. process the data using the rowid as the parameter to the procedure.  the procedure looks up the value at that rowid.
0
 

Author Comment

by:adlikon
ID: 20366588
Hi Rbooker,

That might work for my current proof of concept but when I go to production I will have a Java API to feed the data.

If my understaning is correct due to  parameter limitations with Oracle procedures & sqlplus I have 3 options;
1) The API generates & runs a sql file for each line of data in the stream read.
    This will contain "exec proc (<..data..>); exit"
2) Drop the procedure idea, use JDBC instead from my Java API to insert the data into the db.
3) Get the Java API to dump to a flat file and use an Oracle External table connection to read the flat file passing this data to the procedure as a parameter.

Are these my options ?
0
 
LVL 18

Accepted Solution

by:
rbrooker earned 500 total points
ID: 20367443
they seem like the easiest.

point 2 sounds like what i would use with a few alterations. use jdbc to insert the data into a table with 2 columns, a column for the line of data from the java feed and another column saying when it was processed.  have the procedure altered to loop through all records with a null processed date, updating the processed date when complete.  schedule the procedure to run every minute or so...  you could also hold a rolling 7 days of data so you can report on numbers, data throughput etc...

with external tables, you run the risk of reprocessing lines, or when the file gets overwritten with a new one, not all lines were processed.

just random thoughts.
0
 

Author Closing Comment

by:adlikon
ID: 31411095
Hi, I didn't get a technical sollution that I hadn't  thought of but you did suggest usefull ideas and confirmed to me my options which is just as important - thanks.
0
 

Author Comment

by:adlikon
ID: 20372959
I have another possible workaround. If the "&" standard bind variable is limited to say 270K and I want to pass about 800K, I can split my data up into for example 4 chunks, pass this as 4 parameters "&1, &2, &3,&4 " and then just concat then all together in my proc.
0

Featured Post

Is Your Active Directory as Secure as You Think?

More than 75% of all records are compromised because of the loss or theft of a privileged credential. Experts have been exploring Active Directory infrastructure to identify key threats and establish best practices for keeping data safe. Attend this month’s webinar to learn more.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Title # Comments Views Activity
Oracle Subquery bad Join 11 60
PL/SQL Search for multiple strings 5 42
history tablespace temp usage 2 31
Oracle - Create Procedure with Paramater 16 57
Working with Network Access Control Lists in Oracle 11g (part 1) Part 2: http://www.e-e.com/A_9074.html So, you upgraded to a shiny new 11g database and all of a sudden every program that used UTL_MAIL, UTL_SMTP, UTL_TCP, UTL_HTTP or any oth…
Background In several of the companies I have worked for, I noticed that corporate reporting is off loaded from the production database and done mainly on a clone database which needs to be kept up to date daily by various means, be it a logical…
This video explains at a high level about the four available data types in Oracle and how dates can be manipulated by the user to get data into and out of the database.
Via a live example, show how to restore a database from backup after a simulated disk failure using RMAN.

863 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

18 Experts available now in Live!

Get 1:1 Help Now