Link to home
Start Free TrialLog in
Avatar of pvsbandi
pvsbandiFlag for United States of America

asked on

S3 to Postgres RDS - How to use quotes in the s3 import function

Hi,

  We are on Arora Postgres RDS version.

We get csv files from DB2, loaded to the S3 buckets and we load from S3 buckets into our RDS Postgres databases.  The below is what we are using to loading them.

SELECT aws_s3.table_import_from_s3('tb_client', '', '(format csv, DELIMITER ''|'', HEADER false,NULL ''NULL'')',
    'cndnt.xrx.bic.file', 'tb_client.txt', 'us-east-1');

Open in new window

The file is pipe delimited and has double quotes as the Quote for the text fields. But the above command doesn't have a place to specify the quotes and so, the fields are identified along with the quotes and are failing saying the field is exceeding the defined length.

01 |"Tony   "|"Blazer "|"03/10/1983"|10000|"123 Main Rd"|"Dreamland"|"VA"

Open in new window

Can someone kindly let me know how to deal with this issue?

Thank you!

Avatar of slightwv (䄆 Netminder)
slightwv (䄆 Netminder)

I don't think the double quotes is causing the error.  

From the table_import_from_s3 link:
https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_PostgreSQL.S3Import.html

It says that it uses the COPY arguments.  From there:
https://www.postgresql.org/docs/current/sql-copy.html

QUOTE
Specifies the quoting character to be used when a data value is quoted. The default is double-quote. This must be a single one-byte character. This option is allowed only when using CSV format.

Double quotes is the default.

Using the COPY command, this works for me:
drop table if exists bob;
create table bob(col1 int, col2 varchar(5));

\copy bob from stdin with csv
1,"Hello"
2,World
\.

select * from bob;

Open in new window


Just to show this, you can use the QUOTE parameter from the COPY docs:
\copy bob from stdin (format csv, quote '"');

Triple check the data.  If possible download it and try loading it directly using the COPY command.

I'll be happy to take a look at what you have but I'll need the table definition and example of the data causing the problem.
Avatar of pvsbandi

ASKER

Thank you! and Sorry, Actually should have put out my actual problem.
We have the double quotes and the data has double quotes so it was failing.
We then changed the quotes to a tilda(~).
Now, the question is, how to add the quote as ~, in the s3 function.
ASKER CERTIFIED SOLUTION
Avatar of slightwv (䄆 Netminder)
slightwv (䄆 Netminder)

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Thanks so much! sorry, was on vacation and so didn't get to work on earlier