Oracle 11G: Import fails: not a valid export file, header failed verification

I am trying to import an Oracle 11G dmp file but it is failing with the error: "not a valid export file, header failed verification"
I dont know what is wrong, my client provide me a copy of his Oracle 11G Databse dump.
My client have an Oracle 11G database (Unix)
I have an Oracle 11G database (Windows).

Can somebody help me to fix this issue?

Firstly, I created a new user:

create user userTarget identified by PASSWORD123;
grant connect, unlimited tablespace, resource to userTarget;

Open in new window

This is the Oracle command I used to import:
imp system/ABCDE FROMUSER=usrSource TOUSER=userTarget file=D:\Temp\bd_pruebas_abcde\BD_FULL_21082015.dmp full=yes

Open in new window

José PerezAsked:
Who is Participating?

[Product update] Infrastructure Analysis Tool is now available with Business Accounts.Learn More

I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

johnsoneSenior Oracle DBACommented:
90% of the time when I see this it is because the file was transferred incorrectly.  When transferring the file, it was transferred in ASCII mode and not in binary mode.  Therefore translations were done on certain characters and the file is not what it was originally.

There is typically an option within the transfer program that was used that specifies the type of transfer.  If there is an automatic, do not use it, explicitly set it as a binary transfer.
slightwv (䄆 Netminder) Commented:
Normally I agree with the above statement about the transfer mode.  When Oracle introduced the datapump utilities, transfer mode became less of an issue.

My guess is the file was created with the datapump version of export: expdp.

If so, you need to datapump version of import: impdp.

The file needs to be on the database server ion a folder that is pointed to by a database directory.

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
José PerezAuthor Commented:
Can you please tell what commands should I use? (I am an SQL DB Adminsitrator not Oracle)
Big Business Goals? Which KPIs Will Help You

The most successful MSPs rely on metrics – known as key performance indicators (KPIs) – for making informed decisions that help their businesses thrive, rather than just survive. This eBook provides an overview of the most important KPIs used by top MSPs.

José PerezAuthor Commented:
mmm probably you're perfectly right!
I was checking the logfile for the exported file and it contains:

dumpfile=DATA_PUMP_DIR:DCTM_FULL_21082015.dmp logfile=DATA_PUMP_DIR:DCTM_FULL_21082015.log full=y

Open in new window

Now if you're correct, I just need the commands to import it, can you... please?
johnsoneSenior Oracle DBACommented:
For file transfer typically ftp (or one of its variants) would be used.

For importing, impdp would be used.  Probably the same parameters that were used for the export would be ideal.  The DATA_PUMP_DIR object would have to be created before you imported.

I didn't think that expdp created a human readable file.  I haven't actually looked that closely at one yet.
slightwv (䄆 Netminder) Commented:
>>Can you please tell what commands should I use?

It really isn't that easy.  If you don't know Oracle, it will likely be confusing and problematic.  I'm afraid I cannot provide the "run this" command because I don't know your system and requirements.

Log into the destination database as a DBA level user and run the following:
select directory_path from dba_directories where directory_name='DATA_PUMP_DIR';

Put the file in that folder.  If it isn't large enough, you'll need to create the directory:
create or replace directory mydir as 'C:\SOME_PATH';

Now the problematic parts:
Do the tablespaces in the source database already exist in the target database?
Do all the users exist?
Do the tables already exist?
etc...  you get the idea.

If you don't have this information, just running the import can really mess with things.  It will likely attempt to create and tablespace with the data file paths from the source database.

It might just fail because things aren't set up right.

The parameters should be the same as the export parameters with one exception:  It looks like you want to change schema names.  I think the fromuser and touser from the original imp will also work in datapump but the new parameter is remap_schema.

If the tabpespaces are different, remap_tablespace, etc...

There are examples of these and how to use them in the online documentation and on the web.

The run the impdp with the same parameters as the export.
José PerezAuthor Commented:
I created the user in target database, so the user is new and epmty but it has the same name than in its source database.

I know the client did a full database export.
I have the default tablespaces of the Oracle 11G installation in the 'oradata' folder.

This is the result of the command you suggest:

select directory_path from dba_directories where directory_name='DATA_PUMP_DIR';

Open in new window

slightwv (䄆 Netminder) Commented:
>>I have the default tablespaces of the Oracle 11G installation in the 'oradata' folder.

You cannot mix and match SQL Server and Oracle terms.  I would encourage you to work directly with a local Oracle person.  They can help educate you about the differences in a way that is difficult in a Q&A site like this.

'default' probably doesn't matter unless the source database was created with 'default' options.

You really need to have a basic understanding of Oracle to understand what it might attempt to do on import.

I'll try to cover the short list:
A user that owns objects in Oracle is a schema.
Those objects exist in a tablespace.
A tablespace can have one or more data files on disk.
A data file belongs to one and only one tablespace.

Files on disk don't matter as far as the import goes as long as the tablespace names are the same.  If the import has an object that exists in the 'BOB' tablespace and 'BOB' doesn't exist in the destination database, the import might try to create it.  It will do so by using the same PATH as the source database.  If that path doesn't exist, the import will likely fail.

Since I doubt you have access to a local Oracle person:
Can you possibly trash the destination database?  If so, just try the import and post the likely errors.

If you cannot trash the destination database, get the specifics from the person that gave you the dump.  At a minimum you'll need to know the tablespaces used by ALL objects exported.

You can generate a SQLFILE from the export that will capture all the commands without executing any of the statements.    You can then go though the output and make sure you have everything you need.  Not an easy process but doable.
Geert GOracle dbaCommented:
you could start by reading the concepts
> to see what's different in architecture between sql and ora

the 2day dba might also help

and check the import utility impdp

it's a lot of reading to just run an import

as slight indicated, for a local oracle hire, will be a lot faster
José PerezAuthor Commented:
Finally the issue was that the Export I received was don using expdp. That was the reason of the error.
Thanks all for your comments and answers.
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Oracle Database

From novice to tech pro — start learning today.