k_murli_krishna
asked on
URGENT - EXPORT IMPORT PROBLEM
1) We have DB2 7.2 EE server on win NT 4 and 2000.
2) We have received a non-delimited ascii text based file of data exported from remote source database table.
3) The table at both ends has same structure which cannot be changed i.e. 8 columns of integer, varchar, decimal, date and char(1) variety. The last column is of char(1) variety. There is no primary & foreign key and check constraints and all are not null.
3) We tried to import it but since data was missing for the last column it failed. Now if there was a default value it would have succeeded.
4) The date format of data in ascii file was not available in drop down of date while importing. So, we edited the ascii file to match one. If we keep date, timestamp and datetime drop downs as null, what happens. Will DB2 understand the date from start and end co-ordinate. What if we choose timestamp/datetime and there was no time component in ascii date values.
5) So we assumed that there will be a space(s) there and created a single space in text pad and imported using positional co-ordinates for the columns and were successful.
6) There were 2 rows only in the ascii file but it showed 4 rows imported, 2 successful and 2 failed. We found that correct data is imported in command center. In control center, for varchar columns data appeared as 123,456,789 instead of 123456789 in ascii file. In CLP, all
data is okay but last column(char(1)) shows value 4 for all rows which is incorrect.
7) Why is this difference in display and which do we beleive to be correct data in the table.
8) Now we tried to export data in table as to how it appears in ascii file. To our surprise, there is no non-delimited ascii file as an option. Then how come such a file was exported from remote source database table. Could this be a difference in VERSION/TYPE/PLATFORM(OS) of DB2 that they are having w.r.t. us.
9) I know that .ixf dump files are not portable across platforms but ascii text based files are. Will there be difference in export/import features across VERSION/TYPE/PLATFORM(OS).
10) Will export from higher version source be acceptable to lower version target or the other way round?
11) Will LOAD be more useful in this case over IMPORT.
12) There is a worksheet format(WSF) also for export/import. Worksheet means MS-Excel which has extension of .xls. Then how come .wsf.
13) Export/import of .ixf was successful at our end itself. Do you think next time round, if they send a .ixf dump from remote source database table, there are higher chances of it getting imported at our end. What will be effect of VERSION/TYPE/PLATFORM(OS) of DB2 differences from source to target in this case.
2) We have received a non-delimited ascii text based file of data exported from remote source database table.
3) The table at both ends has same structure which cannot be changed i.e. 8 columns of integer, varchar, decimal, date and char(1) variety. The last column is of char(1) variety. There is no primary & foreign key and check constraints and all are not null.
3) We tried to import it but since data was missing for the last column it failed. Now if there was a default value it would have succeeded.
4) The date format of data in ascii file was not available in drop down of date while importing. So, we edited the ascii file to match one. If we keep date, timestamp and datetime drop downs as null, what happens. Will DB2 understand the date from start and end co-ordinate. What if we choose timestamp/datetime and there was no time component in ascii date values.
5) So we assumed that there will be a space(s) there and created a single space in text pad and imported using positional co-ordinates for the columns and were successful.
6) There were 2 rows only in the ascii file but it showed 4 rows imported, 2 successful and 2 failed. We found that correct data is imported in command center. In control center, for varchar columns data appeared as 123,456,789 instead of 123456789 in ascii file. In CLP, all
data is okay but last column(char(1)) shows value 4 for all rows which is incorrect.
7) Why is this difference in display and which do we beleive to be correct data in the table.
8) Now we tried to export data in table as to how it appears in ascii file. To our surprise, there is no non-delimited ascii file as an option. Then how come such a file was exported from remote source database table. Could this be a difference in VERSION/TYPE/PLATFORM(OS) of DB2 that they are having w.r.t. us.
9) I know that .ixf dump files are not portable across platforms but ascii text based files are. Will there be difference in export/import features across VERSION/TYPE/PLATFORM(OS).
10) Will export from higher version source be acceptable to lower version target or the other way round?
11) Will LOAD be more useful in this case over IMPORT.
12) There is a worksheet format(WSF) also for export/import. Worksheet means MS-Excel which has extension of .xls. Then how come .wsf.
13) Export/import of .ixf was successful at our end itself. Do you think next time round, if they send a .ixf dump from remote source database table, there are higher chances of it getting imported at our end. What will be effect of VERSION/TYPE/PLATFORM(OS) of DB2 differences from source to target in this case.
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
1) Thank you, mglxxx.
2) We managed to import non-delimited file sent from remote database table or delimited ascii file exported from our own table from the command line in custom date format. But for remotely sent ascii file, we had to introduce a empty space for last column(char(1)) as follows.
3) I am presenting here the table structure and ascii file contents from remote database table:
CNTR_NUM_ID INTEGER NOT NULL,
BUR_EFF_DTE DATE NOT NULL,
BUR_STT_CDE CHAR(2) NOT NULL,
CLM_ID INTEGER NOT NULL,
FILE_NUM CHAR(4) NOT NULL,
BENF_DOB_DTE DATE NOT NULL,
BENF_DPCY_CDE CHAR(1) NOT NULL,
BENF_REL_CDE CHAR(1) NOT NULL
914604001994-01-0131 21140131002001974-02-02 8
914604001994-01-0131 21140131002001972-01-11 5
4) Here we have data in columns but for one less & some are merged together. What kind of a custom program can it be that generates such an output i.e. in which languages it is possible to write such programs.
5) Could it be that they have same table in 2 different databases and they wanted only non-delimited ascii file to resort to custom program.
6) This is all the more so since there is no non-delimited file option available in export at least for our DB2 7.2 EE on win NT/2000. From command line can we do this by framing a suitable command.
7) Please refer to your comment's 6/7) point.
2) We managed to import non-delimited file sent from remote database table or delimited ascii file exported from our own table from the command line in custom date format. But for remotely sent ascii file, we had to introduce a empty space for last column(char(1)) as follows.
3) I am presenting here the table structure and ascii file contents from remote database table:
CNTR_NUM_ID INTEGER NOT NULL,
BUR_EFF_DTE DATE NOT NULL,
BUR_STT_CDE CHAR(2) NOT NULL,
CLM_ID INTEGER NOT NULL,
FILE_NUM CHAR(4) NOT NULL,
BENF_DOB_DTE DATE NOT NULL,
BENF_DPCY_CDE CHAR(1) NOT NULL,
BENF_REL_CDE CHAR(1) NOT NULL
914604001994-01-0131 21140131002001974-02-02 8
914604001994-01-0131 21140131002001972-01-11 5
4) Here we have data in columns but for one less & some are merged together. What kind of a custom program can it be that generates such an output i.e. in which languages it is possible to write such programs.
5) Could it be that they have same table in 2 different databases and they wanted only non-delimited ascii file to resort to custom program.
6) This is all the more so since there is no non-delimited file option available in export at least for our DB2 7.2 EE on win NT/2000. From command line can we do this by framing a suitable command.
7) Please refer to your comment's 6/7) point.
ASKER
1) All columns are not null but someplace data is missing i.e. empty spaces were inserted.
2) There were some junk characters also visible when opened in notepad/textpad/wordpad.
How do we import the non-delimited ascii file keeping all above in mind.
2) There were some junk characters also visible when opened in notepad/textpad/wordpad.
How do we import the non-delimited ascii file keeping all above in mind.
Re Point 3: The data format you present seems to be coming from COBOL based mainframe. The easiest way to load this type of data, assuming that the format you present is uniform throughout the ascii file, is to write a C program that stores each field in a variable, adds the required missing blank value and then insert (NO LOAD OR IMPORT) the record into the table. The 'junk' charcaters you are referring to are actually COBOL manipulators (indicators). Some common ones are &,{,}, and there are numeric indicators for + and - values. For a full list of the 'junk' characters, consult some COBOL sites on the web.
Re non delimited export of data, you can accomplish this as follows:
select concat(coalesce(col1, ' '), ',')||
concat(coalesce(col2, ' '), ',')||
concat(coalesce(col3, ' '), ',')||
repeat as necessary
concat(coalesce(col3, ' '),
Re non delimited export of data, you can accomplish this as follows:
select concat(coalesce(col1, ' '), ',')||
concat(coalesce(col2, ' '), ',')||
concat(coalesce(col3, ' '), ',')||
repeat as necessary
concat(coalesce(col3, ' '),
Re Point 3: The data format you present seems to be coming from COBOL based mainframe. The easiest way to load this type of data, assuming that the format you present is uniform throughout the ascii file, is to write a C program that stores each field in a variable, adds the required missing blank value and then insert (NO LOAD OR IMPORT) the record into the table. The 'junk' characters you are referring to are actually COBOL manipulators (indicators). Some common ones are &,{,}, and there are numeric indicators for + and - values. For a full list of the 'junk' characters, consult some COBOL sites on the web.
Re non delimited export of data, you can accomplish this as follows:
select concat(coalesce(col1, ' '), ',')||
concat(coalesce(col2, ' '), ',')||
concat(coalesce(col3, ' '), ',')||
repeat as necessary
concat(coalesce(last column, ' ')
from table
and then pipe the entire output to a file with the redirection operator > export_file.txt or whatever name/extension you want to call it.
Re non delimited export of data, you can accomplish this as follows:
select concat(coalesce(col1, ' '), ',')||
concat(coalesce(col2, ' '), ',')||
concat(coalesce(col3, ' '), ',')||
repeat as necessary
concat(coalesce(last column, ' ')
from table
and then pipe the entire output to a file with the redirection operator > export_file.txt or whatever name/extension you want to call it.
ASKER
ghp7000, your answer was also good. please refer to https://www.experts-exchange.com/questions/20541972/Points-for-ghp7000.html & just add a comment.
ASKER
15) I have given 200 points but will give (more) points to each experts answers if they are valid & good.