spectrum17
asked on
Can you change the data type in Crystal from say Number to String?
Hello
I'm converting a Crystal 8.5 report into Crystal 11, and in the process I've had to change the database from a csv file to an excel file.
After doing this, the data map screen came up and a few of the fields have changed data type ie from String to Number etc
Is there a way I can change the data type back to what it was? ie via Add Command or something?
I'm converting a Crystal 8.5 report into Crystal 11, and in the process I've had to change the database from a csv file to an excel file.
After doing this, the data map screen came up and a few of the fields have changed data type ie from String to Number etc
Is there a way I can change the data type back to what it was? ie via Add Command or something?
ASKER
I've tried totext and cstr in the formulas but Crystal doesn't seem to be recognising any of the converted text.
I can't edit the file as it's huge and it get recreated each week.
Any other ideas?
I can't edit the file as it's huge and it get recreated each week.
Any other ideas?
ow does it get created?
Can that program be changed to insert a dummy record or 2?
What do you mean it doesn't recognize the converted text?
mlmcc
Can that program be changed to insert a dummy record or 2?
What do you mean it doesn't recognize the converted text?
mlmcc
ASKER
The data file is created by a payroll program.
The program can't be changed to insert a dummy record.
Re the converted text isn't recognised an example is:
When this formula is used and the value in the field deduct.WCODE is "S1" Crystal is returning deduct.WCODED instead of "SUPERANNUATION FUND"
if cstr({deduct.WCODE}) = "S1" then "SUPERANNUATION FUND"
else {deduct.WCODED}
The program can't be changed to insert a dummy record.
Re the converted text isn't recognised an example is:
When this formula is used and the value in the field deduct.WCODE is "S1" Crystal is returning deduct.WCODED instead of "SUPERANNUATION FUND"
if cstr({deduct.WCODE}) = "S1" then "SUPERANNUATION FUND"
else {deduct.WCODED}
ASKER
PS I saw something on google that said you could change the data type via Add Command, using something like this:
SELECT deduct.WCODE
FROM deduct
CONVERT (STRING,WCODE)
I'm getting a syntax error when I try it.
I also saw a Crystal solution that JAMES0628 said you could change the data type via SQL. he didn't post the syntax though.
I suspect it will more like
SELECT CONVERT(VarChar,Deduct.WCo de) as WCode
FROM Deduct
mlmcc
SELECT CONVERT(VarChar,Deduct.WCo
FROM Deduct
mlmcc
CR decides what type each column in an Excel file is by looking at the values in some of the first rows. I'm not sure exactly how many, but it seems to be somewhere around 15 to 25. If it decides that a column is numeric, any string values in that column will simply be ignored/dropped. So, your CStr ({deduct.WCODE}) = "S1" test simply can't work, because CR has decided that WCODE is numeric, so it can never be "S1".
I handle this kind of thing by adding around 20 dummy rows to the beginning of the Excel file with the correct type of value in each column, like 0 in the numeric columns and "A" in the string columns, to get CR to interpret the columns the way that I want.
I've never tried to write a CR Command for an Excel file. It's an interesting idea. I just tried it and I was able to use a Command to read an Excel file, but I don't know if that query language includes any kind of conversion function. The MS SQL CAST and CONVERT functions don't seem to be recognized. Either that, or the syntax for them is different for Excel and I just didn't happen to try the right syntax.
It might be possible to use a Command to force the data type for the columns in a different way. I'm not entirely sure. I tried adding a SELECT (before the main SELECT) that put values of the desired type in each column (similar, in concept, to adding dummy rows to the file). I took one of the columns that CR had decided was numeric in the file, and made it a string in the new SELECT. But I got an error when I tried to save that Command, apparently because the first SELECT was just using literal values and not actually reading the file (or anything else). So then I changed the first SELECT so that it was still using the literal values, but also reading the file, and I added a WHERE that would never be true, so that that first SELECT wouldn't actually produce any rows. CR let me save that Command, and it made that one column a string instead of numeric. I didn't do any testing beyond that, but my guess is that CR was getting the data type for each column from the first SELECT, even though it wasn't actually producing any data, and then using the same types for the columns in the second SELECT, where it was actually reading the file. So, that _seemed_ to work, as far as it went, but whether or not the same thing would work for you, I really don't know. And keep in mind that if you do that, the first SELECT is (presumably) going to be reading through the whole Excel file (even though it's not producing any data), so the report would be reading the file twice. If it's a really large file, that could take some time.
James
I handle this kind of thing by adding around 20 dummy rows to the beginning of the Excel file with the correct type of value in each column, like 0 in the numeric columns and "A" in the string columns, to get CR to interpret the columns the way that I want.
I've never tried to write a CR Command for an Excel file. It's an interesting idea. I just tried it and I was able to use a Command to read an Excel file, but I don't know if that query language includes any kind of conversion function. The MS SQL CAST and CONVERT functions don't seem to be recognized. Either that, or the syntax for them is different for Excel and I just didn't happen to try the right syntax.
It might be possible to use a Command to force the data type for the columns in a different way. I'm not entirely sure. I tried adding a SELECT (before the main SELECT) that put values of the desired type in each column (similar, in concept, to adding dummy rows to the file). I took one of the columns that CR had decided was numeric in the file, and made it a string in the new SELECT. But I got an error when I tried to save that Command, apparently because the first SELECT was just using literal values and not actually reading the file (or anything else). So then I changed the first SELECT so that it was still using the literal values, but also reading the file, and I added a WHERE that would never be true, so that that first SELECT wouldn't actually produce any rows. CR let me save that Command, and it made that one column a string instead of numeric. I didn't do any testing beyond that, but my guess is that CR was getting the data type for each column from the first SELECT, even though it wasn't actually producing any data, and then using the same types for the columns in the second SELECT, where it was actually reading the file. So, that _seemed_ to work, as far as it went, but whether or not the same thing would work for you, I really don't know. And keep in mind that if you do that, the first SELECT is (presumably) going to be reading through the whole Excel file (even though it's not producing any data), so the report would be reading the file twice. If it's a really large file, that could take some time.
James
Another thought:
If you happen to have MS SQL Server, another possibility would be read the Excel file through MS SQL Server (eg. using a stored procedure or a CR command). You could create a linked server that pointed to that Excel file, or use the OPENROWSET function. If you read the Excel file in MS SQL, you can use the CONVERT/CAST functions to change the data type for the columns.
If you have some other db, like Access, you may be able to do the same kind of thing in that db.
James
If you happen to have MS SQL Server, another possibility would be read the Excel file through MS SQL Server (eg. using a stored procedure or a CR command). You could create a linked server that pointed to that Excel file, or use the OPENROWSET function. If you read the Excel file in MS SQL, you can use the CONVERT/CAST functions to change the data type for the columns.
If you have some other db, like Access, you may be able to do the same kind of thing in that db.
James
ASKER
Hi mlmcc
I've tried your solution first, and inserted the following command:
SELECT CONVERT(VarChar,deduct.WCO DE) as WCODE
FROM "deduct"
I get the following syntax error:
Database Connector Error: 'DAO Error Code 0xd7a
Source: DAO.database
Description: Syntax Error in Query. Incomplete Query Clause'
Any idea how to get rid of this error?
I've tried your solution first, and inserted the following command:
SELECT CONVERT(VarChar,deduct.WCO
FROM "deduct"
I get the following syntax error:
Database Connector Error: 'DAO Error Code 0xd7a
Source: DAO.database
Description: Syntax Error in Query. Incomplete Query Clause'
Any idea how to get rid of this error?
The problem is probably the Convert syntax. I don't know how to get rid of it since that is database dependent. Check how convert is supposed to be written.
I always thought it was
Convert(Field,format) yours is reversed.
mlmcc
I always thought it was
Convert(Field,format) yours is reversed.
mlmcc
ASKER
No luck with that one either mlmcc. I've had another thought....The 8.5 report (that I am trying to convert to 11) didn't have this problem, probably because it was a csv file and was run using a text driver.
I've attached a document which shows how I set this up in 8.5. I can't see the option to set the same connection up in 11. Do either of you know how to do this? createdatasource.doc
I've attached a document which shows how I set this up in 8.5. I can't see the option to set the same connection up in 11. Do either of you know how to do this? createdatasource.doc
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
I would so have loved this to have worked!! I used the TEXT option under the ACCESS/EXCEL at DAO and didn't get any data when I refreshed the screen. All gone.
Any ideas why this would be?
Any ideas why this would be?
First of all, as far as the CONVERT syntax goes, in MS SQL, it is data type and then value, as you had it, although with a type like varchar, you would probably also include a length (not sure if that's required), so it would be something like CONVERT(VarChar (10), deduct.WCODE). However, as I said, CONVERT and CAST don't seem to work when accessing an Excel file. I tried various parameters in various arrangements and couldn't get them to work. This isn't MS SQL. It's a SQL-like language being used to read an Excel file. I don't know if it has any kind of conversion function.
As for the text option, as mlmcc said, it's under Access and it does work. What did you do exactly? You said that you "used the TEXT option" and "refreshed the screen". Was this in the report that was reading the Excel file, or the old report that read the text file? Were you trying to change the datasource or add a new one? etc., etc.
Going back to your original post, you said that you "had to change the database from a csv file to an excel file". Why? Forgive the obvious, but did you try running the old report under CR XI?
James
As for the text option, as mlmcc said, it's under Access and it does work. What did you do exactly? You said that you "used the TEXT option" and "refreshed the screen". Was this in the report that was reading the Excel file, or the old report that read the text file? Were you trying to change the datasource or add a new one? etc., etc.
Going back to your original post, you said that you "had to change the database from a csv file to an excel file". Why? Forgive the obvious, but did you try running the old report under CR XI?
James
ASKER
Hi James, answers to your questions as to what paragraph they were in are below:
Para1: The number of characters in the field could vary. Would this matter?
Para2: I went into Set Datasource Location, clicked on Create New Connection - Access/Excel DAO, Selected the csv file from the drop down list, changed the database type to 'Text', clicked Finish, then updated the datasource to read this new connection.
Para3: When I tried to run the old report it brought up the OLE DB ADO screen and asked to choose an ODBC DCN from a list.
Para1: The number of characters in the field could vary. Would this matter?
Para2: I went into Set Datasource Location, clicked on Create New Connection - Access/Excel DAO, Selected the csv file from the drop down list, changed the database type to 'Text', clicked Finish, then updated the datasource to read this new connection.
Para3: When I tried to run the old report it brought up the OLE DB ADO screen and asked to choose an ODBC DCN from a list.
ASKER
PS Re Para1 : I changed the formula to:
SELECT CONVERT(VarChar(10),deduct .WCODE) as WCODE
FROM "deduct"
The same error was received. (Description: Syntax Error in Query. Incomplete Query Clause')
SELECT CONVERT(VarChar(10),deduct
FROM "deduct"
The same error was received. (Description: Syntax Error in Query. Incomplete Query Clause')
ASKER
PSS
Re Para2:
I was trying to change the datasource location, as Crystal couldn't recognise the old one.
Re Para2:
I was trying to change the datasource location, as Crystal couldn't recognise the old one.
> The number of characters in the field could vary. Would this matter?
That shouldn't be a problem. That's what varchar is. You set a maximum length (eg. varchar (10)), but the field only uses as many characters as are required for each value. Or, you can use char (eg. char(10)), and the field will always be that length, with any "extra" characters filled with spaces.
But that's really a moot point, since, as far as I can tell, there is no way to do a conversion like that in an Excel query.
Looking back at your screenshots of how you set up the connection in CR 8.5, it appears that you used a file DSN, which means that the connection information is stored in a file somewhere. You would have selected that DSN file in the last screenshot that you posted, and then gone on from there to enter the connection details that would be saved in that file.
If that's the case, it kind of seems like the old report should have worked under CR XI, since it supports the same kind of connection, but maybe there are some differences between 8.5 and XI that prevented CR from just automatically making the connection (I haven't used 8.5, so I really don't know).
If the old report is using a DSN file, you could try re-creating the connection under CR XI.
Go to the Database Expert in CR XI and open up "Create New Connection" and then ODBC.
Double-click "Make New Connection".
Click the "Find File DSN" option and browse to the DSN file that the old report is using.
If you like, you could also post your report here. Maybe we can see something that might help.
James
That shouldn't be a problem. That's what varchar is. You set a maximum length (eg. varchar (10)), but the field only uses as many characters as are required for each value. Or, you can use char (eg. char(10)), and the field will always be that length, with any "extra" characters filled with spaces.
But that's really a moot point, since, as far as I can tell, there is no way to do a conversion like that in an Excel query.
Looking back at your screenshots of how you set up the connection in CR 8.5, it appears that you used a file DSN, which means that the connection information is stored in a file somewhere. You would have selected that DSN file in the last screenshot that you posted, and then gone on from there to enter the connection details that would be saved in that file.
If that's the case, it kind of seems like the old report should have worked under CR XI, since it supports the same kind of connection, but maybe there are some differences between 8.5 and XI that prevented CR from just automatically making the connection (I haven't used 8.5, so I really don't know).
If the old report is using a DSN file, you could try re-creating the connection under CR XI.
Go to the Database Expert in CR XI and open up "Create New Connection" and then ODBC.
Double-click "Make New Connection".
Click the "Find File DSN" option and browse to the DSN file that the old report is using.
If you like, you could also post your report here. Maybe we can see something that might help.
James
ASKER
Hi James,
I've tried to do what you mentioned above, but I'm still not getting any results.
The 8.5 report was written so long ago, I can't remember if it used a File DSN or not. How can I check?
I would post the report on here, but it includes a lot of sensitive information, so instead I've attached a file below of the steps I followed to update the datasource location. I've also posted the code behind the file dsn.
Can you have a look at this and see if I have made any mistakes in re-establishing the connection?
Many Thanks!!
payslip.doc
I've tried to do what you mentioned above, but I'm still not getting any results.
The 8.5 report was written so long ago, I can't remember if it used a File DSN or not. How can I check?
I would post the report on here, but it includes a lot of sensitive information, so instead I've attached a file below of the steps I followed to update the datasource location. I've also posted the code behind the file dsn.
Can you have a look at this and see if I have made any mistakes in re-establishing the connection?
Many Thanks!!
payslip.doc
SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
Bingo, that helped find the problem! There were some selection formulas that weren't working with the report. I have since changed them (added Cstr) now they work.
Now I just have to solve some other conversion issues with the report!!
Now I just have to solve some other conversion issues with the report!!
Glad you were finally able to figure it out.
James
James
I assume they are numeric fields but that you want them to be strings.
One way to do it in the report is to use formulas for them like
ToText({YourField})
or
CStr({YourField})
You can control the number of decimals and the thousand separator as
CStr({YourField},0,"")
Another idea would be to modify the Excel file so the first row or 2 have dummy values that are definitely strings like xxxxx in those fields. Crystal looks at the first few rows to determine the datatype.
I don't know if just putting ' in front of the numbers in Excel if Crystal would treat them as strings or not.
mlmcc