I am using MySQL 4.1 Gamma on WinXP. I am experiencing a very weird problem where a command which works fine under the mysql command-line tool fails when issued within my program (table has two columns, a DateTime called time and an INT called total):
catCMD.CommandText = "SELECT sum(total) from MyTable WHERE time>"2004-10-02 09:30:00" AND time<"2004-10-02 16:00:00";
myReader = catCMD.ExecuteReader();
if (myReader.FieldCount > 0) g++;
Under mysql, I get the correct sum of 'total' values in the set matching the condition:
| SUM(total) |
| 72 |
But from the program I just get one row with no fields (.FieldCount is 0) !
If I change the select to, for example:
SELECT * from MyTable WHERE time>"2004-10-02 09:30:00" AND time<"2004-10-02 16:00:00";
all is well and I get what I expect (many rows, none empty). Same with "SELECT total", etc
From experimentation, it seems the problem element is the 'sum' function of the top-level SELECT.
Could there be an issue with one of the layers of software (such as the .NET OleDb objects or the OLE DB driver) to explain this? I has assumed that nobody really looks at the command string except the server itself but I could be mistaken.