One of the most frequent problems a "newbie" developer may encounter is having to deal with different data formats. One for all: THE DATE
We, as humans, need to "see" a date and then interpret it (much of the times this is an automatic operation).
What if I show you: "07/05/2013"?
Many of you would certainly recognize it as July 5, 2013.
For just as many (like me), the same format would mean May 7, 2013...
The problem arises from the fact that the format in which this date is shown (and recognized) varies according to the origin and culture.
Human thinking is really far from my field of study, but for sure different contexts lead to different solutions.
Since (until proven otherwise) machines do not think, we should not expect that machines too have our same problems.
We should limit our programs to only "show" a certain type of data (in this case a Date) in the format expected by the user (because the user is human and prefers a format applied to the type). We should have in mind that date formats are solely for our own interpretation.
Machines don't really need "a format", they understand dates (times) as a number stored in a date/time column.
Since we could consider formatting for "show" only, then this is an exclusive operation of the user interfaces.
Everything else, that does not concern the user, MUST NOT have to deal with the formatting of dates.
So we can say that: UI ARE FOR HUMANS --> USER INTERFACE MAY FORMAT DATES
A well designed application should not give any "issue date" to any programmer, nor should it lead to confusion the end user.
When we design a new application, we should also think at the chance to use the data types offered by the programming language we are using.
For example, if we are in c # or vb.net we can ("should") use "DateTime".
Even in SQL we have the type "Date" (and "Time").
This is the best way IMO to avoid any kind of "conversion" problem.
PROGRAMS ARE FOR MACHINES --> PROGRAMS SHOULD USE THE CORRECT TYPE ("DateTime" and equivalents).