I am working on a project and one of the Engineers is recommending working with a Denormalized database instead of in 3NF. I disagree with this opinion. He states:
"It is more optimized for querying. This way there will be less tables to cross-reference and have faster data for viewing. This will cause some data redundancy, but this will be more optimal for data visualization for reports"
I disagree. For the reasons below.
Normalized tables are usually smaller and have a smaller foot-print because the data is divided vertically among many tables. This allows them to perform better as they are small enough to get fit into the buffer.
The updates are very fast because the data to be updated is located at a single place and there are no duplicates.
Similarly the inserts are very fast because the data has to be inserted at a single place and does not have to be duplicated.
The selects are fast in cases where data has to be fetched from a single table, because normally normalized tables are small enough to get fit into the buffer.
Because the data is not duplicated so there is less need for heavy duty group by or distinct queries.
Lazy programming - bad design
Are there other reasons to NOT do a Denormalized database? Or is he right?