SQL Server Denormalized Database

I am working on a project and one of the Engineers is recommending working with a Denormalized database instead of in 3NF.  I disagree with this opinion.  He states:

    "It is more optimized for querying. This way there will be less tables to cross-reference and have faster data for viewing. This will cause some data redundancy, but this will be more optimal for data visualization for reports"

I disagree.  For the reasons below.

Normalized tables are usually smaller and have a smaller foot-print because the data is divided vertically among many tables. This allows them to perform better as they are small enough to get fit into the buffer.
The updates are very fast because the data to be updated is located at a single place and there are no duplicates.
Similarly the inserts are very fast because the data has to be inserted at a single place and does not have to be duplicated.
The selects are fast in cases where data has to be fetched from a single table, because normally normalized tables are small enough to get fit into the buffer.
Because the data is not duplicated so there is less need for heavy duty group by or distinct queries.
Lazy programming - bad design

Are there other reasons to NOT do a Denormalized database?  Or is he right?
LVL 2
CipherISAsked:
Who is Participating?

[Product update] Infrastructure Analysis Tool is now available with Business Accounts.Learn More

x
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

Éric MoreauSenior .Net ConsultantCommented:
Denormalized data is sometime helping performance when the data is static. I would never build a denormalized table for a table being continuously updated.

I often create a denormalized table containing the results of long/complex calculation used by reports or dashboard screens.

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
CipherISAuthor Commented:
So, it is ok for data being inserted?  What about querying the data?
Jim HornMicrosoft SQL Server Data DudeCommented:
Generally if there's conflict between production data entry (as debated by you) and reporting (as debated by the engineer), production data entry wins every time.

You are correct in that a normalized database is better suited for a transactional model.
No duplication of data goes on here, as everything has a single location.  
Typically for transactional models there are mostly INSERTs and UPDATEs going on, with some SELECTs.

The engineer is correct in that denormalized databases are faster for reporting as they require fewer JOINs.  
This is the same thinking as what leads to cubes and star-snowflake data models.
Typically for reporting there is only SELECT statemenets going on.

I've seen lots of people take a transactional database and create a denormalized database for reporting, with an ETL such as SSIS inserting data into the denormalized database every day/hour/whatever.

I have never seen any experienced database developer start with a denormalized database.   Maybe a superuser that hasn't been trained in database concepts can crank something denormalized out, or that's what their source of data is, but it's rarely a conscious decision to start denormalized.

Good luck.
Jim
Acronis True Image 2019 just released!

Create a reliable backup. Make sure you always have dependable copies of your data so you can restore your entire system or individual files.

John TsioumprisSoftware & Systems EngineerCommented:
Instead of spending time to find ways to denormalize data spend time on creating the necessary views that will give the information that you would expect from the denormalization...Normalization is the way to go
CipherISAuthor Commented:
The engineer is stating that using joins it going to effect performance.  Any arguments against it?
Jim HornMicrosoft SQL Server Data DudeCommented:
Fewer joins yes, but reporting dictating architecture over data entry is the tail wagging the dog.

Assuming their is a data entry going on in this db.  Correct?

Sounds like this engineer knows only what he is arguing, which is a pleasant way of saying he doesn't know what the eff he's talking about.
John TsioumprisSoftware & Systems EngineerCommented:
If the engineer is certain why don't you try to find out if his right...
Just set up 2 similar computers load the same amount of data and test who is faster...
If he proves right try to double or triple the amount of data making a projection to the future.....
Denormalized data at some point they will start to show their disadvantages....
Scott PletcherSenior DBACommented:
Perhaps with denormalized data you could get incorrect answers more quickly.  However, the joy of getting answers quickly disappears once people figure out the answer is wrong!
CipherISAuthor Commented:
Thanks for the input.  Still trying to convince them my point is accurate.
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Microsoft SQL Server

From novice to tech pro — start learning today.