How to deal with file import into database?, MSSql

Purely at a theoretical level, I don't know if there is a "correct" answer. I know this is really general but I would appreciate some advice.

I have a project that requires a file import on a daily basis. I'd like to optimize the database but I'm not sure what's best in this situation. I've never had to deal with a daily import of data before.

The daily list is a .csv file that is a list of vehicles with make, model, year, condition, but also includes stuff like address, city, etc. The database itself will include this imported list and mainly administration type things like: Admin, Users, Offers and then layout/display type info to facilitate searches/display, etc.

Is it worth the time to import the list, then transfer to tables like, Address, Vehicle, Users, Admin and do the foreign key, typical database structure?

Or should I just leave the list as is and only optimize the Admin stuff?

Who is Participating?

[Product update] Infrastructure Analysis Tool is now available with Business Accounts.Learn More

I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

By "optimize", I think you are mean "normalize".   Optimization means designing the database so that you get good performance and/or space utilization.  Normalization means designing the database to promote data integrity.  

In some cases, a normalization and optimization are at odds; but not always.  The goal is to find a happy balance within the bounds of your stakeholder requirements.

Since the database seems to be for reporting and analysis, I don't much see the point of foreign key constraints (except possibly to identify problems in the source data).  Ditto for normalization.  You may want to consider restructuring the CSV to address specific performance issues however.  That includes decomposition as well as aggregation and indexing.
How much data are you importing?
Is the data is primarily for reporting or searching?
Is the data read-only, or can users amend it through whichever tool will be doing the searching?
How powerful is the server hosting your database?

As Dqmq said, you need to know what the data's for, before you can really plan the database.
Personally, I'd put in an auto-incrementing integer primary key field, if no unique field exists in the data, to help with the searching (users love a simple ID key!).

The vital step would be indexing it, e.g. make/model/year, and state/city, this would help with both searches and reporting (we recently had a very humbling experience with a Microsoft SQL designer giving us a lesson on Indexing for Dummies, so make sure you understand indexing).
After each import, re-index the data through tsql as the built-in re-indexing is a bit dubious, and make sure you run a nice dbcc checkdb every so often to make sure your database stays healthy.
MHenryAuthor Commented:
It's about 250mb. And is actually being imported twice daily.

I was just thinking that (for example) if there's an address in the import list and an address for users the typical design would be to have an address table and do all the foreign key stuff.

But I've been thinking lately that there is no real reason to try to maneuver all that data into different tables - especially since it's being updated twice a day. It just seems to me that it's just more of an opportunity for something to go wrong.

Mainly just wanted to see if there's a compelling reason to do so...

>Mainly just wanted to see if there's a compelling reason to do so...

"Compelling reasons" need to come from your business case, not from our outsiders view.  The integrity of a reports/analysis database is dependent on the loading process which can be fairly tightly controlled.   Very often, the overhead of referential integrity on top of that is not cost effective.  So, in that sense, I agree with your "gut":  why add complexity risk, unless  there is a justified benefit.  Is there?

That said, I can tell you that many report/analysis imports are restructured for better alignment with the queries that are anticipated.  That's done both for performance reasons and to simplify the query development.   Your call.

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
MHenryAuthor Commented:
I don't think there is a compelling reason. But that's why I'm asking here. ;)

I can move the data into more traditional tables once a customer adds the vehicle to his list. And when it's moved into the user's list I can do all the more complex queries. The imported list is mainly just a "choose from this list" kind of thing.

At least that's the way I'm thinking about it now.
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Microsoft SQL Server

From novice to tech pro — start learning today.