Importing inconsistent excel data into a sql server table

I have a bunch of spreadsheets with info listed in multiple rows and columns, similar to the attached file.  Each sheet represents a box with multiple files in it.  The box number is listed 1 time in column A.  Then column D starts the specific data for a file.  Some rows have data in columns E, F and/or G. Some files have 1 or 2 rows of info, others may have 3, 4 or 5.  (See attached).  The last row of data for each file contains "Attorney Initial(s)" in column D and "Destroy Date" in column G .  

What I'm trying to do is develop a process that can read the info and import it into a table where each "File" is in one row.  Column Headers are:

Box #
Attorney Initials
Destroy Date
Client # (the data in column G with numbers)

Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

Nico BontenbalCommented:
If someone asked me to do that I'dd write some VBA code in Excel to process the data. This macro can then loop though all the Excel files, open them one by one an loop through all the sheets in the Excel files. Then it loops though all the rows and columns of the sheet and extracts the data and inserts it into SQL server. When you have never created VBA code before this is quite a big task. You might consider hiring someone to do this for you. The code should do something like this:
Loop through all the .xls files in a folder
   Open the file
   Loop though all sheets in the file
      Get the box number from column A
      loop though all the rows
          if first row or new file 'remember' client number from column G
          if not add and 'remember' all the desciptions
          until column d starts with "Attorney Initials" then get the initials and the destoy date from the row and store all the data in SQL server
          start with the next file

Might not be 100% correct but you'll need some kind of rule set like this. Also keep in mind that a slight variation in the structure that would be totaly clear to a human (having 'DFG (Attorney Initials)' in a field somewhere, would create undesired results.
Dale FyeCommented:
Concur with Nicobo.

From your example, what you would like is 2 records or 3?  The inconsistency that row 6 does not have any initials but has text above it would be a complication, but could simply leave that field empty.

I assume that the "Box #" field would contain "Tub 7 - J Cool", "EPA", "Tub 8 - Kelly Hump"

But would you want to compile all of the info from Cells D8, E9, D9, D10, D11, D12 into a single description field?

Also, the 1721.002 is a bit confusing as a "Destroy Date".  That value might mean something in the legal profession, but is hard to fathom as a "date" field.
Nico BontenbalCommented:
And I agree with Dale. It's not totally clear how you would like to process the data. I think it would help if you described the exact result (so the data in the SQL table) that you would like from the Sample.xls. Don't forget that you might have been working with these sheets for hours/days/months/years but we see them for the first time.
Ultimate Tool Kit for Technology Solution Provider

Broken down into practical pointers and step-by-step instructions, the IT Service Excellence Tool Kit delivers expert advice for technology solution providers. Get your free copy now.

James ElliottManaging DirectorCommented:
I normally love this sort of thing, but I agree with the above, your data is a complete mess. I think you need at the very least (for a free solution) to provide an annonymised file of your data so we can see the full problem, and, a draft mock up of how you want it to look after.

RavenTimAuthor Commented:
Hey Guys, sorry, but I was on vacation and been catching up... I'll get some more info together & post it later today.

RavenTimAuthor Commented:
Ok.   I've attached two different samples showing the data as is (sheet 1) and the "Desired Result" (sheet 2).  I'm not sure how doable this is going to be as I have about 1500 separate files to process and there is little consistency.  Please see attached samples.

James ElliottManaging DirectorCommented:
I'm slightly confused by the first file. Are we to assume that rows 80 onwards should have been completed in the pro forms cells above? Is this how the files differ? Ie. People have chosen to adhere to the template at random?

if so i fear a quick generic approach is not going to be possible.

If u were able to provide a list of every partner initials and surnames then a generic routine might hit 70% accuracy but I'm guessing the final outcome needs to be better than that?

I'd suggest getting a fast-typing temp for a week :/

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
RavenTimAuthor Commented:
:-)  That's kind of what I thought...  

Martin LissOlder than dirtCommented:
This question has been classified as abandoned and is closed as part of the Cleanup Program. See the recommendation for more details.
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Microsoft SQL Server

From novice to tech pro — start learning today.

Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.