Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people, just like you, are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
Solved

Database design of large amounts of heterogeneous data

Posted on 2014-04-05
15
775 Views
Last Modified: 2014-04-06
All ,

We are developing an application that collects data from different sources. The data could later used for visualization [Graphs] , Reporting and calculations [standard deviation, mean etc.]. The format of the data can be different . Some typical formats are

1) Key1:Value1
   Key2:Value2
   Key3:Value3
   -----------------

2) Key1:Value1;Key2:Value2;Key3:Value3

3) Key1:Value1,Key2:Value2,Key3:Value,

4) Key1:Value1
   Key2:Value2
   Key3:Value3

the Key will be always string and the value can be double , integer etc.how do we design database for storing such heterogeneous data.... The database could be either SQL Server or Oracle or SQL Lite
0
Comment
Question by:sabarish2u
  • 4
  • 4
  • 4
  • +3
15 Comments
 
LVL 83

Expert Comment

by:Dave Baldwin
ID: 39979697
You will probably have to convert it all to strings.  I suppose you could store it in binary blobs but then you have to write an application to get it in and out of the database and into forms that you can use.
0
 
LVL 16

Expert Comment

by:Wasim Akram Shaik
ID: 39979708
As you said, design, You should convert them in strings like what dave had suggested,

If you have a segregation of sources, like in if  there are  (say)8 input systems then you can think of 8 interface tables to those sources and all the interface tables should have same structure.

As you said, they key is always a string, so you have to think of various possibilities of those string types, so that you can identify the columns in tables.
0
 
LVL 74

Expert Comment

by:sdstuber
ID: 39980095
On Oracle - If the values are always numeric, then just use NUMBER.  If not, then consider using the ANYDATA type, which, as the name implies, will store any data type Oracle supports.

Another option, which would work for any platform, is to simply extend your table with columns for different types

Key, IntegerValue, DoubleValue, StringValue, DateValue, IntervalValue, GeoLocValue, etc

This would mean your queries would be slightly different for each type; but then they probably would anyway because you'd have to make accommodations for the data varying if all jammed into a single data type.

By using the correct type for each date value you'll not only get better efficiency in processing them but also an implicit constraint for data quality.  You can't put a date into an integer; but if they were all strings, you could.

A downside to this method is you'll have declare all of your values to be nullable to account for only value being populated for a given row.


And of course the last option, which is probably best...
Simply create different Key,Value tables.  One table for each data type.
0
Netscaler Common Configuration How To guides

If you use NetScaler you will want to see these guides. The NetScaler How To Guides show administrators how to get NetScaler up and configured by providing instructions for common scenarios and some not so common ones.

 
LVL 77

Expert Comment

by:slightwv (䄆 Netminder)
ID: 39980540
For the data to be used later for "visualization [Graphs] , Reporting and calculations" would you not sort of have to know what you will be getting for input?

I mean if one feed was average ice cream temps per city and another average height above sea level for a state.  Not much you could do with those two sets of data.

As far as the various input types go, would you not also sort of have to know what format you will be getting from what source?

I would set up a parser for every feed I received that would parse the input then store the data appropriately in relational tables so it can easily be used later.
0
 

Author Comment

by:sabarish2u
ID: 39980951
We are planning to write a parser that would parse the input and store the data in to a relational database... However we are not sure how the tables should be designed...the suggestion by sdstuber seems to be good
0
 
LVL 33

Expert Comment

by:ste5an
ID: 39981156
It really depends on the amount of data and the necessary performance of your reports and calculations.

But basically your data looks like you need an EAV model.

You may also consider using a data warehouse like star schema. But this depends on the semantics of your data.
0
 

Author Comment

by:sabarish2u
ID: 39981203
0
 
LVL 74

Expert Comment

by:sdstuber
ID: 39981206
>>> the data can be huge ,  

If the data really is going to have large volume, then don't put anything between you and your data to make it less efficient.

Create your tables and columns with the correct data types.
If that means using multiple columns or multiple tables that's ok.
Better than ok, it's better than trying to stuff different types of data into a single structure.
0
 
LVL 33

Expert Comment

by:ste5an
ID: 39981217
In general I would prefer a type-safe EAV model. Thus for each data type in your value column using on table.

Using SQL Server 2012+ I would also consider using Columnstore Indexes with the data warehouse approach.
0
 

Author Comment

by:sabarish2u
ID: 39981220
One of the problems is that we need to support atleast 3 databases - SQL Server , Oracle and SQL Lite...
0
 
LVL 74

Expert Comment

by:sdstuber
ID: 39981225
>> One of the problems is that we need to support atleast 3 databases

I highly recommend NOT trying to make a generic solution.  One of the hallmarks of poor applications is sacrificing quality in the name of portability.

Create tables and columns with consistent names but create them using whatever datatypes are appropriate to that platform.
0
 
LVL 33

Expert Comment

by:ste5an
ID: 39981233
Well, @sdstuber is correct. With your 3 selected RDBMS there is no generic solution which perform best on all RDBMS. This mainly depends on SQL Lite.

But don't mix semantic model with implementation. Accessing your data via the same semantic views and procedures allows you to use the appropriate table model below. This can be optimized to the RDBMS and thus differ from RDBMS to RDBMS.
0
 

Author Comment

by:sabarish2u
ID: 39981259
Ok... Can  i summarize that the most acceptable solution would be to create tables for each data types and store the data
0
 
LVL 74

Accepted Solution

by:
sdstuber earned 300 total points
ID: 39981262
>>>  Can  i summarize that the most acceptable solution would be to create tables for each data types and store the data

yes, that's what I've been recommending and I think everyone pretty much agrees
0
 
LVL 33

Expert Comment

by:ste5an
ID: 39981280
+1
0

Featured Post

PRTG Network Monitor: Intuitive Network Monitoring

Network Monitoring is essential to ensure that computer systems and network devices are running. Use PRTG to monitor LANs, servers, websites, applications and devices, bandwidth, virtual environments, remote systems, IoT, and many more. PRTG is easy to set up & use.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

International Data Corporation (IDC) prognosticates that before the current the year gets over disbursing on IT framework products to be sent in cloud environs will be $37.1B.
These days, all we hear about hacktivists took down so and so websites and retrieved thousands of user’s data. One of the techniques to get unauthorized access to database is by performing SQL injection. This article is quite lengthy which gives bas…
Via a live example, show how to shrink a transaction log file down to a reasonable size.
Viewers will learn how the fundamental information of how to create a table.

829 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question