Link to home
Start Free TrialLog in
Avatar of William Peck
William PeckFlag for United States of America

asked on

question on the extent of usage of in-memory computing (Oracle and SAP Hana), and eliminating data warehouse / OLAP systems.

I've been learning about in-memory computing (both Oracle and SAP Hana) and this is looking like a major revolution, no ? Making the architecture of the 90's a relic of the past ? Just trying to understand.

In this SAP presentation (first 5 minutes), Prof. Hasso Plattner talks about the revolution that is Hana. Because it's an in-memory database, performance is fast as lightning, thus opening up many possibilities because of the speed. Attached is Hana's .pdf
 
After researching this a bit (including Oracle, here), I can see the value of in-memory computing: faster, reduces complexity, reduces data footprint, etc.

Prof Plattner also talks about eliminating OLAP databases (and data warehouses, which I suppose is OLAP), because of the speed. So is this true too ? He says to bring analysis back to the OLTP system. I also heard Hana can scan 2 B records per second . . . if so, wow.

So I'm old school Oracle, came at the beginning of client server, three-tiered, etc. with a little COBOL thrown in back in the day. Can write awesome SQL statements and did plenty of PL*SQL programming, and lots of performance tuning. But it's looking to me like this is a new world and I better get a handle on it.

We have a small d.b. so size is not even a consideration. Like 60,000 main master records (students in a higher education setting).

So wondering
- the extent to which in-memory is being implemented
- are OLAP / data warehouse systems going to be obsolete ?
- other thoughts for an old dog . . .
SAP-Hana--In-Memory-platform.pdf
SOLUTION
Avatar of Kyle Hamilton
Kyle Hamilton
Flag of United States of America image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of William Peck

ASKER

Kyle, thanks.

It's all new to me, so I appreciate your perspective. I'll dig into their documentation the little bit more. Plus I'll leave this open until early part of next week.
Hi, this is a pretty important question and I was hoping to get more feedback on this. How about sdstuber, slightwv, or mlcc ?
ASKER CERTIFIED SOLUTION
Avatar of Sean Stuber
Sean Stuber

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
ststuber,

thank you !

re: reduces data footprint (and reduces complexity)
- the point here is you don't have to store aggregates, which currently require extra processes to a) build and b) constantly validate. The logic might be the same to come up with your totals, but the overhead is reduced, including the data footprint. Aggregates are always then done on-the-fly
- but good on the other points re: data footprint

re: In-Memory is an EXTRA cost option
- the Hana guy (Hasso Plattner, co-founder of SAP and pretty much a tech guru) had a segment on why the reduced complexity etc reduces overall cost (start at 9:54 - 12:30)
- his point is, given that in-memory is the way to go, it costs $XXX to put everything in-memory with your run-of-the-mill d.b. (like Oracle 11c, I presume), but it costs significantly less to build with Hana (and EVEN LESS with their fancy-schmanzy S4 Hana).

- good points on OLAP / data warehouse as well

Overall, this seems to be a major change, one that is bursting old ways of doing things, at least according to Prof. Plattner.

Doesn't it seem that any shop worth its salt should be investigating this ? And for the dinosaur development manager who says, "we're just fine, don't worry about this", what should I say ? (This is the same development manager who is spending 100% of his time on patching a 20 year old system, including many many processes that pre-aggregate the data).

I think you will enjoy and glean very good points from the presentation (see link in my post).

Thanks again !
Avatar of Sean Stuber
Sean Stuber

>>> what should I say ?

for your data volumes, I wouldn't recommend the costs of in-memory unless you're doing a ton of analytics and materialized results.

However, I do recommend testing it out.   There's no license cost for trying a feature out.  If you have hardware available to allocate the extra memory then give it a shot in development and see what it does for you.

If the results are significant, let them speak for themselves.  If they aren't, then those results should speak just as loudly against the cost.
thanks for the tips !