From the memory resident high performance of SAP’s HANA to Oracle TimesTen, elastic clouds in the Azure sky from Microsoft, the bite-sized ‘tapas’ analytics on a pay-as-you-go approach from UNIT4, to all included analytics that sit on top of your existing software, users have more choices and price points than ever to analyze before adopting an analytics solution.
Let’s look at some of these developments.
Role-based Ready-to-Use On Demand
UNIT4 just launched this concept of role-based tapas-size apps on demand with thirty-five on-demand apps. Clever marketing phrases aside, UNIT4 tapas have some important technology foundations that are worth analyzing:
- Event-driven — not merely after-the-fact reporting, aka Crystal-type reports, these apps can answer day-to-day questions (the “how am I doing now”) with analytics such as utilization, spend within budget, and so on.
- On Demand — Users can access the app store and order/activate what they need. You might use the apps off and on — so why pay forever? This is an interesting concept for analytics because the enterprise landscape is littered with reports.1 Excel spreadsheets and user reports often are not used again — or are redundantly developed. An app store allows users to keep track of what they bought, and what they do or do not actually use.
- Highly graphical UI to take advantage of new device platforms — pads, pods and phones.
- Strategy and Tactical — from big financial questions about profit and EBITDA to more tactical questions for a production manager (Figure 1).
Analytics are most often consumed within a role or function. Score cards — like politics — are local. So the tapas serving size makes sense. This On-Demand approach with prebuilt analytics is important. Rather than launching a big evaluation and purchase exercise, the end-user community gets what it wants today from the offerings of UNIT4 Business Analytics.
UNIT4 is also addressing the need our research has indicated that users want most — ‘real-time’ and predictive analytics that are designed for their immediate problems, not just after-the-fact reporting.
Memory Resident Database
Oracle and SAP are chasing BIG and fast with their in-memory products. The database product and the accompanying hardware are designed to run together.
SAP’s HANA, initially deployed for large enterprise database/warehouse applications, leverages HP’s strong history in memory resident architecture2 and its multi-decade investment in high performance chips. IBM has also created several configurations for HANA. Interestingly, SAP is working to make a subset of their analytics on HANA available to SMBs. HANA’s pricing is solely based on the size requirement for your in-memory analytics. Long term, HANA will be available for transaction-based applications. Exciting stuff that Hasso Plattner3 has talked about for ten years. (OK, it takes time for big companies to get on board.) But it is a transformative game changer for the enterprise, all the same.
Oracle’s in-memory database, TimesTen; and Exalytics (appliance) are designed for big data analytics.4 Oracle is not leveraging this for the core ERP, and I would posit that with database software sales so large and profitable, that eating their own young (vs. others’ young) is not what Oracle had in mind when they purchased Sun.
IBM has their various strategies such as free conversions from the competitor and delayed payment/subscription pricing for getting you committed to an all-IBM strategy with their cloud version of DB2. I attended an IBM event at which they presented their cloud and DB2 strategy to provide high performance service for less money than Oracle. They also spent considerable time attacking the Oracle strategy and pricing.
These three behemoths are going straight for one another’s jugular, attacking each other by name and challenging each other’s strategies. If you like to watch hockey fights amongst the spectators at the game, you will also enjoy the Oracle vs. SAP battle about in-memory going on now. Oracle also has a compelling ad in which they show the Oracle database appliance — one box — stacked against a multi-layered Microsoft solution — data base + server + storage + network route. It might be stretching it, but it does make a compelling image of the whole appliance approach (not just from Oracle). Entertainment aside, these battles will only make their solutions better for you over time.
SAP provided us a briefing on their analytics strategy for HANA and the SMB which provided some good insight about HANA’s capabilities.5 So this won’t be a decision for just once.As your needs change there is flexibility in pricing — up or down.
A new entry in the market for in-memory database is Starcounter. This is an option for the .NET community to consider. This is a young company looking to the software market (and early-adopter end-users) to utilize their database technology. Starcounter’s success will lie with software companies that have analytic and optimization-type solutions and that could embed Starcounter’s database technology to torque-up their offerings. Of interest to end-users is that they do not have to buy a special appliance to run the Starcounter technology.
In-Memory Is a Game Changer
Having worked with memory resident solutions since the mid-nineties, I can attest to the value of these solutions in situations where time reductions in multi-stage business processes, data processing, and big number crunching problems exist.6 Today’s systems have only gotten better since then and the idea of moving your transaction systems — not just your data analytics — to this environment is very appealing. One can solve big modeling problems: How will a new pricing strategy affect our profit? Will it impact other product lines? How will a storm on the East Coast impact our supply chain? What routes will we be able to use to get these needed, high-demand products to that market quickly? Risk, pricing, investing, routing, multistage planning and modeling, assembling complex designs, to name a few, are all applications that benefit from memory resident approaches. Perhaps you’re an online (Amazon) customer searching and pricing the overwhelming array of products with sizzling speed.
Today, modeling and optimization is managed as a highly orchestrated exercise used only for planning. But users would really like to get that capability all day, all the time. Memory resident systems have consistently demonstrated their value in terms of cost reductions; yet they could do so much more if part of the transaction system.
In-memory databases also allow the user to simplify database design. The old model was to think through the database design, designing indices and creating all sorts of database administrative tools to allow users to manage and have quick access to their most frequently-used queries. We love in-memory, since it can reduce or eliminate some of this pre-thinking and database overhead. Again, if you have to plan the queries and reports you might use so far upfront, it limits your ability to use all that data now for a new need. Or explore, discover and learn from the data.
In-memory is not just about speed. Well-designed analytics take advantage of 64-bit architecture to solve math problems more holistically, creating richer, more complete answers.
So, I won’t pick a winner here — Oracle vs. SAP. Let’s say the one who can reduce the database overhead and fulfill my dream of really converting the enterprise to real-time has my vote.
In the Cloud
Another option is to use elastic cloud services. You can get HANA in the cloud. But the early-in with this strategy was Microsoft with Azure.7 Microsoft’s platform as a service can be independent from other technology decisions you may make.
A problem for the enterprise is how big can these databases get? The answer: No one knows. Users never give back processing power and software once they get it.
So a big selling point for Azure is the elastic nature of the service. Elasticity is a real value to the enterprise; less up front planning, more support for adhoc. And not paying for what you don’t use. Azure has the benefit of fitting within the overall Microsoft enterprise strategy and can be used with most solution players (most have a .NET option). It has an easy bridge from your on-premise SQL servers. Many companies use SQL, so even without an accompanying enterprise software provider, users can take advantage of Azure SQL Database services in the cloud.
The analytics/BI purchase is topmost in every industry category we research. Alternative offerings abound from the BI providers who have the libraries and portfolios of reports and database design services. And many application providers have analytics built on top and included in a solutions purchase. But there is a lot to ponder as you plan your strategy going forward.
Different approaches may be appropriate for your company’s topology — global vs. local; your technology portfolio — hardware and software vendor preference; and your plans to be a smarter, faster enterprise.
In this short article, we are not going to give you a big architectural rundown on all the pros and cons of these technologies. Suffice it to say that the decision about databases and analytics just got very interesting.
1 I once was given the task of evaluating a Fortune 100’s entire data warehouse and reporting strategy. Besides there being multiple data marts, each data mart had hundreds of users with literally hundreds of thousands of reports per data mart. Too many were redundant. The cost and confusion of running these was in the tens of millions of dollars. — Return to article text above
2 First introduced on the Alpha chip from DEC back in the early nineties. Alpha quickly got applied to engineering, manufacturing, analytics and big modeling applications. You can learn about the HP solution with HANA here.
IBM also has several hardware configurations for HANA which you can read about here. Read about Intel, as well.
— Return to article text above
3 One of the founders of SAP — Return to article text above
4 As you know, Oracle bought Sun Microsystems a few years ago and has been learning to leverage this ‘exclusive’ hardware platform. You can read about TImesTen in-memory here. — Return to article text above
5 You can also follow the drama on Steve Luca, from SAP blog. — Return to article text above
6 These approaches have been used by i2 (now JDA) since the early 90’s. Today optimization engines from ToolsGroup and Logility; and pricing systems from Vistar, Revionics, are good examples of systems that have used these approaches since their inception.
— Return to article text above
7 You can read about Microsoft Windows SQL Azure database services here. — Return to article text above
To view other articles from this issue of the brief, click here.