We can repeat "Mobile-Social-Cloud-Big Data" as fast as the next guy but exactly what, and how much, those mean to an individual enterprise vary widely. Letting a thousand flowers bloom around the next big thing (or around anything) just leads to a field of weeds. Let's skip the hype and get to work.
If VentureBeats’s list of startups in the Big Data space represents the state of the art, then we conclude that nearly all of them are incremental products, or even just features, riding the Big Data wave. But a couple of them have the potential to be real innovators: metamarkets and ParAccel.
Both companies have constructed complete analytical stacks: data integration (certainly with limitations), in-memory datastores with analytic engines and libraries of analytic functions. The practical consequences: eliminating costly and painful movement of large datasets from data sources or data warehouses to analytic environments, performance advantages stemming from the in-memory datastores coresident with the analytical engines and not least, the potential for sharing analytically derived datasets and custom built functions within and across analytical teams.
ParAccel has been around a while and benefits from a level of maturity. But metamarkets' optimization of specialized functions, e.g., certain matrix decompositions, has the potential for real-world impact on hard problems such as detecting anomalies in very large data sets – of interest to law enforcement, to fraud detection, to risk management and to traders, just to name a few.
The volume of data, the heterogeneity of data, the number of sources of data all combine into a staggering problem. That is true for data internal to an enterprise as well as for that which is external (and the decreasing distinction between the two.) Traditional solutions of integrating, warehousing and governing will not work. Decide the domains in which "perfect" data matching is an imperative. Everywhere else, start to apply new methods for looking at data across disparate sources.
Organizations need to identify exactly what they are looking for and then determine if such a person even exists. Often, the practical choice is to identify the skills and assemble a small team of competent people with the skills:
Big thinkers think big thoughts and then hand them off to a junior "analyst" and hope for the best. It doesn't work and it's the single largest point of failure for projects, whether it's a new ERP system (heaven help us) or your basement remodel. Business Requirements establish the language that links intent and outcome. If that language is ambiguous, the link is broken and the project will not deliver on the intent.
We love glib nonsense at genus2. But declaring that SOA is the foundation or that data governance is the key doesn't advance the business. To be effective, the enterprise architecture function must first understand the business, its processes and it's plans - profoundly. It must also be pragmatic and recognize the infrastructure, the applications, the services and the data constructs that are already in place. Only when those pieces are in place can it begin the necessarily flexible process of putting together the rest of the puzzle. And dogma is unaffordable.
We didn't mean to kill it. We just neglected it - we were busy. Every older enterprise worries about it's legacy systems and all that 30 year old COBOL, VSAM and Assembler. In 15 years we'll still be worrying about all that 45 year old COBOL, but our real nightmares will be the layers upon layers of Services, APIs, ETL, Java, Visual C#, etc.. we are leaving behind now. Architect, Design, Document. Replace what you must. Build for backward compatibility and think about forward compatibility.
We can safely assume that everything (and not just "things") will be connected to everything else: every object, every person, every device, every piece of information, every event.
Copyright 2008-2013 genus2 Technology LLC. All rights reserved.