by May 12, 2014 0 comments

Every business today is a Big Data business, whether they realize it or not. Businesses have access to a wealth of data originating from disparate sources, proprietary and 3rd party. This offers a unique opportunity to businesses today to mine these rich repositories of data together to gain insights and intelligence that could make profound impact to business growth.

But there are several challenges that make most businesses shy away from leveraging Big Data. The most prominent of them are the technology and usability challenges. Big data technologies, like Hadoop, have complex learning curve even for the IT teams. And these take analysis further away from business users and even analysts, of course with a reason – the power of these systems lies in their ability to write elaborate code and analyze data in an unbridled manner, unlike the restrictive relational database driven data warehouses of the previous generations. Hadoop provides an easy storage for different data, as it requires no fixed schema setup or complex maintenance and can scale-out easily. However it comes with cost as these only offer batch queries making real-time responses to ad-hoc queries difficult. Also because these are not natively relational systems, doing relational analysis across different types of data, which otherwise was possible in traditional data warehouses becomes difficult.

Big Data makes it extremely difficult to analyze data using the traditional manual approach where professional services manually creates and manages data warehouses and sets up analysis cubes based on a comprehensive list of requirements of what and how the user wants to consume the data. Because big data enables vast analysis ability providing restrictive canned setup done manually limits its power.
Until now, business data analysis has been like finding information on the Web in the early 1990s, the pre-Google era, where all we had was manually curated Web directories like that of Yahoo!. Information search was limited to data that was previously listed in canned directories. But as information on the Web exploded, it was clear that human editors could not keep pace with the rate of information growth, leaving only a small percentage of information on the Web discoverable through dated directories.

What’s next in Big Data?
That’s where Google came in. They replaced those manual directories with algorithms that helped people discover information, regardless of where that information resided or when their initial queries were generated. They were the first company to make information discovery truly usable. They also have been instrumental in bringing natural language search to the forefront of data in recent years.

Big Data discovery requires a similar machine first approach. With history as our guide, we shall see in the coming months and years several innovations where algorithms will replace manual efforts to enable data discovery in the growing web of enterprise data. Natural language question answering systems provide natural and easy user interfaces on top of sophisticated algorithmic platforms that automatically understand data residing across disparate sources and their relationships, which will drive Big Data discovery to its true era of success.

This is even more important now that Big Data has grown beyond just a buzzword to an initiative companies are focusing large efforts to tackle successfully. Manual approaches to Big Data modeling and warehousing cannot scale to the volume, velocity and variety of data that’s being produced today.

What’s needed for Big Data to become truly pervasive is a system that helps organizations automate the process of big data management and help them get to faster, smarter, operational decisions based on all data that a business has access to in the most natural way without having to learn complex software or spend dollars and months in setup. This is achievable only with Smart Machines.

The role played by smart machines
Smart Machines employs cognitive computing, which is a combination of machine learning, artificial intelligence, natural language processing and statistical algorithms that makes the machine think, act and understand like humans. We have seen Smart Machine and cognitive computing work its best with IBM Watson, which won the popular American TV Quiz Show – Jeopardy, defeating the previous human winners of the series. While Watson is an information and knowledge processing Smart Machine, there is a rise of Smart Machines for data computations.

A few companies are pioneering Smart Machines for Big Data discovery. These are systems with a machine-first approach that can automatically and dynamically identify entities and relationships from disparate data sources, model data into data lakes from the vast ocean of Big Data and provide a natural language interface for people to interact with and ask questions on their data in real-time. This removes the technology and usability barriers that Big Data otherwise poses and makes adoption easy and seamless.
For C-level executives, looking to justify the investment in Big Data, Smart Machines offer the ultimate solution. These cater to the precise requirements of the C-suites who want real-time benefits and the added value of context with regards to the original business queries. Algorithmic equations from the front end to the back end, as opposed to manual setups, will yield big results for companies of all sizes; they can finally expect to see the returns previously promised by traditional BI vendors.

The new paradigm in data analysis
For decades now, the business world has struggled to analyze its data. Companies have used humans to painstakingly understand their data structure and analytics needs and manually build out data warehouses, canned reports and dashboards. These time-intensive systems were not only expensive but often hurt the business. By the time IT was able to dive deeper into the analytics platform the business needs had moved on and a new initiative was under way. This also made data intelligence solutions cost-prohibitive for small and medium businesses that lack the budget and resources to spend on professional services.

Intelligent machines with sophisticated algorithms are rising now to save the business world millions of dollars and incremental amounts of time building data warehouse models, which will likely become obsolete soon. Natural language search driven by intelligent virtual agents outfitted with sophisticated algorithms is the future of data discovery and intelligence in 2014. Removing manual setups is the next logical step in this equation. In today’s fast-paced world, time is more crucial than ever. Organizations need a system that can provide the capability to draw inferences and conclusions. Before you hire that next data scientist, just remember that being able to make better business decisions and judgments based on instant intelligence from data are key – and algorithms provide that capability. Buckle up, because the rise of the Smart Machine is upon us and will change the way we view data discovery and interact with Big Data in big ways.

No Comments so far

Jump into a conversation

No Comments Yet!

You can be the one to start a conversation.