Skip to main content

Oh No Your Big Data Project is a Success!

12/2/2013
‘Be careful what you wish for.’  We’ve all heard that cautionary tale at one time or another. Sometimes it was merited; sometimes not. With big data initiatives, truer words were never spoken.
 
What’s a frequent big data wish for organizations? That their initial foray will be so successful management will want to implement a large-scale big data initiative enterprise-wide. After all, when a limited-scope project enables such keen insights about customer behavior and organizational processes, imagine what can be accomplished if you unlock the value inside the massive, ever-growing troves of information residing within and beyond your business’ data repositories. 
 
But beware, that initial success can breed a host of challenges.
 
Consider this example. A large retailer decides to evaluate on-time inventory shipment delivery to figure out which routes and carriers are most effective so it can minimize stock outages and increase revenue. Their IT department captures container delivery data for the past 18 months and uses an analytics application to review 25 gigabytes of ocean, rail and truck data for 300,000 shipments. After hearing how a big data approach helped optimize its supply chain, management wants to apply the technology to other aspects of the business.
 
Before we look at how IT might tackle this, let’s step back and explore three key variables affecting big data project requirements:  Use cases, data sources and data retention periods. If any of these grow significantly in scope, you can find yourself scrambling to expand your system needs. And if all of them ramp quickly, as in the case of this retailer, the challenges can be overwhelming:
  1. Use cases:  The retailer wants to conduct additional analyses, including booking and invoicing, price/performance, by commodity type and shipping location. It also wants to measure the effect of weather and predict the impact of labor disputes.
  1. Data sources:  To handle these use cases, the number and complexity of data sources expands to include new structured and unstructured sources including dramatically more shipment operations data, as well as historical weather data, news feeds and LinkedIn group information for sentiment analysis.
  1. Data retention period: To do long-term trend analyses, IT must keep the data for at least five years.
Now, instead of 25GB, IT needs to manage a 25-terabyte environment. And when the retailer’s marketing and ecommerce groups decide that they, too, can benefit from big data applications and that they want them implemented by next quarter, the requirements explode.
 
You may think companies, like this retailer, can scale internal big data environments to accommodate rapidly evolving requirements, but consider what you have to deal with:
  • Capital investment and data center constraints: Building out a big data infrastructure – with the compute, storage, security and bandwidth to handle ever-growing volumes of data – is expensive. Most companies struggle with data center capacity, and either lack the budget to create it or would rather invest in other strategic initiatives.
  • Timing:  In an ideal world, IT would have all the time they need to build an infrastructure. But in this economy, every day you’re not applying big data practices is a day you’re losing ground to competitors who are. Speed is of the essence.
  • Skills shortage:  Finding big data professionals to handle your growing big data initiative is far from easy. According to McKinsey & Co., demand for those professionals in the U.S. alone will exceed the available supply by 140,000 to 190,000 positions by 2018.1 If you’re lucky enough to locate these experts in time, you may not be able to afford hiring them.
  • Focus on technical details vs. higher-level objectives:  Managing an internal environment requires a tremendous amount of time and effort. You run the risk of diverting resources from strategic objectives to day-to-day infrastructure maintenance.
Organizations that embark on big data projects without a game plan for dealing with these challenges are likely to see their initiatives fail — or at least stall long enough that their impact is compromised. 
 
Many businesses are finding the best way to prepare for success is to partner with companies like Savvis, which offer fully hosted and managed big data environments in a managed services model. This allows you to quickly implement your big data project with an ultra-secure, reliable infrastructure that will effortlessly and cost-effectively scale with your needs, while benefitting from big data experts who can guide you in developing successful strategies. Bottom line:  you’ll get maximum value out of ever-growing data sources, accelerated time-to-results and the flexibility to be as innovative and nimble as you need to stay ahead of the competition.
 
1Big Data Analysts in Big Demand
 
 

ABOUT THE AUTHOR
Milan Vaclavik is Sr. Director and Solution Lead for Savvis’ big data practice. For more than 20 years, he has been bringing innovative software solutions to market in a variety of industries, including enterprise messaging and collaboration, digital rights management, document automation, supply chain management and physical security.  He has held senior product management, marketing and business development positions with startup software firms, as well as larger organizations such as Lotus Development/IBM, GE and LexisNexis. Milan holds a bachelor’s degree in Regional Science from the University of Pennsylvania and an MBA in Finance and Management of the Organization from Columbia Business School.
X
This ad will auto-close in 10 seconds