News/Blog

As big data adoption rates grow, so too must enterprise storage and networks

 

More companies than ever are leveraging big data analytics to glean new insights into operations and customer behavior, but campaigns will not be successful unless the business makes appropriate investments in enterprise storage and networking infrastructure design, according to John Chambers, chairman and chief executive of Cisco Systems.

 

In a survey of Fortune 1,000 companies conducted last year for the Harvard Business Review, 85 percent of those polled said they either currently have a big data initiative or they will soon have one in place. Additionally, 80 percent of those polled thought that big data will affect many business functions, and 75 percent predict big data analytics will address many internal departments. Although big data has the potential to affect just about every aspect of business, the enterprises polled expect their analytics efforts to especially boost their customer experience and decision making processes.

 

However, despite the many benefits that big data purports to offer, many organizations still struggle to use analytics in the most effective manner. In particular, the HBR study found that the majority of respondents indicated having difficulty accessing data and handling a variety of data streams.

 

“Recall that 80 percent of respondents agreed that big data initiatives would reach across multiple lines of business,” Paul Barth and Randy Bean, the authors of the report, wrote in an HBR blog post. “That reality bumps right up against the biggest data challenge respondents identified: ‘integrating a wider variety of data.’ This challenge appears to be more apparent to IT than to business executives. We’d guess that they’re more aware of how silo’d their companies really are, and that this is another reason that they judge the company’s capacity to transform itself using big data more harshly.”

 

Addressing big data concerns with improved enterprise storage and networking

 
One of the main barriers to realizing big data analytics success is that legacy networks are not equipped to handle the amount of data that is now analyzed and stored. As a result, companies will likely have to upgrade their existing IT infrastructure design, which includes adding more processing power and space to their enterprise storage solutions in order to avoid the most common big data analytics pitfalls, according to Chambers.

 

In a recent interview with The New York Times, Chambers noted that legacy computing systems were built to effectively process certain types of data in specified quantities. However, thanks to the rise in big data, these parameters are no longer realistic or effective. Not only is the amount of data being processed on the rise – an estimated 2.5 quintillion bytes of data is now created every day – but so too is the type of information being analyzed. Today’s tools are increasingly being tasked with unstructured data analysis in addition to more traditional structured data.

 

Companies looking to adequately address the three V’s of big data – volume, velocity and variety – should turn to an IT consulting services firm like FlexITy. The managed services company not only partners with industry leaders like Cisco, but FlexITy works with organizations from the beginning to map out a comprehensive adoption strategy that takes into account its distinct business needs and addresses them with the appropriate technological solutions.

')}