- Bare Metal
- Bare Metal Cloud
- Big Data Benchmarks
- Big Data Experts Interviews
- Big Data Technologies
- Big Data Use Cases
- Big Data Week
- Data Lake as a Service
- Dedicated Servers
- Disaster Recovery
- Industry Standards
- Online Retail
- People of Bigstep
- Performance for Big Data Apps
- Press Corner
- Tech Trends
- What is Big Data
Don’t let big data be a big drain on resources
Big data and virtualisation are two of the dominant technology trends of the past few years. But can they work together effectively? Big data requires immense computing power to manage and process it effectively and whilst virtualisation has certain qualities, it lacks the raw power to truly unlock big data’s potential. How much resource could a business waste trying to process big data in a virtual environment?
The race for big data ROI
Delivering ROI is the holy grail for most technologies, not least of which big data. But is virtualisation killing the chance for big data to deliver that ROI?
Big data is one of the most hyped terms within the tech industry over the past five years. Organisations are all keen to extract value from their big data and use it to make insightful and informed decisions about their business. The encouraging news is that some early adopters are already starting to do this.
A recent survey by the information management professional body AIIM revealed 60 per cent of big data early adopters considered their ROI to be good. But what demands does big data make on your business? ROI is partly about the gains you can make from a technology but balanced against the resources used to make those gains.
Big data challenges
There are a number of reasons identified within the AIIM survey why firms are yet to see big data ROI, a skills shortage being one of the main ones. Security is another big data adoption challenge whilst the rise of ‘dark data’ (unstructured content that still holds value) was mentioned by 61 per cent of respondents.
What the research did not cover however, was the impact of running big data applications in a virtual environment. When we launched Bigstep earlier this year, we had conducted extensive research into how power hungry applications such as Hadoop could be. They have powerful analytic capabilities but can be a drain on computing power.
So we removed the hypervisor, making our infrastructure entirely bare metal. This has a major impact on speed and performance, key attributes when looking to find ROI form big data.
A broad array of industry opinion formers have also looked at the impact of a hypervisor on computing power. IBM tested a number of virtual environments and found that they were ‘still unable to approach bare-metal performance’, whilst Nati Shalom, CTO & Founder of GigaSpaces has written ‘that it would be fair to assume that running I/O intrusive workloads such as Big Data on a virtualised infrastructure would require 3X more resources than its Baremetal equivalent’.
Three times more resources is a dramatic difference and begs the question ‘why would you not use bare metal for big data applications?’ Delivering ROI on big data will be one of the key challenges for businesses in 2014 and using a virtual environment is a major barrier to achieving that ROI.