- Bare Metal
- Bare Metal Cloud
- Big Data Benchmarks
- Big Data Experts Interviews
- Big Data Technologies
- Big Data Use Cases
- Big Data Week
- Data Lake as a Service
- Dedicated Servers
- Disaster Recovery
- Industry Standards
- Online Retail
- People of Bigstep
- Performance for Big Data Apps
- Press Corner
- Tech Trends
- What is Big Data
Using big data for real time insight
We recently blogged about some of the different practical uses an organisation can find for big data. But many of the best use cases also have an additional element – real-time analytics.
The actionable insight that organisations derive from their big data is a powerful tool proposition but increasingly executives are wanting that insight in real-time. What steps do organisations need to take in terms of infrastructure in order to extract meaning from their big data in real-time?
Big data in 2014
Industry analyst organisation IDC recently held a webinar, IDC Big Data Predictions 2014: Beyond Irrational Exuberance - Opportunities in the Big Data and Analytics Markets. It predicted that the market for big data will reach $16.1 billion in 2014, growing six times faster than the overall IT market.
Astonishing figures that show big data is coming of age. But given the scale of the market, how can organisations unlock insight from their big data? Its true value shows when insight can be derived in real-time and that requires an infrastructure fast enough to manage and analyse large volumes of data. And an infrastructure that fast and powerful has to use bare metal.
Bare metal for real-time big data
The problem with deploying big data applications is that they are extremely power hungry. That’s why we aim to reach the greatest possible speed and performance for our customers and have achieved this by removing the hypervisor to utilise the full power of bare metal.
A variety of impartial industry benchmarks have shown that even the very best hypervisors waste a minimum of 20 per cent of the bare metal power of servers. The actual figure is almost certainly significantly higher than 20 per cent, which when you are crunching terrabytes of data in real-time, is a major drop in power.
In addition to speed, a real-time big data infrastructure needs to be scalable. If an organisation suddenly gets an overwhelming torrent of data – perhaps a social media campaign that sees millions of tweets suddenly sent – it needs to be able to scale up its infrastructure fast. Yet such queries might not happen all the time and when they do, an organisation may only need that amount of computing power for a day or even less.
That’s why we provide our big data infrastructure with all the flexibility you would expect from a cloud provider. Customers are able to use our services on a pay-per-hour basis, paying only for the processing power they actually need and use.
Social media big data
Social media is undoubtedly driving much of the real-time big data explosion. Twitter recently announced some of its 2013 stats and for an event such as Andy Murray winning Wimbledon, around 120,000 tweets per minute were sent on
If an organisation saw even a fraction of those tweets about itself, wanted to analyse the data and take action based on the findings, it would require an awful lot of computing power. The instant nature of Twitter means that any actionable insight would need to be played out in real-time and this needs an infrastructure ready and capable of handling real-time analytics.