- Advertising
- Bare Metal
- Bare Metal Cloud
- Benchmarks
- Big Data Benchmarks
- Big Data Experts Interviews
- Big Data Technologies
- Big Data Use Cases
- Big Data Week
- Cloud
- Data Lake as a Service
- Databases
- Dedicated Servers
- Disaster Recovery
- Features
- Fun
- GoTech World
- Hadoop
- Healthcare
- Industry Standards
- Insurance
- Linux
- News
- NoSQL
- Online Retail
- People of Bigstep
- Performance for Big Data Apps
- Press
- Press Corner
- Security
- Tech Trends
- Tutorial
- What is Big Data
Hadoop Adoption Just Got Way Easier
Predictions for Hadoop adoption among enterprises have been highly hopeful. One such prediction anticipated that Hadoop would achieve a 75 percent adoption rate among all Fortune 2000 organizations by the year 2020. However, these predictions have had to be scaled back a bit in light of the fact that adoption rates, especially among the large enterprises, has been unexpectedly slow. Why aren’t businesses grabbing up the Great Hope of Big Data, and more importantly, what is Hortonworks doing to turn the tide?
How Hadoop Has Been Historically Difficult
Setting up Hadoop clusters is nothing short of excruciatingly complex and frustrating. It requires a long series of steps, which are not always successful. Coding in MapReduce is another exercise in frustration entirely. Even after the Hadoop clusters are set up, it is a time-consuming and tedious process to manage system resources and troubleshoot the workloads. Some jobs simply block other jobs, requiring the overseer to look for and address the job that’s holding all of the others up. Occasionally, users block each others’ jobs intentionally, and the Hadoop overseer quickly has to become the workplace referee. All of this is what happens when things with Hadoop are going right. Users identify HDFS, YARN, Hive, and HBase as the most difficult of the lot.
How Hortonworks is Making Hadoop Easier to Adopt, Setup, and Manage
For some time, third party vendors have built and released tools to make Hadoop setup and use easier. For instance, when it comes to coding in MapReduce (which can easily evoke cursing and fuming from those experienced in the skill), products like Cascading can help tremendously. Cascading is essentially a layer on top of MapReduce that makes coding many times easier. Yet these products are a bit like putting bandages on a failing dam.
Faced with far slower than expected adoption rates (Gartner puts actual Hadoop adoption at around ten percent among enterprises), Hortonworks went to great lengths in version HDP 2.3 (the latest Hadoop release) to make it easier for businesses to adopt and administrate Hadoop clusters. This version worked toward clarifying and streamlining those processes that users feel are the most difficult to work with (particularly HDFS, YARN, Hive, and HBase).
Additionally, Apache Atlas is supposed to help tremendously when it comes to data governance within Hadoop. Atlas allows the data to be searched and audited while remaining autonomous and allowing for data masking in order to remain compliant. Most industry experts also agree that Hadoop works much more smoothly and far faster on the Full Metal Cloud. The Full Metal Cloud can cut hours off of database administration work.
Hortonworks’ Backup Plans if Hadoop Isn’t Widely Adopted by Enterprises
Will Hortonworks go under if Hadoop fails to take off as anticipated? Probably not. Many of its products (such as Spark) work extremely well with Hadoop, but have also developed as useful stand-alone products. Though the current business model involves developing these side products in conjunction with Hadoop, these tools could serve as an excellent backup plan for Hortonworks if Hadoop is surpassed by other big data solutions.
Leave a Reply
Your email address will not be published.