Why Hadoop in the Cloud Makes Sense

  • Karan Kirpalani
  • Jun 20, 2018
  • 3 min read
Why Hadoop in the Cloud Makes Sense

Despite indifferent Hadoop adoption over the last three or four years, most Big Data and Distributed Computing experts still maintain that Hadoop is here to stay.

Why, you ask?

Since its introduction over a decade ago, Apache Hadoop has become the dominant big data processing framework, used by leading big data companies like Cloudera, MapR and Hortonworks. Enterprise-class cloud and big data analytics platforms such as IBM BigInsights and Informatica use their own, proprietary versions of Hadoop. Microsoft Azure offers a managed cloud service for the Hortonworks Data Platform. The other major driver of Hadoop adoption is the rapid rise in real-world big data use cases across various industries such as banking, transportation & logistics, e-commerce and healthcare. A growing need to process, store and manage large data sets is making it imperative for companies to install and run Hadoop clusters.

To do this, many organizations have, over the years, relied on on-premise installations of Hadoop to manage their big data processing needs, but are we about to witness a change in this trend?

We’re now seeing the availability of Hadoop functionality through cloud platforms as well. According to a report by Forrester in March 2017, Hadoop on the cloud will gain prominence among enterprise architecture and business technology professionals, with “public cloud big data services” becoming a priority for 40% of executives. With the global uptake in IoT and user generated content, the volume and variety of digital data (especially unstructured data) is expected to grow exponentially over the next few years. Of course, Hadoop in the Cloud has already made significant headway in the market- you’d be hard-pressed to find a leading CSP without a Hadoop offering. But does this mean that moving your Hadoop workloads into the Cloud makes more sense than maintaining them on-premise?

Essentially, yes, for 3 main reasons.

  1. Unless you’re the kind of organization that is crunching a steady volume of data around the clock, being able to spawn Hadoop nodes on-the-fly and pay only for the duration you use them ostensibly makes more financial sense than investing in the hardware, software and manpower to manage the same environment on-premise. The elastic pay-per-use model that a Cloud provides is almost tailor-made for Hadoop in this regard.
  2. Then there’s the issue of managing manpower – Hadoop administrators are hard to find, and harder to retain making it even more sensible for organizations to move these workloads into the Cloud and free themselves of the hassles of finding and managing manpower.
  3. Leading CSPs are already providing strong cloud options for Hadoop implementations. Netmagic, for example, has a ‘SimpliHadoop’ service that offers end-to-end capabilities for running Hadoop clusters, on both public cloud infrastructure or as part of a dedicated, private cloud infrastructure. Such offerings today assure enterprise –class performance, availability and security. Given the advances in cloud security and data governance, most organizations with large amounts of consumer information (such as retail, e-commerce, banking, etc.) can feel extremely safe with their data sets placed on virtual private clouds.

At the same time, using a cloud service provider like Netmagic SimpliHadoop offers a greater degree of security and customization, as compared to a public cloud based Hadoop service. This allows organizations to leverage various options such as private clouds and bare metal servers for their highly customized Hadoop installations. For example, where security and latency concerns are higher (e.g. customer facing applications, healthcare applications), having your Hadoop stack on a dedicated hosted server would be a more optimized solution. For data and applications where latency is not a concern, organizations may choose to go with a Hadoop stack on public cloud infrastructure.

Eventually the decision to move big data and Hadoop to the cloud depends on each organization’s unique needs. Companies that have already made significant advancements in cloud (public or private) will find their journey to migrate Hadoop to the cloud simpler. For all organizations, it always makes sense to work closely with cloud infrastructure vendors and Hadoop experts to find the best route to Hadoop on cloud adoption.

Karan Kirpalani

Karan Kirpalani

Vice President - Products & Services
Karan is the Vice President - Product Management at NTT-Netmagic.
Joining Netmagic when the company was in its nascent stage, Karan has played two pivotal roles for nearly two decades, and has been part of the core team of the organization and has worn many hats – from business development to product management and product marketing.
Prior to Netmagic, Karan was associated with ICICI Bank as part of the online content and web application team.
Karan is a graduate in Economics from Mumbai University. He is a guest lecturer at Indian Institute of Technology Bombay, enlightening engineering students on cloud and other IT infrastructure technologies.
Karan’s dedication to what he does and business insight makes him a visionary and thought leader in the field. He is an avid reader, and has written articles, whitepapers and blogs on various technologies.

  • Share on:

Tell us how we can help you

+ =