According to a 2015 survey on big data adoption, only 25% of current Hadoop users work with a Hadoop distribution vendor. Why’s the number so low? Well, it’s changing. Most early adopters experimented internally with Hadoop as pilot programs. Now that more companies are ready for full-scale deployments, many are realizing that they lack the internal skills and expertise to make that transition to production and have begun looking for a Hadoop partner.
Fast forward to 2017 and according to interviews done by Datanami with leaders in big data, Hadoop is far from dead.
If I were in that 75% looking for a partner to help me with Hadoop, there are a number of key questions I’d ask to vet whether or not a potential Hadoop partner would be able to meet my requirements. Specifically, I would want to understand if they would be able to help me determine the best big data solutions to support my company’s business strategy, develop a sound strategy for data management, and develop a solid implementation strategy.
Here are the top eight questions I would ask when talking with a Hadoop partner.
- Tell me what’s new in the industry today and where you think things are trending. A good partner should have a strong perspective on what’s new and exciting in the big data ecosystem. They have a complete view and understanding of the whole gamut of technologies, tools and platforms available now, and where the technology is headed in the next six to 12 months. There’s a lot of “noise” out there and being able to help you see through the noise is important. They should be able to give you insights and information to consider.
- Do you have breadth of experience? You want to make sure your partner has experience with different Hadoop distributions and isn’t committed to just one that might not be suited to your needs.
- Do you have depth of experience? In addition to being experienced with different types of deployments, you want to make sure your partner has experience with successful, longer-term, full-scale production deployments. They should be able to give you some examples that go beyond trials and proof of concepts and talk about building full-scale production use cases that have provided business value.
- How do you act as a strategic partner? A good partner should be able to guide you in thinking about what core areas to consider when developing your Hadoop strategy and how to demonstrate ROI.
- What is your methodology? Your partner should be able to clearly describe how they would architect a solution and map that solution architecture to your business use cases. What are the processes they follow to determine the right design and architecture?
- Are you results-oriented? You want to make sure your partner understands what the business opportunity is that justifies your Hadoop investment. They should be able to describe how they will help you measure and show business outcomes for each business use case.
- Describe the customer experience at your company. Ideally your partner should assign you a dedicated project manager who can help manage the deliverables and be on site if needed. You want a responsive support person who will be available when you need him or her.
- Are you connected in the industry? A good partner will have the strategic relationships with technology vendors to support all of your data needs and integrate Hadoop with your existing technologies.
Applications of Hadoop are still evolving and companies are forecasted to spend nearly $800 million on Hadoop and Hadoop-related services in 2017, according to a report by Forrester. As you can see, it’s increasingly critical for IT teams to begin investigating third-party Hadoop partners now. Spend a good amount of time talking to experts in the space – particularly important for larger companies that likely will need an end-to-end big data solution – so that you’re sure to get the information you need to make the right decisions.
I invite you to learn more about the Zaloni Data Platform, built specifically for production level Hadoop data management. If you’d like to dive a little deeper, check out with Hadoop Workflow Management and Metadata Management in Hadoop.