1, 2, 3… How do I operate all this?
Your business has decided to adopt big data. You’ve developed the business strategy, assessed the risks and impacts, determined what your first project will be and selected your big data platform. Now you need to choose the operating platform.
Should I install the platform in my company’s datacentre? Does my company have the capacity to host this kind of infrastructure? Is my current infrastructure secure enough? Or will it be compromised? What are the risks of going with an external infrastructure provider like Amazon, Azure or IBM? Should I choose managed services or just infrastructure that my team can manage on their own? Is a hybrid approach that combines local and cloud computing the best option for my company, given the flexibility we need from our infrastructure?
Remember that big data environments need to be high-performance, agile and effective in the way they use material resources.
So, what’s the best solution?
As we mentioned in the previous article, the environment breaks from the established practices. Since you need flexibility in your infrastructure’s material resources and since big data tools change so quickly, there’s a strong argument for externalizing your environment with cloud computing because technological risk is assumed by the supplier at competitive prices.
But just how flexible are these arrangements? Are these environments complete? How often are they updated? Even if they’re good as production environments, shouldn’t development and test environments (which require fewer resources but are more flexible) be kept internal? If I opt for a cloud computing solution, will my telecom costs go through the roof? We have terabytes of data in our current data warehouses, but how do we transfer it to a supplier?
Anyone who claims they can answer your questions without carefully analyzing your situation first will almost certainly lead you to failure.
To figure out the best operating solution for you and your needs, you need to analyze different alternatives, assess the risks and choose the scenario that will minimize risk and help you achieve your goals at the lowest cost.
Big data maturity model
NOVIPRO offers a big data maturity model that’s combined with a complete risk analysis (explained in Article 3 of this series) to help you resolve any confidence issues and eliminate any negative factors that could jeopardize your big data strategy.
This big data maturity model determines what type of architecture you need based on the type of data you want to process. The volume, variety and velocity of your target data will decide wither you need batch or in-memory processing infrastructure, intensive storage or intensive computing, etc. Obviously, your strategy may involve one type of architecture initially, only to demand another type of architecture later.
How do you scale your big data environment to meet new needs? That’s one of the benefits of the big data maturity model developed by NOVIPRO.
Now it’s time to verify the big data architecture that you’ve chosen with bench testing. The tests may be implemented quickly using NOVIPRO’S E-SPACE service, which offers a cloud-computing business platform for projects that need high-performance, agile and effective infrastructure. E-SPACE allows you to implement big data bench tests without putting your existing infrastructure at risk.
We hope this has helped you determine the avenues for making the best decisions to operate your big data solution.
Read the next article of our Big Data series : From business need to roll-out: a summary of our series.