What is big data? The three Vs: volume, velocity and variety.
24 cores / 48 threads - 2.2 GHz
8 cores / 16 threads - 2.1 GHz
14 cores / 28 threads - 2.6 GHz
What do dedicated servers bring to big data?
Big data tends to refer to the discipline of storing, processing and real-time analytics for huge business data sets. It is not just the size of the data source that makes it difficult to address, but also the lack of structure and the speed at which it needs to be processed. This is why big data is typically defined by the three Vs: volume, velocity and variety.
In reality, some form of big data has been practiced for decades, as part of standard business processes, such as data analysis, web analytics, visualisation and information management. However, traditional hardware, software and analysis tools have been unable to tackle the sheer size, complexity and unstructured nature of contemporary data. In contrast, big data typically uses more powerful servers, in conjunction with advanced algorithms, storage and data analytics tools, harnessing the full potential of organisations’ digital data.
For example, big data usually involves NoSQL or NewSQL technology and a distributed architecture to analyse unstructured data from multiple sources (sometimes called a data lake). Similarly, the open-source Apache Hadoop – a filesystem for managing storage – is the number-one technology associated with big data analysis.
Why use dedicated big data servers as opposed to a cloud solution?
Think about what is suited to your business. Start-ups and developers who’re experimenting with big data analytics may benefit from a cost-effective, entry-level Public Cloud solution. Whereas, for enterprise business, a dedicated big data server configuration or a hybrid cloud approach might win out.
OVHcloud’s competitively-priced HG servers with great price/performance mean that, if you are implementing a significant big data project or application, involving many terabytes of raw data, it could be much cheaper to deploy HG servers, as opposed to Public Cloud. You also benefit from an unrivalled volume of storage space, and there are no noisy neighbours or hypervisors to wrestle with, which could be the case with a virtualised option.
Tip 1. Consider a datacentre extension
Is your on-premises infrastructure running out of space to store your big data? Use OVHcloud Connect or a site-to-site VPN to deposit your data securely in OVHcloud datacentres.
Tip 2. Create redundancy with Hadoop
Operating in clusters, Hadoop’s distributed file system promotes a high rate of data transfer between nodes. This allows the system to work uninterrupted, even in the event that one element fails.
Build Hadoop clusters using OVHcloud bare-metal servers, then deploy and interconnect multiple Hadoop data nodes using OVHcloud's private vRack. Consider some of the various tools and projects in the Hadoop ecosystem – such as Apache Spark, Kettle, Ouzi or Mawazo – to simplify your information management and business analytics processes.
Tip 3. Experiment with your big data architecture
The OVHcloud dedicated server range comes with the tools, options and very affordable models you need to experiment with different big data solutions: scale up if successful and shut down servers quickly when projects are complete.
Take advantage of our short-term contracts, user-friendly Control Panel with numerous visual analytics tools, 120s delivery and post-installation scripts.