Big data on high-performance dedicated servers

Big data OVHcloud

What is big data? The three Vs: volume, velocity and variety.

Our recommendation

OVHcloud Deals

Infra-4

From $243.10 /month
Intel 2x Xeon Silver 4214
24 cores / 48 threads - 2.2 GHz
96 GB DDR4 ECC
2×960 GB SSD NVMe
Public bandwidth: 1 Gbps
Select

mHG-2019

From $252.99 /month
Intel Xeon Silver 4110
8 cores / 16 threads - 2.1 GHz
96 GB DDR4 ECC
2×6 TB HDD SAS
Public bandwidth: 1 Gbps
Select

HG-2019

From $382.99 /month
Intel Xeon Gold 6132
14 cores / 28 threads - 2.6 GHz
96 GB DDR4 ECC
2×6 TB HDD SAS
Public bandwidth: 1 Gbps
Select

BHG-2019

From $676.99 /month
Intel 2x Xeon Gold 6132
28 cores / 56 threads - 2.6 GHz
192 GB DDR4 ECC
2×6 TB HDD SAS
Public bandwidth: 1 Gbps
Select

What do dedicated servers bring to big data?

Big data tends to refer to the discipline of storing, processing and real-time analytics for huge business data sets. It is not just the size of the data source that makes it difficult to address, but also the lack of structure and the speed at which it needs to be processed. This is why big data is typically defined by the three Vs: volume, velocity and variety.

In reality, some form of big data has been practiced for decades, as part of standard business processes, such as data analysis, web analytics, visualisation and information management. However, traditional hardware, software and analysis tools have been unable to tackle the sheer size, complexity and unstructured nature of contemporary data. In contrast, big data typically uses more powerful servers, in conjunction with advanced algorithms, storage and data analytics tools, harnessing the full potential of organisations’ digital data.

For example, big data usually involves NoSQL or NewSQL technology and a distributed architecture to analyse unstructured data from multiple sources (sometimes called a data lake). Similarly, the open-source Apache Hadoop – a filesystem for managing storage – is the number-one technology associated with big data analysis.

 

Storage – volume

Performance – velocity Price/Performance – value
To deliver meaningful insights, we need to leverage huge volumes of many types of data, turning storage into an ongoing challenge. OVHcloud dedicated servers come with 500GB of storage space as standard, along with the ability to effortlessly add additional secure storage space, whenever and wherever it’s needed.

Dedicated servers deliver the raw power and performance necessary to meet the intensive processing demands of big data and real-time analytics.  

OVH’s customisable HG servers, designed for big data, are equipped with Intel Xeon Scalable processors, with 8 to 36 cores (16 to 72 threads), for consistently high.

Dedicated servers give you the best price/performance ratio and unparalleled scalability, for enterprise-level big data projects that transform customer data into powerful business intelligence that drives sustainable growth.

This way, your data will deliver maximum business value, however you choose to utilise it.

   
     

Why use dedicated big data servers as opposed to a cloud solution?

Think about what is suited to your business. Start-ups and developers who’re experimenting with big data analytics may benefit from a cost-effective, entry-level Public Cloud solution. Whereas, for enterprise business, a dedicated big data server configuration or a hybrid cloud approach might win out.

OVHcloud’s competitively-priced HG servers with great price/performance mean that, if you are implementing a significant big data project or application, involving many terabytes of raw data, it could be much cheaper to deploy HG servers, as opposed to Public Cloud. You also benefit from an unrivalled volume of storage space, and there are no noisy neighbours or hypervisors to wrestle with, which could be the case with a virtualised option.

Tip 1. Consider a datacentre extension

Is your on-premises infrastructure running out of space to store your big data? Use OVHcloud Connect or a site-to-site VPN to deposit your data securely in OVHcloud datacentres.

Tip 2. Create redundancy with Hadoop

Operating in clusters, Hadoop’s distributed file system promotes a high rate of data transfer between nodes. This allows the system to work uninterrupted, even in the event that one element fails.

Build Hadoop clusters using OVHcloud bare-metal servers, then deploy and interconnect multiple Hadoop data nodes using OVHcloud's private vRack. Consider some of the various tools and projects in the Hadoop ecosystem – such as Apache Spark, Kettle, Ouzi or Mawazo – to simplify your information management and business analytics processes.

Tip 3. Experiment with your big data architecture

The OVHcloud dedicated server range comes with the tools, options and very affordable models you need to experiment with different big data solutions: scale up if successful and shut down servers quickly when projects are complete.

Take advantage of our short-term contracts, user-friendly Control Panel with numerous visual analytics tools, 120s delivery and post-installation scripts.