Big Data & Analytics
Big data and Analytics on a Bare Metal server
Big data projects require infrastructures that are sized for mass data processing. Get the most out of your data with the raw power of OVHcloud Bare Metal dedicated servers.
Meet the challenges of big data with OVHcloud servers
Bare Metal cloud servers have been designed to meet the challenges of big data, with security at the heart of the architecture. They deliver all the computing power and storage you need for real-time data processing.
Our Bare Metal servers are delivered with no virtual layer, and offer maximum raw performance. They are equipped with NVMe disks, and the latest generation Intel and AMD processors — so you get the best hardware for big data usage. You can very easily add resources to expand your infrastructure.
A resilient network
Our 4-link high-speed network ensures service continuity when needed, and optimal resilience guaranteed by a 99.99% SLA. Traffic is unlimited, so you will not need to worry about any extra costs.
A fully-isolated infrastructure
Your servers are fully-dedicated to your project. This guarantees both stable, consistent performance, and security for sensitive data.
High disk capacity
With up to 360TB of storage space, get very high read-write rates (IOPS) with the power of SSD technology.
OVHcloud Bare Metal servers are adapted for building Hadoop clusters. Deploy and connect multiple data nodes via our guaranteed 50Gbit/s internal network (vRack) on the High Grade range. You can also use a range of tools and projects from the Hadoop ecosystem (Apache Spark, Kettle, Ouzi or Mawazo) to simplify your data management and business analytics processes.
Expand your capacity on demand. You can do this by adding storage disks, changing the hot-swap disks available on our High Grade range, or increasing the nodes in your cluster via the vRack.
Options and services included
- A range of memory and storage options.
- 50Gbit/s private bandwidth guaranteed on High Grade (optional).
- OVHcloud Link Aggregation (OLA) is available at no additional cost.
Scale-6 Dedicated Servers
Server based on an Intel Dual Xeon Gold 6248R processor (48C/96T @ 3.00/4.00GHz)
Our recommended Bare Metal servers
A range of servers designed for complex, high-resilience infrastructures.
24-48 core processors and dual processors, max storage 24TB SSD NVMe, guaranteed private bandwidth of up to 25Gbit/s.
High Grade Servers (HGR-HCI)
The most powerful servers, optimised for critical loads and scalability.
32-64 core dual processors, max storage of 92TB NVMe SSD or SAS SSD, up to 24 disks (hot-swappable), guaranteed private bandwidth of up to 50Gbit/s.
High Grade Servers (HGR-SDS)
Recommended for high distributed storage requirements.
20 and 24 core processors, max storage of 360TB, NVMe SSD or SAS SSD, up to 24 disks (hot swap) and guaranteed private bandwidth of up to 50Gbit/s.
Epsilon worked in partnership with OVHcloud and Cloudera to deliver a stable, secure big data platform for a major luxury goods customer, allowing them to dramatically enhance their coupon marketing strategies.
Talent, a global job search engine, explains how in just a few years it managed to establish itself as a leading employment search website with OVHcloud’s Bare Metal and Public Cloud solutions.
The company is developing solutions to improve the performance of Postgres, the open-source database management system. Swarm64 designs solutions for high-volume projects, like data storage, IoT and SaaS systems. To this end, it offers even faster Postgres databases with excellent value for money.
How is a big data infrastructure built?
Based in part on the project’s needs, any big data infrastructure has a common foundation: hardware. This usage requires servers with high computing power (CPU), high volumes of RAM, and large amounts of storage space (hard disks, SSD and NVMe). These servers need to communicate over a high-speed network, with enough transfer speed to handle the multiple processing required for the big data project to work. Big data infrastructures gather and store an enormous volume of data, analyse it with the shortest possible processing times, and ultimately give the IT teams in charge of the project an exploitable result. The best way of building a big data infrastructure will vary depending on the technology used. There are many different technologies that offer their own advantages, complexities, and responses to business needs. These include Apache Hadoop, Apache Spark, MongoDB, Cassandra, Rapidminer, and many others.