Add value with big data
From cryptocurrency in financial services, to big data in healthcare, machine learning in commerce, robotics and IoT in industry, and self-driving vehicles in transport, data is transforming the world we live in. It acts as a driver for digital transformation in all economic sectors.
An overview of data in the world today
In the future, all companies will face new scenarios that are just dots on the horizon for now. If companies try to avoid them, they will risk being quickly beaten by a competitor that has found new ways of using data to boost their productivity, get a better understanding of the market, or offer services that are better adapted to their customers’ needs. They can either innovate or go under.
Innovate or go under
Data is constantly transforming and evolving as it changes the world around us. It is not only increasing in volume, but becoming more complex, with images, audio, social media posts, maps, sensor readings, satellite data, and much more. Storing and processing all of this heterogenous, non-structured data is a real challenge.
Data — a vital asset for companies
Without a challenge, a revolution cannot happen. A problem has to exist in order for people to offer an innovative solution that represents a paradigm shift — and every revolution carries its own risks. The increasing volume and complexity of data represents a new risk for organisations.
Since data is now so critical for business models, data loss can seriously affect companies’ business continuity. And in sectors such as financial services and healthcare, where security is of paramount importance, organisations may be required to pay penalties and indemnities totalling millions of euros for non-compliance with local standards.
Risk management also increases costs for infrastructures and human resources, which may not be affordable for some organisations.
To face these issues, most companies have taken a new approach, and turned to cloud-based solutions.
The features offered by the cloud help companies tackle the challenges resulting from the exponential growth and diversification of data. They also open up opportunities to set up infrastructures that can host very high volumes of data, support data growth, and handle the requirements of the new technologies associated with it.
With all of these challenges in mind with regards to data, it would be almost impossible to face them using a standard on-premises infrastructure. But depending on the hosting provider, deployment time is cut down considerably using cloud-based technology.
A hosting provider ahead of the game
Not all cloud-based services are the same. Only a smart cloud can help companies maintain control over their data, so that they can innovate and run their businesses freely. Our cloud is simple, and quick to deploy. It is multi-local, close to visitors across the globe, and available at affordable and clear prices. It is also reversible, open, interoperable, transparent and responsible. We believe that a smart cloud like ours is key if the data revolution is to offer everyone a step forward.
Moreover, projects beyond a certain size will face three major challenges that not all cloud service providers can overcome: intelligence, volume and security.
There seems to be a consensus that together, big data and artificial intelligence form one of the main pillars of an organisation’s development. Companies are aware that with data, they can acquire knowledge that can be used to gain a better understanding of the social and economic contexts they are working in, so they can make the best decisions at any given time. And with this data, they can now build new services, based on machine learning and deep learning. In fact, almost all managers confirm that their companies are investing in initiatives like this.
To meet the demands of companies looking to ensure that their systems can manage the complexity of artificial intelligence and machine learning, OVH offers solutions to accelerate these initiatives throughout the entirety of the data journey. As a result, big data and artificial intelligence solutions are more accessible to businesses. These can be used on an hourly or monthly basis, and to make things even simpler, they can be delivered as managed services.
To optimise operations, new product development, and added-value services, as well as ensuring compliance with increasingly strict regulations, running predictive analysis tasks will require companies to manage very high volumes of data. To store all the data that powers artificial intelligence systems and process it quickly, you need to have a suitable infrastructure.
As a native cloud provider, our solutions help you store and process high volumes of data in a secure, scalable, and reversible environment.
In just a short amount of time, you can have a production infrastructure, powered by next-generation hardware components, offering guaranteed availability. Our services, hosted in our own datacentres, spread across four continents, can be connected to one another via a dedicated network to handle growing volumes of data. They are also based on market standards and open-source solutions.
At any time, you can use our products alongside solutions from another service provider, or switch to another provider altogether. If our customers want to cancel their services, we want the process to be as simple as possible.
As we mentioned earlier on, data security is another major challenge. Over the last few months, the record for the intensity of distributed denial-of-service (DDoS) attacks has been beaten several times. A number of security vulnerabilities have also been made public. Keeping this in mind, it is not surprising to see that one of the biggest security concerns for most businesses is the risk of falling victim to a cyber-attack.
As IoT devices become increasingly popular, sensors and other wearables are likely to collect personal data. Because of this, they present a new source of vulnerability. Companies that collect this data must guarantee its integrity, both to protect their users’ private lives, and to ensure compliance with applicable local and international regulations.
At OVH, we take data protection and security very seriously. All of our services include our anti-DDoS protection as standard.
They also offer high availability, which we can ensure through hardware redundancy, and the very highest-standard compliance certifications. Our datacentres are connected to one another via our own dedicated global network, with a bandwidth of 18Tbit/s.
And as a company based in Europe, we follow very strict regulations in terms of data protection. When we expanded into North America, we created a legal isolation, so that our services hosted outside of the US would not be subject to the Patriot Act or the Cloud Act. Our firm commitment to data sovereignty is one of the reasons why OVH was chosen as the official cloud provider of the IA4EU (Artificial Intelligence for the European Union) initiative, which aims to promote a European vision of artificial intelligence, centred on ethical values.
Contrary to what many people may think, security isn’t just a technical issue. In practice, finance is often the main obstacle for implementing the right security measures. Building secure infrastructures with no downtime and guaranteed data protection often incurs costs with service providers, which can exponentially increase, or even become completely unpredictable.
At OVH, you manage your own costs. We offer transparent, affordable pricing, to ensure that you always have a clear budget for a business continuity plan. Our cloud services don’t have a subscription period, and depending on the solution, you can choose between regular or on-demand billing, on a pay-as-you-go basis.
OVH solutions to fit your data strategy
OVH is proud to participate in developing a number of businesses worldwide, by supporting their data projects from start to finish. Our solutions cover the entire data journey, from collection, storage, and analysis, to predictions established via automatic learning.
Our free OVH Data Collector lab – a powerful collector, supported by the OVH cloud – can be used to collect data easily. You can then replicate, interrogate, and transfer it to power your application.
We offer a range of data storage , solutions to meet every requirement, from bare metal machines to turn-key managed platforms: dedicated servers that are specifically designed to store high volumes of data, like the new Advance STOR range, HA-NAS centralised storage, scalable storage solutions, such as Object Storage and the Block Storage solution (powered by OpenStack Swift and Ceph technology respectively), the Logs Data Platform and Metrics Data Platform, to store and analyse logs and metrics with almost no limitations, and Cloud Databases, a managed database solution.
Once your data has been collected and stored in an adapted service, we offer two ways of building a big data cluster. With our first option, the Data Analytics Platform, you can deploy a secure, ready-to-use Apache Hadoop production cluster in less than an hour. And with the second option, Cloudera Managed, you can harness the expertise of our partners, Claranet and Cloudera, to have a fully-managed Apache Hadoop big data solution.
Finally, to use your data for machine learning and access the full potential of artificial intelligence, OVH will give you the infrastructure you need, with dedicated servers and Public Cloud instances boosted by next-generation NVIDIA GPUs. More recently, we also took this a step further by offering all of the required software. This is why we’re the only European cloud services provider to offer the full catalogue of containers accelerated by the NVIDIA GPU Cloud (NGC), which you can use to install applications like TensorFlow and PyTorch on your instances in just a few clicks. Turn-key data science programs, such as Jupyter and Dataiku DSS, are also available. Finally, you can also take part in our Machine Learning Platform and OVH AI Marketplace labs.
NVIDIA NGC Platform
OVH and NVIDIA Partner to Deliver the Best GPU Acceleration Platform for Deep Learning and High-Performance Computing
OVH's NVIDIA GPU Cloud combines the flexibility of the Public Cloud with the power of the NVIDIA Tesla V100 graphics card to provide a complete catalogue of GPU-accelerated containers that can be deployed and maintained for artificial intelligence applications.
it enables users to run their projects on a reliable and efficient platform that respects confidentiality, reversibility and transparency of data location.
Data Analytics Platform
Deploying a big data cluster is usually a long, restrictive process. With the OVH Data Analytics Platform, you can simplify your business. In under an hour, we can deliver a preconfigured, ready-to-use Apache Hadoop stack.
Based on a standard open-source Hadoop distribution, we preconfigure all the services you need to process data and secure the flow of data traffic.
The first step to taking advantage of your data is collecting it. With the OVH Data Collector, you can ingest data in record times. It does this while maintaining real-time replication from many sources, including databases and message buses. You can then push them into a big data cluster (we deliver an Apache Kafka topic natively).
You collect all your data with minimal impact on your production infrastructure.
Design, deploy and use machine learning models quickly, without having to worry about your platform’s infrastructure. With this setup, you can focus on bringing value to your business.
You can use AutoML via a command line or web interface to automate all of your selection tasks, train and deploy your machine learning models. It has a range of uses, including fraud detection, supply chain process optimisation, scientific research, and much more. You can manage all of this in the cloud, paying only for the resources you use, and maintain full control over your data.
Data as a driver for innovation
As we’ve already mentioned, a number of companies constantly collect data that needs to be stored progressively on their IT systems, with the maximum level of security guaranteed. For example, this is the case for the European Space Agency (ESA) which, through its partner Serco, hosts data for the Copernicus programme in the OVH Public Cloud. Every year, several petabytes of geographical, thermic, and non-structured data from the Sentinel satellites observing Earth are stored in our Public Cloud. The volume of data that needs to be stored is constantly increasing, especially due to the more newer, more efficient satellites being put into orbit.
There are infinite applications for this data: forecasting the state of oceans, monitoring air quality, developing tools to create climate change services, and even calculating the profitability of installing solar panels. Using this kind of geographical data, the Spanish startup dotGIS developed SolarMap. Using technologies like big data and business analytics, this solution finds the best roofs to install solar panels on, depending on the surface and daily solar radiation curves. SolarMap also received support from OVH. By joining the Digital Launch Pad, our innovation programme for startups, they designed an architecture of high-availability clusters with a load balancer, based on GPU-powered OVH dedicated servers. Using our private vRack network, they are able to transfer all their data volumes quickly and seamlessly.
But there are many others applications for big data. Businesses often use it for predictive marketing. By studying the actions taken by customers before they cancel a service, or whether they have stopped purchasing certain products, companies can observe behavioural patterns. The number of website hits, support calls, and even the buying rhythm are indicators that, once they have been correlated, can help detect a dissatisfied customer and take the appropriate measures. For example, at OVH, we developed a strategy like this internally, in order to guarantee customer satisfaction.
And this isn’t the only domain in which we use big data. Using sensors and probes, we collect data such as the temperatures in datacentre rooms, the number of kWh in each power outlet, presence in offices, and much more. By correlating it with data from other sources, like weather conditions and supplier prices, we can quickly determine which element uses the most energy. And by doing this, we can cut costs and reduce our carbon footprint.
These are just a few examples of how we — and our customers — use data in different social and economic domains. And at the heart of this revolution, we work tirelessly to ensure that our users and partners can create innovative technology with total freedom. They trust us with their most precious possession — their data.
“We manage an online game that totals 4 million parties per day, and uses very high volumes of data. Early on, we would store the results of games on standard relational databases. Since then, we opted for big data solutions offered by OVH. By doing this, we are no longer limited in terms of the volume of data we can store and use. The OVH Data Analytics Platform is a turn-key solution that has also helped us save a considerable amount of time setting up a natively-secure big data cluster. We can now now use all our data efficiently, and without any constraints.”