{% extends "confidential-computing/base_confidential-computing.html" %}
{% block title %}Confidential AI{% endblock %}
{% block meta_description %}
Run your confidential AI workloads with our Ubuntu confidential computing stack.
{% endblock %}
{% block meta_copydoc %}
https://docs.google.com/document/d/1TZ_jK3ReetAU2qA_YxfZL8i4j_lFidcF6F2Xup8R8Nc/edit
{% endblock %}
{% block body_class %}
is-paper
{% endblock body_class %}
{% block content %}
Protect the confidentiality and integrity of your AI workloads at run-time across public and private clouds.
Confidential AI is made possible thanks to confidential computing. Unlike traditional VMs, where you have to trust that the host software is also secure, confidential VMs only require you to trust the software running within it and the platform's hardware root of trust.
Ubuntu confidential VMs protect your workload's computation while in the CPU. It makes use of the newer hardware encryption engines to keep your data encrypted in system memory.
Nvidia H100 Tensor Core GPUs protect the confidentiality and integrity of the workload's computation within the GPU, using built-in firewalls.
The CPU-GPU communication is encrypted, and a strong cryptographic guarantee is available for all remote parties to remotely attest the security claims of the platform.
Enterprises using machine learning in the cloud are concerned about the security of their data and the protection of their models. Industry regulations often prevent sharing sensitive data, hindering AI's full potential in important fields.
To address these challenges on Azure, a confidential AI preview is available with Ubuntu confidential VM using AMD 4th Gen EPYC processors with SEV-SNP, alongside NVIDIA H100 GPUs.
Sign up for the Azure preview of confidential AI with Ubuntu ›
While confidential VMs can protect your workload from external threats, vulnerabilities from within their boundaries remain a concern. This is where Ubuntu Pro becomes essential, to keep your guest CVM software stack always patched and up-to-date.
Your on-premises servers are vulnerable to insiders' attacks, and they also run the same privileged system software found in the public cloud. Therefore, they are susceptible to the same vulnerabilities and security risks.
To help you seamlessly enable confidential computing within your private cloud, Canonical offers a tech preview of Intel TDX on Ubuntu 24.04 which provides base host OS, guest OS, and remote attestation functionalities.
Ubuntu for confidential AI
Contact us
Why you need Ubuntu for confidential AI
How confidential AI works
CPU-based confidential computing
GPU-based confidential computing
Securely integrated CPU-GPU solution
Confidential AI with Ubuntu and Nvidia H100 GPUs
Use Ubuntu Pro to further harden your confidential VMs
Deploy confidential computing on your private cloud with Ubuntu and Intel TDX
Learn more about confidential computing