AI, ML, & Ubuntu: Everything you need to know
Canonical
on 26 July 2018
Tags: AI , AI/ML , GPGPU , Kubeflow , kubernetes , ML , MLOps , TensorFlow , Ubuntu
AI and ML adoption in the enterprise is exploding from Silicon Valley to Wall Street. Ubuntu is the premier platform for these ambitions — from developer workstations, to racks, to clouds and to the edge with smart connected IoT. One of the joys that come with new developer trends are a plethora of new technologies and terminologies to understand.
In this webinar, join Canonical’s Kubernetes Product Manager Carmine Rimi for:
- An introduction to some of the key concepts in Machine Learning
- A look into some examples of how AI applications and their development are reshaping company’s IT
- A deep dive into how enterprises are applying devops practices to their ML infrastructure and workflows
- An introduction to Canonical AI / ML portfolio from Ubuntu to the Canonical Distribution of Kubernetes and and how to get started quickly with your project
And in addition, we’ll be answering some of these questions:
- What do Kubeflow, Tensorflow, Jupyter, and GPGPUs do?
- What’s the difference between AI, ML and DL?
- What is an AI model? How do you train it? How do you develop / improve it? How do you execute it?
And finally, we’ll be taking the time to answer your questions in a Q&A session
Run Kubeflow anywhere, easily
With Charmed Kubeflow, deployment and operations of Kubeflow are easy for any scenario.
Charmed Kubeflow is a collection of Python operators that define integration of the apps inside Kubeflow, like
katib or pipelines-ui.
Use Kubeflow on-prem, desktop, edge, public cloud and multi-cloud.
What is Kubeflow?
Kubeflow makes deployments of Machine Learning workflows on Kubernetes simple, portable and scalable.
Kubeflow is the machine learning toolkit for Kubernetes. It extends Kubernetes ability to run independent and
configurable steps, with machine learning specific frameworks and libraries.
Install Kubeflow
The Kubeflow project is dedicated to making deployments of machine learning workflows on Kubernetes simple,
portable and scalable.
You can install Kubeflow on your workstation, local server or public cloud VM. It is easy to install
with MicroK8s on any of these environments and can be scaled to high-availability.
Newsletter signup
Related posts
Meet Canonical at KubeCon + CloudNativeCon North America 2024
We are ready to connect with the pioneers of open-source innovation! Canonical, the force behind Ubuntu, is returning as a gold sponsor at KubeCon +...
7 considerations when building your ML architecture
As the number of organizations moving their ML projects to production is growing, the need to build reliable, scalable architecture has become a more pressing...
AI in 2025: is it an agentic year?
2024 was the GenAI year. With new and more performant LLMs and a higher number of projects rolled out to production, adoption of GenAI doubled compared to the...