Seldon —MLOps

Abhishek Maheshwarappa
3 min readDec 29, 2021

Seldon a platform for packaging and deploying machine learning models

This blog will talk about the overview of an MLOps tool Seldon and why one should know about Seldon and what are the benefits of using Seldon.

Why should you know about Seldon?

The biggest problem in the industry is productionizing the Machine Learning models due to a lot of moving parts it is difficult for the data scientist and DevOps engineer to work together and get things working. Data scientists lack understanding of cloud infrastructure and on the other hand, DevOps engineers lack requirements of Machine Learning models. To solve this Seldon provides a platform where these processes can be done at a high level even with less knowledge about their infrastructure by a Machine learning engineer or the new role MLOps Engineer.

Who should know about it?

Anyone who is trying to productionize the Machine Learning models is a Machine Learning Engineer or MLOps engineer.

Using Seldon

The above flow explains the high-level flow of the usage of Seldon.

  1. Data Scientist prepares Machine learning model either by training the new model or fine-tuning the pre-trained model with data.
  2. The trained is saved in the storage location may be in any cloud storage as required.
  3. Then one has to create an inference class to read the model from the storage and provide a prediction.
  4. Now using Seldon core build a reusable model server image(docker image) and upload that to the registry.
  5. This model server provides REST or GRPC microservice for scoring.
  6. This can be deployed to the cluster for batch data or streaming data.

Why not use a flask or docker?

Seldon comes with some of these benefits which makes it a more suitable option to use. These benefits are:

  1. All the hard work is done in the Seldon platform.
  2. Seldon provides very complex inference graphs where one can do even A/B testing, break the inference into different modules, etc. The below figure shows where one can even set up a simple Seldon core Inference Graph or a complex Graph with Multi-Arm Bandit, outlier detection, and others.
Image source: Seldon

3. Ease way to containerize Machine Learning models.

Image source: Seldon

4. Ease of deployment into the cluster, even with less knowledge of infrastructure.

Image source: Seldon

5. Seldon provides automated ingress configuration.

6. Advanced and customizable metrics with integration to Prometheus and Grafana

The next article will be on how to use Seldon to build microservices for the Machine Learning model.

References

https://www.seldon.io/
https://github.com/SeldonIO/seldon-core/

--

--