Online MLMU #7: Serve Your ML Models in AWS Using Python – Václav Košař

  Machine learning

The meetup will be hosted online - using Zoom platform (and streamed at YouTube).

Automate your ML model train-deploy cycle, garbage collection, and rollbacks, all from Python with an open-source PyPi package based on Cortex.

It all started with the modernization of a product categorization project. The goal was to replace complex low-level Docker commands with a straightforward and user-friendly deployment utility called Cortex. The solution in the form of a Python package proved to be re-usable since we successfully used it as part of our recommendation engine project. We plan to deploy all ML projects like this. Since GLAMI relies heavily on open-source software, we wanted to contribute back and decided to open-source the package, calling it Cortex Serving Client. So now you can use it as well.

Vaclav is a software developer on his way to becoming a machine learning engineer. From 2020 he works for GLAMI on AI product categorization and recommendation. Subscribe at