Machine Learning Model as a Serverless App using Google App Engine | by Saed Hussain | Jan, 2021


Create a folder for the project and download the code files for this article from the repository here.

Then navigate to this directory using terminal (cd <path_to_dir>) and make sure that the virtual environment is active (conda activate <env_name>).

Navigating to the project directory and activating a virtual environment

Obviously, you can do the same using your favorite IDE. But make sure to activate a virtual environment (click here for VS Code). Otherwise, you will end up installing dependencies in your default environment, which could break other projects using that environment.

Now let’s take a look at the Streamlit app file (

GitHub Gist of Streamlit app file (

Notice how by adding a simple import (import streamlit as st), a regular data science script (with pandas, numpy, model.predict(), etc.), gets converted in a Streamlit web app. All we have done is add Streamlit widgets to interact with the model, such as a text input widget, button widget, etc.

You can try running the example Streamlit app in the newly created virtual environment using streamlit run

This should result in errors due to missing python modules in the virtual environment. You can use pip install to install the missing modules one by one, as you encounter them until the app finally runs.

Streamlit not installed in the virtual environment.

Or, you can install all the dependencies of the app, by using the dependency list in requirements.txt, which will replicate my environment in which the app was created and tested.

Install all the project dependencies into the virtual environment using the command pip install -r requirements.txt.

Installing pip module in a conda virtual environment.

You can create a dependency list like this for your own project, once completed, by running pip freeze > requirements.txt .

Give some time for the installation of the modules to complete. When it’s done, you can run the Streamlit app in the virtual environment using thestreamlit run command.

Streamlit app server running on the localhost, on port 8051.
Iris flower species prediction Streamlit app running on localhost:8051

You should see your default browser pop up and display the app (on localhost, usually port 8051 by default). Feel free to play around with the numbers and see the model work.

Congratulations, you have built a web app in minutes to interact with a machine learning model! 😄

When you are done playing with the Streamlit app, you can shut the app server down using Ctrl + C in the terminal.

Read More …


Write a comment