ogre-cli

The quickest way to start using ogre.run

⚠️ This package is currently outdated and on the track to be deprecated in favor of miniogre.

Ogre CLI is a framework built around Docker that automates the creation of containerized runtime environments for Python code.

It analyzes the source code inside a given folder and automatically creates an environment

  • No Dockerfile? Ogre takes care of it for you

  • No requirements file? Ogre generates one for you

This get-started focuses on the simplest form of the Ogre CLI, which can be used offline, without a token. To unlock the full power of the Ogre CLI, create a token and follow the these steps.

Installation

IMPORTANT: make sure you have Docker installed. Follow instructions from the Docker website.

Install the Ogre CLI with python-pip

pip install ogre-cli

To check if everything was successfully installed, type:

ogre --help

Y‍ou will be greeted by the ogre menu.

Usage

Now let's "ogrerize" your code. As an example, we use nanoGPT, a popular AI GitHub repository. But you are also free to try on any Python code base of your choice.

Clone the nanoGPT repository:

git clone https://github.com/karpathy/nanoGPT.git

Go inside its folder:

cd nanoGPT

Configure ogre. This generates the ogre.yml file that contains the specifications for ogre to generate the runtime environment (see details here):

ogre --config

Finally, generate the ogre environment:

ogre

IMPORTANT: If you run into problems in this step, please contact us: contact@ogre.run

The environment building starts and you will see something similar to the log below:

root@6f1af4cdcc34:/opt/nanoGPT# ogre
>>> Port 8001 is available
True
    ____
  / __ \____ _________
 / / / / __ `/ ___/ _ \
/ /_/ / /_/ / /  /  __/
\____/\__, /_/   \___/
     /____/

Ogre 0.8.2 - ogre.run

Project: nanoGPT

Ogre summary
: {
    "version": false,
    "path": ".",
    "web": false,
    "jupyter": true,
    "jupyter_password": "ogre123",
    "jupyter_port": "8001",
    "platform": null,
    "mount": true,
    "ping": false,
    "api_url": "https://dev-cloud.ogre.run",
    "api_token": "$OGRE_API_TOKEN",
    "openai_token": "",
    "package": false,
    "publish": false,
    "service_up": false,
    "service_down": false,
    "service_delete": false,
    "service_ls": false,
    "attach": false,
    "stop": false,
    "delete": false,
    "config": false,
    "dry": false,
    "info": false,
    "gui": false,
    "save": false,
    "yaml_config": [],
    "docker": {
        "company": "ogre-run",
        "version": "0.1.0",
        "baseimage": "ogrerun/base:ubuntu22.04-amd64",
        "device": "cpu",
        "container_repo": "ogrerun",
        "project_volume_path": "/opt",
        "cmd": "bash",
        "ttyd": "ttyd",
        "ttyd_version": "1.6.3",
        "ttyd_url": "https://github.com/tsl0922/ttyd/releases/download/",
        "ttyd_port": 6007,
        "ogre_dir": "ogre_dir",
        "requirements_format": "freeze",
        "expose_ports": 8001,
        "no_cache": true
    }
}

Once the build is done, you can check that the environment is deployed and running:

❯ ogre --info
>>> Port 8001 is available
True
    ____
  / __ \____ _________
 / / / / __ `/ ___/ _ \
/ /_/ / /_/ / /  /  __/
\____/\__, /_/   \___/
     /____/

Ogre 0.8.2 - ogre.run

Project: nanoGPT

> Info about ogre containers running in the local system
ogre-ogre-run-nanogpt-jupyter 0.0.0.0:8001->8001/tcp

Notice that the ogre environment is allocated to the localhost address 0.0.0.0:8001.

What just happened?

  1. Ogre read your Python source code and created a list of all the dependencies that needed to be installed. It did that without needing a requirements.txt file.

  2. Ogre also created a Dockerfile based on the public ogre.run linux Docker image. This Dockerfile is used by the Docker engine to build a unique Docker image for that repository, all without the need for the user to manually specify the build parameters.

  3. As the Docker image is built, it uses the ogre-generated requirements.txt to install all the dependencies in the image.

Finally, access the built environment

You can now access the runtime environment, with all the dependencies installed, and start working using either of the following options:

Option 1

Type ogre --attach , which will take you inside the container

❯ ogre --attach

> Attach to container ogre-ogre-run-nanogpt
docker exec -i --tty ogre-ogre-run-nanogpt-jupyter bash

  ___   __ _ _ __ ___
 / _ \ / _` | '__/ _ \
| (_) | (_| | | |  __/
 \___/ \__, |_|  \___|
       |___/


Made by ogre.run

Reach out to us: contact@ogre.run

WARNING: You are running this container as root, which can cause new files in
mounted volumes to be created as the root user on your host machine.

To avoid this, run the container by specifying your user's userid:

$ docker run -u $(id -u):$(id -g) args...
PRODUCT = nanoGPT
VERSION = 0.1.0
BUILD_DATE = 2024-02-08-013214
REPOSITORY = https://github.com/karpathy/nanoGPT.git
COMMIT = eba36e8
COMMIT_AUTHOR = andrej.karpathy@gmail.com
root@966d6ff57758:/opt/nanoGPT#

Option 2

On a web browser, visit the address returned by the ogre --info command, in this case 0.0.0.0:8001. You will be greeted by a Jupyter Notebook interface that provides the developer with more options to interact with the code. The default password is ogre123, but you can define your own by configuring the Ogre CLI parameters.

Last updated