For the upcoming Python contest, I would like to make a small demo, on how to create a simple REST application using Python, which will use IRIS as a database. Using this tools
- FastAPI framework, high performance, easy to learn, fast to code, ready for production
- SQLAlchemy is the Python SQL toolkit and Object Relational Mapper that gives application developers the full power and flexibility of SQL
- Alembic is a lightweight database migration tool for usage with the SQLAlchemy Database Toolkit for Python.
- Uvicorn is an ASGI web server implementation for Python.
Prepare environment
Assuming that Python is already installed, at least version 3.7. Create a project folder, and create a requirements.txt file in it with this content
fastapi==0.101.1
alembic==1.11.1
uvicorn==0.22.0
sqlalchemy==2.0.20
sqlalchemy-iris==0.10.5
PythonPython
I'd recommend using virtual environment in python, let's create new environment, and activate it
python -m venv env && source env/bin/activate
Shell SessionShell Session
And now we can install our dependencies
pip install -r requirements.txt
Shell SessionShell Session
Fast start
Let's create the simplest REST Api with FastAPI. To do so, create app/main.py
from fastapi import FastAPI
app = FastAPI(
title='TODO Application',
version='1.0.0',
)
@app.get("/ping")
async def pong():
return {"ping": "pong!"}
PythonPython
At this point it's enough to start our application, and it should already work. To start server we will use uvicorn
$ uvicorn app.main:app INFO: Started server process [94936] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
And we can make ping requst
$ curl http://localhost:8000/ping {"ping":"pong!"}
FastAPI offer UI where we can test API.
Dockerized environment
To add IRIS into our application, we will use containers. IRIS image will go as is, but we need to build Docker image for the python application. And we will need Dockerfile
FROM python:3.11-slim-buster
WORKDIR /usr/src/app
RUN --mount=type=bind,src=.,dst=. \
pip install --upgrade pip && \
pip install -r requirements.txt
COPY . .
ENTRYPOINT [ "/usr/src/app/entrypoint.sh" ]
DockerfileDockerfile
To start the application inside the container requires a simple entrypoint.sh
#!/bin/sh
# Run SQL Migrations, to make DB Schema up to date
alembic upgrade head
# Start Python application
uvicorn app.main:app \
--workers 1 \
--host 0.0.0.0 \
--port 8000 "$@"
BashBash
Do not forget to add an execution flag
chmod +x entrypoint.sh
Shell SessionShell Session
And combine with IRIS in docker-compose.yml.
version: "3"
services:
iris:
image: intersystemsdc/iris-community
ports:
- 1972
environment:
- IRISUSERNAME=demo
- IRISPASSWORD=demo
healthcheck:
test: /irisHealth.sh
interval: 5s
app:
build: .
ports:
- 8000:8000
environment:
- DATABASE_URL=iris://demo:demo@iris:1972/USER
volumes:
- ./:/usr/src/app
depends_on:
iris:
condition: service_healthy
command:
- --reload
YAMLYAML
Let's build it
docker-compose build
Shell SessionShell Session
The first data model
Now let's declare access to our IRIS database to the application, add file app/db.py, which will configure SQLAlchemy to access our database, which is defined through the URL passed through docker-compose.yml. And it contains a couple handlers we will use later in the app
import os
from sqlalchemy import create_engine
from sqlalchemy.ext.declarative import DeclarativeMeta, declarative_base
from sqlalchemy.orm import sessionmaker
DATABASE_URL = os.environ.get("DATABASE_URL")
if not DATABASE_URL:
DATABASE_URL = "iris://demo:demo@localhost:1972/USER"
engine = create_engine(DATABASE_URL, echo=True, future=True)
Base: DeclarativeMeta = declarative_base()
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
def init_db():
engine.connect()
def get_session():
session = SessionLocal()
yield session
PythonPython
And ready to define the first and only one model in our application. Create and edit file app/models.py, it will use SQLAlchemy to define the model, named Todo, with three columns, id, title, and description.
from sqlalchemy import Column, Integer, String, Text
from app.db import Base
class Todo(Base):
__tablename__ = 'todo'
id = Column(Integer, primary_key=True, index=True)
title = Column(String(200), index=True, nullable=False)
description = Column(Text, nullable=False)
PythonPython
Prepare SQL migration
In a changing world, we know that our application will be improved in the future, that our table structure is not final, we can add more tables, columns, indexes, and so on. In this case the best scenario is using some SQL Migration tools, which help to upgrade the actual structure in the database according to the version of our application, and with these tools, it helps to downgrade it too, in case if something went off. While in this project we use Python and SQLAlchemy, the author of SQLAlchemy offers his tool named Alembic, and we going to use it here.
We need to start IRIS and container with our application, at this moment we need bash, to be able to run commands
$ docker-compose run --entrypoint bash app [+] Creating 2/0 ✔ Network fastapi-iris-demo_default Created 0.0s ✔ Container fastapi-iris-demo-iris-1 Created 0.0s [+] Running 1/1 ✔ Container fastapi-iris-demo-iris-1 Started 0.1s root@7bf903cd2721:/usr/src/app#
Run command alembic init app/migrations
root@7bf903cd2721:/usr/src/app# alembic init app/migrations Creating directory '/usr/src/app/app/migrations' ... done Creating directory '/usr/src/app/app/migrations/versions' ... done Generating /usr/src/app/app/migrations/README ... done Generating /usr/src/app/app/migrations/script.py.mako ... done Generating /usr/src/app/app/migrations/env.py ... done Generating /usr/src/app/alembic.ini ... done Please edit configuration/connection/logging settings in '/usr/src/app/alembic.ini' before proceeding. root@7bf903cd2721:/usr/src/app#
This prepared alembic configuration, and we need to fix it to fit our application needs. To do so, edit app/migrations/env.py file. This is only the beginning of file, which supposed to be updated, focus on updating sqlalchemy.url and target_metadata. What's below remains the same
import os
import urllib.parse
from logging.config import fileConfig
from sqlalchemy import engine_from_config
from sqlalchemy import pool
from alembic import context
# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config
DATABASE_URL = os.environ.get("DATABASE_URL")
decoded_uri = urllib.parse.unquote(DATABASE_URL)
config.set_main_option("sqlalchemy.url", decoded_uri)
# Interpret the config file for Python logging.
# This line sets up loggers basically.
if config.config_file_name is not None:
fileConfig(config.config_file_name)
# add your model's MetaData object here
# for 'autogenerate' support
from app.models import Base
target_metadata = Base.metadata
# target_metadata = None
PythonPython
We already have a model, now we need to create a migration, with command alembic revision --autogenerate
root@7bf903cd2721:/usr/src/app# alembic revision --autogenerate INFO [alembic.runtime.migration] Context impl IRISImpl. INFO [alembic.runtime.migration] Will assume non-transactional DDL. INFO [alembic.autogenerate.compare] Detected added table 'todo' INFO [alembic.autogenerate.compare] Detected added index 'ix_todo_id' on '['id']' INFO [alembic.autogenerate.compare] Detected added index 'ix_todo_title' on '['title']' Generating /usr/src/app/app/migrations/versions/1e4d3b4d51ca_.py ... done root@7bf903cd2721:/usr/src/app#
Now it's time to apply this to the database, with the command alembic upgrade head, where is head is a keyword to upgrade to the latest version
root@7bf903cd2721:/usr/src/app# alembic upgrade head INFO [alembic.runtime.migration] Context impl IRISImpl. INFO [alembic.runtime.migration] Will assume non-transactional DDL. INFO [alembic.runtime.migration] Running upgrade -> 1e4d3b4d51ca, empty message
Check the current state at any time, will give information if some migrations that are missing
root@7bf903cd2721:/usr/src/app# alembic check INFO [alembic.runtime.migration] Context impl IRISImpl. INFO [alembic.runtime.migration] Will assume non-transactional DDL. No new upgrade operations detected.
Make data accessable
So, we now can go back to the REST, and we need it up and running, exit from the current container and run app service as normal now, uvicorn has a flag --reload, so, it will check for changes in the python files and will restart when we change them
$ docker-compose up app [+] Running 2/0 ✔ Container fastapi-iris-demo-iris-1 Running 0.0s ✔ Container fastapi-iris-demo-app-1 Created 0.0s Attaching to fastapi-iris-demo-app-1, fastapi-iris-demo-iris-1 fastapi-iris-demo-app-1 | INFO [alembic.runtime.migration] Context impl IRISImpl. fastapi-iris-demo-app-1 | INFO [alembic.runtime.migration] Will assume non-transactional DDL. fastapi-iris-demo-app-1 | INFO: Will watch for changes in these directories: ['/usr/src/app'] fastapi-iris-demo-app-1 | INFO: Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit) fastapi-iris-demo-app-1 | INFO: Started reloader process [8] using StatReload fastapi-iris-demo-app-1 | INFO: Started server process [10] fastapi-iris-demo-app-1 | INFO: Waiting for application startup. fastapi-iris-demo-app-1 | INFO: Application startup complete.
FastAPI uses pydantic project, to declare the data schema, and we need it too, let's create app/schemas.py, the same columns as in models.py but in a simple Python form
from pydantic import BaseModel
class TodoCreate(BaseModel):
title: str
description: str
class Todo(TodoCreate):
id: int
class Config:
from_attributes = True
PythonPython
Declare the crud operations in app/crud.py, where we work with the database using SQLAlchemy ORM
from sqlalchemy.orm import Session
from . import models, schemas
def get_todos(db: Session, skip: int = 0, limit: int = 100):
return db.query(models.Todo).offset(skip).limit(limit).all()
def create_todo(db: Session, todo: schemas.TodoCreate):
db_todo = models.Todo(**todo.dict())
db.add(db_todo)
db.commit()
db.refresh(db_todo)
return db_todo
PythonPython
And finally, we can update our app/main.py, and add routes to read and create todos
from fastapi import FastAPI, Depends
from .db import init_db, get_session
from . import crud, schemas
app = FastAPI(
title='TODO Application',
version='1.0.0',
)
@app.on_event("startup")
def on_startup():
init_db()
@app.get("/ping")
async def pong():
return {"ping": "pong!"}
@app.get("/todo", response_model=list[schemas.Todo])
async def read_todos(skip: int = 0, limit: int = 100, session=Depends(get_session)):
todos = crud.get_todos(session, skip=skip, limit=limit)
return todos
@app.post("/todo", response_model=schemas.Todo)
async def create_todo(todo: schemas.TodoCreate, session=Depends(get_session)):
return crud.create_todo(db=session, todo=todo)
PythonPython
The docs page was updated accordingly, and now we can play with it
And let's check it in IRIS
─$ docker-compose exec iris irissqlcli iris+emb:/// Server: IRIS for UNIX (Ubuntu Server LTS for ARM64 Containers) 2023.2 (Build 227U) Mon Jul 31 2023 17:43:25 EDT Version: 0.5.4 [SQL]irisowner@/usr/irissys/:USER> .tables +-------------------------+ | TABLE_NAME | +-------------------------+ | SQLUser.alembic_version | | SQLUser.todo | +-------------------------+ Time: 0.043s [SQL]irisowner@/usr/irissys/:USER> select * from todo +----+-------+---------------------+ | id | title | description | +----+-------+---------------------+ | 1 | demo | it's really working | +----+-------+---------------------+ 1 row in set Time: 0.004s [SQL]irisowner@/usr/irissys/:USER> select * from alembic_version +--------------+ | version_num | +--------------+ | 1e4d3b4d51ca | +--------------+ 1 row in set Time: 0.045s [SQL]irisowner@/usr/irissys/:USER>
Hope you enjoyed the ease of using Python and FastAPI, on creating REST. The source code for this project is available on github https://github.com/caretdev/fastapi-iris-demo
Thanks @Dmitry Maslennikov !
Tried to use Field(ge=0) for one of the fields - doesn't work.
from dataclasses import Field
is presented. What could be the issue?
ge=0, what do you expect from it?
I suppose, you expect it to fail to insert values that less than 0. And this probably supposed to be implemented by using CHECK CONSTRAINTS, which IRIS SQL does not have. So, the only option right now is to implement it on Python side
I didn't get the meaning of Field(ge=0) immediately
I see 6 variants to achieve this in IRIS in order of my personal preference
There are eventually more options
Another question: where can I see uvicorn logs while working with the container?
In my example, and recommended way, Python application supposed to go as a separate container, and logs of this container should be enough