OpenAI Quickstart
In this tutorial, you'll learn how to build an interactive AI application and deploy it to the cloud in just 9 lines of code.
Preparation
First, create a folder for your app and activate a virtual environment.
- macOS/Linux
- Windows (PowerShell)
- Windows (cmd)
python3 -m venv ai-app/.venv
cd ai-app
source .venv/bin/activate
touch main.py
python3 -m venv ai-app/.venv
cd ai-app
.venv\Scripts\activate.ps1
New-Item main.py
python3 -m venv ai-app/.venv
cd ai-app
.venv\Scripts\activate.bat
TYPE nul > main.py
Then, install dependencies and initialize a DBOS config file.
pip install dbos llama-index
dbos init --config
Next, to run this app, you need an OpenAI developer account. Obtain an API key here. Set the API key as an environment variable.
- macOS/Linux
- Windows (PowerShell)
- Windows (cmd)
export OPENAI_API_KEY=XXXXX
set OPENAI_API_KEY=XXXXX
set OPENAI_API_KEY=XXXXX
Declare the environment variable in dbos-config.yaml
:
env:
OPENAI_API_KEY: ${OPENAI_API_KEY}
Finally, let's download some data. This app uses the text from Paul Graham's "What I Worked On". You can download the text from this link and save it under data/paul_graham_essay.txt
of your app folder.
Now, your app folder structure should look like this:
ai-app/
├── dbos-config.yaml
├── main.py
└── data/
└── paul_graham_essay.txt
Load Data and Build a Q&A Engine
Now, let's use LlamaIndex to write a simple AI application in just 5 lines of code.
Add the following code to your main.py
:
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader
documents = SimpleDirectoryReader("data").load_data()
index = VectorStoreIndex.from_documents(documents)
query_engine = index.as_query_engine()
response = query_engine.query("What did the author do growing up?")
print(response)
This script loads data and builds an index over the documents under the data/
folder, and it generates an answer by querying the index. You can run this script and it should give you a response, for example:
$ python3 main.py
The author worked on writing short stories and programming...
HTTP Serving
Now, let's add a FastAPI endpoint to serve responses through HTTP. Modify your main.py
as follows:
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader
from fastapi import FastAPI
app = FastAPI()
documents = SimpleDirectoryReader("data").load_data()
index = VectorStoreIndex.from_documents(documents)
query_engine = index.as_query_engine()
@app.get("/")
def get_answer():
response = query_engine.query("What did the author do growing up?")
return str(response)
Now you can start your app with fastapi run main.py
. To see that it's working, visit this URL: http://localhost:8000
"The author worked on writing short stories and programming..."
The result may be slightly different every time you refresh your browser window!
Hosting on DBOS Cloud
To deploy your app to DBOS Cloud, you only need to add two lines to main.py
:
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader
from fastapi import FastAPI
from dbos import DBOS
app = FastAPI()
DBOS(fastapi=app)
documents = SimpleDirectoryReader("data").load_data()
index = VectorStoreIndex.from_documents(documents)
query_engine = index.as_query_engine()
@app.get("/")
def get_answer():
response = query_engine.query("What did the author do growing up?")
return str(response)
Now, install the DBOS Cloud CLI if you haven't already (requires Node.js):
npm i -g @dbos-inc/dbos-cloud
Instructions to install Node.js
- macOS or Linux
- Windows
Run the following commands in your terminal:
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.40.1/install.sh | bash
export NVM_DIR="$HOME/.nvm"
[ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh" # This loads nvm
nvm install 22
nvm use 22
Download Node.js 20 or later from the official Node.js download page and install it.
After installing Node.js, create the following folder: C:\Users\%user%\AppData\Roaming\npm
(%user%
is the Windows user on which you are logged in).
Then freeze dependencies to requirements.txt
and deploy to DBOS Cloud:
pip freeze > requirements.txt
dbos-cloud app deploy
In less than a minute, it should print Access your application at <URL>
.
To see that your app is working, visit <URL>
in your browser.
"The author worked on writing short stories and programming..."
Congratulations, you've successfully deployed your first AI app to DBOS Cloud! You can see your deployed app in the cloud console.
Next Steps
This is just the beginning of your DBOS journey. Next, check out how DBOS can make your AI applications more scalable and resilient:
- Use durable execution to write crashproof workflows.
- Use queues to gracefully manage AI/LLM API rate limits.
- Want to build a more complex app? Check out the AI-Powered Slackbot.