Build & Deploy a Serverless OpenAI App in ines of Code
? Want to build and deploy an interactive AI app ?? ??? ????? in just ? ????? ?? ?????
In this tutorial, you'll use LlamaIndex to create a Q&A engine, FastAPI to serve it over HTTP, and DBOS to deploy it serverlessly to the cloud.
It's based on LlamaIndex’s 5-line starter, with just 4 extra lines to make it cloud-ready. Simple, fast, and ready to scale!
Preparation
First, create a folder for your app and activate a virtual environment.
python3 -m venv ai-app/.venv cd ai-app source .venv/bin/activate touch main.py
Then, install dependencies and initialize a DBOS config file.
pip install dbos llama-index dbos init --config
Next, to run this app, you need an OpenAI developer account. Obtain an API key here. Set the API key as an environment variable.
export OPENAI_API_KEY=XXXXX
Declare the environment variable in dbos-config.yaml:
env: OPENAI_API_KEY: ${OPENAI_API_KEY}
Finally, let's download some data. This app uses the text from Paul Graham's "What I Worked On". You can download the text from this link and save it under data/paul_graham_essay.txt of your app folder.
Now, your app folder structure should look like this:
ai-app/ ├── dbos-config.yaml ├── main.py └── data/ └── paul_graham_essay.txt
Load Data and Build a Q&A Engine
Now, let's use LlamaIndex to write a simple AI application in just 5 lines of code.
Add the following code to your main.py:
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader documents = SimpleDirectoryReader("data").load_data() index = VectorStoreIndex.from_documents(documents) query_engine = index.as_query_engine() response = query_engine.query("What did the author do growing up?") print(response)
This script loads data and builds an index over the documents under the data/ folder, and it generates an answer by querying the index. You can run this script and it should give you a response, for example:
$ python3 main.py The author worked on writing short stories and programming...
HTTP Serving
Now, let's add a FastAPI endpoint to serve responses through HTTP. Modify your main.py as follows:
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader from fastapi import FastAPI app = FastAPI() documents = SimpleDirectoryReader("data").load_data() index = VectorStoreIndex.from_documents(documents) query_engine = index.as_query_engine() @app.get("/") def get_answer(): response = query_engine.query("What did the author do growing up?") return str(response)
Now you can start your app with fastapi run main.py. To see that it's working, visit this URL: http://localhost:8000
The result may be slightly different every time you refresh your browser window!
Hosting on DBOS Cloud
To deploy your app to DBOS Cloud, you only need to add two lines to main.py:
- from dbos import DBOS
- DBOS(fastapi=app)
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader from fastapi import FastAPI from dbos import DBOS app = FastAPI() DBOS(fastapi=app) documents = SimpleDirectoryReader("data").load_data() index = VectorStoreIndex.from_documents(documents) query_engine = index.as_query_engine() @app.get("/") def get_answer(): response = query_engine.query("What did the author do growing up?") return str(response)
Now, install the DBOS Cloud CLI if you haven't already (requires Node.js):
npm i -g @dbos-inc/dbos-cloud
Then freeze dependencies to requirements.txt and deploy to DBOS Cloud:
pip freeze > requirements.txt dbos-cloud app deploy
In less than a minute, it should print Access your application at
To see that your app is working, visit
Congratulations, you've successfully deployed your first AI app to DBOS Cloud! You can see your deployed app in the cloud console.
Next Steps
This is just the beginning of your DBOS journey. Next, check out how DBOS can make your AI applications more scalable and resilient:
- Use durable execution to write crashproof workflows.
- Use queues to gracefully manage AI/LLM API rate limits.
- Want to build a more complex app? Check out the AI-Powered Slackbot.
Give it a try and let me know what you think ?
The above is the detailed content of Build & Deploy a Serverless OpenAI App in ines of Code. For more information, please follow other related articles on the PHP Chinese website!

Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

Video Face Swap
Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Article

Hot Tools

Notepad++7.3.1
Easy-to-use and free code editor

SublimeText3 Chinese version
Chinese version, very easy to use

Zend Studio 13.0.1
Powerful PHP integrated development environment

Dreamweaver CS6
Visual web development tools

SublimeText3 Mac version
God-level code editing software (SublimeText3)

Hot Topics

Solution to permission issues when viewing Python version in Linux terminal When you try to view Python version in Linux terminal, enter python...

How to avoid being detected when using FiddlerEverywhere for man-in-the-middle readings When you use FiddlerEverywhere...

When using Python's pandas library, how to copy whole columns between two DataFrames with different structures is a common problem. Suppose we have two Dats...

How does Uvicorn continuously listen for HTTP requests? Uvicorn is a lightweight web server based on ASGI. One of its core functions is to listen for HTTP requests and proceed...

Fastapi ...

Using python in Linux terminal...

How to teach computer novice programming basics within 10 hours? If you only have 10 hours to teach computer novice some programming knowledge, what would you choose to teach...

Understanding the anti-crawling strategy of Investing.com Many people often try to crawl news data from Investing.com (https://cn.investing.com/news/latest-news)...
