Comment on page
Using Docker to Build the App
Welcome to this module on using the Dockerized Application to set up and run the Dropbox AI Chat application. Before we begin, it's essential to ensure that you meet the prerequisites and understand each step thoroughly.
Basic Prerequisites:
- Ensure you have Docker installed on your machine.
- Dropbox must also be installed.
Open your terminal and run:
git clone https://github.com/pathway-labs/dropbox-ai-chat
cd dropbox-ai-chat
If you have previously cloned an older version of the repository, ensure you're in the correct repository directory and update it using:
git pull https://github.com/pathway-labs/dropbox-ai-chat
What this does: The git pull command will update your local repository with the latest changes from the remote repository.
Overview:
- The
.env
file sets crucial environment variables for your application. - If you're using macOS, the
.env
file might be hidden by default when viewed through Finder but is visible via Terminal. Regardless of the OS, it's important to note that this file plays a pivotal role. - The primary change you'll make in this entire implementation is to the
{PATH_TO_DROPBOX}
variable, which the.env
file uses.
Understanding
DROPBOX_LOCAL_FOLDER_PATH
variable used here:- This variable defines the relative path from your project to your Dropbox folder.
- If you want to quickly understand how relative path works in the context of Linux, you can check this quick video by Udacity or read these comprehensive explanations by RedHat or Coding Rooms.
Setting the environment variables in MacOS or Linux
- 1.Create an
.env
file in the project's root directory usingtouch
.touch .env - 2.Edit the
.env
file using a text editor likenano
orvim
.nano .env - 3.Populate the
.env
file with the following content, replacing placeholders with actual values: Note: Replace the following while using the environment variables from below:- 1.
- 2.
{REPLACE_WITH_DROPBOX_RELATIVE_PATH}
with the relative path where the Dropbox folder is locatedOPENAI_API_TOKEN={OPENAI_API_KEY}EMBEDDER_LOCATOR=text-embedding-ada-002EMBEDDING_DIMENSION=1536MODEL_LOCATOR=gpt-3.5-turboMAX_TOKENS=200TEMPERATURE=0.0DROPBOX_LOCAL_FOLDER_PATH={REPLACE_WITH_DROPBOX_RELATIVE_PATH} - 3.Alternative to using
export
: If.env
doesn't work for you, you can set these variables directly in your shell using the command below. However, it is important to note that variables set withexport
(Linux/macOS) orset
(Windows, as seen below) last only for the current session. If you want them to persist, you'll need to add them to shell configuration files like add to.bashrc
or.bash_profile
for Linux/macOS, or use System Properties on Windows.export OPENAI_API_TOKEN={OPENAI_API_KEY}export EMBEDDER_LOCATOR=text-embedding-ada-002export EMBEDDING_DIMENSION=1536export MODEL_LOCATOR=gpt-3.5-turboexport MAX_TOKENS=200export TEMPERATURE=0.0export DROPBOX_LOCAL_FOLDER_PATH={REPLACE_WITH_DROPBOX_RELATIVE_PATH}
Setting the environment variables in Windows:
- 1.Create an
.env
file using a text editor of your choice. - 2.Populate the
.env
file as shown above. - 3.Alternative: Use the
set
command in Command Prompt to set environment variables. Note: Replace{OPENAI_API_KEY}
with your OpenAI API key and{REPLACE_WITH_DROPBOX_RELATIVE_PATH}
with your local Dropbox path.set OPENAI_API_TOKEN={OPENAI_API_KEY}set EMBEDDER_LOCATOR=text-embedding-ada-002set EMBEDDING_DIMENSION=1536set MODEL_LOCATOR=gpt-3.5-turboset MAX_TOKENS=200set TEMPERATURE=0.0set DROPBOX_LOCAL_FOLDER_PATH={REPLACE_WITH_DROPBOX_RELATIVE_PATH}
Now we will build the Docker image and start the containers.
Navigate to the cloned directory
dropbox-ai-chat
. Then run these commands.docker-compose build
docker-compose up
Behind the Scenes with Docker:
- Dockerfile: This file contains instructions that Docker follows to build an image. It's like a blueprint for your application. Docker reads these instructions and creates a Docker image based on them. This image contains everything your app needs to run.
- docker-compose: It's a tool for defining and running multi-container Docker applications. In our context,
docker-compose
uses thedocker-compose.yml
file to understand how to set up and run the app's services. - When you run
docker-compose up
, it starts the services as defined.
What it does: Opens access to your API and Streamlit UI.
- Access Points for your application:
To stop the services and remove the containers, execute:
docker-compose down
By following these steps, you should be able to get both the main application and the Streamlit UI up and running using Docker Compose.
Interestingly if you quickly revisit the LLM Architecture diagram we saw earlier, there was a prompt from a Customer Support Executive trying to understand the product release notes made by the development team. With something like the Dropbox app, that problem can be easily addressed.
Now let's look at another example where we use an API as a data source.
Last modified 17h ago