Code icon

The App is Under a Quick Maintenance

We apologize for the inconvenience. Please come back later

Menu iconMenu iconOpenAI API Bible Volume 2
OpenAI API Bible Volume 2

Chapter 4: Building a Simple Chatbot with Memory

Practical Exercises — Chapter 4

Exercise 1: Create a Basic Chatbot in Streamlit

Task:

Build a chatbot using Streamlit that accepts user input, sends it to GPT-4o, and displays the response in the UI.

Solution:

import streamlit as st
import openai
import os
from dotenv import load_dotenv

load_dotenv()
openai.api_key = os.getenv("OPENAI_API_KEY")

st.title("💬 Simple GPT-4o Chatbot")

user_input = st.text_input("You:", "")

if user_input:
    messages = [
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": user_input}
    ]
    response = openai.ChatCompletion.create(
        model="gpt-4o",
        messages=messages
    )
    st.write("🤖", response["choices"][0]["message"]["content"])

Exercise 2: Add Session-Based Memory in Streamlit

Task:

Modify your chatbot so that it remembers the entire conversation during a single browser session.

Solution:

if "messages" not in st.session_state:
    st.session_state.messages = [{"role": "system", "content": "You are a helpful assistant."}]

for msg in st.session_state.messages[1:]:
    st.write(f"**{msg['role'].capitalize()}:** {msg['content']}")

user_input = st.text_input("You:")

if user_input:
    st.session_state.messages.append({"role": "user", "content": user_input})
    response = openai.ChatCompletion.create(
        model="gpt-4o",
        messages=st.session_state.messages
    )
    reply = response["choices"][0]["message"]["content"]
    st.session_state.messages.append({"role": "assistant", "content": reply})
    st.write("🤖", reply)

Exercise 3: Add Chat Memory in Flask Using Session

Task:

Create a Flask chatbot that maintains conversation history using Flask’s session object.

Solution:

from flask import Flask, request, render_template, session
import openai
import os
from flask_session import Session
from dotenv import load_dotenv

load_dotenv()
openai.api_key = os.getenv("OPENAI_API_KEY")

app = Flask(__name__)
app.secret_key = os.urandom(24)
app.config["SESSION_TYPE"] = "filesystem"
Session(app)

@app.route("/", methods=["GET", "POST"])
def chat():
    if "history" not in session:
        session["history"] = [{"role": "system", "content": "You are a helpful assistant."}]

    if request.method == "POST":
        user_input = request.form["user_input"]
        session["history"].append({"role": "user", "content": user_input})

        response = openai.ChatCompletion.create(
            model="gpt-4o",
            messages=session["history"]
        )
        reply = response["choices"][0]["message"]["content"]
        session["history"].append({"role": "assistant", "content": reply})

    return render_template("chat.html", history=session["history"][1:])

Exercise 4: Limit Memory to the Most Recent Messages

Task:

Prevent your session memory from growing too large by trimming older messages once the history exceeds 20 entries.

Solution (Streamlit):

MAX_HISTORY = 20
if len(st.session_state.messages) > MAX_HISTORY:
    st.session_state.messages = [st.session_state.messages[0]] + st.session_state.messages[-MAX_HISTORY:]

Solution (Flask):

if len(session["history"]) > MAX_HISTORY:
    session["history"] = [session["history"][0]] + session["history"][-MAX_HISTORY:]

Exercise 5: Style the Chat Interface in Streamlit

Task:

Enhance your Streamlit UI using st.chat_message() to make the chatbot look like a real conversation.

Solution:

st.set_page_config(page_title="Chatbot", page_icon="🤖")
st.title("🤖 GPT-4o Assistant")

if "messages" not in st.session_state:
    st.session_state.messages = [{"role": "system", "content": "You are a helpful assistant."}]

for msg in st.session_state.messages[1:]:
    with st.chat_message(msg["role"]):
        st.markdown(msg["content"])

user_input = st.chat_input("Say something...")

if user_input:
    st.session_state.messages.append({"role": "user", "content": user_input})
    with st.chat_message("user"):
        st.markdown(user_input)

    with st.chat_message("assistant"):
        with st.spinner("Thinking..."):
            response = openai.ChatCompletion.create(
                model="gpt-4o",
                messages=st.session_state.messages
            )
            reply = response["choices"][0]["message"]["content"]
            st.markdown(reply)
            st.session_state.messages.append({"role": "assistant", "content": reply})

Summary of What You Practiced

  • Creating simple and effective chat UIs with Streamlit and Flask
  • Preserving multi-turn conversation memory
  • Structuring clean, readable chat interfaces
  • Managing message history to optimize API usage
  • Implementing server-side sessions in Flask for continuity

You’ve now built a chatbot that looks good, feels responsive, and holds context like a pro. In the next section, we’ll take memory to the next level using vector databases for persistent, semantic memory that lasts beyond the session.

Practical Exercises — Chapter 4

Exercise 1: Create a Basic Chatbot in Streamlit

Task:

Build a chatbot using Streamlit that accepts user input, sends it to GPT-4o, and displays the response in the UI.

Solution:

import streamlit as st
import openai
import os
from dotenv import load_dotenv

load_dotenv()
openai.api_key = os.getenv("OPENAI_API_KEY")

st.title("💬 Simple GPT-4o Chatbot")

user_input = st.text_input("You:", "")

if user_input:
    messages = [
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": user_input}
    ]
    response = openai.ChatCompletion.create(
        model="gpt-4o",
        messages=messages
    )
    st.write("🤖", response["choices"][0]["message"]["content"])

Exercise 2: Add Session-Based Memory in Streamlit

Task:

Modify your chatbot so that it remembers the entire conversation during a single browser session.

Solution:

if "messages" not in st.session_state:
    st.session_state.messages = [{"role": "system", "content": "You are a helpful assistant."}]

for msg in st.session_state.messages[1:]:
    st.write(f"**{msg['role'].capitalize()}:** {msg['content']}")

user_input = st.text_input("You:")

if user_input:
    st.session_state.messages.append({"role": "user", "content": user_input})
    response = openai.ChatCompletion.create(
        model="gpt-4o",
        messages=st.session_state.messages
    )
    reply = response["choices"][0]["message"]["content"]
    st.session_state.messages.append({"role": "assistant", "content": reply})
    st.write("🤖", reply)

Exercise 3: Add Chat Memory in Flask Using Session

Task:

Create a Flask chatbot that maintains conversation history using Flask’s session object.

Solution:

from flask import Flask, request, render_template, session
import openai
import os
from flask_session import Session
from dotenv import load_dotenv

load_dotenv()
openai.api_key = os.getenv("OPENAI_API_KEY")

app = Flask(__name__)
app.secret_key = os.urandom(24)
app.config["SESSION_TYPE"] = "filesystem"
Session(app)

@app.route("/", methods=["GET", "POST"])
def chat():
    if "history" not in session:
        session["history"] = [{"role": "system", "content": "You are a helpful assistant."}]

    if request.method == "POST":
        user_input = request.form["user_input"]
        session["history"].append({"role": "user", "content": user_input})

        response = openai.ChatCompletion.create(
            model="gpt-4o",
            messages=session["history"]
        )
        reply = response["choices"][0]["message"]["content"]
        session["history"].append({"role": "assistant", "content": reply})

    return render_template("chat.html", history=session["history"][1:])

Exercise 4: Limit Memory to the Most Recent Messages

Task:

Prevent your session memory from growing too large by trimming older messages once the history exceeds 20 entries.

Solution (Streamlit):

MAX_HISTORY = 20
if len(st.session_state.messages) > MAX_HISTORY:
    st.session_state.messages = [st.session_state.messages[0]] + st.session_state.messages[-MAX_HISTORY:]

Solution (Flask):

if len(session["history"]) > MAX_HISTORY:
    session["history"] = [session["history"][0]] + session["history"][-MAX_HISTORY:]

Exercise 5: Style the Chat Interface in Streamlit

Task:

Enhance your Streamlit UI using st.chat_message() to make the chatbot look like a real conversation.

Solution:

st.set_page_config(page_title="Chatbot", page_icon="🤖")
st.title("🤖 GPT-4o Assistant")

if "messages" not in st.session_state:
    st.session_state.messages = [{"role": "system", "content": "You are a helpful assistant."}]

for msg in st.session_state.messages[1:]:
    with st.chat_message(msg["role"]):
        st.markdown(msg["content"])

user_input = st.chat_input("Say something...")

if user_input:
    st.session_state.messages.append({"role": "user", "content": user_input})
    with st.chat_message("user"):
        st.markdown(user_input)

    with st.chat_message("assistant"):
        with st.spinner("Thinking..."):
            response = openai.ChatCompletion.create(
                model="gpt-4o",
                messages=st.session_state.messages
            )
            reply = response["choices"][0]["message"]["content"]
            st.markdown(reply)
            st.session_state.messages.append({"role": "assistant", "content": reply})

Summary of What You Practiced

  • Creating simple and effective chat UIs with Streamlit and Flask
  • Preserving multi-turn conversation memory
  • Structuring clean, readable chat interfaces
  • Managing message history to optimize API usage
  • Implementing server-side sessions in Flask for continuity

You’ve now built a chatbot that looks good, feels responsive, and holds context like a pro. In the next section, we’ll take memory to the next level using vector databases for persistent, semantic memory that lasts beyond the session.

Practical Exercises — Chapter 4

Exercise 1: Create a Basic Chatbot in Streamlit

Task:

Build a chatbot using Streamlit that accepts user input, sends it to GPT-4o, and displays the response in the UI.

Solution:

import streamlit as st
import openai
import os
from dotenv import load_dotenv

load_dotenv()
openai.api_key = os.getenv("OPENAI_API_KEY")

st.title("💬 Simple GPT-4o Chatbot")

user_input = st.text_input("You:", "")

if user_input:
    messages = [
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": user_input}
    ]
    response = openai.ChatCompletion.create(
        model="gpt-4o",
        messages=messages
    )
    st.write("🤖", response["choices"][0]["message"]["content"])

Exercise 2: Add Session-Based Memory in Streamlit

Task:

Modify your chatbot so that it remembers the entire conversation during a single browser session.

Solution:

if "messages" not in st.session_state:
    st.session_state.messages = [{"role": "system", "content": "You are a helpful assistant."}]

for msg in st.session_state.messages[1:]:
    st.write(f"**{msg['role'].capitalize()}:** {msg['content']}")

user_input = st.text_input("You:")

if user_input:
    st.session_state.messages.append({"role": "user", "content": user_input})
    response = openai.ChatCompletion.create(
        model="gpt-4o",
        messages=st.session_state.messages
    )
    reply = response["choices"][0]["message"]["content"]
    st.session_state.messages.append({"role": "assistant", "content": reply})
    st.write("🤖", reply)

Exercise 3: Add Chat Memory in Flask Using Session

Task:

Create a Flask chatbot that maintains conversation history using Flask’s session object.

Solution:

from flask import Flask, request, render_template, session
import openai
import os
from flask_session import Session
from dotenv import load_dotenv

load_dotenv()
openai.api_key = os.getenv("OPENAI_API_KEY")

app = Flask(__name__)
app.secret_key = os.urandom(24)
app.config["SESSION_TYPE"] = "filesystem"
Session(app)

@app.route("/", methods=["GET", "POST"])
def chat():
    if "history" not in session:
        session["history"] = [{"role": "system", "content": "You are a helpful assistant."}]

    if request.method == "POST":
        user_input = request.form["user_input"]
        session["history"].append({"role": "user", "content": user_input})

        response = openai.ChatCompletion.create(
            model="gpt-4o",
            messages=session["history"]
        )
        reply = response["choices"][0]["message"]["content"]
        session["history"].append({"role": "assistant", "content": reply})

    return render_template("chat.html", history=session["history"][1:])

Exercise 4: Limit Memory to the Most Recent Messages

Task:

Prevent your session memory from growing too large by trimming older messages once the history exceeds 20 entries.

Solution (Streamlit):

MAX_HISTORY = 20
if len(st.session_state.messages) > MAX_HISTORY:
    st.session_state.messages = [st.session_state.messages[0]] + st.session_state.messages[-MAX_HISTORY:]

Solution (Flask):

if len(session["history"]) > MAX_HISTORY:
    session["history"] = [session["history"][0]] + session["history"][-MAX_HISTORY:]

Exercise 5: Style the Chat Interface in Streamlit

Task:

Enhance your Streamlit UI using st.chat_message() to make the chatbot look like a real conversation.

Solution:

st.set_page_config(page_title="Chatbot", page_icon="🤖")
st.title("🤖 GPT-4o Assistant")

if "messages" not in st.session_state:
    st.session_state.messages = [{"role": "system", "content": "You are a helpful assistant."}]

for msg in st.session_state.messages[1:]:
    with st.chat_message(msg["role"]):
        st.markdown(msg["content"])

user_input = st.chat_input("Say something...")

if user_input:
    st.session_state.messages.append({"role": "user", "content": user_input})
    with st.chat_message("user"):
        st.markdown(user_input)

    with st.chat_message("assistant"):
        with st.spinner("Thinking..."):
            response = openai.ChatCompletion.create(
                model="gpt-4o",
                messages=st.session_state.messages
            )
            reply = response["choices"][0]["message"]["content"]
            st.markdown(reply)
            st.session_state.messages.append({"role": "assistant", "content": reply})

Summary of What You Practiced

  • Creating simple and effective chat UIs with Streamlit and Flask
  • Preserving multi-turn conversation memory
  • Structuring clean, readable chat interfaces
  • Managing message history to optimize API usage
  • Implementing server-side sessions in Flask for continuity

You’ve now built a chatbot that looks good, feels responsive, and holds context like a pro. In the next section, we’ll take memory to the next level using vector databases for persistent, semantic memory that lasts beyond the session.

Practical Exercises — Chapter 4

Exercise 1: Create a Basic Chatbot in Streamlit

Task:

Build a chatbot using Streamlit that accepts user input, sends it to GPT-4o, and displays the response in the UI.

Solution:

import streamlit as st
import openai
import os
from dotenv import load_dotenv

load_dotenv()
openai.api_key = os.getenv("OPENAI_API_KEY")

st.title("💬 Simple GPT-4o Chatbot")

user_input = st.text_input("You:", "")

if user_input:
    messages = [
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": user_input}
    ]
    response = openai.ChatCompletion.create(
        model="gpt-4o",
        messages=messages
    )
    st.write("🤖", response["choices"][0]["message"]["content"])

Exercise 2: Add Session-Based Memory in Streamlit

Task:

Modify your chatbot so that it remembers the entire conversation during a single browser session.

Solution:

if "messages" not in st.session_state:
    st.session_state.messages = [{"role": "system", "content": "You are a helpful assistant."}]

for msg in st.session_state.messages[1:]:
    st.write(f"**{msg['role'].capitalize()}:** {msg['content']}")

user_input = st.text_input("You:")

if user_input:
    st.session_state.messages.append({"role": "user", "content": user_input})
    response = openai.ChatCompletion.create(
        model="gpt-4o",
        messages=st.session_state.messages
    )
    reply = response["choices"][0]["message"]["content"]
    st.session_state.messages.append({"role": "assistant", "content": reply})
    st.write("🤖", reply)

Exercise 3: Add Chat Memory in Flask Using Session

Task:

Create a Flask chatbot that maintains conversation history using Flask’s session object.

Solution:

from flask import Flask, request, render_template, session
import openai
import os
from flask_session import Session
from dotenv import load_dotenv

load_dotenv()
openai.api_key = os.getenv("OPENAI_API_KEY")

app = Flask(__name__)
app.secret_key = os.urandom(24)
app.config["SESSION_TYPE"] = "filesystem"
Session(app)

@app.route("/", methods=["GET", "POST"])
def chat():
    if "history" not in session:
        session["history"] = [{"role": "system", "content": "You are a helpful assistant."}]

    if request.method == "POST":
        user_input = request.form["user_input"]
        session["history"].append({"role": "user", "content": user_input})

        response = openai.ChatCompletion.create(
            model="gpt-4o",
            messages=session["history"]
        )
        reply = response["choices"][0]["message"]["content"]
        session["history"].append({"role": "assistant", "content": reply})

    return render_template("chat.html", history=session["history"][1:])

Exercise 4: Limit Memory to the Most Recent Messages

Task:

Prevent your session memory from growing too large by trimming older messages once the history exceeds 20 entries.

Solution (Streamlit):

MAX_HISTORY = 20
if len(st.session_state.messages) > MAX_HISTORY:
    st.session_state.messages = [st.session_state.messages[0]] + st.session_state.messages[-MAX_HISTORY:]

Solution (Flask):

if len(session["history"]) > MAX_HISTORY:
    session["history"] = [session["history"][0]] + session["history"][-MAX_HISTORY:]

Exercise 5: Style the Chat Interface in Streamlit

Task:

Enhance your Streamlit UI using st.chat_message() to make the chatbot look like a real conversation.

Solution:

st.set_page_config(page_title="Chatbot", page_icon="🤖")
st.title("🤖 GPT-4o Assistant")

if "messages" not in st.session_state:
    st.session_state.messages = [{"role": "system", "content": "You are a helpful assistant."}]

for msg in st.session_state.messages[1:]:
    with st.chat_message(msg["role"]):
        st.markdown(msg["content"])

user_input = st.chat_input("Say something...")

if user_input:
    st.session_state.messages.append({"role": "user", "content": user_input})
    with st.chat_message("user"):
        st.markdown(user_input)

    with st.chat_message("assistant"):
        with st.spinner("Thinking..."):
            response = openai.ChatCompletion.create(
                model="gpt-4o",
                messages=st.session_state.messages
            )
            reply = response["choices"][0]["message"]["content"]
            st.markdown(reply)
            st.session_state.messages.append({"role": "assistant", "content": reply})

Summary of What You Practiced

  • Creating simple and effective chat UIs with Streamlit and Flask
  • Preserving multi-turn conversation memory
  • Structuring clean, readable chat interfaces
  • Managing message history to optimize API usage
  • Implementing server-side sessions in Flask for continuity

You’ve now built a chatbot that looks good, feels responsive, and holds context like a pro. In the next section, we’ll take memory to the next level using vector databases for persistent, semantic memory that lasts beyond the session.