Skip to content

mariadb-developers/langchain-fastapi-mariadb-webinar-demo

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

18 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MariaDB + LangChain + FastAPI + NiceGUI demo

Demo Screenshot

This example application shows how to use MariaDB, LangChain, FastAPI, and NiceGUI to implement the search functionality for an online store. It uses semantic search storing vectors created by Google Generative AI in MariaDB.

Note: If you are looking for the demo implemented during the webinar Beyond Keywords: AI Vector Search with LangChain and MariaDB Cloud, see the webinar-main.py file.

Prerequisites

You'll need:

  • A MariaDB server up and running (or spin up a free serverless instance in seconds using MariaDB Cloud)
  • Python 3.11 or later
  • pip
  • A SQL client compatible with MariaDB (for example, DBeaver, the mariadb command-line tool, or an extension for your IDE)

Preparing the database

Connect to your MariaDB database and create the following table:

CREATE OR REPLACE TABLE products (
    id SERIAL PRIMARY KEY,
    name VARCHAR(100) NOT NULL,
    category VARCHAR(100),
    price DECIMAL(10,2) NOT NULL,
    description TEXT,
    created_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
    updated_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
    FULLTEXT(name),
    FULLTEXT(description)
);

Download this CSV file.

Load the data from that CSV file into the table that you previously created (remember to use the absolute path to your CSV file):

LOAD DATA LOCAL INFILE '/path/to/products.csv'
INTO TABLE products
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS
(@dummy_id, name, category, price, description);

Configuring the example application

Download or clone this repository and move to the corresponding directory:

git clone https://github.com/mariadb-developers/langchain-fastapi-mariadb-webinar-demo.git
cd langchain-fastapi-mariadb-webinar-demo

Define the OS environment variables using the values corresponding to you MariaDB database connection and Google Generative AI API key. For example, on Linux:

export DB_HOST=127.0.0.1
export DB_PORT=3306
export DB_USER=root
export DB_PASSWORD=password
export DB_NAME=demo
export GOOGLE_API_KEY=key1234567

Alternatively, modify the default values for this variables in the backend.py file.

Installing the required packages

Create and activate a new virtual environment:

python3 -m venv venv/
source venv/bin/activate

Install the required packages:

pip install -r requirements.txt

Running the backend

Run the FastAPI backend:

python backend.py

Calculate and store vector embeddings

Go to http://localhost:8000/docs, authorize (use this API key: demo-key-123), and invoke the POST /ingest-products endpoint to calculate and store the vector embeddings. Check the logs in the terminal to confirm 500 products are ingested.

Running the frontend

Important! Run the POST /ingest-products endpoint before running using the frontend!

Run the NiceGUI frontend in a separate terminal:

cd langchain-fastapi-mariadb-webinar-demo
source venv/bin/activate
python frontend.py

This should open your default browser pointing to http://127.0.0.1:8080/.

About

Semantic search demo using MariaDB, LangChain, FastAPI, and NiceGUI

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages