Step-by-Step Guide to Build an AI-First CRM

April 3, 2025

Table of Contents

The goal of this blog is to build an AI-first CRM application that receives signals about customers, researches information about them, and automatically drafts outreach campaigns for products that those customers might be interested in.

Introduction - The Potential of AI-Powered CRM

The future of Customer Relationship Management (CRMs) lies in the hands of AI-powered solutions that can autonomously gather, analyze, and act upon customer data. AI is transforming industries and CRMs are no exception to this. Traditional CRMs rely heavily on manual data entry and human intervention. However, the future belongs to AI-powered CRMs that can automate repetitive tasks, derive insights from vast amounts of data, and make intelligent decisions.

Imagine a CRM that can receive signals about customers from various touchpoints, such as social media, email, and website interactions. It can then proactively research and gather relevant information about each customer, building comprehensive profiles. Armed with this knowledge, the AI-powered CRM can predict the products or services that would most likely interest each individual customer. It can then automatically generate personalized outreach campaigns, tailored to their specific needs and preferences.

Workflow

We’ll take the following steps (in sequential order) to build our AI-powered CRM application:

  1. Create a fake (sample) dataset containing customer information, like name, age, gender, location, and preferences, and insert it into a PostgresSQL table. 
  2. Load a dataset of products. We’ll be using ckandemir/amazon-products from Hugging Face. It has about 24K rows of Products and their Descriptions. We’ll insert these Products (as vector embeddings) along with their Descriptions (as metadata) into the PGVector database.
  3. Then we’ll create another dataset of Customer Signals, which will include search activity signals of various customers. For example, Customer Sanya Gupta searched for Indie Films. We’ll insert this data in another SQL table.
  4. Finally, we’ll write a function that crafts automatic outreach emails when given a customer name. The function, based on the customer’s name, collects the relevant information about them from the SQL database. Then it does a vector search on the PGVector Database to find products that are similar to the customer’s preferences. 
  5. The information collected from the SQL and the vector database are then passed onto the LLM (Llama 3 on Ollama) to generate the outreach email.  

The Code

Since we’ll be using an AI-heavy stack to develop our CRM application, we’ll need a high-performance GPU compute for our processing needs. E2E Networks provides a fleet of AI-optimized GPUs. Check it out here: https://myaccount.e2enetworks.com/.

First, set up PostGres SQL

Update your system’s package list:


  ```bash
   sudo apt update
   ```
   

Install PostgreSQL with:


 ```bash
   sudo apt install postgresql postgresql-contrib
   ```
   

PostgreSQL should start automatically. You can check its status with:


 ```bash
   sudo systemctl status postgresql
   ```

Create a Database and User

Switch to the default PostgreSQL user:


 ```bash
   sudo -i -u postgres
   ```

Access the PostgreSQL prompt by typing:


  ```bash
   psql
   ```

Create a new database:


  ```sql
   CREATE DATABASE mydatabase;
   ```

Create a user with a password:


   ```sql
   CREATE USER myuser WITH ENCRYPTED PASSWORD 'mypassword';
   ```

Grant all privileges on the database to your new user:


  ```sql
   GRANT ALL PRIVILEGES ON DATABASE mydatabase TO myuser;
   ```
   

Exit the PostgreSQL prompt:


  ```bash
   \q
   ```

Alternatively, you can also launch a PostGresSQL Cluster on E2E Cloud. Details are here

Now install all the Python packages needed by running ```pip install -r requirements.txt```:


psycopg2-binary
langchain_postgres
ollama
langchain

We’ll create sample customer information data.

Then we’ll insert it into a SQL Table called customers:


import psycopg2


# Database connection parameters
db_name = "mydatabase"
db_user = "myuser"
db_pass = "mypassword"
db_host = "localhost"
db_port = "5432"


# Connect to the database
conn = psycopg2.connect(
    dbname=db_name,
    user=db_user,
    password=db_pass,
    host=db_host,
    port=db_port
)


# Create a cursor object using the connection
cursor = conn.cursor()


# SQL command to create a table
create_table_command = """
CREATE TABLE IF NOT EXISTS customers (
    id SERIAL PRIMARY KEY,
    name VARCHAR(100),
    age INT,
    gender VARCHAR(10),
    location VARCHAR(100),
    preferences TEXT
);
"""




def insert_customers(cursor, customers):
    for customer in customers:
        cursor.execute(
            "INSERT INTO customers (id, name, age, gender, location, preferences) VALUES (%s, %s, %s, %s, %s, %s)",
            (customer["id"], customer["name"], customer["age"], customer["gender"], customer["location"], ", ".join(customer["preferences"]))
        )
try:    # Execute the SQL command to create the table    cursor.execute(create_table_command)    # Insert customer data    insert_customers(cursor, customers)    conn.commit()  # Commit the transaction    print("Table created and data inserted successfully") except psycopg2.Error as e:    # Handle the error case    print("An error occurred:")    print(e)    conn.rollback()  # Rollback the transaction in case of an error finally:    # Close the cursor and the connection    cursor.close()    conn.close()

Now, load the dataset of products:


from datasets import load_dataset


dataset = load_dataset('ckandemir/amazon-products', split= 'train')

df = dataset.to_pandas()

Convert it into Document format, saving the product name in the main content body and the description in the metadata.


from langchain.docstore.document import Document
documents = [Document(page_content= row['Product Name'], metadata={"Description": row['Description'] } ) for index, row in df.iterrows()]

Next, launch a Docker container for PGVector enabled PostGres using LangChain.


docker run --name pgvector-container -e POSTGRES_USER=langchain -e POSTGRES_PASSWORD=langchain -e POSTGRES_DB=langchain -p 6024:5432 -d pgvector/pgvector:pg16

Then we’ll upsert the vectors in batches of 1000 into the vector DB.


batch_size = 1000  # Adjust the batch size as needed
num_batches = (len(documents) + batch_size - 1) // batch_size


for i in range(num_batches):
    start_idx = i * batch_size
    end_idx = min((i + 1) * batch_size, len(documents))
    batch_documents = documents[start_idx:end_idx]
    batch_ids = list(range(start_idx + 1, end_idx + 1))
    vectorstore.add_documents(batch_documents, ids=batch_ids)
    print(f"Batch no. {i} done")

After that, we’ll create a dataset for customer signals based on their preferences:


import random


# Define search signals based on preferences
search_signals_examples = {
    "Movies": ["latest blockbusters", "indie films", "movie tickets"],
    "Cricket": ["cricket gear", "match tickets", "cricket club memberships"],
    "Books": ["bestselling books", "library memberships", "e-books"],
    "Traveling": ["flight tickets", "hotel bookings", "travel gear"],
    "Music": ["concert tickets", "music streaming subscriptions", "musical instruments"],
    "Biking": ["bike accessories", "new bicycles", "cycling gear"],
    "Cooking": ["cooking classes", "kitchen gadgets", "recipe books"],
    "Art": ["art supplies", "museum tickets", "online art courses"],
    "Fitness": ["gym memberships", "fitness gear", "personal trainers"],
    "Tech": ["latest gadgets", "software deals", "tech gadgets"],
    "Yoga": ["yoga mats", "yoga classes", "meditation apps"],
    "Reading": ["latest novels", "book clubs", "digital readers"],
    "Gardening": ["garden tools", "plant seeds", "gardening books"],
    "Chess": ["chess sets", "chess books", "chess tournament entries"],
    "Dancing": ["dance classes", "dancewear", "performance tickets"],
    "Shopping": ["online store coupons", "shopping mall deals", "exclusive sales"],
    "Photography": ["camera gear", "photography courses", "photo editing software"],
    "Fashion": ["fashion accessories", "clothing sales", "fashion shows"],
    "Blogging": ["blog hosting services", "content creation tools", "SEO courses"],
    "Hiking": ["hiking gear", "trail maps", "guided tours"],
    "Politics": ["political books", "campaign gear", "debate tickets"],
    "Nature": ["nature documentaries", "outdoor gear", "conservation projects"],
    "Languages": ["language courses", "travel guides", "language learning apps"],
    "History": ["history books", "documentary subscriptions", "historical tours"],
    "Public Speaking": ["public speaking workshops", "presentation tools", "public speaking books"],
    "Writing": ["writing seminars", "publishing guides", "writing software"],
    "Gaming": ["video games", "gaming consoles", "online game subscriptions"]
}


# Generate search signals
customer_signals = []
for customer in customers:
    signals = [random.choice(search_signals_examples.get(pref, [])) for pref in customer['preferences']]
    combined_signals = ', '.join(signals)
    customer_signals.append((customer['id'], customer['name'], combined_signals))

Insert into a SQL database called Customer_Signals.


import psycopg2


# Database connection parameters
db_name = "mydatabase"
db_user = "myuser"
db_pass = "mypassword"
db_host = "localhost"
db_port = "5432"


# Connect to the database
conn = psycopg2.connect(
    dbname=db_name,
    user=db_user,
    password=db_pass,
    host=db_host,
    port=db_port
)


# Create a cursor object using the connection
cursor = conn.cursor()


# SQL to create the Customer_Signals table
create_table_command = """
CREATE TABLE IF NOT EXISTS Customer_Signals (
    id SERIAL PRIMARY KEY,
    name VARCHAR(255),
    search_signals VARCHAR(255)
);
"""


# Execute the create table command
cursor.execute(create_table_command)
conn.commit()


# Insert data into the Customer_Signals table
insert_command = "INSERT INTO Customer_Signals (id, name, search_signals) VALUES (%s, %s, %s) ON CONFLICT (id) DO UPDATE SET name = EXCLUDED.name, search_signals = EXCLUDED.search_signals;"


for signal in customer_signals:
    cursor.execute(insert_command, signal)


# Commit the insertions
conn.commit()


# Close the database connection
cursor.close()
conn.close()


print("Table created and data inserted successfully.")

We then craft the final function that composes the campaign email after fetching relevant data from the SQL databases and vector database.


from langchain_community.llms import Ollama


def draft_email(customer_name):


    # Connect to the database
    conn = psycopg2.connect(
        dbname=db_name,
        user=db_user,
        password=db_pass,
        host=db_host,
        port=db_port
    )


    # Create a cursor object using the connection
    cursor = conn.cursor()


    # SQL query to join customers and customer_signals on the name column and filter by name
    sql_query = """
    SELECT c.name, c.age, c.gender, c.location, s.search_signals
    FROM customers c
    JOIN customer_signals s ON c.name = s.name
    WHERE c.name = %s;
    """


    try:
        # Execute the SQL query with parameter
        cursor.execute(sql_query, (customer_name,))


        # Fetch the result
        record = cursor.fetchone()


        if record:
            customer_age = record[1]
            customer_gender = record[2]
            customer_location = record[3]
            customer_search_signal = record[4].split(',')[0]


        else:
            print("No data found for the given name.")


    except Exception as e:
        print(f"An error occurred: {e}")


    finally:
        # Close the cursor and connection to the database
        cursor.close()
        conn.close()


    #Retrieving Products and their descriptions from the vectorstore
    vector_search = vectorstore.similarity_search(customer_search_signal)
    product_to_sell = vector_search[0].page_content
    description_of_product_to_sell = vector_search[0].metadata['Description']


    #Putting all the relevant information in the prompt
    formatted_prompt = prompt.format(customer_name= customer_name ,customer_gender= customer_gender,
                                     customer_age= customer_age, customer_location= customer_location,
                                     product_to_sell= product_to_sell,
                                     description_of_product_to_sell= description_of_product_to_sell)


    llm = Ollama(model="llama3")


    return llm.invoke(formatted_prompt)

In the above example, instead of deploying a local Llama 3 endpoint through Ollama, we can instead deploy an endpoint on E2E’s TIR platform. Follow this guide. Once deployed, the endpoint be accessed using this code:


import requests
import json


url = "https://infer.e2enetworks.net/project/p-2475/endpoint/is-1393/v1/models/finetuned-meta-llama-Llama-3-8b-hf:predict"


api_key = 'YOUR_API_KEY'
query = 'YOUR_QUERY'


payload = json.dumps({
    "instances": [
    {
        "text": f'{query}'
    }
    ]
})


headers = {
    'Content-Type': 'application/json',
    'Authorization': 'Bearer ' + f'{api_key}'
}


response = requests.request("POST", url, headers=headers, data=payload)


response.text

Results


draft_email('Aarav Kumar')

---

**Subject: Relive Your Favorite Friends Moments with Our Exclusive Card Game!**

Dear Aarav Kumar,

Hope this email finds you well! As a 30-year-old guy from Delhi, we're sure you've got a soft spot for the iconic TV show Friends. Who doesn't love Ross's sarcastic remarks or Joey's foodie adventures?

We're excited to introduce you to our top-of-the-line product - the **Friends Top 30 Moments Card Game**! This entertaining and educational game is designed to bring your favorite Friends moments to life. With over 100 cards featuring iconic scenes, characters, and quotes from the show, this game is sure to delight fellow Friends fans like yourself.

Here's how it works:

1. Shuffle the deck and draw a card.
2. Answer trivia questions or complete challenges inspired by the featured scene.
3. Score points based on your performance.
4. Compete with friends and family to be the ultimate Friends champion!

This unique game is perfect for:

- Friends reunions
- Family gatherings
- Game nights with fellow fans

As a valued customer from Delhi, we're offering you an **exclusive 10% discount** on your first purchase of the Friends Top 30 Moments Card Game. Use code **FRIENDS10** at checkout to redeem your offer.

Don't miss this chance to relive the laughter and excitement of Friends! Order now and get ready for a fun-filled experience with friends and family.

Best regards,

[Your Name]  
E-commerce Expert

P.S. Don't forget to share your gaming experiences with us on social media using **#FriendsForever**!

--- 


Latest Blogs

A vector illustration of a tech city using latest cloud technologies & infrastructure