The goal of this blog is to build an AI-first CRM application that receives signals about customers, researches information about them, and automatically drafts outreach campaigns for products that those customers might be interested in.
Introduction - The Potential of AI-Powered CRM
The future of Customer Relationship Management (CRMs) lies in the hands of AI-powered solutions that can autonomously gather, analyze, and act upon customer data. AI is transforming industries and CRMs are no exception to this. Traditional CRMs rely heavily on manual data entry and human intervention. However, the future belongs to AI-powered CRMs that can automate repetitive tasks, derive insights from vast amounts of data, and make intelligent decisions.
Imagine a CRM that can receive signals about customers from various touchpoints, such as social media, email, and website interactions. It can then proactively research and gather relevant information about each customer, building comprehensive profiles. Armed with this knowledge, the AI-powered CRM can predict the products or services that would most likely interest each individual customer. It can then automatically generate personalized outreach campaigns, tailored to their specific needs and preferences.
Workflow
We’ll take the following steps (in sequential order) to build our AI-powered CRM application:
- Create a fake (sample) dataset containing customer information, like name, age, gender, location, and preferences, and insert it into a PostgresSQL table.
- Load a dataset of products. We’ll be using ckandemir/amazon-products from Hugging Face. It has about 24K rows of Products and their Descriptions. We’ll insert these Products (as vector embeddings) along with their Descriptions (as metadata) into the PGVector database.
- Then we’ll create another dataset of Customer Signals, which will include search activity signals of various customers. For example, Customer Sanya Gupta searched for Indie Films. We’ll insert this data in another SQL table.
- Finally, we’ll write a function that crafts automatic outreach emails when given a customer name. The function, based on the customer’s name, collects the relevant information about them from the SQL database. Then it does a vector search on the PGVector Database to find products that are similar to the customer’s preferences.
- The information collected from the SQL and the vector database are then passed onto the LLM (Llama 3 on Ollama) to generate the outreach email.
The Code
Since we’ll be using an AI-heavy stack to develop our CRM application, we’ll need a high-performance GPU compute for our processing needs. E2E Networks provides a fleet of AI-optimized GPUs. Check it out here: https://myaccount.e2enetworks.com/.
First, set up PostGres SQL
Update your system’s package list:
Install PostgreSQL with:
PostgreSQL should start automatically. You can check its status with:
Create a Database and User
Switch to the default PostgreSQL user:
Access the PostgreSQL prompt by typing:
Create a new database:
Create a user with a password:
Grant all privileges on the database to your new user:
Exit the PostgreSQL prompt:
Alternatively, you can also launch a PostGresSQL Cluster on E2E Cloud. Details are here.
Now install all the Python packages needed by running ```pip install -r requirements.txt```:
We’ll create sample customer information data.
Then we’ll insert it into a SQL Table called customers:
Now, load the dataset of products:
Convert it into Document format, saving the product name in the main content body and the description in the metadata.
Next, launch a Docker container for PGVector enabled PostGres using LangChain.
Then we’ll upsert the vectors in batches of 1000 into the vector DB.
After that, we’ll create a dataset for customer signals based on their preferences:
Insert into a SQL database called Customer_Signals.
We then craft the final function that composes the campaign email after fetching relevant data from the SQL databases and vector database.
In the above example, instead of deploying a local Llama 3 endpoint through Ollama, we can instead deploy an endpoint on E2E’s TIR platform. Follow this guide. Once deployed, the endpoint be accessed using this code: