E-commerce sales are expected to hit new highs during Black Friday and Cyber Monday this year. As a result, online retailers face the following challenges:(1) standing out in a flurry of compelling promotions; and (2) keeping up with a surge of customer inquiries and engagement.
Intel® Liftoff for AI Startups member, Prediction Guard, a startup enabling companies to deploy trustworthy Large Language Model (LLM) solutions, has deployed private, secure LLMs on Intel’s Developer Cloud to help e-commerce companies overcome these challenges.
For one of their clients, Antique Candle Co, this has already resulted in 100’s of thousands of dollars in increased revenue and operational efficiencies for customer support teams!
LLM-based promotion planning:
With regards to promotion planning for Black Friday and Cyber Monday (BFCM), Prediction Guard realized that companies, like Antique Candle Co. (ACC), need to differentiate their offers (e.g., “30% off + free shipping”) from an increasing number of competitors and dynamically adjust offers based on forecasted and actual demand. Prediction Guard met these needs using a secure and private LLM-based solution that reasoned over previous ACC sales data, forecasted sales, and offer semantics.
This solution first analyzes historical sales data from Shopify alongside previous offers from Mailchimp to establish historical baseline sales numbers and sales “uplifts” tied to previous offers. The historical offers are extracted from Mailchimp campaigns in a secure environment on Intel Developer Cloud using a chain of calls to the latest LLMs like Llama2. These are matched to the historical sales data to determine a sales uplift associated with each promotion (see Figure 1).
This uplift data is combined in an aggregated dataset alongside the historical offers and delivery windows of the historical offers. Prediction Guard created “embeddings” of these historical offers and stored them in the LanceDB vector database, such that new draft offers can be semantically matched to the historical offers. This setup is allowing the ACC team to draft new offers for BFCM, match them to the closest historical offers during similar delivery times, and calculate a forecasted uplift corresponding to the draft offers (see Figure 2). The forecasts are being used to evaluate various draft offers and optimize messaging.
Finally, to make this promotion planning tool more dynamic, ACC is leveraging Prediction Guard’s suite of privately hosted LLMs (in Intel Developer Cloud on Gaudi2) to proactively rephrase or rewrite draft offers in ways that increase forecasted uplifts. That is, ACC will put a draft offer into the promotion planning tool and request that the tool generate similar offers having the same basic parameters (e.g., a certain percentage off) but with wording that will outperform the draft wording. LLMs are great at this task and are able to act in an agentic way reasoning over the historical sales uplift data and iteratively rephrasing the offers to find better wording (see Figure 3).
Given that this promotion planning functionality processes email campaign data and generates offers that need to be private until they are announced, ACC would be hesitant to upload related data into an LLM API using closed, proprietary models with sometimes confusing terms of use. Running these text generations with Prediction Guard allows them to keep their data private without sacrificing magical (and game changing) AI functionality.
Not only that, but this functionality produces results! The LLM-driven promotional planning strategy was deployed in flash sales leading up to BFCM resulting in up to 40% boosts in revenue!
Reliably mitigating increased CX demand:
As promotions bring a wave of customers to your storefront, retailers are forced to deal with increased demand for customer experience (CX) and support interactions. Delighting customers during BFCM means responding well to a flood of requests for exchanges, returns, shipping updates, etc.
Prediction Guard has pioneered an LLM-powered customer service system for their e-commerce customers (like ACC) leading up to the holiday rush. Using the latest LLMs, this chatbot can respond to customer queries accurately in the brand's unique voice, while seamlessly analyzing the context from prior conversational exchanges. These conversational exchanges are kept private and LLM responses are validated and de-risked using Prediction Guard’s unique factuality and toxicity checks.
After implementing this LLM-based customer response system in Zendesk, ACC has seen a huge boost in operational efficiencies, which will be mission critical during BFCM. According to Daniel from Prediction Guard: "ACC, our valued customer, achieved an impressive milestone by leveraging Prediction Guard. They've efficiently reduced the need for two full-time employees dedicated to customer experience during flash sales to just one person working half-time. This strategic optimization has been instrumental, especially during their most critical periods."
Conclusion:
As e-commerce continues to define the holiday shopping experience, strategic implementations of AI solutions will be key to converting website traffic into satisfied customers. However, these AI solutions need to be implemented in a trustworthy manner, especially in cases where customer data or sensitive financial information is being processed. Predication Guard’s private, secure, and de-risked LLM solutions deployed on Intel Developer Cloud are meeting this need and demonstrating value this holiday season.
For more information on Prediction Guard, visit their website.
Intel® Liftoff for AI Startups helps innovators like Prediction Guard to turn their most ambitious projects into a reality. Apply to the program today to discover how far your business could go with access to Intel® accelerated computing solutions.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.