For ecommerce businesses who use Shopify and other tools, data analytics is a serious challenge. These businesses generate mountains of valuable data - but it’s highly fragmented, making it difficult to identify important insights, alerts and attributions.
To help these ecommerce brands get more value from their data, The Mango Jelly is building a Collaborative AI Workspace and Copilot for marketing teams. They’re a startup that was recently incubated at Berkeley SkyDeck Pad-13 by UC Berkeley. Their platform enables them to extract actionable insights from high volumes of complex and fragmented data, across multiple product integrations.
Crucially, it’s all done using plain English, with no technical or data science expertise required. Using Generative AI, their platform enhances marketing analytics and automation efficiency, achieving a 20x improvement in speed, productivity, and data usage compared to existing techniques.
So they were a natural fit for the Intel® Liftoff, where they made into the top 4 startups by successfully leveraging accelerated computing infrastructure.
From Data to Clarity: The Paradox of Abundant Information and Scarce Insights
Earlier we mentioned the sheer volume of data that ecommerce businesses generate. But the way they collect that data adds even more complexity. Ecommerce marketing teams usually handle multiple marketing channels to engage and acquire customers. They are inundated with data, fragmented across multiple tools. As a result, they often miss out on critical insights and alerts, leading to lost revenue. The Mango Jelly is improving their data utilisation by surfacing the most valuable insights simply with conversations.
The Mango Jelly set out to build a fine-tuned LLM and integrate it with Shopify. The goal was for users to be able to request insights from their data in Natural Language. The integration would need to be able to consume the natural language request and then provide the data analysis results from Shopify store data. The Mango Jelly hoped to then refine this model for different verticals and use cases.
The biggest challenge they faced was the need to scale the product, with the optimal infrastructure, to make these data operations cost-effective.
Supercharging Open-Source Models with Cutting-Edge Performance
Over the course of the Intel® Liftoff hackathon, the team was able to fine-tune an open-source LLM on extremely powerful hardware including Intel® Data Center GPU Max.
This gave The Mango Jelly greater control over customizability, fine-tuning and usage limits with open-source LLMs, backed by efficient and cutting-edge hardware. It also optimized their solution, making it more suitable for enterprise use cases.
"Intel Liftoff is a fantastic program for AI Startups. It is run by a very supportive and engaged organizing team. Interestingly, we were onboarded to the program less than an hour before the LLM hackathon started. As part of Intel Liftoff, we were able to fine-tune an open-source LLM on extremely powerful hardware. It was astonishing to experience the speed and performance of the Intel XPU." - Divya Upadhyay, Co-Founder, The Mango Jelly.
What’s Next for The Mango Jelly?
The team already has big plans for their optimized LLM integration for Shopify:
- Integrating with other products in the marketing stack
- Predictions based on training data-sets
- Inviting beta customers to get on board
Empowering Startups to Dream Big: Intel® Liftoff for Startups
By providing early-stage startups like The Mango Jelly with cutting-edge infrastructure and mentorship, Intel® Liftoff for Startups is playing an indispensable role in catalysing innovative ideas. The Mango Jelly's journey is a clear indication of how a promising startup, armed with the right tools and resources, can tackle urgent business challenges and win, enabling their own clients to leverage the LLM revolution. Apply to the program today to take your great idea one step closer to reality.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.