After a successful launch event, the Intel® Liftoff Days returned for a second edition running from July 29 to August 2, 2024. The event was a great opportunity for Intel® Liftoff members to test multimodal integration and efficiency through quantization.
The participating startups were encouraged to work on integrating and processing multiple types of data (text, images, audio) within a single model while utilizing quantization to make models more efficient.
What We Covered
Our workshops and brainstorming sessions explored a variety of AI topics, all designed to introduce participating members to the latest techniques and spark fresh ideas and discussions.
Here’s a snapshot of what we covered over the course of the week.
The week focused on structured learning, hands-on workshops, and leveraging Intel technologies to accelerate AI development. Startups explored cutting-edge techniques, from overcoming data bottlenecks to optimizing multimodal models for Intel hardware.
These were the week’s key workshops
- Synthetic Data Workshop: Techniques for generating data using stable diffusion pipelines, variational autoencoders, and Intel's PyTorch extensions.
- Inference of Multimodal Models Workshop: Optimization for Intel CPUs/GPUs using Intel extensions for PyTorch and the ipex.optimize function.
- Fine-Tuning Multimodal Models Workshop: Preprocessing datasets, optimizing training loops, and leveraging Intel tools for hardware-specific performance.
- Quantization with OpenVINO Workshop: Reducing model size and boosting performance with FP16, INT8, and INT4 quantization.
On Demo Day, projects were pitched by the 5 participating startups. This is where our participants were able to practically apply the knowledge and skills acquired during the week.
The five startups that successfully competed in the hackathon with a final pitch included bitHuman, PeopleSense.AI, Maplewell Energy, Reama AI and kAI.
Startup Gallery
bitHuman create human-like, interactive AI agents for enterprises that leverage generative AI and affective computing to identify emotions, sentiment and intent and interact real-time. | PeopleSense.AI is an award-winning, patent-pending system and approach for detecting and estimating people count information acquired by signal sensing and other technologies. | Maplewell Energy develops demand optimization and AI software that transforms commercial and industrial buildings into virtual power plants. Their predictive Energy Management System optimizes battery storage in real time to maximize energy savings and revenue. |
Reama AI is digital asset lifecycle overlay platform for renewable energy asset managers that enables 24/7 oversight of renewable energy power plant performance, failures, and stoppages. | kAI is an AI-powered daily planner designed to revolutionize task management and productivity. With kAI, users can effortlessly organize their tasks, prioritize activities, and optimize their workflow with the help of artificial intelligence. |
|
Projects We Loved
The hackathon spotlighted innovative projects from our community, with standout contributions from Maplewell Energy and kAI.
Maplewell Energy: Predictive Control System for Optimizing Energy Demand
Maplewell Energy is focused on modernizing energy demand management for commercial and industrial buildings. Their product, JANiiT, is a platform for distributed energy storage that helps solar developers, energy service companies (ESCOs), and facility managers manage, predict, and optimize peak load within their buildings.
Growing electrification behind the meter leads to issues such as low growth on utility distribution systems, peak demand charges, and infrastructure limitations.
JANiiT provides predictive peak shaving, time-of-use and demand response optimization, and acts as a virtual power plant for utilities. It uses real-time optimization, global forecasting, state estimation, and model predictive control.
Technology Stack:
The platform leverages an optimization engine and predictive control software, incorporating state-of-the-art AI techniques such as LLMs for forecasting and encoder-decoder models for multivariate data estimation.
Hackathon Objective
Build a small LLM for physical systems data estimation, focusing on vector quantization and clustering techniques to improve the predictive accuracy and efficiency of their energy management system.
What They Achieved During Intel® Liftoff Days
During the event, Maplewell Energy explored transitioning from local optimization models to global optimization models to improve time series forecasting and state estimation. They also implemented algorithms and time series data to improve accuracy and utilized encoder-decoder models to compress data into state vectors for robust predictions.
The team also highlighted their success in the previous hackathon where they improved forecasting accuracy by 25% using an LLM. For this hackathon, they aimed to enhance their state estimation capabilities using new techniques and Intel's hardware optimization.
Mentor’s Feedback and Insights
Our technical mentors commended the model’s complexity and suggested leveraging advanced optimization techniques and parallel processing to enhance performance. They also highlighted how scalable Maplewell's solution is and how it could be used in other industries, like maritime and aircraft battery management.
Rahul Unnikrishnan Nair, Engineering Lead at Intel® Liftoff highlighted the significance of data compression and the need for efficient algorithms to handle large-scale data. He suggested exploring different approaches to enhance the system's predictive capabilities.
Read more about Maplewell’s participation in the Intel Liftoff Days hackathon here.
kAI: Calendar and Chatbot for Business Scheduling
CEO Gilberto Pardo’s pitch for kAI was all about making life easier for businesses juggling endless tasks. Many businesses struggle with coordinating schedules and managing tasks across teams, leading to inefficiencies and missed deadlines.
Talk to Kai is a tool designed to help businesses organize their day-to-day activities more efficiently. It combines a conversational interface with a calendar system to streamline team scheduling and task delegation using simple prompts. The tool integrates these tasks into a visual calendar that displays team schedules by priority and importance.
Technology Stack:
The platform uses NLP and AI to understand and process natural language prompts, converting them into actionable tasks and schedules.
Hackathon Objective
kAI’s main challenge was to train a model to recognize events and activities from presentations or images, converting them into team schedules. This involved developing a PDF or image recognition system that can extract relevant information and automate scheduling processes.
Main Technical Takeaways
kAI’s team focused on adapting AI models for multilingual support, integrating diverse data sources for accurate scheduling, and designing a user-friendly interface to enhance productivity. Key techniques included leveraging advanced machine learning methods and optimizing the integration of diverse data sources.
Mentor’s Feedback and Insights
Intel® Liftoff’s technical mentors recommended exploring various optimization techniques and leveraging advanced machine learning methods to boost model performance. They emphasized the importance of integrating diverse data sources to ensure accurate task management.
Desmond Grealy, Head of Tech at Intel® Liftoff, highlighted the potential applications of the solution across different business contexts, stressing the need for scalability and robustness. He also suggested enhancing the user interface to improve productivity and exploring innovative approaches for greater efficiency.
The team further underscored the significance of multilingual support and the importance of implementing efficient algorithms to process diverse data inputs effectively.
Lessons We Learned
This edition of Intel Liftoff Days hackathon provided some great insights into overcoming data bottlenecks, optimizing models, leveraging multimodal AI, and implementing agent-based systems.
1. Importance of Data Augmentation and Synthetic Data
Key takeaway: Generating and utilizing synthetic data is a powerful way to overcome data bottlenecks and accelerate development. Techniques like stable diffusion pipelines and variational autoencoders can create realistic datasets that significantly enhance workflows.
2. Optimizing Models for Performance and Scalability
Optimization techniques: Advanced optimization methods, such as Intel's extensions for PyTorch (IPEX), can greatly improve model performance and scalability. The ipex.optimize function allows for better hardware utilization, particularly on GPUs, leading to increased efficiency.
Quantization with OpenVINO: Quantization reduces model size and improves inference speed, making models more suitable for edge deployments. This approach highlights the dual benefits of smaller models and faster performance.
3. Leveraging Multimodal AI for Enhanced Capabilities
Multimodal architectures: Combining multiple data types within a single model creates robust and versatile AI solutions. The hackathon showcased the integration of components such as image encoders, pretrained language models (LLMs), and embedding projectors. For example, the team demonstrated how the Moondream model efficiently handled image-based question answering tasks.
Inference and fine-tuning: Rahul highlighted techniques for optimizing these models using Intel’s tools, enhancing their efficiency across various applications.
4. The Role of Agent-Based Systems in Data Management
Managing data with agents: Agent-based systems can effectively handle data collection and synchronization, especially in environments with intermittent connectivity. The team encouraged the use of agents to ensure continuous development by managing data accumulation and synchronization, even under challenging network conditions.
What Our Participants Had to Say
Participating in the Intel® Liftoff Days hackathon and collaborating with the exceptionally talented Intel team was incredibly valuable for us at kAI. It allowed us to lay the groundwork for our next update in just a few days, and the mentorship we received saved us months of work. Without a doubt, it was a special experience that we hope to repeat soon.
- Kelvin Perea, CEO, kAI
With the team support at Intel® Liftoff Days, we felt comfortable pushing our boundaries and trying something risky. Without this program, we would have not explored LLMs in our application.
- Matthew Irvin, CEO and Co-founder, Maplewell Energy
The hackathon on Synthetic Data and Multimodal Language Models was incredibly insightful. I'm excited to apply these concepts to my startup. My next steps are building a value hypothesis and rapidly prototyping with customers to validate our ideas. Thanks to the team at Intel® Liftoff for putting this hackathon together.
- Tebogo Mohlahlana, CEO and Co-Founder, Reama AI
The Intel® Liftoff team was very supportive and flexible to provide us an avenue where our team showcased a critical product update, and got first-hand valuable feedback.
- Efe Akengin, Head of Product, bitHuman
About Intel® Liftoff
Intel® Liftoff for startups is open to early-stage AI and machine learning startups. This free virtual program helps you innovate and scale, no matter where you are in your entrepreneurial journey.
Resources used during Intel® Liftoff Days
Intel® Tiber™ AI Cloud - Cloud platform for AI development and deployment
Intel® Distribution of OpenVINO™ toolkit (Powered by oneAPI) - Framework for optimizing deep learning inference
Intel® Extension for PyTorch* - Optimizations for PyTorch on Intel hardware
Intel® Gaudi® 2 AI accelerator - High-performance AI training processor designed for deep learning workloads
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.