The Intel® Core™ Ultra Processors (Series 1), codenamed Meteor Lake, were released in December last year. With the objective of getting early feedback and understanding the scope of innovations on the Meteor Lake AI PC platform, a developer focused AI PC pilot hackathon was organized by the Software Ecosystem Enablement (SEE) group of Intel in collaboration with the Client Computing Group (CCG). The event was called ‘Rockstar Student Ambassador hackathon’ where our team selected 12 most engaged members of the Intel® Student Ambassador Program in the US, provided them with Intel’s AI PC development kits and trained them for 9 weeks on building innovative solutions harnessing the power of AI PCs.
This blog highlights the top projects built on AI PCs at the hackathon.
Winning Projects at the AI PC Pilot Hackathon
The following are the 5 best projects developed by the Student Ambassadors at the event:
- Virtual Reality (VR) AI Playground: This project developed by Migara Amarasinghe from the FIorida State University is an interactive learning platform for educators, students and AI enthusiasts. It creates an immersive educational experience covering basic to advanced level AI concepts using virtual reality and real-time AI applications. The aim is to make AI learning more engaging through interactive tutorials and real-time demonstrations. The Intel® NUC (Next Unit of Computing) was used as the main workstation. The project is based on C# and Python* programming languages and comprises three programs: (i) Main VR AI playground for speech-to-image interaction, (ii) YOLOv10 object detection VR program that uses the OpenVINO™ Toolkit for locally performing model training and inference on GPU, and (iii) VR integration with Hugging Face* APIs for interactive text to image conversion.
• Check out the demo video of project 'VR AI Playground' for more details. - TikTalk: This project developed by Bill Zhang from the University of Southern California generates short videos about educational content from prompt input. The Meta* Llama 3.2 model optimized by the team for Intel® CPU using the OpenVINO Toolkit generates both video scripts and music description. Using the music description, audio music is generated and then the audio is transcribed to get subtitles. Relevant images are generated from subtitles followed by the step of assembling everything into a complete video output in the form of a .mp4 file.
• For more details, refer to the demo video of project 'TikTalk'. - Maestra: It is a Python based voice-to-voice platform built by Keerthi Nalabotu from the University of California, San Diego. The user feeds a topic or a document, notes, etc. kind of file as an input to the tool. The tool responds with an initial question based on the file’s content. The user can then record his own response to the question, and it will lead to a conversation which will be tracked in a transcript for future reference. The team used OpenVINO version of the Microsoft* Phi-3 model and trained the models on GPU instead of CPU resulting in faster inference.
• Check out the demo video of project ‘Maestra’. - NUC AI PC Drone: This AI-powered drone project was developed by Yuri Winche Achermann from the RWTH Aachen University. It simulates the brain of an autonomous drone using NUC AI PC powered by the Intel Core Ultra 7 processor as the central unit for managing hardware resources and software applications. The Computer Vision Module of the project performs object detection, depth estimation and facial recognition; the Natural Language Processing (NLP) module handles question-answering, order placement and customization; and the Audio Processing module tackles speech-to-text and text-to-speech conversions. Details of the drone architecture are available on GitHub.
• For more information, refer to the demo video of the AI Drone project. - YouTube Copilot: Built by Kieran Llarena from the University of Michigan, the project acts as a virtual assistant that can answer your questions based on the YouTube video URL. The Python based project runs Hugging Face LLM on AI PC and uses OpenVINO for faster inference. A Google Chrome* extension injects GUI using JavaScript* into a target YouTube video URL. The URL is then processed to extract video ID and store subtitles. Based on the user’s search query, the LLM run locally on AI PC generates response to the query and sends it back to the Chrome extension which parses the data and displays the output response on the injected GUI.
• Here’s a demo video of the ‘YT co-pilot' project for more details. What’s Next?
Leverage our AI PCs powered by the Intel Core Ultra processors for accelerated AI development combining the potential of CPU, GPU and NPU. Get started with our AI PC development resources including the AI PC development kit.
We also encourage you to explore our AI tools and framework optimizations powered by the oneAPI programing model for multiarchitecture, cross-vendor, parallel computing.
Additional Resources
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.