Intel® Liftoff member, CloudConstable, is creating powerful new AI-powered, cloud-deployed subscription services to guarantee optimal customer engagement with a focus on accessibility, inclusivity, and operational efficiency.
The notable front runner of Cloud Constable’s cloud deployed solutions is AVA (Animated Venue AI), an animated, fully customizable “subject expert” that dynamically answers customer questions, guides visitors, and significantly improves customer engagement. AVA’s interface handles multiple inquiries simultaneously and provides rapid feedback for continuous improvement, scalability, and flexibility. Subscribers to Cloud Constable’s suite of Cloud-based solutions also receive ongoing support and updates, ensuring access to the latest features.
The entire system’s front end runs neatly on an AI PC and is powered by OpenVINO, allowing incredibly rapid development and deployment.
CloudConstable's collaboration with Intel® Liftoff for AI Startups
The collaboration with Intel® Liftoff accelerates CloudConstable's innovation, enhances performance, and provides strategic advantages. With Intel's support, the organization is positioned to scale its solutions, improve customer satisfaction, and achieve precise business objectives in a competitive market.
Intel® Liftoff opens avenues to cutting-edge technologies, ensuring the development of high-performance AI applications with rapid response times and real-time data processing. The program’s focus on optimizing AI models, refining algorithms for enhanced efficiency, agility, and accuracy raises the standard of the overall customer experience. Leveraging Intel's expertise in scalable infrastructure, Cloud Constable can deploy AI solutions broadly, maintaining robust and reliable performance that adapts seamlessly to the increasing demands of its customer base. Intel® Liftoff reinforces the brand's credibility, validating its AI capabilities and solutions, enhancing market presence, and attracting new clients and investors. These collaborative opportunities with AI innovators foster knowledge sharing mutual growth and enable exponential improvements to existing solutions. Additionally, Intels key position in the industry guarantees visibility within the tech community, providing platforms to showcase innovative products, expanding reach, and contributing to marketing initiatives. Invaluable mentorship from Intel's experts ensures effective problem-solving, expedites the product development cycle, and aids in overcoming complex challenges.
Advent of GenAI
The Advent of GenAI Hackathon focused Cloud Contable’s team on integrating the Azure OpenAI Service with AVA. Using GPT3.5 Turbo and GPT4 LLM parameters to explore Microsoft's RAG implementation, the team demonstrated its connection to Azure Cognitive Search. The real gem, however, was foreseeing the use of Prediction Guard's APIs to monitor AVA's responses, crucial for Cloud Constable’s existing users, including a military museum and a health clinic. Intel Developer Cloud's GPUs and HPUs became key resources, promising efficient inferences, potential model training, and enhanced privacy. This unlocked avenues in Large Language Models and Speech Synthesis, empowering tailored solutions for diverse customer needs.
Practical Development
In sectors such as museums and healthcare the primary emphasis must be ensuring accuracy and useability. Intel hardware's potential in this domain drastically increases content accuracy by actively addressing the unique challenges associated with large language models.
The greatest challenge for museum content generation was ensuring tailored information that avoided model hallucinations, and the crucial task of clearly defining information sources to prevent inaccuracies. These challenges actively shape the pursuit of refining the content creation process, especially in the intricate and varied landscape of museum exhibits.
The balance between providing accurate information in museums and meeting visitor expectations is a complex task. Challenges involve curating comprehensive data that encompasses core information and specific artifact details. Striking this balance is paramount in delivering a meaningful experience for visitors while maintaining the accuracy and integrity of the presented information.
For smaller museums, the challenge is creating a custom document corpus of pretrained and available data specific to that institution. Delving into the challenges faced by these institutions actively sheds light on potential solutions to effectively curate data, addressing the specific needs and constraints encountered by smaller institutions of this kind.
Michael Pickering, CEO of Cloud constable gave a clear example of this issue in the real world:
“...if you let the model give answers based on its world knowledge, it can have all kinds of issues, like even sort of weird hallucinations, where it gets confused between other military museums. As an example, the one we work with is in Oshawa, Ontario. There's a big military museum, the Canadian War museum up in Ottawa. They have a lot of stuff too.
They probably have even more content published on the web. And when I was asking about artifacts from the Oshawa museum, GPT sometimes got confused and would start telling me about stuff from the Canadian War museum, which is real stuff, but it's in a different museum.”
This means it’s essential to have AVA accurately recall and describe the unique artifacts like military vehicles at the client’s museum and not accidentally describe items from collections in other museums! This requires finding or creating reputable general knowledge sources for certain information types and emphasizing accuracy in specific collection details as a priority.
The lighter side: AVA’s Virtual Green Thumb
Sometimes though, an unintended behavior can be quite endearing: In Cloud Constable’s small office, the team decided to test AVA in the real world, setting her up to greet team members. AVA's sensitive sensors, reacting to the air conditioning, comically and constantly greeted the office plant, Leafy, causing a running joke. This glitch provided insights for sensor calibration and emphasized humor in innovation, contributing to Cloud Constable’s vibrant workplace culture. Leafy, the apparently not-so-silent colleague, became an integral part of the CloudConstable team!
Cloud Constable remains focused on developing effective strategies that actively address the challenges inherent in creating a cloud based “subject expert”, with a commitment to creating a suite of systems that accurately meet the demands of their customers’ highly specialized fields.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.