By Paul Krill
Editor at Large, InfoWorld |
Microsoft is supplementing its Semantic Kernel SDK for integrating AI large language models (LLM) into applications with an open source Copilot Chat sample app, allowing developers to more easily build chatbots using features such as natural language processing, file uploading, and speech recognition.
Unveiled May 1, Copilot Chat demonstrates how developers can integrate AI and LLM intelligence into their own applications. The tool is intended to help with building applications such personalized recommendation systems and automated assistants for customer service, e-commerce, training and education, HR, and other tasks.
Microsoft cited scalability of these operations as a chief benefit, with chatbots helping user sites scale up to meet increasing demand without hiring more staff, which Microsoft noted could reduce costs and increase revenues. Other specific benefits the company cited for using the sample app include an improved user experience, increased efficiency, personalized recommendations, and better accessibility.
Developers can make chat smarter via LLM-based AI, with up-to-date information through the Semantic Kernel, Microsoft said, describing the Copilot Chat sample app as “an enriched intelligence app” that will become smarter with use. To use Copilot Chat, developers can update to the latest copy of Semantic Kernel from GitHub and then navigate to instructions for the sample app. However, the sample app is marked as for educational purposes and is not recommended for production deployments.
Open source Semantic Kernel is positioned as a lightweight SDK to mix conventional programming languages with the latest in LLM AI “prompts” with templating, chaining, and planning capabilities.
Next read this:
Paul Krill is an editor at large at InfoWorld, whose coverage focuses on application development.
Copyright © 2023 IDG Communications, Inc.
Copyright © 2023 IDG Communications, Inc.
Leave a Reply