Build chatbots with Microsoft's Bot Framework

Microsoft's Azure-based services are a good place to start with conversational computing, even outside Microsoft environments

chatbots chatbot bot
Thinkstock

One of the more fascinating trends of this past year was the move to conversational computing. Instead of building complex apps, using bots over services like Facebook’s Messenger or Microsoft’s Skype can help simplify customer interactions.

Using bots is a technique that can also work over internal chat tools like Slack and Microsoft Teams, giving rise to self-service “chatops” tools that can manage common service-desk queries. Last week, Microsoft unveiled more of its bot tools, with bots to take content and use it as the basis of conversational services.

Inside the Microsoft’s Bot Framework

Microsoft’s Bot Framework is designed to help you build and deploy chat-based bots across a range of services, including non-Microsoft platforms and through open web and SMS gateways, with minimal coding and with tools for delivering cross-platform conversations from one bot implementation. Like much of Microsoft’s recent development tools, the Bot Framework is intended to be cross-platform and cloud-based, building on Azure services and on the APIs in the company’s machine learning-powered Cognitive Services APIs.

At the heart of the Bot Framework are two SDKs, one for use with .Net and one building on the open source cross-platform JavaScript-based Node.js. There’s also a set of RESTful APIs for building your own code in your choice of languages. Once built and tested, bots can be registered in any of the supported channels (with their own user names and passwords), before being listed in Microsoft’s Bot Directory. You don’t need to register a bot unless you’re planning on using it with Skype. However, in practice it’s a good idea for customer-facing services because it adds a layer of discoverability.

Microsoft is providing much of the sample code for its Bot Framework on GitHub. The approach makes a lot of sense; developers are likely to already have GitHub accounts and can build chatbots using the toolkits into their existing workflow. It also means that developers can make appropriate pull requests and add their own code to the samples, improving them in collaboration with other development teams both inside and outside Microsoft.

Introducing the Azure Bot Service

Since Microsoft released its Bot Framework as part of its “conversational computing” push at its Build 2016 developer event, it’s continued to regularly update the service, adding new features and functions. That includes support for the Azure Bot Service, a cloud-hosted bot development platform, as well as open-sourcing its emulators and key controls. There’s also now support for more flexible conversations, allowing users to interrupt precomposed conversation streams, using their inputs as triggers to launch new actions and jump out of a dialog.

One useful new feature is a QnA Maker service designed to take content and turn it into a bot. It’s an ideal tool for anyone building a customer service bot because it can work with your FAQs and support documentation to deliver conversational support to users. Once it has generated pairs of questions and answers from your documentation, you can then test and train the new knowledge base bot to deliver the answers to the questions you expect to get. The resulting service can be wrapped as an API for use with Cortana Intelligence Suite or used as the back end for a bot running on the Azure Bot Service.

One advantage of hosting bots on a cloud platform like Azure with the Azure Bot Service is that you can use it with serverless compute resources. If you’re expecting bots to need to respond quickly to an unknown number of users, it’s well worth considering this approach. You won’t be spending money on unused virtual infrastructure, and at the same time you won’t risk losing a customer when a bot fails to respond to a query. As demand increases, serverless bots are spawned as needed, and you’re charged for only the cloud resources you use.

The Azure Bot Service simplifies building and deploying bots, using the C# and Node.js support in Azure’s serverless computing functions and calls to the various Cognitive Services APIs. Although you can build and run your code from the cloud-hosted Azure editor, Microsoft recommends its continuous integration tools to work from your choice of source control services and your own IDE.

Building your first bot is relatively simple: You can pick one set of templates to help get started once you’ve subscribed to the Bot Service via the Azure dashboard. If you’re using the in-Azure tools, you get access to a basic chat simulator alongside the code editor. Methods like this—that is, integrating simulators with development tools—are going to become more common along with bots because without a real-time test tool, it’s nearly impossible to ensure that a bot works.

More complex conversation with LUIS

Most bots are relatively simple, looking for keywords, then triggering appropriate responses. That’s all very well when you can control the dialog between user and bot. However, it’s an approach that breaks down with more sophisticated bots, where users are expecting a more natural conversational interaction. If you’re taking the natural-language route, you’ll need to spend some time with Microsoft’s Language Understanding Intelligent Service (LUIS).

LUIS is a fascinating tool, one of the more important elements driving free-form conversational bots. It’s designed to interpret the intent of a user conversation. For example, if you’re building a bot to handle ordering takeout food, you can work with LUIS to extract address information and the meal a user is ordering, formatting that information so that it’s ready to pass to an app. LUIS works by extracting a user’s intent from a dialog; thus, you can build specific responses to handle specific user intents: one for where an order has to go, one for what it contains, and one to finalize the order.

You can use dialogs like this to construct more complex interactions by treating them as chains of information, tied together by queries that build the connection from your bot to your application. It’s a fairly complex way of thinking about user experience, but one that’s not too different from working with requests from a single-page web app.

Microsoft is doing its best to make it easy to get started with building and using bots. But it’s still early days, and though users are prepared to use them for simple interactions, more complex interactions are still far from gaining wide user acceptance. Tools like QnA Maker should help encourage wider adoption, as they can take your knowledge base and help turn it into a conversational service that can reduce the load on support desks, helping support staff prioritize important problems over routine queries.

Copyright © 2016 IDG Communications, Inc.