Tuesday 24 October 2023

Connect an OpenAI chat bot to the internet using Bing Search API

In the previous post, we saw what is OpenAI function calling and how to use it to chat with your organization's user directory using Microsoft Graph. Please have a look at the article here: Chat with your user directory using OpenAI functions and Microsoft Graph

In this post, we will implement function calling for a very common scenario of augmenting the large language model's responses with data fetched from internet search.

Since the Large Language model (LLM) was trained with data only up to a certain date, we cannot talk to it about events which happened after that date. To solve this, we will use OpenAI function calling to call out the Bing Search API and then augment the LLM's responses with the data returned via internet search.

This pattern is called Retrieval Augmented Generation or RAG. 


Let's look at the code now on how to achieve this. In this code sample I have used the following nuget packages:

https://www.nuget.org/packages/Azure.AI.OpenAI/1.0.0-beta.6/

https://www.nuget.org/packages/Azure.Identity/1.10.2/

The very first thing we will look at is our function definition for informing the model that it can call out to external search API to search information:

In this function we are informing the LLM that if it needs to search the internet as part of providing the responses, it can call this function. The function name will be returned in the response and the relevant parameters will be provided as well.

Next, let's see how our orchestrator looks. I have added comments to each line where relevant:

This code is responsible for handling the chat with OpenAI, calling the Bing API and also responding back to the user based on the response from internet search. 

Next, let's have a look at the code which calls the Bing API based on the parameters provided by the LLM. Before executing this code, you will need to have created Bing Web Search API resource in Azure. Here is more information on it: https://learn.microsoft.com/en-us/bing/search-apis/bing-web-search/overview

The Bing Web Search API key can be found in the "Keys and Endpoint" section on the Azure resource:


Here is the code for calling the Bing Search API:

In this code, we are calling the Bing Web Search REST API to get results based on the search query created by the LLM. Once the top 3 results are fetched we are getting the text snippets of those results, combining them and sending it back the LLM. 

We are only getting the search result snippets to keep this demo simple. In production, ideally you will need to get the Url of each search result and then get the content of the page using the Url.

Finally, lets have a look at our CallChatGPT function which is responsible for talking to the Open AI chat API:
This code defines the OpenAI function which will be included in our Chat API calls. Also, the user's question is sent to the Chat API to determine if the function needs to be called. This function is also called again after the response from the Bing Web Search API is fetched. At that time, this function contains the search results and uses them to generate an output in natural language.

This way, we can use Open AI function calling together with Bing Web Search API to connect our chat bot to the internet!

Thursday 19 October 2023

Chat with your user directory using OpenAI functions and Microsoft Graph

Ever since OpenAI function calling was released, I have been incredibly fascinated by it. To me, it is as big a game changer as ChatGPT itself. 

With function calling, we are no longer limited by the data which was used to train the Large Language Model (LLM). We can call out to external APIs, protected company data and other business specific APIs and use the data to supplement the responses from the LLM. 

To know more about function calling specifically with the Azure OpenAI service, check out the Microsoft docs: https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/function-calling

In this post, let's have a look at how we can leverage OpenAI function calling to chat with our user directory and search for users in natural language. To make this possible we will use the Microsoft Graph to do the heavy lifting. 

This is what we want to achieve: The user asks a question about the people directory in natural language, the LLM is able to transform the question to code which the Microsoft Graph understands and the LLM is again able to transform the response from the Microsoft Graph back to natural language.


On a high level, our approach can be summarised as follows:

1. Define the OpenAI functions and make them available to the LLM 


2. During the course of the chat, if the LLM thinks that to respond to the user, it needs to call our function, it will respond with the function name along with the parameters to be sent to function.

3. Call the Microsoft Graph user search API based on the parameters provided by the LLM.

4. Send the results returned from the Microsoft Graph back to the LLM to generate a response in natural language.

Alright, let's now look at the code. In this code sample I have used the following nuget packages:

https://www.nuget.org/packages/Azure.AI.OpenAI/1.0.0-beta.6/

https://www.nuget.org/packages/Microsoft.Graph/5.30.0/

https://www.nuget.org/packages/Azure.Identity/1.10.2/

The very first thing we will look at is our function definition:

In this function we are informing the LLM that if needs to search any users as part of providing the responses, it can call this function. The function name will be returned in the response and the relevant parameters will be provided as well. 

The enums in the officeLocation and department parameter will instruct the LLM to only return those values even if user asks a slightly different variation in their question. We can see an example of this in the gif above. Even if the question asked contains words like "devs" and "NY", the LLM is able to determine and use the terms "Engineering" and "New York" instead.

Next, let's see how our orchestrator looks. I have added comments to each line where relevant:

There is a lot to unpack here as this function is the one which does the heavy lifting. This code is responsible for handling the chat with OpenAI, calling the MS Graph and also responding back to the user based on the response from the Graph.

Next, let's have a look at the code which calls the Microsoft Graph based on the parameters provided by the LLM. 

Before executing this code, you will need to have created an App registration with a clientId and clientSecret. Here is how to do that: https://learn.microsoft.com/en-us/azure/active-directory/develop/quickstart-register-app

Since we are calling the Microsoft Graph /users endpoint with application permissions, the app registration will need a minimum of the User.Read.All application permission granted.
https://learn.microsoft.com/en-us/graph/api/user-list?view=graph-rest-1.0&tabs=http 

This code get the parameters sent from the LLM and uses the Microsoft Graph .NET SDK to call the /users/search endpoint and fetch the users based on the officeLocation, department or jobTitle properties.

Once the users are returned, their displayName value is concatenated into a string and returned to the orchestrator function so that it can be sent again to the LLM.

Finally, lets have a look at our CallChatGPT function which is responsible for talking to the Open AI chat api. 

This function defines the Open AI function which will be included in our Chat API calls. Also, the user's question is sent to the API to determine if the function needs to be called. This function is also called again after the response from the Microsoft Graph is fetched. At that time, this function contains the details fetched from the Graph to generate an output in natural language. 

This way, we can use Open AI function calling together with Microsoft Graph API to chat with your user directory.