Focusworks AI for Microsoft Teams - Chat with your business data https://focusworks.ai/
Fresh - Bring your SharePoint intranet to life https://freshintranet.com/
Sunday 26 November 2023
Manage Azure OpenAI Service using the Azure CLI
Thursday 16 November 2023
Teams tab fails to load in the new Microsoft Teams Desktop client
The new Microsoft Teams Desktop client was made generally available for Windows and Mac recently. The good news is that the new client provides feature parity for 3rd party apps like Focusworks AI giving customers a choice of using their preferred Teams client to access the apps.
However, if you have a custom built Microsoft Teams tab or a task module as part of your solution, and find that it fails to load in the new Microsoft Teams client, there might be a specific reason for it.
And since there is no way to invoke the Developer tools in the new Teams desktop client yet (November 2023), the experience can get a bit frustrating.
In my case, I have a custom React/TypeScript based tab which is using the @microsoft/teams-js library to interact with Teams.
Since teams tabs are just HTML pages, we need to make sure that the page is being loaded inside Teams before continuing to execute the code. To do that we can use the context.app.host.name property and check that the value was "teams" before moving ahead.
However, with the new desktop client my tab was failing to load. After a bit of digging around I realised that the new Teams desktop client has an entirely different host name property and the value is "teamsModern" as mentioned here: https://learn.microsoft.com/en-us/javascript/api/%40microsoft/teams-js/hostname?view=msteams-client-js-latest
So changing my code to include the new value as well worked!
Hope this saves you some debugging time!Tuesday 24 October 2023
Connect an OpenAI chat bot to the internet using Bing Search API
In the previous post, we saw what is OpenAI function calling and how to use it to chat with your organization's user directory using Microsoft Graph. Please have a look at the article here: Chat with your user directory using OpenAI functions and Microsoft Graph
In this post, we will implement function calling for a very common scenario of augmenting the large language model's responses with data fetched from internet search.
Since the Large Language model (LLM) was trained with data only up to a certain date, we cannot talk to it about events which happened after that date. To solve this, we will use OpenAI function calling to call out the Bing Search API and then augment the LLM's responses with the data returned via internet search.
This pattern is called Retrieval Augmented Generation or RAG.
Let's look at the code now on how to achieve this. In this code sample I have used the following nuget packages:
https://www.nuget.org/packages/Azure.AI.OpenAI/1.0.0-beta.6/
https://www.nuget.org/packages/Azure.Identity/1.10.2/
The very first thing we will look at is our function definition for informing the model that it can call out to external search API to search information:
In this function we are informing the LLM that if it needs to search the internet as part of providing the responses, it can call this function. The function name will be returned in the response and the relevant parameters will be provided as well.The Bing Web Search API key can be found in the "Keys and Endpoint" section on the Azure resource:
This way, we can use Open AI function calling together with Bing Web Search API to connect our chat bot to the internet!
Thursday 19 October 2023
Chat with your user directory using OpenAI functions and Microsoft Graph
Ever since OpenAI function calling was released, I have been incredibly fascinated by it. To me, it is as big a game changer as ChatGPT itself.
With function calling, we are no longer limited by the data which was used to train the Large Language Model (LLM). We can call out to external APIs, protected company data and other business specific APIs and use the data to supplement the responses from the LLM.
To know more about function calling specifically with the Azure OpenAI service, check out the Microsoft docs: https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/function-calling
In this post, let's have a look at how we can leverage OpenAI function calling to chat with our user directory and search for users in natural language. To make this possible we will use the Microsoft Graph to do the heavy lifting.
This is what we want to achieve: The user asks a question about the people directory in natural language, the LLM is able to transform the question to code which the Microsoft Graph understands and the LLM is again able to transform the response from the Microsoft Graph back to natural language.On a high level, our approach can be summarised as follows:
1. Define the OpenAI functions and make them available to the LLM
2. During the course of the chat, if the LLM thinks that to respond to the
user, it needs to call our function, it will respond with the function name
along with the parameters to be sent to function.
3. Call the Microsoft Graph user search API based on the parameters
provided by the LLM.
4. Send the results returned from the Microsoft Graph back to the LLM to generate a response in natural language.
Alright, let's now look at the code. In this code sample I have used the following nuget packages:
https://www.nuget.org/packages/Azure.AI.OpenAI/1.0.0-beta.6/
https://www.nuget.org/packages/Microsoft.Graph/5.30.0/
https://www.nuget.org/packages/Azure.Identity/1.10.2/
The very first thing we will look at is our function definition:
In this function we are informing the LLM that if needs to search any users as part of providing the responses, it can call this function. The function name will be returned in the response and the relevant parameters will be provided as well.
The enums in the officeLocation and department parameter will instruct the LLM to only return those values even if user asks a slightly different variation in their question. We can see an example of this in the gif above. Even if the question asked contains words like "devs" and "NY", the LLM is able to determine and use the terms "Engineering" and "New York" instead.
Next, let's see how our orchestrator looks. I have added comments to each line where relevant:
There is a lot to unpack here as this function is the one which does the heavy lifting. This code is responsible for handling the chat with OpenAI, calling the MS Graph and also responding back to the user based on the response from the Graph.
Next, let's have a look at the code which calls the Microsoft Graph based on the parameters provided by the LLM.
Before executing this code, you will need to have created an App registration with a clientId and clientSecret. Here is how to do that: https://learn.microsoft.com/en-us/azure/active-directory/develop/quickstart-register-app
Since we are calling the Microsoft Graph /users endpoint with application permissions, the app registration will need a minimum of the User.Read.All application permission granted.
https://learn.microsoft.com/en-us/graph/api/user-list?view=graph-rest-1.0&tabs=http
This code get the parameters sent from the LLM and uses the Microsoft Graph .NET SDK to call the /users/search endpoint and fetch the users based on the officeLocation, department or jobTitle properties.
Once the users are returned, their displayName value is concatenated into a string and returned to the orchestrator function so that it can be sent again to the LLM.
Finally, lets have a look at our CallChatGPT function which is responsible for talking to the Open AI chat api.
This function defines the Open AI function which will be included in our Chat API calls. Also, the user's question is sent to the API to determine if the function needs to be called. This function is also called again after the response from the Microsoft Graph is fetched. At that time, this function contains the details fetched from the Graph to generate an output in natural language.
This way, we can use Open AI function calling together with Microsoft Graph API to chat with your user directory.