Calling an Azure Function App from the SharePoint Framework

This one is a long one, but if you want to know how to configure your environment so that you can securely call functions for Azure Function App protected by Entra ID then read on.

There are lots of moving parts here and figuring all of this out was not a simple exercise and so it is my hope that if you face the same challenge, this post will help you on your way.

Background

In my previous post, I outlined my submission for the 2026 SharePoint Hackathon, K-Docs Publish, and promised to follow up with some detail on how I built it.

Sadly, I didn’t win the Hackathon, nor did I even place, but never mind. ☹

Actually, that’s not strictly true, I didn’t come anywhere with K-Docs Publish, but another solution I build for the Parliamentary Library of South Australia, did do much better. Not only did the team bag the honorary award for most creative video submission but the project came runner up in 2 categories (best Library Solution and most innovative SPFx solution), so I guess I shouldn’t be too downhearted. You’ll not see my name anywhere because the rules of the competition where that an individual could only be in a single project team – it seems I might have chosen the wrong team!

Anyway, with K-Docs Publish I gave it my best shot. I guess it wasn’t what the judges were looking for, or (more likely) other submissions were simply just better.

However, I still strongly believe that K-Docs Publish has legs and undeterred I shall be pushing forward with this solution to get it into a polished state so that it can be released to the Microsoft Marketplace in the next few weeks.

If you haven’t read my previous post, K-Docs Publish is a solution to build a knowledge base in SharePoint Online. The basic idea is for organisations to retain the “single source of truth” version of key documents, which might be policies, procedures, best practices, product specs or the like, in their original MS Word format but publish them to consumer users, as wiki-like web pages.

The only problem is that SharePoint modern doesn’t have a wiki library or publishing pages any more that can host HTML. This means we need to build 2 things:

  • An SPFx Command Extension that allows users to select source Word documents from a SharePoint library and convert them into HTML to be saved somewhere (in our knowledge base).
  • A viewer solution, which will be an SPFx Web Part to be deployed as a Single Web Part App Page that can present the HTML in a useful and engaging way for consumer users.

This is what the current version of K-Docs Publish does now.

But I wanted more. I wanted to provide a UI that could act a cross-article browser which meant building some kind of navigation structure and it is often the case that there is a hierarchical relationship between articles and so we needed something like a tree view navigator.

That’s enough of a challenge, but it was clear from the Hackathon blurb that Microsoft was especially keen to see what the community could do with their AI services, which includes Copilot obviously.

Well, a knowledge base solution is a natural and obvious fit for AI. It could be argued that a useful AI assistant is what would transform an information repository into a knowledge base. That’s actually why I picked K-Docs Publish has my Hackathon entry in preference to a dozen other candidate projects I might have chosen.

I suspect that I could have built K-Docs Publish and bolted on a Copilot agent to provide the Response Augmented Generation (RAG) capability I was after, but I wasn’t keen to go that route, for a number of reasons:

  • Not everyone is licensed for Copilot and its not a cheap option either.
  • I envisaged scenarios in which customers would want to share their knowledge bases with trusted guest users and partner organisation, who might not be licensed for Copilot.
  • Copilot is essentially a wrapper client for AI services hosted in Azure and I figured that for what I wanted to build, it would be an unnecessary middle layer that might enforce certain design decisions (like the choice of AI model) that I (or my customers) wouldn’t be entirely happy with.
  • I wanted to learn how to work with Azure AI services.

Groundwork

After some research, trial and error (yes I used Copilot for that), I discovered that you can’t call the Azure AI services directly any more. It seems that you used to be able to do this, but those doors are closing (if not closed already).

The way to access Azure AI services is to provision them in the Azure AI Foundry which provides you with endpoints which, as I have just said, you cannot call directly from an SPFx solution or any other client for that matter, but rather we must call them from an Azure Function App which you can then call from SPFx.

The Function App acts as a kind of broker in the middle. Your SPFx solution calls functions in a Function App which relay information to your AI services and get sent back the response which is subsequently returned to your SPFx client.

It works like shown below:

  1. Your SPFx solution (Web Part, Command Extension etc.) will trigger a call to a function defined in the Function App, that you must set up in Azure, by calling a function endpoint and passing in an appropriate data package. In the case of K-Docs Publish, when is calls a RAG AI model, the data package will include:
    • User input (what the user types as a question or response).
    • The discussion thread between the user and the AI Model (or dialog history if you prefer) which is needed for the RAG model to understand the ongoing context.
    • Grounding instructions, where you tell model who it is, it’s purpose in life and how you expect it to behave.
    • Format instructions, where you tell the AI model the format of the output you expect such as an action list, a table summary of key points or header sections with dot points etc. in Markdown or HTML.
  2. The function inside the Function App, which gets called by your SPFx client, could potentially process the data it was served and act as an additional logic layer in the processing pipeline.
  3. However, I don’t like this idea because it means you have business logic in 2 places, namely inside the SPFx solution and inside the function. I guess there might be good reasons why you would want to implement additional logic in the function but for K-Docs Publish, I decided not to do this and so my function is a simple pass-through, it’s a data conduit to the AI model and nothing more.
  4. The AI model does its thing and processes the received input.
  5. And then sends back the response to the calling function. Again, the function might potentially process the response in some way.
  6. However, I am really keen on the idea that the function should do nothing more than pass on what has been received to the SPFx solution. This way, not only does the business logic resides in one place, the SPFx solution, it makes the code inside the function as clean and simple as it can be, and so easy to maintain.

I should flag something here. It is possible to create a Function App, which allows the functions it defines, to be called anonymously. Whilst building an unauthenticated endpoint for testing a prototyping is probably acceptable, we shouldn’t do that for a production environment. Why? Because your endpoint can be seen in the browser console tools and that means that malicious code might highjack it for nefarious purposes such as denial or service attack or just to rack up a huge and unexpected AI processing bill for you by spoofing!

So, we need to protect the Function App, and the way to do that is by making sure it can only be called when properly authentication (rather than anonymously).

This then becomes the first leg of our journey, to configure and deploy an Azure Function App that can be authenticated within our tenancy, so that calls to function endpoints are trusted. But to do that we first need an App Registration in Entra ID.

Creating an App Registration in Entra ID

These are the steps I went through.

From Azure go to Microsoft Entra ID and then select the App registration blade and then the New registration action button.

You must provide a stable and ideally descriptive, name for the registration.

Now, and this is important. This name does not affect token validation, but it does matter for the permission requests that will be made by the SPFx client application.

In general you can name the app registration anything you like, but in practice, if the registration is going to work from an SPFx client you cannot, because we are effectively creating a dependency between the package-solution.json file of the SPFx solution and the application registration.

The permission request set up in the package-solution.json file is not a variable, it is a fixed value, that must be specified when the SPFx solution is built.

So, if you are building your own application you will want to choose a name that is relevant to the context of your SPFx solution but if you are creating this app registration so that a 3rd party SPFx solution (like K-Docs Publish) can make trusted calls, your app registration needs to use the exact name that is specified as the resource in the package-solution.json file.

To put it as simply as I can, the highlighted values below need to match exactly (including by case)

If they do not, your SPFx will not be trusted to make the API call as Azure won’t let you get past the sentry that we are about to set up to guard your entry to the Azure Function App.

If you are setting this up for K-Docs-Publish the resource element in the package-solution.json file is named K-Docs-Publish and so the App Registration also needs to be name K-Docs-Publish.

Back in the Entra ID wizard, we can (generally) leave the rest of the settings with their default values and simply click the Register button to create the App Registration and so key registration details will be made available for you as shown in the redacted screenshot below:

The App Registration now exists but is not yet an API that we can call. To resolve that go to the Expose an API blade and click on the Add link next to the Application ID URI label

From the slide out panel click the Save button to generate the URI. The URI will then be shown in UI.

The Application ID URI, also called identifier URI, is a globally unique URI used to identify this web API. This URI is the prefix for scopes in the OAuth protocol. You can either use the default value in the form of api://<application-client-id> or specify a more readable URI.

Next, we need to create a scope for the App Registration.

Still within the Expose an API blade click the Add a scope action button, as highlighted in orange above.

From the slide-out panel you will need to add the appropriate scope details. For K-Docs-Approve, you should specify something like what is shown below:

Mostly what you specify here is up to you, with the one critical exception of the Scope name value. Like the app registration name, the Scope name must match exactly with the resource property in the package-solution.json file of your SPFx solution.

Just to emphasise, it is essential whatever is specified in the resource and scope elements of the webApiPermissionRequests entry in the package-solution.json file of your SPFx solution to match exactly to the App Registration name and the Scope name that you set up in Entra ID, respectively.

The values for setting up K-Docs Approve are shown below:

Click the Add scope button in the panel to add the scope as specified and the scope will be listed in the Expose an API blade.

Next, gather and record the key identifiers required for the next step:

You will need the Application (client) ID and the Directory (tenant) ID from the Overview page

The Application ID URI and Scope URI, from the Expose an API blade

At this point the API identity is fully defined. The next step is to create the Azure Function App.

Creating an Azure Function App

From the Azure Portal homepage search for Function App and you should end up in a page which asks you to select a hosting plan.

There are a few options to choose from, and Copilot recommends the (default) Flex Consumption plan, but it turns out that with this plan I cannot create a simple function to be hosted in Azure directly. This means I can only build a function using VS Code (or some other client editor).

Now, I can see why this might be desirable if the function is going to do some heavy processing, but that’s not what I want my functions to do. I want them to simply pass data on to the AI models and send back the response to my SPFx solution. If I go this default route I will end up with a custom solution that I will need to maintain and that’s what I am trying to avoid – I want low maintenance here.

If you are happier using client dev tools to create and manage your functions, go ahead and select the default Flex Consumption plan, but as far as I can tell, the only way to side-step the need to create a client project (in VS Code or whatever) is to not chose the Flex Consumption plan and go for the classic Consumption plan instead.

When you do that, Azure tries to convince you to take the Flex Consumption plan, so you will need to confirm this is what you truly want!

It is what we truly want, so click the Confirm button to let Azure know that you know that you know what you are doing!

On the Basics tab select your subscription and resource group (or create a new one) and for the Function App name you must select a name that is unique within your selected Region. I recommend that you prefix your function app name with tenancy domain name, so in my case I use Kaboodle360-K-Docs-Publish

I chose Windows OS over Linux and for Runtime stack I am chose Node.js because it is most closely aligned with SPFx development (Typescript, React etc.)

For the Version, I selected the latest Long Term Support (LTS) version, which at the time of writing was 22 LTS and for the Instance size, I just ran with the default 2048 MB.

For the Region, my wizard bizarly defaulted to Canada Central but I obviously want to choose the same location that hosts my SharePoint Tenancy, so as to minimise latecy.

My settings are shown below:

Copilot recommended that I just use the default settings for the rest of the configuration, so that’s what I did. I just went to the Review and create button and then the Create button to provision my Function App.

It may take a few minutes before everything is set up. The screen will display “Deployment in progress” before a “Your deployment is complete” screen is shown.

Click on the Go to resource button to access the Overview blade of the new Function App.

The highlighted option, to Create in Azure portal, is the option that will be missing if you ran with the Flex Consumption option.

Enable Entra ID Authentication on the Function App

So far, we have an App Registration in Entra ID, and we have a Function App (albeit without any functions yet). The next step is to enable Entra ID Authentication on the Function App and so bind these two services together.

To do that, access the Authentication menu from the Settings blade in the Function App.

Click on the Add identity provider button and that will send you to the Add an identity provider page, where you need to select Microsoft as the Identity provider and select the option to Provide the details of an existing app registration in the App registration section.

In the Application (client) ID textbox, paste in the Application (client) ID value from the App Registration set up earlier.

Azure has two token formats and the default one set up for the Issuer URL (highlighted in orange above) is for v1.0 tokens but that’s not going to work for an SPFx solution and so we need to change this to the URL that supports v2.0 which is in the pattern:

So change the URL so that it uses the unique id of your tenancy.

You also need to set the properties highlighted below from the Permissions tab:

Note you will need to add your Application (client) ID as an entry in the Allowed token audiences section. Setting the option to Allow requests from any entity is probably good choice here (even though it says Not recommended) as it will allow you functions to potentially be called by different SPFx solutions which means you won’t have to make configuration changes in Azure if you decide to deploy additional web parts which need to call functions in your Function App.

You cannot leave it at Allow requests only from the application itself, because that’s not what we are doing, we’re wanting to call functions from SPFx (and Postman to test it works before we get there).

The only other setting I changed was to change the response code for unauthenticated request from the default 302 to the more usual 401, for API calls:

Finally click the Add button to finish the configuration step.

Note that if you do not see, or skipped over, the additional checks configuration options when setting up the Microsoft identity provider, you can edit the configuration using the edit icon button.

Granting Permissions for the Function App in Entra ID

We’re not there yet! Now we need to go back to the Entra ID App Registration and grant the necessary permission to the Function App.

Back in Entra ID select the All applications tab of the App registrations blade

Then click on the link of your Function App (highlighted above) so that it can be configured.

From here click on the API Permissions link. You will most likely see some configured permission like User.Read permissions for Microsoft Graph.

Click the Add a permission action button and from the slide out panel find you API, search for it in the APIs my organization uses tab

Click on you API and the panel will stay open and allows you to search for and select the user_impersonation scope. Select it and click the Add permission button.

The panel will close and you should see the newly added permission. You must now click the Grant admin consent for Kaboodle Software button so that your set up looks like the one below:

But we’re still not done! Click on the Authentication (Preview) link (highlighted above).

I want to be able to test that everything works in Postman before I make the final leap of testing the Function App from an SPFx client solution, and to do that I need to add a redirect URL

From the panel select the Mobile and desktop applications option, if you choose a different option it likely won’t work (at least Copilot says it won’t) even though Web might seem a more logical choice.

Then post in the following URL: https://oauth.pstmn.io/v1/callback

Click the Configure button to add the URI and the panel will close and you configuration should look like below.

If you are not going to test this in Postman you can skip the previous step to add a Redirect URI, but what you can’t skip is setting the Allow public client flows to enabled, which is a configured using a toggle switch on the Settings tab.

Click the Save button to update the configuration.

Creating a Test Function

Now we have an Entra ID App Registration and a Function App which is bound to it so that calls made to functions in the Function App will be authenticated.

However, we don’t yet have a function, inside the Function App, that we can call.

To add a new function, click the Create in Azure portal button on the Functions tab of the Overview page. Remember this option likely won’t be available to you if you chose a different hosting option. And sadly you can’t switch options – if you chose the wrong hosting option, the only path is to delete your Function App and start over!

From the Create function panel select the HTTP trigger template.

Click Next and from the Template details tab give the function a meaningful name and set the Authorization level to Anonymous as Entra ID Authentication will enforce identity anyway.

Click Create to provision the function.

Azure kindly provides us with a sample function for testing.

Testing all is well with Postman

I strongly recommend confirming that your simple test function works as expected by pining it from Postman. If you can’t get it to run under Postman then you’ll likely not be able to get it to run from and SPFx client app – it’s a great way to make sure we have everything wired up correctly.

I recommend the desktop version as configuring the web browser version will give you CORS headaches that are best avoided.

Fire up Postman and Add a new GET Request and provide it with the target end point of our test function, which in my case is:

Now, you may be wondering where I got this URL from.

Go to the Code + Text tab of the function and then select the Get function URL button.

Review the panel and you will see 3 options.

But which one to choose?

It turns out that all 3 URLs are the same, so it doesn’t matter which you copy (and begs the question why).

This is an authenticated call (that’s the whole point), so for this to work we need to set up Postman to make an authenticated call.

In Postman, click on the Authorization tab and set the Auth Type to be OAuth 2.0 and then set the properties in the Configure New Token section.

  • Token Name: Can be set to whatever you wish
  • Grant Type: Set to Authorization Code (With PKCE)
  • Callback URL: This is specific to Postman and should be set to https://oauth.pstmn.io/v1/callback, remember that’s what we set this up as a valid Redirect URI in App Registration.
  • Auth URL:
  • Access Token URL: We get both of these Endpoints panel from the App Registration:

  1. Client Secret: You can leave this blank because we didn’t bother to set one up.
  2. Code Challenge Method: Set to SHA-256
  3. Code Verifier: Leave blank
  4. Scope: Set to the Scope URI that we set up, api://b2694200-e2f9-4b8f-a748-f2eb2ccae54c/user_impersonation, in my case which you can pick up from the API permissions blade.

  • State: Leave blank
  • Client Authentication: Set to Send client credentials in body

Once you are confident that everything is set up as it should be, click Get new access token button.

This will throw up a login dialog which you will use to login to your tenancy.

If all goes well, Postman with show you a Use Token button.

Click this to use the access token that Entra ID has just served up when you successfully completed the login.

Finally, click the Send button in Postman to hit the end point of your function and view the results.

All being well, Postman will report a 200 OK status code and response expected from the simple test function will appear in the output window for the response body.

Testing it from an SPFx Solution

Nearly there! Using Postman, we have been able to confirm that the App Registration we set up in Entra ID works and that, after being authenticated, we can successfully call a test function in the Azure Function App.

The final piece is to confirm that we can call the same function from a simple SPFx solution (a web part) we can provision as a test harness.

Create a Test Harness Web Part Solution

I’m assuming that you have access to a dev rig and that you know how to create a basic SPFx Web Part using the standard, get-you-started template – I selected a React framework solution because nearly all my projects are React these days.

I removed the default junk that comes with the default solution and skinned everything down so that my main web part class looks like this:

And my main React component class looks like this:

This is about as simple as a web part can be. No properties, and all it does is renders is a single button, which we will later use to call our test function.

The screenshot below shows a web part instance, rendered on a test page.

Update Package-Solution.json

Next we need to add a webApiPermissionRequests element to the config/package-solution.json file.

As mentioned above, this is a hardcoded dependency, where the resource and scope attributes of the webApiPermissionRequests element must match exactly with the Display name of the App Registration that we set up in Entra ID

and the scope we defined in the Expose an API blade of the App Registration, respectively.

And so the package-solution.json file now looks like this:

Take great care to make sure these entries match exactly.

Build and Package the Text Harness Solution

We need to do this now so that we can deploy the solution to the App Catalog and then approve the API permission request we have just set up in the package-solution.json file.

Without first deploying the solution we cannot approve the API request and so we cannot test the call to the Azure function as that would fail because no permissions have been approved.

So:

  • gulp clean
  • gulp bundle –ship
  • gulp package-solution –ship

After that you will end up with a solution package (.sppkg file) in your sharepointsolution folder, which you can then right-click on to open in the File Explorer

Deploy the Solution to the App Catalog

Open your App Catalog site and drag and drop the solution package from the file explorer into the App for SharePoint gallery and Do you trust. dialog will open.

There is no reason to deploy this solution globally so keep the skip deployment feature checkbox unchecked.

Note how the dialog tells us that this solution package is requesting API access and that will need to be approved.

Click the Deploy button and you will see the solution package in the gallery.

Grant API Approval

Before the test solution can attempt to call the test function in the Azure Function App, we must first grant it approval to do so.

Go to the SharePoint Admin center, expand the Advanced menu and select the API access item and you will see the API access request waiting patiently in Pending requests.

Click on the request row and the Approve and Reject buttons will appear. Note that although this action is executed from the SharePoint Admin center, SharePoint Admin permissions are insufficient to grant approval, you will essentially need to be logged in as Tenancy Admin to successfully grant the requested permissions.

We obviously want to click the Approve button and from the slide-out panel, click the panel Approve button to confirm that action.

All being well your API request will now sit in the Approved requests section – job done!

Testing from the Test Harness Web Part

So far, so good. Our test harness web part now has now been granted the permissions required to call functions we define in the Function App.

All that remains is to write some code to call the test function we have set up and verify that we get the expected response.

If you use AI to help you (as I did) it will likely try spit out something like this:

But this didn’t work. Firstly, I was missing dependency, but it didn’t work even after I installed what seemed to be missing.

npm install @microsoft/sp-http@1.18.0 –save

I never got to work and after much pain, Copilot finally coughed up that this method isn’t supported on Single Web Part App Pages in any case – don’t you just love it when AI tells you that what it has just told you is an error!

Then Copilot tried to drive me towards MSAL and got me to try this:

npm install @azure/msal-browser –save

But the latest version is not compatible with the SPFx. The latest version that is compatible with SPFx is:

npm install @azure/msal-browser@2.37.0 –save

This approach looked decidedly flaky to me, not least because the Copilot sample code throws up a log in dialog (simply not an acceptable option).

Copilot persisted in telling me I was only one step away from assured success, but we never got there with either of the approaches which is steadfastly insisted would work and forced me to eliminate all other possible issues like caching. Neither approach worked, or at least I couldn’t get them to work.

Whoever thinks that AI is ready to make software engineering a redundant profession should re-evaluate and take stock. AI is just a reflection of what it can harvest on the Internet, and if that information is wrong or poor quality, it simply does not know that, and it can’t magically create a working solution for you. It is also particularly frustrating when trying to negotiate configuration settings in Azure, frequently telling me to access controls and pages which simply don’t exist. The problem is that Azure is not a standard platform, it varies hugely based on what region you are hosted in and what services you pay for, and it’s not helped by Microsoft constantly updating the UI every few weeks – AI has no chance of keeping up.

In the end, I used quite simple code that did not rely on additional dependencies, although I did need to pass the context of my web part to the React class component.

I updated the React class component of the web part so that it now looks like:

As you can see, I have hardcoded the applictionID and the functionEndPointUrl as constants. For the real K-Docs-Publish solution I have moved these values to be configuration settings but here we are just verifying that the plumbing works and so hard coding these values is fine for now.

When I initially ran the above code I got a CORS error, but it turns out that could be easily resolved by adding my tenancy as an allowed origin in the CORS section of my Function App.

Clicking the Test button in my test harness web part showed exactly what I was hoping to see in the browser console.

What’s next?

Remember this diagram?

It turns out that we’ve only done steps 1, 2 and 7 and we still have 3, 4, 5 and 6 to go!

In my next post in this series, I will explain how to set up models in the AI Foundry that can be called from a function in the Function App.

Such fun – stay tuned!

Leave a Reply

Scroll to Top

Discover more from Innovations in SharePoint

Subscribe now to keep reading and get access to the full archive.

Continue reading