Quantcast
Channel: Jason Lee's Blog
Viewing all 56 articles
Browse latest View live

Adding Site Columns to Lists and List Views using the JavaScript Object Model

$
0
0
A couple of years ago, I posted on how to create SharePoint site columns using JavaScript. More recently, we needed to add a site column to a list in SharePoint Online as part of a scripted provisioning exercise. There were a few aspects of this that took a bit of trial and error, such as:
  • Retrieving the site column from the root site.
  • Getting the field to show up on the default list view.
  • Hiding the field from various forms.
So I figured it's probably worth sharing the end-to-end code (sanitised and simplified). First of all, let's create a site column named JasonNotes on the root site in the site collection:

var context; 
var rootWeb; 
var rootWebId = "..."; // The GUID identifier for the root web

var stage1 = function () {         
     context = new SP.ClientContext();
     
     // Get the fields collection for the root web     
     var rootWeb = context.get_site().openWebById(rootWebId);
     var fields = rootWeb.get_fields();
     
     // Create and add the field
     var fieldSchema = '<Field Type="Note"
                               Name="JasonNotes"
                               StaticName="JasonNotes"
                               DisplayName = "Jason Notes"
                               NumLines="10"
                               Required="FALSE"
                               Group="Jason Columns">
                        </Field>';
     fields.addFieldAsXml(fieldSchema, true,
         (SP.AddFieldOptions.addToDefaultContentType &           
          SP.AddFieldOptions.addFieldCheckDisplayName));    
     
     context.executeQueryAsync(stage2, onQueryFail);
};

The next stage is to add the site column to our list. At this point, we need to ensure that the field also gets added to the default list view. This is also a good opportunity to set any properties you require on the list field, such as whether you want it to appear on forms:



var listTarget; 
var listTitle = "..."; // The title of the target list

var stage2 = function () {
     context = SP.ClientContext.get_current();
     
     // Get the field from the root web
     rootWeb = context.get_site().openWebById(rootWebId);
     var webFields = rootWeb.get_fields();
     var fldJasonNotes = 
          webFields.getByInternalNameOrTitle("JasonNotes");


     // Get the list that we want to add the column to

     var web = context.get_web();
     listTarget = 
          web.get_lists().getByTitle(listTitle);
     
     // Add the field to the field collection for the list
     var listFields = listTarget.get_fields();
     var listFldJasonNotes = listFields.add(fldJasonNotes);
     
     // Show the field only on the New form            
     listFldJasonNotes.setShowInDisplayForm(false);
     listFldJasonNotes.setShowInEditForm(false);
     listFldJasonNotes.setShowInNewForm(true);
     listFldJasonNotes.update();

     // Add the JasonNotes field to the default view
     var defaultView = listTarget.get_defaultView();
     var defaultViewFields = defaultView.get_viewFields();
     defaultViewFields.add("JasonNotes");
     defaultView.update();
     listTarget.update();

     context.executeQueryAsync(stage3, onQueryFail);
};

There are a few noteworthy points in the code. First of all, when we've added the site column (fldJasonNotes) to the list, note that we need to grab a reference to the resulting list column (listFldJasonNotes) if we want to set list-specific properties.

Second, note how we approach adding the new column to the default list view:
  1. Call get_defaultView to get the default list view from the list instance.
  2. Call get_viewFields to get the field collection from the default list view.
  3. Add the field by name to the field collection.
  4. Call the update method on the view.
Hope that helps!

Get the Current User's Manager in JavaScript

$
0
0
Using the JavaScript Object Model to look up the manager of the current user (from the User Profile Service) seems to be a fairly common requirement - I've had to do it at least three times in the last couple of months.

First of all, you need to load the profile properties for the current user:


var context, userProperties;

var stage1 = function() {
     context = SP.ClientContext.get_current();
     var peopleManager = new SP.UserProfiles.PeopleManager(context);
     userProperties = peopleManager.getMyProperties();

     context.load(userProperties);
     context.executeQueryAsync(stage2, onQueryFail);
}


Then you can retrieve the value of the Manager property as follows:

var stage2 = function() {
    var manager = 
         userProperties.get_userProfileProperties()["Manager"];
}


And that's it. Easy once you know how.

Incidentally, the trickiest part of all this can be getting the SharePoint script files to load in the right order. You can't run your code until sp.userprofiles.js is loaded, and you can't load sp.userprofiles.js until sp.js is loaded. The sp.userprofiles.js library seems to be particularly awkward to load. I usually use the following pattern, cobbled together from various helpful forum posts:


$(document).ready(function () {
     
// Force sp.js to load, then call sharePointReady
     if (!SP.SOD.executeOrDelayUntilScriptLoaded(sharePointReady,
          'sp.js')) {
          LoadSodByKey('sp.js');
     }
});

function sharePointReady() {
     // Force sp.userprofiles.js to load, then call our custom code
     if (!SP.SOD.executeOrDelayUntilScriptLoaded(stage1,
          'sp.userprofiles.js')) {
          LoadSodByKey('userprofile');
     }
}


Applying a Logo to Every Site in SharePoint Online

$
0
0
A quick post today on a fairly common problem - you're given a site collection on Office 365 that has grown organically with hundreds of nested sites, four or five levels deep in places. The various site owners have used many different versions of your company logo, and you want to reintroduce a degree of consistency by applying the same logo to every site in a site collection.

The most efficient way to do this is to run some client-side code from PowerShell. Basically you need to iterate over the subsites of a given site and set the Web.SiteLogoUrl property on every site. The easiest way to do this is to:
  1. Install the excellent SharePointPnP.PowerShell cmdlets.
  2. Run a script something like the one below.
# UpdateSiteLogos.ps1
# Jason Lee 5/12/16

# Variables
$rootSiteUrl="https://[tenant].sharepoint.com/sites/[path]"
$credentialManagerLabel="ianWilliamsAdmin"
$logoUrl="/sites/[path]/SiteAssets/smile.png"

# Recursive function to set site logo on specified site
# Provide the server-relative URL of the root site to start
functionupdateSiteLogo{
    Param(
        [parameter(Mandatory=$true)]       
        [String]
        $serverRelativeUrl
    )
    $web=Get-PnPWeb-Identity$serverRelativeUrl
    Write-Output("Updating site {0}"-f$web.ServerRelativeUrl)
    $web.SiteLogoUrl =$logoUrl
    $web.Update()
    Execute-PnPQuery
   
    $subwebs=Get-PnPSubWebs-Web$web
    foreach($subwebin$subwebs) {       
        updateSiteLogo($subweb.ServerRelativeUrl)
    }
   
}

# Connect to the site collection, then call updateSiteLogo on the root web
Connect-PnPOnline-Url$rootSiteUrl-Credentials$credentialManagerLabel
$rootweb=Get-PnPWeb
Write-Output"Setting site logos..."
updateSiteLogo($rootweb.ServerRelativeUrl)
Write-Output"Done."

The Get-PnPWeb cmdlet actually has its own Recurse option, and you could use that if you want to rather than doing your own recursion. I steered away from it because it gets all the subsites in one hit (the right approach in most circumstances), which can cause the script to hang for a while if you've got hundreds of nested sites. This approach instead walks the site structure one level at a time.


Controlling start options for SharePoint workflows in Visual Studio

$
0
0
When you build a reusable SharePoint workflow, it's useful to be able to control which start options are available. For example, if you only want your workflow to run once when an item is created, it makes sense to disable the "Changing an item will start this workflow" option.

SharePoint Designer provides some handy checkboxes that you can use to control your start options:










However, it's not immediately obvious how you can set these options for a Visual Studio workflow as the options aren't documented anywhere.

To control the start options for a SharePoint workflow in Visual Studio, you need to edit the feature element file that deploys your workflow. Within the feature element file, you need to add properties to the File element that deploys your Workflow.xaml file:

<?xmlversion="1.0"encoding="utf-8" ?>
<Elementsxmlns="http://schemas.microsoft.com/sharepoint/">
  <ModuleName="[Workflow Name]"Url="wfsvc/1c77e2780d9d49f197115fef9e7d9c50">
    <FileUrl="Workflow.xaml" 
          Type="GhostableInLibrary" 
          Path="[Workflow Name]\Workflow.xaml" 
          DoGUIDFixUp="TRUE" 
          ReplaceContent="TRUE">
      <PropertyName="ContentType"Value="WorkflowServiceDefinition" />
      <PropertyName="isReusable"Value="true" />
      <PropertyName="RequiresInitiationForm"Value="True" />
      <PropertyName="RequiresAssociationForm"Value="False" />      
      <PropertyName="DisableAutoStartCreate"Value="True" />
      <PropertyName="DisableAutoStartChange"Value="True" />
      <PropertyName="WSPublishState"Value="3" />
      <PropertyName="WSDisplayName"Value="[Workflow Title]" />
      <PropertyName="WSDescription"Value="[Workflow Description]" />
      <PropertyName="InitiationUrl"Value="wfsvc/....aspx" />
    </File>
    ...

</Elements>

In short:

  • To disable the "Creating a new item will start this workflow" option, add the DisableAutoStartCreate property and set the value to True.
  • To disable the "Changing an item will start this workflow" option, add the DisableAutoStartChange property and set the value to True.
  • To disable the "Allow this workflow to be manually started... " option, add the DisableManualStart property and set the value to True.
As a bonus fact, if you want to set the workflow status to the current stage name (SharePoint Designer workflows do this automatically, and it's a pretty useful feature for at-a-glance progress monitoring), add the AutosetStatusToStageName property and set the value to True.

I figured this out by creating a basic reusable workflow in SharePoint Designer, saving it as a template, then cracking open the .wsp file to look at how SPD goes about constructing workflows. If you do this, you'll find that the feature element file created by SPD specifies a property named MetaInfo on the workflow.xaml file. This property contains a bunch of property values. If you can look beyond the hideous syntax it uses, you can pick out the names of some useful properties.


<PropertyName="MetaInfo"Value="8;#vti_parserversion:SR|16.0.0.4456&#xD;&#xA;
                                 WSDescription:SW|&#xD;&#xA;
                                 IsProjectMode:SW|false&#xD;&#xA;
                                 isReusable:SW|true&#xD;&#xA;
                                 WSGUID:SW|2f0c508d...&#xD;&#xA;
                                 WSDisplayName:SW|ReusableTest&#xD;&#xA;
                                 WSPublishState:IW|3&#xD;&#xA;
                                 vti_author:SR|i:0#.w|jrjlee\\jason&#xD;&#xA;
                                 RequiresAssociationForm:SW|false&#xD;&#xA;
                                 AutosetStatusToStageName:SW|true&#xD;&#xA;
                                 DisableAutoStartChange:SW|true&#xD;&#xA;
                                 RestrictToType:SW|List&#xD;&#xA;
                                 vti_foldersubfolderitemcount:IW|0&#xD;&#xA;
                                 vti_modifiedby:SR|i:0#.w|jrjlee\\jason&#xD;&#xA;
                                 FileLeafRef:SW|workflow.xaml&#xD;&#xA;
                                 ContentTypeId:SW|0x01002...&#xD;&#xA;
                                 DisableAutoStartCreate:SW|true&#xD;&#xA;
                                 RequiresInitiationForm:SW|false&#xD;&#xA;
                                 InitiationUrl:SW|&#xD;&#xA;
                                 FormField:SW|&lt;Fields /&gt;&#xD;&#xA;
                                 AssociationUrl:SW|&#xD;&#xA;
                                 DisableManualStart:SW|true&#xD;&#xA;
                                 SPDConfig.LastEditMode:SW|TextBased&#xD;&#xA;
                                 vti_folderitemcount:IW|0&#xD;&#xA;" />



Getting or Setting Multi-Value Metadata Fields with the REST API

$
0
0
A couple of years ago, I published a series of posts on getting and setting taxonomy field values in SharePoint workflows by using the REST API in custom workflow activities. These custom activities have served me well, but they've always been unable to work with multi-value taxonomy fields. It's time to fix that.

Background

As you probably already know, when you add a taxonomy field to a SharePoint list, SharePoint adds a corresponding hidden note field. For example, if you add a taxonomy field named Colours, you actually get:

  • A taxonomy field named Colours. Depending on whether the column allows the user to select multiple terms, the field accepts values of type SP.Taxonomy.TaxonomyFieldValue or Collection(SP.Taxonomy.TaxonomyFieldValue).
  • A hidden note field, probably named Colours_0
The hidden note field stores the contents of the field in a term string format. In a single-value taxonomy field, the format looks like this:

Red|87999a76-e3cb-433c-96ad-c6fe354db476

And in a multi-value taxonomy field, the format looks like this:

Blue|77788fee-9e1d-4df2-a21b-d41dd1734b71;Indigo|d631d196-6909-4b54-a8bb-3c15bcfec18a;Yellow|fdc643a8-3310-448d-9a5d-c9ba13f366fb

In other words, each managed metadata term is represented as [Label]|[Term GUID], and multiple terms are separated by semi-colons.

I've previously resisted using this hidden note field to work with taxonomy fields using REST, as I've always felt that you ought to be able to get or set the field value using the regular, visible taxonomy field. You can get or set single values using the regular taxonomy field, so why not multiple values? However, I eventually conceded it can't be done. Meanwhile, other bloggers, such as this excellent post from Beau Cameron, have demonstrated how you can update a multi-value taxonomy field over REST by targeting the hidden note field and providing a text value in the term string format shown above.

The Problem

We know we can set single-value or multi-value taxonomy fields by writing a text value to the hidden note field. However, if we want to automate this (for example in a custom workflow activity), we need a reliable way of identifying the hidden note field that corresponds to a particular taxonomy field. This is tricky:
  • The internal/static name of the note field is typically a randomly-generated GUID.
  • You can't assume that the title of the note field will always be the title of the taxonomy field with "_0" appended - it depends how the field was created.
Fortunately there is a more robust way of finding the hidden note field associated with a specific taxonomy field.

The Solution - Short Version

If you retrieve the properties of a taxonomy field in any list, you'll notice it has a property named TextField. This field stores the GUID identifier of the associated hidden note field. Given this identifier you can retrieve the hidden note field (reliably and programmatically) and get its internal name. Given the internal name of the hidden note field, you can update your taxonomy field by providing a string value.

The Solution - Long Version

First, send a REST request to retrieve the details of the taxonomy field:


[Site URL]_api/web/lists(guid'[List ID]')/fields?$filter=title eq '[Field Name]' or internalname eq '[FieldName]'

For example, suppose we've got a multi-value taxonomy field named Colours:

GET /_api/web/lists(guid'd190f637-2f62-41e8-b191-bf760daff64f')/fields?$filter=title eq 'Colours' or internalname eq 'Colours' HTTP/1.1
Accept: application/json; odata=verbose
Host: sp.jrjlee.net

The REST API will return a response that resembles the following:

{
    "d": {
        "results": [
            {
                "__metadata": {...},
                "DescriptionResource": {...},
                "TitleResource": {...},
                "AutoIndexed": false,
                "CanBeDeleted": true,
                "DefaultValue": "",
                "Description": "",
                "Direction": "none",
                "EnforceUniqueValues": false,
                "EntityPropertyName": "Colours",
                "Filterable": true,
                "FromBaseType": false,
                "Group": "Jason Columns",
                "Hidden": false,
                "Id": "5e198f9b-6daf-4c13-ad01-cdb616a06ab4",
                "Indexed": false,
                "InternalName": "Colours",
                "JSLink": "...",
                "ReadOnlyField": false,
                "Required": false,
                "SchemaXml": "...",
                "Scope": "/Lists/Things",
                "Sealed": false,
                "Sortable": false,
                "StaticName": "Colours",
                "Title": "Colours",
                "FieldTypeKind": 0,
                "TypeAsString": "TaxonomyFieldTypeMulti",
                "TypeDisplayName": "Managed Metadata",
                "TypeShortDescription": "Managed Metadata",
                "ValidationFormula": null,
                "ValidationMessage": null,
                "AllowMultipleValues": true,
                "IsRelationship": true,
                "LookupField": "Term$Resources:core,Language;",
                "LookupList": "{567ae30f-ee2d-4d07-8c4a-a6467a94959c}",
                "LookupWebId": "9361373e-19a2-42e5-bf36-a67adfeae11e",
                "PrimaryFieldId": null,
                "RelationshipDeleteBehavior": 0,
                "AnchorId": "00000000-0000-0000-0000-000000000000",
                "CreateValuesInEditForm": false,
                "IsAnchorValid": true,
                "IsKeyword": false,
                "IsPathRendered": false,
                "IsTermSetValid": true,
                "Open": false,
                "SspId": "8750d12e-49d8-4326-84a6-ae8bd5a953c6",
                "TargetTemplate": null,
                "TermSetId": "ba00c0bf-3de6-45d7-96b3-c23debc868c4",
                "TextField": "451e34e4-c9b6-43d9-8499-e0495c6dcb4a",
                "UserCreated": false
            }
        ]
    }
}

The TextField property is the ID of the hidden note field that is associated with this taxonomy field. If you're building a workflow activity, you can pull the value out of the response using the XPath query d/results(0)/TextField.

Once you've got the field ID, you can send another REST request - this time to retrieve the details of the hidden note field:

[Site URL]_api/web/lists(guid'[List ID]')/fields(guid'[Field ID]')

For example:

GET /_api/web/lists(guid'd190f637-2f62-41e8-b191-bf760daff64f')/fields(guid'451e34e4-c9b6-43d9-8499-e0495c6dcb4a') HTTP/1.1
Accept: application/json; odata=verbose
Host: sp.jrjlee.net


As before, the REST API returns a response that resembles the following:

{
    "d": {
        "__metadata": {...},
        "DescriptionResource": {...},
        "TitleResource": {...},
        "AutoIndexed": false,
        "CanBeDeleted": true,
        "DefaultValue": null,
        "Description": "",
        "Direction": "none",
        "EnforceUniqueValues": false,
        "EntityPropertyName": "le198f9b6daf4c13ad01cdb616a06ab4",
        "Filterable": false,
        "FromBaseType": false,
        "Group": "Jason Columns",
        "Hidden": true,
        "Id": "451e34e4-c9b6-43d9-8499-e0495c6dcb4a",
        "Indexed": false,
        "InternalName": "le198f9b6daf4c13ad01cdb616a06ab4",
        "JSLink": "clienttemplates.js",
        "ReadOnlyField": false,
        "Required": false,
        "SchemaXml": "...",
        "Scope": "/Lists/Things",
        "Sealed": false,
        "Sortable": false,
        "StaticName": "le198f9b6daf4c13ad01cdb616a06ab4",
        "Title": "Colours_0",
        "FieldTypeKind": 3,
        "TypeAsString": "Note",
        "TypeDisplayName": "Multiple lines of text",
        "TypeShortDescription": "Multiple lines of text",
        "ValidationFormula": null,
        "ValidationMessage": null,
        "AllowHyperlink": false,
        "AppendOnly": false,
        "NumberOfLines": 6,
        "RestrictedMode": true,
        "RichText": false,
        "WikiLinking": false
    }
}

This time, we grab the InternalName property (d/InternalName) from the response.

Now that we've got the internal name of the hidden note field, we can use the REST API to get or set taxonomy values programmatically. We send a MERGE request:

POST /_api/web/lists(guid'd190f637-2f62-41e8-b191-bf760daff64f')/Items(12) HTTP/1.1
If-Match: *
X-HTTP-Method: MERGE
Content-Type: application/json; odata=verbose
Host: sp.jrjlee.net

And in the JSON body of the request we set our hidden field (using the GUID-based internal name) to our multi-value term string:

{
    "le198f9b6daf4c13ad01cdb616a06ab4": "Red|87999a76-e3cb-433c-96ad-c6fe354db476;Blue|77788fee-9e1d-4df2-a21b-d41dd1734b71;Violet|a1a70ca3-b104-49f7-86c9-38a265f35f4d",
    "__metadata": { "type": "SP.Data.ThingsListItem" }
}

The end result? Our Colours taxonomy field shows the new values as expected.























SharePoint Framework POST requests: watch out for OData version incompatibility

$
0
0
This week I've been building a SharePoint Framework web part that queries a Pages list, amongst other things. For reasons that I won't go into, I need to do this by sending a CAML query to the server. The usual way to do this is to send a POST request to the getitems REST endpoint:

[Web URL]/_api/web/lists/getbyid('[List GUID]')/getitems

Where the body of the request contains your CAML query:

'query': {
   '__metadata': {'type': 'SP.CamlQuery' },
   'ViewXml': '<View><Query><Where>...'
}

To do this the SPFx way, we leave jQuery.ajax behind and use the SPHttpClient.post method. However, when I did this, the server returned an HTTP 400/Bad Request response with the error message:

The property '__metadata' does not exist on type 'SP.CamlQuery'. Make sure to only use property names that are defined by the type.

After a bit of trial and error in Fiddler, I found the problem:

  • The SPHttpClient class appends an odata-version: 4.0 header to the request.
  • The SharePoint REST API, on my SharePoint Online tenancy at least, supports OData 3.0. (Send a GET request to /_api/$metadata.)
  • Manually removing the odata-version header from the request in Fiddler solves the problem.
Back in our web part code, fortunately, the SPHttpClient class provides a way of overriding headers. The post method accepts three arguments:
  • A string URL.
  • An SPHttpClientConfiguration value (at this stage there's only one to choose from).
  • An ISPHttpClientOptions object.
The ISPHttpClientOptions object holds the answer here. In addition to specifying our request body, we can use it to add or override headers:

const options: ISPHttpClientOptions = {
   headers: {'odata-version':'3.0'},
   body: {'query': {'__metadata': ...
};

Including the headers option ensures that the errant odata-version: 4.0 header is replaced by a more benign odata-version: 3.0 header, with the result that everything starts to work. Putting it all together, you get something like this (apologies for any dodgy TypeScript; we're all learning here):

public static getMyNews(contextIWebPartContextfiltersFilterValue)
   : Promise<NewsItem[]> {
        // Build a REST endpoint URL
        const restUrlstring = this.buildRestUrl(contextfilters);

        // Build a stringified request body
        const payloadstring = this.buildQueryPayload(filters);
        
        // Build an ISPHttpClientOptions object
        const optionsISPHttpClientOptions = {
            headers: {'odata-version':'3.0'},
            body: payload
        };

        // Send the request and parse the response
        return new Promise<NewsItem[]> 
        ((resolve: (optionsNewsItem[]) => voidreject: (errorany=> void=> {
            context.spHttpClient
                .post(restUrlSPHttpClient.configurations.v1options)
                .then((responseSPHttpClientResponse=> {
                    response.json().then((itemsany=> {
                        const newsItemsNewsItem[] = items.value.map(item => {
                            return<NewsItem> ({
                                id: item.GUID,
                                title: item.Title,
                                description: item.NewsItemSummary,
                                imageUrl: item.PublishingRollupImage
                            })
                        })
                        resolve(newsItems);
                    });
                });
        });
    }

Edit 3/3/17: A similar issue is known to affect calls to the Search REST API (again, it's a POST request) - see this thread for details.


Creating Services with SPFx

$
0
0
In SharePoint developer land, we're all getting increasingly familiar with the benefits of using the SharePoint Framework (SPFx) to create self-contained client web parts. However, there are many scenarios where you might want to share functionality or data between the web parts on a page. For example, suppose you're building intranet-style functionality where a user subscribes to tags, and you use these tags across multiple web parts - news, events, classifieds, whatever you like - to filter what you retrieve and what the user sees. Retrieving those tags in every web part on the page would be inefficient and is likely to give you a poor user experience.

It turns out there's a neat way of meeting this requirement in the form of SPFx services. These are built as standalone node packages that you reference from your web parts. Multiple web parts on a page might register dependencies on the node package for your service, but the package is only downloaded once. Even more importantly, multiple web parts on the page might use the service, but the service is only instantiated once. As a result, we can use our service to share both data and functionality within a web part page.

It seems that SPFx services are like the neglected sibling of SPFx web parts that no one wants to talk about - at the time of writing, the only documentation I've seen is this tech note on the Service Scope API and this subheading in the SPFx guidance. Actually getting a service up and running took a little bit of figuring out - especially configuring the package.json file correctly and testing the package locally - hence this post.

Sample Solution

Let's look at a dead simple service that gets a list of lists from the current SharePoint site. I've posted the bare-bones project on GitHub if you want to browse.

Key Points

Have a read of the existing documentation first (like I said, here and here) for an overview of the process. I'm just going to call out a few points that took a little more time to figure out.

Creating the project structure

Generate the project structure in the same way that you'd create an SPFx web part project - by using the yo @microsoft/sharepoint scaffolder (use the No JavaScript web framework option). When the project has been created, in the src folder, delete the entire webparts folder - you don't need any of this. Replace it with a folder named service (or a name of your choosing). 

Creating files

In the service folder, you'll want at least two files:
  • A TypeScript file that contains your service logic (in this case, ListGetter.ts).
  • A JSON-based manifest file that describes the component (ListGetter.manifest.json).
The manifest.json file looks very similar to the manifest for a web part (copy and paste opportunity), except you should set the componentType to Library instead of WebPart. (And if you do copy and paste, make sure you generate a new GUID for the id property.)

{    
  "$schema""../../node_modules/@microsoft/sp-module-interfaces/lib/manifestSchemas/jsonSchemas/clientSideComponentManifestSchema.json",
  "id""fb56669f-3d74-4982-abe6-02cfa1065272",
  "alias""ChorusListService",
  "componentType""Library",
  "version""0.0.1",
  "manifestVersion"2  
}

The ListGetter.ts file defines the service logic, and the requirements here are explained in more detail by Microsoft's Service Scope API note. At a high level, this file:
  • Exports an interface (IListGetter) that defines the shape of our service.
  • Exports a class (ListGetter) that provides a default implementation of the service.
Again, at a high level, the service class must:
  • Provide a constructor that accepts an argument of type ServiceScope. (This constructor gets called by the service locator implementation in the SharePoint Framework.)
  • Provide a static key of type ServiceKey that uniquely identifies your service and indicates the default implementation of the service.
The file looks like this:

import { ServiceScopeServiceKey } from '@microsoft/sp-core-library';
import { SPHttpClientSPHttpClientResponse } from '@microsoft/sp-http';
import { IDropdownOption } from 'office-ui-fabric-react';
import { IWebPartContext } from '@microsoft/sp-webpart-base';

/**
 * Interface for a service that retrieves lists from the current site
 */
export interface IListGetter<T> {
    /**
     * Retrieves the ID and Title of all the lists in a SharePoint site
     * @param context - the IWebPartContext object provided by the web part consuming this service
     * @param includeHidden - whether you want to include hidden lists in the results
     */
    getLists(contextIWebPartContextincludeHiddenBoolean): Promise<T>;
}

/**
 * An implementation of the IListGetter service
 * @class
 */
export default class ListGetter implements IListGetter<IDropdownOption[]> {
    /**
     * SPFx services must include a constructor that accepts an argument of type ServiceScope
     * @constructor
     * @param serviceScope
     */
    constructor(serviceScopeServiceScope) {
    }

    /**
     * Retrieves the ID and Title of all the lists in a SharePoint site
     * @param context - the IWebPartContext object provided by the web part consuming this service
     * @param includeHidden - whether you want to include hidden lists in the results
     */
    public getLists(contextIWebPartContextincludeHiddenBoolean = false): Promise<IDropdownOption[]> {
        const endpoint = includeHidden 
          ? '/_api/web/lists?$select=Title,Id' 
          : '/_api/web/lists?$filter=Hidden%20eq%20false&$select=Title,Id';
        return new Promise<IDropdownOption[]>((resolve: (optionsIDropdownOption[]) => voidreject: (errorany=> void=> {
            context.spHttpClient
                .get(context.pageContext.web.absoluteUrl + endpointSPHttpClient.configurations.v1)
                .then((responseSPHttpClientResponse=> {
                    response.json().then((listsany=> {
                        const dropdownOptionsIDropdownOption[] = lists.value.map(list => {
                            return <IDropdownOption>({
                                key: list.Id,
                                text: list.Title
                            });
                        });
                        resolve(dropdownOptions);
                    });
                });
        });
    }

    /**
     * A lookup key that the service locator uses to retrieve this service
     */
    public static readonly serviceKeyServiceKey<IListGetter<IDropdownOption[]>> 
      = ServiceKey.create<IListGetter<IDropdownOption[]>>('Chorus.SPFxServices.ListGetter'ListGetter);
}

Editing the package.json file

This is the bit that took me the most time to figure out. In order for service consumers to be able to resolve your node package and to get some IntelliSense, you need to add main and typings entries to your package.json file. More broadly:
  • name: Make sure this is unique - globally unique if you plan to push it up to npm, or locally unique if you're publishing to a local feed.
  • description: Be nice and provide one.
  • private: If you want to publish to a feed - even a private company feed - you'll need to remove this. But it doesn't stop you testing locally.
  • main: This should point to the JavaScript file that defines your service (e.g. ListGetter.js) in the lib folder.
  • typings: This should point to the corresponding typings file (.d.ts) in the lib folder.
You may also want to use this opportunity to remove any unused dependencies from package.json.

Testing the package locally

To use your service, you need to import the service package into your web part project in the same way that you import other dependencies. However, you might not want to build and deploy the package to npm (or your company feed, etc) while you're building out your proof-of-concept. The easiest way to test it out is to add a relative folder reference in your web part project (assuming your service project and your web part project are on the same machine):

import ListGetter,{IListGetter} from '../../../../../ChorusListService';

For this to work, you need to reference the folder that contains the package.json file for your service package - in this case ChorusListService. Note that this approach does still require that the package.json file for your service is properly configured - your web part project won't resolve the package if it isn't.

Alternatively, if you can't use a folder reference for whatever reason or you want to know exactly what will be included in your node package, you can run the npm pack command at the root of your service project. This will generate a .tgz (TAR archive) file in your root folder named something like chorus-list-service-n.n.n.tgz. In terms of content and structure, this is exactly the same as your published node package. You can then install it in your web part project:

npm install C:\path\to\chorus-list-service-0.0.4.tgz

And then import it into your web part modules by name, just like a regular node module:

import ListGetter,{IListGetter} from 'chorus-list-service';

If you're able to import and use the functionality from the .tgz file, you can be pretty confident that things will continue to work when you deploy your package to npm or a company feed.

Consuming the service

Once you've imported the package into your web part module, you can put the service to use. First, define a field of your service interface type. Next, in the onInit method for your web part, grab the parent service scope from the web part context and retrieve a service instance, as follows:

private listGetterIListGetter<IDropdownOption[]>;

protected onInit<T>(): Promise<T> {
  const serviceScopeServiceScope = this.context.serviceScope.getParent();
  serviceScope.whenFinished(():void =>{      
    this.listGetter = serviceScope.consume(ListGetter.serviceKeyas IListGetter<IDropdownOption[]>;
  });

  ...
  return Promise.resolve();

You can then use the service anywhere you like - for example to populate a dropdown list in the property pane:

private listsIDropdownOption[];

protected onPropertyPaneConfigurationStart(): void {
  this.listsDropdownDisabled = !this.lists;

  // Display a loading indicator while we do the data retrieval
  this.context.statusRenderer.displayLoadingIndicator(this.domElement'available lists');

  // Use the service to get the list of lists
  this.listGetter.getLists(this.context as anyfalse)
    .then((listOptionsIDropdownOption[]): void => {
      this.lists = listOptions;
      this.listsDropdownDisabled = false;
      this.context.propertyPane.refresh();
    });
}

Next steps

This post illustrates a bare bones service that gets a list of lists from the current site. The idea is that multiple web parts on a page could all consume this service without duplicating functionality. To make the service useful, the next logical step is to build in some caching functionality to store the list data after the first request. That way, the service only needs to retrieve list data from the server once per page load, regardless of how many web parts request the data. You'll probably also want to create a mock implementation of the service that returns some dummy data, for example for when you're running on the workbench.

Avoiding deployment validation errors with SPFx packages

$
0
0
package-solution.json fileJust a quick note on a gotcha I came across when deploying an SPFx package to the App Catalog:

In the package-solution.json file, make sure your solution name doesn't use dot notation (or for my American friends, doesn't contain periods). This appears to prevent SharePoint from recognising your .sppkg file as a valid app package.



I started with a package-solution.json file that resembled the following - a nicely-qualified solution name, or so I thought:

{
    "solution": {
        "name": "Chorus.TiledNewsPart",
        "id": "83a75dd2-26d3-4235-b4ba-d0d4a8b19443",
        "version": "1.0.0.2"
    },
    "paths": {
        "zippedPackage": "solution/chorustilednewspart.sppkg"
    }
}


When I dropped the package into the App Catalog, I saw the following:










As you can see, things don't look good:

  • A bunch of fields (Title, Product ID, Metadata Language) haven't been populated.
  • The Valid App Package field value is No.
  • The Deployed field value is No.

Needless to say, my web part didn't make it through to any of my SharePoint sites either.

If we switch to a more NPM-style naming convention for my solution name in the package-solution.json file (use dashes, ditch the dots):

{
    "solution": {
        "name": "chorus-tiled-news-part", 
        "id": "83a75dd2-26d3-4235-b4ba-d0d4a8b19443", 
        "version": "1.0.0.3" 
    }, 
    "paths": { 
        "zippedPackage": "solution/chorustilednewspart.sppkg" 
    }
}

We now get a much better result when we add the package to the App Catalog - all fields are populated, the Valid App Package field value is Yes, and the Deployed field value is Yes:












And there you have it. Summary: dots bad, dashes good.

Changes to link sharing defaults in SharePoint Online

$
0
0
Just a quick note (followed by a quick rant) on a recent change to sharing in Office 365 (both SharePoint Online and OneDrive for Business) that's caused us a couple of issues over the last week.

The issue came to light when a team member tried to get the URL of a document in a SharePoint Online document library. You'll be aware that you can't just right-click and copy the URL in SharePoint - instead you can either right-click the document and then click Get a link on the context menu, or you can select the document and click Get a link on the toolbar:





















My user - who has permissions to edit the document in question - was presented with a nasty error message:

Couldn't create the link - Attempted to perform an unauthorized operation














The reason for this is that when you get a link in SharePoint Online (or OneDrive for Business), it now defaults to creating a link that grants anonymous access to the document. If you've locked down anonymous sharing on your site collection, the user will get an error message instead.

If you want to change this behaviour, head over to the SharePoint admin center and browse to the sharing settings. Then change this:

Default link type: Anonymous Access - anyone with the link











To this:

Default link type: Direct - only people who have permission











Now, when my users go to get a link, they literally get a link. They're not creating an anonymous access URL with an embedded OAuth token, they're not granting anyone access who doesn't already have access, they're literally just getting the URL for the document.

Quick rant: I really don't understand the rationale behind this new default - if I'd wanted to give new people access to a document, I'd have clicked Share. Surely the most common scenario, and the most sensible default, is to leave permissions alone and just get a link.

Note that you can configure these sharing settings separately for SharePoint and for OneDrive for Business (now that OneDrive has its own admin center in Office 365). This probably isn't a bad idea - in OneDrive I'm more likely to be inviting someone within the organisation who doesn't already have access, whereas in a SharePoint document library I want minimum deviation from the site permissions.



Server Resource Quota in SharePoint Online

$
0
0
Q. What does the Server Resource Quota setting do in SharePoint Online?
A. Right now, nothing. Nothing at all.















Earlier this week I was talking to an IT Manager who's been troubleshooting some performance issues on a SharePoint Online site collection. As part of his investigation, he raised the issue with Microsoft Support.

Among other things, the support engineer advised him to increase the Server Resource Quota for the site collection.

Microsoft Support should really know better. For the avoidance of doubt, the Server Resource Quota existed solely to limit the amount of server resources that any sandboxed solutions running in the site collection can consume over a 24-hour period. As code-based sandboxed solutions have been blocked entirely in SharePoint Online from July 2016, this setting now does absolutely nothing at all. I'd assume that in time it will disappear from the admin center UI.

Versioning SharePoint Framework Packages

$
0
0
TL;DR: Package versioning in SPFx web parts is confusing. Your sites will notify you that an upgrade is available, but will automatically use the latest version of your code regardless.

Package versioning behaviour in the SharePoint Framework is currently a little idiosyncratic, and it recently caused us a few headaches. It seems that from a versioning perspective SharePoint (incorrectly) expects the SPFx packages to behave like SharePoint Add-ins. In this post I'll run through some of the issues we encountered and how we worked through it.

Context

You can version an SPFx package in three places:

  • The SharePoint package manifest, in the package-solution.json file.
  • The component manifests, in one or more .manifest.json files.
  • The NPM package manifest, in the package.json file.

In this post we're looking at the SharePoint package manifest (package-solution.json) as this is the version number that SharePoint reads and propagates when you deploy a package to the App Catalog.

The Problem

Recently I created a neat little "Project Summary" web part for a client using the SharePoint Framework. I packaged it up and deployed it to the client's App Catalog site, and the client added it to the landing page on around 150 team sites.

Then the client reported a bug. No problem - we duly fixed the issue, bumped the version number in the package-solution.json manifest file, and deployed the updated package to the App Catalog. We then started to see update prompts on the team sites that includes the web part, like this:

And when you click through:

 
In other words, I was seeing the versioning mechanism for the old SharePoint add-ins model. At this point I started to get a headache - I didn't want to tell my client to update the web part manually on 150 sites, and I couldn't find any hooks for automating the update. I posted a question about bulk updates on Stack Exchange. The most helpful responses advised me to leave the version number unchanged when pushing out updates (not ideal from the good software development practices, but I'm ever the pragmatist).

Pushing out an updated .sppkg package with an unchanged version number got rid of the update prompts on sites that already included the web part solution. However, this approach introduced a whole new problem on sites that didn't already include the solution. On attempting to add the solution from the Site Contents page, users were presented with the error:

Sorry, something went wrong
A different version of this App is already installed with the same version number. You need to delete the app from the site and the site recycle bin to install this version.

None of this was particularly helpful - the web part was not installed on the site and did not feature in any recycle bins.

What's Going On

It didn't occur to me (despite some of the responses to my question alluding to it) that any existing instances of the web part within the tenancy would automatically use the updated code, despite prompting the owner to "get" the latest update. (You can kind of figure it out if you look at what you're deploying to your CDN - although your JavaScript bundle gets a different unique name every time, the file name for the JSON manifest doesn't change - in other words, you're replacing the manifest every time, and the manifest always points to the latest bundle.)

Here's a quick proof-of-concept - I created the world's dumbest web part. All it does is display a version number as static text. I deployed it to the App Catalog on a dev tenancy, then added it to a site:









Next, I bumped the version number in package-solution.json, updated my static text, and deployed the new package to the App Catalog. I browsed back to my site, refreshed the page, and immediately saw the new version:







However, the Site Contents page tells us we're still running version 1:





And gives us the opportunity to "get" the latest update:



Pretty confusing, right? If you click GET IT, SharePoint will go through the motions and the Site Contents page will display the updated version number. But it won't make any difference to your SPFx web parts - these will already be using the updated version of your code.

Conclusion

At the moment, the versioning story for SPFx components is messy. For now, we figured the best way through this is:
  • You do need to bump the version numbers when you deploy an updated package, otherwise you risk running into the A different version of this App is already installed with the same version number error.
  • Any instances of the web part in your tenancy will automatically use the updated version.
  • You can ignore the update prompts on the Site Contents page. (If site owners do choose to "get" the latest version, the only thing that will change is the version number displayed on the Site Contents details view.)
Right, glad we've cleared that up... I'm off for a lie down.

Automating Navigation Inheritance for SharePoint Sites

$
0
0
Automating site creation processes with SharePoint workflow has cropped up a lot in this blog over the years. The latest battle has been getting newly-created subsites to inherit the top link navigation bar from the site collection root.

In summary...

At the time of writing, you can't switch on navigation inheritance using the REST API. I've tried, exhaustively. Google agrees.

You can switch it on using client-side code (managed client or JavaScript)... but that's no use for applications such as workflow where you're limited to code-free solutions.

However...  if you can do it with client-side code, you can do it by calling the client.svc service with an XML body. It's ugly but effective. Your web service call should look something like this:

Endpoint:
{Web URL}/_vti_bin/client.svc/ProcessQuery

HTTP method:
POST

Headers:
Content-Type: text/xml

Body:
<Request AddExpandoFieldTypeSuffix="true" SchemaVersion="15.0.0.0" LibraryVersion="16.0.0.0" ApplicationName=".NET Library" 
xmlns="http://schemas.microsoft.com/sharepoint/clientquery/2009">
  <Actions>
    <SetProperty Id="1" ObjectPathId="2" Name="UseShared">
      <Parameter Type="Boolean">true</Parameter>
    </SetProperty>
  </Actions>
  <ObjectPaths>
    <Property Id="2" ParentId="3" Name="Navigation" />
    <Identity Id="3" Name="{0}:site:{1}:web:{2}" />
  </ObjectPaths>
</Request>

There are three string placeholders in the XML body above:

  • {0} is the ID of the SPObjectFactory class. At the time of writing, for SharePoint Online at least, this appears to be af552f9e-a032-4000-c7e1-e6a34cfee284|740c6a0b-85e2-48a0-a494-e0f1759d4aa7. It should always be the same, as far as I know - it's currently working for me in two different Office 365 tenancies.
  • {1} is the GUID of your site collection.
  • {2} is the GUID of the web that you want to update.
And that's it. 

(By the way, if you want to see how to bundle it all up into a custom workflow activity, take a look at this post from 2014. Visual Studio workflow activities remain the same now as they were then.)

Targeted Links in a Modern SharePoint UI

$
0
0
Recently I was advising a client on building out intranet-style content on SharePoint Online. I walked them through communication sites and modern team sites, demonstrated the mobile-friendly responsive rendering, the ease of editing, and so on. Then came the killer question: "how do I target my navigation links to different users?"

This might well be on Microsoft's product backlog... but at the time of writing, there's no support for audience targeting with communication sites or modern team sites. So how do we use targeted navigation without losing the benefits of the modern UI? After a bit of trial-and-error we came up with a working solution using modern pages within classic SharePoint sites. There are quite a few gotchas, hence this post.

Step 1: Create a suitable site

Create or locate a classic SharePoint site (e.g. using the Team Site - SharePoint Online configuration template).

Gotcha #1: Don't use a publishing site template, or you won't be creating modern pages.

Gotcha #2: Don't use a communication site or a modern team site (O365 group site), or you'll error out at step 2.

Step 2: Activate the site collection publishing feature

Activate the site collection-scoped SharePoint Server Publishing Infrastructure feature.

Gotcha #3: Without this feature, although you can browse to /_layouts/15/AreaNavigationSettings.aspx manually and create targeted navigation links, SharePoint will ignore the targeting. In other words, everyone will see everything.

Gotcha #4: Don't activate the site-scoped SharePoint Server Publishing feature - if you do that you'll get all the publishing bloat - page layouts, additional libraries, etc - and you'll find it much harder to create modern pages.

Step 3: Create targeted links

Browse to Site Settings> Look and Feel> Navigation, or go directly to /_layouts/15/AreaNavigationSettings.aspx.

Gotcha #5: Add your links under Current Navigation (i.e. the Quick Launch) - modern pages don't show Global Navigation links. Or at least they don't show more than one.

In the link settings, you can target a SharePoint group, an AD security group or DL, or a global compiled audience.

Step 4: Add modern pages

On the settings (cog) menu, click Add a page. Providing you haven't activated the site-scoped publishing feature, SharePoint will create a modern page by default. If you've swerved all of the gotchas in steps 1-3, the page should respect your link targeting.

Step 5: Make a modern page your site home page

At some point you'll probably want to make one of your shiny modern pages the default landing page for the site. To do this, browse the the Site Pages library. Find the page you want to use, and on the ellipsis menu, click Make homepage.


Final gotcha: Note that the Site Pages library must be configured to use the new experience, not the classic experience, otherwise you won't see the Make homepage option.

Permissions required to edit Quick Launch navigation links

$
0
0
Just a quick one today. A client needed to allow a group of users to add navigation links to the Quick Launch on their SharePoint Online sites:

We mistakenly assumed that granting the Design permission level would be sufficient. That's not the case on SharePoint Online when the Publishing Infrastructure site collection feature is enabled. The EDIT LINKS link doesn't show up until you grant the Manage Hierarchy permission level. Alternatively, if you want to get granular, you need the Manage Web Site site permission.

Don't get caught out by Office 365 CDN settings

$
0
0
Quick note on something that caught me out today. Suppose you've developed your SPFx solution, you've tested it on developer tenancies, and you're ready to push it out on your client's Office 365 tenancy. You drop it into the App Catalog, and you see the following:

Well that's not good. It's enabled... it's a valid app package... but it's failed to deploy. In the App Package Error Message column, we've got a generic Deployment failed. Correlation ID... message that doesn't shed much light on the situation.

This turned out to be a simple oversight in Office 365 CDN settings. If  you're bundling client-side assets into your sppkg file (as is the default from SFPx 1.4 onwards), rather than deploying them to a separate CDN, you must make sure the public CDN is enabled on your Office 365 tenancy. It's a requirement that's well-covered in the SPFx documentation, but it might not be the first thing that comes to mind when you see a generic deployment error.

In other words, if your package-solution.json file includes this:

"includeClientSideAssets": true

You'll need to run something like this in PowerShell:

Set-SPOTenantCdnEnabled -CdnType Public -Enable $true

Hope that saves someone some head-scratching.

Calculating working days in Power Automate or Logic Apps

$
0
0
Here’s a common scenario for any kind of SLA-driven process – calculate a target date or a due date for a task based on a number of working days. In most cases, we can take working days to be everything except weekends and public holidays. Let’s take a look at how you might go about it in Power Automate or Azure Logic Apps.

Note: the approach is identical regardless of whether you go with Power Automate or Logic Apps – I’m using Logic Apps in this case because the HTTP connector requires a premium license in Power Automate. 

The challenge

Given a start date and the number of working days we need to add, calculate a target date.

High-level approach

The algorithm works like this:
  1. Add one day at a time to our start date – let’s call this the running date.
  2. If the running date is not a weekend or a public holiday, increment a counter.
  3. Repeat until the counter equals the number of working days we need to add.
  4. Set the target date to the final value of the running date.

The trigger

To keep this as simple as possible, let’s start with a SharePoint list with three key fields – Start Date, Days to Add, and Target Date. We’ll manually set the Start Date and the Days to Add, and we’ll use the logic app to calculate the Target Date:

Note that those are UK date formats :-)

We'll trigger the Logic App every time an item is added to this list.

The variables

We need to define a few variables at the top:
  • workingDaysAdded is our counter – we’ll use this to keep track of how many qualifying days we’ve notched up in our loop.
  • runningDate is the date we’re going to add days to in our loop. We’re initially setting this to the start date we got from the SharePoint list item.
  • runningDateIncremented is a variable we’ll use to store temporary values when we’re calculating dates.

Note that we’re using strings for the date variables – Logic Apps and Power Automate connectors typically pass dates around as date strings in UTC format.

The public holiday data

You can get accurate public holiday data from various sources. In this case, I’m using an HTTP activity to send a GET request to https://www.gov.uk/bank-holidays.json, to get official public holiday data for the UK. This returns data that looks like this:

{
    "england-and-wales": {
        "division": "england-and-wales",
        "events": [
            {
                "title": "New Year’s Day",
                "date": "2015-01-01",
                "notes": "",
                "bunting": true
            },
            {
                "title": "Good Friday",
                "date": "2015-04-03",
                "notes": "",
                "bunting": false
            },
            {
                "title": "Easter Monday",
                "date": "2015-04-06",
                "notes": "",
                "bunting": true
            },
            ...
        ]
    },
    "scotland": {...},
    "northern-ireland": {...}
}

I’m then using a Parse JSON activity to convert that response into an object, to make it easier to work with later on:

Tip: That Use sample payload to generate schemaoption is very useful – just paste in a typical response from your web service and Logic Apps / Power Automate will do the work for you.

The loop

To do the calculation, we need a loop. In Logic Apps, I’m using the Until activity. In Power Automate it’s called Do Until, but it’s the same thing. We want to run the loop until the value of our counter, workingDaysAdded, is equal to the number of working days we need to add (provided by the SharePoint list item). At a high level, the activity looks like this:

For clarity, in advanced mode, the loop condition reads @equals(variables('workingDaysAdded'), triggerBody()?['DaystoAdd']).

The next thing we need to do is add one day to our running date. Logic Apps and Power Automate won’t let us write expressions like runningDate = addDays(runningDate,1), so we need to do this with two Set variableactivities:
  1. Set runningDateIncremented to addDays(runningDate,1).
  2. Set runningDate to runningDateIncremented.
The end result – we’ve added one day to runningDate.

The weekday check

Checking whether a specific date is a weekday is straightforward in Logic Apps and Power Automate. We use the dayOfWeekfunction. This returns a number – 0 is Sunday, 6 is Saturday, and anything in between is a weekday:

If the condition is false, we do nothing and let the loop go back to the start. If the condition is true, we need to check whether our weekday is a public holiday.

The public holiday check

To check whether our weekday is a public holiday, we need to see whether it matches any of the records in our public holiday data. We can do this with a Filter array activity:



The From input is an array from our Parse JSON activity – specifically, this is:

body(‘Parse_JSON:_UK_public_holiday_data’)?[‘england-and-wales’]?[‘events’])

For the filter criteria, we’re returning only items where the date property matches the running date. We have to format the running date to match the format of the bank holiday data – in advanced mode, the filter array expression is:

@equals(item()?['date'], formatDateTime(variables('runningDate'), 'yyyy-MM-dd'))

If our filtered array contains more than zero items (use the length operator to check), we know our running date is a public holiday:

If this is true, we do nothing and let the loop go back to the start. If it’s false, we know that our running day is (1) not a weekend, and (2) not a public holiday. In this case, we increment our counter (workingDaysAdded) before the loop goes back to the start.

The result

When the workingDaysAdded counter reaches the target Days to Add value we pulled from the SharePoint list item, our loop will stop running. At this point, our running date is the date we need – it’s the start date with the required number of working days added. We can use an Update item activity to write this value back to the SharePoint list item:

If you check the SharePoint list against a public holiday calendar for England and Wales, you’ll see that everything adds up nicely!


Viewing all 56 articles
Browse latest View live