Monday, July 22, 2019

Splitting up PnP files so that LogicApps/FunctionApps do not time out

I am a big fan of Microsoft Patterns and Practices (PnP) and use it both from a scripting perspective (PowerShell) and web development perspective (PnPJS).

Recently I was involved in building an automated project site provisioning process for a client, using LogicApps and FunctionApps. I had generated a number of PnP templates for a site (e.g. 1 for site columns/content types, 1 for lists, 1 for features and 1 for pages/navigation) using the Get-PnPProvisioningTemplate -out template.pnp -Handlers
As my plan was to store the PnP templates within SharePoint, the file extensions must be .pnp and not .xml (.xml will not work). Within the document library I stored a 'sequence' field, which basically dictated which PnP templates should be applied and in what order. The LogicApp then iterated this library and called various FunctionApps to first apply the necessary PnP file and (later) to add custom web parts.

If you did not know, the default timeout on a Function App is only 2 minutes (and PnP provisioning can be slow). This can be increased up to 10 minutes by navigating to 'Platform Features | Kudo | Debug Console | PowerShell' within the Azure console, navigating into the 'site | wwwroot', then clicking the pencil icon next to host.json and adding the functionTimeout JSON property to 10 minutes.
E.g.
{
  "version": "2.0",
  "functionTimeout": "00:10:00"
}


Sometimes however, this 10 minute timeout still isn't enough, particularly if you have a large number of content types and lists in the site. In order to fix this you have to 'hack' the Lists PnP file, so that you split it up into multiple (smaller) List PnP files.  To do this, here are the steps:

1) Copy your PnP file to a temporary folder.
2) Change the file extension from .pnp to .zip.
3) Within the 'files' folder, edit the XML file and strip out the number of lists to be provisioned
4) As you will be saving these updates to a new PnP filename, it is important that the files-map.xml file in the ProvisioningTemplate folder is updated to match the new PnP filename, except with the .xml extension. For example, if I was going to name my PnP file ProjectSiteTemplateLists1.pnp, the files-map.xml file would look like:


 
    ProjectSiteTemplateLists.en-US.resx
    ProjectSiteTemplateLists1.xml
 


5) Within Windows Explorer, navigate back to the root where you had extracted the files initially. Select all of the files and create a zip file from it (I use 7-zip, but you could just use the Windows Send To Zip file option).
6) Rename the .zip file to ProjectSiteTemplateLists1.pnp.
7) Repeat the process to create the other list pnp files (i.e. by starting with the full PnP file again).

I hope this helps someone. Often you don't do this sort of thing often, but when you do it is important that it is done correctly, as otherwise you will get a nasty 'Apply-PnPProvisioningTemplate : The Provisioning Template URI ProjectSiteTemplateLists1.xml is not valid'.

Tuesday, April 24, 2018

List of Issues Experienced with SharePoint January 2018 CU

At one particular client site, after we applied the January 2018 CU to their SharePoint 2013 environment, we found that one site column, which was used in multiple site collections, all of a sudden had a space in its internal name (which is not valid). For the end user, this was preventing them from creating new Document Sets within a library. What I found was that even though the offending field wasn't in the Document Set Content Type that they were trying the use, the fact that the invalid field was associated with the list (via another content type), during the process of creating the document set it throws an internal error "The schema for field with this name is wrong or missing. Field 'Meeting Type'".  As the Document Set errors were not discovered until several days after the patch was applied, we could not roll back to the snapshot taken prior to the CU being applied. To fix the issue (as we didn't want to wait for Microsoft Support), we created a new field, added it to the Content Type, removed the offending field, then had to go into every library and physically remove the old field from there also.

Another issue we experienced with the Jan 201 CU was that also SharePoint thought the User Profile Sync service was started, under the covers in the Services.msc snap-in, it was not (even though it was set to Automatic). The underlying error in the Event log was 'Detection of product '', feature 'PeopleILM' failed during request for component ''. This was due to the Network Service not having access to the C:\Program Files\Microsoft Office Servers\15.0\SQL\DatabaseSetttings.sql file. By giving the Network Service account read/execute rights to the 15 folder, the Synchronization service was able to be started and we were able to do profile imports again.

Note that for another customer, were I had setup a SharePoint/Project Server 2016 environment, the January 2018 CU ships with a corrupt resource file 'PWA.en-us.resx', which prevents you from administering projects (i.e. all action buttons in the ribbon are disabled). To fix this, we overwrote the file with a copy of the PWA.resx file.

Thursday, January 25, 2018

Date Range Comparisons within Nintex Workflow

I recently was tasked to amend a document submission/review workflow, where documents could only be submitted (via changing a status to 'Ready for Review') during a particular window. Typically the window was only for a certain day each month, between 7am and 7pm.

I first started by creating a separate list, which would contain a single list item that had a 'Date From' and 'Date To' field. I then added a query list action with a CAML query of:

<Query>
  <Lists>
    <List ID="{D5DA2C78-F280-4A30-9D9D-FD43997B5B08}"/>
  </Lists>
  <ViewFields>
    <FieldRef Name="ID">
  </ViewFields>
  <Where>
    <DateRangesOverlap>
      <FieldRef Name="DateFrom" />
      <FieldRef Name="DateTo" />
      <Value IncludeTimeValue="TRUE" Type="DateTime">
        <Today/>
      </Value>
    </DateRangesOverlap>
  <Where>
<Query>

Note: I substituted <Now/> for <Today/> and also a Workflow variable for the current date time (derived from a Calculate date action) to see if it made any difference.

I found that if the submission date was on a different day to today, the logic would work fine and not return any results (which is what I was checking for after the query list action). However, if the current datetime was outside of the window (which was today), it would return a result, even if the time was outside of the window.

I found the article https://community.nintex.com/thread/3213, where Paul Svetleachni said:
"The time is used to calculate if one period of time turns into next day or not. Thus only calculated based on actual day and hours are used if it is next day or not. So, filtering by specific hour/min/second is not possible, it is only used to determine if next day is added or subtracted based on calculation of date." So I tried various attempts using a set a condition action (based on https://community.nintex.com/thread/10160?commentID=32146#comment-32146), which didn't work for me. I amended this logic to the following, but still no luck:

Condition: If any value equals value
Where: "Workflow Data" - "Current Datetime"
is greater than "Workflow Data" - "DateTo"

OR

Condition: If any value equals value
Where: "Workflow Data" - "Current Datetime"
is less than "Workflow Data" - "DateFrom"

In the end I decided to convert the dates into numbers, then do my comparison that way, which WORKED!

Steps to reproduce:
1. Build String to populate Workflow variable "CurrentTimeAsNumberString". The formula for this was:
fn-FormateDate({Common:CurrentDate},"yyyyMMdd")fn-FormatDate({Common:CurrentTime},"HHmm")

This produced a string that looked like "201801250915"

2. Convert the CurrentTimeAsNumberString workflow variable to CurrentTimeAsNumber, using the Convert value action (where "Input" is CurrentTImeAsNumberString and "Store result in" is CurrentTimeAsNumber).

3. Repeat the Build String action, this time to populate the "DateFromAsNumberString" using the formula:
fn-FormateDate({WorkflowVariable:DateFrom},"yyyyMMdd")fn-FormatDate({WorkflowVariable:DateFrom},"HHmm")

4. Convert DateFromAsNumberString to DateFromAsNumber using the convert value action.

5. Repeat the Build String action, this time to populate the "DateToAsNumberString" using the formula:
fn-FormateDate({WorkflowVariable:DateTo},"yyyyMMdd")fn-FormatDate({WorkflowVariable:DateTo},"HHmm")

6. Convert DateToAsNumberString to DateToAsNumber using the convert value action.

7. Update the set condition action shown above to use the numbers instead of dates in the comparison:

Condition: If any value equals value
Where: "Workflow Data" - "CurrentTimeAsNumber"
is greater than "Workflow Data" - "DateToAsNumber"

OR

Condition: If any value equals value
Where: "Workflow Data" - "CurrentTimeAsNumber"
is less than "Workflow Data" - "DateFromAsNumber"


I hope this helps someone, as I was pulling my hair out trying to figure out why such a simple thing such as a data range comparison doesn't work naturally within SharePoint (and therefore Nintex Workflow).

Thursday, February 23, 2017

Installing SharePoint 2013 on Windows 2012 R2 with .NET 4.6.1

I ran into an issue today whereby my SharePoint installer would crash after all of the prerequisites were installed and I tried to run setup.exe. After delving a little deeper I found the article https://support.microsoft.com/en-us/help/3087184/sharepoint-2013-or-project-server-2013-setup-error-if-the-.net-framework-4.6-is-installed.

Microsoft's original response to this problem was to postpone the installation of .NET 4.6.1 until after you've installed SharePoint - well that doesn't help me when the server image already has it baked in.
Then Microsoft came out with a fix, which required a DLL file to be copied into the updates directory of the installation media (i.e. copy everything to a local directory, then copy in the dll file). See https://download.microsoft.com/download/3/6/2/362c4a9c-4afe-425e-825f-369d34d64f4e/svrsetup_15-0-4709-1000_x64.zip for SharePoint 2013 patch file.

In my case it still wasn't working, but I later discovered that it was because I was also trying to slipstream a Cumulative Update into the installation (which I normally do to speed up the installation process and remove the need for multiple PSConfig operations).

So the morale to the story - if you want to use this fix, you can't slipstream any updates into the installation also.

FYI, to find out what version of .NET is actually installed on a Windows 2012 Server, follow the steps at https://msdn.microsoft.com/en-us/library/hh925568%28v=vs.110%29.aspx?f=255&MSPPError=-2147217396#net_d.

Wednesday, November 16, 2016

Displaying a random SharePoint Search result using Display Templates

On a recent project I had the requirement to randomly display a staff profile on the home page of the Intranet. The staff profile would provide some basic details of the person (with a head shot), then on clicking the item the user would be redirected to a page that displayed a bunch of fun questions and the person's response to each question. The web page used its own custom page layout and associated content type to store all of the answers.

Now we've probably all seen various custom developed web parts in the past, utilising calls to the User Profile Service or some other list via CAML queries. We've probably also seen Content Query Web Part solutions. For this project I was determined to leverage search as much as possible, reducing the continual queries to the backend data source. So my technical challenge was not in getting the results (as search query rules/result sources do a good job of that); but rather randomly selecting an item to display from the result pool.

So in order to make this magic happen, I went about with the idea of creating a custom group display template (i.e. to generate the seed) and a custom item template, whose job was to render the item if it was told to do so. So how do you do this? Quite simply, you add your own variables to the Search context!

In your control template, you put the following :

<!--#_
        var randomItem = 0;  // default to first item returned from search
        try {
            var totalResults = ctx.ListData.ResultTables[0].ResultRows.length; //get the total number of results returned
            if (!isNaN(totalResults))
            {
                randomItem = Math.floor(Math.random() * totalResults);
            }
            ctx.RandomItemNumber = randomItem;  // prime our own variable for the Item display template
        }
        catch (err) {
            console.log(err);
        }
_#-->

Then in your item template, you put the following around the outer part of the display template:

<!--#_
  var showItem = true;  // default to showing the item
  try
  {
       if (ctx.CurrentItemIdx != ctx.RandomItemNumber) {
                showItem = false;
       }
  }
  catch (err)
  {
       console.log(err);
  }
  if (showItem)
  {
 _#-->
...Your Item output rendering goes here...
<!--#_
  }
_#-->

I hope this helps someone - it took a little bit of playing around to discover how to get this working, but in the end it was really quite simple.

Friday, March 27, 2015

Get all Office 365 Video Channels, Groups and Delve Boards with REST

​source: http://www.vrdmn.com/2015/01/get-all-office-365-video-channels.html

Office 365 has introduced 3 new portals recently: Videos, Groups and Delve. Behind the scenes, the architecture of Videos and Groups is such that each Video channel is a site collection and so is each Group. For Delve boards, each board is saved as a Tag and when you add a document to a board, the document is tagged with the name of the board.

If you are working on a solution for Office 365 and want to integrate Videos, Groups or Delve, here is how you can get a list of all of them using the SharePoint REST API:

1) Get all Office 365 Video Channels with REST API:
https://siteurl.sharepoint.com/_api/search/query?querytext='contentclass:sts_site WebTemplate:POINTPUBLISHINGTOPIC'&SelectProperties='WebTemplate,Title,Path'&rowlimit=50

2) Get all Office 365 Groups with REST API:
https://siteurl.sharepoint.com/_api/search/query?querytext='contentclass:sts_site WebTemplate:Group'&SelectProperties='WebTemplate,Title,Path'&rowlimit=50

3) Get all Delve Boards with REST API:
https://siteurl.sharepoint.com/_api/search/query?querytext='(Path:"TAG://PUBLIC/?NAME=*")'&Properties='IncludeExternalContent:true'&selectproperties='Path,Title'&rowlimit=50

SharePoint 2010 Public Site Navigation not working on latest version of Safari for the Mac

The link below provides the fix, which is an update to the compat.browser file.
http://blog.sharepointexperience.com/2014/10/did-safari-or-ios-8-break-your-sharepoint-2010-site/

Wednesday, October 24, 2012

onclick attributes do not fire on custom search XSLT on a Publishing page

I had a requirement for a client this week to create a search solution for a custom list (for Contractors). Basically the client wanted to use SharePoint Search and a custom XSLT to render the list items. The two challenges I faced (without reverting to full on development):
  1. Make the results sort by the Company name. Note that OOB, the Core Search Results web part does not support this. It only supports sorting by date or relevance (which is the default).
  2. Open the results in a modal popup window, rather than redirecting the whole page to the list item.

To solve point 1, I found the blog entry http://ddkonline.blogspot.com.au/2011/12/sharepoint-2010-modifying-core-search.html which allowed me to change the sort order to company.
The second problem was more frustrating to solve. I thought that within the XSLT I would add the onclick element, calling a function to load the given URL into a SharePoint modal window. But every time I tried, the onclick event just wouldn't fire. I went back to simple onclick="alert('hello')" and it still didn't work.  Within the body of the page I added straight links, switched to HTML mode and added the onclick events there, but every time I did this, SharePoint would strip it out. The blog entry at http://blog.mastykarz.nl/tracking-links-google-analytics-sharepoint-2010-mavention-google-analytics-links-tracking/ then gave me a clue. Basically SharePoint was blocking basic onclick events from firing or removing them all together. So to solve the problem, I removed the onclick attributes, then embedded a jquery function within the XSLT to iterate through all links and add the onclick event. This time it worked. FYI, below is the XSLT block:   

<xsl:text disable-output-escaping="yes">  
<![CDATA[  
<script type="text/javascript" src="/_layouts/Inflow/cqwp.js"></script>  
<script type="text/javascript">  
$(document).ready(function () {    
  if ($("#results-table").length) {  
    $("#results-table a").click(function() {   
      openDialog($(this).attr('href'));   
      return false;  
    });    
  }  
});  
</script>  
]]>  
</xsl:text>  
<div class="srch-results" accesskey="W">  
<table id="results-table" border="1"> <tr><th>Company</th><th>Active / Inactive</th><th>Full Name</th></tr>  
<xsl:apply-templates select="All_Results/Result">    
<!-- The xsl-sort needs operate upon a single field - it doesn't work if the sort has to evaluate child nodes-->    
<xsl:sort select="company" />  
</xsl:apply-templates>  
</table>  
</div>  
<xsl:call-template name="DisplayMoreResultsAnchor" /> </xsl:template>
<!-- This template is called for each result -->
<xsl:template match="Result">  
<xsl:variable name="id" select="id"/>  
<xsl:variable name="currentId" select="concat($IdPrefix,$id)"/>
<xsl:variable name="url" select="url"/>
<tr><td>
<a id="{concat($currentId,'_Title')}">          
<xsl:attribute name="href">            
<xsl:value-of select="$url"/>          
</xsl:attribute>          
<xsl:attribute name="title">            
<xsl:value-of select="company"/>          
</xsl:attribute>         
<xsl:value-of select="company"/>        
</a>
</td>
<td><xsl:value-of select="activeinactive" /></td>
<td><xsl:value-of select="fullname" /></td>
</tr>
</xsl:template>