Enabling Code Coverage for Sitecore with Coverlet & Github Actions

Last week I was tasked with enabling Code Coverage in our Sitecore Visual Studio solution and getting it into CodeCov (via our build pipeline). I ended up going down quite the Rabbit hole of different options and hitting a lot of brick walls along the way.

I finally figured it out and got it working though so thought I’d share my findings and what I did in the end to get this working.

TLDR - add this to your CI workflow in Github actions and adjust the settings as required.

What is Code Coverage?

In simple terms it gives you an idea of how many of your lines of code are covered by tests and therefore how confident you can be in making changes and releasing without breaking things. I’m not going to get into if this is a good idea, how accurate it is as a indication of the quality of your tests or if it’s a waste of time here – as I was just asked to get it setup and working. I don’t think were aiming for 100% code-coverage but we want to know the level of coverage we have and where we need to improve it. By the way the header image above is a lie (I hacked it together) – 100% sure looks nice though :-).

What Code Coverage options are there?

There are quite a few, but some of them are paid for. Given the cost cutting across the board at the moment I felt free ones were best to investigate first. The ones I looked at were as follows:

Selected Tools

Read more below on reasoning but in the end I went with the following:

After trying AltCover for a while and struggling to get the filtering working on various dlls I decided to try Coverlet. Coverlet seems to be the defecto standard and is included by default in ASP.NET 6.0+ projects and .NET Core projects in Visual Studio now.

As our Sitecore 10.3 project is traditional MVC, we are tied to .NET 4.8 framework. Also our projects are fairly legacy and have been upgraded a few times. Therefore it’s not possible to install Coverlet as an NuGet package within the test projects and use MSBuild as Id like to have. It seems this is only possible for newer SDK style projects or .NET core ones and not .NET Framework classic projects. So I had to instead go for using the Coverlet console – which in the end worked pretty well.

How do I use it?

So first you need to install it the coverlet console globally like so:

dotnet tool install --global coverlet.console

Then for each of your test projects you need to execute a command like so:

coverlet "C:\Projects\sc103-flux\src\Foundation\Accounts\Tests\bin\FluxDigital.Foundation.Accounts.Tests.dll" --target "C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Common7\IDE\Extensions\TestPlatform\vstest.console" --targetargs "C:\Projects\sc103-flux\src\Foundation\Accounts\Tests\bin\FluxDigital.Foundation.Accounts.Tests.dll /Parallel /Logger:TRX" --output "C:\Projects\sc103-flux\coverlet\coverlet-report1.cobertura" --format cobertura --include "[FluxDigital.*]*" --verbosity detailed

What this does is pass your test project dll to Coverlet and tell it to run Xunit to execute the tests. We also send some Params to XUnit to ensure it runs the the tests in parallel and logs out to the console. Lastly we pass some Params to the coverlet to tell it to filter on certain dlls as – otherwise it seems to try and monitor/test 3rd party dlls as well as our code. If you get any errrors in the console it might be because you are not filtering everything out you need to.

So to break it down in more detail:

  • coverlet – runs the coverlet console
  • “..\FluxDigital.Foundation.Accounts.Tests.dll” – this is the test project dll to run codecoverage on
  • –target  ..\vstest.console” – the path to the VSTest console, ensure this path is correct for your version of Visual Studio
  • /Parallel – runs the tests in VSTest in Parallel
  • /Logger:TRX – log out details to the console from VSTest
  • –targetargs “..\FluxDigital.Foundation.Accounts.Tests.dll” – the path to the dll your are testing again. This time for VSTest
  • –output “..\coverlet-cobertura1.cobertura” – the report file saved at the end of the test run.
  • –format cobertura – format for the above report file (this format allows us to merge the files from different test runs)
  • –include “[FluxDigital.]” – this paramater lets you filter out assemblies (dlls) and/or method to include by name. In my case I only want to include the CodeCoverage of dlls that start with “FluxDigital.” so this filters to just include these. I think you can actually add multiple include params if you wish (see below). 
  • –exclude “[]Model” –exclude “[FluxDigital.Foundation.Models]” –exclude “[]Controller*” – I’m not actually using these filters in my command above but if you want to you add multiple exclude parameters, e.g to exclude any Models or Controllers from Coverlet for example. 
  • –verbosity detailed – This tells Coverlet to output a lot of detail when running the code coverage, it’s really useful for debugging any issues.

I found some info here on include/exclude filtering and it was really helpful. Essentially patterns in brackets [my.dll.name] are assemblies and patterns outside of brackets “*my.class.name” are classes/methods.


Once it runs you will get a code coverage report which you will note is In the cultura format. The reason for this is that we want to merge all of our tests into one code coverage file and other formats don’t work for this. More on this later. 

You need to run a similar command (change the test dll and report name) for each test library and save the code coverage file out with a different name but in the same folder. In my case this was 9 test projects and therefore 9 code coverage files generated. Like so:

Running this 9 times in our build pipeline isn’t going to cut it, so you will see I solved this later using PowerShell to find all test dlls and run these commands automatically – but I wanted to explain how this works more simply first.

Report Generator

To merge them I used ReportGenerator. We will also use this tool later to upload the report to CodeCov. First we need to install it like so:

dotnet tool install -g dotnet-reportgenerator-globaltool

Then with the following command we can merge the files (ensure the path is correct to find your individual cobertura report files):

reportgenerator "-reports:C:\Projects\sc103-flux\coverlet\*.cobertura" "-targetdir:C:\Projects\sc103-flux\coverlet\report" -reporttypes:Cobertura

This gives us an Cobertura xml file with all code coverage data blended into one and generates an html report from it.

If you open up the index.html file in your browser you will see a summary of your Code Coverage at the top and then a breakdown by Assembly below that. Hmm 22%, not great at all. We have some work to do here to improve this, but that’s a job for another day.

This report is pretty neat though and is already enough for you to see where the gaps are in your coverage so you can decide where you need to add more tests.

Putting everything into Github Actions

The next step is to run this in the build pipeline (in our case Github Actions) and use Report Generator to send the file to CodeCov.

Running Coverlet via PowerShell for all Test Projects

A mentioned earlier in order to make this simpler to run in the build pipeline and maintainable I decided to write an PowerShell script which finds all test dlls that match a specific pattern (it ensures an unique list) and then executes the coverlet command (from above) for each dll in turn with VSTest Console.

This is what I came up with:

$basePath = "."
$reportPath = "coverlet"
$incNamePattern = "*Fluxdigital*test*.dll"
$incVSTestNamePattern = "[Fluxdigital.*]*"

#get all test dlls in the solution - filter here to reduce duplicates
$testdlls = (Get-ChildItem $basePath -include $($incNamePattern) -recurse | ? {$_.FullName -match 'Release' -and $_.FullName -notmatch 'obj' -and $_.FullName -notmatch 'LocalPublish'}).FullName 
        
#write-host "$($testdlls.Count) test dlls found..."
[System.Collections.ArrayList]$uniquedlls = @()

#ensure we only get each test dll once by adding them to an arraylist
foreach ($testdll in $testdlls){
    $fileName = [System.IO.Path]::GetFileName($testdll)
    #write-host "checking for $($fileName)"
    if($uniquedlls -match $fileName){
#write-host "allready in array"
    }
    else{
$uniquedlls.Add($testdll) | out-null 
    }
}

#run coverlet for each test dll in the list
write-host "$($uniquedlls.Count) unique test dlls found..."
foreach ($uniquedll in $uniquedlls){
$fileName = [System.IO.Path]::GetFileName($uniquedll)
$cmd = @"
coverlet $($uniquedll) --target "vstest.console.exe" --targetargs "$($uniquedll)" --output "$($reportPath)\coverlet-$($fileName.Replace('.dll','')).cobertura" --format cobertura --include "$($incVSTestNamePattern)" --verbosity detailed
"@
write-host "running tests for: $($fileName) - report path: $($reportPath)\coverlet-$($fileName.Replace('.dll','')).cobertura"
$($cmd) | cmd
}

This is used in the Github Action below so you will need to update the $incNamePattern and $incVSTestNamePattern to match your test dlls when using it in your Github workflow. You could obviously just use it locally to generate a report too.

The Final Github Actions YAML

In order to use Coverlet, VS Test, Report Generator in Github Actions I needed to add some steps in the build pipeline to install the tools. I also wanted to show the code coverage in the Github action summary so eventually found an market place action that would do that (and work with Windows runners) and then finally an action to send the report to Code Cov. Note you will need to update this action with your repo details and Code Cov token (in secrets).

Please review all the settings below too before trying this in your CI pipeline:

Just like running Coverlet locally from the command line you get a summary as it runs in Github too so it’s easy to debug any issues:

The report summary looks like so, pretty cool I think. You can configure this to work for PRs too if you wish.

Once you have this all working you may need to reduce the log levels so it’s not as noisy in the console.

Incidentally AltCover seems very clever and if you can get it to work correctly for you might be better than Coverlet, so give it a try also if you have time.

Hopefully this is useful for others who need to get Code Coverage setup for legacy Sitecore MVC projects (or other older .NET Framework projects). I’m sure a very similar approach would work in Azure Devops or other CI/CD platforms too. I’m off to write some more Unit tests.

As always there were a lot of useful links out there that helped me with this in addition to the ones I’ve included above:

https://blog.ndepend.com/guide-code-coverage-tools/
https://medium.com/@justingoldberg_2282/setting-up-code-coverage-with-net-xunit-and-teamcity-for-a-solution-with-multiple-test-projects-5d0986db788b

https://stackoverflow.com/questions/67058242/using-coverlet-with-net-framework-generates-an-error-the-expression-system-v

https://stackoverflow.com/questions/60707310/is-it-possible-to-get-code-coverage-of-net-framework-project-using-coverlet-in

https://stackoverflow.com/questions/60838586/how-to-output-code-coverage-results-file-of-solution-tests-to-solution-directory

https://stackoverflow.com/questions/62512661/how-to-generate-line-coverage-report-with-vstest-console-exe

Sitecore Page Exporter

Something I need to do regularly is pull down a page from an higher environment (such as UAT or Production) to my local machine or Test. I’ve done this in the past by manually building packages, using Sitecore Sidekick or SPE’s ‘Quick Download Tree as package’ option.

However the SPE’s package option does not support packaging up the datasource items (unless they are child items of the page). In my experience there are often global datasources that are not sub-items of the page. This can take quite some time to do manually, especially for large pages.

Enter Sitecore Page Exporter

So I decided to create ‘Sitecore Page Exporter’ using SPE which will handle this. It supports exporting a specific page as an package and optionally the datasources, images and sub-items. This is v1 so I plan to add more features in the near future.

Pre-requisites

You must have Sitecore PowerShell Extensions installed. This release has been tested with Sitecore 10.3 and SPE 6.4 but should work with older versions also.

Install Notes

  • Download the v1 package from the release link
  • Install the package using the Sitecore package install option in the Sitecore Desktop
  • You should now have Sitecore Page Exporter installed under the SPE module:

Usage

  • To export an page right-click the page in the Content Editor and choose: Scripts > Export Page as Package:
  • The following options are then available to you:
  • Choose your options and Click ‘OK’
  • Download and save the package
  • You get an overview of the export if you click ‘view script results’:
  • You will get an summary at the end of the number of items included also:
  • Upload the package to where you want to use the page (e.g your development machine)

Hopefully this is useful for others too. Let me know of any features you think might be added or any issues you have with this.

Automating Sitecore Azure SQL Database Maintenance

For a long time Sitecore have recommended that you run SQL Maintenance regularly and rebuild the indexes. However you can’t run maintenance plans like this (as you would in an On-Prem environment) in Azure.

So I did some research and it seems that Sitecore set these up for you if you using Managed Cloud. but I couldn’t find much further info on this.

However I did come across this SSE post with a very useful answer from Richard Hauer on using Azure Runbooks and PowerShell to run database maintenance.
There was unfortunately not a lot of detail on how to set it up or use it, I’d only really used Azure Runbooks once before for monitoring and re-starting Solr – so I am certainly no expert on this.

So having done this recently I thought I’d write this post to help others who need to do this, follow the steps below.

Step 1 – Create a new automation account

If you don’t have an existing Azure Automation Account you will need one so go to the Automation Accounts section in Azure Portal and create one.

If you have an existing Automation Account you can move on to Step 2.

Step 2 – Create Runbook & Add Script

Note: These need to be migrated to Extension Based Hybrid workers by August 2024. However Microsoft provide a simple approach to do this. I haven’t used these yet as I don’t have VMs available to run the workers but we will do this soon, so please bear this in mind.

Under Runbooks in the Automation account click ‘Create a runbook’:

Then and name it something like ‘Sitecore-DB-Maintenance-Plan-Workflow-RB’. Ensure you choose ‘Powershell Workflow’ as the Runbook Type – otherwise the script doesn’t work correctly:

Click on the Runbook you just created and choose ‘Edit in portal’:

Then paste in the script (see below):

This is the script to copy and paste. It’s modified version of the one Richard shared on SSE.
It includes more logging and comments. Note some of the additional logging shows up in the ‘All Logs’ section as is Verbose:

You can test this if you like in the test pane but once you are happy with it publish it.

Step 3 – Create Credentials

Now we need to add our SQL Admin user username and password as Azure Credentials. If you don’t have an existing SQL Admin user you can use then create one which has the access required to rebuild indexes.

Next add an new Credentials under the automation account by clicking ‘Add a credential’:

Add the credentials details like so called ‘DatabaseCred’:

Step 4 – Create Schedules

Now we need to create a schedule for each Sitecore database that we want to Re-Index. This will run the Runbook Workflow script on a schedule.

Under the automation account click ‘Add a schedule’:

Then add the Schedule details. For example the below is for the Master Database.

Sitecore recommend Indexing is done weekly and In my case we want to run it out of hours (3am) and not over a weekend of near a Monday (as that is the busiest day for this client). This may vary for you so adjust accordingly:

Repeat this for each Database you want to Re-Index. I setup schedules for the main databases: Master, Core and Web:

Step 5 – Link Schedules & Set Parameters

Now we need to link the existing Schedules to the Runbook. Go to the ‘Sitecore-DB-Maintenance-Plan-Workflow-RB‘ Runbook and click ‘Link to schedule’:

Then select the Runbook Schedule by clicking ‘Link a schedule to your runbook’:

And select a schedule from those you setup previously at Step 4.

Then click ‘Configure Parameters and run settings’:

Set the parameters like so for the SQLServer, Database and CredentialsName like so. Use the Credentials you setup at step 3:

Step 6 – Set up Logging & Alerts

Under the runbook ‘Logging and tracing’ turn on ‘Log verbose records’ like so:

You can setup alerts if you would like to for errors under the automation account by creating an alert rule and filtering on the Runbook logs:

Step 7 – Test and check Logs

Once the Runbook schedule has run you can check the output under the ‘Jobs’ section of the runbook:

Check the ‘All logs’ section too and you should see more information such as how fragmented the tables were and the number of fragmented tables found:

That’s it, you should now have a working Runbook Workflow that automates the re-indexing and prevents your databases from becoming slow. Hopefully this is useful for others too.

Here are some other useful links that I found to help with this:

https://gist.github.com/ivanbuzyka/70db190d540e34300dab5015f21d00bf

https://github.com/yochananrachamim/AzureSQL/blob/master/AzureSQLMaintenance.txt

https://segovoni.medium.com/automating-azure-sql-database-maintenance-tasks-overview-bdbadcb312bf

https://learnsitecorebasics.wordpress.com/2023/04/30/sitecore-commerce-user-creation-takes-too-long-or-turns-into-timeout-error/

https://devjef.wordpress.com/2017/08/28/running-database-maintenance-on-azure-sql-db-with-azure-automation/

https://learn.microsoft.com/en-us/azure/automation/automation-runbook-output-and-messages

https://learn.microsoft.com/en-us/azure/automation/learn/automation-tutorial-runbook-textual

Bulk Enable/Disable Sitecore Users with SPE

We’re currently pretty close to completing an upgrade to Sitecore 10.3 for a client and during the go live process we needed to disable most of the users apart from a few admin users and then re-enable them again after go-live.

We have a lot of users in the system and so I turned to Sitecore PowerShell Extensions (SPE) to automate this process. Here is the script I came up with:

When you run the script it has a dialog which allows you to select if you would like to enable or disable users and to choose which Admin users you would like Exclude when running the Disable/Enable:

Obviously you don’t want to accidently lock yourself out of Sitecore by disabling the main sitecore\Admin user!, therefore I’ve put a check in for this to try and stop this happening:

Once the script has completed you will see a modal confirming the number of users Disabled/Enabled:

Then you will be shown a report showing a list of all the users that have been either Enabled or Disabled:

Note that as I unchecked the sitecore\testadminuser in the modal dialog it has disabled this user along with all the other non-admin users in Sitecore.

These screenshots are from my local dev environment, but I’ve tested this script on hundreds of users and it runs in a few seconds.

Hopefully it’s useful for others who need to do something similar and can be easily updated too.

Deleting IAR items from the Database & Content Editor warnings for over-written IAR Files

Having recently created an Sitecore 10.3 IAR Package for the Scheduled Publishing Module I needed to remove the files from the database as they were still there even though they are now in the .dat files I created.

In previous versions of Sitecore it was quite tricky to do this but luckily were using Sitecore 10.3 and the Sitecore CLI has been updated to allow us to delete specific items from the database with the itemres cleanup command.

The commands we need to run are as follows:

dotnet sitecore itemres cleanup -p "/sitecore/templates/Scheduled Publish" -r

 dotnet sitecore itemres cleanup -p "/sitecore/system/Tasks/Schedules/ScheduledPublishTask" -r 

 dotnet sitecore itemres cleanup -p "/sitecore/system/Tasks/Commands/ScheduledPublishCommand" -r 

 dotnet sitecore itemres cleanup -p "/sitecore/system/Modules/Scheduled Publish" -r 

 dotnet sitecore itemres cleanup -p "/sitecore/content/Applications/Content Editor/Gutters/Scheduled Publish" -r 

 dotnet sitecore itemres cleanup -p "/sitecore/content/Applications/Content Editor/Ribbons/Strips/Publish/Scheduled Publish" -r 

 dotnet sitecore itemres cleanup -p "/sitecore/content/Applications/Content Editor/Ribbons/Chunks/Scheduled Publish" -r 

 dotnet sitecore itemres cleanup -p "/sitecore/system/Field types/Custom Field Types" -r

It’s possible to run these commands using the ‘what if’ flag (-w), to see what would happen if you ran them which is quite handy for testing them first. You will see a message saying that no changes will be made:


Note that unfortunately It’s not possible to run the ‘what if’ if not providing a path. It seems this might be coming in 10.4:


Once you’ve run the commands properly (without the -w switch) then you will see confirmation that the item(s) were removed like so:

The next step was that I wanted to check the above deletes have worked correctly and that all the items were indeed coming from the IAR files and not from the database.

I decided an Content Editor warning would be a good way of doing this, I have created these using SPE before so had a look around and found this really useful post from Jan Bluemink on doing this for IAR files. It mostly worked ok but the code that was share had some issues with the formatting and I wanted to make some improvements. Here is my updated version:

Note: to use this you need to ensure that your script library is configured as an ‘PowerShell Script Module’, that the integration points for Content Editor Warnings are enabled and the script placed in the correct sub-folder (Warning).

The script displays an Content Editor blue Info message if an item is coming from an IAR file and hasn’t been over-written like so:

And if it has been over-written (is coming from the database) then it shows an orange warning message like so:

This was really useful for confirming that the IAR files were working as expected. I actually set this up before running the cleanup commands above so that I could check I was getting the Orange message initially and then the Blue one after running the cleanup commands.

You can test this yourself if you like by opening this item in Content Editor: /sitecore/system/Marketing Control Panel/Taxonomies/Campaign group

This item comes from the item.master.dat file out of the box.

Another helpful tool is this SPE report that Jan Bluemink created, it lists all over-written IAR file items from a .dat file.

Hopefully this is useful info for anyone else who needs to cleanup IAR files and check the cleanup has worked correctly.

Below are some other useful links I found when working on this:

https://doc.sitecore.com/xp/en/developers/103/sitecore-experience-manager/work-with-items-as-resources.html
https://uxbee.eu/insights/items-as-resources-by-sitecore-part-3
https://jeroen-de-groot.com/2022/01/05/remove-items-from-resource-file/
https://gist.github.com/michaellwest/13e5a49b34340b9ebebdb83ff2166077

Convert Publish files to Sitecore CLI JSON format


I’m currently working on an Sitecore upgrade for a client and this week I needed to upgrade the Scheduled Publishing Module to be compatible with Sitecore 10.3. And whilst the code had been upgraded recently by Nehemiah Jeyakumar, there was no package for it still.

I was really keen to use an Item as Resource File (IAR) version, but to do so I’d need an Sitecore CLI JSON file which I didn’t have. There was however a Package.xml file which was used to create previous Sitecore Packages for the module.

I wondered if I’d be able to use this to create an Sitecore CLI JSON file but couldn’t find anything online that anyone had done. So I decided I’d write a PowerShell Script to do this for me. You can find this below:


The script essentially loads each x-item entry in an Package.xml file, calculates the item path and name and then generates and JSON file in the Sitecore CLI Serialization format and saves it to disk for you.

How to use it

Open the script in PowerShell ISE on your PC and update the 4 variables below.
The $inputPackageFilePath should be your existing package file and $outputJsonFilePath where you would like to save the JSON file. The namespace and description should reflect your module.

variables - UPDATE THESE 4 AS REQUIRED
$inputPackageFilePath = "C:\Projects\SCScheduledPublishing\Packages\Sitecore Scheduled Publish.xml"
$outputJsonFilePath = "C:\Projects\SCScheduledPublishing\Sitecore.Schedule.Publish-module.json"
$namespace = "Sitecore.Schedule.Publish"
$description = "Sitecore.Schedule.Publish Module Items"

Once you have update these variables you can run the script and all being well you will get a JSON file saved out as specified. You should see an output similar to below with a summary of the number of items created and the location of the JSON file:


You should see that the script will automatically work out which database the item should be put into (from the package path) e.g Master or Core.
Note: The script currently sets all items as ‘SingleItem‘ scope and ‘CreateUpdateAndDelete‘ for allowedpushoperations, so you may want to adjust some of these manually.

After that you can just run the serialize command from your Sitecore Solution folder like so:
dotnet sitecore ser pull

And then the run the Item as Resource command to create one or more .dat files with the items included:
dotnet sitecore itemres create -i Sitecore.Schedule.Publish  -o SCScheduledPublishing\src –overwrite


I have a blog post here with some more info on these steps actually (it’s for an older version of the CLI but will work fine): https://www.flux-digital.com/blog/creating-custom-items-resources-sitecore-cli-4-0/

Hopefully this is useful for others who need to do this for a module.

Indecently if you need an Sitecore 10.3 version of the SCScheduledPublishing module you can find an normal and IAR package for 10.3 here: https://github.com/nehemiahj/SCScheduledPublishing/tree/main/Packages

My Sitecore SUGCON 2023 Takeaways – Day 2

> DAY ONE - If you haven't read about Day One you can read it here.

SUGCON DAY 2

IMG_0478

The 2nd Day of SUGCON started bright an early so after a quick breakfast and cup of tea at the hotel I headed down to the first session I’d planned to see.

Rob’s session is one of the key sessions I really didn’t want to miss this year. A few clients I’ve spoken to recently (and other Sitecore Dev’s I’ve chatted to at SUGCON so far) are facing this challenge:

‘How do we move to XM Cloud from XP and what do we need to consider?’

– so I was keen to learn from Rob’s experiences.

Migrating advanced Sitecore implementations to XM Cloud – Rob Habraken

rob-xm-cloud-banner

Rob started with telling us the differences with XM Cloud and explaining how publishing works differently (given you publish to the Edge):

IMG_0638
IMG_0640

Rob then shared a typical XP implementation diagram and showed how XP Differs as integrations and functionality is moved into the head application:

IMG_0642

He then discussed what is included and not included in XM Cloud in detail. Martin shared some similar slides the day before, but I think these were a little clearer so I didn’t include them in the previous post:

IMG_0644
IMG_0645

This was also a pretty cool comparison of XP vs XM Cloud equivalent features:

IMG_0646

Rob then discussed the Migration approach to XM Cloud. There was a lot of really useful info here about things to consider and how to get your project prepared for the migration and how to tackle it:

IMG_0648
IMG_0649
IMG_0650
IMG_0652

IMG_0653

Next up was the the different development approaches and workflow. I’ve talked about these before but I didn’t know much about option 3 at all. I guess most Sitecore developers (especially in a small team) will use option 1, but option 3 is a really good approach for being able to use local content for your development without having to push it to XM Cloud:
IMG_0654
IMG_0655
IMG_0656
IMG_0657

Rob then went on to explain in detail about how Content Resolvers don’t work if they are dynamic and only static ones do. It’s possible to use some out of the box ones or implement your own GraphQL Content Resolver:

IMG_0659
IMG_0662
IMG_0663
IMG_0664

This is an example of Bread crumbs in XM Cloud and a GraphQL search query:

IMG_0666

IMG_0667

Rob finished his talk with a summary of the benefits of XM Cloud. The shift in Development domain and thinking is the tricky part for a lot of Sitecore Developers I feel:
IMG_0668

 

Rendering your data in headless – 101 different ways
– Mike Edwards

mike-headless-banner

I’ve known Mike for a number of years now and he’s always an good speaker so I was looking forward to Mike sharing his learnings from his headless journey.

IMG_0673
IMG_0674

Mike started by lamenting how things used to be easy in the World of MVC and server-side development and then with all the JQuery and JS frameworks things became pretty bloated.

Things have moved on a lot now in FE development though and there are now many different options for building Headless websites in Sitecore, some of these I’m aware of or have experimented with – Others I’ve not heard of, such as ‘Island Architecture’.

IMG_0674
IMG_0676
IMG_0677
IMG_0678

SPAs bring their own set of problems in terms of page load times and indexability so Mike went into Hydration and Partial Hydration techniques and approaches that try to solve these issues:

IMG_0679
IMG_0680

Then Mike explained more about Partial Hydration examples and Island Architecture. Island Architecture lets you create your web app with pure HTML and CSS for all the static content but then add in regions/placeholders of dynamic content to support interactive ‘islands’ of content. Given the rest of the page is static it downloads really quickly and is available to use faster.

IMG_0681
IMG_0683

Mike then covered Resumability, Edge/Serverless and tools such as Storybook and Hydration Payload.

IMG_0684
IMG_0685
IMG_0686
IMG_0687

There are some Challenges and limitations which need to be re-address:

IMG_0689
IMG_0690
IMG_0692
IMG_0693

Finally Mike ended with saying that this is the future and we need to embrace the new world.
IMG_0694

It was a really interesting talk and gave me a lot to think about and research further. The following talks were 15 minute lightning talks until lunch.

Leverage Sitecore Connect for Sitecore CDP – Sarah O’Reilly

IMG_0695

I’d heard a fair bit about Connect but I’ve not really seen much about how it actually works. So I was looking forward to this session

Sarah took us through an example of using Connect to import user segment data from CDP into Google Ads.

IMG_0696 IMG_0703

Once the export was setup to build from CDP the steps were then configured in Connect to sync to Google Ads:
IMG_0712
IMG_0714

There are tons of Apps supported and different recipes defined and it was impressive to see the options for building logic such as if statements / for loops data mapping and manipulation all within Connect.

IMG_0709

This was an insightful session and really interesting to see how it works. I can see how it could be used to help with migrating to XM Cloud from XP or another CMS platform.

Sitecore components explained for your marketers – Ugo Quaisse

The next session was about the Sitecore Components builder in Pages in XM Cloud. I’ve heard a bit about this but not seen much of it in detail. I was hoping to see a full demo of it. I guess at the session was only 15 minutes there wasn’t time, but I still learned quite a bit about how it works.

IMG_0715
IMG_0716
IMG_0717
IMG_0719

The Component Builder can be used without any development or code required at all. First Themes are setup with colours, fonts and breakpoints configured.

Then datasources are setup and mapped from either a url or json or GraphQL.

IMG_0720

Then the components ‘look and feel’ – layout, dimensions and sizing can be configured in the Builder. This looks pretty neat. Then versioning and publishing is setup for the Component.

IMG_0721

IMG_0722

Lastly some details were shared around the benefits for digital creatives, it’s possible to get Sites built very quickly and easily using Components Builder.

IMG_0723
IMG_0724
IMG_0726
IMG_0727

 

Leveraging XM Cloud APIs and Webhooks to powerup integrations – Ramkumar Dhinakaran & Elakkuvan Rajamani

IMG_0740

After lunch it was time for another session, this time on Webhooks. The use-case here was the XM Cloud Lighthouse Integration which would do an automated quality check of pages using Webhooks and report on it.

IMG_0731

IMG_0742
IMG_0745

Depending on the integration required it might not be best to use a Webhook:
IMG_0756
IMG_0758

Quite a lot of detail was shared with how this all works and integrates.

IMG_0735
IMG_0736
IMG_0737
IMG_0747

There were some links and takeaways shared at the end.

IMG_0763

 

Sitecore Search: Real case PoC – Sebastian Winslow & Jesper Balle

IMG_0766

The 2nd to last session for the day was on the Sitecore search (based on Discover) which I was keen to learn about more as I didn’t know much about how it worked.

IMG_0770
IMG_0772
IMG_0773

CEC looks pretty powerful and can be used to manage search, performance is key and widgets can be configured for search and catalog:

IMG_0774
IMG_0776
IMG_0778
IMG_0779

Some dev resources and admin info were shared:

IMG_0780
IMG_0782

The use case for search was a property Site. There is still some features that need to be built.

IMG_0783
IMG_0785
IMG_0789
IMG_0790

Some info was then provided on Triggers to get the content, Request and document extractors to process and manipulate the content.

IMG_0791
IMG_0792
IMG_0793
IMG_0794

Search API endpoints, results response, API Explorer and ability to refine the widgets.

IMG_0796
IMG_0798
IMG_0799
IMG_0801

It’s early days and the search SDK is still not there yet but it’s coming. Be careful with how much content you try and index when testing but there are some significant benefits to using it.

IMG_0803
IMG_0805

This was a really informative session and gave me all the info I was looking for about how to go about implementing search.

Experiences with Content Hub One – Journey of relaunching our Usergroup website – Katharina Luger & Christian Hahn

IMG_0807

Then it was time for my last session of the day on how the Sitecore User Group Germany rebuilt their site as an SPA using Content Hub One.

The slide below was probably the simplest comparison I saw all SUCON of the differences between XM Cloud and Content Hub One.

IMG_0808
IMG_0811
IMG_0816
IMG_0818

There are 7 Steps to component creation:

IMG_0821
IMG_0822
IMG_0823
IMG_0824
IMG_0825
IMG_0826
IMG_0828

Lastly there were some challenges faced.

IMG_0831

This was a really great session and I’m looking forward to working with Content Hub One in the future.

Virtual Closing Keynote by Scott Hanselman

IMG_0833

There was then an really entertaining and insightful talk from Scott Hanselman. He had some great advice, wisdom and stories to tell to us and I think everyone in the room was pretty captivated by his talk.

IMG_0848

With that it was the end of SUCON 2023, there was a big round of applause for all the organisers. These events take a hell of a lot of organising and a real commitment from everyone involved.

 

IMG_0855

It was time to go and have a few beers and reflect on what was a another brilliant SUGCON.

IMG_0862
Hopefully this is useful info for anyone that couldn’t attend this year or had too many beers and forgot what they learned :-).

My Sitecore SUGCON 2023 Takeaways – Day 1

sugcon-2023-banner

I’ve just got back from and awesome weekend in Malaga at Sitecore SUGCON 2023. If you are not aware of what it is, it’s the developer conference organised by the Community each year to bring Sitecore Developers together in Europe. I’ve been to quite a few SUGCONs over the years but I think this has been one of the best. The talks were really interesting and it was great to catch-up with everyone in the Sitecore Community.

I arrived late on Wednesday at the rather impressive Melia near Malaga – right next to the beach. I’ve certainly been to worse locations for conferences. More venue’s like this for future SUGCONs please Sitecore :-).

pool-banner

The roof top bar was particularly special – but there was some learning to do before I would be able to enjoy the pool.

> DAY TWO - you can read about Day Two here.

Day 1

I had a fairly clear idea of the talks I wanted to see and I managed to stick to it pretty well.

Opening Keynote – Steve Tzikakis and Dave O’Flanagan

dave-banner

The following day after a late breakfast I went to register and then to see Dave O’Flanagan kick off the conference and introduce Steve.

Steve couldn’t attend so was on Video link and he explained that given the Economy downturn innovations such as ChatGPT would be key for diversifying product offerings, Steve then hinted that something was in the works with an integration of ChatGPT with Sitecore.

Many companies such as SAP and Oracle have slowed down to adapt to SaaS and Headless; while Sitecore have grown a lot over the past few years. He said that Sitecore has outgrown Adobe for the past 6 quarters with around 20% growth, Sitecore are 2nd place in industry rankings and aim to catch Adobe.

steve-banner

There is an healthy 16% R&D investment and Steve set out his ambitious target of going from 40,000 Sitecore developers to 100,000 in the market. He said that he felt the tough part (moving to SaaS and creating/integrating new SaaS platforms) was now over and Sitecore are ready to move forwards. Sitecore intend to lead by innovation and partners and developers need to be onboard to continue investment and growth.

IMG_0511

Dave then took back the stage and stated that Sitecore want to lead with Composable DXP and be the best in the market. He also confirmed that DXP is not going anywhere and 10.4 is currently in the works. There are clients who are restricted geographically with what they can do with SaaS or the want full control of their data and platform so there is still a place for self-hosting with DXP.

IMG_0507

Then Dave his us with a bit of very welcome honesty. He said that the SaaS products that Sitecore have procured/integrated and built over the past couple of years are now in a good place but he acknowledged that the documentation, marketing, information on migration and features is not great and Sitecore are going to work on this ASAP. This was great to hear as I think that there is some confusion right now for current customers and potential new customers with all of the different SaaS offerings and XP/XM. I feel It’s quite tricky to understand and the messaging from Sitecore needs improving, especially around the migration path to SaaS.

IMG_0522

Dave went on to say that there are no new product announcements this SUGCON, instead Sitecore will slow down and help customers understand the new products better and explain how to migrate to them. It was good to hear that Sitecore understand there are many customers heavily invested in XP and that it’s not that easy for them to just jump ship to SaaS and that they want to know more about how they can help customers with this journey.

The discussion then turned to the Content, Engagement and Commerce clouds and the work Sitecore have done here with huge investment, especially in Content Cloud – of which XM Cloud is key part and is getting better by the day.

IMG_0510

Content Hub 1 is Sitecore’s answer to a full headless CMS and they will work on integrating it into XM Cloud to allow you to pull content into XM Cloud seamlessly. Content Hub 4.3 is the last version of Content Hub and all customers are now updated.

IMG_0512

Search is a new SaaS offering from Sitecore, it is developed on top of Discover (a SaaS platform Sitecore purchased, originally called Reflektion). It has now been developed further and is able to search all content and is powered by AI. I would learn more about this at sessions later in SUGCON.

IMG_0513

When looking at the cost of XM Cloud customers need to consider the TCO of their existing XP/XM (or other DXP platform) and not just the licence, hosting and development costs. I think it’s true that many clients probably don’t consider the ongoing maintenance costs of Azure Infrastructure / AKS / Solr / Redis etc and the cost and complexity of Upgrades, Security patches and so forth. Not to mention the cost of DevOps/Build Pipelines and other services. When you add this all up the cost may be around the same or a bit cheaper. One of the challenges here I think is different budgets in organisations that traditionally just pay for the hosting or licence etc so this may be something to navigate when it comes to the new world of SaaS.

IMG_0515

Dave then went on to talk about XM Cloud a bit further and that they do know there are challenges without having an CD instance anymore in XM Cloud (these are instead replaced by Edge Servers – experience edge which just returns the items via the layout service or GraphQL). He said that it is something they are trying to resolve and will look to the Community so potential solutions to this. This sounds interesting so I’ll be keen to follow this idea further and see where it goes.

IMG_0516

He mentioned the impressive looking Component Builder in Pages (which I also attended a session on later at SUCON). Dave also discussed that Forms in XM Cloud is something they are working on currently; they have recognised it’s very important and are listening. This is a bit of a gap in the XM Cloud offering at the moment and some clients won’t be keen to use a 3rd-party option for this, so it’s good to hear this is in progress.

IMG_0518

Another interesting statement Dave made was that the feedback from front-end developers was that JSS is Complex and not something they really want to learn, it has a lot of complexity baked in which some feel is not needed. Therefore, Sitecore are looking at how they can simplify this. I wasn’t exactly sure what was meant by this (I’ll try and find out more) but reducing complexity and barriers to entry is generally a good thing I feel.

IMG_0519

Finally Dave discussed Sitecore Connect which is one of Sitecore’s most recent SaaS offerings which allows SaaS products to be integrated via a low code / no code approach.

IMG_0524

Dave mentioned connect recipes will be provided to help customers to move to SaaS and these could be taken and customised to fit your requirements. I’d seen a bit on this and how it works from other Sitecore Developers such as this great post by Jeremy Davis. However, I was looking forward to learning more about how it all works later at SUGCON.

IMG_0525

Life at the Edge with Vercel and Next.js – Javi Velasco

IMG_0534

Next up was a partner talk from Vercel about their platform and different offerings. Javi explained how customers expect a lot more now in 2023 (faster, more dynamic & personalised) and the pandemic effectively jump technology advancements and expectations forward by two years to 2025.

He talked about how computing and innovations in compilers has improved vastly in recent years and how Edge workers can now execute code extremely fast providing similar performance to Static Page Generation as well as Edge Middleware.

IMG_0533

I’ve not used Vercel yet but I’ve heard good things about it and Next.js (which they also created) and I know that Sitecore partner with them for XM Cloud so it was interesting to learn more about it all.

Accelerate website builds with Headless SXA and XM Cloud – Martin Miles

IMG_0541

We then needed to select our talks for the afternoon. I decided to pick Martin Miles’s talk about XM Cloud and while I’ve learned a fair bit about XM Cloud in the past 4 or 5 months and spoken on it at both the Manchester and Columbus SUGs I still learned a lot from Martin’s talk.

I know Martin plans to share his slides on his blog and the slides are very detailed. So I’ve tried to pick out some key slides which I thought were really useful or had important information that I hadn’t really seen detailed elsewhere.

Auto update and upgrade information & Licensing model:

IMG_0547

IMG_0548

Some additional limitations I wasn’t aware of:

IMG_0549

Further architecture details:

IMG_0550

More details about Webhook event handlers:

IMG_0551

Some GraphQL details and limitations:

IMG_0552

More authentication and authorisation details:

IMG_0554

Some more details on Embedded Personalisation & Analytics from CDP:

IMG_0563

IMG_0564

Docker development details:

IMG_0565

Sitecore CLI details:

IMG_0567

XM Cloud folder structure, files and folders overview:

IMG_0568

I hadn’t really though about how XM Cloud deploys items but it makes sense it uses Items as Resource Files:

IMG_0570

SPE usage in XM Cloud:

IMG_0571

Some useful developer tips and tricks:

IMG_0573

Some really useful migration details for headless solutions:

IMG_0574

Two different MVC migration routes:

IMG_0575

Lastly some really useful Headless SXA details:

IMG_0577

IMG_0578

IMG_0579

Once Martin share his talk I’ll link it here as there was lots more information that was useful.

SXA MVC & Headless SXA – a MOVING tale… – Jason Wilkerson

IMG_0582

I’ve known Jason for quite a while and he’s always an entertaining speaker so I was looking forward to this one. It didn’t disappoint and Jason started with a story about Hipster developers…

IMG_0584

Being an predominantly back-end/c#/.Net developer and coming from the Microsoft stack I can really relate to this too. All this new-fangled hipster FE/Headless development is kind of ‘mind boggling’ and a big shift in thinking for those of us who’ve been around since the WebForms (or in my case classic ASP days).

IMG_0583

I’ve done some React development with Sitecore and the JSS Training course but I’m still not 100% comfortable with the shift yet.

IMG_0585

Jason’s talk was great as he showed us how you would go about building an example ‘Spotlight’ component in Class SXA/MVC and how that differs when building an JSS Headless component in React.

Here are a couple of slides from building the MVC Component, I think this is pretty well understood by most Sitecore Devs:

IMG_0587

IMG_0590

IMG_0589

There were some learnings that Json Shared with us around the differences with doing this for JSS instead:

IMG_0597

IMG_0598

IMG_0599

IMG_0601

IMG_0602

The rendering variants setup is quite different for MVC:

IMG_0607

Then the JSS variant, this looks quite a nice approach:

IMG_0609

IMG_0612

There are limitations of JSS Renderings which are that the search component of SXA are not available, also if your using SSG then you can’t use forms. There are also personalisation restrictions too:

IMG_0615

IMG_0616

IMG_0618

Lastly Jason also had some training links he shared for those new to this and need to know a good place to start.

IMG_0619

XM Cloud and Content Hub ONE Battle Royale – Rick Bauer & Richard Seal

The final session of the day for me was Rick and Richards talk which was positioned as a battle between the two platforms. It made for a pretty fast-paced and entertaining talk.

Pretty much all the info was on the slides so I’m going to just drop them all below:

IMG_0620

IMG_0621

IMG_0622

IMG_0623

IMG_0624

IMG_0625

IMG_0626

IMG_0627

IMG_0628

IMG_0629

IMG_0630

At the end there was an final summary that confirmed that XM Cloud and Content Hub one are different products and are positioned separately in the market to meet different requirements:

IMG_0632

End of Day 1

It was then time for Dinner, catching up with friends from the Sitecore Community and the MVP awards ceremony. Unfortunately the MPV awards had got stuck in Customs again (despite Tamas’s best efforts) but we got a few photos and there was an entertaining performance from Rodrigo and Sebastian and also the quiz; followed by a few well earned beers.

Fr_WawlXwAEc5-d

IMG_0634

You can read about what I learned on day two here.

What I’m looking forward to seeing at SUGCON 2023

sugcon-2023-banner-hotel

I’m really looking forward to SUGCON 2023 in Malaga in a couple of days time and have decided I’d take an more in-depth look at the conference sessions.

At previous SUGCON’s I’ve generally planned the talks I want to go to on the plane flight over, so this time I thought I’d try and do so up-front and share my thoughts on why.

Bear in mind that I’m a Sitecore Developer and Consultant so I naturally lean towards more technical tracks. You can find the full agenda here and you can read more about the sessions here.

Thursday

After the Opening Keynote by Steve Tzikakis and Dave O’Flanagan and the other initial sessions we need to decide on which of the parallel talks to attend. As usual there are a few clashes here, but these are my selections for the rest of the day:

Selected Talk Title Rationale Notable Alternative
3:10 pm – 3:55 pm
Martin Miles
Accelerate website builds with Headless SXA and XM Cloud
Martin has been Blogging a lot on XM Cloud over the past year and has a lot of Knowledge to share so I’m really keen to learn more about how to build headless websites with Headless SXA and XM Cloud and what Martin has learned from doing so. Andy Cohen
Innovations in Deploy
4:40 pm – 5:25 pm
Jason Wilkerson
SXA MVC & Headless SXA – a MOVING tale…
 Jason is always a really engaging speaker and I’m intrigued to know more about transitioning MVC-based SXA sites to a headless implementation of SXA. Thomas Stern
Hacking Sitecore
5:30 pm – 6:15 pm
Rick Bauer

Richard Seal
XM Cloud and Content Hub ONE Battle Royale
This sounds like and interesting session and having learnt a fair bit about XMCloud recently I’d like to know more about Content Hub ONE and how it compares. Vasiliy Fomichev
Crafting rock-solid secure composable Sitecore SaaS-based applications

Friday

There is an Content Hub ONE – Insights session to start the day but these are my choices for the rest of it. The first talk selection was a tough one!:

Selected Talk Title Rationale Notable Alternative
9:50 am – 10:35 am
Rob Habraken
Migrating advanced Sitecore implementations to XM Cloud
I feel this is a must-attend session for any Sitecore developer who might be working on projects that plan to move to XMCloud from XP/XM. Kiran Patil
Sheetal Jain
Upgrade path for a Monolithic Developer to a Composable Developer
11:15 am – 12:00 pm
Mike Edwards
Rendering your data in headless – 101 different ways
Mike will no doubt have a lot of knowledge to share on Headless so I’m interested to learn more about the different rendering patterns and pros and cons of each approach. Bart Plasmeijer
Keep the door open when transforming from Sitecore XM to composable DXP using XM Cloud!
12:10 pm – 12:25 pm
Chris Sulham
Grappling with the Many Heads of Headless
I’m interested to hear more about the considerations for headless and benefits and drawbacks to each. Mark Lowe
A Road Trip to Composable Canyon
12:30 pm – 12:45 pm
Sarah O’Reilly
Leverage Sitecore Connect for Sitecore CDP
I don’t know very much about Sitecore Connect yet (other than that it’s based on Workato) so I’m keen to learn more about it and how it works. Himadri Chakrabarti
What is Flexibility Over Features Philosophy in Sitecore OrderCloud Architecture
12:50 pm – 1:05 pm
Kingsley Hibbert
Mathew Evans
DevOps Composability in a Composable World
This session sounds very informative and not something I’ve really considered too much. Nicky Vadera
Using External Components in Content Hub 4.2
2:20 pm – 3:05 pm
Ramkumar Dhinakaran
Elakkuvan Rajamani
Leveraging XM Cloud APIs and Webhooks to powerup integrations
 Webhooks are a new feature in XMCloud and Sitecore 10.3 so it will be cool to see some examples of these in action. Daniela Militaru
Katharina Luger
Women in Sitecore Roundtable
3:15 pm – 4:00 pm
Sebastian Winslow
Jesper Balle
Sitecore Search: Real case PoC
 Search is tricky in a Headless world so I’m looking forward to learning about potential solutions and approaches to different search. Sebastian Winter
Sitecore Components in Action

Hope everyone who’s going enjoys SUGCON and hopefully I’ll see some of you there.