This post was most recently updated on September 19th, 2021.4 min read.
Azure DevOps pipelines have a pretty handy feature called Pipeline Caching. It’ll help you avoid annoyingly long NuGet package restores in your builds. And that makes sense – why would you restore packages again and again and again, if you’re not updating your dependencies meanwhile? And especially in the case of CI or CD pipeline, you’ll end up having a pipeline running constantly without packages actually getting bumped up – so your NuGet task is downloading the same packages over and over again.
That’s where Pipeline Caching comes in. But when is it useful?
Since Pipeline Caching comes with some overhead, it won’t always produce good results. In some cases – if restore is in fact quite quick – it might be slower than simply downloading the few packages required.
So let’s take a look at our options.
When to use or not use caching vs artifacts?
With Azure DevOps, you actually have 3 options:
- Use Azure DevOps Pipeline Caching
- Use Pipeline Artifacts (or the artifact stream)
- Don’t use either
All of these 3 options are best suitable for different cases. The following has been borrowed from Microsoft:
Pipeline caching and pipeline artifacts perform similar functions but are designed for different scenarios and should not be used interchangeably. In general:
Use pipeline artifacts when you need to take specific files produced in one job and share them with other jobs (and these other jobs will likely fail without them).
Use pipeline caching when you want to improve build time by reusing files from previous runs (and not having these files will not impact the job’s ability to run).
Essentially, artifacts are not what we want to use for NuGet packages (unless it’s a NuGet package we also publish ourselves!) – the decision is between using Pipeline Caching or going vanilla – i.e. not doing anything extra.
And hopefully, this article will help you make an informed decision!
Was it useful in our case?
Our solution had 10 or so projects, a couple of which were fairly large – but even the heaviest one had only a dozen dependencies or so. Nothing overwhelming.
Each project is dependent on the same versions of each particular package.
So what were our experiences?
Well… It’s a resounding “it’s ok, I guess“.
We shaved off some 30 seconds of build time per run, which equates to around 10%. But where a significant change happened, is the distribution of time between tasks in a run!
Whereas before implementing the Pipeline Caching the NuGetCommand took at least as much time as the actual build (~100s or so), but after implementing the caching, that’s down to roughly 20-30s (as the packages are restored from local cache), and instead of the caching step now takes an additional 30-50s to finish.
All in all, on average, a small save in build minutes. Definitely not worth a lot of work in tweaking the process, but still, if you’re actively waiting for the build to finish, 30 seconds already counts :)
Below, you can see an example from one build pipeline. The first few runs weren’t exactly optimized, but you can see the build duration trending a little bit lower after the changes than before.
So there’s an improvement, albeit a small one. But how do we implement this?
Okay – so the solution is to tweak the YAML template to implement Pipeline Caching. Let me show you how to do that!
Time needed: 30 minutes.
How to implement Pipeline Caching for an Azure DevOps pipeline?
- Enable package.lock files for your projects
This is a (soft) requirement for the caching to work – I’m sure there’s a way around this, but implementing lock-files for NuGet packages turned out to be quite easy, so…
Simply open your project files in edit mode like this:
And add this property:
There’s a great article published by Microsoft on NuGet package lock files.
- Add a variable to hold the NuGet package restore path
You can do this directly in your pipeline definition, somewhat like this:
- Add the cache task to your YAML file
This one will be something like so:
– task: [email protected]
key: ‘nuget | “$(Agent.OS)” | /packages.lock.json,!/bin/**’
nuget | “$(Agent.OS)”
displayName: Cache NuGet packages
(If WordPress messes up the YAML above, there’s another code snippet below to copypaste it from)
This task will restore data from the cache if a matching key is found. And this part is actually really nifty because the key is a combination of 3 things:
1. A static string, “nuget”
2. Platform (agent OS)
3. A combination of hashes of package lock files
This means that the task won’t restore packages if the lock files have changed, e.g. if there are updates to the packages, but otherwise will either restore packages locally or store them at the end of the run (if new versions were pulled).
- That’s it!
Just run your pipeline and see how it goes! If all’s good, cache contents will be saved automatically during the first run and fetched from the cache from thereon.
Just in case WordPress messes up the code snippet in the How-to above, here’s a further sample for your copy-pasting needs:
# ASP.NET # Build and test ASP.NET projects. # Add steps that publish symbols, save build artifacts, deploy, and more: # https://docs.microsoft.com/azure/devops/pipelines/apps/aspnet/build-aspnet-4 trigger: - DEV - release/* pool: vmImage: 'windows-latest' variables: solution: 'ContosoSolution.sln' NUGET_PACKAGES: $(Pipeline.Workspace)/.nuget/packages steps: - task: [email protected] - task: [email protected] inputs: key: 'nuget | "$(Agent.OS)" | **/packages.lock.json,!**/bin/**' restoreKeys: | nuget | "$(Agent.OS)" path: $(NUGET_PACKAGES) displayName: Cache NuGet packages - task: [email protected] inputs: restoreSolution: '$(solution)' - task: [email protected] inputs: solution: '$(solution)' platform: '$(buildPlatform)' configuration: '$(buildConfiguration)'
Note: Like Dmitri notes in the comments below, as long as your packages.lock.json files’ build action is set to none you can remove the !**/bin/** directive from the setting, as then you won’t need to exclude the “output folder” of your builds from where your pipeline looks for the files to figure out your dependencies.
And it’s definitely a best practice not to copy any files you don’t need to the output folder, so this is a valuable point in general, and not really specific just to the YAML sample in question!