I recently ran into some trouble trying to update my blog engine (Ghost) to the latest version. I was more than a year behind, which represented about 29 updates 😳. As you can imagine, cathing up didn't go so well. In fact, nothing at all was working after my first attempt. I even corrupted my database at some point. Fortunately, I had backups. In the end, I was able to correct all the issues, but I had to build some tooling to help me.

Then, I remembered that, one of the first rules of continuous delivery is:

If it hurts, do it more often

At that point, it was pretty clear that I needed a much better process to keep my blog engine up to date.


To make it a bit easier to follow, I'll give you some context about my environment.
I run this blog on Azure using an App Service which hosts Ghost on NodeJS and the underlying database is SQLite. It's important to note that Ghost doesn't run of the box on Azure. However, Yannick Reekmans has an excellent fork that take care of all the details to host it in an Azure App Service.


  • Kick off automatically on new versions of Ghost
  • Backup important data
  • Use App Service slots to safely deploy and test
  • Build and run the migration off the production servers to minimize downtime


The first step really is to build a deployable package in a repeatable way. Let's see how we can do it using Azure DevOps.



  1. Create a new Build pipeline in Azure DevOps
    Make sure to use the classic editor

  2. Select your git repository

  3. Choose "Empty Job"

  4. Make sure to add the following variables

  • backupWebJobName
  • scmUsername
  • scmPassword
  • scmUrl



Backup DB

This will manually start the webjob that creates a backup of our SQLite DB and name it backup.db. I won't go over the details, but if you try to download the running DB directly, you'll get a DB with 0 bytes as it's locked by another process. Fortunately, Tom Chantler made a simple WebJob that does the trick.

Task type: Rest call

  backupWebJobName: 'BackupDb-Manual'

- task: CdiscountAlm.rest-call-build-task.custom-build-task.restCallBuildTask@0
  displayName: 'Start DB backup web job'
    webserviceEndpoint: '$(Parameters.webserviceEndpoint)'
    relativeUrl: 'api/triggeredwebjobs/$(backupWebJobName)/run'
    httpVerb: POST

WebService endpoint
create a generic service connection that points to your App Service Kudu dashboard. The format is: https://[YOUR_APP_SERVICE_NAME_HERE].scm.azurewebsites.net/


Download images

This will download the content/images folder. As we use slots rotation we need to copy all the data from the production slot to our dev slot.

Task type: Download file

- task: Fizcko.azure-devops-download-a-file.azure.devops.download.a.file.DownloadAFile@1
  displayName: 'Download images'
    strUrl: '$(scmUrl)/api/zip/site/wwwroot/content/images/'
    strTargetDir: '$(Build.ArtifactStagingDirectory)'
    strTargetFilename: images.zip
    authType: basic
    basicAuthUsername: '$(scmUsername)'
    basicAuthPassword: '$(scmPassword)'

Extract images

Now let's extract images.zip we just downloaded to the files we pulled from git and merge everything together.

Task type: Extract files

- task: ExtractFiles@1
  displayName: 'Extract images.zip'
    archiveFilePatterns: '$(Build.ArtifactStagingDirectory)/images.zip'
    destinationFolder: content/images

Download DB backup

Time to download our newly created backup DB (backup.db) and copy it in the file structure we pulled from git. You might wonder why we don't do this step right after the first step where we kick-off the webjob to perform the backup? Well, the webjob is fairly fast (~2s), but also asynchronous, which means it's possible that the backup file is not complete when you try to download it. That's why we pushed that task a bit later on our pipeline. That way, we make sure the backup job is completely done before trying to download the backup.db file.

Task type: Download file

- task: Fizcko.azure-devops-download-a-file.azure.devops.download.a.file.DownloadAFile@1
  displayName: 'Download backup.db'
    strUrl: '$(scmUrl)/api/vfs/site/wwwroot/content/data/backup.db'
    strTargetDir: content/data/
    strTargetFilename: ghost.db
    authType: basic
    basicAuthUsername: '$(scmUsername)'
    basicAuthPassword: '$(scmPassword)'

Npm install

Nothing to say really here. Just download the trillion packages we need to run the database migration.

Task type: npm install

- task: Npm@1
  displayName: 'npm install'
    verbose: false

DB Migration

Let's run the DB migration scripts. This will make sure to migrate our SQLite DB the latest schema. Ghost alreay provides these scripts for each version. We only need to lauch their javascript DB migrator.

Ghost server will run this database migration automatically on startup if needed. However, to minimize the downtime while deploying a new version, it's better to run it as part of our build pipeline. This will also allow us to inspect the DB before deploying it, you know... in case the migration went wrong. Cough... Cough... Murphy's law.

Task type: PowerShell

- powershell: 'node .\db.js'
  failOnStderr: true
  displayName: 'DB migration'

Remove node_modules

node_modules is a really big folder and might cause issues if we add it to our deploy package. Let's remove it!

Task type: PowerShell

- powershell: 'Remove-Item .\node_modules -Recurse'
  failOnStderr: true
  displayName: 'Remove node_modules'

Archive files

It's now time to zip everything into one deployable package.

Task type: Archive file

- task: ArchiveFiles@1
  displayName: 'Archive files '
    rootFolder: '$(System.DefaultWorkingDirectory)'
    includeRootFolder: false

Copy Files

Copy that package from the $ArtifactStagingDirectory and rename it with the build number. This could be usefull if we need to debug multipe deployments.

Task type: Copy files

- task: CopyFiles@2
  displayName: 'Copy Files'
    SourceFolder: '$(Build.ArtifactStagingDirectory)'
    Contents: '$(Build.BuildId).zip'
    TargetFolder: '$(Build.ArtifactStagingDirectory)\ArtifactsToBePublished'

Publish artifact

Finally, publish that package to the Azure Pipelines, so it will be available for our release pipeline later on.

Task type: Publish build artifacts

- task: PublishBuildArtifacts@1
  displayName: 'Publish Artifact: drop'
    PathtoPublish: '$(Build.ArtifactStagingDirectory)\ArtifactsToBePublished'


You should now have a pipeline that looks like this one

Let's not forget to set a trigger when there's a new commit. To do so, go to the Triggers tab and select
Enable continuous integration

You are now all set, congratulations! Click Save & Queue and give it a try. On my next post, I'll show you how to deploy that package while having zero downtime, all of that using Azure App Service slots.

Make sure to look at the Part 2 to learn how to build the Release Pipeline for this package.