I recently ran into some trouble trying to update my blog engine (Ghost) to the latest version. I was more than a year behind, which represented about 29 updates 😳. As you can imagine, cathing up didn't go so well. In fact, nothing at all was working after my first attempt. I even corrupted my database at some point. Fortunately, I had backups. In the end, I was able to correct all the issues, but I had to build some tooling to help me.
Then, I remembered that, one of the first rules of continuous delivery is:
If it hurts, do it more often
At that point, it was pretty clear that I needed a much better process to keep my blog engine up to date.
To make it a bit easier to follow, I'll give you some context about my environment.
I run this blog on Azure using an App Service which hosts Ghost on NodeJS and the underlying database is SQLite. It's important to note that Ghost doesn't run of the box on Azure. However, Yannick Reekmans has an excellent fork that take care of all the details to host it in an Azure App Service.
- Kick off automatically on new versions of Ghost
- Backup important data
- Use App Service slots to safely deploy and test
- Build and run the migration off the production servers to minimize downtime
The first step really is to build a deployable package in a repeatable way. Let's see how we can do it using Azure DevOps.
Create a new Build pipeline in Azure DevOps
Make sure to use the classic editor
Select your git repository
Choose "Empty Job"
Make sure to add the following variables
This will manually start the webjob that creates a backup of our SQLite DB and name it backup.db. I won't go over the details, but if you try to download the running DB directly, you'll get a DB with 0 bytes as it's locked by another process. Fortunately, Tom Chantler made a simple WebJob that does the trick.
Task type: Rest call
variables: backupWebJobName: 'BackupDb-Manual' steps: - task: CdiscountAlm.rest-call-build-task.custom-build-task.restCallBuildTask@0 displayName: 'Start DB backup web job' inputs: webserviceEndpoint: '$(Parameters.webserviceEndpoint)' relativeUrl: 'api/triggeredwebjobs/$(backupWebJobName)/run' httpVerb: POST
create a generic service connection that points to your App Service Kudu dashboard. The format is:
This will download the
content/images folder. As we use slots rotation we need to copy all the data from the production slot to our dev slot.
Task type: Download file
steps: - task: Fizcko.azure-devops-download-a-file.azure.devops.download.a.file.DownloadAFile@1 displayName: 'Download images' inputs: strUrl: '$(scmUrl)/api/zip/site/wwwroot/content/images/' strTargetDir: '$(Build.ArtifactStagingDirectory)' strTargetFilename: images.zip authType: basic basicAuthUsername: '$(scmUsername)' basicAuthPassword: '$(scmPassword)'
Now let's extract
images.zip we just downloaded to the files we pulled from git and merge everything together.
Task type: Extract files
steps: - task: ExtractFiles@1 displayName: 'Extract images.zip' inputs: archiveFilePatterns: '$(Build.ArtifactStagingDirectory)/images.zip' destinationFolder: content/images
Download DB backup
Time to download our newly created backup DB (
backup.db) and copy it in the file structure we pulled from git. You might wonder why we don't do this step right after the first step where we kick-off the webjob to perform the backup? Well, the webjob is fairly fast (~2s), but also asynchronous, which means it's possible that the backup file is not complete when you try to download it. That's why we pushed that task a bit later on our pipeline. That way, we make sure the backup job is completely done before trying to download the
Task type: Download file
steps: - task: Fizcko.azure-devops-download-a-file.azure.devops.download.a.file.DownloadAFile@1 displayName: 'Download backup.db' inputs: strUrl: '$(scmUrl)/api/vfs/site/wwwroot/content/data/backup.db' strTargetDir: content/data/ strTargetFilename: ghost.db authType: basic basicAuthUsername: '$(scmUsername)' basicAuthPassword: '$(scmPassword)'
Nothing to say really here. Just download the trillion packages we need to run the database migration.
Task type: npm install
steps: - task: Npm@1 displayName: 'npm install' inputs: verbose: false
Ghost server will run this database migration automatically on startup if needed. However, to minimize the downtime while deploying a new version, it's better to run it as part of our build pipeline. This will also allow us to inspect the DB before deploying it, you know... in case the migration went wrong. Cough... Cough... Murphy's law.
Task type: PowerShell
steps: - powershell: 'node .\db.js' failOnStderr: true displayName: 'DB migration'
node_modules is a really big folder and might cause issues if we add it to our deploy package. Let's remove it!
Task type: PowerShell
steps: - powershell: 'Remove-Item .\node_modules -Recurse' failOnStderr: true displayName: 'Remove node_modules'
It's now time to zip everything into one deployable package.
Task type: Archive file
steps: - task: ArchiveFiles@1 displayName: 'Archive files ' inputs: rootFolder: '$(System.DefaultWorkingDirectory)' includeRootFolder: false
Copy that package from the
$ArtifactStagingDirectory and rename it with the build number. This could be usefull if we need to debug multipe deployments.
Task type: Copy files
steps: - task: CopyFiles@2 displayName: 'Copy Files' inputs: SourceFolder: '$(Build.ArtifactStagingDirectory)' Contents: '$(Build.BuildId).zip' TargetFolder: '$(Build.ArtifactStagingDirectory)\ArtifactsToBePublished'
Finally, publish that package to the Azure Pipelines, so it will be available for our release pipeline later on.
Task type: Publish build artifacts
steps: - task: PublishBuildArtifacts@1 displayName: 'Publish Artifact: drop' inputs: PathtoPublish: '$(Build.ArtifactStagingDirectory)\ArtifactsToBePublished'
You should now have a pipeline that looks like this one
Let's not forget to set a trigger when there's a new commit. To do so, go to the
Triggers tab and select
Enable continuous integration
You are now all set, congratulations! Click
Save & Queue and give it a try. On my next post, I'll show you how to deploy that package while having zero downtime, all of that using Azure App Service slots.
Make sure to look at the Part 2 to learn how to build the Release Pipeline for this package.