When you’re in the publishing industry, regular database backups are crucial: one deleted server or accidental drop of a database can mean hours, days, or more of lost content. This is compounded by platforms like WordPress, which encourage users to author content directly in the content management system; if the database ever disappears, you’ve lost all of your content!
In order to protect content, it’s crucial that publishers are creating backups on a regular schedule. If you’re using a managed database service like Amazon RDS these may be handled for you, but you can never have too many backups of your production database.
One simple approach that we’re using as part of our more comprehensive disaster recovery plan is an automated, daily backup of all of our databases to Amazon S3.
Growella uses DeployBot to handle deployments, but we're also using Gulp to run webpack, Sass, and other compilation tasks, then building a
dist/ directory that acts as the root of our deployment. This ensures we're only deploying the files we need on production, while leaving out development assets.
We're also building Growella as a twelve-factor application, so we're using Composer and WPackagist to pull in our dependencies, which is super easy to do with DeployBot. We're even able to cache the
composer install, ensuring it's only run when something has changed in
composer.json (for instance, installing/upgrading a plugin).
On paper, this looks great: we're only running Composer when we need to, and we're able to prepare a nice, packaged version of the site for delivery to the target server, which is taking advantage of DeployBot's atomic deployment pattern.
Here's the rub: it was slow. Unacceptably slow, taking 10-15min to deploy a WordPress site.
We're proud to announce that we've published our first plugin to the WordPress.org repository: Dynamic Asset Versioning.
wp_enqueue_style(), respectively) is missing an explicit version number and, when such an asset is found, a version number is created dynamically.