Software Life Cycle: 4. Updating The Production Servers

From WormBaseWiki
Jump to navigationJump to search

Overview

This document describes the fourth and final stage of the WormBase software release cycle, moving new databases and software into production. New releases of the web app and associated databases are staged on the development server.

The scripts that mediate production release are maintained in the git website-admin repository.

Steps

Check disk space on local and remote nodes

Purge a specific release from production nodes. This need not be run every release but is useful to periodically clear out old builds.

Remove build WSXXX -- including acedb, mysql, and support DBs -- from the staging cluster:

staging> website-admin/update/production/purge_old_releases.sh --release WSXXX --target staging

Remove build WSXXX -- including acedb, mysql, and support DBs -- from the production cluster:

staging> admin/update/production/purge_old_releases.sh --release WSXXX --target production

Put the production site into maintenance mode

TODO

Push databases to production nodes

Push acedb, mysql, and support databases to production (or staging) nodes.

 staging> ./steps/push_staged_release_to_nodes.pl --release $RELEASE --target production

Update symlinks on production nodes

We use symlinks to point to the most current versions of acedb and mysql databases in both production and on the FTP site. These need to be adjusted prior to going live.

staging> ./steps/adjust_symlinks.pl --release WSXXX --target production


Deploy the web app

Run deploy_webapp.pl from any WormBase host to push a new software version into production. This script:

  1. gets the current major version of the webapp from lib/WormBase/Web.pm
  2. gets the number of commits for the source in wb-dev:website/staging since the last tag
  3. creates a VERSION.txt in the staging directory
  4. backs up the production/ directory on each node to archive/VERSION_STRING for easy rollbacks
  5. visits each staging or production host and pulls the specified branch.
  6. send a HUP to starman on remote nodes forcing a graceful restart of all workers
  7. creates a software release on the FTP site, and symlinks current.tgz to it
  8. copies the staged code to wb-dev:/usr/local/wormbase/website/production.current" for easy reference
  9. Minimizes javascript and CSS via UglifyJS
staging> website-admin/update/production/steps/deploy_webapp.sh --release WSXXX --target production|staging

Every push to production is

tied to a version of the database
tied to a major version of the webapp
tied to a revision corresponding to the number of commits since the last tag
         Date            # of commits since WS221 tag
         |                |
   WS221-2010.12.24-v0.02r0
   |                 |
   Acedb version           Major software version

Working with git branches

We develop code for each release on the master branch. Once ready, these changes are then push/pulled to the "production" branch.

Bug fixes are made on the "production" branch and merged from production -> master.

Tags are used to denote specific releases.

For reference, here is how the production branch was created.

shell> git branch --track production origin/master
shell> git checkout production

To work on a branch:

shell> cd /path/to/your/repository
shell> git pull
shell> git checkout [production|master]
shell> git branch // to see which branch you are working on.
shell> git branch -a // to list all local branches.

To make a bug fix to production:

shell> cd /path/to/your/repository
shell> git checkout production
... write code of heart breaking beauty and humble economy ...
shell> git commit -m 'just resolved all the issues'
shell> git push
shell> git checkout master
shell> git merge production

To create an annotated release tag:

shell> cd /path/to/your/repository
shell> git tag -a WS500 -m 'release of WormBase WS500'
shell> git push origin WS500   // push the remote tag to origin
shell> git show WS500 // list all information about the annotated tag.

Restart services

Starman and AceDB need to be restarted on remote nodes.

staging> ./steps/restart_starman.pl WSXXX     // this only restarts starman
staging> ./steps/restart_services.pl WSXXX     // starman + sgifaceserver

Update symlinks on the FTP site

On the production FTP server, run:

wb-dev> ./steps/update_production_ftp_site_symlinks.pl --release WSXXX

Manual steps

Manually:

  • Update release notes on wiki
  • Send announcement

Installing a new assemblies metadata file

Get the metadata file for a particular WormBase release:

./website/script/get_assemblies_metadata.sh *VERSION*

Example:

./website/script/get_assemblies_metadata.sh WS240

Add the metadata file to the repo:

cd website
git add metadata/ASSEMBLIES.*VERSION*.json
git commit metadata/ASSEMBLIES.*VERSION*.json
git push

Example:

cd website
git add metadata/ASSEMBLIES.WS240.json
git commit metadata/ASSEMBLIES.WS240.json
git push

Post a blog entry announcing the new release

Posting a blog entry on blog.wormbase.org will cross-feed to our social media outlets as well as add a news item to the front page.

Restore the production site into maintenance mode

TODO




Appendices

Appendix 1: Pushing Acedb independently

Push acedb into production via a tarball, a single unpacked acedb database directory, or all acedb/wormbase* directories.

   To run it under cron syncing ALL acedb/wormbase_* directories:
      ./push_acedb.pl --method all_directories
   To push out a single release using a tarball:
      ./push_acedb.pl --method by_package --release WSXXX
   To push out a single release by rsyncing the directory:
      ./push_acedb.pl --method by_directory --release WSXXX

Appendix 2: Pushing support databases independently

TODO: Add FIFO to accelerate synchronization http://engineering.tumblr.com/post/7658008285/efficiently-copying-files-to-multiple-destinations

UPDATE: The *best* way to do this is probably tar the dir (not gzip it) and send it over the wire:

tar -c /path/to/dir | ssh remote_server 'tar -xvf - -C /absolute/path/to/remotedir'

Or Rsync:

rsync -avW -e ssh /path/to/dir/ remote_server:/path/to/remotedir

The default -- and hard-coded -- behavior is to sync the local support databases directory to our ${NFS mount}/shared/databases. Other options include via a tarball, a single database directory, or all current support database directories.

  1. Start after the build deadline date

0 3 12-31 * * /home/tharris/projects/wormbase/website-admin/update/production/steps/push_support_databases.pl

To run it under cron syncing ALL database/WS* directories:

      ./push_support_databases.pl --method all_directories

To push out a single release using a tarball:

      ./push_support_databases.pl --method by_package --release WSXXX

To push out a single release by rsyncing the directory:

      ./push_support_databases.pl --method by_directory --release WSXXX

For additional information:

  ./push_support_databases.pl --help

Appendix 3: Pushing MySQL databases independently

Push mysql databases into production via a tarball or by syncing individual unpacked database directories.

   Preferred: push out a single release by rsyncing individual databases:
    ./push_mysql_databases.pl --method by_directory --release WSXXX
   To push out a single release using a tarball:
      ./push_mysql_databases.pl --method by_package --release WSXXX


Appendix 4. Minimizing Javascript and CSS

On the development server, install UglifyJS


# clone the repository
mkdir -p /usr/local/wormbase/services/uglifyJS
cd /usr/local/wormbase/services/uglifyJS
git clone git://github.com/mishoo/UglifyJS.git

# make the module available to Node
mkdir -p ~/.node_libraries/
cd ~/.node_libraries/
ln -s /where/you/wanna/put/it/UglifyJS/uglify-js.js

# and if you want the CLI script too:
mkdir -p ~/bin
cd ~/bin
ln -s /where/you/wanna/put/it/UglifyJS/bin/uglifyjs
  # (then add ~/bin to your $PATH if it's not there already)