Difference between revisions of "Software Life Cycle: 4. Updating The Production Servers"

From WormBaseWiki
Jump to navigationJump to search
Line 29: Line 29:
 
== Deploy the web app ==
 
== Deploy the web app ==
  
Rsync the staging/ directory to new directories on production nodes in unique directories by versionTo be run from EITHER the staging host or the primary development host.
+
Run deploy_webapp.pl from any WormBase host to push a new software version into production.  This script:
  
Also: creates a software release on the FTP site and a snapshot for reference.
+
# gets the current major version of the webapp from lib/WormBase/Web.pm
 +
# gets the number of commits for the source in wb-dev:website/staging since the last tag
 +
# creates a VERSION.txt in the staging directory
 +
# backs up the production/ directory on each node to archive/VERSION_STRING for easy rollbacks
 +
# visits each staging or production host and pulls the specified branch.
 +
# send a HUP to starman on remote nodes forcing a graceful restart of all workers
 +
# creates a software release on the FTP site, and symlinks current.tgz to it
 +
# copies the staged code to wb-dev:/usr/local/wormbase/website/production.current" for easy reference
 +
# Minimizes javascript and CSS via UglifyJS
  
  staging> website-admin/update/production/steps/deploy_webapp.sh
+
  staging> website-admin/update/production/steps/deploy_webapp.sh --release WSXXX --target production|staging
  
 
Every push to production is  
 
Every push to production is  
Line 46: Line 54:
 
   Acedb version          Major software version
 
   Acedb version          Major software version
 
</pre>
 
</pre>
The deployment script runs the following steps.
 
# gets the current major version of the webapp from lib/WormBase/Web.pm
 
# gets the number of commits for the source in wb-dev:website/staging since the last tag
 
# creates a VERSION.txt in the staging directory
 
# backs up the production/ directory on each node to archive/VERSION_STRING for easy rollbacks
 
# rsyncs the staging directory to website/production/ on local and remote web cluster nodes
 
# send a HUP to starman on remote nodes forcing a graceful restart of all workers
 
# creates a software release on the FTP site, and symlinks current.tgz to it
 
# copies the staged code to wb-dev:/usr/local/wormbase/website/production.current" for easy reference
 
# Minimizes javascript and CSS via UglifyJS
 
 
  
 
=== Working with git branches ===
 
=== Working with git branches ===

Revision as of 16:15, 30 May 2012

Overview

This document describes the fourth and final stage of the WormBase software release cycle, moving new databases and software into production. New releases of the web app and associated databases are staged on the development server.

The scripts that mediate production release are maintained in the git website-admin repository.

Steps

Check disk space on local and remote nodes

Purge a specific release from production nodes. This need to be run every release but is useful to periodically clear out old builds.

Remove build WSXXX, including acedb, mysql, and support DBs

staging> admin/update/production/purge_old_releases.sh WSXXX

Remove build WSXXX from localhost (ie the dev server), including acedb, mysql, and support DBs

staging> admin/update/production/purge_old_releases.sh WSXXX local

Put the production site into maintenance mode

TODO

Push databases to production nodes

 staging> ./steps/push_staged_release_to_nodes.pl --release $RELEASE --target production

Check out and test software on the staging host.

Deploy the web app

Run deploy_webapp.pl from any WormBase host to push a new software version into production. This script:

  1. gets the current major version of the webapp from lib/WormBase/Web.pm
  2. gets the number of commits for the source in wb-dev:website/staging since the last tag
  3. creates a VERSION.txt in the staging directory
  4. backs up the production/ directory on each node to archive/VERSION_STRING for easy rollbacks
  5. visits each staging or production host and pulls the specified branch.
  6. send a HUP to starman on remote nodes forcing a graceful restart of all workers
  7. creates a software release on the FTP site, and symlinks current.tgz to it
  8. copies the staged code to wb-dev:/usr/local/wormbase/website/production.current" for easy reference
  9. Minimizes javascript and CSS via UglifyJS
staging> website-admin/update/production/steps/deploy_webapp.sh --release WSXXX --target production|staging

Every push to production is

tied to a version of the database
tied to a major version of the webapp
tied to a revision corresponding to the number of commits since the last tag
         Date            # of commits since WS221 tag
         |                |
   WS221-2010.12.24-v0.02r0
   |                 |
   Acedb version           Major software version

Working with git branches

We develop code for each release on an independent branch named after the release (eg WS230) that is remotely tracking the origin/master. This means that fetch and pull will come from the remote master and not the local master branch. We use branch names to pull code onto our staging and production servers.

Branches are created during the staging process via:

shell> cd /path/to/your/repository
shell> git branch --track WSXXX origin/master
shell> git checkout WSXXX
shell> git push origin WSXXX

To work on a branch:

shell> cd /path/to/your/repository
shell> git pull
shell> git checkout WSXXX
shell> git branch // to see which branch you are working on.

Periodically, we should merge branch development back to master. To do so:

shell>cd /path/to/your/repository
shell> git checkout master
shell> git merge WSXXX

Update symlinks on the FTP site

On the production FTP server, run:

  wb-dev> ./steps/update_production_ftp_site_symlinks.pl --release WSXXX

Update symlinks on both production nodes

We use symlinks to point to the most current versions of acedb and mysql databases in both production and on the FTP site. These need to be adjusted prior to going live.

staging> ./steps/adjust_symlinks.pl --release WSXXX --target production

Restart services

Starman and AceDB need to be restarted on remote nodes.

  (This only restarts starman)
wb-dev> ./steps/restart_starman.pl WSXXX
wb-dev> ./steps/restart_services.pl WSXXX

Manual steps

Manually:

  • Update release notes on wiki
  • Send announcement

Post a blog entry announcing the new release

Posting a blog entry on blog.wormbase.org will cross-feed to our social media outlets as well as add a news item to the front page.

Restore the production site into maintenance mode

TODO




To update the software ONLY

 ./steps/deploy_webapp.pl --release WSXXXX

Appendices

Appendix 1: Pushing Acedb independently

Push acedb into production via a tarball, a single unpacked acedb database directory, or all acedb/wormbase* directories.

   To run it under cron syncing ALL acedb/wormbase_* directories:
      ./push_acedb.pl --method all_directories
   To push out a single release using a tarball:
      ./push_acedb.pl --method by_package --release WSXXX
   To push out a single release by rsyncing the directory:
      ./push_acedb.pl --method by_directory --release WSXXX

Appendix 2: Pushing support databases independently

TODO: Add FIFO to accelerate synchronization http://engineering.tumblr.com/post/7658008285/efficiently-copying-files-to-multiple-destinations

UPDATE: The *best* way to do this is probably tar the dir (not gzip it) and send it over the wire:

tar -c /path/to/dir | ssh remote_server 'tar -xvf - -C /absolute/path/to/remotedir'

Or Rsync:

rsync -avW -e ssh /path/to/dir/ remote_server:/path/to/remotedir

The default -- and hard-coded -- behavior is to sync the local support databases directory to our ${NFS mount}/shared/databases. Other options include via a tarball, a single database directory, or all current support database directories.

  1. Start after the build deadline date

0 3 12-31 * * /home/tharris/projects/wormbase/website-admin/update/production/steps/push_support_databases.pl

To run it under cron syncing ALL database/WS* directories:

      ./push_support_databases.pl --method all_directories

To push out a single release using a tarball:

      ./push_support_databases.pl --method by_package --release WSXXX

To push out a single release by rsyncing the directory:

      ./push_support_databases.pl --method by_directory --release WSXXX

For additional information:

  ./push_support_databases.pl --help

Appendix 3: Pushing MySQL databases independently

Push mysql databases into production via a tarball or by syncing individual unpacked database directories.

   Preferred: push out a single release by rsyncing individual databases:
    ./push_mysql_databases.pl --method by_directory --release WSXXX
   To push out a single release using a tarball:
      ./push_mysql_databases.pl --method by_package --release WSXXX


Appendix 4. Minimizing Javascript and CSS

On the development server, install UglifyJS


# clone the repository
mkdir -p /usr/local/wormbase/services/uglifyJS
cd /usr/local/wormbase/services/uglifyJS
git clone git://github.com/mishoo/UglifyJS.git

# make the module available to Node
mkdir -p ~/.node_libraries/
cd ~/.node_libraries/
ln -s /where/you/wanna/put/it/UglifyJS/uglify-js.js

# and if you want the CLI script too:
mkdir -p ~/bin
cd ~/bin
ln -s /where/you/wanna/put/it/UglifyJS/bin/uglifyjs
  # (then add ~/bin to your $PATH if it's not there already)