A.K.A The Sqlitening!
This removes bedrock's depenence on a database server and moves to
downloading pre-built sqlite database files from s3 on a schedule. There
is also a clock process that will update and upload such a database on a
schedule as well. This should mean more stability, speed, and
reliability for bedrock, as well as quicker development due to easy to
download pre-populated databases.
Store the last successfully updated git hash in the
database. As long as we only update said hash upon
successful completion of an import it should try again
next time. Also switch product-details and security-
advisory import commands to simply update all files
when a git update is detected.
This was done before using a different process type in
Procfile, but only "web" and "cmd" types are served as
HTTP traffic, so that wouldn't work. This new direction
should work for any instance for which we'd like to use
supervisord.
Also reunified the cron.py file and used arguments to
determine which jobs are scheduled.
This will allow us to run supervisord in www-dev
that will run the site and a process to keep l10n
updated. It allows us to run this without also running
the other cron tasks that update the DB in every container,
since we use a separate clock container for that. Demo instances
will also run the DB update process via supervisor.
Currently it imports the whole django setup and commands
and runs those functions, but we've run into issues as
some of these commands don't expect for the process to
persist. This change makes the cron script simply call
the same commands in a subshell.