This updates a bunch of minor things in the docs and adds links to
other things.
This commit is contained in:
Will Kahn-Greene 2012-04-02 18:18:17 -04:00
Родитель 3627f80604
Коммит 9013c42ed7
4 изменённых файлов: 213 добавлений и 118 удалений

Просмотреть файл

@ -6,6 +6,19 @@ This document contains useful information about our coding conventions, and
things to watch out for, etc.
Coding conventions
==================
We follow most of the practices as detailed in the `Mozilla webdev bootcamp
guide <http://mozweb.readthedocs.org/en/latest/coding.html>`_.
Git conventions
===============
See :ref:`patching` for our Git usage conventions.
Tests
=====

Просмотреть файл

@ -36,6 +36,9 @@ following things (in addition to Git, of course).
* Several Python packages. See `Installing the Packages`_.
* Elastic Search and Sphinx Search. :ref:`search-chapter` covers
installation, configuration, and running.
Installation for these is very system dependent. Using a package manager, like
yum, aptitude, or brew, is encouraged.
@ -120,6 +123,47 @@ For local development you will want to add the following settings::
TEMPLATE_DEBUG = DEBUG
SESSION_COOKIE_SECURE = False
# Allows you to run Kitsune without running Celery---all tasks
# will be done synchronously.
CELERY_ALWAYS_EAGER = False
# Allows you to specify waffle settings in the querystring.
WAFFLE_OVERRIDE = True
# Change this to True if you're going to be doing search-related
# work.
ES_LIVE_INDEXING = False
# Basic cache configuration for development.
CACHES = {
'default': {
'BACKEND': 'caching.backends.memcached.CacheClass',
'LOCATION': 'localhost:11211'
}
}
CACHE_BACKEND = 'caching.backends.memcached://localhost:11211'
CACHE_MACHINE_USE_REDIS = True
CACHE_MIDDLEWARE_ALIAS = 'default'
CACHE_MIDDLEWARE_KEY_PREFIX = ''
CACHE_MIDDLEWARE_SECONDS = 600
CACHE_PREFIX = 'sumo:'
# Basic database configuration for development.
DATABASES = {
'default': {
'NAME': 'kitsune',
'ENGINE': 'django.db.backends.mysql',
'HOST': 'localhost',
'USER': 'kitsune',
'PASSWORD': 'password',
'OPTIONS': {'init_command': 'SET storage_engine=InnoDB'},
'TEST_CHARSET': 'utf8',
'TEST_COLLATION': 'utf8_unicode_ci',
},
}
Redis
-----
@ -137,6 +181,19 @@ it up differently, tweak the settings in ``settings_local.py``
accordingly, and run Redis using just the test configuration.
memcache
--------
.. Note::
This should probably be somewhere else, but the easy way to flush
your cache is something like this::
echo "flush_all" | nc localhost 11211
Assuming you have memcache configured to listen to 11211.
Database
--------
@ -236,6 +293,15 @@ I (Will) put that in a script that creates the needed directories in
$REDISBIN $CONFFILE/redis-volatile.conf
Elastic search and Sphinx search
--------------------------------
Elastic Search and Sphinx Search. :ref:`search-chapter` covers
installation, configuration, and running.
.. todo:: The installation side of these two should get moved here.
Testing it Out
==============
@ -264,9 +330,8 @@ Running the test suite is easy::
For more information, see the :ref:`test documentation <tests-chapter>`.
.. Note::
Setting Up Search
=================
See the :ref:`search documentation <search-chapter>` for steps to get
Sphinx search working.
Some of us use `nose-progressive
<https://github.com/erikrose/nose-progressive>`_ because it makes
tests easier to run and debug.

Просмотреть файл

@ -1,3 +1,5 @@
.. _patching:
================
Patching Kitsune
================

Просмотреть файл

@ -13,23 +13,27 @@ Both of these give us a number of advantages over MySQL's full-text
search or Google's site search.
* Much faster than MySQL.
* And reduces load on MySQL.
* We have total control over what results look like.
* We can adjust searches with non-visible content.
* We don't rely on Google reindexing the site.
* We can fine-tune the algorithm ourselves.
* We can fine-tune the algorithm and scoring.
.. Note::
Right now we're rewriting our search system to use Elastic Search
and switching between Sphinx and Elastic Search. At some point,
the results we're getting with our Elastic Search-based code will
be good enough to switch over. At that point, we'll remove the
Sphinx-based search code.
We've deprecated the Sphinx search code as replaced by our Elastic
Search code.
Until then, we have instructions for installing both Sphinx Search
and Elastic Search.
To run the unit tests, you still need both installed. (Note: That
should get fixed.)
To test search locally, you should test with Elastic Search.
At some point in the near future we will remove Sphinx search from
the system altogether.
**To switch between Sphinx Search and Elastic Search**, there's a
waffle flag. In the admin, go to waffle, then turn on and off the
@ -37,108 +41,6 @@ search or Google's site search.
If it's off, then Sphinx is used.
Installing Sphinx Search
========================
We currently require **Sphinx 0.9.9**. You may be able to install this from a
package manager like yum, aptitude, or brew.
If not, you can easily `download <http://sphinxsearch.com/downloads/>`_ the
source and compile it. Generally all you'll need to do is::
$ cd sphinx-0.9.9
$ ./configure --enable-id64 # Important! We need 64-bit document IDs.
$ make
$ sudo make install
This should install Sphinx in ``/usr/local/bin``. (You can change this by
setting the ``--prefix`` argument to ``configure``.)
To test that everything works, make sure that the ``SPHINX_INDEXER`` and
``SPHINX_SEARCHD`` settings point to the ``indexer`` and ``searchd`` binaries,
respectively. (Probably ``/usr/local/bin/indexer`` and
``/usr/local/bin/searchd``, unless you changed the prefix.) Then run the
Kitsune search tests::
$ ./manage.py test -s --no-input --logging-clear-handlers search
If the tests pass, everything is set up correctly!
Using Sphinx Search
===================
Having Sphinx installed will allow the search tests to run, which may be
enough. But you want to work on or test the search app, you will probably need
to actually see search results!
The Easy, Sort of Wrong Way
---------------------------
The easiest way to start Sphinx for testing is to use some helpful management
commands for developers::
$ ./manage.py reindex
$ ./manage.py start_sphinx
You can also stop Sphinx::
$ ./manage.py stop_sphinx
If you need to update the search indexes, you can pass the ``--rotate`` flag to
``reindex`` to update them in-place::
$ ./manage.py reindex --rotate
While this method is very easy, you will need to reindex after any time you run
the search tests, as they will overwrite the data files Sphinx uses.
The Complicated but Safe Way
----------------------------
You can safely run multiple instances of ``searchd`` as long as they listen on
different ports, and store their data files in different locations.
The advantage of this method is that you won't need to reindex every time you
run the search tests. Otherwise, this should be identical to the easy method
above.
Start by copying ``configs/sphinx`` to a new directory, for example::
$ cp -r configs/sphinx ../
$ cd ../sphinx
Then create your own ``localsettings.py`` file::
$ cp localsettings.py-dist localsettings.py
$ vim localsettings.py
Fill in the settings so they match the values in ``settings_local.py``. Pick a
place on the file system for ``ROOT_PATH``.
Once you have tweaked all the settings so Sphinx will be able to talk to your
database and write to the directories, you can run the Sphinx binaries
directly (as long as they are on your ``$PATH``)::
$ indexer --all -c sphinx.conf
$ searchd -c sphinx.conf
You can reindex without restarting ``searchd`` by using the ``--rotate`` flag
for ``indexer``::
$ indexer --all --rotate -c sphinx.conf
You can also stop ``searchd``::
$ searchd --stop -c sphinx.conf
This method not only lets you maintain a running Sphinx instance that doesn't
get wiped out by the tests, but also lets you see some very interesting output
from Sphinx about indexing rate and statistics.
Installing Elastic Search
=========================
@ -146,6 +48,8 @@ There's an installation guide on the Elastic Search site.
http://www.elasticsearch.org/guide/reference/setup/installation.html
We're currently using 0.17.something in production.
The directory you install Elastic in will hereafter be referred to as
``ELASTICDIR``.
@ -243,7 +147,6 @@ Other things you can change:
The index names in both ``ES_INDEXES`` and ``ES_WRITE_INDEXES``
**must** start with this prefix.
``ES_LIVE_INDEXING``
Defaults to False.
@ -286,6 +189,16 @@ Other things you can change:
If you're having problems with indexing operations timing out,
raising this number can sometimes help. Try 60.
``ES_DUMP_CURL``
This defaults to None.
This is super handy for debugging our Elastic Search code and
otherwise not useful for anything else. See the `elasticutils
documentation
<http://elasticutils.readthedocs.org/en/latest/debugging.html#es-dump-curl>`_.
Using Elastic Search
====================
@ -477,8 +390,8 @@ Troubleshooting category, then we add a filter where the result has to
be in the Troubleshooting category.
Link to the code
----------------
Link to the Elastic Search code
-------------------------------
Here's a link to the search view in the master branch. This is what's
on dev:
@ -490,3 +403,105 @@ Here's a link to the search view in the next branch. This is what's
on staging:
https://github.com/mozilla/kitsune/blob/next/apps/search/views.py
Installing Sphinx Search
========================
We currently require **Sphinx 0.9.9**. You may be able to install this from a
package manager like yum, aptitude, or brew.
If not, you can easily `download <http://sphinxsearch.com/downloads/>`_ the
source and compile it. Generally all you'll need to do is::
$ cd sphinx-0.9.9
$ ./configure --enable-id64 # Important! We need 64-bit document IDs.
$ make
$ sudo make install
This should install Sphinx in ``/usr/local/bin``. (You can change this by
setting the ``--prefix`` argument to ``configure``.)
To test that everything works, make sure that the ``SPHINX_INDEXER`` and
``SPHINX_SEARCHD`` settings point to the ``indexer`` and ``searchd`` binaries,
respectively. (Probably ``/usr/local/bin/indexer`` and
``/usr/local/bin/searchd``, unless you changed the prefix.) Then run the
Kitsune search tests::
$ ./manage.py test -s --no-input --logging-clear-handlers search
If the tests pass, everything is set up correctly!
Using Sphinx Search
===================
Having Sphinx installed will allow the search tests to run, which may be
enough. But you want to work on or test the search app, you will probably need
to actually see search results!
The Easy, Sort of Wrong Way
---------------------------
The easiest way to start Sphinx for testing is to use some helpful management
commands for developers::
$ ./manage.py reindex
$ ./manage.py start_sphinx
You can also stop Sphinx::
$ ./manage.py stop_sphinx
If you need to update the search indexes, you can pass the ``--rotate`` flag to
``reindex`` to update them in-place::
$ ./manage.py reindex --rotate
While this method is very easy, you will need to reindex after any time you run
the search tests, as they will overwrite the data files Sphinx uses.
The Complicated but Safe Way
----------------------------
You can safely run multiple instances of ``searchd`` as long as they listen on
different ports, and store their data files in different locations.
The advantage of this method is that you won't need to reindex every time you
run the search tests. Otherwise, this should be identical to the easy method
above.
Start by copying ``configs/sphinx`` to a new directory, for example::
$ cp -r configs/sphinx ../
$ cd ../sphinx
Then create your own ``localsettings.py`` file::
$ cp localsettings.py-dist localsettings.py
$ vim localsettings.py
Fill in the settings so they match the values in ``settings_local.py``. Pick a
place on the file system for ``ROOT_PATH``.
Once you have tweaked all the settings so Sphinx will be able to talk to your
database and write to the directories, you can run the Sphinx binaries
directly (as long as they are on your ``$PATH``)::
$ indexer --all -c sphinx.conf
$ searchd -c sphinx.conf
You can reindex without restarting ``searchd`` by using the ``--rotate`` flag
for ``indexer``::
$ indexer --all --rotate -c sphinx.conf
You can also stop ``searchd``::
$ searchd --stop -c sphinx.conf
This method not only lets you maintain a running Sphinx instance that doesn't
get wiped out by the tests, but also lets you see some very interesting output
from Sphinx about indexing rate and statistics.