|
| 1 | += Local SOLR Loading Notes |
| 2 | +:source-highlighter: highlight.js |
| 3 | + |
| 4 | +Notes and instructions for running and loading a Solr instance from your local |
| 5 | +machine. |
| 6 | + |
| 7 | +== GUS_HOME Setup |
| 8 | + |
| 9 | +If you don't have a GUS_HOME set up already, a minimal setup is described here: |
| 10 | + |
| 11 | +. Create a directory for your GUS_HOME. |
| 12 | ++ |
| 13 | +[source, bash] |
| 14 | +---- |
| 15 | +mkdir -p ~/gus_home/lib/python/SiteSearchData/Model |
| 16 | +---- |
| 17 | +. Copy the lib file from `Model/lib/python` into the GUS_HOME path we just |
| 18 | + created. |
| 19 | ++ |
| 20 | +[source, bash] |
| 21 | +---- |
| 22 | +cp -t ~/gus_home/lib/python/SiteSearchData/Model Model/lib/python/BatchReportUtils.py |
| 23 | +---- |
| 24 | + |
| 25 | +You should now be able to run the Solr loading scripts. |
| 26 | + |
| 27 | +== SOLR Setup |
| 28 | + |
| 29 | +. `make build` |
| 30 | +. `make run` |
| 31 | +. `docker exec -it <container-name> bash` |
| 32 | ++ |
| 33 | +[source, bash] |
| 34 | +---- |
| 35 | +mkdir -p ~/site_search/conf |
| 36 | +cp -rt ~/site_search/conf/ /opt/solr/server/solr/configsets/site-search/conf/* |
| 37 | +---- |
| 38 | +. From http://localhost:8983/ go to Core Admin |
| 39 | +. Use the following configuration options: |
| 40 | ++ |
| 41 | +[cols=2] |
| 42 | +|=== |
| 43 | +h| name | `site_search` |
| 44 | +h| instanceDir | `/home/solr/site_search/` |
| 45 | +h| dataDir | `/home/solr` |
| 46 | +h| config | `/home/solr/site_search/conf/solrconfig.xml` |
| 47 | +h| schema | `/home/solr/site_search/conf/schema.xml` |
| 48 | +|=== |
| 49 | +. Press "Add Core" |
| 50 | + |
| 51 | +== Loading |
| 52 | + |
| 53 | +Download the target files from yew a build and project directory under the root |
| 54 | +`/eupath/data/EuPathDB/siteSearchDataDumps/` into a local directory. For the |
| 55 | +following examples we will use build-65 and ToxoDB |
| 56 | + |
| 57 | +=== Using SFTP |
| 58 | + |
| 59 | +. Create a local directory to contain the batches for the target project, then |
| 60 | + `cd` into that directory. |
| 61 | ++ |
| 62 | +[source, bash] |
| 63 | +---- |
| 64 | +mkdir ~/ToxoDB |
| 65 | +cd ~/ToxoDB |
| 66 | +---- |
| 67 | +. Open an SFTP connection to yew. |
| 68 | ++ |
| 69 | +[source, bash] |
| 70 | +---- |
| 71 | +sftp <connection info for yew> |
| 72 | +---- |
| 73 | +. Run the following SFTP commands: |
| 74 | ++ |
| 75 | +[source, bash] |
| 76 | +---- |
| 77 | +cd /eupath/data/EuPathDB/siteSearchDataDumps/bld65/ToxoDB |
| 78 | +get -R . |
| 79 | +exit |
| 80 | +---- |
| 81 | + |
| 82 | +At this point you should now have a mirror of the ToxoDB batches in your local |
| 83 | +ToxoDB directory. From here you can run the target `Model/bin` loading |
| 84 | +script(s) to populate the `site_search` Solr core in your local instance. |
| 85 | + |
| 86 | +For this example we will use the multi-batch loading script for our downloaded |
| 87 | +files: |
| 88 | + |
| 89 | +[source, bash] |
| 90 | +---- |
| 91 | +# Go to the directory with the bin scripts |
| 92 | +cd /path/to/SiteSearchData/Model/bin |
| 93 | +
|
| 94 | +# Export the necessary env vars |
| 95 | +export GUS_HOME=/path/to/local/gus_home |
| 96 | +export PATH=$PATH:$PWD |
| 97 | +
|
| 98 | +# Load the batches into Solr |
| 99 | +./ssLoadMultipleBatches https://localhost:8983/solr/site_search ~/ToxoDB |
| 100 | +
|
| 101 | +# Commit the typeahead index |
| 102 | +./ssCommitSuggesterIndex |
| 103 | +---- |
0 commit comments