Skip to content

Commit ab90b2d

Browse files
authored
Merge pull request #24 from VEuPathDB/commit-suggester-index
Add script to commit suggester index
2 parents 69dd4c9 + a563e0e commit ab90b2d

2 files changed

Lines changed: 120 additions & 0 deletions

File tree

Model/bin/ssCommitSuggesterIndex

Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,17 @@
1+
#/bin/bash
2+
3+
SOLR_URL=$1
4+
5+
if [ -z "$SOLR_URL" ]; then
6+
echo ""
7+
echo "Usage:"
8+
echo " ./ssCommitSuggesterIndex https://some.url.for.solr/solr/site_search"
9+
echo ""
10+
echo "This script commits the Solr Suggester index used for the site-search"
11+
echo "typeahead. This script should be executed as the last step of building"
12+
echo "the site-search core."
13+
echo ""
14+
exit 1
15+
fi
16+
17+
curl "$SOLR_URL/suggest?suggest.build=true"

local-loading-notes.adoc

Lines changed: 103 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,103 @@
1+
= Local SOLR Loading Notes
2+
:source-highlighter: highlight.js
3+
4+
Notes and instructions for running and loading a Solr instance from your local
5+
machine.
6+
7+
== GUS_HOME Setup
8+
9+
If you don't have a GUS_HOME set up already, a minimal setup is described here:
10+
11+
. Create a directory for your GUS_HOME.
12+
+
13+
[source, bash]
14+
----
15+
mkdir -p ~/gus_home/lib/python/SiteSearchData/Model
16+
----
17+
. Copy the lib file from `Model/lib/python` into the GUS_HOME path we just
18+
created.
19+
+
20+
[source, bash]
21+
----
22+
cp -t ~/gus_home/lib/python/SiteSearchData/Model Model/lib/python/BatchReportUtils.py
23+
----
24+
25+
You should now be able to run the Solr loading scripts.
26+
27+
== SOLR Setup
28+
29+
. `make build`
30+
. `make run`
31+
. `docker exec -it <container-name> bash`
32+
+
33+
[source, bash]
34+
----
35+
mkdir -p ~/site_search/conf
36+
cp -rt ~/site_search/conf/ /opt/solr/server/solr/configsets/site-search/conf/*
37+
----
38+
. From http://localhost:8983/ go to Core Admin
39+
. Use the following configuration options:
40+
+
41+
[cols=2]
42+
|===
43+
h| name | `site_search`
44+
h| instanceDir | `/home/solr/site_search/`
45+
h| dataDir | `/home/solr`
46+
h| config | `/home/solr/site_search/conf/solrconfig.xml`
47+
h| schema | `/home/solr/site_search/conf/schema.xml`
48+
|===
49+
. Press "Add Core"
50+
51+
== Loading
52+
53+
Download the target files from yew a build and project directory under the root
54+
`/eupath/data/EuPathDB/siteSearchDataDumps/` into a local directory. For the
55+
following examples we will use build-65 and ToxoDB
56+
57+
=== Using SFTP
58+
59+
. Create a local directory to contain the batches for the target project, then
60+
`cd` into that directory.
61+
+
62+
[source, bash]
63+
----
64+
mkdir ~/ToxoDB
65+
cd ~/ToxoDB
66+
----
67+
. Open an SFTP connection to yew.
68+
+
69+
[source, bash]
70+
----
71+
sftp <connection info for yew>
72+
----
73+
. Run the following SFTP commands:
74+
+
75+
[source, bash]
76+
----
77+
cd /eupath/data/EuPathDB/siteSearchDataDumps/bld65/ToxoDB
78+
get -R .
79+
exit
80+
----
81+
82+
At this point you should now have a mirror of the ToxoDB batches in your local
83+
ToxoDB directory. From here you can run the target `Model/bin` loading
84+
script(s) to populate the `site_search` Solr core in your local instance.
85+
86+
For this example we will use the multi-batch loading script for our downloaded
87+
files:
88+
89+
[source, bash]
90+
----
91+
# Go to the directory with the bin scripts
92+
cd /path/to/SiteSearchData/Model/bin
93+
94+
# Export the necessary env vars
95+
export GUS_HOME=/path/to/local/gus_home
96+
export PATH=$PATH:$PWD
97+
98+
# Load the batches into Solr
99+
./ssLoadMultipleBatches https://localhost:8983/solr/site_search ~/ToxoDB
100+
101+
# Commit the typeahead index
102+
./ssCommitSuggesterIndex
103+
----

0 commit comments

Comments
 (0)