You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert

17
21
***Kernel-based**: K-Nearest-Neighbor (KNN)
18
22
23
+

24
+
19
25
## Metaheuristic Guided Search
20
26
21
27
### Baseline
22
28
***Random Search** with a fixed and reproducible base seed across runs
29
+

23
30
24
31
### Other Algorithms
25
32
***Evolutionary Algorithm**:
26
33
* Memetic Algorithm (a specialized Genetic Algorithm that escapes local search's plateau with one more tournament selection based on a `radius` parameter)
27
34
* Standard Genetic Algorithm
35
+
36
+

### Model Training with a Customized Tuning Process
107
118
* A Proof-of-Concept end-to-end quick demo is shown in the Jupyter Notebook: `notebooks\rs_training.ipynb`. It focuses primarily on using a random solver (our choice of baseline) to search for the best set of hyperparameters based on a more updated version of classes and functions interfaces from our models.
108
119
120
+
### Download Training Results
121
+
For the time being, we've prepared a zip folder, named `experiments.bak.zip`. You can load each subfolder inside `experiment`, into your git-ignored `.cache/experiment` directory, in the format of `<model>-<optimizer name>` as the folder name, with all the run folders captured by tensorboard inside. You need to also copy the entire `final_training` folder right under `.cache/`. Then you are good to run `analyze_experiment.py` or `experiment_analysis.ipynb` to visualize results of different searches on different models.
122
+
109
123
### Run a Search quickly
110
124
* You can run a quick hyperparameter search based on this script:
0 commit comments