This repository contains the implementation of RankDiff, a novel approach for offline black-box optimization using conditional diffusion models with rank-based reweighting and dynamic noise scheduling.d
- Dynamic Noise Scheduling: Adjusts noise rates based on function values
- Rank-Based Reweighting: Simplified loss weighting using normalized ranks
- Conditional Training with Classifier-Free Guidance
- Efficient sampling using Heun solver
- Clone the repository:
git clone https://github.com/yourusername/RankDiff.git
cd RankDiff- Install dependencies:
pip install -r requirements.txtRankDiff/
├── configs/ # 配置文件目录
│ ├── rankdiff.cfg # 基础配置文件
│ └── experiments/ # 实验配置文件
│ └── dkitty.cfg # DKitty任务配置
├── design_baselines/ # 主要代码目录
│ └── rankdiff/ # RankDiff实现
│ ├── __init__.py
│ ├── model.py # 模型定义
│ ├── trainer.py # 训练器
│ ├── data.py # 数据处理
│ ├── eval.py # 评估脚本
│ └── util.py # 工具函数
├── run_experiment.py # 实验运行脚本
└── README.md
Use the run_experiment.py script to run an experiment:
python run_experiment.py \
--config configs/experiments/dkitty.cfg \
--output_dir experiments/dkitty \
--seed 42 \
--gpu 0 \
--mode bothParameters:
--config: Path to the experiment configuration file--output_dir: Output directory--seed: Random seed--gpu: GPU ID--mode: Run mode, options are 'train', 'eval', or 'both'
from design_baselines.rankdiff.data import load_dataset, preprocess_data
# Load data
data = load_dataset(data_path, task_name)
# Preprocess data
data, stats = preprocess_data(data, normalize=True)python -m design_baselines.rankdiff.train \
--config configs/rankdiff.cfg \
--data_path path/to/your/dataset.pt \
--output_dir path/to/output \
--device cudapython -m design_baselines.rankdiff.eval \
--config configs/rankdiff.cfg \
--model_path path/to/model.pt \
--data_path path/to/test_data.pt \
--output_dir path/to/output \
--device cudainput_dim: Input dimensiontime_dim: Time embedding dimensioncondition_dim: Condition embedding dimensionhidden_dim: Hidden layer dimensionnum_blocks: Number of Transformer blocks
num_timesteps: Number of diffusion stepsbeta_start: Noise schedule start valuebeta_end: Noise schedule end valuealpha: Ranking loss weightbeta_w: Target function weightgamma: Condition guidance weightbatch_size: Batch sizelearning_rate: Learning ratenum_epochs: Number of training epochsoptimizer: Optimizer typeweight_decay: Weight decayscheduler: Learning rate schedulermin_lr: Minimum learning rate
num_samples: Number of samplesguidance_scale: Condition guidance scale
task_name: Task namenormalize: Whether to normalize data
Experiment results will be saved in the specified output directory, including:
- Preprocessed data
- Model checkpoints
- Evaluation results and visualizations
This project is licensed under the MIT License - see the LICENSE file for details.