Here we provide source codes of Surr-RLDE, which has been recently accepted by GECCO 2025.
The PDF version of the paper is available here. If you find our Surr-RLDE useful, please cite it in your publications or projects.
@inproceedings{ma2025surrogate,
title={Surrogate learning in meta-black-box optimization: A preliminary study},
author={Ma, Zeyuan and Huang, Zhiyang and Chen, Jiacheng and Cao, Zhiguang and Gong, Yue-Jiao},
booktitle={Proceedings of the Genetic and Evolutionary Computation Conference},
pages={1137--1145},
year={2025}
}You can install all of dependencies of Surr-RLDE via the command below.
pip install -r requirements.txtThe surrogate learning process can be activated via te command below
python main.py --train_surrogate The trained model will be saved tooutput/surrogate_model/
The Surr-RLDE agent training process can be activated via the command below, which is just an example.
python main.py --run_experiment --problem bbob-surrogate For more adjustable settings, please refer to main.py and config.py for details.
Recording results: Log files will be saved to ./output/train/ . The saved checkpoints will be saved to ./agent_model/train/. The file structure is as follow:
|--agent_model
|--train
|--Surr_RLDE_Agent
|--run_Name
|--checkpoint0.pkl
|--checkpoint1.pkl
|--...
|--output
|--train
|--Surr_RLDE_Agent
|--runName
|--log
|--pic
The test process can be easily activated via the command below. The defalt agent load path is agent_model/test/
python main.py --test --agent_load_dir YourAgentDir --agent_for_cp Surr_RLDE_Agent --l_optimizer_for_cp Surr_RLDE_Optimizer
You can compare Surr-RLDE with DEDQN, DEDDQN, GLEET by adding them into the agent_for_cp and l_optimizer_for_cp
python main.py --test --agent_load_dir YourAgentDir --agent_for_cp Surr_RLDE_Agent DEDQN_Agent --l_optimizer_for_cp Surr_RLDE_Optimizer DEDQN_optimizer