JACoW logo

Joint Accelerator Conferences Website

The Joint Accelerator Conferences Website (JACoW) is an international collaboration that publishes the proceedings of accelerator conferences held around the world.


BiBTeX citation export for THPAB197: Enhancing Efficiency of Multi-Objective Neural-Network-Assisted Nonlinear Dynamics Lattice Optimization via 1-D Aperture Objectives & Objective Focusing

@inproceedings{hidaka:ipac2021-thpab197,
  author       = {Y. Hidaka and D.A. Hidas and F. Plassard and T.V. Shaftan and G.M. Wang},
  title        = {{Enhancing Efficiency of Multi-Objective Neural-Network-Assisted Nonlinear Dynamics Lattice Optimization via 1-D Aperture Objectives & Objective Focusing}},
  booktitle    = {Proc. IPAC'21},
  pages        = {4156--4159},
  eid          = {THPAB197},
  language     = {english},
  keywords     = {lattice, focusing, network, simulation, storage-ring},
  venue        = {Campinas, SP, Brazil},
  series       = {International Particle Accelerator Conference},
  number       = {12},
  publisher    = {JACoW Publishing, Geneva, Switzerland},
  month        = {08},
  year         = {2021},
  issn         = {2673-5490},
  isbn         = {978-3-95450-214-1},
  doi          = {10.18429/JACoW-IPAC2021-THPAB197},
  url          = {https://jacow.org/ipac2021/papers/thpab197.pdf},
  note         = {https://doi.org/10.18429/JACoW-IPAC2021-THPAB197},
  abstract     = {{Mutli-objective optimizers such as multi-objective genetic algorithm (MOGA) have been quite popular in discovering desirable lattice solutions for accelerators. However, even these successful algorithms can become ineffective as the dimension and range of the search space increase due to exponential growth in the amount of exploration required to find global optima. This difficulty is even more exacerbated by the resource-intensive and time-consuming tendency for the evaluations of nonlinear beam dynamics. Lately the use of surrogate models based on neural network has been drawing attention to alleviate this problem. Following this trend, to further enhance the efficiency of nonlinear lattice optimization for storage rings, we propose to replace typically used objectives with those that are less time-consuming and to focus on a single objective constructed from multiple objectives, which can maximize utilization of the trained models through local optimization and objective gradient extraction. We demonstrate these enhancements using a NSLS-II upgrade lattice candidate as an example.}},
}