Paper | Title | Other Keywords | Page |
---|---|---|---|
MOCOAAB02 | Design and Status of the SuperKEKB Accelerator Control System | controls, network, EPICS, interface | 4 |
|
|||
SuperKEKB is the upgrade of the KEKB asymmetric energy e+e− collider, for the B-factory experiment in Japan, designed to achieve a 40-times higher luminosity than the world record by KEKB. The KEKB control system was based on EPICS at the equipment layer and scripting languages at the operation layer. The SuperKEKB control system continues to employ those features, while we implement additional technologies for the successful operation at such a high luminosity. In the accelerator control network system, we introduce 10GbE for the wider bandwidth data transfer, and redundant configurations for reliability. The network security is also enhanced. For the SuperKEKB construction, the wireless network is installed into the beamline tunnel. In the timing system, the new configuration for positron beams is required. We have developed the faster response beam abort system, interface modules to control thousands magnet power supplies, and the monitoring system for the final focusing superconducting magnets to assure stable operations. We introduce the EPICS embedded PLC, where EPICS runs on a CPU module. The design and status of the SuperKEKB accelerator control system will be presented. | |||
![]() |
Slides MOCOAAB02 [5.930 MB] | ||
MOCOAAB04 | The Integrated Control System at ESS | controls, software, hardware, linac | 12 |
|
|||
The European Spallation Source (ESS) is a high current proton LINAC to be built in Lund, Sweden. The LINAC delivers 5 MW of power to the target at 2500 MeV, with a nominal current of 50 mA. The project entered Construction phase on January 1st 2013. In order to design, develop and deliver a reliable, well-performing and standardized control system for the ESS facility, the Integrated Control System (ICS) project has been established. The ICS project also entered Construction phase on January 1st. ICS consists of four distinct Core components (Physics, Software Services, Hardware and Protection) that make up the essence of the control system. Integration Support activities support the stakeholders and users, and the Control System Infrastructure provides the required underlying infrastructure for operating the control system and the facility. The current state of the control system project and key decisions are presented as well as immediate challenges and proposed solutions. | |||
![]() |
Slides MOCOAAB04 [11.760 MB] | ||
MOPPC028 | High-Density Power Converter Real-Time Control for the MedAustron Synchrotron | controls, operation, FPGA, real-time | 127 |
|
|||
The MedAustron accelerator is a synchrotron for light-ion therapy, developed under the guidance of CERN within the MedAustron-CERN collaboration. Procurement of 7 different power converter families and development of the control system were carried out concurrently. Control is optimized for unattended routine clinical operation. Therefore, finding a uniform control solution was paramount to fulfill the ambitious project plan. Another challenge was the need to operate with about 5'000 cycles initially, achieving pipelined operation with pulse-to-pulse re-configuration times smaller than 250 msec. This contribution shows the architecture and design and gives an overview of the system as built and operated. It is based on commercial-off-the-shelf processing hardware at front-end level and on the CERN function generator design at equipment level. The system is self contained, permitting use of parts and the whole is other accelerators. Especially the separation of the power converter from the real-time regulation using CERN's Converter Regulation Board makes this approach an attractive choice for integrating existing power converters in new configurations. | |||
![]() |
Poster MOPPC028 [0.892 MB] | ||
MOPPC029 | Internal Post Operation Check System for Kicker Magnet Current Waveforms Surveillance | controls, kicker, interface, operation | 131 |
|
|||
A software framework, called Internal Post Operation Check (IPOC), has been developed to acquire and analyse kicker magnet current waveforms. It was initially aimed at performing the surveillance of LHC beam dumping system (LBDS) extraction and dilution kicker current waveforms and was subsequently also deployed on various other kicker systems at CERN. It has been implemented using the Front-End Software Architecture (FESA) framework, and uses many CERN control services. It provides a common interface to various off-the-shelf digitiser cards, allowing a transparent integration of new digitiser types into the system. The waveform analysis algorithms are provided as external plug-in libraries, leaving their specific implementation to the kicker system experts. The general architecture of the IPOC system is presented in this paper, along with its integration within the control environment at CERN. Some application examples are provided, including the surveillance of the LBDS kicker currents and trigger synchronisation, and a closed-loop configuration to guarantee constant switching characteristics of high voltage thyratron switches. | |||
![]() |
Poster MOPPC029 [0.435 MB] | ||
MOPPC092 | Commissioning the MedAustron Accelerator with ProShell | controls, interface, ion, framework | 314 |
|
|||
MedAustron is a synchrotron based centre for light ion therapy under construction in Austria. The accelerator and its control system entered the on-site commissioning phase in January 2013. This contribution presents the current status of the accelerator operation and commissioning procedure framework called ProShell. It is used to model measurement procedures for commissioning and operation with Petri-Nets. Beam diagnostics device adapters are implemented in C#. To illustrate its use for beam commissioning, procedures currently in use are presented including their integration with existing devices such as ion source, power converters, slits, wire scanners and profile grid monitors. The beam spectrum procedure measures distribution of particles generated by the ion source. The phase space distribution procedure performs emittance measurement in beam transfer lines. The trajectory steering procedure measures the beam position in each part of the machine and aids in correcting the beam positions by integrating MAD-XX optics calculations. Additional procedures and (beam diagnostic) devices are defined, implemented and integrated with ProShell on demand as commissioning progresses. | |||
![]() |
Poster MOPPC092 [2.896 MB] | ||
MOPPC097 | The FAIR Control System - System Architecture and First Implementations | controls, software, operation, network | 328 |
|
|||
The paper presents the architecture of the control system for the Facility for Antiproton and Ion Research (FAIR) currently under development. The FAIR control system comprises the full electronics, hardware, and software to control, commission, and operate the FAIR accelerator complex for multiplexed beams. It takes advantage of collaborations with CERN in using proven framework solutions like FESA, LSA, White Rabbit, etc. The equipment layer consists of equipment interfaces, embedded system controllers, and software representations of the equipment (FESA). A dedicated real time network based on White Rabbit is used to synchronize and trigger actions on equipment level. The middle layer provides service functionality both to the equipment layer and the application layer through the IP control system network. LSA is used for settings management. The application layer combines the applications for operators as GUI applications or command line tools typically written in Java. For validation of concepts already in 2014 FAIR's proton injector at CEA/France and CRYRING at GSI will be commissioned with reduced functionality of the proposed FAIR control system stack. | |||
![]() |
Poster MOPPC097 [2.717 MB] | ||
MOPPC106 | Status Report of RAON Control System | controls, EPICS, vacuum, PLC | 356 |
|
|||
The RAON is a new heavy ion accelerator under construction in South Korea, which is to produce a variety of stable ion and rare isotope beams to support various researches for the basic science and applied research applications. To produce the isotopes to fulfill the requirements we have planed the several modes of operation scheme which require fine-tuned synchronous controls, asynchronous controls, or both among the accelerator complexes. The basic idea and development progress of the control system as well as the future plan are presented. | |||
![]() |
Poster MOPPC106 [1.403 MB] | ||
MOPPC108 | Status of the NSLS-II Booster Control System | controls, booster, vacuum, operation | 362 |
|
|||
The booster control system is an integral part of the NSLS-II control system and is developed under EPICS. The booster control system includes six IBM Systems x3250 M3 and four VME3100 controllers connected via Gigabit Ethernet. These computers provide running IOCs for power supplies control, timing, beam diagnostics and interlocks. Also cPCI ADCs located in cPCI crate are used for beam diagnostics. Front-end electronics for vacuum control and interlocks are Allen-Bradley programmable logic controllers and I/O devices. Timing system is based on use of Micro-Research Finland Oy products: EVR 230RF and PMC EVR. Power supplies control use BNL developed set of a Power Supply Interface (PSI) which is located close to power supplies and a Power Supply Controller (PSC) which is connected to a front-end computer via 100 Mbit Ethernet. Each PSI is connected to its PSC via fiber-optic link. High Level Applications developed in Control System Studio and python run in Operator Consoles located in the Control Room. This paper describes the final design and status of the booster control system. The functional block diagrams are presented. | |||
![]() |
Poster MOPPC108 [0.458 MB] | ||
TUPPC059 | EPICS Data Acquisition Device Support | EPICS, interface, software, detector | 707 |
|
|||
A large number of devices offer a similar kind of capabilities. For example, data acquisition all offer sampling at some rate. If each such device were to have a different interface, engineers using them would need to be familiar with each device specifically, inhibiting transfer of know-how from working with one device to another and increasing the chance of engineering errors due to a miscomprehension or incorrect assumptions. In the Nominal Device Model (NDM) model, we propose to standardize the EPICS interface of the analog and digital input and output devices, and image acquisition devices. The model describes an input/output device which can have digital or analog channels, where channels can be configured for output or input. Channels can be organized in groups that have common parameters. NDM is implemented as EPICS Nominal Device Support library (NDS). It provides a C++ interface to developers of device-specific drivers. NDS itself inherits well-known asynPortDriver. NDS hides from the developer all the complexity of the communication with asynDriver and allows to focus on the business logic of the device itself. | |||
![]() |
Poster TUPPC059 [0.371 MB] | ||
TUPPC073 | National Ignition Facility (NIF) Dilation X-ray Imager (DIXI) Diagnostic Instrumentation and Control System | diagnostics, target, controls, instrumentation | 751 |
|
|||
Funding: * This work was performed under the auspices of the Lawrence Livermore National Security, LLC, (LLNS) under Contract No. DE-AC52-07NA27344. #LLNL-ABS-633832 X-ray cameras on inertial confinement fusion facilities can determine the implosion velocity and symmetry of NIF targets by recording the emission of X-rays from the target gated as a function of time. To capture targets that undergo ignition and thermonuclear burn, however, cameras with less than 10 picosecond shutter times are needed. A Collaboration between LLNL, General Atomics and Kentech Instruments has resulted in the design and construction of an X-ray camera which converts an X-ray image to an electron image, which is stretched, and then coupled to a conventional shuttered electron camera to meet this criteria. This talk discusses target diagnostic instrumentation and software used to control the DIXI diagnostic and seamlessly integrate it into the National Ignition Facility (NIF) Integrated Computer Control System (ICCS). |
|||
![]() |
Poster TUPPC073 [3.443 MB] | ||
TUPPC083 | FPGA Implementation of a Digital Constant Fraction for Fast Timing Studies in the Picosecond Range | detector, FPGA, neutron, real-time | 774 |
|
|||
Thermal or cold neutron capture on different fission systems is an excellent method to produce a variety of very neutron-rich nuclei. Since neutrons at these energies bring in the reaction just enough energy to produce fission, the fragments remain neutron-rich due to the negligible neutron evaporation thus allowing detailed nuclear structure studies. In 2012 and 2013 a combination of EXOGAM, GASP and Lohengrin germanium detectors has been installed at the PF1B cold neutron beam of the Institut Laue-Langevin. The present paper describes the digital acquisition system used to collect information on all gamma rays emitted by the decaying nuclei. Data have been acquired in a trigger-less mode to preserve a maximum of information for further off-line treatment with a total throughput of about 10 MByte/sec. Special emphasis is devoted to the FPGA implementation of an on-line digital constant fraction algorithm allowing fast timing studies in the pico second range. | |||
![]() |
Poster TUPPC083 [9.928 MB] | ||
TUPPC086 | Electronics Developments for High Speed Data Throughput and Processing | detector, FPGA, controls, interface | 778 |
|
|||
Funding: The research leading to these results has received funding from the European Union Seventh Framework Programme (FP7/2007-2013) under grant agreement No. 283745 The European XFEL DAQ system has to acquire and process data in short bursts every 100ms. Bursts lasts for 600us and contain a maximum of 2700 x-ray pulses with a repetition rate of 4.5MHz which have to be captured and processed before the next burst starts. This time structure defines the boundary conditions for almost all diagnostic and detector related DAQ electronics required and currently being developed for start of operation in fall 2015. Standards used in the electronics developments are: MicroTCA.4 and AdvancedTCA crates, use of FPGAs for data processing, transfer to backend systems via 10Gbps (SFP+) links, and feedback information transfer using 3.125Gbps (SFP) links. Electronics being developed in-house or in collaboration with external institutes and companies include: a Train Builder ATCA blade for assembling and processing data of large-area image detectors, a VETO MTCA.4 development for evaluating pulse information and distributing a trigger decision to detector front-end ASICs and FPGAs with low-latency, a MTCA.4 digitizer module, interface boards for timing and similar synchronization information, etc. |
|||
![]() |
Poster TUPPC086 [0.983 MB] | ||
THPPC018 | Construction of the TPS Network System | network, controls, EPICS, Ethernet | 1127 |
|
|||
Project of 3 GeV Taiwan Photon Source (TPS) need a reliable, secure and high throughput network to ensure facility operate routinely and to provide better service for various purposes. The network system includes the office network, the beamline network and the accelerator control network for the TPS and the TLS (Taiwan Light Source) sites at NSRRC. Combining cyber security technologies such as firewall, NAT and VLAN will be adopted to define the tree network topology for isolating the accelerator control network, beamline network and subsystem components. Various network management tools are used for maintenance and troubleshooting. The TPS network system architecture, cabling topology, redundancy and maintainability are described in this report. | |||
![]() |
Poster THPPC018 [2.650 MB] | ||
THPPC056 | Design and Implementation of Linux Drivers for National Instruments IEEE 1588 Timing and General I/O Cards | hardware, software, Linux, controls | 1193 |
|
|||
Cosylab is developing GPL Linux device drivers to support several National Instruments (NI) devices. In particular, drivers have already been developed for the NI PCI-1588, PXI-6682 (IEEE1588/PTP) devices and the NI PXI-6259 I/O device. These drivers are being used in the development of the latest plasma fusion research reactor, ITER, being built at the Cadarache facility in France. In this paper we discuss design and implementation issues, such as driver API design (device file per device versus device file per functional unit), PCI device enumeration, handling reset, etc. We also present various use-cases demonstrating the capabilities and real-world applications of these drivers. | |||
![]() |
Poster THPPC056 [0.482 MB] | ||
THPPC060 | A PXI-Based Low Level Control for the Fast Pulsed Magnets in the CERN PS Complex | controls, kicker, FPGA, monitoring | 1205 |
|
|||
Fast pulsed magnet (kicker) systems are used for beam injection and extraction in the CERN PS complex. A novel approach, based on off-the-shelf PXI components, has been used for the consolidation of the low level part of their control system. Typical functionalities required like interlocking, equipment state control, thyratron drift stabilisation and protection, short circuit detection in magnets and transmission lines, pulsed signal acquisition and fine timing have been successfully integrated within a PXI controller. The controller comprises a National Instruments NI PXI-810x RT real time processor, a multifunctional RIO module including a Virtex-5 LX30 FPGA, a 1 GS/s digitiser and a digital delay module with 1 ns resolution. National Instruments LabVIEW development tools have been used to develop the embedded real time software as well as FPGA configuration and expert application programs. The integration within the CERN controls environment is performed using the Rapid Application Development Environment (RADE) software tools, developed at CERN. | |||
![]() |
Poster THPPC060 [0.887 MB] | ||
THPPC067 | New EPICS Drivers for Keck TCS Upgrade | EPICS, FPGA, interface, controls | 1231 |
|
|||
Keck Observatory is in the midst of a major telescope control system upgrade. This involves migrating from a VME based EPICS control system originally deployed on Motorola FRC40s VxWorks 5.1 and EPICS R3.13.0Beta12 to a distributed 64-bit X86 Linux servers running RHEL 2.6.33.x and EPICS R3.14.12.x. This upgrade brings a lot of new hardware to the project which includes Ethernet/IP connected PLCs, the ethernet connected DeltaTau Brick controllers, National Instruments MXI RIO, Heidenhain Encoders (and the Heidenhain ethernet connected Encoder Interface Box in particular), Symmetricom PCI based BC635 timing and synchronization cards, and serial line extenders and protocols. Keck has chosen to implement all new drivers using the ASYN framework. This paper will describe the various drivers used in the upgrade including those from the community and those developed by Keck which include BC635, MXI and Heidenhain EIB. It will also discuss the use of the BC635 as a local NTP reference clock and a service for the EPICS general time. | |||
THPPC089 | High Repetition Rate Laser Beamline Control System | laser, controls, EPICS, network | 1281 |
|
|||
Funding: The authors acknowledge the support of the following grants of the Czech Ministry of Education, Youth and Sports "CZ.1.05/1.1.00/02.0061" and "CZ.1.07/2.3.00/20.0091". ELI-Beamlines will be a high-energy, high repetition-rate laser pillar of the ELI (Extreme Light Infrastructure) project. It will be an international user facility for both academic and applied research, scheduled to provide user capability from the beginning of 2017. As part of the development of L1 laser beamline we are developing a prototype control system. The beamline repetition rate of 1kHz with its femtosecond pulse accuracy puts demanding requirements on both control and synchronization systems. A low-jitter high-precision commercial timing system will be deployed to accompany both EPICS- and LabVIEW-based control system nodes, many of which will be enhanced for real-time responsiveness. Data acquisition will be supported by an in-house time-stamping mechanism relying on sub-millisecond system responses. The synergy of LabVIEW Real-Time and EPICS within particular nodes should be secured by advanced techniques to achieve both fast responsiveness and high data-throughput. *tomas.mazanec@eli-beams.eu |
|||
![]() |
Poster THPPC089 [1.286 MB] | ||
THPPC090 | Picoseconds Timing System | laser, experiment, controls, diagnostics | 1285 |
|
|||
The instrumentation of large physics experiments needs to be synchronized down to few picoseconds. These experiments require different sampling rates for multi shot or single shot on each instrument distributed on a large area. Greenfield Technology presents a commercial solution with a Picoseconds Timing System built around a central Master Oscillator which delivers a serial data stream over an optical network to synchronize local multi channel delay generators. This system is able to provide several hundreds of trigger pulses within a 1ps resolution and a jitter less than 15 ps distributed over an area up to 10 000 m². The various qualities of this Picoseconds Timing System are presented with measurements and functions and have already been implemented in French facilities (Laser MegaJoule prototype - Ligne d’Intégration Laser- , petawatt laser applications and synchrotron Soleil). This system with different local delay generator form factors (box, 19” rack, cPCI or PXI board) and many possibilities of trigger pulse shape is the ideal solution to synchronize Synchrotron, High Energy Laser or any Big Physics Experiments. | |||
![]() |
Poster THPPC090 [1.824 MB] | ||
THPPC092 | FAIR Timing System Developments Based on White Rabbit | controls, network, FPGA, interface | 1288 |
|
|||
A new timing system based on White Rabbit (WR) is being developed for the upcoming FAIR facility at GSI, in collaboration with CERN, other institutes and industry partners. The timing system is responsible for the synchronization of nodes with nanosecond accuracy and distribution of timing messages, which allows for real-time control of the accelerator equipment. WR is a fully deterministic Ethernet-based network for general data transfer and synchronization, which is based on Synchronous Ethernet and PTP. The ongoing development at GSI aims for a miniature timing system, which is part of a control system of a proton source, that will be used at one of the accelerators at FAIR. Such a timing system consists of a Data Master generating timing messages, which are forwarded by a WR switch to a handful of timing receiver. The next step is an enhancement of the robustness, reliability and scalability of the system. These features will be integrated in the forthcoming CRYRING control system in GSI. CRYRING serves as a prototype and testing ground for the final control system for FAIR. The contribution presents the overall design and status of the timing system development. | |||
![]() |
Poster THPPC092 [0.549 MB] | ||
THPPC102 | Comparison of Synchronization Layers for Design of Timing Systems | interface, network, Ethernet, real-time | 1296 |
|
|||
Two synchronization layers for timing systems in large experimental physics control systems are compared. White Rabbit (WR), which is an emerging standard, is compared against the well-established event based approach. Several typical timing system services have been implemented on an FPGA using WR to explore its concepts and architecture, which is fundamentally different from an event based. Both timing system synchronization layers were evaluated based on typical requirements of current accelerator projects and with regard to other parameters such as scalability. The proposed design methodology demonstrates how WR can be deployed in future accelerator projects. | |||
![]() |
Poster THPPC102 [1.796 MB] | ||
THPPC103 | Timing System at MAX IV | linac, gun, injection, storage-ring | 1300 |
|
|||
The MAX IV Laboratory is the successor of the MAX-lab national laboratory in Sweden. The facility is being constructed at Brunnshög in the North Eastern part of Lund and will contain one long linac 3GeV (full energy injector), two storage rings (SR 1.5GeV and SR 3GeV) and a short pulse facility (SPF). This paper describes the design status of the timing system in 2013. | |||
![]() |
Poster THPPC103 [7.134 MB] | ||
THPPC104 | A Timing System for Cycle Based Accelerators | software, real-time, LabView, hardware | 1303 |
|
|||
Synchrotron accelerators with multiple ion sources and beam lines require a high degree of flexibility to define beam cycle timing sequences. We have therefore decided to design a ready-to-use accelerator timing system based on off-the-shelf hardware and software that can fit mid-size accelerators and that is easy to adapt to specific user needs. This Real Time Event Distribution Network (REDNet) has been developed under the guidance of CERN within the MedAustron-CERN collaboration. The system based on the MRF transport layer has been implemented by Cosylab. While we have used hardware on NI PXIe platform, it is straightforward to obtain it for other platforms such as VME. The following characteristics are key to its readiness for use: (1) turn-key system comprising hardware, transport layer, application software and open integration interfaces, (2) performance suitable for a wide range of accelerators, (3) multiple virtual timing systems in one physical box, (4) documentation developed according to V-model. Given the maturity of the development, we have decided to make REDNet available as a product through our industrial partner. | |||
![]() |
Poster THPPC104 [0.429 MB] | ||
THPPC107 | Timing and Synchronization at Beam Line Experiments | hardware, experiment, EPICS, controls | 1311 |
|
|||
Some experiment concepts require a control system with the individual components working synchronously. At PSI the control system for X-ray experiments is distributed in several VME crates, on several EPICS soft ioc servers and linux nodes, which need to be synchronized. The timing network using fibre optics, separated from standard network based on TCP/IP protocol, is used for distributing of time stamps and timing events. The synchronization of all control components and data acquisition systems has to be done automatically with sufficient accuracy and is done by event distribution and/or by synchronization by I/O trigger devices. Data acquisition is synchronized by hardware triggers either produced by sequences in event generator or by motors in case of on-the-fly scans. Some detectors like EIGER with acquisition rate close to 20kHz, fast BPMs connected to current measuring devices like picoammmeters with sampling frequences up to 26 kHz and photodiodes are integrated to measure beam properties and radiation exposures. The measured data are stored on various file servers situated within one BL subnetwork. In this paper we describe a concept for implementing such a system. | |||
THPPC109 | Status of the TPS Timing System | injection, controls, booster, EPICS | 1314 |
|
|||
Implementation of timing system of the Taiwan Photon Source (TPS) is underway. Timing system provides synchronization for electron gun, modulators of linac, pulse magnet power supplies, booster power supply ramp trigger, bucket addressing of storage ring, diagnostic equipments, beamline gating signal for top-up injection, synchronize for the time-resolved experiments. The system is based on event distribution system that broadcasts the timing events over optic fiber network, and decodes and processes them at the timing event receivers. The system supports uplink functionality which will be used for the fast interlock system to distribute signals like beam dump and post-mortem trigger with less than 5 μsec response time. Software support is in preceded. Time sequencer to support various injection modes is in development. Timing solutions for the TPS project will summary in following paragraphs. | |||
![]() |
Poster THPPC109 [1.612 MB] | ||
THPPC110 | Timing of the ALS Booster Injection and Extraction | booster, injection, extraction, storage-ring | 1318 |
|
|||
The Advanced Light Source (ALS) timing system upgrade introduces a complete replacement of both the hardware and the technology used to drive the timing of the accelerator. The implementation of a new strategy for the booster injection and extraction mechanisms is conceptually similar to the one in place today, but fundamentally different due to the replacement of the technology. Here we describe some of the building blocks of this new implementation as well as an example of how the system can be configured to provide timing for injection and extraction of the ALS booster. | |||
![]() |
Poster THPPC110 [0.207 MB] | ||
THPPC112 | The LANSCE Timing Reference Generator | controls, neutron, EPICS, interface | 1321 |
|
|||
The Los Alamos Neutron Science Center is an 800 MeV linear proton accelerator at Los Alamos National Laboratory. For optimum performance, power modulators must be tightly coupled to the phase of the power grid. Downstream at the neutron scattering center there is a competing requirement that rotating choppers follow the changing phase of neutron production in order to remove unwanted energy components from the beam. While their powerful motors are actively accelerated and decelerated to track accelerator timing, they cannot track instantaneous grid phase changes. A new timing reference generator has been designed to couple the accelerator to the power grid through a phase locked loop. This allows some slip between the phase of the grid and the accelerator so that the modulators stay within their timing margins, but the demands on the choppers are relaxed. This new timing reference generator is implemented in 64 bit floating point math in an FPGA. Operators in the control room have real-time network control over the AC zero crossing offset, maximum allowed drift, and slew rate - the parameter that determines how tightly the phase of the accelerator is coupled to the power grid.
LA-UR-13-21289 |
|||
THPPC113 | Integrated Timing System for the EBIS Pre-Injector | booster, ion, operation, controls | 1325 |
|
|||
Funding: Work supported by Brookhaven Science Associates, LLC under Contract No. DE-AC02-98CH10886 with the U.S. Department of Energy. The Electron Beam Ion Source (EBIS) began operating as a pre-injector in the C-AD RHIC accelerator complex in 2010. Historically, C-AD RHIC pre-injectors, like the 200MeV Linac, have had largely independent timing systems that receive a minimal number of triggers from the central C-AD timing system to synchronize the injection process. The EBIS timing system is much more closely integrated into central C-AD timing, with all EBIS machine cycles included in the master supercycle that coordinates the interoperation of C-AD accelerators. The integrated timing approach allows better coordination of pre-injector activities with other activities in the C-AD complex. Independent pre-injector operation, however, must also be supported by the EBIS timing system. This paper describes the design of the EBIS timing system and evaluates experience in operational management of EBIS timing. |
|||
![]() |
Poster THPPC113 [21.388 MB] | ||
THPPC119 | Software Architecture for the LHC Beam-based Feedback System at CERN | feedback, controls, optics, network | 1337 |
|
|||
This paper presents an overview of beam based feedback systems at the LHC at CERN. It will cover the system architecture which is split into two main parts – a controller (OFC) and a service unit (OFSU). The paper presents issues encountered during beam commissioning and lessons learned including follow-up from a recent review which took place at CERN | |||
![]() |
Poster THPPC119 [1.474 MB] | ||
THCOBB01 | An Upgraded ATLAS Central Trigger for 2015 LHC Luminosities | detector, electronics, interface, luminosity | 1388 |
|
|||
The LHC collides protons at a rate of ~40MHz and each collision produces ~1.5MB of data from the ATLAS detector (~60TB of data per second). The ATLAS trigger system reduces the input rate to a more reasonable storage rate of about 400Hz. The Level1 trigger reduces the input rate to ~100kHz with a decision latency of ~2.5us and is responsible for initiating the readout of data from all the ATLAS subdetectors. It is primarily composed of the Calorimeter Trigger, Muon Trigger, and the Central Trigger Processor (CTP). The CTP collects trigger information from all Level1 systems and produces the Level--1 trigger decision. The LHC has now shutdown for upgrades and will return in 2015 with an increased luminosity and a center of mass energy of 14TeV. With higher luminosities, the number and complexity of Level1 triggers will increase in order to satisfy the physics goals of ATLAS while keeping the total Level1 rates at or below 100kHz. In this talk we will discuss the current Central Trigger Processor, the justification for its upgrade, including the plans to satisfy the requirements of the 2015 physics run at the LHC.
The abstract is submitted on behalf of the ATLAS Collaboration. The name of the presenter will be chosen by the collaboration and communicated upon acceptance of the abstract. |
|||
![]() |
Slides THCOBB01 [10.206 MB] | ||
THCOBB06 | CLIC-ACM: Acquisition and Control System | radiation, controls, network, survey | 1404 |
|
|||
CLIC (Compact Linear Collider) is a world-wide collaboration to study the next “terascale” lepton collider, relying upon a very innovative concept of two-beam-acceleration. In this scheme, the power is transported to the main accelerating structures by a primary electron beam. The Two Beam Module (TBM) is a compact integration with a high filling factor of all components: RF, Magnets, Instrumentation, Vacuum, Alignment and Stabilization. This paper describes the very challenging aspects of designing the compact system to serve as a dedicated Acquisition & Control Module (ACM) for all signals of the TBM. Very delicate conditions must be considered, in particular radiation doses that could reach several kGy in the tunnel. In such severe conditions shielding and hardened electronics will have to be taken into consideration. In addition, with more than 300 channels per ACM and about 21000 ACMs in total, it appears clearly that power consumption will be an important issue. It is also obvious that digitalization of the signals acquisition will take place at the lowest possible hardware level and that neither the local processor, nor the operating system shall be used inside the ACM. | |||
![]() |
Slides THCOBB06 [0.846 MB] | ||
![]() |
Poster THCOBB06 [0.747 MB] | ||
THCOCA01 | A Design of Sub-Nanosecond Timing and Data Acquisition Endpoint for LHAASO Project | network, interface, electronics, controls | 1442 |
|
|||
Funding: National Science Foundation of China (No.11005065 and 11275111) The particle detector array (KM2A) of Large High Altitude Air Shower Observatory (LHAASO) project consists of 5631 electron and 1221 muon detection units over 1.2 square km area. To reconstruct the incident angle of cosmic ray, sub-nanosecond time synchronization must be achieved. The White Rabbit (WR) protocol is applied for its high synchronization precision, automatic delay compensation and intrinsic high band-width data transmit capability. This paper describes the design of a sub-nanosecond timing and data acquisition endpoint for KM2A. It works as a FMC mezzanine mounted on detector specific front-end electronic boards and provides the WR synchronized clock and timestamp. The endpoint supports EtherBone protocol for remote monitor and firmware update. Moreover, a hardware UDP engine is integrated in the FPGA to pack and transmit raw data from detector electronics to readout network. Preliminary test demonstrates a timing precision of 29ps (RMS) and a timing accuracy better than 100ps (RMS). * The authors are with Key Laboratory of Particle and Radiation Imaging, Department of Engineering Physics, Tsinghua University, Beijing, China, 100084 * pwb.thu@gmail.com |
|||
![]() |
Slides THCOCA01 [1.182 MB] | ||
THCOCA03 | High-Precision Timing of Gated X-Ray Imagers at the National Ignition Facility | target, laser, experiment, detector | 1449 |
|
|||
Funding: This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. #LLNL-ABS-633013 The National Ignition Facility (NIF) at the Lawrence Livermore National Laboratory is a stadium-sized facility that contains a 192-beam, 1.8-Megajoule, 500-Terawatt, ultraviolet laser system together with a 10-meter diameter target chamber. We describe techniques used to synchronize data acquired by gated x-ray imagers with laser beams at the National Ignition Facility (NIF). Synchronization is achieved by collecting data from multiple beam groups with spatial and temporal separation in a single NIF shot. By optimizing the experimental setup and data analysis, repeatable measurements of 15ps or better have been achieved. This demonstrates that the facility timing system, laser, and target diagnostics, are highly stable over year-long time scales. |
|||
![]() |
Slides THCOCA03 [1.182 MB] | ||
THCOCA04 | Upgrade of Event Timing System at SuperKEKB | injection, linac, positron, operation | 1453 |
|
|||
The timing system of the KEKB accelerator will be upgraded for the SuperKEKB project. One of difficulties at SuperKEKB is the positron injection. It takes more than 40ms since positron pulse must be stored at newly constructed damping ring for at least 40ms. Timings of whole accelerators are precisely synchronized for such a long period. We must manage highly frequent injections even with this situation. Typically beam pulse is delivered to one of rings at every 20ms. Besides, the new system must have a capability of realtime selection of injection RF-bucket - we call it "Bucket Selection" at KEKB - for equalizing bunch current at main rings. Bucket Selection also will be upgraded to synchronize buckets of damping ring and those of main rings. This includes the expansion of maximum delay time up to 2ms and the pulse-by-pulse shift of RF phase at 2nd half of injection Linac. We plan to upgrade the Event Timing System from "2-layer type", which simply connect one generator and one receiver, to "cascade type" for satisfying the new injection requirements. We report the basic design of the new timing system and recent studies about key elements of Event Timing System instruments. | |||
![]() |
Slides THCOCA04 [1.559 MB] | ||
THCOCA05 | Laser MegaJoule Timing System | laser, target, diagnostics, high-voltage | 1457 |
|
|||
The French Commissariat à l’Énergie Atomique et aux Énergies alternatives (CEA) is currently building the Laser Megajoule (LMJ). This facility is designed to deliver laser energy to targets for high energy density physics experiments, including fusion experiments. The Integrated Timing and Triggering System (ITTS) is one of the critical LMJ components, in charge of timing distribution for synchronizing the laser beams and triggering the shot data acquisitions. The LMJ ITTS Control System provides a single generic interface to its users at the Supervisory level, built around the key concept of “Synchronized Channels Group”, a set of delay channels triggered simultaneously. Software common components provide basic mechanisms: communication with its users, channel registration User-defined delays are specified with respect to a given reference(target chamber center, quadruplet or beam reference times), these delays are then translated into hardware delays according to different parameters such as electronic cards temperatures(for thermal drift correction) and transit delays. Equipments are mainly off-the-shelf timing equipments delivering trigger signals with jitter down to 15ps rms. | |||
![]() |
Slides THCOCA05 [0.974 MB] | ||