Paper | Title | Other Keywords | Page |
---|---|---|---|
MOPPC139 | A Framework for Off-line Verification of Beam Instrumentation Systems at CERN | framework, database, software, interface | 435 |
|
|||
Many beam instrumentation systems require checks to confirm their beam readiness, detect any deterioration in performance and to identify physical problems or anomalies. Such tests have already been developed for several LHC instruments using the LHC sequencer, but the scope of this framework doesn't extend to all systems; notably absent in the pre-LHC injector chain. Furthermore, the operator-centric nature of the LHC sequencer means that sequencer tasks aren't accessible by hardware and software experts who are required to execute similar tests on a regular basis. As a consequence, ad-hoc solutions involving code sharing and in extreme cases code duplication have evolved to satisfy the various use-cases. In terms of long term maintenance, this is undesirable due to the often short-term nature of developers at CERN alongside the importance of the uninterrupted stability of CERN's accelerators. This paper will outline the first results of an investigation into the existing analysis software, and provide proposals for the future of such software. | |||
TUPPC028 | The CERN Accelerator Logging Service - 10 Years in Operation: A Look at the Past, Present, and Future | database, operation, extraction, controls | 612 |
|
|||
During the 10 years since it's first operational use, the scope and scale of the CERN Accelerator Logging Service (LS) has evolved significantly: from an LHC specific service expected to store 1TB / year; to a CERN-wide service spanning the complete accelerator complex (including related sub-systems and experiments) currently storing more than 50 TB / year on-line for some 1 million signals. Despite the massive increase over initial expectations the LS remains reliable, and highly usable - this can be attested to by the 5 million daily / average number of data extraction requests, from close to 1000 users. Although a highly successful service, demands on the LS are expected to increase significantly as CERN prepares LHC for running at top energy, which is likely to result in at least doubling current data volumes. Furthermore, focus is now shifting firmly towards a need to perform complex analysis on logged data, which in-turn presents new challenges. This paper reflects on 10 years as an operational service, in terms of how it has managed to scale to meet growing demands, what has worked well, and lessons learned. On-going developments, and future evolution will also be discussed. | |||
![]() |
Poster TUPPC028 [3.130 MB] | ||
TUPPC034 | Experience Improving the Performance of Reading and Displaying Very Large Datasets | collider, network, distributed, software | 630 |
|
|||
Funding: Work supported by Brookhaven Science Associates, LLC under Contract No. DE-AC02-98CH10886 with the U.S. Department of Energy. There has been an increasing need over the last 5 years within the BNL accelerator community (primarily within the RF and Instrumentation groups) to collect, store and display data at high frequencies (1-10 kHz). Data throughput considerations when storing this data are manageable. But requests to display gigabytes of the collected data can quickly tax the speed at which data can be read from storage, transported over a network, and displayed on a users computer monitor. This paper reports on efforts to improve the performance of both reading and displaying data collected by our data logging system. Our primary means of improving performance was to build an Data Server – a hardware/software server solution built to respond to client requests for data. It's job is to improve performance by 1) improving the speed at which data is read from disk, and 2) culling the data so that the returned datasets are visually indistinguishable from the requested datasets. This paper reports on statistics that we've accumulated over the last two years that show improved data processing speeds and associated increases in the number and average size of client requests. |
|||
![]() |
Poster TUPPC034 [1.812 MB] | ||
TUPPC054 | A PLC-Based System for the Control of an Educational Observatory | controls, PLC, interface, operation | 691 |
|
|||
An educational project that aims to involve young students in astronomical observations has been developed in the last decade at the Basovizza branch station of the INAF-Astronomical Observatory of Trieste. The telescope used is a 14” reflector equipped with a robotic Paramount ME equatorial mount and placed in a non-automatic dome. The new-developing control system is based on Beckhoff PLC. The control system will mainly allow to remotely control the three-phase synchronous motor of the dome, the switching of the whole instrumentation and the park of the telescope. Thanks to the data coming from the weather sensor, the PLC will be able to ensure the safety of the instruments. A web interface is used for the communication between the user and the instrumentation. In this paper a detailed description of the whole PLC-based control system architecture will be presented. | |||
![]() |
Poster TUPPC054 [3.671 MB] | ||
TUPPC067 | A Distributed Remote Monitoring System for ISIS Sample Environment | controls, monitoring, EPICS, neutron | 733 |
|
|||
The benefits of remote monitoring in industrial and manufacturing plants are well documented and equally applicable to scientific research facilities. This paper highlights the benefits of implementing a distributed monitoring system for sample environment equipment and instrumentation at the ISIS Neutron & Muon source facility. The upcoming implementation of an EPICS replacement for the existing beamline control system provides a timely opportunity to integrate operational monitoring and diagnostic capabilities with minimal overheads. The ISIS facility located at the Rutherford Appleton Laboratory UK is the most productive research centre of its type in the world supporting a national and international community of more than 2000 scientists using neutrons and muons for research into materials and life sciences. | |||
![]() |
Poster TUPPC067 [0.821 MB] | ||
TUPPC073 | National Ignition Facility (NIF) Dilation X-ray Imager (DIXI) Diagnostic Instrumentation and Control System | diagnostics, target, timing, controls | 751 |
|
|||
Funding: * This work was performed under the auspices of the Lawrence Livermore National Security, LLC, (LLNS) under Contract No. DE-AC52-07NA27344. #LLNL-ABS-633832 X-ray cameras on inertial confinement fusion facilities can determine the implosion velocity and symmetry of NIF targets by recording the emission of X-rays from the target gated as a function of time. To capture targets that undergo ignition and thermonuclear burn, however, cameras with less than 10 picosecond shutter times are needed. A Collaboration between LLNL, General Atomics and Kentech Instruments has resulted in the design and construction of an X-ray camera which converts an X-ray image to an electron image, which is stretched, and then coupled to a conventional shuttered electron camera to meet this criteria. This talk discusses target diagnostic instrumentation and software used to control the DIXI diagnostic and seamlessly integrate it into the National Ignition Facility (NIF) Integrated Computer Control System (ICCS). |
|||
![]() |
Poster TUPPC073 [3.443 MB] | ||