Paper | Title | Other Keywords | Page | |||||
---|---|---|---|---|---|---|---|---|
MOPB01 | Grid and Component Technologies in Physics Applications | plasma | 29 | |||||
|
Physics experiments and simulations grow in size and complexity. Examples are the existing HEP/NP experiments and upcoming challenges of SNS, LHC, ILC, and ITER. Managing the experimental data is an extremely complex activity. Physics simulations now attempt full modeling of various phenomena and whole experimental devices, such as in fusion integrated and space weather modeling. Recent advances in computer science, such as Grids and Components, address the challenges faced by applications. In science, Globus and Common Component Architecture (CCA) became commonly used tools for these technologies. Globus allows creating a gridcomputers trusting each other and a group of users who can then submit jobs and move data. CCA expresses connectivity of the simulations elements in different languages as components, objects with in and out ports. CCA frameworks combine components into simulation and can swap components sharing ports. CCA accommodates high-performance and distributed applications. We will present our work with Globus and CCA in HEP/NP and fusion, share the lessons learned, and evaluate the ease of using these technologies and the value added.
|
|
|
Slides
|
|
|
||
TOAA05 | Implementation, Commissioning and Current Status of the Diamond Light Source Control System | controls, photon, diagnostics, linac | 56 | |||||
|
Starting with the Linac in 2005, the commissioning of the Diamond Light Source accelerators and photon beamlines, together with their related control systems, progressed to an aggressive program such that as of early in 2007, the facility was available for first users with a suite of beamlines and experiment stations. The implementation and commissioning of the control system to meet the overall project objectives are presented. The current status of the control system, including ongoing developments for electron-beam orbit stability and future photon beamline requirements, are also described.
|
|
|
Slides
|
|
|
||
TOAB02 | Current Status of the Control System for J-PARC Accelerator Complex | linac, controls, beam-losses, proton | 62 | |||||
|
J-PARC accelerator complex consists of a proton linac (LINAC), > a Rapid Cycle Synchrotron (RCS), and a Main Ring synchrotron (MR). The commissioning of LINAC already started in November 2006, while the commissioning of Main Ring synchrotron (MR) is scheduled in May 2008. Most of the machine components of MR have been installed in the tunnel. Introduction of electronic modules and wiring will be made by the end of 2007. For the control of MR, the J-PARC accelerator control network was extended to include the MR related parts in March 2007. IOC computers (VME-bus computers) for MR will be introduced in 2007. In addition, more server computers for application development will be also introduced in 2007. This paper reports the status of development for the J-PARC MR control system.
|
|
|
Slides
|
|
|
||
TOPA01 | Data Management at JET with a Look Forward to ITER | controls, plasma, diagnostics, power-supply | 74 | |||||
|
Since the first JET pulse in 1983, the raw data collected per ~40s of plasma discharge (pulse) has roughly followed a Moore's Law-like doubling every 2 years. Today we collect up to ~10GB per pulse, and the total data collected over ~70,000 pulses amounts to ~35TB. Enhancements to JET should result in ~60GB per pulse being collected by 2010. An ongoing challenge is to maintain the pulse repetition rate, data access times, and data security. The mass data store provides storage, archiving, and also the data access methods. JET, like most fusion experiments, provides an MDSplus (http://www.mdsplus.org) access layer on top of its own client-server access. Although ITER will also be a pulsed experiment, the discharge will be ~300-5000s in duration. Data storage and analysis must hence be performed exclusively in real time. The ITER conceptual design proposes a continuous timeline for access to all project data. The JET mass data store will be described together with the planned upgrades required to cater for the increases in data at the end of 2009. The functional requirements for the ITER mass storage system will be described based on the current status of the ITER conceptual design.
|
|
|
Slides
|
|
|
||
TPPA10 | Development of Photon Beamline and Motion Control Software at Diamond Light Source | controls, photon, diagnostics, site | 108 | |||||
|
Diamond Light Source has opened its first eight photon beamlines to the user community this year. We have developed the control software for the beamlines in parallel, adopting a common set of standards, tools, and designs across all beamlines. At the core of the control system is the EPICS toolset and the widespread use of the Delta Tau PMAC motion controller. The latter is a complex, but flexible controller that has met our needs both for simple and complex systems. We describe how we have developed the standard EPICS software for this controller so that we can use the existing EPICS interfaces, but also enables us to use the more advanced features of the controller.
|
|
|
|||||
TPPA23 | The ACOP Family of Beans: A Framework Independent Approach | controls, target | 138 | |||||
|
The current ACOP (Advanced Component Oriented Programming)* controls set has now been expanded to include a wide variety of graphical java beans, which simultaneously act as displayers of control system data. Besides the original ACOP Chart, the set of ACOP beans also includes a Label, Slider, Table, Gauge, Wheel, and image control, along with an invisible Transport bean, which is itself embedded in the ACOP GUI beans. The new ACOP beans all offer design-time browsing of the control system to expedite data end-point selection. Optionally a developer can choose to connect and render the incoming data automatically, obviating the need for writing code. The developer can either forgo this option or choose to override the generated code with his own, allowing for rich client development. At the same time a user can browse and add or change the control system endpoints at run-time. If the application is using the Component Object Manager (COMA)** then all visual aspects of the application can be edited at run-time, allowing for simple client development. This scenario is independent of a framework, and the developer is free to choose the IDE of choice.
|
* http://acop.desy.de** "The Run-Time Customization of Java Rich Clients with the COMA Class," P. Bartkiewicz, et al., these proceedings. |
|
|||||
TPPB06 | The MIRI Imager Ground Support Equipment Control System Based on PCs | controls, cryogenics, alignment | 172 | |||||
|
The James Web Space Telescope (JWST) is the successor of Hubble in the infrared. Our division, Dapnia, is in charge of the design and completion of the optomechanical part of the imager called MIRIM, one instrument of JWST, and of its test bench called the Ground Support Equipment (GSE). This GSE consists of a warm telescope simulator, of a model (identical to the flight model) of the imager, of a cryostat to cool the imager down to its operating temperature, and of an infrared detector (1024x1024 pixels). The telescope simulator is composed of several optical components to control (hexapod, 8 motors table, etc.). The major part of the hardware architecture for the control of the IR detector and the telescope simulator is based on PCs and COTS boards. This paper describes the software development and its specificities. ESO software (IRACE and BOB) and EPICS are associated to complete the operator interface. The cryostat control is our homemade supervision system for cryogenics systems based on PLCs, on the WorldFIP Fieldbus network, and on an industrial XPe PC. The tests of the different subsystems have started, and the whole test bench will be operational in summer 2007.
|
|
|
|||||
WOAB02 | CAD Model and Visual Assisted Control System for NIF Target Area Positioners | controls, target, alignment, laser | 293 | |||||
|
The National Ignition Facility (NIF) contains precision motion control systems that reach up to 6 meters into the target chamber for handling targets and diagnostics. Systems include the target positioner, an alignment sensor, and diagnostic manipulators. Experiments require a variety of arrangements near chamber center to be aligned to an accuracy of 10 micrometers. These devices are some of the largest in NIF, and they require careful monitoring and control in three dimensions to prevent interferences. Alignment techniques such as viewing target markers and cross-chamber telescopes are employed. Positioner alignment is a human-control process incorporating real-time video feedback on the user interface. The system provides efficient, flexible controls while also coordinating all positioner movements. This is accomplished through advanced video-control integration incorporating remote position sensing and real-time analysis of a CAD model of target chamber devices. This talk discusses the control system design, the method used to integrate existing mechanical CAD models, and the offline test laboratory used to verify proper operation of the integrated control system.
|
|
|
Slides
|
|
|
||
WOPA02 | Remote Operations of an Accelerator Using the Grid | controls, storage-ring, feedback, instrumentation | 303 | |||||
|
The GRIDCC* is a three-year project funded by the European Commission. Its goal is integrating instruments and sensors with the traditional Grid resources. The GRIDCC middleware is being designed bearing in mind use cases from a very diverse set of applications, and as the result, the GRIDCC architecture provides access to the instruments in as generic a way as possible. GRIDCC is also developing an adaptable user interface and a mechanism for executing complex workflows in order to increase both the usability and the usefulness of the system. The new middleware is incorporated into significant applications that will allow the software validation in terms both of functionality and quality of service. The pilot application this paper focuses on is applying GRIDCC to support Remote Operations of the ELETTRA synchrotron radiation facility. We describe the results of implementing via GRIDCC complex workflows involved in the both routine operations and troubleshooting scenarios. In particular, the implementation of an orbit correction feedback shows the level of integration of instruments and traditional Grid resources which can be reached using the GRIDCC middleware.
|
* http://www.gridcc.org. |
|
Slides
|
|
|
||
ROAA04 | XAL Online Model Enhancements for J-PARC Commissioning and Operation | space-charge, emittance, dipole, controls | 494 | |||||
|
The XAL application development environment has been installed as a part of the control system for the Japan Proton Accelerator Research Complex (J-PARC) in Tokai, Japan. XAL was initially developed at SNS and has been described at length in previous conference proceedings (e.g., Chu et. al. APAC07, Galambos et. al. PAC05, etc.). We outline the upgrades and enhancements to the XAL online model necessary for accurate simulation of the J-PARC linac. For example, we have added permanent magnet quadrupoles and additional space charge capabilities such as off-centered and rotated beams and bending magnets with space charge. In addition significant architectural refactoring was performed in order to incorporate the current, and past, upgrades into a robust framework capable of supporting future control operations. The architecture and design of XAL is as important as its function, as such, we also focus upon the revised architecture and how it supports a component-based, software engineering approach.
|
|
|
Slides
|
|
|
||
ROPA02 | The High Performance Database Archiver for the LHC Experiments | insertion, controls, background, collider | 517 | |||||
|
Each of the Large Hadron Collider (LHC) experiments will be controlled by a large distributed system built with the SCADA tool PVSS. There will be about 150 computers and millions of input/output channels per experiment. The values read from the hardware, alarms generated, and user actions will be archived for the physics analysis and for the debugging of the control system itself. Although the original PVSS implementation of a database archiver was appropriate for standard industrial use, the performance was not enough. A collaboration was set up between CERN and ETM, the company that develops PVSS. Changes in the architecture and several optimizations were made and tested in a system of a comparable size to the final ones. As a result we have been able to improve the performance by more than one order of magnitude, and what is more important, we now have a scalable architecture based on the Oracle clustering technology (Real Application Cluster or RAC). This architecture can deal with the requirements for insertion rate, data querying, and manageability of the high volume of data (e.g., an insertion rate of > 150,000 changes/s was achieved with a 6-node RAC cluster).
|
|
|
Slides
|
|
|
||
RPPB20 | A Graphical Sequencer for SOLEIL Beamline Acquisitions | controls, synchrotron, alignment, site | 647 | |||||
|
Addressing batch processing and sequencing needs are fundamentals for daily beamlines operation. The SOLEIL control software group offers two solutions. Firstly, the Python scripting environment, for which a dedicated Tango binding is available, has been proved to be powerful, but is limited to scientists with good programming skills. Secondly, we provide the PASSERELLE software, developed by the ISENCIA* company and based on the PTOLEMY** framework. In this environment, sequences can be designed graphically by drag and drop components called actors (representing elementary tasks). The process execution can be easily programmed by defining graphically the data flow between actors. Upon this framework, an existing generic GUI application allows users to configure and execute the sequences. A dedicated GUI application can also be provided on demand to give the beam lines end user an easy-to-use acquisition application. The work organization, the software architecture and design of the whole system will be presented, as well as the current status of deployment on SOLEIL beamlines.
|
* http://www.isencia.com/main/web/init** http://ptolemy.eecs.berkeley.edu/ptolemyII/index.htm |
|