Keyword: data-analysis
Paper Title Other Keywords Page
TUPPC015 On-line and Off-line Data Analysis System for SACLA Experiments experiment, detector, laser, data-acquisition 580
 
  • T. Sugimoto, Y. Furukawa, Y. Joti, T.K. Kameshima, K. Okada, R. Tanaka, M. Yamaga
    JASRI/SPring-8, Hyogo-ken, Japan
  • T. Abe
    RIKEN SPring-8 Center, Innovative Light Sources Division, Hyogo, Japan
 
  The X-ray Free-Electron Laser facility, SACLA, has delivered X-ray laser beams to users from March 2012 [1]. Typical user experiments utilize two-dimensional-imaging sensors, which generate 10 MBytes per accelerator beam shot. At 60 Hz beam repetition, the experimental data at the rate of 600 MBytes/second are accumulated using a dedicate data-acquisition (DAQ) system [2]. To analyze such a large amount of data, we developed data-analysis system for SACLA experiments. The system consists of on-line and off-line sections. The on-line section performs on-the-fly filtering using data handling servers, which examine data qualities and records the results onto the database with event-by-event basis. By referring the database, we can select good events before performing off-line analysis. The off-line section performs precise analysis by utilizing high-performance computing system, such as physical image reconstruction and rough three-dimensional structure analysis of the data samples. For the large-scaled image reconstructions, we also plan to use external supercomputer. In this paper, we present overview and future plan of the SACLA analysis system.
[1] T. Ishikawa et al., Nature Photonics 6, 540-544 (2012).
[2] M. Yamaga et al., ICALEPCS 2011, TUCAUST06, 2011.
 
poster icon Poster TUPPC015 [10.437 MB]  
 
TUPPC072 Flexible Data Driven Experimental Data Analysis at the National Ignition Facility diagnostics, software, target, framework 747
 
  • A.D. Casey, R.C. Bettenhausen, E.J. Bond, R.N. Fallejo, M.S. Hutton, J.A. Liebman, A.A. Marsh, T. M. Pannell, S.M. Reisdorf, A.L. Warrick
    LLNL, Livermore, California, USA
 
  Funding: This work was performed under the auspices of the Lawrence Livermore National Security, LLC, (LLNS) under Contract No. DE-AC52-07NA27344. #LLNL-ABS-632532
After each target shot at the National Ignition Facility (NIF), scientists require data analysis within 30 minutes from ~50 diagnostic instrument systems. To meet this goal, NIF engineers created the Shot Data Analysis (SDA) Engine based on the Oracle Business Process Execution Language (BPEL) platform. While this provided for a very powerful and flexible analysis product, it still required engineers conversant in software development practices in order to create the configurations executed by the SDA engine. As more and more diagnostics were developed and the demand for analysis increased, the development staff was not able to keep pace. To solve this problem, the Data Systems team took the approach of creating a database table based scripting language that allows users to define an analysis configuration of inputs, input the data into standard processing algorithms and then store the outputs in a database. The creation of the Data Driven Engine (DDE) has substantially decreased the development time for new analysis and simplified maintenance of existing configurations. The architecture and functionality of the Data Driven Engine will be presented along with examples.
 
poster icon Poster TUPPC072 [1.150 MB]  
 
TUPPC117 Unifying Data Diversity and Conversion to Common Engineering Analysis Tools software, superconducting-magnet, status, factory 852
 
  • H. Reymond, O.O. Andreassen, C. Charrondière, M.F. Gomez De La Cruz, A. Rijllart
    CERN, Geneva, Switzerland
 
  The large variety of systems for the measurements of insulation, conductivity, RRR, quench performance, etc. installed at CERN’s superconducting magnet test facility generates a diversity of data formats. This mixture causes problems when the measurements need to be correlated. Each measurement application has a dedicated data analysis tool used to validate its results, but there are no generic bridge between the applications that facilitates cross analysis of mixed data and data types. Since the LHC start-up, the superconducting magnet test facility hosts new R&D measurements on a multitude of superconducting components. These results are analysed by international collaborators, which triggered a greater need to access the raw data from many typical engineering and analysis tools, such as MATLAB®, Mathcad®, DIAdem™, Excel™… This paper describes the technical solutions developed for the data formats unification and reviews the present status.  
poster icon Poster TUPPC117 [11.140 MB]  
 
WECOBA01 Algebraic Reconstruction of Ultrafast Tomography Images at the Large Scale Data Facility framework, distributed, synchrotron, radiation 996
 
  • X. Yang, T. Jejkal, H. Pasic, R. Stotzka, A. Streit, T. dos Santos Rolo, T. van de Kamp
    KIT, Eggenstein-Leopoldshafen, Germany
 
  Funding: Kalsruhe Institute of Technology, Institute for Data Processing and Electronics; China Scholarship Council
The ultrafast tomography system built up at the ANKA Synchrotron Light Source at KIT makes possible the study of moving biological objects with high temporal and spatial resolution. The resulting amounts of data are challenging in terms of reconstruction algorithm, automatic processing software and computing. The standard operated reconstruction method yields limited quality of reconstruction images due to much fewer projections obtained from the ultrafast tomography. Thus an algebraic reconstruction technique based on a more precise forward transform model and compressive sampling theory is investigated. It results in high quality images, but is computationally very intensive. For near real–time reconstruction, an automatic workflow is started after data ingest, processing a full volume data in parallel using the Hadoop cluster at the Large Scale Data Facility (LSDF) to reduce the computing time greatly. It will not only provide better reconstruction results but also higher data analysis efficiency to users. This study contributes to the construction of the fast tomography system at ANKA and will enhance its application in the fields of chemistry, biology and new materials.
 
slides icon Slides WECOBA01 [1.595 MB]  
 
WECOBA07 High Speed Detectors: Problems and Solutions detector, network, operation, software 1016
 
  • N.P. Rees, M. Basham, J. Ferner, U.K. Pedersen, T.S. Richter, J.A. Thompson
    Diamond, Oxfordshire, United Kingdom
 
  Diamond has an increasing number of high speed detectors primarily used on Macromolecular Crystallography, Small Angle X-Ray Scattering and Tomography beamlines. Recently, the performance requirements have exceeded the performance available from a single threaded writing process on our Lustre parallel file system, so we have had to investigate other file systems and ways of parallelising the data flow to mitigate this. We report on the some comparative tests between Lustre and GPFS, and some work we have been leading to enhance the HDF5 library to add features that simplify the parallel writing problem.  
slides icon Slides WECOBA07 [0.617 MB]