Methodology for the integration of high-throughput data and image acquisition systems in epics

  1. Esquembri Martínez, Sergio
Dirigida por:
  1. Eduardo Barrera López de Turiso Director/a
  2. Mariano Ruiz González Codirector/a

Universidad de defensa: Universidad Politécnica de Madrid

Fecha de defensa: 11 de octubre de 2017

Tribunal:
  1. Eduardo Juárez Martínez Presidente/a
  2. Juan Manuel López Navarro Secretario/a
  3. Josu Jugo García Vocal
  4. Daniel Mozos Muñoz Vocal
  5. Adriano R. Lucheta Vocal

Tipo: Tesis

Resumen

Data acquisition and processing systems are essential parts of today’s world. The purpose of this systems is to take measurements from real world characteristics in such a way that they can be processed to obtain information. The scale of these systems varies enormously from the smallest examples found in portable and wearable technology (such as smartphones and tablets) to big systems based on industrial communication buses that acquire and process several gigabytes of information per second. Special cases of these systems are those used in big physics experiments. The nature and purpose of these experiments change considerably from one to another, but all of them need to extract as much information as possible and with the utmost precision from the physic phenomena under study. This leads to a large amount of data to be processed and archived. Moreover, some of the acquired or processed data may be needed for the control of the experiments, so it is necessary to calculate them in real-time and with the lowest possible latency. Additionally, the reliability and availability of these systems must be guaranteed for the correct operation of the experiments, especially for the control and safety systems of the experiment. Another common characteristic in big physics experiments is their organization as a Supervisory Control And Data Acquisition (SCADA) system. The size and complexity of these experiments make necessary the use of these systems that allow dividing the functionality among the different systems while they are communicated for the coordination and synchronization of the actions taken by each one of them. This doctoral dissertation proposes a generic model as reference for the real-time data acquisition and processing solutions in big physics experiments, especially for those used in magnetic confinement fusion experiments. The proposed model tries to address the common requirements for this kind of systems. The model is designed using the technologies present in current experiments, as reconfigurable logic devices based on Field Programmable Gate Arrays (FPGAs), Graphics Processing Units (GPUs), Central Processing Units (CPUs), and architectures based on the industrial version of Peripheral Component Interconnect Express (PCIe) known as PCI eXtensions for Instrumentation Express (PXIe). The SCADA system used for the integration of the data acquisition systems is the Experimental Physics and Industrial Control System (EPICS), chosen for being one most used distributed control system in big physics experiments. xii However, there are some problems derived from the use of these technologies. The integration of the systems in EPICS requires the development of an intermediate application that interfaces between the hardware devices and EPICS. In the case of systems involving FPGA-based devices, the development of these applications becomes more challenging. This is because FPGAs devices can be configured in many different ways, and the development of one application for each possible configuration will increase the maintainability cost of the experiment. It is also necessary to consider that the application must integrate together devices very different from each other, as FPGAs, GPUs, and CPUs, and this must be done in such a way that they can work in collaboration for the data process. However, also important is that the systems developed are easily customizable, as they will be used for playing very different roles in the experiment, and it will be necessary to add custom functionalities, as specific data process or control actions taken by the acquisition device. In response to the problems described, this thesis is focused on the development of a methodology for the implementation of high throughput data acquisition and processing systems and its integration in EPICS. The methodology takes the proposed model system as a reference, and it tries to address the issues that the architecture presents pursuing to ease the implementation of these systems. The proposed methodology is based on the use of a set of software tools specifically designed for this purpose. The developed tools are publicly available and at the disposal of the scientific community under an open software license. Given the above, the main topics of this thesis are the following: • Study of data acquisition and processing systems requirements for big physics experiments as an example of an application where a high throughput is required. • Analysis of the hardware and software architecture of a data acquisition and processing system based on the technologies mentioned previously. • Definition and specification of the methodology and the related development cycle for the implementation of those systems and their integration in EPICS. • Description of the products developed in support of the methodology and for easing the use of the different technologies involved in the system. • Evaluation and validation of the proposed methodology, including actual use cases where the methodology is being applied.