Camera Control System and Data Flow Oct 14, 2008 internal review Stuart Marshall Mike Huffer *, Terry Schalk, Jon Thaler

Embed Size (px)

DESCRIPTION

Internal review Oct 14 ‘08 3 Camera Control System The CCS maintains the state of the camera as a whole, and thus is able to orchestrate the sequence of operations that enables data collection. It receives status information from the camera subsystems and is able to detect and report faults. (This does not include the detection of safety-related error detection, which is the responsibility of individual subsystems.) It provides the camera interface to the observatory control system (OCS), responding to commands from and reporting status information to the OCS; It is the interface between the human operators and the instrument. It can respond to reconfiguration commands, ensuring orderly transitions between different modes of camera operation. It provides the hardware and software necessary to receive the data streams generated by the camera and transmit them to downstream clients, such as the Data Management system (DM).

Citation preview

Camera Control System and Data Flow Oct 14, 2008 internal review Stuart Marshall Mike Huffer *, Terry Schalk, Jon Thaler Internal review Oct 14 08 2 Outline CCS Definition CCS Scope Control Architecture Data Flow Architecture Internal review Oct 14 08 3 Camera Control System The CCS maintains the state of the camera as a whole, and thus is able to orchestrate the sequence of operations that enables data collection. It receives status information from the camera subsystems and is able to detect and report faults. (This does not include the detection of safety-related error detection, which is the responsibility of individual subsystems.) It provides the camera interface to the observatory control system (OCS), responding to commands from and reporting status information to the OCS; It is the interface between the human operators and the instrument. It can respond to reconfiguration commands, ensuring orderly transitions between different modes of camera operation. It provides the hardware and software necessary to receive the data streams generated by the camera and transmit them to downstream clients, such as the Data Management system (DM). Internal review Oct 14 08 4 Camera Systems Locations test/staging mirror coating data control office cable wrap platform lift shaft telescope pier entry mech. clean room white room camera utilities Suite of camera maintenance rooms on east side of service & ops building Approximate routing of utility runs from camera on telescope: Flex Hard Nordby Internal review Oct 14 08 5 subsystem CCS Baseline Architecture Intelligent CCS Network Messages {cmd, telemetry, alarms} Mediation console S M subsystem OCS telemetry logging alarms M framework Library of classes, Documentation, Makefiles, Code management console M engineer Developer Same GUI tools User scripting S Internal review Oct 14 08 6 Every arrow has an interface at each end. Red means its a CCS group responsibility. Subsystems that do not produce data (only status info). OCS SASSDS CommandResponse Data DM FCS MCM Subsystems that produce data. Similar for WFS/WDS and GSS/GAS (see next slide) Similar for TSS, RAS, SCU, VCS, and L2U Subsystem managers Subsystems mapping to Managers Internal review Oct 14 08 7 Camera buses Camera Body Thermal (T5U)Science DAQ (SDS) Guide Analysis (GAS) WF DAQ (WDS) TCM Thermal (T3U,T4U) Shutter (SCU) Filters (FCS) Vacuum (VCS) Power/Signal (PSU) Cryostat Thermal (T1U,T2U) FP actuation (FPU) Guide array (GSS) Wave Front (WFS) Science array (SAS) Raft Alignment (RAS) Camera Control (CCS) Command Status Auxiliary systems Control Room Observatory buses Camera Control Architecture Internal review Oct 14 08 8 CCS architecture The CCS architecture is described in DocUshare (Document-4083). In a nutshell, the CCS consists of a Master Control Module (MCM) and a collection of devices. This is very similar to the TCS SW which will hopefully simplify the system and lessen the maintenance burden. Ground rules / requirements... * The control system is built on nodes in a tcp/ip network (ethernet?). * Each device (eg, shutter) must be capable of of supporting an API to the CCS network. The CCS will provide:. Device APIs that support a limited number of language bindings.. MCM adaptors to the device controllers that enable device diagnostics using the developer's language binding. * All devices must be individually addressable. Internal review Oct 14 08 9 CCS architecture II * One device per physical and logical connection. Multiple connections means multiple devices. * Devices must always be listening (no dead time). * Each device must have a watchdog timer and a well defined response to communications loss. * Every device must implement at least German's minimal state machine model (CTIO talk): [Offline] [Active] [Idle] [Error] (not all transitions shown here) * There will be a standardized packet format, accommodating:. Commands (ack/nak). Responses. Asynchronous publication (status, config params,...) Internal review Oct 14 08 10 CCS architecture III * Devices must know the time. The MCM is the CCS time server. * Except for primary data flow only the MCM has an external interface.. The MCM must interface with the OCS/FDB.. The human interface to CCS is through the MCM.. GUI implementations must support scripting.. The CCS control API must be able to support the same GUI that the OCS/TCS operators use. * Long term maintenance: Collaborators will develop, deliver, and debug their SW. Long term (eg, for 10 years), the SW must be widely maintainable by people who did not participate in the original code writing. Internal review Oct 14 08 11 CCS architecture IV Implementation issues : * Implementation decisions must take into consideration possible incompatibilities and the desire for homogeneity.. Hardware platforms.. Operating systems.. Programming languages. * Code repository = Subversion * Bug tracking and wiki:. Code reuse:. Test stand needs:. CCS needs to control various environmental quantities that OCS will do on mtn.. We will need mock-up OCS, FDB, etc.. This year's test stand SW needs to work, but not necessarily the final implementation.. How many CCS's will we really need on the mountain? (eg, clean room) How capable must they be?. How will we implement alarms? We should have a single person responsible for this.. How will we implement Displays? How many different types? Internal review Oct 14 08 12 Data Flow: Science DAQ System (SDS) Processes, buffers & publishes science data to both telescope and DM systems Science Array (21 center rafts) ~3.2 Gpixels readout in 1-2 seconds Wave-Front-System (WFS) (4 corner rafts) Guider (4 corner rafts) Camera Control System SDSdata management & telescope systems camera wire protocol optimized for: small footprint, low latency, robust error correction Equally important requirement is to support camera development & commissioning This means support for test-stand activity. Why is this support necessary? reuse allows early field testing preserves across development, commissioning & operation: camera teams software investment camera operational experience (CCS) commodity wire protocol (Ethernet) Internal review Oct 14 08 13 Irreducible functions Convert read-out representation to physical representation Manage and perform electronics cross-talk correction for the transient analysis Mount the server side of the client image interface Stream data to DM Mount CCS interface Manage and perform image simulation Support In-situ monitoring & diagnostics Provide image buffering, used for 3 day store flow-control (client interface) simulation monitoring & diagnostics Internal review Oct 14 08 14 SDS Summary Architecture is now relatively mature Functional prototypes are in-hand Performance and scaling have been demonstrated to meet the performance requirements Focus has now shifted to the definition & development of the client interface. However: A significant amount of interface work remains between camera electronics and the input side of the SDS The specification (& implementation!) of necessary diagnostics is a work in progress A camera-centric activity, however observatory could well add to these requirements There is more then one SDS Support for test-stands WFS & Guider activities Full camera commissioning & operation Building 33 Mountain-top The SDS design is intended to encompass and support all these activities Internal review Oct 14 08 15 The SDS & its data interfaces camera to telescope WFS data processing system to telescope guider data processing system image data processing system data management system WFS arrayscience arrayguider array SDS Internal review Oct 14 08 16 The test-stand crate & its interfaces sensor/raft/electronics etc SDS crate router/switch Linux host test-jig Control, analysis & long-term archival 1-8 channels 1-2 gbit (copper) gbits (fiber) Internal review Oct 14 08 17 Using the development board FPGA (FX020) USB 1 GE Fiber/transceiver General purpose I/O (LVDS) Internal review Oct 14 08 18 Interface ICDs actively being developed We have had several workshops to develop a consistent set of expectations across the connected systems TCS in chile in 2007 Electronics in Boston slac in 2007 & 2008 DM and OCS in tucson in 2008 French group (filter subsystem) Coming up: DM 2 day workshop (nov) TCS workshop (nov - dec) Internal review Oct 14 08 19 SDS-Computer room to Telescope SDS- to Staging area, Clean room etc Sci. image flow 4 x 10 Gb fibers COMPUTER ROOM - Mountain Cisco Layer 3 switch To Base 10 GigE lines For replicating control rooms, TCS/OCS and all other services, videos, phones etc TCS and OCS Switch has 1.4 Tb backbone Second image ???? 48 port 100/1000 baseT blade Image or Part image To Central Mountain router Internal review Oct 14 08 20 Backup Slides Internal review Oct 14 08 21 Defining Subsystems Master Control Module The MCM controls all other CCS subsystems via messages passed over tcp/ip networking on a private camera net. The MCM is the only "device" visible to the OCS and represents the camera as a whole. Power control - utility room This subsystem enables, monitors, sequences power sources which are in the ground utility room It includes switches, relays, and logic to provide appropriate sequencing and interlocks between power source In particular, we might not power the heat exchanger if the insulating vacuum system is not functioning. Power control -- camera This subsystem enables, monitors, sequences power sources which are in the utility trunk volume. It includes switches, relays, and logic to provide appropriate sequencing and interlocks between power sources. Filter Changer (FCS) Filter changer is self contained set of 5 filters, electromechanical motion devices and controller. Internal review Oct 14 08 22 Defining Subsystems (contd) Shutter Assuming a two blade shutter instrumented for position vs. time measurements Focal Plane Thermal Control Controls the temperature of the focal plane assembly. It must monitor temperatures and heater settings in each of the 25 rafts. Inputs are temperatures reported by the science array via the TCM (or possibly the SDS) and output is to the RCM's via the TCM (or again possibly the SDS). Cryogen control (camera) Cryogen is delivered from the utility room to the valve box in the utility trunk. Actuated valves (at least 4) control the flow of cryogen to the cold plate (BEE heat sink) and the cryoplate (CCD/FEE heat sink). This system controls and monitors this flow. Sensors for air pressure (for actualted valves) and nitrogen gas purge may be needed. Temperature sensors in the lines, cold plates, and valve boxes are also likely. The output temperatures from this system would be visible to the cryogen delivery system on the ground and could be used to bring the CCD temperature control system within the limits of the focal plane heaters. Internal review Oct 14 08 23 Defining Subsystems (contd) Cryogen control (ground) Comprises 3polycold refrigeration systems, a pumping system, heat exchanger, storage reservoir, and round-trip transport lines. This system provides monitoring and control of this system as a whole. It must be able to start/stop and adjust these devices during operation. Cryostat temperature monitoring This system provides monitoring of all temperature sensors in the cryostat which are not part of the cryogen flow or science array systems. Camera body Thermal Control This is environmental control and monitoring of the camera body including all areas outside of the cryostat. In particular, this includes the L1/L2 chamber, the L3/shutter/filter zone and the carousel zone also. The system consists of a temperature regulated supply (from the ground) of dry nitrogen gas feeding a fan coil unit with a supply/return line for chilled water. The fan coil circulates the dry N2 throughout the volume absorbing heat in chilled water coils. Operation depends upon the supply of dry N2 (low pressure) and chilled water (telescope utility). Temperature sensors in the camera body skin and components will be inputs, with the system adjusting either fan speeds, or water flow rates to regulate the skin temperature of the system. Internal review Oct 14 08 24 Defining Subsystems (contd) Utility Trunk Thermal Control This is environmental control and monitoring of the utility trunk. The system consists of a temperature regulated supply (from the ground) of dry nitrogen gas feeding a fan coil unit with a supply/return line for chilled water. The fan coil circulates the dry N2 throughout the volume absorbing heat in the chilled water coil. Operation depends upon the supply of dry N2 (low pressure) and chilled water (telescope utility). Temperature sensors in the utility trunk components will be inputs, with the system adjusting either fan speeds, or water flow rates to regulate the skin temperature of the system. Cryostat vacuum control Two turbo-pumps with backing from the utility room. Operation would assume the backing pumps are on (with verification). Two or more vacuum gauges and possibly an RGA would be used to monitor the vacuum quality. Interface to turbo pumps unknown at this time but may include health/safety data from the pump itself. Internal review Oct 14 08 25 Defining Subsystems (contd) Utility room vacuum control The on-ground utility room vacuum system provides three distinct vacuum systems: - backing vacuum for the cryostat - backing vacuum for the valve box - insulating vacuum for the cryogen supply system lower portion. These systems will include some number of gauges and valves (manual) and also a N2 (nitrogen) gas purge supply with some monitoring. Valve box vacuum control This system comprises one turbo pump and some gauges and is backed by the utility room backing pump system (shared with the cryostat vacuum but they can be manually valved off). Sensors/gauges for monitoring both the vacuum and the pump/valves would also be included in this system. science array (TCM) Timing and Control Module (TCM) distributes commands and timing to the ~21 raft controllers. The TCM delivers raft status to the CCS. Internal review Oct 14 08 26 Defining Subsystems (contd) Science Data Acquisition (SDS) The SDS is the data acquisition and distribution system for science data. It receives the data stream from the CCDs, does preliminary processing, buffering and temporary storage of the science data. While under the control of the CCS, the SDS also provides services that are beyond and asynchronous with the camera readout. guide sensor (TCM) Timing and Control Module (TCM) distributes commands and timing to the 4 guide controllers. The subsystem module (this object) instantiates the states that represent the guider system. Guider Data Acquisition This is the portion of the SDS devoted to guiding. It nominally operates when the science array is integrating. Also, the real-time analysis of the guider data may take place in the SDS which then outputs offset parameters that the telescope control system uses to control the telescope tracking. Internal review Oct 14 08 27 Defining Subsystems (contd) Wavefront sensor (TCM) Timing and Control Module (TCM) distributes commands and timing to the 4 guide controllers. The subsystem module (this object) instantiates the states that represent the guider system. WFS Data Acquisition The SDS is the data acquisition and distribution system for wavefront sensor data. It receives the data stream from the sensors, does preliminary processing, buffering and temporary storage of the wavefront. While under the control of the CCS, the SDS also provides services that are beyond and asynchronous with the camera readout. The primary client of this data is the telescope system. It is not expected to be routinely consumed by any other client. Focal Plane laser spot system This system consists of a set of laser sources and optics which project a fixed pattern onto the focal plane for in-situ calibration procedures. Control will essentially consist of just enable/disable and on/off. Internal review Oct 14 08 28 Representing: Mark Freytag Gunther Haller Ryan Herbst Chris OGrady Amedeo Perazzo Leonid Sapozhnikov Eric Siskind Matt Weaver CCS APC Paris Workshop Introduction to the SDS (the Science DAQ System) Michael Huffer, Stanford Linear Accelerator Center August 27, 2008 Internal review Oct 14 08 29 The Channel Physically: fiber-optic pair connected to transceivers Camera SDS transceiver choice not yet made, however SDS is relatively insensitive to this decision Logically: Runs camera-private protocol full duplex asynchronous may operate from 1 to 3.2 Gb/sec we will operate at 3.2 provides error detection & correction data not buffered on camera QOS support (4 virtual channels) bulk-data transfer (science data) R/W configuration registers (minimal usage) triggering (most likely not used) VHDL library provided for interface on BEE to Cluster Element Back-End-Electronics Camera FPGA Protocol interface Internal review Oct 14 08 30 The Cluster Element Is the unit of intelligence and buffering for the SDS services science data from 14 Channels has two (2) 10-GE Ethernet MACs contains up to 1 TByte of flash memory (currently 1/2 TByte) contains a reconfigurable FPGA fabric ( + DSPs) contains a PowerPC processor 450 MHZ) with: 512 Mbytes RLDRAM-II (currently 128 Mbytes) 128 Mbytes configuration memory processor operates an Open Source Real/Time kernel (RTEMS) POSIX compliant interfaces standard I/P network stack processor has a generic DMA interface which supports up to 8 GBytes/sec of I/O processor has I/O interfaces to: all four input channels (at full rate) both Ethernet MACs (at full rate) flash memory (access at up to 1GByte/Sec) FPGA fabric Internal review Oct 14 08 31 RCE board + RTM (Block diagram) MFD RCE Media Carriers RCE Zone 2 Zone 3 slice 0 slice 1 slice 2 slice 3 slice 7 slice 6 slice 5 slice 4 MFD Fiber-optic transceivers backplane Internal review Oct 14 08 32 RCE board + RTM Media Carrier Media Slice RCE Power conversion Zone 1 (power) Zone 2 Zone 3 transceiver s Internal review Oct 14 08 33 Cluster Interconnect board + RTM (Block diagram) Zone 2 Zone 3 MFD CX-4 connectors 10-GE switch L2 backplane RCE X2 1-GE X2 1-GE 10-GE switch L2 Switch Managemen t Managemen t bus Internal review Oct 14 08 34 Cluster Interconnect board Power conversion Zone 3 Zone 2 Zone 1 1 GE Ethernet Management Processor 10 GE switch X2 External Ethernet Internal review Oct 14 08 35 ATCA (horizontal) crate (front) Fans shelf manager Cluster Interconnec t board Internal review Oct 14 08 36 Generic Ideas for Continued Discussion TCS/SOAR heritage where useful Subsystem boundaries defined by tightly coupled Implementing the event driven state machine model Communications: Master/slave only No peer-to-peer Peer info via subscription??? Network (tcp/ip) is only connection to subsystem CCS proxy modules vs. resident CCS code in subsystem. Use of EA and UML/SysML to document design CCS trac/svn site exists Supervision of a distributed set of processes on a distributed set of nodes: Startup/shutdown Logging Debugging/tracing/snooping tools Graphical tools for monitoring Error handling/restarts Links to hardware protection system Internal review Oct 14 08 37 Defining Subsystems Substantial updates from mechanical engineering team Additions include utilities transport, utility trunk evolution, off- telescope operations Review and re-parse subsystem definitions based on subsystems are tightly coupled Track definitions and functionality of subsystems in UML Each subsystem (CCS view) is based on a generic controller model