Published: June 04, 2014 |
Joe Cassar, Team Leader, dSPACE Inc.
The existence of system networks in airframes, satellites and other avionic implements have long included smart serialized communication between multiple components and systems. These systems started out with simple peer-to-peer serialized exchanges and have evolved into much more complex and high-speed multipoint networks.
OEMs and suppliers of avionics require test systems for the validation of embedded software. In many cases, these test systems include simulation of the system under control, together with LRU or FADEC and other avionics components. Such test systems are closed-loop, real-time systems and are often referred to as Hardware-in-the-Loop (HIL) systems.
In order to validate embedded software, it is often necessary to emulate various communication networking interfaces that would physically exist in an airframe. A HIL system for a singular unit under test (UUT) should provide emulation of non-existent components such as control electronics and sensors participating in a particular communication network. In this scenario, the ability to simulate a communication bus (e.g. ARINC429) is critical for software testing. It allows engineers to perform a wide variety of tests on the UUT without the need for real sensor and actuator interfaces to be available, and typically allows for near complete validation of the UUT software with minimal physical components.
In addition to a physical component test, it may be desirable to setup a test bench to validate multiple components and to test the actual interaction between these components together. In this scenario, it also becomes important to be able to isolate and hijack the communication between one UUT and another, in order to insert failure conditions – basically to manipulate certain parts of the payload data while allowing majority of the communication to be preserved. This is important because it is sometimes difficult to emulate the dynamic traffic from non-existent components, but allow for testing of the interaction between the UUTs together, in order to validate the system level implementation of the software.
In order to facilitate such test conditions, the test equipment must be capable of deterministic emulation of the bus technology back-bone of an integrated system. In most cases, it becomes imperative to employ an embedded system with a deterministic real-time operating system (RTOS). QNX is one of the popular commercial RTOSs used to achieve deterministic performance. However, there is more to ensuring determinism than utilizing a RTOS – it is also important that peripherals utilized by the central embedded system running the RTOS do not violate the determinism principals.
This is why it is so critical that the overall system is architected properly – and also why COTS parts from multiple vendors create uncertainty in this regard. This is one important criterion which is often overlooked when a roll-your-own system is put together by using commercially-available components from multiple hardware vendors that are not properly vetted together as a whole.
In most cases, the integrator of these systems must ensure that these 3rd party items perform up to expectations. Interface cards used for bus simulation are often aimed at various types of test applications, and often these cards are capable of performing real-time data replay with buffered data memory across a PCI or PCI-Express backplane.
For real-time testing applications involving, for example, a simulation of a dynamic plant model of an engine or turbo fan, it is necessary that the interface card used for the bus simulation and the main Central Processing Unit (CPU) of the PC running the dynamic model in the RTOS environment are able to exchange data at a discrete time-step in a synchronous manner. In such applications, it is not only important to consider the frame rate of a message transmission, but also the bus loading.
The MIL-STD-1553 communication bus runs with major frame time periods as slow as 16 or 32 microseconds with the possibility of 60-70% bus load, which means there is an immense amount of data to be swapped between the core CPU and the peripheral bus simulation card. In view of these requirements, integrated solutions for real-time systems which ensure interoperability between the core real-time simulation and the interface peripherals are critically important. Dedicated and well integrated turn-key HIL systems, like dSPACE SCALEXIO, address all requirements for testing of networked embedded systems to provide adequate processing power for simulation and advanced electronic interfaces to emulate sensor and measure actuator response, as well as network traffic simulation.
Historically, the predominant network interface of choice for commercial airframes has been ARINC-429, which is implemented as point-to-point communication. ARINC429 transceivers send data in a 32-bit frame consisting of a label and a data field, where the label field identifies the specific application of the data. These labels have become somewhat standardized for particular usage. Because of the nature of this point-to-point communication, ARINC429 often requires high-channel count to be able to emulate the entire bus architecture – and the number of channels required for a test system doubles if failure injection between nodes is necessary for testing.
For defense avionics and satellites, the MIL-SPEC 1553B protocol has been the dominant network interface over the past 40 years. The protocol is based on a Bus Controller (BC) and includes up to 32 remote terminals (RT), in which all transactions on the bus are predicated by the BC that directs a specific RT to send or receive data.
ARINC419 and MIL-STD-1553 bus technologies are still relevant and used in wide variety of applications, and therefore, have to be supported by the test systems.
Today, a more diverse array of network interfaces is being employed in avionics and defense applications. Protocols such as SAE AS6802 Time-Triggered Ethernet, IEEE-1394 based SAE AS5643 and FlexRay follow a similar approach of synchronous scheduling of messaging along a major frame to ensure determinism by providing each message a dedicated time-slice for their transmissions. In addition to backbone type Ethernet network interfaces, many commercial airframe manufacturers are looking into augmenting the backbone network with distributed cluster interfaces based on ARINC-825 and the popular automotive CAN Protocol.
Additionally, shared memory interfaces between the distributed controllers, such as FORM, SCRAMNET and FibreChannel, offer an alternative to bus interfaces. Therefore, test systems are required to integrate a variety of these interfaces.
The next important factor to consider for test systems is the network management mechanism for realizing the bus configuration. This can be challenging because although many vendors of avionics bus hardware provide graphical user interfaces, they are typically designed to be used on a non-real-time PC environment running Windows or Linux. They may not provide device drivers for QNX or equivalent RTOS because the normal use case is to handle the required determinism on the device itself. Although this is sufficient for simple static open-loop or stimuli-based testing, it fails to address the need for closed-loop dynamic simulation. It is very important that there is a tight coupling between the core real-time system and the network emulation hardware. The network configuration software must allow for easy configuration on a PC and be able to target the real-time system.
Many vendors of real-time systems provide configuration software that interconnects the network topology management to the real-time plant models. In many cases, plant models are defined as mathematical software packages which come in many forms. Graphical modeling of physical components has become fairly popular in the past 20 years with the advent of advanced simulation tools, and so it is very important that any software test system must also employ a configuration tool that is capable of importing this modeling content into their system.
It is equally important for a network configuration tool to also work with industry standard simulation software packages such as Numerical Propulsion System Simulation (NPSS) models. This can be challenging if such modeling packages are only available in compiled form and require specific OS support requiring the test system to execute models in co-simulation across multiple CPU cores.
dSPACE test solutions with multi-core processors support hosting of multiple operating systems across a single application for simulation packages that only execute under a specific environment.
Defining communication network configuration is a tedious and time consuming process. Many organizations define their network topology using some form of configuration files based on XML technology. Specific XML schemas are used to describe parameters, such as specific timing of a message, as well as information about how to encode and decode a message. Test systems that implement a file parser for configuration of the network topology would benefit from this predefined configuration. As these XML schemas are not standardized, custom engineering is typically required for automation of the bus configuration. It is therefore important to verify that test systems have an open API for such automation of the network configuration. dSPACE systems provide open API based on Python, COM, .Net and HIL API based on ASAM standards.
Finally, test systems for avionics must allow for easy implementation of test cases and efficient management of test data across the test systems. ASAM AE-HIL-BS-1-3-API has evolved as a standard in the automotive industry to realize an interface between test automation software and HIL test systems. dSPACE AutomationDesk and HIL test systems are compliant with this standard.
These tools typically allow for test cases to be implemented graphically, but many also have Python- based scripting API support. The test automation tool should include mechanisms for defining a test framework which can be used as a basis for every defined test case, some form of editor, test execution and managing test data results. Interfaces to requirements management tools may be required as well.
Test Automation and Data Management Tools with dSPACE AutomationDesk and SYNECT
In closing, specifying a HIL test system to validate software of networked embedded systems requires special considerations with regard to the criteria for real-time determinism and relevant network interface technology. These systems must also be capable of a high degree of fidelity with regard to computation of mathematically described plant models to ensure closed-loop determinism, and they must have a tight coupling between the core real-time system and the network interface card(s). Such systems should enable easy configuration and ideally allow for the import of network description artifacts, if available. It is also important to address test automation and data management issues as part of the overall process.
Drive innovation forward. Always on the pulse of technology development.
Subscribe to our expert knowledge. Learn from our successful project examples. Keep up to date on simulation and validation. Subscribe to/manage dSPACE direct and aerospace & defense now.