This shows you the differences between two versions of the page.
| Next revision | Previous revision | ||
| en:safeav:ctrl:testing [2026/03/26 11:34] – created raivo.sell | en:safeav:ctrl:testing [2026/03/26 13:43] (current) – airi | ||
|---|---|---|---|
| Line 1: | Line 1: | ||
| ====== Physical Testing ====== | ====== Physical Testing ====== | ||
| + | Physical testing infrastructures across ground, airborne, marine, and space systems reflect a progression from **high-access, | ||
| + | |||
| + | ===== Ground Systems (Automotive & Robotics) ===== | ||
| + | |||
| + | <figure Ref.figure6.12a> | ||
| + | {{: | ||
| + | < | ||
| + | </ | ||
| + | |||
| + | Ground systems benefit from the most accessible and diverse physical testing environments. **Proving grounds and AV test tracks**—such as Mcity and American Center for Mobility—replicate urban, suburban, and highway conditions with controllable variables (traffic signals, pedestrian dummies, weather systems). OEMs also use large private facilities (e.g., General Motors Milford Proving Ground) for durability, ADAS, and edge-case testing. These environments enable **repeatable scenario testing**, fault injection, and safe validation of perception and decision-making systems. Increasingly, | ||
| + | |||
| + | ===== Airbone Systems (Aviation & UAVs) ===== | ||
| + | |||
| + | <figure Ref.figure6.12b> | ||
| + | {{: | ||
| + | < | ||
| + | </ | ||
| + | |||
| + | Airborne testing combines **ground-based facilities and open-air test ranges**. Wind tunnels (e.g., NASA Ames Research Center Wind Tunnel) provide controlled aerodynamic testing across regimes, while **iron-bird rigs** and avionics labs enable hardware/ | ||
| + | |||
| + | <figure Ref.figure6.12c> | ||
| + | {{: | ||
| + | < | ||
| + | </ | ||
| + | |||
| + | Marine testing relies on a mix of **controlled hydrodynamic facilities and open-water trials**. Towing tanks and wave basins—such as those at Naval Surface Warfare Center—allow precise study of hull performance, | ||
| + | |||
| + | <figure Ref.figure6.12d> | ||
| + | {{: | ||
| + | < | ||
| + | </ | ||
| + | |||
| + | Space systems have the most specialized and constrained physical testing infrastructure. Because full end-to-end testing in the operational environment is impossible, engineers rely on **high-fidelity ground facilities** that replicate aspects of space conditions. These include thermal vacuum chambers (e.g., NASA Johnson Space Center Chamber A), vibration and acoustic test facilities for launch loads, and propulsion test stands (e.g., Stennis Space Center). RF anechoic chambers validate communication and sensing systems. While these facilities achieve extreme fidelity for specific physics, **system-level validation is fragmented**, | ||
| + | |||
| + | ===== Cross-Domain Insight ===== | ||
| + | |||
| + | Across all four domains, physical testing evolves from **highly repeatable, scenario-rich environments (ground)** to **physics-constrained, | ||
| + | |||
| + | Summary: | ||
| + | |||
| + | This chapter develops a comprehensive view of how **control, decision-making, | ||
| + | |||
| + | The chapter then explores the **decision and planning hierarchy**, | ||
| + | |||
| + | Finally, the chapter focuses on **validation and assurance**, | ||
| + | |||
| + | Assessments: | ||
| + | |||
| + | |||
| + | ^ # ^ Project Title ^ Description ^ Learning Objectives ^ | ||
| + | | 1 |Classical vs AI Control Benchmark Study | Implement and compare a classical controller (e.g., PID or LQR) with an AI-based controller (e.g., reinforcement learning) for a simplified vehicle model in simulation. Evaluate performance under nominal and disturbed conditions. |- Understand differences between model-based and data-driven control \\ - Analyze stability, robustness, and interpretability trade-offs \\ - Evaluate controller performance under uncertainty and disturbances | | ||
| + | | 2 |Behavioral & Motion Planning Stack Design | Design a hierarchical autonomy stack that includes a behavioral layer (FSM or behavior tree) and a motion planner (A*, RRT*, or MPC). Apply it to a scenario such as lane change or obstacle avoidance. | * Distinguish between behavioral decision-making and motion planning \\ * Implement planning algorithms under constraints \\ * Understand integration between perception, planning, and control | | ||
| + | | 3 |Scenario-Based Validation Framework | Develop a scenario-based testing framework using parameterized scenarios (e.g., varying speeds, distances, agent behaviors). Use a simulator to evaluate planning/ | ||
| + | | 4 |Digital Twin & Multi-Fidelity Simulation Study | Build a simplified digital twin of a vehicle and environment. Perform validation using both low-fidelity and high-fidelity simulation setups, comparing results and identifying discrepancies. | * Understand role of digital twins in V&V \\ * Analyze trade-offs between simulation fidelity and scalability \\ * Quantify sim-to-real gaps and their implications | | ||
| + | | 5 |Formal Methods for Safety Validation | Define safety requirements using a formal specification approach (e.g., temporal logic or rule-based constraints). Apply these to simulation traces and identify violations or edge cases. | * Translate safety requirements into formal, testable properties \\ * Use formal methods for falsification and validation \\ * Understand limitations of simulation without formal rigor | | ||