simulation based formal verification of onboard software
play

Simulation Based Formal Verification of Onboard Software A Case - PowerPoint PPT Presentation

Simulation Based Formal Verification of Onboard Software A Case Study SyLVer : System Level Verifier Toni Mancini, Annalisa Massini, Federico Mari , Igor Melatti, Ivano Salvo, Enrico Tronci Computer Science Department Sapienza University


  1. Simulation Based Formal Verification of Onboard Software — A Case Study — SyLVer : System Level Verifier Toni Mancini, Annalisa Massini, Federico Mari , Igor Melatti, Ivano Salvo, Enrico Tronci Computer Science Department Sapienza University of Rome, Italy http://mclab.di.uniroma1.it

  2. System Level Verification of CPSs Cyber Physical System (CPS): hw + sw components • ⇒ Can be modelled as Hybrid System System Level Verification (SLV): to verify that the whole • system (hw+sw) satisfies given specifications CPSs of industrial relevance too complex for SLV to be • performed by model checkers for Hybrid Systems Main workhorse for SLV: Hardware in The Loop Simulation • (HILS) Simulation Based Formal Verification of Onboard Software: A Case Study 2

  3. Hardware in The Loop Simulation Hardware in The Loop Simulation ( HILS ): 
 • replace hardware with a software simulator Supported by Model Based Design Tools as Simulink, VisSim, … • System Under Verification (SUV) Simulator Controller 
 Plant 
 Software System Physical System uncontrollable Simulation Operational inputs : output scenario faults, changes in sys params, … “ disturbances ” Pass Fail Simulation Based Formal Verification of Onboard Software: A Case Study 3

  4. HILS Campaign: Main Obstacles SUV Simulator Effort needed to define the operational • Controller 
 Plant 
 scenarios defining disturbances to be Software System Physical System injected into the system under verification. Operational Simulation scenario output Computation time needed to carry out • Pass Fail the simulation campaign itself. Hard to be done manually • Degree of assurance achieved at the end of the HILS campaign: did Can take weeks ! we consider all relevant operational scenarios? “Did I overlook anything?” • Graceful degradation : what can we say about the error probability during the HILS campaign? “What can I say if I abort verification now ?” Simulation Based Formal Verification of Onboard Software: A Case Study 4

  5. Our approach to System Level Formal Verification Formal model of operational • Effort needed to define the • scenarios (disturbance model) operational scenarios defining as a FSA described in a high- disturbances to be injected into level language (CMurphi) the system under verification. Exhaustive system level • Degree of assurance : did we • verification wrt operational consider all relevant operational scenarios defined by the model scenarios? Anytime random algorithm : at • Graceful degradation : what can • any time we compute an upper we say about the error bound to Omission probability during the HILS Probability campaign? Embarrassing parallel multi- • Computation time needed to • core approach to speed up carry out the simulation simulation + optimisation campaign itself. [CAV13, PDP14, DSD14, PDP15, Microprocessors & Microsystems 2016, Fundamenta Informaticae 2016] Simulation Based Formal Verification of Onboard Software: A Case Study 5

  6. Model-Based System Verification @ MCLab Disturbance Model (formal model of operational scenarios) SyLVer … System Level Formal Verifier https://bitbucket.org/mclab/sylver-simulink-driver LOAD - RUN - FREE -STORE Simulator Omission Probability Optimised Simulator Simulation Monitor output CPS + Driver Monitor fail Campaign Model 1 pass 0 … … … Parallel (cluster) Simulator Omission Probability Optimised Simulator Simulation CPS Monitor output + Driver Monitor fail Campaign Model 1 pass 0 Hardware-in-the-Loop Simulation (HILS) Simulation Based Formal Verification of Onboard Software: A Case Study 6

  7. SyLVaaS Introduces Verification as a Service paradigm • Supports companies in the CPS design business in their daily • verification activities Allows keeping both the SUV model and the property to be • verified secret ( Intellectual Property protection ) 1 2 SUV & Disturbance model 
 property (CMurphi syntax) SyLVaaS Private cluster http Verification 
 engineer 4 Optimised simulation 3 campaigns for random exhaustive parallel HILS Simulation Based Formal Verification of Onboard Software: A Case Study 7

  8. Modelling the Operational Environment Discrete event sequence u(t) u(t) 3 d =3 2 Fail 1 1 0 SUV Monitor no t Pass disturb. 0 t disturbance event SUV input : discrete event seq. Property to be verified : • A ssociates to each (real) t a embedded in a continuous-time disturbance event within [0, d ] SUV monitor • Differs from 0 (no disturbance) in a finite number of time-points SUV : continuous-time SUV output : 0 at input-state-output start; goes to and … no system can withstand an infinite deterministic stays 1 as soon as number of disturbances within a dynamical system error is detected finite time Simulation Based Formal Verification of Onboard Software: A Case Study 8

  9. Discrete Event Seq’s & Disturbance Traces We aim at Bounded System Level Formal Verification : Bounded time horizon : h • Bounded time quantum between disturbances: 𝜐 • Discrete event sequence (h,d) disturbance trace h u(t) 𝜐 d=3 3 00 2 0 3 00000 1 000 2 00 2 1 h 0 t Simulation Based Formal Verification of Onboard Software: A Case Study 9

  10. Disturbance Model Defining all disturbance sequences the SUV should withstand • cannot be done manually for large CPSs Approach: use high-level modelling language to define • disturbance model as a Finite State Automaton A tiny example function disturbanceModel( h ) c ← 0; /* counter */ • Just one disturbance (fault), always 
 t ← 0; /* time */ while t ≤ h do recovered within 4 seconds t ← t + 1 ; d ← read (); • At least 5 seconds between two 
 if c > 0 then c ← c − 1 ; if d = 1 then consecutive disturbances if c > 0 then return ⊗ ; • Time quantum 𝜐 = 1 second else c ← 4; return √ ; • Time horizon h = 6 seconds end (b) 000000 √ 010000 √ FSA recognising admissible disturbance traces 
 … overall 8 adm 000001 √ (we actually use the rich language of the 010001 ⊗ 000010 √ disturbance traces 01001 ⊗ CMurphi model checker) 000011 ⊗ 0101 ⊗ Simulation Based Formal Verification of Onboard Software: A Case Study 10

  11. SyLVaaS Workflow k: Number of cores sim.camp 1 in user cluster sim.camp 2 sim.camp k Disturbance model Embarrassing Master-slave parallelism distributed approach disturbance traces Disturbance trace slice 1 Slicing of generation slice 2 … Computation of optimised random exhaustive slice k simulation campaign Simulation Based Formal Verification of Onboard Software: A Case Study 11

  12. Optimised Rnd Exhaustive Sim. Campaigns Computation of optimised … rnd exhaustive simulation slice 1 sim. campaign 1 campaigns embarrassing parallelism in SyLVaaS cluster Computation of optimised … rnd exhaustive simulation slice k sim. campaign k campaigns Sequence of simulator commands : Optimisation : use of load/store • inj_run( e , t ): inject disturbance and commands avoids revisiting previously • advance simulation visited simulation states as much as store( l ): store current sim. state into possible • mass memory Exhaustiveness : all disturbance traces in • load( l ): set current sim. state from input slice are verified • previously stored state Randomness : trace verification order is • free( l): free stored sim. state stored randomised • Simulation Based Formal Verification of Onboard Software: A Case Study 12

  13. Optimised Rnd Exhaustive Sim. Campaigns Slice Simulation campaign (rnd+optimised) 0 21 00 1 1 init store ( a ) 0 22 000 2 load( a ) inj_run( 0 ,1) store( b ) 0 22 0 3 0 inj_run( 2 ,1 𝜐 ) store( c ) 3 3 inj_run( 2 ,2 𝜐 ) store( i ) 0 2311 0 4 inj_run( 3 ,2 𝜐 ) 0 2322 0 5 load( c ) inj_run( 1 ,3 𝜐 ) 0 3 00 1 0 Prefix labelling 6 1 inj_run( 1 ,1 𝜐 ) during generation (DFS —> free!) load( b ) free( b ) 6 inj_run( 3 ,3 𝜐 ) inj_run( 1 ,2 𝜐 ) Slice of labelled traces load( c ) free( c ) a 0 b2c1d 0 e 0 f1g 1 inj_run( 3 ,1 𝜐 ) store( p ) 5 a 0 b2c2h 0 i 0 j 0 k inj_run( 2 ,1 𝜐 ) inj_run( 2 ,2 𝜐 ) 2 a 0 b2c2h 0 i3m 0 n load( p ) free( p ) 3 Labels 4 inj_run( 1 ,1 𝜐 ) inj_run( 1 ,2 𝜐 ) a 0 b2c3p1q1r 0 s 4 univocally load( i ) free( i ) free( a ) a 0 b2c3p2v2w 0 x denote trace 5 2 inj_run( 0 ,2 𝜐 ) prefixes a 0 b3y 0 z 0 α 1 β 0 λ 6 Simulation Based Formal Verification of Onboard Software: A Case Study 13

  14. Embarrassingly Parallel Simulation Simulation carried out on user private cluster (Intellectual Property protection) sim.camp 1 Anytime bound to sim.camp 2 Omission Probability : sim.camp 3 1 - min i ∈ [1,k] (%done i ) … … sim.camp k SUV model + pass / k overall Simulink embedded fail + cntrex instances on k cores property monitor Simulation Based Formal Verification of Onboard Software: A Case Study 14

  15. A Case Study: Apollo Saturn V Launch Vehicle Apollo Command Module pitch engine roll engine Apollo Command and Service Modules yaw engine Simulation Based Formal Verification of Onboard Software: A Case Study 15

  16. A Case Study: Apollo Yaw, Pitch and Roll Jets Three signals: Yaw, Pitch and Roll sensors Safety: Yaw, Pitch and Roll close to 0 Simulation Based Formal Verification of Onboard Software: A Case Study 16

  17. A Case Study: Apollo Three signals demuxed to disturb singularly SyLVaaS Disturber Module SUV Monitor Checking safety property Monitor output fail 1 pass Simulation Based Formal Verification of Onboard Software: A Case Study 0 17

Recommend


More recommend