learning software models
play

Learning Software Models Alessandra Russo Imperial College London - PowerPoint PPT Presentation

Learning Software Models Alessandra Russo Imperial College London in collaboration with.... Imperial College London Dalal Alrajeh, Jeff Kramer, Jeff Magee, Daniel Sykes Domenico Corapi Universidad de Buenos Aires Sebastian


  1. Learning Software Models Alessandra Russo Imperial College London

  2. in collaboration with.... • Imperial College London Dalal Alrajeh, Jeff Kramer, Jeff Magee, Daniel Sykes 
 Domenico Corapi • Universidad de Buenos Aires 
 Sebastian Uchitel • Université Catholique de Louvain 
 Axel van Lamsweerde • National Institute of Informatics, Japan Katsumi Inoue Dalal Alrajeh Dalal Alrajeh Dalal Alrajeh

  3. do(putdown, 3) holdsAt(at(loc1), 0) holdsAt(holdingObject, 3) G[PumpOnWhenHighWaterAndNoMethane] holdsAt(at(loc5), 3) do(move(loc2, loc5), 2) G[PumpOffWhenLowWater] holdsAt(holdingObject, 2) G[PumpOffWhenMethane] holdsAt(at(loc3), 2) G[AlarmWhenMethane] do(move(lock, lock),1) holdsAt(holdingObject, 1) holdsAt(at(loc1), 1) do(pickup, 0) Engineering Software Models o a b n s a t l a y c s l i e s Informal 
 
 requirements n o i t a t p Execution traces a d a elaboration refinement = � (tick → ((HighWater ∧ ¬ CriticalMethane ) → ◯ ( ¬ tick W (tick ∧ PumpOn)))) = � (tick → ( LowWater → ◯ ( ¬ tick W (tick ∧ ¬ PumpOn)))) = � (tick → ( CriticalMethane → ◯ ( ¬ tick W (tick ∧ ¬ PumpOn)))) = � (tick → ( CriticalMethane → ◯ ( ¬ tick W (tick ∧ Alarm)))) System goals

  4. do(putdown, 3) G[AlarmWhenMethane] holdsAt(holdingObject, 3) holdsAt(at(loc5), 3) do(move(loc2, loc5), 2) holdsAt(holdingObject, 2) holdsAt(at(loc3), 2) do(move(lock, lock),1) G[PumpOnWhenHighWaterAndNoMethane] holdsAt(holdingObject, 1) holdsAt(at(loc1), 1) G[PumpOffWhenLowWater] do(pickup, 0) G[PumpOffWhenMethane] holdsAt(at(loc1), 0) Learning Software Models o a b n s a t l a y c s l i e s Informal 
 
 requirements n o i t a t p Execution traces a d a elaboration refinement Logic-based learning can provide automated support = � (tick → ((HighWater ∧ ¬ CriticalMethane ) → ◯ ( ¬ tick W (tick ∧ PumpOn)))) to software model = � (tick → ( LowWater → ◯ ( ¬ tick W (tick ∧ ¬ PumpOn)))) elaboration, refinement, = � (tick → ( CriticalMethane → ◯ ( ¬ tick W (tick ∧ ¬ PumpOn)))) adaptation and analysis = � (tick → ( CriticalMethane → ◯ ( ¬ tick W (tick ∧ Alarm)))) System goals

  5. � (criticalMethane → ◯ turnPumpOff) Elaboration of Operational Requirements Software models: formal description of 
 operational requirements: Op "TurnPumpOff" pre-conditions - DomPre: PumpOn - DomPost: PumpOff trigger-conditions - ReqTrig: CriticalMethane - ReqPre: !HighWater post-conditions ….In practice Operationalisation patterns • Pre-verified • Restricted to patterns • Guarantee completeness 
 • Patterns can be complex undesirable 
 behaviours Req ⊨ Goals • Lack of automated support • Guide the operationalisation desirable 
 behaviours

  6. Elaboration of Operational Requirements Research Question Automatically generate software models cover scenarios of desired behaviours reject scenarios of undesirable behaviours satisfy given domain properties and partial requirements Domain properties 
 Partial requirements Undesirable behaviours Operational 
 Requirements Desirable behaviours

  7. Elaboration of Operational Requirements Research Question Automatically generate software models cover scenarios of desired behaviours reject scenarios of undesirable behaviours satisfy given domain properties and partial requirements Domain properties 
 Partial requirements Undesirable behaviours Operational 
 Requirements Desirable behaviours

  8. Elaboration of Operational Requirements Research Question Automatically generate software models cover scenarios of desired behaviours reject scenarios of undesirable behaviours satisfy given domain properties and partial requirements Domain properties 
 Partial requirements Undesirable behaviours logic-based 
 Operational 
 Requirements learning Desirable behaviours

  9. What is logic-based learning? Knowledge Representation Machine Learning Logic-based Logic Learning Programming • Knowledge extraction for observations • Declarative representation of problem domain • Prediction about unseen data • Clear semantics • Ability to improve behaviour over time 
 • Sound (and complete) inference mechanisms with experience Learning declarative knowledge from 
 observations and existing (partial) domain model

  10. How does it work? Given g: odd(5), Language bias not odd(4), nat(s(X)) ← nat(X) not odd(2) nat(0) K Domain knowledge 
 modeh(odd(+nat)) even(0) H: odd(X) E + Positive examples modeh(even(+nat)) odd(5) E - Negative examples modeb(even(+nat)) not odd(2) IC Integrity constraints modeb(odd(+nat)) not odd(4) H: odd(X) ←
 Find modeb(=(+nat, s(-nat))) Key Features false ← odd(X), even(X) X= s(Y) H New knowledge, such that Able to learn concepts that are even(X) ← X = s(Y), 
 K ∪ H ⊨ E + not observed odd(Y) K ∪ H ⊭ E - recursive odd(X) ← X = s(Y), 
 H: odd(X) ←
 K ∪ H ∪ IC ⊭ false inter-dependent even(Y) X= s(Y), even(Y) Use of heuristics to drive the search “artificial example” 
 Handle integrity constraints g: even(4) Amenable to distributed computation

  11. Elaboration of Operational Requirements Formalisation of the problem Given Find Domain knowledge: New operational requirements Req domain properties (D) 
 D ∪ R ∪ Req ⊨ E + partial requirement (R) D ∪ R ∪ Req ⊭ E - E + desirable behaviours D ∪ R ∪ Req ⊭ false E - undesirable behaviours

  12. ◯ signalCriticalMethane , ◯²signalHighWater, ◯³turnPumpOn Example: mine pump system � (pumpOn → pumpOn W turnPumpOff) ���(��(�� → �(��(�������)����(�� ����)����(�� → pumpOn) ����)����(��� → �(��(�� ���������������������������� undesired behaviours (E - ) ◯ signalLowWater, ◯² turnPumpOn logic-based 
 �������������������������������(( �����)���������������������� learning ◯ signalNoCriticalMethane, ◯² signalHighWater , ◯³ turnPumpOn Translate 
 Execute 
 Encode FLTL 
 desired behaviours (E + ) back into FLTL learning into LP

  13. Theoretical Results • Automated sound and complete encoding of FLTL into Event Calculus logic programs • Learned requirements are correct elaborations of the software model with respect to the given desired and undesired behaviours. • Satisfiability of desired behaviour are equivalent to entailment of positive examples because of single stable model of the EC representation. What about system goals?

  14. Refinement of Software models from Goal models Research Question Can we automatically generate complete 
 set of operational requirements 
 (i.e. pre-conditions and trigger-conditions) that together with the domain properties 
 satisfies a given goal model?

  15. Refinement of Software models from Goal models More formally…. Solution While D ∪ R ⊭ G Problem Compute model checking Given undesirable scenarios E - D ∪ R ∪ E - ⊭ G Domain knowledge: desirable scenarios E + domain properties (D) 
 D ∪ R ∪ E + ∪ G ⊭ false set of goals (G) Find Learn 
 logic-based learning operational requirements {Rep i } Set of operational requirements Req D ∪ R ∪ Rep i ⊭ E - D ∪ Req ⊨ G D ∪ R ∪ Rep i ∪ E + ⊭ false D ∪ Req ⊭ false D ∪ R ∪ Rep i ∪ G ⊭ false R = R ∪ Rep i

  16. Refinement of Software models from Goal models Goal Model Witnesses model checking Domain properties Counterexamples Operational 
 Negative examples Requirements logic-based 
 (Pre-conditions 
 learning Trigger-conditions) Positive examples

  17. Model Checking Phase LTS M is synthesised from FLTL software model (D ∪ R ) M is model checked again goals G, and 
 counterexample C is generated Witnesses traces W of the violated goal are computed from 
 M ∪ G

  18. Logic-based Learning Phase Positive and negative examples are generated from witnesses and counterexample FLTL software model (D ∪ R ) is encoded into EC logic program (K), and goals are expressed as integrity constraints (IC) Learning task computes missing requirements (Req i ) • D ∪ R ∪ Req i ∪ W ⊭ false • K ∪ Req i ⊨ E + • D ∪ R ∪ Req i ⊭ C • K ∪ Req i ⊭ E - • K ∪ Req i ∪ IC ⊭ false • D ∪ R ∪ Req i ∪ G ⊭ false FLTL representation of selected hypothesis is added to software model.

  19. Elaborating examples from counterexample and witnesses Goal ⃞ (tick → ((CriticalMethane) → ◯ (¬tick W (tick ⋀ ¬PumpOn)) Identify undesirable event in the counterexample software controlled event missing pre-condition impossible(switchPumpOn, P , S) ←
 learning the missing pre-condition holdsAtPrev(criticalMethans, P , S). � (tick → (criticalMethane →
 ◯ (¬switchPumpOn W tick))

  20. Elaborating examples from counterexample and witnesses Goal ⃞ (tick → ((CriticalMethane) → ◯ (¬tick W (tick ⋀ ¬PumpOn)) Identify undesirable event in the counterexample software controlled event missing pre-condition missing trigger-condition tick event impossible(tick, P , S) ←
 learning the missing trigger-condition holdsAtPrev(criticalMethans, P , S), 
 holdsAtprev(pumpOn,P ,S), 
 not occursSinceLastTick(switchPumpOff, P , S). � (tick → (criticalMethane ∧ pumpOn →
 ◯ (¬tick W switchPumpOff)))

Recommend


More recommend