industry 4 0 open architecture for data collection and
play

INDUSTRY 4.0 OPEN ARCHITECTURE FOR DATA COLLECTION AND TRANSPORT - PowerPoint PPT Presentation

INDUSTRY 4.0 OPEN ARCHITECTURE FOR DATA COLLECTION AND TRANSPORT Bologna - 17/05/2019 Agenda Industry 4.0 definition and principles Architecture for data collection and transport for Industry 4.0 Enabling technologies


  1. INDUSTRY 4.0 – OPEN ARCHITECTURE FOR DATA COLLECTION AND TRANSPORT Bologna - 17/05/2019

  2. Agenda › Industry 4.0 – definition and principles › Architecture for data collection and transport for Industry 4.0 › Enabling technologies ▪ Apache Kafka ▪ Docker ▪ Kubernetes 2

  3. Industry 4.0 – definition and principles 3

  4. Industry 4.0 - Definition “Industry 4.0 is a term applied to a group of rapid transformations in the design, manufacture, operation and service of manufacturing systems and products […] everything in and around a manufacturing operation (suppliers, the plant, distributors, even the product itself) is digitally connected, providing a highly integrated value chain ” http://www.europarl.europa.eu/RegData/etudes/BRIE/2015/568337/EPRS_BRI(2015)568337_EN.pdf 4

  5. Industry 4.0 - Definition 4th Industrial revolution – Industry 4.0 Today - Electronics and information technology drives new levels of automation of complex tasks 3rd Industrial revolution 1970s - Today 1st Industrial revolution Electronics and information technology drives new levels of automation of complex tasks 1784 – mid 19° century 2nd Industrial revolution Water and steam-powered Late 19° century -1970s mechanical manufacturing Electric-powered mass production based on the division of labour http://www.europarl.europa.eu/RegData/etudes/BRIE/2015/568337/EPRS_BRI(2015)568337_EN.pdf 5

  6. I4.0 - Cornerstones The application of information and communication technology ( ICT ) to digitise information and integrate systems Cyber-physical systems that use ICTs to monitor and control physical processes and systems . These may involve embedded sensors, intelligent robots that can configure themselves to suit the immediate product to be created, or additive manufacturing (3D printing) devices Network communications including wireless and internet technologies that serve to link machines, work products, systems and people , both within the manufacturing plant , and with suppliers and distributors Simulation, modelling and virtualisation in the design of products and the establishment of manufacturing processes Collection of vast quantities of data , and their analysis and exploitation , either immediately on the factory floor, or through big data analysis and cloud computing Greater ICT-based support for human workers, including robots, augmented reality and intelligent tools 6

  7. I4.0 - Cornerstones The application of information and communication technology ( ICT ) to digitise information and integrate systems Cyber-physical systems that use ICTs to monitor and control physical processes and systems . These may involve embedded sensors, intelligent robots that can configure themselves to suit the immediate product to be created, or additive manufacturing (3D printing) devices Network communications including wireless and internet technologies that serve to link machines, work products, systems and people , both within the manufacturing plant , and with suppliers and distributors Simulation, modelling and virtualisation in the design of products and the establishment of manufacturing processes Collection of vast quantities of data , and their analysis and exploitation , either immediately on the factory floor, or through big data analysis and cloud computing Greater ICT-based support for human workers, including robots, augmented reality and intelligent tools 7

  8. Industry 4.0 – Key factors and challenges ▪ Business practices ▪ Corporate standards ▪ Security/safety policies and procedures ▪ Business and Application requirements ▪ Regulatory compliance ▪ Risk management 8

  9. Cornerstones of a joint collaboration ▪ Industry (SACMI) ▪ Business domain expertise (regulations, value stream, etc …) ▪ Cyber-physical systems expertise ▪ Academia (UniBO) ▪ Private/public infrastructure interoperability ▪ Protocol interoperability, efficiency, and performance tuning/evaluation ▪ Digital Twins ▪ Enterprise Architecture (Imola) ▪ Large-scale system design and integration on private/public infrastructures ▪ Software development/delivery process governance ▪ Data Governance and Cybersecurity 9

  10. Architecture for data collection and transport for Industry 4.0 10

  11. Project goal Design and implementation of a data collection and transport platform/architecture for Industry 4.0 scenarios 11

  12. Project requirements ▪ OT principles and regulations should be guaranteed; (Physical) Safety AND Cybersecurity ▪ Heterogeneity : need to accomodate a large number of (mainly legacy) equipments and data formats ▪ Scaling : need to accomodate ever-increasing volumes of connected equipments (new plants) and message rates ▪ Near real-time processing requirements: business/safety-critical actions happen at the equipment level 12

  13. Data collection and transport architecture – logical layers • Pluggable output towards multiple data stores Storage/analysis layer • Multiple data analysis and reporting platforms • Data format modeling and definition Interoperability layer • Data format validation and transformation Transport/messaging • Asynchronous, publish-subscribe model layer • Scalable message transport • HW/SW integration via possibly legacy and/or proprietary Data collection layer protocols • Different equipment models and HW/firmware versions 13

  14. Data collection and transport architecture – architecture 14

  15. Enabling technologies Apache Kafka 15

  16. Transport layer – Apache Kafka Apache Kafka is a streaming platform with three key capabilities ▪ Publish and subscribe to streams of records, similar to a message queue or enterprise messaging system ▪ Store streams of records in a fault-tolerant durable way ▪ Process streams of records as they occur Apache Kafka typical use cases ▪ Real-time streaming data pipelines used for data aggregation, processing, and transport; ▪ Event-reactive streaming applications used for fraud detection, data validation, email sending confirmation; ▪ Applications for real-time data analytics, stream processing, log aggregation, messaging, audit trail, sync for cooperative nodes. 16

  17. Publish-subscribe model Subscriber Publisher Publisher Topic Subscriber Publisher Subscriber Subscriber Publisher Topic Subscriber Publisher Subscriber 17

  18. Apache Kafka Topics and Partitions Consumer process offset:0 offset:1 offset:1 Publisher Name: Name: Name: Boca Borussia Arsenal Juniors Dortmund Publisher Consumer process Partition 1 - A-M Team names Publisher Consumer group 1 offset:0 Name: Consumer process • Real Partitions and consumer groups allow for Madrid scalability • Partitions are spread across different Partition 2 - P-Z Team names machines • Partitions split commit log by given criteria • Topic – Football Teams Consumer group 2 Partitions preserve message order 18

  19. Apache Kafka – Brokers and Cluster Consumer process offset:0 offset:1 offset:1 Publisher Name: Name: Name: Boca Borussia Arsenal Juniors Dortmund Publisher Consumer process Partition 1 - A-M Team names BROKER 1 Publisher Consumer group 1 Consumer process offset:0 Name: Real Madrid Partition 2 - P-Z Team names Consumer group 2 BROKER 2 KAFKA CLUSTER 19

  20. Apache Kafka – Apache Zookeeper coordination Consumer process offset:0 offset:1 offset:1 Publisher Name: Name: Name: Boca Borussia Arsenal Juniors Dortmund Publisher Consumer process Partition 1 - A-M Team names BROKER 1 Publisher Consumer group 1 Consumer process offset:0 Name: Real Madrid Partition 2 - P-Z Team names Consumer group 2 BROKER 2 APACHE ZOOKEEPER KAFKA CLUSTER 20

  21. Apache Kafka commit log 21

  22. Interoperability layer – Kafka Connect APACHE Kafka Connect – framework to define connectors to stream data into and out of Kafka ▪ A common framework for Kafka connectors ▪ REST interface ▪ Automatic offset management ▪ Distributed and scalable 22

  23. Interoperability layer – Kafka Connect Amazon S3 Amazon S3 Sink Connector Kafka PostgreSQL DB Source Connector Topic Elasticsearch Sink Connector 23

  24. Enabling technologies Containers & Docker 24

  25. Infrastructure - Containers VIRTUAL MACHINE APP APP APP CONTAINER LIBS/BIN LIBS/BIN LIBS/BIN APP APP APP GUEST OS GUEST OS GUEST OS LIBS/BIN LIBS/BIN LIBS/BIN HYPERVISOR DOCKER ENGINE HOST OS HOST OS PHYSICAL INFRASTRUCTURE PHYSICAL INFRASTRUCTURE 25

  26. Infrastructure - Containers ▪ containers include an application/service together with its dependencies ▪ containers share kernel with other containers ▪ containers run as isolated processes ▪ higher efficiency w/r to virtualization ▪ images are the cornerstone in crafting declarative/automated, easily repeatable, and scalable services and applications 26

  27. Infrastructure - Docker An open platform for distributed applications for developers and sysadmins Docker allows you to package an application with all of its dependencies into a standardized unit for software development . https://docs.docker.com/engine/ 27

  28. What is Docker? ▪ Docker consists of: ▪ The Docker Engine - our lightweight and powerful open source containerization technology combined with a work flow for building and containerizing your applications. ▪ Docker Hub - our SaaS service for sharing and managing your application stacks. 28

Recommend


More recommend