How NLP is Helping a European Financial Institution Enhance Customer Experience Tal Doron Director, Technology Innovation
Agenda Introduction 1 Challenges 2 Use Case 3 Project Milestones 4 What’s Next 5 2
ABOUT ME @taldor oron on taldor oron on84 tald ld@gigaspaces.com Ta Tal Doron Director, Technology Innovation
About GigaSpaces 300+ Direct customers We provide one of the leading in-memory computing platforms for real-time insight to action and extreme transactional processing. With GigaSpaces, enterprises 50+ / 500+ can operationalize machine learning and transactional Fortune / Organizations processing to gain real-time insights on their data and act upon them in the moment. 5,000+ Large installations in production (OEM) 25+ InsightEdge is an in-memory real- In-Memory Computing time analytics platform for instant Platform for microsecond ISVs insights to action; analyzing data scale transactional as it's born, enriching it with processing, data scalability, historical context, for smarter, and powerful event-driven faster decisions workflows
74% want to be data driven only 23% are successful,
How Can You Gain the Most Value from Your Data? Near real-ti Ne time data ta is highly valuable if you act on it on time Time-cr Ti critical cal Tr Trad aditional al “ “bat atch ch” de decision business bu in intellig igence His istorical orical + near ar Value re real-ti time data ta is more ore valuable if you have the means Actionab Reactive Historical le to combine them Preventive/ Actionable Reactive Predictive Historical REAL-TIME SECONDS MINUTES HOURS DAYS MONTHS Time
The Velocity of Business (once upon a time) “A typical e-commerce “To prevent fraud, “A call center receives website will experience anomaly detection 450, 45 0,000 000 calls lls/min , across 40% 40% bounce if it loads in needs to happen 200 phone numbers, more than 3 s 3 seconds , against 500,000 each call needs to be including txn/sec in less than routed in less than 60 60 personalization offers” 200 mi 20 millise seconds ” mi milliseconds ” ECOMMERCE TELCO FINANCIAL SERVICES
ABOUT THE CUSTOMER This Financial IT Service provider serves the leading banks in Germany with core solutions and services Business Goals: Enhance customer experience with quicker First Call Resolution Reduce Average Handle Time for optimized efficiency
BUSINESS CHALLENGES KEEPING UP WITH DISJOINTED CUSTOMER AN OMNICHANNEL EXPERIENCE EMPOWERED CUSTOMERS INTERACTIONS Customers are smarter and Customers want a Disparate data sources and have more insights into systems, led to inefficient consistent experience juggling between screen and competitive products and across all channels and systems and poor data services, raising expectations agents, demanding faster quality & poor customer to a new standard resolution times experience
TECHNICAL CHALLENGES HIGH PERFORMANCE MILLISECOND LATENCY CONTINUOUS ML TRAINING Ingestion of millions of CRM Customers demand Insights constantly need to cases and data from other an immediate response time, adapt to changing repositories into a unified requiring high performance conditions for smartest analytics platform solutions that leverage ML insights models in real-time
PROPOSED SOLUTION If a live agent is needed during a call, the NLP based solution automatically supplies the agent with articles and knowledge documents based on the conversation
Search… DATA SOURCES CUSTOMER CUSTOMER TICKET Ticket ID #54367 95.32% Credit Limit exceeded #56409 93.05 % Authentication Customer Name required #33487 #54367 86.16 % Beneficiary account unknown International #180762 Payment Type declined Beneficiary 77.98% account dormant Enterprise #180762 71.53% Intermediary bank changes #60975 Support Level Bronze Case Description Case Resolution Case Description Last Contact Date International payment to Check that credit limit Check here to email 20.12.18 supplier declined is not exceeded instructions to customer Read more Read more Email
Training the model Time to results based on ~50ms 2M CRM records in 27min
General Architecture & Data Flow Server 1 Server 2 APPLICATION REST INTEGRATION WEB SERVER SERVER PLATFORM SERVICE SERVICE BROKER 1 – 3 CLUSTER-PARTNER Find Similarities Model & ONLY FOR FAILOVER Tickets API Initial Load SERVICE DB
General Architecture & Data Flow Initial Load 1 Server 1 Hibernate on Object Store APPLICATION REST INTEGRATION WEB SERVER SERVER PLATFORM SERVICE SERVICE BROKER 1 – 3 Find Similarities Model & Tickets API 1 Initial Load SERVICE DB
General Architecture & Data Flow Training/building model 2 Server 1 • Train • stopTrainModel APPLICATION REST INTEGRATION • WEB SERVER SERVER getTrainModelStatus PLATFORM SERVICE SERVICE • checkModelInSpace BROKER 1 – 3 • destroylModel Find Similarities 2 Model & Tickets API 1 Initial Load SERVICE DB
General Architecture & Data Flow 3 Long Running Spark Job Server 1 API • startModel • stopModel APPLICATION • REST INTEGRATION checkModelIsRunning WEB SERVER SERVER PLATFORM SERVICE SERVICE • getFindSimilartiesStatus BROKER 1 – 3 3 Find Similarities 2 Model & Tickets API 1 Initial Load SERVICE DB
General Architecture & Data Flow 4 findSimilarities Server 1 • Write findSimilaritiesRequest APPLICATION REST object to the space using task INTEGRATION WEB SERVER SERVER PLATFORM SERVICE SERVICE BROKER 1 – 3 • Spark long time running job takes the object perform the 3 find similarities action (set the Find Similarities 4 object status to processed 2 Model & true) Tickets API 1 Initial Load SERVICE DB
General Architecture & Data Flow 4 findSimilarities Server 1 ticketId>72018 APPLICATION gs.exec(modelId, “my search”) REST INTEGRATION WEB SERVER SERVER PLATFORM SERVICE SERVICE The result is the following similar cases: BROKER 1 – 3 70534 (0.823432215) 70874 (0.726937532) 70110 (0.719002341) 3 Find Similarities 4 70998 (0.528010191) 2 Model & Tickets API 1 Initial Load SERVICE DB
General Architecture & Data Flow 5 Support Tickets (the data) Server 1 • Incremental Feed APPLICATION REST • INTEGRATION WEB SERVER SERVER Delete PLATFORM SERVICE SERVICE BROKER 1 – 3 3 Find Similarities 4 5 2 Model & Tickets API 1 Initial Load SERVICE DB
Unified Transactional & Analytical Processing for Operationalizing ML VARIOUS APPLICATION DATA SOURCES UNIFIED REAL-TIME ANALYTICS, AI & TRANSACTIONAL PROCESSING REAL-TIME INSIGHT TO ACTION DISTRIBUTED IN-MEMORY MULTI MODEL STORE HOT RAM DATA SSD STORAGE WARM PERSISTENT MEMORY DATA DASHBOARDS AnalyticsXtreme AnalyticsXtreme • No ETL, reduced complexity • Built-in integration with external Hadoop/Data Lakes S3-like • Fast access to historical data • Automated life-cycle management COLD DATA BATCH LAYER
RESULTS CO CONTIN TINUOUS ML REAL AL-TIM TIME EMPOW OWER THE HE AG AGENT TRAI AINING Allow the agents an Average time of 27 Minutes 50ms immediate response background training to search and find time, reducing mean time for 2 million similar cases time to resolution records
Overcoming Challenges
Step 1: Initial Load UNIFIED REAL-TIME ANALYTICS, AI & TRANSACTIONAL PROCESSING IN-MEMORY MULTI MODEL STORE RAM HOT DATA STORAGE-CLASS MEMORY Load 2 million records from a slow tier SSD STORAGE WARM DATA to a distributed in-memory data fabric (e.g. Multi-model Store) REAL-TIME LAYER BATCH LAYER DATABASE
Distributed Multi-Model Object Store DYNAMIC SCALE CLIENT DB NODE 1 BACKUP NODE 1 PRIMARY NODE 2 BACKUP NODE 2 PRIMARY NODE 3 PRIMARY NODE 3 BACKUP
Step 2: Create Model and Save to… UNIFIED REAL-TIME ANALYTICS, AI & TRANSACTIONAL PROCESSING Submit a Spark job to read from space and create an RDD IN-MEMORY MULTI MODEL STORE Create a Model (or “Customized RAM HOT DATA Model”) and save to: 1. Spark – can lose model STORAGE-CLASS MEMORY 2. Disk – too slow & no HA 3. Distributed Datagrid SSD STORAGE WARM DATA REAL-TIME LAYER Challenge # 1 BATCH LAYER Not a built-in Spark MLlib algorithm, DATABASE had to work around to persist to the grid.
Step 3: Request/Response via Message Broker UNIFIED REAL-TIME ANALYTICS, AI & TRANSACTIONAL PROCESSING Spark job to “Find similarity request stream from Kafka” (long running job) IN-MEMORY MULTI MODEL STORE Run through model to get a response RAM HOT (model is loaded once to Spark) DATA STORAGE-CLASS MEMORY SSD STORAGE WARM DATA Write the response back to Kafka REAL-TIME LAYER Challenge # 2 BATCH LAYER Message broker is adding too much DATABASE latency
Recommend
More recommend