finding structure in time finding structure in time
play

Finding Structure in Time Finding Structure in Time By Jonathan Hall - PowerPoint PPT Presentation

Finding Structure in Time Finding Structure in Time By Jonathan Hall Author: Jeffrey L. Elman Author: Jeffrey L. Elman Contents Contents Problem Problem Algorithm Results l Conclusion Problem Problem Speech research Speech


  1. Finding Structure in Time Finding Structure in Time By Jonathan Hall Author: Jeffrey L. Elman Author: Jeffrey L. Elman

  2. Contents Contents • Problem Problem • Algorithm • Results l • Conclusion

  3. Problem Problem • Speech research Speech research • Usual MLP takes all input at once and returns output all at once output all at once • Speech needs temporal ordering

  4. Past solutions to problem Past solutions to problem • Usual approach: sequential representation of Usua app oac : seque t a ep ese tat o o events – [0 1 1 1 0 0 0 0 0] – [0 0 0 1 1 1 0 0 0] • Biological Implications – How to express parallel computations in terms of human brain? • Not all input vectors same length • Not all input vectors same length – Example: translating user speech into text • Applications: prediction problems • Applications: prediction problems

  5. Solution to Problem Solution to Problem • Variable length input Variable length input • Analyze batches of events • Dynamic system i • Remember past events

  6. Algorithm Algorithm • Solution: give network memory with a Solution: give network memory with a recurrent network

  7. Recurrent Networks Overview Recurrent Networks Overview • Has a directed cycle Has a directed cycle • Hopfield is a symmetric recurrent network • Good for: learning grammar speech recognition and Good for: learning grammar, speech recognition, and music composition

  8. Author’s Network Author s Network • Why did author use network on left and not Why did author use network on left and not the one on the right?

  9. Results Contents Results Contents • XOR XOR • Letter Sequence • Words d • Sentences

  10. XOR XOR • Problem setup: Problem setup: – Input: 2 random bits, 3 rd bit is XOR of previous 2 bits bits – Goal: predict next bit – Example Input: 1 0 1 0 0 0 0 1 1 1 1 0 1 0 1 – Example Input: 1 0 1 0 0 0 0 1 1 1 1 0 1 0 1 …

  11. XOR Results XOR Results

  12. Letter Sequence Letter Sequence • Problem setup: Problem setup: – Input: Random consonants (b,d,g), then random consonants replaced with: b ‐ >ba d ‐ >dii g ‐ >guuu consonants replaced with: b >ba, d >dii, g >guuu – Each letter given unique 1x6 vector – Goal: predict next letter – Goal: predict next letter

  13. Letter Sequence Results Letter Sequence Results

  14. Words Words • Problem Setup: Problem Setup: – Input: String of sentences with no breaks between words or sentences words or sentences – Goal: predict next letter of sequence

  15. Words Results Words Results

  16. Sentence Setup Sentence Setup • A large number of simple sentences were A large number of simple sentences were randomly produced • Each word was vectorized • Each word was vectorized • No breaks in between sentences • Goal: predict the next word

  17. Sentences Sentences

  18. Sentences: quality measurement Sentences: quality measurement • Can’t use word ‐ by ‐ word RSS Can t use word by word RSS – Error: 0.88 • Solution use RSS for categories of words • Solution: use RSS for categories of words – Error: 0.053 • How did the network do this?

  19. Sentence Classification Sentence Classification

  20. Conclusion Conclusion • Some problems are different when expressed Some problems are different when expressed as temporal events. • RSS can be used to analyze the temporal RSS can be used to analyze the temporal structure. • Length of sequential dependencies doesn’t Length of sequential dependencies doesn t always worsen performance. • Representation of time and memory is task ‐ Representation of time and memory is task dependant. • Representations can be structured. Representations can be structured.

  21. The End

Recommend


More recommend