Add to My Interests
Remove from My Interests
Learning long-term dependencies in extended temporal sequences requires credit assignment to events far in the past. The most common method for training recurrent neural networks, backpropagation through time, requires credit information to be propagated backwards through every single step of the forward computation, potentially over thousands or millions of time steps. We'll describe how this becomes computationally expensive or even infeasible when used with long sequences. Although biological brains are unlikely to perform such detailed reverse replay over very long sequences of internal states, humans often reminded of past memories or mental states associated with their current mental states. We'll discuss the hypothesis that such memory associations between past and present could be used for credit assignment through arbitrarily long sequences, propagating the credit assigned to the current state to the associated past state.
Do Not Sell My Personal Information