No
Yes
View More
View Less
Working...
Close
OK
Cancel
Confirm
System Message
Delete
My Schedule
An unknown error has occurred and your request could not be completed. Please contact support.
Scheduled
Scheduled
Wait Listed
Personal Calendar
Speaking
Conference Event
Meeting
Interest
There aren't any available sessions at this time.
Conflict Found
This session is already scheduled at another time. Would you like to...
Loading...
Please enter a maximum of {0} characters.
{0} remaining of {1} character maximum.
Please enter a maximum of {0} words.
{0} remaining of {1} word maximum.
must be 50 characters or less.
must be 40 characters or less.
Session Summary
We were unable to load the map image.
This has not yet been assigned to a map.
Search Catalog
Reply
Replies ()
Search
New Post
Microblog
Microblog Thread
Post Reply
Post
Your session timed out.
This web page is not optimized for viewing on a mobile device. Visit this site in a desktop browser to access the full set of features.
2019 GTC San Jose
Add to My Interests
Remove from My Interests

S9251 - Sparse Attentive Backtracking: Temporal Credit Assignment Through Reminding

Session Speakers
Session Description

Learning long-term dependencies in extended temporal sequences requires credit assignment to events far in the past. The most common method for training recurrent neural networks, backpropagation through time, requires credit information to be propagated backwards through every single step of the forward computation, potentially over thousands or millions of time steps. We'll describe how this becomes computationally expensive or even infeasible when used with long sequences. Although biological brains are unlikely to perform such detailed reverse replay over very long sequences of internal states, humans often reminded of past memories or mental states associated with their current mental states. We'll discuss the hypothesis that such memory associations between past and present could be used for credit assignment through arbitrarily long sequences, propagating the credit assigned to the current state to the associated past state.


Additional Information
AI/Deep Learning Research
AI/Deep Learning Research
Other
All technical
Talk
50 minutes
Session Schedule