Learning Neural Models for Continuous-Time Sequences

13 Nov 2021  ·  Vinayak Gupta ·

The large volumes of data generated by human activities such as online purchases, health records, spatial mobility etc. are stored as a sequence of events over a continuous time. Learning deep learning methods over such sequences is a non-trivial task as it involves modeling the ever-increasing event timestamps, inter-event time gaps, event types, and the influences between events -- within and across different sequences. This situation is further exacerbated by the constraints associated with data collection e.g. limited data, incomplete sequences, privacy restrictions etc. With the research direction described in this work, we aim to study the properties of continuous-time event sequences (CTES) and design robust yet scalable neural network-based models to overcome the aforementioned problems. In this work, we model the underlying generative distribution of events using marked temporal point processes (MTPP) to address a wide range of real-world problems. Moreover, we highlight the efficacy of the proposed approaches over the state-of-the-art baselines and later report the ongoing research problems.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here