时 间: 2022-12-06 16:00 — 17:00
Implicit Generative Models (IGMs) have demonstrated superior capacity in generating high-fidelity samples from high dimensional space, especially for static image data. However, these methods struggle to capture the temporal dependence of joint probability distributions induced by time-series data. To tackle this issue, we directly compare the path distributions via the characteristic function of measures on the path space (PCF) from rough path theory, which uniquely characterises the law of stochastic processes. The distance metric via PCF enjoyed theoretical properties, including boundedness and differentiability with respect to generator parameters. The PCF can also be thought as a variant of the MMD loss on the path space, which enjoys linear time complexity in the sample size, in contrast with the quadratic-time Maximum Mean Discrepancy (MMD). Furthermore, the PCF loss can be optimised based on the path distribution by learning the optimal unitary representation of PCF, which avoids the need for manual kernel selection, and leads to an improvement in test power relative to the original PCF. Numerical results demonstrate that the proposed PCF-GAN consistently outperforms state-of-the-art baselines on several benchmarking datasets (e.g, rough volatility data, google stock data and air-quality data) in terms of various test metrics.
(*Zoom Meeting ID: 919 637 6185; Passcode: 314159)