登入選單
返回Google圖書搜尋
Stochastic Automatic Differentiation
其他書名
Automatic Differentiation for Monte-Carlo Simulations
出版SSRN, 2018
URLhttp://books.google.com.hk/books?id=CVr6zgEACAAJ&hl=&source=gbs_api
註釋In this paper we re-formulate the automatic differentiation (and in particular, the backward automatic differentiation, also known as adjoint automatic differentiation, AAD) for random variables. While this is just a formal re-interpretation it allows to investigate the algorithms in the presence of stochastic operators like expectation, conditional expectation or indicator functions.We then specify the algorithms to efficiently incorporate (conditional) expectation operators without the need to differentiate an approximation of the (conditional) expectation. Under a comparably mild assumption it is possible to retain the simplicity of the backward automatic differentiation algorithm in the presence of (conditional) expectation operators. This simplifies important applications like - in mathematical finance - the application of backward automatic differentiation to the valuation of Bermudan options or calculation of xVA's.In addition, the framework allows to dramatically reduce the memory requirements and improve the performance of a tapeless implementation of automatic differentiation.