• Graduate Program
    • Why study Business Data Science?
    • Program Outline
    • Courses
    • Course Registration
    • Admissions
    • Facilities
  • Research
  • News
  • Summer School
    • Deep Learning
    • Machine Learning for Business
    • Tinbergen Institute Summer School Program
    • Receive updates
  • Events
    • Events Calendar
    • Events archive
    • Summer school
      • Deep Learning
      • Machine Learning for Business
      • Tinbergen Institute Summer School Program
      • Receive updates
    • Conference: Consumer Search and Markets
    • Tinbergen Institute Lectures
    • Annual Tinbergen Institute Conference archive
  • Alumni
Home | Events Archive | Inference on derivatives of high dimensional regression function with deep neural network
Seminar

Inference on derivatives of high dimensional regression function with deep neural network


  • Location
    Erasmus University Rotterdam, E building, room ET-14
    Rotterdam
  • Date and time

    April 04, 2024
    12:00 - 13:00

Abstract

We study the estimation of the partial derivatives of nonparametric regression functions with many predictors, with a view to conducting a significance test for the said derivatives. Our derivative estimator is based on the convolution of a regression function estimator and the derivative of a smoothing kernel, where the regression function estimator is a deep neural network whose structure could scale up as the sample size grows.

We demonstrate that in the context of modelling with neural networks, derivative estimation is in fact quite different from estimating the regression function itself, and hence the smoothing operation becomes beneficial and even necessary. Our subsequent test is based on the moment generating function of the aforementioned derivative estimator.} This test finds applications in model specification and variable screening for high-dimensional data.

To render our test effective in the context of predictors with high or even diverging dimension, we assume that first, the observed high-dimensional predictors can effectively serve as the proxies for certain latent, lower-dimensional factors and that second, only the latent factors and a subset of the coordinates of the observed high-dimensional predictors drive the regression function. Moreover, we finely adjust the regression function estimator, enabling us to achieve the desired asymptotic normality under the null hypothesis that the partial derivative is zero, as well as consistency for any fixed scenarios and certain local alternatives.

Reigstration

You can sign up for this seminar by sending an email to eb-secr@ese.eur.nl


We demonstrate the excellent performance of our test in simulation studies. We present two applications that highlight the robustness and effectiveness of our inference methods.