Testing identifying assumptions in Tobit Models
We develop sharp, testable implications for the identifying assumptions of Tobit and IV-Tobit models: linear index, (joint) normality of errors, treatment (instrument) exogeneity, and relevance. The new sharp testable equalities can detect all possible observable violations of the identifying conditions. The proposed test procedure for the model’s validity uses existing inference methods for intersection bounds. Simulations suggest adequate test size and power in detecting exogeneity and error structure violations. We review and propose alternatives to partially identify the parameters of interest under less restrictive assumptions. We revisit a study of married women’s labor supply in Lee (1995) to demonstrate the test’s practical implementation.
💡 Research Summary
This paper presents a novel methodological framework for testing the validity of the identifying assumptions in Tobit and IV-Tobit models, which are widely used in empirical economics and social sciences for analyzing censored or limited dependent variables. The authors develop sharp, testable implications that serve as necessary and sufficient conditions for the joint set of core assumptions: the linear index structure, the (joint) normality of latent errors, the exogeneity of the treatment variable (or the instrument), and the instrument relevance condition. Unlike previous tests that often target a single assumption, this approach can detect any observable violation of the model’s foundational premises.
The core theoretical contribution lies in deriving moment equalities that must hold in the population if the model is correctly specified. For the classic Tobit model with an exogenous treatment, these equalities link the conditional probability of a non-zero outcome to a linear function of the treatment via the probit link function. For the IV-Tobit model, similar conditional linearity constraints are derived involving the instrumental variable. The authors then transform these equalities into conditional moment inequalities, setting the stage for practical hypothesis testing.
To implement the test, the paper addresses the challenge posed by continuous treatments and outcomes, which would lead to an infinite set of moment conditions. It proposes a discretization strategy for the support of these variables, balancing computational feasibility with testing power. The resulting finite set of moment inequalities is tested using established inference methods for intersection bounds, specifically the framework of Chernozhukov, Lee, and Rosen (2013), yielding an asymptotically valid testing procedure.
Simulation studies demonstrate that the proposed test maintains adequate size control across various sample sizes and exhibits substantial power in detecting violations of exogeneity and departures from the assumed normal error structure. Recognizing that a rejection of the model calls for alternative strategies, the paper also reviews and suggests pathways for partial identification of parameters of interest (like the Average Treatment Effect) under weaker sets of assumptions, such as imposing monotonicity in selection, thereby providing a practical roadmap for researchers when the standard Tobit framework is invalid.
Finally, the methodology is applied to revisit a classic study of married women’s labor supply by Lee (1995). This empirical illustration showcases how the test can be implemented in practice to assess the validity of a Tobit specification and guides the researcher toward more robust alternative estimation strategies if the model is rejected. In summary, this work provides empirical researchers with a comprehensive tool for validating the core assumptions of Tobit models, thereby enhancing the credibility of findings derived from these commonly used but restrictive estimators.
Comments & Academic Discussion
Loading comments...
Leave a Comment