Threshold Boundary Logistic Regression for Binary Data
- 2025-10-20 (Mon.), 10:30 AM
- 統計所B1演講廳;茶 會:上午10:10。
- 實體與線上視訊同步進行。
- Prof. Chih-Hao Chang (張志浩 副教授)
- 國立政治大學統計學系
Abstract
This talk introduces Threshold Boundary Logistic Regression (TBLR) for binary outcomes, which links covariates to both the logistic component and a linear or nonlinear threshold that partitions the feature space for distinct logistic models. We develop an iterative two-stage sample-splitting estimator that recasts the non-differentiable likelihood into an optimization: threshold parameters are obtained by minimizing a weighted classification loss, while logistic parameters are updated by likelihood maximization. Under suitable conditions, we establish consistency, oracle-optimal convergence rates, and asymptotic normality. Computation uses both Mixed-Integer Programming (MIP) and Weighted SVM (WSVM) as solvers: for linear boundaries we warm-start MIP with a WSVM solution—improving estimation at extra runtime—whereas nonlinear boundaries are solved by WSVM only. Simulations and real-data analyses illustrate finite-sample behavior and feasibility in nonlinear regimes. We further outline an extension to count responses—Threshold Boundary Poisson Regression (TBPR)—adapting the two-stage scheme to the Poisson likelihood and log link, and we will present preliminary empirical analyses that demonstrate applicability and the modeling workflow for count data.
Keywords:Maximum likelihood (ML), mixed integer programming (MIP), two-stage sample-splitting, weighted support vector machine (WSVM).
線上視訊請點選連結