Regression to a Linear Lower Bound With Outliers: An Exponentially Modified Gaussian Noise Model

Abstract : A regression method to estimate a linear bound in the presence of outliers is discussed. An exponentially-modified Gaussian (EMG) noise model is proposed, based on a maximum entropy argument. The resulting "EMG regression" method is shown to encompass the classical linear regression (with Gaussian noise) and a minimum regression (with exponential noise) as special cases. Simulations are performed to assess the consistency of the regression as well as its resilience to model mismatch. We conclude with an example taken from a real-world study of human performance in rapid aiming with application to human-computer interaction.
Complete list of metadatas

Cited literature [18 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-02191051
Contributor : Julien Gori <>
Submitted on : Tuesday, July 23, 2019 - 11:31:49 AM
Last modification on : Thursday, July 25, 2019 - 1:16:06 AM

File

ieee_approved.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-02191051, version 1

Citation

Julien Gori, Olivier Rioul. Regression to a Linear Lower Bound With Outliers: An Exponentially Modified Gaussian Noise Model. Eusipco 2019 | 27th European Signal Processing Conference, Sep 2019, A Coruna, Spain. ⟨hal-02191051⟩

Share

Metrics

Record views

21

Files downloads

15