python least squares regression modification to objective function -


least squares regression defined minimization of sum of squared residuals e.g.

minimize(sum_squares(x * beta - y)) 

however, i'd propose slight modification such still minimizing following

minimize(sum_modified_squares(x*beta - y))  sum_modified_squares(x*beta - y) = 0 if sign(x*beta) == sign(y) else sum_modified_squares(x*beta - y) = sum_squares(x*beta - y) 

basically want only penalize when sign of prediction not equal sign of actual y. there literature on or implementations? i'm trying implement in cvxpy not sure how it


Comments