python least squares regression modification to objective function -


least squares regression defined minimization of sum of squared residuals e.g.

minimize(sum_squares(x * beta - y)) 

however, i'd propose slight modification such still minimizing following

minimize(sum_modified_squares(x*beta - y))  sum_modified_squares(x*beta - y) = 0 if sign(x*beta) == sign(y) else sum_modified_squares(x*beta - y) = sum_squares(x*beta - y) 

basically want only penalize when sign of prediction not equal sign of actual y. there literature on or implementations? i'm trying implement in cvxpy not sure how it


Comments

Popular posts from this blog

python - Operations inside variables -

Generic Map Parameter java -

arrays - What causes a java.lang.ArrayIndexOutOfBoundsException and how do I prevent it? -