python least squares regression modification to objective function -


least squares regression defined minimization of sum of squared residuals e.g.

minimize(sum_squares(x * beta - y)) 

however, i'd propose slight modification such still minimizing following

minimize(sum_modified_squares(x*beta - y))  sum_modified_squares(x*beta - y) = 0 if sign(x*beta) == sign(y) else sum_modified_squares(x*beta - y) = sum_squares(x*beta - y) 

basically want only penalize when sign of prediction not equal sign of actual y. there literature on or implementations? i'm trying implement in cvxpy not sure how it


Comments

Popular posts from this blog

ubuntu - PHP script to find files of certain extensions in a directory, returns populated array when run in browser, but empty array when run from terminal -

php - How can i create a user dashboard -

javascript - How to detect toggling of the fullscreen-toolbar in jQuery Mobile? -