Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
735 views
in Technique[技术] by (71.8m points)

increase positive predictive value in xgboost binary classification

I'm working on a binary classification project in an insurance company.

the dataset has 70 features including BMI, jobs, etc, and about 35,000 rows.

Based on the binary classification score (the probability value multiplied by 100) lower than 1, we accept the contract. (e.g., if the original probability exceeds 0.01, the contract falls into a rejection region.)

my goal is to improve accuracy (to bring more people automatically in) and improve positive predictive value at the same time. (in this case, the positive value is 0, acceptance of the contract). I know this sounds greedy but if there's any way to improve it, I'd like to take any advice.

so far out of 1259 test cases

 xgb hyperparameters
 : eta = 0.5, gamma = 0.25, max_depth = 25, subsample = 1, colsample_bytree = 0.6
   (definitely gonna overfit, but it gives me the best result in terms of positive predictive value)
 

 cancer_test$test_YN = as.factor(ifelse(test_prediction < 0.01, 0, 1)
 confusionMatrix(cancer_test$test_YN, test_label, positive = '0')

              reference
              0     1
 prediction
          0   538   15
          1   585   121


Pos Pred Value = 0.9729
Sensitivity    = 0.4791
#(these are the two values I wanna improve)

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)
等待大神答复

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...