Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Does surv.ranger not support the max.depth parameter? #427

Closed
invain1218 opened this issue Dec 4, 2024 · 3 comments
Closed

Does surv.ranger not support the max.depth parameter? #427

invain1218 opened this issue Dec 4, 2024 · 3 comments

Comments

@invain1218
Copy link

I could find these parameters:

learner = lrn("surv.ranger",max.depth = 10)
learner$help()
learner$param_set

<ParamSet(29)>
id class lower upper nlevels default value

1: alpha ParamDbl -Inf Inf Inf 0.5
2: always.split.variables ParamUty NA NA Inf <NoDefault[0]>
3: holdout ParamLgl NA NA 2 FALSE
4: importance ParamFct NA NA 4 <NoDefault[0]>
5: keep.inbag ParamLgl NA NA 2 FALSE
6: max.depth ParamInt 0 Inf Inf 10
7: min.node.size ParamInt 1 Inf Inf 5
8: minprop ParamDbl -Inf Inf Inf 0.1
9: mtry ParamInt 1 Inf Inf <NoDefault[0]>
10: mtry.ratio ParamDbl 0 1 Inf <NoDefault[0]>
11: num.random.splits ParamInt 1 Inf Inf 1
12: num.threads ParamInt 1 Inf Inf 1 1
13: num.trees ParamInt 1 Inf Inf 500
14: oob.error ParamLgl NA NA 2 TRUE
15: regularization.factor ParamUty NA NA Inf 1
16: regularization.usedepth ParamLgl NA NA 2 FALSE
17: replace ParamLgl NA NA 2 TRUE
18: respect.unordered.factors ParamFct NA NA 3 ignore
19: sample.fraction ParamDbl 0 1 Inf <NoDefault[0]>
20: save.memory ParamLgl NA NA 2 FALSE
21: scale.permutation.importance ParamLgl NA NA 2 FALSE
22: seed ParamInt -Inf Inf Inf
23: split.select.weights ParamDbl 0 1 Inf <NoDefault[0]>
24: splitrule ParamFct NA NA 4 logrank
25: verbose ParamLgl NA NA 2 TRUE
26: write.forest ParamLgl NA NA 2 TRUE
27: min.bucket ParamInt -Inf Inf Inf 3
28: time.interest ParamInt 1 Inf Inf
29: node.stats ParamLgl NA NA 2 FALSE
id class lower upper nlevels default value

But when I use AutoTuner, the error comes up:

sp = ps(
  surv.ranger.max.depth = p_int(lower = 10, upper = 12),
  surv.ranger.min.bucket = p_int(lower =8, upper = 10),
  surv.ranger.min.node.size = p_int(lower = 10, upper = 12),
  surv.ranger.num.trees = p_fct(c(10,20)),
  surv.ranger.mtry = p_int(lower = 4, upper = 6)
)
auto = AutoTuner$new(learner = learner,
              search_space = sp,
              resampling = rsmp("cv", folds = 3),
              measure = msr("surv.cindex"),
              terminator = trm("stagnation", threshold = .001),
              tuner = tnr("grid_search"))
auto$train(task)

....
INFO [09:19:18.567] [bbotk] surv.ranger.max.depth surv.ranger.min.bucket surv.ranger.min.node.size surv.ranger.num.trees surv.ranger.mtry
INFO [09:19:18.567] [bbotk]
INFO [09:19:18.567] [bbotk] 11 9 11 20 6
INFO [09:19:18.567] [bbotk] learner_param_vals x_domain surv.cindex
INFO [09:19:18.567] [bbotk]
INFO [09:19:18.567] [bbotk] <list[7]> <list[5]> 0.6285598
Error in self$assert(xs, sanitize = TRUE) :
Assertion on 'xs' failed: Parameter 'surv.ranger.max.depth' not available. Did you mean 'alpha' / 'always.split.variables' / 'holdout'?.

Thanks for help! 😊

@invain1218
Copy link
Author

invain1218 commented Dec 4, 2024

I add "surv.ranger" beacause the learner is a GraphLearner, it contain others pipeline.

sp = ps(
  max.depth = p_int(lower = 10, upper = 12),
  min.bucket = p_int(lower =8, upper = 10),
  min.node.size = p_int(lower = 10, upper = 12),
  num.trees = p_fct(c(10,20)),
  mtry = p_int(lower = 1, upper = 2)
)
auto = AutoTuner$new(learner = lrn("surv.ranger"),
              search_space = sp,
              resampling = rsmp("cv", folds = 3),
              measure = msr("surv.cindex"),
              terminator = trm("stagnation", threshold = .001),
              tuner = tnr("grid_search"))
auto$train(task)

this is work

@bblodfon
Copy link
Collaborator

bblodfon commented Dec 4, 2024

Hi! I can't reproduce this and didn't quite understand the difference between the two posts.

Can you please copy your code, run reprex::reprex(), and then copy-paste the result in the above post? Add the library versions as well with devtools::session_info(). Then add specific questions/comments on the code, ie what does work, what doesn't? Where you don't understand what is happening exactly? what you need to do that is not there? All that would help me help you.

@invain1218
Copy link
Author

I restarted the R studio, and the error disappeared. There might have been a conflict between packages, but I'm not sure which R package caused the issue.

@bblodfon bblodfon closed this as completed Dec 4, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants