Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conditional sampling #13

Merged
merged 171 commits into from
Jan 23, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
171 commits
Select commit Hold shift + click to select a range
b4bd531
new generator
dswatson Jun 7, 2023
7f12ad7
proper prior
dswatson Jun 9, 2023
c27446d
tibbles
dswatson Jun 9, 2023
70d8bba
circuits
dswatson Jun 9, 2023
99d6eb3
tibbles too
dswatson Jun 9, 2023
c6ae3aa
no need for renaming
dswatson Jun 11, 2023
1db3324
nvm
dswatson Jun 11, 2023
772c765
vectorize logs
dswatson Jun 11, 2023
5cd9bfa
nullify
dswatson Jun 11, 2023
ed90b40
fifelse
dswatson Jun 28, 2023
d5a6c80
mid > med
dswatson Jun 28, 2023
724c810
ptruncnorm for CDFs
dswatson Jun 28, 2023
ceb13ff
leaf_poster util
dswatson Jun 28, 2023
70e9c62
leaf posterior util
dswatson Jun 28, 2023
0485499
conditional forging
dswatson Jun 28, 2023
95d33ee
impose constraints
dswatson Jun 29, 2023
0c3c6c1
vectorize cnt, parallelize cat
dswatson Jun 29, 2023
cc2ca0c
par
dswatson Jun 29, 2023
9cacf55
warnings
dswatson Jun 29, 2023
2e287fb
dumb error
dswatson Jun 29, 2023
4948463
partial
dswatson Jun 30, 2023
b24f2cc
pc > params, dox
dswatson Jun 30, 2023
72368ab
dox
dswatson Jun 30, 2023
9c4da01
value != min
dswatson Jun 30, 2023
8a0d7bf
cforde Jan
jkapar Jul 1, 2023
f7a3c3b
modify comments cforde
jkapar Jul 2, 2023
4ff8826
Minor changes, typos
jkapar Jul 2, 2023
f5c9cb1
example
jkapar Jul 2, 2023
d36f9f3
cvg can be 1/n
dswatson Jul 8, 2023
2aa8a7c
no zero-cvg leaves
dswatson Jul 8, 2023
367f0b5
dropmin
dswatson Jul 8, 2023
add6c56
sigma0 fix
dswatson Jul 8, 2023
3b90ae1
un-col_rename
dswatson Jul 8, 2023
fa9e1a2
relations > operators
dswatson Jul 10, 2023
0b7dead
optional conditioning
dswatson Jul 10, 2023
53ab051
relation > operator
dswatson Jul 10, 2023
1b670d0
default alpha/epsilon
dswatson Jul 10, 2023
ef68599
Merge branch 'tweaks' of https://github.com/bips-hb/arf into tweaks
dswatson Jul 10, 2023
766f3ec
lik args
dswatson Jul 10, 2023
630c2d8
vignette touchup
dswatson Jul 10, 2023
45d371a
omega_tmp
dswatson Jul 10, 2023
e964a4c
avoid data.table checks
dswatson Jul 10, 2023
c66eef7
cleanups
dswatson Jul 10, 2023
2c49bcc
eg touchup
dswatson Jul 10, 2023
f8696b8
lik test
dswatson Jul 10, 2023
bd56733
floating point issue
dswatson Jul 10, 2023
cb018c4
clean up code
jkapar Jul 10, 2023
2addd6b
Fix bug with zero-prob volumes
jkapar Jul 13, 2023
77234b1
handling zero-prob condition due cat value
jkapar Jul 13, 2023
26521de
Fix typo
jkapar Jul 14, 2023
2b07b42
fix typo
jkapar Jul 14, 2023
54f046f
prep_evi
dswatson Jul 22, 2023
7ebf9b7
no more lse
dswatson Jul 24, 2023
46e50d0
prep_x
dswatson Jul 24, 2023
ea33d01
preps
dswatson Jul 24, 2023
e199791
categorical boost
dswatson Jul 24, 2023
f2dfdf6
enter map
dswatson Jul 24, 2023
57a089a
x_real fix
dswatson Jul 24, 2023
5faed30
leaves fix
dswatson Jul 24, 2023
55d917e
all.equal
dswatson Jul 24, 2023
beb3ca2
omega_tmp for mixed case
dswatson Jul 24, 2023
2092562
data.table check
dswatson Jul 24, 2023
00ae9a8
conj cleanup
dswatson Jul 24, 2023
f30daff
conj cleanup
dswatson Jul 24, 2023
d436621
no matrixstats
dswatson Jul 24, 2023
6a618e3
evi fix
dswatson Jul 24, 2023
8c3b957
pc > params
dswatson Jul 24, 2023
1938d98
query doc
dswatson Jul 26, 2023
b84bd93
prep_evi
dswatson Jul 26, 2023
d4f80ea
sw
dswatson Jul 26, 2023
3cf251f
warning reshuffle
dswatson Jul 27, 2023
6e5cf8a
warnings
dswatson Jul 27, 2023
6fd5d72
cforde: some accel. & bug fix
jkapar Jul 27, 2023
1546246
post_x
dswatson Jul 27, 2023
e100f59
Merge branch 'tweaks' of https://github.com/bips-hb/arf into tweaks
dswatson Jul 27, 2023
27fb383
Update cforde.R
jkapar Jul 27, 2023
f16da29
fctr fix
dswatson Jul 27, 2023
8a7166a
imports
dswatson Jul 27, 2023
e3f27af
no idx_character
dswatson Jul 27, 2023
e44a379
vignette touchup
dswatson Jul 27, 2023
84e6064
matrix colnames
dswatson Jul 27, 2023
ef6321c
data.table > data.frame
dswatson Jul 27, 2023
6284a97
setDT
dswatson Jul 27, 2023
78d3464
junk names
dswatson Jul 27, 2023
dab6dcd
prep_x returns data.frame
dswatson Jul 27, 2023
82d1d8b
revert
dswatson Jul 27, 2023
a1337ac
colnames mgmt
dswatson Jul 27, 2023
03b349e
partial setnames
dswatson Jul 27, 2023
2891d54
colorderfix, variable.factor = FALSE
dswatson Jul 28, 2023
f76a053
cforge touchup
dswatson Jul 28, 2023
87a96a5
to_sim upfront
dswatson Jul 28, 2023
be42a43
params > pc
dswatson Jul 29, 2023
c57b157
no par
dswatson Jul 31, 2023
06d201b
no par
dswatson Jul 31, 2023
9eff704
leaf_posterior touchup
dswatson Jul 31, 2023
a00cc1d
sloppy checks
dswatson Jul 31, 2023
fcf3c5c
leaf_post touchup
dswatson Jul 31, 2023
3959823
no par
dswatson Jul 31, 2023
187968c
null queries
dswatson Jul 31, 2023
469974e
MAP too
dswatson Jul 31, 2023
7e71172
no par
dswatson Aug 1, 2023
198fa95
bound tightening
dswatson Aug 1, 2023
5d4d73b
forge no par
dswatson Aug 1, 2023
82eab01
speed up cforde
jkapar Aug 3, 2023
9524b2b
accelerations
jkapar Aug 12, 2023
32e8bfb
small code clean-up
jkapar Aug 15, 2023
e671e51
small code clean-up
jkapar Aug 15, 2023
55c7612
small code clean-up
jkapar Aug 15, 2023
4baf4f0
faster resampler
dswatson Aug 15, 2023
2be0685
optional
dswatson Aug 15, 2023
80fa483
lvls
dswatson Aug 15, 2023
db7ac28
forde handling NAs
jkapar Aug 24, 2023
ea687e7
forde handling NAs
jkapar Aug 24, 2023
7d8b47a
fix by-ref-bug in forge
jkapar Aug 24, 2023
ce816a1
optional pruning
dswatson Aug 30, 2023
eabdd5e
steez
dswatson Aug 30, 2023
ab69529
colnames fix
dswatson Aug 30, 2023
2710517
Merge branch 'tweaks' of https://github.com/bips-hb/arf into tweaks
dswatson Aug 30, 2023
8077c7e
factor_cols fix
dswatson Aug 30, 2023
83579ea
expct
dswatson Sep 5, 2023
74b15c2
cvg speed boost
dswatson Sep 6, 2023
240dab8
singletons
dswatson Sep 7, 2023
1949858
finite_bounds for forde
dswatson Sep 8, 2023
fed5c03
style
dswatson Sep 9, 2023
8c920c3
just boots
dswatson Sep 9, 2023
30ed7fe
genfix
dswatson Sep 9, 2023
05e82b5
accelerations
jkapar Sep 11, 2023
0118cb9
style pts, col renaming
dswatson Sep 11, 2023
dfb0049
consistent pred->dt
dswatson Sep 12, 2023
300f26c
grepl bug
dswatson Sep 12, 2023
639933f
back compatibility
dswatson Sep 15, 2023
661b6cc
hyperparameters
dswatson Sep 15, 2023
ce64b75
globalz
dswatson Sep 15, 2023
53a38e3
interval fix
dswatson Nov 3, 2023
a6498f4
lp redundancy fix
dswatson Nov 3, 2023
ffa47c6
fctrs as numeric for resampling
dswatson Nov 8, 2023
8bcf593
check for inf values
dswatson Nov 9, 2023
e0a3fe8
match levels, precision
dswatson Nov 11, 2023
1c2134e
intgr
dswatson Nov 11, 2023
388a4c6
fix psi list
dswatson Nov 11, 2023
941daf7
unnecessary setDFs
dswatson Nov 11, 2023
9647aad
prep_x bug, class>type
dswatson Nov 11, 2023
f25f26e
precision fix
dswatson Nov 13, 2023
74b4661
line switch preserves n
dswatson Nov 28, 2023
a813e00
remove Jan's version of conditional sampling in this branch
mnwright Dec 1, 2023
d45cf14
missing values
dswatson Dec 13, 2023
1238e12
shuffle factor_cols
dswatson Dec 13, 2023
51dfe5a
levels
dswatson Dec 13, 2023
37af635
rbindlist
dswatson Dec 14, 2023
e5ead7c
na.rm bug patch
dswatson Dec 14, 2023
42aef32
decimals > precision
dswatson Dec 14, 2023
6279224
add a test for alpha>0
mnwright Jan 5, 2024
bc7bdfe
fix level/levels typo
mnwright Jan 5, 2024
ac0bf8c
don't parallelize in tests
mnwright Jan 5, 2024
30116f1
no grid
dswatson Jan 7, 2024
f0ef208
infinite weights
dswatson Jan 7, 2024
c2d7f27
add a meaningfull error message for the zero-likelihood case
mnwright Jan 17, 2024
fa3dd7e
sample() does not work for vector with length 1
mnwright Jan 17, 2024
69c1905
geom. mean solution for inf. posteriors
jkapar Jan 18, 2024
fd8f36a
geom. mean solution for inf. posteriors
jkapar Jan 18, 2024
b784680
random leaf if all zero + warning
mnwright Jan 19, 2024
32b8e7c
more robust tests
mnwright Jan 19, 2024
6257af2
integers to numeric if levels > 5
dswatson Jan 21, 2024
71da719
integers to factors or numeric
dswatson Jan 21, 2024
534705a
expct, no map
dswatson Jan 21, 2024
7d8d2a6
v0.20
dswatson Jan 21, 2024
cba4143
headlines
dswatson Jan 21, 2024
cc39a7e
conditioning
dswatson Jan 21, 2024
1fe4400
no map
dswatson Jan 21, 2024
76f19c1
fctr bug
dswatson Jan 21, 2024
bc483cc
drop map cross reference
dswatson Jan 21, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions DESCRIPTION
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
Package: arf
Title: Adversarial Random Forests
Version: 0.1.3
Version: 0.2.0
Date: 2023-02-06
Authors@R:
c(person(given = "Marvin N.",
Expand Down Expand Up @@ -34,10 +34,10 @@ Imports:
ranger,
foreach,
truncnorm,
matrixStats
tibble
Encoding: UTF-8
Roxygen: list(markdown = TRUE)
RoxygenNote: 7.2.3
RoxygenNote: 7.3.0
Suggests:
ggplot2,
doParallel,
Expand Down
5 changes: 4 additions & 1 deletion NAMESPACE
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
# Generated by roxygen2: do not edit by hand

export(adversarial_rf)
export(expct)
export(forde)
export(forge)
export(lik)
Expand All @@ -9,8 +10,10 @@ import(ranger)
importFrom(foreach,"%do%")
importFrom(foreach,"%dopar%")
importFrom(foreach,foreach)
importFrom(matrixStats,logSumExp)
importFrom(stats,predict)
importFrom(stats,runif)
importFrom(tibble,as_tibble)
importFrom(truncnorm,dtruncnorm)
importFrom(truncnorm,etruncnorm)
importFrom(truncnorm,ptruncnorm)
importFrom(truncnorm,rtruncnorm)
12 changes: 7 additions & 5 deletions NEWS.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,7 @@
# arf 0.1.3
* Speed boost for the adversarial resampling step
* Early stopping option for adversarial training
* alpha parameter for regularizing multinomial distributions in forde
* Unified treatment of colnames with internal semantics (y, obs, tree, leaf)
# arf 0.2.0
* Vectorized adversarial resampling
* Speed boost for compiling into a probabilistic circuit
* Conditional densities and sampling
* Bayesian solution for invariant continuous data within leaf nodes
* New function for computing (conditional) expectations
* Options for missing data
163 changes: 71 additions & 92 deletions R/adversarial_rf.R
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@
#' @param max_iters Maximum iterations for the adversarial loop.
#' @param early_stop Terminate loop if performance fails to improve from one
#' round to the next?
#' @param prune Impose \code{min_node_size} by pruning?
#' @param verbose Print discriminator accuracy after each round?
#' @param parallel Compute in parallel? Must register backend beforehand, e.g.
#' via \code{doParallel}.
Expand All @@ -24,10 +25,10 @@
#' iteratively, with alternating rounds of generation and discrimination. In
#' the first instance, synthetic data is generated via independent bootstraps of
#' each feature, and a RF classifier is trained to distinguish between real and
#' synthetic samples. In subsequent rounds, synthetic data is generated
#' separately in each leaf, using splits from the previous forest. This creates
#' increasingly realistic data that satisfies local independence by construction.
#' The algorithm converges when a RF cannot reliably distinguish between the two
#' fake samples. In subsequent rounds, synthetic data is generated separately in
#' each leaf, using splits from the previous forest. This creates increasingly
#' realistic data that satisfies local independence by construction. The
#' algorithm converges when a RF cannot reliably distinguish between the two
#' classes, i.e. when OOB accuracy falls below 0.5 + \code{delta}.
#'
#' ARFs are useful for several unsupervised learning tasks, such as density
Expand All @@ -36,13 +37,12 @@
#' trees for improved performance (typically on the order of 100-1000 depending
#' on sample size).
#'
#' Integer variables are treated as ordered factors by default. If the ARF is
#' passed to \code{forde}, the estimated distribution for these variables will
#' only have support on observed factor levels (i.e., the output will be a pmf,
#' not a pdf). To override this behavior and assign nonzero density to
#' intermediate values, explicitly recode the features as numeric.
#' Integer variables are recoded with a warning. Default behavior is to convert
#' those with six or more unique values to numeric, while those with up to five
#' unique values are treated as ordered factors. To override this behavior,
#' explicitly recode integer variables to the target type prior to training.
#'
#' Note: convergence is not guaranteed in finite samples. The \code{max_iter}
#' Note: convergence is not guaranteed in finite samples. The \code{max_iters}
#' argument sets an upper bound on the number of training rounds. Similar
#' results may be attained by increasing \code{delta}. Even a single round can
#' often give good performance, but data with strong or complex dependencies may
Expand Down Expand Up @@ -84,57 +84,23 @@ adversarial_rf <- function(
delta = 0,
max_iters = 10L,
early_stop = TRUE,
prune = TRUE,
verbose = TRUE,
parallel = TRUE,
...) {

# To avoid data.table check issues
i <- b <- cnt <- obs <- tree <- leaf <- . <- NULL
i <- b <- cnt <- obs <- tree <- leaf <- N <- . <- NULL

# Prep data
x_real <- as.data.frame(x)
x_real <- prep_x(x)
n <- nrow(x_real)
if ('y' %in% colnames(x_real)) {
colnames(x_real)[which(colnames(x_real) == 'y')] <- col_rename(x_real, 'y')
}
if ('obs' %in% colnames(x_real)) {
colnames(x_real)[which(colnames(x_real) == 'obs')] <- col_rename(x_real, 'obs')
}
if ('tree' %in% colnames(x_real)) {
colnames(x_real)[which(colnames(x_real) == 'tree')] <- col_rename(x_real, 'tree')
}
if ('leaf' %in% colnames(x_real)) {
colnames(x_real)[which(colnames(x_real) == 'leaf')] <- col_rename(x_real, 'leaf')
}
idx_char <- sapply(x_real, is.character)
if (any(idx_char)) {
x_real[, idx_char] <- as.data.frame(
lapply(x_real[, idx_char, drop = FALSE], as.factor)
)
}
idx_logical <- sapply(x_real, is.logical)
if (any(idx_logical)) {
x_real[, idx_logical] <- as.data.frame(
lapply(x_real[, idx_logical, drop = FALSE], as.factor)
)
}
idx_intgr <- sapply(x_real, is.integer)
if (any(idx_intgr)) {
warning('Recoding integer data as ordered factors. To override this behavior, ',
'explicitly code these variables as numeric.')
for (j in which(idx_intgr)) {
lvls <- sort(unique(x_real[, j]))
x_real[, j] <- factor(x_real[, j], levels = lvls, ordered = TRUE)
}
}
d <- ncol(x_real)
factor_cols <- sapply(x_real, is.factor)
if (any(!factor_cols) & min_node_size == 1L) {
warning('Variance is undefined when a leaf contains just a single observation. ',
'Consider increasing min_node_size.')
}
lvls <- lapply(x_real[factor_cols], levels)

# Fit initial model: sample from marginals, concatenate data, train RF
x_synth <- as.data.frame(lapply(x_real, sample, n, replace = TRUE))
x_synth <- setDF(lapply(x_real, sample, n, replace = TRUE))
dat <- rbind(data.frame(y = 1L, x_real),
data.frame(y = 0L, x_synth))
if (isTRUE(parallel)) {
Expand All @@ -155,29 +121,40 @@ adversarial_rf <- function(
', Accuracy: ', round(acc0 * 100, 2), '%\n'))
}
if (acc0 > 0.5 + delta & iters < max_iters) {
sample_by_class <- function(x, n) {
if (is.numeric(x)) {
as.numeric(sample(x, n, replace = TRUE))
} else {
sample(x, n, replace = TRUE)
}
}
converged <- FALSE
while (!isTRUE(converged)) { # Adversarial loop begins...
# Create synthetic data by sampling from intra-leaf marginals
nodeIDs <- stats::predict(rf0, x_real, type = 'terminalNodes')$predictions
tmp <- melt(as.data.table(nodeIDs), measure.vars = 1:num_trees,
variable.name = 'tree', value.name = 'leaf')
tmp[, tree := as.numeric(gsub('V', '', tree))][, obs := rep(1:n, num_trees)]
x_real_dt <- as.data.table(x_real)[, obs := 1:n]
x_real_dt <- merge(x_real_dt, tmp, by = 'obs', sort = FALSE)
tmp[, obs := NULL]
tmp <- data.table('tree' = rep(seq_len(num_trees), each = n),
'leaf' = as.vector(nodeIDs))
x_real_dt <- do.call(rbind, lapply(seq_len(num_trees), function(b) {
cbind(x_real, tmp[tree == b])
}))
x_real_dt[, factor_cols] <- lapply(x_real_dt[, factor_cols, drop = FALSE], as.numeric)
tmp <- tmp[sample(.N, n, replace = TRUE)]
tmp <- unique(tmp[, cnt := .N, by = .(tree, leaf)])
draw_from <- merge(tmp, x_real_dt, by = c('tree', 'leaf'), sort = FALSE)
x_synth <- draw_from[, lapply(.SD[, -c('cnt', 'obs')], sample_by_class, .SD[, cnt[1]]),
by = .(tree, leaf)][, c('tree', 'leaf') := NULL]
rm(nodeIDs, tmp, x_real_dt, draw_from)
draw_from <- merge(tmp, x_real_dt, by = c('tree', 'leaf'),
sort = FALSE)[, N := .N, by = .(tree, leaf)]
rm(nodeIDs, tmp, x_real_dt)
draw_params_within <- unique(draw_from, by = c('tree','leaf'))[, .(cnt, N)]
adj_absolut_col <- rep(c(0, draw_params_within[-.N, cumsum(N)]),
times = draw_params_within$cnt)
adj_absolut <- rep(adj_absolut_col, d) + rep(seq(0, d - 1) * nrow(draw_from), each = n)
idx_drawn_within <- ceiling(runif(n * d, 0, rep(draw_params_within$N, draw_params_within$cnt)))
idx_drawn <- idx_drawn_within + adj_absolut
draw_from_stacked <- unlist(draw_from[, -c('tree', 'leaf', 'cnt', 'N')],
use.names = FALSE)
values_drawn_stacked <- data.table('col_id' = rep(seq_len(d), each = n),
'values' = draw_from_stacked[idx_drawn])
x_synth <- as.data.table(split(values_drawn_stacked, by = 'col_id', keep.by = FALSE))
setnames(x_synth, names(x_real))
if (any(factor_cols)) {
x_synth[, names(which(factor_cols))] <- lapply(names(which(factor_cols)), function(j) {
lvls[[j]][x_synth[[j]]]
})
}
rm(draw_from, draw_params_within, adj_absolut_col,
adj_absolut, idx_drawn_within, idx_drawn, draw_from_stacked)
# Concatenate real and synthetic data
dat <- rbind(data.frame(y = 1L, x_real),
data.frame(y = 0L, x_synth))
Expand All @@ -195,8 +172,8 @@ adversarial_rf <- function(
acc0 <- 1 - rf1$prediction.error
acc <- c(acc, acc0)
iters <- iters + 1L
plateau <- ifelse(isTRUE(early_stop),
acc[iters] <= acc[iters + 1L], FALSE)
plateau <- fifelse(isTRUE(early_stop),
acc[iters] <= acc[iters + 1L], FALSE)
if (acc0 <= 0.5 + delta | iters >= max_iters | plateau) {
converged <- TRUE
} else {
Expand All @@ -211,32 +188,34 @@ adversarial_rf <- function(
}

# Prune leaves to ensure min_node_size w.r.t. real data
pred <- stats::predict(rf0, x_real, type = 'terminalNodes')$predictions + 1L
prune <- function(tree) {
out <- rf0$forest$child.nodeIDs[[tree]]
leaves <- which(out[[1]] == 0L)
to_prune <- leaves[!(leaves %in% which(tabulate(pred[, tree]) >= min_node_size))]
while(length(to_prune) > 0) {
for (tp in to_prune) {
# Find parents
parent <- which((out[[1]] + 1L) == tp)
if (length(parent) > 0) {
# Left child
out[[1]][parent] <- out[[2]][parent]
} else {
# Right child
parent <- which((out[[2]] + 1L) == tp)
out[[2]][parent] <- out[[1]][parent]
if (isTRUE(prune)) {
pred <- stats::predict(rf0, x_real, type = 'terminalNodes')$predictions + 1L
prune <- function(tree) {
out <- rf0$forest$child.nodeIDs[[tree]]
leaves <- which(out[[1]] == 0L)
to_prune <- leaves[!(leaves %in% which(tabulate(pred[, tree]) >= min_node_size))]
while(length(to_prune) > 0) {
for (tp in to_prune) {
# Find parents
parent <- which((out[[1]] + 1L) == tp)
if (length(parent) > 0) {
# Left child
out[[1]][parent] <- out[[2]][parent]
} else {
# Right child
parent <- which((out[[2]] + 1L) == tp)
out[[2]][parent] <- out[[1]][parent]
}
}
to_prune <- which((out[[1]] + 1L) %in% to_prune)
}
to_prune <- which((out[[1]] + 1L) %in% to_prune)
return(out)
}
if (isTRUE(parallel)) {
rf0$forest$child.nodeIDs <- foreach(b = seq_len(num_trees)) %dopar% prune(b)
} else {
rf0$forest$child.nodeIDs <- foreach(b = seq_len(num_trees)) %do% prune(b)
}
return(out)
}
if (isTRUE(parallel)) {
rf0$forest$child.nodeIDs <- foreach(b = 1:num_trees) %dopar% prune(b)
} else {
rf0$forest$child.nodeIDs <- foreach(b = 1:num_trees) %do% prune(b)
}

# Export
Expand Down
Loading