Issue
I'd like to use the warm_start
parameter to add training data to my random forest classifier. I expected it to be used like this:
clf = RandomForestClassifier(...)
clf.fit(get_data())
clf.fit(get_more_data(), warm_start=True)
But the warm_start
parameter is a constructor parameter. So do I do something like this?
clf = RandomForestClassifier()
clf.fit(get_data())
clf = RandomForestClassifier (warm_start=True)
clf.fit(get_more_data)
That makes no sense to me. Won't the new call to the constructor discard previous training data? I think I'm missing something.
Solution
The basic pattern of (taken from Miriam's answer):
clf = RandomForestClassifier(warm_start=True)
clf.fit(get_data())
clf.fit(get_more_data())
would be the correct usage API-wise.
But there is an issue here.
As the docs say the following:
When set to True, reuse the solution of the previous call to fit and add more estimators to the ensemble, otherwise, just fit a whole new forest.
it means, that the only thing warm_start
can do for you, is adding new DecisionTree's. All the previous trees seem to be untouched!
Let's check this with some sources:
n_more_estimators = self.n_estimators - len(self.estimators_)
if n_more_estimators < 0:
raise ValueError('n_estimators=%d must be larger or equal to '
'len(estimators_)=%d when warm_start==True'
% (self.n_estimators, len(self.estimators_)))
elif n_more_estimators == 0:
warn("Warm-start fitting without increasing n_estimators does not "
"fit new trees.")
This basically tells us, that you would need to increase the number of estimators before approaching a new fit!
I have no idea what kind of usage sklearn expects here. I'm not sure, if fitting, increasing internal variables and fitting again is correct usage, but i somehow doubt it (especially as n_estimators
is not a public class-variable).
Your basic approach (in regards to this library and this classifier) is probably not a good idea for your out-of-core learning here! I would not pursue this further.
Answered By - sascha
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.