I want to know is there any way to see "under the hood" on other python sklearn algorithms. For example, I have created a decision tree classifier using sklearn and have been able to export the exact structure of the tree but would like to also be able to do this with other algorithms, for example KNN classification. Is this possible?
-
3Grab your favorite ML-book and read it. In KNN, there is no fitting, and besides optimizing the internal-storage for fast lookups (kd-tree, ball-tree, ...), there is no internal-state. All is done during prediction-time. All the others of course have attributes you can call -> SVMs: support-vectors; LinearModels: coefficients and so on... What's wrong with the docs?sascha– sascha2017-09-07 11:31:50 +00:00Commented Sep 7, 2017 at 11:31
-
@sascha, Thanks for that. That clears it up for me. I will read up on KNN properly and reread the documentation!Sjoseph– Sjoseph2017-09-07 11:37:55 +00:00Commented Sep 7, 2017 at 11:37
-
1For every classifier/regressor there is a API-page like this. The part within attributes is the stuff you can query. Everything else is hidden and not directly supported. Somewhat bigger example: MLP.sascha– sascha2017-09-07 11:39:12 +00:00Commented Sep 7, 2017 at 11:39
Add a comment
|