- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Is there a road map for the library? Are there any plans to include online decision tree approaches?
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello,
Thank you for the question.The first release of Intel(R) Data Analytics Acceleration Library provides the algorithms for analysis, modeling and training including algorithms for classification, regression and clustering. The library supports batch, online and distributed type of the computations depending on the specific algorithm. It is understandable, that the list of the features supported by the present version of the library may not be sufficient to cover more use cases/models, and decision tree/random forest is one such example. Thus, we continue analysis of possible directions for development/extensions of the library by collecting respective feedbacks and requests.
It would be helpful, if you could provide additional details on the usage model of the online decision tree you ask about including the typical dimension of the dataset/their blocks, type of the data processed (continuous and/or categorical, ranges of values for the data points), and requirements to performance of training and scoring stages as well as to amount of memory used underneath. Please, let us know if you keep in mind a specific algorithm for online decision tree that may meet the requirements of your application.
Thanks in advance,
Andrey
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thank you.
I'm afraid I can not answer your questions in detail due to the NDA. In any case, I will try to keep track of your promising library, by communicating with employees of Intel Nizhny Novgorod.
Best regards,
Michael.

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page