mysql - R: NaiveBayes incrementally on a large data set -


i have large data set in mysql database (at least 11 gb of data). train naivebayes model on entire set , test against smaller quite large data set (~3 gb).

the second part seems feasible - assume run following in loop:

data_test <- sqlquery(con, paste("select * test_data limit 10000", "offset", (i*10000) ))     model_pred <- predict(model, data_test, type="raw") 

...and dump predictions mysql or csv.

how can i, however, train model incrementally on such large data set? noticed in r documentation of function (http://www.inside-r.org/packages/cran/e1071/docs/naivebayes) there addtional argument in predict function "newdata" suggests incremental learning possible. predict function return predictions , not new model.

please provide me example of how incrementally train model.


Comments

Popular posts from this blog

javascript - Jquery show_hide, what to add in order to make the page scroll to the bottom of the hidden field once button is clicked -

javascript - Highcharts multi-color line -

javascript - Enter key does not work in search box -