Decision trees are commonly used in operations research, specifically in decision analysis, to help identify a strategy most likely to reach a goal, but are also a popular tool in machine learning.
> main_tree <- rpart(Item_Outlet_Sales ~ ., data = train_data, control = rpart.control(cp=0.01))
>printcp(main_tree)
> plotcp(main_tree)
The final value for cp = 0.01. You can also check the table populated in console for more information. The model with cp = 0.01 has the least RMSE. Let’s now build a decision tree with 0.01 as complexity parameter.
> rpart.plot(main_tree,uniform = TRUE,main="Regression Tree for Item_Outlet_Sales")
> text(rp.model,use.n = TRUE,cex=.8)
> pre_score <- predict(main_tree, type = "vector")
> rmse(train_data$Item_Outlet_Sales, pre_score)
[1] 1102.774