Running huge datasets with R
I'm trying to run some analysis with some big datasets (eg 400k rows vs. 400 columns) with R (e.g. using neural networks and recommendation systems). But, it's taking too long to process the data (with huge matrices, e.g. 400k rows vs. 400k columns). What are some free/cheap ways to improve R performance?
I'm accepting packages or web services suggestions (other options are welcome).
Topic optimization r processing bigdata
Category Data Science