Can hadoop with Spark be configured with 1GB RAM

I'm trying to set up a cluster (1 namenode, 1 datanode) on AWS. I'm using free one year trial period of AWS, but the challenge is, instance is created with 1GB of RAM.

As I'm a student, I cannot afford much. Can anyone please suggest me some solution?

Also, it would be great if you could provide any links for setting up multi cluster hadoop with spark on AWS.

Note: I cannot try in GCE as my trial period is exhausted.

Topic aws apache-hadoop nosql bigdata

Category Data Science


So if 4GB of RAM isn't sufficient, 1GB isn't going to be. That is really too little to run an HDFS namenode, a datanode, YARN, Spark driver alone, let alone leaving room for your workers.

Much more reasonable is to simply run Spark locally on that instance without Hadoop at all.

But I would question whether Spark is the right choice if you are definitely limited to such a small machine.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.