Can hadoop with Spark be configured with 1GB RAM
I'm trying to set up a cluster (1 namenode, 1 datanode) on AWS. I'm using free one year trial period of AWS, but the challenge is, instance is created with 1GB of RAM.
As I'm a student, I cannot afford much. Can anyone please suggest me some solution?
Also, it would be great if you could provide any links for setting up multi cluster hadoop with spark on AWS.
Note: I cannot try in GCE as my trial period is exhausted.
Topic aws apache-hadoop nosql bigdata
Category Data Science