Tuesday, 15 September 2015

hadoop - How to use Yarn to allocate more resources to a job -


I have some different functions to run on a Hadoop cluster. I need some resources and something else, eg . remember. I would like to run these tasks together on my cluster because it supports yarns. I think if I only collect jobs in the cluster, then the yarn automatically decides the processing requirements, although I myself I want to specify how can I use the API or command line to specify every work resource requirements?

You can set up memory by using JobConf for mapper and reducer, from the command line or Can do in your driver's class.

Set these properties to a specific setMemoryForMapTask and there are more information and usage details in SetMemoryForredSetsk (Long MEM)

.


No comments:

Post a Comment