I'm reading the Apache Mahout Cookbook But I have a problem in Chapter 2, making a sequence file I'm using Mahavat 0.9
The order I am executing is as follows:
$ MAHOUT_HOME / Bin / Mahavat seqdirectory I / house / haritz / escritorio / work_dir / original -o / home / haritz / escritorio / work_dir / sequencefiles
but I received the following error:
< Running on Hadoop, using / usr / local / Hadoop / bin / Hadoop and HADOOP_CONF_DIR = Mahout jobs: /home/haritz/mahout-distribution-0.9/mahout-examples-0.9-job Jar 15/03/16 16:45:57 information common.Abst ractJob: Command-line arguments: {--charset = [UTF-8], --chunkSize = [64], --endPhase = [2147483647], --fileFilterClass = [org.apache.mahout.text.PrefixAdditionFilter], - -input = [/ home / haritz / escritorio / work_dir / original], --keyPrefix = [], --method = [mapreduce], --output = [/ home / haritz / escritorio / work_dir / sequencefiles], - StartPhase = [0], --tempDir = [temp]} 15/03/16 16:45:58 WARN util.NativeCodeLoader: Unable to load the original-hysop library for your platform ... builtin-java classes Use where applicable 15/03/16 16:45:58 Configure information Ion.deprecation: mapred.input.dir dislikes Instead, use mapreduce.input.fileinputformat.inputdir 15/03/16 16:45:58 INFO configuration.depr Ecation: mapred.compress.map.output is disliked, instead, use mapreduce.map.output.compress 15/03/16 16:45:58 INFO Configuration.deprecation: mapred.output.dir dislikes, instead, the formula using mapreduce.output.fileoutputformat.outputdir exception "main" JavakiokFileNotFoundException: file does not exist: / home / haritz / Escritorio / work_dir / original org.apache.hadoop.hdfs.DistributedFileSystem org at $ 18.doCall on .apache.hadoop.fs.FileSystemLinkResolver.resolve (FileSystemLinkResolver.java:81) on org.apache.hadoop.hdfs.DistributedFileSystem $ 18.doCall (DistributedFileSystem.java:1114) (DistributedFileSystem.java:1122) on org.apache .hadoop.hdfs.DistributedFileSystem.getFileStatus (DistributedFileSystem.java:1114) org.apache.mahout.text.SequenceFilesFromDirectory.runMapReduce (SequenceFilesFromDirectory.java:162) org.apache.mahout.text.SequenceFilesFromDirectory.run (SequenceFilesFromDirectory.java). : 91) org.apache.hadoop.util.ToolRunner.run (ToolRunner.java:70) at org.apache.hadoop.util.ToolRunner.run (ToolRunner.java:84) at org.apache.mahout.text. Sequencefilereed directory Maine (Anukrmfailondairectori. Jawaः 65) sun.reflect.DelegatingMethodAccessorImpl on java.lang.reflect.Method.invoke on KreflectkNativeMethodAccessorImplkinvoke0 (Native Method) SunkreflectkNativeMethodAccessorImplkinvoke (NativeMethodAccessorImplkjava:57) sun. invoke (DelegatingMethodAccessorImpl.java:43) on org.apache.hadoop.util.ProgramDriver.run (ProgramDriver.java:144) on org.apache.hadoop.util.ProgramDriver $ ProgramDescription.invoke (ProgramDriver.java:71) to ( Method.java:606) On organization. apache.hadoop.util.ProgramDriver.driver org.apache.mahout.driver.MahoutDriver.main (Native Method (ProgramDriver.java:152) (MahoutDriver.java:195) on sun.reflect.NativeMethodAccessorImpl.invoke0) on the sun . reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:57) on org.apache.hadoop on sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43) java.lang.reflect.Method.invoke (Method.java:606) Main (RunJar.java:1236)Can anyone tell me that .atil.RunJar.run (RunJar.java, June 21) org.apache.hadoop.util.RunJar Why do I get this exception? Thanks!
is hoping hdfs path for input and output. If Hdfs path Mahout seqdirectory -i Acdifs-path (hdfs: // ip / user / -) -o Acdifs-path (hdfs: // ip / user / ..) you want it to run locally, so it needs to set the input system and output from the local file in the local file system, you / bin / mahout Shell file MAHOUT_LOCAL = true
No comments:
Post a Comment