Note that there are some explanatory texts on larger screens.

plurals
  1. POSLF4J bindings error
    primarykey
    data
    text
    <p>I'm a newbie at mahout and I'm struggling to install it on my ubuntu 12.10. As I've seen many problems relating to SLF4J and Eclipse, I mention that I don't use Eclipse.</p> <p>The maven compilation works fine. (mvn install -DskipTests=true)</p> <p>I think I have properly set my environment variables, here is my /etc/environment :</p> <pre><code>PATH="/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/usr/lib/jvm/java-1.6.0-openjdk-amd64/bin:/usr/local/hadoop/bin" JAVA_HOME=/usr/lib/jvm/java-1.6.0-openjdk-amd64/ MAHOUT_HOME=/home/edelans/mahout/ HADOOP_HOME=/usr/local/hadoop/ HADOOP_CONF_DIR=/usr/local/hadoop/conf </code></pre> <p>I started getting some errors when I started to try and run some example scripts: </p> <pre><code>root@edelans-ubuntu-master:/home/edelans/mahout/examples/bin# ./cluster-reuters.sh Please select a number to choose the corresponding clustering algorithm 1. kmeans clustering 2. fuzzykmeans clustering 3. dirichlet clustering 4. minhash clustering Enter your choice : 1 ok. You chose 1 and we'll use kmeans Clustering creating work directory at /tmp/mahout-work-root MAHOUT_LOCAL is set, so we don't add HADOOP_CONF_DIR to classpath. MAHOUT_LOCAL is set, running locally SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/home/edelans/mahout/examples/target/mahout-examples-0.8-SNAPSHOT-job.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/edelans/mahout/examples/target/dependency/slf4j-jcl-1.7.2.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/edelans/mahout/examples/target/dependency/slf4j-log4j12-1.4.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: slf4j-api 1.6.x (or later) is incompatible with this binding. SLF4J: Your binding is version 1.5.5 or earlier. SLF4J: Upgrade your binding to version 1.6.x. Exception in thread "main" java.lang.NoSuchMethodError: org.slf4j.impl.StaticLoggerBinder.getSingleton()Lorg/slf4j/impl/StaticLoggerBinder; at org.slf4j.LoggerFactory.bind(LoggerFactory.java:128) at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:107) at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:295) at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:269) at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:281) at org.apache.mahout.driver.MahoutDriver.&lt;clinit&gt;(MahoutDriver.java:89) Could not find the main class: org.apache.mahout.driver.MahoutDriver. Program will exit. Running on hadoop, using /usr/local/hadoop/bin/hadoop and HADOOP_CONF_DIR= MAHOUT-JOB: /home/edelans/mahout/examples/target/mahout-examples-0.8-SNAPSHOT-job.jar 12/11/21 14:48:23 INFO vectorizer.SparseVectorsFromSequenceFiles: Maximum n-gram size is: 1 12/11/21 14:48:23 INFO vectorizer.SparseVectorsFromSequenceFiles: Minimum LLR value: 1.0 12/11/21 14:48:23 INFO vectorizer.SparseVectorsFromSequenceFiles: Number of reduce tasks: 1 12/11/21 14:48:24 INFO mapred.JobClient: Cleaning up the staging area hdfs://localhost:54310/app/hadoop/tmp/mapred/staging/root/.staging/job_201211211039_0002 12/11/21 14:48:24 ERROR security.UserGroupInformation: PriviledgedActionException as:root cause:org.apache.hadoop.mapreduce.lib.input.InvalidInputException: Input path does not exist: hdfs://localhost:54310/tmp/mahout-work-root/reuters-out-seqdir Exception in thread "main" org.apache.hadoop.mapreduce.lib.input.InvalidInputException: Input path does not exist: hdfs://localhost:54310/tmp/mahout-work-root/reuters-out-seqdir at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:235) at org.apache.hadoop.mapreduce.lib.input.SequenceFileInputFormat.listStatus(SequenceFileInputFormat.java:55) at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:252) at org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:962) at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:979) at org.apache.hadoop.mapred.JobClient.access$600(JobClient.java:174) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:897) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:850) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:416) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121) at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:850) at org.apache.hadoop.mapreduce.Job.submit(Job.java:500) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:530) at org.apache.mahout.vectorizer.DocumentProcessor.tokenizeDocuments(DocumentProcessor.java:93) at org.apache.mahout.vectorizer.SparseVectorsFromSequenceFiles.run(SparseVectorsFromSequenceFiles.java:255) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79) at org.apache.mahout.vectorizer.SparseVectorsFromSequenceFiles.main(SparseVectorsFromSequenceFiles.java:55) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:616) at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68) at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139) at org.apache.mahout.driver.MahoutDriver.main(MahoutDriver.java:195) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:616) at org.apache.hadoop.util.RunJar.main(RunJar.java:156) </code></pre> <p>Another user had a similar problem <a href="https://stackoverflow.com/questions/12773694/playframework-multiple-slf4j-bindings?answertab=votes#tab-top">here</a> but I didn't know how to write the line of code that answered his problem:</p> <pre><code>"org.apache.mahout" % "mahout-core" % "0.7" excludeAll(ExclusionRule(organization = "org.slf4j")) </code></pre> <p>And I don't have the authorization to comment his question... Sorry to open another question for that.</p> <p>Thanks for your help.</p>
    singulars
    1. This table or related slice is empty.
    plurals
    1. This table or related slice is empty.
    1. This table or related slice is empty.
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload