Andrea Barbato
2014-01-13 10:06:50 UTC
I'm working with Hadoop 2.2.0 and trying to run this *hdfs_test.cpp*
application:
#include "hdfs.h"
int main(int argc, char **argv) {
hdfsFS fs = hdfsConnect("default", 0);
const char* writePath = "/tmp/testfile.txt";
hdfsFile writeFile = hdfsOpenFile(fs, writePath, O_WRONLY|O_CREAT, 0, 0, 0);
if(!writeFile) {
fprintf(stderr, "Failed to open %s for writing!\n", writePath);
exit(-1);
}
char* buffer = "Hello, World!";
tSize num_written_bytes = hdfsWrite(fs, writeFile, (void*)buffer,
strlen(buffer)+1);
if (hdfsFlush(fs, writeFile)) {
fprintf(stderr, "Failed to 'flush' %s\n", writePath);
exit(-1);
}
hdfsCloseFile(fs, writeFile);}
I compiled it but when I'm running it with *./hdfs_test* I have this:
loadFileSystems error:(unable to get stack trace for
java.lang.NoClassDefFoundError exception:
ExceptionUtils::getStackTrace error.)
hdfsBuilderConnect(forceNewInstance=0, nn=default, port=0,
kerbTicketCachePath=(NULL), userName=(NULL)) error:(unable to get
stack trace for java.lang.NoClassDefFoundError exception:
ExceptionUtils::getStackTrace error.)
hdfsOpenFile(/tmp/testfile.txt): constructNewObjectOfPath
error:(unable to get stack trace for java.lang.NoClassDefFoundError
exception: ExceptionUtils::getStackTrace error.)Failed to open
/tmp/testfile.txt for writing!
Maybe is a problem with the classpath. My $HADOOP_HOME is /usr/local/hadoop
and actually this is my *variable *CLASSPATH**:
echo $CLASSPATH/usr/local/hadoop/etc/hadoop:/usr/local/hadoop/share/hadoop/common/lib/*:/usr/local/hadoop/share/hadoop/common/*:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/*:/usr/local/hadoop/share/hadoop/hdfs/*:/usr/local/hadoop/share/hadoop/yarn/lib/*:/usr/local/hadoop/share/hadoop/yarn/*:/usr/local/hadoop/share/hadoop/mapreduce/lib/*:/usr/local/hadoop/share/hadoop/mapreduce/*:/contrib/capacity-scheduler/*.jar
Any help is appreciated.. thanks
application:
#include "hdfs.h"
int main(int argc, char **argv) {
hdfsFS fs = hdfsConnect("default", 0);
const char* writePath = "/tmp/testfile.txt";
hdfsFile writeFile = hdfsOpenFile(fs, writePath, O_WRONLY|O_CREAT, 0, 0, 0);
if(!writeFile) {
fprintf(stderr, "Failed to open %s for writing!\n", writePath);
exit(-1);
}
char* buffer = "Hello, World!";
tSize num_written_bytes = hdfsWrite(fs, writeFile, (void*)buffer,
strlen(buffer)+1);
if (hdfsFlush(fs, writeFile)) {
fprintf(stderr, "Failed to 'flush' %s\n", writePath);
exit(-1);
}
hdfsCloseFile(fs, writeFile);}
I compiled it but when I'm running it with *./hdfs_test* I have this:
loadFileSystems error:(unable to get stack trace for
java.lang.NoClassDefFoundError exception:
ExceptionUtils::getStackTrace error.)
hdfsBuilderConnect(forceNewInstance=0, nn=default, port=0,
kerbTicketCachePath=(NULL), userName=(NULL)) error:(unable to get
stack trace for java.lang.NoClassDefFoundError exception:
ExceptionUtils::getStackTrace error.)
hdfsOpenFile(/tmp/testfile.txt): constructNewObjectOfPath
error:(unable to get stack trace for java.lang.NoClassDefFoundError
exception: ExceptionUtils::getStackTrace error.)Failed to open
/tmp/testfile.txt for writing!
Maybe is a problem with the classpath. My $HADOOP_HOME is /usr/local/hadoop
and actually this is my *variable *CLASSPATH**:
echo $CLASSPATH/usr/local/hadoop/etc/hadoop:/usr/local/hadoop/share/hadoop/common/lib/*:/usr/local/hadoop/share/hadoop/common/*:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/*:/usr/local/hadoop/share/hadoop/hdfs/*:/usr/local/hadoop/share/hadoop/yarn/lib/*:/usr/local/hadoop/share/hadoop/yarn/*:/usr/local/hadoop/share/hadoop/mapreduce/lib/*:/usr/local/hadoop/share/hadoop/mapreduce/*:/contrib/capacity-scheduler/*.jar
Any help is appreciated.. thanks