I'm trying to work on a Hadoop project , but I'm new to all this concept. I have successfully set up my hadoop cluster on an Ubuntu server to use Kerberos authentication and I can access my HDFS files with the Kerberos ticket. Now I'm trying to write a Java code on my windows machine to access the remote Kerberized HDFS. This is what I've tried so far:
class FileCount
{
public static void main(final String[] args) throws IOException, FileNotFoundException, InterruptedException{
System.setProperty("java.security.krb5.realm", "PFE.COM");
System.setProperty("java.security.krb5.kdc","ubuntu:88");
Configuration conf = new Configuration();
conf.set("fs.defaultFS", "hdfs://192.168.8.101:9000");
conf.set("hadoop.security.authentication", "kerberos");
conf.set("debug", "true");
UserGroupInformation.setConfiguration(conf);
UserGroupInformation.loginUserFromKeytab("hdfs/[email protected]",
"C:/fichiers/hdfs.keytab");
FileSystem fs = FileSystem.get(conf);
FileStatus[] fsStatus = fs.listStatus(new Path("/"));
for(int i = 0; i < fsStatus.length; i++){
System.out.println(fsStatus[i].getPath().toString());
}
}
}
but I'm getting this error:
Exception in thread "main" java.io.IOException: Login failure for hdfs/[email protected] from keytab C:/fichiers/hdfs.keytab: javax.security.auth.login.LoginException: ubuntu
I've transfered my keytab file from ubntu server to my windows machine to be able to use it in this code.
Also I don't know if I should have kerberos installed on my windows machine to be able to get this to work. So if you have any idea please help me with it !