Home:ALL Converter>Hadoop distcp No AWS Credentials provided

Hadoop distcp No AWS Credentials provided

Ask Time:2017-11-23T21:16:57         Author:KDC

Json Formatter

I have a huge bucket of S3files that I want to put on HDFS. Given the amount of files involved my preferred solution is to use 'distributed copy'. However for some reason I can't get hadoop distcp to take my Amazon S3 credentials. The command I use is:

hadoop distcp -update s3a://[bucket]/[folder]/[filename] hdfs:///some/path/ -D fs.s3a.awsAccessKeyId=[keyid] -D fs.s3a.awsSecretAccessKey=[secretkey] -D fs.s3a.fast.upload=true

However that acts the same as if the '-D' arguments aren't there.

ERROR tools.DistCp: Exception encountered
java.io.InterruptedIOException: doesBucketExist on [bucket]: com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider SharedInstanceProfileCredentialsProvider : com.amazonaws.SdkClientException: Unable to load credentials from service endpoint

I've looked at the hadoop distcp documentation, but can't find a solution there on why this isn't working. I've tried -Dfs.s3n.awsAccessKeyId as a flag which didn't work either. I've read how explicitly passing credentials isn't good practice, so maybe this is just some gentil suggestion to do it some other way?

How is one supposed to pass S3 credentials with distcp? Anyone knows?

Author:KDC,eproduced under the CC 4.0 BY-SA copyright license with a link to the original source and this disclaimer.
Link to original article:https://stackoverflow.com/questions/47456479/hadoop-distcp-no-aws-credentials-provided
KDC :

It appears the format of credentials flags has changed since the previous version. The following command works:\n\nhadoop distcp \\\n -Dfs.s3a.access.key=[accesskey] \\\n -Dfs.s3a.secret.key=[secretkey] \\\n -Dfs.s3a.fast.upload=true \\\n -update \\\n s3a://[bucket]/[folder]/[filename] hdfs:///some/path\n",
2017-11-23T13:33:49
Bhargav Kosaraju :

In case if some one came for with same error using -D hadoop.security.credential.provider.path, please ensure your credentials store(jceks file ) is located in distributed file system(hdfs) as distcp starts form one of the node manager node so it can access the same.",
2018-12-19T06:24:26
yy