Home:ALL Converter>Errors in hadoop fs -cat command while writing output of text delimited table into local unix

Errors in hadoop fs -cat command while writing output of text delimited table into local unix

Ask Time:2020-04-29T10:31:19         Author:Mohammed

Json Formatter

I have very huge amount of data stored in a text delimited table whose DDL I created through Impala, I have written it with data in Hive and I want to get it to local file system in Unix in a single file. I have multiple files in HDFS table location path, this is the command I used:

hadoop fs -cat '<tablelocationinHDFS>/*' > /unixlocalpathandfilename.csv

I'm getting 2 errors:

  1. cat: is a directory error
    Because there is one impala_insert_staging directory inside it, how do I avoid this error?
  2. cat: Unable to write to output stream

Or Should I be getting data into Local using get command or getmerge command into a single file?
How do I avoid the above errors and have a successful run of command.

Author:Mohammed,eproduced under the CC 4.0 BY-SA copyright license with a link to the original source and this disclaimer.
Link to original article:https://stackoverflow.com/questions/61493257/errors-in-hadoop-fs-cat-command-while-writing-output-of-text-delimited-table-in
yy