Home:ALL Converter>Is it possible to read and write Parquet using Java without a dependency on Hadoop and HDFS?

Is it possible to read and write Parquet using Java without a dependency on Hadoop and HDFS?

Ask Time:2017-02-07T06:53:53         Author:Jesse

Json Formatter

I've been hunting around for a solution to this question.

It appears to me that there is no way to embed reading and writing Parquet format in a Java program without pulling in dependencies on HDFS and Hadoop. Is this correct?

I want to read and write on a client machine, outside of a Hadoop cluster.

I started to get excited about Apache Drill, but it appears that it must run as a separate process. What I need is an in-process ability to read and write a file using the Parquet format.

Author:Jesse,eproduced under the CC 4.0 BY-SA copyright license with a link to the original source and this disclaimer.
Link to original article:https://stackoverflow.com/questions/42078757/is-it-possible-to-read-and-write-parquet-using-java-without-a-dependency-on-hado
yy