I have some custom java objects (which internally are composed of other custom objects). I wish to write these to HDFS in parquet format.
Even after a lot of searching, most suggestions seem to be around using a avro format and the internal AvroConverter from parquet to store the objects.
Seeing this here and here, it seems like I will have to write a custom WriterSupport to accomplish this.
Is there a better way to do this? Which is more optimal, writing custom objects directly or using something like Avro as a intermediate schema definition?