* [HUDI-3290] Different file formats for the partition metadata file. Partition metadata files are stored in each partition to help identify the base path of a table. These files are saved in the properties file format. Some query engines do not work when non Parquet/ORC files are found in a partition. Added a new table config 'hoodie.partition.metafile.use.data.format' which when enabled (default false for backward compatibility) ensures that partition metafiles will be saved in the same format as the base files of a dataset. For new datasets, the config can be set via hudi-cli. Deltastreamer has a new parameter --partition-metafile-use-data-format which will create a table with this setting. * Code review comments - Adding a new command to migrate from text to base file formats for meta file. - Reimplementing readFromFS() to first read the text format, then base format - Avoid extra exists() checks in readFromFS() - Added unit tests, enabled parquet format across hoodie-hadoop-mr - Code cleanup, restructuring, naming consistency. * Wiring in all the other Spark code paths to respect this config - Turned on parquet meta format for COW data source tests - Removed the deltastreamer command line to keep it shorter * populate HoodiePartitionMetadata#format after readFromFS() Co-authored-by: Vinoth Chandar <vinoth@apache.org> Co-authored-by: Raymond Xu <2701446+xushiyan@users.noreply.github.com>
Description of the relationship between each module
This repo contains the code that integrate Hudi with Spark. The repo is split into the following modules
hudi-spark
hudi-spark2
hudi-spark3
hudi-spark3.1.x
hudi-spark2-common
hudi-spark3-common
hudi-spark-common
- hudi-spark is the module that contains the code that both spark2 & spark3 version would share, also contains the antlr4 file that supports spark sql on spark 2.x version.
- hudi-spark2 is the module that contains the code that compatible with spark 2.x versions.
- hudi-spark3 is the module that contains the code that compatible with spark 3.2.0(and above) versions。
- hudi-spark3.1.x is the module that contains the code that compatible with spark3.1.x and spark3.0.x version.
- hudi-spark2-common is the module that contains the code that would be reused between spark2.x versions, right now the module has no class since hudi only supports spark 2.4.4 version, and it acts as the placeholder when packaging hudi-spark-bundle module.
- hudi-spark3-common is the module that contains the code that would be reused between spark3.x versions.
- hudi-spark-common is the module that contains the code that would be reused between spark2.x and spark3.x versions.
Description of Time Travel
HoodieSpark3_2ExtendedSqlAstBuilderhave comments in the spark3.2's code fork fromorg.apache.spark.sql.catalyst.parser.AstBuilder, and additionalwithTimeTravelmethod.SqlBase.g4have comments in the code forked from spark3.2's parser, and add SparkSQL SyntaxTIMESTAMP AS OFandVERSION AS OF.
Time Travel Support Spark Version:
| version | support |
|---|---|
| 2.4.x | No |
| 3.0.x | No |
| 3.1.2 | No |
| 3.2.0 | Yes |
About upgrading Time Travel:
Spark3.3 support time travel syntax link SPARK-37219. Once Spark 3.3 released. The files in the following list will be removed:
- hudi-spark3's
HoodieSpark3_2ExtendedSqlAstBuilder.scala、HoodieSpark3_2ExtendedSqlParser.scala、TimeTravelRelation.scala、SqlBase.g4、HoodieSqlBase.g4