Currently, all Hudi Relations bear performance gap relative to Spark's HadoopFsRelation and the reason to that is SchemaPruning optimization rule (pruning nested schemas) that is unfortunately predicated on usage of HadoopFsRelation, meaning that it's not applied in cases when any other relation is used. This change is porting this rule to Hudi relations (MOR, Incremental, etc) by the virtue of leveraging HoodieSparkSessionExtensions mechanism injecting modified version of the original SchemaPruning rule that is adopted to work w/ Hudi's custom relations. - Added customOptimizerRules to HoodieAnalysis - Added NestedSchemaPrunning Spark's Optimizer rule - Handle Spark's Optimizer pruned data schema (to effectively prune nested schemas) - Enable HoodieClientTestHarness to inject HoodieSparkSessionExtensions - Injecting Spark Session extensions for TestMORDataSource, TestCOWDataSource - Disabled fallback to HadoopFsRelation
Description of the relationship between each module
This repo contains the code that integrate Hudi with Spark. The repo is split into the following modules
hudi-spark
hudi-spark2
hudi-spark3
hudi-spark3.1.x
hudi-spark2-common
hudi-spark3-common
hudi-spark-common
- hudi-spark is the module that contains the code that both spark2 & spark3 version would share, also contains the antlr4 file that supports spark sql on spark 2.x version.
- hudi-spark2 is the module that contains the code that compatible with spark 2.x versions.
- hudi-spark3 is the module that contains the code that compatible with spark 3.2.0(and above) versions。
- hudi-spark3.1.x is the module that contains the code that compatible with spark3.1.x and spark3.0.x version.
- hudi-spark2-common is the module that contains the code that would be reused between spark2.x versions, right now the module has no class since hudi only supports spark 2.4.4 version, and it acts as the placeholder when packaging hudi-spark-bundle module.
- hudi-spark3-common is the module that contains the code that would be reused between spark3.x versions.
- hudi-spark-common is the module that contains the code that would be reused between spark2.x and spark3.x versions.
Description of Time Travel
HoodieSpark3_2ExtendedSqlAstBuilderhave comments in the spark3.2's code fork fromorg.apache.spark.sql.catalyst.parser.AstBuilder, and additionalwithTimeTravelmethod.SqlBase.g4have comments in the code forked from spark3.2's parser, and add SparkSQL SyntaxTIMESTAMP AS OFandVERSION AS OF.
Time Travel Support Spark Version:
| version | support |
|---|---|
| 2.4.x | No |
| 3.0.x | No |
| 3.1.2 | No |
| 3.2.0 | Yes |
About upgrading Time Travel:
Spark3.3 support time travel syntax link SPARK-37219. Once Spark 3.3 released. The files in the following list will be removed:
- hudi-spark3's
HoodieSpark3_2ExtendedSqlAstBuilder.scala、HoodieSpark3_2ExtendedSqlParser.scala、TimeTravelRelation.scala、SqlBase.g4、HoodieSqlBase.g4