1
0
Files
hudi/hudi-examples
Dongwook 8d19ebfd0f [HUDI-993] Let delete API use "hoodie.delete.shuffle.parallelism" (#1703)
For Delete API, "hoodie.delete.shuffle.parallelism" isn't used as opposed to "hoodie.upsert.shuffle.parallelism" is used for upsert, this creates the performance difference between delete by upsert API with "EmptyHoodieRecordPayload" and delete API for certain cases.

This patch makes the following fixes in this regard. 
- Let deduplicateKeys method use "hoodie.delete.shuffle.parallelism"
- Repartition inputRDD as "hoodie.delete.shuffle.parallelism" in case "hoodie.combine.before.delete=false"
2020-09-01 12:55:31 -04:00
..

This directory contains examples code that uses hudi.

To run the demo:

  1. Configure your SPARK_MASTER env variable, yarn-cluster mode by default.

  2. For hudi write client demo and hudi data source demo, just use spark-submit as common spark app

  3. For hudi delta streamer demo of custom source, run bin/custom-delta-streamer-example.sh

  4. For hudi delta streamer demo of dfs source:

    4.1 Prepare dfs data, we have provided src/main/resources/delta-streamer-config/dfs/source-file.json for test

    4.2 Run bin/dfs-delta-streamer-example.sh

  5. For hudi delta streamer demo of dfs source:

    5.1 Start Kafka server

    5.2 Configure your Kafka properties, we have provided src/main/resources/delta-streamer-config/kafka/kafka-source.properties for test

    5.3 Run bin/kafka-delta-streamer-example.sh

    5.4 continuously write source data to the Kafka topic your configured with hoodie.deltastreamer.source.kafka.topic in kafka-source.properties

  6. Some notes delta streamer demo:

    6.1 The configuration files we provided is just the simplest demo, you can change it according to your specific needs.

    6.2 You could also use Intellij to run the example directly by configuring parameters as "Program arguments"