1
0

[MINOR] Update README.md (#2010)

- add maven profile to test running commands
- remove -DskipITs for packaging commands
This commit is contained in:
Raymond Xu
2020-08-24 09:28:29 -07:00
committed by GitHub
parent f8dcd5334e
commit 111a9753a0

View File

@@ -54,7 +54,7 @@ Prerequisites for building Apache Hudi:
``` ```
# Checkout code and build # Checkout code and build
git clone https://github.com/apache/hudi.git && cd hudi git clone https://github.com/apache/hudi.git && cd hudi
mvn clean package -DskipTests -DskipITs mvn clean package -DskipTests
# Start command # Start command
spark-2.4.4-bin-hadoop2.7/bin/spark-shell \ spark-2.4.4-bin-hadoop2.7/bin/spark-shell \
@@ -73,7 +73,7 @@ mvn clean javadoc:aggregate -Pjavadocs
The default Scala version supported is 2.11. To build for Scala 2.12 version, build using `scala-2.12` profile The default Scala version supported is 2.11. To build for Scala 2.12 version, build using `scala-2.12` profile
``` ```
mvn clean package -DskipTests -DskipITs -Dscala-2.12 mvn clean package -DskipTests -Dscala-2.12
``` ```
### Build without spark-avro module ### Build without spark-avro module
@@ -83,7 +83,7 @@ The default hudi-jar bundles spark-avro module. To build without spark-avro modu
``` ```
# Checkout code and build # Checkout code and build
git clone https://github.com/apache/hudi.git && cd hudi git clone https://github.com/apache/hudi.git && cd hudi
mvn clean package -DskipTests -DskipITs -Pspark-shade-unbundle-avro mvn clean package -DskipTests -Pspark-shade-unbundle-avro
# Start command # Start command
spark-2.4.4-bin-hadoop2.7/bin/spark-shell \ spark-2.4.4-bin-hadoop2.7/bin/spark-shell \
@@ -94,14 +94,19 @@ spark-2.4.4-bin-hadoop2.7/bin/spark-shell \
## Running Tests ## Running Tests
All tests can be run with maven Unit tests can be run with maven profile `unit-tests`.
``` ```
mvn test mvn -Punit-tests test
```
Functional tests, which are tagged with `@Tag("functional")`, can be run with maven profile `functional-tests`.
```
mvn -Pfunctional-tests test
``` ```
To run tests with spark event logging enabled, define the Spark event log directory. This allows visualizing test DAG and stages using Spark History Server UI. To run tests with spark event logging enabled, define the Spark event log directory. This allows visualizing test DAG and stages using Spark History Server UI.
``` ```
mvn test -DSPARK_EVLOG_DIR=/path/for/spark/event/log mvn -Punit-tests test -DSPARK_EVLOG_DIR=/path/for/spark/event/log
``` ```
## Quickstart ## Quickstart