Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
21 changes: 15 additions & 6 deletions .github/workflows/e2e-tests-flink-1.x.yml
Original file line number Diff line number Diff line change
Expand Up @@ -45,6 +45,7 @@ jobs:
matrix:
# Only Test Latest Version
flink_version: [ '1.20' ]
hadoop_profile: ['', 'hadoop3']
steps:
- name: Checkout code
uses: actions/checkout@v6
Expand All @@ -56,19 +57,27 @@ jobs:
distribution: 'temurin'

- name: Build Flink
run: mvn -T 2C -B clean install -DskipTests -Pflink1,spark3 -pl paimon-e2e-tests -am -Pflink-${{ matrix.flink_version }}
run: |
profile="flink1,spark3,flink-${{ matrix.flink_version }}"
if ! [ "${{ matrix.hadoop_profile }}" = "" ]; then
profile="$profile,${{ matrix.hadoop_profile }}"
fi
mvn -T 2C -B clean install -DskipTests -P$profile -pl paimon-e2e-tests -am

- name: Test Flink
run: |
# run tests with random timezone to find out timezone related bugs
. .github/workflows/utils.sh
jvm_timezone=$(random_timezone)
echo "JVM timezone is set to $jvm_timezone"
profile="flink-${{ matrix.flink_version }}"
if [ "${{ matrix.flink_version }}" = "${{ matrix.flink_version[-1] }}" ]; then
mvn -T 1C -B test -Pflink1,spark3 -pl paimon-e2e-tests -Duser.timezone=$jvm_timezone
else
mvn -T 1C -B test -Pflink1,spark3 -pl paimon-e2e-tests -Duser.timezone=$jvm_timezone -P${profile}
profile="flink1,spark3"
if ! [ "${{ matrix.flink_version }}" = "${{ matrix.flink_version[-1] }}" ]; then
profile="$profile,flink-${{ matrix.flink_version }}"
fi
if ! [ "${{ matrix.hadoop_profile }}" = "" ]; then
profile="$profile,${{ matrix.hadoop_profile }}"
fi
profile="flink-${{ matrix.flink_version }}"
mvn -T 1C -B test -P$profile -pl paimon-e2e-tests -Duser.timezone=$jvm_timezone
env:
MAVEN_OPTS: -Xmx4096m
18 changes: 14 additions & 4 deletions .github/workflows/utitcase-flink-1.x-common.yml
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,9 @@ jobs:
build_test:
runs-on: ubuntu-latest
timeout-minutes: 60

strategy:
matrix:
hadoop_profile: ['', 'hadoop3']
steps:
- name: Checkout code
uses: actions/checkout@v6
Expand All @@ -52,18 +54,26 @@ jobs:

- name: Build Flink
run: |
profile="flink1,spark3"
if ! [ "${{ matrix.hadoop_profile }}" = "" ]; then
profile="$profile,${{ matrix.hadoop_profile }}"
fi
COMPILE_MODULE="org.apache.paimon:paimon-flink-common"
echo "Start compiling modules: $COMPILE_MODULE"
mvn -T 2C -B clean install -DskipTests -Pflink1,spark3 -pl "${COMPILE_MODULE}" -am
mvn -T 2C -B clean install -DskipTests -P$profile -pl "${COMPILE_MODULE}" -am

- name: Test Flink
run: |
. .github/workflows/utils.sh
jvm_timezone=$(random_timezone)
echo "JVM timezone is set to $jvm_timezone"
profile="flink1,spark3"
if ! [ "${{ matrix.hadoop_profile }}" = "" ]; then
profile="$profile,${{ matrix.hadoop_profile }}"
fi
TEST_MODULE="org.apache.paimon:paimon-flink-common"
echo "Start testing module: $TEST_MODULE"
mvn -T 2C -B test verify -Pflink1,spark3 -pl "${TEST_MODULE}" -Duser.timezone=$jvm_timezone
mvn -T 2C -B test verify -P$profile -pl "${TEST_MODULE}" -Duser.timezone=$jvm_timezone
echo "All modules tested"
env:
MAVEN_OPTS: -Xmx4096m -XX:+UseG1GC -XX:CICompilerCount=2
MAVEN_OPTS: -Xmx4096m -XX:+UseG1GC -XX:CICompilerCount=2
17 changes: 14 additions & 3 deletions .github/workflows/utitcase-flink-1.x-others.yml
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,9 @@ jobs:
build_test:
runs-on: ubuntu-latest
timeout-minutes: 60
strategy:
matrix:
hadoop_profile: ['', 'hadoop3']

steps:
- name: Checkout code
Expand All @@ -52,18 +55,26 @@ jobs:

- name: Build Flink
run: |
mvn -T 2C -B clean install -DskipTests -Pflink1,spark3
profile="flink1,spark3"
if ! [ "${{ matrix.hadoop_profile }}" = "" ]; then
profile="$profile,${{ matrix.hadoop_profile }}"
fi
mvn -T 2C -B clean install -DskipTests -P$profile

- name: Test Flink
run: |
. .github/workflows/utils.sh
jvm_timezone=$(random_timezone)
echo "JVM timezone is set to $jvm_timezone"
profile="flink1,spark3"
if ! [ "${{ matrix.hadoop_profile }}" = "" ]; then
profile="$profile,${{ matrix.hadoop_profile }}"
fi
test_modules=""
for suffix in cdc 1.16 1.17 1.18 1.19 1.20; do
test_modules+="org.apache.paimon:paimon-flink-${suffix},"
done
test_modules="${test_modules%,}"
mvn -T 2C -B test verify -Pflink1,spark3 -pl "${test_modules}" -Duser.timezone=$jvm_timezone
mvn -T 2C -B test verify -P$profile -pl "${test_modules}" -Duser.timezone=$jvm_timezone
env:
MAVEN_OPTS: -Xmx4096m -XX:+UseG1GC -XX:CICompilerCount=2
MAVEN_OPTS: -Xmx4096m -XX:+UseG1GC -XX:CICompilerCount=2
12 changes: 11 additions & 1 deletion docs/content/flink/quick-start.md
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,17 @@ You can also manually build bundled jar from the source code.
To build from source code, [clone the git repository]({{< github_repo >}}).

Build bundled jar with the following command.
- `mvn clean install -DskipTests`

```bash
# build paimon flink 1.x
mvn clean install -DskipTests

# build paimon flink 1.x with hadoop 3.x
mvn clean package -DskipTests -Pflink1,hadoop3
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why we need to build with -P hadoop3? What changed?

Copy link
Contributor Author

@yunfengzhou-hub yunfengzhou-hub Jan 14, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hadoop 2.x relies on commons-lang, which might be vulnerable to CVE-2025-48924. Some of our Paimon users have asked to provide a paimon-flink version with CVEs like have been fixed.

I'll update this to the description of this PR.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you mean we will bundle the corresponding dependencies into our JAR file based on the Hadoop version?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, the official Paimon jars will not bundle hadoop 3.x dependencies.

This PR is more like providing a guarantee that if there is another project that relies on paimon-flink and Hadoop 3.x at the same time, this project is not supposed to have Hadoop version compatibility issue.


# build paimon flink 2.x (Java 11 required)
mvn clean package -DskipTests -Pflink2
```

You can find the bundled jar in `./paimon-flink/paimon-flink-<flink-version>/target/paimon-flink-<flink-version>-{{< version >}}.jar`, and the action jar in `./paimon-flink/paimon-flink-action/target/paimon-flink-action-{{< version >}}.jar`.

Expand Down
14 changes: 14 additions & 0 deletions paimon-common/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -258,6 +258,20 @@ under the License.
</dependency>
</dependencies>

<profiles>
<profile>
<id>hadoop3</id>
<dependencies>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs-client</artifactId>
<version>${hadoop.version}</version>
<scope>provided</scope>
</dependency>
</dependencies>
</profile>
</profiles>

<build>
<plugins>
<plugin>
Expand Down
18 changes: 9 additions & 9 deletions paimon-flink/paimon-flink-cdc/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -339,15 +339,15 @@ under the License.
</dependency>
</dependencies>

<!-- <dependencyManagement>-->
<!-- <dependencies>-->
<!-- <dependency>-->
<!-- <groupId>com.google.guava</groupId>-->
<!-- <artifactId>guava</artifactId>-->
<!-- <version>32.1.2-jre</version>-->
<!-- </dependency>-->
<!-- </dependencies>-->
<!-- </dependencyManagement>-->
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-annotations</artifactId>
<version>2.15.2</version>
</dependency>
</dependencies>
</dependencyManagement>

<build>
<plugins>
Expand Down
6 changes: 6 additions & 0 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -532,6 +532,12 @@ under the License.
<activeByDefault>true</activeByDefault>
</activation>
</profile>
<profile>
<id>hadoop3</id>
<properties>
<hadoop.version>3.4.2</hadoop.version>
</properties>
</profile>
</profiles>

<build>
Expand Down
Loading