Skip to content

Commit e8cdf2e

Browse files
committed
version bump to 0.8.4
1 parent 8618954 commit e8cdf2e

8 files changed

Lines changed: 16 additions & 16 deletions

File tree

README.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -62,12 +62,12 @@ See [Our Features](https://dataflint.gitbook.io/dataflint-for-spark/overview/our
6262
Install DataFlint OSS via sbt:
6363
For Spark 3.X:
6464
```sbt
65-
libraryDependencies += "io.dataflint" %% "spark" % "0.8.3"
65+
libraryDependencies += "io.dataflint" %% "spark" % "0.8.4"
6666
```
6767

6868
For Spark 4.X:
6969
```sbt
70-
libraryDependencies += "io.dataflint" %% "dataflint-spark4" % "0.8.3"
70+
libraryDependencies += "io.dataflint" %% "dataflint-spark4" % "0.8.4"
7171
```
7272

7373

@@ -87,7 +87,7 @@ For Spark 3.X:
8787
```python
8888
builder = pyspark.sql.SparkSession.builder
8989
...
90-
.config("spark.jars.packages", "io.dataflint:spark_2.12:0.8.3") \
90+
.config("spark.jars.packages", "io.dataflint:spark_2.12:0.8.4") \
9191
.config("spark.plugins", "io.dataflint.spark.SparkDataflintPlugin") \
9292
...
9393
```
@@ -96,7 +96,7 @@ For Spark 4.X:
9696
```python
9797
builder = pyspark.sql.SparkSession.builder
9898
...
99-
.config("spark.jars.packages", "io.dataflint:dataflint-spark4_2.13:0.8.3") \
99+
.config("spark.jars.packages", "io.dataflint:dataflint-spark4_2.13:0.8.4") \
100100
.config("spark.plugins", "io.dataflint.spark.SparkDataflintPlugin") \
101101
...
102102
```
@@ -107,22 +107,22 @@ Alternatively, install DataFlint OSS with **no code change** as a spark ivy pack
107107

108108
```bash
109109
spark-submit
110-
--packages io.dataflint:spark_2.12:0.8.3 \
110+
--packages io.dataflint:spark_2.12:0.8.4 \
111111
--conf spark.plugins=io.dataflint.spark.SparkDataflintPlugin \
112112
...
113113
```
114114

115115
For Spark 4.X:
116116
```bash
117117
spark-submit
118-
--packages io.dataflint:dataflint-spark4_2.13:0.8.3 \
118+
--packages io.dataflint:dataflint-spark4_2.13:0.8.4 \
119119
--conf spark.plugins=io.dataflint.spark.SparkDataflintPlugin \
120120
...
121121
```
122122

123123
### Additional installation options
124124

125-
* There is also support for scala 2.13, if your spark cluster is using scala 2.13 change package name to io.dataflint:spark_**2.13**:0.8.3
125+
* There is also support for scala 2.13, if your spark cluster is using scala 2.13 change package name to io.dataflint:spark_**2.13**:0.8.4
126126
* For more installation options, including for **python** and **k8s spark-operator**, see [Install on Spark docs](https://dataflint.gitbook.io/dataflint-for-spark/getting-started/install-on-spark)
127127
* For installing DataFlint OSS in **spark history server** for observability on completed runs see [install on spark history server docs](https://dataflint.gitbook.io/dataflint-for-spark/getting-started/install-on-spark-history-server)
128128
* For installing DataFlint OSS on **DataBricks** see [install on databricks docs](https://dataflint.gitbook.io/dataflint-for-spark/getting-started/install-on-databricks)

examples/distributed_computing_unpacked.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@
3232
"spark = SparkSession \\\n",
3333
" .builder \\\n",
3434
" .appName(\"Distributed Compute Examples\") \\\n",
35-
" .config(\"spark.jars.packages\", \"io.dataflint:spark_2.12:0.8.3\") \\\n",
35+
" .config(\"spark.jars.packages\", \"io.dataflint:spark_2.12:0.8.4\") \\\n",
3636
" .config(\"spark.plugins\", \"io.dataflint.spark.SparkDataflintPlugin\") \\\n",
3737
" .config(\"spark.ui.port\", \"11000\") \\\n",
3838
" .master(\"local[*]\") \\\n",

spark-plugin/build.sbt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
import xerial.sbt.Sonatype._
22
import sbtassembly.AssemblyPlugin.autoImport._
33

4-
lazy val versionNum: String = "0.8.3"
4+
lazy val versionNum: String = "0.8.4"
55
lazy val scala212 = "2.12.20"
66
lazy val scala213 = "2.13.16"
77
lazy val supportedScalaVersions = List(scala212, scala213)

spark-plugin/clean-and-setup.sh

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -38,6 +38,6 @@ echo "1. Refresh your IntelliJ IDEA project (File -> Reload Gradle Project or si
3838
echo "2. If you still get conflicts, try: File -> Invalidate Caches and Restart"
3939
echo ""
4040
echo "📦 Fat JARs created:"
41-
echo "- Spark 3.x: pluginspark3/target/scala-2.12/dataflint-spark3_2.12-0.8.3-SNAPSHOT.jar"
42-
echo "- Spark 4.x: pluginspark4/target/scala-2.13/dataflint-spark4_2.13-0.8.3-SNAPSHOT.jar"
41+
echo "- Spark 3.x: pluginspark3/target/scala-2.12/dataflint-spark3_2.12-0.8.4-SNAPSHOT.jar"
42+
echo "- Spark 4.x: pluginspark4/target/scala-2.13/dataflint-spark4_2.13-0.8.4-SNAPSHOT.jar"
4343

spark-plugin/example_3_5_1/map_in_arrow_example.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@
1818
# Resolve the plugin JAR path relative to this script's location
1919
_script_dir = Path(__file__).resolve().parent
2020
_project_root = _script_dir.parent # up to spark-plugin/
21-
_plugin_jar = _project_root / "pluginspark3" / "target" / "scala-2.12" / "spark_2.12-0.8.3-SNAPSHOT.jar"
21+
_plugin_jar = _project_root / "pluginspark3" / "target" / "scala-2.12" / "spark_2.12-0.8.4-SNAPSHOT.jar"
2222

2323
if not _plugin_jar.exists():
2424
raise FileNotFoundError(

spark-plugin/example_3_5_1/map_in_pandas_example.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@
1313
# Resolve the plugin JAR path relative to this script's location
1414
_script_dir = Path(__file__).resolve().parent
1515
_project_root = _script_dir.parent # up to spark-plugin/
16-
_plugin_jar = _project_root / "pluginspark3" / "target" / "scala-2.12" / "spark_2.12-0.8.3-SNAPSHOT.jar"
16+
_plugin_jar = _project_root / "pluginspark3" / "target" / "scala-2.12" / "spark_2.12-0.8.4-SNAPSHOT.jar"
1717

1818
if not _plugin_jar.exists():
1919
raise FileNotFoundError(

spark-ui/package-lock.json

Lines changed: 2 additions & 2 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

spark-ui/package.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
{
22
"name": "dataflint-ui",
3-
"version": "0.8.3",
3+
"version": "0.8.4",
44
"homepage": "./",
55
"private": true,
66
"dependencies": {

0 commit comments

Comments
 (0)