Skip to content

Commit a0f2620

Browse files
authored
v0.3.0 release prep (#128)
1 parent eb26baa commit a0f2620

File tree

11 files changed

+74
-21
lines changed

11 files changed

+74
-21
lines changed

README.md

+4-5
Original file line numberDiff line numberDiff line change
@@ -36,7 +36,7 @@
3636
<tbody align="center">
3737
<tr>
3838
<td >2.3.*</td>
39-
<td rowspan=3><a href="https://github.com/dotnet/spark/releases/tag/v0.2.0">v0.2.0</a></td>
39+
<td rowspan=4><a href="https://github.com/dotnet/spark/releases/tag/v0.3.0">v0.3.0</a></td>
4040
</tr>
4141
<tr>
4242
<td>2.4.0</td>
@@ -45,12 +45,11 @@
4545
<td>2.4.1</td>
4646
</tr>
4747
<tr>
48-
<td>2.4.2</td>
49-
<td><a href="https://github.com/dotnet/spark/issues/60">Not supported</a></td>
48+
<td>2.4.3</td>
5049
</tr>
5150
<tr>
52-
<td>2.4.3</td>
53-
<td>master branch</td>
51+
<td>2.4.2</td>
52+
<td><a href="https://github.com/dotnet/spark/issues/60">Not supported</a></td>
5453
</tr>
5554
</tbody>
5655
</table>

benchmark/scala/pom.xml

+1-1
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@
33
<modelVersion>4.0.0</modelVersion>
44
<groupId>com.microsoft.spark</groupId>
55
<artifactId>microsoft-spark-benchmark</artifactId>
6-
<version>0.2.0</version>
6+
<version>0.3.0</version>
77
<inceptionYear>2019</inceptionYear>
88
<properties>
99
<encoding>UTF-8</encoding>

docs/getting-started/ubuntu-instructions.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ These instructions will show you how to run a .NET for Apache Spark app using .N
77
- Download and install the following: **[.NET Core 2.1 SDK](https://dotnet.microsoft.com/download/dotnet-core/2.1)** | **[OpenJDK 8](https://openjdk.java.net/install/)** | **[Apache Spark 2.4.1](https://archive.apache.org/dist/spark/spark-2.4.1/spark-2.4.1-bin-hadoop2.7.tgz)**
88
- Download and install **[Microsoft.Spark.Worker](https://github.com/dotnet/spark/releases)** release:
99
- Select a **[Microsoft.Spark.Worker](https://github.com/dotnet/spark/releases)** release from .NET for Apache Spark GitHub Releases page and download into your local machine (e.g., `~/bin/Microsoft.Spark.Worker`).
10-
- **IMPORTANT** Create a [new environment variable](https://help.ubuntu.com/community/EnvironmentVariables) `DotnetWorkerPath` and set it to the directory where you downloaded and extracted the Microsoft.Spark.Worker (e.g., `~/bin/Microsoft.Spark.Worker`).
10+
- **IMPORTANT** Create a [new environment variable](https://help.ubuntu.com/community/EnvironmentVariables) `DOTNET_WORKER_DIR` and set it to the directory where you downloaded and extracted the Microsoft.Spark.Worker (e.g., `~/bin/Microsoft.Spark.Worker`).
1111

1212
For detailed instructions, you can see [Building .NET for Apache Spark from Source on Ubuntu](../building/ubuntu-instructions.md).
1313

docs/getting-started/windows-instructions.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ These instructions will show you how to run a .NET for Apache Spark app using .N
77
- Download and install the following: **[.NET Core 2.1 SDK](https://dotnet.microsoft.com/download/dotnet-core/2.1)** | **[Visual Studio 2019](https://www.visualstudio.com/downloads/)** | **[Java 1.8](https://www.oracle.com/technetwork/java/javase/downloads/jdk8-downloads-2133151.html)** | **[Apache Spark 2.4.1](https://archive.apache.org/dist/spark/spark-2.4.1/spark-2.4.1-bin-hadoop2.7.tgz)**
88
- Download and install **[Microsoft.Spark.Worker](https://github.com/dotnet/spark/releases)** release:
99
- Select a **[Microsoft.Spark.Worker](https://github.com/dotnet/spark/releases)** release from .NET for Apache Spark GitHub Releases page and download into your local machine (e.g., `c:\bin\Microsoft.Spark.Worker\`).
10-
- **IMPORTANT** Create a [new environment variable](https://www.java.com/en/download/help/path.xml) `DotnetWorkerPath` and set it to the directory where you downloaded and extracted the Microsoft.Spark.Worker (e.g., `c:\bin\Microsoft.Spark.Worker`).
10+
- **IMPORTANT** Create a [new environment variable](https://www.java.com/en/download/help/path.xml) `DOTNET_WORKER_DIR` and set it to the directory where you downloaded and extracted the Microsoft.Spark.Worker (e.g., `c:\bin\Microsoft.Spark.Worker`).
1111

1212
For detailed instructions, you can see [Building .NET for Apache Spark from Source on Windows](../building/windows-instructions.md).
1313

docs/release-notes/0.3/release-0.3.md

+46
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,46 @@
1+
# .NET for Apache Spark 0.3 Release Notes
2+
3+
### Release Notes
4+
5+
Below are some of the highlights from this release.
6+
7+
* [Apache Spark 2.4.3](https://spark.apache.org/news/spark-2-4-3-released.html) support ([#118](https://github.com/dotnet/spark/pull/108))
8+
* dotnet/spark is now using [dotnet/arcade](https://github.com/dotnet/arcade) as the build infrastructure ([#113](https://github.com/dotnet/spark/pull/113))
9+
* [Source Link](https://github.com/dotnet/sourcelink) is now supported for the Nuget package ([#40](https://github.com/dotnet/spark/issues/40)).
10+
* Fixed the issue where Microsoft.Spark.dll is not signed ([#119](https://github.com/dotnet/spark/issues/119)).
11+
* Pickling performance is improved ([#111](https://github.com/dotnet/spark/pull/111)).
12+
* Performance improvment PRs in the Pickling Library: [irmen/Pyrolite#64](https://github.com/irmen/Pyrolite/pull/64), [irmen/Pyrolite#67](https://github.com/irmen/Pyrolite/pull/67)
13+
* ArrayType and MapType are supported as UDF return types ([#112](https://github.com/dotnet/spark/issues/112#issuecomment-493297068), [#114](https://github.com/dotnet/spark/pull/114))
14+
15+
### Supported Spark Versions
16+
17+
The following table outlines the supported Spark versions along with the microsoft-spark JAR to use with:
18+
19+
<table>
20+
<thead>
21+
<tr>
22+
<th>Spark Version</th>
23+
<th>microsoft-spark JAR</th>
24+
</tr>
25+
</thead>
26+
<tbody align="center">
27+
<tr>
28+
<td>2.3.*</td>
29+
<td>microsoft-spark-2.3.x-0.2.0.jar</td>
30+
</tr>
31+
<tr>
32+
<td>2.4.0</td>
33+
<td rowspan=3>microsoft-spark-2.4.x-0.2.0.jar</td>
34+
</tr>
35+
<tr>
36+
<td>2.4.1</td>
37+
</tr>
38+
<tr>
39+
<td>2.4.3</td>
40+
</tr>
41+
<tr>
42+
<td>2.4.2</td>
43+
<td><a href="https://github.com/dotnet/spark/issues/60">Not supported</a></td>
44+
</tr>
45+
</tbody>
46+
</table>

eng/Versions.props

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
<?xml version="1.0" encoding="utf-8"?>
22
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
33
<PropertyGroup>
4-
<VersionPrefix>0.2.0</VersionPrefix>
4+
<VersionPrefix>0.3.0</VersionPrefix>
55
<PreReleaseVersionLabel>prerelease</PreReleaseVersionLabel>
66
<RestoreSources>
77
$(RestoreSources);

src/csharp/Microsoft.Spark.E2ETest/SparkFixture.cs

+1-1
Original file line numberDiff line numberDiff line change
@@ -36,7 +36,7 @@ public SparkFixture()
3636
AppDomain.CurrentDomain.BaseDirectory);
3737
#elif NETCOREAPP2_1
3838
// For .NET Core, the user must have published the worker as a standalone
39-
// executable and set DotnetWorkerPath to the published directory.
39+
// executable and set the worker path to the published directory.
4040
if (string.IsNullOrEmpty(Environment.GetEnvironmentVariable(workerDirEnvVarName)))
4141
{
4242
throw new Exception(

src/csharp/Microsoft.Spark/Sql/SparkSession.cs

+16-8
Original file line numberDiff line numberDiff line change
@@ -65,6 +65,22 @@ public void Dispose()
6565
public SparkSession NewSession() =>
6666
new SparkSession((JvmObjectReference)_jvmObject.Invoke("newSession"));
6767

68+
/// <summary>
69+
/// Returns the specified table/view as a DataFrame.
70+
/// </summary>
71+
/// <param name="tableName">Name of a table or view</param>
72+
/// <returns>DataFrame object</returns>
73+
public DataFrame Table(string tableName)
74+
=> new DataFrame((JvmObjectReference)_jvmObject.Invoke("table", tableName));
75+
76+
/// <summary>
77+
/// Executes a SQL query using Spark, returning the result as a DataFrame.
78+
/// </summary>
79+
/// <param name="sqlText">SQL query text</param>
80+
/// <returns>DataFrame object</returns>
81+
public DataFrame Sql(string sqlText)
82+
=> new DataFrame((JvmObjectReference)_jvmObject.Invoke("sql", sqlText));
83+
6884
/// <summary>
6985
/// Returns a DataFrameReader that can be used to read non-streaming data in
7086
/// as a DataFrame.
@@ -80,14 +96,6 @@ public DataFrameReader Read() =>
8096
public DataStreamReader ReadStream() =>
8197
new DataStreamReader((JvmObjectReference)_jvmObject.Invoke("readStream"));
8298

83-
/// <summary>
84-
/// Executes a SQL query using Spark, returning the result as a DataFrame.
85-
/// </summary>
86-
/// <param name="sqlText">SQL query text</param>
87-
/// <returns>DataFrame object</returns>
88-
public DataFrame Sql(string sqlText)
89-
=> new DataFrame((JvmObjectReference)_jvmObject.Invoke("sql", sqlText));
90-
9199
/// <summary>
92100
/// Returns UDFRegistraion object with which user-defined functions (UDF) can
93101
/// be registered.

src/scala/microsoft-spark-2.3.x/pom.xml

+1-1
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44
<parent>
55
<groupId>com.microsoft.scala</groupId>
66
<artifactId>microsoft-spark</artifactId>
7-
<version>0.2.0</version>
7+
<version>0.3.0</version>
88
</parent>
99
<artifactId>microsoft-spark-2.3.x</artifactId>
1010
<inceptionYear>2019</inceptionYear>

src/scala/microsoft-spark-2.4.x/pom.xml

+1-1
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44
<parent>
55
<groupId>com.microsoft.scala</groupId>
66
<artifactId>microsoft-spark</artifactId>
7-
<version>0.2.0</version>
7+
<version>0.3.0</version>
88
</parent>
99
<artifactId>microsoft-spark-2.4.x</artifactId>
1010
<inceptionYear>2019</inceptionYear>

src/scala/pom.xml

+1-1
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44
<groupId>com.microsoft.scala</groupId>
55
<artifactId>microsoft-spark</artifactId>
66
<packaging>pom</packaging>
7-
<version>0.2.0</version>
7+
<version>0.3.0</version>
88
<properties>
99
<encoding>UTF-8</encoding>
1010
</properties>

0 commit comments

Comments
 (0)