Skip to content

[WIP]: Adding multiple Spark Master environment variables #91

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
18 changes: 16 additions & 2 deletions src/csharp/Microsoft.Spark/Services/ConfigurationService.cs
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
// See the LICENSE file in the project root for more information.

using System;
using System.Collections.Generic;
using System.Configuration;
using System.Diagnostics;
using System.IO;
Expand All @@ -22,7 +23,7 @@ internal sealed class ConfigurationService : IConfigurationService
public const string WorkerReadBufferSizeEnvName = "spark.dotnet.worker.readBufferSize";
public const string WorkerWriteBufferSizeEnvName = "spark.dotnet.worker.writeBufferSize";

private const string SparkMasterEnvName = "spark.master";
private readonly string[] SparkMasterEnvName = new string[] { "spark.master", "MASTER" };
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think this is the right fix. I think we should just merge DefaultConfiguration/LocalConfiguration/DebugConfiguration into one, so that the configuration creation is independent of spark master env variable.

private const string DotnetBackendPortNumberSettingKey = "DotnetBackendPortNumber";
private const string DotnetBackendPortEnvName = "DOTNETBACKEND_PORT";
private const int DotnetBackendDebugPort = 5567;
Expand Down Expand Up @@ -51,7 +52,20 @@ internal ConfigurationService()
entryAssembly.Location);

// SPARK_MASTER is set by when the driver runs on the Scala side.
string sparkMaster = Environment.GetEnvironmentVariable(SparkMasterEnvName);
// Depending on the job submission, there are different environment
// variables that are set to indicate Spark Master URI:
// - spark.master for job submissions through spark-submit
// - MASTER for job submissions through Databricks (Create Job -> Set JAR)
string sparkMaster = null;
foreach(string sparkMasterEnv in SparkMasterEnvName)
{
sparkMaster = Environment.GetEnvironmentVariable(sparkMasterEnv);
if (sparkMaster != null)
{
break;
}
}

if (sparkMaster == null)
{
_configuration = new DebugConfiguration(appConfig);
Expand Down