Skip to content

[WIP]: Adding multiple Spark Master environment variables #91

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from
Closed

[WIP]: Adding multiple Spark Master environment variables #91

wants to merge 1 commit into from

Conversation

rapoth
Copy link
Contributor

@rapoth rapoth commented May 5, 2019

When running in Azure Databricks (see below: Create Job -> Select JAR), spark.master is not set but instead MASTER is set to point to the Master URI. This change introduces the check for MASTER in addition to spark.master.

image

So we can do the best job, please check:

  • There's a descriptive title that will make sense to other developers some time from now.
  • There's associated issues. All PR's should have issue(s) associated - unless a trivial self-evident change such as fixing a typo. You can use the format Fixes #nnnn in your description to cause GitHub to automatically close the issue(s) when your PR is merged.
  • Your change description explains what the change does, why you chose your approach, and anything else that reviewers should know.
  • You have included any necessary tests in the same PR.

@rapoth rapoth requested review from suhsteve and imback82 May 5, 2019 16:43
@rapoth rapoth changed the title Adding multiple Spark Master environment variables [WIP]: Adding multiple Spark Master environment variables May 5, 2019
@@ -22,7 +23,7 @@ internal sealed class ConfigurationService : IConfigurationService
public const string WorkerReadBufferSizeEnvName = "spark.dotnet.worker.readBufferSize";
public const string WorkerWriteBufferSizeEnvName = "spark.dotnet.worker.writeBufferSize";

private const string SparkMasterEnvName = "spark.master";
private readonly string[] SparkMasterEnvName = new string[] { "spark.master", "MASTER" };
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think this is the right fix. I think we should just merge DefaultConfiguration/LocalConfiguration/DebugConfiguration into one, so that the configuration creation is independent of spark master env variable.

@imback82
Copy link
Contributor

imback82 commented May 6, 2019

This will be addressed in #92.

@imback82 imback82 closed this May 6, 2019
@rapoth rapoth deleted the db-fix branch May 6, 2019 02:31
@rapoth
Copy link
Contributor Author

rapoth commented May 6, 2019

Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants