Replies: 2 comments 2 replies
-
What is the %~dp0 syntax in your environment variables? Can you start by following the instructions in the tutorial and get that working first: |
Beta Was this translation helpful? Give feedback.
-
Thank you for the comment. Yes, i followed the instructions at https://docs.microsoft.com/en-us/dotnet/spark/tutorials/get-started?tabs=windows#5-install-net-for-apache-spark With the help of this, i am able to run my code with master local, \DotnetSpark.Package\Submit-Job.cmd --class org.apache.spark.deploy.dotnet.DotnetRunner --master local microsoft-spark-3-0_2.12-1.0.0.jar .\MyApp.exe I am trying to run in cluster, to do this i need to setup .net spark master and workers and call this way \DotnetSpark.Package\Submit-Job.cmd --class org.apache.spark.deploy.dotnet.DotnetRunner --master spark://masteripaddress:7077 microsoft-spark-3-0_2.12-1.0.0.jar .\MyApp.exe We were able to do for pyspark using following commands Master: Worker: Do we have similar one or any other way to setup .net spark master and workers in cluster. |
Beta Was this translation helpful? Give feedback.
-
Hi,
Can you please help me in setting up worker nodes to run .NET Spark on Windows cluster.
Usually we use \spark-3.0.0-bin-hadoop2.7\bin\spark-class2.cmd org.apache.spark.deploy.worker.Worker spark://xxxx.xx.0.24:7077 to start for python/java/scala on Windows.
This is not working for .NET Spark, getting error when i run my program this way
Submit-Job.cmd --class org.apache.spark.deploy.dotnet.DotnetRunner --master spark://xx.xx.0.24:7077 microsoft-spark-3-0_2.12-1.0.0.jar .\Myapp.exe
java.io.IOException: Cannot run program "..\Microsoft.Spark.Worker-1.0.0 \Microsoft.Spark.Worker.exe": CreateProcess error=2, The system cannot find the file specified
when i run in local mode, it works
Submit-Job.cmd --class org.apache.spark.deploy.dotnet.DotnetRunner --master local microsoft-spark-3-0_2.12-1.0.0.jar .\Myapp.exe
Here is my script to setup worker
set HADOOP_HOME=%~dp0\hadoop
set JAVA_HOME=%~dp0\java
set SPARK_HOME=%~dp0\spark-3.0.0-bin-hadoop2.7
set DOTNET_WORKER_DIR=%~dp0\Microsoft.Spark.Worker-1.0.0
set DOTNET_WORKER_DEBUG=1
set PATH=%SPARK_HOME%\bin;%DOTNET_WORKER_DIR%;%PATH%
spark-class2.cmd org.apache.spark.deploy.worker.Worker spark://xxxx.xx.0.24:7077
Appreciate your help.
Thank you.
Beta Was this translation helpful? Give feedback.
All reactions