Merge pull request #25665 from ursuad/patch-1
Automatic merge from submit-queue Fixed namespace name to spark-cluster Just changed the namespace from **default** to **spark-cluster** in the spark example docs.
This commit is contained in:
commit
e43ec4c445
@ -156,7 +156,7 @@ kubectl proxy --port=8001
|
||||
```
|
||||
|
||||
At which point the UI will be available at
|
||||
[http://localhost:8001/api/v1/proxy/namespaces/default/services/spark-webui/](http://localhost:8001/api/v1/proxy/namespaces/default/services/spark-webui/).
|
||||
[http://localhost:8001/api/v1/proxy/namespaces/spark-cluster/services/spark-webui/](http://localhost:8001/api/v1/proxy/namespaces/default/services/spark-webui/).
|
||||
|
||||
## Step Three: Start your Spark workers
|
||||
|
||||
@ -294,7 +294,7 @@ kubectl get pods -lcomponent=zeppelin # Get the driver pod to interact with.
|
||||
```
|
||||
|
||||
At which point the Master UI will be available at
|
||||
[http://localhost:8001/api/v1/proxy/namespaces/default/services/spark-webui/](http://localhost:8001/api/v1/proxy/namespaces/default/services/spark-webui/).
|
||||
[http://localhost:8001/api/v1/proxy/namespaces/spark-cluster/services/spark-webui/](http://localhost:8001/api/v1/proxy/namespaces/default/services/spark-webui/).
|
||||
|
||||
You can either interact with the Spark cluster the traditional `spark-shell` /
|
||||
`spark-subsubmit` / `pyspark` commands by using `kubectl exec` against the
|
||||
|
Loading…
Reference in New Issue
Block a user