You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
bash-5.0# ./start-worker.sh spark://spark:7077
rsync from spark://spark:7077
/spark/sbin/spark-daemon.sh: line 177: rsync: command not found
starting org.apache.spark.deploy.worker.Worker, logging to /spark/logs/spark--org.apache.spark.deploy.worker.Worker-1-6f7782b9b0d5.out
ps: unrecognized option: p
BusyBox v1.30.1 (2020-05-30 09:44:53 UTC) multi-call binary.
Usage: ps [-o COL1,COL2=HEADER]
Show list of processes
-o COL1,COL2=HEADER Select columns for display
ps: unrecognized option: p
BusyBox v1.30.1 (2020-05-30 09:44:53 UTC) multi-call binary.
Usage: ps [-o COL1,COL2=HEADER]
Show list of processes
The text was updated successfully, but these errors were encountered:
um, could you maybe tell me a bit more about how you are trying to start a worker? From which image? If you have set it up via our example docker-compose file then the worker is automatically started or even via a normal docker-run using our worker image.
Do let me know a bit more so that we can resolve this.
show
The text was updated successfully, but these errors were encountered: