Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

cannot submit tasks to master #130

Open
pzg250 opened this issue Jun 30, 2021 · 3 comments
Open

cannot submit tasks to master #130

pzg250 opened this issue Jun 30, 2021 · 3 comments

Comments

@pzg250
Copy link

pzg250 commented Jun 30, 2021

Hi,
I met an issue. Anyone can help? Thanks in advance.

After deploy docker-spark to a server(192.168.10.8), I try to test by another server(192.168.10.7).
same version spark has been installed on 192.168.10.7.
cmd steps:

spark-shell --master spark://192.168.10.8:7077 --total-executor-cores 1 --executor-memory 512M
# xxxx
# some output here
# xxxx
val textFile = sc.textFile("file:///opt/spark/README.md");
textFile.first();

I got below error.(infinite loop messages)

WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
@GezimSejdiu
Copy link
Member

Hey @pzg250 ,

thanks a lot for reporting this. Could you tell us how you are running spark-shell command? Withing docker exec or from outside the docker network? Have you tried to use one of our docker templates as an example?

Best,

@nguacon90
Copy link

I've the same issue. Anyone can help?
Thanks alot.

@rilakgg
Copy link

rilakgg commented Mar 30, 2022

Hi @pzg250 ,

I got the same error too.
Could I use this configuration as below from outside Spark Cluster ?

spark = SparkSession.builder.appName("SparkSample2").master("spark://192.XX.X.XX:7077").getOrCreate()
I'd like to run this application from client side.
Thank you for your great support.
Best,

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants