-
Notifications
You must be signed in to change notification settings - Fork 4.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[bitnami/spark] Unable to run pyspark interactive shell #32710
Comments
Hi @jonatronblah, you seem to be right as the tool does not work with the current Bitnami container image for Spark 3.3. However, I tried the upstream We are already working on adding the new 3.4 version soon to Bitnami, so once it is released, we can check whether the issue is resolved (or not), and escalate it if needed. |
This Issue has been automatically marked as "stale" because it has not had recent activity (for 15 days). It will be closed if no further activity occurs. Thanks for the feedback. |
Due to the lack of activity in the last 5 days since it was marked as "stale", we proceed to close this Issue. Do not hesitate to reopen it later if necessary. |
I had the same issue. I was able to solve the issue by putting the commands pip install pyspark --> push enter I hope this helps you too. If not, sorry :/ I'm pretty new to this too. |
Just wanna report that it also did not work for 3.4 and 3.5. This is not 'solved'. |
This Issue has been automatically marked as "stale" because it has not had recent activity (for 15 days). It will be closed if no further activity occurs. Thanks for the feedback. |
Due to the lack of activity in the last 5 days since it was marked as "stale", we proceed to close this Issue. Do not hesitate to reopen it later if necessary. |
i have the same issue - any workarounds? |
currently facing the same issue |
an workaround i found here: #38139 (comment) |
Name and Version
bitnami/spark:3.3
What architecture are you using?
amd64
What steps will reproduce the bug?
Forgive me as I am very new to spark, but could not solve this issue with google.
Using the following docker compose file:
Attempting to run pyspark interactive shell with the following command: ./bin/pyspark
What is the expected behavior?
Expect the pyspark interactive shell to run.
What do you see instead?
Always receive the following error: "Error: pyspark does not support any application options."
Additional information
I can submit pyspark jobs successfully, and activate the scala interactive shell just fine. Thanks for any help.
The text was updated successfully, but these errors were encountered: