-
Notifications
You must be signed in to change notification settings - Fork 147
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] spark executor task error when reading shuffle data when using java open jdk11 #2082
Comments
@rickyma Do you have any suggestion? |
I've no idea. Could you please test this case using JDK 11? @maobaolong |
@rickyma We use jdk8 for all RSS cluster and client, so we did not encounter this issue for production env. But I did a test on JDK11 just konw, this issue reproduced.
val data = sc.parallelize(Seq(("A", 1), ("B", 2), ("C", 3), ("A", 4), ("B", 5), ("A", 6), ("A", 7),("A", 7), ("A", 7), ("A", 7), ("A", 7), ("A", 7), ("A", 7), ("A", 7), ("A", 7), ("A", 7), ("A", 7), ("A", 7), ("A", 7)));
val result = data.reduceByKey(_ + _);
result.collect().foreach(println);
System.exit(0); |
Could you help to solve this? It will block people from using JDK 11. @maobaolong |
@rickyma @ChenRussell JDK11 cannot works with uniffle client for now. |
Do you mean JDK11? |
Sorry for the mistake, yeah, i mean JDK11, I got this conclusion from the community meeting. |
Code of Conduct
Search before asking
Describe the bug
I use openjdk 11 in spark image, and I get errors when spark task reading shuffle data from uniffle server, here is the executor task error log:
Affects Version(s)
0.9.0
Uniffle Server Log Output
Uniffle Engine Log Output
Uniffle Server Configurations
rss.rpc.server.type GRPC_NETTY ...
Uniffle Engine Configurations
Additional context
No response
Are you willing to submit PR?
The text was updated successfully, but these errors were encountered: