我正在EMR中创建一个集群,当spark运行我的应用程序时,我会在下面得到错误:
Exception in thread "main" java.lang.UnsupportedClassVersionError:
com/example/demodriver/MyClassFromJAR has been compiled by a more recent version of the Java Runtime (class file version 55.0),
this version of the Java Runtime only recognizes class file versions up to 52.0我在集群上使用releaseLabel emr-6.5.0,我的驱动程序jar是在java11中构建的
如何在电子病历中运行java11应用程序?或者这个错误是关于其他什么的?
发布于 2022-04-29 07:49:26
在EMR的最新版本中,安装了java 11。要启用它,可以提供以下配置。
[
{
"Classification": "spark-env",
"Configurations": [
{
"Classification": "export",
"Properties": {
"JAVA_HOME": "/usr/lib/jvm/java-11-amazon-corretto.x86_64"
}
}
]
},
{
"Classification": "spark-defaults",
"Properties": {
"spark.executorEnv.JAVA_HOME": "/usr/lib/jvm/java-11-amazon-corretto.x86_64"
}
}
]这似乎没有记录在案。
defaultJavaOptions和extraJavaOptions可能包含不兼容的java 11选项,您可能仍然需要修改/更新这些选项。
发布于 2022-09-01 04:16:22
下面是完整的配置,包括必要的JVM选项:
[
{
"Classification": "spark-env",
"Configurations": [
{
"Classification": "export",
"Properties": {
"JAVA_HOME": "/usr/lib/jvm/java-11-amazon-corretto.x86_64"
}
}
]
},
{
"Classification": "spark-defaults",
"Properties": {
"spark.executorEnv.JAVA_HOME": "/usr/lib/jvm/java-11-amazon-corretto.x86_64",
"spark.driver.defaultJavaOptions": "-XX:OnOutOfMemoryError='kill -9 %p' -XX:MaxHeapFreeRatio=70",
"spark.executor.defaultJavaOptions": "-verbose:gc -Xlog:gc*::time -XX:+PrintGCDetails -XX:+PrintGCDateStamps -XX:OnOutOfMemoryError='kill -9 %p' -XX:MaxHeapFreeRatio=70 -XX:+IgnoreUnrecognizedVMOptions"
}
}
]https://stackoverflow.com/questions/70886684
复制相似问题