美文网首页
org.apache.spark.sql.catalyst.ca

org.apache.spark.sql.catalyst.ca

作者: 扎西的德勒 | 来源:发表于2021-05-25 17:05 被阅读0次
一、报错

在进行SparkSql代码调试时,代码确认无误执行报如下错误:
Exception in thread "main" java.lang.IncompatibleClassChangeError: class org.apache.spark.sql.hive.HiveExternalCatalog has interface org.apache.spark.sql.catalyst.catalog.ExternalCatalog as super class

Exception in thread "main" java.lang.IncompatibleClassChangeError: class org.apache.spark.sql.hive.HiveExternalCatalog has interface org.apache.spark.sql.catalyst.catalog.ExternalCatalog as super class
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:348)
    at org.apache.spark.util.Utils$.classForName(Utils.scala:239)
    at org.apache.spark.sql.internal.StaticSQLConf$.defaultHiveCatalogImplementation(StaticSQLConf.scala:49)
    at org.apache.spark.sql.internal.StaticSQLConf$$anonfun$3.apply(StaticSQLConf.scala:41)
    at org.apache.spark.sql.internal.StaticSQLConf$$anonfun$3.apply(StaticSQLConf.scala:41)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.internal.config.ConfigEntryWithDefaultFunction.readFrom(ConfigEntry.scala:103)
    at org.apache.spark.SparkConf.get(SparkConf.scala:261)
    at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$sessionStateClassName(SparkSession.scala:1074)
    at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:155)
    at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:153)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:153)
    at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:150)
    at org.apache.spark.sql.DataFrameReader.<init>(DataFrameReader.scala:787)
    at org.apache.spark.sql.SparkSession.read(SparkSession.scala:664)
    at com.kkb.spark.sparkSql.ReadTextFile$.main(ReadTextFile.scala:14)
    at com.kkb.spark.sparkSql.ReadTextFile.main(ReadTextFile.scala)
二、解决方案

在网上找不到该报错的相关文章(估计这个错误太low了),经检查发现pom.xml配置了两个不同版本的spark-hive_2.11依赖,将其中一个注释后执行正常

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.11</artifactId>
            <version>2.4.0-cdh6.2.0</version>
        </dependency>
<!--        <dependency>-->
<!--            <groupId>org.apache.spark</groupId>-->
<!--            <artifactId>spark-hive_2.11</artifactId>-->
<!--            &lt;!&ndash;                        <version>2.4.0-cdh6.2.0</version>&ndash;&gt;-->
<!--            <version>2.3.3</version>-->
<!--        </dependency>-->

相关文章

网友评论

      本文标题:org.apache.spark.sql.catalyst.ca

      本文链接:https://www.haomeiwen.com/subject/jcdcsltx.html