相逢意气为君饮,系马高楼垂柳边。这篇文章主要讲述spark-sql 查询报错:Invalid method name: ‘get_table_req‘相关的知识,希望能为你提供帮助。
spark-sql> select * from zps_d001 limit 1;
Error in query: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to fetch table zps_xxx. Invalid method name: \'get_table_req\'
org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to fetch table zps_xxx. Invalid method name: \'get_table_req\'
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:112)
........................(此处省略若干日志)..................
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1039)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1048)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to fetch table zps_d001. Invalid method name: \'get_table_req\'
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:1282)
at org.apache.spark.sql.hive.client.HiveClientImpl.getRawTableOption(HiveClientImpl.scala:392)
at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$tableExists$1(HiveClientImpl.scala:406)
at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23)
at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$withHiveState$1(HiveClientImpl.scala:291)
at org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:224)
at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:223)
at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:273)
at org.apache.spark.sql.hive.client.HiveClientImpl.tableExists(HiveClientImpl.scala:406)
at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$tableExists$1(HiveExternalCatalog.scala:854)
at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23)
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:102)
... 111 more
根本原因还是spark 与hive的版本不兼容造成的。
安装的spark版本内置的是2.3.7版本的hive,而我集群中的hive是2.1.1版本的。
解决办法:
bin/spark-sql --conf spark.sql.hive.metastore.version=2.1.1 \\
--conf spark.sql.hive.metastore.jars=/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/lib/hive/lib/*
或者把这两个配置写到 ?
?spark/conf/spark-defaults.conf?
? 文件里。【spark-sql 查询报错(Invalid method name: ‘get_table_req‘)】
推荐阅读
- Java技术指南「并发编程专题」Guava RateLimiter限流器入门到精通(源码分析)
- ?超级详细万文零基础也能学的面向对象—没对象(new一个!)
- linux 性能优化大纲
- MAC下使用selenium躲过亚马逊反爬虫机制
- Linux Docker 运维相关命令
- 从vCenter Server中删除不再使用的replica-虚拟机
- 测试一下
- 803_AUTOSAR_TR_GeneralBlueprintsSupplement1_概述以及可视化表达1
- Oracle数据库基本操作第一章