写点什么

开启 Kerberos 安全认证的大数据环境中如何正确指定 HS2 的 jdbc url 地址?

  • 2023-09-25
    湖北
  • 本文字数:6233 字

    阅读完需:约 20 分钟

开启 Kerberos 安全认证的大数据环境中如何正确指定 HS2 的 jdbc url 地址?

1 Kerberos 环境中 HS2 的认证方式概述

大家知道,HIVE 的认证方式可以通过参数 hive.server2.authentication 在服务端进行统一配置,而在开启了 Kerberos 安全认证的大数据环境中:


  • 我们可以配置 hive.server2.authentication=kerberos,代表配置 HS2 使用 Kerberos 安全认证;

  • 我们可以配置 hive.server2.authentication=ldap,代表配置 HS2 使用 Kerberos 和 LDAP 的双重认证。


更多详细信息,可以查看前期博文《大数据生态安全框架的实现原理与最佳实践》。

2 Kerberos 认证模式下 HS2 的 jdbc url 地址格式

在 HS2 的 Kerberos 认证模式下(hive.server2.authentication=kerberos),HS2 的 jdbc url 地址格式为 jdbc:hive2://<host>:<port>/<db>;principal=<Server_Principal_of_HiveServer2>,其中 hs2 地址部分的 host 可以指定为具体域名或 Ip,principal 部分的 Server_Principal_of_HiveServer2 可以指定为具体域名,具体 IP 或特殊字符“_HOST”,所以所有排列组合下的可能值如下:


  • beeline -u "jdbc:hive2://uf30-1:10000/default;principal=hive/uf30-1@xxx.com";

  • beeline -u jdbc:hive2://100.116.3.228:10005/default;principal=hive/192.168.71.70@xxx.com";

  • beeline -u "jdbc:hive2://uf30-1:10000/default;principal=hive/_HOST@xxx.com";

  • beeline -u "jdbc:hive2://192.168.71.70:10000/default;principal=hive/uf30-1@xxx.com";

  • beeline -u "jdbc:hive2://100.116.3.228:10005/default;principal=hive/192.168.71.70@xxx.com";

  • beeline -u "jdbc:hive2://100.116.3.228:10005/default;principal=hive/_HOST@xxx.com";

3 Kerberos 认证模式下因 HS2 jdbc url 地址格式使用错误带来的常见问题

在 Kerberos 认证模式下,HS2 jdbc url 地址格式中 hs2 地址部分的 host 可以指定为具体域名或具体 IP,principal 部分的 Server_Principal_of_HiveServer2 可以指定为具体域名,具体 IP 或特殊字符“_HOST”,所以所有排列组合下的可能值共有 6 种;在具体使用过程中,因为各个环境中 kerberos kdc 中配置的差异,很容易出现连接失败问题,某次连接失败时详细的报错日志如下:


Connecting to jdbc:hive2://192.168.71.70:10000/default;principal=hive/_HOST@CDH.COM23/05/04 15:06:01 [main]: ERROR transport.TSaslTransport: SASL negotiation failurejavax.security.sasl.SaslException: GSS initiate failed        at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) ~[?:1.8.0_181]        at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94) ~[hive-exec-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]        at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) [hive-exec-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]        at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) [hive-exec-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) [hive-exec-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) [hive-exec-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]        at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_181]        at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_181]        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) [hadoop-common-3.0.0-cdh6.3.2.jar:?]        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) [hive-exec-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]        at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:229) [hive-jdbc-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]        at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:184) [hive-jdbc-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]        at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107) [hive-jdbc-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]        at java.sql.DriverManager.getConnection(DriverManager.java:664) [?:1.8.0_181]        at java.sql.DriverManager.getConnection(DriverManager.java:208) [?:1.8.0_181]        at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:145) [hive-beeline-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]        at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:209) [hive-beeline-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]        at org.apache.hive.beeline.Commands.connect(Commands.java:1617) [hive-beeline-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]        at org.apache.hive.beeline.Commands.connect(Commands.java:1512) [hive-beeline-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_181]        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_181]        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_181]        at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_181]        at org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:56) [hive-beeline-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]        at org.apache.hive.beeline.BeeLine.execCommandWithPrefix(BeeLine.java:1290) [hive-beeline-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]        at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:1329) [hive-beeline-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]        at org.apache.hive.beeline.BeeLine.connectUsingArgs(BeeLine.java:864) [hive-beeline-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]        at org.apache.hive.beeline.BeeLine.initArgs(BeeLine.java:768) [hive-beeline-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]        at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:1004) [hive-beeline-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]        at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:526) [hive-beeline-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]        at org.apache.hive.beeline.BeeLine.main(BeeLine.java:508) [hive-beeline-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_181]        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_181]        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_181]        at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_181]        at org.apache.hadoop.util.RunJar.run(RunJar.java:313) [hadoop-common-3.0.0-cdh6.3.2.jar:?]        at org.apache.hadoop.util.RunJar.main(RunJar.java:227) [hadoop-common-3.0.0-cdh6.3.2.jar:?]Caused by: org.ietf.jgss.GSSException: No valid credentials provided (Mechanism level: Server not found in Kerberos database (7) - LOOKING_UP_SERVER)        at sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:770) ~[?:1.8.0_181]        at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:248) ~[?:1.8.0_181]        at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) ~[?:1.8.0_181]        at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192) ~[?:1.8.0_181]        ... 36 moreCaused by: sun.security.krb5.KrbException: Server not found in Kerberos database (7) - LOOKING_UP_SERVER        at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:73) ~[?:1.8.0_181]        at sun.security.krb5.KrbTgsReq.getReply(KrbTgsReq.java:251) ~[?:1.8.0_181]        at sun.security.krb5.KrbTgsReq.sendAndGetCreds(KrbTgsReq.java:262) ~[?:1.8.0_181]        at sun.security.krb5.internal.CredentialsUtil.serviceCreds(CredentialsUtil.java:308) ~[?:1.8.0_181]        at sun.security.krb5.internal.CredentialsUtil.acquireServiceCreds(CredentialsUtil.java:126) ~[?:1.8.0_181]        at sun.security.krb5.Credentials.acquireServiceCreds(Credentials.java:458) ~[?:1.8.0_181]        at sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:693) ~[?:1.8.0_181]        at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:248) ~[?:1.8.0_181]        at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) ~[?:1.8.0_181]        at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192) ~[?:1.8.0_181]        ... 36 moreCaused by: sun.security.krb5.Asn1Exception: Identifier doesn't match expected value (906)        at sun.security.krb5.internal.KDCRep.init(KDCRep.java:140) ~[?:1.8.0_181]        at sun.security.krb5.internal.TGSRep.init(TGSRep.java:65) ~[?:1.8.0_181]        at sun.security.krb5.internal.TGSRep.<init>(TGSRep.java:60) ~[?:1.8.0_181]        at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:55) ~[?:1.8.0_181]        at sun.security.krb5.KrbTgsReq.getReply(KrbTgsReq.java:251) ~[?:1.8.0_181]        at sun.security.krb5.KrbTgsReq.sendAndGetCreds(KrbTgsReq.java:262) ~[?:1.8.0_181]        at sun.security.krb5.internal.CredentialsUtil.serviceCreds(CredentialsUtil.java:308) ~[?:1.8.0_181]        at sun.security.krb5.internal.CredentialsUtil.acquireServiceCreds(CredentialsUtil.java:126) ~[?:1.8.0_181]        at sun.security.krb5.Credentials.acquireServiceCreds(Credentials.java:458) ~[?:1.8.0_181]        at sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:693) ~[?:1.8.0_181]        at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:248) ~[?:1.8.0_181]        at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) ~[?:1.8.0_181]        at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192) ~[?:1.8.0_181]        ... 36 more23/05/04 15:06:01 [main]: WARN jdbc.HiveConnection: Failed to connect to 192.168.71.70:10000Unknown HS2 problem when communicating with Thrift server.Error: Could not open client transport with JDBC Uri: jdbc:hive2://192.168.71.70:10000/default;principal=hive/_HOST@CDH.COM: GSS initiate failed (state=08S01,code=0)
复制代码



  • 查看 kdc 服务端日志可见”TGS_REQ (1 etypes {17}) 192.168.71.70: LOOKING_UP_SERVER: authtime 0, dap@CDH.COM for hive/192.168.71.70@CDH.COM, Server not found in Kerberos database“:


  • 上述 hs2 客户端与 kdc 服务端日志的核心报错信息是 “Server not found in kerberos database (7) - LOOKING_UP_SERVER",其含义是 kdc 数据库中没有指定的 Server_Principal_of_HiveServer2(hive/192.168.71.70@CDH.COM)对应的 server;

  • 通过命令 kadmin.local -q "listprincs" |egrep -i 'hive|dap' 查看 kdc 数据库,可见确实如此(KDC 数据库中指定的都是具体的域名而不是 IP):


4 技术背景与问题解决

  • KDC 包括 Authentication Server(AS),也包括 Ticket Granting Server(TGS);KDC 数据库中包括了 user/group/service/computer 相关信息;KDC 可以响应客户端的 AS_REQ 请求(once per user login session),也可以响应客户端的 TGS_REQ(once per type of service);

  • kerberos 认证模式下,客户端创建到 hs2 的 JDBC 连接时,KDC 首先会通过 AS_REQ 和 AS_REP 校验用户身份(用户需要通过密码或 keytab 文件的方式提供身份凭证,如 kinit user_name -kt xx.keytab);在用户身份校验通过后,KDC 会通过 TGS_REQ 和 TGS_REP 申请 service ticket (用户需要在 JDBC URL 中指定 Server_Principal_of_HiveServer2);

  • 如果用户在 hs2 JDBC URL 中指定 Server_Principal_of_HiveServer2 使用域名或 IP,则 HS2 会以用户提供的信息直接发起 TGS_REQ 请求;

  • 如果用户在 hs2 JDBC URL 中指定 Server_Principal_of_HiveServer2 为特殊字符”_HOST“,则 HS2 首先会将特殊字符”_HOST“ 替换为 hs2 JDBC URL 地址中 hs2 host 的域名或 IP 地址,然后再向 KDC 发起 TGS_REQ 请求;

  • 如果 KDC 数据库中没有对应域名或 IP 的 principal,就会报错:Server not found in Kerberos database (7) - LOOKING_UP_SERVER;

  • The special string _HOST in the principal will be replaced automatically with the correct host name by hs2 before issue TGS_REQ;

  • Kerberos principals are traditionally defined with hostnames of the form hbase@worker3/EXAMPLE.COM, not hbase/10.10.15.1/EXAMPLE.COM,The issue of whether Hadoop should support IP addresses has been raised HADOOP-9019 & HADOOP-7510 Current consensus is no: you need DNS set up, or at least a consistent and valid /etc/hosts file on every node in the cluster.


知道了相关技术背景后,上述 Server not found in Kerberos database (7) - LOOKING_UP_SERVER 问题的解决方法就很清晰了:


  • 不使用特殊字符_HOST:此时在 hs2 jdbc url 中指定 Server_Principal_of_HiveServer2 使用 hs2 节点的域名而不是 IP(此时 hs2 host 可以使用具体域名或 IP);

  • 使用特殊字符_HOST:此时在 hs2 jdbc url 中指定 Server_Principal_of_HiveServer2 使用特殊字符 _HOST, 并指定 hs2 host 节点使用具体域名而不是 IP;

5 相关源码:


- org.apache.hive.service.auth.KerberosSaslHelper#createSubjectAssumedTransport- org.apache.hive.service.auth.KerberosSaslHelper#getKerberosTransport- org.apache.hive.jdbc.HiveConnection#createBinaryTransport- org.apache.hive.jdbc.HiveConnection#HiveConnection(java.lang.String, java.util.Properties, org.apache.hive.jdbc.saml.IJdbcBrowserClientFactory) 
复制代码


发布于: 刚刚阅读数: 5
用户头像

Keep Striving! 2018-04-25 加入

明哥,十四年 IT经验,六年大数据经验; 做过大数据集群的搭建运维,大数据应用系统的开发优化,也做过大数据平台的技术选型以及架构咨询; 目前聚焦于泛大数据生态,包括数据仓库/数据湖,云计算和人工智能。

评论

发布
暂无评论
开启 Kerberos 安全认证的大数据环境中如何正确指定 HS2 的 jdbc url 地址?_大数据_明哥的IT随笔_InfoQ写作社区