0


hadoop 使用 kerberos 认证后,hadoop fs -ls 命令、hdfs dfs -ls 无法使用

版本:阿里云Centos 8.0、hadoop 3.1.3、kerberos 1.18

在启用了kerberos认证的hdfs安全集群输入:

[root@singlenode ~]# hadoop fs -ls /2022-02-05 19:53:57,765 WARN ipc.Client: Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
ls: DestHost:destPort singlenode:8020 , LocalHost:localPort singlenode/172.31.xxx.xx:0. Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]

错误信息如下:
2022-02-05 19:53:57,765 WARN ipc.Client: Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
ls: DestHost:destPort singlenode:8020 , LocalHost:localPort singlenode/172.31.xxx.xx:0. Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
可以看到是未认证错误
查看klist

[root@singlenode ~]# klist
Ticket cache: KCM:0
Default principal: hdfs/[email protected]

Valid starting       Expires              Service principal
02/05/2022 20:13:11  02/06/2022 20:13:11  krbtgt/[email protected]
    renew until 02/05/2022 20:13:11

发现又有认证?

解决方法:

shell 指定票券缓存在 /tmp/krb5cc_$UID 即可解决

[root@singlenode ~]# kinit -c /tmp/krb5cc_$UID hdfs/hadoop

此方法缓存的票券用 klist 是看不到的,请使用以下命令查看:

[root@singlenode ~]# klist -c /tmp/krb5cc_$UID

当然也得用以下的 kdestroy

[root@singlenode ~]# kdestroy -c /tmp/krb5cc_$UID

原因分析:

猜测可能是因为Kerberos版本过高,1.15版本好像不存在此问题,新版本Kerberos默认票券缓存在 KCM:0

[root@singlenode ~]# klist 
Ticket cache: KCM:0
Default principal: hdfs/[email protected]

Valid starting       Expires              Service principal
02/05/2022 20:13:11  02/06/2022 20:13:11  krbtgt/[email protected]
    renew until 02/05/2022 20:13:11

可能hadoop默认去 /tmp/ 目录下找,尝试过修改 /etc/krb5.conf :注释掉 default_ccache_name,或者指定为 FILE:/tmp/krb5cc_%{uid} 都没有效果
贴一下本人的 /etc/krb5.conf :

# To opt out of the system crypto-policies configuration of krb5, remove the# symlink at /etc/krb5.conf.d/crypto-policies which will not be recreated.
includedir /etc/krb5.conf.d/

[logging]
    default = FILE:/var/log/krb5libs.log
    kdc = FILE:/var/log/krb5kdc.log
    admin_server = FILE:/var/log/kadmind.log

[libdefaults]
    dns_lookup_realm =false
    dns_lookup_kdc =false
    ticket_lifetime = 24h
    renew_lifetime = 7d
    forwardable =true
    rdns =false
    pkinit_anchors = FILE:/etc/pki/tls/certs/ca-bundle.crt
    default_realm = EXAMPLE.COM
    KRB5CCNAME = FILE:/tmp/krb5cc_%{uid}
    default_ccache_name = FILE:/tmp/krb5cc_%{uid}[realms]
EXAMPLE.COM ={
    kdc = singlenode
    admin_server = singlenode
}[domain_realm]#.example.com = EXAMPLE.COM#example.com = EXAMPLE.COM
标签: hadoop hdfs kerberos

本文转载自: https://blog.csdn.net/u011307484/article/details/122792825
版权归原作者 余鸿福 所有, 如有侵权,请联系我们删除。

“hadoop 使用 kerberos 认证后,hadoop fs -ls 命令、hdfs dfs -ls 无法使用”的评论:

还没有评论