TT Bigdata TT Bigdata
首页
  • 部署专题

    • 常规安装
    • 一键部署
  • 组件专题

    • 安装指导
    • 魔改分享
  • 高阶玩法

    • 实战 Kerberos
    • 实战 Ranger
  • 版本专题

    • 更新说明
    • BUG临时处理
  • 实验室

    • VIEW插件
    • JIRA速查
  • Ambari-Env

    • 环境准备
    • 开始使用
  • 二开指导

    • 前端开发
    • 后端开发
  • 组件编译

    • 专区—Ambari
    • 专区—Bigtop-官方组件
    • 专区—Bigtop-扩展组件
  • 报错解决

    • 专区—Ambari
    • 专区—Bigtop
  • 其他技巧

    • APT仓库增量更新
    • Maven镜像加速
    • Gradle镜像加速
    • Bower镜像加速
    • 虚拟环境思路
    • R环境安装+一键安装脚本
    • Ivy配置私有镜像仓库
    • Node.js 多版本共存方案
    • Ambari Web本地启动
    • Npm镜像加速
    • PostgreSQL快速安装
    • Temurin JDK 23快速安装
  • 成神之路

    • 专区—Ambari
    • 专区—Ambari-Metrics
    • 专区—Bigtop
  • 集成案例

    • Redis集成教学
    • Dolphin集成教学
    • Doris集成教学
    • 持续整理...
  • 核心代码

    • 各组件代码
    • 通用代码模板
  • 国产化&其他系统

    • Kylin V10系列
    • Rocky系列
    • Ubuntu系列
  • Grafana监控方案

    • Ambari-Metrics插件
    • Infinity插件
  • 优化增强

    • 组件配置调优
  • 支持&共建

    • 蓝图愿景
    • 合作共建
    • 服务说明
登陆
GitHub (opens new window)

JaneTTR

数据酿造智慧,每一滴都是沉淀!
首页
  • 部署专题

    • 常规安装
    • 一键部署
  • 组件专题

    • 安装指导
    • 魔改分享
  • 高阶玩法

    • 实战 Kerberos
    • 实战 Ranger
  • 版本专题

    • 更新说明
    • BUG临时处理
  • 实验室

    • VIEW插件
    • JIRA速查
  • Ambari-Env

    • 环境准备
    • 开始使用
  • 二开指导

    • 前端开发
    • 后端开发
  • 组件编译

    • 专区—Ambari
    • 专区—Bigtop-官方组件
    • 专区—Bigtop-扩展组件
  • 报错解决

    • 专区—Ambari
    • 专区—Bigtop
  • 其他技巧

    • APT仓库增量更新
    • Maven镜像加速
    • Gradle镜像加速
    • Bower镜像加速
    • 虚拟环境思路
    • R环境安装+一键安装脚本
    • Ivy配置私有镜像仓库
    • Node.js 多版本共存方案
    • Ambari Web本地启动
    • Npm镜像加速
    • PostgreSQL快速安装
    • Temurin JDK 23快速安装
  • 成神之路

    • 专区—Ambari
    • 专区—Ambari-Metrics
    • 专区—Bigtop
  • 集成案例

    • Redis集成教学
    • Dolphin集成教学
    • Doris集成教学
    • 持续整理...
  • 核心代码

    • 各组件代码
    • 通用代码模板
  • 国产化&其他系统

    • Kylin V10系列
    • Rocky系列
    • Ubuntu系列
  • Grafana监控方案

    • Ambari-Metrics插件
    • Infinity插件
  • 优化增强

    • 组件配置调优
  • 支持&共建

    • 蓝图愿景
    • 合作共建
    • 服务说明
登陆
GitHub (opens new window)
  • Hadoop

  • Spark

  • Trino

  • Hudi

  • Paimon

  • Livy

  • Flink

  • Atlas

  • Superset

  • Jsvc

  • Zookeeper

  • Hive

  • Sqoop

  • Cloudbeaver

  • Bigtop-select

  • Knox

  • Hue

    • Hue 访问 Hadoop 权限问题
    • Hue 访问 Yarn 权限问题
    • Hue 访问 Impala 时间格式问题
    • requests-kerberos 兼容性问题
      • 完整的报错日志
      • 一、问题现象概览
      • 二、完整异常链路拆解
        • 一)第一层异常:Django 子命令加载失败
        • 二)第二层异常:settings.py 加载中断
        • 三)第三层异常:requests-kerberos 初始化失败
      • 三、为什么 Hue 会“被动”触发 Kerberos 依赖?
        • 一)触发路径并非来自 HDFS 配置
        • 二)requests-kerberos 的动态依赖机制
      • 四、根因总结(不是一个问题)
      • 五、推荐修复方案(可复用)
        • 一)补齐 Kerberos 运行时依赖(系统层)
        • 二、解决 libcom_err 版本不兼容问题
        • 三、在 Hue 虚拟环境中重建 Python Kerberos 依赖
    • libmariadb.so.3 缺失导致 syncdb 失败
    • 生产环境下解决方案——Hue/query_api.py
  • 报错解决-Bigtop
  • Hue
JaneTTR
2026-01-12
目录

requests-kerberos 兼容性问题Ubuntu22报错

# 完整的报错日志


stderr:
NoneType: None

The above exception was the cause of the following exception:

Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/BIGTOP/3.2.0/services/HUE/package/scripts/hue_server.py", line 90, in <module>
hue_server().execute()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 413, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/BIGTOP/3.2.0/services/HUE/package/scripts/hue_server.py", line 41, in install
hue_service('hue_server', action='metastoresync', upgrade_type=None)
File "/var/lib/ambari-agent/cache/stacks/BIGTOP/3.2.0/services/HUE/package/scripts/hue_service.py", line 55, in hue_service
Execute (format("{hue_bin_dir}/hue syncdb --noinput"),
File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 168, in __init__
self.env.run()
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 171, in run
self.run_action(resource, action)
File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 137, in run_action
provider_action()
File "/usr/lib/ambari-agent/lib/resource_management/core/providers/system.py", line 333, in action_run
shell.checked_call(
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 95, in inner
result = function(command, **kwargs)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 142, in checked_call
return _call_wrapper(
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 278, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/ambari-agent/lib/resource_management/core/shell.py", line 493, in _call
raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/bigtop/current/hue/build/env/bin/hue syncdb --noinput' returned 1. Traceback (most recent call last):
File "/usr/bigtop/current/hue/build/env/lib/python3.8/site-packages/django/core/management/__init__.py", line 237, in fetch_command
app_name = commands[subcommand]
KeyError: 'syncdb'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/usr/bigtop/current/hue/build/env/lib/python3.8/site-packages/requests_kerberos/kerberos_.py", line 2, in <module>
import kerberos
ImportError: libcom_err.so.3: cannot open shared object file: No such file or directory

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/usr/bigtop/current/hue/build/env/bin/hue", line 33, in <module>
sys.exit(load_entry_point('desktop', 'console_scripts', 'hue')())
File "/usr/bigtop/current/hue/desktop/core/src/desktop/manage_entry.py", line 233, in entry
execute_from_command_line(sys.argv)
File "/usr/bigtop/current/hue/build/env/lib/python3.8/site-packages/django/core/management/__init__.py", line 419, in execute_from_command_line
utility.execute()
File "/usr/bigtop/current/hue/build/env/lib/python3.8/site-packages/django/core/management/__init__.py", line 413, in execute
self.fetch_command(subcommand).run_from_argv(self.argv)
File "/usr/bigtop/current/hue/build/env/lib/python3.8/site-packages/django/core/management/__init__.py", line 244, in fetch_command
settings.INSTALLED_APPS
File "/usr/bigtop/current/hue/build/env/lib/python3.8/site-packages/django/conf/__init__.py", line 82, in __getattr__
self._setup(name)
File "/usr/bigtop/current/hue/build/env/lib/python3.8/site-packages/django/conf/__init__.py", line 69, in _setup
self._wrapped = Settings(settings_module)
File "/usr/bigtop/current/hue/build/env/lib/python3.8/site-packages/django/conf/__init__.py", line 170, in __init__
mod = importlib.import_module(self.SETTINGS_MODULE)
File "/usr/bigtop/current/hue/build/env/lib/python3.8/importlib/__init__.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
File "<frozen importlib._bootstrap>", line 991, in _find_and_load
File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 843, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/usr/bigtop/current/hue/desktop/core/src/desktop/settings.py", line 38, in <module>
from aws.conf import is_enabled as is_s3_enabled
File "/usr/bigtop/current/hue/desktop/libs/aws/src/aws/conf.py", line 26, in <module>
from desktop.lib.idbroker import conf as conf_idbroker
File "/usr/bigtop/current/hue/desktop/core/src/desktop/lib/idbroker/conf.py", line 22, in <module>
from requests_kerberos import HTTPKerberosAuth
File "/usr/bigtop/current/hue/build/env/lib/python3.8/site-packages/requests_kerberos/__init__.py", line 17, in <module>
from .kerberos_ import HTTPKerberosAuth, REQUIRED, OPTIONAL, DISABLED
File "/usr/bigtop/current/hue/build/env/lib/python3.8/site-packages/requests_kerberos/kerberos_.py", line 4, in <module>
import winkerberos as kerberos
ModuleNotFoundError: No module named 'winkerberos'
stdout:
2026-01-06 13:44:38,121 - Stack Feature Version Info: Cluster Stack=3.2.0, Command Stack=None, Command Version=None -> 3.2.0
2026-01-06 13:44:38,122 - Using hadoop conf dir: /etc/hadoop/conf
2026-01-06 13:44:38,123 - Skipping param: datanode_max_locked_memory, due to Configuration parameter 'dfs.datanode.max.locked.memory' was not found in configurations dictionary!
2026-01-06 13:44:38,123 - Skipping param: falcon_user, due to Configuration parameter 'falcon-env' was not found in configurations dictionary!
2026-01-06 13:44:38,123 - Skipping param: gmetad_user, due to Configuration parameter 'ganglia-env' was not found in configurations dictionary!
2026-01-06 13:44:38,123 - Skipping param: gmond_user, due to Configuration parameter 'ganglia-env' was not found in configurations dictionary!
2026-01-06 13:44:38,123 - Skipping param: hbase_user, due to Configuration parameter 'hbase-env' was not found in configurations dictionary!
2026-01-06 13:44:38,123 - Skipping param: nfsgateway_heapsize, due to Configuration parameter 'nfsgateway_heapsize' was not found in configurations dictionary!
2026-01-06 13:44:38,124 - Skipping param: oozie_user, due to Configuration parameter 'oozie-env' was not found in configurations dictionary!
2026-01-06 13:44:38,124 - Skipping param: repo_info, due to Configuration parameter 'repoInfo' was not found in configurations dictionary!
2026-01-06 13:44:38,124 - Skipping param: zeppelin_group, due to Configuration parameter 'zeppelin-env' was not found in configurations dictionary!
2026-01-06 13:44:38,124 - Skipping param: zeppelin_user, due to Configuration parameter 'zeppelin-env' was not found in configurations dictionary!
2026-01-06 13:44:38,124 - Group['ranger'] {}
2026-01-06 13:44:38,125 - Group['hdfs'] {}
2026-01-06 13:44:38,125 - Group['hue'] {}
2026-01-06 13:44:38,127 - Adding group Group['hue']
2026-01-06 13:44:38,165 - Group['hadoop'] {}
2026-01-06 13:44:38,166 - User['hive'] {'uid': None, 'gid': 'hadoop', 'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
2026-01-06 13:44:38,169 - User['zookeeper'] {'uid': None, 'gid': 'hadoop', 'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
2026-01-06 13:44:38,170 - User['ambari-qa'] {'uid': None, 'gid': 'hadoop', 'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
2026-01-06 13:44:38,172 - User['ranger'] {'uid': None, 'gid': 'hadoop', 'groups': ['ranger', 'hadoop'], 'fetch_nonlocal_groups': True}
2026-01-06 13:44:38,174 - User['solr'] {'uid': None, 'gid': 'hadoop', 'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
2026-01-06 13:44:38,175 - User['tez'] {'uid': None, 'gid': 'hadoop', 'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
2026-01-06 13:44:38,177 - User['hdfs'] {'uid': None, 'gid': 'hadoop', 'groups': ['hdfs', 'hadoop'], 'fetch_nonlocal_groups': True}
2026-01-06 13:44:38,179 - User['hue'] {'uid': None, 'gid': 'hadoop', 'groups': ['hue', 'hadoop'], 'fetch_nonlocal_groups': True}
2026-01-06 13:44:38,179 - Adding user User['hue']
2026-01-06 13:44:38,246 - User['yarn'] {'uid': None, 'gid': 'hadoop', 'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
2026-01-06 13:44:38,248 - User['hcat'] {'uid': None, 'gid': 'hadoop', 'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
2026-01-06 13:44:38,250 - User['mapred'] {'uid': None, 'gid': 'hadoop', 'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
2026-01-06 13:44:38,252 - User['knox'] {'uid': None, 'gid': 'hadoop', 'groups': ['hadoop'], 'fetch_nonlocal_groups': True}
2026-01-06 13:44:38,254 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0o555}
2026-01-06 13:44:38,256 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2026-01-06 13:44:38,269 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2026-01-06 13:44:38,270 - User['hdfs'] {'fetch_nonlocal_groups': True}
2026-01-06 13:44:38,272 - User['hdfs'] {'groups': ['hdfs', 'hadoop'], 'fetch_nonlocal_groups': True}
2026-01-06 13:44:38,274 - FS Type: HDFS
2026-01-06 13:44:38,274 - Directory['/etc/hadoop'] {'mode': 0o755}
2026-01-06 13:44:38,293 - File['/etc/hadoop/conf/hadoop-env.sh'] {'owner': 'root', 'group': 'hadoop', 'content': InlineTemplate(...)}
2026-01-06 13:44:38,294 - Writing File['/etc/hadoop/conf/hadoop-env.sh'] because contents don't match
2026-01-06 13:44:38,295 - Changing group for /tmp/tmp1767707078.2947836_773 from 0 to hadoop
2026-01-06 13:44:38,295 - Moving /tmp/tmp1767707078.2947836_773 to /etc/hadoop/conf/hadoop-env.sh
2026-01-06 13:44:38,306 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 0o1777}
2026-01-06 13:44:38,352 - Skipping param: gmetad_user, due to Configuration parameter 'ganglia-env' was not found in configurations dictionary!
2026-01-06 13:44:38,352 - Skipping param: gmond_user, due to Configuration parameter 'ganglia-env' was not found in configurations dictionary!
2026-01-06 13:44:38,352 - Skipping param: hbase_user, due to Configuration parameter 'hbase-env' was not found in configurations dictionary!
2026-01-06 13:44:38,352 - Skipping param: repo_info, due to Configuration parameter 'repoInfo' was not found in configurations dictionary!
2026-01-06 13:44:38,352 - Repository['BIGTOP-3.2.0-repo-1'] {'action': ['prepare'], 'base_url': 'http://192.168.3.212', 'mirror_list': None, 'repo_file_name': 'ambari-bigtop-1', 'repo_template': '{{package_type}} [trusted=yes] {{base_url}} ./', 'components': ['bigtop', 'main']}
2026-01-06 13:44:38,358 - Repository[None] {'action': ['create']}
2026-01-06 13:44:38,359 - File['/tmp/tmpnt0mc6a5'] {'content': b'deb [trusted=yes] http://192.168.3.212 ./', 'owner': 'root'}
2026-01-06 13:44:38,360 - Writing File['/tmp/tmpnt0mc6a5'] because contents don't match
2026-01-06 13:44:38,360 - Moving /tmp/tmp1767707078.3604774_167 to /tmp/tmpnt0mc6a5
2026-01-06 13:44:38,371 - File['/tmp/tmplvm7t6b0'] {'content': StaticFile('/etc/apt/sources.list.d/ambari-bigtop-1.list'), 'owner': 'root'}
2026-01-06 13:44:38,372 - Writing File['/tmp/tmplvm7t6b0'] because contents don't match
2026-01-06 13:44:38,372 - Moving /tmp/tmp1767707078.3723066_869 to /tmp/tmplvm7t6b0
2026-01-06 13:44:38,383 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2026-01-06 13:44:38,404 - Installing package unzip ('/usr/bin/apt-get -o Dpkg::Options::=--force-confdef --allow-unauthenticated --assume-yes install unzip')
2026-01-06 13:44:39,600 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2026-01-06 13:44:39,609 - Installing package curl ('/usr/bin/apt-get -o Dpkg::Options::=--force-confdef --allow-unauthenticated --assume-yes install curl')
2026-01-06 13:44:40,668 - Package['bigtop-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2026-01-06 13:44:40,676 - Installing package bigtop-select ('/usr/bin/apt-get -o Dpkg::Options::=--force-confdef --allow-unauthenticated --assume-yes install bigtop-select')
2026-01-06 13:44:41,944 - The repository with version 3.2.0 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2026-01-06 13:44:42,531 - config['clusterHostInfo']: {'hive_metastore_hosts': ['dev2'], 'resourcemanager_hosts': ['dev1', 'dev2'], 'zkfc_hosts': ['dev1', 'dev2'], 'ranger_admin_hosts': ['dev2'], 'hive_client_hosts': ['dev1', 'dev3', 'dev2'], 'mapreduce2_client_hosts': ['dev1', 'dev3', 'dev2'], 'tez_client_hosts': ['dev1', 'dev3', 'dev2'], 'kerberos_client_hosts': ['dev1', 'dev3', 'dev2'], 'solr_server_hosts': ['dev1'], 'knox_gateway_hosts': ['dev1'], 'namenode_hosts': ['dev1', 'dev2'], 'hdfs_client_hosts': ['dev1', 'dev3', 'dev2'], 'hive_server_hosts': ['dev2'], 'ranger_tagsync_hosts': ['dev3'], 'hcat_hosts': ['dev1', 'dev3', 'dev2'], 'ranger_usersync_hosts': ['dev2'], 'nodemanager_hosts': ['dev1', 'dev3', 'dev2'], 'zookeeper_server_hosts': ['dev1', 'dev3', 'dev2'], 'yarn_client_hosts': ['dev1', 'dev3', 'dev2'], 'webhcat_server_hosts': ['dev1'], 'journalnode_hosts': ['dev1', 'dev3', 'dev2'], 'zookeeper_client_hosts': ['dev1', 'dev3', 'dev2'], 'datanode_hosts': ['dev1', 'dev3', 'dev2'], 'historyserver_hosts': ['dev2'], 'hue_server_hosts': ['dev1'], 'all_hosts': ['dev1', 'dev3', 'dev2'], 'all_racks': ['/default-rack', '/default-rack', '/default-rack'], 'all_ipv4_ips': ['192.168.3.212', '192.168.3.214', '192.168.3.213']}
2026-01-06 13:44:42,540 - Using hadoop conf dir: /etc/hadoop/conf
2026-01-06 13:44:42,550 - Command repositories: BIGTOP-3.2.0-repo-1
2026-01-06 13:44:42,550 - Applicable repositories: BIGTOP-3.2.0-repo-1
2026-01-06 13:44:48,058 - Looking for matching packages in the following repositories: 192.168.3.212
2026-01-06 13:44:48,094 - Package['hue-3-2-0'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2026-01-06 13:44:48,107 - Installing package hue-3-2-0 ('/usr/bin/apt-get -o Dpkg::Options::=--force-confdef --allow-unauthenticated --assume-yes install hue-3-2-0')
2026-01-06 13:45:45,273 - Execute['python /usr/lib/bigtop-select/distro-select set hue 3.2.0'] {}
2026-01-06 13:45:45,317 - Directory['/var/log/hue'] {'owner': 'hue', 'create_parents': True, 'group': 'hue', 'mode': 0o775}
2026-01-06 13:45:45,318 - Changing group for /var/log/hue from 1000 to hue
2026-01-06 13:45:45,318 - Changing permission for /var/log/hue from 755 to 775
2026-01-06 13:45:45,318 - Directory['/var/run/hue'] {'owner': 'hue', 'create_parents': True, 'group': 'hadoop', 'mode': 0o775}
2026-01-06 13:45:45,319 - Changing permission for /var/run/hue from 755 to 775
2026-01-06 13:45:45,345 - File['/usr/bigtop/current/hue/desktop/conf/hue.ini'] {'owner': 'hue', 'group': 'hue', 'mode': 0o755, 'content': InlineTemplate(...)}
2026-01-06 13:45:45,345 - Writing File['/usr/bigtop/current/hue/desktop/conf/hue.ini'] because it doesn't exist
2026-01-06 13:45:45,346 - Changing owner for /tmp/tmp1767707145.3462114_429 from 0 to hue
2026-01-06 13:45:45,346 - Changing group for /tmp/tmp1767707145.3462114_429 from 0 to hue
2026-01-06 13:45:45,346 - Changing permission for /tmp/tmp1767707145.3462114_429 from 644 to 755
2026-01-06 13:45:45,346 - Moving /tmp/tmp1767707145.3462114_429 to /usr/bigtop/current/hue/desktop/conf/hue.ini
2026-01-06 13:45:45,362 - File['/usr/bigtop/current/hue/desktop/conf/log.conf'] {'owner': 'hue', 'group': 'hue', 'mode': 0o755, 'content': InlineTemplate(...)}
2026-01-06 13:45:45,362 - Writing File['/usr/bigtop/current/hue/desktop/conf/log.conf'] because it doesn't exist
2026-01-06 13:45:45,363 - Changing owner for /tmp/tmp1767707145.3630645_397 from 0 to hue
2026-01-06 13:45:45,363 - Changing group for /tmp/tmp1767707145.3630645_397 from 0 to hue
2026-01-06 13:45:45,363 - Changing permission for /tmp/tmp1767707145.3630645_397 from 644 to 755
2026-01-06 13:45:45,363 - Moving /tmp/tmp1767707145.3630645_397 to /usr/bigtop/current/hue/desktop/conf/log.conf
2026-01-06 13:45:45,379 - File['/usr/bigtop/current/hue/desktop/conf/log4j.properties'] {'owner': 'hue', 'group': 'hue', 'mode': 0o755, 'content': InlineTemplate(...)}
2026-01-06 13:45:45,380 - Writing File['/usr/bigtop/current/hue/desktop/conf/log4j.properties'] because it doesn't exist
2026-01-06 13:45:45,380 - Changing owner for /tmp/tmp1767707145.380615_745 from 0 to hue
2026-01-06 13:45:45,381 - Changing group for /tmp/tmp1767707145.380615_745 from 0 to hue
2026-01-06 13:45:45,381 - Changing permission for /tmp/tmp1767707145.380615_745 from 644 to 755
2026-01-06 13:45:45,381 - Moving /tmp/tmp1767707145.380615_745 to /usr/bigtop/current/hue/desktop/conf/log4j.properties
2026-01-06 13:45:45,396 - Execute['/usr/bigtop/current/hue/build/env/bin/hue syncdb --noinput'] {'environment': {'JAVA_HOME': '/usr/jdk64/jdk17', 'HADOOP_CONF_DIR': '/etc/hadoop/conf', 'LD_LIBRARY_PATH': '$LD_LIBRARY_PATH:/usr/hdp/current/hue/lib-native'}, 'user': 'hue'}
2026-01-06 13:45:46,916 - The repository with version 3.2.0 for this command has been marked as resolved. It will be used to report the version of the component which was installed

Command failed after 1 tries


1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180

# 一、问题现象概览

在 Ambari 调度安装 Hue(Bigtop 3.2.0) 过程中,执行到以下步骤时失败:

/usr/bigtop/current/hue/build/env/bin/hue syncdb --noinput
1

Ambari Agent 日志中显示命令返回码为 1,并伴随多段异常堆栈。

从表面看,最早出现的是一个 Django 命令不存在 的异常:

KeyError: 'syncdb'
1

但这并不是根因,而是 Python 运行时初始化阶段提前中断 导致的连锁表现。

# 二、完整异常链路拆解

# 一)第一层异常:Django 子命令加载失败

File ".../django/core/management/__init__.py", line 237, in fetch_command
  app_name = commands[subcommand]
KeyError: 'syncdb'
1
2
3

从 Django 角度看,这是:

  • manage.py 启动正常
  • 但 settings 尚未完整加载
  • INSTALLED_APPS 未初始化完成

因此 Django 无法正确注册 syncdb 子命令。

# 二)第二层异常:settings.py 加载中断

继续向下看调用栈:

File ".../desktop/settings.py", line 38, in &lt;module>
from aws.conf import is_enabled as is_s3_enabled
1
2

这一步表明:

  • Hue 已进入 desktop 初始化阶段
  • 并开始加载与 安全 / 云存储 / 身份代理 相关的模块

# 三)第三层异常:requests-kerberos 初始化失败

真正的关键错误出现在这里:

ImportError: libcom_err.so.3: cannot open shared object file: No such file or directory
1

随后紧跟着:

ModuleNotFoundError: No module named 'winkerberos'
1

这两个错误具有明确的指向性:

  1. Hue 并未显式启用 Kerberos
  2. 但 requests-kerberos 已被间接 import
  3. requests-kerberos 在初始化时:
  • 需要系统 Kerberos C 库
  • 需要 Python Kerberos 绑定模块

# 三、为什么 Hue 会“被动”触发 Kerberos 依赖?

这是一个非常容易被忽略的点。

# 一)触发路径并非来自 HDFS 配置

在本场景中,并没有显式配置:

  • WebHDFS Kerberos
  • Hive Kerberos
  • HDFS doAs Kerberos

但 Hue 在启动时,会统一加载:

  • desktop.lib.idbroker
  • aws.conf
  • requests_kerberos.HTTPKerberosAuth

这是 设计行为,不是误配置。

# 二)requests-kerberos 的动态依赖机制

requests-kerberos 在 import 阶段的行为是:

  1. 优先尝试加载:
  • kerberos / pykerberos
  1. 若失败:
  • 尝试 winkerberos
  1. 同时:
  • 依赖系统层 Kerberos 动态库(libcom_err.so.*)

一旦 任一环节失败,整个 import 过程会抛异常,导致 Hue settings 加载失败。

# 四、根因总结(不是一个问题)

综合日志,可以明确这是一个 复合型问题:

层级 问题点
OS 缺失 libcom_err.so.3
系统库 Debian 系列仅提供 libcom_err.so.2
Python requests-kerberos 版本与系统库不兼容
Hue 启动阶段未做 Kerberos import 兜底

所以 仅修 Python 包是不够的。

# 五、推荐修复方案(可复用)

# 一)补齐 Kerberos 运行时依赖(系统层)

优先确保以下依赖完整安装:

apt-get install -y \
  build-essential \
  gcc \
  python3-dev \
  krb5-config \
  krb5-user \
  libkrb5-dev \
  libkrb5-3 \
  libgssapi-krb5-2 \
  libk5crypto3 \
  libkrb5support0 \
  libcom-err2 \
  libsasl2-2 \
  libsasl2-modules \
  libsasl2-modules-gssapi-mit
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15

# 二、解决 libcom_err 版本不兼容问题

在 Debian / Ubuntu 环境中:

  • 实际存在的是:libcom_err.so.2
  • requests-kerberos 期望的是:libcom_err.so.3

解决方式采用 软链接兼容:

sudo ln -sf /usr/lib/x86_64-linux-gnu/libcom_err.so.2 \
  /usr/lib/x86_64-linux-gnu/libcom_err.so.3

sudo ldconfig
1
2
3
4

这是运行时兼容方案,不影响系统原有库加载。

# 三、在 Hue 虚拟环境中重建 Python Kerberos 依赖

注意:必须使用 Hue 自身的 virtualenv。

sudo -u hue /usr/bigtop/current/hue/build/env/bin/pip install \
  --no-cache-dir \
  --only-binary=:all: \
  requests-kerberos==0.15.0 \
  pyspnego \
  cryptography \
  requests
1
2
3
4
5
6
7

随后,强制重装 Kerberos 绑定模块:

sudo -u hue /usr/bigtop/current/hue/build/env/bin/pip install \
  --no-cache-dir \
  --no-binary=kerberos,pykerberos \
  --force-reinstall \
  kerberos==1.3.1 \
  pykerberos==1.2.4
1
2
3
4
5
6

最后重启服务即可

#Bigtop#Hue#Kerberos#requests-kerberos#Hadoop 安全#Ambari
Hue 访问 Impala 时间格式问题
libmariadb.so.3 缺失导致 syncdb 失败

← Hue 访问 Impala 时间格式问题 libmariadb.so.3 缺失导致 syncdb 失败→

最近更新
01
Ranger Admin LDAP 认证报 Bad credentials 分析
02-15
02
Ranger Admin LDAP 认证报 Bad credentials 处理
02-15
03
Ranger Admin 证书快速导入脚本
02-15
更多文章>
Theme by Vdoing | Copyright © 2017-2026 JaneTTR | MIT License
  • 跟随系统
  • 浅色模式
  • 深色模式
  • 阅读模式