Java自学者论坛

 找回密码
 立即注册

手机号码,快捷登录

恭喜Java自学者论坛(https://www.javazxz.com)已经为数万Java学习者服务超过8年了!积累会员资料超过10000G+
成为本站VIP会员,下载本站10000G+会员资源,会员资料板块,购买链接:点击进入购买VIP会员

JAVA高级面试进阶训练营视频教程

Java架构师系统进阶VIP课程

分布式高可用全栈开发微服务教程Go语言视频零基础入门到精通Java架构师3期(课件+源码)
Java开发全终端实战租房项目视频教程SpringBoot2.X入门到高级使用教程大数据培训第六期全套视频教程深度学习(CNN RNN GAN)算法原理Java亿级流量电商系统视频教程
互联网架构师视频教程年薪50万Spark2.0从入门到精通年薪50万!人工智能学习路线教程年薪50万大数据入门到精通学习路线年薪50万机器学习入门到精通教程
仿小米商城类app和小程序视频教程深度学习数据分析基础到实战最新黑马javaEE2.1就业课程从 0到JVM实战高手教程MySQL入门到精通教程
查看: 768|回复: 0

spark-submit python 程序,"/home/.python-eggs" permission denied 问题解决

[复制链接]
  • TA的每日心情
    奋斗
    2024-4-6 11:05
  • 签到天数: 748 天

    [LV.9]以坛为家II

    2034

    主题

    2092

    帖子

    70万

    积分

    管理员

    Rank: 9Rank: 9Rank: 9

    积分
    705612
    发表于 2021-6-1 10:29:39 | 显示全部楼层 |阅读模式

    问题描述,spark-submit 用 yarn 模式提交一个python 脚本运行程序,运行到需要分布式的部分,即map/mapPartition等等RDD的时候,或者actor RDD的时候,报错如下 :

    Traceback (most recent call last):
    File "/usr/lib64/python2.7/runpy.py", line 151, in _run_module_as_main
      mod_name, loader, code, fname = _get_module_details(mod_name)
    File "/usr/lib64/python2.7/runpy.py", line 101, in _get_module_details
      loader = get_loader(mod_name)
    File "/usr/lib64/python2.7/pkgutil.py", line 464, in get_loader
      return find_loader(fullname)
    File "/usr/lib64/python2.7/pkgutil.py", line 474, in find_loader
      for importer in iter_importers(fullname):
    File "/usr/lib64/python2.7/pkgutil.py", line 430, in iter_importers
      __import__(pkg)
    File "/data8/yarn/local-dir/usercache/bo.feng/appcache/application_1448854352032_70810/container_1448854352032_70810_01_000002/pyspark.zip/pyspark/__init__.py", line 41, in <module>
    File "/data8/yarn/local-dir/usercache/bo.feng/appcache/application_1448854352032_70810/container_1448854352032_70810_01_000002/pyspark.zip/pyspark/context.py", line 35, in <module>
    File "/data8/yarn/local-dir/usercache/bo.feng/appcache/application_1448854352032_70810/container_1448854352032_70810_01_000002/pyspark.zip/pyspark/rdd.py", line 51, in <module>
    File "/data8/yarn/local-dir/usercache/bo.feng/appcache/application_1448854352032_70810/container_1448854352032_70810_01_000002/pyspark.zip/pyspark/shuffle.py", line 33, in <module>
    File "build/bdist.linux-x86_64/egg/psutil/__init__.py", line 89, in <module>
    File "build/bdist.linux-x86_64/egg/psutil/_pslinux.py", line 24, in <module>
    File "build/bdist.linux-x86_64/egg/_psutil_linux.py", line 7, in <module>
    File "build/bdist.linux-x86_64/egg/_psutil_linux.py", line 4, in __bootstrap__
    File "/usr/lib/python2.7/site-packages/pkg_resources.py", line 945, in resource_filename
      self, resource_name
    File "/usr/lib/python2.7/site-packages/pkg_resources.py", line 1633, in get_resource_filename
      self._extract_resource(manager, self._eager_to_zip(name))
    File "/usr/lib/python2.7/site-packages/pkg_resources.py", line 1661, in _extract_resource
      self.egg_name, self._parts(zip_path)
    File "/usr/lib/python2.7/site-packages/pkg_resources.py", line 1025, in get_cache_path
      self.extraction_error()
    File "/usr/lib/python2.7/site-packages/pkg_resources.py", line 991,     inextraction_error
      raise err
      pkg_resources.ExtractionError: Can't extract file(s) to egg cache
      The following error occurred while trying to extract file(s) to the Python egg
      cache:
        [Errno 13] Permission denied: '/home/.python-eggs' 
      The Python egg cache directory is currently set to:  
       /home/.python-eggs  
     Perhaps your account does not have write access to this directory?  You can
     change the cache directory by setting the PYTHON_EGG_CACHE environment
     variable to point to an accessible directory.
    

      解决方案:

    1、在你的map/mapPartition 里面的代码里面加上:

      

    os.environ['PYTHON_EGG_CACHE'] = '/tmp/.python-eggs/'
    os.environ['PYTHON_EGG_DIR']='/tmp/.python-eggs/'

    2、在集群的每一台机器上面配置环境变量(推荐):

    os.environ['PYTHON_EGG_CACHE'] = '/tmp/.python-eggs/'
    os.environ['PYTHON_EGG_DIR']='/tmp/.python-eggs/'

    3、打开spark的根目录,cd到python/lib,找到pyspark.zip文件,解压文件,cd 到pyspark里面,找到rdd.py ,vim打开,找到  “import os”这一行,在这行下面插入代码:

    os.environ['PYTHON_EGG_CACHE'] = '/tmp/.python-eggs/'
    os.environ['PYTHON_EGG_DIR']='/tmp/.python-eggs/'

    以上三种方案都不能解决这个问题的话,建议先用 hadoop 的streaming 功能 提交一个python 的执行文件,测试yarn是否支持python运算。

    然后再看看用spark的standalone模式是不是可以提交python任务。

    以上。

    如果还有问题,那就只能发邮件给spark的开发组了。

    哎...今天够累的,签到来了1...
    回复

    使用道具 举报

    您需要登录后才可以回帖 登录 | 立即注册

    本版积分规则

    QQ|手机版|小黑屋|Java自学者论坛 ( 声明:本站文章及资料整理自互联网,用于Java自学者交流学习使用,对资料版权不负任何法律责任,若有侵权请及时联系客服屏蔽删除 )

    GMT+8, 2024-5-1 00:51 , Processed in 0.064950 second(s), 29 queries .

    Powered by Discuz! X3.4

    Copyright © 2001-2021, Tencent Cloud.

    快速回复 返回顶部 返回列表