美文网首页
spark-client安装

spark-client安装

作者: 后知不觉1 | 来源:发表于2023-05-04 20:40 被阅读0次

1、 依赖项

  • hadoop-client
  • hive-client

2、配置

  • spark-default.conf

      spark.driver.extraLibraryPath /opt/hadoop/lib/native  #spark 安装位置
      spark.eventLog.dir hdfs:///spark2-history/   #spark日志hdfs位置
      spark.eventLog.enabled true    #spark开启日志
      spark.executor.extraLibraryPath /opt/hadoop/lib/native  #
      spark.extraListeners org.apache.spark.sql.hive.asdListner   #spark hook
      spark.history.fs.logDirectory hdfs:///spark2-history/   #spark日志hdfs位置
      spark.history.provider org.apache.spark.deploy.history.FsHistoryProvider   #history server启动类
      spark.history.ui.port 18081  #history server端口
      spark.sql.queryExecutionListeners org.apache.spark.sql.hive.DagUsageListener  #spark 查询hook
      spark.yarn.historyServer.address asd:18081   #history server链接地址
      spark.yarn.queue default   #spark默认队列
    
  • spark-env.sh

      export SPARK_CONF_DIR=${SPARK_CONF_DIR:-/opt/spark/conf}
      export SPARK_LOG_DIR=/var/log/spark2
      export SPARK_PID_DIR=/var/run/spark2
      export SPARK_DAEMON_MEMORY=1024m
      SPARK_IDENT_STRING=$USER
      SPARK_NICENESS=0
      export HADOOP_HOME=${HADOOP_HOME:-}
      echo ${HADOOP_HOME}
      export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-/etc/hadoop/conf}  # 如果没有,默认值
      export JAVA_HOME=/usr/local/java
    

3、启动history sesrver

   su spark -c "bash $SPARK_HOME/sbin/start-history-server.sh"

4、问题记录

4.1、 host问题

描述
self.socket.bind(self.server_address) socket.gaierror: [Errno -2] Name or service not known
解决
 /etc/hosts添加127.0.0.1 localhost配置即可

相关文章

网友评论

      本文标题:spark-client安装

      本文链接:https://www.haomeiwen.com/subject/vhgxsdtx.html