这篇文章给大家介绍spark中怎么配置启用LZO压缩,内容非常详细,感兴趣的小伙伴们可以参考借鉴,希望对大家能有所帮助。

Spark中配置启用LZO压缩,步骤如下:
一、spark-env.sh配置
- export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/app/hadoop-2.6.0-cdh6.7.0/lib/native 
 export SPARK_LIBRARY_PATH=$SPARK_LIBRARY_PATH:/app/hadoop-2.6.0-cdh6.7.0/lib/native
 export SPARK_CLASSPATH=$SPARK_CLASSPATH:/app/hadoop-2.6.0-cdh6.7.0/share/hadoop/yarn/*:/app/hadoop-2.6.0-cdh6.7.0/share/hadoop/yarn/lib/*:/app/hadoop-2.6.0-cdh6.7.0/share/hadoop/common/*:/app/hadoop-2.6.0-cdh6.7.0/share/hadoop/common/lib/*:/app/hadoop-2.6.0-cdh6.7.0/share/hadoop/hdfs/*:/app/hadoop-2.6.0-cdh6.7.0/share/hadoop/hdfs/lib/*:/app/hadoop-2.6.0-cdh6.7.0/share/hadoop/mapreduce/*:/app/hadoop-2.6.0-cdh6.7.0/share/hadoop/mapreduce/lib/*:/app/hadoop-2.6.0-cdh6.7.0/share/hadoop/tools/lib/*:/app/spark-2.2.0-bin-2.6.0-cdh6.7.0/jars/*
2、无法找到LzopCodec类
   2.1、错误提示:
- Caused by: java.lang.IllegalArgumentException: Compression codec com.hadoop.compression.lzo.LzopCodec not found. 
- at org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:135) 
- at org.apache.hadoop.io.compress.CompressionCodecFactory. - (CompressionCodecFactory.java:175) 
- at org.apache.hadoop.mapred.TextInputFormat.configure(TextInputFormat.java:45) 
- Caused by: java.lang.ClassNotFoundException: Class com.hadoop.compression.lzo.LzopCodec not found 
- at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1980) 
- at org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:128) 
2.2、解决办法:在spark的conf中配置spark-defaults.conf,增加以下内容:
- spark.driver.extraClassPath /app/hadoop-2.6.0-cdh6.7.0/share/hadoop/common/hadoop-lzo-0.4.19.jar 
- spark.executor.extraClassPath /app/hadoop-2.6.0-cdh6.7.0/share/hadoop/common/hadoop-lzo-0.4.19.jar 
关于spark中怎么配置启用LZO压缩就分享到这里了,希望以上内容可以对大家有一定的帮助,可以学到更多知识。如果觉得文章不错,可以把它分享出去让更多的人看到。
本文标题:spark中怎么配置启用LZO压缩-创新互联
本文路径:http://www.scyingshan.cn/article/deiege.html

 建站
建站
 咨询
咨询 售后
售后
 建站咨询
建站咨询 
 