安装Scala
1,到 下载与Spark版本对应的Scala。Spark1.2对应于Scala2.10的版本。这里下载scala-2.10.4.tgz。 2,解压安装Scala 1), 执行#tar -axvf scala-2.10.4.tgz,解压到/root/spark/scala-2.10.4。 2),在~/.bash_profile中添加如下配置:export SCALA_HOME=/root/spark/scala-2.10.4 export PATH=$JAVA_HOME/bin$HADOOP_HOME/bin:$HIVE_HOME/bin:$SCALA_HOME/bin:$PATH
3),使环境变量生效,#source ~/.bash_profile
3,验证安装,在命令行中输入scala命令,可以进入scala命令控制台。# scalaWelcome to Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.6.0_45). Type in expressions to have them evaluated. Type :help for more information. scala>
三,安装Spark
1,到下载spark-1.2.0-bin-hadoop2.4.tgz,解压到/root/spark/spark-1.2.0-bin-hadoop2.4。 2,在.bash_profile中添加如下配置:export SPARK_HOME=/root/spark/spark-1.2.0-bin-hadoop2.4 export PATH=$JAVA_HOME/bin:$HADOOP_HOME/bin:$SCALA_HOME/bin:$SPARK_HOME/bin:$HIVE_HOME/bin:$PATH
3,使环境变量生效,#source ~/.bash_profile
四,配置Spark
1,进入Spark的配置文件路径,#cd $SPARK_HOME/conf
2,执行,#cp spark-env.sh.template spark-env.sh
3,在spark-env.sh文件中添加如下配置: export JAVA_HOME=/usr/lib/jdk1.6.0_45 export SCALA_HOME=/root/spark/scala-2.10.4 export HADOOP_CONF_DIR=/root/hadoop/hadoop-2.6.0/etc/hadoop
五,启动Spark
1,进入spark的安装路径,#cd /root/spark/spark-1.2.0-bin-hadoop2.4
2,执行#./sbin/start-all.sh
命令 3,执行 #jps
命令,会有Master和Worker进程 # jps38907 RunJar 39030 RunJar 54679 NameNode 26587 Jps 54774 DataNode 9850 Worker 9664 Master 55214 NodeManager 55118 ResourceManager 54965 SecondaryNameNode
4,进入Spark的Web界面:
5,执行,#./bin/spark-shell
命令,可以进入Spark的shell环境,可以通过,看到SparkUI的情况。
Last login: Sun Oct 8 05:35:42 2017 from 192.168.1.1
[hadoop@blm ~]$ java -versionjava version "1.7.0_65"Java(TM) SE Runtime Environment (build 1.7.0_65-b17)Java HotSpot(TM) Client VM (build 24.65-b04, mixed mode)[hadoop@blm ~]$ ifconfigeth0 Link encap:Ethernet HWaddr 00:0C:29:3C:BF:E3 inet addr:192.168.1.103 Bcast:192.168.1.255 Mask:255.255.255.0 inet6 addr: fe80::20c:29ff:fe3c:bfe3/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:4461 errors:0 dropped:0 overruns:0 frame:0 TX packets:5051 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:362317 (353.8 KiB) TX bytes:411434 (401.7 KiB) Interrupt:19 Base address:0x2024 lo Link encap:Local Loopback inet addr:127.0.0.1 Mask:255.0.0.0 inet6 addr: ::1/128 Scope:Host UP LOOPBACK RUNNING MTU:16436 Metric:1 RX packets:325 errors:0 dropped:0 overruns:0 frame:0 TX packets:325 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:0 RX bytes:27918 (27.2 KiB) TX bytes:27918 (27.2 KiB)[hadoop@blm ~]$ uname -aLinux blm 2.6.32-358.el6.i686 #1 SMP Thu Feb 21 21:50:49 UTC 2013 i686 i686 i386 GNU/Linux[hadoop@blm ~]$ lltotal 449508-rw-rw-r--. 1 hadoop hadoop 80288778 Oct 5 22:25 apache-hive-0.14.0-bin.tar.gzdrwxrwxr-x. 10 hadoop hadoop 4096 Oct 8 05:27 app-rw-rw-r--. 1 hadoop hadoop 0 Oct 4 20:21 a.txt-rw-rw-r--. 1 hadoop hadoop 0 Oct 4 20:21 b.txt-rw-rw-r--. 1 hadoop hadoop 0 Oct 4 20:21 c.txtdrwxrwxr-x. 2 hadoop hadoop 4096 Oct 4 22:34 download-rwxrw-rw-. 1 hadoop hadoop 160860571 Oct 2 14:19 eclipse-java-luna-SR2-linux-gtk.tar.gz-rw-rw-r--. 1 hadoop hadoop 27315 Oct 4 00:32 flow.jar-rw-rw-r--. 1 hadoop hadoop 17765 Oct 4 03:50 flowsum.jar-rw-rw-r--. 1 hadoop hadoop 15417097 Oct 1 03:33 hadoop-2.4.1-src.tar.gz-rw-rw-r--. 1 hadoop hadoop 138656756 Oct 1 03:33 hadoop-2.4.1.tar.gz-rwxrw-rw-. 1 hadoop hadoop 2214 Jul 30 2013 HTTP_20130313143750.datdrwxr-xr-x. 8 hadoop hadoop 4096 Jun 16 2014 jdk1.7.0_65lrwxrwxrwx. 1 hadoop hadoop 32 Oct 2 02:33 Link to eclipse -> /home/hadoop/app/eclipse/eclipse-rw-rw-r--. 1 hadoop hadoop 29937534 Oct 13 08:57 scala-2.10.4.tgz.gz-rw-rw-r--. 1 hadoop hadoop 10808 Oct 3 01:57 wc.jar-rw-rw-r--. 1 hadoop hadoop 96 Oct 3 01:41 word.logdrwxrwxr-x. 6 hadoop hadoop 4096 Oct 2 08:10 workspace-r-xr--r--. 1 hadoop hadoop 35042811 Oct 4 22:39 zookeeper-3.4.10.tar.gz[hadoop@blm ~]$ tar -zxvf scala-2.10.4.tgz.gz -C app/scala-2.10.4/scala-2.10.4/man/scala-2.10.4/man/man1/scala-2.10.4/man/man1/scaladoc.1scala-2.10.4/man/man1/scalap.1scala-2.10.4/man/man1/scalac.1scala-2.10.4/man/man1/fsc.1scala-2.10.4/man/man1/scala.1scala-2.10.4/src/scala-2.10.4/src/scala-library-src.jarscala-2.10.4/src/scala-swing-src.jarscala-2.10.4/src/fjbg-src.jarscala-2.10.4/src/scala-reflect-src.jarscala-2.10.4/src/scalap-src.jarscala-2.10.4/src/msil-src.jarscala-2.10.4/src/scala-compiler-src.jarscala-2.10.4/src/scala-actors-src.jarscala-2.10.4/src/scala-partest-src.jarscala-2.10.4/doc/scala-2.10.4/doc/READMEscala-2.10.4/doc/tools/scala-2.10.4/doc/tools/index.htmlscala-2.10.4/doc/tools/scalap.htmlscala-2.10.4/doc/tools/scalac.htmlscala-2.10.4/doc/tools/fsc.htmlscala-2.10.4/doc/tools/css/scala-2.10.4/doc/tools/css/style.cssscala-2.10.4/doc/tools/images/scala-2.10.4/doc/tools/images/scala_logo.pngscala-2.10.4/doc/tools/images/external.gifscala-2.10.4/doc/tools/scaladoc.htmlscala-2.10.4/doc/tools/scala.htmlscala-2.10.4/doc/LICENSEscala-2.10.4/doc/licenses/scala-2.10.4/doc/licenses/mit_jquery-ui.txtscala-2.10.4/doc/licenses/mit_tools.tooltip.txtscala-2.10.4/doc/licenses/mit_sizzle.txtscala-2.10.4/doc/licenses/mit_jquery-layout.txtscala-2.10.4/doc/licenses/mit_jquery.txtscala-2.10.4/doc/licenses/bsd_jline.txtscala-2.10.4/doc/licenses/apache_jansi.txtscala-2.10.4/doc/licenses/bsd_asm.txtscala-2.10.4/examples/scala-2.10.4/examples/sort2.scalascala-2.10.4/examples/iterators.scalascala-2.10.4/examples/monads/scala-2.10.4/examples/monads/callccInterpreter.scalascala-2.10.4/examples/monads/stateInterpreter.scalascala-2.10.4/examples/monads/simpleInterpreter.scalascala-2.10.4/examples/monads/directInterpreter.scalascala-2.10.4/examples/monads/errorInterpreter.scalascala-2.10.4/examples/tcpoly/scala-2.10.4/examples/tcpoly/monads/scala-2.10.4/examples/tcpoly/monads/Monads.scalascala-2.10.4/examples/futures.scalascala-2.10.4/examples/boundedbuffer.scalascala-2.10.4/examples/sort1.scalascala-2.10.4/examples/parsing/scala-2.10.4/examples/parsing/ListParsers.scalascala-2.10.4/examples/parsing/ListParser.scalascala-2.10.4/examples/parsing/ArithmeticParser.scalascala-2.10.4/examples/parsing/lambda/scala-2.10.4/examples/parsing/lambda/TestParser.scalascala-2.10.4/examples/parsing/lambda/Main.scalascala-2.10.4/examples/parsing/lambda/TestSyntax.scalascala-2.10.4/examples/parsing/lambda/test/scala-2.10.4/examples/parsing/lambda/test/test-02.kwiscala-2.10.4/examples/parsing/lambda/test/test-07.kwiscala-2.10.4/examples/parsing/lambda/test/test-08.kwiscala-2.10.4/examples/parsing/lambda/test/test-03.kwiscala-2.10.4/examples/parsing/lambda/test/test-04.kwiscala-2.10.4/examples/parsing/lambda/test/test-06.kwiscala-2.10.4/examples/parsing/lambda/test/test-05.kwiscala-2.10.4/examples/parsing/lambda/test/test-01.kwiscala-2.10.4/examples/parsing/JSON.scalascala-2.10.4/examples/parsing/MiniML.scalascala-2.10.4/examples/parsing/ArithmeticParsers.scalascala-2.10.4/examples/fors.scalascala-2.10.4/examples/patterns.scalascala-2.10.4/examples/computeserver.scalascala-2.10.4/examples/oneplacebuffer.scalascala-2.10.4/examples/sort.scalascala-2.10.4/examples/package.scalascala-2.10.4/examples/actors/scala-2.10.4/examples/actors/seq.scalascala-2.10.4/examples/actors/producers.scalascala-2.10.4/examples/actors/links.scalascala-2.10.4/examples/actors/boundedbuffer.scalascala-2.10.4/examples/actors/message.scalascala-2.10.4/examples/actors/auction.scalascala-2.10.4/examples/actors/channels.scalascala-2.10.4/examples/actors/fringe.scalascala-2.10.4/examples/actors/pingpong.scalascala-2.10.4/examples/actors/looping.scalascala-2.10.4/examples/xml/scala-2.10.4/examples/xml/phonebook/scala-2.10.4/examples/xml/phonebook/phonebook1.scalascala-2.10.4/examples/xml/phonebook/phonebook2.scalascala-2.10.4/examples/xml/phonebook/embeddedBook.scalascala-2.10.4/examples/xml/phonebook/verboseBook.scalascala-2.10.4/examples/xml/phonebook/phonebook.scalascala-2.10.4/examples/xml/phonebook/phonebook3.scalascala-2.10.4/examples/gadts.scalascala-2.10.4/examples/maps.scalascala-2.10.4/misc/scala-2.10.4/misc/scala-devel/scala-2.10.4/misc/scala-devel/plugins/scala-2.10.4/misc/scala-devel/plugins/continuations.jarscala-2.10.4/lib/scala-2.10.4/lib/typesafe-config.jarscala-2.10.4/lib/akka-actors.jarscala-2.10.4/lib/scala-actors.jarscala-2.10.4/lib/scala-compiler.jarscala-2.10.4/lib/scala-reflect.jarscala-2.10.4/lib/scala-library.jarscala-2.10.4/lib/scala-swing.jarscala-2.10.4/lib/jline.jarscala-2.10.4/lib/scala-actors-migration.jarscala-2.10.4/lib/scalap.jarscala-2.10.4/bin/scala-2.10.4/bin/scaladoc.batscala-2.10.4/bin/scala.batscala-2.10.4/bin/scalac.batscala-2.10.4/bin/scalascala-2.10.4/bin/scaladocscala-2.10.4/bin/fsc.batscala-2.10.4/bin/fscscala-2.10.4/bin/scalacscala-2.10.4/bin/scalap.batscala-2.10.4/bin/scalap[hadoop@blm ~]$ cd app[hadoop@blm app]$ lltotal 22732drwxrwxr-x. 8 hadoop hadoop 4096 Oct 5 22:32 apache-hive-0.14.0-bindrwxrwxr-x. 9 hadoop hadoop 4096 Oct 3 21:36 eclipsedrwxr-xr-x. 11 hadoop hadoop 4096 Oct 1 05:16 hadoop-2.4.1drwxr-xr-x. 15 hadoop hadoop 4096 Jun 20 2014 hadoop-2.4.1-srcdrwxrwxr-x. 2 hadoop hadoop 4096 Oct 6 03:15 hivedrwxrwxr-x. 2 hadoop hadoop 4096 Oct 6 04:02 hivetestdata-rw-rw-r--. 1 hadoop hadoop 7232487 Oct 5 23:59 MySQL-client-5.1.73-1.glibc23.i386.rpm-rw-rw-r--. 1 hadoop hadoop 16004449 Oct 5 23:59 MySQL-server-5.1.73-1.glibc23.i386.rpmdrwxrwxr-x. 9 hadoop hadoop 4096 Mar 18 2014 scala-2.10.4drwxr-xr-x. 11 root root 4096 Oct 7 07:44 xxdrwxr-xr-x. 11 root root 4096 Oct 8 05:27 zookeeper-3.4.5[hadoop@blm app]$ clear[hadoop@blm app]$ cd /etc/profile-bash: cd: /etc/profile: Not a directory[hadoop@blm app]$ su rootPassword: su: incorrect password[hadoop@blm app]$ suPassword: [root@blm app]# clear[root@blm app]# vi /etc/profile# /etc/profile# System wide environment and startup programs, for login setup# Functions and aliases go in /etc/bashrc# It's NOT a good idea to change this file unless you know what you# are doing. It's much better to create a custom.sh shell script in# /etc/profile.d/ to make custom changes to your environment, as this# will prevent the need for merging in future updates.pathmunge () { case ":${PATH}:" in *:"$1":*) ;; *) if [ "$2" = "after" ] ; then PATH=$PATH:$1 else PATH=$1:$PATH fi esac}if [ -x /usr/bin/id ]; then if [ -z "$EUID" ]; then # ksh workaround EUID=`id -u` UID=`id -ru` fi USER="`id -un`" LOGNAME=$USER MAIL="/var/spool/mail/$USER"fi# Path manipulationif [ "$EUID" = "0" ]; then pathmunge /sbin pathmunge /usr/sbin pathmunge /usr/local/sbinelse pathmunge /usr/local/sbin after pathmunge /usr/sbin after pathmunge /sbin afterfiHOSTNAME=`/bin/hostname 2>/dev/null`HISTSIZE=1000if [ "$HISTCONTROL" = "ignorespace" ] ; then export HISTCONTROL=ignorebothelse export HISTCONTROL=ignoredupsfiexport PATH USER LOGNAME MAIL HOSTNAME HISTSIZE HISTCONTROLexport JAVA_HOME=/home/hadoop/jdk1.7.0_65export SCALA_HOME=/home/hadoop/app/scala-2.10.4export HADOOP_HOME=/home/hadoop/app/hadoop-2.4.1export PATH=$PATH:$JAVA_HOME/bin:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$SCALA_HOME/binexport HIVE_HOME=/home/hadoop/app/apache-hive-0.14.0-bin# By default, we want umask to get set. This sets it for login shell# Current threshold for system reserved uid/gids is 200# You could check uidgid reservation validity in# /usr/share/doc/setup-*/uidgid fileif [ $UID -gt 199 ] && [ "`id -gn`" = "`id -un`" ]; then umask 002else umask 022fifor i in /etc/profile.d/*.sh ; do if [ -r "$i" ]; then if [ "${-#*i}" != "$-" ]; then . "$i" else . "$i" >/dev/null 2>&1 fi fidoneunset i"/etc/profile" 85L, 2078C written[root@blm app]# scalabash: scala: command not found[root@blm app]# javaUsage: java [-options] class [args...] (to execute a class) or java [-options] -jar jarfile [args...] (to execute a jar file)where options include: -d32 use a 32-bit data model if available -d64 use a 64-bit data model if available -client to select the "client" VM -server to select the "server" VM -hotspot is a synonym for the "client" VM [deprecated] The default VM is client. -cp <class search path of directories and zip/jar files> -classpath <class search path of directories and zip/jar files> A : separated list of directories, JAR archives, and ZIP archives to search for class files. -D<name>=<value> set a system property -verbose:[class|gc|jni] enable verbose output -version print product version and exit -version:<value> require the specified version to run -showversion print product version and continue -jre-restrict-search | -no-jre-restrict-search include/exclude user private JREs in the version search -? -help print this help message -X print help on non-standard options -ea[:<packagename>...|:<classname>] -enableassertions[:<packagename>...|:<classname>] enable assertions with specified granularity -da[:<packagename>...|:<classname>] -disableassertions[:<packagename>...|:<classname>] disable assertions with specified granularity -esa | -enablesystemassertions enable system assertions -dsa | -disablesystemassertions disable system assertions -agentlib:<libname>[=<options>] load native agent library <libname>, e.g. -agentlib:hprof see also, -agentlib:jdwp=help and -agentlib:hprof=help -agentpath:<pathname>[=<options>] load native agent library by full pathname -javaagent:<jarpath>[=<options>] load Java programming language agent, see java.lang.instrument -splash:<imagepath> show splash screen with specified imageSee http://www.oracle.com/technetwork/java/javase/documentation/index.html for more details.[root@blm app]# su[root@blm app]# vi /etc/profile# /etc/profile# System wide environment and startup programs, for login setup# Functions and aliases go in /etc/bashrc# It's NOT a good idea to change this file unless you know what you# are doing. It's much better to create a custom.sh shell script in# /etc/profile.d/ to make custom changes to your environment, as this# will prevent the need for merging in future updates.pathmunge () { case ":${PATH}:" in *:"$1":*) ;; *) if [ "$2" = "after" ] ; then PATH=$PATH:$1 else PATH=$1:$PATH fi esac}if [ -x /usr/bin/id ]; then if [ -z "$EUID" ]; then # ksh workaround EUID=`id -u` UID=`id -ru` fi USER="`id -un`" LOGNAME=$USER MAIL="/var/spool/mail/$USER"fi# Path manipulationif [ "$EUID" = "0" ]; then pathmunge /sbin pathmunge /usr/sbin pathmunge /usr/local/sbinelse pathmunge /usr/local/sbin after pathmunge /usr/sbin after pathmunge /sbin afterfiHOSTNAME=`/bin/hostname 2>/dev/null`HISTSIZE=1000if [ "$HISTCONTROL" = "ignorespace" ] ; then export HISTCONTROL=ignorebothelse export HISTCONTROL=ignoredupsfiexport PATH USER LOGNAME MAIL HOSTNAME HISTSIZE HISTCONTROLexport JAVA_HOME=/home/hadoop/jdk1.7.0_65export SCALA_HOME=/home/hadoop/app/scala-2.10.4export HADOOP_HOME=/home/hadoop/app/hadoop-2.4.1export SPARK_HOME=/home/hadoop/app/spark-1.2.0-bin-hadoop2.4export PATH=$PATH:$JAVA_HOME/bin:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$SCALA_HOME/bin:$SPARK_HOME/binexport HIVE_HOME=/home/hadoop/app/apache-hive-0.14.0-bin# By default, we want umask to get set. This sets it for login shell# Current threshold for system reserved uid/gids is 200# You could check uidgid reservation validity in# /usr/share/doc/setup-*/uidgid fileif [ $UID -gt 199 ] && [ "`id -gn`" = "`id -un`" ]; then umask 002else umask 022fifor i in /etc/profile.d/*.sh ; do if [ -r "$i" ]; then if [ "${-#*i}" != "$-" ]; then . "$i" else . "$i" >/dev/null 2>&1 fi fidoneunset i"/etc/profile" 86L, 2155C written==============================================================================
[hadoop@blm spark-1.2.0-bin-hadoop2.4]$ cd logs
[hadoop@blm logs]$ lltotal 8-rw-rw-r--. 1 hadoop hadoop 2014 Oct 13 09:40 spark-hadoop-org.apache.spark.deploy.master.Master-1-blm.out-rw-rw-r--. 1 hadoop hadoop 2091 Oct 13 09:40 spark-hadoop-org.apache.spark.deploy.worker.Worker-1-blm.out[hadoop@blm logs]$ tail -100f spark-hadoop-org.apache.spark.deploy.master.Master-1-blm.out Spark assembly has been built with Hive, including Datanucleus jars on classpathSpark Command: /home/hadoop/jdk1.7.0_65/bin/java -cp ::/home/hadoop/app/spark-1.2.0-bin-hadoop2.4/sbin/../conf:/home/hadoop/app/spark-1.2.0-bin-hadoop2.4/lib/spark-assembly-1.2.0-hadoop2.4.0.jar:/home/hadoop/app/spark-1.2.0-bin-hadoop2.4/lib/datanucleus-api-jdo-3.2.6.jar:/home/hadoop/app/spark-1.2.0-bin-hadoop2.4/lib/datanucleus-core-3.2.10.jar:/home/hadoop/app/spark-1.2.0-bin-hadoop2.4/lib/datanucleus-rdbms-3.2.9.jar:/home/hadoop/app/hadoop-2.4.1 -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m org.apache.spark.deploy.master.Master --ip blm --port 7077 --webui-port 8080========================================Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties17/10/13 09:40:11 INFO Master: Registered signal handlers for [TERM, HUP, INT]17/10/13 09:40:12 INFO SecurityManager: Changing view acls to: hadoop17/10/13 09:40:12 INFO SecurityManager: Changing modify acls to: hadoop17/10/13 09:40:12 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); users with modify permissions: Set(hadoop)17/10/13 09:40:15 INFO Slf4jLogger: Slf4jLogger started17/10/13 09:40:16 INFO Remoting: Starting remoting17/10/13 09:40:17 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkMaster@blm:7077]17/10/13 09:40:17 INFO Remoting: Remoting now listens on addresses: [akka.tcp://sparkMaster@blm:7077]17/10/13 09:40:17 INFO Utils: Successfully started service 'sparkMaster' on port 7077.17/10/13 09:40:18 INFO Master: Starting Spark master at spark://blm:707717/10/13 09:40:28 INFO Utils: Successfully started service 'MasterUI' on port 8080.17/10/13 09:40:28 INFO MasterWebUI: Started MasterWebUI at http://blm:808017/10/13 09:40:29 INFO Master: I have been elected leader! New state: ALIVE17/10/13 09:40:32 INFO Master: Registering worker blm:38727 with 1 cores, 512.0 MB RAM^C[hadoop@blm logs]$ cat spark-hadoop-org.apache.spark.deploy.worker.Worker-1-blm.outSpark assembly has been built with Hive, including Datanucleus jars on classpathSpark Command: /home/hadoop/jdk1.7.0_65/bin/java -cp ::/home/hadoop/app/spark-1.2.0-bin-hadoop2.4/sbin/../conf:/home/hadoop/app/spark-1.2.0-bin-hadoop2.4/lib/spark-assembly-1.2.0-hadoop2.4.0.jar:/home/hadoop/app/spark-1.2.0-bin-hadoop2.4/lib/datanucleus-api-jdo-3.2.6.jar:/home/hadoop/app/spark-1.2.0-bin-hadoop2.4/lib/datanucleus-core-3.2.10.jar:/home/hadoop/app/spark-1.2.0-bin-hadoop2.4/lib/datanucleus-rdbms-3.2.9.jar:/home/hadoop/app/hadoop-2.4.1 -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m org.apache.spark.deploy.worker.Worker spark://blm:7077========================================Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties17/10/13 09:40:16 INFO Worker: Registered signal handlers for [TERM, HUP, INT]17/10/13 09:40:16 INFO SecurityManager: Changing view acls to: hadoop17/10/13 09:40:16 INFO SecurityManager: Changing modify acls to: hadoop17/10/13 09:40:16 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); users with modify permissions: Set(hadoop)17/10/13 09:40:18 INFO Slf4jLogger: Slf4jLogger started17/10/13 09:40:19 INFO Remoting: Starting remoting17/10/13 09:40:19 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkWorker@blm:38727]17/10/13 09:40:19 INFO Remoting: Remoting now listens on addresses: [akka.tcp://sparkWorker@blm:38727]17/10/13 09:40:19 INFO Utils: Successfully started service 'sparkWorker' on port 38727.17/10/13 09:40:20 INFO Worker: Starting Spark worker blm:38727 with 1 cores, 512.0 MB RAM17/10/13 09:40:20 INFO Worker: Spark home: /home/hadoop/app/spark-1.2.0-bin-hadoop2.417/10/13 09:40:30 INFO Utils: Successfully started service 'WorkerUI' on port 8081.17/10/13 09:40:30 INFO WorkerWebUI: Started WorkerWebUI at http://blm:808117/10/13 09:40:30 INFO Worker: Connecting to master spark://blm:7077...17/10/13 09:40:32 INFO Worker: Successfully registered with master spark://blm:7077export JAVA_HOME=/home/hadoop/jdk1.7.0_65
export SCALA_HOME=/home/hadoop/app/scala-2.10.4export HADOOP_CONF_DIR=/home/hadoop/app/hadoop-2.4.1"spark-env.sh" 59L, 3361C written [hadoop@blm conf]$ lltotal 28-rw-rw-r--. 1 hadoop hadoop 303 Dec 10 2014 fairscheduler.xml.template-rw-rw-r--. 1 hadoop hadoop 620 Dec 10 2014 log4j.properties.template-rw-rw-r--. 1 hadoop hadoop 5308 Dec 10 2014 metrics.properties.template-rw-rw-r--. 1 hadoop hadoop 80 Dec 10 2014 slaves.template-rw-rw-r--. 1 hadoop hadoop 507 Dec 10 2014 spark-defaults.conf.template-rwxrwxr-x. 1 hadoop hadoop 3361 Oct 13 09:36 spark-env.sh[hadoop@blm conf]$ jps4382 Jps4027 Worker3890 Master[hadoop@blm conf]$ lltotal 28-rw-rw-r--. 1 hadoop hadoop 303 Dec 10 2014 fairscheduler.xml.template-rw-rw-r--. 1 hadoop hadoop 620 Dec 10 2014 log4j.properties.template-rw-rw-r--. 1 hadoop hadoop 5308 Dec 10 2014 metrics.properties.template-rw-rw-r--. 1 hadoop hadoop 80 Dec 10 2014 slaves.template-rw-rw-r--. 1 hadoop hadoop 507 Dec 10 2014 spark-defaults.conf.template-rwxrwxr-x. 1 hadoop hadoop 3361 Oct 13 09:36 spark-env.sh[hadoop@blm conf]$ pwd/home/hadoop/app/spark-1.2.0-bin-hadoop2.4/conf[hadoop@blm conf]$ lltotal 28-rw-rw-r--. 1 hadoop hadoop 303 Dec 10 2014 fairscheduler.xml.template-rw-rw-r--. 1 hadoop hadoop 620 Dec 10 2014 log4j.properties.template-rw-rw-r--. 1 hadoop hadoop 5308 Dec 10 2014 metrics.properties.template-rw-rw-r--. 1 hadoop hadoop 80 Dec 10 2014 slaves.template-rw-rw-r--. 1 hadoop hadoop 507 Dec 10 2014 spark-defaults.conf.template-rwxrwxr-x. 1 hadoop hadoop 3361 Oct 13 09:36 spark-env.sh[hadoop@blm conf]$ cd ..[hadoop@blm spark-1.2.0-bin-hadoop2.4]$ lltotal 120drwxrwxr-x. 2 hadoop hadoop 4096 Dec 10 2014 bindrwxrwxr-x. 2 hadoop hadoop 4096 Oct 13 09:36 confdrwxrwxr-x. 3 hadoop hadoop 4096 Dec 10 2014 datadrwxrwxr-x. 4 hadoop hadoop 4096 Dec 10 2014 ec2drwxrwxr-x. 3 hadoop hadoop 4096 Dec 10 2014 examplesdrwxrwxr-x. 2 hadoop hadoop 4096 Dec 10 2014 lib-rw-rw-r--. 1 hadoop hadoop 45242 Dec 10 2014 LICENSEdrwxrwxr-x. 2 hadoop hadoop 4096 Oct 13 09:40 logs-rw-rw-r--. 1 hadoop hadoop 22559 Dec 10 2014 NOTICEdrwxrwxr-x. 7 hadoop hadoop 4096 Dec 10 2014 python-rw-rw-r--. 1 hadoop hadoop 3645 Dec 10 2014 README.md-rw-rw-r--. 1 hadoop hadoop 35 Dec 10 2014 RELEASEdrwxrwxr-x. 2 hadoop hadoop 4096 Dec 10 2014 sbindrwxrwxr-x. 2 hadoop hadoop 4096 Oct 13 09:40 work[hadoop@blm spark-1.2.0-bin-hadoop2.4]$ cd bin[hadoop@blm bin]$ lltotal 108-rwxrwxr-x. 1 hadoop hadoop 1047 Dec 10 2014 beeline-rw-rw-r--. 1 hadoop hadoop 953 Dec 10 2014 beeline.cmd-rw-rw-r--. 1 hadoop hadoop 5374 Dec 10 2014 compute-classpath.cmd-rwxrwxr-x. 1 hadoop hadoop 6377 Dec 10 2014 compute-classpath.sh-rw-rw-r--. 1 hadoop hadoop 2065 Dec 10 2014 load-spark-env.sh-rwxrwxr-x. 1 hadoop hadoop 5049 Dec 10 2014 pyspark-rw-rw-r--. 1 hadoop hadoop 2412 Dec 10 2014 pyspark2.cmd-rw-rw-r--. 1 hadoop hadoop 1023 Dec 10 2014 pyspark.cmd-rwxrwxr-x. 1 hadoop hadoop 2131 Dec 10 2014 run-example-rw-rw-r--. 1 hadoop hadoop 2869 Dec 10 2014 run-example2.cmd-rw-rw-r--. 1 hadoop hadoop 1035 Dec 10 2014 run-example.cmd-rwxrwxr-x. 1 hadoop hadoop 6750 Dec 10 2014 spark-class-rw-rw-r--. 1 hadoop hadoop 6482 Dec 10 2014 spark-class2.cmd-rw-rw-r--. 1 hadoop hadoop 1033 Dec 10 2014 spark-class.cmd-rwxrwxr-x. 1 hadoop hadoop 2884 Dec 10 2014 spark-shell-rw-rw-r--. 1 hadoop hadoop 971 Dec 10 2014 spark-shell2.cmd-rwxrwxr-x. 1 hadoop hadoop 1031 Dec 10 2014 spark-shell.cmd-rwxrwxr-x. 1 hadoop hadoop 1744 Dec 10 2014 spark-sql-rwxrwxr-x. 1 hadoop hadoop 2562 Dec 10 2014 spark-submit-rw-rw-r--. 1 hadoop hadoop 2603 Dec 10 2014 spark-submit2.cmd-rw-rw-r--. 1 hadoop hadoop 1033 Dec 10 2014 spark-submit.cmd-rwxrwxr-x. 1 hadoop hadoop 2058 Dec 10 2014 utils.sh[hadoop@blm bin]$ spark-shell-bash: spark-shell: command not found[hadoop@blm bin]$ ./spark-shellSpark assembly has been built with Hive, including Datanucleus jars on classpathUsing Spark's default log4j profile: org/apache/spark/log4j-defaults.properties17/10/13 09:52:03 INFO SecurityManager: Changing view acls to: hadoop17/10/13 09:52:03 INFO SecurityManager: Changing modify acls to: hadoop17/10/13 09:52:03 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); users with modify permissions: Set(hadoop)17/10/13 09:52:03 INFO HttpServer: Starting HTTP Server17/10/13 09:52:03 INFO Utils: Successfully started service 'HTTP class server' on port 40534.Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 1.2.0 /_/Using Scala version 2.10.4 (Java HotSpot(TM) Client VM, Java 1.7.0_65)Type in expressions to have them evaluated.Type :help for more information.17/10/13 09:52:29 INFO SecurityManager: Changing view acls to: hadoop17/10/13 09:52:29 INFO SecurityManager: Changing modify acls to: hadoop17/10/13 09:52:29 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); users with modify permissions: Set(hadoop)17/10/13 09:52:31 INFO Slf4jLogger: Slf4jLogger started17/10/13 09:52:31 INFO Remoting: Starting remoting17/10/13 09:52:33 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@blm:43213]17/10/13 09:52:33 INFO Utils: Successfully started service 'sparkDriver' on port 43213.17/10/13 09:52:34 INFO SparkEnv: Registering MapOutputTracker17/10/13 09:52:34 INFO SparkEnv: Registering BlockManagerMaster17/10/13 09:52:34 INFO DiskBlockManager: Created local directory at /tmp/spark-local-20171013095234-d91a17/10/13 09:52:34 INFO MemoryStore: MemoryStore started with capacity 267.3 MB17/10/13 09:52:36 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable17/10/13 09:52:37 INFO HttpFileServer: HTTP File server directory is /tmp/spark-a2325c17-1794-4c66-a240-6fecb4150ea117/10/13 09:52:37 INFO HttpServer: Starting HTTP Server17/10/13 09:52:38 INFO Utils: Successfully started service 'HTTP file server' on port 41906.17/10/13 09:52:49 INFO Utils: Successfully started service 'SparkUI' on port 4040.17/10/13 09:52:49 INFO SparkUI: Started SparkUI at http://blm:404017/10/13 09:52:50 INFO Executor: Using REPL class URI: http://192.168.1.103:4053417/10/13 09:52:50 INFO AkkaUtils: Connecting to HeartbeatReceiver: akka.tcp://sparkDriver@blm:43213/user/HeartbeatReceiver17/10/13 09:52:51 INFO NettyBlockTransferService: Server created on 4670817/10/13 09:52:51 INFO BlockManagerMaster: Trying to register BlockManager17/10/13 09:52:51 INFO BlockManagerMasterActor: Registering block manager localhost:46708 with 267.3 MB RAM, BlockManagerId(<driver>, localhost, 46708)17/10/13 09:52:51 INFO BlockManagerMaster: Registered BlockManager17/10/13 09:52:52 INFO SparkILoop: Created spark context..Spark context available as sc.