購入金額
郵送込みで 約「$50」
Raspberry Pi Type B Single Board Computer 512MB US$35.00
Raspberry Pi Type B Case – Clear US$6.47
Running Total US$41.47
Standard Delivery (Despatch expected within 2 week(s)) US$8.02
Order total US$49.49
購入金額
郵送込みで 約「$50」
Raspberry Pi Type B Single Board Computer 512MB US$35.00
Raspberry Pi Type B Case – Clear US$6.47
Running Total US$41.47
Standard Delivery (Despatch expected within 2 week(s)) US$8.02
Order total US$49.49
apt-get install r-base apt-get install gdebi-core
パッケージがない!
GitHubでソースダウンロード
https://github.com/rstudio/rstudio
cmakeを使いそうなのでインストール
apt-get install cmake
というか依存関係を解決して必要なものをいれてくれるスクリプトがあった
root@raspberrypi:~/RStudio/rstudio-master/dependencies/linux# ./install-dependencies-debian --exclude-qt-sdk
common.copy /opt/rstudio-tools/boost/boost_1_50_0/lib/libboost_wave.a ...updated 10671 targets... --2014-02-08 20:07:41--? https://s3.amazonaws.com/rstudio-buildtools/pandoc-1.12.3.zip s3.amazonaws.com (s3.amazonaws.com) をDNSに問いあわせています... 207.171.187.117 s3.amazonaws.com (s3.amazonaws.com)|207.171.187.117|:443 に接続しています... 接続しました。 HTTP による接続要求を送信しました、応答を待っています... 200 OK 長さ: 97002741 (93M) [application/zip] `pandoc-1.12.3.zip' に保存中 100%[==========================================================================================>] 97,002,741? 2.31M/s 時間 56s 2014-02-08 20:08:44 (1.66 MB/s) - `pandoc-1.12.3.zip' へ保存完了 [97002741/97002741] cp: `pandoc-1.12.3/linux/debian/armv6l/pandoc*' を stat できません: そのようなファイルやディレクトリはありません root@raspberrypi:~/RStudio/rstudio-master/dependencies/linux#
失敗したのでいったん、アンインストール
apt-get remove gdebi-core apt-get remove r-base # rm -Rf RStudio/
コンフィグ追加
ntpdate が入ってなかったのでapt-get でinstall
root@raspberrypi:~# ntpdate sarah.tk.net 8 Feb 13:19:44 ntpdate[2654]: step time server 192.168.0.2 offset 13698894.377963 sec
ここで作った環境でとりあえず、MapReduce
hduser@raspberrypi /usr/local/hadoop $ hadoop fs -put conf input hduser@raspberrypi ~ $ hadoop fs -ls /user/hduser/ Found 1 items drwxr-xr-x - hduser supergroup 0 2013-08-04 11:44 /user/hduser/input hduser@raspberrypi ~ $
しゃれじゃないけど、PiだけにPi(π)で試してみる
MapReduceは処理が走る前に準備時間が結構かかりますのでそれをわかった上で、RaspberryPiでどのくらい力があるか見てみる
hduser@raspberrypi /usr/local/hadoop $ hadoop jar hadoop-examples-1.1.2.jar pi Usage: org.apache.hadoop.examples.PiEstimator <nMaps> <nSamples>
Map数=1 サンプル数=1 123.96 seconds Estimated value of Pi is 4.00000000000000000000
Map数=1 サンプル数=10 115.538 seconds Estimated value of Pi is 3.60000000000000000000
Map数=1 サンプル数=100 114.493 seconds Estimated value of Pi is 3.20000000000000000000
Map数=1 サンプル数=1000 113.498 seconds Estimated value of Pi is 3.14800000000000000000
Map数=1 サンプル数=10万 113.844 seconds Estimated value of Pi is 3.14120000000000000000
Map数=1 サンプル数=1000万 129.025 seconds Estimated value of Pi is 3.14158440000000000000
Map数=1 サンプル数=5000万 217.511 seconds Estimated value of Pi is 3.14159448000000000000
Map数=1
サンプル数=1億
303.028 seconds
Estimated value of Pi is 3.14159256000000000000
Raspberry Pi 上で Hadoop を動かしてみる。Raspberry Pi は1台しかないので、PSEUDOで動かす。
Expand Filesystem Internationalisation Options -> Change Locale ja_JP.UTF-8 -> Change Timezone Tokyo -> Change Keyboard Layout : Generic 105-key (Intel) PC # Control+Alt+Backspace で X Serverを Terminateできるように Advanced Option -> CPUとGPUのメモリ配分を調節するとか
以降 # raspi-conf で呼び出せる
http://www.oracle.com/technetwork/java/javase/downloads/jdk7-downloads-1880260.html
https://wiki.openjdk.java.net/display/OpenJFX/OpenJFX+on+the+Raspberry+Pi
root@raspberrypi:~# ls hadoop-1.1.2.tar.gz jdk-7u21-linux-arm-sfp.tar.gz run_vncserver root@raspberrypi:~# tar zxf jdk-7u21-linux-arm-sfp.tar.gz -C /opt root@raspberrypi:~# update-alternatives --install "/usr/bin/ja jar jarsigner java javac javadoc javah javap root@raspberrypi:~# update-alternatives --install "/usr/bin/java" "java" "/opt/jdk1.7.0_21/bin/java" 1 root@raspberrypi:~# root@raspberrypi:~# java -version java version "1.7.0_03" OpenJDK Runtime Environment (IcedTea7 2.1.7) (7u3-2.1.7-1) OpenJDK Zero VM (build 22.0-b10, mixed mode) root@raspberrypi:~#
root@raspberrypi:~# addgroup hadoop root@raspberrypi:~# adduser --ingroup hadoop hduser root@raspberrypi:~# adduser hduser
root@raspberrypi:~# su - hduser hduser@raspberrypi ~ $ hduser@raspberrypi ~ $ ssh-keygen -t rsa -P "" Generating public/private rsa key pair. Enter file in which to save the key (/home/hduser/.ssh/id_rsa): Created directory '/home/hduser/.ssh'. Your identification has been saved in /home/hduser/.ssh/id_rsa. Your public key has been saved in /home/hduser/.ssh/id_rsa.pub. The key fingerprint is: 95:5e:cb:32:fc:ef:db:1a:05:ff:4b:b8:47:0c:cb:19 hduser@raspberrypi The key's randomart image is: +--[ RSA 2048]----+ 略 hduser@raspberrypi ~ $ cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys hduser@raspberrypi ~ $ ssh localhost The authenticity of host 'localhost (127.0.0.1)' can't be established. ECDSA key fingerprint is 2e:f6:2f:7d:3f:88:30:17:1e:59:39:a4:bc:40:d0:be. Are you sure you want to continue connecting (yes/no)? yes Warning: Permanently added 'localhost' (ECDSA) to the list of known hosts. Linux raspberrypi 3.6.11+ #474 PREEMPT Thu Jun 13 17:14:42 BST 2013 armv6l 略
# For Hadoop Environment export JAVA_HOME=/opt/jdk1.7.0_21 export HADOOP_INSTALL=/usr/local/hadoop export PATH=$JAVA_HOME/bin:$HADOOP_INSTALL/bin:$PATH
root@raspberrypi:~# tar zxf hadoop-1.1.2.tar.gz -C /usr/local root@raspberrypi:~# cd /usr/local/ root@raspberrypi:/usr/local# mv hadoop-1.1.2 hadoop root@raspberrypi:/usr/local# chown -R hduser:hadoop hadoop root@raspberrypi:~# su - hduser hduser@raspberrypi ~ $ hadoop version Hadoop 1.1.2 Subversion https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r 1440782 Compiled by hortonfo on Thu Jan 31 02:03:24 UTC 2013 From source with checksum c720ddcf4b926991de7467d253a79b8b hduser@raspberrypi ~ $
# cd /usr/local/hadoop/conf
core-site.xml
<configuration> <property> <name>hadoop.tmp.dir</name> <value>/usr/local/fs/hadoop/tmp</value> </property> <property> <name>fs.default.name</name> <value>hdfs://localhost:54310</value> </property> </configuration>
hdfs-site.xml
<configuration> <property> <name>dfs.replication</name> <value>1</value> </property> </configuration>
mapred-site.xml
<configuration> <property> <name>mapred.job.tracker</name> <value>localhost:54311</value> </property> </configuration>
環境補足
root@raspberrypi:/usr/local# mkdir -p /usr/local/fs/hadoop/tmp root@raspberrypi:/usr/local# chown hduser:hadoop /usr/local/fs/hadoop/tmp root@raspberrypi:/usr/local# chmod 750 /usr/local/fs/hadoop/tmp
DFSフォーマット
hduser@raspberrypi ~ $ hadoop namenode -format 13/08/04 10:41:42 INFO namenode.NameNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting NameNode STARTUP_MSG: host = raspberrypi/127.0.1.1 STARTUP_MSG: args = [-format] STARTUP_MSG: version = 1.1.2 STARTUP_MSG: build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r 1440782; compiled by 'hortonfo' on Thu Jan 31 02:03:24 UTC 2013 ************************************************************/ 13/08/04 10:41:45 INFO util.GSet: VM type = 32-bit 13/08/04 10:41:45 INFO util.GSet: 2% max memory = 19.335 MB 13/08/04 10:41:45 INFO util.GSet: capacity = 2^22 = 4194304 entries 13/08/04 10:41:45 INFO util.GSet: recommended=4194304, actual=4194304 13/08/04 10:41:48 INFO namenode.FSNamesystem: fsOwner=hduser 13/08/04 10:41:49 INFO namenode.FSNamesystem: supergroup=supergroup 13/08/04 10:41:49 INFO namenode.FSNamesystem: isPermissionEnabled=true 13/08/04 10:41:49 INFO namenode.FSNamesystem: dfs.block.invalidate.limit=100 13/08/04 10:41:49 INFO namenode.FSNamesystem: isAccessTokenEnabled=false accessKeyUpdateInterval=0 min(s), accessTokenLifetime=0 min(s) 13/08/04 10:41:49 INFO namenode.NameNode: Caching file names occuring more than 10 times 13/08/04 10:41:51 INFO common.Storage: Image file of size 112 saved in 0 seconds. 13/08/04 10:41:52 INFO namenode.FSEditLog: closing edit log: position=4, editlog=/usr/local/fs/hadoop/tmp/dfs/name/current/edits 13/08/04 10:41:52 INFO namenode.FSEditLog: close success: truncate to 4, editlog=/usr/local/fs/hadoop/tmp/dfs/name/current/edits 13/08/04 10:41:53 INFO common.Storage: Storage directory /usr/local/fs/hadoop/tmp/dfs/name has been successfully formatted. 13/08/04 10:41:53 INFO namenode.NameNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down NameNode at raspberrypi/127.0.1.1 ************************************************************/
hduser@raspberrypi /usr/local/hadoop/bin $ ./start-all.sh starting namenode, logging to /usr/local/hadoop/libexec/../logs/hadoop-hduser-namenode-raspberrypi.out localhost: starting datanode, logging to /usr/local/hadoop/libexec/../logs/hadoop-hduser-datanode-raspberrypi.out localhost: starting secondarynamenode, logging to /usr/local/hadoop/libexec/../logs/hadoop-hduser-secondarynamenode-raspberrypi.out starting jobtracker, logging to /usr/local/hadoop/libexec/../logs/hadoop-hduser-jobtracker-raspberrypi.out localhost: starting tasktracker, logging to /usr/local/hadoop/libexec/../logs/hadoop-hduser-tasktracker-raspberrypi.out hduser@raspberrypi /usr/local/hadoop/bin $ hduser@raspberrypi /usr/local/hadoop/bin $ jps 5660 JobTracker 5586 SecondaryNameNode 5866 Jps 5372 NameNode 5770 TaskTracker hduser@raspberrypi /usr/local/hadoop/bin $
*注意:start-allでNameNode、JobTrackerは起動したが、DataNode、TaskTrackerがコケた「Error: JAVA_HOME is not set.」と出たので、/usr/local/hadoop/conf/hadoop-env.sh の JAVA_HOMEを追加した
export JAVA_HOME=/opt/jdk1.7.0_21
*注意:SecondaryNameNode いらないので以下をコメントアウト
start-dfs.sh:"$bin"/hadoop-daemons.sh --config $HADOOP_CONF_DIR --hosts masters start secondarynamenode stop-dfs.sh:"$bin"/hadoop-daemons.sh --config $HADOOP_CONF_DIR --hosts masters stop secondarynamenode
*注意:Oracle JDK 7 (ARM Soft Float ABI 用)を使うとき、DataNodeはクライアントモードで起動する必要がある
これを修正しないとデーモンは起動するが、DataNodeが正しく動かないためJobがコケる
bin/hadoop ファイルの以下を編集( -server を消す)
#HADOOP_OPTS="$HADOOP_OPTS -server $HADOOP_DATANODE_OPTS" HADOOP_OPTS="$HADOOP_OPTS $HADOOP_DATANODE_OPTS"
Oracle JDK 8はServerを(まだ?)サポートしていないようだが、7もおそらく。なお、Serverモードだと50070にアクセスしたときに、「Configured Capacity」や「DFS Used」などが「0」になっており、きちんとDataNode部分が動いていない事を確認、直したら以下のように動いた
hduser@raspberrypi /usr/local/hadoop/bin $ ./stop-all.sh stopping jobtracker localhost: stopping tasktracker stopping namenode localhost: stopping datanode hduser@raspberrypi /usr/local/hadoop/bin $ jps 3035 Jps hduser@raspberrypi /usr/local/hadoop/bin $
ものすごい小型の、しかも安く組めるコンピュータができあがると知り試したくなった。
Raspberry Pi というCPU(メモリもチップ内蔵)の基盤が必要だが、これが3千円程度で買える(現時点でちょっと円安だから、もう少しするかも)。安いんですけど、軍資金集め。HP ProLiant ML110 が3台になったので、1つ古いのを売ってからにする。
ということで、まずは物を集める計画。
ということで、あえて買うのはSDカードくらい。