MOUNT THE ONE NODE HADOOP CLUSTER USING FUSE
PROCEDURE
FUSE
(Filesystem in Userspace) enables you to write a normal user
application as a bridge for a traditional filesystem interface.
The
hadoop-hdfs-fuse
package enables you to use your HDFS cluster as if it were a
traditional filesystem on Linux. It is assumed that you have a
working HDFS cluster and know the hostname and port that your
NameNode exposes.
To install fuse-dfs on Ubuntu systems:
hdpuser@jiju-PC:~$
wget
http://archive.cloudera.com/cdh5/one-click-install/trusty/amd64/cdh5-repository_1.0_all.deb
--2016-07-24
09:10:33--
http://archive.cloudera.com/cdh5/one-click-install/trusty/amd64/cdh5-repository_1.0_all.deb
Resolving
archive.cloudera.com (archive.cloudera.com)... 151.101.8.167
Connecting to
archive.cloudera.com (archive.cloudera.com)|151.101.8.167|:80...
connected.
HTTP request sent,
awaiting response... 200 OK
Length: 3508 (3.4K)
[application/x-debian-package]
Saving to:
‘cdh5-repository_1.0_all.deb’
100%[======================================>]
3,508 --.-K/s in 0.09s
2016-07-24 09:10:34
(37.4 KB/s) - ‘cdh5-repository_1.0_all.deb’ saved [3508/3508]
hdpuser@jiju-PC:~$
sudo dpkg -i cdh5-repository_1.0_all.deb
Selecting previously
unselected package cdh5-repository.
(Reading database
... 170607 files and directories currently installed.)
Preparing to unpack
cdh5-repository_1.0_all.deb ...
Unpacking
cdh5-repository (1.0) ...
Setting up
cdh5-repository (1.0) ...
gpg: keyring
`/etc/apt/secring.gpg' created
gpg: keyring
`/etc/apt/trusted.gpg.d/cloudera-cdh5.gpg' created
gpg: key 02A818DD:
public key "Cloudera Apt Repository" imported
gpg: Total number
processed: 1
gpg:
imported: 1
hdpuser@jiju-PC:~$
sudo apt-get update
hdpuser@jiju-PC:~$
sudo apt-get install hadoop-hdfs-fuse
Reading package
lists... Done
Building dependency
tree
Reading state
information... Done
The following extra
packages will be installed:
avro-libs
bigtop-jsvc bigtop-utils curl hadoop hadoop-0.20-mapreduce
hadoop-client
hadoop-hdfs hadoop-mapreduce hadoop-yarn libcurl3 libhdfs0
parquet
parquet-format zookeeper
The following NEW
packages will be installed:
avro-libs
bigtop-jsvc bigtop-utils curl hadoop hadoop-0.20-mapreduce
hadoop-client
hadoop-hdfs hadoop-hdfs-fuse hadoop-mapreduce hadoop-yarn
libhdfs0 parquet
parquet-format zookeeper
The following
packages will be upgraded:
libcurl3
1 upgraded, 15 newly
installed, 0 to remove and 702 not upgraded.
Need to get 222 MB
of archives.
After this
operation, 267 MB of additional disk space will be used.
Do you want to
continue? [Y/n] Y
Get:1
http://in.archive.ubuntu.com/ubuntu/ trusty-updates/main libcurl3
amd64 7.35.0-1ubuntu2.7 [173 kB]
Get:2
https://archive.cloudera.com/cdh5/ubuntu/trusty/amd64/cdh/
trusty-cdh5/contrib avro-libs all
1.7.6+cdh5.8.0+112-1.cdh5.8.0.p0.74~trusty-cdh5.8.0 [47.0 MB]
Get:3
http://in.archive.ubuntu.com/ubuntu/ trusty-updates/main curl amd64
7.35.0-1ubuntu2.7 [123 kB]
Get:4
https://archive.cloudera.com/cdh5/ubuntu/trusty/amd64/cdh/
trusty-cdh5/contrib parquet-format all
2.1.0+cdh5.8.0+12-1.cdh5.8.0.p0.70~trusty-cdh5.8.0 [479 kB]
Get:5
https://archive.cloudera.com/cdh5/ubuntu/trusty/amd64/cdh/
trusty-cdh5/contrib parquet all
1.5.0+cdh5.8.0+174-1.cdh5.8.0.p0.71~trusty-cdh5.8.0 [27.1 MB]
Get:6
https://archive.cloudera.com/cdh5/ubuntu/trusty/amd64/cdh/
trusty-cdh5/contrib hadoop all
2.6.0+cdh5.8.0+1601-1.cdh5.8.0.p0.93~trusty-cdh5.8.0 [28.2 MB]
Get:7
https://archive.cloudera.com/cdh5/ubuntu/trusty/amd64/cdh/
trusty-cdh5/contrib libhdfs0 amd64
2.6.0+cdh5.8.0+1601-1.cdh5.8.0.p0.93~trusty-cdh5.8.0 [320 kB]
Get:8
https://archive.cloudera.com/cdh5/ubuntu/trusty/amd64/cdh/
trusty-cdh5/contrib hadoop-hdfs-fuse amd64
2.6.0+cdh5.8.0+1601-1.cdh5.8.0.p0.93~trusty-cdh5.8.0 [317 kB]
Fetched 222 MB in
3min 28s (1,064 kB/s)
(Reading database
... 170612 files and directories currently installed.)
Preparing to unpack
.../libcurl3_7.35.0-1ubuntu2.7_amd64.deb ...
Unpacking
libcurl3:amd64 (7.35.0-1ubuntu2.7) over (7.35.0-1ubuntu2) ...
Selecting previously
unselected package curl.
Preparing to unpack
.../curl_7.35.0-1ubuntu2.7_amd64.deb ...
Unpacking curl
(7.35.0-1ubuntu2.7) ...
Selecting previously
unselected package avro-libs.
Preparing to unpack
.../avro-libs_1.7.6+cdh5.8.0+112-1.cdh5.8.0.p0.74~trusty-cdh5.8.0_all.deb
...
Unpacking avro-libs
(1.7.6+cdh5.8.0+112-1.cdh5.8.0.p0.74~trusty-cdh5.8.0) ...
Selecting previously
unselected package bigtop-utils.
Preparing to unpack
.../bigtop-utils_0.7.0+cdh5.8.0+0-1.cdh5.8.0.p0.72~trusty-cdh5.8.0_all.deb
...
Unpacking
bigtop-utils (0.7.0+cdh5.8.0+0-1.cdh5.8.0.p0.72~trusty-cdh5.8.0) ...
Selecting previously
unselected package bigtop-jsvc.
Preparing to unpack
.../bigtop-jsvc_0.6.0+cdh5.8.0+847-1.cdh5.8.0.p0.74~trusty-cdh5.8.0_amd64.deb
...
Unpacking
bigtop-jsvc (0.6.0+cdh5.8.0+847-1.cdh5.8.0.p0.74~trusty-cdh5.8.0) ...
Selecting previously
unselected package zookeeper.
Preparing to unpack
.../zookeeper_3.4.5+cdh5.8.0+94-1.cdh5.8.0.p0.76~trusty-cdh5.8.0_all.deb
...
Unpacking zookeeper
(3.4.5+cdh5.8.0+94-1.cdh5.8.0.p0.76~trusty-cdh5.8.0) ...
Selecting previously
unselected package parquet-format.
Preparing to unpack
.../parquet-format_2.1.0+cdh5.8.0+12-1.cdh5.8.0.p0.70~trusty-cdh5.8.0_all.deb
...
Unpacking
parquet-format (2.1.0+cdh5.8.0+12-1.cdh5.8.0.p0.70~trusty-cdh5.8.0)
...
Selecting previously
unselected package hadoop-yarn.
Preparing to unpack
.../hadoop-yarn_2.6.0+cdh5.8.0+1601-1.cdh5.8.0.p0.93~trusty-cdh5.8.0_all.deb
...
Unpacking
hadoop-yarn (2.6.0+cdh5.8.0+1601-1.cdh5.8.0.p0.93~trusty-cdh5.8.0)
...
Selecting previously
unselected package hadoop-mapreduce.
Preparing to unpack
.../hadoop-mapreduce_2.6.0+cdh5.8.0+1601-1.cdh5.8.0.p0.93~trusty-cdh5.8.0_all.deb
...
Unpacking
hadoop-mapreduce
(2.6.0+cdh5.8.0+1601-1.cdh5.8.0.p0.93~trusty-cdh5.8.0) ...
Selecting previously
unselected package hadoop-hdfs.
Preparing to unpack
.../hadoop-hdfs_2.6.0+cdh5.8.0+1601-1.cdh5.8.0.p0.93~trusty-cdh5.8.0_all.deb
...
Unpacking
hadoop-hdfs (2.6.0+cdh5.8.0+1601-1.cdh5.8.0.p0.93~trusty-cdh5.8.0)
...
Selecting previously
unselected package hadoop-0.20-mapreduce.
Preparing to unpack
.../hadoop-0.20-mapreduce_2.6.0+cdh5.8.0+1601-1.cdh5.8.0.p0.93~trusty-cdh5.8.0_amd64.deb
...
Unpacking
hadoop-0.20-mapreduce
(2.6.0+cdh5.8.0+1601-1.cdh5.8.0.p0.93~trusty-cdh5.8.0) ...
Selecting previously
unselected package hadoop-client.
Preparing to unpack
.../hadoop-client_2.6.0+cdh5.8.0+1601-1.cdh5.8.0.p0.93~trusty-cdh5.8.0_all.deb
...
Unpacking
hadoop-client (2.6.0+cdh5.8.0+1601-1.cdh5.8.0.p0.93~trusty-cdh5.8.0)
...
Selecting previously
unselected package parquet.
Preparing to unpack
.../parquet_1.5.0+cdh5.8.0+174-1.cdh5.8.0.p0.71~trusty-cdh5.8.0_all.deb
...
Unpacking parquet
(1.5.0+cdh5.8.0+174-1.cdh5.8.0.p0.71~trusty-cdh5.8.0) ...
Selecting previously
unselected package hadoop.
Preparing to unpack
.../hadoop_2.6.0+cdh5.8.0+1601-1.cdh5.8.0.p0.93~trusty-cdh5.8.0_all.deb
...
Unpacking hadoop
(2.6.0+cdh5.8.0+1601-1.cdh5.8.0.p0.93~trusty-cdh5.8.0) ...
Selecting previously
unselected package libhdfs0.
Preparing to unpack
.../libhdfs0_2.6.0+cdh5.8.0+1601-1.cdh5.8.0.p0.93~trusty-cdh5.8.0_amd64.deb
...
Unpacking libhdfs0
(2.6.0+cdh5.8.0+1601-1.cdh5.8.0.p0.93~trusty-cdh5.8.0) ...
Selecting previously
unselected package hadoop-hdfs-fuse.
Preparing to unpack
.../hadoop-hdfs-fuse_2.6.0+cdh5.8.0+1601-1.cdh5.8.0.p0.93~trusty-cdh5.8.0_amd64.deb
...
Unpacking
hadoop-hdfs-fuse
(2.6.0+cdh5.8.0+1601-1.cdh5.8.0.p0.93~trusty-cdh5.8.0) ...
Processing triggers
for man-db (2.6.7.1-1) ...
Setting up
libcurl3:amd64 (7.35.0-1ubuntu2.7) ...
Setting up curl
(7.35.0-1ubuntu2.7) ...
Setting up avro-libs
(1.7.6+cdh5.8.0+112-1.cdh5.8.0.p0.74~trusty-cdh5.8.0) ...
Setting up
bigtop-utils (0.7.0+cdh5.8.0+0-1.cdh5.8.0.p0.72~trusty-cdh5.8.0) ...
Setting up
bigtop-jsvc (0.6.0+cdh5.8.0+847-1.cdh5.8.0.p0.74~trusty-cdh5.8.0) ...
Setting up zookeeper
(3.4.5+cdh5.8.0+94-1.cdh5.8.0.p0.76~trusty-cdh5.8.0) ...
update-alternatives:
using /etc/zookeeper/conf.dist to provide /etc/zookeeper/conf
(zookeeper-conf) in auto mode
Setting up
parquet-format (2.1.0+cdh5.8.0+12-1.cdh5.8.0.p0.70~trusty-cdh5.8.0)
...
Setting up parquet
(1.5.0+cdh5.8.0+174-1.cdh5.8.0.p0.71~trusty-cdh5.8.0) ...
Setting up hadoop
(2.6.0+cdh5.8.0+1601-1.cdh5.8.0.p0.93~trusty-cdh5.8.0) ...
update-alternatives:
using /etc/hadoop/conf.empty to provide /etc/hadoop/conf
(hadoop-conf) in auto mode
Setting up
hadoop-yarn (2.6.0+cdh5.8.0+1601-1.cdh5.8.0.p0.93~trusty-cdh5.8.0)
...
Setting up libhdfs0
(2.6.0+cdh5.8.0+1601-1.cdh5.8.0.p0.93~trusty-cdh5.8.0) ...
Setting up
hadoop-mapreduce
(2.6.0+cdh5.8.0+1601-1.cdh5.8.0.p0.93~trusty-cdh5.8.0) ...
Setting up
hadoop-hdfs (2.6.0+cdh5.8.0+1601-1.cdh5.8.0.p0.93~trusty-cdh5.8.0)
...
Setting up
hadoop-0.20-mapreduce
(2.6.0+cdh5.8.0+1601-1.cdh5.8.0.p0.93~trusty-cdh5.8.0) ...
Setting up
hadoop-client (2.6.0+cdh5.8.0+1601-1.cdh5.8.0.p0.93~trusty-cdh5.8.0)
...
Setting up
hadoop-hdfs-fuse
(2.6.0+cdh5.8.0+1601-1.cdh5.8.0.p0.93~trusty-cdh5.8.0) ...
Processing triggers
for libc-bin (2.19-0ubuntu6) ...
hdpuser@jiju-PC:~$
sudo mkdir -p /home/hdpuser/hdfs
[sudo] password for
hdpuser:
hdpuser@jiju-PC:~$
sudo hadoop-fuse-dfs dfs://localhost:54310 /home/hdpuser/hdfs/
INFO
/data/jenkins/workspace/generic-package-ubuntu64-14-04/CDH5.8.0-Packaging-Hadoop-2016-07-12_15-43-10/hadoop-2.6.0+cdh5.8.0+1601-1.cdh5.8.0.p0.93~trusty/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_options.c:164
Adding FUSE arg /home/hdpuser/hdfs/
hdpuser@jiju-PC:~$
ls /home/hdpuser/hdfs/
hdpuser@jiju-PC:~$
mkdir /home/hdpuser/hdfs/new
hdpuser@jiju-PC:~$ ls /home/hdpuser/hdfs/
new
hdpuser@jiju-PC:~$
mkdir /home/hdpuser/hdfs/example
hdpuser@jiju-PC:~$
ls -l /home/hdpuser/hdfs/
total 8
drwxr-xr-x 2 hdpuser
99 4096 Jul 24 15:28 example
drwxr-xr-x 2 hdpuser
99 4096 Jul 24 15:19 new
To Unmont the file system
Using umount command
the filesystem can be unmounted.
hdpuser@jiju-PC:~$
sudo umount /home/hdpuser/hdfs
NOTE:
You
can now add a permanent HDFS mount which persists through reboots.
To
add a system mount:
- Open /etc/fstab and add lines to the bottom similar to these: (sudo vi /etc/fstab)
hadoop-fuse-dfs#dfs://<name_node_hostname>:<namenode_port>
<mount_point> fuse allow_other,usetrash,rw 2 0
For
example:
sudo
hadoop-fuse-dfs#dfs://localhost:54310 /home/hdpuser/hdfs fuse
allow_other,usetrash,rw 2 0
- Test to make sure everything is working properly:
$
mount <mount_point>
hdpuser@jiju-PC:~$
sudo mount /home/hdpuser/hdfs
No comments:
Post a Comment