Series catalog:

Hadoop Combat (1) _ Aliyun builds the pseudo-distributed environment of Hadoop2.x

Hadoop Deployment (2) _ Vm Deployment of Hadoop in full distribution Mode

Hadoop Deployment (3) _ Virtual machine building CDH full distribution mode

Hadoop Deployment (4) _Hadoop cluster management and resource allocation

Hadoop Deployment (5) _Hadoop OPERATION and maintenance experience

Hadoop Deployment (6) _ Build the Eclipse development environment for Apache Hadoop

Perform the following steps to configure Hue:

  • Install the required RPM package
  • Edit the Hadoop configuration file
  • Install the Hue
  • Start Hue and authentication

Install the required RPM package

yum install ant
yum install asciidoc
yum install cyrus-sasl-devel
yum install cyrus-sasl-gssapi
yum install gcc
yum install gcc-c++
yum install krb5-devel

# for unit tests only
yum install libtidy 

yum install libxml2-devel
yum install libxslt-devel
yum install make
# yum install mvn (from maven package or maven3 tarball)
yum install mysql
yum install mysql-devel
yum install openldap-devel
yum install python-devel
yum install sqlite-devel

# for version 7+
yum install openssl-devel

yum install gmp-devel
Copy the code

Edit the Hadoop configuration file

Parameter file parameter reference instructions
hdfs-site.xml dfs.webhdfs.enabled true Enable the WebHDFS function
core-site.xml hadoop.proxyuser.root.hosts * Set a proxy user for the Hadoop cluster.*Represents any user
core-site.xml hadoop.proxyuser.root.groups * Set a proxy user group for the Hadoop cluster

hdfs-site.xml

<property>
    <name>dfs.webhdfs.enabled</name>
    <value>true</value>
</property>
Copy the code

core-site.xml

<property>
    <name>hadoop.proxyuser.root.hosts</name>
    <value>*</value>
</property>
<property>
    <name>hadoop.proxyuser.root.groups</name>
    <value>*</value>
</property>
Copy the code

Install the Hue

Unpack the hue – 3.7.0 – cdh5.4.2. Tar. Gz

Mkdir -p /home/tools tar -zxvf hue-3.7.0-cdh5.4.2.tar.gz -c ~/training/Copy the code

Compile and install: Pay attention to system time

cd~ / training/hue - 3.7.0 - cdh5.4.2 PREFIX = / root/training/make installCopy the code

Bug to solve

/usr/include/gnu/stubs.h:9:27: error: gnu/stubs-64.h: No such file or direct                          ory
error: command 'gcc' failed with exitThe status 1 [root @ hadoop25pseudo hue - 3.7.0 - cdh5.4.2]# rpm -qa | grep glibc*Glibc - headers - 2.12-1.132. El6. X86_64 glibc -static - 2.12-1.132. El6. I686 glibc - common - 2.12-1.132. El6. X86_64 Glibc 2.12-1.132. El6. X86_64 glib2 2.26.1-3. El6. X86_64 dbus - glib - 0.86-6. El6. X86_64 glibc 2.12-1.132. El6. I686 Glibc - devel - 2.12-1.132. El6. I68664 bit missing
yum install -y glibc-devel
Copy the code
/bin/bash: rsync: command not found

yum install -y rsync
Copy the code

Adding a User Hue

adduser hue
chown -R hue.hue /root/training/hue/
Copy the code

Modify the hue. Ini ($HUE_HOME/desktop/conf/hue. Ini) parameter file

parameter reference
http_host hadoop25pseudo
http_port 8888
server_user root
server_group root
default_user root
default_hdfs_superuser root
fs_defaultfs hdfs://hadoop25pseudo:9000
webhdfs_url http://hadoop25pseudo:50070/webhdfs/v1
hadoop_conf_dir / root/training/hadoop – against 2.4.1 / etc/hadoop
resourcemanager_host hadoop25pseudo
resourcemanager_api_url http://hadoop25pseudo:8088
proxy_api_url http://hadoop25pseudo:8088
history_server_api_url http://hadoop25pseudo:19888
find . -name hue.ini
# ./desktop/conf/hue.ini
cd desktop/conf/
vi hue.ini
Copy the code
http_host=hadoop25pseudo http_port=8888 server_user=root server_group=root default_user=root default_hdfs_superuser=root  fs_defaultfs=hdfs://hadoop25pseudo:9000 webhdfs_url=http://hadoop25pseudo:50070/webhdfs/v1 Hadoop_conf_dir = / root/training/hadoop - against 2.4.1 / etc/hadoop resourcemanager_host = hadoop25pseudo resourcemanager_api_url=http://hadoop25pseudo:8088 proxy_api_url=http://hadoop25pseudo:8088 history_server_api_url=http://hadoop25pseudo:19888Copy the code

Start Hue and authentication

Start hadoop-related component start-all.sh

Start the Hue:

cd ~/training/hue/build/env
bin/supervisor
Copy the code

Authentication, access to the home page: http://hadoop25pseudo:8888/


The wechat official account “Data Analysis” is used to share self-cultivation of data scientists. Since we met each other, it is better to grow up together.

Reprint please specify: Reprint from wechat official account “Data Analysis”


Reader communication telegraph group:

https://t.me/sspadluo