题 hadoop 2.2.0 64位安装但无法启动


我正在尝试在服务器上安装Hadoop 2.2.0群集。目前所有服务器都是64位,我下载了Hadoop 2.2.0并且已经设置了所有配置文件。当我运行./start-dfs.sh时,我收到以下错误:

13/11/15 14:29:26 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/hchen/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.namenode]
sed: -e expression #1, char 6: unknown option to `s' have: ssh: Could not resolve hostname have: Name or service not known
HotSpot(TM): ssh: Could not resolve hostname HotSpot(TM): Name or service not known
-c: Unknown cipher type 'cd'
Java: ssh: Could not resolve hostname Java: Name or service not known
The authenticity of host 'namenode (192.168.1.62)' can't be established.
RSA key fingerprint is 65:f9:aa:7c:8f:fc:74:e4:c7:a2:f5:7f:d2:cd:55:d4.
Are you sure you want to continue connecting (yes/no)? VM: ssh: Could not resolve        hostname VM: Name or service not known
You: ssh: Could not resolve hostname You: Name or service not known
warning:: ssh: Could not resolve hostname warning:: Name or service not known
library: ssh: Could not resolve hostname library: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
64-Bit: ssh: Could not resolve hostname 64-Bit: Name or service not known
...

除了64位,还有其他错误吗?我已经完成了没有密码的namenode和datanode之间的登录,其他错误意味着什么?


19
2017-11-15 21:53


起源


对于可搜索性:此问题也适用于Hadoop 2.4.0和Hadoop 2.4.1。 - Greg Dubicki


答案:


将以下条目添加到.bashrc,其中HADOOP_HOME是您的hadoop文件夹:

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"

另外,执行以下命令:

ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys

22
2017-11-27 12:38





根本原因是hadoop中的默认本机库是为32位构建的。 解决方案

1)在中设置一些环境变量 .bash_profile。请参阅 https://gist.github.com/ruo91/7154697 要么

2)重建你的hadoop本地库,请参考 http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/NativeLibraries.html


9
2017-12-04 09:42



欢迎来到SO。请提供上述链接的更多详细信息。它将帮助解决问题,以防它们变得不可用 - Nogard


您还可以在hadoop-env.sh中导出变量

vim /usr/local/hadoop/etc/hadoop/hadoop-env.sh

在/ usr /本地/ Hadoop的  - 我的hadoop安装文件夹

#Hadoop variables
export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-amd64 # your jdk install path
export HADOOP_INSTALL=/usr/local/hadoop
export PATH=$PATH:$HADOOP_INSTALL/bin
export PATH=$PATH:$HADOOP_INSTALL/sbin
export HADOOP_MAPRED_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_HOME=$HADOOP_INSTALL
export HADOOP_HDFS_HOME=$HADOOP_INSTALL
export YARN_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib"

4
2018-06-05 07:27



更改导出HADOOP_OPTS =“ - Djava.library.path = $ HADOOP_INSTALL / lib”以导出HADOOP_OPTS =“ - Djava.library.path = $ HADOOP_INSTALL / lib / native” - Hafiz Shehbaz Ali
我已经对此进行了检查。 - Hafiz Shehbaz Ali


我认为这里唯一的问题与中的问题相同 这个问题,所以解决方案也是一样的:


阻止JVM将堆栈保护警告打印到stdout / stderr,因为这是打破HDFS启动脚本的原因。


通过替换你的。来做到这一点 etc/hadoop/hadoop-env.sh 线:

export HADOOP_OPTS="$HADOOP_OPTS -Djava.net.preferIPv4Stack=true"

有:

export HADOOP_OPTS="$HADOOP_OPTS -XX:-PrintWarnings -Djava.net.preferIPv4Stack=true"


这个解决方案 已被发现 Sumit Chawla的博客


2
2017-09-20 11:38



堆栈保护问题主要发生在x64架构上。 - Hafiz Shehbaz Ali
关于这个错误的任何想法。 WARN util.NativeCodeLoader:无法为您的平台加载native-hadoop库...使用适用的builtin-java类 - Hafiz Shehbaz Ali


问题不在于本机库。请注意它只是一个警告。请导出上面提到的hadoop变量。那可行


0
2017-12-24 10:37





你有三个问题:

  1. 无法加载native-hadoop库“就像@Nogard说的那样。他的回答解决了这个问题。
  2. 无法建立主机'namenode(192.168.1.62)'的真实性。“是因为你没有ssh身份验证。这样做:

    ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys chmod 600 ~/.ssh/authorized_keys scp ~/.ssh/authorized_keys your_install_user@192.168.1.62:/home/your_install_user/.ssh/

  3. sed:-e expression#1,char 6:`s'的未知选项:ssh:无法解析主机名:名称或服务未知 HotSpot(TM):ssh:无法解析主机名HotSpot(TM):名称或服务未知 -C: “

    试试这个:编辑你的 .bash_profile 要么 .bashrc 把它放进去:

    export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
    export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
    

    source .bash_profile 要么 source .bashr 使更改立即生效。


0
2018-05-22 06:15





在遇到上述所有建议后,我遇到了类似的问题而无法解决问题。

最后明白,配置的主机名和IP地址不是相同的。

我的主机名是 vagrant 它配置在 /etc/hostname。但我发现没有分配流浪者的IP地址 /etc/hosts。在 /etc/hosts 我只找到了IP地址 localhost

一旦我更新了两者的主机名 localhost 和 vagrant 所有上述问题都得到了解决。


0
2018-05-30 06:28





确保你的 HADOOP_HOME 和 HADOOP_PREFIX 设置得当。我有这个问题。此外,需要正确设置ssh无密码。


-1
2018-01-13 02:26