How to ssh login into remote computer without password


While working with Linux in networked environment, you might want to login into several different computers in the network. This process normally will require you to enter the password of the remote computer in order to login.  This can be time consuming and boring especially when you have many computers in the network. You can set the passwords using the following steps  and thus will be letting you login in the remote computer without asking you to enter the password every time.

1.  Create public and private keys

fredrickishengoma@InfinityLabs$ ssh-keygen  
Generating public/private rsa key pair. 
Enter file in which to save the key (/home/fredrickishengoma/.ssh/id_rsa):[Enter key] 
Enter passphrase (empty for no passphrase): [Press enter key] 
Enter same passphrase again: [Pess enter key] 
Your identification has been saved in /home/fredrickishengoma/.ssh/id_rsa.
Your public key has been saved in /home/fredrickishengoma/.ssh/id_rsa.pub. 
The key fingerprint is:
33:b3:fe:af:95:95:18:11:31:d5:de:96:2f:f2:35:f9 
fredrickishengoma@infinityLabs


2.  Copy the keys to remote Host
fredrickishengoma@InfinityLabs$ ssh-copy-id -i ~/.ssh/id_rsa.pub remote-host
fredrickishengoma@remote-host's password:
Now try logging into the machine, with "ssh, 'remote-host'", and check in:

.ssh/authorized_keys

to make sure we haven't added extra keys that you weren't expecting 

3.  Login automatically to remote host
fredrickishengoma@InfinityLabs$ ssh remote-host
Last login: Sun June 17 12:20:11 2012 from 192.168.12.24
[Note it didn't ask for password]

Hadoop:: Incompatible namespaceIDs Error


While working with Hadoop / Hadoop Distibute File System , you might come across with "Incompatible namespaceID" error and your NameNode or DataNodes won't start because of this error.
This problem is related to JIRA's https://issues.apache.org/jira/browse/HDFS-107

Error org.apache.hadoop.hdfs.server.datanode.DataNode;
  java.io.IOExpception: Incompatible namespaceIDs in /hadoop/hdfs/datadir:
  namenode namespaceID = 515704843; datanode namespaceID = 572408927



Problem: Hadoop namespaceIDs are corrupted/have different versions with the namenode.
Solution: The quick solution for this is to format the HDFS. Most probably you will loose the data in HDFS.
Procedure:
bin/stop-all.sh
rm -rf /tmp/hadoop-your-user-name/*
bin/ hadoop namenode -format

System boots very slow when reaches sendmail service

If you have worked with Linux OS, you might have come across with this situation, where during booting, the system takes "forever" when it starts the sendmail service. The most common cause is that Linux can not look up the name of the machine (if you set up networking to have a machine name). The machine pauses waiting for the network timeout of DNS lookups, and will eventually bring up the login prompt.

Quick Fix:

I just had do add 127.0.0.1 localhost.localdomain localhost to /etc/hosts and  that solved the problem. If you already have that line in /etc/hosts, then make sure that line is the first line in that file.
$ vi /etc/hosts127.0.0.1                        localhost.localdomain           localhost