Discussion:
about replication
Irfan Sayed
2013-08-05 06:53:20 UTC
Permalink
hi,

i have setup the two node apache hadoop cluster on windows environment
one is namenode and another is datanode

everything is working fine.
one thing which i need to know is , how the replication starts
if i create a.txt in namenode , how it will be appeared in datanodes

please suggest

regards
Mohammad Tariq
2013-08-05 09:33:36 UTC
Permalink
Hello Irfan,

You can find all the answers from HDFS architecture
guide<http://hadoop.apache.org/docs/stable/hdfs_design.html>.
See the section Data
Organization<http://hadoop.apache.org/docs/stable/hdfs_design.html#Data+Organization>in
particular for this question.

Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
hi,
i have setup the two node apache hadoop cluster on windows environment
one is namenode and another is datanode
everything is working fine.
one thing which i need to know is , how the replication starts
if i create a.txt in namenode , how it will be appeared in datanodes
please suggest
regards
Irfan Sayed
2013-08-05 11:16:05 UTC
Permalink
thanks mohammad
i ran the below command on NameNode

$ ./hadoop dfs -mkdir /wksp

and the "wksp" dir got created in c:\ ( as i have windows environment)

now , when i log in to one of the DataNode , then i am not able to see
c:\wksp

any issue ?
please suggest

regards
Post by Mohammad Tariq
Hello Irfan,
You can find all the answers from HDFS architecture guide<http://hadoop.apache.org/docs/stable/hdfs_design.html>.
See the section Data Organization<http://hadoop.apache.org/docs/stable/hdfs_design.html#Data+Organization>in particular for this question.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
hi,
i have setup the two node apache hadoop cluster on windows environment
one is namenode and another is datanode
everything is working fine.
one thing which i need to know is , how the replication starts
if i create a.txt in namenode , how it will be appeared in datanodes
please suggest
regards
Mohammad Tariq
2013-08-05 11:48:07 UTC
Permalink
You cannot physically see the HDFS files and directories through local FS.
Either use HDFS shell or HDFS webUI(namenode_machine:50070).

Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks mohammad
i ran the below command on NameNode
$ ./hadoop dfs -mkdir /wksp
and the "wksp" dir got created in c:\ ( as i have windows environment)
now , when i log in to one of the DataNode , then i am not able to see
c:\wksp
any issue ?
please suggest
regards
Post by Mohammad Tariq
Hello Irfan,
You can find all the answers from HDFS architecture guide<http://hadoop.apache.org/docs/stable/hdfs_design.html>.
See the section Data Organization<http://hadoop.apache.org/docs/stable/hdfs_design.html#Data+Organization>in particular for this question.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
hi,
i have setup the two node apache hadoop cluster on windows environment
one is namenode and another is datanode
everything is working fine.
one thing which i need to know is , how the replication starts
if i create a.txt in namenode , how it will be appeared in datanodes
please suggest
regards
Irfan Sayed
2013-08-05 11:59:12 UTC
Permalink
thanks.
please refer below:

***@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
$ ./hadoop dfs -ls /wksp
Found 1 items
drwxr-xr-x - Administrator Domain 0 2013-08-05 16:58 /wksp/New
folder

***@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
$


same command if i run on the datanode then it says:

***@DFS-1 /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
$ ./hadoop dfs -ls /wksp
ls: Cannot access /wksp: No such file or directory.

does it mean that replication is not yet started ...?

please suggest

regards
Post by Mohammad Tariq
You cannot physically see the HDFS files and directories through local FS.
Either use HDFS shell or HDFS webUI(namenode_machine:50070).
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks mohammad
i ran the below command on NameNode
$ ./hadoop dfs -mkdir /wksp
and the "wksp" dir got created in c:\ ( as i have windows environment)
now , when i log in to one of the DataNode , then i am not able to see
c:\wksp
any issue ?
please suggest
regards
Post by Mohammad Tariq
Hello Irfan,
You can find all the answers from HDFS architecture guide<http://hadoop.apache.org/docs/stable/hdfs_design.html>.
See the section Data Organization<http://hadoop.apache.org/docs/stable/hdfs_design.html#Data+Organization>in particular for this question.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
hi,
i have setup the two node apache hadoop cluster on windows environment
one is namenode and another is datanode
everything is working fine.
one thing which i need to know is , how the replication starts
if i create a.txt in namenode , how it will be appeared in datanodes
please suggest
regards
manish dunani
2013-08-05 12:30:17 UTC
Permalink
You can not physically access the datanode.You have to understand it to
logically and it really happens.

Type "jps" command to check ur datanode was started or not.

when user stores the file into hdfs ,the request is goes to datanode and
datanode will divide the file into number of blocks.
Each block size is 64mb or 128mb.but,default is 64 mb.

The default replication factor is 1 that was u already set in ur
hdfs-site.xml .If u want to change ur replication factor from 1 to 3 for
particular file or particular directory in hdfs then use the below commands
appropriately.

First load any file to hdfs using command:

bin/hadoop dfs -copyFromLocal /ur/local/directory/path/to/file
/ur/hdfs/directory/path.

*commands:*

To set replication of an individual file to 4:

./bin/hadoop dfs -setrep -w 4 /path/to/file

You can also do this recursively. To change replication of entire HDFS to 1:

./bin/hadoop dfs -setrep -R -w 1 /
*
*
If the replication factor is 3 then data will divided into 3 blocks and
replicated to various data nodes of ur production cluster(multi node).

Here Replication means same file copied three times on data nodes for
handling hardware failure.
Post by Irfan Sayed
thanks.
$ ./hadoop dfs -ls /wksp
Found 1 items
drwxr-xr-x - Administrator Domain 0 2013-08-05 16:58 /wksp/New
folder
$
$ ./hadoop dfs -ls /wksp
ls: Cannot access /wksp: No such file or directory.
does it mean that replication is not yet started ...?
please suggest
regards
Post by Mohammad Tariq
You cannot physically see the HDFS files and directories through local
FS. Either use HDFS shell or HDFS webUI(namenode_machine:50070).
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks mohammad
i ran the below command on NameNode
$ ./hadoop dfs -mkdir /wksp
and the "wksp" dir got created in c:\ ( as i have windows environment)
now , when i log in to one of the DataNode , then i am not able to see
c:\wksp
any issue ?
please suggest
regards
Post by Mohammad Tariq
Hello Irfan,
You can find all the answers from HDFS architecture guide<http://hadoop.apache.org/docs/stable/hdfs_design.html>.
See the section Data Organization<http://hadoop.apache.org/docs/stable/hdfs_design.html#Data+Organization>in particular for this question.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
hi,
i have setup the two node apache hadoop cluster on windows environment
one is namenode and another is datanode
everything is working fine.
one thing which i need to know is , how the replication starts
if i create a.txt in namenode , how it will be appeared in datanodes
please suggest
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
manishd207-***@public.gmane.org
Irfan Sayed
2013-08-06 04:59:36 UTC
Permalink
thanks.
i verified, datanode is up and running

i ran the below command:

***@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
$ ./hadoop dfs -copyFromLocal
C:\\Users\\Administrator\\Desktop\\hadoop-1.1.2.tar /wksp
copyFromLocal: File C:/Users/Administrator/Desktop/hadoop-1.1.2.tar does
not exist.

it says , file does not exist. as my cluster is windows based , i dont know
how dir path needs to be used
i am using cygwin for linux style formatting

i tried below as well :

***@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.

***@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.

please suggest

regards
Post by manish dunani
You can not physically access the datanode.You have to understand it to
logically and it really happens.
Type "jps" command to check ur datanode was started or not.
when user stores the file into hdfs ,the request is goes to datanode and
datanode will divide the file into number of blocks.
Each block size is 64mb or 128mb.but,default is 64 mb.
The default replication factor is 1 that was u already set in ur
hdfs-site.xml .If u want to change ur replication factor from 1 to 3 for
particular file or particular directory in hdfs then use the below commands
appropriately.
bin/hadoop dfs -copyFromLocal /ur/local/directory/path/to/file
/ur/hdfs/directory/path.
*commands:*
./bin/hadoop dfs -setrep -w 4 /path/to/file
You can also do this recursively. To change replication of entire HDFS to
./bin/hadoop dfs -setrep -R -w 1 /
*
*
If the replication factor is 3 then data will divided into 3 blocks and
replicated to various data nodes of ur production cluster(multi node).
Here Replication means same file copied three times on data nodes for
handling hardware failure.
Post by Irfan Sayed
thanks.
$ ./hadoop dfs -ls /wksp
Found 1 items
drwxr-xr-x - Administrator Domain 0 2013-08-05 16:58 /wksp/New
folder
$
$ ./hadoop dfs -ls /wksp
ls: Cannot access /wksp: No such file or directory.
does it mean that replication is not yet started ...?
please suggest
regards
Post by Mohammad Tariq
You cannot physically see the HDFS files and directories through local
FS. Either use HDFS shell or HDFS webUI(namenode_machine:50070).
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks mohammad
i ran the below command on NameNode
$ ./hadoop dfs -mkdir /wksp
and the "wksp" dir got created in c:\ ( as i have windows environment)
now , when i log in to one of the DataNode , then i am not able to see
c:\wksp
any issue ?
please suggest
regards
Post by Mohammad Tariq
Hello Irfan,
You can find all the answers from HDFS architecture guide<http://hadoop.apache.org/docs/stable/hdfs_design.html>.
See the section Data Organization<http://hadoop.apache.org/docs/stable/hdfs_design.html#Data+Organization>in particular for this question.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
hi,
i have setup the two node apache hadoop cluster on windows environment
one is namenode and another is datanode
everything is working fine.
one thing which i need to know is , how the replication starts
if i create a.txt in namenode , how it will be appeared in datanodes
please suggest
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
manish dunani
2013-08-06 06:10:37 UTC
Permalink
hello,

I think u r newbie.You need to work on to learn from scratch.
Please do not mind.whatever u try to think hadoop should be like this, ur
concept is totally wrong.
but,your effort is positive.

My suggestion is try to learn on ubuntu os.Windows is not good enough for
hadoop.

Follow the link to setup single node cluster on ubuntu.

http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/

If u stop anywhere and do not get idea what to do next? just revert me .

Also read this:

https://docs.google.com/viewer?a=v&pid=sites&srcid=ZGVmYXVsdGRvbWFpbnxtYW5pc2hkdW5hbml8Z3g6NGRiM2JhODhhOGEzMTcyYw


*Thanks & Regards*
*
*
*Manish dunani*
Post by Irfan Sayed
thanks.
i verified, datanode is up and running
$ ./hadoop dfs -copyFromLocal
C:\\Users\\Administrator\\Desktop\\hadoop-1.1.2.tar /wksp
copyFromLocal: File C:/Users/Administrator/Desktop/hadoop-1.1.2.tar does
not exist.
it says , file does not exist. as my cluster is windows based , i dont
know how dir path needs to be used
i am using cygwin for linux style formatting
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
please suggest
regards
Post by manish dunani
You can not physically access the datanode.You have to understand it to
logically and it really happens.
Type "jps" command to check ur datanode was started or not.
when user stores the file into hdfs ,the request is goes to datanode and
datanode will divide the file into number of blocks.
Each block size is 64mb or 128mb.but,default is 64 mb.
The default replication factor is 1 that was u already set in ur
hdfs-site.xml .If u want to change ur replication factor from 1 to 3 for
particular file or particular directory in hdfs then use the below commands
appropriately.
bin/hadoop dfs -copyFromLocal /ur/local/directory/path/to/file
/ur/hdfs/directory/path.
*commands:*
./bin/hadoop dfs -setrep -w 4 /path/to/file
You can also do this recursively. To change replication of entire HDFS to
./bin/hadoop dfs -setrep -R -w 1 /
*
*
If the replication factor is 3 then data will divided into 3 blocks and
replicated to various data nodes of ur production cluster(multi node).
Here Replication means same file copied three times on data nodes for
handling hardware failure.
Post by Irfan Sayed
thanks.
$ ./hadoop dfs -ls /wksp
Found 1 items
drwxr-xr-x - Administrator Domain 0 2013-08-05 16:58
/wksp/New folder
$
$ ./hadoop dfs -ls /wksp
ls: Cannot access /wksp: No such file or directory.
does it mean that replication is not yet started ...?
please suggest
regards
Post by Mohammad Tariq
You cannot physically see the HDFS files and directories through local
FS. Either use HDFS shell or HDFS webUI(namenode_machine:50070).
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks mohammad
i ran the below command on NameNode
$ ./hadoop dfs -mkdir /wksp
and the "wksp" dir got created in c:\ ( as i have windows environment)
now , when i log in to one of the DataNode , then i am not able to see
c:\wksp
any issue ?
please suggest
regards
Post by Mohammad Tariq
Hello Irfan,
You can find all the answers from HDFS architecture guide<http://hadoop.apache.org/docs/stable/hdfs_design.html>.
See the section Data Organization<http://hadoop.apache.org/docs/stable/hdfs_design.html#Data+Organization>in particular for this question.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
hi,
i have setup the two node apache hadoop cluster on windows environment
one is namenode and another is datanode
everything is working fine.
one thing which i need to know is , how the replication starts
if i create a.txt in namenode , how it will be appeared in datanodes
please suggest
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
manishd207-***@public.gmane.org
Irfan Sayed
2013-08-06 06:19:39 UTC
Permalink
thanks. yes , i am newbie.
however, i need windows setup.

let me surely refer the doc and link which u sent but i need this to be
working ...
can you please help

regards
Irfan Sayed
2013-08-06 06:50:31 UTC
Permalink
hi,
i have gone high level through the doc and seems very promising. really
nice
but can you please help me on this issue .

i need to do PoC first and get the demo out

regards
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link which u sent but i need this to be
working ...
can you please help
regards
manish dunani
2013-08-06 07:10:47 UTC
Permalink
*You are wrong at this:*

***@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.

***@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.

Because,You had wrote both the paths local and You need not to copy hadoop
into hdfs...Hadoop is already working..

Just check out in browser by after starting ur single node cluster :

localhost:50070

then go for browse the filesystem link in it..

If there is no directory then make directory there.
That is your hdfs directory.
Then copy any text file there(no need to copy hadoop there).beacause u are
going to do processing on that data in text file.That's why hadoop is used
for ,first u need to make it clear in ur mind.Then and then u will do
it...otherwise not possible..

*Try this: *

***@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2
$ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
/hdfs/directory/path
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link which u sent but i need this to be
working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
manishd207-***@public.gmane.org
Irfan Sayed
2013-08-06 11:39:43 UTC
Permalink
thanks.
when i browse the file system , i am getting following :
i haven't seen any make directory option there

i need to create it from command line ?
further, in the hdfs-site.xml file , i have given following entries. are
they correct ?

<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>

please suggest


[image: Inline image 1]
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths local and You need not to copy hadoop
into hdfs...Hadoop is already working..
localhost:50070
then go for browse the filesystem link in it..
If there is no directory then make directory there.
That is your hdfs directory.
Then copy any text file there(no need to copy hadoop there).beacause u are
going to do processing on that data in text file.That's why hadoop is used
for ,first u need to make it clear in ur mind.Then and then u will do
it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
/hdfs/directory/path
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link which u sent but i need this to be
working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
Mohammad Tariq
2013-08-06 11:49:13 UTC
Permalink
Hello Irfan,

Sorry for being unresponsive. Got stuck with some imp work.

HDFS webUI doesn't provide us the ability to create file or directory. You
can browse HDFS, view files, download files etc. But operation like create,
move, copy etc are not supported.

These values look fine to me.

One suggestion though. Try getting a Linux machine(if possible). Or at
least use a VM. I personally feel that using Hadoop on windows is always
messy.

Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks.
i haven't seen any make directory option there
i need to create it from command line ?
further, in the hdfs-site.xml file , i have given following entries. are
they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths local and You need not to copy
hadoop into hdfs...Hadoop is already working..
localhost:50070
then go for browse the filesystem link in it..
If there is no directory then make directory there.
That is your hdfs directory.
Then copy any text file there(no need to copy hadoop there).beacause u
are going to do processing on that data in text file.That's why hadoop is
used for ,first u need to make it clear in ur mind.Then and then u will do
it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
/hdfs/directory/path
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link which u sent but i need this to be
working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
Irfan Sayed
2013-08-06 11:56:09 UTC
Permalink
thanks.
however, i need this to be working on windows environment as project
requirement.
i will add/work on Linux later

so, now , at this stage , c:\\wksp is the HDFS file system OR do i need to
create it from command line ?

please suggest

regards,
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck with some imp work.
HDFS webUI doesn't provide us the ability to create file or directory. You
can browse HDFS, view files, download files etc. But operation like create,
move, copy etc are not supported.
These values look fine to me.
One suggestion though. Try getting a Linux machine(if possible). Or at
least use a VM. I personally feel that using Hadoop on windows is always
messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks.
i haven't seen any make directory option there
i need to create it from command line ?
further, in the hdfs-site.xml file , i have given following entries. are
they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths local and You need not to copy
hadoop into hdfs...Hadoop is already working..
localhost:50070
then go for browse the filesystem link in it..
If there is no directory then make directory there.
That is your hdfs directory.
Then copy any text file there(no need to copy hadoop there).beacause u
are going to do processing on that data in text file.That's why hadoop is
used for ,first u need to make it clear in ur mind.Then and then u will do
it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
/hdfs/directory/path
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link which u sent but i need this to be
working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
Mohammad Tariq
2013-08-06 12:00:51 UTC
Permalink
Create 2 directories manually corresponding to the values of dfs.name.dir
and dfs.data.dir properties and change the permissions of these directories
to 755. When you start pushing data into your HDFS, data will start going
inside the directory specified by dfs.data.dir and the associated metadata
will go inside dfs.name.dir. Remember, you store data in HDFS, but it
eventually gets stored in your local/native FS. But you cannot see this
data directly on your local/native FS.

Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks.
however, i need this to be working on windows environment as project
requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is the HDFS file system OR do i need to
create it from command line ?
please suggest
regards,
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck with some imp work.
HDFS webUI doesn't provide us the ability to create file or directory.
You can browse HDFS, view files, download files etc. But operation like
create, move, copy etc are not supported.
These values look fine to me.
One suggestion though. Try getting a Linux machine(if possible). Or at
least use a VM. I personally feel that using Hadoop on windows is always
messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks.
i haven't seen any make directory option there
i need to create it from command line ?
further, in the hdfs-site.xml file , i have given following entries. are
they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths local and You need not to copy
hadoop into hdfs...Hadoop is already working..
localhost:50070
then go for browse the filesystem link in it..
If there is no directory then make directory there.
That is your hdfs directory.
Then copy any text file there(no need to copy hadoop there).beacause u
are going to do processing on that data in text file.That's why hadoop is
used for ,first u need to make it clear in ur mind.Then and then u will do
it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
/hdfs/directory/path
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link which u sent but i need this to
be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
Irfan Sayed
2013-08-06 17:05:17 UTC
Permalink
these dir needs to be created on all datanodes and namenodes ?
further, hdfs-site.xml needs to be updated on both datanodes and namenodes
for these new dir?

regards
Post by Mohammad Tariq
Create 2 directories manually corresponding to the values of dfs.name.dir
and dfs.data.dir properties and change the permissions of these directories
to 755. When you start pushing data into your HDFS, data will start going
inside the directory specified by dfs.data.dir and the associated metadata
will go inside dfs.name.dir. Remember, you store data in HDFS, but it
eventually gets stored in your local/native FS. But you cannot see this
data directly on your local/native FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks.
however, i need this to be working on windows environment as project
requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is the HDFS file system OR do i need
to create it from command line ?
please suggest
regards,
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck with some imp work.
HDFS webUI doesn't provide us the ability to create file or directory.
You can browse HDFS, view files, download files etc. But operation like
create, move, copy etc are not supported.
These values look fine to me.
One suggestion though. Try getting a Linux machine(if possible). Or at
least use a VM. I personally feel that using Hadoop on windows is always
messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks.
i haven't seen any make directory option there
i need to create it from command line ?
further, in the hdfs-site.xml file , i have given following entries.
are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths local and You need not to copy
hadoop into hdfs...Hadoop is already working..
localhost:50070
then go for browse the filesystem link in it..
If there is no directory then make directory there.
That is your hdfs directory.
Then copy any text file there(no need to copy hadoop there).beacause u
are going to do processing on that data in text file.That's why hadoop is
used for ,first u need to make it clear in ur mind.Then and then u will do
it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
/hdfs/directory/path
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link which u sent but i need this to
be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
Irfan Sayed
2013-08-06 17:29:28 UTC
Permalink
i have created these dir "wksp_data" and "wksp_name" on both datanode and
namenode
made the respective changes in "hdfs-site.xml" file
formatted the namenode
started the dfs

but still, not able to browse the file system through web browser
please refer below

anything still missing ?
please suggest

[image: Inline image 1]
Post by Irfan Sayed
these dir needs to be created on all datanodes and namenodes ?
further, hdfs-site.xml needs to be updated on both datanodes and
namenodes for these new dir?
regards
Post by Mohammad Tariq
Create 2 directories manually corresponding to the values of dfs.name.dir
and dfs.data.dir properties and change the permissions of these directories
to 755. When you start pushing data into your HDFS, data will start going
inside the directory specified by dfs.data.dir and the associated metadata
will go inside dfs.name.dir. Remember, you store data in HDFS, but it
eventually gets stored in your local/native FS. But you cannot see this
data directly on your local/native FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks.
however, i need this to be working on windows environment as project
requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is the HDFS file system OR do i need
to create it from command line ?
please suggest
regards,
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck with some imp work.
HDFS webUI doesn't provide us the ability to create file or directory.
You can browse HDFS, view files, download files etc. But operation like
create, move, copy etc are not supported.
These values look fine to me.
One suggestion though. Try getting a Linux machine(if possible). Or at
least use a VM. I personally feel that using Hadoop on windows is always
messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks.
i haven't seen any make directory option there
i need to create it from command line ?
further, in the hdfs-site.xml file , i have given following entries.
are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths local and You need not to copy
hadoop into hdfs...Hadoop is already working..
localhost:50070
then go for browse the filesystem link in it..
If there is no directory then make directory there.
That is your hdfs directory.
Then copy any text file there(no need to copy hadoop there).beacause
u are going to do processing on that data in text file.That's why hadoop is
used for ,first u need to make it clear in ur mind.Then and then u will do
it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
/hdfs/directory/path
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link which u sent but i need this to
be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
Mohammad Tariq
2013-08-06 20:22:24 UTC
Permalink
OK. we'll start fresh. Could you plz show me your latest config files?

BTW, are your daemons running fine?Use JPS to verify that.

Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
i have created these dir "wksp_data" and "wksp_name" on both datanode and
namenode
made the respective changes in "hdfs-site.xml" file
formatted the namenode
started the dfs
but still, not able to browse the file system through web browser
please refer below
anything still missing ?
please suggest
[image: Inline image 1]
Post by Irfan Sayed
these dir needs to be created on all datanodes and namenodes ?
further, hdfs-site.xml needs to be updated on both datanodes and
namenodes for these new dir?
regards
Post by Mohammad Tariq
Create 2 directories manually corresponding to the values of
dfs.name.dir and dfs.data.dir properties and change the permissions of
these directories to 755. When you start pushing data into your HDFS, data
will start going inside the directory specified by dfs.data.dir and the
associated metadata will go inside dfs.name.dir. Remember, you store data
in HDFS, but it eventually gets stored in your local/native FS. But you
cannot see this data directly on your local/native FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks.
however, i need this to be working on windows environment as project
requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is the HDFS file system OR do i need
to create it from command line ?
please suggest
regards,
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck with some imp work.
HDFS webUI doesn't provide us the ability to create file or directory.
You can browse HDFS, view files, download files etc. But operation like
create, move, copy etc are not supported.
These values look fine to me.
One suggestion though. Try getting a Linux machine(if possible). Or at
least use a VM. I personally feel that using Hadoop on windows is always
messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks.
i haven't seen any make directory option there
i need to create it from command line ?
further, in the hdfs-site.xml file , i have given following entries.
are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths local and You need not to copy
hadoop into hdfs...Hadoop is already working..
localhost:50070
then go for browse the filesystem link in it..
If there is no directory then make directory there.
That is your hdfs directory.
Then copy any text file there(no need to copy hadoop there).beacause
u are going to do processing on that data in text file.That's why hadoop is
used for ,first u need to make it clear in ur mind.Then and then u will do
it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
/hdfs/directory/path
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link which u sent but i need this
to be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
Irfan Sayed
2013-08-07 04:11:55 UTC
Permalink
thanks.

if i run the jps command on namenode :

***@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
$ ./jps.exe
3164 NameNode
1892 Jps

same command on datanode :

***@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin
$ ./jps.exe
3848 Jps

jps does not list any process for datanode. however, on web browser i can
see one live data node
please find the attached conf rar file of namenode

regards
Post by Mohammad Tariq
OK. we'll start fresh. Could you plz show me your latest config files?
BTW, are your daemons running fine?Use JPS to verify that.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
i have created these dir "wksp_data" and "wksp_name" on both datanode and
namenode
made the respective changes in "hdfs-site.xml" file
formatted the namenode
started the dfs
but still, not able to browse the file system through web browser
please refer below
anything still missing ?
please suggest
[image: Inline image 1]
Post by Irfan Sayed
these dir needs to be created on all datanodes and namenodes ?
further, hdfs-site.xml needs to be updated on both datanodes and
namenodes for these new dir?
regards
Post by Mohammad Tariq
Create 2 directories manually corresponding to the values of
dfs.name.dir and dfs.data.dir properties and change the permissions of
these directories to 755. When you start pushing data into your HDFS, data
will start going inside the directory specified by dfs.data.dir and the
associated metadata will go inside dfs.name.dir. Remember, you store data
in HDFS, but it eventually gets stored in your local/native FS. But you
cannot see this data directly on your local/native FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks.
however, i need this to be working on windows environment as project
requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is the HDFS file system OR do i
need to create it from command line ?
please suggest
regards,
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck with some imp work.
HDFS webUI doesn't provide us the ability to create file or
directory. You can browse HDFS, view files, download files etc. But
operation like create, move, copy etc are not supported.
These values look fine to me.
One suggestion though. Try getting a Linux machine(if possible). Or
at least use a VM. I personally feel that using Hadoop on windows is always
messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks.
i haven't seen any make directory option there
i need to create it from command line ?
further, in the hdfs-site.xml file , i have given following entries.
are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths local and You need not to copy
hadoop into hdfs...Hadoop is already working..
localhost:50070
then go for browse the filesystem link in it..
If there is no directory then make directory there.
That is your hdfs directory.
Then copy any text file there(no need to copy hadoop
there).beacause u are going to do processing on that data in text
file.That's why hadoop is used for ,first u need to make it clear in ur
mind.Then and then u will do it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
/hdfs/directory/path
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link which u sent but i need this
to be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
Irfan Sayed
2013-08-07 04:19:23 UTC
Permalink
attachment got quarantined
resending in txt format. please rename it to conf.rar

regards
Post by Irfan Sayed
thanks.
$ ./jps.exe
3164 NameNode
1892 Jps
$ ./jps.exe
3848 Jps
jps does not list any process for datanode. however, on web browser i can
see one live data node
please find the attached conf rar file of namenode
regards
Post by Mohammad Tariq
OK. we'll start fresh. Could you plz show me your latest config files?
BTW, are your daemons running fine?Use JPS to verify that.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
i have created these dir "wksp_data" and "wksp_name" on both datanode
and namenode
made the respective changes in "hdfs-site.xml" file
formatted the namenode
started the dfs
but still, not able to browse the file system through web browser
please refer below
anything still missing ?
please suggest
[image: Inline image 1]
Post by Irfan Sayed
these dir needs to be created on all datanodes and namenodes ?
further, hdfs-site.xml needs to be updated on both datanodes and
namenodes for these new dir?
regards
Post by Mohammad Tariq
Create 2 directories manually corresponding to the values of
dfs.name.dir and dfs.data.dir properties and change the permissions of
these directories to 755. When you start pushing data into your HDFS, data
will start going inside the directory specified by dfs.data.dir and the
associated metadata will go inside dfs.name.dir. Remember, you store data
in HDFS, but it eventually gets stored in your local/native FS. But you
cannot see this data directly on your local/native FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks.
however, i need this to be working on windows environment as project
requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is the HDFS file system OR do i
need to create it from command line ?
please suggest
regards,
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck with some imp work.
HDFS webUI doesn't provide us the ability to create file or
directory. You can browse HDFS, view files, download files etc. But
operation like create, move, copy etc are not supported.
These values look fine to me.
One suggestion though. Try getting a Linux machine(if possible). Or
at least use a VM. I personally feel that using Hadoop on windows is always
messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks.
i haven't seen any make directory option there
i need to create it from command line ?
further, in the hdfs-site.xml file , i have given following
entries. are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths local and You need not to
copy hadoop into hdfs...Hadoop is already working..
Just check out in browser by after starting ur single node cluster
localhost:50070
then go for browse the filesystem link in it..
If there is no directory then make directory there.
That is your hdfs directory.
Then copy any text file there(no need to copy hadoop
there).beacause u are going to do processing on that data in text
file.That's why hadoop is used for ,first u need to make it clear in ur
mind.Then and then u will do it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
/hdfs/directory/path
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link which u sent but i need this
to be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
Irfan Sayed
2013-08-08 00:26:26 UTC
Permalink
please suggest

regards
Post by Irfan Sayed
attachment got quarantined
resending in txt format. please rename it to conf.rar
regards
Post by Irfan Sayed
thanks.
$ ./jps.exe
3164 NameNode
1892 Jps
$ ./jps.exe
3848 Jps
jps does not list any process for datanode. however, on web browser i can
see one live data node
please find the attached conf rar file of namenode
regards
Post by Mohammad Tariq
OK. we'll start fresh. Could you plz show me your latest config files?
BTW, are your daemons running fine?Use JPS to verify that.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
i have created these dir "wksp_data" and "wksp_name" on both datanode
and namenode
made the respective changes in "hdfs-site.xml" file
formatted the namenode
started the dfs
but still, not able to browse the file system through web browser
please refer below
anything still missing ?
please suggest
[image: Inline image 1]
Post by Irfan Sayed
these dir needs to be created on all datanodes and namenodes ?
further, hdfs-site.xml needs to be updated on both datanodes and
namenodes for these new dir?
regards
Post by Mohammad Tariq
Create 2 directories manually corresponding to the values of
dfs.name.dir and dfs.data.dir properties and change the permissions of
these directories to 755. When you start pushing data into your HDFS, data
will start going inside the directory specified by dfs.data.dir and the
associated metadata will go inside dfs.name.dir. Remember, you store data
in HDFS, but it eventually gets stored in your local/native FS. But you
cannot see this data directly on your local/native FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks.
however, i need this to be working on windows environment as project
requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is the HDFS file system OR do i
need to create it from command line ?
please suggest
regards,
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck with some imp work.
HDFS webUI doesn't provide us the ability to create file or
directory. You can browse HDFS, view files, download files etc. But
operation like create, move, copy etc are not supported.
These values look fine to me.
One suggestion though. Try getting a Linux machine(if possible). Or
at least use a VM. I personally feel that using Hadoop on windows is always
messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks.
i haven't seen any make directory option there
i need to create it from command line ?
further, in the hdfs-site.xml file , i have given following
entries. are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths local and You need not to
copy hadoop into hdfs...Hadoop is already working..
Just check out in browser by after starting ur single node
localhost:50070
then go for browse the filesystem link in it..
If there is no directory then make directory there.
That is your hdfs directory.
Then copy any text file there(no need to copy hadoop
there).beacause u are going to do processing on that data in text
file.That's why hadoop is used for ,first u need to make it clear in ur
mind.Then and then u will do it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
/hdfs/directory/path
On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link which u sent but i need
this to be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
Irfan Sayed
2013-08-13 07:26:22 UTC
Permalink
hey Tariq,
i am still stuck ..
can you please suggest

regards
irfan
Post by Irfan Sayed
please suggest
regards
Post by Irfan Sayed
attachment got quarantined
resending in txt format. please rename it to conf.rar
regards
Post by Irfan Sayed
thanks.
$ ./jps.exe
3164 NameNode
1892 Jps
$ ./jps.exe
3848 Jps
jps does not list any process for datanode. however, on web browser i
can see one live data node
please find the attached conf rar file of namenode
regards
Post by Mohammad Tariq
OK. we'll start fresh. Could you plz show me your latest config files?
BTW, are your daemons running fine?Use JPS to verify that.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
i have created these dir "wksp_data" and "wksp_name" on both datanode
and namenode
made the respective changes in "hdfs-site.xml" file
formatted the namenode
started the dfs
but still, not able to browse the file system through web browser
please refer below
anything still missing ?
please suggest
[image: Inline image 1]
Post by Irfan Sayed
these dir needs to be created on all datanodes and namenodes ?
further, hdfs-site.xml needs to be updated on both datanodes and
namenodes for these new dir?
regards
Post by Mohammad Tariq
Create 2 directories manually corresponding to the values of
dfs.name.dir and dfs.data.dir properties and change the permissions of
these directories to 755. When you start pushing data into your HDFS, data
will start going inside the directory specified by dfs.data.dir and the
associated metadata will go inside dfs.name.dir. Remember, you store data
in HDFS, but it eventually gets stored in your local/native FS. But you
cannot see this data directly on your local/native FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks.
however, i need this to be working on windows environment as
project requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is the HDFS file system OR do i
need to create it from command line ?
please suggest
regards,
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck with some imp work.
HDFS webUI doesn't provide us the ability to create file or
directory. You can browse HDFS, view files, download files etc. But
operation like create, move, copy etc are not supported.
These values look fine to me.
One suggestion though. Try getting a Linux machine(if possible).
Or at least use a VM. I personally feel that using Hadoop on windows is
always messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks.
i haven't seen any make directory option there
i need to create it from command line ?
further, in the hdfs-site.xml file , i have given following
entries. are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths local and You need not to
copy hadoop into hdfs...Hadoop is already working..
Just check out in browser by after starting ur single node
localhost:50070
then go for browse the filesystem link in it..
If there is no directory then make directory there.
That is your hdfs directory.
Then copy any text file there(no need to copy hadoop
there).beacause u are going to do processing on that data in text
file.That's why hadoop is used for ,first u need to make it clear in ur
mind.Then and then u will do it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
/hdfs/directory/path
On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link which u sent but i need
this to be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
Irfan Sayed
2013-08-19 04:15:03 UTC
Permalink
please suggest

regards
irfan
Post by Irfan Sayed
hey Tariq,
i am still stuck ..
can you please suggest
regards
irfan
Post by Irfan Sayed
please suggest
regards
Post by Irfan Sayed
attachment got quarantined
resending in txt format. please rename it to conf.rar
regards
Post by Irfan Sayed
thanks.
$ ./jps.exe
3164 NameNode
1892 Jps
$ ./jps.exe
3848 Jps
jps does not list any process for datanode. however, on web browser i
can see one live data node
please find the attached conf rar file of namenode
regards
Post by Mohammad Tariq
OK. we'll start fresh. Could you plz show me your latest config files?
BTW, are your daemons running fine?Use JPS to verify that.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
i have created these dir "wksp_data" and "wksp_name" on both datanode
and namenode
made the respective changes in "hdfs-site.xml" file
formatted the namenode
started the dfs
but still, not able to browse the file system through web browser
please refer below
anything still missing ?
please suggest
[image: Inline image 1]
Post by Irfan Sayed
these dir needs to be created on all datanodes and namenodes ?
further, hdfs-site.xml needs to be updated on both datanodes and
namenodes for these new dir?
regards
Post by Mohammad Tariq
Create 2 directories manually corresponding to the values of
dfs.name.dir and dfs.data.dir properties and change the permissions of
these directories to 755. When you start pushing data into your HDFS, data
will start going inside the directory specified by dfs.data.dir and the
associated metadata will go inside dfs.name.dir. Remember, you store data
in HDFS, but it eventually gets stored in your local/native FS. But you
cannot see this data directly on your local/native FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks.
however, i need this to be working on windows environment as
project requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is the HDFS file system OR do i
need to create it from command line ?
please suggest
regards,
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck with some imp work.
HDFS webUI doesn't provide us the ability to create file or
directory. You can browse HDFS, view files, download files etc. But
operation like create, move, copy etc are not supported.
These values look fine to me.
One suggestion though. Try getting a Linux machine(if possible).
Or at least use a VM. I personally feel that using Hadoop on windows is
always messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks.
i haven't seen any make directory option there
i need to create it from command line ?
further, in the hdfs-site.xml file , i have given following
entries. are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths local and You need not to
copy hadoop into hdfs...Hadoop is already working..
Just check out in browser by after starting ur single node
localhost:50070
then go for browse the filesystem link in it..
If there is no directory then make directory there.
That is your hdfs directory.
Then copy any text file there(no need to copy hadoop
there).beacause u are going to do processing on that data in text
file.That's why hadoop is used for ,first u need to make it clear in ur
mind.Then and then u will do it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
/hdfs/directory/path
On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link which u sent but i need
this to be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
manish dunani
2013-08-19 13:48:11 UTC
Permalink
First of all read the concepts ..I hope you will like it..

https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
Post by Irfan Sayed
please suggest
regards
irfan
Post by Irfan Sayed
hey Tariq,
i am still stuck ..
can you please suggest
regards
irfan
Post by Irfan Sayed
please suggest
regards
Post by Irfan Sayed
attachment got quarantined
resending in txt format. please rename it to conf.rar
regards
Post by Irfan Sayed
thanks.
$ ./jps.exe
3164 NameNode
1892 Jps
$ ./jps.exe
3848 Jps
jps does not list any process for datanode. however, on web browser i
can see one live data node
please find the attached conf rar file of namenode
regards
Post by Mohammad Tariq
OK. we'll start fresh. Could you plz show me your latest config files?
BTW, are your daemons running fine?Use JPS to verify that.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
i have created these dir "wksp_data" and "wksp_name" on both
datanode and namenode
made the respective changes in "hdfs-site.xml" file
formatted the namenode
started the dfs
but still, not able to browse the file system through web browser
please refer below
anything still missing ?
please suggest
[image: Inline image 1]
Post by Irfan Sayed
these dir needs to be created on all datanodes and namenodes ?
further, hdfs-site.xml needs to be updated on both datanodes and
namenodes for these new dir?
regards
Post by Mohammad Tariq
Create 2 directories manually corresponding to the values of
dfs.name.dir and dfs.data.dir properties and change the permissions of
these directories to 755. When you start pushing data into your HDFS, data
will start going inside the directory specified by dfs.data.dir and the
associated metadata will go inside dfs.name.dir. Remember, you store data
in HDFS, but it eventually gets stored in your local/native FS. But you
cannot see this data directly on your local/native FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks.
however, i need this to be working on windows environment as
project requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is the HDFS file system OR do
i need to create it from command line ?
please suggest
regards,
On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck with some imp work.
HDFS webUI doesn't provide us the ability to create file or
directory. You can browse HDFS, view files, download files etc. But
operation like create, move, copy etc are not supported.
These values look fine to me.
One suggestion though. Try getting a Linux machine(if possible).
Or at least use a VM. I personally feel that using Hadoop on windows is
always messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
i haven't seen any make directory option there
i need to create it from command line ?
further, in the hdfs-site.xml file , i have given following
entries. are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths local and You need not to
copy hadoop into hdfs...Hadoop is already working..
Just check out in browser by after starting ur single node
localhost:50070
then go for browse the filesystem link in it..
If there is no directory then make directory there.
That is your hdfs directory.
Then copy any text file there(no need to copy hadoop
there).beacause u are going to do processing on that data in text
file.That's why hadoop is used for ,first u need to make it clear in ur
mind.Then and then u will do it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
/hdfs/directory/path
On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link which u sent but i need
this to be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
--
Regards

*Manish Dunani*
*Contact No* : +91 9408329137
*skype id* : manish.dunani*
*
Mohammad Tariq
2013-08-20 07:52:55 UTC
Permalink
I'm sorry for being unresponsive. Was out of touch for sometime because of
ramzan and eid. Resuming work today.

What's the current status?

Warm Regards,
Tariq
cloudfront.blogspot.com
Post by manish dunani
First of all read the concepts ..I hope you will like it..
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
Post by Irfan Sayed
please suggest
regards
irfan
Post by Irfan Sayed
hey Tariq,
i am still stuck ..
can you please suggest
regards
irfan
Post by Irfan Sayed
please suggest
regards
Post by Irfan Sayed
attachment got quarantined
resending in txt format. please rename it to conf.rar
regards
Post by Irfan Sayed
thanks.
$ ./jps.exe
3164 NameNode
1892 Jps
$ ./jps.exe
3848 Jps
jps does not list any process for datanode. however, on web browser i
can see one live data node
please find the attached conf rar file of namenode
regards
Post by Mohammad Tariq
OK. we'll start fresh. Could you plz show me your latest config files?
BTW, are your daemons running fine?Use JPS to verify that.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
i have created these dir "wksp_data" and "wksp_name" on both
datanode and namenode
made the respective changes in "hdfs-site.xml" file
formatted the namenode
started the dfs
but still, not able to browse the file system through web browser
please refer below
anything still missing ?
please suggest
[image: Inline image 1]
Post by Irfan Sayed
these dir needs to be created on all datanodes and namenodes ?
further, hdfs-site.xml needs to be updated on both datanodes and
namenodes for these new dir?
regards
Post by Mohammad Tariq
Create 2 directories manually corresponding to the values of
dfs.name.dir and dfs.data.dir properties and change the permissions of
these directories to 755. When you start pushing data into your HDFS, data
will start going inside the directory specified by dfs.data.dir and the
associated metadata will go inside dfs.name.dir. Remember, you store data
in HDFS, but it eventually gets stored in your local/native FS. But you
cannot see this data directly on your local/native FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks.
however, i need this to be working on windows environment as
project requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is the HDFS file system OR do
i need to create it from command line ?
please suggest
regards,
On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck with some imp work.
HDFS webUI doesn't provide us the ability to create file or
directory. You can browse HDFS, view files, download files etc. But
operation like create, move, copy etc are not supported.
These values look fine to me.
One suggestion though. Try getting a Linux machine(if
possible). Or at least use a VM. I personally feel that using Hadoop on
windows is always messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
i haven't seen any make directory option there
i need to create it from command line ?
further, in the hdfs-site.xml file , i have given following
entries. are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths local and You need not
to copy hadoop into hdfs...Hadoop is already working..
Just check out in browser by after starting ur single node
localhost:50070
then go for browse the filesystem link in it..
If there is no directory then make directory there.
That is your hdfs directory.
Then copy any text file there(no need to copy hadoop
there).beacause u are going to do processing on that data in text
file.That's why hadoop is used for ,first u need to make it clear in ur
mind.Then and then u will do it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
/hdfs/directory/path
On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link which u sent but i need
this to be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
--
Regards
*Manish Dunani*
*Contact No* : +91 9408329137
*skype id* : manish.dunani*
*
Irfan Sayed
2013-08-20 09:26:17 UTC
Permalink
thanks tariq for response.
as discussed last time, i have sent you all the config files in my setup .
can you please go through that ?

please let me know

regards
irfan
Post by Mohammad Tariq
I'm sorry for being unresponsive. Was out of touch for sometime because of
ramzan and eid. Resuming work today.
What's the current status?
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by manish dunani
First of all read the concepts ..I hope you will like it..
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
Post by Irfan Sayed
please suggest
regards
irfan
Post by Irfan Sayed
hey Tariq,
i am still stuck ..
can you please suggest
regards
irfan
Post by Irfan Sayed
please suggest
regards
Post by Irfan Sayed
attachment got quarantined
resending in txt format. please rename it to conf.rar
regards
Post by Irfan Sayed
thanks.
$ ./jps.exe
3164 NameNode
1892 Jps
$ ./jps.exe
3848 Jps
jps does not list any process for datanode. however, on web browser
i can see one live data node
please find the attached conf rar file of namenode
regards
Post by Mohammad Tariq
OK. we'll start fresh. Could you plz show me your latest config files?
BTW, are your daemons running fine?Use JPS to verify that.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
i have created these dir "wksp_data" and "wksp_name" on both
datanode and namenode
made the respective changes in "hdfs-site.xml" file
formatted the namenode
started the dfs
but still, not able to browse the file system through web browser
please refer below
anything still missing ?
please suggest
[image: Inline image 1]
Post by Irfan Sayed
these dir needs to be created on all datanodes and namenodes ?
further, hdfs-site.xml needs to be updated on both datanodes and
namenodes for these new dir?
regards
On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <
Post by Mohammad Tariq
Create 2 directories manually corresponding to the values of
dfs.name.dir and dfs.data.dir properties and change the permissions of
these directories to 755. When you start pushing data into your HDFS, data
will start going inside the directory specified by dfs.data.dir and the
associated metadata will go inside dfs.name.dir. Remember, you store data
in HDFS, but it eventually gets stored in your local/native FS. But you
cannot see this data directly on your local/native FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
however, i need this to be working on windows environment as
project requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is the HDFS file system OR
do i need to create it from command line ?
please suggest
regards,
On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck with some imp work.
HDFS webUI doesn't provide us the ability to create file or
directory. You can browse HDFS, view files, download files etc. But
operation like create, move, copy etc are not supported.
These values look fine to me.
One suggestion though. Try getting a Linux machine(if
possible). Or at least use a VM. I personally feel that using Hadoop on
windows is always messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
i haven't seen any make directory option there
i need to create it from command line ?
further, in the hdfs-site.xml file , i have given following
entries. are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths local and You need not
to copy hadoop into hdfs...Hadoop is already working..
Just check out in browser by after starting ur single node
localhost:50070
then go for browse the filesystem link in it..
If there is no directory then make directory there.
That is your hdfs directory.
Then copy any text file there(no need to copy hadoop
there).beacause u are going to do processing on that data in text
file.That's why hadoop is used for ,first u need to make it clear in ur
mind.Then and then u will do it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
/hdfs/directory/path
On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link which u sent but i
need this to be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
--
Regards
*Manish Dunani*
*Contact No* : +91 9408329137
*skype id* : manish.dunani*
*
Mohammad Tariq
2013-08-20 09:53:22 UTC
Permalink
You are welcome. Which link have you followed for the configuration?Your *
core-site.xml* is empty. Remove the property *fs.default.name *from *
hdfs-site.xml* and add it to *core-site.xml*. Remove
*mapred.job.tracker*as well. It is required in
*mapred-site.xml*.

I would suggest you to do a pseudo distributed setup first in order to get
yourself familiar with the process and then proceed to the distributed
mode. You can visit this
link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if
you need some help. Let me know if you face any issue.

HTH

Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks tariq for response.
as discussed last time, i have sent you all the config files in my setup .
can you please go through that ?
please let me know
regards
irfan
Post by Mohammad Tariq
I'm sorry for being unresponsive. Was out of touch for sometime because
of ramzan and eid. Resuming work today.
What's the current status?
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by manish dunani
First of all read the concepts ..I hope you will like it..
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
Post by Irfan Sayed
please suggest
regards
irfan
Post by Irfan Sayed
hey Tariq,
i am still stuck ..
can you please suggest
regards
irfan
Post by Irfan Sayed
please suggest
regards
Post by Irfan Sayed
attachment got quarantined
resending in txt format. please rename it to conf.rar
regards
Post by Irfan Sayed
thanks.
$ ./jps.exe
3164 NameNode
1892 Jps
$ ./jps.exe
3848 Jps
jps does not list any process for datanode. however, on web browser
i can see one live data node
please find the attached conf rar file of namenode
regards
Post by Mohammad Tariq
OK. we'll start fresh. Could you plz show me your latest config files?
BTW, are your daemons running fine?Use JPS to verify that.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
i have created these dir "wksp_data" and "wksp_name" on both
datanode and namenode
made the respective changes in "hdfs-site.xml" file
formatted the namenode
started the dfs
but still, not able to browse the file system through web browser
please refer below
anything still missing ?
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <
Post by Irfan Sayed
these dir needs to be created on all datanodes and namenodes ?
further, hdfs-site.xml needs to be updated on both datanodes
and namenodes for these new dir?
regards
On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <
Post by Mohammad Tariq
Create 2 directories manually corresponding to the values of
dfs.name.dir and dfs.data.dir properties and change the permissions of
these directories to 755. When you start pushing data into your HDFS, data
will start going inside the directory specified by dfs.data.dir and the
associated metadata will go inside dfs.name.dir. Remember, you store data
in HDFS, but it eventually gets stored in your local/native FS. But you
cannot see this data directly on your local/native FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
however, i need this to be working on windows environment as
project requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is the HDFS file system OR
do i need to create it from command line ?
please suggest
regards,
On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck with some imp work.
HDFS webUI doesn't provide us the ability to create file or
directory. You can browse HDFS, view files, download files etc. But
operation like create, move, copy etc are not supported.
These values look fine to me.
One suggestion though. Try getting a Linux machine(if
possible). Or at least use a VM. I personally feel that using Hadoop on
windows is always messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
i haven't seen any make directory option there
i need to create it from command line ?
further, in the hdfs-site.xml file , i have given following
entries. are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths local and You need not
to copy hadoop into hdfs...Hadoop is already working..
Just check out in browser by after starting ur single node
localhost:50070
then go for browse the filesystem link in it..
If there is no directory then make directory there.
That is your hdfs directory.
Then copy any text file there(no need to copy hadoop
there).beacause u are going to do processing on that data in text
file.That's why hadoop is used for ,first u need to make it clear in ur
mind.Then and then u will do it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal
/full/local/path/to/ur/file /hdfs/directory/path
On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link which u sent but i
need this to be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
--
Regards
*Manish Dunani*
*Contact No* : +91 9408329137
*skype id* : manish.dunani*
*
Irfan Sayed
2013-08-20 10:44:15 UTC
Permalink
thanks. i followed this url :
http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
let me follow the url which you gave for pseudo distributed setup and then
will switch to distributed mode

regards
irfan
Post by Mohammad Tariq
You are welcome. Which link have you followed for the configuration?Your *
core-site.xml* is empty. Remove the property *fs.default.name *from *
hdfs-site.xml* and add it to *core-site.xml*. Remove *mapred.job.tracker*as well. It is required in
*mapred-site.xml*.
I would suggest you to do a pseudo distributed setup first in order to get
yourself familiar with the process and then proceed to the distributed
mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
HTH
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks tariq for response.
as discussed last time, i have sent you all the config files in my setup
.
can you please go through that ?
please let me know
regards
irfan
Post by Mohammad Tariq
I'm sorry for being unresponsive. Was out of touch for sometime because
of ramzan and eid. Resuming work today.
What's the current status?
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by manish dunani
First of all read the concepts ..I hope you will like it..
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
Post by Irfan Sayed
please suggest
regards
irfan
Post by Irfan Sayed
hey Tariq,
i am still stuck ..
can you please suggest
regards
irfan
Post by Irfan Sayed
please suggest
regards
Post by Irfan Sayed
attachment got quarantined
resending in txt format. please rename it to conf.rar
regards
Post by Irfan Sayed
thanks.
$ ./jps.exe
3164 NameNode
1892 Jps
$ ./jps.exe
3848 Jps
jps does not list any process for datanode. however, on web
browser i can see one live data node
please find the attached conf rar file of namenode
regards
Post by Mohammad Tariq
OK. we'll start fresh. Could you plz show me your latest config files?
BTW, are your daemons running fine?Use JPS to verify that.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <
Post by Irfan Sayed
i have created these dir "wksp_data" and "wksp_name" on both
datanode and namenode
made the respective changes in "hdfs-site.xml" file
formatted the namenode
started the dfs
but still, not able to browse the file system through web browser
please refer below
anything still missing ?
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <
Post by Irfan Sayed
these dir needs to be created on all datanodes and namenodes ?
further, hdfs-site.xml needs to be updated on both datanodes
and namenodes for these new dir?
regards
On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <
Post by Mohammad Tariq
Create 2 directories manually corresponding to the values of
dfs.name.dir and dfs.data.dir properties and change the permissions of
these directories to 755. When you start pushing data into your HDFS, data
will start going inside the directory specified by dfs.data.dir and the
associated metadata will go inside dfs.name.dir. Remember, you store data
in HDFS, but it eventually gets stored in your local/native FS. But you
cannot see this data directly on your local/native FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
however, i need this to be working on windows environment as
project requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is the HDFS file system OR
do i need to create it from command line ?
please suggest
regards,
On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck with some imp work.
HDFS webUI doesn't provide us the ability to create file or
directory. You can browse HDFS, view files, download files etc. But
operation like create, move, copy etc are not supported.
These values look fine to me.
One suggestion though. Try getting a Linux machine(if
possible). Or at least use a VM. I personally feel that using Hadoop on
windows is always messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
i haven't seen any make directory option there
i need to create it from command line ?
further, in the hdfs-site.xml file , i have given following
entries. are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths local and You need
not to copy hadoop into hdfs...Hadoop is already working..
Just check out in browser by after starting ur single node
localhost:50070
then go for browse the filesystem link in it..
If there is no directory then make directory there.
That is your hdfs directory.
Then copy any text file there(no need to copy hadoop
there).beacause u are going to do processing on that data in text
file.That's why hadoop is used for ,first u need to make it clear in ur
mind.Then and then u will do it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal
/full/local/path/to/ur/file /hdfs/directory/path
On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link which u sent but i
need this to be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
--
Regards
*Manish Dunani*
*Contact No* : +91 9408329137
*skype id* : manish.dunani*
*
Irfan Sayed
2013-08-21 11:41:57 UTC
Permalink
i followed the url and did the steps mention in that. i have deployed on
the windows platform

Now, i am able to browse url : http://localhost:50070 (name node )
however, not able to browse url : http://localhost:50030

please refer below

[image: Inline image 1]

i have modified all the config files as mentioned and formatted the hdfs
file system as well
please suggest

regards
Post by Irfan Sayed
http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
let me follow the url which you gave for pseudo distributed setup and then
will switch to distributed mode
regards
irfan
Post by Mohammad Tariq
You are welcome. Which link have you followed for the configuration?Your
*core-site.xml* is empty. Remove the property *fs.default.name *from *
hdfs-site.xml* and add it to *core-site.xml*. Remove *mapred.job.tracker*as well. It is required in
*mapred-site.xml*.
I would suggest you to do a pseudo distributed setup first in order to
get yourself familiar with the process and then proceed to the distributed
mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
HTH
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks tariq for response.
as discussed last time, i have sent you all the config files in my setup
.
can you please go through that ?
please let me know
regards
irfan
Post by Mohammad Tariq
I'm sorry for being unresponsive. Was out of touch for sometime because
of ramzan and eid. Resuming work today.
What's the current status?
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by manish dunani
First of all read the concepts ..I hope you will like it..
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
Post by Irfan Sayed
please suggest
regards
irfan
Post by Irfan Sayed
hey Tariq,
i am still stuck ..
can you please suggest
regards
irfan
Post by Irfan Sayed
please suggest
regards
Post by Irfan Sayed
attachment got quarantined
resending in txt format. please rename it to conf.rar
regards
Post by Irfan Sayed
thanks.
$ ./jps.exe
3164 NameNode
1892 Jps
$ ./jps.exe
3848 Jps
jps does not list any process for datanode. however, on web
browser i can see one live data node
please find the attached conf rar file of namenode
regards
On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <
Post by Mohammad Tariq
OK. we'll start fresh. Could you plz show me your latest config files?
BTW, are your daemons running fine?Use JPS to verify that.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <
Post by Irfan Sayed
i have created these dir "wksp_data" and "wksp_name" on both
datanode and namenode
made the respective changes in "hdfs-site.xml" file
formatted the namenode
started the dfs
but still, not able to browse the file system through web browser
please refer below
anything still missing ?
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <
Post by Irfan Sayed
these dir needs to be created on all datanodes and namenodes ?
further, hdfs-site.xml needs to be updated on both datanodes
and namenodes for these new dir?
regards
On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <
Post by Mohammad Tariq
Create 2 directories manually corresponding to the values of
dfs.name.dir and dfs.data.dir properties and change the permissions of
these directories to 755. When you start pushing data into your HDFS, data
will start going inside the directory specified by dfs.data.dir and the
associated metadata will go inside dfs.name.dir. Remember, you store data
in HDFS, but it eventually gets stored in your local/native FS. But you
cannot see this data directly on your local/native FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
however, i need this to be working on windows environment as
project requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is the HDFS file system
OR do i need to create it from command line ?
please suggest
regards,
On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck with some imp work.
HDFS webUI doesn't provide us the ability to create file or
directory. You can browse HDFS, view files, download files etc. But
operation like create, move, copy etc are not supported.
These values look fine to me.
One suggestion though. Try getting a Linux machine(if
possible). Or at least use a VM. I personally feel that using Hadoop on
windows is always messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
i haven't seen any make directory option there
i need to create it from command line ?
further, in the hdfs-site.xml file , i have given
following entries. are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths local and You need
not to copy hadoop into hdfs...Hadoop is already working..
Just check out in browser by after starting ur single
localhost:50070
then go for browse the filesystem link in it..
If there is no directory then make directory there.
That is your hdfs directory.
Then copy any text file there(no need to copy hadoop
there).beacause u are going to do processing on that data in text
file.That's why hadoop is used for ,first u need to make it clear in ur
mind.Then and then u will do it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal
/full/local/path/to/ur/file /hdfs/directory/path
On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link which u sent but i
need this to be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
--
Regards
*Manish Dunani*
*Contact No* : +91 9408329137
*skype id* : manish.dunani*
*
Mohammad Tariq
2013-08-21 11:58:36 UTC
Permalink
Your DN is still not running. Showing me the logs would be helpful.

Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
i followed the url and did the steps mention in that. i have deployed on
the windows platform
Now, i am able to browse url : http://localhost:50070 (name node )
however, not able to browse url : http://localhost:50030
please refer below
[image: Inline image 1]
i have modified all the config files as mentioned and formatted the hdfs
file system as well
please suggest
regards
Post by Irfan Sayed
http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
let me follow the url which you gave for pseudo distributed setup and
then will switch to distributed mode
regards
irfan
Post by Mohammad Tariq
You are welcome. Which link have you followed for the configuration?Your
*core-site.xml* is empty. Remove the property *fs.default.name *from *
hdfs-site.xml* and add it to *core-site.xml*. Remove *mapred.job.tracker
* as well. It is required in *mapred-site.xml*.
I would suggest you to do a pseudo distributed setup first in order to
get yourself familiar with the process and then proceed to the distributed
mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
HTH
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks tariq for response.
as discussed last time, i have sent you all the config files in my
setup .
can you please go through that ?
please let me know
regards
irfan
Post by Mohammad Tariq
I'm sorry for being unresponsive. Was out of touch for sometime
because of ramzan and eid. Resuming work today.
What's the current status?
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by manish dunani
First of all read the concepts ..I hope you will like it..
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
Post by Irfan Sayed
please suggest
regards
irfan
Post by Irfan Sayed
hey Tariq,
i am still stuck ..
can you please suggest
regards
irfan
Post by Irfan Sayed
please suggest
regards
Post by Irfan Sayed
attachment got quarantined
resending in txt format. please rename it to conf.rar
regards
Post by Irfan Sayed
thanks.
$ ./jps.exe
3164 NameNode
1892 Jps
$ ./jps.exe
3848 Jps
jps does not list any process for datanode. however, on web
browser i can see one live data node
please find the attached conf rar file of namenode
regards
On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <
Post by Mohammad Tariq
OK. we'll start fresh. Could you plz show me your latest config
files?
BTW, are your daemons running fine?Use JPS to verify that.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <
Post by Irfan Sayed
i have created these dir "wksp_data" and "wksp_name" on both
datanode and namenode
made the respective changes in "hdfs-site.xml" file
formatted the namenode
started the dfs
but still, not able to browse the file system through web browser
please refer below
anything still missing ?
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <
Post by Irfan Sayed
these dir needs to be created on all datanodes and namenodes ?
further, hdfs-site.xml needs to be updated on both datanodes
and namenodes for these new dir?
regards
On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <
Post by Mohammad Tariq
Create 2 directories manually corresponding to the values of
dfs.name.dir and dfs.data.dir properties and change the permissions of
these directories to 755. When you start pushing data into your HDFS, data
will start going inside the directory specified by dfs.data.dir and the
associated metadata will go inside dfs.name.dir. Remember, you store data
in HDFS, but it eventually gets stored in your local/native FS. But you
cannot see this data directly on your local/native FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
however, i need this to be working on windows environment
as project requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is the HDFS file system
OR do i need to create it from command line ?
please suggest
regards,
On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck with some imp work.
HDFS webUI doesn't provide us the ability to create file
or directory. You can browse HDFS, view files, download files etc. But
operation like create, move, copy etc are not supported.
These values look fine to me.
One suggestion though. Try getting a Linux machine(if
possible). Or at least use a VM. I personally feel that using Hadoop on
windows is always messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
i haven't seen any make directory option there
i need to create it from command line ?
further, in the hdfs-site.xml file , i have given
following entries. are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths local and You need
not to copy hadoop into hdfs...Hadoop is already working..
Just check out in browser by after starting ur single
localhost:50070
then go for browse the filesystem link in it..
If there is no directory then make directory there.
That is your hdfs directory.
Then copy any text file there(no need to copy hadoop
there).beacause u are going to do processing on that data in text
file.That's why hadoop is used for ,first u need to make it clear in ur
mind.Then and then u will do it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal
/full/local/path/to/ur/file /hdfs/directory/path
On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link which u sent but i
need this to be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
--
Regards
*Manish Dunani*
*Contact No* : +91 9408329137
*skype id* : manish.dunani*
*
Irfan Sayed
2013-08-22 04:22:22 UTC
Permalink
datanode is trying to connect to namenode continuously but fails

when i try to run "jps" command it says :
$ ./jps.exe
4584 NameNode
4016 Jps

and when i ran the "./start-dfs.sh" then it says :

$ ./start-dfs.sh
namenode running as process 3544. Stop it first.
DFS-1: datanode running as process 4076. Stop it first.
localhost: secondarynamenode running as process 4792. Stop it first.

both these logs are contradictory
please find the attached logs

should i attach the conf files as well ?

regards
Post by Mohammad Tariq
Your DN is still not running. Showing me the logs would be helpful.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
i followed the url and did the steps mention in that. i have deployed on
the windows platform
Now, i am able to browse url : http://localhost:50070 (name node )
however, not able to browse url : http://localhost:50030
please refer below
[image: Inline image 1]
i have modified all the config files as mentioned and formatted the hdfs
file system as well
please suggest
regards
Post by Irfan Sayed
http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
let me follow the url which you gave for pseudo distributed setup and
then will switch to distributed mode
regards
irfan
Post by Mohammad Tariq
You are welcome. Which link have you followed for the
configuration?Your *core-site.xml* is empty. Remove the property *
fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
Remove *mapred.job.tracker* as well. It is required in *mapred-site.xml
*.
I would suggest you to do a pseudo distributed setup first in order to
get yourself familiar with the process and then proceed to the distributed
mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
HTH
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks tariq for response.
as discussed last time, i have sent you all the config files in my
setup .
can you please go through that ?
please let me know
regards
irfan
Post by Mohammad Tariq
I'm sorry for being unresponsive. Was out of touch for sometime
because of ramzan and eid. Resuming work today.
What's the current status?
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by manish dunani
First of all read the concepts ..I hope you will like it..
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
Post by Irfan Sayed
please suggest
regards
irfan
Post by Irfan Sayed
hey Tariq,
i am still stuck ..
can you please suggest
regards
irfan
Post by Irfan Sayed
please suggest
regards
Post by Irfan Sayed
attachment got quarantined
resending in txt format. please rename it to conf.rar
regards
On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
Post by Irfan Sayed
thanks.
$ ./jps.exe
3164 NameNode
1892 Jps
$ ./jps.exe
3848 Jps
jps does not list any process for datanode. however, on web
browser i can see one live data node
please find the attached conf rar file of namenode
regards
On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <
Post by Mohammad Tariq
OK. we'll start fresh. Could you plz show me your latest
config files?
BTW, are your daemons running fine?Use JPS to verify that.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <
Post by Irfan Sayed
i have created these dir "wksp_data" and "wksp_name" on both
datanode and namenode
made the respective changes in "hdfs-site.xml" file
formatted the namenode
started the dfs
but still, not able to browse the file system through web browser
please refer below
anything still missing ?
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <
Post by Irfan Sayed
these dir needs to be created on all datanodes and namenodes ?
further, hdfs-site.xml needs to be updated on both
datanodes and namenodes for these new dir?
regards
On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <
Post by Mohammad Tariq
Create 2 directories manually corresponding to the values
of dfs.name.dir and dfs.data.dir properties and change the permissions of
these directories to 755. When you start pushing data into your HDFS, data
will start going inside the directory specified by dfs.data.dir and the
associated metadata will go inside dfs.name.dir. Remember, you store data
in HDFS, but it eventually gets stored in your local/native FS. But you
cannot see this data directly on your local/native FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
however, i need this to be working on windows environment
as project requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is the HDFS file system
OR do i need to create it from command line ?
please suggest
regards,
On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck with some imp
work.
HDFS webUI doesn't provide us the ability to create file
or directory. You can browse HDFS, view files, download files etc. But
operation like create, move, copy etc are not supported.
These values look fine to me.
One suggestion though. Try getting a Linux machine(if
possible). Or at least use a VM. I personally feel that using Hadoop on
windows is always messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
i haven't seen any make directory option there
i need to create it from command line ?
further, in the hdfs-site.xml file , i have given
following entries. are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths local and You need
not to copy hadoop into hdfs...Hadoop is already working..
Just check out in browser by after starting ur single
localhost:50070
then go for browse the filesystem link in it..
If there is no directory then make directory there.
That is your hdfs directory.
Then copy any text file there(no need to copy hadoop
there).beacause u are going to do processing on that data in text
file.That's why hadoop is used for ,first u need to make it clear in ur
mind.Then and then u will do it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal
/full/local/path/to/ur/file /hdfs/directory/path
On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link which u sent but
i need this to be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
--
Regards
*Manish Dunani*
*Contact No* : +91 9408329137
*skype id* : manish.dunani*
*
Arpit Agarwal
2013-08-22 04:29:12 UTC
Permalink
Most likely there is a stale pid file. Something like
\tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
the datanode.

I haven't read the entire thread so you may have looked at this already.

-Arpit
Post by Irfan Sayed
datanode is trying to connect to namenode continuously but fails
$ ./jps.exe
4584 NameNode
4016 Jps
$ ./start-dfs.sh
namenode running as process 3544. Stop it first.
DFS-1: datanode running as process 4076. Stop it first.
localhost: secondarynamenode running as process 4792. Stop it first.
both these logs are contradictory
please find the attached logs
should i attach the conf files as well ?
regards
Post by Mohammad Tariq
Your DN is still not running. Showing me the logs would be helpful.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
i followed the url and did the steps mention in that. i have deployed on
the windows platform
Now, i am able to browse url : http://localhost:50070 (name node )
however, not able to browse url : http://localhost:50030
please refer below
[image: Inline image 1]
i have modified all the config files as mentioned and formatted the hdfs
file system as well
please suggest
regards
Post by Irfan Sayed
http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
let me follow the url which you gave for pseudo distributed setup and
then will switch to distributed mode
regards
irfan
Post by Mohammad Tariq
You are welcome. Which link have you followed for the
configuration?Your *core-site.xml* is empty. Remove the property *
fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
Remove *mapred.job.tracker* as well. It is required in *
mapred-site.xml*.
I would suggest you to do a pseudo distributed setup first in order to
get yourself familiar with the process and then proceed to the distributed
mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
HTH
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks tariq for response.
as discussed last time, i have sent you all the config files in my
setup .
can you please go through that ?
please let me know
regards
irfan
Post by Mohammad Tariq
I'm sorry for being unresponsive. Was out of touch for sometime
because of ramzan and eid. Resuming work today.
What's the current status?
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by manish dunani
First of all read the concepts ..I hope you will like it..
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
Post by Irfan Sayed
please suggest
regards
irfan
On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
Post by Irfan Sayed
hey Tariq,
i am still stuck ..
can you please suggest
regards
irfan
Post by Irfan Sayed
please suggest
regards
On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
Post by Irfan Sayed
attachment got quarantined
resending in txt format. please rename it to conf.rar
regards
On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
Post by Irfan Sayed
thanks.
$ ./jps.exe
3164 NameNode
1892 Jps
$ ./jps.exe
3848 Jps
jps does not list any process for datanode. however, on web
browser i can see one live data node
please find the attached conf rar file of namenode
regards
On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <
Post by Mohammad Tariq
OK. we'll start fresh. Could you plz show me your latest
config files?
BTW, are your daemons running fine?Use JPS to verify that.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <
Post by Irfan Sayed
i have created these dir "wksp_data" and "wksp_name" on both
datanode and namenode
made the respective changes in "hdfs-site.xml" file
formatted the namenode
started the dfs
but still, not able to browse the file system through web
browser
please refer below
anything still missing ?
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <
Post by Irfan Sayed
these dir needs to be created on all datanodes and
namenodes ?
further, hdfs-site.xml needs to be updated on both
datanodes and namenodes for these new dir?
regards
On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <
Post by Mohammad Tariq
Create 2 directories manually corresponding to the values
of dfs.name.dir and dfs.data.dir properties and change the permissions of
these directories to 755. When you start pushing data into your HDFS, data
will start going inside the directory specified by dfs.data.dir and the
associated metadata will go inside dfs.name.dir. Remember, you store data
in HDFS, but it eventually gets stored in your local/native FS. But you
cannot see this data directly on your local/native FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
however, i need this to be working on windows environment
as project requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is the HDFS file
system OR do i need to create it from command line ?
please suggest
regards,
On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck with some imp
work.
HDFS webUI doesn't provide us the ability to create file
or directory. You can browse HDFS, view files, download files etc. But
operation like create, move, copy etc are not supported.
These values look fine to me.
One suggestion though. Try getting a Linux machine(if
possible). Or at least use a VM. I personally feel that using Hadoop on
windows is always messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
i haven't seen any make directory option there
i need to create it from command line ?
further, in the hdfs-site.xml file , i have given
following entries. are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths local and You
need not to copy hadoop into hdfs...Hadoop is already working..
Just check out in browser by after starting ur single
localhost:50070
then go for browse the filesystem link in it..
If there is no directory then make directory there.
That is your hdfs directory.
Then copy any text file there(no need to copy hadoop
there).beacause u are going to do processing on that data in text
file.That's why hadoop is used for ,first u need to make it clear in ur
mind.Then and then u will do it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal
/full/local/path/to/ur/file /hdfs/directory/path
On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link which u sent but
i need this to be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
--
Regards
*Manish Dunani*
*Contact No* : +91 9408329137
*skype id* : manish.dunani*
*
--
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to
which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
Irfan Sayed
2013-08-22 05:05:20 UTC
Permalink
thanks
here is what i did .
i stopped all the namenodes and datanodes using ./stop-dfs.sh command
then deleted all pid files for namenodes and datanodes

started dfs again with command : "./start-dfs.sh"

when i ran the "Jps" command . it shows

***@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
$ ./jps.exe
4536 Jps
2076 NameNode

however, when i open the pid file for namenode then it is not showing pid
as : 4560. on the contrary, it shud show : 2076

please suggest

regards
Post by Arpit Agarwal
Most likely there is a stale pid file. Something like
\tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
the datanode.
I haven't read the entire thread so you may have looked at this already.
-Arpit
Post by Irfan Sayed
datanode is trying to connect to namenode continuously but fails
$ ./jps.exe
4584 NameNode
4016 Jps
$ ./start-dfs.sh
namenode running as process 3544. Stop it first.
DFS-1: datanode running as process 4076. Stop it first.
localhost: secondarynamenode running as process 4792. Stop it first.
both these logs are contradictory
please find the attached logs
should i attach the conf files as well ?
regards
Post by Mohammad Tariq
Your DN is still not running. Showing me the logs would be helpful.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
i followed the url and did the steps mention in that. i have deployed
on the windows platform
Now, i am able to browse url : http://localhost:50070 (name node )
however, not able to browse url : http://localhost:50030
please refer below
[image: Inline image 1]
i have modified all the config files as mentioned and formatted the
hdfs file system as well
please suggest
regards
Post by Irfan Sayed
http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
let me follow the url which you gave for pseudo distributed setup and
then will switch to distributed mode
regards
irfan
Post by Mohammad Tariq
You are welcome. Which link have you followed for the
configuration?Your *core-site.xml* is empty. Remove the property *
fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
Remove *mapred.job.tracker* as well. It is required in *
mapred-site.xml*.
I would suggest you to do a pseudo distributed setup first in order
to get yourself familiar with the process and then proceed to the
distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
HTH
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks tariq for response.
as discussed last time, i have sent you all the config files in my
setup .
can you please go through that ?
please let me know
regards
irfan
Post by Mohammad Tariq
I'm sorry for being unresponsive. Was out of touch for sometime
because of ramzan and eid. Resuming work today.
What's the current status?
Warm Regards,
Tariq
cloudfront.blogspot.com
On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
Post by manish dunani
First of all read the concepts ..I hope you will like it..
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
Post by Irfan Sayed
please suggest
regards
irfan
On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
Post by Irfan Sayed
hey Tariq,
i am still stuck ..
can you please suggest
regards
irfan
On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
Post by Irfan Sayed
attachment got quarantined
resending in txt format. please rename it to conf.rar
regards
On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
Post by Irfan Sayed
thanks.
$ ./jps.exe
3164 NameNode
1892 Jps
$ ./jps.exe
3848 Jps
jps does not list any process for datanode. however, on web
browser i can see one live data node
please find the attached conf rar file of namenode
regards
On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <
Post by Mohammad Tariq
OK. we'll start fresh. Could you plz show me your latest
config files?
BTW, are your daemons running fine?Use JPS to verify that.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <
Post by Irfan Sayed
i have created these dir "wksp_data" and "wksp_name" on
both datanode and namenode
made the respective changes in "hdfs-site.xml" file
formatted the namenode
started the dfs
but still, not able to browse the file system through web
browser
please refer below
anything still missing ?
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <
Post by Irfan Sayed
these dir needs to be created on all datanodes and
namenodes ?
further, hdfs-site.xml needs to be updated on both
datanodes and namenodes for these new dir?
regards
On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <
Post by Mohammad Tariq
Create 2 directories manually corresponding to the values
of dfs.name.dir and dfs.data.dir properties and change the permissions of
these directories to 755. When you start pushing data into your HDFS, data
will start going inside the directory specified by dfs.data.dir and the
associated metadata will go inside dfs.name.dir. Remember, you store data
in HDFS, but it eventually gets stored in your local/native FS. But you
cannot see this data directly on your local/native FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
however, i need this to be working on windows
environment as project requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is the HDFS file
system OR do i need to create it from command line ?
please suggest
regards,
On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck with some imp
work.
HDFS webUI doesn't provide us the ability to create
file or directory. You can browse HDFS, view files, download files etc. But
operation like create, move, copy etc are not supported.
These values look fine to me.
One suggestion though. Try getting a Linux machine(if
possible). Or at least use a VM. I personally feel that using Hadoop on
windows is always messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
when i browse the file system , i am getting following
i haven't seen any make directory option there
i need to create it from command line ?
further, in the hdfs-site.xml file , i have given
following entries. are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths local and You
need not to copy hadoop into hdfs...Hadoop is already working..
Just check out in browser by after starting ur single
localhost:50070
then go for browse the filesystem link in it..
If there is no directory then make directory there.
That is your hdfs directory.
Then copy any text file there(no need to copy hadoop
there).beacause u are going to do processing on that data in text
file.That's why hadoop is used for ,first u need to make it clear in ur
mind.Then and then u will do it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal
/full/local/path/to/ur/file /hdfs/directory/path
On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link which u sent
but i need this to be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
--
Regards
*Manish Dunani*
*Contact No* : +91 9408329137
*skype id* : manish.dunani*
*
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
Arpit Agarwal
2013-08-22 05:56:07 UTC
Permalink
I just noticed you are on Cygwin. IIRC Windows PIDs are not the same as
Cygwin PIDs so that may be causing the discrepancy. I don't know how well
Hadoop works in Cygwin as I have never tried it. Work is in progress for
native Windows support however there are no official releases with Windows
support yet. It may be easier to get familiar with a
release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux
if you are new to it.
Post by Irfan Sayed
thanks
here is what i did .
i stopped all the namenodes and datanodes using ./stop-dfs.sh command
then deleted all pid files for namenodes and datanodes
started dfs again with command : "./start-dfs.sh"
when i ran the "Jps" command . it shows
$ ./jps.exe
4536 Jps
2076 NameNode
however, when i open the pid file for namenode then it is not showing pid
as : 4560. on the contrary, it shud show : 2076
please suggest
regards
Post by Arpit Agarwal
Most likely there is a stale pid file. Something like
\tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
the datanode.
I haven't read the entire thread so you may have looked at this already.
-Arpit
Post by Irfan Sayed
datanode is trying to connect to namenode continuously but fails
$ ./jps.exe
4584 NameNode
4016 Jps
$ ./start-dfs.sh
namenode running as process 3544. Stop it first.
DFS-1: datanode running as process 4076. Stop it first.
localhost: secondarynamenode running as process 4792. Stop it first.
both these logs are contradictory
please find the attached logs
should i attach the conf files as well ?
regards
Post by Mohammad Tariq
Your DN is still not running. Showing me the logs would be helpful.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
i followed the url and did the steps mention in that. i have deployed
on the windows platform
Now, i am able to browse url : http://localhost:50070 (name node )
however, not able to browse url : http://localhost:50030
please refer below
[image: Inline image 1]
i have modified all the config files as mentioned and formatted the
hdfs file system as well
please suggest
regards
Post by Irfan Sayed
http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
let me follow the url which you gave for pseudo distributed setup and
then will switch to distributed mode
regards
irfan
Post by Mohammad Tariq
You are welcome. Which link have you followed for the
configuration?Your *core-site.xml* is empty. Remove the property *
fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
Remove *mapred.job.tracker* as well. It is required in *
mapred-site.xml*.
I would suggest you to do a pseudo distributed setup first in order
to get yourself familiar with the process and then proceed to the
distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
HTH
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks tariq for response.
as discussed last time, i have sent you all the config files in my
setup .
can you please go through that ?
please let me know
regards
irfan
Post by Mohammad Tariq
I'm sorry for being unresponsive. Was out of touch for sometime
because of ramzan and eid. Resuming work today.
What's the current status?
Warm Regards,
Tariq
cloudfront.blogspot.com
On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
Post by manish dunani
First of all read the concepts ..I hope you will like it..
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
irfan
On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
Post by Irfan Sayed
hey Tariq,
i am still stuck ..
can you please suggest
regards
irfan
On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
Post by Irfan Sayed
attachment got quarantined
resending in txt format. please rename it to conf.rar
regards
On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
Post by Irfan Sayed
thanks.
$ ./jps.exe
3164 NameNode
1892 Jps
$ ./jps.exe
3848 Jps
jps does not list any process for datanode. however, on web
browser i can see one live data node
please find the attached conf rar file of namenode
regards
On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <
Post by Mohammad Tariq
OK. we'll start fresh. Could you plz show me your latest
config files?
BTW, are your daemons running fine?Use JPS to verify that.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <
Post by Irfan Sayed
i have created these dir "wksp_data" and "wksp_name" on
both datanode and namenode
made the respective changes in "hdfs-site.xml" file
formatted the namenode
started the dfs
but still, not able to browse the file system through web
browser
please refer below
anything still missing ?
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <
Post by Irfan Sayed
these dir needs to be created on all datanodes and
namenodes ?
further, hdfs-site.xml needs to be updated on both
datanodes and namenodes for these new dir?
regards
On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <
Post by Mohammad Tariq
Create 2 directories manually corresponding to the
values of dfs.name.dir and dfs.data.dir properties and change the
permissions of these directories to 755. When you start pushing data into
your HDFS, data will start going inside the directory specified by
dfs.data.dir and the associated metadata will go inside dfs.name.dir.
Remember, you store data in HDFS, but it eventually gets stored in your
local/native FS. But you cannot see this data directly on your local/native
FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
however, i need this to be working on windows
environment as project requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is the HDFS file
system OR do i need to create it from command line ?
please suggest
regards,
On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck with some imp
work.
HDFS webUI doesn't provide us the ability to create
file or directory. You can browse HDFS, view files, download files etc. But
operation like create, move, copy etc are not supported.
These values look fine to me.
One suggestion though. Try getting a Linux machine(if
possible). Or at least use a VM. I personally feel that using Hadoop on
windows is always messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
when i browse the file system , i am getting
i haven't seen any make directory option there
i need to create it from command line ?
further, in the hdfs-site.xml file , i have given
following entries. are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths local and You
need not to copy hadoop into hdfs...Hadoop is already working..
Just check out in browser by after starting ur
localhost:50070
then go for browse the filesystem link in it..
If there is no directory then make directory there.
That is your hdfs directory.
Then copy any text file there(no need to copy hadoop
there).beacause u are going to do processing on that data in text
file.That's why hadoop is used for ,first u need to make it clear in ur
mind.Then and then u will do it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal
/full/local/path/to/ur/file /hdfs/directory/path
On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link which u sent
but i need this to be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
--
Regards
*Manish Dunani*
*Contact No* : +91 9408329137
*skype id* : manish.dunani*
*
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
--
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to
which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
Irfan Sayed
2013-08-22 06:19:13 UTC
Permalink
thanks.
can i have setup like this :
namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)
and datanodes are the combination of any OS (windows , linux , unix etc )

however, my doubt is, as the file systems of both the systems (win and
linux ) are different , datanodes of these systems can not be part of
single cluster . i have to make windows cluster separate and UNIX cluster
separate ?

regards
Post by Arpit Agarwal
I just noticed you are on Cygwin. IIRC Windows PIDs are not the same as
Cygwin PIDs so that may be causing the discrepancy. I don't know how well
Hadoop works in Cygwin as I have never tried it. Work is in progress for
native Windows support however there are no official releases with Windows
support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
Post by Irfan Sayed
thanks
here is what i did .
i stopped all the namenodes and datanodes using ./stop-dfs.sh command
then deleted all pid files for namenodes and datanodes
started dfs again with command : "./start-dfs.sh"
when i ran the "Jps" command . it shows
$ ./jps.exe
4536 Jps
2076 NameNode
however, when i open the pid file for namenode then it is not showing pid
as : 4560. on the contrary, it shud show : 2076
please suggest
regards
Post by Arpit Agarwal
Most likely there is a stale pid file. Something like
\tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
the datanode.
I haven't read the entire thread so you may have looked at this already.
-Arpit
Post by Irfan Sayed
datanode is trying to connect to namenode continuously but fails
$ ./jps.exe
4584 NameNode
4016 Jps
$ ./start-dfs.sh
namenode running as process 3544. Stop it first.
DFS-1: datanode running as process 4076. Stop it first.
localhost: secondarynamenode running as process 4792. Stop it first.
both these logs are contradictory
please find the attached logs
should i attach the conf files as well ?
regards
Post by Mohammad Tariq
Your DN is still not running. Showing me the logs would be helpful.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
i followed the url and did the steps mention in that. i have deployed
on the windows platform
Now, i am able to browse url : http://localhost:50070 (name node )
however, not able to browse url : http://localhost:50030
please refer below
[image: Inline image 1]
i have modified all the config files as mentioned and formatted the
hdfs file system as well
please suggest
regards
Post by Irfan Sayed
http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
let me follow the url which you gave for pseudo distributed setup
and then will switch to distributed mode
regards
irfan
Post by Mohammad Tariq
You are welcome. Which link have you followed for the
configuration?Your *core-site.xml* is empty. Remove the property *
fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
Remove *mapred.job.tracker* as well. It is required in *
mapred-site.xml*.
I would suggest you to do a pseudo distributed setup first in order
to get yourself familiar with the process and then proceed to the
distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
HTH
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks tariq for response.
as discussed last time, i have sent you all the config files in my
setup .
can you please go through that ?
please let me know
regards
irfan
On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
Post by Mohammad Tariq
I'm sorry for being unresponsive. Was out of touch for sometime
because of ramzan and eid. Resuming work today.
What's the current status?
Warm Regards,
Tariq
cloudfront.blogspot.com
On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
Post by manish dunani
First of all read the concepts ..I hope you will like it..
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
irfan
On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
Post by Irfan Sayed
hey Tariq,
i am still stuck ..
can you please suggest
regards
irfan
On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
Post by Irfan Sayed
attachment got quarantined
resending in txt format. please rename it to conf.rar
regards
On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
Post by Irfan Sayed
thanks.
$ ./jps.exe
3164 NameNode
1892 Jps
$ ./jps.exe
3848 Jps
jps does not list any process for datanode. however, on web
browser i can see one live data node
please find the attached conf rar file of namenode
regards
On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <
Post by Mohammad Tariq
OK. we'll start fresh. Could you plz show me your latest
config files?
BTW, are your daemons running fine?Use JPS to verify that.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <
Post by Irfan Sayed
i have created these dir "wksp_data" and "wksp_name" on
both datanode and namenode
made the respective changes in "hdfs-site.xml" file
formatted the namenode
started the dfs
but still, not able to browse the file system through web
browser
please refer below
anything still missing ?
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <
Post by Irfan Sayed
these dir needs to be created on all datanodes and
namenodes ?
further, hdfs-site.xml needs to be updated on both
datanodes and namenodes for these new dir?
regards
On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <
Post by Mohammad Tariq
Create 2 directories manually corresponding to the
values of dfs.name.dir and dfs.data.dir properties and change the
permissions of these directories to 755. When you start pushing data into
your HDFS, data will start going inside the directory specified by
dfs.data.dir and the associated metadata will go inside dfs.name.dir.
Remember, you store data in HDFS, but it eventually gets stored in your
local/native FS. But you cannot see this data directly on your local/native
FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
however, i need this to be working on windows
environment as project requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is the HDFS file
system OR do i need to create it from command line ?
please suggest
regards,
On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck with some imp
work.
HDFS webUI doesn't provide us the ability to create
file or directory. You can browse HDFS, view files, download files etc. But
operation like create, move, copy etc are not supported.
These values look fine to me.
One suggestion though. Try getting a Linux machine(if
possible). Or at least use a VM. I personally feel that using Hadoop on
windows is always messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
when i browse the file system , i am getting
i haven't seen any make directory option there
i need to create it from command line ?
further, in the hdfs-site.xml file , i have given
following entries. are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths local and You
need not to copy hadoop into hdfs...Hadoop is already working..
Just check out in browser by after starting ur
localhost:50070
then go for browse the filesystem link in it..
If there is no directory then make directory there.
That is your hdfs directory.
Then copy any text file there(no need to copy
hadoop there).beacause u are going to do processing on that data in text
file.That's why hadoop is used for ,first u need to make it clear in ur
mind.Then and then u will do it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal
/full/local/path/to/ur/file /hdfs/directory/path
On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link which u sent
but i need this to be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
--
Regards
*Manish Dunani*
*Contact No* : +91 9408329137
*skype id* : manish.dunani*
*
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
Irfan Sayed
2013-08-22 11:31:20 UTC
Permalink
please suggest

regards
irfan
Post by Irfan Sayed
thanks.
namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)
and datanodes are the combination of any OS (windows , linux , unix etc )
however, my doubt is, as the file systems of both the systems (win and
linux ) are different , datanodes of these systems can not be part of
single cluster . i have to make windows cluster separate and UNIX cluster
separate ?
regards
Post by Arpit Agarwal
I just noticed you are on Cygwin. IIRC Windows PIDs are not the same as
Cygwin PIDs so that may be causing the discrepancy. I don't know how well
Hadoop works in Cygwin as I have never tried it. Work is in progress for
native Windows support however there are no official releases with Windows
support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
Post by Irfan Sayed
thanks
here is what i did .
i stopped all the namenodes and datanodes using ./stop-dfs.sh command
then deleted all pid files for namenodes and datanodes
started dfs again with command : "./start-dfs.sh"
when i ran the "Jps" command . it shows
$ ./jps.exe
4536 Jps
2076 NameNode
however, when i open the pid file for namenode then it is not showing
pid as : 4560. on the contrary, it shud show : 2076
please suggest
regards
Post by Arpit Agarwal
Most likely there is a stale pid file. Something like
\tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
the datanode.
I haven't read the entire thread so you may have looked at this already.
-Arpit
Post by Irfan Sayed
datanode is trying to connect to namenode continuously but fails
$ ./jps.exe
4584 NameNode
4016 Jps
$ ./start-dfs.sh
namenode running as process 3544. Stop it first.
DFS-1: datanode running as process 4076. Stop it first.
localhost: secondarynamenode running as process 4792. Stop it first.
both these logs are contradictory
please find the attached logs
should i attach the conf files as well ?
regards
Post by Mohammad Tariq
Your DN is still not running. Showing me the logs would be helpful.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
i followed the url and did the steps mention in that. i have
deployed on the windows platform
Now, i am able to browse url : http://localhost:50070 (name node )
however, not able to browse url : http://localhost:50030
please refer below
[image: Inline image 1]
i have modified all the config files as mentioned and formatted the
hdfs file system as well
please suggest
regards
Post by Irfan Sayed
http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
let me follow the url which you gave for pseudo distributed setup
and then will switch to distributed mode
regards
irfan
Post by Mohammad Tariq
You are welcome. Which link have you followed for the
configuration?Your *core-site.xml* is empty. Remove the property *
fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml
*. Remove *mapred.job.tracker* as well. It is required in *
mapred-site.xml*.
I would suggest you to do a pseudo distributed setup first in
order to get yourself familiar with the process and then proceed to the
distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
HTH
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
thanks tariq for response.
as discussed last time, i have sent you all the config files in
my setup .
can you please go through that ?
please let me know
regards
irfan
On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
Post by Mohammad Tariq
I'm sorry for being unresponsive. Was out of touch for sometime
because of ramzan and eid. Resuming work today.
What's the current status?
Warm Regards,
Tariq
cloudfront.blogspot.com
On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
Post by manish dunani
First of all read the concepts ..I hope you will like it..
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
irfan
On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
Post by Irfan Sayed
hey Tariq,
i am still stuck ..
can you please suggest
regards
irfan
On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
Post by Irfan Sayed
attachment got quarantined
resending in txt format. please rename it to conf.rar
regards
On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
Post by Irfan Sayed
thanks.
$ ./jps.exe
3164 NameNode
1892 Jps
$ ./jps.exe
3848 Jps
jps does not list any process for datanode. however, on
web browser i can see one live data node
please find the attached conf rar file of namenode
regards
On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <
Post by Mohammad Tariq
OK. we'll start fresh. Could you plz show me your latest
config files?
BTW, are your daemons running fine?Use JPS to verify that.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <
Post by Irfan Sayed
i have created these dir "wksp_data" and "wksp_name" on
both datanode and namenode
made the respective changes in "hdfs-site.xml" file
formatted the namenode
started the dfs
but still, not able to browse the file system through
web browser
please refer below
anything still missing ?
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <
Post by Irfan Sayed
these dir needs to be created on all datanodes and
namenodes ?
further, hdfs-site.xml needs to be updated on both
datanodes and namenodes for these new dir?
regards
On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <
Post by Mohammad Tariq
Create 2 directories manually corresponding to the
values of dfs.name.dir and dfs.data.dir properties and change the
permissions of these directories to 755. When you start pushing data into
your HDFS, data will start going inside the directory specified by
dfs.data.dir and the associated metadata will go inside dfs.name.dir.
Remember, you store data in HDFS, but it eventually gets stored in your
local/native FS. But you cannot see this data directly on your local/native
FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
however, i need this to be working on windows
environment as project requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is the HDFS file
system OR do i need to create it from command line ?
please suggest
regards,
On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck with some
imp work.
HDFS webUI doesn't provide us the ability to create
file or directory. You can browse HDFS, view files, download files etc. But
operation like create, move, copy etc are not supported.
These values look fine to me.
One suggestion though. Try getting a Linux
machine(if possible). Or at least use a VM. I personally feel that using
Hadoop on windows is always messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
when i browse the file system , i am getting
i haven't seen any make directory option there
i need to create it from command line ?
further, in the hdfs-site.xml file , i have given
following entries. are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths local and You
need not to copy hadoop into hdfs...Hadoop is already working..
Just check out in browser by after starting ur
localhost:50070
then go for browse the filesystem link in it..
If there is no directory then make directory there.
That is your hdfs directory.
Then copy any text file there(no need to copy
hadoop there).beacause u are going to do processing on that data in text
file.That's why hadoop is used for ,first u need to make it clear in ur
mind.Then and then u will do it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal
/full/local/path/to/ur/file /hdfs/directory/path
On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link which u sent
but i need this to be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
--
Regards
*Manish Dunani*
*Contact No* : +91 9408329137
*skype id* : manish.dunani*
*
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
Mohammad Tariq
2013-08-22 14:26:10 UTC
Permalink
It is possible. Theoretically Hadoop doesn't stop you from doing that. But
it is not a very wise setup.

Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
please suggest
regards
irfan
Post by Irfan Sayed
thanks.
namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)
and datanodes are the combination of any OS (windows , linux , unix etc )
however, my doubt is, as the file systems of both the systems (win and
linux ) are different , datanodes of these systems can not be part of
single cluster . i have to make windows cluster separate and UNIX cluster
separate ?
regards
Post by Arpit Agarwal
I just noticed you are on Cygwin. IIRC Windows PIDs are not the same as
Cygwin PIDs so that may be causing the discrepancy. I don't know how well
Hadoop works in Cygwin as I have never tried it. Work is in progress for
native Windows support however there are no official releases with Windows
support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
Post by Irfan Sayed
thanks
here is what i did .
i stopped all the namenodes and datanodes using ./stop-dfs.sh command
then deleted all pid files for namenodes and datanodes
started dfs again with command : "./start-dfs.sh"
when i ran the "Jps" command . it shows
$ ./jps.exe
4536 Jps
2076 NameNode
however, when i open the pid file for namenode then it is not showing
pid as : 4560. on the contrary, it shud show : 2076
please suggest
regards
On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
Post by Arpit Agarwal
Most likely there is a stale pid file. Something like
\tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
the datanode.
I haven't read the entire thread so you may have looked at this already.
-Arpit
Post by Irfan Sayed
datanode is trying to connect to namenode continuously but fails
$ ./jps.exe
4584 NameNode
4016 Jps
$ ./start-dfs.sh
namenode running as process 3544. Stop it first.
DFS-1: datanode running as process 4076. Stop it first.
localhost: secondarynamenode running as process 4792. Stop it first.
both these logs are contradictory
please find the attached logs
should i attach the conf files as well ?
regards
Post by Mohammad Tariq
Your DN is still not running. Showing me the logs would be helpful.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
i followed the url and did the steps mention in that. i have
deployed on the windows platform
Now, i am able to browse url : http://localhost:50070 (name node )
however, not able to browse url : http://localhost:50030
please refer below
[image: Inline image 1]
i have modified all the config files as mentioned and formatted the
hdfs file system as well
please suggest
regards
Post by Irfan Sayed
http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
let me follow the url which you gave for pseudo distributed setup
and then will switch to distributed mode
regards
irfan
On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
Post by Mohammad Tariq
You are welcome. Which link have you followed for the
configuration?Your *core-site.xml* is empty. Remove the property
*fs.default.name *from *hdfs-site.xml* and add it to *
core-site.xml*. Remove *mapred.job.tracker* as well. It is
required in *mapred-site.xml*.
I would suggest you to do a pseudo distributed setup first in
order to get yourself familiar with the process and then proceed to the
distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
HTH
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
Post by Irfan Sayed
thanks tariq for response.
as discussed last time, i have sent you all the config files in
my setup .
can you please go through that ?
please let me know
regards
irfan
On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
Post by Mohammad Tariq
I'm sorry for being unresponsive. Was out of touch for sometime
because of ramzan and eid. Resuming work today.
What's the current status?
Warm Regards,
Tariq
cloudfront.blogspot.com
On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
Post by manish dunani
First of all read the concepts ..I hope you will like it..
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
irfan
On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
Post by Irfan Sayed
hey Tariq,
i am still stuck ..
can you please suggest
regards
irfan
On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
Post by Irfan Sayed
attachment got quarantined
resending in txt format. please rename it to conf.rar
regards
On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
Post by Irfan Sayed
thanks.
$ ./jps.exe
3164 NameNode
1892 Jps
$ ./jps.exe
3848 Jps
jps does not list any process for datanode. however, on
web browser i can see one live data node
please find the attached conf rar file of namenode
regards
On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <
Post by Mohammad Tariq
OK. we'll start fresh. Could you plz show me your latest
config files?
BTW, are your daemons running fine?Use JPS to verify
that.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <
Post by Irfan Sayed
i have created these dir "wksp_data" and "wksp_name" on
both datanode and namenode
made the respective changes in "hdfs-site.xml" file
formatted the namenode
started the dfs
but still, not able to browse the file system through
web browser
please refer below
anything still missing ?
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <
Post by Irfan Sayed
these dir needs to be created on all datanodes and
namenodes ?
further, hdfs-site.xml needs to be updated on both
datanodes and namenodes for these new dir?
regards
On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <
Post by Mohammad Tariq
Create 2 directories manually corresponding to the
values of dfs.name.dir and dfs.data.dir properties and change the
permissions of these directories to 755. When you start pushing data into
your HDFS, data will start going inside the directory specified by
dfs.data.dir and the associated metadata will go inside dfs.name.dir.
Remember, you store data in HDFS, but it eventually gets stored in your
local/native FS. But you cannot see this data directly on your local/native
FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
however, i need this to be working on windows
environment as project requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is the HDFS file
system OR do i need to create it from command line ?
please suggest
regards,
On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck with some
imp work.
HDFS webUI doesn't provide us the ability to create
file or directory. You can browse HDFS, view files, download files etc. But
operation like create, move, copy etc are not supported.
These values look fine to me.
One suggestion though. Try getting a Linux
machine(if possible). Or at least use a VM. I personally feel that using
Hadoop on windows is always messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
when i browse the file system , i am getting
i haven't seen any make directory option there
i need to create it from command line ?
further, in the hdfs-site.xml file , i have given
following entries. are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths local and
You need not to copy hadoop into hdfs...Hadoop is already working..
Just check out in browser by after starting ur
localhost:50070
then go for browse the filesystem link in it..
If there is no directory then make directory
there.
That is your hdfs directory.
Then copy any text file there(no need to copy
hadoop there).beacause u are going to do processing on that data in text
file.That's why hadoop is used for ,first u need to make it clear in ur
mind.Then and then u will do it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal
/full/local/path/to/ur/file /hdfs/directory/path
On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link which u
sent but i need this to be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
--
Regards
*Manish Dunani*
*Contact No* : +91 9408329137
*skype id* : manish.dunani*
*
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
Irfan Sayed
2013-08-22 20:12:15 UTC
Permalink
ok. thanks
now, i need to start with all windows setup first as our product will be
based on windows
so, now, please tell me how to resolve the issue

datanode is not starting . please suggest

regards,
irfan
Post by Mohammad Tariq
It is possible. Theoretically Hadoop doesn't stop you from doing that. But
it is not a very wise setup.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
please suggest
regards
irfan
Post by Irfan Sayed
thanks.
namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)
and datanodes are the combination of any OS (windows , linux , unix etc )
however, my doubt is, as the file systems of both the systems (win and
linux ) are different , datanodes of these systems can not be part of
single cluster . i have to make windows cluster separate and UNIX cluster
separate ?
regards
On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
Post by Arpit Agarwal
I just noticed you are on Cygwin. IIRC Windows PIDs are not the same as
Cygwin PIDs so that may be causing the discrepancy. I don't know how well
Hadoop works in Cygwin as I have never tried it. Work is in progress for
native Windows support however there are no official releases with Windows
support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
Post by Irfan Sayed
thanks
here is what i did .
i stopped all the namenodes and datanodes using ./stop-dfs.sh command
then deleted all pid files for namenodes and datanodes
started dfs again with command : "./start-dfs.sh"
when i ran the "Jps" command . it shows
$ ./jps.exe
4536 Jps
2076 NameNode
however, when i open the pid file for namenode then it is not showing
pid as : 4560. on the contrary, it shud show : 2076
please suggest
regards
On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
Post by Arpit Agarwal
Most likely there is a stale pid file. Something like
\tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
the datanode.
I haven't read the entire thread so you may have looked at this already.
-Arpit
Post by Irfan Sayed
datanode is trying to connect to namenode continuously but fails
$ ./jps.exe
4584 NameNode
4016 Jps
$ ./start-dfs.sh
namenode running as process 3544. Stop it first.
DFS-1: datanode running as process 4076. Stop it first.
localhost: secondarynamenode running as process 4792. Stop it first.
both these logs are contradictory
please find the attached logs
should i attach the conf files as well ?
regards
Post by Mohammad Tariq
Your DN is still not running. Showing me the logs would be helpful.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
i followed the url and did the steps mention in that. i have
deployed on the windows platform
Now, i am able to browse url : http://localhost:50070 (name node )
however, not able to browse url : http://localhost:50030
please refer below
[image: Inline image 1]
i have modified all the config files as mentioned and formatted
the hdfs file system as well
please suggest
regards
Post by Irfan Sayed
http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
let me follow the url which you gave for pseudo distributed setup
and then will switch to distributed mode
regards
irfan
On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
Post by Mohammad Tariq
You are welcome. Which link have you followed for the
configuration?Your *core-site.xml* is empty. Remove the
property *fs.default.name *from *hdfs-site.xml* and add it to *
core-site.xml*. Remove *mapred.job.tracker* as well. It is
required in *mapred-site.xml*.
I would suggest you to do a pseudo distributed setup first in
order to get yourself familiar with the process and then proceed to the
distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
HTH
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
Post by Irfan Sayed
thanks tariq for response.
as discussed last time, i have sent you all the config files in
my setup .
can you please go through that ?
please let me know
regards
irfan
On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
Post by Mohammad Tariq
I'm sorry for being unresponsive. Was out of touch for
sometime because of ramzan and eid. Resuming work today.
What's the current status?
Warm Regards,
Tariq
cloudfront.blogspot.com
On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
Post by manish dunani
First of all read the concepts ..I hope you will like it..
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
irfan
On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
Post by Irfan Sayed
hey Tariq,
i am still stuck ..
can you please suggest
regards
irfan
On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
Post by Irfan Sayed
attachment got quarantined
resending in txt format. please rename it to conf.rar
regards
On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
Post by Irfan Sayed
thanks.
$ ./jps.exe
3164 NameNode
1892 Jps
$ ./jps.exe
3848 Jps
jps does not list any process for datanode. however, on
web browser i can see one live data node
please find the attached conf rar file of namenode
regards
On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <
Post by Mohammad Tariq
OK. we'll start fresh. Could you plz show me your
latest config files?
BTW, are your daemons running fine?Use JPS to verify
that.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <
Post by Irfan Sayed
i have created these dir "wksp_data" and "wksp_name"
on both datanode and namenode
made the respective changes in "hdfs-site.xml" file
formatted the namenode
started the dfs
but still, not able to browse the file system through
web browser
please refer below
anything still missing ?
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <
Post by Irfan Sayed
these dir needs to be created on all datanodes and
namenodes ?
further, hdfs-site.xml needs to be updated on both
datanodes and namenodes for these new dir?
regards
On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <
Post by Mohammad Tariq
Create 2 directories manually corresponding to the
values of dfs.name.dir and dfs.data.dir properties and change the
permissions of these directories to 755. When you start pushing data into
your HDFS, data will start going inside the directory specified by
dfs.data.dir and the associated metadata will go inside dfs.name.dir.
Remember, you store data in HDFS, but it eventually gets stored in your
local/native FS. But you cannot see this data directly on your local/native
FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
however, i need this to be working on windows
environment as project requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is the HDFS file
system OR do i need to create it from command line ?
please suggest
regards,
On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck with some
imp work.
HDFS webUI doesn't provide us the ability to
create file or directory. You can browse HDFS, view files, download files
etc. But operation like create, move, copy etc are not supported.
These values look fine to me.
One suggestion though. Try getting a Linux
machine(if possible). Or at least use a VM. I personally feel that using
Hadoop on windows is always messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
when i browse the file system , i am getting
i haven't seen any make directory option there
i need to create it from command line ?
further, in the hdfs-site.xml file , i have given
following entries. are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths local and
You need not to copy hadoop into hdfs...Hadoop is already working..
Just check out in browser by after starting ur
localhost:50070
then go for browse the filesystem link in it..
If there is no directory then make directory
there.
That is your hdfs directory.
Then copy any text file there(no need to copy
hadoop there).beacause u are going to do processing on that data in text
file.That's why hadoop is used for ,first u need to make it clear in ur
mind.Then and then u will do it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal
/full/local/path/to/ur/file /hdfs/directory/path
On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link which u
sent but i need this to be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
--
Regards
*Manish Dunani*
*Contact No* : +91 9408329137
*skype id* : manish.dunani*
*
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.
Mohammad Tariq
2013-08-22 21:12:43 UTC
Permalink
Seriously??You are planning to develop something using Hadoop on windows.
Not a good idea. Anyways, cold you plz show me your log files?I also need
some additional info :
-The exact problem which you are facing right now
-Your cluster summary(no. of nodes etc)
-Your latest configuration files
-Your /etc.hosts file

Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
ok. thanks
now, i need to start with all windows setup first as our product will be
based on windows
so, now, please tell me how to resolve the issue
datanode is not starting . please suggest
regards,
irfan
Post by Mohammad Tariq
It is possible. Theoretically Hadoop doesn't stop you from doing that.
But it is not a very wise setup.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
please suggest
regards
irfan
Post by Irfan Sayed
thanks.
namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)
and datanodes are the combination of any OS (windows , linux , unix etc
)
however, my doubt is, as the file systems of both the systems (win
and linux ) are different , datanodes of these systems can not be part of
single cluster . i have to make windows cluster separate and UNIX cluster
separate ?
regards
On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
Post by Arpit Agarwal
I just noticed you are on Cygwin. IIRC Windows PIDs are not the same
as Cygwin PIDs so that may be causing the discrepancy. I don't know how
well Hadoop works in Cygwin as I have never tried it. Work is in progress
for native Windows support however there are no official releases with
Windows support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
Post by Irfan Sayed
thanks
here is what i did .
i stopped all the namenodes and datanodes using ./stop-dfs.sh command
then deleted all pid files for namenodes and datanodes
started dfs again with command : "./start-dfs.sh"
when i ran the "Jps" command . it shows
$ ./jps.exe
4536 Jps
2076 NameNode
however, when i open the pid file for namenode then it is not showing
pid as : 4560. on the contrary, it shud show : 2076
please suggest
regards
On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
Post by Arpit Agarwal
Most likely there is a stale pid file. Something like
\tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
the datanode.
I haven't read the entire thread so you may have looked at this already.
-Arpit
Post by Irfan Sayed
datanode is trying to connect to namenode continuously but fails
$ ./jps.exe
4584 NameNode
4016 Jps
$ ./start-dfs.sh
namenode running as process 3544. Stop it first.
DFS-1: datanode running as process 4076. Stop it first.
localhost: secondarynamenode running as process 4792. Stop it first.
both these logs are contradictory
please find the attached logs
should i attach the conf files as well ?
regards
Post by Mohammad Tariq
Your DN is still not running. Showing me the logs would be helpful.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
i followed the url and did the steps mention in that. i have
deployed on the windows platform
Now, i am able to browse url : http://localhost:50070 (name node )
however, not able to browse url : http://localhost:50030
please refer below
[image: Inline image 1]
i have modified all the config files as mentioned and formatted
the hdfs file system as well
please suggest
regards
On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
Post by Irfan Sayed
http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
let me follow the url which you gave for pseudo distributed
setup and then will switch to distributed mode
regards
irfan
On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
Post by Mohammad Tariq
You are welcome. Which link have you followed for the
configuration?Your *core-site.xml* is empty. Remove the
property *fs.default.name *from *hdfs-site.xml* and add it to *
core-site.xml*. Remove *mapred.job.tracker* as well. It is
required in *mapred-site.xml*.
I would suggest you to do a pseudo distributed setup first in
order to get yourself familiar with the process and then proceed to the
distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
HTH
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
Post by Irfan Sayed
thanks tariq for response.
as discussed last time, i have sent you all the config files
in my setup .
can you please go through that ?
please let me know
regards
irfan
On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
Post by Mohammad Tariq
I'm sorry for being unresponsive. Was out of touch for
sometime because of ramzan and eid. Resuming work today.
What's the current status?
Warm Regards,
Tariq
cloudfront.blogspot.com
On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
Post by manish dunani
First of all read the concepts ..I hope you will like it..
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
irfan
On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
Post by Irfan Sayed
hey Tariq,
i am still stuck ..
can you please suggest
regards
irfan
On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
Post by Irfan Sayed
attachment got quarantined
resending in txt format. please rename it to conf.rar
regards
On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
Post by Irfan Sayed
thanks.
$ ./jps.exe
3164 NameNode
1892 Jps
$ ./jps.exe
3848 Jps
jps does not list any process for datanode. however, on
web browser i can see one live data node
please find the attached conf rar file of namenode
regards
On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <
Post by Mohammad Tariq
OK. we'll start fresh. Could you plz show me your
latest config files?
BTW, are your daemons running fine?Use JPS to verify
that.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <
Post by Irfan Sayed
i have created these dir "wksp_data" and "wksp_name"
on both datanode and namenode
made the respective changes in "hdfs-site.xml" file
formatted the namenode
started the dfs
but still, not able to browse the file system through
web browser
please refer below
anything still missing ?
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <
Post by Irfan Sayed
these dir needs to be created on all datanodes and
namenodes ?
further, hdfs-site.xml needs to be updated on both
datanodes and namenodes for these new dir?
regards
On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <
Post by Mohammad Tariq
Create 2 directories manually corresponding to the
values of dfs.name.dir and dfs.data.dir properties and change the
permissions of these directories to 755. When you start pushing data into
your HDFS, data will start going inside the directory specified by
dfs.data.dir and the associated metadata will go inside dfs.name.dir.
Remember, you store data in HDFS, but it eventually gets stored in your
local/native FS. But you cannot see this data directly on your local/native
FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
however, i need this to be working on windows
environment as project requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is the HDFS
file system OR do i need to create it from command line ?
please suggest
regards,
On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck with some
imp work.
HDFS webUI doesn't provide us the ability to
create file or directory. You can browse HDFS, view files, download files
etc. But operation like create, move, copy etc are not supported.
These values look fine to me.
One suggestion though. Try getting a Linux
machine(if possible). Or at least use a VM. I personally feel that using
Hadoop on windows is always messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
when i browse the file system , i am getting
i haven't seen any make directory option there
i need to create it from command line ?
further, in the hdfs-site.xml file , i have
given following entries. are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths local and
You need not to copy hadoop into hdfs...Hadoop is already working..
Just check out in browser by after starting ur
localhost:50070
then go for browse the filesystem link in it..
If there is no directory then make directory
there.
That is your hdfs directory.
Then copy any text file there(no need to copy
hadoop there).beacause u are going to do processing on that data in text
file.That's why hadoop is used for ,first u need to make it clear in ur
mind.Then and then u will do it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal
/full/local/path/to/ur/file /hdfs/directory/path
On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link which u
sent but i need this to be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
--
Regards
*Manish Dunani*
*Contact No* : +91 9408329137
*skype id* : manish.dunani*
*
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.
Irfan Sayed
2013-08-23 04:15:08 UTC
Permalink
thanks.
ok. i think i need to change the plan over here
let me create two environments. 1: totally windows 2: totally Unix

because, on windows , anyway i have to try and see how hadoop works
on UNIX, it is already known that , it is working fine.

so, on windows , here is the setup:

namenode : windows 2012 R2
datanode : windows 2012 R2

now, the exact problem is :
1: datanode is not getting started
2: replication : if i put any file/folder on any datanode , it should get
replicated to all another available datanodes

regards
Post by Mohammad Tariq
Seriously??You are planning to develop something using Hadoop on windows.
Not a good idea. Anyways, cold you plz show me your log files?I also need
-The exact problem which you are facing right now
-Your cluster summary(no. of nodes etc)
-Your latest configuration files
-Your /etc.hosts file
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
ok. thanks
now, i need to start with all windows setup first as our product will be
based on windows
so, now, please tell me how to resolve the issue
datanode is not starting . please suggest
regards,
irfan
Post by Mohammad Tariq
It is possible. Theoretically Hadoop doesn't stop you from doing that.
But it is not a very wise setup.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
please suggest
regards
irfan
Post by Irfan Sayed
thanks.
namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)
and datanodes are the combination of any OS (windows , linux , unix
etc )
however, my doubt is, as the file systems of both the systems (win
and linux ) are different , datanodes of these systems can not be part of
single cluster . i have to make windows cluster separate and UNIX cluster
separate ?
regards
On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
Post by Arpit Agarwal
I just noticed you are on Cygwin. IIRC Windows PIDs are not the same
as Cygwin PIDs so that may be causing the discrepancy. I don't know how
well Hadoop works in Cygwin as I have never tried it. Work is in progress
for native Windows support however there are no official releases with
Windows support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
Post by Irfan Sayed
thanks
here is what i did .
i stopped all the namenodes and datanodes using ./stop-dfs.sh command
then deleted all pid files for namenodes and datanodes
started dfs again with command : "./start-dfs.sh"
when i ran the "Jps" command . it shows
$ ./jps.exe
4536 Jps
2076 NameNode
however, when i open the pid file for namenode then it is not
showing pid as : 4560. on the contrary, it shud show : 2076
please suggest
regards
On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
Post by Arpit Agarwal
Most likely there is a stale pid file. Something like
\tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
the datanode.
I haven't read the entire thread so you may have looked at this already.
-Arpit
Post by Irfan Sayed
datanode is trying to connect to namenode continuously but fails
$ ./jps.exe
4584 NameNode
4016 Jps
$ ./start-dfs.sh
namenode running as process 3544. Stop it first.
DFS-1: datanode running as process 4076. Stop it first.
localhost: secondarynamenode running as process 4792. Stop it first.
both these logs are contradictory
please find the attached logs
should i attach the conf files as well ?
regards
On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
Post by Mohammad Tariq
Your DN is still not running. Showing me the logs would be helpful.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
Post by Irfan Sayed
i followed the url and did the steps mention in that. i have
deployed on the windows platform
Now, i am able to browse url : http://localhost:50070 (name node )
however, not able to browse url : http://localhost:50030
please refer below
[image: Inline image 1]
i have modified all the config files as mentioned and formatted
the hdfs file system as well
please suggest
regards
On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
Post by Irfan Sayed
http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
let me follow the url which you gave for pseudo distributed
setup and then will switch to distributed mode
regards
irfan
On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
Post by Mohammad Tariq
You are welcome. Which link have you followed for the
configuration?Your *core-site.xml* is empty. Remove the
property *fs.default.name *from *hdfs-site.xml* and add it to
*core-site.xml*. Remove *mapred.job.tracker* as well. It is
required in *mapred-site.xml*.
I would suggest you to do a pseudo distributed setup first in
order to get yourself familiar with the process and then proceed to the
distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
HTH
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
Post by Irfan Sayed
thanks tariq for response.
as discussed last time, i have sent you all the config files
in my setup .
can you please go through that ?
please let me know
regards
irfan
On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
Post by Mohammad Tariq
I'm sorry for being unresponsive. Was out of touch for
sometime because of ramzan and eid. Resuming work today.
What's the current status?
Warm Regards,
Tariq
cloudfront.blogspot.com
On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
Post by manish dunani
First of all read the concepts ..I hope you will like it..
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
irfan
On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
Post by Irfan Sayed
hey Tariq,
i am still stuck ..
can you please suggest
regards
irfan
On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
Post by Irfan Sayed
attachment got quarantined
resending in txt format. please rename it to conf.rar
regards
On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
Post by Irfan Sayed
thanks.
$ ./jps.exe
3164 NameNode
1892 Jps
$ ./jps.exe
3848 Jps
jps does not list any process for datanode. however,
on web browser i can see one live data node
please find the attached conf rar file of namenode
regards
On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <
Post by Mohammad Tariq
OK. we'll start fresh. Could you plz show me your
latest config files?
BTW, are your daemons running fine?Use JPS to verify
that.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <
Post by Irfan Sayed
i have created these dir "wksp_data" and "wksp_name"
on both datanode and namenode
made the respective changes in "hdfs-site.xml" file
formatted the namenode
started the dfs
but still, not able to browse the file system
through web browser
please refer below
anything still missing ?
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <
Post by Irfan Sayed
these dir needs to be created on all datanodes and
namenodes ?
further, hdfs-site.xml needs to be updated on both
datanodes and namenodes for these new dir?
regards
On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <
Post by Mohammad Tariq
Create 2 directories manually corresponding to the
values of dfs.name.dir and dfs.data.dir properties and change the
permissions of these directories to 755. When you start pushing data into
your HDFS, data will start going inside the directory specified by
dfs.data.dir and the associated metadata will go inside dfs.name.dir.
Remember, you store data in HDFS, but it eventually gets stored in your
local/native FS. But you cannot see this data directly on your local/native
FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
however, i need this to be working on windows
environment as project requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is the HDFS
file system OR do i need to create it from command line ?
please suggest
regards,
On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck with
some imp work.
HDFS webUI doesn't provide us the ability to
create file or directory. You can browse HDFS, view files, download files
etc. But operation like create, move, copy etc are not supported.
These values look fine to me.
One suggestion though. Try getting a Linux
machine(if possible). Or at least use a VM. I personally feel that using
Hadoop on windows is always messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
when i browse the file system , i am getting
i haven't seen any make directory option there
i need to create it from command line ?
further, in the hdfs-site.xml file , i have
given following entries. are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 12:40 PM, manish dunani
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths local and
You need not to copy hadoop into hdfs...Hadoop is already working..
Just check out in browser by after starting ur
localhost:50070
then go for browse the filesystem link in it..
If there is no directory then make directory
there.
That is your hdfs directory.
Then copy any text file there(no need to copy
hadoop there).beacause u are going to do processing on that data in text
file.That's why hadoop is used for ,first u need to make it clear in ur
mind.Then and then u will do it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal
/full/local/path/to/ur/file /hdfs/directory/path
On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link which u
sent but i need this to be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
--
Regards
*Manish Dunani*
*Contact No* : +91 9408329137
*skype id* : manish.dunani*
*
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.
Olivier Renault
2013-08-23 07:27:20 UTC
Permalink
Irfu,

If you want to quickly get Hadoop running on windows platform. You may want
to try our distribution for Windows. You will be able to find the msi on
our website.

Regards
Olivier
Post by Irfan Sayed
thanks.
ok. i think i need to change the plan over here
let me create two environments. 1: totally windows 2: totally Unix
because, on windows , anyway i have to try and see how hadoop works
on UNIX, it is already known that , it is working fine.
namenode : windows 2012 R2
datanode : windows 2012 R2
1: datanode is not getting started
2: replication : if i put any file/folder on any datanode , it should get
replicated to all another available datanodes
regards
Post by Mohammad Tariq
Seriously??You are planning to develop something using Hadoop on windows.
Not a good idea. Anyways, cold you plz show me your log files?I also need
-The exact problem which you are facing right now
-Your cluster summary(no. of nodes etc)
-Your latest configuration files
-Your /etc.hosts file
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
ok. thanks
now, i need to start with all windows setup first as our product will be
based on windows
so, now, please tell me how to resolve the issue
datanode is not starting . please suggest
regards,
irfan
Post by Mohammad Tariq
It is possible. Theoretically Hadoop doesn't stop you from doing that.
But it is not a very wise setup.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
please suggest
regards
irfan
Post by Irfan Sayed
thanks.
namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)
and datanodes are the combination of any OS (windows , linux , unix
etc )
however, my doubt is, as the file systems of both the systems (win
and linux ) are different , datanodes of these systems can not be part of
single cluster . i have to make windows cluster separate and UNIX cluster
separate ?
regards
On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
Post by Arpit Agarwal
I just noticed you are on Cygwin. IIRC Windows PIDs are not the same
as Cygwin PIDs so that may be causing the discrepancy. I don't know how
well Hadoop works in Cygwin as I have never tried it. Work is in progress
for native Windows support however there are no official releases with
Windows support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
Post by Irfan Sayed
thanks
here is what i did .
i stopped all the namenodes and datanodes using ./stop-dfs.sh command
then deleted all pid files for namenodes and datanodes
started dfs again with command : "./start-dfs.sh"
when i ran the "Jps" command . it shows
$ ./jps.exe
4536 Jps
2076 NameNode
however, when i open the pid file for namenode then it is not
showing pid as : 4560. on the contrary, it shud show : 2076
please suggest
regards
On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
Post by Arpit Agarwal
Most likely there is a stale pid file. Something like
\tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
the datanode.
I haven't read the entire thread so you may have looked at this already.
-Arpit
Post by Irfan Sayed
datanode is trying to connect to namenode continuously but fails
$ ./jps.exe
4584 NameNode
4016 Jps
$ ./start-dfs.sh
namenode running as process 3544. Stop it first.
DFS-1: datanode running as process 4076. Stop it first.
localhost: secondarynamenode running as process 4792. Stop it first.
both these logs are contradictory
please find the attached logs
should i attach the conf files as well ?
regards
On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
Post by Mohammad Tariq
Your DN is still not running. Showing me the logs would be helpful.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
Post by Irfan Sayed
i followed the url and did the steps mention in that. i have
deployed on the windows platform
Now, i am able to browse url : http://localhost:50070 (name node )
however, not able to browse url : http://localhost:50030
please refer below
[image: Inline image 1]
i have modified all the config files as mentioned and formatted
the hdfs file system as well
please suggest
regards
On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
Post by Irfan Sayed
http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
let me follow the url which you gave for pseudo distributed
setup and then will switch to distributed mode
regards
irfan
On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
Post by Mohammad Tariq
You are welcome. Which link have you followed for the
configuration?Your *core-site.xml* is empty. Remove the
property *fs.default.name *from *hdfs-site.xml* and add it
to *core-site.xml*. Remove *mapred.job.tracker* as well. It
is required in *mapred-site.xml*.
I would suggest you to do a pseudo distributed setup first in
order to get yourself familiar with the process and then proceed to the
distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
HTH
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
Post by Irfan Sayed
thanks tariq for response.
as discussed last time, i have sent you all the config files
in my setup .
can you please go through that ?
please let me know
regards
irfan
On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
Post by Mohammad Tariq
I'm sorry for being unresponsive. Was out of touch for
sometime because of ramzan and eid. Resuming work today.
What's the current status?
Warm Regards,
Tariq
cloudfront.blogspot.com
On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
Post by manish dunani
First of all read the concepts ..I hope you will like it..
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
irfan
On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
Post by Irfan Sayed
hey Tariq,
i am still stuck ..
can you please suggest
regards
irfan
On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
Post by Irfan Sayed
attachment got quarantined
resending in txt format. please rename it to conf.rar
regards
On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
Post by Irfan Sayed
thanks.
$ ./jps.exe
3164 NameNode
1892 Jps
$ ./jps.exe
3848 Jps
jps does not list any process for datanode. however,
on web browser i can see one live data node
please find the attached conf rar file of namenode
regards
On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <
Post by Mohammad Tariq
OK. we'll start fresh. Could you plz show me your
latest config files?
BTW, are your daemons running fine?Use JPS to verify
that.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <
Post by Irfan Sayed
i have created these dir "wksp_data" and
"wksp_name" on both datanode and namenode
made the respective changes in "hdfs-site.xml" file
formatted the namenode
started the dfs
but still, not able to browse the file system
through web browser
please refer below
anything still missing ?
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <
Post by Irfan Sayed
these dir needs to be created on all datanodes and
namenodes ?
further, hdfs-site.xml needs to be updated on
both datanodes and namenodes for these new dir?
regards
On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <
Post by Mohammad Tariq
Create 2 directories manually corresponding to
the values of dfs.name.dir and dfs.data.dir properties and change the
permissions of these directories to 755. When you start pushing data into
your HDFS, data will start going inside the directory specified by
dfs.data.dir and the associated metadata will go inside dfs.name.dir.
Remember, you store data in HDFS, but it eventually gets stored in your
local/native FS. But you cannot see this data directly on your local/native
FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
however, i need this to be working on windows
environment as project requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is the HDFS
file system OR do i need to create it from command line ?
please suggest
regards,
On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck with
some imp work.
HDFS webUI doesn't provide us the ability to
create file or directory. You can browse HDFS, view files, download files
etc. But operation like create, move, copy etc are not supported.
These values look fine to me.
One suggestion though. Try getting a Linux
machine(if possible). Or at least use a VM. I personally feel that using
Hadoop on windows is always messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
when i browse the file system , i am getting
i haven't seen any make directory option there
i need to create it from command line ?
further, in the hdfs-site.xml file , i have
given following entries. are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 12:40 PM, manish dunani
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths local
and You need not to copy hadoop into hdfs...Hadoop is already working..
Just check out in browser by after starting
localhost:50070
then go for browse the filesystem link in it..
If there is no directory then make directory
there.
That is your hdfs directory.
Then copy any text file there(no need to copy
hadoop there).beacause u are going to do processing on that data in text
file.That's why hadoop is used for ,first u need to make it clear in ur
mind.Then and then u will do it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal
/full/local/path/to/ur/file /hdfs/directory/path
On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link which u
sent but i need this to be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
--
Regards
*Manish Dunani*
*Contact No* : +91 9408329137
*skype id* : manish.dunani*
*
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.
--
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to
which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
Mohammad Tariq
2013-08-23 07:38:18 UTC
Permalink
Are you running DN on both the machines? Could you please show me your DN
logs?

Also, consider Oliver's suggestion. It's definitely a better option.



Warm Regards,
Tariq
cloudfront.blogspot.com


On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault
Post by Olivier Renault
Irfu,
If you want to quickly get Hadoop running on windows platform. You may
want to try our distribution for Windows. You will be able to find the msi
on our website.
Regards
Olivier
Post by Irfan Sayed
thanks.
ok. i think i need to change the plan over here
let me create two environments. 1: totally windows 2: totally Unix
because, on windows , anyway i have to try and see how hadoop works
on UNIX, it is already known that , it is working fine.
namenode : windows 2012 R2
datanode : windows 2012 R2
1: datanode is not getting started
2: replication : if i put any file/folder on any datanode , it should get
replicated to all another available datanodes
regards
Post by Mohammad Tariq
Seriously??You are planning to develop something using Hadoop on
windows. Not a good idea. Anyways, cold you plz show me your log files?I
-The exact problem which you are facing right now
-Your cluster summary(no. of nodes etc)
-Your latest configuration files
-Your /etc.hosts file
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
ok. thanks
now, i need to start with all windows setup first as our product will
be based on windows
so, now, please tell me how to resolve the issue
datanode is not starting . please suggest
regards,
irfan
Post by Mohammad Tariq
It is possible. Theoretically Hadoop doesn't stop you from doing that.
But it is not a very wise setup.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
please suggest
regards
irfan
Post by Irfan Sayed
thanks.
namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)
and datanodes are the combination of any OS (windows , linux , unix
etc )
however, my doubt is, as the file systems of both the systems (win
and linux ) are different , datanodes of these systems can not be part of
single cluster . i have to make windows cluster separate and UNIX cluster
separate ?
regards
On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
Post by Arpit Agarwal
I just noticed you are on Cygwin. IIRC Windows PIDs are not the
same as Cygwin PIDs so that may be causing the discrepancy. I don't know
how well Hadoop works in Cygwin as I have never tried it. Work is in
progress for native Windows support however there are no official releases
with Windows support yet. It may be easier to get familiar with a
release <https://www.apache.org/dyn/closer.cgi/hadoop/common/> on
Linux if you are new to it.
Post by Irfan Sayed
thanks
here is what i did .
i stopped all the namenodes and datanodes using ./stop-dfs.sh command
then deleted all pid files for namenodes and datanodes
started dfs again with command : "./start-dfs.sh"
when i ran the "Jps" command . it shows
$ ./jps.exe
4536 Jps
2076 NameNode
however, when i open the pid file for namenode then it is not
showing pid as : 4560. on the contrary, it shud show : 2076
please suggest
regards
On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
Post by Arpit Agarwal
Most likely there is a stale pid file. Something like
\tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
the datanode.
I haven't read the entire thread so you may have looked at this already.
-Arpit
On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
Post by Irfan Sayed
datanode is trying to connect to namenode continuously but fails
$ ./jps.exe
4584 NameNode
4016 Jps
$ ./start-dfs.sh
namenode running as process 3544. Stop it first.
DFS-1: datanode running as process 4076. Stop it first.
localhost: secondarynamenode running as process 4792. Stop it first.
both these logs are contradictory
please find the attached logs
should i attach the conf files as well ?
regards
On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
Post by Mohammad Tariq
Your DN is still not running. Showing me the logs would be helpful.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
Post by Irfan Sayed
i followed the url and did the steps mention in that. i have
deployed on the windows platform
Now, i am able to browse url : http://localhost:50070 (name node )
however, not able to browse url : http://localhost:50030
please refer below
[image: Inline image 1]
i have modified all the config files as mentioned and
formatted the hdfs file system as well
please suggest
regards
On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
Post by Irfan Sayed
http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
let me follow the url which you gave for pseudo distributed
setup and then will switch to distributed mode
regards
irfan
On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
Post by Mohammad Tariq
You are welcome. Which link have you followed for the
configuration?Your *core-site.xml* is empty. Remove the
property *fs.default.name *from *hdfs-site.xml* and add it
to *core-site.xml*. Remove *mapred.job.tracker* as well. It
is required in *mapred-site.xml*.
I would suggest you to do a pseudo distributed setup first
in order to get yourself familiar with the process and then proceed to the
distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
HTH
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
Post by Irfan Sayed
thanks tariq for response.
as discussed last time, i have sent you all the config
files in my setup .
can you please go through that ?
please let me know
regards
irfan
On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
Post by Mohammad Tariq
I'm sorry for being unresponsive. Was out of touch for
sometime because of ramzan and eid. Resuming work today.
What's the current status?
Warm Regards,
Tariq
cloudfront.blogspot.com
On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
Post by manish dunani
First of all read the concepts ..I hope you will like it..
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
irfan
On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
Post by Irfan Sayed
hey Tariq,
i am still stuck ..
can you please suggest
regards
irfan
On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
Post by Irfan Sayed
attachment got quarantined
resending in txt format. please rename it to conf.rar
regards
On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
Post by Irfan Sayed
thanks.
$ ./jps.exe
3164 NameNode
1892 Jps
$ ./jps.exe
3848 Jps
jps does not list any process for datanode. however,
on web browser i can see one live data node
please find the attached conf rar file of namenode
regards
On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <
Post by Mohammad Tariq
OK. we'll start fresh. Could you plz show me your
latest config files?
BTW, are your daemons running fine?Use JPS to
verify that.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <
Post by Irfan Sayed
i have created these dir "wksp_data" and
"wksp_name" on both datanode and namenode
made the respective changes in "hdfs-site.xml"
file
formatted the namenode
started the dfs
but still, not able to browse the file system
through web browser
please refer below
anything still missing ?
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <
Post by Irfan Sayed
these dir needs to be created on all datanodes
and namenodes ?
further, hdfs-site.xml needs to be updated on
both datanodes and namenodes for these new dir?
regards
On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <
Post by Mohammad Tariq
Create 2 directories manually corresponding to
the values of dfs.name.dir and dfs.data.dir properties and change the
permissions of these directories to 755. When you start pushing data into
your HDFS, data will start going inside the directory specified by
dfs.data.dir and the associated metadata will go inside dfs.name.dir.
Remember, you store data in HDFS, but it eventually gets stored in your
local/native FS. But you cannot see this data directly on your local/native
FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
however, i need this to be working on windows
environment as project requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is the HDFS
file system OR do i need to create it from command line ?
please suggest
regards,
On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck with
some imp work.
HDFS webUI doesn't provide us the ability to
create file or directory. You can browse HDFS, view files, download files
etc. But operation like create, move, copy etc are not supported.
These values look fine to me.
One suggestion though. Try getting a Linux
machine(if possible). Or at least use a VM. I personally feel that using
Hadoop on windows is always messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
when i browse the file system , i am getting
i haven't seen any make directory option
there
i need to create it from command line ?
further, in the hdfs-site.xml file , i have
given following entries. are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 12:40 PM, manish
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths local
and You need not to copy hadoop into hdfs...Hadoop is already working..
Just check out in browser by after starting
localhost:50070
then go for browse the filesystem link in
it..
If there is no directory then make directory
there.
That is your hdfs directory.
Then copy any text file there(no need to
copy hadoop there).beacause u are going to do processing on that data in
text file.That's why hadoop is used for ,first u need to make it clear in
ur mind.Then and then u will do it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal
/full/local/path/to/ur/file /hdfs/directory/path
On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link which
u sent but i need this to be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
--
Regards
*Manish Dunani*
*Contact No* : +91 9408329137
*skype id* : manish.dunani*
*
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
Irfan Sayed
2013-08-23 09:54:32 UTC
Permalink
thanks.
i just followed the instructions to setup the pseudo distributed setup
first using the url :
http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I

i don't think so i am running DN on both machine
please find the attached log

hi olivier

can you please give me download link ?
let me try please

regards
irfan
Post by Mohammad Tariq
Are you running DN on both the machines? Could you please show me your DN
logs?
Also, consider Oliver's suggestion. It's definitely a better option.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
Post by Olivier Renault
Irfu,
If you want to quickly get Hadoop running on windows platform. You may
want to try our distribution for Windows. You will be able to find the msi
on our website.
Regards
Olivier
Post by Irfan Sayed
thanks.
ok. i think i need to change the plan over here
let me create two environments. 1: totally windows 2: totally Unix
because, on windows , anyway i have to try and see how hadoop works
on UNIX, it is already known that , it is working fine.
namenode : windows 2012 R2
datanode : windows 2012 R2
1: datanode is not getting started
2: replication : if i put any file/folder on any datanode , it should
get replicated to all another available datanodes
regards
Post by Mohammad Tariq
Seriously??You are planning to develop something using Hadoop on
windows. Not a good idea. Anyways, cold you plz show me your log files?I
-The exact problem which you are facing right now
-Your cluster summary(no. of nodes etc)
-Your latest configuration files
-Your /etc.hosts file
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
ok. thanks
now, i need to start with all windows setup first as our product will
be based on windows
so, now, please tell me how to resolve the issue
datanode is not starting . please suggest
regards,
irfan
Post by Mohammad Tariq
It is possible. Theoretically Hadoop doesn't stop you from doing
that. But it is not a very wise setup.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
please suggest
regards
irfan
Post by Irfan Sayed
thanks.
namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)
and datanodes are the combination of any OS (windows , linux , unix
etc )
however, my doubt is, as the file systems of both the systems
(win and linux ) are different , datanodes of these systems can not be
part of single cluster . i have to make windows cluster separate and UNIX
cluster separate ?
regards
On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
Post by Arpit Agarwal
I just noticed you are on Cygwin. IIRC Windows PIDs are not the
same as Cygwin PIDs so that may be causing the discrepancy. I don't know
how well Hadoop works in Cygwin as I have never tried it. Work is in
progress for native Windows support however there are no official releases
with Windows support yet. It may be easier to get familiar with a
release <https://www.apache.org/dyn/closer.cgi/hadoop/common/> on
Linux if you are new to it.
On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
Post by Irfan Sayed
thanks
here is what i did .
i stopped all the namenodes and datanodes using ./stop-dfs.sh command
then deleted all pid files for namenodes and datanodes
started dfs again with command : "./start-dfs.sh"
when i ran the "Jps" command . it shows
$ ./jps.exe
4536 Jps
2076 NameNode
however, when i open the pid file for namenode then it is not
showing pid as : 4560. on the contrary, it shud show : 2076
please suggest
regards
On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
Post by Arpit Agarwal
Most likely there is a stale pid file. Something like
\tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
the datanode.
I haven't read the entire thread so you may have looked at this
already.
-Arpit
On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
Post by Irfan Sayed
datanode is trying to connect to namenode continuously but fails
$ ./jps.exe
4584 NameNode
4016 Jps
$ ./start-dfs.sh
namenode running as process 3544. Stop it first.
DFS-1: datanode running as process 4076. Stop it first.
localhost: secondarynamenode running as process 4792. Stop it first.
both these logs are contradictory
please find the attached logs
should i attach the conf files as well ?
regards
On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
Post by Mohammad Tariq
Your DN is still not running. Showing me the logs would be helpful.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
Post by Irfan Sayed
i followed the url and did the steps mention in that. i have
deployed on the windows platform
Now, i am able to browse url : http://localhost:50070 (name
node )
however, not able to browse url : http://localhost:50030
please refer below
[image: Inline image 1]
i have modified all the config files as mentioned and
formatted the hdfs file system as well
please suggest
regards
On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
Post by Irfan Sayed
http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
let me follow the url which you gave for pseudo distributed
setup and then will switch to distributed mode
regards
irfan
On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
Post by Mohammad Tariq
You are welcome. Which link have you followed for the
configuration?Your *core-site.xml* is empty. Remove the
property *fs.default.name *from *hdfs-site.xml* and add it
to *core-site.xml*. Remove *mapred.job.tracker* as well.
It is required in *mapred-site.xml*.
I would suggest you to do a pseudo distributed setup first
in order to get yourself familiar with the process and then proceed to the
distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
HTH
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
Post by Irfan Sayed
thanks tariq for response.
as discussed last time, i have sent you all the config
files in my setup .
can you please go through that ?
please let me know
regards
irfan
On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
Post by Mohammad Tariq
I'm sorry for being unresponsive. Was out of touch for
sometime because of ramzan and eid. Resuming work today.
What's the current status?
Warm Regards,
Tariq
cloudfront.blogspot.com
On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
Post by manish dunani
First of all read the concepts ..I hope you will like
it..
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
irfan
On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
Post by Irfan Sayed
hey Tariq,
i am still stuck ..
can you please suggest
regards
irfan
On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
Post by Irfan Sayed
attachment got quarantined
resending in txt format. please rename it to
conf.rar
regards
On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
Post by Irfan Sayed
thanks.
$ ./jps.exe
3164 NameNode
1892 Jps
$ ./jps.exe
3848 Jps
jps does not list any process for datanode.
however, on web browser i can see one live data node
please find the attached conf rar file of namenode
regards
On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <
Post by Mohammad Tariq
OK. we'll start fresh. Could you plz show me your
latest config files?
BTW, are your daemons running fine?Use JPS to
verify that.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <
Post by Irfan Sayed
i have created these dir "wksp_data" and
"wksp_name" on both datanode and namenode
made the respective changes in "hdfs-site.xml"
file
formatted the namenode
started the dfs
but still, not able to browse the file system
through web browser
please refer below
anything still missing ?
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <
Post by Irfan Sayed
these dir needs to be created on all datanodes
and namenodes ?
further, hdfs-site.xml needs to be updated on
both datanodes and namenodes for these new dir?
regards
On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <
Post by Mohammad Tariq
Create 2 directories manually corresponding to
the values of dfs.name.dir and dfs.data.dir properties and change the
permissions of these directories to 755. When you start pushing data into
your HDFS, data will start going inside the directory specified by
dfs.data.dir and the associated metadata will go inside dfs.name.dir.
Remember, you store data in HDFS, but it eventually gets stored in your
local/native FS. But you cannot see this data directly on your local/native
FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
however, i need this to be working on windows
environment as project requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is the HDFS
file system OR do i need to create it from command line ?
please suggest
regards,
On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck with
some imp work.
HDFS webUI doesn't provide us the ability to
create file or directory. You can browse HDFS, view files, download files
etc. But operation like create, move, copy etc are not supported.
These values look fine to me.
One suggestion though. Try getting a Linux
machine(if possible). Or at least use a VM. I personally feel that using
Hadoop on windows is always messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
when i browse the file system , i am getting
i haven't seen any make directory option
there
i need to create it from command line ?
further, in the hdfs-site.xml file , i have
given following entries. are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 12:40 PM, manish
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths local
and You need not to copy hadoop into hdfs...Hadoop is already working..
Just check out in browser by after starting
localhost:50070
then go for browse the filesystem link in
it..
If there is no directory then make
directory there.
That is your hdfs directory.
Then copy any text file there(no need to
copy hadoop there).beacause u are going to do processing on that data in
text file.That's why hadoop is used for ,first u need to make it clear in
ur mind.Then and then u will do it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal
/full/local/path/to/ur/file /hdfs/directory/path
On Tue, Aug 6, 2013 at 11:49 AM, Irfan
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link which
u sent but i need this to be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
--
Regards
*Manish Dunani*
*Contact No* : +91 9408329137
*skype id* : manish.dunani*
*
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual
or entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
Olivier Renault
2013-08-23 10:40:51 UTC
Permalink
Here is the link

http://download.hortonworks.com/products/hdp-windows/

Olivier
Post by Irfan Sayed
thanks.
i just followed the instructions to setup the pseudo distributed setup
http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
i don't think so i am running DN on both machine
please find the attached log
hi olivier
can you please give me download link ?
let me try please
regards
irfan
Post by Mohammad Tariq
Are you running DN on both the machines? Could you please show me your DN
logs?
Also, consider Oliver's suggestion. It's definitely a better option.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
Post by Olivier Renault
Irfu,
If you want to quickly get Hadoop running on windows platform. You may
want to try our distribution for Windows. You will be able to find the msi
on our website.
Regards
Olivier
Post by Irfan Sayed
thanks.
ok. i think i need to change the plan over here
let me create two environments. 1: totally windows 2: totally Unix
because, on windows , anyway i have to try and see how hadoop works
on UNIX, it is already known that , it is working fine.
namenode : windows 2012 R2
datanode : windows 2012 R2
1: datanode is not getting started
2: replication : if i put any file/folder on any datanode , it should
get replicated to all another available datanodes
regards
Post by Mohammad Tariq
Seriously??You are planning to develop something using Hadoop on
windows. Not a good idea. Anyways, cold you plz show me your log files?I
-The exact problem which you are facing right now
-Your cluster summary(no. of nodes etc)
-Your latest configuration files
-Your /etc.hosts file
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
ok. thanks
now, i need to start with all windows setup first as our product will
be based on windows
so, now, please tell me how to resolve the issue
datanode is not starting . please suggest
regards,
irfan
Post by Mohammad Tariq
It is possible. Theoretically Hadoop doesn't stop you from doing
that. But it is not a very wise setup.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
please suggest
regards
irfan
Post by Irfan Sayed
thanks.
namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)
and datanodes are the combination of any OS (windows , linux ,
unix etc )
however, my doubt is, as the file systems of both the systems
(win and linux ) are different , datanodes of these systems can not be
part of single cluster . i have to make windows cluster separate and UNIX
cluster separate ?
regards
On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
Post by Arpit Agarwal
I just noticed you are on Cygwin. IIRC Windows PIDs are not the
same as Cygwin PIDs so that may be causing the discrepancy. I don't know
how well Hadoop works in Cygwin as I have never tried it. Work is in
progress for native Windows support however there are no official releases
with Windows support yet. It may be easier to get familiar with a
release <https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
Post by Irfan Sayed
thanks
here is what i did .
i stopped all the namenodes and datanodes using ./stop-dfs.sh command
then deleted all pid files for namenodes and datanodes
started dfs again with command : "./start-dfs.sh"
when i ran the "Jps" command . it shows
$ ./jps.exe
4536 Jps
2076 NameNode
however, when i open the pid file for namenode then it is not
showing pid as : 4560. on the contrary, it shud show : 2076
please suggest
regards
On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
Post by Arpit Agarwal
Most likely there is a stale pid file. Something like
\tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
the datanode.
I haven't read the entire thread so you may have looked at this
already.
-Arpit
On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
Post by Irfan Sayed
datanode is trying to connect to namenode continuously but fails
$ ./jps.exe
4584 NameNode
4016 Jps
$ ./start-dfs.sh
namenode running as process 3544. Stop it first.
DFS-1: datanode running as process 4076. Stop it first.
localhost: secondarynamenode running as process 4792. Stop it
first.
both these logs are contradictory
please find the attached logs
should i attach the conf files as well ?
regards
On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
Post by Mohammad Tariq
Your DN is still not running. Showing me the logs would be
helpful.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
Post by Irfan Sayed
i followed the url and did the steps mention in that. i have
deployed on the windows platform
Now, i am able to browse url : http://localhost:50070 (name
node )
however, not able to browse url : http://localhost:50030
please refer below
[image: Inline image 1]
i have modified all the config files as mentioned and
formatted the hdfs file system as well
please suggest
regards
On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
Post by Irfan Sayed
http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
let me follow the url which you gave for pseudo distributed
setup and then will switch to distributed mode
regards
irfan
On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
Post by Mohammad Tariq
You are welcome. Which link have you followed for the
configuration?Your *core-site.xml* is empty. Remove the
property *fs.default.name *from *hdfs-site.xml* and add
it to *core-site.xml*. Remove *mapred.job.tracker* as
well. It is required in *mapred-site.xml*.
I would suggest you to do a pseudo distributed setup first
in order to get yourself familiar with the process and then proceed to the
distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
HTH
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
Post by Irfan Sayed
thanks tariq for response.
as discussed last time, i have sent you all the config
files in my setup .
can you please go through that ?
please let me know
regards
irfan
On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
Post by Mohammad Tariq
I'm sorry for being unresponsive. Was out of touch for
sometime because of ramzan and eid. Resuming work today.
What's the current status?
Warm Regards,
Tariq
cloudfront.blogspot.com
On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
Post by manish dunani
First of all read the concepts ..I hope you will like
it..
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
irfan
On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
Post by Irfan Sayed
hey Tariq,
i am still stuck ..
can you please suggest
regards
irfan
On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
Post by Irfan Sayed
attachment got quarantined
resending in txt format. please rename it to
conf.rar
regards
On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
Post by Irfan Sayed
thanks.
$ ./jps.exe
3164 NameNode
1892 Jps
$ ./jps.exe
3848 Jps
jps does not list any process for datanode.
however, on web browser i can see one live data node
please find the attached conf rar file of namenode
regards
On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <
Post by Mohammad Tariq
OK. we'll start fresh. Could you plz show me your
latest config files?
BTW, are your daemons running fine?Use JPS to
verify that.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <
Post by Irfan Sayed
i have created these dir "wksp_data" and
"wksp_name" on both datanode and namenode
made the respective changes in "hdfs-site.xml"
file
formatted the namenode
started the dfs
but still, not able to browse the file system
through web browser
please refer below
anything still missing ?
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <
Post by Irfan Sayed
these dir needs to be created on all datanodes
and namenodes ?
further, hdfs-site.xml needs to be updated on
both datanodes and namenodes for these new dir?
regards
On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq
Post by Mohammad Tariq
Create 2 directories manually corresponding to
the values of dfs.name.dir and dfs.data.dir properties and change the
permissions of these directories to 755. When you start pushing data into
your HDFS, data will start going inside the directory specified by
dfs.data.dir and the associated metadata will go inside dfs.name.dir.
Remember, you store data in HDFS, but it eventually gets stored in your
local/native FS. But you cannot see this data directly on your local/native
FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
however, i need this to be working on windows
environment as project requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is the
HDFS file system OR do i need to create it from command line ?
please suggest
regards,
On Tue, Aug 6, 2013 at 5:19 PM, Mohammad
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck with
some imp work.
HDFS webUI doesn't provide us the ability to
create file or directory. You can browse HDFS, view files, download files
etc. But operation like create, move, copy etc are not supported.
These values look fine to me.
One suggestion though. Try getting a Linux
machine(if possible). Or at least use a VM. I personally feel that using
Hadoop on windows is always messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed
Post by Irfan Sayed
thanks.
when i browse the file system , i am
i haven't seen any make directory option
there
i need to create it from command line ?
further, in the hdfs-site.xml file , i have
given following entries. are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 12:40 PM, manish
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths local
and You need not to copy hadoop into hdfs...Hadoop is already working..
Just check out in browser by after
localhost:50070
then go for browse the filesystem link in
it..
If there is no directory then make
directory there.
That is your hdfs directory.
Then copy any text file there(no need to
copy hadoop there).beacause u are going to do processing on that data in
text file.That's why hadoop is used for ,first u need to make it clear in
ur mind.Then and then u will do it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal
/full/local/path/to/ur/file /hdfs/directory/path
On Tue, Aug 6, 2013 at 11:49 AM, Irfan
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link
which u sent but i need this to be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
--
Regards
*Manish Dunani*
*Contact No* : +91 9408329137
*skype id* : manish.dunani*
*
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual
or entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
--
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to
which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
Irfan Sayed
2013-09-03 04:51:01 UTC
Permalink
thanks. sorry for the long break. actually got involved in some other
priorities
i downloaded the installer and while installing i got following error

[image: Inline image 1]

do i need to make any configuration prior to installation ??

regards
irfan



On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault
Post by Olivier Renault
Here is the link
http://download.hortonworks.com/products/hdp-windows/
Olivier
Post by Irfan Sayed
thanks.
i just followed the instructions to setup the pseudo distributed setup
http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
i don't think so i am running DN on both machine
please find the attached log
hi olivier
can you please give me download link ?
let me try please
regards
irfan
Post by Mohammad Tariq
Are you running DN on both the machines? Could you please show me your
DN logs?
Also, consider Oliver's suggestion. It's definitely a better option.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
Post by Olivier Renault
Irfu,
If you want to quickly get Hadoop running on windows platform. You may
want to try our distribution for Windows. You will be able to find the msi
on our website.
Regards
Olivier
Post by Irfan Sayed
thanks.
ok. i think i need to change the plan over here
let me create two environments. 1: totally windows 2: totally Unix
because, on windows , anyway i have to try and see how hadoop works
on UNIX, it is already known that , it is working fine.
namenode : windows 2012 R2
datanode : windows 2012 R2
1: datanode is not getting started
2: replication : if i put any file/folder on any datanode , it should
get replicated to all another available datanodes
regards
Post by Mohammad Tariq
Seriously??You are planning to develop something using Hadoop on
windows. Not a good idea. Anyways, cold you plz show me your log files?I
-The exact problem which you are facing right now
-Your cluster summary(no. of nodes etc)
-Your latest configuration files
-Your /etc.hosts file
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
ok. thanks
now, i need to start with all windows setup first as our product
will be based on windows
so, now, please tell me how to resolve the issue
datanode is not starting . please suggest
regards,
irfan
Post by Mohammad Tariq
It is possible. Theoretically Hadoop doesn't stop you from doing
that. But it is not a very wise setup.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
please suggest
regards
irfan
On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
Post by Irfan Sayed
thanks.
namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)
and datanodes are the combination of any OS (windows , linux ,
unix etc )
however, my doubt is, as the file systems of both the systems
(win and linux ) are different , datanodes of these systems can not be
part of single cluster . i have to make windows cluster separate and UNIX
cluster separate ?
regards
On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
Post by Arpit Agarwal
I just noticed you are on Cygwin. IIRC Windows PIDs are not the
same as Cygwin PIDs so that may be causing the discrepancy. I don't know
how well Hadoop works in Cygwin as I have never tried it. Work is in
progress for native Windows support however there are no official releases
with Windows support yet. It may be easier to get familiar with a
release <https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
Post by Irfan Sayed
thanks
here is what i did .
i stopped all the namenodes and datanodes using ./stop-dfs.sh command
then deleted all pid files for namenodes and datanodes
started dfs again with command : "./start-dfs.sh"
when i ran the "Jps" command . it shows
$ ./jps.exe
4536 Jps
2076 NameNode
however, when i open the pid file for namenode then it is not
showing pid as : 4560. on the contrary, it shud show : 2076
please suggest
regards
On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
Post by Arpit Agarwal
Most likely there is a stale pid file. Something like
\tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
the datanode.
I haven't read the entire thread so you may have looked at
this already.
-Arpit
On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
Post by Irfan Sayed
datanode is trying to connect to namenode continuously but fails
$ ./jps.exe
4584 NameNode
4016 Jps
$ ./start-dfs.sh
namenode running as process 3544. Stop it first.
DFS-1: datanode running as process 4076. Stop it first.
localhost: secondarynamenode running as process 4792. Stop it
first.
both these logs are contradictory
please find the attached logs
should i attach the conf files as well ?
regards
On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
Post by Mohammad Tariq
Your DN is still not running. Showing me the logs would be
helpful.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
Post by Irfan Sayed
i followed the url and did the steps mention in that. i
have deployed on the windows platform
Now, i am able to browse url : http://localhost:50070(name node )
however, not able to browse url : http://localhost:50030
please refer below
[image: Inline image 1]
i have modified all the config files as mentioned and
formatted the hdfs file system as well
please suggest
regards
On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
Post by Irfan Sayed
http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
let me follow the url which you gave for pseudo
distributed setup and then will switch to distributed mode
regards
irfan
On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
Post by Mohammad Tariq
You are welcome. Which link have you followed for the
configuration?Your *core-site.xml* is empty. Remove the
property *fs.default.name *from *hdfs-site.xml* and add
it to *core-site.xml*. Remove *mapred.job.tracker* as
well. It is required in *mapred-site.xml*.
I would suggest you to do a pseudo distributed setup
first in order to get yourself familiar with the process and then proceed
to the distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
HTH
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
Post by Irfan Sayed
thanks tariq for response.
as discussed last time, i have sent you all the config
files in my setup .
can you please go through that ?
please let me know
regards
irfan
On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
Post by Mohammad Tariq
I'm sorry for being unresponsive. Was out of touch for
sometime because of ramzan and eid. Resuming work today.
What's the current status?
Warm Regards,
Tariq
cloudfront.blogspot.com
On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
Post by manish dunani
First of all read the concepts ..I hope you will like
it..
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
irfan
On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
Post by Irfan Sayed
hey Tariq,
i am still stuck ..
can you please suggest
regards
irfan
On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
Post by Irfan Sayed
attachment got quarantined
resending in txt format. please rename it to
conf.rar
regards
On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
Post by Irfan Sayed
thanks.
$ ./jps.exe
3164 NameNode
1892 Jps
$ ./jps.exe
3848 Jps
jps does not list any process for datanode.
however, on web browser i can see one live data node
please find the attached conf rar file of
namenode
regards
On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <
Post by Mohammad Tariq
OK. we'll start fresh. Could you plz show me
your latest config files?
BTW, are your daemons running fine?Use JPS to
verify that.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <
Post by Irfan Sayed
i have created these dir "wksp_data" and
"wksp_name" on both datanode and namenode
made the respective changes in "hdfs-site.xml"
file
formatted the namenode
started the dfs
but still, not able to browse the file system
through web browser
please refer below
anything still missing ?
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <
Post by Irfan Sayed
these dir needs to be created on all datanodes
and namenodes ?
further, hdfs-site.xml needs to be updated on
both datanodes and namenodes for these new dir?
regards
On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq
Post by Mohammad Tariq
Create 2 directories manually corresponding
to the values of dfs.name.dir and dfs.data.dir properties and change the
permissions of these directories to 755. When you start pushing data into
your HDFS, data will start going inside the directory specified by
dfs.data.dir and the associated metadata will go inside dfs.name.dir.
Remember, you store data in HDFS, but it eventually gets stored in your
local/native FS. But you cannot see this data directly on your local/native
FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <
Post by Irfan Sayed
thanks.
however, i need this to be working on
windows environment as project requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is the
HDFS file system OR do i need to create it from command line ?
please suggest
regards,
On Tue, Aug 6, 2013 at 5:19 PM, Mohammad
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck
with some imp work.
HDFS webUI doesn't provide us the ability
to create file or directory. You can browse HDFS, view files, download
files etc. But operation like create, move, copy etc are not supported.
These values look fine to me.
One suggestion though. Try getting a Linux
machine(if possible). Or at least use a VM. I personally feel that using
Hadoop on windows is always messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed
Post by Irfan Sayed
thanks.
when i browse the file system , i am
i haven't seen any make directory option
there
i need to create it from command line ?
further, in the hdfs-site.xml file , i
have given following entries. are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 12:40 PM, manish
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths
local and You need not to copy hadoop into hdfs...Hadoop is already
working..
Just check out in browser by after
localhost:50070
then go for browse the filesystem link in
it..
If there is no directory then make
directory there.
That is your hdfs directory.
Then copy any text file there(no need to
copy hadoop there).beacause u are going to do processing on that data in
text file.That's why hadoop is used for ,first u need to make it clear in
ur mind.Then and then u will do it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal
/full/local/path/to/ur/file /hdfs/directory/path
On Tue, Aug 6, 2013 at 11:49 AM, Irfan
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link
which u sent but i need this to be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
--
Regards
*Manish Dunani*
*Contact No* : +91 9408329137
*skype id* : manish.dunani*
*
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual
or entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual
or entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
Olivier Renault
2013-09-03 07:28:56 UTC
Permalink
Correct, you need to define the cluster configuration as part of a file.
You will find some information on the configuration file as part of the
documentation.

http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html

You should make sure to have also installed the pre requisite.

Thanks
Olivier
Post by Irfan Sayed
thanks. sorry for the long break. actually got involved in some other
priorities
i downloaded the installer and while installing i got following error
[image: Inline image 1]
do i need to make any configuration prior to installation ??
regards
irfan
Post by Olivier Renault
Here is the link
http://download.hortonworks.com/products/hdp-windows/
Olivier
Post by Irfan Sayed
thanks.
i just followed the instructions to setup the pseudo distributed setup
http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
i don't think so i am running DN on both machine
please find the attached log
hi olivier
can you please give me download link ?
let me try please
regards
irfan
Post by Mohammad Tariq
Are you running DN on both the machines? Could you please show me your
DN logs?
Also, consider Oliver's suggestion. It's definitely a better option.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
Post by Olivier Renault
Irfu,
If you want to quickly get Hadoop running on windows platform. You may
want to try our distribution for Windows. You will be able to find the msi
on our website.
Regards
Olivier
Post by Irfan Sayed
thanks.
ok. i think i need to change the plan over here
let me create two environments. 1: totally windows 2: totally Unix
because, on windows , anyway i have to try and see how hadoop works
on UNIX, it is already known that , it is working fine.
namenode : windows 2012 R2
datanode : windows 2012 R2
1: datanode is not getting started
2: replication : if i put any file/folder on any datanode , it should
get replicated to all another available datanodes
regards
Post by Mohammad Tariq
Seriously??You are planning to develop something using Hadoop on
windows. Not a good idea. Anyways, cold you plz show me your log files?I
-The exact problem which you are facing right now
-Your cluster summary(no. of nodes etc)
-Your latest configuration files
-Your /etc.hosts file
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
ok. thanks
now, i need to start with all windows setup first as our product
will be based on windows
so, now, please tell me how to resolve the issue
datanode is not starting . please suggest
regards,
irfan
Post by Mohammad Tariq
It is possible. Theoretically Hadoop doesn't stop you from doing
that. But it is not a very wise setup.
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
please suggest
regards
irfan
On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
Post by Irfan Sayed
thanks.
namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)
and datanodes are the combination of any OS (windows , linux ,
unix etc )
however, my doubt is, as the file systems of both the systems
(win and linux ) are different , datanodes of these systems can not be
part of single cluster . i have to make windows cluster separate and UNIX
cluster separate ?
regards
On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
Post by Arpit Agarwal
I just noticed you are on Cygwin. IIRC Windows PIDs are not the
same as Cygwin PIDs so that may be causing the discrepancy. I don't know
how well Hadoop works in Cygwin as I have never tried it. Work is in
progress for native Windows support however there are no official releases
with Windows support yet. It may be easier to get familiar with a
release <https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
Post by Irfan Sayed
thanks
here is what i did .
i stopped all the namenodes and datanodes using ./stop-dfs.sh
command
then deleted all pid files for namenodes and datanodes
started dfs again with command : "./start-dfs.sh"
when i ran the "Jps" command . it shows
$ ./jps.exe
4536 Jps
2076 NameNode
however, when i open the pid file for namenode then it is not
showing pid as : 4560. on the contrary, it shud show : 2076
please suggest
regards
On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
Post by Arpit Agarwal
Most likely there is a stale pid file. Something like
\tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
the datanode.
I haven't read the entire thread so you may have looked at
this already.
-Arpit
On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
Post by Irfan Sayed
datanode is trying to connect to namenode continuously but
fails
$ ./jps.exe
4584 NameNode
4016 Jps
$ ./start-dfs.sh
namenode running as process 3544. Stop it first.
DFS-1: datanode running as process 4076. Stop it first.
localhost: secondarynamenode running as process 4792. Stop
it first.
both these logs are contradictory
please find the attached logs
should i attach the conf files as well ?
regards
On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
Post by Mohammad Tariq
Your DN is still not running. Showing me the logs would be
helpful.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
Post by Irfan Sayed
i followed the url and did the steps mention in that. i
have deployed on the windows platform
Now, i am able to browse url : http://localhost:50070(name node )
however, not able to browse url : http://localhost:50030
please refer below
[image: Inline image 1]
i have modified all the config files as mentioned and
formatted the hdfs file system as well
please suggest
regards
On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
Post by Irfan Sayed
http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
let me follow the url which you gave for pseudo
distributed setup and then will switch to distributed mode
regards
irfan
On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
Post by Mohammad Tariq
You are welcome. Which link have you followed for the
configuration?Your *core-site.xml* is empty. Remove the
property *fs.default.name *from *hdfs-site.xml* and add
it to *core-site.xml*. Remove *mapred.job.tracker* as
well. It is required in *mapred-site.xml*.
I would suggest you to do a pseudo distributed setup
first in order to get yourself familiar with the process and then proceed
to the distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
HTH
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
Post by Irfan Sayed
thanks tariq for response.
as discussed last time, i have sent you all the config
files in my setup .
can you please go through that ?
please let me know
regards
irfan
On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
Post by Mohammad Tariq
I'm sorry for being unresponsive. Was out of touch for
sometime because of ramzan and eid. Resuming work today.
What's the current status?
Warm Regards,
Tariq
cloudfront.blogspot.com
On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
Post by manish dunani
First of all read the concepts ..I hope you will like
it..
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
irfan
On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
Post by Irfan Sayed
hey Tariq,
i am still stuck ..
can you please suggest
regards
irfan
On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
Post by Irfan Sayed
attachment got quarantined
resending in txt format. please rename it to
conf.rar
regards
On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
Post by Irfan Sayed
thanks.
$ ./jps.exe
3164 NameNode
1892 Jps
$ ./jps.exe
3848 Jps
jps does not list any process for datanode.
however, on web browser i can see one live data node
please find the attached conf rar file of
namenode
regards
On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <
Post by Mohammad Tariq
OK. we'll start fresh. Could you plz show me
your latest config files?
BTW, are your daemons running fine?Use JPS to
verify that.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <
Post by Irfan Sayed
i have created these dir "wksp_data" and
"wksp_name" on both datanode and namenode
made the respective changes in "hdfs-site.xml"
file
formatted the namenode
started the dfs
but still, not able to browse the file system
through web browser
please refer below
anything still missing ?
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <
Post by Irfan Sayed
these dir needs to be created on all
datanodes and namenodes ?
further, hdfs-site.xml needs to be updated
on both datanodes and namenodes for these new dir?
regards
On Tue, Aug 6, 2013 at 5:30 PM, Mohammad
Post by Mohammad Tariq
Create 2 directories manually corresponding
to the values of dfs.name.dir and dfs.data.dir properties and change the
permissions of these directories to 755. When you start pushing data into
your HDFS, data will start going inside the directory specified by
dfs.data.dir and the associated metadata will go inside dfs.name.dir.
Remember, you store data in HDFS, but it eventually gets stored in your
local/native FS. But you cannot see this data directly on your local/native
FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed
Post by Irfan Sayed
thanks.
however, i need this to be working on
windows environment as project requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is the
HDFS file system OR do i need to create it from command line ?
please suggest
regards,
On Tue, Aug 6, 2013 at 5:19 PM, Mohammad
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck
with some imp work.
HDFS webUI doesn't provide us the ability
to create file or directory. You can browse HDFS, view files, download
files etc. But operation like create, move, copy etc are not supported.
These values look fine to me.
One suggestion though. Try getting a Linux
machine(if possible). Or at least use a VM. I personally feel that using
Hadoop on windows is always messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:09 PM, Irfan
Post by Irfan Sayed
thanks.
when i browse the file system , i am
i haven't seen any make directory option
there
i need to create it from command line ?
further, in the hdfs-site.xml file , i
have given following entries. are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 12:40 PM, manish
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths
local and You need not to copy hadoop into hdfs...Hadoop is already
working..
Just check out in browser by after
localhost:50070
then go for browse the filesystem link
in it..
If there is no directory then make
directory there.
That is your hdfs directory.
Then copy any text file there(no need to
copy hadoop there).beacause u are going to do processing on that data in
text file.That's why hadoop is used for ,first u need to make it clear in
ur mind.Then and then u will do it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal
/full/local/path/to/ur/file /hdfs/directory/path
On Tue, Aug 6, 2013 at 11:49 AM, Irfan
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link
which u sent but i need this to be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
--
Regards
*Manish Dunani*
*Contact No* : +91 9408329137
*skype id* : manish.dunani*
*
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the
individual or entity to which it is addressed and may contain information
that is confidential, privileged and exempt from disclosure under
applicable law. If the reader of this message is not the intended
recipient, you are hereby notified that any printing, copying,
dissemination, distribution, disclosure or forwarding of this communication
is strictly prohibited. If you have received this communication in error,
please contact the sender immediately and delete it from your system. Thank
You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual
or entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
--
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to
which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
Irfan Sayed
2013-09-04 11:36:46 UTC
Permalink
thanks.
i referred the logs and manuals. i modified the clusterproperties file and
then double click on the msi file
however, it still failed.
further i started the installation on command line by giving
HDP_LAYOUT=clusterproperties file path,
installation went ahead and it failed for .NET framework 4.0 and VC++
redistributable package dependency

i installed both and started again the installation.
failed again with following error
[image: Inline image 1]

when i search for the logs mentioned in the error , i never found that
please suggest

regards
irfan



On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault
Post by Olivier Renault
Correct, you need to define the cluster configuration as part of a file.
You will find some information on the configuration file as part of the
documentation.
http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
You should make sure to have also installed the pre requisite.
Thanks
Olivier
Post by Irfan Sayed
thanks. sorry for the long break. actually got involved in some other
priorities
i downloaded the installer and while installing i got following error
[image: Inline image 1]
do i need to make any configuration prior to installation ??
regards
irfan
On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
Post by Olivier Renault
Here is the link
http://download.hortonworks.com/products/hdp-windows/
Olivier
Post by Irfan Sayed
thanks.
i just followed the instructions to setup the pseudo distributed setup
http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
i don't think so i am running DN on both machine
please find the attached log
hi olivier
can you please give me download link ?
let me try please
regards
irfan
Post by Mohammad Tariq
Are you running DN on both the machines? Could you please show me your
DN logs?
Also, consider Oliver's suggestion. It's definitely a better option.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
Post by Olivier Renault
Irfu,
If you want to quickly get Hadoop running on windows platform. You
may want to try our distribution for Windows. You will be able to find the
msi on our website.
Regards
Olivier
Post by Irfan Sayed
thanks.
ok. i think i need to change the plan over here
let me create two environments. 1: totally windows 2: totally Unix
because, on windows , anyway i have to try and see how hadoop works
on UNIX, it is already known that , it is working fine.
namenode : windows 2012 R2
datanode : windows 2012 R2
1: datanode is not getting started
2: replication : if i put any file/folder on any datanode , it
should get replicated to all another available datanodes
regards
Post by Mohammad Tariq
Seriously??You are planning to develop something using Hadoop on
windows. Not a good idea. Anyways, cold you plz show me your log files?I
-The exact problem which you are facing right now
-Your cluster summary(no. of nodes etc)
-Your latest configuration files
-Your /etc.hosts file
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
ok. thanks
now, i need to start with all windows setup first as our product
will be based on windows
so, now, please tell me how to resolve the issue
datanode is not starting . please suggest
regards,
irfan
On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <
Post by Mohammad Tariq
It is possible. Theoretically Hadoop doesn't stop you from doing
that. But it is not a very wise setup.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
irfan
On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
Post by Irfan Sayed
thanks.
namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)
and datanodes are the combination of any OS (windows , linux ,
unix etc )
however, my doubt is, as the file systems of both the systems
(win and linux ) are different , datanodes of these systems can not be
part of single cluster . i have to make windows cluster separate and UNIX
cluster separate ?
regards
On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
Post by Arpit Agarwal
I just noticed you are on Cygwin. IIRC Windows PIDs are not
the same as Cygwin PIDs so that may be causing the discrepancy. I don't
know how well Hadoop works in Cygwin as I have never tried it. Work is in
progress for native Windows support however there are no official releases
with Windows support yet. It may be easier to get familiar with a
release <https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
Post by Irfan Sayed
thanks
here is what i did .
i stopped all the namenodes and datanodes using ./stop-dfs.sh
command
then deleted all pid files for namenodes and datanodes
started dfs again with command : "./start-dfs.sh"
when i ran the "Jps" command . it shows
$ ./jps.exe
4536 Jps
2076 NameNode
however, when i open the pid file for namenode then it is not
showing pid as : 4560. on the contrary, it shud show : 2076
please suggest
regards
On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
Post by Arpit Agarwal
Most likely there is a stale pid file. Something like
\tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
the datanode.
I haven't read the entire thread so you may have looked at
this already.
-Arpit
On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
Post by Irfan Sayed
datanode is trying to connect to namenode continuously but
fails
$ ./jps.exe
4584 NameNode
4016 Jps
$ ./start-dfs.sh
namenode running as process 3544. Stop it first.
DFS-1: datanode running as process 4076. Stop it first.
localhost: secondarynamenode running as process 4792. Stop
it first.
both these logs are contradictory
please find the attached logs
should i attach the conf files as well ?
regards
On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
Post by Mohammad Tariq
Your DN is still not running. Showing me the logs would be
helpful.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
Post by Irfan Sayed
i followed the url and did the steps mention in that. i
have deployed on the windows platform
Now, i am able to browse url : http://localhost:50070(name node )
however, not able to browse url : http://localhost:50030
please refer below
[image: Inline image 1]
i have modified all the config files as mentioned and
formatted the hdfs file system as well
please suggest
regards
On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
Post by Irfan Sayed
http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
let me follow the url which you gave for pseudo
distributed setup and then will switch to distributed mode
regards
irfan
On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
Post by Mohammad Tariq
You are welcome. Which link have you followed for the
configuration?Your *core-site.xml* is empty. Remove
the property *fs.default.name *from *hdfs-site.xml*and add it to
*core-site.xml*. Remove *mapred.job.tracker* as well.
It is required in *mapred-site.xml*.
I would suggest you to do a pseudo distributed setup
first in order to get yourself familiar with the process and then proceed
to the distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
HTH
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
Post by Irfan Sayed
thanks tariq for response.
as discussed last time, i have sent you all the config
files in my setup .
can you please go through that ?
please let me know
regards
irfan
On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
Post by Mohammad Tariq
I'm sorry for being unresponsive. Was out of touch
for sometime because of ramzan and eid. Resuming work today.
What's the current status?
Warm Regards,
Tariq
cloudfront.blogspot.com
On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
Post by manish dunani
First of all read the concepts ..I hope you will
like it..
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
irfan
On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
Post by Irfan Sayed
hey Tariq,
i am still stuck ..
can you please suggest
regards
irfan
On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
Post by Irfan Sayed
attachment got quarantined
resending in txt format. please rename it to
conf.rar
regards
On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
Post by Irfan Sayed
thanks.
$ ./jps.exe
3164 NameNode
1892 Jps
$ ./jps.exe
3848 Jps
jps does not list any process for datanode.
however, on web browser i can see one live data node
please find the attached conf rar file of
namenode
regards
On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq
Post by Mohammad Tariq
OK. we'll start fresh. Could you plz show me
your latest config files?
BTW, are your daemons running fine?Use JPS to
verify that.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <
Post by Irfan Sayed
i have created these dir "wksp_data" and
"wksp_name" on both datanode and namenode
made the respective changes in
"hdfs-site.xml" file
formatted the namenode
started the dfs
but still, not able to browse the file system
through web browser
please refer below
anything still missing ?
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed
Post by Irfan Sayed
these dir needs to be created on all
datanodes and namenodes ?
further, hdfs-site.xml needs to be updated
on both datanodes and namenodes for these new dir?
regards
On Tue, Aug 6, 2013 at 5:30 PM, Mohammad
Post by Mohammad Tariq
Create 2 directories manually corresponding
to the values of dfs.name.dir and dfs.data.dir properties and change the
permissions of these directories to 755. When you start pushing data into
your HDFS, data will start going inside the directory specified by
dfs.data.dir and the associated metadata will go inside dfs.name.dir.
Remember, you store data in HDFS, but it eventually gets stored in your
local/native FS. But you cannot see this data directly on your local/native
FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed
Post by Irfan Sayed
thanks.
however, i need this to be working on
windows environment as project requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is the
HDFS file system OR do i need to create it from command line ?
please suggest
regards,
On Tue, Aug 6, 2013 at 5:19 PM, Mohammad
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck
with some imp work.
HDFS webUI doesn't provide us the ability
to create file or directory. You can browse HDFS, view files, download
files etc. But operation like create, move, copy etc are not supported.
These values look fine to me.
One suggestion though. Try getting a
Linux machine(if possible). Or at least use a VM. I personally feel that
using Hadoop on windows is always messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:09 PM, Irfan
Post by Irfan Sayed
thanks.
when i browse the file system , i am
i haven't seen any make directory option
there
i need to create it from command line ?
further, in the hdfs-site.xml file , i
have given following entries. are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 12:40 PM, manish
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths
local and You need not to copy hadoop into hdfs...Hadoop is already
working..
Just check out in browser by after
localhost:50070
then go for browse the filesystem link
in it..
If there is no directory then make
directory there.
That is your hdfs directory.
Then copy any text file there(no need
to copy hadoop there).beacause u are going to do processing on that data in
text file.That's why hadoop is used for ,first u need to make it clear in
ur mind.Then and then u will do it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal
/full/local/path/to/ur/file /hdfs/directory/path
On Tue, Aug 6, 2013 at 11:49 AM, Irfan
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link
which u sent but i need this to be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
--
Regards
*Manish Dunani*
*Contact No* : +91 9408329137
*skype id* : manish.dunani*
*
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the
individual or entity to which it is addressed and may contain information
that is confidential, privileged and exempt from disclosure under
applicable law. If the reader of this message is not the intended
recipient, you are hereby notified that any printing, copying,
dissemination, distribution, disclosure or forwarding of this communication
is strictly prohibited. If you have received this communication in error,
please contact the sender immediately and delete it from your system. Thank
You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual
or entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
Olivier Renault
2013-09-04 11:43:35 UTC
Permalink
The command to install it is msiexec /i msifile /...

You will find the correct syntax as part of doc.

Happy reading
Olivier
Post by Irfan Sayed
thanks.
i referred the logs and manuals. i modified the clusterproperties file and
then double click on the msi file
however, it still failed.
further i started the installation on command line by giving
HDP_LAYOUT=clusterproperties file path,
installation went ahead and it failed for .NET framework 4.0 and VC++
redistributable package dependency
i installed both and started again the installation.
failed again with following error
[image: Inline image 1]
when i search for the logs mentioned in the error , i never found that
please suggest
regards
irfan
Post by Olivier Renault
Correct, you need to define the cluster configuration as part of a file.
You will find some information on the configuration file as part of the
documentation.
http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
You should make sure to have also installed the pre requisite.
Thanks
Olivier
Post by Irfan Sayed
thanks. sorry for the long break. actually got involved in some other
priorities
i downloaded the installer and while installing i got following error
[image: Inline image 1]
do i need to make any configuration prior to installation ??
regards
irfan
On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
Post by Olivier Renault
Here is the link
http://download.hortonworks.com/products/hdp-windows/
Olivier
Post by Irfan Sayed
thanks.
i just followed the instructions to setup the pseudo distributed setup
http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
i don't think so i am running DN on both machine
please find the attached log
hi olivier
can you please give me download link ?
let me try please
regards
irfan
Post by Mohammad Tariq
Are you running DN on both the machines? Could you please show me
your DN logs?
Also, consider Oliver's suggestion. It's definitely a better option.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
Post by Olivier Renault
Irfu,
If you want to quickly get Hadoop running on windows platform. You
may want to try our distribution for Windows. You will be able to find the
msi on our website.
Regards
Olivier
Post by Irfan Sayed
thanks.
ok. i think i need to change the plan over here
let me create two environments. 1: totally windows 2: totally Unix
because, on windows , anyway i have to try and see how hadoop works
on UNIX, it is already known that , it is working fine.
namenode : windows 2012 R2
datanode : windows 2012 R2
1: datanode is not getting started
2: replication : if i put any file/folder on any datanode , it
should get replicated to all another available datanodes
regards
Post by Mohammad Tariq
Seriously??You are planning to develop something using Hadoop on
windows. Not a good idea. Anyways, cold you plz show me your log files?I
-The exact problem which you are facing right now
-Your cluster summary(no. of nodes etc)
-Your latest configuration files
-Your /etc.hosts file
Warm Regards,
Tariq
cloudfront.blogspot.com
Post by Irfan Sayed
ok. thanks
now, i need to start with all windows setup first as our product
will be based on windows
so, now, please tell me how to resolve the issue
datanode is not starting . please suggest
regards,
irfan
On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <
Post by Mohammad Tariq
It is possible. Theoretically Hadoop doesn't stop you from doing
that. But it is not a very wise setup.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
irfan
On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
Post by Irfan Sayed
thanks.
namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)
and datanodes are the combination of any OS (windows , linux ,
unix etc )
however, my doubt is, as the file systems of both the
systems (win and linux ) are different , datanodes of these systems can
not be part of single cluster . i have to make windows cluster separate and
UNIX cluster separate ?
regards
On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
Post by Arpit Agarwal
I just noticed you are on Cygwin. IIRC Windows PIDs are not
the same as Cygwin PIDs so that may be causing the discrepancy. I don't
know how well Hadoop works in Cygwin as I have never tried it. Work is in
progress for native Windows support however there are no official releases
with Windows support yet. It may be easier to get familiar with a
release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
Post by Irfan Sayed
thanks
here is what i did .
i stopped all the namenodes and datanodes using
./stop-dfs.sh command
then deleted all pid files for namenodes and datanodes
started dfs again with command : "./start-dfs.sh"
when i ran the "Jps" command . it shows
$ ./jps.exe
4536 Jps
2076 NameNode
however, when i open the pid file for namenode then it is
not showing pid as : 4560. on the contrary, it shud show : 2076
please suggest
regards
On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
Post by Arpit Agarwal
Most likely there is a stale pid file. Something like
\tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
the datanode.
I haven't read the entire thread so you may have looked at
this already.
-Arpit
On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
Post by Irfan Sayed
datanode is trying to connect to namenode continuously but
fails
$ ./jps.exe
4584 NameNode
4016 Jps
$ ./start-dfs.sh
namenode running as process 3544. Stop it first.
DFS-1: datanode running as process 4076. Stop it first.
localhost: secondarynamenode running as process 4792. Stop
it first.
both these logs are contradictory
please find the attached logs
should i attach the conf files as well ?
regards
On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
Post by Mohammad Tariq
Your DN is still not running. Showing me the logs would
be helpful.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
Post by Irfan Sayed
i followed the url and did the steps mention in that. i
have deployed on the windows platform
Now, i am able to browse url : http://localhost:50070(name node )
however, not able to browse url : http://localhost:50030
please refer below
[image: Inline image 1]
i have modified all the config files as mentioned and
formatted the hdfs file system as well
please suggest
regards
On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
Post by Irfan Sayed
http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
let me follow the url which you gave for pseudo
distributed setup and then will switch to distributed mode
regards
irfan
On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
Post by Mohammad Tariq
You are welcome. Which link have you followed for the
configuration?Your *core-site.xml* is empty. Remove
the property *fs.default.name *from *hdfs-site.xml*and add it to
*core-site.xml*. Remove *mapred.job.tracker* as well.
It is required in *mapred-site.xml*.
I would suggest you to do a pseudo distributed setup
first in order to get yourself familiar with the process and then proceed
to the distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
HTH
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
Post by Irfan Sayed
thanks tariq for response.
as discussed last time, i have sent you all the
config files in my setup .
can you please go through that ?
please let me know
regards
irfan
On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
Post by Mohammad Tariq
I'm sorry for being unresponsive. Was out of touch
for sometime because of ramzan and eid. Resuming work today.
What's the current status?
Warm Regards,
Tariq
cloudfront.blogspot.com
On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
Post by manish dunani
First of all read the concepts ..I hope you will
like it..
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
irfan
On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
Post by Irfan Sayed
hey Tariq,
i am still stuck ..
can you please suggest
regards
irfan
On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
Post by Irfan Sayed
attachment got quarantined
resending in txt format. please rename it to
conf.rar
regards
On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
Post by Irfan Sayed
thanks.
$ ./jps.exe
3164 NameNode
1892 Jps
$ ./jps.exe
3848 Jps
jps does not list any process for datanode.
however, on web browser i can see one live data node
please find the attached conf rar file of
namenode
regards
On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq
Post by Mohammad Tariq
OK. we'll start fresh. Could you plz show me
your latest config files?
BTW, are your daemons running fine?Use JPS to
verify that.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed
Post by Irfan Sayed
i have created these dir "wksp_data" and
"wksp_name" on both datanode and namenode
made the respective changes in
"hdfs-site.xml" file
formatted the namenode
started the dfs
but still, not able to browse the file
system through web browser
please refer below
anything still missing ?
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed
Post by Irfan Sayed
these dir needs to be created on all
datanodes and namenodes ?
further, hdfs-site.xml needs to be updated
on both datanodes and namenodes for these new dir?
regards
On Tue, Aug 6, 2013 at 5:30 PM, Mohammad
Post by Mohammad Tariq
Create 2 directories manually
corresponding to the values of dfs.name.dir and dfs.data.dir properties and
change the permissions of these directories to 755. When you start pushing
data into your HDFS, data will start going inside the directory specified
by dfs.data.dir and the associated metadata will go inside dfs.name.dir.
Remember, you store data in HDFS, but it eventually gets stored in your
local/native FS. But you cannot see this data directly on your local/native
FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:26 PM, Irfan
Post by Irfan Sayed
thanks.
however, i need this to be working on
windows environment as project requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is the
HDFS file system OR do i need to create it from command line ?
please suggest
regards,
On Tue, Aug 6, 2013 at 5:19 PM, Mohammad
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck
with some imp work.
HDFS webUI doesn't provide us the
ability to create file or directory. You can browse HDFS, view files,
download files etc. But operation like create, move, copy etc are not
supported.
These values look fine to me.
One suggestion though. Try getting a
Linux machine(if possible). Or at least use a VM. I personally feel that
using Hadoop on windows is always messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:09 PM, Irfan
Post by Irfan Sayed
thanks.
when i browse the file system , i am
i haven't seen any make directory
option there
i need to create it from command line ?
further, in the hdfs-site.xml file , i
have given following entries. are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 12:40 PM, manish
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths
local and You need not to copy hadoop into hdfs...Hadoop is already
working..
Just check out in browser by after
localhost:50070
then go for browse the filesystem link
in it..
If there is no directory then make
directory there.
That is your hdfs directory.
Then copy any text file there(no need
to copy hadoop there).beacause u are going to do processing on that data in
text file.That's why hadoop is used for ,first u need to make it clear in
ur mind.Then and then u will do it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal
/full/local/path/to/ur/file /hdfs/directory/path
On Tue, Aug 6, 2013 at 11:49 AM, Irfan
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link
which u sent but i need this to be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
--
Regards
*Manish Dunani*
*Contact No* : +91 9408329137
*skype id* : manish.dunani*
*
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the
individual or entity to which it is addressed and may contain information
that is confidential, privileged and exempt from disclosure under
applicable law. If the reader of this message is not the intended
recipient, you are hereby notified that any printing, copying,
dissemination, distribution, disclosure or forwarding of this communication
is strictly prohibited. If you have received this communication in error,
please contact the sender immediately and delete it from your system. Thank
You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the
individual or entity to which it is addressed and may contain information
that is confidential, privileged and exempt from disclosure under
applicable law. If the reader of this message is not the intended
recipient, you are hereby notified that any printing, copying,
dissemination, distribution, disclosure or forwarding of this communication
is strictly prohibited. If you have received this communication in error,
please contact the sender immediately and delete it from your system. Thank
You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
--
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to
which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
Irfan Sayed
2013-09-05 10:33:43 UTC
Permalink
thanks. i followed the user manual for deployment and installed all
pre-requisites
i modified the command and still the issue persist. please suggest

please refer below


[image: Inline image 1]

regards
irfan
Post by Olivier Renault
The command to install it is msiexec /i msifile /...
You will find the correct syntax as part of doc.
Happy reading
Olivier
Post by Irfan Sayed
thanks.
i referred the logs and manuals. i modified the clusterproperties file
and then double click on the msi file
however, it still failed.
further i started the installation on command line by giving
HDP_LAYOUT=clusterproperties file path,
installation went ahead and it failed for .NET framework 4.0 and VC++
redistributable package dependency
i installed both and started again the installation.
failed again with following error
[image: Inline image 1]
when i search for the logs mentioned in the error , i never found that
please suggest
regards
irfan
On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
Post by Olivier Renault
Correct, you need to define the cluster configuration as part of a file.
You will find some information on the configuration file as part of the
documentation.
http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
You should make sure to have also installed the pre requisite.
Thanks
Olivier
Post by Irfan Sayed
thanks. sorry for the long break. actually got involved in some other
priorities
i downloaded the installer and while installing i got following error
[image: Inline image 1]
do i need to make any configuration prior to installation ??
regards
irfan
On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
Post by Olivier Renault
Here is the link
http://download.hortonworks.com/products/hdp-windows/
Olivier
Post by Irfan Sayed
thanks.
i just followed the instructions to setup the pseudo distributed
http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
i don't think so i am running DN on both machine
please find the attached log
hi olivier
can you please give me download link ?
let me try please
regards
irfan
Post by Mohammad Tariq
Are you running DN on both the machines? Could you please show me
your DN logs?
Also, consider Oliver's suggestion. It's definitely a better option.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
Post by Olivier Renault
Irfu,
If you want to quickly get Hadoop running on windows platform. You
may want to try our distribution for Windows. You will be able to find the
msi on our website.
Regards
Olivier
Post by Irfan Sayed
thanks.
ok. i think i need to change the plan over here
let me create two environments. 1: totally windows 2: totally Unix
because, on windows , anyway i have to try and see how hadoop works
on UNIX, it is already known that , it is working fine.
namenode : windows 2012 R2
datanode : windows 2012 R2
1: datanode is not getting started
2: replication : if i put any file/folder on any datanode , it
should get replicated to all another available datanodes
regards
On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <
Post by Mohammad Tariq
Seriously??You are planning to develop something using Hadoop on
windows. Not a good idea. Anyways, cold you plz show me your log files?I
-The exact problem which you are facing right now
-Your cluster summary(no. of nodes etc)
-Your latest configuration files
-Your /etc.hosts file
Warm Regards,
Tariq
cloudfront.blogspot.com
On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <
Post by Irfan Sayed
ok. thanks
now, i need to start with all windows setup first as our product
will be based on windows
so, now, please tell me how to resolve the issue
datanode is not starting . please suggest
regards,
irfan
On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <
Post by Mohammad Tariq
It is possible. Theoretically Hadoop doesn't stop you from
doing that. But it is not a very wise setup.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
irfan
On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
Post by Irfan Sayed
thanks.
namenode will be on linux (flavour may be RHEL, CentOS,
UBuntu etc)
and datanodes are the combination of any OS (windows , linux
, unix etc )
however, my doubt is, as the file systems of both the
systems (win and linux ) are different , datanodes of these systems can
not be part of single cluster . i have to make windows cluster separate and
UNIX cluster separate ?
regards
On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
Post by Arpit Agarwal
I just noticed you are on Cygwin. IIRC Windows PIDs are not
the same as Cygwin PIDs so that may be causing the discrepancy. I don't
know how well Hadoop works in Cygwin as I have never tried it. Work is in
progress for native Windows support however there are no official releases
with Windows support yet. It may be easier to get familiar with a
release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
Post by Irfan Sayed
thanks
here is what i did .
i stopped all the namenodes and datanodes using
./stop-dfs.sh command
then deleted all pid files for namenodes and datanodes
started dfs again with command : "./start-dfs.sh"
when i ran the "Jps" command . it shows
$ ./jps.exe
4536 Jps
2076 NameNode
however, when i open the pid file for namenode then it is
not showing pid as : 4560. on the contrary, it shud show : 2076
please suggest
regards
On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
Post by Arpit Agarwal
Most likely there is a stale pid file. Something like
\tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
the datanode.
I haven't read the entire thread so you may have looked at
this already.
-Arpit
On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
Post by Irfan Sayed
datanode is trying to connect to namenode continuously
but fails
$ ./jps.exe
4584 NameNode
4016 Jps
$ ./start-dfs.sh
namenode running as process 3544. Stop it first.
DFS-1: datanode running as process 4076. Stop it first.
localhost: secondarynamenode running as process 4792.
Stop it first.
both these logs are contradictory
please find the attached logs
should i attach the conf files as well ?
regards
On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
Post by Mohammad Tariq
Your DN is still not running. Showing me the logs would
be helpful.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
Post by Irfan Sayed
i followed the url and did the steps mention in that. i
have deployed on the windows platform
Now, i am able to browse url : http://localhost:50070(name node )
http://localhost:50030
please refer below
[image: Inline image 1]
i have modified all the config files as mentioned and
formatted the hdfs file system as well
please suggest
regards
On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
Post by Irfan Sayed
http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
let me follow the url which you gave for pseudo
distributed setup and then will switch to distributed mode
regards
irfan
On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
Post by Mohammad Tariq
You are welcome. Which link have you followed for the
configuration?Your *core-site.xml* is empty. Remove
the property *fs.default.name *from *hdfs-site.xml*and add it to
*core-site.xml*. Remove *mapred.job.tracker* as
well. It is required in *mapred-site.xml*.
I would suggest you to do a pseudo distributed setup
first in order to get yourself familiar with the process and then proceed
to the distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
HTH
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
Post by Irfan Sayed
thanks tariq for response.
as discussed last time, i have sent you all the
config files in my setup .
can you please go through that ?
please let me know
regards
irfan
On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
Post by Mohammad Tariq
I'm sorry for being unresponsive. Was out of touch
for sometime because of ramzan and eid. Resuming work today.
What's the current status?
Warm Regards,
Tariq
cloudfront.blogspot.com
On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
Post by manish dunani
First of all read the concepts ..I hope you will
like it..
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
irfan
On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
Post by Irfan Sayed
hey Tariq,
i am still stuck ..
can you please suggest
regards
irfan
On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
Post by Irfan Sayed
attachment got quarantined
resending in txt format. please rename it to
conf.rar
regards
On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
Post by Irfan Sayed
thanks.
$ ./jps.exe
3164 NameNode
1892 Jps
$ ./jps.exe
3848 Jps
jps does not list any process for datanode.
however, on web browser i can see one live data node
please find the attached conf rar file of
namenode
regards
On Wed, Aug 7, 2013 at 1:52 AM, Mohammad
Post by Mohammad Tariq
OK. we'll start fresh. Could you plz show me
your latest config files?
BTW, are your daemons running fine?Use JPS
to verify that.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed
Post by Irfan Sayed
i have created these dir "wksp_data" and
"wksp_name" on both datanode and namenode
made the respective changes in
"hdfs-site.xml" file
formatted the namenode
started the dfs
but still, not able to browse the file
system through web browser
please refer below
anything still missing ?
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 10:35 PM, Irfan
Post by Irfan Sayed
these dir needs to be created on all
datanodes and namenodes ?
further, hdfs-site.xml needs to be
updated on both datanodes and namenodes for these new dir?
regards
On Tue, Aug 6, 2013 at 5:30 PM, Mohammad
Post by Mohammad Tariq
Create 2 directories manually
corresponding to the values of dfs.name.dir and dfs.data.dir properties and
change the permissions of these directories to 755. When you start pushing
data into your HDFS, data will start going inside the directory specified
by dfs.data.dir and the associated metadata will go inside dfs.name.dir.
Remember, you store data in HDFS, but it eventually gets stored in your
local/native FS. But you cannot see this data directly on your local/native
FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:26 PM, Irfan
Post by Irfan Sayed
thanks.
however, i need this to be working on
windows environment as project requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is
the HDFS file system OR do i need to create it from command line ?
please suggest
regards,
On Tue, Aug 6, 2013 at 5:19 PM, Mohammad
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got stuck
with some imp work.
HDFS webUI doesn't provide us the
ability to create file or directory. You can browse HDFS, view files,
download files etc. But operation like create, move, copy etc are not
supported.
These values look fine to me.
One suggestion though. Try getting a
Linux machine(if possible). Or at least use a VM. I personally feel that
using Hadoop on windows is always messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:09 PM, Irfan
Post by Irfan Sayed
thanks.
when i browse the file system , i am
i haven't seen any make directory
option there
i need to create it from command line ?
further, in the hdfs-site.xml file , i
have given following entries. are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 12:40 PM,
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths
local and You need not to copy hadoop into hdfs...Hadoop is already
working..
Just check out in browser by after
localhost:50070
then go for browse the filesystem
link in it..
If there is no directory then make
directory there.
That is your hdfs directory.
Then copy any text file there(no need
to copy hadoop there).beacause u are going to do processing on that data in
text file.That's why hadoop is used for ,first u need to make it clear in
ur mind.Then and then u will do it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal
/full/local/path/to/ur/file /hdfs/directory/path
On Tue, Aug 6, 2013 at 11:49 AM,
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and link
which u sent but i need this to be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
--
Regards
*Manish Dunani*
*Contact No* : +91 9408329137
*skype id* : manish.dunani*
*
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the
individual or entity to which it is addressed and may contain information
that is confidential, privileged and exempt from disclosure under
applicable law. If the reader of this message is not the intended
recipient, you are hereby notified that any printing, copying,
dissemination, distribution, disclosure or forwarding of this communication
is strictly prohibited. If you have received this communication in error,
please contact the sender immediately and delete it from your system. Thank
You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the
individual or entity to which it is addressed and may contain information
that is confidential, privileged and exempt from disclosure under
applicable law. If the reader of this message is not the intended
recipient, you are hereby notified that any printing, copying,
dissemination, distribution, disclosure or forwarding of this communication
is strictly prohibited. If you have received this communication in error,
please contact the sender immediately and delete it from your system. Thank
You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
Olivier Renault
2013-09-05 12:39:03 UTC
Permalink
Could you share the log files ( c:\hdp.log,
c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log ) as
well as your clusterproperties.txt ?

Thanks,
Olivier
Post by Irfan Sayed
thanks. i followed the user manual for deployment and installed all
pre-requisites
i modified the command and still the issue persist. please suggest
please refer below
[image: Inline image 1]
regards
irfan
Post by Olivier Renault
The command to install it is msiexec /i msifile /...
You will find the correct syntax as part of doc.
Happy reading
Olivier
Post by Irfan Sayed
thanks.
i referred the logs and manuals. i modified the clusterproperties file
and then double click on the msi file
however, it still failed.
further i started the installation on command line by giving
HDP_LAYOUT=clusterproperties file path,
installation went ahead and it failed for .NET framework 4.0 and VC++
redistributable package dependency
i installed both and started again the installation.
failed again with following error
[image: Inline image 1]
when i search for the logs mentioned in the error , i never found that
please suggest
regards
irfan
On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
Post by Olivier Renault
Correct, you need to define the cluster configuration as part of a
file. You will find some information on the configuration file as part of
the documentation.
http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
You should make sure to have also installed the pre requisite.
Thanks
Olivier
Post by Irfan Sayed
thanks. sorry for the long break. actually got involved in some other
priorities
i downloaded the installer and while installing i got following error
[image: Inline image 1]
do i need to make any configuration prior to installation ??
regards
irfan
On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
Post by Olivier Renault
Here is the link
http://download.hortonworks.com/products/hdp-windows/
Olivier
Post by Irfan Sayed
thanks.
i just followed the instructions to setup the pseudo distributed
http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
i don't think so i am running DN on both machine
please find the attached log
hi olivier
can you please give me download link ?
let me try please
regards
irfan
Post by Mohammad Tariq
Are you running DN on both the machines? Could you please show me
your DN logs?
Also, consider Oliver's suggestion. It's definitely a better option.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
Post by Olivier Renault
Irfu,
If you want to quickly get Hadoop running on windows platform. You
may want to try our distribution for Windows. You will be able to find the
msi on our website.
Regards
Olivier
Post by Irfan Sayed
thanks.
ok. i think i need to change the plan over here
let me create two environments. 1: totally windows 2: totally Unix
because, on windows , anyway i have to try and see how hadoop works
on UNIX, it is already known that , it is working fine.
namenode : windows 2012 R2
datanode : windows 2012 R2
1: datanode is not getting started
2: replication : if i put any file/folder on any datanode , it
should get replicated to all another available datanodes
regards
On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <
Post by Mohammad Tariq
Seriously??You are planning to develop something using Hadoop on
windows. Not a good idea. Anyways, cold you plz show me your log files?I
-The exact problem which you are facing right now
-Your cluster summary(no. of nodes etc)
-Your latest configuration files
-Your /etc.hosts file
Warm Regards,
Tariq
cloudfront.blogspot.com
On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <
Post by Irfan Sayed
ok. thanks
now, i need to start with all windows setup first as our
product will be based on windows
so, now, please tell me how to resolve the issue
datanode is not starting . please suggest
regards,
irfan
On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <
Post by Mohammad Tariq
It is possible. Theoretically Hadoop doesn't stop you from
doing that. But it is not a very wise setup.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
irfan
On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
Post by Irfan Sayed
thanks.
namenode will be on linux (flavour may be RHEL, CentOS,
UBuntu etc)
and datanodes are the combination of any OS (windows , linux
, unix etc )
however, my doubt is, as the file systems of both the
systems (win and linux ) are different , datanodes of these systems can
not be part of single cluster . i have to make windows cluster separate and
UNIX cluster separate ?
regards
On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
Post by Arpit Agarwal
I just noticed you are on Cygwin. IIRC Windows PIDs are not
the same as Cygwin PIDs so that may be causing the discrepancy. I don't
know how well Hadoop works in Cygwin as I have never tried it. Work is in
progress for native Windows support however there are no official releases
with Windows support yet. It may be easier to get familiar with a
release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
Post by Irfan Sayed
thanks
here is what i did .
i stopped all the namenodes and datanodes using
./stop-dfs.sh command
then deleted all pid files for namenodes and datanodes
started dfs again with command : "./start-dfs.sh"
when i ran the "Jps" command . it shows
$ ./jps.exe
4536 Jps
2076 NameNode
however, when i open the pid file for namenode then it is
not showing pid as : 4560. on the contrary, it shud show : 2076
please suggest
regards
On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
Post by Arpit Agarwal
Most likely there is a stale pid file. Something like
\tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
the datanode.
I haven't read the entire thread so you may have looked
at this already.
-Arpit
On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
Post by Irfan Sayed
datanode is trying to connect to namenode continuously
but fails
$ ./jps.exe
4584 NameNode
4016 Jps
$ ./start-dfs.sh
namenode running as process 3544. Stop it first.
DFS-1: datanode running as process 4076. Stop it first.
localhost: secondarynamenode running as process 4792.
Stop it first.
both these logs are contradictory
please find the attached logs
should i attach the conf files as well ?
regards
On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
Post by Mohammad Tariq
Your DN is still not running. Showing me the logs would
be helpful.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
Post by Irfan Sayed
i followed the url and did the steps mention in that.
i have deployed on the windows platform
Now, i am able to browse url : http://localhost:50070(name node )
http://localhost:50030
please refer below
[image: Inline image 1]
i have modified all the config files as mentioned and
formatted the hdfs file system as well
please suggest
regards
On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
Post by Irfan Sayed
http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
let me follow the url which you gave for pseudo
distributed setup and then will switch to distributed mode
regards
irfan
On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
Post by Mohammad Tariq
You are welcome. Which link have you followed for
the configuration?Your *core-site.xml* is empty.
Remove the property *fs.default.name *from *
hdfs-site.xml* and add it to *core-site.xml*.
Remove *mapred.job.tracker* as well. It is required
in *mapred-site.xml*.
I would suggest you to do a pseudo distributed setup
first in order to get yourself familiar with the process and then proceed
to the distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
HTH
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
Post by Irfan Sayed
thanks tariq for response.
as discussed last time, i have sent you all the
config files in my setup .
can you please go through that ?
please let me know
regards
irfan
On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
Post by Mohammad Tariq
I'm sorry for being unresponsive. Was out of touch
for sometime because of ramzan and eid. Resuming work today.
What's the current status?
Warm Regards,
Tariq
cloudfront.blogspot.com
On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
Post by manish dunani
First of all read the concepts ..I hope you will
like it..
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
irfan
On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
Post by Irfan Sayed
hey Tariq,
i am still stuck ..
can you please suggest
regards
irfan
On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
Post by Irfan Sayed
attachment got quarantined
resending in txt format. please rename it to
conf.rar
regards
On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
Post by Irfan Sayed
thanks.
$ ./jps.exe
3164 NameNode
1892 Jps
$ ./jps.exe
3848 Jps
jps does not list any process for datanode.
however, on web browser i can see one live data node
please find the attached conf rar file of
namenode
regards
On Wed, Aug 7, 2013 at 1:52 AM, Mohammad
Post by Mohammad Tariq
OK. we'll start fresh. Could you plz show
me your latest config files?
BTW, are your daemons running fine?Use JPS
to verify that.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 10:59 PM, Irfan
Post by Irfan Sayed
i have created these dir "wksp_data" and
"wksp_name" on both datanode and namenode
made the respective changes in
"hdfs-site.xml" file
formatted the namenode
started the dfs
but still, not able to browse the file
system through web browser
please refer below
anything still missing ?
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 10:35 PM, Irfan
Post by Irfan Sayed
these dir needs to be created on all
datanodes and namenodes ?
further, hdfs-site.xml needs to be
updated on both datanodes and namenodes for these new dir?
regards
On Tue, Aug 6, 2013 at 5:30 PM, Mohammad
Post by Mohammad Tariq
Create 2 directories manually
corresponding to the values of dfs.name.dir and dfs.data.dir properties and
change the permissions of these directories to 755. When you start pushing
data into your HDFS, data will start going inside the directory specified
by dfs.data.dir and the associated metadata will go inside dfs.name.dir.
Remember, you store data in HDFS, but it eventually gets stored in your
local/native FS. But you cannot see this data directly on your local/native
FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:26 PM, Irfan
Post by Irfan Sayed
thanks.
however, i need this to be working on
windows environment as project requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is
the HDFS file system OR do i need to create it from command line ?
please suggest
regards,
On Tue, Aug 6, 2013 at 5:19 PM,
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got
stuck with some imp work.
HDFS webUI doesn't provide us the
ability to create file or directory. You can browse HDFS, view files,
download files etc. But operation like create, move, copy etc are not
supported.
These values look fine to me.
One suggestion though. Try getting a
Linux machine(if possible). Or at least use a VM. I personally feel that
using Hadoop on windows is always messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:09 PM, Irfan
Post by Irfan Sayed
thanks.
when i browse the file system , i am
i haven't seen any make directory
option there
i need to create it from command line
?
further, in the hdfs-site.xml file ,
i have given following entries. are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 12:40 PM,
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the paths
local and You need not to copy hadoop into hdfs...Hadoop is already
working..
Just check out in browser by after
localhost:50070
then go for browse the filesystem
link in it..
If there is no directory then make
directory there.
That is your hdfs directory.
Then copy any text file there(no
need to copy hadoop there).beacause u are going to do processing on that
data in text file.That's why hadoop is used for ,first u need to make it
clear in ur mind.Then and then u will do it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal
/full/local/path/to/ur/file /hdfs/directory/path
On Tue, Aug 6, 2013 at 11:49 AM,
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and
link which u sent but i need this to be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
--
Regards
*Manish Dunani*
*Contact No* : +91 9408329137
*skype id* : manish.dunani*
*
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the
individual or entity to which it is addressed and may contain information
that is confidential, privileged and exempt from disclosure under
applicable law. If the reader of this message is not the intended
recipient, you are hereby notified that any printing, copying,
dissemination, distribution, disclosure or forwarding of this communication
is strictly prohibited. If you have received this communication in error,
please contact the sender immediately and delete it from your system. Thank
You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the
individual or entity to which it is addressed and may contain information
that is confidential, privileged and exempt from disclosure under
applicable law. If the reader of this message is not the intended
recipient, you are hereby notified that any printing, copying,
dissemination, distribution, disclosure or forwarding of this communication
is strictly prohibited. If you have received this communication in error,
please contact the sender immediately and delete it from your system. Thank
You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
--
Olivier Renault
Solution Engineer - Big Data - Hortonworks, Inc.
+44 7500 933 036
orenault-RYNwJFaOa9CEK/***@public.gmane.org
www.hortonworks.com
<http://hortonworks.com/products/hortonworks-sandbox/>
--
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to
which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
Irfan Sayed
2013-09-06 03:41:56 UTC
Permalink
please find the attached.
i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
as it is not generated

regards
irfan
Post by Olivier Renault
Could you share the log files ( c:\hdp.log,
c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log ) as
well as your clusterproperties.txt ?
Thanks,
Olivier
Post by Irfan Sayed
thanks. i followed the user manual for deployment and installed all
pre-requisites
i modified the command and still the issue persist. please suggest
please refer below
[image: Inline image 1]
regards
irfan
Post by Olivier Renault
The command to install it is msiexec /i msifile /...
You will find the correct syntax as part of doc.
Happy reading
Olivier
Post by Irfan Sayed
thanks.
i referred the logs and manuals. i modified the clusterproperties file
and then double click on the msi file
however, it still failed.
further i started the installation on command line by giving
HDP_LAYOUT=clusterproperties file path,
installation went ahead and it failed for .NET framework 4.0 and VC++
redistributable package dependency
i installed both and started again the installation.
failed again with following error
[image: Inline image 1]
when i search for the logs mentioned in the error , i never found that
please suggest
regards
irfan
On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
Post by Olivier Renault
Correct, you need to define the cluster configuration as part of a
file. You will find some information on the configuration file as part of
the documentation.
http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
You should make sure to have also installed the pre requisite.
Thanks
Olivier
Post by Irfan Sayed
thanks. sorry for the long break. actually got involved in some other
priorities
i downloaded the installer and while installing i got following error
[image: Inline image 1]
do i need to make any configuration prior to installation ??
regards
irfan
On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
Post by Olivier Renault
Here is the link
http://download.hortonworks.com/products/hdp-windows/
Olivier
Post by Irfan Sayed
thanks.
i just followed the instructions to setup the pseudo distributed
http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
i don't think so i am running DN on both machine
please find the attached log
hi olivier
can you please give me download link ?
let me try please
regards
irfan
Post by Mohammad Tariq
Are you running DN on both the machines? Could you please show me
your DN logs?
Also, consider Oliver's suggestion. It's definitely a better option.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
Post by Olivier Renault
Irfu,
If you want to quickly get Hadoop running on windows platform.
You may want to try our distribution for Windows. You will be able to find
the msi on our website.
Regards
Olivier
Post by Irfan Sayed
thanks.
ok. i think i need to change the plan over here
let me create two environments. 1: totally windows 2: totally Unix
because, on windows , anyway i have to try and see how hadoop works
on UNIX, it is already known that , it is working fine.
namenode : windows 2012 R2
datanode : windows 2012 R2
1: datanode is not getting started
2: replication : if i put any file/folder on any datanode , it
should get replicated to all another available datanodes
regards
On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <
Post by Mohammad Tariq
Seriously??You are planning to develop something using Hadoop
on windows. Not a good idea. Anyways, cold you plz show me your log files?I
-The exact problem which you are facing right now
-Your cluster summary(no. of nodes etc)
-Your latest configuration files
-Your /etc.hosts file
Warm Regards,
Tariq
cloudfront.blogspot.com
On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <
Post by Irfan Sayed
ok. thanks
now, i need to start with all windows setup first as our
product will be based on windows
so, now, please tell me how to resolve the issue
datanode is not starting . please suggest
regards,
irfan
On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <
Post by Mohammad Tariq
It is possible. Theoretically Hadoop doesn't stop you from
doing that. But it is not a very wise setup.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
irfan
On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
Post by Irfan Sayed
thanks.
namenode will be on linux (flavour may be RHEL, CentOS,
UBuntu etc)
and datanodes are the combination of any OS (windows ,
linux , unix etc )
however, my doubt is, as the file systems of both the
systems (win and linux ) are different , datanodes of these systems can
not be part of single cluster . i have to make windows cluster separate and
UNIX cluster separate ?
regards
On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
Post by Arpit Agarwal
I just noticed you are on Cygwin. IIRC Windows PIDs are
not the same as Cygwin PIDs so that may be causing the discrepancy. I don't
know how well Hadoop works in Cygwin as I have never tried it. Work is in
progress for native Windows support however there are no official releases
with Windows support yet. It may be easier to get familiar with a
release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
Post by Irfan Sayed
thanks
here is what i did .
i stopped all the namenodes and datanodes using
./stop-dfs.sh command
then deleted all pid files for namenodes and datanodes
started dfs again with command : "./start-dfs.sh"
when i ran the "Jps" command . it shows
$ ./jps.exe
4536 Jps
2076 NameNode
however, when i open the pid file for namenode then it is
not showing pid as : 4560. on the contrary, it shud show : 2076
please suggest
regards
On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
Post by Arpit Agarwal
Most likely there is a stale pid file. Something like
\tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
the datanode.
I haven't read the entire thread so you may have looked
at this already.
-Arpit
On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
Post by Irfan Sayed
datanode is trying to connect to namenode continuously
but fails
$ ./jps.exe
4584 NameNode
4016 Jps
$ ./start-dfs.sh
namenode running as process 3544. Stop it first.
DFS-1: datanode running as process 4076. Stop it first.
localhost: secondarynamenode running as process 4792.
Stop it first.
both these logs are contradictory
please find the attached logs
should i attach the conf files as well ?
regards
On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
Post by Mohammad Tariq
Your DN is still not running. Showing me the logs
would be helpful.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
Post by Irfan Sayed
i followed the url and did the steps mention in that.
i have deployed on the windows platform
Now, i am able to browse url : http://localhost:50070(name node )
http://localhost:50030
please refer below
[image: Inline image 1]
i have modified all the config files as mentioned and
formatted the hdfs file system as well
please suggest
regards
On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
Post by Irfan Sayed
http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
let me follow the url which you gave for pseudo
distributed setup and then will switch to distributed mode
regards
irfan
On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
Post by Mohammad Tariq
You are welcome. Which link have you followed for
the configuration?Your *core-site.xml* is empty.
Remove the property *fs.default.name *from *
hdfs-site.xml* and add it to *core-site.xml*.
Remove *mapred.job.tracker* as well. It is
required in *mapred-site.xml*.
I would suggest you to do a pseudo distributed
setup first in order to get yourself familiar with the process and then
proceed to the distributed mode. You can visit this
link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
HTH
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
Post by Irfan Sayed
thanks tariq for response.
as discussed last time, i have sent you all the
config files in my setup .
can you please go through that ?
please let me know
regards
irfan
On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
Post by Mohammad Tariq
I'm sorry for being unresponsive. Was out of
touch for sometime because of ramzan and eid. Resuming work today.
What's the current status?
Warm Regards,
Tariq
cloudfront.blogspot.com
On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
Post by manish dunani
First of all read the concepts ..I hope you will
like it..
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
irfan
On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
Post by Irfan Sayed
hey Tariq,
i am still stuck ..
can you please suggest
regards
irfan
On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
Post by Irfan Sayed
please suggest
regards
On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
Post by Irfan Sayed
attachment got quarantined
resending in txt format. please rename it to
conf.rar
regards
On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed
Post by Irfan Sayed
thanks.
$ ./jps.exe
3164 NameNode
1892 Jps
$ ./jps.exe
3848 Jps
jps does not list any process for datanode.
however, on web browser i can see one live data node
please find the attached conf rar file of
namenode
regards
On Wed, Aug 7, 2013 at 1:52 AM, Mohammad
Post by Mohammad Tariq
OK. we'll start fresh. Could you plz show
me your latest config files?
BTW, are your daemons running fine?Use JPS
to verify that.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 10:59 PM, Irfan
Post by Irfan Sayed
i have created these dir "wksp_data" and
"wksp_name" on both datanode and namenode
made the respective changes in
"hdfs-site.xml" file
formatted the namenode
started the dfs
but still, not able to browse the file
system through web browser
please refer below
anything still missing ?
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 10:35 PM, Irfan
Post by Irfan Sayed
these dir needs to be created on all
datanodes and namenodes ?
further, hdfs-site.xml needs to be
updated on both datanodes and namenodes for these new dir?
regards
On Tue, Aug 6, 2013 at 5:30 PM, Mohammad
Post by Mohammad Tariq
Create 2 directories manually
corresponding to the values of dfs.name.dir and dfs.data.dir properties and
change the permissions of these directories to 755. When you start pushing
data into your HDFS, data will start going inside the directory specified
by dfs.data.dir and the associated metadata will go inside dfs.name.dir.
Remember, you store data in HDFS, but it eventually gets stored in your
local/native FS. But you cannot see this data directly on your local/native
FS.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:26 PM, Irfan
Post by Irfan Sayed
thanks.
however, i need this to be working on
windows environment as project requirement.
i will add/work on Linux later
so, now , at this stage , c:\\wksp is
the HDFS file system OR do i need to create it from command line ?
please suggest
regards,
On Tue, Aug 6, 2013 at 5:19 PM,
Post by Mohammad Tariq
Hello Irfan,
Sorry for being unresponsive. Got
stuck with some imp work.
HDFS webUI doesn't provide us the
ability to create file or directory. You can browse HDFS, view files,
download files etc. But operation like create, move, copy etc are not
supported.
These values look fine to me.
One suggestion though. Try getting a
Linux machine(if possible). Or at least use a VM. I personally feel that
using Hadoop on windows is always messy.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Tue, Aug 6, 2013 at 5:09 PM, Irfan
Post by Irfan Sayed
thanks.
when i browse the file system , i am
i haven't seen any make directory
option there
i need to create it from command
line ?
further, in the hdfs-site.xml file ,
i have given following entries. are they correct ?
<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>
please suggest
[image: Inline image 1]
On Tue, Aug 6, 2013 at 12:40 PM,
Post by manish dunani
*You are wrong at this:*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
Because,You had wrote both the
paths local and You need not to copy hadoop into hdfs...Hadoop is already
working..
Just check out in browser by after
localhost:50070
then go for browse the filesystem
link in it..
If there is no directory then make
directory there.
That is your hdfs directory.
Then copy any text file there(no
need to copy hadoop there).beacause u are going to do processing on that
data in text file.That's why hadoop is used for ,first u need to make it
clear in ur mind.Then and then u will do it...otherwise not possible..
*Try this: *
$ .bin/hadoop dfs -copyFromLocal
/full/local/path/to/ur/file /hdfs/directory/path
On Tue, Aug 6, 2013 at 11:49 AM,
Post by Irfan Sayed
thanks. yes , i am newbie.
however, i need windows setup.
let me surely refer the doc and
link which u sent but i need this to be working ...
can you please help
regards
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443
--
Regards
*Manish Dunani*
*Contact No* : +91 9408329137
*skype id* : manish.dunani*
*
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the
individual or entity to which it is addressed and may contain information
that is confidential, privileged and exempt from disclosure under
applicable law. If the reader of this message is not the intended
recipient, you are hereby notified that any printing, copying,
dissemination, distribution, disclosure or forwarding of this communication
is strictly prohibited. If you have received this communication in error,
please contact the sender immediately and delete it from your system. Thank
You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the
individual or entity to which it is addressed and may contain information
that is confidential, privileged and exempt from disclosure under
applicable law. If the reader of this message is not the intended
recipient, you are hereby notified that any printing, copying,
dissemination, distribution, disclosure or forwarding of this communication
is strictly prohibited. If you have received this communication in error,
please contact the sender immediately and delete it from your system. Thank
You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
--
Olivier Renault
Solution Engineer - Big Data - Hortonworks, Inc.
+44 7500 933 036
www.hortonworks.com
<http://hortonworks.com/products/hortonworks-sandbox/>
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
Ravi Mummulla (BIG DATA)
2013-09-06 05:42:50 UTC
Permalink
Here's your issue (from the logs you attached earlier):

CAQuietExec: Checking JAVA_HOME is set correctly...
CAQuietExec: Files\Java\jdk1.6.0_31 was unexpected at this time.

It seems that you installed Java prerequisite in the default path, which is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3 does not like spaces in paths, do you need to reinstall Java under c:\java\ or something similar (in a path with no spaces).

From: Irfan Sayed [mailto:irfu.sayed-***@public.gmane.org]
Sent: Thursday, September 5, 2013 8:42 PM
To: user-7ArZoLwFLBtd/SJB6HiN2Ni2O/***@public.gmane.org
Subject: Re: about replication

please find the attached.
i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log" as it is not generated

regards
irfan





On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <orenault-RYNwJFaOa9CEK/***@public.gmane.org<mailto:orenault-RYNwJFaOa9CEK/***@public.gmane.org>> wrote:
Could you share the log files ( c:\hdp.log, c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log ) as well as your clusterproperties.txt ?

Thanks,
Olivier

On 5 September 2013 12:33, Irfan Sayed <irfu.sayed-***@public.gmane.org<mailto:irfu.sayed-***@public.gmane.org>> wrote:
thanks. i followed the user manual for deployment and installed all pre-requisites
i modified the command and still the issue persist. please suggest

please refer below


[Inline image 1]

regards
irfan


On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <orenault-RYNwJFaOa9CEK/***@public.gmane.org<mailto:orenault-RYNwJFaOa9CEK/***@public.gmane.org>> wrote:

The command to install it is msiexec /i msifile /...

You will find the correct syntax as part of doc.

Happy reading
Olivier
On 4 Sep 2013 12:37, "Irfan Sayed" <irfu.sayed-***@public.gmane.org<mailto:***@gmail.com>> wrote:
thanks.
i referred the logs and manuals. i modified the clusterproperties file and then double click on the msi file
however, it still failed.
further i started the installation on command line by giving HDP_LAYOUT=clusterproperties file path,
installation went ahead and it failed for .NET framework 4.0 and VC++ redistributable package dependency

i installed both and started again the installation.
failed again with following error
[Inline image 1]

when i search for the logs mentioned in the error , i never found that
please suggest

regards
irfan


On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <orenault-RYNwJFaOa9CEK/***@public.gmane.org<mailto:orenault-RYNwJFaOa9CEK/***@public.gmane.org>> wrote:

Correct, you need to define the cluster configuration as part of a file. You will find some information on the configuration file as part of the documentation.

http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html

You should make sure to have also installed the pre requisite.

Thanks
Olivier
On 3 Sep 2013 06:51, "Irfan Sayed" <irfu.sayed-***@public.gmane.org<mailto:***@gmail.com>> wrote:
thanks. sorry for the long break. actually got involved in some other priorities
i downloaded the installer and while installing i got following error

[Inline image 1]

do i need to make any configuration prior to installation ??

regards
irfan


On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <orenault-RYNwJFaOa9CEK/***@public.gmane.org<mailto:orenault-RYNwJFaOa9CEK/***@public.gmane.org>> wrote:

Here is the link

http://download.hortonworks.com/products/hdp-windows/

Olivier
On 23 Aug 2013 10:55, "Irfan Sayed" <irfu.sayed-***@public.gmane.org<mailto:***@gmail.com>> wrote:
thanks.
i just followed the instructions to setup the pseudo distributed setup first using the url : http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I

i don't think so i am running DN on both machine
please find the attached log

hi olivier

can you please give me download link ?
let me try please

regards
irfan



On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <dontariq-***@public.gmane.org<mailto:dontariq-***@public.gmane.org>> wrote:
Are you running DN on both the machines? Could you please show me your DN logs?

Also, consider Oliver's suggestion. It's definitely a better option.



Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <orenault-RYNwJFaOa9CEK/***@public.gmane.org<mailto:orenault-RYNwJFaOa9CEK/***@public.gmane.org>> wrote:

Irfu,

If you want to quickly get Hadoop running on windows platform. You may want to try our distribution for Windows. You will be able to find the msi on our website.

Regards
Olivier
On 23 Aug 2013 05:15, "Irfan Sayed" <irfu.sayed-***@public.gmane.org<mailto:***@gmail.com>> wrote:
thanks.
ok. i think i need to change the plan over here
let me create two environments. 1: totally windows 2: totally Unix

because, on windows , anyway i have to try and see how hadoop works
on UNIX, it is already known that , it is working fine.

so, on windows , here is the setup:

namenode : windows 2012 R2
datanode : windows 2012 R2

now, the exact problem is :
1: datanode is not getting started
2: replication : if i put any file/folder on any datanode , it should get replicated to all another available datanodes

regards








On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <dontariq-***@public.gmane.org<mailto:dontariq-***@public.gmane.org>> wrote:
Seriously??You are planning to develop something using Hadoop on windows. Not a good idea. Anyways, cold you plz show me your log files?I also need some additional info :
-The exact problem which you are facing right now
-Your cluster summary(no. of nodes etc)
-Your latest configuration files
-Your /etc.hosts file

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <irfu.sayed-***@public.gmane.org<mailto:irfu.sayed-***@public.gmane.org>> wrote:
ok. thanks
now, i need to start with all windows setup first as our product will be based on windows
so, now, please tell me how to resolve the issue

datanode is not starting . please suggest

regards,
irfan


On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <dontariq-***@public.gmane.org<mailto:dontariq-***@public.gmane.org>> wrote:
It is possible. Theoretically Hadoop doesn't stop you from doing that. But it is not a very wise setup.

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <irfu.sayed-***@public.gmane.org<mailto:irfu.sayed-***@public.gmane.org>> wrote:
please suggest

regards
irfan


On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <irfu.sayed-***@public.gmane.org<mailto:irfu.sayed-***@public.gmane.org>> wrote:
thanks.
can i have setup like this :
namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)
and datanodes are the combination of any OS (windows , linux , unix etc )

however, my doubt is, as the file systems of both the systems (win and linux ) are different , datanodes of these systems can not be part of single cluster . i have to make windows cluster separate and UNIX cluster separate ?

regards


On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <aagarwal-RYNwJFaOa9CEK/***@public.gmane.org<mailto:aagarwal-RYNwJFaOa9CEK/***@public.gmane.org>> wrote:
I just noticed you are on Cygwin. IIRC Windows PIDs are not the same as Cygwin PIDs so that may be causing the discrepancy. I don't know how well Hadoop works in Cygwin as I have never tried it. Work is in progress for native Windows support however there are no official releases with Windows support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/> on Linux if you are new to it.


On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <irfu.sayed-***@public.gmane.org<mailto:irfu.sayed-***@public.gmane.org>> wrote:
thanks
here is what i did .
i stopped all the namenodes and datanodes using ./stop-dfs.sh command
then deleted all pid files for namenodes and datanodes

started dfs again with command : "./start-dfs.sh"

when i ran the "Jps" command . it shows

***@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
$ ./jps.exe
4536 Jps
2076 NameNode

however, when i open the pid file for namenode then it is not showing pid as : 4560. on the contrary, it shud show : 2076

please suggest

regards


On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <aagarwal-RYNwJFaOa9CEK/***@public.gmane.org<mailto:aagarwal-RYNwJFaOa9CEK/***@public.gmane.org>> wrote:
Most likely there is a stale pid file. Something like \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting the datanode.

I haven't read the entire thread so you may have looked at this already.

-Arpit


On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <irfu.sayed-***@public.gmane.org<mailto:irfu.sayed-***@public.gmane.org>> wrote:
datanode is trying to connect to namenode continuously but fails

when i try to run "jps" command it says :
$ ./jps.exe
4584 NameNode
4016 Jps

and when i ran the "./start-dfs.sh" then it says :

$ ./start-dfs.sh
namenode running as process 3544. Stop it first.
DFS-1: datanode running as process 4076. Stop it first.
localhost: secondarynamenode running as process 4792. Stop it first.

both these logs are contradictory
please find the attached logs

should i attach the conf files as well ?

regards


On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <dontariq-***@public.gmane.org<mailto:dontariq-***@public.gmane.org>> wrote:
Your DN is still not running. Showing me the logs would be helpful.

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <irfu.sayed-***@public.gmane.org<mailto:irfu.sayed-***@public.gmane.org>> wrote:
i followed the url and did the steps mention in that. i have deployed on the windows platform

Now, i am able to browse url : http://localhost:50070 (name node )
however, not able to browse url : http://localhost:50030

please refer below

[Inline image 1]

i have modified all the config files as mentioned and formatted the hdfs file system as well
please suggest

regards


On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <irfu.sayed-***@public.gmane.org<mailto:irfu.sayed-***@public.gmane.org>> wrote:
thanks. i followed this url : http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
let me follow the url which you gave for pseudo distributed setup and then will switch to distributed mode

regards
irfan


On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <dontariq-***@public.gmane.org<mailto:dontariq-***@public.gmane.org>> wrote:
You are welcome. Which link have you followed for the configuration?Your core-site.xml is empty. Remove the property fs.default.name<http://fs.default.name> from hdfs-site.xml and add it to core-site.xml. Remove mapred.job.tracker as well. It is required in mapred-site.xml.

I would suggest you to do a pseudo distributed setup first in order to get yourself familiar with the process and then proceed to the distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I> if you need some help. Let me know if you face any issue.

HTH

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <irfu.sayed-***@public.gmane.org<mailto:irfu.sayed-***@public.gmane.org>> wrote:
thanks tariq for response.
as discussed last time, i have sent you all the config files in my setup .
can you please go through that ?

please let me know

regards
irfan



On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <dontariq-***@public.gmane.org<mailto:dontariq-***@public.gmane.org>> wrote:
I'm sorry for being unresponsive. Was out of touch for sometime because of ramzan and eid. Resuming work today.

What's the current status?

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <manishd207-***@public.gmane.org<mailto:manishd207-***@public.gmane.org>> wrote:
First of all read the concepts ..I hope you will like it..

https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf

On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <irfu.sayed-***@public.gmane.org<mailto:irfu.sayed-***@public.gmane.org>> wrote:
please suggest

regards
irfan


On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <irfu.sayed-***@public.gmane.org<mailto:irfu.sayed-***@public.gmane.org>> wrote:
hey Tariq,
i am still stuck ..
can you please suggest

regards
irfan


On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <irfu.sayed-***@public.gmane.org<mailto:irfu.sayed-***@public.gmane.org>> wrote:
please suggest

regards


On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <irfu.sayed-***@public.gmane.org<mailto:irfu.sayed-***@public.gmane.org>> wrote:
attachment got quarantined
resending in txt format. please rename it to conf.rar

regards


On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <irfu.sayed-***@public.gmane.org<mailto:irfu.sayed-***@public.gmane.org>> wrote:
thanks.

if i run the jps command on namenode :

***@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
$ ./jps.exe
3164 NameNode
1892 Jps

same command on datanode :

***@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin
$ ./jps.exe
3848 Jps

jps does not list any process for datanode. however, on web browser i can see one live data node
please find the attached conf rar file of namenode

regards


On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <dontariq-***@public.gmane.org<mailto:dontariq-***@public.gmane.org>> wrote:
OK. we'll start fresh. Could you plz show me your latest config files?

BTW, are your daemons running fine?Use JPS to verify that.

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <irfu.sayed-***@public.gmane.org<mailto:irfu.sayed-***@public.gmane.org>> wrote:
i have created these dir "wksp_data" and "wksp_name" on both datanode and namenode
made the respective changes in "hdfs-site.xml" file
formatted the namenode
started the dfs

but still, not able to browse the file system through web browser
please refer below

anything still missing ?
please suggest

[Inline image 1]

On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <irfu.sayed-***@public.gmane.org<mailto:irfu.sayed-***@public.gmane.org>> wrote:
these dir needs to be created on all datanodes and namenodes ?
further, hdfs-site.xml needs to be updated on both datanodes and namenodes for these new dir?

regards


On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <dontariq-***@public.gmane.org<mailto:dontariq-***@public.gmane.org>> wrote:
Create 2 directories manually corresponding to the values of dfs.name.dir and dfs.data.dir properties and change the permissions of these directories to 755. When you start pushing data into your HDFS, data will start going inside the directory specified by dfs.data.dir and the associated metadata will go inside dfs.name.dir. Remember, you store data in HDFS, but it eventually gets stored in your local/native FS. But you cannot see this data directly on your local/native FS.

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <irfu.sayed-***@public.gmane.org<mailto:irfu.sayed-***@public.gmane.org>> wrote:
thanks.
however, i need this to be working on windows environment as project requirement.
i will add/work on Linux later

so, now , at this stage , c:\\wksp is the HDFS file system OR do i need to create it from command line ?

please suggest

regards,


On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <dontariq-***@public.gmane.org<mailto:dontariq-***@public.gmane.org>> wrote:
Hello Irfan,

Sorry for being unresponsive. Got stuck with some imp work.

HDFS webUI doesn't provide us the ability to create file or directory. You can browse HDFS, view files, download files etc. But operation like create, move, copy etc are not supported.

These values look fine to me.

One suggestion though. Try getting a Linux machine(if possible). Or at least use a VM. I personally feel that using Hadoop on windows is always messy.

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <irfu.sayed-***@public.gmane.org<mailto:irfu.sayed-***@public.gmane.org>> wrote:
thanks.
when i browse the file system , i am getting following :
i haven't seen any make directory option there

i need to create it from command line ?
further, in the hdfs-site.xml file , i have given following entries. are they correct ?

<property>
<name>dfs.data.dir</name>
<value>c:\\wksp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>c:\\wksp</value>
</property>

please suggest


[Inline image 1]

On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <manishd207-***@public.gmane.org<mailto:manishd207-***@public.gmane.org>> wrote:
You are wrong at this:

***@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
$ ./hadoop dfs -copyFromLocal /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.

***@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
$ ./hadoop dfs -copyFromLocal /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.

Because,You had wrote both the paths local and You need not to copy hadoop into hdfs...Hadoop is already working..

Just check out in browser by after starting ur single node cluster :

localhost:50070

then go for browse the filesystem link in it..

If there is no directory then make directory there.
That is your hdfs directory.
Then copy any text file there(no need to copy hadoop there).beacause u are going to do processing on that data in text file.That's why hadoop is used for ,first u need to make it clear in ur mind.Then and then u will do it...otherwise not possible..

Try this:

***@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2
$ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file /hdfs/directory/path




On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <irfu.sayed-***@public.gmane.org<mailto:irfu.sayed-***@public.gmane.org>> wrote:
thanks. yes , i am newbie.
however, i need windows setup.

let me surely refer the doc and link which u sent but i need this to be working ...
can you please help

regards






--
MANISH DUNANI
-THANX
+91 9426881954<tel:%2B91%209426881954>,+91 8460656443<tel:%2B91%208460656443>
manishd207-***@public.gmane.org<mailto:manishd207-***@public.gmane.org>














--
Regards
Manish Dunani
Contact No : +91 9408329137<tel:%2B91%209408329137>
skype id : manish.dunani









CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.



CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.







CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.



CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.


CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.


CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.




--
Olivier Renault
Solution Engineer - Big Data - Hortonworks, Inc.
+44 7500 933 036
orenault-RYNwJFaOa9CEK/***@public.gmane.org<mailto:orenault-RYNwJFaOa9CEK/***@public.gmane.org>
www.hortonworks.com<http://www.hortonworks.com/>
<http://hortonworks.com/products/hortonworks-sandbox/>

CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.<http://hortonworks.com/products/hortonworks-sandbox/>
<http://hortonworks.com/products/hortonworks-sandbox/>
Irfan Sayed
2013-09-06 06:12:56 UTC
Permalink
thanks.
i installed the latest java in c:\java folder and now no error in log file
related to java
however, now it is throwing error on not having cluster properties file.
in fact i am running/installing hdp from the location where this file exist
. still it is throwing error

please find the attached

[image: Inline image 1]

regards
irfan



On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
HereÂ’s your issue (from the logs you attached earlier):****
** **
CAQuietExec: Checking JAVA_HOME is set correctly...****
CAQuietExec: Files\Java\jdk1.6.0_31 was unexpected at this time.****
** **
It seems that you installed Java prerequisite in the default path, which
is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3 does
not like spaces in paths, do you need to reinstall Java under c:\java\ or
something similar (in a path with no spaces).****
** **
*Sent:* Thursday, September 5, 2013 8:42 PM
*Subject:* Re: about replication****
** **
please find the attached.****
i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
as it is not generated ****
** **
regards****
irfan****
** **
** **
** **
** **
** **
wrote:****
Could you share the log files ( c:\hdp.log,
c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log ) as
well as your clusterproperties.txt ?****
** **
Thanks, ****
Olivier****
** **
thanks. i followed the user manual for deployment and installed all
pre-requisites ****
i modified the command and still the issue persist. please suggest ****
** **
please refer below ****
** **
** **
[image: Inline image 1]****
** **
regards****
irfan ****
** **
** **
wrote:****
The command to install it is msiexec /i msifile /... ****
You will find the correct syntax as part of doc. ****
Happy reading
Olivier ****
thanks. ****
i referred the logs and manuals. i modified the clusterproperties file and
then double click on the msi file ****
however, it still failed.****
further i started the installation on command line by giving
HDP_LAYOUT=clusterproperties file path, ****
installation went ahead and it failed for .NET framework 4.0 and VC++
redistributable package dependency ****
** **
i installed both and started again the installation. ****
failed again with following error ****
[image: Inline image 1]****
** **
when i search for the logs mentioned in the error , i never found that ***
*
please suggest ****
** **
regards****
irfan****
** **
** **
wrote:****
Correct, you need to define the cluster configuration as part of a file.
You will find some information on the configuration file as part of the
documentation. ****
http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
****
You should make sure to have also installed the pre requisite. ****
Thanks
Olivier ****
thanks. sorry for the long break. actually got involved in some other
priorities****
i downloaded the installer and while installing i got following error ****
** **
[image: Inline image 1]****
** **
do i need to make any configuration prior to installation ??****
** **
regards****
irfan ****
** **
** **
wrote:****
Here is the link ****
http://download.hortonworks.com/products/hdp-windows/****
Olivier ****
thanks.****
i just followed the instructions to setup the pseudo distributed setup
http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
****
****
i don't think so i am running DN on both machine ****
please find the attached log****
** **
hi olivier ****
** **
can you please give me download link ?****
let me try please ****
** **
regards****
irfan ****
** **
** **
** **
wrote:****
Are you running DN on both the machines? Could you please show me your
DN logs?****
** **
Also, consider Oliver's suggestion. It's definitely a better option.****
** **
** **
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
Irfu, ****
If you want to quickly get Hadoop running on windows platform. You may
want to try our distribution for Windows. You will be able to find the msi
on our website. ****
Regards
Olivier ****
thanks. ****
ok. i think i need to change the plan over here ****
let me create two environments. 1: totally windows 2: totally Unix****
** **
because, on windows , anyway i have to try and see how hadoop works ****
on UNIX, it is already known that , it is working fine. ****
** **
so, on windows , here is the setup:****
** **
namenode : windows 2012 R2 ****
datanode : windows 2012 R2 ****
** **
now, the exact problem is :****
1: datanode is not getting started ****
2: replication : if i put any file/folder on any datanode , it should get
replicated to all another available datanodes ****
** **
regards****
** **
** **
** **
** **
** **
** **
** **
** **
wrote:****
Seriously??You are planning to develop something using Hadoop on
windows. Not a good idea. Anyways, cold you plz show me your log files?I
also need some additional info :****
-The exact problem which you are facing right now****
-Your cluster summary(no. of nodes etc)****
-Your latest configuration files****
-Your /etc.hosts file****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
****
ok. thanks****
now, i need to start with all windows setup first as our product will be
based on windows ****
so, now, please tell me how to resolve the issue ****
** **
datanode is not starting . please suggest ****
** **
regards,****
irfan ****
** **
** **
wrote:****
It is possible. Theoretically Hadoop doesn't stop you from doing that.
But it is not a very wise setup.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
****
please suggest****
** **
regards****
irfan****
** **
** **
wrote:****
thanks.****
can i have setup like this :****
namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)****
and datanodes are the combination of any OS (windows , linux , unix etc )*
***
** **
however, my doubt is, as the file systems of both the systems (win and
linux ) are different , datanodes of these systems can not be part of
single cluster . i have to make windows cluster separate and UNIX cluster
separate ?****
** **
regards****
** **
** **
wrote:****
I just noticed you are on Cygwin. IIRC Windows PIDs are not the same as
Cygwin PIDs so that may be causing the discrepancy. I don't know how well
Hadoop works in Cygwin as I have never tried it. Work is in progress for
native Windows support however there are no official releases with Windows
support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
****
****
wrote:****
thanks ****
here is what i did .****
i stopped all the namenodes and datanodes using ./stop-dfs.sh command ****
then deleted all pid files for namenodes and datanodes ****
** **
started dfs again with command : "./start-dfs.sh"****
** **
when i ran the "Jps" command . it shows****
** **
$ ./jps.exe****
4536 Jps****
2076 NameNode****
** **
however, when i open the pid file for namenode then it is not showing pid
as : 4560. on the contrary, it shud show : 2076****
** **
please suggest ****
** **
regards****
** **
** **
wrote:****
Most likely there is a stale pid file. Something like
\tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
the datanode.
I haven't read the entire thread so you may have looked at this already.
-Arpit****
****
****
datanode is trying to connect to namenode continuously but fails ****
** **
when i try to run "jps" command it says :****
$ ./jps.exe****
4584 NameNode****
4016 Jps****
** **
and when i ran the "./start-dfs.sh" then it says :****
** **
$ ./start-dfs.sh****
namenode running as process 3544. Stop it first.****
DFS-1: datanode running as process 4076. Stop it first.****
localhost: secondarynamenode running as process 4792. Stop it first.****
** **
both these logs are contradictory ****
please find the attached logs ****
** **
should i attach the conf files as well ?****
** **
regards****
****
** **
wrote:****
Your DN is still not running. Showing me the logs would be helpful.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
****
i followed the url and did the steps mention in that. i have deployed on
the windows platform****
** **
Now, i am able to browse url : http://localhost:50070 (name node )****
however, not able to browse url : http://localhost:50030****
** **
please refer below****
** **
[image: Inline image 1]****
** **
i have modified all the config files as mentioned and formatted the hdfs
file system as well ****
please suggest ****
** **
regards****
** **
** **
****
http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html*
***
let me follow the url which you gave for pseudo distributed setup and then
will switch to distributed mode****
** **
regards****
irfan ****
** **
** **
wrote:****
You are welcome. Which link have you followed for the configuration?Your
*core-site.xml* is empty. Remove the property *fs.default.name *from *
hdfs-site.xml* and add it to *core-site.xml*. Remove *mapred.job.tracker*as well. It is required in
*mapred-site.xml*.****
** **
I would suggest you to do a pseudo distributed setup first in order to get
yourself familiar with the process and then proceed to the distributed
mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
****
** **
HTH****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
****
thanks tariq for response. ****
as discussed last time, i have sent you all the config files in my setup .
****
can you please go through that ?****
** **
please let me know ****
** **
regards****
irfan ****
** **
** **
** **
wrote:****
I'm sorry for being unresponsive. Was out of touch for sometime because
of ramzan and eid. Resuming work today.****
** **
What's the current status?****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
First of all read the concepts ..I hope you will like it..****
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf****
** **
****
please suggest ****
** **
regards****
irfan ****
** **
** **
wrote:****
hey Tariq,****
i am still stuck .. ****
can you please suggest ****
** **
regards****
irfan ****
** **
** **
***
please suggest ****
** **
regards****
** **
** **
***
attachment got quarantined ****
resending in txt format. please rename it to conf.rar ****
** **
regards****
** **
** **
***
thanks.****
** **
if i run the jps command on namenode :****
** **
$ ./jps.exe****
3164 NameNode****
1892 Jps****
** **
same command on datanode :****
** **
$ ./jps.exe****
3848 Jps****
** **
jps does not list any process for datanode. however, on web browser i can
see one live data node ****
please find the attached conf rar file of namenode ****
** **
regards****
** **
** **
****
OK. we'll start fresh. Could you plz show me your latest config files?***
*
** **
BTW, are your daemons running fine?Use JPS to verify that.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
****
i have created these dir "wksp_data" and "wksp_name" on both datanode
and namenode ****
made the respective changes in "hdfs-site.xml" file ****
formatted the namenode ****
started the dfs ****
** **
but still, not able to browse the file system through web browser ****
please refer below ****
** **
anything still missing ?****
please suggest ****
** **
[image: Inline image 1]****
** **
****
these dir needs to be created on all datanodes and namenodes ?****
further, hdfs-site.xml needs to be updated on both datanodes and
namenodes for these new dir?****
** **
regards****
** **
** **
****
Create 2 directories manually corresponding to the values of
dfs.name.dir and dfs.data.dir properties and change the permissions of
these directories to 755. When you start pushing data into your HDFS, data
will start going inside the directory specified by dfs.data.dir and the
associated metadata will go inside dfs.name.dir. Remember, you store data
in HDFS, but it eventually gets stored in your local/native FS. But you
cannot see this data directly on your local/native FS.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
***
thanks. ****
however, i need this to be working on windows environment as project
requirement.****
i will add/work on Linux later ****
** **
so, now , at this stage , c:\\wksp is the HDFS file system OR do i need to
create it from command line ?****
** **
please suggest****
** **
regards,****
** **
** **
****
Hello Irfan,****
** **
Sorry for being unresponsive. Got stuck with some imp work.****
** **
HDFS webUI doesn't provide us the ability to create file or directory. You
can browse HDFS, view files, download files etc. But operation like create,
move, copy etc are not supported.****
** **
These values look fine to me.****
** **
One suggestion though. Try getting a Linux machine(if possible). Or at
least use a VM. I personally feel that using Hadoop on windows is always
messy.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
***
thanks.****
when i browse the file system , i am getting following :****
i haven't seen any make directory option there ****
** **
i need to create it from command line ?****
further, in the hdfs-site.xml file , i have given following entries. are
they correct ? ****
** **
<property>****
<name>dfs.data.dir</name>****
<value>c:\\wksp</value>****
</property>****
<property>****
<name>dfs.name.dir</name>****
<value>c:\\wksp</value>****
</property>****
** **
please suggest ****
** **
** **
[image: Inline image 1]****
** **
wrote:****
*You are wrong at this:*****
** **
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.**
**
** **
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp****
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
****
** **
Because,You had wrote both the paths local and You need not to copy hadoop
into hdfs...Hadoop is already working..****
** **
Just check out in browser by after starting ur single node cluster :****
** **
localhost:50070****
** **
then go for browse the filesystem link in it..****
** **
If there is no directory then make directory there.****
That is your hdfs directory.****
Then copy any text file there(no need to copy hadoop there).beacause u are
going to do processing on that data in text file.That's why hadoop is used
for ,first u need to make it clear in ur mind.Then and then u will do
it...otherwise not possible..****
** **
*Try this: *****
** **
$ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
/hdfs/directory/path****
** **
** **
** **
** **
****
thanks. yes , i am newbie.****
however, i need windows setup.****
** **
let me surely refer the doc and link which u sent but i need this to be
working ...****
can you please help****
** **
regards****
** **
****
** **
****
** **
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443****
** **
** **
** **
** **
** **
** **
** **
** **
** **
** **
** **
** **
****
-- ****
Regards****
*Manish Dunani*****
*Contact No* : +91 9408329137****
*skype id* : manish.dunani****
** **
** **
** **
** **
** **
** **
** **
** **
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.****
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.****
** **
** **
** **
** **
** **
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.****
** **
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.****
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.****
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.****
** **
****
** **
-- ****
Olivier Renault****
Solution Engineer - Big Data - Hortonworks, Inc.
+44 7500 933 036
www.hortonworks.com****
**** <http://hortonworks.com/products/hortonworks-sandbox/>
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
** ** <http://hortonworks.com/products/hortonworks-sandbox/>
Irfan Sayed
2013-09-06 06:46:16 UTC
Permalink
ok.. now i made some changes and installation went ahead
but failed in property "HIVE_SERVER_HOST" declaration
in cluster config file, i have commented this property. if i uncomment ,
then what server address will give ???

i have only two windows machines setup.
1: for namenode and another for datanode

please suggest

regards
irfan
Post by Irfan Sayed
thanks.
i installed the latest java in c:\java folder and now no error in log file
related to java
however, now it is throwing error on not having cluster properties file.
in fact i am running/installing hdp from the location where this file
exist . still it is throwing error
please find the attached
[image: Inline image 1]
regards
irfan
On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
HereÂ’s your issue (from the logs you attached earlier):****
** **
CAQuietExec: Checking JAVA_HOME is set correctly...****
CAQuietExec: Files\Java\jdk1.6.0_31 was unexpected at this time.****
** **
It seems that you installed Java prerequisite in the default path, which
is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3 does
not like spaces in paths, do you need to reinstall Java under c:\java\ or
something similar (in a path with no spaces).****
** **
*Sent:* Thursday, September 5, 2013 8:42 PM
*Subject:* Re: about replication****
** **
please find the attached.****
i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
as it is not generated ****
** **
regards****
irfan****
** **
** **
** **
** **
** **
wrote:****
Could you share the log files ( c:\hdp.log,
c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log ) as
well as your clusterproperties.txt ?****
** **
Thanks, ****
Olivier****
** **
thanks. i followed the user manual for deployment and installed all
pre-requisites ****
i modified the command and still the issue persist. please suggest ****
** **
please refer below ****
** **
** **
[image: Inline image 1]****
** **
regards****
irfan ****
** **
** **
wrote:****
The command to install it is msiexec /i msifile /... ****
You will find the correct syntax as part of doc. ****
Happy reading
Olivier ****
thanks. ****
i referred the logs and manuals. i modified the clusterproperties file
and then double click on the msi file ****
however, it still failed.****
further i started the installation on command line by giving
HDP_LAYOUT=clusterproperties file path, ****
installation went ahead and it failed for .NET framework 4.0 and VC++
redistributable package dependency ****
** **
i installed both and started again the installation. ****
failed again with following error ****
[image: Inline image 1]****
** **
when i search for the logs mentioned in the error , i never found that **
**
please suggest ****
** **
regards****
irfan****
** **
** **
On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
Correct, you need to define the cluster configuration as part of a file.
You will find some information on the configuration file as part of the
documentation. ****
http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
****
You should make sure to have also installed the pre requisite. ****
Thanks
Olivier ****
thanks. sorry for the long break. actually got involved in some other
priorities****
i downloaded the installer and while installing i got following error ***
*
** **
[image: Inline image 1]****
** **
do i need to make any configuration prior to installation ??****
** **
regards****
irfan ****
** **
** **
On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
Here is the link ****
http://download.hortonworks.com/products/hdp-windows/****
Olivier ****
thanks.****
i just followed the instructions to setup the pseudo distributed setup
http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
****
****
i don't think so i am running DN on both machine ****
please find the attached log****
** **
hi olivier ****
** **
can you please give me download link ?****
let me try please ****
** **
regards****
irfan ****
** **
** **
** **
wrote:****
Are you running DN on both the machines? Could you please show me your
DN logs?****
** **
Also, consider Oliver's suggestion. It's definitely a better option.****
** **
** **
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
Irfu, ****
If you want to quickly get Hadoop running on windows platform. You may
want to try our distribution for Windows. You will be able to find the msi
on our website. ****
Regards
Olivier ****
thanks. ****
ok. i think i need to change the plan over here ****
let me create two environments. 1: totally windows 2: totally Unix****
** **
because, on windows , anyway i have to try and see how hadoop works ****
on UNIX, it is already known that , it is working fine. ****
** **
so, on windows , here is the setup:****
** **
namenode : windows 2012 R2 ****
datanode : windows 2012 R2 ****
** **
now, the exact problem is :****
1: datanode is not getting started ****
2: replication : if i put any file/folder on any datanode , it should get
replicated to all another available datanodes ****
** **
regards****
** **
** **
** **
** **
** **
** **
** **
** **
wrote:****
Seriously??You are planning to develop something using Hadoop on
windows. Not a good idea. Anyways, cold you plz show me your log files?I
also need some additional info :****
-The exact problem which you are facing right now****
-Your cluster summary(no. of nodes etc)****
-Your latest configuration files****
-Your /etc.hosts file****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
ok. thanks****
now, i need to start with all windows setup first as our product will be
based on windows ****
so, now, please tell me how to resolve the issue ****
** **
datanode is not starting . please suggest ****
** **
regards,****
irfan ****
** **
** **
wrote:****
It is possible. Theoretically Hadoop doesn't stop you from doing that.
But it is not a very wise setup.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
please suggest****
** **
regards****
irfan****
** **
** **
wrote:****
thanks.****
can i have setup like this :****
namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)****
and datanodes are the combination of any OS (windows , linux , unix etc )
****
** **
however, my doubt is, as the file systems of both the systems (win and
linux ) are different , datanodes of these systems can not be part of
single cluster . i have to make windows cluster separate and UNIX cluster
separate ?****
** **
regards****
** **
** **
wrote:****
I just noticed you are on Cygwin. IIRC Windows PIDs are not the same as
Cygwin PIDs so that may be causing the discrepancy. I don't know how well
Hadoop works in Cygwin as I have never tried it. Work is in progress for
native Windows support however there are no official releases with Windows
support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
****
****
wrote:****
thanks ****
here is what i did .****
i stopped all the namenodes and datanodes using ./stop-dfs.sh command ***
*
then deleted all pid files for namenodes and datanodes ****
** **
started dfs again with command : "./start-dfs.sh"****
** **
when i ran the "Jps" command . it shows****
** **
$ ./jps.exe****
4536 Jps****
2076 NameNode****
** **
however, when i open the pid file for namenode then it is not showing pid
as : 4560. on the contrary, it shud show : 2076****
** **
please suggest ****
** **
regards****
** **
** **
wrote:****
Most likely there is a stale pid file. Something like
\tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
the datanode.
I haven't read the entire thread so you may have looked at this already.
-Arpit****
****
wrote:****
datanode is trying to connect to namenode continuously but fails ****
** **
when i try to run "jps" command it says :****
$ ./jps.exe****
4584 NameNode****
4016 Jps****
** **
and when i ran the "./start-dfs.sh" then it says :****
** **
$ ./start-dfs.sh****
namenode running as process 3544. Stop it first.****
DFS-1: datanode running as process 4076. Stop it first.****
localhost: secondarynamenode running as process 4792. Stop it first.****
** **
both these logs are contradictory ****
please find the attached logs ****
** **
should i attach the conf files as well ?****
** **
regards****
****
** **
wrote:****
Your DN is still not running. Showing me the logs would be helpful.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
i followed the url and did the steps mention in that. i have deployed
on the windows platform****
** **
Now, i am able to browse url : http://localhost:50070 (name node )****
however, not able to browse url : http://localhost:50030****
** **
please refer below****
** **
[image: Inline image 1]****
** **
i have modified all the config files as mentioned and formatted the hdfs
file system as well ****
please suggest ****
** **
regards****
** **
** **
wrote:****
http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
****
let me follow the url which you gave for pseudo distributed setup and
then will switch to distributed mode****
** **
regards****
irfan ****
** **
** **
wrote:****
You are welcome. Which link have you followed for the
configuration?Your *core-site.xml* is empty. Remove the property *
fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
Remove *mapred.job.tracker* as well. It is required in *mapred-site.xml*.
****
** **
I would suggest you to do a pseudo distributed setup first in order to
get yourself familiar with the process and then proceed to the distributed
mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
****
** **
HTH****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
thanks tariq for response. ****
as discussed last time, i have sent you all the config files in my setup
. ****
can you please go through that ?****
** **
please let me know ****
** **
regards****
irfan ****
** **
** **
** **
wrote:****
I'm sorry for being unresponsive. Was out of touch for sometime because
of ramzan and eid. Resuming work today.****
** **
What's the current status?****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
First of all read the concepts ..I hope you will like it..****
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf****
** **
wrote:****
please suggest ****
** **
regards****
irfan ****
** **
** **
wrote:****
hey Tariq,****
i am still stuck .. ****
can you please suggest ****
** **
regards****
irfan ****
** **
** **
****
please suggest ****
** **
regards****
** **
** **
****
attachment got quarantined ****
resending in txt format. please rename it to conf.rar ****
** **
regards****
** **
** **
****
thanks.****
** **
if i run the jps command on namenode :****
** **
$ ./jps.exe****
3164 NameNode****
1892 Jps****
** **
same command on datanode :****
** **
$ ./jps.exe****
3848 Jps****
** **
jps does not list any process for datanode. however, on web browser i can
see one live data node ****
please find the attached conf rar file of namenode ****
** **
regards****
** **
** **
wrote:****
OK. we'll start fresh. Could you plz show me your latest config files?**
**
** **
BTW, are your daemons running fine?Use JPS to verify that.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
i have created these dir "wksp_data" and "wksp_name" on both datanode
and namenode ****
made the respective changes in "hdfs-site.xml" file ****
formatted the namenode ****
started the dfs ****
** **
but still, not able to browse the file system through web browser ****
please refer below ****
** **
anything still missing ?****
please suggest ****
** **
[image: Inline image 1]****
** **
wrote:****
these dir needs to be created on all datanodes and namenodes ?****
further, hdfs-site.xml needs to be updated on both datanodes and
namenodes for these new dir?****
** **
regards****
** **
** **
wrote:****
Create 2 directories manually corresponding to the values of
dfs.name.dir and dfs.data.dir properties and change the permissions of
these directories to 755. When you start pushing data into your HDFS, data
will start going inside the directory specified by dfs.data.dir and the
associated metadata will go inside dfs.name.dir. Remember, you store data
in HDFS, but it eventually gets stored in your local/native FS. But you
cannot see this data directly on your local/native FS.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
****
thanks. ****
however, i need this to be working on windows environment as project
requirement.****
i will add/work on Linux later ****
** **
so, now , at this stage , c:\\wksp is the HDFS file system OR do i need
to create it from command line ?****
** **
please suggest****
** **
regards,****
** **
** **
wrote:****
Hello Irfan,****
** **
Sorry for being unresponsive. Got stuck with some imp work.****
** **
HDFS webUI doesn't provide us the ability to create file or directory.
You can browse HDFS, view files, download files etc. But operation like
create, move, copy etc are not supported.****
** **
These values look fine to me.****
** **
One suggestion though. Try getting a Linux machine(if possible). Or at
least use a VM. I personally feel that using Hadoop on windows is always
messy.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
****
thanks.****
when i browse the file system , i am getting following :****
i haven't seen any make directory option there ****
** **
i need to create it from command line ?****
further, in the hdfs-site.xml file , i have given following entries. are
they correct ? ****
** **
<property>****
<name>dfs.data.dir</name>****
<value>c:\\wksp</value>****
</property>****
<property>****
<name>dfs.name.dir</name>****
<value>c:\\wksp</value>****
</property>****
** **
please suggest ****
** **
** **
[image: Inline image 1]****
** **
wrote:****
*You are wrong at this:*****
** **
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.*
***
** **
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp****
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
****
** **
Because,You had wrote both the paths local and You need not to copy
hadoop into hdfs...Hadoop is already working..****
** **
Just check out in browser by after starting ur single node cluster :****
** **
localhost:50070****
** **
then go for browse the filesystem link in it..****
** **
If there is no directory then make directory there.****
That is your hdfs directory.****
Then copy any text file there(no need to copy hadoop there).beacause u
are going to do processing on that data in text file.That's why hadoop is
used for ,first u need to make it clear in ur mind.Then and then u will do
it...otherwise not possible..****
** **
*Try this: *****
** **
$ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
/hdfs/directory/path****
** **
** **
** **
** **
wrote:****
thanks. yes , i am newbie.****
however, i need windows setup.****
** **
let me surely refer the doc and link which u sent but i need this to be
working ...****
can you please help****
** **
regards****
** **
****
** **
****
** **
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443****
** **
** **
** **
** **
** **
** **
** **
** **
** **
** **
** **
** **
****
-- ****
Regards****
*Manish Dunani*****
*Contact No* : +91 9408329137****
*skype id* : manish.dunani****
** **
** **
** **
** **
** **
** **
** **
** **
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.****
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.****
** **
** **
** **
** **
** **
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.****
** **
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.****
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.****
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.****
** **
****
** **
-- ****
Olivier Renault****
Solution Engineer - Big Data - Hortonworks, Inc.
+44 7500 933 036
www.hortonworks.com****
**** <http://hortonworks.com/products/hortonworks-sandbox/>
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
** ** <http://hortonworks.com/products/hortonworks-sandbox/>
Irfan Sayed
2013-09-07 11:26:56 UTC
Permalink
please suggest

regards
irfan
Post by Irfan Sayed
ok.. now i made some changes and installation went ahead
but failed in property "HIVE_SERVER_HOST" declaration
in cluster config file, i have commented this property. if i uncomment ,
then what server address will give ???
i have only two windows machines setup.
1: for namenode and another for datanode
please suggest
regards
irfan
Post by Irfan Sayed
thanks.
i installed the latest java in c:\java folder and now no error in log
file related to java
however, now it is throwing error on not having cluster properties file.
in fact i am running/installing hdp from the location where this file
exist . still it is throwing error
please find the attached
[image: Inline image 1]
regards
irfan
On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
HereÂ’s your issue (from the logs you attached earlier):****
** **
CAQuietExec: Checking JAVA_HOME is set correctly...****
CAQuietExec: Files\Java\jdk1.6.0_31 was unexpected at this time.****
** **
It seems that you installed Java prerequisite in the default path, which
is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3 does
not like spaces in paths, do you need to reinstall Java under c:\java\ or
something similar (in a path with no spaces).****
** **
*Sent:* Thursday, September 5, 2013 8:42 PM
*Subject:* Re: about replication****
** **
please find the attached.****
i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
as it is not generated ****
** **
regards****
irfan****
** **
** **
** **
** **
** **
On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
Could you share the log files ( c:\hdp.log,
c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log ) as
well as your clusterproperties.txt ?****
** **
Thanks, ****
Olivier****
** **
thanks. i followed the user manual for deployment and installed all
pre-requisites ****
i modified the command and still the issue persist. please suggest ****
** **
please refer below ****
** **
** **
[image: Inline image 1]****
** **
regards****
irfan ****
** **
** **
On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
The command to install it is msiexec /i msifile /... ****
You will find the correct syntax as part of doc. ****
Happy reading
Olivier ****
thanks. ****
i referred the logs and manuals. i modified the clusterproperties file
and then double click on the msi file ****
however, it still failed.****
further i started the installation on command line by giving
HDP_LAYOUT=clusterproperties file path, ****
installation went ahead and it failed for .NET framework 4.0 and VC++
redistributable package dependency ****
** **
i installed both and started again the installation. ****
failed again with following error ****
[image: Inline image 1]****
** **
when i search for the logs mentioned in the error , i never found that *
***
please suggest ****
** **
regards****
irfan****
** **
** **
On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
Correct, you need to define the cluster configuration as part of a file.
You will find some information on the configuration file as part of the
documentation. ****
http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
****
You should make sure to have also installed the pre requisite. ****
Thanks
Olivier ****
thanks. sorry for the long break. actually got involved in some other
priorities****
i downloaded the installer and while installing i got following error **
**
** **
[image: Inline image 1]****
** **
do i need to make any configuration prior to installation ??****
** **
regards****
irfan ****
** **
** **
On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
Here is the link ****
http://download.hortonworks.com/products/hdp-windows/****
Olivier ****
thanks.****
i just followed the instructions to setup the pseudo distributed setup
http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
****
****
i don't think so i am running DN on both machine ****
please find the attached log****
** **
hi olivier ****
** **
can you please give me download link ?****
let me try please ****
** **
regards****
irfan ****
** **
** **
** **
wrote:****
Are you running DN on both the machines? Could you please show me your
DN logs?****
** **
Also, consider Oliver's suggestion. It's definitely a better option.****
** **
** **
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
Irfu, ****
If you want to quickly get Hadoop running on windows platform. You may
want to try our distribution for Windows. You will be able to find the msi
on our website. ****
Regards
Olivier ****
thanks. ****
ok. i think i need to change the plan over here ****
let me create two environments. 1: totally windows 2: totally Unix****
** **
because, on windows , anyway i have to try and see how hadoop works ****
on UNIX, it is already known that , it is working fine. ****
** **
so, on windows , here is the setup:****
** **
namenode : windows 2012 R2 ****
datanode : windows 2012 R2 ****
** **
now, the exact problem is :****
1: datanode is not getting started ****
2: replication : if i put any file/folder on any datanode , it should
get replicated to all another available datanodes ****
** **
regards****
** **
** **
** **
** **
** **
** **
** **
** **
wrote:****
Seriously??You are planning to develop something using Hadoop on
windows. Not a good idea. Anyways, cold you plz show me your log files?I
also need some additional info :****
-The exact problem which you are facing right now****
-Your cluster summary(no. of nodes etc)****
-Your latest configuration files****
-Your /etc.hosts file****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
ok. thanks****
now, i need to start with all windows setup first as our product will be
based on windows ****
so, now, please tell me how to resolve the issue ****
** **
datanode is not starting . please suggest ****
** **
regards,****
irfan ****
** **
** **
wrote:****
It is possible. Theoretically Hadoop doesn't stop you from doing that.
But it is not a very wise setup.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
please suggest****
** **
regards****
irfan****
** **
** **
wrote:****
thanks.****
can i have setup like this :****
namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)****
and datanodes are the combination of any OS (windows , linux , unix etc )
****
** **
however, my doubt is, as the file systems of both the systems (win and
linux ) are different , datanodes of these systems can not be part of
single cluster . i have to make windows cluster separate and UNIX cluster
separate ?****
** **
regards****
** **
** **
On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
I just noticed you are on Cygwin. IIRC Windows PIDs are not the same as
Cygwin PIDs so that may be causing the discrepancy. I don't know how well
Hadoop works in Cygwin as I have never tried it. Work is in progress for
native Windows support however there are no official releases with Windows
support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
****
****
wrote:****
thanks ****
here is what i did .****
i stopped all the namenodes and datanodes using ./stop-dfs.sh command **
**
then deleted all pid files for namenodes and datanodes ****
** **
started dfs again with command : "./start-dfs.sh"****
** **
when i ran the "Jps" command . it shows****
** **
$ ./jps.exe****
4536 Jps****
2076 NameNode****
** **
however, when i open the pid file for namenode then it is not showing
pid as : 4560. on the contrary, it shud show : 2076****
** **
please suggest ****
** **
regards****
** **
** **
wrote:****
Most likely there is a stale pid file. Something like
\tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
the datanode.
I haven't read the entire thread so you may have looked at this already.
-Arpit****
****
wrote:****
datanode is trying to connect to namenode continuously but fails ****
** **
when i try to run "jps" command it says :****
$ ./jps.exe****
4584 NameNode****
4016 Jps****
** **
and when i ran the "./start-dfs.sh" then it says :****
** **
$ ./start-dfs.sh****
namenode running as process 3544. Stop it first.****
DFS-1: datanode running as process 4076. Stop it first.****
localhost: secondarynamenode running as process 4792. Stop it first.****
** **
both these logs are contradictory ****
please find the attached logs ****
** **
should i attach the conf files as well ?****
** **
regards****
****
** **
wrote:****
Your DN is still not running. Showing me the logs would be helpful.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
i followed the url and did the steps mention in that. i have deployed
on the windows platform****
** **
Now, i am able to browse url : http://localhost:50070 (name node )****
however, not able to browse url : http://localhost:50030****
** **
please refer below****
** **
[image: Inline image 1]****
** **
i have modified all the config files as mentioned and formatted the hdfs
file system as well ****
please suggest ****
** **
regards****
** **
** **
wrote:****
http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
****
let me follow the url which you gave for pseudo distributed setup and
then will switch to distributed mode****
** **
regards****
irfan ****
** **
** **
wrote:****
You are welcome. Which link have you followed for the
configuration?Your *core-site.xml* is empty. Remove the property *
fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
Remove *mapred.job.tracker* as well. It is required in *mapred-site.xml*
.****
** **
I would suggest you to do a pseudo distributed setup first in order to
get yourself familiar with the process and then proceed to the distributed
mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
****
** **
HTH****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
thanks tariq for response. ****
as discussed last time, i have sent you all the config files in my setup
. ****
can you please go through that ?****
** **
please let me know ****
** **
regards****
irfan ****
** **
** **
** **
wrote:****
I'm sorry for being unresponsive. Was out of touch for sometime
because of ramzan and eid. Resuming work today.****
** **
What's the current status?****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
First of all read the concepts ..I hope you will like it..****
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf****
** **
wrote:****
please suggest ****
** **
regards****
irfan ****
** **
** **
wrote:****
hey Tariq,****
i am still stuck .. ****
can you please suggest ****
** **
regards****
irfan ****
** **
** **
wrote:****
please suggest ****
** **
regards****
** **
** **
wrote:****
attachment got quarantined ****
resending in txt format. please rename it to conf.rar ****
** **
regards****
** **
** **
wrote:****
thanks.****
** **
if i run the jps command on namenode :****
** **
$ ./jps.exe****
3164 NameNode****
1892 Jps****
** **
same command on datanode :****
** **
$ ./jps.exe****
3848 Jps****
** **
jps does not list any process for datanode. however, on web browser i
can see one live data node ****
please find the attached conf rar file of namenode ****
** **
regards****
** **
** **
wrote:****
OK. we'll start fresh. Could you plz show me your latest config files?*
***
** **
BTW, are your daemons running fine?Use JPS to verify that.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
i have created these dir "wksp_data" and "wksp_name" on both datanode
and namenode ****
made the respective changes in "hdfs-site.xml" file ****
formatted the namenode ****
started the dfs ****
** **
but still, not able to browse the file system through web browser ****
please refer below ****
** **
anything still missing ?****
please suggest ****
** **
[image: Inline image 1]****
** **
wrote:****
these dir needs to be created on all datanodes and namenodes ?****
further, hdfs-site.xml needs to be updated on both datanodes and
namenodes for these new dir?****
** **
regards****
** **
** **
wrote:****
Create 2 directories manually corresponding to the values of
dfs.name.dir and dfs.data.dir properties and change the permissions of
these directories to 755. When you start pushing data into your HDFS, data
will start going inside the directory specified by dfs.data.dir and the
associated metadata will go inside dfs.name.dir. Remember, you store data
in HDFS, but it eventually gets stored in your local/native FS. But you
cannot see this data directly on your local/native FS.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
thanks. ****
however, i need this to be working on windows environment as project
requirement.****
i will add/work on Linux later ****
** **
so, now , at this stage , c:\\wksp is the HDFS file system OR do i need
to create it from command line ?****
** **
please suggest****
** **
regards,****
** **
** **
wrote:****
Hello Irfan,****
** **
Sorry for being unresponsive. Got stuck with some imp work.****
** **
HDFS webUI doesn't provide us the ability to create file or directory.
You can browse HDFS, view files, download files etc. But operation like
create, move, copy etc are not supported.****
** **
These values look fine to me.****
** **
One suggestion though. Try getting a Linux machine(if possible). Or at
least use a VM. I personally feel that using Hadoop on windows is always
messy.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
thanks.****
when i browse the file system , i am getting following :****
i haven't seen any make directory option there ****
** **
i need to create it from command line ?****
further, in the hdfs-site.xml file , i have given following entries. are
they correct ? ****
** **
<property>****
<name>dfs.data.dir</name>****
<value>c:\\wksp</value>****
</property>****
<property>****
<name>dfs.name.dir</name>****
<value>c:\\wksp</value>****
</property>****
** **
please suggest ****
** **
** **
[image: Inline image 1]****
** **
wrote:****
*You are wrong at this:*****
** **
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
****
** **
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp****
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
****
** **
Because,You had wrote both the paths local and You need not to copy
hadoop into hdfs...Hadoop is already working..****
** **
Just check out in browser by after starting ur single node cluster :****
** **
localhost:50070****
** **
then go for browse the filesystem link in it..****
** **
If there is no directory then make directory there.****
That is your hdfs directory.****
Then copy any text file there(no need to copy hadoop there).beacause u
are going to do processing on that data in text file.That's why hadoop is
used for ,first u need to make it clear in ur mind.Then and then u will do
it...otherwise not possible..****
** **
*Try this: *****
** **
$ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
/hdfs/directory/path****
** **
** **
** **
** **
wrote:****
thanks. yes , i am newbie.****
however, i need windows setup.****
** **
let me surely refer the doc and link which u sent but i need this to be
working ...****
can you please help****
** **
regards****
** **
****
** **
****
** **
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443****
** **
** **
** **
** **
** **
** **
** **
** **
** **
** **
** **
** **
****
-- ****
Regards****
*Manish Dunani*****
*Contact No* : +91 9408329137****
*skype id* : manish.dunani****
** **
** **
** **
** **
** **
** **
** **
** **
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.****
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.****
** **
** **
** **
** **
** **
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.****
** **
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.****
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.****
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.****
** **
****
** **
-- ****
Olivier Renault****
Solution Engineer - Big Data - Hortonworks, Inc.
+44 7500 933 036
www.hortonworks.com****
**** <http://hortonworks.com/products/hortonworks-sandbox/>
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
** ** <http://hortonworks.com/products/hortonworks-sandbox/>
Irfan Sayed
2013-09-10 06:02:57 UTC
Permalink
please suggest

regards
irfan
Post by Irfan Sayed
please suggest
regards
irfan
Post by Irfan Sayed
ok.. now i made some changes and installation went ahead
but failed in property "HIVE_SERVER_HOST" declaration
in cluster config file, i have commented this property. if i uncomment ,
then what server address will give ???
i have only two windows machines setup.
1: for namenode and another for datanode
please suggest
regards
irfan
Post by Irfan Sayed
thanks.
i installed the latest java in c:\java folder and now no error in log
file related to java
however, now it is throwing error on not having cluster properties file.
in fact i am running/installing hdp from the location where this file
exist . still it is throwing error
please find the attached
[image: Inline image 1]
regards
irfan
On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
HereÂ’s your issue (from the logs you attached earlier):****
** **
CAQuietExec: Checking JAVA_HOME is set correctly...****
CAQuietExec: Files\Java\jdk1.6.0_31 was unexpected at this time.****
** **
It seems that you installed Java prerequisite in the default path,
which is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3
does not like spaces in paths, do you need to reinstall Java under c:\java\
or something similar (in a path with no spaces).****
** **
*Sent:* Thursday, September 5, 2013 8:42 PM
*Subject:* Re: about replication****
** **
please find the attached.****
i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
as it is not generated ****
** **
regards****
irfan****
** **
** **
** **
** **
** **
On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
Could you share the log files ( c:\hdp.log,
c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log ) as
well as your clusterproperties.txt ?****
** **
Thanks, ****
Olivier****
** **
*
thanks. i followed the user manual for deployment and installed all
pre-requisites ****
i modified the command and still the issue persist. please suggest ****
** **
please refer below ****
** **
** **
[image: Inline image 1]****
** **
regards****
irfan ****
** **
** **
On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
The command to install it is msiexec /i msifile /... ****
You will find the correct syntax as part of doc. ****
Happy reading
Olivier ****
thanks. ****
i referred the logs and manuals. i modified the clusterproperties file
and then double click on the msi file ****
however, it still failed.****
further i started the installation on command line by giving
HDP_LAYOUT=clusterproperties file path, ****
installation went ahead and it failed for .NET framework 4.0 and VC++
redistributable package dependency ****
** **
i installed both and started again the installation. ****
failed again with following error ****
[image: Inline image 1]****
** **
when i search for the logs mentioned in the error , i never found that
****
please suggest ****
** **
regards****
irfan****
** **
** **
On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
Correct, you need to define the cluster configuration as part of a
file. You will find some information on the configuration file as part of
the documentation. ****
http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
****
You should make sure to have also installed the pre requisite. ****
Thanks
Olivier ****
thanks. sorry for the long break. actually got involved in some other
priorities****
i downloaded the installer and while installing i got following error *
***
** **
[image: Inline image 1]****
** **
do i need to make any configuration prior to installation ??****
** **
regards****
irfan ****
** **
** **
On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
Here is the link ****
http://download.hortonworks.com/products/hdp-windows/****
Olivier ****
thanks.****
i just followed the instructions to setup the pseudo distributed setup
http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
****
****
i don't think so i am running DN on both machine ****
please find the attached log****
** **
hi olivier ****
** **
can you please give me download link ?****
let me try please ****
** **
regards****
irfan ****
** **
** **
** **
wrote:****
Are you running DN on both the machines? Could you please show me
your DN logs?****
** **
Also, consider Oliver's suggestion. It's definitely a better option.***
*
** **
** **
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
Irfu, ****
If you want to quickly get Hadoop running on windows platform. You may
want to try our distribution for Windows. You will be able to find the msi
on our website. ****
Regards
Olivier ****
thanks. ****
ok. i think i need to change the plan over here ****
let me create two environments. 1: totally windows 2: totally Unix****
** **
because, on windows , anyway i have to try and see how hadoop works ***
*
on UNIX, it is already known that , it is working fine. ****
** **
so, on windows , here is the setup:****
** **
namenode : windows 2012 R2 ****
datanode : windows 2012 R2 ****
** **
now, the exact problem is :****
1: datanode is not getting started ****
2: replication : if i put any file/folder on any datanode , it should
get replicated to all another available datanodes ****
** **
regards****
** **
** **
** **
** **
** **
** **
** **
** **
wrote:****
Seriously??You are planning to develop something using Hadoop on
windows. Not a good idea. Anyways, cold you plz show me your log files?I
also need some additional info :****
-The exact problem which you are facing right now****
-Your cluster summary(no. of nodes etc)****
-Your latest configuration files****
-Your /etc.hosts file****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
ok. thanks****
now, i need to start with all windows setup first as our product will
be based on windows ****
so, now, please tell me how to resolve the issue ****
** **
datanode is not starting . please suggest ****
** **
regards,****
irfan ****
** **
** **
wrote:****
It is possible. Theoretically Hadoop doesn't stop you from doing
that. But it is not a very wise setup.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
please suggest****
** **
regards****
irfan****
** **
** **
wrote:****
thanks.****
can i have setup like this :****
namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)****
and datanodes are the combination of any OS (windows , linux , unix etc
)****
** **
however, my doubt is, as the file systems of both the systems (win
and linux ) are different , datanodes of these systems can not be part of
single cluster . i have to make windows cluster separate and UNIX cluster
separate ?****
** **
regards****
** **
** **
On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
I just noticed you are on Cygwin. IIRC Windows PIDs are not the same as
Cygwin PIDs so that may be causing the discrepancy. I don't know how well
Hadoop works in Cygwin as I have never tried it. Work is in progress for
native Windows support however there are no official releases with Windows
support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
****
****
wrote:****
thanks ****
here is what i did .****
i stopped all the namenodes and datanodes using ./stop-dfs.sh command *
***
then deleted all pid files for namenodes and datanodes ****
** **
started dfs again with command : "./start-dfs.sh"****
** **
when i ran the "Jps" command . it shows****
** **
$ ./jps.exe****
4536 Jps****
2076 NameNode****
** **
however, when i open the pid file for namenode then it is not showing
pid as : 4560. on the contrary, it shud show : 2076****
** **
please suggest ****
** **
regards****
** **
** **
On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
Most likely there is a stale pid file. Something like
\tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
the datanode.
I haven't read the entire thread so you may have looked at this already.
-Arpit****
****
wrote:****
datanode is trying to connect to namenode continuously but fails ****
** **
when i try to run "jps" command it says :****
$ ./jps.exe****
4584 NameNode****
4016 Jps****
** **
and when i ran the "./start-dfs.sh" then it says :****
** **
$ ./start-dfs.sh****
namenode running as process 3544. Stop it first.****
DFS-1: datanode running as process 4076. Stop it first.****
localhost: secondarynamenode running as process 4792. Stop it first.***
*
** **
both these logs are contradictory ****
please find the attached logs ****
** **
should i attach the conf files as well ?****
** **
regards****
****
** **
wrote:****
Your DN is still not running. Showing me the logs would be helpful.***
*
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
i followed the url and did the steps mention in that. i have deployed
on the windows platform****
** **
Now, i am able to browse url : http://localhost:50070 (name node )****
however, not able to browse url : http://localhost:50030****
** **
please refer below****
** **
[image: Inline image 1]****
** **
i have modified all the config files as mentioned and formatted the
hdfs file system as well ****
please suggest ****
** **
regards****
** **
** **
wrote:****
http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
****
let me follow the url which you gave for pseudo distributed setup and
then will switch to distributed mode****
** **
regards****
irfan ****
** **
** **
wrote:****
You are welcome. Which link have you followed for the
configuration?Your *core-site.xml* is empty. Remove the property *
fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
Remove *mapred.job.tracker* as well. It is required in *mapred-site.xml
*.****
** **
I would suggest you to do a pseudo distributed setup first in order to
get yourself familiar with the process and then proceed to the distributed
mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
****
** **
HTH****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
thanks tariq for response. ****
as discussed last time, i have sent you all the config files in my
setup . ****
can you please go through that ?****
** **
please let me know ****
** **
regards****
irfan ****
** **
** **
** **
wrote:****
I'm sorry for being unresponsive. Was out of touch for sometime
because of ramzan and eid. Resuming work today.****
** **
What's the current status?****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
First of all read the concepts ..I hope you will like it..****
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf****
** **
wrote:****
please suggest ****
** **
regards****
irfan ****
** **
** **
wrote:****
hey Tariq,****
i am still stuck .. ****
can you please suggest ****
** **
regards****
irfan ****
** **
** **
wrote:****
please suggest ****
** **
regards****
** **
** **
wrote:****
attachment got quarantined ****
resending in txt format. please rename it to conf.rar ****
** **
regards****
** **
** **
wrote:****
thanks.****
** **
if i run the jps command on namenode :****
** **
$ ./jps.exe****
3164 NameNode****
1892 Jps****
** **
same command on datanode :****
** **
$ ./jps.exe****
3848 Jps****
** **
jps does not list any process for datanode. however, on web browser i
can see one live data node ****
please find the attached conf rar file of namenode ****
** **
regards****
** **
** **
wrote:****
OK. we'll start fresh. Could you plz show me your latest config files?
****
** **
BTW, are your daemons running fine?Use JPS to verify that.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
i have created these dir "wksp_data" and "wksp_name" on both datanode
and namenode ****
made the respective changes in "hdfs-site.xml" file ****
formatted the namenode ****
started the dfs ****
** **
but still, not able to browse the file system through web browser ****
please refer below ****
** **
anything still missing ?****
please suggest ****
** **
[image: Inline image 1]****
** **
wrote:****
these dir needs to be created on all datanodes and namenodes ?****
further, hdfs-site.xml needs to be updated on both datanodes and
namenodes for these new dir?****
** **
regards****
** **
** **
wrote:****
Create 2 directories manually corresponding to the values of
dfs.name.dir and dfs.data.dir properties and change the permissions of
these directories to 755. When you start pushing data into your HDFS, data
will start going inside the directory specified by dfs.data.dir and the
associated metadata will go inside dfs.name.dir. Remember, you store data
in HDFS, but it eventually gets stored in your local/native FS. But you
cannot see this data directly on your local/native FS.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
thanks. ****
however, i need this to be working on windows environment as project
requirement.****
i will add/work on Linux later ****
** **
so, now , at this stage , c:\\wksp is the HDFS file system OR do i need
to create it from command line ?****
** **
please suggest****
** **
regards,****
** **
** **
wrote:****
Hello Irfan,****
** **
Sorry for being unresponsive. Got stuck with some imp work.****
** **
HDFS webUI doesn't provide us the ability to create file or directory.
You can browse HDFS, view files, download files etc. But operation like
create, move, copy etc are not supported.****
** **
These values look fine to me.****
** **
One suggestion though. Try getting a Linux machine(if possible). Or at
least use a VM. I personally feel that using Hadoop on windows is always
messy.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
thanks.****
when i browse the file system , i am getting following :****
i haven't seen any make directory option there ****
** **
i need to create it from command line ?****
further, in the hdfs-site.xml file , i have given following entries.
are they correct ? ****
** **
<property>****
<name>dfs.data.dir</name>****
<value>c:\\wksp</value>****
</property>****
<property>****
<name>dfs.name.dir</name>****
<value>c:\\wksp</value>****
</property>****
** **
please suggest ****
** **
** **
[image: Inline image 1]****
** **
wrote:****
*You are wrong at this:*****
** **
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
****
** **
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp****
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
****
** **
Because,You had wrote both the paths local and You need not to copy
hadoop into hdfs...Hadoop is already working..****
** **
Just check out in browser by after starting ur single node cluster :***
*
** **
localhost:50070****
** **
then go for browse the filesystem link in it..****
** **
If there is no directory then make directory there.****
That is your hdfs directory.****
Then copy any text file there(no need to copy hadoop there).beacause u
are going to do processing on that data in text file.That's why hadoop is
used for ,first u need to make it clear in ur mind.Then and then u will do
it...otherwise not possible..****
** **
*Try this: *****
** **
$ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
/hdfs/directory/path****
** **
** **
** **
** **
wrote:****
thanks. yes , i am newbie.****
however, i need windows setup.****
** **
let me surely refer the doc and link which u sent but i need this to be
working ...****
can you please help****
** **
regards****
** **
****
** **
****
** **
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443****
** **
** **
** **
** **
** **
** **
** **
** **
** **
** **
** **
** **
****
-- ****
Regards****
*Manish Dunani*****
*Contact No* : +91 9408329137****
*skype id* : manish.dunani****
** **
** **
** **
** **
** **
** **
** **
** **
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
** **
** **
** **
** **
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
****
** **
-- ****
Olivier Renault****
Solution Engineer - Big Data - Hortonworks, Inc.
+44 7500 933 036
www.hortonworks.com****
**** <http://hortonworks.com/products/hortonworks-sandbox/>
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
** ** <http://hortonworks.com/products/hortonworks-sandbox/>
Olivier Renault
2013-09-10 11:39:19 UTC
Permalink
Your cluster-properties.txt should look something like :

#Log directory
HDP_LOG_DIR=c:\hadoop\logs

#Data directory
HDP_DATA_DIR=c:\hdp\data

#Hosts
NAMENODE_HOST=yourmaster.fqdn.com
JOBTRACKER_HOST=yourmaster.fqdn.com
HIVE_SERVER_HOST=yourmaster.fqdn.com
OOZIE_SERVER_HOST=yourmaster.fqdn.com
TEMPLETON_HOST=yourmaster.fqdn.com
SLAVE_HOSTS=yourmaster.fqdn.com,youslave.fqdn.com

#Database host
DB_FLAVOR=derby
DB_HOSTNAME=yourmaster.fqdn.com


#Hive properties
HIVE_DB_NAME=hive
HIVE_DB_USERNAME=hive
HIVE_DB_PASSWORD=hive

#Oozie properties
OOZIE_DB_NAME=oozie
OOZIE_DB_USERNAME=oozie
OOZIE_DB_PASSWORD=oozie

You will need to replace, yourmaster.fqdn.com, yourslave.fqdn.com by your
servers name. For the time being, I suggest that you do not install HBase,
Oozie,

regards,
Olivier
Post by Irfan Sayed
please suggest
regards
irfan
Post by Irfan Sayed
please suggest
regards
irfan
Post by Irfan Sayed
ok.. now i made some changes and installation went ahead
but failed in property "HIVE_SERVER_HOST" declaration
in cluster config file, i have commented this property. if i uncomment ,
then what server address will give ???
i have only two windows machines setup.
1: for namenode and another for datanode
please suggest
regards
irfan
Post by Irfan Sayed
thanks.
i installed the latest java in c:\java folder and now no error in log
file related to java
however, now it is throwing error on not having cluster properties file.
in fact i am running/installing hdp from the location where this file
exist . still it is throwing error
please find the attached
[image: Inline image 1]
regards
irfan
On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
HereÂ’s your issue (from the logs you attached earlier):****
** **
CAQuietExec: Checking JAVA_HOME is set correctly...****
CAQuietExec: Files\Java\jdk1.6.0_31 was unexpected at this time.****
** **
It seems that you installed Java prerequisite in the default path,
which is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3
does not like spaces in paths, do you need to reinstall Java under c:\java\
or something similar (in a path with no spaces).****
** **
*Sent:* Thursday, September 5, 2013 8:42 PM
*Subject:* Re: about replication****
** **
please find the attached.****
i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
as it is not generated ****
** **
regards****
irfan****
** **
** **
** **
** **
** **
On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
Could you share the log files ( c:\hdp.log,
c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log ) as
well as your clusterproperties.txt ?****
** **
Thanks, ****
Olivier****
** **
**
thanks. i followed the user manual for deployment and installed all
pre-requisites ****
i modified the command and still the issue persist. please suggest ***
*
** **
please refer below ****
** **
** **
[image: Inline image 1]****
** **
regards****
irfan ****
** **
** **
On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
The command to install it is msiexec /i msifile /... ****
You will find the correct syntax as part of doc. ****
Happy reading
Olivier ****
thanks. ****
i referred the logs and manuals. i modified the clusterproperties file
and then double click on the msi file ****
however, it still failed.****
further i started the installation on command line by giving
HDP_LAYOUT=clusterproperties file path, ****
installation went ahead and it failed for .NET framework 4.0 and VC++
redistributable package dependency ****
** **
i installed both and started again the installation. ****
failed again with following error ****
[image: Inline image 1]****
** **
when i search for the logs mentioned in the error , i never found that
****
please suggest ****
** **
regards****
irfan****
** **
** **
On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
Correct, you need to define the cluster configuration as part of a
file. You will find some information on the configuration file as part of
the documentation. ****
http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
****
You should make sure to have also installed the pre requisite. ****
Thanks
Olivier ****
thanks. sorry for the long break. actually got involved in some
other priorities****
i downloaded the installer and while installing i got following error
****
** **
[image: Inline image 1]****
** **
do i need to make any configuration prior to installation ??****
** **
regards****
irfan ****
** **
** **
On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
Here is the link ****
http://download.hortonworks.com/products/hdp-windows/****
Olivier ****
thanks.****
i just followed the instructions to setup the pseudo distributed setup
http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
****
****
i don't think so i am running DN on both machine ****
please find the attached log****
** **
hi olivier ****
** **
can you please give me download link ?****
let me try please ****
** **
regards****
irfan ****
** **
** **
** **
wrote:****
Are you running DN on both the machines? Could you please show me
your DN logs?****
** **
Also, consider Oliver's suggestion. It's definitely a better option.**
**
** **
** **
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
Irfu, ****
If you want to quickly get Hadoop running on windows platform. You may
want to try our distribution for Windows. You will be able to find the msi
on our website. ****
Regards
Olivier ****
thanks. ****
ok. i think i need to change the plan over here ****
let me create two environments. 1: totally windows 2: totally Unix****
** **
because, on windows , anyway i have to try and see how hadoop works **
**
on UNIX, it is already known that , it is working fine. ****
** **
so, on windows , here is the setup:****
** **
namenode : windows 2012 R2 ****
datanode : windows 2012 R2 ****
** **
now, the exact problem is :****
1: datanode is not getting started ****
2: replication : if i put any file/folder on any datanode , it should
get replicated to all another available datanodes ****
** **
regards****
** **
** **
** **
** **
** **
** **
** **
** **
wrote:****
Seriously??You are planning to develop something using Hadoop on
windows. Not a good idea. Anyways, cold you plz show me your log files?I
also need some additional info :****
-The exact problem which you are facing right now****
-Your cluster summary(no. of nodes etc)****
-Your latest configuration files****
-Your /etc.hosts file****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
ok. thanks****
now, i need to start with all windows setup first as our product will
be based on windows ****
so, now, please tell me how to resolve the issue ****
** **
datanode is not starting . please suggest ****
** **
regards,****
irfan ****
** **
** **
wrote:****
It is possible. Theoretically Hadoop doesn't stop you from doing
that. But it is not a very wise setup.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
please suggest****
** **
regards****
irfan****
** **
** **
wrote:****
thanks.****
can i have setup like this :****
namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)***
*
and datanodes are the combination of any OS (windows , linux , unix
etc )****
** **
however, my doubt is, as the file systems of both the systems (win
and linux ) are different , datanodes of these systems can not be part of
single cluster . i have to make windows cluster separate and UNIX cluster
separate ?****
** **
regards****
** **
** **
On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
I just noticed you are on Cygwin. IIRC Windows PIDs are not the same
as Cygwin PIDs so that may be causing the discrepancy. I don't know how
well Hadoop works in Cygwin as I have never tried it. Work is in progress
for native Windows support however there are no official releases with
Windows support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
****
****
wrote:****
thanks ****
here is what i did .****
i stopped all the namenodes and datanodes using ./stop-dfs.sh command
****
then deleted all pid files for namenodes and datanodes ****
** **
started dfs again with command : "./start-dfs.sh"****
** **
when i ran the "Jps" command . it shows****
** **
$ ./jps.exe****
4536 Jps****
2076 NameNode****
** **
however, when i open the pid file for namenode then it is not showing
pid as : 4560. on the contrary, it shud show : 2076****
** **
please suggest ****
** **
regards****
** **
** **
On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
Most likely there is a stale pid file. Something like
\tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
the datanode.
I haven't read the entire thread so you may have looked at this already.
-Arpit****
****
wrote:****
datanode is trying to connect to namenode continuously but fails ****
** **
when i try to run "jps" command it says :****
$ ./jps.exe****
4584 NameNode****
4016 Jps****
** **
and when i ran the "./start-dfs.sh" then it says :****
** **
$ ./start-dfs.sh****
namenode running as process 3544. Stop it first.****
DFS-1: datanode running as process 4076. Stop it first.****
localhost: secondarynamenode running as process 4792. Stop it first.**
**
** **
both these logs are contradictory ****
please find the attached logs ****
** **
should i attach the conf files as well ?****
** **
regards****
****
** **
wrote:****
Your DN is still not running. Showing me the logs would be helpful.**
**
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
i followed the url and did the steps mention in that. i have
deployed on the windows platform****
** **
Now, i am able to browse url : http://localhost:50070 (name node )****
however, not able to browse url : http://localhost:50030****
** **
please refer below****
** **
[image: Inline image 1]****
** **
i have modified all the config files as mentioned and formatted the
hdfs file system as well ****
please suggest ****
** **
regards****
** **
** **
wrote:****
http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
****
let me follow the url which you gave for pseudo distributed setup and
then will switch to distributed mode****
** **
regards****
irfan ****
** **
** **
wrote:****
You are welcome. Which link have you followed for the
configuration?Your *core-site.xml* is empty. Remove the property *
fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
Remove *mapred.job.tracker* as well. It is required in *
mapred-site.xml*.****
** **
I would suggest you to do a pseudo distributed setup first in order to
get yourself familiar with the process and then proceed to the distributed
mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
****
** **
HTH****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
thanks tariq for response. ****
as discussed last time, i have sent you all the config files in my
setup . ****
can you please go through that ?****
** **
please let me know ****
** **
regards****
irfan ****
** **
** **
** **
wrote:****
I'm sorry for being unresponsive. Was out of touch for sometime
because of ramzan and eid. Resuming work today.****
** **
What's the current status?****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
First of all read the concepts ..I hope you will like it..****
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf****
** **
wrote:****
please suggest ****
** **
regards****
irfan ****
** **
** **
wrote:****
hey Tariq,****
i am still stuck .. ****
can you please suggest ****
** **
regards****
irfan ****
** **
** **
wrote:****
please suggest ****
** **
regards****
** **
** **
wrote:****
attachment got quarantined ****
resending in txt format. please rename it to conf.rar ****
** **
regards****
** **
** **
wrote:****
thanks.****
** **
if i run the jps command on namenode :****
** **
$ ./jps.exe****
3164 NameNode****
1892 Jps****
** **
same command on datanode :****
** **
$ ./jps.exe****
3848 Jps****
** **
jps does not list any process for datanode. however, on web browser i
can see one live data node ****
please find the attached conf rar file of namenode ****
** **
regards****
** **
** **
wrote:****
OK. we'll start fresh. Could you plz show me your latest config
files?****
** **
BTW, are your daemons running fine?Use JPS to verify that.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
i have created these dir "wksp_data" and "wksp_name" on both
datanode and namenode ****
made the respective changes in "hdfs-site.xml" file ****
formatted the namenode ****
started the dfs ****
** **
but still, not able to browse the file system through web browser ****
please refer below ****
** **
anything still missing ?****
please suggest ****
** **
[image: Inline image 1]****
** **
wrote:****
these dir needs to be created on all datanodes and namenodes ?****
further, hdfs-site.xml needs to be updated on both datanodes and
namenodes for these new dir?****
** **
regards****
** **
** **
wrote:****
Create 2 directories manually corresponding to the values of
dfs.name.dir and dfs.data.dir properties and change the permissions of
these directories to 755. When you start pushing data into your HDFS, data
will start going inside the directory specified by dfs.data.dir and the
associated metadata will go inside dfs.name.dir. Remember, you store data
in HDFS, but it eventually gets stored in your local/native FS. But you
cannot see this data directly on your local/native FS.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
thanks. ****
however, i need this to be working on windows environment as project
requirement.****
i will add/work on Linux later ****
** **
so, now , at this stage , c:\\wksp is the HDFS file system OR do i
need to create it from command line ?****
** **
please suggest****
** **
regards,****
** **
** **
wrote:****
Hello Irfan,****
** **
Sorry for being unresponsive. Got stuck with some imp work.****
** **
HDFS webUI doesn't provide us the ability to create file or directory.
You can browse HDFS, view files, download files etc. But operation like
create, move, copy etc are not supported.****
** **
These values look fine to me.****
** **
One suggestion though. Try getting a Linux machine(if possible). Or at
least use a VM. I personally feel that using Hadoop on windows is always
messy.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
thanks.****
when i browse the file system , i am getting following :****
i haven't seen any make directory option there ****
** **
i need to create it from command line ?****
further, in the hdfs-site.xml file , i have given following entries.
are they correct ? ****
** **
<property>****
<name>dfs.data.dir</name>****
<value>c:\\wksp</value>****
</property>****
<property>****
<name>dfs.name.dir</name>****
<value>c:\\wksp</value>****
</property>****
** **
please suggest ****
** **
** **
[image: Inline image 1]****
** **
wrote:****
*You are wrong at this:*****
** **
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
****
** **
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp****
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
****
** **
Because,You had wrote both the paths local and You need not to copy
hadoop into hdfs...Hadoop is already working..****
** **
Just check out in browser by after starting ur single node cluster :**
**
** **
localhost:50070****
** **
then go for browse the filesystem link in it..****
** **
If there is no directory then make directory there.****
That is your hdfs directory.****
Then copy any text file there(no need to copy hadoop there).beacause u
are going to do processing on that data in text file.That's why hadoop is
used for ,first u need to make it clear in ur mind.Then and then u will do
it...otherwise not possible..****
** **
*Try this: *****
** **
$ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
/hdfs/directory/path****
** **
** **
** **
** **
wrote:****
thanks. yes , i am newbie.****
however, i need windows setup.****
** **
let me surely refer the doc and link which u sent but i need this to
be working ...****
can you please help****
** **
regards****
** **
****
** **
****
** **
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443****
** **
** **
** **
** **
** **
** **
** **
** **
** **
** **
** **
** **
****
-- ****
Regards****
*Manish Dunani*****
*Contact No* : +91 9408329137****
*skype id* : manish.dunani****
** **
** **
** **
** **
** **
** **
** **
** **
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
** **
** **
** **
** **
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
****
** **
-- ****
Olivier Renault****
Solution Engineer - Big Data - Hortonworks, Inc.
+44 7500 933 036
www.hortonworks.com****
**** <http://hortonworks.com/products/hortonworks-sandbox/>
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
** ** <http://hortonworks.com/products/hortonworks-sandbox/>
--
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to
which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
Irfan Sayed
2013-09-11 09:26:06 UTC
Permalink
i do not have any HIVE server host, then, what should i put over here?? .
if i comment then i guess it throws error of commenting that
can i put the fqdn of namenode for HIVE server host ?

will it be a really working configuration?

please suggest

regards
irfan



On Tue, Sep 10, 2013 at 5:09 PM, Olivier Renault
Post by Olivier Renault
#Log directory
HDP_LOG_DIR=c:\hadoop\logs
#Data directory
HDP_DATA_DIR=c:\hdp\data
#Hosts
NAMENODE_HOST=yourmaster.fqdn.com
JOBTRACKER_HOST=yourmaster.fqdn.com
HIVE_SERVER_HOST=yourmaster.fqdn.com
OOZIE_SERVER_HOST=yourmaster.fqdn.com
TEMPLETON_HOST=yourmaster.fqdn.com
SLAVE_HOSTS=yourmaster.fqdn.com,youslave.fqdn.com
#Database host
DB_FLAVOR=derby
DB_HOSTNAME=yourmaster.fqdn.com
#Hive properties
HIVE_DB_NAME=hive
HIVE_DB_USERNAME=hive
HIVE_DB_PASSWORD=hive
#Oozie properties
OOZIE_DB_NAME=oozie
OOZIE_DB_USERNAME=oozie
OOZIE_DB_PASSWORD=oozie
You will need to replace, yourmaster.fqdn.com, yourslave.fqdn.com by your
servers name. For the time being, I suggest that you do not install HBase,
Oozie,
regards,
Olivier
Post by Irfan Sayed
please suggest
regards
irfan
Post by Irfan Sayed
please suggest
regards
irfan
Post by Irfan Sayed
ok.. now i made some changes and installation went ahead
but failed in property "HIVE_SERVER_HOST" declaration
in cluster config file, i have commented this property. if i uncomment
, then what server address will give ???
i have only two windows machines setup.
1: for namenode and another for datanode
please suggest
regards
irfan
Post by Irfan Sayed
thanks.
i installed the latest java in c:\java folder and now no error in log
file related to java
however, now it is throwing error on not having cluster properties file.
in fact i am running/installing hdp from the location where this file
exist . still it is throwing error
please find the attached
[image: Inline image 1]
regards
irfan
On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
HereÂ’s your issue (from the logs you attached earlier):****
** **
CAQuietExec: Checking JAVA_HOME is set correctly...****
CAQuietExec: Files\Java\jdk1.6.0_31 was unexpected at this time.****
** **
It seems that you installed Java prerequisite in the default path,
which is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3
does not like spaces in paths, do you need to reinstall Java under c:\java\
or something similar (in a path with no spaces).****
** **
*Sent:* Thursday, September 5, 2013 8:42 PM
*Subject:* Re: about replication****
** **
please find the attached.****
i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
as it is not generated ****
** **
regards****
irfan****
** **
** **
** **
** **
** **
On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
Could you share the log files ( c:\hdp.log,
c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log ) as
well as your clusterproperties.txt ?****
** **
Thanks, ****
Olivier****
** **
***
thanks. i followed the user manual for deployment and installed all
pre-requisites ****
i modified the command and still the issue persist. please suggest **
**
** **
please refer below ****
** **
** **
[image: Inline image 1]****
** **
regards****
irfan ****
** **
** **
On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
The command to install it is msiexec /i msifile /... ****
You will find the correct syntax as part of doc. ****
Happy reading
Olivier ****
thanks. ****
i referred the logs and manuals. i modified the clusterproperties
file and then double click on the msi file ****
however, it still failed.****
further i started the installation on command line by giving
HDP_LAYOUT=clusterproperties file path, ****
installation went ahead and it failed for .NET framework 4.0 and VC++
redistributable package dependency ****
** **
i installed both and started again the installation. ****
failed again with following error ****
[image: Inline image 1]****
** **
when i search for the logs mentioned in the error , i never found
that ****
please suggest ****
** **
regards****
irfan****
** **
** **
On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
Correct, you need to define the cluster configuration as part of a
file. You will find some information on the configuration file as part of
the documentation. ****
http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
****
You should make sure to have also installed the pre requisite. ****
Thanks
Olivier ****
thanks. sorry for the long break. actually got involved in some
other priorities****
i downloaded the installer and while installing i got following error
****
** **
[image: Inline image 1]****
** **
do i need to make any configuration prior to installation ??****
** **
regards****
irfan ****
** **
** **
On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
Here is the link ****
http://download.hortonworks.com/products/hdp-windows/****
Olivier ****
thanks.****
i just followed the instructions to setup the pseudo distributed
http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
****
****
i don't think so i am running DN on both machine ****
please find the attached log****
** **
hi olivier ****
** **
can you please give me download link ?****
let me try please ****
** **
regards****
irfan ****
** **
** **
** **
wrote:****
Are you running DN on both the machines? Could you please show me
your DN logs?****
** **
Also, consider Oliver's suggestion. It's definitely a better option.*
***
** **
** **
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
Irfu, ****
If you want to quickly get Hadoop running on windows platform. You
may want to try our distribution for Windows. You will be able to find the
msi on our website. ****
Regards
Olivier ****
thanks. ****
ok. i think i need to change the plan over here ****
let me create two environments. 1: totally windows 2: totally Unix***
*
** **
because, on windows , anyway i have to try and see how hadoop works *
***
on UNIX, it is already known that , it is working fine. ****
** **
so, on windows , here is the setup:****
** **
namenode : windows 2012 R2 ****
datanode : windows 2012 R2 ****
** **
now, the exact problem is :****
1: datanode is not getting started ****
2: replication : if i put any file/folder on any datanode , it should
get replicated to all another available datanodes ****
** **
regards****
** **
** **
** **
** **
** **
** **
** **
** **
wrote:****
Seriously??You are planning to develop something using Hadoop on
windows. Not a good idea. Anyways, cold you plz show me your log files?I
also need some additional info :****
-The exact problem which you are facing right now****
-Your cluster summary(no. of nodes etc)****
-Your latest configuration files****
-Your /etc.hosts file****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
ok. thanks****
now, i need to start with all windows setup first as our product will
be based on windows ****
so, now, please tell me how to resolve the issue ****
** **
datanode is not starting . please suggest ****
** **
regards,****
irfan ****
** **
** **
wrote:****
It is possible. Theoretically Hadoop doesn't stop you from doing
that. But it is not a very wise setup.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
please suggest****
** **
regards****
irfan****
** **
** **
wrote:****
thanks.****
can i have setup like this :****
namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)**
**
and datanodes are the combination of any OS (windows , linux , unix
etc )****
** **
however, my doubt is, as the file systems of both the systems (win
and linux ) are different , datanodes of these systems can not be part of
single cluster . i have to make windows cluster separate and UNIX cluster
separate ?****
** **
regards****
** **
** **
On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
I just noticed you are on Cygwin. IIRC Windows PIDs are not the same
as Cygwin PIDs so that may be causing the discrepancy. I don't know how
well Hadoop works in Cygwin as I have never tried it. Work is in progress
for native Windows support however there are no official releases with
Windows support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
****
****
wrote:****
thanks ****
here is what i did .****
i stopped all the namenodes and datanodes using ./stop-dfs.sh command
****
then deleted all pid files for namenodes and datanodes ****
** **
started dfs again with command : "./start-dfs.sh"****
** **
when i ran the "Jps" command . it shows****
** **
$ ./jps.exe****
4536 Jps****
2076 NameNode****
** **
however, when i open the pid file for namenode then it is not showing
pid as : 4560. on the contrary, it shud show : 2076****
** **
please suggest ****
** **
regards****
** **
** **
On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
Most likely there is a stale pid file. Something like
\tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
the datanode.
I haven't read the entire thread so you may have looked at this already.
-Arpit****
****
wrote:****
datanode is trying to connect to namenode continuously but fails ***
*
** **
when i try to run "jps" command it says :****
$ ./jps.exe****
4584 NameNode****
4016 Jps****
** **
and when i ran the "./start-dfs.sh" then it says :****
** **
$ ./start-dfs.sh****
namenode running as process 3544. Stop it first.****
DFS-1: datanode running as process 4076. Stop it first.****
localhost: secondarynamenode running as process 4792. Stop it first.*
***
** **
both these logs are contradictory ****
please find the attached logs ****
** **
should i attach the conf files as well ?****
** **
regards****
****
** **
wrote:****
Your DN is still not running. Showing me the logs would be helpful.*
***
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
i followed the url and did the steps mention in that. i have
deployed on the windows platform****
** **
Now, i am able to browse url : http://localhost:50070 (name node )***
*
however, not able to browse url : http://localhost:50030****
** **
please refer below****
** **
[image: Inline image 1]****
** **
i have modified all the config files as mentioned and formatted the
hdfs file system as well ****
please suggest ****
** **
regards****
** **
** **
wrote:****
http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
****
let me follow the url which you gave for pseudo distributed setup and
then will switch to distributed mode****
** **
regards****
irfan ****
** **
** **
wrote:****
You are welcome. Which link have you followed for the
configuration?Your *core-site.xml* is empty. Remove the property *
fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
Remove *mapred.job.tracker* as well. It is required in *
mapred-site.xml*.****
** **
I would suggest you to do a pseudo distributed setup first in order
to get yourself familiar with the process and then proceed to the
distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
****
** **
HTH****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
thanks tariq for response. ****
as discussed last time, i have sent you all the config files in my
setup . ****
can you please go through that ?****
** **
please let me know ****
** **
regards****
irfan ****
** **
** **
** **
wrote:****
I'm sorry for being unresponsive. Was out of touch for sometime
because of ramzan and eid. Resuming work today.****
** **
What's the current status?****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
First of all read the concepts ..I hope you will like it..****
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf****
** **
wrote:****
please suggest ****
** **
regards****
irfan ****
** **
** **
wrote:****
hey Tariq,****
i am still stuck .. ****
can you please suggest ****
** **
regards****
irfan ****
** **
** **
wrote:****
please suggest ****
** **
regards****
** **
** **
wrote:****
attachment got quarantined ****
resending in txt format. please rename it to conf.rar ****
** **
regards****
** **
** **
wrote:****
thanks.****
** **
if i run the jps command on namenode :****
** **
$ ./jps.exe****
3164 NameNode****
1892 Jps****
** **
same command on datanode :****
** **
$ ./jps.exe****
3848 Jps****
** **
jps does not list any process for datanode. however, on web browser i
can see one live data node ****
please find the attached conf rar file of namenode ****
** **
regards****
** **
** **
wrote:****
OK. we'll start fresh. Could you plz show me your latest config
files?****
** **
BTW, are your daemons running fine?Use JPS to verify that.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
i have created these dir "wksp_data" and "wksp_name" on both
datanode and namenode ****
made the respective changes in "hdfs-site.xml" file ****
formatted the namenode ****
started the dfs ****
** **
but still, not able to browse the file system through web browser ***
*
please refer below ****
** **
anything still missing ?****
please suggest ****
** **
[image: Inline image 1]****
** **
wrote:****
these dir needs to be created on all datanodes and namenodes ?****
further, hdfs-site.xml needs to be updated on both datanodes and
namenodes for these new dir?****
** **
regards****
** **
** **
wrote:****
Create 2 directories manually corresponding to the values of
dfs.name.dir and dfs.data.dir properties and change the permissions of
these directories to 755. When you start pushing data into your HDFS, data
will start going inside the directory specified by dfs.data.dir and the
associated metadata will go inside dfs.name.dir. Remember, you store data
in HDFS, but it eventually gets stored in your local/native FS. But you
cannot see this data directly on your local/native FS.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
thanks. ****
however, i need this to be working on windows environment as project
requirement.****
i will add/work on Linux later ****
** **
so, now , at this stage , c:\\wksp is the HDFS file system OR do i
need to create it from command line ?****
** **
please suggest****
** **
regards,****
** **
** **
wrote:****
Hello Irfan,****
** **
Sorry for being unresponsive. Got stuck with some imp work.****
** **
HDFS webUI doesn't provide us the ability to create file or
directory. You can browse HDFS, view files, download files etc. But
operation like create, move, copy etc are not supported.****
** **
These values look fine to me.****
** **
One suggestion though. Try getting a Linux machine(if possible). Or
at least use a VM. I personally feel that using Hadoop on windows is always
messy.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
thanks.****
when i browse the file system , i am getting following :****
i haven't seen any make directory option there ****
** **
i need to create it from command line ?****
further, in the hdfs-site.xml file , i have given following entries.
are they correct ? ****
** **
<property>****
<name>dfs.data.dir</name>****
<value>c:\\wksp</value>****
</property>****
<property>****
<name>dfs.name.dir</name>****
<value>c:\\wksp</value>****
</property>****
** **
please suggest ****
** **
** **
[image: Inline image 1]****
** **
wrote:****
*You are wrong at this:*****
** **
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
****
** **
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp****
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
****
** **
Because,You had wrote both the paths local and You need not to copy
hadoop into hdfs...Hadoop is already working..****
** **
Just check out in browser by after starting ur single node cluster :*
***
** **
localhost:50070****
** **
then go for browse the filesystem link in it..****
** **
If there is no directory then make directory there.****
That is your hdfs directory.****
Then copy any text file there(no need to copy hadoop there).beacause
u are going to do processing on that data in text file.That's why hadoop is
used for ,first u need to make it clear in ur mind.Then and then u will do
it...otherwise not possible..****
** **
*Try this: *****
** **
$ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
/hdfs/directory/path****
** **
** **
** **
** **
wrote:****
thanks. yes , i am newbie.****
however, i need windows setup.****
** **
let me surely refer the doc and link which u sent but i need this to
be working ...****
can you please help****
** **
regards****
** **
****
** **
****
** **
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443****
** **
** **
** **
** **
** **
** **
** **
** **
** **
** **
** **
** **
****
-- ****
Regards****
*Manish Dunani*****
*Contact No* : +91 9408329137****
*skype id* : manish.dunani****
** **
** **
** **
** **
** **
** **
** **
** **
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
** **
** **
** **
** **
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
****
** **
-- ****
Olivier Renault****
Solution Engineer - Big Data - Hortonworks, Inc.
+44 7500 933 036
www.hortonworks.com****
**** <http://hortonworks.com/products/hortonworks-sandbox/>
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
** ** <http://hortonworks.com/products/hortonworks-sandbox/>
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
Olivier Renault
2013-09-11 09:46:40 UTC
Permalink
You can put the same FQDN as your NameNode for example.

Thanks
Olivier
Post by Irfan Sayed
i do not have any HIVE server host, then, what should i put over here?? .
if i comment then i guess it throws error of commenting that
can i put the fqdn of namenode for HIVE server host ?
will it be a really working configuration?
please suggest
regards
irfan
Post by Olivier Renault
#Log directory
HDP_LOG_DIR=c:\hadoop\logs
#Data directory
HDP_DATA_DIR=c:\hdp\data
#Hosts
NAMENODE_HOST=yourmaster.fqdn.com
JOBTRACKER_HOST=yourmaster.fqdn.com
HIVE_SERVER_HOST=yourmaster.fqdn.com
OOZIE_SERVER_HOST=yourmaster.fqdn.com
TEMPLETON_HOST=yourmaster.fqdn.com
SLAVE_HOSTS=yourmaster.fqdn.com,youslave.fqdn.com
#Database host
DB_FLAVOR=derby
DB_HOSTNAME=yourmaster.fqdn.com
#Hive properties
HIVE_DB_NAME=hive
HIVE_DB_USERNAME=hive
HIVE_DB_PASSWORD=hive
#Oozie properties
OOZIE_DB_NAME=oozie
OOZIE_DB_USERNAME=oozie
OOZIE_DB_PASSWORD=oozie
You will need to replace, yourmaster.fqdn.com, yourslave.fqdn.com by
your servers name. For the time being, I suggest that you do not install
HBase, Oozie,
regards,
Olivier
Post by Irfan Sayed
please suggest
regards
irfan
Post by Irfan Sayed
please suggest
regards
irfan
Post by Irfan Sayed
ok.. now i made some changes and installation went ahead
but failed in property "HIVE_SERVER_HOST" declaration
in cluster config file, i have commented this property. if i uncomment
, then what server address will give ???
i have only two windows machines setup.
1: for namenode and another for datanode
please suggest
regards
irfan
Post by Irfan Sayed
thanks.
i installed the latest java in c:\java folder and now no error in log
file related to java
however, now it is throwing error on not having cluster properties file.
in fact i am running/installing hdp from the location where this file
exist . still it is throwing error
please find the attached
[image: Inline image 1]
regards
irfan
On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
HereÂ’s your issue (from the logs you attached earlier):****
** **
CAQuietExec: Checking JAVA_HOME is set correctly...****
CAQuietExec: Files\Java\jdk1.6.0_31 was unexpected at this time.***
*
** **
It seems that you installed Java prerequisite in the default path,
which is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3
does not like spaces in paths, do you need to reinstall Java under c:\java\
or something similar (in a path with no spaces).****
** **
*Sent:* Thursday, September 5, 2013 8:42 PM
*Subject:* Re: about replication****
** **
please find the attached.****
i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
as it is not generated ****
** **
regards****
irfan****
** **
** **
** **
** **
** **
On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
Could you share the log files ( c:\hdp.log,
c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log ) as
well as your clusterproperties.txt ?****
** **
Thanks, ****
Olivier****
** **
****
thanks. i followed the user manual for deployment and installed
all pre-requisites ****
i modified the command and still the issue persist. please suggest *
***
** **
please refer below ****
** **
** **
[image: Inline image 1]****
** **
regards****
irfan ****
** **
** **
On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
The command to install it is msiexec /i msifile /... ****
You will find the correct syntax as part of doc. ****
Happy reading
Olivier ****
thanks. ****
i referred the logs and manuals. i modified the clusterproperties
file and then double click on the msi file ****
however, it still failed.****
further i started the installation on command line by giving
HDP_LAYOUT=clusterproperties file path, ****
installation went ahead and it failed for .NET framework 4.0 and
VC++ redistributable package dependency ****
** **
i installed both and started again the installation. ****
failed again with following error ****
[image: Inline image 1]****
** **
when i search for the logs mentioned in the error , i never found
that ****
please suggest ****
** **
regards****
irfan****
** **
** **
On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
Correct, you need to define the cluster configuration as part of a
file. You will find some information on the configuration file as part of
the documentation. ****
http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
****
You should make sure to have also installed the pre requisite. ****
Thanks
Olivier ****
thanks. sorry for the long break. actually got involved in some
other priorities****
i downloaded the installer and while installing i got following
error ****
** **
[image: Inline image 1]****
** **
do i need to make any configuration prior to installation ??****
** **
regards****
irfan ****
** **
** **
On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
Here is the link ****
http://download.hortonworks.com/products/hdp-windows/****
Olivier ****
*
thanks.****
i just followed the instructions to setup the pseudo distributed
http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
****
****
i don't think so i am running DN on both machine ****
please find the attached log****
** **
hi olivier ****
** **
can you please give me download link ?****
let me try please ****
** **
regards****
irfan ****
** **
** **
** **
wrote:****
Are you running DN on both the machines? Could you please show me
your DN logs?****
** **
Also, consider Oliver's suggestion. It's definitely a better option.
****
** **
** **
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
Irfu, ****
If you want to quickly get Hadoop running on windows platform. You
may want to try our distribution for Windows. You will be able to find the
msi on our website. ****
Regards
Olivier ****
*
thanks. ****
ok. i think i need to change the plan over here ****
let me create two environments. 1: totally windows 2: totally Unix**
**
** **
because, on windows , anyway i have to try and see how hadoop works ****
on UNIX, it is already known that , it is working fine. ****
** **
so, on windows , here is the setup:****
** **
namenode : windows 2012 R2 ****
datanode : windows 2012 R2 ****
** **
now, the exact problem is :****
1: datanode is not getting started ****
2: replication : if i put any file/folder on any datanode , it
should get replicated to all another available datanodes ****
** **
regards****
** **
** **
** **
** **
** **
** **
** **
** **
wrote:****
Seriously??You are planning to develop something using Hadoop on
windows. Not a good idea. Anyways, cold you plz show me your log files?I
also need some additional info :****
-The exact problem which you are facing right now****
-Your cluster summary(no. of nodes etc)****
-Your latest configuration files****
-Your /etc.hosts file****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
ok. thanks****
now, i need to start with all windows setup first as our product
will be based on windows ****
so, now, please tell me how to resolve the issue ****
** **
datanode is not starting . please suggest ****
** **
regards,****
irfan ****
** **
** **
wrote:****
It is possible. Theoretically Hadoop doesn't stop you from doing
that. But it is not a very wise setup.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
please suggest****
** **
regards****
irfan****
** **
** **
wrote:****
thanks.****
can i have setup like this :****
namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)*
***
and datanodes are the combination of any OS (windows , linux , unix
etc )****
** **
however, my doubt is, as the file systems of both the systems (win
and linux ) are different , datanodes of these systems can not be part of
single cluster . i have to make windows cluster separate and UNIX cluster
separate ?****
** **
regards****
** **
** **
On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
I just noticed you are on Cygwin. IIRC Windows PIDs are not the same
as Cygwin PIDs so that may be causing the discrepancy. I don't know how
well Hadoop works in Cygwin as I have never tried it. Work is in progress
for native Windows support however there are no official releases with
Windows support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
****
****
wrote:****
thanks ****
here is what i did .****
i stopped all the namenodes and datanodes using ./stop-dfs.sh
command ****
then deleted all pid files for namenodes and datanodes ****
** **
started dfs again with command : "./start-dfs.sh"****
** **
when i ran the "Jps" command . it shows****
** **
$ ./jps.exe****
4536 Jps****
2076 NameNode****
** **
however, when i open the pid file for namenode then it is not
showing pid as : 4560. on the contrary, it shud show : 2076****
** **
please suggest ****
** **
regards****
** **
** **
On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
Most likely there is a stale pid file. Something like
\tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
the datanode.
I haven't read the entire thread so you may have looked at this already.
-Arpit****
****
wrote:****
datanode is trying to connect to namenode continuously but fails **
**
** **
when i try to run "jps" command it says :****
$ ./jps.exe****
4584 NameNode****
4016 Jps****
** **
and when i ran the "./start-dfs.sh" then it says :****
** **
$ ./start-dfs.sh****
namenode running as process 3544. Stop it first.****
DFS-1: datanode running as process 4076. Stop it first.****
localhost: secondarynamenode running as process 4792. Stop it first.
****
** **
both these logs are contradictory ****
please find the attached logs ****
** **
should i attach the conf files as well ?****
** **
regards****
****
** **
wrote:****
Your DN is still not running. Showing me the logs would be helpful.
****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
i followed the url and did the steps mention in that. i have
deployed on the windows platform****
** **
Now, i am able to browse url : http://localhost:50070 (name node )**
**
however, not able to browse url : http://localhost:50030****
** **
please refer below****
** **
[image: Inline image 1]****
** **
i have modified all the config files as mentioned and formatted the
hdfs file system as well ****
please suggest ****
** **
regards****
** **
** **
wrote:****
http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
****
let me follow the url which you gave for pseudo distributed setup
and then will switch to distributed mode****
** **
regards****
irfan ****
** **
** **
wrote:****
You are welcome. Which link have you followed for the
configuration?Your *core-site.xml* is empty. Remove the property *
fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
Remove *mapred.job.tracker* as well. It is required in *
mapred-site.xml*.****
** **
I would suggest you to do a pseudo distributed setup first in order
to get yourself familiar with the process and then proceed to the
distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
****
** **
HTH****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
thanks tariq for response. ****
as discussed last time, i have sent you all the config files in my
setup . ****
can you please go through that ?****
** **
please let me know ****
** **
regards****
irfan ****
** **
** **
** **
wrote:****
I'm sorry for being unresponsive. Was out of touch for sometime
because of ramzan and eid. Resuming work today.****
** **
What's the current status?****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
First of all read the concepts ..I hope you will like it..****
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf***
*
** **
wrote:****
please suggest ****
** **
regards****
irfan ****
** **
** **
wrote:****
hey Tariq,****
i am still stuck .. ****
can you please suggest ****
** **
regards****
irfan ****
** **
** **
wrote:****
please suggest ****
** **
regards****
** **
** **
wrote:****
attachment got quarantined ****
resending in txt format. please rename it to conf.rar ****
** **
regards****
** **
** **
wrote:****
thanks.****
** **
if i run the jps command on namenode :****
** **
$ ./jps.exe****
3164 NameNode****
1892 Jps****
** **
same command on datanode :****
** **
$ ./jps.exe****
3848 Jps****
** **
jps does not list any process for datanode. however, on web browser
i can see one live data node ****
please find the attached conf rar file of namenode ****
** **
regards****
** **
** **
wrote:****
OK. we'll start fresh. Could you plz show me your latest config
files?****
** **
BTW, are your daemons running fine?Use JPS to verify that.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
i have created these dir "wksp_data" and "wksp_name" on both
datanode and namenode ****
made the respective changes in "hdfs-site.xml" file ****
formatted the namenode ****
started the dfs ****
** **
but still, not able to browse the file system through web browser **
**
please refer below ****
** **
anything still missing ?****
please suggest ****
** **
[image: Inline image 1]****
** **
wrote:****
these dir needs to be created on all datanodes and namenodes ?****
further, hdfs-site.xml needs to be updated on both datanodes and
namenodes for these new dir?****
** **
regards****
** **
** **
wrote:****
Create 2 directories manually corresponding to the values of
dfs.name.dir and dfs.data.dir properties and change the permissions of
these directories to 755. When you start pushing data into your HDFS, data
will start going inside the directory specified by dfs.data.dir and the
associated metadata will go inside dfs.name.dir. Remember, you store data
in HDFS, but it eventually gets stored in your local/native FS. But you
cannot see this data directly on your local/native FS.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
thanks. ****
however, i need this to be working on windows environment as project
requirement.****
i will add/work on Linux later ****
** **
so, now , at this stage , c:\\wksp is the HDFS file system OR do i
need to create it from command line ?****
** **
please suggest****
** **
regards,****
** **
** **
wrote:****
Hello Irfan,****
** **
Sorry for being unresponsive. Got stuck with some imp work.****
** **
HDFS webUI doesn't provide us the ability to create file or
directory. You can browse HDFS, view files, download files etc. But
operation like create, move, copy etc are not supported.****
** **
These values look fine to me.****
** **
One suggestion though. Try getting a Linux machine(if possible). Or
at least use a VM. I personally feel that using Hadoop on windows is always
messy.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
thanks.****
when i browse the file system , i am getting following :****
i haven't seen any make directory option there ****
** **
i need to create it from command line ?****
further, in the hdfs-site.xml file , i have given following entries.
are they correct ? ****
** **
<property>****
<name>dfs.data.dir</name>****
<value>c:\\wksp</value>****
</property>****
<property>****
<name>dfs.name.dir</name>****
<value>c:\\wksp</value>****
</property>****
** **
please suggest ****
** **
** **
[image: Inline image 1]****
** **
wrote:****
*You are wrong at this:*****
** **
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
****
** **
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp***
*
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
****
** **
Because,You had wrote both the paths local and You need not to copy
hadoop into hdfs...Hadoop is already working..****
** **
****
** **
localhost:50070****
** **
then go for browse the filesystem link in it..****
** **
If there is no directory then make directory there.****
That is your hdfs directory.****
Then copy any text file there(no need to copy hadoop there).beacause
u are going to do processing on that data in text file.That's why hadoop is
used for ,first u need to make it clear in ur mind.Then and then u will do
it...otherwise not possible..****
** **
*Try this: *****
** **
$ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
/hdfs/directory/path****
** **
** **
** **
** **
wrote:****
thanks. yes , i am newbie.****
however, i need windows setup.****
** **
let me surely refer the doc and link which u sent but i need this to
be working ...****
can you please help****
** **
regards****
** **
****
** **
****
** **
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443****
** **
** **
** **
** **
** **
** **
** **
** **
** **
** **
** **
** **
****
-- ****
Regards****
*Manish Dunani*****
*Contact No* : +91 9408329137****
*skype id* : manish.dunani****
** **
** **
** **
** **
** **
** **
** **
** **
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
** **
** **
** **
** **
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
****
** **
-- ****
Olivier Renault****
Solution Engineer - Big Data - Hortonworks, Inc.
+44 7500 933 036
www.hortonworks.com****
**** <http://hortonworks.com/products/hortonworks-sandbox/>
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
** ** <http://hortonworks.com/products/hortonworks-sandbox/>
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
--
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to
which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
Irfan Sayed
2013-09-12 06:30:39 UTC
Permalink
thanks.
finally it got installed :)

further, when i try to start the namenode, it failed with following log

C:\hdp>start_remote_hdp_services.cmd
Master nodes: start DFS-DC
0 Master nodes successfully started.
1 Master nodes failed to start.

PSComputerName Service Message Status
-------------- ------- ------- ------
Connecting to re...


StartStop-HDPServices : Manually start services on Master nodes then retry
full
cluster start. Exiting.
At C:\hdp\manage_remote_hdp_services.ps1:187 char:47
+ if ($mode -eq "start") { StartStop-HDPservices <<<< ($mode) }
+ CategoryInfo : NotSpecified: (:) [Write-Error],
WriteErrorExcep
tion
+ FullyQualifiedErrorId :
Microsoft.PowerShell.Commands.WriteErrorExceptio
n,StartStop-HDPServices


C:\hdp>

i tried starting manually as well but no luck
anything missing in configuration ?

regards



On Wed, Sep 11, 2013 at 3:16 PM, Olivier Renault
Post by Olivier Renault
You can put the same FQDN as your NameNode for example.
Thanks
Olivier
Post by Irfan Sayed
i do not have any HIVE server host, then, what should i put over here??
. if i comment then i guess it throws error of commenting that
can i put the fqdn of namenode for HIVE server host ?
will it be a really working configuration?
please suggest
regards
irfan
On Tue, Sep 10, 2013 at 5:09 PM, Olivier Renault <
Post by Olivier Renault
#Log directory
HDP_LOG_DIR=c:\hadoop\logs
#Data directory
HDP_DATA_DIR=c:\hdp\data
#Hosts
NAMENODE_HOST=yourmaster.fqdn.com
JOBTRACKER_HOST=yourmaster.fqdn.com
HIVE_SERVER_HOST=yourmaster.fqdn.com
OOZIE_SERVER_HOST=yourmaster.fqdn.com
TEMPLETON_HOST=yourmaster.fqdn.com
SLAVE_HOSTS=yourmaster.fqdn.com,youslave.fqdn.com
#Database host
DB_FLAVOR=derby
DB_HOSTNAME=yourmaster.fqdn.com
#Hive properties
HIVE_DB_NAME=hive
HIVE_DB_USERNAME=hive
HIVE_DB_PASSWORD=hive
#Oozie properties
OOZIE_DB_NAME=oozie
OOZIE_DB_USERNAME=oozie
OOZIE_DB_PASSWORD=oozie
You will need to replace, yourmaster.fqdn.com, yourslave.fqdn.com by
your servers name. For the time being, I suggest that you do not install
HBase, Oozie,
regards,
Olivier
Post by Irfan Sayed
please suggest
regards
irfan
Post by Irfan Sayed
please suggest
regards
irfan
Post by Irfan Sayed
ok.. now i made some changes and installation went ahead
but failed in property "HIVE_SERVER_HOST" declaration
in cluster config file, i have commented this property. if i
uncomment , then what server address will give ???
i have only two windows machines setup.
1: for namenode and another for datanode
please suggest
regards
irfan
Post by Irfan Sayed
thanks.
i installed the latest java in c:\java folder and now no error in
log file related to java
however, now it is throwing error on not having cluster properties file.
in fact i am running/installing hdp from the location where this
file exist . still it is throwing error
please find the attached
[image: Inline image 1]
regards
irfan
On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
HereÂ’s your issue (from the logs you attached earlier):****
** **
CAQuietExec: Checking JAVA_HOME is set correctly...****
CAQuietExec: Files\Java\jdk1.6.0_31 was unexpected at this time.**
**
** **
It seems that you installed Java prerequisite in the default path,
which is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3
does not like spaces in paths, do you need to reinstall Java under c:\java\
or something similar (in a path with no spaces).****
** **
*Sent:* Thursday, September 5, 2013 8:42 PM
*Subject:* Re: about replication****
** **
please find the attached.****
i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
as it is not generated ****
** **
regards****
irfan****
** **
** **
** **
** **
** **
On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
Could you share the log files ( c:\hdp.log,
c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log ) as
well as your clusterproperties.txt ?****
** **
Thanks, ****
Olivier****
** **
wrote:****
thanks. i followed the user manual for deployment and installed
all pre-requisites ****
i modified the command and still the issue persist. please suggest ****
** **
please refer below ****
** **
** **
[image: Inline image 1]****
** **
regards****
irfan ****
** **
** **
On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
The command to install it is msiexec /i msifile /... ****
You will find the correct syntax as part of doc. ****
Happy reading
Olivier ****
*
thanks. ****
i referred the logs and manuals. i modified the clusterproperties
file and then double click on the msi file ****
however, it still failed.****
further i started the installation on command line by giving
HDP_LAYOUT=clusterproperties file path, ****
installation went ahead and it failed for .NET framework 4.0 and
VC++ redistributable package dependency ****
** **
i installed both and started again the installation. ****
failed again with following error ****
[image: Inline image 1]****
** **
when i search for the logs mentioned in the error , i never found
that ****
please suggest ****
** **
regards****
irfan****
** **
** **
On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
Correct, you need to define the cluster configuration as part of a
file. You will find some information on the configuration file as part of
the documentation. ****
http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
****
You should make sure to have also installed the pre requisite. ****
Thanks
Olivier ****
*
thanks. sorry for the long break. actually got involved in some
other priorities****
i downloaded the installer and while installing i got following
error ****
** **
[image: Inline image 1]****
** **
do i need to make any configuration prior to installation ??****
** **
regards****
irfan ****
** **
** **
On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
Here is the link ****
http://download.hortonworks.com/products/hdp-windows/****
Olivier ****
**
thanks.****
i just followed the instructions to setup the pseudo distributed
http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
****
****
i don't think so i am running DN on both machine ****
please find the attached log****
** **
hi olivier ****
** **
can you please give me download link ?****
let me try please ****
** **
regards****
irfan ****
** **
** **
** **
wrote:****
Are you running DN on both the machines? Could you please show me
your DN logs?****
** **
Also, consider Oliver's suggestion. It's definitely a better option.
****
** **
** **
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
Irfu, ****
If you want to quickly get Hadoop running on windows platform. You
may want to try our distribution for Windows. You will be able to find the
msi on our website. ****
Regards
Olivier ****
**
thanks. ****
ok. i think i need to change the plan over here ****
let me create two environments. 1: totally windows 2: totally Unix*
***
** **
because, on windows , anyway i have to try and see how hadoop works ****
on UNIX, it is already known that , it is working fine. ****
** **
so, on windows , here is the setup:****
** **
namenode : windows 2012 R2 ****
datanode : windows 2012 R2 ****
** **
now, the exact problem is :****
1: datanode is not getting started ****
2: replication : if i put any file/folder on any datanode , it
should get replicated to all another available datanodes ****
** **
regards****
** **
** **
** **
** **
** **
** **
** **
** **
wrote:****
Seriously??You are planning to develop something using Hadoop on
windows. Not a good idea. Anyways, cold you plz show me your log files?I
also need some additional info :****
-The exact problem which you are facing right now****
-Your cluster summary(no. of nodes etc)****
-Your latest configuration files****
-Your /etc.hosts file****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
ok. thanks****
now, i need to start with all windows setup first as our product
will be based on windows ****
so, now, please tell me how to resolve the issue ****
** **
datanode is not starting . please suggest ****
** **
regards,****
irfan ****
** **
** **
wrote:****
It is possible. Theoretically Hadoop doesn't stop you from doing
that. But it is not a very wise setup.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
please suggest****
** **
regards****
irfan****
** **
** **
wrote:****
thanks.****
can i have setup like this :****
namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)
****
and datanodes are the combination of any OS (windows , linux , unix
etc )****
** **
however, my doubt is, as the file systems of both the systems
(win and linux ) are different , datanodes of these systems can not be
part of single cluster . i have to make windows cluster separate and UNIX
cluster separate ?****
** **
regards****
** **
** **
On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
I just noticed you are on Cygwin. IIRC Windows PIDs are not the
same as Cygwin PIDs so that may be causing the discrepancy. I don't know
how well Hadoop works in Cygwin as I have never tried it. Work is in
progress for native Windows support however there are no official releases
with Windows support yet. It may be easier to get familiar with a
release <https://www.apache.org/dyn/closer.cgi/hadoop/common/> on
Linux if you are new to it.****
****
wrote:****
thanks ****
here is what i did .****
i stopped all the namenodes and datanodes using ./stop-dfs.sh
command ****
then deleted all pid files for namenodes and datanodes ****
** **
started dfs again with command : "./start-dfs.sh"****
** **
when i ran the "Jps" command . it shows****
** **
$ ./jps.exe****
4536 Jps****
2076 NameNode****
** **
however, when i open the pid file for namenode then it is not
showing pid as : 4560. on the contrary, it shud show : 2076****
** **
please suggest ****
** **
regards****
** **
** **
On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
Most likely there is a stale pid file. Something like
\tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
the datanode.
I haven't read the entire thread so you may have looked at this already.
-Arpit****
****
wrote:****
datanode is trying to connect to namenode continuously but fails *
***
** **
when i try to run "jps" command it says :****
$ ./jps.exe****
4584 NameNode****
4016 Jps****
** **
and when i ran the "./start-dfs.sh" then it says :****
** **
$ ./start-dfs.sh****
namenode running as process 3544. Stop it first.****
DFS-1: datanode running as process 4076. Stop it first.****
localhost: secondarynamenode running as process 4792. Stop it first.
****
** **
both these logs are contradictory ****
please find the attached logs ****
** **
should i attach the conf files as well ?****
** **
regards****
****
** **
wrote:****
Your DN is still not running. Showing me the logs would be helpful.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
i followed the url and did the steps mention in that. i have
deployed on the windows platform****
** **
Now, i am able to browse url : http://localhost:50070 (name node )*
***
however, not able to browse url : http://localhost:50030****
** **
please refer below****
** **
[image: Inline image 1]****
** **
i have modified all the config files as mentioned and formatted the
hdfs file system as well ****
please suggest ****
** **
regards****
** **
** **
wrote:****
http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
****
let me follow the url which you gave for pseudo distributed setup
and then will switch to distributed mode****
** **
regards****
irfan ****
** **
** **
wrote:****
You are welcome. Which link have you followed for the
configuration?Your *core-site.xml* is empty. Remove the property *
fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
Remove *mapred.job.tracker* as well. It is required in *
mapred-site.xml*.****
** **
I would suggest you to do a pseudo distributed setup first in order
to get yourself familiar with the process and then proceed to the
distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
****
** **
HTH****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
thanks tariq for response. ****
as discussed last time, i have sent you all the config files in my
setup . ****
can you please go through that ?****
** **
please let me know ****
** **
regards****
irfan ****
** **
** **
** **
wrote:****
I'm sorry for being unresponsive. Was out of touch for sometime
because of ramzan and eid. Resuming work today.****
** **
What's the current status?****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
First of all read the concepts ..I hope you will like it..****
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf**
**
** **
wrote:****
please suggest ****
** **
regards****
irfan ****
** **
** **
wrote:****
hey Tariq,****
i am still stuck .. ****
can you please suggest ****
** **
regards****
irfan ****
** **
** **
wrote:****
please suggest ****
** **
regards****
** **
** **
wrote:****
attachment got quarantined ****
resending in txt format. please rename it to conf.rar ****
** **
regards****
** **
** **
wrote:****
thanks.****
** **
if i run the jps command on namenode :****
** **
$ ./jps.exe****
3164 NameNode****
1892 Jps****
** **
same command on datanode :****
** **
$ ./jps.exe****
3848 Jps****
** **
jps does not list any process for datanode. however, on web browser
i can see one live data node ****
please find the attached conf rar file of namenode ****
** **
regards****
** **
** **
wrote:****
OK. we'll start fresh. Could you plz show me your latest config
files?****
** **
BTW, are your daemons running fine?Use JPS to verify that.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
i have created these dir "wksp_data" and "wksp_name" on both
datanode and namenode ****
made the respective changes in "hdfs-site.xml" file ****
formatted the namenode ****
started the dfs ****
** **
but still, not able to browse the file system through web browser *
***
please refer below ****
** **
anything still missing ?****
please suggest ****
** **
[image: Inline image 1]****
** **
wrote:****
these dir needs to be created on all datanodes and namenodes ?****
further, hdfs-site.xml needs to be updated on both datanodes and
namenodes for these new dir?****
** **
regards****
** **
** **
wrote:****
Create 2 directories manually corresponding to the values of
dfs.name.dir and dfs.data.dir properties and change the permissions of
these directories to 755. When you start pushing data into your HDFS, data
will start going inside the directory specified by dfs.data.dir and the
associated metadata will go inside dfs.name.dir. Remember, you store data
in HDFS, but it eventually gets stored in your local/native FS. But you
cannot see this data directly on your local/native FS.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
thanks. ****
however, i need this to be working on windows environment as
project requirement.****
i will add/work on Linux later ****
** **
so, now , at this stage , c:\\wksp is the HDFS file system OR do i
need to create it from command line ?****
** **
please suggest****
** **
regards,****
** **
** **
wrote:****
Hello Irfan,****
** **
Sorry for being unresponsive. Got stuck with some imp work.****
** **
HDFS webUI doesn't provide us the ability to create file or
directory. You can browse HDFS, view files, download files etc. But
operation like create, move, copy etc are not supported.****
** **
These values look fine to me.****
** **
One suggestion though. Try getting a Linux machine(if possible). Or
at least use a VM. I personally feel that using Hadoop on windows is always
messy.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
thanks.****
when i browse the file system , i am getting following :****
i haven't seen any make directory option there ****
** **
i need to create it from command line ?****
further, in the hdfs-site.xml file , i have given following
entries. are they correct ? ****
** **
<property>****
<name>dfs.data.dir</name>****
<value>c:\\wksp</value>****
</property>****
<property>****
<name>dfs.name.dir</name>****
<value>c:\\wksp</value>****
</property>****
** **
please suggest ****
** **
** **
[image: Inline image 1]****
** **
On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <
*You are wrong at this:*****
** **
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
****
** **
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp**
**
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
****
** **
Because,You had wrote both the paths local and You need not to copy
hadoop into hdfs...Hadoop is already working..****
** **
****
** **
localhost:50070****
** **
then go for browse the filesystem link in it..****
** **
If there is no directory then make directory there.****
That is your hdfs directory.****
Then copy any text file there(no need to copy hadoop
there).beacause u are going to do processing on that data in text
file.That's why hadoop is used for ,first u need to make it clear in ur
mind.Then and then u will do it...otherwise not possible..****
** **
*Try this: *****
** **
$ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
/hdfs/directory/path****
** **
** **
** **
** **
wrote:****
thanks. yes , i am newbie.****
however, i need windows setup.****
** **
let me surely refer the doc and link which u sent but i need this
to be working ...****
can you please help****
** **
regards****
** **
****
** **
****
** **
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443****
** **
** **
** **
** **
** **
** **
** **
** **
** **
** **
** **
** **
****
-- ****
Regards****
*Manish Dunani*****
*Contact No* : +91 9408329137****
*skype id* : manish.dunani****
** **
** **
** **
** **
** **
** **
** **
** **
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
** **
** **
** **
** **
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
****
** **
-- ****
Olivier Renault****
Solution Engineer - Big Data - Hortonworks, Inc.
+44 7500 933 036
www.hortonworks.com****
**** <http://hortonworks.com/products/hortonworks-sandbox/>
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
** ** <http://hortonworks.com/products/hortonworks-sandbox/>
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
Irfan Sayed
2013-09-13 09:37:08 UTC
Permalink
please suggest

regards
Post by Irfan Sayed
thanks.
finally it got installed :)
further, when i try to start the namenode, it failed with following log
C:\hdp>start_remote_hdp_services.cmd
Master nodes: start DFS-DC
0 Master nodes successfully started.
1 Master nodes failed to start.
PSComputerName Service Message Status
-------------- ------- ------- ------
Connecting to re...
StartStop-HDPServices : Manually start services on Master nodes then retry
full
cluster start. Exiting.
At C:\hdp\manage_remote_hdp_services.ps1:187 char:47
+ if ($mode -eq "start") { StartStop-HDPservices <<<< ($mode) }
+ CategoryInfo : NotSpecified: (:) [Write-Error],
WriteErrorExcep
tion
Microsoft.PowerShell.Commands.WriteErrorExceptio
n,StartStop-HDPServices
C:\hdp>
i tried starting manually as well but no luck
anything missing in configuration ?
regards
Post by Olivier Renault
You can put the same FQDN as your NameNode for example.
Thanks
Olivier
Post by Irfan Sayed
i do not have any HIVE server host, then, what should i put over here??
. if i comment then i guess it throws error of commenting that
can i put the fqdn of namenode for HIVE server host ?
will it be a really working configuration?
please suggest
regards
irfan
On Tue, Sep 10, 2013 at 5:09 PM, Olivier Renault <
Post by Olivier Renault
#Log directory
HDP_LOG_DIR=c:\hadoop\logs
#Data directory
HDP_DATA_DIR=c:\hdp\data
#Hosts
NAMENODE_HOST=yourmaster.fqdn.com
JOBTRACKER_HOST=yourmaster.fqdn.com
HIVE_SERVER_HOST=yourmaster.fqdn.com
OOZIE_SERVER_HOST=yourmaster.fqdn.com
TEMPLETON_HOST=yourmaster.fqdn.com
SLAVE_HOSTS=yourmaster.fqdn.com,youslave.fqdn.com
#Database host
DB_FLAVOR=derby
DB_HOSTNAME=yourmaster.fqdn.com
#Hive properties
HIVE_DB_NAME=hive
HIVE_DB_USERNAME=hive
HIVE_DB_PASSWORD=hive
#Oozie properties
OOZIE_DB_NAME=oozie
OOZIE_DB_USERNAME=oozie
OOZIE_DB_PASSWORD=oozie
You will need to replace, yourmaster.fqdn.com, yourslave.fqdn.com by
your servers name. For the time being, I suggest that you do not install
HBase, Oozie,
regards,
Olivier
Post by Irfan Sayed
please suggest
regards
irfan
Post by Irfan Sayed
please suggest
regards
irfan
Post by Irfan Sayed
ok.. now i made some changes and installation went ahead
but failed in property "HIVE_SERVER_HOST" declaration
in cluster config file, i have commented this property. if i
uncomment , then what server address will give ???
i have only two windows machines setup.
1: for namenode and another for datanode
please suggest
regards
irfan
Post by Irfan Sayed
thanks.
i installed the latest java in c:\java folder and now no error in
log file related to java
however, now it is throwing error on not having cluster properties file.
in fact i am running/installing hdp from the location where this
file exist . still it is throwing error
please find the attached
[image: Inline image 1]
regards
irfan
On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
HereÂ’s your issue (from the logs you attached earlier):****
** **
CAQuietExec: Checking JAVA_HOME is set correctly...****
CAQuietExec: Files\Java\jdk1.6.0_31 was unexpected at this time.*
***
** **
It seems that you installed Java prerequisite in the default path,
which is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3
does not like spaces in paths, do you need to reinstall Java under c:\java\
or something similar (in a path with no spaces).****
** **
*Sent:* Thursday, September 5, 2013 8:42 PM
*Subject:* Re: about replication****
** **
please find the attached.****
i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
as it is not generated ****
** **
regards****
irfan****
** **
** **
** **
** **
** **
On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
Could you share the log files ( c:\hdp.log,
c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log ) as
well as your clusterproperties.txt ?****
** **
Thanks, ****
Olivier****
** **
wrote:****
thanks. i followed the user manual for deployment and installed
all pre-requisites ****
i modified the command and still the issue persist. please suggest ****
** **
please refer below ****
** **
** **
[image: Inline image 1]****
** **
regards****
irfan ****
** **
** **
On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
The command to install it is msiexec /i msifile /... ****
You will find the correct syntax as part of doc. ****
Happy reading
Olivier ****
**
thanks. ****
i referred the logs and manuals. i modified the clusterproperties
file and then double click on the msi file ****
however, it still failed.****
further i started the installation on command line by giving
HDP_LAYOUT=clusterproperties file path, ****
installation went ahead and it failed for .NET framework 4.0 and
VC++ redistributable package dependency ****
** **
i installed both and started again the installation. ****
failed again with following error ****
[image: Inline image 1]****
** **
when i search for the logs mentioned in the error , i never found
that ****
please suggest ****
** **
regards****
irfan****
** **
** **
On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
Correct, you need to define the cluster configuration as part of a
file. You will find some information on the configuration file as part of
the documentation. ****
http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
****
You should make sure to have also installed the pre requisite. ***
*
Thanks
Olivier ****
**
thanks. sorry for the long break. actually got involved in some
other priorities****
i downloaded the installer and while installing i got following
error ****
** **
[image: Inline image 1]****
** **
do i need to make any configuration prior to installation ??****
** **
regards****
irfan ****
** **
** **
On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
Here is the link ****
http://download.hortonworks.com/products/hdp-windows/****
Olivier ****
***
thanks.****
i just followed the instructions to setup the pseudo distributed
http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
****
****
i don't think so i am running DN on both machine ****
please find the attached log****
** **
hi olivier ****
** **
can you please give me download link ?****
let me try please ****
** **
regards****
irfan ****
** **
** **
** **
On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <
Are you running DN on both the machines? Could you please show
me your DN logs?****
** **
Also, consider Oliver's suggestion. It's definitely a better option.****
** **
** **
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
Irfu, ****
If you want to quickly get Hadoop running on windows platform. You
may want to try our distribution for Windows. You will be able to find the
msi on our website. ****
Regards
Olivier ****
***
thanks. ****
ok. i think i need to change the plan over here ****
let me create two environments. 1: totally windows 2: totally Unix
****
** **
because, on windows , anyway i have to try and see how hadoop works ****
on UNIX, it is already known that , it is working fine. ****
** **
so, on windows , here is the setup:****
** **
namenode : windows 2012 R2 ****
datanode : windows 2012 R2 ****
** **
now, the exact problem is :****
1: datanode is not getting started ****
2: replication : if i put any file/folder on any datanode , it
should get replicated to all another available datanodes ****
** **
regards****
** **
** **
** **
** **
** **
** **
** **
** **
On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <
Seriously??You are planning to develop something using Hadoop on
windows. Not a good idea. Anyways, cold you plz show me your log files?I
also need some additional info :****
-The exact problem which you are facing right now****
-Your cluster summary(no. of nodes etc)****
-Your latest configuration files****
-Your /etc.hosts file****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
ok. thanks****
now, i need to start with all windows setup first as our product
will be based on windows ****
so, now, please tell me how to resolve the issue ****
** **
datanode is not starting . please suggest ****
** **
regards,****
irfan ****
** **
** **
On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <
It is possible. Theoretically Hadoop doesn't stop you from doing
that. But it is not a very wise setup.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
please suggest****
** **
regards****
irfan****
** **
** **
On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
thanks.****
can i have setup like this :****
namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)
****
and datanodes are the combination of any OS (windows , linux ,
unix etc )****
** **
however, my doubt is, as the file systems of both the systems
(win and linux ) are different , datanodes of these systems can not be
part of single cluster . i have to make windows cluster separate and UNIX
cluster separate ?****
** **
regards****
** **
** **
On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
I just noticed you are on Cygwin. IIRC Windows PIDs are not the
same as Cygwin PIDs so that may be causing the discrepancy. I don't know
how well Hadoop works in Cygwin as I have never tried it. Work is in
progress for native Windows support however there are no official releases
with Windows support yet. It may be easier to get familiar with a
release <https://www.apache.org/dyn/closer.cgi/hadoop/common/> on
Linux if you are new to it.****
****
On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
thanks ****
here is what i did .****
i stopped all the namenodes and datanodes using ./stop-dfs.sh
command ****
then deleted all pid files for namenodes and datanodes ****
** **
started dfs again with command : "./start-dfs.sh"****
** **
when i ran the "Jps" command . it shows****
** **
$ ./jps.exe****
4536 Jps****
2076 NameNode****
** **
however, when i open the pid file for namenode then it is not
showing pid as : 4560. on the contrary, it shud show : 2076****
** **
please suggest ****
** **
regards****
** **
** **
On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
Most likely there is a stale pid file. Something like
\tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
the datanode.
I haven't read the entire thread so you may have looked at this already.
-Arpit****
****
wrote:****
datanode is trying to connect to namenode continuously but fails ****
** **
when i try to run "jps" command it says :****
$ ./jps.exe****
4584 NameNode****
4016 Jps****
** **
and when i ran the "./start-dfs.sh" then it says :****
** **
$ ./start-dfs.sh****
namenode running as process 3544. Stop it first.****
DFS-1: datanode running as process 4076. Stop it first.****
localhost: secondarynamenode running as process 4792. Stop it first.****
** **
both these logs are contradictory ****
please find the attached logs ****
** **
should i attach the conf files as well ?****
** **
regards****
****
** **
On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
Your DN is still not running. Showing me the logs would be helpful.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
i followed the url and did the steps mention in that. i have
deployed on the windows platform****
** **
Now, i am able to browse url : http://localhost:50070 (name node )
****
however, not able to browse url : http://localhost:50030****
** **
please refer below****
** **
[image: Inline image 1]****
** **
i have modified all the config files as mentioned and formatted
the hdfs file system as well ****
please suggest ****
** **
regards****
** **
** **
wrote:****
http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
****
let me follow the url which you gave for pseudo distributed setup
and then will switch to distributed mode****
** **
regards****
irfan ****
** **
** **
On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
You are welcome. Which link have you followed for the
configuration?Your *core-site.xml* is empty. Remove the property *
fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml
*. Remove *mapred.job.tracker* as well. It is required in *
mapred-site.xml*.****
** **
I would suggest you to do a pseudo distributed setup first in
order to get yourself familiar with the process and then proceed to the
distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
****
** **
HTH****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
thanks tariq for response. ****
as discussed last time, i have sent you all the config files in my
setup . ****
can you please go through that ?****
** **
please let me know ****
** **
regards****
irfan ****
** **
** **
** **
On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
I'm sorry for being unresponsive. Was out of touch for sometime
because of ramzan and eid. Resuming work today.****
** **
What's the current status?****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
First of all read the concepts ..I hope you will like it..****
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf*
***
** **
wrote:****
please suggest ****
** **
regards****
irfan ****
** **
** **
On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
hey Tariq,****
i am still stuck .. ****
can you please suggest ****
** **
regards****
irfan ****
** **
** **
wrote:****
please suggest ****
** **
regards****
** **
** **
wrote:****
attachment got quarantined ****
resending in txt format. please rename it to conf.rar ****
** **
regards****
** **
** **
wrote:****
thanks.****
** **
if i run the jps command on namenode :****
** **
$ ./jps.exe****
3164 NameNode****
1892 Jps****
** **
same command on datanode :****
** **
$ ./jps.exe****
3848 Jps****
** **
jps does not list any process for datanode. however, on web
browser i can see one live data node ****
please find the attached conf rar file of namenode ****
** **
regards****
** **
** **
wrote:****
OK. we'll start fresh. Could you plz show me your latest config
files?****
** **
BTW, are your daemons running fine?Use JPS to verify that.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
i have created these dir "wksp_data" and "wksp_name" on both
datanode and namenode ****
made the respective changes in "hdfs-site.xml" file ****
formatted the namenode ****
started the dfs ****
** **
but still, not able to browse the file system through web browser ****
please refer below ****
** **
anything still missing ?****
please suggest ****
** **
[image: Inline image 1]****
** **
wrote:****
these dir needs to be created on all datanodes and namenodes ?***
*
further, hdfs-site.xml needs to be updated on both datanodes and
namenodes for these new dir?****
** **
regards****
** **
** **
wrote:****
Create 2 directories manually corresponding to the values of
dfs.name.dir and dfs.data.dir properties and change the permissions of
these directories to 755. When you start pushing data into your HDFS, data
will start going inside the directory specified by dfs.data.dir and the
associated metadata will go inside dfs.name.dir. Remember, you store data
in HDFS, but it eventually gets stored in your local/native FS. But you
cannot see this data directly on your local/native FS.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
thanks. ****
however, i need this to be working on windows environment as
project requirement.****
i will add/work on Linux later ****
** **
so, now , at this stage , c:\\wksp is the HDFS file system OR do i
need to create it from command line ?****
** **
please suggest****
** **
regards,****
** **
** **
wrote:****
Hello Irfan,****
** **
Sorry for being unresponsive. Got stuck with some imp work.****
** **
HDFS webUI doesn't provide us the ability to create file or
directory. You can browse HDFS, view files, download files etc. But
operation like create, move, copy etc are not supported.****
** **
These values look fine to me.****
** **
One suggestion though. Try getting a Linux machine(if possible).
Or at least use a VM. I personally feel that using Hadoop on windows is
always messy.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
thanks.****
when i browse the file system , i am getting following :****
i haven't seen any make directory option there ****
** **
i need to create it from command line ?****
further, in the hdfs-site.xml file , i have given following
entries. are they correct ? ****
** **
<property>****
<name>dfs.data.dir</name>****
<value>c:\\wksp</value>****
</property>****
<property>****
<name>dfs.name.dir</name>****
<value>c:\\wksp</value>****
</property>****
** **
please suggest ****
** **
** **
[image: Inline image 1]****
** **
On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <
*You are wrong at this:*****
** **
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
****
** **
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp*
***
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
****
** **
Because,You had wrote both the paths local and You need not to
copy hadoop into hdfs...Hadoop is already working..****
** **
Just check out in browser by after starting ur single node cluster :****
** **
localhost:50070****
** **
then go for browse the filesystem link in it..****
** **
If there is no directory then make directory there.****
That is your hdfs directory.****
Then copy any text file there(no need to copy hadoop
there).beacause u are going to do processing on that data in text
file.That's why hadoop is used for ,first u need to make it clear in ur
mind.Then and then u will do it...otherwise not possible..****
** **
*Try this: *****
** **
$ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
/hdfs/directory/path****
** **
** **
** **
** **
wrote:****
thanks. yes , i am newbie.****
however, i need windows setup.****
** **
let me surely refer the doc and link which u sent but i need this
to be working ...****
can you please help****
** **
regards****
** **
****
** **
****
** **
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443****
** **
** **
** **
** **
** **
** **
** **
** **
** **
** **
** **
** **
****
-- ****
Regards****
*Manish Dunani*****
*Contact No* : +91 9408329137****
*skype id* : manish.dunani****
** **
** **
** **
** **
** **
** **
** **
** **
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
** **
** **
** **
** **
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
****
** **
-- ****
Olivier Renault****
Solution Engineer - Big Data - Hortonworks, Inc.
+44 7500 933 036
www.hortonworks.com****
**** <http://hortonworks.com/products/hortonworks-sandbox/>
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
** ** <http://hortonworks.com/products/hortonworks-sandbox/>
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
Irfan Sayed
2013-09-17 05:12:08 UTC
Permalink
please suggest. i am stuck
i haven't find anything in the log

regards
irfan
Post by Irfan Sayed
please suggest
regards
Post by Irfan Sayed
thanks.
finally it got installed :)
further, when i try to start the namenode, it failed with following log
C:\hdp>start_remote_hdp_services.cmd
Master nodes: start DFS-DC
0 Master nodes successfully started.
1 Master nodes failed to start.
PSComputerName Service Message Status
-------------- ------- ------- ------
Connecting to re...
StartStop-HDPServices : Manually start services on Master nodes then
retry full
cluster start. Exiting.
At C:\hdp\manage_remote_hdp_services.ps1:187 char:47
+ if ($mode -eq "start") { StartStop-HDPservices <<<< ($mode) }
+ CategoryInfo : NotSpecified: (:) [Write-Error],
WriteErrorExcep
tion
Microsoft.PowerShell.Commands.WriteErrorExceptio
n,StartStop-HDPServices
C:\hdp>
i tried starting manually as well but no luck
anything missing in configuration ?
regards
On Wed, Sep 11, 2013 at 3:16 PM, Olivier Renault <
Post by Olivier Renault
You can put the same FQDN as your NameNode for example.
Thanks
Olivier
Post by Irfan Sayed
i do not have any HIVE server host, then, what should i put over
here?? . if i comment then i guess it throws error of commenting that
can i put the fqdn of namenode for HIVE server host ?
will it be a really working configuration?
please suggest
regards
irfan
On Tue, Sep 10, 2013 at 5:09 PM, Olivier Renault <
Post by Olivier Renault
#Log directory
HDP_LOG_DIR=c:\hadoop\logs
#Data directory
HDP_DATA_DIR=c:\hdp\data
#Hosts
NAMENODE_HOST=yourmaster.fqdn.com
JOBTRACKER_HOST=yourmaster.fqdn.com
HIVE_SERVER_HOST=yourmaster.fqdn.com
OOZIE_SERVER_HOST=yourmaster.fqdn.com
TEMPLETON_HOST=yourmaster.fqdn.com
SLAVE_HOSTS=yourmaster.fqdn.com,youslave.fqdn.com
#Database host
DB_FLAVOR=derby
DB_HOSTNAME=yourmaster.fqdn.com
#Hive properties
HIVE_DB_NAME=hive
HIVE_DB_USERNAME=hive
HIVE_DB_PASSWORD=hive
#Oozie properties
OOZIE_DB_NAME=oozie
OOZIE_DB_USERNAME=oozie
OOZIE_DB_PASSWORD=oozie
You will need to replace, yourmaster.fqdn.com, yourslave.fqdn.com by
your servers name. For the time being, I suggest that you do not install
HBase, Oozie,
regards,
Olivier
Post by Irfan Sayed
please suggest
regards
irfan
Post by Irfan Sayed
please suggest
regards
irfan
Post by Irfan Sayed
ok.. now i made some changes and installation went ahead
but failed in property "HIVE_SERVER_HOST" declaration
in cluster config file, i have commented this property. if i
uncomment , then what server address will give ???
i have only two windows machines setup.
1: for namenode and another for datanode
please suggest
regards
irfan
Post by Irfan Sayed
thanks.
i installed the latest java in c:\java folder and now no error in
log file related to java
however, now it is throwing error on not having cluster properties file.
in fact i am running/installing hdp from the location where this
file exist . still it is throwing error
please find the attached
[image: Inline image 1]
regards
irfan
On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
HereÂ’s your issue (from the logs you attached earlier):****
** **
CAQuietExec: Checking JAVA_HOME is set correctly...****
CAQuietExec: Files\Java\jdk1.6.0_31 was unexpected at this time.
****
** **
It seems that you installed Java prerequisite in the default
path, which is %PROGRAMFILES% (expands to C:\Program Files in your case).
HDP 1.3 does not like spaces in paths, do you need to reinstall Java under
c:\java\ or something similar (in a path with no spaces).****
** **
*Sent:* Thursday, September 5, 2013 8:42 PM
*Subject:* Re: about replication****
** **
please find the attached.****
i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
as it is not generated ****
** **
regards****
irfan****
** **
** **
** **
** **
** **
On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
Could you share the log files ( c:\hdp.log,
c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log ) as
well as your clusterproperties.txt ?****
** **
Thanks, ****
Olivier****
** **
wrote:****
thanks. i followed the user manual for deployment and installed
all pre-requisites ****
i modified the command and still the issue persist. please suggest ****
** **
please refer below ****
** **
** **
[image: Inline image 1]****
** **
regards****
irfan ****
** **
** **
On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
The command to install it is msiexec /i msifile /... ****
You will find the correct syntax as part of doc. ****
Happy reading
Olivier ****
***
thanks. ****
i referred the logs and manuals. i modified the clusterproperties
file and then double click on the msi file ****
however, it still failed.****
further i started the installation on command line by giving
HDP_LAYOUT=clusterproperties file path, ****
installation went ahead and it failed for .NET framework 4.0 and
VC++ redistributable package dependency ****
** **
i installed both and started again the installation. ****
failed again with following error ****
[image: Inline image 1]****
** **
when i search for the logs mentioned in the error , i never found
that ****
please suggest ****
** **
regards****
irfan****
** **
** **
On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
Correct, you need to define the cluster configuration as part of
a file. You will find some information on the configuration file as part of
the documentation. ****
http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
****
You should make sure to have also installed the pre requisite. **
**
Thanks
Olivier ****
***
thanks. sorry for the long break. actually got involved in some
other priorities****
i downloaded the installer and while installing i got following
error ****
** **
[image: Inline image 1]****
** **
do i need to make any configuration prior to installation ??****
** **
regards****
irfan ****
** **
** **
On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
Here is the link ****
http://download.hortonworks.com/products/hdp-windows/****
Olivier ****
****
thanks.****
i just followed the instructions to setup the pseudo distributed
http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
****
****
i don't think so i am running DN on both machine ****
please find the attached log****
** **
hi olivier ****
** **
can you please give me download link ?****
let me try please ****
** **
regards****
irfan ****
** **
** **
** **
On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <
Are you running DN on both the machines? Could you please show
me your DN logs?****
** **
Also, consider Oliver's suggestion. It's definitely a better option.****
** **
** **
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
Irfu, ****
If you want to quickly get Hadoop running on windows platform.
You may want to try our distribution for Windows. You will be able to find
the msi on our website. ****
Regards
Olivier ****
****
thanks. ****
ok. i think i need to change the plan over here ****
let me create two environments. 1: totally windows 2: totally Unix
****
** **
because, on windows , anyway i have to try and see how hadoop works ****
on UNIX, it is already known that , it is working fine. ****
** **
so, on windows , here is the setup:****
** **
namenode : windows 2012 R2 ****
datanode : windows 2012 R2 ****
** **
now, the exact problem is :****
1: datanode is not getting started ****
2: replication : if i put any file/folder on any datanode , it
should get replicated to all another available datanodes ****
** **
regards****
** **
** **
** **
** **
** **
** **
** **
** **
On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <
Seriously??You are planning to develop something using Hadoop
on windows. Not a good idea. Anyways, cold you plz show me your log files?I
also need some additional info :****
-The exact problem which you are facing right now****
-Your cluster summary(no. of nodes etc)****
-Your latest configuration files****
-Your /etc.hosts file****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <
ok. thanks****
now, i need to start with all windows setup first as our product
will be based on windows ****
so, now, please tell me how to resolve the issue ****
** **
datanode is not starting . please suggest ****
** **
regards,****
irfan ****
** **
** **
On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <
It is possible. Theoretically Hadoop doesn't stop you from
doing that. But it is not a very wise setup.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <
please suggest****
** **
regards****
irfan****
** **
** **
On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
thanks.****
can i have setup like this :****
namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)****
and datanodes are the combination of any OS (windows , linux ,
unix etc )****
** **
however, my doubt is, as the file systems of both the systems
(win and linux ) are different , datanodes of these systems can not be
part of single cluster . i have to make windows cluster separate and UNIX
cluster separate ?****
** **
regards****
** **
** **
On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
I just noticed you are on Cygwin. IIRC Windows PIDs are not the
same as Cygwin PIDs so that may be causing the discrepancy. I don't know
how well Hadoop works in Cygwin as I have never tried it. Work is in
progress for native Windows support however there are no official releases
with Windows support yet. It may be easier to get familiar with a
release <https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
****
****
On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
thanks ****
here is what i did .****
i stopped all the namenodes and datanodes using ./stop-dfs.sh
command ****
then deleted all pid files for namenodes and datanodes ****
** **
started dfs again with command : "./start-dfs.sh"****
** **
when i ran the "Jps" command . it shows****
** **
$ ./jps.exe****
4536 Jps****
2076 NameNode****
** **
however, when i open the pid file for namenode then it is not
showing pid as : 4560. on the contrary, it shud show : 2076****
** **
please suggest ****
** **
regards****
** **
** **
On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
Most likely there is a stale pid file. Something like
\tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
the datanode.
I haven't read the entire thread so you may have looked at this already.
-Arpit****
****
On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
datanode is trying to connect to namenode continuously but fails ****
** **
when i try to run "jps" command it says :****
$ ./jps.exe****
4584 NameNode****
4016 Jps****
** **
and when i ran the "./start-dfs.sh" then it says :****
** **
$ ./start-dfs.sh****
namenode running as process 3544. Stop it first.****
DFS-1: datanode running as process 4076. Stop it first.****
localhost: secondarynamenode running as process 4792. Stop it first.****
** **
both these logs are contradictory ****
please find the attached logs ****
** **
should i attach the conf files as well ?****
** **
regards****
****
** **
On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
Your DN is still not running. Showing me the logs would be helpful.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
i followed the url and did the steps mention in that. i have
deployed on the windows platform****
** **
Now, i am able to browse url : http://localhost:50070 (name node )****
however, not able to browse url : http://localhost:50030****
** **
please refer below****
** **
[image: Inline image 1]****
** **
i have modified all the config files as mentioned and formatted
the hdfs file system as well ****
please suggest ****
** **
regards****
** **
** **
On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
****
let me follow the url which you gave for pseudo distributed setup
and then will switch to distributed mode****
** **
regards****
irfan ****
** **
** **
On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
You are welcome. Which link have you followed for the
configuration?Your *core-site.xml* is empty. Remove the property
*fs.default.name *from *hdfs-site.xml* and add it to *
core-site.xml*. Remove *mapred.job.tracker* as well. It is
required in *mapred-site.xml*.****
** **
I would suggest you to do a pseudo distributed setup first in
order to get yourself familiar with the process and then proceed to the
distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
****
** **
HTH****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
thanks tariq for response. ****
as discussed last time, i have sent you all the config files in
my setup . ****
can you please go through that ?****
** **
please let me know ****
** **
regards****
irfan ****
** **
** **
** **
On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
I'm sorry for being unresponsive. Was out of touch for sometime
because of ramzan and eid. Resuming work today.****
** **
What's the current status?****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
First of all read the concepts ..I hope you will like it..****
https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
****
** **
On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
please suggest ****
** **
regards****
irfan ****
** **
** **
On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
hey Tariq,****
i am still stuck .. ****
can you please suggest ****
** **
regards****
irfan ****
** **
** **
wrote:****
please suggest ****
** **
regards****
** **
** **
wrote:****
attachment got quarantined ****
resending in txt format. please rename it to conf.rar ****
** **
regards****
** **
** **
wrote:****
thanks.****
** **
if i run the jps command on namenode :****
** **
$ ./jps.exe****
3164 NameNode****
1892 Jps****
** **
same command on datanode :****
** **
$ ./jps.exe****
3848 Jps****
** **
jps does not list any process for datanode. however, on web
browser i can see one live data node ****
please find the attached conf rar file of namenode ****
** **
regards****
** **
** **
On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <
OK. we'll start fresh. Could you plz show me your latest config
files?****
** **
BTW, are your daemons running fine?Use JPS to verify that.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <
i have created these dir "wksp_data" and "wksp_name" on both
datanode and namenode ****
made the respective changes in "hdfs-site.xml" file ****
formatted the namenode ****
started the dfs ****
** **
but still, not able to browse the file system through web browser ****
please refer below ****
** **
anything still missing ?****
please suggest ****
** **
[image: Inline image 1]****
** **
On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <
these dir needs to be created on all datanodes and namenodes ?**
**
further, hdfs-site.xml needs to be updated on both datanodes and
namenodes for these new dir?****
** **
regards****
** **
** **
On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <
Create 2 directories manually corresponding to the values of
dfs.name.dir and dfs.data.dir properties and change the permissions of
these directories to 755. When you start pushing data into your HDFS, data
will start going inside the directory specified by dfs.data.dir and the
associated metadata will go inside dfs.name.dir. Remember, you store data
in HDFS, but it eventually gets stored in your local/native FS. But you
cannot see this data directly on your local/native FS.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
thanks. ****
however, i need this to be working on windows environment as
project requirement.****
i will add/work on Linux later ****
** **
so, now , at this stage , c:\\wksp is the HDFS file system OR do
i need to create it from command line ?****
** **
please suggest****
** **
regards,****
** **
** **
On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <
Hello Irfan,****
** **
Sorry for being unresponsive. Got stuck with some imp work.****
** **
HDFS webUI doesn't provide us the ability to create file or
directory. You can browse HDFS, view files, download files etc. But
operation like create, move, copy etc are not supported.****
** **
These values look fine to me.****
** **
One suggestion though. Try getting a Linux machine(if possible).
Or at least use a VM. I personally feel that using Hadoop on windows is
always messy.****
****
Warm Regards,****
Tariq****
cloudfront.blogspot.com****
** **
wrote:****
thanks.****
when i browse the file system , i am getting following :****
i haven't seen any make directory option there ****
** **
i need to create it from command line ?****
further, in the hdfs-site.xml file , i have given following
entries. are they correct ? ****
** **
<property>****
<name>dfs.data.dir</name>****
<value>c:\\wksp</value>****
</property>****
<property>****
<name>dfs.name.dir</name>****
<value>c:\\wksp</value>****
</property>****
** **
please suggest ****
** **
** **
[image: Inline image 1]****
** **
On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <
*You are wrong at this:*****
** **
*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp***
*
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
****
** **
*
$ ./hadoop dfs -copyFromLocal
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
****
copyFromLocal: File
/cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
****
** **
Because,You had wrote both the paths local and You need not to
copy hadoop into hdfs...Hadoop is already working..****
** **
Just check out in browser by after starting ur single node cluster :****
** **
localhost:50070****
** **
then go for browse the filesystem link in it..****
** **
If there is no directory then make directory there.****
That is your hdfs directory.****
Then copy any text file there(no need to copy hadoop
there).beacause u are going to do processing on that data in text
file.That's why hadoop is used for ,first u need to make it clear in ur
mind.Then and then u will do it...otherwise not possible..****
** **
*Try this: *****
** **
$ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
/hdfs/directory/path****
** **
** **
** **
** **
On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <
thanks. yes , i am newbie.****
however, i need windows setup.****
** **
let me surely refer the doc and link which u sent but i need this
to be working ...****
can you please help****
** **
regards****
** **
****
** **
****
** **
--
MANISH DUNANI
-THANX
+91 9426881954,+91 8460656443****
** **
** **
** **
** **
** **
** **
** **
** **
** **
** **
** **
** **
****
-- ****
Regards****
*Manish Dunani*****
*Contact No* : +91 9408329137****
*skype id* : manish.dunani****
** **
** **
** **
** **
** **
** **
** **
** **
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
** **
** **
** **
** **
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****
** **
****
** **
-- ****
Olivier Renault****
Solution Engineer - Big Data - Hortonworks, Inc.
+44 7500 933 036
www.hortonworks.com****
**** <http://hortonworks.com/products/hortonworks-sandbox/>
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
** ** <http://hortonworks.com/products/hortonworks-sandbox/>
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or
entity to which it is addressed and may contain information that is
confidential, privileged and exempt from disclosure under applicable law.
If the reader of this message is not the intended recipient, you are hereby
notified that any printing, copying, dissemination, distribution,
disclosure or forwarding of this communication is strictly prohibited. If
you have received this communication in error, please contact the sender
immediately and delete it from your system. Thank You.
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity
to which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
Loading...