Lets talk about basic Linux commands:
pwd : Print working directory
cd : Change directory
ls : List Directory
mkdir: Make Directory
rmdir: Remove Directory
pushd: pushd adds a directory to the stack n changes to new current directory
pushd: pushd adds a directory to the stack n changes to new current directory
popd : popd removes a directory from the stack and sets the current
directory.
clear: Clear the screen n user irritation
print working directory(pwd) will display your current working directory.
Example:
hadoopguru@hadoop2:~$ pwd
/home/hadoop
cd (Change
directory):
Change directory (cd) command can be used to change your current working directory.
Example: inside
/home/hadoop there is a folder named
hadoop-1.2.1
hadoopguru@hadoop2:~$
cd hadoop-1.2.1/
hadoopguru@hadoop2:~/hadoop-1.2.1$
pwd
/home/hadoop/hadoop-1.2.1
1.
cd ~ or cd:
cd command without any target directory
will take you to your home directory, same is the effect of cd ~
Example: our current working directory
was /home/hadoop/hadoop-
I.
hadoopguru@hadoop2:~/hadoop-1.2.1$ cd ~
hadoopguru@hadoop2:~$ pwd
/home/hadoop
II.
hadoopguru@hadoop2:/$ pwd
/
hadoopguru@hadoop2:/$ cd
hadoopguru@hadoop2:~$ pwd
/home/hadoop
2.
cd .. :
cd .. command will take you to the
parent working directory. Parent working directory is the one which is just
above your current working directory.
The
usage of slash after ../.. will take user the directory which parent to the
parent directory.
Example:
I.
hadoopguru@hadoop2:~/hadoop-1.2.1/conf$ pwd
/home/hadoop/hadoop-1.2.1/conf
hadoopguru@hadoop2:~/hadoop-1.2.1/conf$
cd ..
hadoopguru@hadoop2:~/hadoop-1.2.1$
pwd
/home/hadoop/hadoop-1.2.1
II. hadoopguru@hadoop2:~/hadoop-1.2.1/conf$ pwd
/home/hadoop/hadoop-1.2.1/conf
hadoopguru@hadoop2:~/hadoop-1.2.1/conf$
cd ../..
hadoopguru@hadoop2:~$
pwd
/home/hadoop
III. hadoopguru@hadoop2:~/hadoop-1.2.1/conf$ pwd
/home/hadoop/hadoop-1.2.1/conf
hadoopguru@hadoop2:~/hadoop-1.2.1/conf$
cd ../../..
hadoopguru@hadoop2:/home$
pwd
/home
3.
cd - :
If
one wishes to go back to the previous directory, cd – will help to do that.
Example:
hadoopguru@hadoop2:~$ cd
hadoop-1.2.1/conf
hadoopguru@hadoop2:~/hadoop-1.2.1/conf$
pwd
/home/hadoop/hadoop-1.2.1/conf
hadoopguru@hadoop2:~/hadoop-1.2.1/conf$
cd -
/home/hadoop
Slash (/) usage makes a differen
Possible scenarios:
1. If
user wants to open a directory present in root directory:
Sol: Starting directory name with slash
(/) always direct you to the root of the file tree.
Example:
I.
hadoopguru@hadoop2:~$ pwd
/home/hadoop
hadoopguru@hadoop2:~$ cd /home
hadoopguru@hadoop2:/home$ pwd
/home
II. hadoopguru@hadoop2:~/hadoop-1.2.1/conf$ pwd
/home/hadoop/hadoop-1.2.1/conf
hadoopguru@hadoop2:~/hadoop-1.2.1/conf$
cd /bin
hadoopguru@hadoop2:/bin$ pwd
/bin
2. If
user wants to open a directory in current working directory:
Sol: Ignore slash (/) prior to
directory name.
Example:
hadoopguru@hadoop2:~$ pwd
hadoopguru@hadoop2:~$ pwd
/home/hadoop
hadoopguru@hadoop2:~$ cd hadoop-1.2.1/
hadoopguru@hadoop2:~/hadoop-1.2.1$ pwd
/home/hadoop/hadoop-1.2.1
3. If
user wants to open a directory present in root directory, and current working
directory is root directory.
Sol: one can find solution in above 2
scenarios, in such cases usage of slash (/) makes no difference
Example:
I.
Without
slash(/)
hadoopguru@hadoop2:/$ pwd
/
hadoopguru@hadoop2:/$ cd home
hadoopguru@hadoop2:/home$ pwd
/home
II.
Is
same as with slash(/)
hadoopguru@hadoop2:/$ pwd
/
hadoopguru@hadoop2:/$ cd /home
hadoopguru@hadoop2:/home$ pwd
/home
List
directory contents:
User can list the contents of directory using ls
command.
Example:
hadoopguru@hadoop2:~$
pwd
/home/hadoop
hadoopguru@hadoop2:~$
ls
apache-flume-1.4.0-bin.tar.gz.1 hadoop-1.2.1-bin.tar.gz
aveo hive
data hive-0.11.0-bin.tar.gz
datanode mahout-distribution-0.8.tar.gz
derby.log metastore_db
flume namenode
hadoop-1.2.1
1. ls –a:
In
order to list all files including hidden
files use –a with ls (i.e. ls –a).
Example:
hadoopguru@hadoop2:~$ pwd
/home/hadoop
hadoopguru@hadoop2:~$ ls -a
. flume
.. hadoop-1.2.1
apache-flume-1.4.0-bin.tar.gz.1 hadoop-1.2.1-bin.tar.gz
aveo hive
.bash_history hive-0.11.0-bin.tar.gz
.bash_logout .hivehistory
.bash_profile mahout-distribution-0.8.tar.gz
.bashrc metastore_db
.cache namenode
data .profile
datanode .ssh
derby.log .viminfo
2. ls –l:
For
files with details user can use ls –l command
Example:
hadoopguru@hadoop2:~$ pwd
/home/hadoop
hadoopguru@hadoop2:~$ ls -l
total 261824
-rw-rw-r-- 1 hadoop
hadoop 60965956 Jul 1 09:41
apache-flume-1.4.0-bin.tar.gz.1
drwxrwxr-x 2 hadoop hadoop 4096 Oct
24 01:22 aveo
drwxr-xr-x 6 hadoop
hadoop 4096 Oct 6 23:30 data
drwxrwxr-x 2 hadoop hadoop 4096 Oct 6 17:37 datanode
-rw-rw-r-- 1 hadoop
hadoop 343 Oct 6 17:47 derby.log
drwxrwxr-x 7 hadoop hadoop 4096 Oct 6 17:55 flume
drwxr-xr-x 15 hadoop
hadoop 4096 Oct 6 16:32
hadoop-1.2.1
-rw-rw-r-- 1 hadoop
hadoop 38096663 Oct 6 12:37
hadoop-1.2.1-bin.tar.gz
drwxrwxr-x 8 hadoop hadoop 4096 Oct 6 17:44 hive
-rw-rw-r-- 1 hadoop
hadoop 59859572 Oct 6 12:08
hive-0.11.0-bin.tar.gz
-rw-rw-r-- 1 hadoop
hadoop 109137498 Oct 6 12:29 mahout-distribution-0.8.tar.gz
drwxrwxr-x 5 hadoop hadoop 4096 Oct 6 17:47 metastore_db
drwxrwxr-x 5 hadoop hadoop 4096 Oct 6 23:30 namenode
3. ls –lh or ls –hl or ls –l –h or ls –h -l:
ls –lh shows the size of file in human
readable form.
Example:
A. ls -lh :
hadoopguru@hadoop2:~$ ls -lh
total 256M
-rw-rw-r-- 1 hadoop
hadoop 59M Jul 1 09:41 apache-flume-1.4.0-bin.tar.gz.1
drwxrwxr-x 2 hadoop
hadoop 4.0K Oct 24 01:22 aveo
drwxr-xr-x 6 hadoop
hadoop 4.0K Oct 6 23:30 data
drwxrwxr-x 2 hadoop
hadoop 4.0K Oct 6 17:37 datanode
-rw-rw-r-- 1 hadoop
hadoop 343 Oct 6 17:47 derby.log
drwxrwxr-x 7 hadoop
hadoop 4.0K Oct 6 17:55 flume
drwxr-xr-x 15 hadoop hadoop 4.0K Oct
6 16:32 hadoop-1.2.1
-rw-rw-r-- 1 hadoop
hadoop 37M Oct 6 12:37 hadoop-1.2.1-bin.tar.gz
drwxrwxr-x 8 hadoop
hadoop 4.0K Oct 6 17:44 hive
-rw-rw-r-- 1 hadoop
hadoop 58M Oct 6 12:08 hive-0.11.0-bin.tar.gz
-rw-rw-r-- 1 hadoop
hadoop 105M Oct 6 12:29 mahout-distribution-0.8.tar.gz
drwxrwxr-x 5 hadoop
hadoop 4.0K Oct 6 17:47 metastore_db
drwxrwxr-x 5 hadoop
hadoop 4.0K Oct 6 23:30 namenode
B. ls -l -h :
hadoopguru@hadoop2:~$ ls -l -h
total 256M
-rw-rw-r-- 1 hadoop
hadoop 59M Jul 1 09:41 apache-flume-1.4.0-bin.tar.gz.1
drwxrwxr-x 2 hadoop
hadoop 4.0K Oct 24 01:22 aveo
drwxr-xr-x 6 hadoop
hadoop 4.0K Oct 6 23:30 data
drwxrwxr-x 2 hadoop
hadoop 4.0K Oct 6 17:37 datanode
-rw-rw-r-- 1 hadoop
hadoop 343 Oct 6 17:47 derby.log
drwxrwxr-x 7 hadoop
hadoop 4.0K Oct 6 17:55 flume
drwxr-xr-x 15 hadoop hadoop 4.0K Oct
6 16:32 hadoop-1.2.1
-rw-rw-r-- 1 hadoop
hadoop 37M Oct 6 12:37 hadoop-1.2.1-bin.tar.gz
drwxrwxr-x 8 hadoop
hadoop 4.0K Oct 6 17:44 hive
-rw-rw-r-- 1 hadoop
hadoop 58M Oct 6 12:08 hive-0.11.0-bin.tar.gz
-rw-rw-r-- 1 hadoop
hadoop 105M Oct 6 12:29 mahout-distribution-0.8.tar.gz
drwxrwxr-x 5 hadoop
hadoop 4.0K Oct 6 17:47 metastore_db
drwxrwxr-x 5 hadoop
hadoop 4.0K Oct 6 23:30 namenode
C. ls
-hl :
hadoopguru@hadoop2:~$ ls -hl
total 256M
-rw-rw-r-- 1 hadoop
hadoop 59M Jul 1 09:41 apache-flume-1.4.0-bin.tar.gz.1
drwxrwxr-x 2 hadoop
hadoop 4.0K Oct 24 01:22 aveo
drwxr-xr-x 6 hadoop
hadoop 4.0K Oct 6 23:30 data
drwxrwxr-x 2 hadoop
hadoop 4.0K Oct 6 17:37 datanode
-rw-rw-r-- 1 hadoop
hadoop 343 Oct 6 17:47 derby.log
drwxrwxr-x 7 hadoop
hadoop 4.0K Oct 6 17:55 flume
drwxr-xr-x 15 hadoop hadoop 4.0K Oct
6 16:32 hadoop-1.2.1
-rw-rw-r-- 1 hadoop
hadoop 37M Oct 6 12:37 hadoop-1.2.1-bin.tar.gz
drwxrwxr-x 8 hadoop
hadoop 4.0K Oct 6 17:44 hive
-rw-rw-r-- 1 hadoop
hadoop 58M Oct 6 12:08 hive-0.11.0-bin.tar.gz
-rw-rw-r-- 1 hadoop
hadoop 105M Oct 6 12:29 mahout-distribution-0.8.tar.gz
drwxrwxr-x 5 hadoop
hadoop 4.0K Oct 6 17:47 metastore_db
drwxrwxr-x 5 hadoop
hadoop 4.0K Oct 6 23:30 namenode
D. ls -h -l :
hadoopguru@hadoop2:~$ ls -h -l
total 256M
-rw-rw-r-- 1 hadoop
hadoop 59M Jul 1 09:41 apache-flume-1.4.0-bin.tar.gz.1
drwxrwxr-x 2 hadoop
hadoop 4.0K Oct 24 01:22 aveo
drwxr-xr-x 6 hadoop
hadoop 4.0K Oct 6 23:30 data
drwxrwxr-x 2 hadoop
hadoop 4.0K Oct 6 17:37 datanode
-rw-rw-r-- 1 hadoop
hadoop 343 Oct 6 17:47 derby.log
drwxrwxr-x 7 hadoop
hadoop 4.0K Oct 6 17:55 flume
drwxr-xr-x 15 hadoop hadoop 4.0K Oct
6 16:32 hadoop-1.2.1
-rw-rw-r-- 1 hadoop
hadoop 37M Oct 6 12:37 hadoop-1.2.1-bin.tar.gz
drwxrwxr-x 8 hadoop
hadoop 4.0K Oct 6 17:44 hive
-rw-rw-r-- 1 hadoop
hadoop 58M Oct 6 12:08 hive-0.11.0-bin.tar.gz
-rw-rw-r-- 1 hadoop
hadoop 105M Oct 6 12:29 mahout-distribution-0.8.tar.gz
drwxrwxr-x 5 hadoop
hadoop 4.0K Oct 6 17:47 metastore_db
drwxrwxr-x 5 hadoop
hadoop 4.0K Oct 6 23:30 namenode
Make
directory:
User can create their directory using mkdir command.
Example:
hadoopguru@hadoop2:~$
pwd
/home/hadoop
hadoopguru@hadoop2:~$
mkdir aveo_hadoop
hadoopguru@hadoop2:~$
ls -l
total
261828
-rw-rw-r-- 1 hadoop
hadoop 60965956 Jul 1 09:41 apache-flume-1.4.0-bin.tar.gz.1
drwxrwxr-x 2 hadoop
hadoop 4096 Oct 24 01:22 aveo
drwxrwxr-x 2 hadoop
hadoop 4096 Oct 27 10:10 aveo_hadoop
drwxr-xr-x 6 hadoop
hadoop 4096 Oct 6 23:30 data
drwxrwxr-x 2 hadoop
hadoop 4096 Oct 6 17:37 datanode
-rw-rw-r-- 1 hadoop
hadoop 343 Oct 6 17:47 derby.log
drwxrwxr-x 7 hadoop
hadoop 4096 Oct 6 17:55 flume
drwxr-xr-x
15 hadoop hadoop 4096 Oct 6 16:32 hadoop-1.2.1
-rw-rw-r-- 1 hadoop
hadoop 38096663 Oct 6 12:37 hadoop-1.2.1-bin.tar.gz
drwxrwxr-x 8 hadoop
hadoop 4096 Oct 6 17:44 hive
-rw-rw-r-- 1 hadoop
hadoop 59859572 Oct 6 12:08 hive-0.11.0-bin.tar.gz
-rw-rw-r-- 1 hadoop
hadoop 109137498 Oct 6 12:29 mahout-distribution-0.8.tar.gz
drwxrwxr-x 5 hadoop
hadoop 4096 Oct 6 17:47 metastore_db
drwxrwxr-x 5 hadoop
hadoop 4096 Oct 6 23:30 namenode
1. mkdir -p:
mkdir
–p will help creating parent directory if needed
Example:
hadoopguru@hadoop2:~$ mkdir -p
aveo_hadoop/aveo_hadoop1/aveo_hadoop2
hadoopguru@hadoop2:~$ mkdir -p
aveo_hadoop/aveo_hadoop1/aveo_hadoop2
hadoopguru@hadoop2:~$ ls
apache-flume-1.4.0-bin.tar.gz.1 hadoop-1.2.1
aveo hadoop-1.2.1-bin.tar.gz
aveo_hadoop hive
data hive-0.11.0-bin.tar.gz
datanode mahout-distribution-0.8.tar.gz
derby.log metastore_db
flume namenode
hadoopguru@hadoop2:~$ cd aveo_hadoop
hadoopguru@hadoop2:~/aveo_hadoop$ ls
aveo_hadoop1
hadoopguru@hadoop2:~/aveo_hadoop$ cd
aveo_hadoop1
hadoopguru@hadoop2:~/aveo_hadoop/aveo_hadoop1$
ls
aveo_hadoop2
hadoopguru@hadoop2:~/aveo_hadoop/aveo_hadoop1$
cd aveo_hadoop2
hadoopguru@hadoop2:~/aveo_hadoop/aveo_hadoop1/aveo_hadoop2$
pwd
/home/hadoop/aveo_hadoop/aveo_hadoop1/aveo_hadoop2
Remove
directory:
One can delete the existing directory using rmdir
command but iff the directory is empty.
Example:
hadoopguru@hadoop2:~$
ls
apache-flume-1.4.0-bin.tar.gz.1 hadoop-1.2.1-bin.tar.gz
aveo hive
aveo_hadoop hive-0.11.0-bin.tar.gz
data mahout-distribution-0.8.tar.gz
datanode metastore_db
derby.log mydir
flume namenode
hadoop-1.2.1
hadoopguru@hadoop2:~$
rmdir mydir/
hadoopguru@hadoop2:~$
ls
apache-flume-1.4.0-bin.tar.gz.1 hadoop-1.2.1
aveo hadoop-1.2.1-bin.tar.gz
aveo_hadoop hive
data hive-0.11.0-bin.tar.gz
datanode mahout-distribution-0.8.tar.gz
derby.log metastore_db
flume namenode
rmdir –p:
rmdir –p:
User can remove directory from any
specified path using rmdir –p.
Example:
hadoopguru@hadoop2:~$ ls
apache-flume-1.4.0-bin.tar.gz.1
hadoop-1.2.1
aveo hadoop-1.2.1-bin.tar.gz
aveo_hadoop hive
data hive-0.11.0-bin.tar.gz
datanode mahout-distribution-0.8.tar.gz
derby.log metastore_db
flume namenode
hadoopguru@hadoop2:~$ cd aveo_hadoop
hadoopguru@hadoop2:~/aveo_hadoop$ ls
aveo_hadoop1
hadoopguru@hadoop2:~/aveo_hadoop$ cd aveo_hadoop1
hadoopguru@hadoop2:~/aveo_hadoop/aveo_hadoop1$ ls
aveo_hadoop2
hadoopguru@hadoop2:~/aveo_hadoop/aveo_hadoop1$ cd aveo_hadoop2
hadoopguru@hadoop2:~/aveo_hadoop/aveo_hadoop1/aveo_hadoop2$ cd
hadoopguru@hadoop2:~$ rmdir -p
aveo_hadoop/aveo_hadoop1/aveo_hadoop2/
hadoopguru@hadoop2:~$ ls
apache-flume-1.4.0-bin.tar.gz.1
hadoop-1.2.1-bin.tar.gz
aveo hive
data hive-0.11.0-bin.tar.gz
datanode mahout-distribution-0.8.tar.gz
derby.log metastore_db
flume namenode
hadoop-1.2.1
pushd and popd:
Both these commands works on the common stack on previous
directories
pushd: pushd
adds a directory to the stack n changes to new current directory
popd: popd removes a directory from the stack and
sets the current directory.
Example:
pushd
:
hadoopguru@hadoop2:~$
cd hadoop-1.2.1/
hadoopguru@hadoop2:~/hadoop-1.2.1$
pushd /bin
/bin
~/hadoop-1.2.1
hadoopguru@hadoop2:/bin$
pushd /lib
/lib /bin
~/hadoop-1.2.1
hadoopguru@hadoop2:/lib$
pushd /hadoop
/hadoop /lib
/bin ~/hadoop-1.2.1
popd
:
hadoopguru@hadoop2:/hadoop$
popd
/lib /bin
~/hadoop-1.2.1
hadoopguru@hadoop2:/lib$
popd
/bin
~/hadoop-1.2.1
hadoopguru@hadoop2:/bin$
popd
~/hadoop-1.2.1
MORE LINUX COMMANDS
Linux Commands - mkdir | rmdir | touch | rm | cp | more | less | head | tail | cat
17 comments:
thank tou for offering such a nice content very very unique blog .one of the recommanded blog for students and professionals
Data science training in hyderabad
Data science training in ameerpet
Thanks a lot very much for the high quality and results-oriented help.
Oracle training in pune
Oracle training in rajajinagar
Superb. I really enjoyed very much with this article here. Really it is an amazing article I had ever read. I hope it will help a lot for all. Thank you so much for this amazing posts and please keep update like this excellent article. thank you for sharing such a great blog with us.
Your good knowledge and kindness in playing with all the pieces were very useful. I don’t know what I would have done if I had not encountered such a step like this.
Devops Training in Chennai
Devops Training in Bangalore
Devops Training in pune
From your discussion I have understood that which will be better for me and which is easy to use. Really, I have liked your brilliant discussion. I will comThis is great helping material for every one visitor. You have done a great responsible person. i want to say thanks owner of this blog.
AWS Training in chennai
AWS Training in bangalore
You made such an interesting piece to read, giving every subject enlightenment for us to gain knowledge. Thanks for sharing the such information with us
python training in tambaram
python training in annanagar
python training in OMR
Awesome! Education is the extreme motivation that open the new doors of data and material. So we always need to study around the things and the new part of educations with that we are not mindful.
Blueprism training in Pune
Blueprism online training
Blue Prism Training in Pune
Fantastic work! This is the type of information that should follow collective approximately the web. Embarrassment captivating position Google for not positioning this transmit higher! Enlarge taking place greater than and visit my web situate
Data science training in tambaram | Data Science training in anna nagar
Data Science training in chennai | Data science training in Bangalore
Data Science training in marathahalli | Data Science training in btm
I really like the dear information you offer in your articles. I’m able to bookmark your site and show the kids check out up here generally. Im fairly positive theyre likely to be informed a great deal of new stuff here than anyone
Data Science course in kalyan nagar | Data Science course in OMR
Data Science course in chennai | Data science course in velachery
Data science online course | Data science course in jaya nagar
Those rules moreover attempted to wind up plainly a decent approach to perceive that other individuals online have the indistinguishable enthusiasm like mine to get a handle on incredible arrangement more around this condition
iosh course in chennai
It seems you are so busy in last month. The detail you shared about your work and it is really impressive that's why i am waiting for your post because i get the new ideas over here and you really write so well.
Selenium training in Chennai
I am really very happy to visit your blog. Now I am found which I actually want. I check your blog everyday and try to learn something from your blog. Thank you and waiting for your new post. data science course in aurangabad
Great post i must say and thanks for the information. Education is definitely a sticky subject. However, is still among the leading topics of our time. I appreciate your post and look forward to more.
artificial intelligence course in ecil
Awesome blog. I enjoyed reading your articles. This is truly a great read for me. I have bookmarked it and I am looking forward to reading new articles. Keep up the good work!
data science training in ecil
I have bookmarked your website because this site contains valuable information in it. I am really happy with articles quality and presentation. Thanks a lot for keeping great stuff. I am very much thankful for this site.
data science institute in ecil
Grab Data Science Certification in Chennai for skyrocketing your career with Infycle Technologies, the best Software Training & Placement institutes in and around Chennai. In addition to the Certification, Infycle also gives the best placement training for personality tests, interview preparation, and mock interviews for leveling up the candidate's grades to a professional level.
This post is so interactive and informative.keep update more information...
RPA Training in Tambaram
RPA Training in Chennai
Post a Comment