Friday, September 15, 2017

Hadoop Commands


Command
What It Does
Usage
Examples
dcat
Copies source paths tostdout.
hdfsdfs -cat URI [URI …]
hdfsdfs -cat hdfs://
<path>/file1; hdfs 
dfs -cat file:///file2 /user/hadoop/file3
chgrp
Changes the group association of files. With -R, makes the change recursively by way of the directory structure. The user must be the file owner or the superuser.
hdfsdfs -chgrp [-R] GROUP URI [URI …]

chmod
Changes the permissions of files. With -R, makes the change recursively by way of the directory structure. The user must be the file owner or the superuser.
hdfsdfs -chmod [-R] <MODE[,MODE]... | OCTALMODE> URI [URI …]
hdfsdfs -chmod 777 
test/data1.txt
chown
Changes the owner of files. With -R, makes the change recursively by way of the directory structure. The user must be the superuser.
hdfsdfs -chown [-R] [OWNER][:[GROUP]] URI [URI ]
hdfsdfs -chown -R 
hduser2 /opt/hadoop/logs
copyFromLocal
Works similarly to the put command, except that the source is restricted to a local file reference.
hdfsdfs -copyFromLocal<localsrc> URI
hdfsdfs -copyFromLocal input/docs/data2.txt hdfs://localhost/user/
rosemary/data2.txt
copyToLocal
Works similarly to the getcommand, except that the destination is restricted to a local file reference.
hdfsdfs -copyToLocal [-ignorecrc] [-crc] URI <localdst>
hdfsdfs -copyToLocal
data2.txt data2.copy.txt
count
Counts the number of directories, files, and bytes under the paths that match the specified file pattern.
hdfsdfs -count [-q] <paths>
hdfsdfs -count hdfs://nn1.example.com/
file1 hdfs://nn2.example.com/
file2
cp
Copies one or more files from a specified source to a specified destination. If you specify multiple sources, the specified destination must be a directory.
hdfsdfs -cp URI [URI …] <dest>
hdfsdfs -cp /user/hadoop/file1 /user/hadoop/file2 /user/hadoop/dir
du
Displays the size of the specified file, or the sizes of files and directories that are contained in the specified directory. If you specify the-s option, displays an aggregate summary of file sizes rather than individual file sizes. If you specify the-h option, formats the file sizes in a "human-readable" way.
hdfsdfs -du [-s] [-h] URI [URI …]
hdfsdfs -du /user/hadoop/dir1 /user/hadoop/file1
dus
Displays a summary of file sizes; equivalent to hdfsdfs -du –s.
hdfsdfs -dus<args>

expunge
Empties the trash. When you delete a file, it isn’t removed immediately from HDFS, but is renamed to a file in the /trash directory. As long as the file remains there, you can undelete it if you change your mind, though only the latest copy of the deleted file can be restored.
hdfsdfs –expunge

get
Copies files to the local file system. Files that fail a cyclic redundancy check (CRC) can still be copied if you specify the -ignorecrc option. The CRC is a common technique for detecting data transmission errors. CRC checksum files have the .crc extension and are used to verify the data integrity of another file. These files are copied if you specify the -crcoption.
hdfsdfs -get [-ignorecrc] [-crc] <src><localdst>
hdfsdfs -get /user/hadoop/file3
localfile
getmerge
Concatenates the files insrc and writes the result to the specified local destination file. To add a newline character at the end of each file, specify theaddnl option.
hdfsdfs -getmerge<src><localdst> [addnl]
hdfsdfs -getmerge /user/hadoop/mydir/ ~/result_fileaddnl
ls
Returns statistics for the specified files or directories.
hdfsdfs -ls<args>
hdfsdfs -ls /user/hadoop/file1
lsr
Serves as the recursive version of ls; similar to the Unix command ls -R.
hdfsdfs -lsr<args>
hdfsdfs -lsr /user/
hadoop
mkdir
Creates directories on one or more specified paths. Its behavior is similar to the Unix mkdir -p command, which creates all directories that lead up to the specified directory if they don’t exist already.
hdfsdfs -mkdir<paths>
hdfsdfs -mkdir /user/hadoop/dir5/temp
moveFromLocal
Works similarly to the putcommand, except that the source is deleted after it is copied.
hdfsdfs -moveFromLocal<localsrc><dest>
hdfsdfs -moveFromLocal localfile1 localfile2 /user/hadoop/hadoopdir
mv
Moves one or more files from a specified source to a specified destination. If you specify multiple sources, the specified destination must be a directory. Moving files across file systems isn’t permitted.
hdfsdfs -mv URI [URI …] <dest>
hdfsdfs -mv /user/hadoop/file1 /user/hadoop/file2
put
Copies files from the local file system to the destination file system. This command can also read input from stdin and write to the destination file system.
hdfsdfs -put <localsrc> ... <dest>
hdfsdfs -put localfile1 localfile2 /user/hadoop/hadoopdir;
hdfsdfs -put - /user/hadoop/hadoopdir (reads input from stdin)
rm
Deletes one or more specified files. This command doesn’t delete empty directories or files. To bypass the trash (if it’s enabled) and delete the specified files immediately, specify the -skipTrashoption.
hdfsdfs -rm [-skipTrash] URI [URI …]
hdfsdfs -rm hdfs://nn.example.com/
file9
rmr
Serves as the recursive version of –rm.
hdfsdfs -rmr [-skipTrash] URI [URI …]
hdfsdfs -rmr /user/hadoop/dir
setrep
Changes the replication factor for a specified file or directory. With -R, makes the change recursively by way of the directory structure.
hdfsdfs -setrep<rep> [-R] <path>
hdfsdfs -setrep 3 -R /user/hadoop/dir1
stat
Displays information about the specified path.
hdfsdfs -stat URI [URI …]
hdfsdfs -stat /user/hadoop/dir1
tail
Displays the last kilobyte of a specified file to stdout. The syntax supports the Unix -f option, which enables the specified file to be monitored. As new lines are added to the file by another process, tailupdates the display.
hdfsdfs -tail [-f] URI
hdfsdfs -tail /user/hadoop/dir1
test
Returns attributes of the specified file or directory. Specifies -e to determine whether the file or directory exists; -z to determine whether the file or directory is empty; and -d to determine whether the URI is a directory.
hdfsdfs -test -[ezd] URI
hdfsdfs -test /user/hadoop/dir1
text
Outputs a specified source file in text format. Valid input file formats are zipandTextRecordInputStream.
hdfsdfs -text <src>
hdfsdfs -text /user/hadoop/file8.zip
touchz
Creates a new, empty file of size 0 in the specified path.
hdfsdfs -touchz<path>
hdfsdfs -touchz /user/hadoop/file12

26 comments:


  1. We take the frustration out of parking management for Apartments, Home Owners Associations, Condominiums, Townhouses Co-ops, and other parking areas.Webparkingsoftware.com has reinvented how parking management should be done.

    free parking software
    apartments
    violation management

    ReplyDelete
  2. This comment has been removed by the author.

    ReplyDelete
  3. Wow it is really wonderful and awesome thus it is very much useful for me to understand many concepts and helped me a lot. it is really explainable very well and i got more information from your blog.

    rpa training in Chennai | rpa training in pune

    rpa online training | rpa training in bangalore

    ReplyDelete
  4. You made such an interesting piece to read, giving every subject enlightenment for us to gain knowledge. Thanks for sharing the such information with us
    Data Science training in Chennai | Data science training in bangalore
    Data science training in pune| Data science online training
    Pyton training in Kalyan nagar

    ReplyDelete
  5. It's interesting that many of the bloggers to helped clarify a few things for me as well as giving.Most of ideas can be nice content.The people to give them a good shake to get your point and across the command
    java training in chennai | java training in bangalore


    java training in tambaram | java training in velachery

    ReplyDelete
  6. A very nice guide. I will definitely follow these tips. Thank you for sharing such detailed article. I am learning a lot from you.

    rpa training in electronic-city | rpa training in btm | rpa training in marathahalli | rpa training in pune

    ReplyDelete
  7. This is very good content you share on this blog. it's very informative and provide me future related information.
    Best Devops Training in pune
    Devops Training in Chennai

    ReplyDelete
  8. This is an informative post and it is very useful and knowledgeable. therefore, I would like to thank you for the efforts you have made in writing this article.
    iphone app training course
    iphone app training course in bangalore
    ios app development in hyderabad

    ReplyDelete
  9. Whoa! I’m enjoying the template/theme of this website. It’s simple, yet effective. A lot of times it’s very hard to get that “perfect balance” between superb usability and visual appeal. I must say you’ve done a very good job with this.


    Amazon Web Services Training in Velachery, Chennai |AWS Training in Velachery , Besant Technologies

    Amazon Web Services Training in Chennai | AWS Training in Chennai

    Amazon Web Services Training in Chennai |Best AWS Training in Chennai

    Amazon Web Services Training in Chennai | AWS Training in OMR,Chennai

    ReplyDelete
  10. I wish to show thanks to you just for bailing me out of this particular trouble. As a result of checking through the net and meeting techniques that were not productive, I thought my life was done.
    iosh course in chennai

    ReplyDelete
  11. Your article is awesome! How long does it take to complete this article? I have read through other blogs, but they are cumbersome and confusing. I hope you continue to have such quality articles to share with everyone! I believe there will be many people who share my views when they read this article from you!

    Hadoop Training | Digital Nest

    ReplyDelete
  12. i like the article you can check also this websiteshttps://www.windowindia.net/

    ReplyDelete
  13. Inspirational, I am feeling motivated and now work harder to start the career in Data Science, hope will get similar success. Thanks for sharing your Data Science experience.
    Data Science Training in Hyderabad

    ReplyDelete

  14. Thanks for sharing such an amazing information its very beneficial for our company. our company name is innomatics research labs we offering data science,big data and many more courses to make student career success full and we are giving online, classroom and corporate training our website is https://www.innomatics.in

    ReplyDelete
  15. This comment has been removed by the author.

    ReplyDelete
  16. This Article is very brief. Thanks for sharing your knowledge. Surya Informatics

    ReplyDelete
  17. Thank You so much for sharing such an amazing content. Keep sharing these educative posts and enlightening people.

    Software Automation Tools

    ReplyDelete
  18. thanks for Sharing such an Awesome information with us.

    I learned World's Trending Technology from certified experts for free of cost.i Got job in decent Top MNC Company with handsome 14 LPA salary, i have learned the World's Trending Technology from Python training in pune experts who know advanced concepts which can helps to solve any type of Real time issues in the field of Python. Really worth trying Freelance seo expert in bangalore

    ReplyDelete