Sunday, 4 August 2013

AccessControlException: Access denied for user hdfs. Superuser privilege is required





put: Permission denied: user=XYZ, access=WRITE, inode="/user/test":hadoopuser:hdfsusers:drwxr-xr-x

This error due to permissions check fails for the directory /user.test,  Whenever changes need for a file or directory  in  HDFS do a permissions check for a file or directory. Hadoop(without security or Kerberos installation) the identity of a client process user name is just whatever the host operating syste.
For Unix-like systems,  it identify the client username by executing the command
The user name is the equivalent of `whoami`;
The group list is the equivalent of `bash -c groups`. 

Super User
The super-user is the user with the same identity as name node daemon process running.  That is the username started the name node daemon,

If you getting the above error this while copying a file to HDFS directory

                $hadoop fs –put text.txt /user/test/

You(username-XYZ) are copying a file to hdfs directory “/user/test/”, it owned by hadoopuser in group hdfsusers.
Before hadoop tcommand, export the hadoop user like below then try
Export HADOOP_USER_NAME=hadoopuser

org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.security.AccessControlException: Access denied for user hdfs. Superuser privilege is required

If getting this kind of error in java programs, before running java jar
                   $export HADOOP_USER_NAME=hadoopuser  OR

Include this lines in your java program

               System.setProperty("HADOOP_USER_NAME", hadoopSuperUserName);

               Configuration conf = new Configuration();                           conf.set("hadoop.job.ugi", hadoopSuperUserName);