Welcome to the HadoopExam HDPCA (Hortonworks® Data Platform : HDP® Administrator Practice Questions and Answers.
To Access all Questions and Answers for HDPCA , you must Have Subscription from www.HadoopExam.com
Tips and Tricks for HDPCA Certification exam (web or pdf)
This page is mainly for HDPCA Certifications Practice Scenario, where through recorded videos session we are explaining how to solved problem scenarios for HadoopExam.com Practice Question Bank. Currently there are 57 Practice Scenarios for HDPCA exam, which you can practice using multi node cluster. During practice video session we have used 4 node HDP cluster, which we have created from scratch. Hence, you can create HDP cluster from scratch as well as practice all 57 practice scenarios. For more detail visit www.HadoopExam.com - Using SignIn, to login with your permitted email Id
- Use the Pedagogy Navigation to watch Individual Problem and Solutions Video
Installation- Configure a local HDP repository : It is require, you create local repository of all the HDP software on one of the node in cluster. Once repository setup is done, you have to use it to insall all the HDP software in your 4 node cluster.
- Install ambari-server and ambari-agent : This involves two steps
- Install Ambari Server : it needs to be done only on one of the node in cluster (You can choose, one of the master node)
- Install Ambari Agent : Ambari agent should be installed on all the nodes in cluster. Hence, each node can send data to Ambari Server and can be displayed on Ambari WebUI.
- Install HDP using the Ambari install wizard : Now create 4 node cluster using Ambari UI. In this you have to below steps
- Choose your cluster name
- Choose master and slave nodes
- Configure Local Repository
- Register your node in cluster
- Install Hadoop on all 4 nodes
- Add a new node to an existing cluster : You will be given a node, which is not yet attached to your cluster. It also gives private ssh keys, and you need to add in your existing 4 node cluster.
- Decommission a node : It does not mean to delete node. It means you remove node from the cluster and clean it up (remove all Hadoop DataNode configuration)
- Add an HDP service to a cluster using Ambari : There more than 15 services , which can be added to HDP cluster. You need to learn how to add each individual or all-together this services.
Configuration
- Define and deploy a rack topology script : You shuld be able to create Rack topology and deploy the same. Hence, accordingly datanodes can be arranged and whatevcer data you store in cluster will be arranged as per the defined rack topology.
- Change the configuration of a service using Ambari : As you have more than 15 services in cluster. You need to be able to do basic configuration for those services.
- Configure the Capacity Scheduler : This is one of the schedular for scheduling the submitted jobs in cluster. Being an admin you shoul be able to allocate proper resources to each cluster user. Hence, proper configuration is expected.
- Create a home directory for a user and configure permissions : Create a directory for each unix user in HDFS. Also give respective permissions like who can delete and update data in this HDFS directory.
- Configure the include and exclude DataNode files :
- Troubleshooting : If there are any issue in cluster , you shoul be able to find and fix them. More detail will be given in practice questions.
- Restart an HDP service : Once you have added the services in cluster. You change their configuration and needs to restart the same.
- View an application’s log file : For each service , you should find the log path and see if there is any issue.
- Configure and manage alerts : You should be abkle to create new alert configuration.
- Troubleshoot a failed job : If a user had submitted a Job and it got failed. You should be able to find. Why this Job has failed.
High Availability- Configure NameNode HA : Configure namenode High Availaility
- Configure ResourceManager HA : Configure ResourceManager High Availaility
- Copy data between two clusters using distcp : Copy data from one cluster to another cluster.
- Create a snapshot of an HDFS directory : Take and create snapshot from one of HDFS directory given
- Recover a snapshot : Recreate data from snapshot.
- Configure HiveServer2 HA : Configure HiveServer2 High Availability
Security- Install and configure Knox : Install Knox Gateway and Configure for authentication for services
- Install and configure Ranger : Install Ranger and should be able to create policy and check the audit configuration.
- Configure HDFS ACLs : On HDFS, you should be able to control each user directory.
|
|