Job Code : PTM/HA/1236
Looking for Hadoop Admin
Should have 3 to 12 years of experience
Location: Trivandrum / Bangalore
Notice Period: Preferring Immediate Joinees
- Responsibilities and Duties:
- Installed and configured various components of Hadoop ecosystem and maintained their
- Deploying and maintaining a Hadoop cluster, adding and removing nodes using cluster
monitoring tools like Ganglia Nagios or Cloudera Manager
- Work closely with the database team, network team, BI team and application teams to
make sure that all the big data applications are highly available and performing as
- Managed all aspects of our AWS infrastructure (compute, storage, network, permissions,
cost) using configuration management tools like Ansible, Cloud Formation and shell
- Assist in designing, automating, implementing and sustainment of Amazon machine
images (AMI) across Cloud environment.
- Configured S3 buckets with various life cycle policies to archive the infrequently
accessed data to storage classes based on requirement.
- Implementing and sustainment of Amazon machine images (AMI) across Cloud
- Experience with VPC, Subnets, and Route tables
- Responsible for capacity planning and estimating the requirements for lowering or
increasing the capacity of the Hadoop cluster.
- Monitoring the cluster connectivity and performance.
- Backup and recovery tasks
- Resource and security management
- Troubleshooting application errors and ensuring that they do not occur again
- Competencies & Experience Required/Desired
- Bachelor’s degree in computer science, information technology, data science, data
analytics, interactive media, or related field
- 8+ years of experience in Data warehousing or similar analytic data experience
- 3+ years of experience creating and managing complex data architectures.
- 5+ years of experience in database design, development and data modeling.
- 5+ years if experience working as a Hadoop Admin
- Hands-on experience on Cloudera installation, configuration, debugging, tuning and
- Must have prior Cloudera Hadoop cluster deployment experience from the scratch.
- Hands-on experience on Cloudera, working with data delivery teams to setup new
- This includes setting up Linux users, setting up Kerberos principals and testing HDFS,
- Competency in Red Hat Linux administration (security, configuration, tuning,
troubleshooting and monitoring).
- Expert knowledge on Active Directory/LDAP security integration with Cloudera Big
- Performance tuning experience of Cloudera clusters and Spark (PySpark, Spark,
Language R) and MapReduce routines.
- Experience Optimizing clusters for future workloads.
- Hands-on experience on node management, monitoring and response, support processes
creation, upgrades and patches, logging configuration and managing user rights and space
- Working knowledge of Networks, Linux OS and Unix Shell Scripting.
- Experience working on Agile projects and Agile methodology in general
- Additional Skills:
- Hadoop, AWS and Azure certifications is a plus
(Drop your resume to firstname.lastname@example.org, Please make sure you include the job code in the subject of the Email when applying)