Validate Knowledge content for accuracy, relevancy, and currency. Expertise in HDFS, MapReduce, Hive, Pig, Sqoop, HBase and Hadoop â¦ Effectively identify change and use appropriate protocols to manage and communication this change effectively, Collect, maintain and distribute project status meeting minutes to stakeholders, Provide routine status reports and briefings to project team, customers and senior managers. Or have questions? Have a “hacker” mentality toward building solutions and problem-solving. 280 junior hadoop developer jobs available. Able to understand and explore the constantly evolving tools within Hadoop ecosystem and apply them appropriately to the relevant problems at hand, Experience in working with a Big Data implementation in production environment, Lead and assist with the technical design/architecture and implementation of the Big data cluster in various environments, Able to guide/mentor development team for example to create custom common utilities/libraries that can be reused in multiple big data development efforts, Provide infrastructure system expertise, requirements and assistance to Systems and Database Administrators, other System Architects and application development teams, Work with line of business (LOB) personnel, external vendors, and internal Data Services team to develop system specifications in compliance with corporate standards for architecture adherence and performance guidelines, Provide technical resources to assist in the design, testing and implementation of software code and infrastructure to support data infrastructure and governance activities, Assist in both external and internal audit questionnaires and application assessments, Assess current technical architecture and estimate system capacity to meet near- and long-term processing requirements, Evaluate, select, test, and optimize hardware and software products, Discuss with end-users, clients and senior management to define infrastructure requirements for complex systems and infrastructure development, Design and oversee development and implementation of end-to-end integration of infrastructure solutions, Document the Bank's existing solution architecture and technology portfolio and make recommendations for improvements and/or alternatives, Develop, document and communicate needs for investing in infrastructure evolution, including analysis of cost reduction opportunities, Liaise with Enterprise Architects to conduct research on emerging technologies, and recommend technologies that will increase operational efficiency, infrastructure flexibility and operational stability, Develop, document, communicate and enforce a policy for standardizing systems and software, as necessary, Instruct, direct and mentor other members of the team, Bachelor's degree in a technical or business-related field, or equivalent education and related training, Twelve years of experience in data warehousing architectural approaches, Exposure to and strong working knowledge of distributed systems involving internet protocols, Excellent understanding of client-service models and customer orientation in service delivery, Ability to derive technical specifications from business requirements and express complex technical concepts in terms that are understandable to multi-disciplinary teams, in verbal as well as in written form, Ability to grasp the 'big picture' for a solution by considering all potential options in impacted area, Involves designing, capacity planning, cluster set up, monitoring, structure planning, scaling and administration of Hadoop components ((YARN, MapReduce, HDFS, HBase, Zookeeper, Storm, Kafka, Spark, Pig and Hive), Accountable for performance tuning and resource management of Hadoop clusters and MapReduce routines, Strong Experience with LINUX based systems & scripting (either of Shell, Perl or Python), Experience with configuration management tools like puppet, chef or salt, Good knowledge of directory services like LDAP & ADS and Monitoring tools like Nagios or Icinga2, Strong troubleshooting skills of Hive, Pig, Hbase and JAVA Mapreduce codes/jobs, Creates / comprehends intermediate-to complex logical (conceptual, relational, subject-level and dimensional) data models and other metadata deliverables at varying levels of detail and functionality across multiple business areas of responsibility, Performs / comprehend Analysis and volumetric parameters that dictate data profiling activity preferred, Maintains understanding of software industry, particularly with respect to data warehousing and analytics, This is not a Hadoop Developer role.