Big data Admin with Service management/Consultant Specialist
Hyderābād (Hyderābād) Events
Job description
Job description
Some careers shine brighter than others.
If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further.
HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions.
We are currently seeking an experienced professional to join our team in the role of Consultant Specialist.
In this role, you will:
· Primary responsibility will be to solve complex issues in Bigdata and other Industry standard Products in the same line and act as an SME.
· Experience in Service Management.
· Excellent communication and interpersonal skills.
· Excellent problem-solving skills.
· Strong team player, and able to lead and mentor junior members.
· Strong data engineering, technology engineering and architecture knowledge
· Working independently and as part of a team to design, develop, test and implement applications.
· Building on and suggesting best practices
· Work closely with Group Data Services stakeholders and GB/GF CDO’s to ensure business requirements are understood, translated and delivered as tooling capabilities.
· Work closely with Group Data Services stakeholders and GB/GF CDO’s on tooling capability adoption and exploitation.
· Coordinate delivery and dependencies between other Data Management projects and other relevant programmes/initiatives.
· Own and develop individual project plans and other required documentation as required for project governance.
· Manage and coordinate communication and training.
· Manage the Ops team.
· Manage and lead recruitment.
· Manage and coordinate budgeting cycle.
· Ensure all project portfolio reporting to other forums is consistent and aligned.
· Good stakeholder management skills able to engage in formal and casual conversations and driving the right decisions.
Requirements
To be successful in this role, you should meet the following requirements:
· Good experience in administration of Big data platform and the allied toolset. Big data platform software from Cloudera & Acceldata.
· Working knowledge of Data flow (HDF) architecture, setup and ongoing administration
· Has experience working on secured environments using a variety of technologies like Kerberos, Knox, Ranger, KMS, Encryption zone, Server SSL certificates.
· Prior experience of Linux system administration.
· Good experience of Hadoop capacity planning in terms of HDFS file system, Yarn resources
· Good troubleshooting skills, able to identify the specific service causing issues, reviewing logs and able to identify problem entries and recommend solution working with product vendor
· Capable of reviewing and accepting/ challenging solutions provided by product vendors for platform optimization and root cause analysis tasks.
· Experience in doing product upgrades of the core big data platform, cluster expansion, setting up High availability for core services
· Good knowledge of Hive as a service, Hbase, Kafka, Spark.
· Knowledge of basic data pipeline tools like Sqoop, File ingestion, Distcp and their optimal usage patterns
· Knowledge of the various file formats and compression techniques used within HDFS and ability to recommend right patterns based on application use cases
· Exposure to Amazon Web services (AWS) and Google cloud platform (GCP) services relevant to big data landscape, their usage patterns and administration
· Working with application teams and enabling their access to the clusters with the right level of access control and logging using Active directory (AD) and big data tools
· Setting up disaster recovery solutions for clusters using platform native tools and custom code depending on the requirements
· Configuring Java heap and allied parameters to ensure all Hadoop services are running at their optimal best.
· Significant experience on Linux shell scripting, Python or perl scripting
· Experience with industry standard version control tools (Git, GitHub, Subversion) and automated deployment, testing tools (Ansible, Jenkins, Bamboo etc)
· Worked on projects with Agile/ Devops as the product management framework, good understanding of the principles and ability to work as part of the POD teams.
· Working knowledge of open source RDBMS - MySQL, Postgres, Maria DB
· Ability to go under the hood for Hadoop services (Ambari, Ranger etc) that use DB as the driver
You’ll achieve more when you join HSBC.
www.hsbc.com/careers
HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website.
Issued by – HSDI