Offers “HSBC”

New HSBC

Dataflow Engineer /Senior Software Engineer

  • Pune (Pune Division)
  • IT development

Job description

Job description

Some careers shine brighter than others.

If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further.

HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions.

We are currently seeking an experienced professional to join our team in the role of Senior Software Engineer.

In this role, you will:

·  Design and Build Data Pipelines: Develop and implement efficient data pipelines for collecting, transforming, and storing data across different platforms.
·  Data Integration: Integrate data from a variety of sources, including cloud platforms, Google Cloud), databases (SQL/NoSQL), APIs, and external services.
·  Optimize Data Flow: Ensure optimal performance of data flows by troubleshooting and fine-tuning existing pipelines.
·  Data Transformation: Implement ETL (Extract, Transform, Load) processes, transforming raw data into usable formats for analytics and reporting.
·  Collaboration: Work with cross-functional teams (data engineering, operations) to understand data needs and implement solutions; support the applications on need basis over the weekend or non-office hours.
·  Automation & Scalability: Build scalable, automated workflows to handle large volumes of data with high reliability and low latency.
·  Monitoring & Maintenance: Set up monitoring and alerting systems for data pipelines to ensure minimal downtime and maximum performance.
·  Documentation: Document data flows, pipeline configurations, and processing logic to ensure maintainability and transparency.

Requirements

To be successful in this role, you should meet the following requirements:

·  Experience as a Dataflow Engineer, Data Engineer, or similar role working with large datasets and distributed systems.
·  Proficient in programming languages such as Python, Java.
·  Hands-on experience with data pipeline orchestration tools (e.g. Google Dataflow).
·  Experience with cloud-based data platforms like Google Cloud (BigQuery, Dataflow).
·  Strong experience with ETL frameworks and tools; real-time data streaming and processing.
·  Familiarity with data formats like JSON, Parquet, etc.
·  Knowledge of SQL and NoSQL databases; data governance, data quality, and security best practices.
·  Problem-Solving: Ability to troubleshoot complex data integration and processing issues.
·  Communication Skills: Strong written and verbal communication skills to collaborate with technical and non-technical stakeholders.

www.hsbc.com/careers

HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website.

Issued by – HSBC Software Development India

Make every future a success.
  • Job directory
  • Business directory