Big Data Engineer
Milan (Città metropolitana di Milano) IT development
Job description
The mission of the Advanced Analytics Solution team is to develop and operate end-to-end automation, analytics, and AI assets to enhance company processes and business outcomes. As a Big Data Engineer, the role involves close collaboration with analytics, IT, and architecture teams to develop, industrialize, and maintain data pipelines, analytics/AI models, and automation software for production. Responsibilities also include liaising with the IT department to manage the cloud-based analytics environments utilized by the team. The role encompasses technology scouting and defining solutions and standards for Data & Analytics across the organization, participating in cross-functional working groups at both local and international levels. This position offers a dynamic and innovative work environment with opportunities for professional growth and development.
Key Responsibilities:
· Data Pipeline Development: Design, optimize, and re-engineer data pipelines and analytics models to ensure robust and efficient data processing.
· Project Delivery: Deliver automation and analytics artifacts in alignment with project timelines and objectives.
· Cross-functional Collaboration: Participate in cross-functional working groups to represent the Advanced Analytics team and contribute to the definition of target solutions.
· Cloud Infrastructure Management: Assist team members in managing cloud-based analytics infrastructure in coordination with the IT department.
· Operational Monitoring: Develop and implement activities required to operate and monitor production projects effectively.
· Technology Scouting: Support the identification and evaluation of new technologies, techniques, and software within the Data & Analytics domain.
Key Requirements/Skills/Experience:
· Experience: 2+ years of experience in engineering, with a focus on developing and operating analytics/AI assets in large-data contexts.
· Education: Master’s degree in Computer Science or Computer Engineering. A PhD or post-graduate courses in related fields (e.g., HPC, BDE, DS) is a plus.
· Technical Proficiency:
· Excellent knowledge of Python, PySpark, SQL, and scripting.
· Proficiency with orchestration tools such as Airflow.
· Strong software engineering skills.
· Database Knowledge: Good understanding of relational and non-relational databases, ETL processes, CDC, and data streaming tools and techniques.
· Cloud Ecosystems: Experience with Microsoft Azure or AWS ecosystems.
· Big Data Architectures: Familiarity with analytics and Big Data architectures (e.g., Kafka, Hadoop, Databricks) and web/cloud technologies (e.g., Docker, Kubernetes).
· Language Skills: Fluent in English, both written and spoken.
· Soft Skills:
· Team spirit, self-motivation, and a proactive, result-driven work style.
· Strong ability to meet deadlines, with excellent self-organization.
· Strong analytical, problem-solving, and communication skills.
· Additional Knowledge: Familiarity with statistics and data science topics, MLOps techniques, network communication protocols, and project management will be considered a plus.
Allianz Group is one of the most trusted insurance and asset management companies in the world. Caring for our employees, their ambitions, dreams and challenges, is what makes us a unique employer. Together we can build an environment where everyone feels empowered and has the confidence to explore, to grow and to shape a better future for our customers and the world around us.
We at Allianz believe in a diverse and inclusive workforce and are proud to be an equal opportunity employer. We encourage you to bring your whole self to work, no matter where you are from, what you look like, who you love or what you believe in.
We therefore welcome applications regardless of ethnicity or cultural background, age, gender, nationality, religion, disability or sexual orientation.
Join us. Let's care for tomorrow.
Both genders may apply in accordance with the L. 903/77 (s.m.i).