Offers “PepsiCo”

Expires soon PepsiCo

Senior Data Engineer

  • Krakow, POLAND
  • IT development

Job description

Overview

Location Overview

Working with inspiring and experienced colleagues, you'll find that the atmosphere in our city-centre office in Kraków is informal and engaging. With drive and ingenuity, our teams deliver vital services to PepsiCo employees around the world. With an active, get things-done culture, this is a place where your dynamism and agility will make a difference.

 

Job Overview

With a broader team of more than 250 associates and offices in New York, Chicago, Plano, Silicon Valley, Mexico, India and Poland, we’re unleashing the full potential of automation, data science, and machine learning to challenge the way snacks and beverages are sold every day.

We're looking for a senior software engineer to join our PepsiCo eCommerce Catalog engineering team. You’ll have a once-in-a-generation opportunity to influence the biggest players in online grocery while also building out data pipelines that are at the core of hundreds of millions of dollars in transactions.

Responsibilities

·  You will own development of ETL data pipelines, taking responsibility for designing, data modeling, coding, testing, scalability, operability, and ongoing metrics.
·  Collaborate with a globally distributed product/platform team that includes product manager, business stakeholders, UX design, data analysts, data scientists, BI, and data engineers
·  You will be the people manager of the Catalog data engineering team
·  Assist with hiring the best and brightest engineers to the team
·  Help in creating and maintaining a fantastic work environment and culture that attracts and retains top engineering talent

The Catalog engineering team works mostly in Elixir while the data engineering side relies largely on Python: however, the way this evolves is up to you. We put more emphasis on your general engineering skill and willingness to learn than knowledge of a particular language or framework.

 

Some technologies that we use. Familiarity is great, but not necessary:

·  Python
·  Airflow
·  Snowflake
·  PostgreSQL
·  Docker
·  Kubernetes (we have a dedicated team for this)
Qualifications

·  Bachelor’s Degree (Computer Science or related preferred)
·  Professional experience using Python or similar programming language
·  Deep understanding of RDBMS databases and SQL language
·  Understanding of data warehousing
·  6+ years of professional experience in software development
·  2+ years of exposure to building ETL pipelines
·  Knowledge of professional software engineering practices & best practices for the full software development life cycle, including coding standards, code reviews, source control management, continuous integrations, testing, and operations
·  Hands-on experience with GCP, AWS, or similar
·  Strong verbal and written communication skills.
·  Knowledge of Web Scraping (nice to have)
·  Knowledge of Airflow, Kubernetes and Docker (nice to have)

Make every future a success.
  • Job directory
  • Business directory