Offers “Hp”

Expires soon Hp

Big Data ETL Engineer

  • Provincia De Heredia, Costa Rica
  • IT development

Job description

Job ID 1015054 Date posted 5/31/2018 Primary Location Heredia, Provincia de Heredia, Costa Rica Job Category Information Technology Schedule Full time Shift No shift premium (Costa Rica)

At Hewlett Packard Enterprise (HPE), we live by three core values that drive our business: Partner. Innovate. Act. These values combine to help us create important work all over the world to advance how people live and work.

Big Data ETL Engineer

The Big Data engineer is responsible for developing ETL and streaming integration data flows from various internal systems and databases and external partner systems.  Data integration workflows are either scheduled workflows that drive automated processes or continuously running streaming jobs that respond to events in the system.

In a typical day as a Big Data ETL Engineer , you would

·  Design ETL data flows, develops and implement data querying, cleansing, and transformation programs in Pig, Hive and Spark.
·  Integrate Flume, Kafka, Spark to create continuous big data pipelines for digital apps.
·  Implement data integration workflows using tools like Talend, Pentaho, and DataIku.
·  Apply analytical and problem solving skills to new requirements..
·  Develop rapid prototypes to try out new designs patterns and technologies.
·  Test data integration workflows in DEV and QA systems and support move to production.
·  Execute and write portions of testing plans, protocols, Document code for assigned portion of application; identify and debug issues with code and suggests changes or improvements.
·  Communicate with business, product, and development teams during development, testing, and production support.

If you…

·  are good at partnering, innovating, and making things happen―you are aligned to our core values
·  hold a Bachelor's or Master's degree in Computer Science, Information Systems, or equivalent.
·  have minimum 3 to 5 years of experience.
·  hands on Experience developing Hadoop integrations for data ingestion, data mapping and data processing capabilities using Pig, Hive, Hbase, Sqoop, and Oozie .
·  have strong knowledge of Flume, Storm, Kafka, Spark is a plus.
·  have good understanding of programming fundamentals, asynchronous/synchronous programming concepts and design patterns.
·  are a professional with 3+ years of software development experience with 2+ years in Agile / Scrum environment.
·  Can-do attitude and energy to the team.
·  are self-starter with natural ability to balance priorities and work in a lean and agile environment .
·  must have proven track record building production software on Hadoop.
·  have expertise in SQL, database creation, updates, querying.
·  Core java fundamentals, strong computer science fundamentals.
·  Software systems testing methodology, including execution of test plans, debugging, and testing scripts and tools.
·  have strong written and verbal communication skills; mastery in English and local language. Ability to effectively communicate design proposals and negotiate options.
·  Hadoop Developer certification.
·  Experience implementing Lambda architecture.
·  NoSQL Databases - Cassandra, MongoDB.
·  Python scripting experience.
·  Experience in Enterprise IT environments.

...then, !

We offer:

• A competitive salary and extensive social benefits

• Diverse and dynamic work environment

• Work-life balance and support for career development

Want to know more about HPE? Then let’s stay connected!

https://www.facebook.com/HPECareers

https://twitter.com/HPE_Careers

Hewlett Packard Enterprise is EEO F/M/Protected Veteran/ Individual with Disabilities.

Make every future a success.
  • Job directory
  • Business directory