Offers “Accenture”

Expires soon Accenture

Big Data Developer - Apache Spark with Scala

  • Internship
  • POLAND
  • IT development

Job description



Accenture is a leading global professional services company, providing a broad range of services and solutions in strategy, consulting, digital, technology and operations. Combining experience and specialized skills across more than 40 industries and all business functions—underpinned by the world's largest delivery network—Accenture works at the intersection of business and technology to help Clients improve their performance and create sustainable value for their stakeholders. With approximately 373,000 people serving Clients in more than 120 countries, Accenture drives innovation to improve the way the world works and lives.

Your missions:
Are you passionate about building highly available, robust and scalable distributed systems? Do you like to be challenged and encouraged to learn and grow professionally?
If you have experience in designing architecture of data and analytics platforms, leveraging tools such as Apache Hadoop, Spark, Elasticsearch, or real-time event processing platforms such as Apache Storm or Kafka, and are interested in dev teams embracing cloud technologies, come and talk to us.

We're looking for a Big Data developer to be:
· Platform's design point of contact: providing detailed guidance to the platform's core and feature teams on Big Data design and solution engineering
· The team's and platform stakeholders' trusted advisor: collaborate with dev teams and technical stakeholders to learn about and use optimally Big Data technologies such as Hadoop and Spark
· A community player: capture and share best-practices, participate, and contribute as a member of the worldwide Group Technology community

Desired profile



Qualifications :

·  Candidate must have 2+ years of experience with Apcahe Spark and Scala
·  Candidate must have very good hands on experience with Apache Spark, Spark performance tuning
·  Should be expert in Debugging the Spark jobs via Spark UI
·  Should have good understanding of Spark memory management and different Spark configurations.
·  Must have used Scala for writing Spark applications
·  Must have good hands on experience with Scala.
·  Good to have experience with BDD or TDD approach

Make every future a success.
  • Job directory
  • Business directory