Are you a Data Analytics specialist? Do you have Data Warehousing, Hadoop/Data Lake experience? Do you like to solve the most complex and high scale (billions + records) data challenges in the world today? Do you like to work in a variety of business environments, leading teams through high impact projects that use the newest data analytic technologies? Would you like a career path that enables you to progress with the rapid adoption of cloud computing?
At Amazon Web Services (AWS), we’re hiring highly technical cloud computing architects to collaborate with our customers and partners on key engagements. The Global Competency Center Professional Services team provides direct to customer, back office and packaged delivery consulting services to customers remotely. Our consultants will develop, deliver and implement AI, IOT, and data analytics projects that help our customers leverage their data to develop business insights. These professional services engagements will focus on customer solutions such as Machine Learning, IoT, batch/real-time data processing, Data and Business intelligence.
· Employ customer facing skills to represent AWS well within the customer’s environment and drive discussions with senior personnel regarding trade-offs, standard methodologies, project management and risk mitigation
· Work closely with AWS Platform Service Engineering and Architecture teams to help ensure the success of project consulting engagements with customer
· Work directly with customers’ technical resources to devise and recommend solutions based on the understood requirements
· Think strategically about business, product, and technical challenges in an enterprise environment
· Collaborate with AWS field sales, pre-sales, training and support teams to help partners and customers learn and use AWS services such as Athena, Glue, Lambda, S3, DynamoDB NoSQL, Relational Database Service (RDS), Amazon EMR and Amazon Redshift.
· Develop innovative solutions to complex business and technology problems
· Hands on experience leading large-scale global data warehousing and analytics projects
· Ability to collaborate effectively across organizations
· Understanding of database and analytical technologies in the industry including MPP and NoSQL databases, Data Warehouse design, BI reporting and Dashboard development
· Experience in AWS Platforms and Service
· Experience with TCP/IP Networking a plus
· Ability to lead client specific strategic engagements involving customer/partner teams
· Understanding and ability to participate in all phases of the SDLC including requirements gathering, business analysis, configuration management and quality control desired
· Desire and ability to interact with C-Level/Director level customers in IT/Business to craft their Enterprise Architecture and IT Strategies
· Strong sense of ownership, urgency, and drive.
Amazon is committed to a diverse and inclusive workplace. Amazon is an equal opportunity employer and does not discriminate on the basis of race, national origin, gender, gender identity, sexual orientation, protected veteran status, disability, age, or other legally protected status. For individuals with disabilities who would like to request an accommodation, please visit https://www.amazon.jobs/en/disability/us.
• For virtual jobs where work can be performed in Colorado: For employees based in Colorado, this position starts at 85,100 yr. A sign-on bonus and restricted stock units may be provided as part of the compensation package, in addition to a range of medical, financial, and/or other benefits, dependent on the position offered. For more information regarding Amazon benefits, please visit https://www.amazon.jobs/en/benefits. Applicants should apply via Amazon’s internal or external careers site.
· Bachelor’s degree, in Computer Science, Engineering, Mathematics or a related field or equivalent professional or military experience
· Experience of IT platform implementation in a technical and analytical role
· 1+ years of experience of Data Lake/Hadoop platform implementation
· 1+ years of hands-on experience in implementation and performance tuning Hadoop/Spark implementations.
· Experience Apache Hadoop and the Hadoop ecosystem
· Experience with one or more relevant tools (Sqoop, Flume, Kafka, Oozie, Hue, Zookeeper, HCatalog, Solr, Avro)
· Experience with one or more SQL-on-Hadoop technology (Hive, Impala, Spark SQL, Presto)
· Experience developing software code in one or more programming languages (Java, Python, etc)