Data Engineer - Global Accounts, Professional Services, AWSI-SDT-APJ-Japan
Job ID: 2901603 | Amazon Web Services Japan GK
Our AWS Professional Services consultants deliver IT infrastructure and application architecture guidance, lead proof-of-concept projects, perform enterprise portfolio assessments, review operational best practices and conduct skills transfer workshops. AWS consultants collaborate with customers and partners to address security and compliance, performance and scale, availability and manageability. They advise customers on data platforms using the full range of AWS services. They also assist with the non-technical change management work on policies, processes and people changes.
At AWS, we’re looking for technical architects to collaborate with our customers and partners on key engagements, while helping our partners to develop technical expertise and capacity. The individual requires a strong combination of technical experience, hands-on keyboard capacity, technical leadership experience, and ability to learn fast in a fast paced environment. They will focus on customer solutions that enables customers to be a data-first organization.
Key job responsibilities
1. Expertise: Collaborate with AWS field sales, pre-sales, training and support teams to help partners and customers learn and use AWS services such as Amazon Elastic Compute Cloud (EC2), S3, AWS Glue, DynamoDB NoSQL, Relational Database Service (RDS), Elastic Map Reduce (EMR) and Amazon Redshift.
2. Solutions: Deliver on-site technical engagements with partners and customers. This includes participating in pre-sales on-site visits, understanding customer requirements, creating consulting proposals and creating packaged Big Data service offerings.
3. Delivery: Engagements include short on-site projects proving the use of AWS services to support new distributed computing solutions that often span private cloud and public cloud services. Engagements will include migration of existing applications and development of new applications using AWS cloud services.
About the team
Inclusive Team Culture: Here at AWS, we embrace our differences. We are committed to furthering our culture of inclusion. We have ten employee-led affinity groups, reaching 40,000 employees in over 190 chapters globally. We have innovative benefit offerings, and host annual and ongoing learning experiences, including our Conversations on Race and Ethnicity (CORE) and AmazeCon (gender diversity) conferences. Amazon’s culture of inclusion is reinforced within our 16 Leadership Principles, which remind team members to seek diverse perspectives, learn and be curious, and earn trust.
Work/Life Balance: Our team puts a high value on work-life balance. It isn’t about how many hours you spend at home or at work; it’s about the flow you establish that brings energy to both parts of your life. We believe striking the right balance between your personal and professional life is critical to life-long happiness and fulfillment. We offer flexibility in working hours and encourage you to find your own balance between your work and personal lives.
Mentorship & Career Growth: Our team is dedicated to supporting new members. We have a broad mix of experience levels and tenures, and we’re building an environment that celebrates knowledge sharing and mentorship. We care about your career growth and strive to assign projects based on what will help each team member develop into a better-rounded professional and enable them to take on more complex tasks in the future.
BASIC QUALIFICATIONS
1. Bachelor's degree in Computer Science, Engineering, Math, or related discipline
2. 5+ years of experience with data modeling, data warehousing, and building ETL pipelines
3. 5+ years of leadership experience in a technical, customer-facing role in the technology industry
4. Experience in working with data lakes, modern data architectures, Lambda type architectures
5. Proficiency in writing and optimizing SQL
6. Knowledge of AWS services including S3, Redshift, EMR, Kinesis and RDS.
7. Experience with Open Source Data Technologies (Hadoop, Hive, Hbase, Pig, Spark, etc.)
8. Ability to write code in Python, Ruby, Scala or other platform-related Big data technology
9. Knowledge of professional software engineering practices & best practices for the full software development lifecycle, including coding standards, code reviews, source control management, build processes, testing, and operations
PREFERRED QUALIFICATIONS
1. Strong verbal and written communication skills with stakeholders (Japanese language preferred along with English as second language)
2. Industry experience as a Data Engineer or related specialty (e.g., Software Engineer, Business Intelligence Engineer, Data Scientist) with a track record of manipulating, processing, and extracting value from large datasets.
3. Coding proficiency in at least one modern programming language (Python, Ruby, Java, etc)
4. Experience building data products incrementally and integrating and managing datasets from multiple sources
5. Query performance tuning skills using Unix profiling tools and SQL
6. Experience leading large-scale data warehousing and analytics projects, including using AWS technologies – Redshift, S3, EC2, Data-pipeline and other big data technologies
7. Linux/UNIX including to process large data sets.
Amazon is committed to a diverse and inclusive workplace. Amazon is an equal opportunity employer and does not discriminate on the basis of race, national origin, gender, gender identity, sexual orientation, protected veteran status, disability, age, or other legally protected status.
#J-18808-Ljbffr