Full Time
Data Engineer – American Express Jobs
Job Description
Job Type: Data Engineer from home
Location: Arizona work from home
Company: American Express
Position Description:
Cornerstone is the company’s largest Big Data Platform, making it appropriate for computationally and/or data-intensive processing applications. Cornerstone has extensive capabilities to manage such workloads effectively and cost-efficiently, whether the data is handled in batch, live, or streaming mode.
How will you make an impression in this position?
- You will guide multiple teams on how to ingest, store, process, analyze, and explore/visualize data on our Cloud Platform as a member of an agile team by contributing to software builds through consistent development processes (tools, common components, and documentation).
- You will work on data migrations and transformative initiatives, as well as with internal partners, to build large-scale data processing systems, data pipelines optimized for scaling, and potential platform difficulties.
- Serve as a trusted technical advisor to usecase teams, assisting them in resolving complicated Big Data challenges.
- Develop and provide standard methodology suggestions, tutorials, blog articles, and sample code to key business and technical partners at various levels.
- Design, build, and maintain enterprise capabilities and systems Debug fundamental software components and find code flaws for correction.
- Allow for software deployment, support, and monitoring across test, integration, and production environments.
- Deployments in test or production settings can be automated.
- Work with agile teams to create, evolve, and maintain engineering excellence principles relating to code, code reviews, testing methodologies (unit, integration, etc.) and defect management.
- To accelerate the development of concepts, prototypes are created utilizing visualization and other techniques.
- Seeks ongoing improvement by contemplating creative applications of existing ideas, processes, technology, or products.
- Collaborates with other senior engineers and product owners to develop new features and improve existing ones.
Qualifications
- Bachelor’s degree in computer science, mathematics, or a technical subject connected to computer science, or comparable practical experience
- Proficiency with data processing software and techniques (e.g., Spark, Hadoop, Pig, Hive) (e.g., MapReduce, Flume)
- Java, C++, Python, Go, JavaScript expertise creating and growing effective, dependable, secure infrastructure and systems to support large-scale data efforts, mainly in the Amazon or GCP ecosystem REST API design and implementation experience
- Excellent knowledge of NoSQL technologies such as HBase, Cassandra, Redis, Memcached, and others.
- Ability to interpret technical and business objectives and difficulties, as well as express solutions
- In a production setting, you must have strong analytical and programming skills.
- It’s great to have Google Cloud Platform. Skills: Big Query, Airflow, Terraform, and DBT