Full Time

Data Engineer – Remote Jobs San Diego

Posted 10 months ago
California
$45 - $50 per hour

Job Description

Job Type: Senior Data Engineer from home
Location: San Diego work from home
Company: CVS Health

With a background in SQL, Python, and cloud computing (ideally GCP), CVS Health is seeking a motivated, enthusiastic, and hands-on data engineer who is eager to work with a high-impact team to improve the lives of its clients by redefining the way they connect with the most qualified healthcare providers in their community.

Within Analytics & Behavior Change, you will be integrated into a team dedicated on data engineering. As they build and create applications and pipelines for CVS Health across many business divisions, you will assist other analytic and data science partners in their efforts. Together with product owners, consultants, data scientists, and other engineers, you will generate ideas for products that will enhance millions of people’s health outcomes.

Responsibilities:

  • Helping to create extensive data structures and pipelines for the purpose of organizing, gathering, and standardizing data in order to meet reporting and business objectives and produce insights.
  • Creating analytical processing tools, database system designs, and ETL (Extract, Transform, Load) process writing
  • working together with stakeholders to use automated processes to change data and integrate with other systems
  • Building data pipelines requires familiarity with Hadoop architecture, HDFS commands, and query design and optimization.
  • Using knowledge of Python or any other major programming language, create reliable data pipelines and dynamic systems.
  • constructing data models and marts to assist clients and other internal consumers
  • combining data from several sources while making sure they follow accessibility and data quality guidelines

Requirements:

Essential Requirements

  • Two or more years of expertise in the field developing data transformation and processing solutions with Python, SQL, or a comparable programming language
  • More than two years of experience using a variety of tools and computer languages to examine and work with data sets from various data sources
  • Two or more years of experience with cloud platforms such as GCP, Azure, AWS, Snowflake, and Big Data technologies like Spark and Hadoop
  • Three or more years of experience with Python, SQL, and/or shell scripting

Optimal Qualifications

  • Familiarity with BigQuery, Cloud Composer, Dataproc, and Dataflow among other technologies on the Google Cloud Platform (GCP). (Azure or AWS experience is also taken into account).
  • Familiarity with ML and AI tools like scikit-learn and Vertex AI.
  • proficiency in Python API development.
  • Expertise creating large-scale pipelines for data and applications
  • Practical familiarity with orchestration software, such as Apache Airflow
  • The capacity to comprehend intricate systems and resolve difficult analytical issues.
  • Outstanding critical thinking and problem-solving abilities
  • Strong interpersonal and teamwork abilities both within and between teams

Instruction

  • A bachelor’s degree in a relevant field, such as information systems, machine learning, data engineering, or computer science
  • A master’s degree is ideal.


Would you like tips on how to find work from home jobs? Keep reading: