Full Time
Data Engineer – Stay At Home Jobs
Job Description
Job Type: Data Engineer from home
Location: Wisconsin work from home
Company: Medix
Position Description:
Microsoft SQL, T-SQL, Azure Data Factory, Databricks, Apache Spark, Python or PySpark, and Snowflake Datawarehouse comprise the tech stack for this function. Designing and implementing ADF pipelines, as well as enriching / curating cloud data using Databricks and Python, will be primary duties. It is necessary to have extensive knowledge with Microsoft’s basic BI technologies (T-SQL, SSIS/SSRS/SSAS).
DAY-TO-DAY RESPONSIBILITIES WILL CONSIST OF:
- Working as part of a team to design and construct cloud data pipelines and cloud data models for corporate reporting.
- Using Microsoft T-SQL, extracting massive amounts of raw data in a variety of formats from a range of source platforms (CRM, ERP, SAP, HR, internal systems, external systems).
- Using Azure Cloud Storage and Azure Data Factory (ADF2) to build cloud data pipelines (Data Lake)
- Working on a hybrid on-premises strategy to connect on-premises services to cloud pipelines.
- With Databricks, Apache Spark, and Python / PySpark, you can refine and curate very big data sets in the cloud.
- Creating fact tables, dimensions, and data models for specific business applications
QUALIFICATIONS AND TECHNICAL SKILLS REQUIRED
- Bachelor’s degree in a computer science-related subject is required, as is 5+ years of professional experience.
- 5+ years of Transact-SQL (T-SQL) experience is required, with SSIS/ SSRS as a basis.
- 3+ years of recent experience with Microsoft Azure Data Factory is required (ADF2)
- 3+ years of experience with Azure Databricks is required.
- 3+ years of Python or PySpark experience is required.
- Extensive knowledge of data modeling
- Hands-on expertise with Azure Data Lake Storage is required (ADLS)
- Outstanding writing and verbal communication abilities