Need a SQL Wizz, DataBricks and coding experience (Python, .Net etc)
Snowflake is NOT required.
12+ month contract remote
Job Description:
The Data Engineer will report to the BI/DW Supervisor, partners with BI and Software Engineers, Analysts, business stakeholders and Enterprise Analytics leadership. This individual will be building the data pipelines and data structures that create and support our cloud data warehouse. These datasets are highly used by our business analysts, managers and data scientists. They serve as the foundation for self-service, BI and Advanced Analytics.
Responsibilities:
Work closely with business and technical teams to deliver enterprise grade datasets that are reliable, flexible, scalable, and provide low cost of ownership.
Understands common analytical data models like Kimball. Ensures physical data models align with best practice and requirements.
Build and maintain raw data pipelines from varied sources.
Build and maintain the data warehouse... pipelines.
Updates and creates azure pipelines to support our continuous deployment model.
Recommend ways to improve data reliability, efficiency and quality
Analyzes and estimates feasibility, costs, time, and resources needed to develop, and implement enterprise datasets as needed.
Research opportunities for data acquisition and new uses for existing data
Recommend ways to improve data reliability, efficiency and quality
Collaborate with Enterprise Architecture to publish and contribute to architecture standards and roadmaps.
Achieves and maintains relevant technical competencies and helps to foster an environment of continued growth and learning among colleagues on existing and emerging technologies.
Qualifications:
A Bachelor's Degree in Computer Science or related field is required. A high school diploma and/or equivalent combination of education and work experience may be substituted.
A minimum of 5 years relevant experience of development using integration platforms.
Prefer recent experience using cloud data engineering toolsets.
Highly prefer recent experience in Azure using Azure Data Factory, Azure Databricks and Snowflake.
A minimum of 2 years experience building database tables and models.
Must be able to write TSQL for DDL and DML operations fluently.
Strong understanding of enterprise integration patterns (EIP) and data warehouse modeling.
Experience with development and data warehouse requirements gathering, analysis and design.
Possess strong business acumen and consistently demonstrates forward thinking
See something wrong with this listing?
Contact support