Job Summary
You will report to the Director of Data Engineering. The Data Modeler is responsible for developing,
updating, and enhancing data models and semantic layer to support reporting, data visualization,
business intelligence, and data science. As a Data Architecture expert, the Data Modeler will be an
integral member of the Data Engineering organization collaborating with multiple Agile squads to build
and deploy advanced SQL queries following best practices from Inmon and Kimball. You will work in a
highly collaborative lean Agile environment to build and continuously improve our data models in addition
to using your technical experience and expertise for data analysis to support recommendations along
with judgment to make decisions or solve moderate to complex tasks or problems.
What You’ll Do
Design and build within a modern data architecture using Snowflake and Atlan along with other
tools to add product features for each ART. Your data models and robust semantic layer will
support visualizations, business intelligence, and data science.
Clearly communicate solutions to both technical and non-technical team members.
Responsible for enforcing the Data Modeling and Data Architecture standards across the Data
and Analytics organization.
Advanced SQL query development experience required.
Responsible for Metadata Management using Atlan and Snowflake to define, govern, and manage
metadata for our data product. Must have expertise with metadata repositories, this role will
build and maintain the integrity of our data product semantics and glossary.
You will manage and update the enterprise data architecture artifacts, Data glossaries, data
architecture, data flows, and processes.
Communicate daily with team members regarding product features, development progress and
barriers.
Translate business problems into data solutions and connect to business impact.
Apply defined best practices to the development and support of our Data Warehouse, relational
structures, dimensional data modeling and structured query language (SQL) skills.
Evaluate and recommend new data management technologies and tools.
Optimize data architecture for performance, cost, and efficiency.
You will be challenged to be innovative in your approach to integrate, secure and scale data from
multiple internal and external data sources while reducing technical debt.
What We Need
8+ years of experience in data architecture, with at least 4 years focusing on cloud-based
solutions.
Experience in developing data models, data integration processes, and data flows on AWS cloud
platform.
Understanding of Kimball Data Warehousing design principles, Unified Star Schema (USS) or Data
Vault.
Firsthand experience in building large scale data warehousing solutions on Snowflake.
Experience developing dimensional data models and semantic data layer.
Hands-on experience in relational and multi-dimensional data modeling, including multiple source
systems from databases and flat file.
Firsthand experience with Airflow.
Proven experience implementing complex stored procedures along with an understanding of
standard DWH and ELT concepts.
Effective Communication Skills both written and verbal within all levels of the organization.
Familiarity with machine learning and Data Science technologies their unique data requirements
Ability and willingness to work within a lean Agile team to translate needs of our business
partners into product features which enhance our self-service data platform.
Proven experience delivering within a distributed team.
Desired Qualifications
Bachelor’s degree in computer science or a related field
You have experience with various patterns of data ingestion, processing, and curation.
Experience with Tableau preferred.
Experience implementing data quality initiatives such as test harnesses, monitoring, and auditing
features on data.
Experience with CI/CD workflows using GitHub preferred.
Experience with Redshift, Treasure Data or Microsoft Azure is helpful
Build/Design data pipeline using Python/Snowflake/Airflow preferred.
Experience with AWS data storage and management technologies such as S3
Familiarity with orchestration technologies (e.g., Airflow, Astronomer)