Director, Data Engineering

Intelletec has partnered with a client looking to change proper pet health, providing dogs and their owners with genuine, innovative, and simple care. In addition, they are taking on the $90 billion pet food industry, replacing bags of highly-processed mystery pellets with a personalized subscription service that sends healthy, freshly-made dog food directly to customers' doors. They have delivered over 200 million meals nationwide and raised over $150M in funding to help us build a company as healthy as the dogs who are eating those meals.

Where You'll Come In

As Director of Data Engineering, you will be responsible for the trust and reliability of all data. Since their inception, data has been one of our core competitive advantages that has helped us grow from a tiny start-up to a successful category leader, feeding millions of meals per month. 

You and your team will be responsible for ensuring that each person and system that uses data receives it exactly when and how they need it. You will work with leadership, partners, and data engineers to establish SLAs and guide your team in meeting those SLAs.

You will drive the vision for our future architecture and guide the team in how to most effectively build our systems and model our data – all of which will be critical to our long-term growth and success. 

This role is open to remote or based in NYC and reports to the Head of Data Strategy and Insights. 

How You'll Make an Impact

  • Support and continue to grow a team of 4+ data engineers with a focus on mentorship, coaching, and personal development

  • Develop a healthy and collaborative culture throughout the team.

  • Own all of our data warehouse and all of the infrastructure supporting our ELT/ETL pipelines

  • Care for all of the data residing in our systems, from when it is recorded until it reaches its intended audience or user.

  • Work closely with the Data Science and Analytics teams to provide data required for developing and leading ML models and dashboards for customer-facing applications and internal use.

  • Lead efforts in bringing in new data from various partners

  • Help ensure that all end-users have access to the data they need to make the most effective decisions and that the information is consistently fresh

  • Maintain all data-related tools

  • Lead and participate in system design and architecture conversations

  • Represent Data Engineering in high-level, cross-disciplinary road-mapping discussions

  • Interface with senior leaders to help develop the long-term vision and strategy for Data Engineering.

We're Excited About You Because

  • You have 8+ years of experience and at least five years in a hands-on data engineering role.

  • At least three years leading a data engineering team

  • Have experience hiring, managing, and growing engineering teams

  • You love systems architecture and modeling data

  • You've built and maintained multiple data pipelines and deeply understand BigQuery and Snowflake.

  • You're proficient with dbt and document-store databases (e.g., Mongo, Dynamo, etc.)

  • You have hands-on experience productionalizing machine learning systems at scale in Python or similar languages.

  • Extensive experience in cloud-data warehousing (GCP / AWS / Snowflake)

Data Stack

We are a cloud-native organization relying on both GCP and AWS. Below are our current tools that have helped get us to today, but we are not married to any of them. Our infrastructure and architecture are ready for an overhaul, and we are looking for candidates who have substantial experience in architectural designs and data modeling.

  • Our central data warehouse is BigQuery.

  • Our principal language is Python.

  • We use an off-the-shelf tool for many of our simple data migrations (i.e., from AWS Postgres to BigQuery)

  • We extensively use dbt, GCFs (lambda functions), and airflow for our transformations.

  • Our stakeholders use Looker for most of the data exploration, and our data scientists use R & Python, tapped into designated projects in BigQuery. 

  • We leverage Segment for our web events.

  • Our counterparts in Product Engineering use primarily Postgres and DynamoDB

Please reach out to zack@intelletec.com for more info!