As a core contributor to the Data Engineering team, your primary goals will be building pipelines that gather data from disparate sources, cleaning the data, and loading the transformed data into structures that are tailored to each client’s challenges and analytic approaches. You should have a solid understanding of how to access and extract data through various means (web services, web crawlers, direct access to source database, etc). You will create and apply custom solutions for data wrangling and develop semi-automated and automated ETL and database solutions.
- Contribute to the design and development of our Python data workflow management platform
- Design and develop tools to wrangle datasets of small and large volumes of data into cleaned, normalized, and enriched datasets
- Build and enhance the Data Engineering codebase for added efficiency and capacity
- Refine processes for normalization and performance-tuning analytics