Skip to content

Data Engineer

Neighbor
Lehi, UTonsiteFeb 10, 2026·Posted 2 months ago
View Application Page

Domain

Must-Have Requirements

  • 3+ years of data engineering or analytics engineering experience
  • Bachelor's degree in quantitative/technical fields OR 5+ years as Data Engineer
  • Expert-level SQL mastery
  • Strong command of at least one major programming language for data processing
  • Hands-on experience designing and maintaining data lakes or cloud-based data warehouses
  • Deep understanding of data integration patterns (ETL/ELT)
  • Experience applying scientific, mathematical, or statistical techniques to analyze data

Nice to Have

  • -Experience building predictive models
  • -Advanced ability to translate complex datasets into actionable narratives using BI tools
  • -Track record of using quantitative analysis to solve ambiguous problems

Description

At Neighbor, we’re building the largest hyperlocal marketplace the world has ever seen. We’ve raised over $75 million from top-tier investors such as Andreessen Horowitz and the CEOs of DoorDash, StockX, and Uber. Our marketplace is already flourishing in all 50 states and we’re just getting started!

As a Data Engineer, you will be the core engineering resource responsible for building, scaling, and optimizing the data infrastructure that transforms raw events into high-fidelity, actionable intelligence. You will pioneer a foundation of engineering excellence, enabling the entire organization to make reliable, data-driven decisions at scale across all 50 states.

This engineering resource will be the cornerstone of our data infrastructure, responsible for extraction, transform, and load of the data that powers our nation-wide, best-in-class marketplace. By implementing software engineering best practices and scalable solutions, this role is critical in empowering the CEO, executive team, managers, and individual contributors with the robust and trustworthy intelligence needed to scale and innovate across our marketplace.

Primary Responsibilities Design, implement, and maintain scalable data transformation layers and code-first orchestration frameworks to ensure the delivery of high-fidelity, reusable data models Design and build robust pipelines to ingest data from diverse sources (APIs, logs, relational DBs) Ensure the reliable and timely execution of all critical data pipelines (ETLs/ELTs) to maintain data integrity and freshness Standardize analytics workflows by integrating software engineering best practices, including version control, CI/CD pipelines, and automated data validation protocols Develop and refine a robust semantic layer to facilitate self-service analytics, enabling stakeholders to derive insights without exposure to underlying architectural complexities Monitor and optimize cloud compute utilization and data model performance to ensure high availability and low-latency reporting during periods of rapid data scaling Serve as a strategic technical partner to leadership across Product, Engineering, Marketing, and Finance to align data infrastructure with organizational objectives Become a subject matter expert on the product ecosystem, user behavior, and marketing life cycles to better translate raw data into business value Serve as a versatile technical resource capable of stepping into the Data Analyst capacity when necessary—performing deep-dive quantitative analysis and building sophisticated visualizations to support executive decision-making Mentor the data analytics team on advanced technical methodologies to foster a culture of engineering excellence and data autonomy

Qualifications

3+ years of experience in data engineering or analytics engineering Bachelor's degree in quantitative and/or technical fields (Math, Physics, Statistics, Economics, Computer Science, Engineering, etc.) OR 5+ years work experience as a Data Engineer Expert-level mastery of SQL, with the ability to write, tune, and optimize complex queries for high-volume environments Strong command of at least one major programming language used for data processing Hands-on experience designing and maintaining data lakes or cloud-based data warehouses Deep understanding of data integration patterns, including data ingestion, transformation, and automated cleansing (ETL/ELT) Experience applying scientific, mathematical, or statistical techniques to analyze data and build predictive models Advanced ability to translate complex datasets into actionable narratives using modern business intelligence and reporting tools A proven track record of using quantitative analysis to solve ambiguous problems and drive strategic decision-making in a fast-paced environment Exceptional ability to collaborate with non-technical stakeholders, translating business requirements into technical specs and vice versa

Benefits Generous Stock options Medical, dental, and vision insurance Generous PTO 11 paid company holidays Hybrid work model - WFH every Monday 401(k) plan Infant care leave On-site gym/showers open 24/7

Location Context