Nearu

Nearu

Family of home‑services companies acquiring and empowering local HVAC, plumbing & electrical brands.

1,000HVACPlumbingElectricalResidentialLight CommercialWebsite

Data Architect

Design and maintain Snowflake+Databricks platform ensuring data quality for analytics & AI

Charlotte, North Carolina, United States
Full Time
Intermediate (4-7 years)

Job Highlights

Environment
Office Full-Time

About the Role

Build and maintain a governed Snowflake + Databricks data platform where data quality, cleansing, lineage, KPI consistency, and observability are enforced by design, enabling reliable analytics and scalable AI adoption. • Architect and maintain Snowflake as the enterprise analytical and AI data backbone, designing schemas for financial, operational KPIs, self‑service analytics, and AI datasets. • Implement Snowflake features such as Streams, Tasks, Dynamic Tables, Snowpark, and secure data sharing; optimize warehouse sizing, performance, and cost. • Reverse‑engineer Databricks as the processing, cleansing, and enrichment layer, defining Bronze/Silver/Gold medallion architecture and Spark‑based data quality logic. • Build reusable data cleansing, validation, and standardization frameworks for structured and unstructured data, embedding automated quality checks in pipelines. • Design and govern ELT pipelines from ERP, SaaS, APIs using Fivetran, dbt, and Python, ensuring SLA compliance for analytics, finance, marketing, and AI workloads. • Create monitoring, alerting, and anomaly detection for Snowflake and Databricks pipelines, covering failures, data freshness, volume, and schema changes. • Partner with stakeholders to define, document, and operationalize enterprise KPIs, translating them into governed Snowflake views with traceable lineage. • Enable AI and LLM use cases by preparing RAG‑ready, embedding‑focused datasets and supporting integrations with Snowflake Cortex and Azure OpenAI. • Enforce data governance, access controls, masking, and lineage to meet enterprise and AI policy requirements. • Collaborate with data engineers, analysts, and IT teams to deliver production‑ready data assets under the direction of the Manager of Data Analytics and IT.

Key Responsibilities

  • snowflake architecture
  • databricks layer
  • elt pipelines
  • data cleansing
  • monitoring alerts
  • data governance

What You Bring

• Minimum 6 years experience in data architecture/engineering, strong Snowflake and Databricks/Apache Spark hands‑on skills, advanced SQL and Python, and cloud platform experience (Azure preferred). • Preferred experience with Snowflake Cortex, Snowpark, AI‑SQL functions, vector search, RAG pipelines, and finance/ERP data environments.

Requirements

  • snowflake
  • databricks
  • spark
  • sql
  • python
  • azure

Work Environment

Office Full-Time

Apply Now