Site Reliability Engineer II - Data

Appfolio

The Role

Overview

Develop, operate, and improve reliable, scalable data pipeline infrastructure

Key Responsibilities

  • data architecture
  • data governance
  • agile development
  • tech research
  • data monitoring
  • data pipelines

Tasks

-Collaborate with engineers, data analysts, and scientists to ensure that our data infrastructure meets the SLOs of our data-intensive customers -Improve data architecture, quality, discoverability, and access policies to enable and enforce data governance -Leverage agile practices, encourage collaboration, prioritization, and urgency to develop at a rapid pace -Research, share, and recommend new technologies and trends -Develop techniques for monitoring the completeness, correctness, and reliability of our data sets -Design, build, and operate on next-generation data pipeline infrastructure

Requirements

  • flink
  • finops
  • python
  • kafka
  • kubernetes
  • aws

What You Bring

-You have industry experience working with real-time data technologies such as Apache Flink -Background in monitoring and maintaining cost management on the cloud (FinOps) -You have 3.5+ years of experience working with languages like Python or Ruby, Infrastructure as Code, configuration management, and monitoring tools -You have a passion for building a reliable, scalable, and fault-tolerant infrastructure. -Proficient in RBAC and Data Governance across multiple platforms and ecosystems(AWS, Snowflake, Kafka, etc.) -Data science skills for analyzing data and communicating with ML engineers are a plus -You have excellent MySQL operational knowledge -Experience with Kafka, Kafka Connect, and its ecosystem is highly desirable -Proficient in creating consumable tools for platform management -Experience with containers and container orchestration tools. Docker and Kubernetes experience -Experience with AWS primitives(IAM, VPC, RDS, S3, MSK, EKS, etc.) -You have worked with various data sources in production, including change data capture systems, event sourcing, and clickstreams. -Experience with large-scale Data Lakes and Lake Houses, especially with Apache Iceberg, is a plus -Proficient in tooling orchestration utilizing CICD best practices with tools like CircleCI or Jenkins -Experience with clickstream tracking technology, e.g. Snowplow, is desirable -Bachelor's in Computer Science or other quantitative fields -Proven mentoring experience to help develop peers -Experience with Debezium connector is highly desirable

Benefits

-You care about work-life balance and want your company to care about it too; you'll put in the extra hour when needed, but won't let it become a habit.

The Company

About Appfolio

-Founded in 2006 by Klaus Schauser and Jon Walker, AppFolio emerged to simplify property management through centralized cloud software. -Its flagship product, AppFolio Property Manager, integrates accounting, leasing, marketing, maintenance, and reporting into one platform. -In 2012 it acquired MyCase, later divesting it in 2020 to refocus solely on real estate solutions. -AppFolio has expanded via acquisitions like RentLinx, WegoWise and Dynasty Marketplace to enhance listing presence, analytics, and AI capabilities. -Typical customers range from small landlords to large portfolios, covering single-family, multifamily, commercial, student housing, and community associations. -Its Stack™ Marketplace enables seamless third-party integrations tailored to unique workflows. -In 2024 it introduced Realm-X, a generative AI assistant that automates tasks like report generation, messaging, and maintenance coordination. -Known for rapid feature rollout, mobile-first investor tools, and embedded AI, it stands out in real-time property and investment management.

Sector Specialisms

Residential

Commercial