Equinix banner

Platform Engineer

Equinix

The Role

Overview

Design, build, and maintain AI/GenAI data platforms and cloud-native infrastructure.

Key Responsibilities

  • data modeling
  • multi-cloud
  • pipeline development
  • data governance
  • infrastructure-as-code
  • ai automation

Tasks

-Lead initiatives in data modeling, semantic layer design, and data cataloging, ensuring data quality and discoverability across domains -Architect and manage multi-cloud and hybrid cloud platforms (e.g., GCP, AWS, Azure) optimized for AI, ML, and real-time data processing workloads -Streamline onboarding, documentation, and platform implementation & support using GenAI and conversational interfaces -Develop and maintain real-time and batch data pipelines using tools like Airflow, dbt, Dataform, and Dataflow/Spark -Work with engineering by introducing platform enhancements, observability, and cost optimization techniques -Drive technical leadership across AI-native data platforms, automation systems, and self-service tools -Build and orchestrate multi-agent systems using frameworks like CrewAI, LangGraph, or AutoGen for use cases such as pipeline debugging, code generation, and MLOps -Implement enterprise-wide data governance practices, schema enforcement, and lineage tracking using tools like DataHub, Amundsen, or Collibra -Design and develop event-driven architectures using Apache Kafka, Google Pub/Sub, or equivalent messaging systems -Create extensible CLIs, SDKs, and blueprints to simplify onboarding, accelerate development, and standardize best practices -Integrate LLM APIs (OpenAI, Gemini, Claude, etc.) into platform workflows for intelligent automation and enhanced user experience -Foster a culture of ownership, continuous learning, and innovation -Build reusable frameworks and infrastructure-as-code (IaC) using Terraform, Kubernetes, and CI/CD to drive self-service and automation -Build and expose high-performance data APIs and microservices to support downstream applications, ML workflows, and GenAI agents -Collaborate across teams to enforce cost, reliability, and security standards within platform blueprints -Ensure platform scalability, resilience, and cost efficiency through modern practices like GitOps, observability, and chaos engineering -Collaborate across teams to shape the next generation of intelligent platforms in the enterprise -Guide adoption of data fabric and mesh principles for federated ownership, scalable architecture, and domain-driven data product development

Requirements

  • mlflow
  • kubeflow
  • weaviate
  • kubernetes
  • python
  • data mesh

What You Bring

-Experience with ML Platforms (MLFlow, Vertex AI, Kubeflow) and AI/ML observability tools -Experience with RAG pipelines, vector databases, and embedding-based search -Hands-on expertise building and optimizing vector search and RAG pipelines using tools like Weaviate, Pinecone, or FAISS to support embedding-based retrieval and real-time semantic search across structured and unstructured datasets -Experience in developing and integrating GenAI applications using MCP and orchestration of LLM-powered workflows (e.g., summarization, document Q&A, chatbot assistants, and intelligent data exploration) -Deep knowledge of data modeling, distributed systems, and API design in production environments -Experience with Looker Modeler, LookML, or semantic modeling layers -Experience with GenAI/LLM frameworks and tools for orchestration and workflow automation -Familiarity with observability tools (Prometheus, Grafana, OpenTelemetry) and strong debugging skills across the stack -Proficiency in designing and managing Kubernetes, serverless workloads, and streaming systems (Kafka, Pub/Sub, Flink, Spark) -5+ years of hands-on experience in Platform or Data Engineering, Cloud Architecture, AI Engineering roles -Strong programming background in Java, Python, SQL, and one or more general-purpose languages -Experience with metadata management, data catalogs, data quality enforcement, and semantic modeling & automated integration with Data Platform -Proven experience building scalable, efficient data pipelines for structured and unstructured data -Prior implementation of data mesh or data fabric in a large-scale enterprise

Benefits

-Work with a high-energy, mission-driven team that embraces innovation, open-source, and experimentation

The Company

About Equinix

-World’s largest provider of data center and interconnection services. -Offers cutting-edge solutions that enable businesses to scale and adapt to the digital age. -Facilitates high-performance connections for industries like cloud computing, telecommunications, and finance. -Provides services including hybrid cloud solutions, network and application performance optimization, and interconnection for business ecosystems. -At the forefront of advancing global digital transformation through strategic partnerships with leading companies. -Innovative business models drive enterprise infrastructure modernization, ensuring security and compliance.

Sector Specialisms