Want to hear how I work? Hit play.Find roles with Kablio AI to help build and power the world.Kablio AI helps you secure roles in construction, clean energy, facilities management, engineering, architecture, sustainability, environment and other physical world sectors.
Get hired, get rewarded!
Land a job through Kablio and earn a 5% salary bonus.
Exclusive benefits
5%Bonus
Principal Data Modeler
Equinix
Global leader in data center and interconnection services, enabling digital transformation.
Design and govern enterprise data models and architecture for cloud AI platforms.
7d ago
$155,000 - $233,000
Expert & Leadership (13+ years)
Full Time
Dallas, TX
Office Full-Time
Company Size
10,000 Employees
Service Specialisms
Data center colocation
Interconnection services
Smart Hands / remote support
Software-defined interconnection
Network Edge services
Equinix Metal (bare metal)
Equinix Fabric (cloud routing)
Managed services (integration & advisory)
Sector Specialisms
No specialisms available
Role
What you would be doing
data modeling
data architecture
data lake
data warehouse
api design
data governance
Designing cloud-native data solutions that balance performance, cost, and scalability
Support the integration of AI/ML models into data pipelines with a focus on data readiness, lineage, and governance
Develop and maintain canonical data models (CDM), shared taxonomies, and semantic models for assigned business domains
Support stewardship processes to maintain model integrity and alignment with evolving business needs
Collaborate with data governance teams to implement metadata standards, data quality rules, and lineage tracking
Contribute to the development of domain-specific data architectures that align with enterprise data strategies and cloud platform standards
Partner with engineering teams to design data contracts, APIs, and integration patterns that ensure data consistency across services
Supporting data quality frameworks and data stewardship
Contribute to the definition and enforcement of data architecture standards across assigned domains
Translate business requirements into well-structured data models that support operational, analytical, and AI/ML use cases
Design and guide implementation of data lake, data warehouse, and lakehouse solutions using platforms such as GCP BigQuery, Dataplex, Dataflow or equivalents
Act as a technical leader and subject matter expert on data modeling and information architecture for engineering, analytics, and business teams
Design conceptual, logical, and physical data models for cloud data platforms, ensuring scalability, performance, and flexibility
Defining data contracts and API data schemas
Participate in architecture reviews, providing guidance on data design decisions and best practices.
Ensure data models and designs align with governance, security, and compliance requirements (e.g., GDPR, CCPA)
What you bring
python
gcp
mlops
data modeling
10+ years
cs degree
Passion for designing data solutions that enable trust, reuse, and agility
Experience implementing data architectures in a data mesh, data fabric, or lakehouse environment
Ability to translate complex data concepts into actionable solutions
Familiarity with MLOps pipelines and AI/ML data requirements
Excellent problem-solving, communication, and stakeholder collaboration skills
Data governance, lineage, and metadata management practices
Bachelor’s or master’s degree in computer science, Information Systems, Data Science, or related field
10+ years of experience in data modeling, information architecture, or data engineering, including 2+ years in an architecture or principal-level role
Knowledge of machine learning frameworks (e.g., TensorFlow, PyTorch) and experience in deploying ML models in production
Cloud-native data architecture: GCP (BigQuery, Dataplex, Dataflow), AWS, or Azure equivalents
Proven experience in designing and implementing data architectures, ETL processes, and MLOps practices
Hands-on experience with modern catalog, lineage, and observability tool
Strong proficiency in programming languages such as Python, Java, or Scala, and experience with SQL
Strong sense of ownership, with a focus on delivering high-quality architectures that stand the test of time
Data modeling techniques: relational (3NF), dimensional (Kimball), Data Vault 2.0, hierarchical, NoSQL, graph models
Semantic modeling for BI / analytics (Looker Modeler / LookML, dbt, Dataform)
Hey there! Before you dive into all the good stuff on our site, let’s talk cookies—the digital kind. We use these little helpers to give you the best experience we can, remember your preferences, and even suggest things you might love. But don’t worry, we only use them with your permission and handle them with care.