Job Description
Job Title-Data Architect
Company -Instructure
Years of Experience -8+ years
Location -Remote,Mexico
Role Type-Full-Time Role
Eligibility
- 8+ years in Data Architecture, Data Engineering, or related roles
- Experience working in complex, multi-team data environments
- Strong expertise in enterprise data modeling and governance
- Hands-on experience with modern data stack (Snowflake, BigQuery, Databricks, dbt)
- Ability to collaborate across technical and business stakeholders
- Experience with data governance, metadata, lineage, and quality frameworks
Role Overview
As a Data Architect, you will define and drive the enterprise-wide data architecture strategy, ensuring data is reliable, scalable, and aligned across teams. You will act as the bridge between data production and consumption, enabling Data Engineering, Analytics, Data Science, and Decision Science teams with consistent, high-quality data systems. This is a senior leadership role with high cross-functional influence and strategic impact.
Key Responsibilities
- Define and own enterprise data architecture (conceptual, logical, physical models)
- Establish data standards, schema design principles, and modeling best practices
- Design scalable data products across semantic and analytical layers
- Partner with Product Data teams on architecture, standards, and data contracts
- Evaluate and guide data platform and tooling decisions
- Identify and resolve data quality risks, redundancies, and architectural gaps
- Lead development of business glossary, data catalog, and enterprise ontology
- Support Data Science with feature engineering and data infrastructure
- Ensure analytical models support governed self-service and performance
- Drive data governance, lineage, and observability initiatives
- Mentor engineers and analytics teams on best practices
Skills and Qualifications
- Strong expertise in data modeling (Dimensional, Data Vault, OBT)
- Experience with cloud data platforms (Snowflake, BigQuery, Databricks)
- Knowledge of lakehouse architectures and modern data stack
- Hands-on experience with dbt, data cataloging, and metadata tools
- Strong understanding of data governance and quality frameworks
- Experience with semantic layer tools (Cube, MetricFlow, LookML)
- Knowledge of data mesh and data product architectures
- Exposure to streaming technologies (Kafka, Flink, Spark Streaming)
- Experience with data privacy regulations (GDPR, CCPA)
- Background working in federated or matrixed organizations