About the Role We are seeking an experienced Data Architect to lead the design and management of our Snowflake environment. This role is essential to delivering real-time operational intelligence and driving our long-term data strategy. You will be responsible for building the event-driven foundation that supports downstream aggregations, behavioral analytics, and intelligent automation. A key part of your work will be enabling our Customer 360 initiative, where we unify customer data from various channels to power segmentation, personalization, and compliance-aware experiences. You will architect and maintain a data platform that scales with growing automation, AI integrations, and evolving compliance requirements. Key Responsibilities
- Snowflake Leadership
- Design, build, and optimize scalable schemas in Snowflake to support operational, analytical, and AI workflows.
- Set best practices for performance, cost optimization, and secure access control.
- Data Integration & Pipelines
- Build and maintain robust pipelines to integrate data from Segment, automation platforms, webhooks, CRMs, third-party APIs, and internal systems.
- Own the ingestion of structured and semi-structured data across cloud platforms.
- Customer 360 Program Ownership
- Architect and implement a unified data model to consolidate all customer interaction and behavioral data.
- Enable analytics, segmentation, and personalization based on a real-time, queryable customer profile.
- Governance, Security, and Compliance
- Implement and monitor data privacy, encryption, masking, and access control policies.
- Collaborate with compliance teams to align with standards like GDPR, SOC 2, and HIPAA where applicable.
- Collaboration & Documentation
- Partner with workflow developers and analysts to ensure data structures are usable and reliable.
- Maintain thorough documentation of schemas, pipeline logic, and data flows.
- Observability & Monitoring
- Build logging and lineage tracking into all core pipelines.
- Define SLAs and alerting mechanisms for data availability and pipeline performance.
Qualifications
- 5+ years of experience in data architecture, engineering, or platform roles.
- Deep expertise with Snowflake and modern data stack technologies.
- Experience working with event-based systems (e.g., Segment) and workflow tools (e.g., Make.com, Zapier).
- Proficiency in SQL and experience designing data pipelines for event-driven systems, including downstream aggregations and transformations.
- Solid understanding of data governance, security, and compliance best practices.
- Excellent communication and documentation skills.
- Comfortable working cross-functionally in fast-paced, remote environments.
Nice to Have
- Experience supporting AI/ML data pipelines or LLM-based applications.
- Familiarity with reverse ETL tools, observability platforms, or operational analytics stacks.
Why Join Us?
- Help shape the data foundation of a forward-thinking, AI-integrated system.
- Collaborate with a multidisciplinary team of engineers, analysts, and automation experts.
- Work in a flexible, fully remote environment with a strong culture of autonomy and accountability.
Apply Now to lead a foundational pillar of our intelligent automation architecture.
|