Job Description We are looking for an experienced Data Architect to join our team and lead the design, modeling, and implementation of our client's customer data platform. As a Data Architect, you will be responsible for translating business requirements into data management and technology solutions, strategies, and architectures. This role is crucial in implementing scalable and efficient solutions that integrate data across various teams and domains, including large-scale event streaming, reporting, and analytics.
Responsibilities: Collaborate with cross-functional teams to implement a unified data reporting infrastructure that meets the diverse requirements of multiple teams.Design systems to deliver real-time reporting and analytics to customers, leveraging data from various sources such as MySQL, Druid, Snowflake, S3, etc.Brainstorm with Customer Data Platform teams to provide guidance on data and architectural solutions.Create and maintain comprehensive documentation for data infrastructure, pipelines, and processes.Develop data pipelines for ingesting, processing, and storing event streaming and time-series data.Implement data science IT infrastructure, including data stores, ETL tools, visualizations, and runtime environments.Manage data loads between the data lake, internal data structures, and internal/external APIs. Requirements: Minimum of 10 years of experience in data analysis, integration, and delivery.At least 5 years of proven experience as a Data Architect, working with technologies like MySQL databases in high transaction production environments.Proven experience in designing, modeling, documenting, and implementing scalable and performant data architectures in MySQL, as well as technologies like Druid, Snowflake, and/or ElasticSearch.Deep understanding of relational databases, MySQL database architecture, optimization, troubleshooting, and working with complex data sets.Experience in designing and modeling database schemas, including normalization, denormalization, and indexing strategies.Familiarity with Apache Druid or other time-series technologies for providing analytics on large-scale event streaming data.Expertise in unifying data from multiple data stores and technologies to deliver real-time reports and analytics to customers.Ability to create high-quality documentation and implementation timelines to facilitate multiple workstreams.Knowledge of streaming technologies such as Maxwell's Daemon, Kafka, Snowpipe, and database administration skills (SQL, NoSQL, etc.) for scale, stability, security, multi-user load management, and performance.Experience implementing ETL processes with Airflow. About Your Client: [Your Client] is a leading provider of intelligent marketing automation solutions, empowering small teams to drive big business outcomes. They offer a diverse and inclusive work environment and are committed to fostering innovation and success through their inclusive culture. As an equal opportunity employer, they recruit, hire, and promote individuals regardless of gender, race, color, sexual orientation, religion, age, protected veteran status, physical and mental abilities, or any other identities protected by law.
Perks and Benefits: [Your Client] prioritizes the well-being and satisfaction of its employees. Some of the benefits they offer include:
Comprehensive health and wellness benefits, including a fully covered High Deductible Health Plan (HDHP), telehealth and tele-mental health resources, and complimentary access to Calm.Open paid time off policy.Generous 401(k) matching program with immediate vesting.Quarterly Path Perks, offering options for commuter and lunch benefits or a remote home office stipend.After five years of service, employees are eligible for a four-week paid sabbatical leave and a sabbatical leave bonus. [Your Client] is committed to creating a diverse and inclusive workplace through their Employee Resource Groups (ERGs), which provide support, mentorship, and opportunities for professional growth to all members.