Job Title: Chief Architect - AWS and Snowflake
Level: Director (D)
Location: India
Role Summary: Solutioning lead for NA Data Engineering - AWS and Snowflake as primary stack
Role Responsibilities: Architecture and Solutioning on AWS and Snowflake platforms - data warehouse, lakehouse, data fabric, and datameshSizing, estimation, and implementation plan for solutioningSolution prototyping, advisory, and orchestrating in-person/remote workshopsWork with hyperscalers and platform vendors to understand and test platform roadmaps and develop joint solutionsOwn end-to-end solutions working across various teams in Cognizant - Sales, Delivery, and Global solutioningOwn key accounts as Architecture advisory and establish deep client relationshipsContribute to practice by developing reusable assets and solutions Minimum Required Qualifications: Excellent verbal and written communication skills with the ability to present complex Cloud Data Architecture solutions concepts to technical and executive audiences (leveraging PPTs, demos, and whiteboard)Deep expertise in designing AWS and SnowflakeStrong expertise in handling large and complex RFPs/RFIs and collaborating with multiple service lines & platform vendors in a fast-paced environmentStrong relationship-building skills and ability to provide technical advisory and guidanceMinimum 15 years' experience as Solution Architect designing and developing data architecture patternsMinimum 5 years' hands-on experience in building AWS & Snowflake based solutionsMinimum 3 years' experience as Solution Architect in pre-sales team driving the sales process from a technical solution standpointBachelor's or Master's degree in computer science, engineering, information systems, or a related field Responsibilities: Technology architecture & implementation experience with deep implementation experience with Data solutions15~20 years of experience in Data Engineering and 5+ years of Data Engineering experience on cloud data engineeringTechnology pre-sales experience – Architecture, effort sizing, estimation, and solution defenseData architecture patterns – Data Warehouse, Data Lake, Data Mesh, Lake house, Data as a productDevelop or co-develop proofs of concept and prototypes with customer teamsExcellent understanding of distributed computing fundamentalsExperience working with one or more major cloud vendorsDeep expertise on end-to-end pipeline (or ETL) development following best practices and including orchestration, optimization of data pipelinesStrong understanding of the full CI/CD lifecycleLarge legacy migration (Hadoop, Teradata like) experience to Cloud Data platformsExpert level proficiency in engineering & optimizing with various data engineering ingestion patterns - Batch, Micro Batch, Streaming, and APIUnderstand imperatives of change data capture with tools & best practices POVArchitect and solution Data Governance capability pillars supporting modern data ecosystemData services and various consumption archetypes including semantic layers, BI tools, and AI&MLThought leadership designing self-service data engineering platforms & solutionsCore Platform – AWS & SnowflakeAbility to engage and offer differing points of view to customers' architecture using AWS and Snowflake platformStrong understanding of the Snowflake platform including evolving services like SnowparkImplementation expertise using AWS services – EMR, Redshift, Glue, Kinesis, Lambda, AWS Lake formation, and SnowflakeSecurity design and implementation on AWS & SnowflakePipelines development in multi-hop pipeline architectureArchitecture and implementation experience with Spark and Snowflake performance tuning including topics such as cluster sizing Preferred Skills: Gen-AI architecture patternsData Quality and Data GovernanceCloud Cost Monitoring and Optimization
#J-18808-Ljbffr