This is a key role to lead solution and engineering aspects of agile train releases using Scaled Agile Framework (SAFe) for customer data platform for all brands.
- Contribute to planning, definition of the business solution
- Actively participate in the Continuous Exploration process as part of the Continuous Delivery Pipeline, especially with Enabler Epics/Stories
- Work with Principal Architect to setup design guidelines and take proactive initiatives to enable the Continuous Delivery Pipeline
- (e.g.,) establish modular (micro-services) component strategy across the board for efficient continuous delivery
- Work with Principal Architect to plan and develop the Architectural Runway in support of new business features
- Own the high-level design of the technical solution and exploration of solution alternatives
- Define subsystems and their interfaces and responsibilities following the set design guidelines
- Operate within the budget framework when analyzing the impact of design decisions
- Ensure security best practices are considered in solution
- Work with stakeholders to establish high-level solution intent and documentations
- Establish the critical NFRs
- Identify, define and support the implementation of technical debts and other improvements, working directly with Agile teams to implement them
- Work with Product and Solution management to prioritize and determine the capacity allocation
- Provide oversight and improve Built-In Quality and Technical Agility
- Identify improvements on the tool sets for improving delivery quality and agility
- Define proper benchmarks on code quality, automated test coverage, etc.
- Lead cross-team coordination and mentor other engineers on the teams
- 8-10 years of software engineering, 3 years of data platform engineering and tech lead experience
- Experience building and maintaining event streaming platforms (Kafka, Spark, Security, Producer and consumer API’s using NodeJS, Python etc.)
- Hands on experience in Amazon technologies EC2, EKS, S3, Lambda, SQS, SNS, EventBridge, IAM, DynamoDB, Athena, Glue Redshift etc.
- Experience in AWS Contact center, Customer experience and engagement platforms like Amazon Connect, Pinpoint, Lex, Polly, Personalize, Transcribe, Comprehend and Amplify
- Knowledge of data management, data models, ETL, SQL Queries, Systems Integrations, APIs & Connectors, Middleware, Data Lakes e.g., Snowflake etc. preferably using cloud-based tools/infrastructure
- Advise on performance optimizations and best practices for scalable data models, pipelines and queries for large volume of data.
- Have comprehensive understanding of Master Data Management Concepts when applied to Customer data including but not limited to data collection, unification, transformation, segmentation and storage
- Knowledge of Data Governance and Data Privacy concepts and regulations (GDPR, CCPA, POPI)
- AWS Solutions Architect certification is preferred
- Knowledge of application deployments and automation using DevOps tools
- Strong analytical skills with the ability to use data and metrics to back up assumptions and recommendations
- Experience in successfully delivering information technology solutions for large-scale global applications across multiple software, infrastructure and service platforms. Lead technical design during all phases of development and deployment
- Excellent communication and presentation skills