20622 – Senior Data Architect

Published
February 9, 2023
Location
Remote USA
Category
Job Type
Compensation
$175K + Bonus
Job ID Number
20622
Jarvis Walker Recruiter
David Ruiz

Description

As a senior Data Architect, you'll spearhead the efforts of passionate and dedicated teams at the intersection of technology, data and business strategy.

Duties and Responsibilities:

• Lead the definition and implementation of Data Capabilities like Data Discovery & Classification, Data Catalog, Data Lineage, Data Integration and Data Lake.
• Build data pipeline frameworks to automate high-volume and real-time data delivery for Data Lake and streaming data hub.
• Build data APIs and data delivery services that support critical operational and analytical applications.
• Develop sustainable data driven solutions with current new gen data technologies to meet the needs of our organization and business Customers.
• Define Data Operations (DataOps) strategy and operating model for Data Anonymization, Synthetic Data Generation and Test Data Management
• Responsible for defining and delivering innovative Enterprise Data Services to realize the data capabilities that are core to data strategy.
• Collaborate with and influences solution architects, engineers & delivery teams to adopt enterprise data architecture specifications in solutions.
• Work directly with Product Managers and customers to deliver data products in a collaborative and agile environment.
• Use, protect and disclose patients’ protected health information (PHI) only in accordance with Health Insurance Portability and Accountability Act (HIPAA) standards.

Requirements

• 10 years of Enterprise Data Architecture function, data strategy consulting and driving business value through architecture adoption, data management & analytics cultural change.
• Experienced with various database technologies - SQL/NoSQL/in-memory/HDFS/columnar/Graph DB
• Hands on experience with cloud DB technologies – AWS RDS, Redshift, Snowflake, DynamoDB, ElastiCache, Cosmos, Azure SQL.
• Experience around designing and building production grade secured data pipelines from ingestion to consumption, using Java, Python, Scala, Talend, informatica, etc.
• Hands-on experience with integration tools for data orchestration across solutions such as Amazon Elastic MapReduce (EMR), Apache Hive, Apache Pig, Apache Spark, MongoDB
• Experience with large scale, complex enterprise integration systems in a high availability, high volume, performance environment.
• Understanding of cloud application design principles (micro-services, 12-factor apps, stateless applications meant for cloud, containers, etc.)
• Strong technical experience with designing data lakes, pipelines, and data models.
• Strong analytical and quantitative skills with the ability to use data and metrics to justify requirements, features and demonstrate business value.
• Advanced people leadership skills, including developing talent and the next data functional leaders.
• Experience working with cross-functional teams in delivery of new products or services.

Apply
Drop files here browse files ...