Previous Job
Previous
Big Data Architect
Ref No.: 21-00586
Location: Chevy Chase, Maryland
  • 5+ years' experience as a Data Architect
  • 5+ years' Data Warehouse experience. Candidate must have experience with designing Data Warehouse schemas (Data Vault, 3NF model) as well as Data Mart schemas (Dimensional model)
  • 5+ years' experience integrating data from disparate sources using tools like Kafka
  • Experience in Agile development methodologies
  • 3+ years' knowledge of Azure Data & Storage, Snowflake or other Cloud Data Warehouse
  • Proficient in Data Governance practices and experience
  • Able to gather requirements from both IT and the Business partners
  • Demonstrate the ability to create high quality technical artifacts such as Source to Target Mappings (STMs) and Record Level Specifications (RLS)
  • Experience with data profiling using SQL
  • Prototype ETL transformation logic using Advanced SQL techniques
  • Experience with population of metadata repositories and publication of metadata
  • Experience creating and documenting classifications of Private and sensitive data
  • Understanding of data security and data access controls and design aspects
  • Proficient with Erwin Data Modeler and MS Office Suite
  • Experience in Data Vault 2.0 Modeling methodology
  • P&C Insurance Subject Area Model experience (i.e. Policy, Sales, Service, Claims, and Billing systems)
  • 3+ years' experience with Database Platforms (DBaaS): Snowflake, Azure SQL DB
  • 3+ years' exposure to SnowSQL, advanced concepts (query performance tuning, time travel etc.) and features/tools (data sharing, events, SnowPipe etc.)
  • 3+ years' experience with the Azure suite (Azure Data Lake, Azure Data Factory, Azure DevOps)
  • 5+ years' exposure to Traditional RDBMS: SQL Server, Oracle, DB2, Cosmos DB, Hive
  • 5+ years' experience with Big Data Experience. Hadoop, Hive, Linux.
  • Very good knowledge of RDBMS topics, ability to write complex SQL, PL/SQL
  • Experience with Data Vault 2.0 Methodology and implementation with data build tool (dbt)
  • Expert knowledge developing data models for deployment on Snowflake Platform
  • Willingness to learn and experiment with technologies outside of his/her comfort area
  • Bachelor's degree in Computer Science, Information Systems, or equivalent education or work experience the design
  • Excellent oral and written communication
  • Excellent Analytical and problem-solving skills
  • High attention to detail
  • Ability to work under indirect supervision
  • Broad understanding of information technology topics
  • Effective interpersonal skills and collaborative style
  • Able to communicate clearly, using data, and build relationships to effect influence
  • Demonstrated ability to evaluate technology options and apply to complex business problems
  • Flexible in a fast-paced dynamic environment with shifting roles and responsibilities
  • Capacity and commitment to acquiring new skills outside of current area of expertise
  • Demonstrated passion and deep involvement with technology
  • Technically creative and open-minded