Back to search:Data Engineer / Snowflake, Az

Join our data engineering team to build robust, scalable data infrastructure that powers AI/ML initiatives and business intelligence across commercial and government clients. This role focuses on designing and implementing modern data architectures, pipelines, and platforms.

Key Responsibilities
  • Design and implement scalable data architectures using modern data stack technologies
  • Build real-time and batch data processing pipelines using Apache Spark, Kafka, and Airflow
  • Develop ETL/ELT processes for data integration across multiple source systems
  • Implement data lake and data warehouse solutions using Snowflake, Databricks, and cloud services
  • Build and maintain cloud-based data platforms on AWS, Azure, and GCP
  • Design data mesh architectures and self-service analytics platforms
  • Optimize data storage, processing, and query performance
Data Governance & Quality
  • Implement data quality monitoring and validation frameworks
  • Design data cataloging and metadata management solutions
  • Ensure data security, privacy, and compliance with federal regulations
  • Develop data lineage tracking and impact analysis capabilities
Required Skills and QualificationsEducation & Experience
  • Bachelor's degree in Computer Science, Engineering, or related field
  • 5+ years of data engineering experience in enterprise environments
  • 3+ years of cloud data platform experience
Technical Skills
  • Programming: Advanced Python, SQL, Scala; familiarity with Java
Preferred Qualifications
  • Data mesh and modern data architecture experience
  • Terraform, CloudFormation, or similar IaC experience
#J-18808-Ljbffr

FoCookieConsentP1 FoCookieConsentLink FoCookieConsentP2