Location: Jersey City (Onsite)
Experience: 15+ Years
Employment Type: Full-Time
Position Overview:
We are seeking an experienced Lead Data Engineer to design, build, and optimize enterprise-scale data solutions using AWS, Snowflake, and Big Data technologies. This role requires a hands-on technical leader who can mentor a team, drive best practices in data engineering, and partner with business stakeholders to deliver scalable, secure, and high-quality data platforms. The ideal candidate will have strong expertise in SQL, Python, PySpark, and proven experience working in insurance or similar data-intensive industries.
Key Responsibilities:
- Lead the design, development, and implementation of data solutions using AWS and Snowflake
- Collaborate with cross-functional teams to translate business requirements into technical designs
- Develop and maintain ETL/data pipelines, ensuring quality, integrity, and security
- Optimize data storage, retrieval, and query performance to support warehousing and analytics
- Provide technical leadership and mentorship to junior engineers and team members
- Ensure adherence to best practices, coding standards, and industry compliance in data engineering
- Conduct coding, debugging, performance tuning, and production deployments
- Work closely with stakeholders to deliver data-driven insights, with focus on insurance claims and loss data
- Support Agile delivery through sprint planning, backlog grooming, and iterative releases
Must-Have Skills:
- 10+ years of experience in Data Engineering and delivery
- Strong expertise in Big Data concepts and cloud implementations (AWS)
- Hands-on leadership of teams (minimum 4+ members)
- Proficiency in SQL, Python, and PySpark
- Deep knowledge of Snowflake, Glue, EMR, S3, Aurora, RDS, and AWS architecture
- Strong background in data ingestion and data processing frameworks
- Excellent communication, analytical, and problem-solving skills
- Ability to take ownership and deliver results in fast-paced environments
- Experience working in Agile/Scrum methodologies
Good-to-Have Skills
- Experience with DevOps tools (Jenkins, Git, etc.), CI/CD pipelines, and infrastructure automation
- Hands-on with data migration and frameworks such as Data Vault 2.0
- Familiarity with modern data modeling and ETL processes
Requirements:
- Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field
- Proven experience as a Data Engineer with a focus on AWS and Snowflake
- Strong understanding of data warehousing concepts and best practices
- Excellent communication skills to work across business and technical teams
- Industry experience in insurance, with exposure to claims and loss processes preferred
- Strong attention to detail, ownership mindset, and ability to work independently as well as in a team
Preferred Qualifications:
- AWS or Snowflake certifications
- Familiarity with data governance, lineage, and data security practices
- Experience in regulated industries such as finance, insurance, or healthcare