What are the responsibilities and job description for the Lead Data Engineer position at Smart IT Frame LLC?
Responsibilities:
· Lead the design, development, and implementation of data solutions using AWS and Snowflake.
· Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
· Develop and maintain data pipelines, ensuring data quality, integrity, and security.
· Optimize data storage and retrieval processes to support data warehousing and analytics.
· Provide technical leadership and mentorship to junior data engineers.
· Work closely with stakeholders to gather requirements and deliver data-driven insights.
· Ensure compliance with industry standards and best practices in data engineering.
· Utilize knowledge of insurance, particularly claims and loss, to enhance data solutions.
Must have:
· 8 years of relevant experience in Data Engineering and delivery.
· 8 years of relevant work experience in Big Data Concepts. Worked on cloud implementations.
· Strong experience with SQL, python and Pyspark
· Good understanding of Data ingestion and data processing frameworks
· Strong experience in Snowflake, SQL, AWS (glue, EMR, S3, Aurora, RDS, AWS architecture)
· Good aptitude, strong problem-solving abilities, analytical skills, and ability to take ownership as appropriate.
· Should be able to do coding, debugging, performance tuning, and deploying the apps to the Production environment.
· Experience working in Agile Methodology
Good to have:
Have experience in DevOps tools (Jenkins, GIT etc.) and practices, continuous integration, and delivery (CI/CD) pipelines.
Worked on cloud implementations, data migration, Data Vault 2.0, etc.
Requirements:
· Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field.
· Proven experience as a Data Engineer, with a focus on AWS and Snowflake.
· Strong understanding of data warehousing concepts and best practices.
· Excellent communication skills, with the ability to convey complex technical concepts to non-technical stakeholders.
· Experience in the insurance industry, preferably with knowledge of claims and loss processes.
· Proficiency in SQL, Python, and other relevant programming languages.
· Strong problem-solving skills and attention to detail.
· Ability to work independently and as part of a team in a fast-paced environment.
Preferred Qualifications:
· Experience with data modeling and ETL processes.
· Familiarity with data governance and data security practices.
· Certification in AWS or Snowflake is a plus.