What are the responsibilities and job description for the Data Engineer position at SILAC Insurance Company?
Data Engineer
Job Overview: Data Engineer assists designing, developing, and optimizing data infrastructure and systems to support large-scale data processing, storage, and analytics. This role requires close collaboration with developers, architects, and other engineers to ensure data integrity, accessibility, and compliance with best practices in data engineering.
Department Overview: The Business Intelligence & Analytics (BI&A) department is committed to transforming data into actionable insights that drive strategic decision-making, enhance operational efficiency, and support business growth. Through cutting-edge analytics, data visualization, and predictive modeling, BI&A empowers stakeholders with meaningful intelligence to optimize performance and achieve organizational goals.
Job Details
Data Pipeline Development: Design, develop, and maintain scalable data pipelines and ETL processes to enable efficient data ingestion, transformation, and storage.
Cross-Functional Collaboration: Work closely with business and technical teams to understand data requirements and develop solutions that align with functional and strategic objectives.
Data Architecture & Warehousing: Implement data warehousing solutions, including data modeling, partitioning, and indexing, to ensure optimal performance and scalability.
Optimization & Reliability: Manage and optimize data lakes, data warehouses, and data processing frameworks to ensure high availability, reliability, and performance.
Data Governance & Security: Establish and enforce best practices in data security, governance, and quality assurance across the organization.
Monitoring & Troubleshooting: Proactively monitor, diagnose, and resolve issues related to data pipelines and infrastructure.
Technology Evaluation: Assess, recommend, and implement emerging technologies to enhance data engineering capabilities and efficiency.
Job Requirements
Required
- Bachelor's degree in computer science, data science, software engineering, information systems, or related field. Master's degree in computer science, data science, software engineering, information systems, or related field preferred.
- 5-7 years of experience in data management disciplines, such as data integration, modeling, optimization, and quality assurance.
- Hands-on experience with big data technologies (e.g., Hadoop, Spark, Kafka) and Ra platforms (AWS, Azure, GCP).
- Strong background in data modeling and the design of data lakes, data warehouses, and relational databases.
- Expertise in data governance, security, and quality frameworks.
- Proven ability to engage with business stakeholders at all levels, delivering data-driven insights that influence strategic decisions.
- Ability to manage multiple projects in a fast-paced, collaborative environment.
- Data Science & Analytics: Proficiency in SQL, R, SAS, and Excel for data analysis and manipulation.
- Database Management: Expertise in SQL, NoSQL, Hadoop, Teradata, or similar technologies.
- Programming & Development: Strong programming experience in Python, C , or Java.
Desired
- DevOps & Automation: Familiarity with CI/CD pipelines, DevOps best practices, and cloud-native deployment strategies.
- Real-Time Data Processing: Exposure to real-time data streaming architectures.
- Previous work within the financial or insurance industry.