What are the responsibilities and job description for the Data Engineer position at Edfinancial Services LLC?
Description
General Purpose of Job
Responsible for the designing, building, and maintaining of a Fabric based data warehouse to support analytics, reporting, and business intelligence needs.
- Design, develop, and optimize cloud-based data warehouses using Microsoft Fabric.
- Build and maintain ETL/ELT pipelines to ingest, process, and transform large datasets efficiently.
- Ensure data quality, consistency, and security while implementing best practices for data governance and compliance.
- Collaborate with the business stakeholders, BI Developers, and BI Analysts to understand requirements and deliver scalable data solutions.
- Monitor and enhance data warehouse performance, reliability, and scalability.
- Implement data modeling techniques (star schema, snowflake schema, etc.) to support analytical queries and reporting.
- Automate processes using Python, SQL, Spark, or other scripting languages.
- Troubleshoot data pipeline and performance issues, ensuring smooth operations.
- Maintain a positive work atmosphere by behaving and communicating in a manner so that you get along with customers, clients, co-workers, and management.
- Work alongside the Information System Technical leadership to shape the technology roadmap of the enterprise data warehouse.
Requirements
- 4 years of experience in data engineering, data warehousing, or cloud-based analytics.
- Hands-on experience with at least one of the following cloud data warehousing technologies:
- Microsoft Fabric (OneLake, Data Factory, Synapse)
- Snowflake (Virtual Warehouses, Snowpark, Data Sharing)
- Databricks (Delta Lake, Apache Spark, MLflow)
- Strong expertise in SQL and performance tuning for cloud-based databases.
- Experience with ETL/ELT tools (Informatica, Azure Data Factory, etc.).
- Proficiency in Python, Scala, or Java for data processing and automation.
- Knowledge of cloud platforms (Azure or AWS) and their data services.
- Familiarity with data lake architectures, structured & unstructured data handling.
- Strong understanding of data governance, security, and compliance best practices.
- Experience with CI/CD pipelines and DevOps practices for data engineering.
- Exposure to AGILE development methodologies is a plus.
- Certifications in Microsoft Fabric, Snowflake, Databricks, or a related cloud technology is preferred.
- Experience with machine learning pipelines or AI-driven analytics is a plus.
- Strong business acumen to align data solutions with company objectives.
- Excellent written communication skills.
- Excellent verbal communication skills.