What are the responsibilities and job description for the Data Engineer position at Optomi?
Job Summary
Optomi, in partnership with a leading telecommunications company, is seeking a data engineer to join their team. You will play a key role in transitioning legacy SQL Server processes to Snowflake, managing ETL jobs, and ensuring data pipelines are efficient, scalable, and reliable.
Qualifications Required:
- Strong hands-on experience with SQL and Snowflake.
- Solid understanding of ETL processes and data pipeline orchestration.
- Experience with Azure cloud services.
- Familiarity with Infoworks as a data ingestion/scheduling platform.
- Self-motivated, curious, and able to thrive in a fast-moving, collaborative environment.
- Excellent communicator and team player—confident but humble.
Job Responsibilities:
- Manage and maintain data ingestion jobs using Infoworks for scheduling.
- Build and optimize scalable data pipelines and ETL processes from various sources into Snowflake.
- Migrate existing processes from SQL Server to Snowflake, including stored procedures.
- Collaborate with data analysts, engineers, and stakeholders to define and deliver clean, well-modeled datasets.
- Take ownership of your work—always looking for the next challenge and opportunities to improve.
- Ensure flexibility and reliability in scheduling and data operations.
Nice to Haves:
- Experience with broader Snowflake ecosystem (e.g., Snowpipe, SnowSQL).
- Domain knowledge in HR, HRIS, or People Analytics.
- Previous experience building in modern cloud data environments.
Salary : $67 - $77