What are the responsibilities and job description for the Data Engineer position at Devfi?
Job Title: Data Engineer / Automation Engineer
Location: Pennsylvania, USA (Hybrid/Remote options available)
Job Type: Full-time W2/1099 ONLY
Visa Status: US Citizen's / Green Card Holders ONLY
We are looking for a highly skilled and motivated Data Engineer / Automation Engineer to join our dynamic team. This is a great opportunity to work on cutting-edge projects and contribute to our mission of transforming business processes through automation and data-driven insights.
Role Overview:
As a Data Engineer / Automation Engineer, you will be responsible for designing, implementing, and optimizing data pipelines and automation processes that streamline data flows, improve data quality, and enhance reporting capabilities. You will work closely with cross-functional teams to ensure seamless data integration and automation across various systems.
Key Responsibilities:
- Design, develop, and maintain scalable data pipelines for data extraction, transformation, and loading (ETL) processes.
- Automate data workflows and repetitive tasks to improve operational efficiency.
- Work closely with data scientists, analysts, and business stakeholders to understand data requirements and deliver actionable insights.
- Collaborate with IT teams to integrate data solutions into existing infrastructure and ensure seamless automation.
- Perform data quality checks and troubleshooting to resolve issues in data processing pipelines.
- Monitor and optimize the performance of data systems, ensuring high availability and minimal downtime.
- Stay up-to-date with the latest trends and technologies in data engineering and automation.
- Document and communicate technical designs, solutions, and procedures effectively.
Qualifications:
- Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent work experience).
- Proven experience as a Data Engineer, Automation Engineer, or in a similar technical role.
- Strong experience with data engineering tools and platforms such as SQL, Python, Apache Kafka, Spark, or Airflow.
- Familiarity with cloud platforms like AWS, Google Cloud, or Azure.
- Hands-on experience with automation tools such as Ansible, Terraform, or Jenkins.
- Solid understanding of database management and optimization techniques.
- Strong problem-solving skills and attention to detail.
- Excellent communication and collaboration skills.
- Experience working in an Agile environment is a plus.
- Experience with Data Warehousing, ETL, or CI/CD pipelines is highly desirable.