What are the responsibilities and job description for the Data Engineer position at Robert Half?
We are offering an exciting opportunity for a Data Engineer in Lake Oswego, Oregon. In this role, you will be key in the development and automation of data pipelines using Python and will be involved in various aspects of data architecture, including ETL/ELT processes and data lake architectures. You will also utilize various AWS services and work with technologies such as Apache Kafka, Apache Pig, and Apache Spark in a technology-focused industry.Responsibilities:• Develop and automate data pipelines using Python scripting• Utilize AWS services for various tasks, including managed Kubernetes, Airflow, Docker, and DNS• Work with ETL/ELT processes, Snowflake, and Postgres in data lake architectures• Maintain hands-on experience with GitHub and GitHub Actions• Leverage skills in Apache Kafka, Apache Pig, Apache Spark, and Cloud Technologies• Implement data visualization and algorithm implementation processes• Use analytics and Apache Hadoop in data processing• Develop APIs and make use of AWS Technologies and Kubernetes• Manage data with PostgreSQL and Amazon Web Services (AWS)• Use GitHub for code management and Github social coding for team collaboration.
Salary : $130,000 - $140,000