What are the responsibilities and job description for the Snowflake DBT Data Engineer position at PSRTEK?
Role: Snowflake DBT Engineer
Remote
As a Data Engineer, you will be a part of an Agile team to build healthcare applications and implement new features while adhering to the best coding development standards.
Tools: Snowflake, DBT, Python, PostgreSQL, MSSQL, Shell Scripting, PHP, Jenkins, GitHub, Azure Data Lake, Azure Data Factory, Azure Data Bricks
Required:
- Snowflake (Columnar MPP Cloud data warehouse)
- DBT (ETL tool)
- Python
- Experience designing and implementing Data Warehouse
- Bachelor's degree (B.A. / B.S.) from four-year college or university
- Overall, 7-9 years’ experience as Data Engineer on the Microsoft Azure cloud platform
- Strong understanding of Azure services and expertise on Data Lake, Data Factory and Data Bricks
- Healthcare industry experience preferred
- Deep experience with SQL, database design, optimization, and tuning
- Experience with open-source relational database (e.g., PostgreSQL, MSSQL, etc.)
- Experience in Shell Scripting and one other object-oriented language such as Python, or PHP
- Strong experience in continuous integration and development methodologies tools such as Jenkins, and experience using GitHub
- Experience in an Agile development environment
- Programming skills particularly SQL, Shell Scripting, and Python
- Good oral and written communication skills
Regards,
Guru Prasath M
US IT Recruiter
PSRTEK Inc.
Princeton, NJ 08540
guru@psrtek.com