What are the responsibilities and job description for the Data Engineer position at Thoughtwave Software and Solutions?
Job Details
Role: Data Engineer
Duration: 6 Months
Location: Appleton, WI (4 days per week)
Required Skills:
Spark
Python
SQL
Databricks
Azure Data Factory
Cloud (Azure, AWS, Google Cloud Platform)
Self-starter, strong communication
Job description:
Azure Data Factory and Databricks is current state, evaluating Fivetran, Snowflake, and other technologies.
Trying to move off of ADF, moving more toward Spark pipelines in Databricks.
Sources include Salesforce, SQL Server, Azure SQL--pulling data from there into Databricks bronze layer (they refer to as "Raw, refined, and curated"
Some legacy SSIS pipelines they need to migrate, mentioned they might outsource whole migration--we can help with this.
Needs experience in Spark, Python, and SQL, ADF but not married to ADF
Looking for someone who doesn't only have ADF/Azure stack, someone with a broader range of skills to be able to help evaluate new tooling and be able to pick things up quickly..
Self-starter personality with strong documentation skills
Experience in Fivetran would be a plus
Cloud technology, Azure (preferred), Google Cloud Platform and AWS are okay
Conceptual knowledge and fundamentals are more important than particular tools
If you have some experience in curation through, silver, gold layers and broader medallion architecture that's a nice to have.
Unity Catalog governance implementation
--
Thanks & Regards,
Manoj - Technical Recruiter
Thought wave Software and Solutions
314 N. Lake St, Suite 6, Aurora IL 60506
Desk: EXTN: 158
Email :
Website:
Duration: 6 Months
Location: Appleton, WI (4 days per week)
Required Skills:
Spark
Python
SQL
Databricks
Azure Data Factory
Cloud (Azure, AWS, Google Cloud Platform)
Self-starter, strong communication
Job description:
Azure Data Factory and Databricks is current state, evaluating Fivetran, Snowflake, and other technologies.
Trying to move off of ADF, moving more toward Spark pipelines in Databricks.
Sources include Salesforce, SQL Server, Azure SQL--pulling data from there into Databricks bronze layer (they refer to as "Raw, refined, and curated"
Some legacy SSIS pipelines they need to migrate, mentioned they might outsource whole migration--we can help with this.
Needs experience in Spark, Python, and SQL, ADF but not married to ADF
Looking for someone who doesn't only have ADF/Azure stack, someone with a broader range of skills to be able to help evaluate new tooling and be able to pick things up quickly..
Self-starter personality with strong documentation skills
Experience in Fivetran would be a plus
Cloud technology, Azure (preferred), Google Cloud Platform and AWS are okay
Conceptual knowledge and fundamentals are more important than particular tools
If you have some experience in curation through, silver, gold layers and broader medallion architecture that's a nice to have.
Unity Catalog governance implementation
--
Thanks & Regards,
Manoj - Technical Recruiter
Thought wave Software and Solutions
314 N. Lake St, Suite 6, Aurora IL 60506
Desk: EXTN: 158
Email :
Website:
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.