What are the responsibilities and job description for the Databricks/Pyspark Developer position at Global Business Ser. 4u?
Position: Databricks/Pyspark Developer
Location – Dearborn Michigan
Hybrid: onsite Tues and Wed
Job Type- Contract
Job Description
Looking for an onsite Databricks/PySpark Developer who is willing to learn new technologies if needed and able to work with team.
Essential Job Functions
SQL
Spark/Spark Streaming
Big Data Tool Set
Linux
Python/PySpark
Kafka
Location – Dearborn Michigan
Hybrid: onsite Tues and Wed
Job Type- Contract
Job Description
Looking for an onsite Databricks/PySpark Developer who is willing to learn new technologies if needed and able to work with team.
Essential Job Functions
- Design and development of data ingestion pipelines (Databricks background preferred).
- Performance tune and optimize the databricks jobs
- Evaluated new features and refractors existing code
- Mentor junior developers and makes sure all patterns are documented
- Perform data migration and conversion activities.
- Develop and integrate software applications using suitable development methodologies and standards, applying standard architectural patterns, taking into account critical performance characteristics and security measures.
- Collaborate with Business Analysts, Architects and Senior Developers to establish the physical application framework (e.g. libraries, modules, execution environments).
- Perform end to end automation of ETL process for various datasets that are being ingested into the big data platform.
- Maintain and support the application.
- Must be willing to flex work hours accordingly to support application launches and manage production outages if necessary
- Must understand the requirements thoroughly and in detail and identify gaps in requirements
- Ensure that detailed unit testing is done, handle negative scenarios and document the same
- Work with QA and automation team.
- Work on best practices and documenting the process
- code merges and releases (Bitbucket)
- Works with architect and manager on designs and best practices
- Good data analysis skills
- Must be a team player.
- Must have following:
SQL
Spark/Spark Streaming
Big Data Tool Set
Linux
Python/PySpark
Kafka
- Experience collaborating with dev team, project managers, and engineers.
- Excellent communication and teamwork skills
- Safeguard the company’s assets.
- Adhere to the company’s compliance program.
- Maintain comprehensive knowledge of industry standards, methodologies, processes, and best practices.
- Maintain a focus on customer-service, efficiency, quality, and growth.