What are the responsibilities and job description for the Databricks/PySpark Developer / Dearborn, MI (hybrid) position at AB2 Consulting, Inc.?
Job Summary : Looking for an onsite Databricks / PySpark Developer who is willing to learn new technologies if needed and able to work with team. This position is long term and will likely be renewed annually.
Essential Job Functions :
- Design and development of data ingestion pipelines (Databricks background preferred).
- Performance tune and optimize the databricks jobs
- Evaluated new features and refractors existing code
- Mentor junior developers and makes sure all patterns are documented
- Perform data migration and conversion activities.
- Develop and integrate software applications using suitable development methodologies and standards, applying standard architectural patterns, taking into account critical performance characteristics and security measures.
- Collaborate with Business Analysts, Architects and Senior Developers to establish the physical application framework (e.g. libraries, modules, execution environments).
- Perform end to end automation of ETL process for various datasets that are being ingested into the big data platform.
- Maintain and support the application.
- Must be willing to flex work hours accordingly to support application launches and manage production outages if necessary
- Ensures to understand the requirements thoroughly and in detail and identify gaps in requirements
- Ensures that detailed unit testing is done, handles negative scenarios and document the same
- Work with QA and automation team.
- Works on best practices and documenting the process
- code merges and releases (Bitbucket)
- Works with architect and manager on designs and best practices
- Good data analysis skills.
Minimum Qualifications and Job Requirements :
Other Responsibilities :