What are the responsibilities and job description for the Databricks/PySpark Developer position at V2Soft?
Job Details
V2Soft () is a global company, headquartered out of Bloomfield Hills, Michigan, with locations in Mexico, Italy, India, China and Germany. At V2Soft, our mission is to provide high performance technology solutions to solve real business problems. We become our customer s true partner, enabling both parties to enjoy success. We are committed to promoting diversity in the workplace, and believe it has a positive effect on our company and the customers we serve.
- to view all of our open opportunities and to learn more about our benefits.
Essential Job Functions:
- Design and development of data ingestion pipelines (Databricks background preferred).
- Performance tune and optimize the databricks jobs
- Evaluated new features and refractors existing code
- Mentor junior developers and makes sure all patterns are documented
- Perform data migration and conversion activities.
- Develop and integrate software applications using suitable development methodologies and standards, applying standard architectural patterns, taking into account critical performance characteristics and security measures.
- Collaborate with Business Analysts, Architects and Senior Developers to establish the physical application framework (e.g. libraries, modules, execution environments).
- Perform end to end automation of ETL process for various datasets that are being ingested into the big data platform.
- Maintain and support the application.
- Must be willing to flex work hours accordingly to support application launches and manage production outages if necessary
- Ensures to understand the requirements thoroughly and in detail and identify gaps in requirements
- Ensures that detailed unit testing is done, handles negative scenarios and document the same
- Work with QA and automation team.
- Works on best practices and documenting the process
- code merges and releases (Bitbucket)
- Works with architect and manager on designs and best practices
- Good data analysis skills
- Safeguard the company's assets.
- Adhere to the company's compliance program.
- Maintain comprehensive knowledge of industry standards, methodologies, processes, and best practices.
- Maintain a focus on customer-service, efficiency, quality, and growth.
- Collaborating with additional team members
- Other duties as assigned.
- Minimum Qualifications and Job Requirements:
- Must be a team player.
- Must have following
- SCALA
- SQL
- Spark/Spark Streaming
- Big Data Tool Set
- Linux
- Python/PySpark
- Kafka
- Experience collaborating with dev team, project managers, and engineers.
- Excellent communication and teamwork skills.
- to view all of our open opportunities and to learn more about our benefits.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.