What are the responsibilities and job description for the Big Data engineering (Neo4j) - W2 Only position at Apidel Technologies?
Job Details
Job Overview:
Looking for someone with expertise in Big Data engineering and Graph Databases (Neo4j) to support API development for better network understanding. The role involves ETL development using Databricks with a strong focus on PySpark and containerized infrastructure. The candidate will play a key role in a structural Big Data approach for Comcast s fiber footprint, specifically focusing on new Big Data ETL solutions for fibre cable networks.
Key Responsibilities:
- Design and implement Big Data solutions using Neo4j and Databricks.
- Work extensively with ETL processes using Databricks and PySpark.
- Containerize infrastructure for scalability and efficiency.
- Apply Graph Database expertise (Neo4j or AWS Neptune) to data modeling and analysis.
- Collaborate with teams to ensure best practices in Big Data ETL development.
Required Qualifications:
- 8 10 years of experience in Big Data engineering.
- Strong expertise in:
- Graph Databases (Neo4j or AWS Neptune)
- ETL development and Big Data processing
- Databricks, Python, PySpark
- Experience working on structural Big Data approaches for large-scale networks.
- Strong problem-solving skills and ability to work in a fast-paced environment.
- AWS Components must have.
Interview (Onsite):
- Round 1: Technical interview with Jasmeet and panel (Coding assessment).
- Round 2: Interview with the Product Owner (PO).