What are the responsibilities and job description for the Technical Lead / Data Engineering Specialist (Databricks, Azure Data Lake, Python)/Greenfield, IN 46140 (Onsite) position at Radiansys, Inc.?
Job Details
Hi
We are looking for Technical Lead / Data Engineering Specialist (Databricks, Azure Data Lake, Python)/Greenfield, IN 46140 (Onsite). Anyone interested can share your resume at
Title: Technical Lead / Data Engineering Specialist (Databricks, Azure Data Lake, Python)/Greenfield, IN 46140 (Onsite) Location: Greenfield, IN 46140 (Onsite)
Contract W2/C2C & CTH
Required skills:
We need Senior Data Engineer / Lead / Data Architect with Primarily on Azure Data, Databricks, Synapse, ADLS / Data Lake, Python with some experience Google Cloud Platform (Google Cloud Platform) and/ Storage (GCS)
We are looking highly skilled Technical Lead/Data Engineering Specialist with extensive experience in Cloud technologies, DevOps development practices, Data Engineering to support and enhance RDAP initiatives.
Key Responsibilities
- Design, develop, and maintain Databricks Lakehouse solutions sourcing from Cloud platforms such as Azure Synapse and Google Cloud Platform
- Implement and manage DevOps and CICD workflows using tools like GitHub
- Apply best practices in test-driven development, code review, branching strategies, and deployment processes
- Build, manage, and optimize Python packages using tools like setup, poetry, wheels, and artifact registries
- Develop and optimize data pipelines and workflows in Databricks, utilizing PySpark and Databricks Asset Bundles
- Manage and query SQL databases (Unity Catalog, SQL Server, Hive, Postgres)
- Implement orchestration solutions using Databricks Workflows, Airflow, and Dagster
- Work with event-driven architectures using Kafka, Azure Event Hub, and Google C4 Cloud Pub/Sub
- Develop and maintain Change Data Capture (CDC) solutions using tools like Debezium
- Extensive experience in design and implementation of data migration projects specifically involving Azure Synapse and Databricks Lakehouse
- Manage cloud storage solutions, including Azure Data Lake Storage and Google Cloud Storage
- Configure and manage identity and access solutions using Azure Active Directory, including AD Groups, Service Principals, and Managed Identities
- Effective interactions in Customer for understanding requirements, participating in design discussions and translating requirements into deliverables by working with the development team at Offshore; Effective in collaborating with cross-functional teams across development, operations, and business units; Strong interpersonal skills to build and maintain productive relationships with team members
- Problem-Solving and Analytical Thinking Capability to troubleshoot and resolve issues efficiently; Analytical mindset for optimizing workflows and improving system performance
- Ability to convey complex technical concepts in a clear and concise manner to both technical and non-technical stakeholders; Strong documentation skills for creating process guidelines, technical workflows, and reports
Technologies & Skills & Experience
- Databricks (PySpark, Databricks Asset Bundles)
- Python package builds(setup, poetry, wheels, artifact registries)
- Open File Formats (Delta/Parquet/Iceberg/etc )
- SQL Databases (Unity Catalog, SQL Server, Hive, Postgres)
- Orchestration Tools(Databricks Workflows, Airflow, Dagster)
- Azure Data Lake Storage, Azure Active Directory (AD groups, Service Principles, Managed Identities)
- Secondary/Other Skills/Good To have Kafka, Azure Event Hub, Cloud Pub/Sub; Change Data Capture (Debizum) and Google Cloud Storage
- Bachelor's Degree good in Computer Science, Information Technology or related with 12 years of experience
Regards,
Pinku Kumar
Talent Acquisition Radiansys Inc.
39510 Paseo Padre Pkwy #110, Fremont, CA 94538
Direct: Ext 1006
Email: