What are the responsibilities and job description for the Senior Data Engineer (Hybrid): 25-00078 position at Platinum Resource Group?
Senior Data Engineer
Location; Flower Mound, TX (Hybrid - 2 days onsite per week)
Contract-to-hire Opportunity
JOB DESCRIPTION
- Design and implement robust data pipelines using Databricks and integrate these with Delta Lake for efficient data storage and management.
- Manage end-to-end data workflows from ingestion to insights, utilizing FiveTran for seamless data integration and GCP for scalable cloud solutions.
- Administer Databricks environments, ensuring optimal configuration, security, and performance across all data operations.
- Utilize Unity Catalog to manage data governance across all Databricks workspaces, ensuring compliance with data privacy and security policies.
- Develop and maintain scalable and efficient data models and architecture, supporting the strategic goals of the organization.
- Collaborate with data scientists and analysts to deploy machine learning models and complex analytical projects.
- Monitor, troubleshoot, and optimize data systems, proposing and implementing improvements to enhance performance.
- Document all data procedures and systems, ensuring clear and accurate records are maintained for compliance and operational efficiency.
- Stay updated with the latest innovations in data engineering and introduce new tools and techniques to keep the data infrastructure at the forefront
- Always represent the Company in a professional manner and appearance.
- Understand and internalize the Company's purpose.
- Display loyalty to the Company and its organizational values.
- Display enthusiasm and dedication to learning how to be more effective on the job and share knowledge with others.
- Work effectively with co-workers, internal and external customers and others by sharing ideas in a constructive and positive manner; listen to and objectively consider ideas and suggestions from others; keep commitments; keep others informed of work progress, timetables, and issues; address problems and issues constructively to find mutually acceptable and practical business solutions; address others by name, title, or other respectful identifier, and respect the diversity of our work force in actions, words, and deeds.
- Comply with the policies and procedures stated in the Injury and Illness Prevention Program by always working in a safe manner and immediately reporting any injury, safety hazard, or program violation.
- Ensure conduct is consistent with all Compliance Program Policies and procedures when engaging in any activity on behalf of the company. Immediately report any concerns or violations.
QUALIFICATIONS
- Bachelor's degree in Computer Science, Engineering, or related field (or 4 additional years of experience in lieu).
- Master's degree in related field a plus.
- General knowledge of advanced concepts, practices and procedures related to design and implementation of robust data pipelines
- Proven expertise in Databricks administration and managing large-scale data environments.
- Strong experience with SQL, Python, and other scripting languages commonly used in data engineering.
- Familiarity with Fivetran or similar data integration tools and understanding of ETL processes
- Previous experience managing end-to-end data workflows
- Minimum of five (5) years of experience [nine (9) years for non-degreed candidates] in a data engineering role with significant exposure to Databricks, Delta Lake, and cloud platforms like GCP.
- Experience with Unity Catalog and managing data access and security within a Databricks environment.
- Knowledge of Java Archive (JAR) files and their applications in data projects.
- Must have strong organizational skills
- Must have a detail orientation and the proven ability to prioritize work
- Must have effective verbal and written communication skills
- Must have the ability to work with limited supervision and as part of a team
- Sound decision-making abilities
- Excellent problem-solving skills and the ability to work in a dynamic, fast-paced environment.
- Strong communication and collaboration skills, capable of working effectively across multiple teams.
- Experience with machine learning and AI workflows.
- Familiarity with Apache workflows, such as Apache Airflow or Apache NiFi, showcasing the ability to design, implement, and manage complex data pipelines and workflows.
- Knowledge of data visualization and reporting tools (like, Power BI, Tableau).
- Certifications in data engineering or a related field a plus.
- Willingness to travel about 2 times per year to Temecula, CA for team building, training, or other events.
Please note we are currently unable to sponsor or transfer visas for this position. Candidates must be authorized to work in the U.S. on a permanent basis.
Platinum Resource Group is a professional level consulting firm, providing resources to Fortune 1000 client companies in the areas of technology, human resources, accounting, finance, business systems and supply chain, on a contract and interim basis. PRG has operations in Orange County, San Diego, Los Angeles, and San Francisco. As a W-2 employer we offer our consultants direct deposit bi-weekly payroll, health, dental, vision benefits, paid holidays, and referral bonuses.
Job Type: Full-time
Pay: $125,000.00 - $165,000.00 per year
Benefits:
- Dental insurance
- Health insurance
- Vision insurance
Schedule:
- 8 hour shift
Application Question(s):
- Are you authorized to work in the United States without sponsorship?
Work Location: Hybrid remote in Flower Mound, TX 75028
Salary : $125,000 - $165,000