Demo

Big Data Engineer / Consulting Level

CereCore
Nashville, TN Full Time
POSTED ON 3/27/2025
AVAILABLE BEFORE 4/25/2025
Classification: Contract

Contract Length: 6 Months

Position Summary

The Big Data Engineer/Consulting-Level serves as a primary development resource for design, writing code, test, implementation, document functionality, and maintain of NextGen solutions for the GCP Cloud enterprise data initiatives. The role requires working closely with data teams, frequently in a matrixed environment as part of a broader project team. Due to the emerging and fast-evolving nature of technology and practice, the position requires that one stay well-informed of technological advancements and be proficient at putting new innovations into effective practice. In addition, this position requires a candidate who can analyze business requirements, perform design tasks, construct, test, and implement solutions with minimal supervision. This candidate will have a record of accomplishment of participation in successful projects in a fast-paced, mixed team environment.

Responsibilities

  • Work with data engineers, data architects, SIEM Team stakeholders to understand product requirements and then design, build, and monitor streaming platforms and capabilities that meet today's requirements but can gracefully scale.
  • Implement automated workflows that lower manual/operational costs, define and uphold SLAs for timely delivery of data, and move the company closer to democratizing streaming data.
  • Enable a self-service SIEM data architecture supporting query exploration, dashboards, data catalog, and rich data discovery.
  • Promote a collaborative team environment that prioritizes effective communication, team member growth, and success of the team over success of the individual.
  • Design and create real-time data services using Confluent Kafka and/or Streamsets that accelerate the time from idea to insight.
  • Adheres to and supports platform engineering best practices, processes, and standards.
  • Produce high quality, modular, reusable code that incorporates best practices
  • Helps promote and support security best practices that align with industry standards and regulatory and legal requirements. Help mentor team members on complex data projects and following the Agile process.
  • Help lead implementation of unit and integration tests and promote and conduct performance testing where appropriate.
  • Be a leader in the HCA data community. Evangelize data and platform engineering best practices and standards, participate or present at community events, and encourage the continual growth and development of others.
  • Develop a strong understanding of relevant product area, codebase, and/or systems
  • Demonstrate proficiency in data analysis, programming, and software engineering
  • Work closely with the Lead Architect and Product Owner to define, design and build new features and improve existing products
  • Produce high quality code with good test coverage, using modern abstractions and frameworks
  • Work independently, and complete tasks on-schedule by exercising strong judgment and problem-solving skills
  • Closely collaborates with team members to successfully execute development initiatives using Agile practices and principles
  • Participates in the deployment, change, configuration, management, administration and maintenance of deployment process and systems
  • Proven experience effectively prioritizing workload to meet deadlines and work objectives
  • Works in an environment with rapidly changing business requirements and priorities
  • Work collaboratively with Data Scientists and business and IT leaders throughout the company to understand their needs and use cases.
  • Work closely with management, architects and other teams to develop and implement the projects.
  • Actively participate in technical group discussions and adopt any new technologies to improve the development and operations.

Requirements

  • This role will provide application development for specific business environments.
  • Responsible for building and supporting a GCP based ecosystem designed for enterprise-wide analysis of structured, semi-structured, and unstructured data.
  • Bring new data sources into GCP, transform and load to databases and support regular requests to move data from one cluster to another
  • Develop a strong understanding of relevant product area, codebase, and/or systems
  • Demonstrate proficiency in data analysis, programming, and software engineering
  • Work closely with the Lead Architect and Product Owner to define, design and build new features and improve existing products
  • Produce high quality code with good test coverage, using modern abstractions and frameworks
  • Work independently, and complete tasks on-schedule by exercising strong judgment and problem-solving skills
  • Closely collaborates with team members to successfully execute development initiatives using Agile practices and principles
  • Participates in the deployment, change, configuration, management, administration and maintenance of deployment process and systems
  • Proven experience effectively prioritizing workload to meet deadlines and work objectives
  • Works in an environment with rapidly changing business requirements and priorities
  • Work collaboratively with Data Scientists and business and IT leaders throughout the company to understand their needs and use cases.
  • Work closely with management, architects and other teams to develop and implement the projects.
  • Actively participate in technical group discussions and adopt any new technologies to improve the development and operations.

If your compensation planning software is too rigid to deploy winning incentive strategies, it’s time to find an adaptable solution. Compensation Planning
Enhance your organization's compensation strategy with salary data sets that HR and team managers can use to pay your staff right. Surveys & Data Sets

What is the career path for a Big Data Engineer / Consulting Level?

Sign up to receive alerts about other jobs on the Big Data Engineer / Consulting Level career path by checking the boxes next to the positions that interest you.
Income Estimation: 
$92,929 - $122,443
Income Estimation: 
$122,257 - $154,284
Income Estimation: 
$92,929 - $122,443
Income Estimation: 
$122,257 - $154,284
Income Estimation: 
$89,551 - $118,439
Income Estimation: 
$116,726 - $151,072
Income Estimation: 
$124,724 - $161,246
Income Estimation: 
$116,726 - $151,072
Income Estimation: 
$147,901 - $186,323
Income Estimation: 
$122,257 - $154,284
Income Estimation: 
$143,391 - $179,890
View Core, Job Family, and Industry Job Skills and Competency Data for more than 15,000 Job Titles Skills Library

Job openings at CereCore

CereCore
Hired Organization Address Nashville, TN Full Time
Classification: Contract Contract Length: 12 months Position Summary The Financial Analyst plays a crucial role in manag...
CereCore
Hired Organization Address Jupiter, FL Full Time
Classification : Contract Duration: Six-Months CereCore is searching for a Budgeting Consultant for a major medical cent...
CereCore
Hired Organization Address Nashville, TN Full Time
Classification: Contract-to-Hire Contract Length: 3 months CereCore® provides EHR implementations, IT and application su...
CereCore
Hired Organization Address Nashville, TN Full Time
Classification : Contract to hire Contract Length : 1 year CereCore is a healthcare solutions provider that specializes ...

Not the job you're looking for? Here are some other Big Data Engineer / Consulting Level jobs in the Nashville, TN area that may be a better fit.

Big Data Engineer / Sr. Level - GCP

CereCore, Nashville, TN

Big Data Engineer

CereCore, Nashville, TN

AI Assistant is available now!

Feel free to start your new journey!