Steneral Consulting is Hiring a Hybrid Work - Need Azure DevOps Engineer II in Yardley PA Near Yardley, PA
DevOps Engineer II (Azure) - w2 candidates Ideal location would be Yardley, PA (that is their first choice, hybrid from there. Candidates from that location will get first prefence) - 2 round interview process MUST HAVE THESE SKILLS along with those in JD (focus is definitely on the Azure side, any AWS experience is a plus)
Very high profile and high visibility assignment, must have great communication skills
Legit Dev Ops role but not build and release role.
Someone who had been a data engineer and moved to dev ops is ideal background
Need to be manipulating data using technologies such as Azure Synapse, Data Bricks, Azure data factor
DevOps Engineer II
The DevOps Engineer II manages the standards and governance over the management of the code repositories to optimize and secure build and release pipelines, infrastructure, and processes for enterprise software processes. The DevOps Engineer II is engaged in site reliability engineering for the enterprise. The role supports Cenlar’s enterprise-wide Information Factory initiatives and on-going cadence. ResponsibilitiesDevOps
Serve as key resource in managing the standards and governance of the DevOps processes
Resolve increasingly complex problems and develop and implement solutions where expertise is required to interpret against policies, guidelines or processes
Collaborate, consult, and present technical information to management as it pertains to the DevOps environment
Typically referred to as a technical resource within the DevOps environment
Begin to mentor other junior level DevOps Engineers and IT staff members
Develop, document and communicate data DevOps processes and standards
Develop and maintain naming standards and conventions as they pertain to Azure resources
Develop best practices to ensure security standards are followed according to IT standards.
Configure continuous integration and release Pipelines
Configure approvals, delivery plans, and boards.
Ensure pipelines and service connections are secured
Serve as a key technical resource in implementing standards through automation in build/release pipelines with appropriate approval processes and controls
Collaborate with cross-functional teams and management on enterprise controls and processes
Design and create build pipelines that prevents unapproved code from reaching the production environment
Collaborate with Data Engineers, Data Science, Developers, and other content creators to deliver the value of the artifacts in a controlled manner.
Identify data needs to implement and maintain continuous integration and continuous deployment (CI/CD) pipelines for data-related code.
Identify opportunities for improvements and presents recommendations to management team for approval
Make recommendations on designs, builds, scalability, and security of DevOps infrastructure
Design, build, test, and deploy changes to existing software environments and enhance the security protocols of the data infrastructure
Collaborate with Data Engineers and Data Scientists to ensure data quality through cleaning and validation
Security
Design and implement security measures to protect data and the code repository including encryption and access control
Create best practices to create Projects and Teams for the Enterprise
Ensure all projects are secured according to established IT controls and escalate issues to management
Champion security measures for data, including encryption and access control
Remain current with industry best practices, tools, and technologies in Data DevOps and ensure junior team members are well-informed of industry trends and changes
Appropriately identifies and assess risk when business decisions are made, including but not limited to compliance and operational risks.
Demonstrates consideration for Cenlar’s reputation as well as our clients, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues.
Qualifications
Bachelor's degree in computer science, data engineering, or a related field or equivalent work experience
Minimum 5 to 8 years of experience designing, developing, and supporting databases
Proven experience in data engineering, data management or DevOps
Strong knowledge and experience with data engineering, data management, or DevOps roles
Strong knowledge and experience with cloud data platforms such as AWS, Azure, or Google Cloud
Proficient in programming languages such as Python, Java, or Scala
Experience and working knowledge of data orchestration tools and frameworks such as Apache Airflow
Proficient with containerization and orchestration technologies such as Docker and Kubernetes
Experienced in building pipelines to facilitate the containerization process
Experience with version control systems (e.g., Git) and CI/CD pipelines
Proficient problem-solving and troubleshooting skills coupled with the ability to recommend and develop process improvement strategies
Organizational skills with the ability to guide the development and implementation of multi-faceted data projects
Skilled and motivated team player with demonstrated success in communicating complex subject matter to all levels of the organization including senior leadership
Ability to form collaborative relationships with functional teams and setting expectations with business partners
Ability to act as a role model and mentor to colleagues and junior level staff
Able to serve as a key technical resource to support technology decisions