What are the responsibilities and job description for the W2 - Sr Software Engineer (Java, Devops, GCP, Python, Data Engineering) - Remote position at Tanson Corp?
Duties : PLEASE REFER TO SCHEDULE NOTES FOR REQUIREMENTS
Position Overview : (Major Functions and Non-Essential Functions) : We are seeking a Senior Software Engineer to design and build back-end services that support our portfolio of data-centric clinical and analytic applications. These applications leverage cloud computing, big data, mobile, data science, data warehousing, machine learning using state of the art software development applications and frameworks. Our Software Engineers ensures that these cloud-based micro-services adhere to uptime and accuracy targets, are resilient, and scale as data volumes and traffic increase. They work closely with the data engineering, platform, and solutions teams to develop applications as required to benefit our practice and patients.
- Works closely with the Product Owners, Product Managers, Architects to translate requirements into code.
- Developing services around data warehousing, big data, cloud computing, business intelligence, analytics and machine learning.
- Participate in DevOps, Agile, continuous development and integration frameworks.
- Programming in high-level languages such as Go, Python, Java etc.
- Ensure all appropriate documentation of processes and source code is created and maintained.
- Communicate effectively with peers, leaders, and customers throughout the organization.
- Participate in expert level troubleshooting and resolve problems through root cause analysis, data and system investigation.
- Contributes to design and architecture discussions with Principals and Architects.
- Leads targeted cross-functional improvement efforts and mentors more junior software engineers.
- Solves complex problems; takes a new perspective on existing solutions.
- Work independently with minimal guidance. You may lead projects or project steps within a broader project or have accountability for ongoing activities or objectives.
- Act as a resource for colleagues with less experience.
Skills : Additional Experience and / or Qualifications : (Has Achieved Competency in the Following Areas, Job Knowledge and Additional Considerations) :
Preferred qualifications for this position include :
Education : Minimum Education and / or Experience Required : (Education Requirements and Experience) : Required qualifications for this position include : Bachelor's Degree in Computer Science / Engineering or related field with 5 years of experience as noted below; OR an Associate's degree in Computer / Science / Engineering or related field with 7 years of experience.
Schedule Notes : Scope of Work : The resource will be supporting an engineering team tasked with building out a research data platform which will ingest and make discoverable research generated data.
Data Engineering Skills & Experience :
Programming Languages : Primary pipeline development language with be python. Some datatypes and formats may require the use of other languages (i.e. java, R, etc.) because the libraries / frameworks / sdks available to work with those datatypes and formats are not available in python
Operating Systems : Primary operating system for data pipeline execution will be linux, with data pipelines packaged, deployed, and run as containers. Data source systems could be windows or linux based. Infrastructure Primary data platform and data pipeline execution infrastructure will be hosted on Google Cloud Platform (GCP) utilizing cloud native technologies (i.e. Google Cloud Storage, BigQuery, Google Batch, Dataflow, Cloud SQL, etc.). Data will be replicated from various on-premises sources that include laboratory instruments, network shared drives, and windows desktops attached to instruments.
Development Tools : Sprints, features, and tasks will be managed in Azure DevOps. Code will be managed and versioned Azure DevOps based git repositories. Code will be compiled, packaged, and deployed utilizing Azure DevOps build pipelines. Data pipelines will be packaged, deployed, and run in docker containers. Docker containers will be stored and versioned in Google Cloud Artifact Repositories. Veracode will be utilized to scan source code for vulnerabilities and Prisma Cloud will be utilized to scan containers. The standard integrated development environment will be jetbrains (pycharm, intellij, etc.) or VSCode.
Preferred Candidates :
Required Skills : JAVA, DEV OPS, GCP,
Additional Skills : BUSINESS INTELLIGENCE,C#,C ,TIME MANAGEMENT,PROBLEM SOLVING,BIG DATA,GOLANG,MACHINE LEARNING,APPLICATION DEVELOPMENT,CODING STANDARDS,LIFE CYCLE,CODING,AGILE,DATA WAREHOUSING,OPERATIONS,ROOT CAUSE ANALYSIS,STRUCTURED SOFTWARE,AMAZON WEB SERVICES,B2B SOFTWARE,DOCUMENTATION,DISTRIBUTED SYSTEMS,TRANSLATE,PYTHON,MENTORS,
Minimum Degree Required : Bachelor's Degree
Hours Per Day : 8.00
Hours Per Week : 40.00
Pay range : $67 to $76 per hour on W2.
Salary : $67 - $76