What are the responsibilities and job description for the Senior Data Engineer position at TheCorporate?
Job Summary
A Senior Data Engineer within our Enterprise Systems will lead new solutions that seamlessly integrate both the back end of a website and/or application. The ideal candidate must possess in-depth knowledge of designing modern programming languages & web technologies, backend infrastructure & databases, and AWS cloud services to be effective.
Who You Are
Basic Qualifications
A Senior Data Engineer within our Enterprise Systems will lead new solutions that seamlessly integrate both the back end of a website and/or application. The ideal candidate must possess in-depth knowledge of designing modern programming languages & web technologies, backend infrastructure & databases, and AWS cloud services to be effective.
Who You Are
Basic Qualifications
- 5 years of experience with AWS, including team oversight.
- Proficiency in AWS Serverless, Cloud Security, DevOps, Cloud Migration, and Containers.
- Experience in building and managing scalable data storage solutions using AWS services such as S3, Aurora Postgres, Redshift, DynamoDB, or RDS.
- Ability to optimize data storage and retrieval for performance and cost efficiency.
- In-depth knowledge of modern database technologies and experience developing database applications on AWS using Node JS, Python, Chart.js, Lambda, CloudWatch.
- Competence in integrating data from various sources (e.g., APIs, databases, streaming platforms) into centralized enterprise applications, data lakes, or warehouses.
- Ensuring data consistency, accuracy, and integrity across systems.
- Monitoring data pipelines and infrastructure for performance, reliability, and scalability.
- Implementing data security best practices, including encryption, IAM policies, and access controls.
- Proficiency in AWS services like S3, Glue, Redshift, EMR, Lambda, Athena, and Kinesis.
- Strong programming skills in Python, Java, or Scala.
- Extensive experience with SQL and database management.
- Familiarity with big data tools like Apache Spark, Hadoop, or Kafka.
- Ensuring compliance with data governance and regulatory requirements.
- Troubleshooting and resolving issues in data workflows.
- Collaborating with team members, Product Managers, subject matter experts, and other teams to refine requirements and translate them into functional software using standardized coding techniques and conventions.
- Strong problem-solving and analytical skills.
- Excellent communication and teamwork abilities.
- Comprehensive understanding of all development life cycle phases and solution delivery for cloud systems with experience in unit and integration testing, and strong documentation skills.
- Flexibility to adhere to a hybrid work schedule in Nashville, typically requiring 3 days per week in the office, with up to 5 days per week if business needs require.
- Must be authorized to work in the United States.
- Experience in transforming database code from one language to another, with an emphasis on Oracle to Postgres.
- Experience with CRM/Finance/Accounting systems is a plus.
- Knowledge of the software development lifecycle and concepts such as Agile, SAFe, scrum, CI/CD, and DevOps.
- Understanding of coding best practices, including CI/CD, Cloud Security, and DevOps.
- Experience with ETL solutions such as Glue and Lambda functions.
- Familiarity with dashboards and dashboard platforms such as AWS QuickSight, Google Looker, Tableau, and PowerBI.
- Knowledge of big data technologies such as AWS Redshift, Hive, and Spark.
- Experience with AWS Database Migration Service (DMS).
- Experience with Angular/React Material for front-end development and integration.
- AWS Certification.
Salary : $55 - $62