What are the responsibilities and job description for the Senior Data Engineer position at CCS Global Tech?
Job Details
Position Summary
We are looking for a Senior Data Engineer to expedite our Data Platform Stack to support our developers, database architects, and data analysts on data initiatives. The selected candidate will ensure that optimal data delivery architecture remains consistent throughout ongoing projects in a dynamic environment.
Responsibilities
- Analyze and interpret complex data sets. Ability to identify data anomalies and resolve data issues.
- Understand specific business processes and domain concepts and relate them to data subject domains.
- Collaborate with Data Leads, Data Analysts, and QA Analysts to validate requirements; participate in user requirement sessions.
- Perform tests and validate data flows and prepare ETL processes according to business requirements.
- Perform ETL tuning and SQL tuning.
- Document data flows representing business logic in ETL routines.
- Design and implement a data conversion strategy from legacy to new platforms.
- Perform design validation, reconciliation, and error handling in data load processes.
- Design and prepare technical specifications and guidelines, including ER diagrams and related documents.
Qualifications
- Must be well-versed in data warehousing concepts, including design patterns (Star schemas, Snowflake design schemas). Must be aware of data modeling concepts, including the data modeling Normal forms.
- Knowledge of AWS Infrastructure, including S3, SNS, EC2, CloudWatch, and RDS.
- 7 years working in ETL/Data transformation projects with one or more related products such as Informatica, Talend, or Microsoft SSIS.
- 7 years working in business intelligence and data warehousing initiatives.
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL), as well as working familiarity with a variety of databases like Oracle, MS SQL Server, or Vertica.
- Good to have:
- At least 1 years working with the Matillion ELT tool and Snowflake Database.
- Experience in building AWS Data Pipelines using Python or Spark, SparkSQL in any Cloud Environment (AWS/Azure/Google).
- Experience with any of the NoSQL datastores such as Elasticsearch, MongoDB, DynamoDB, or Cassandra.
Education/Experience
- Bachelor's in Mathematics, Computer Science, or related technical field. Postgraduate degree preferred.
- Minimum 5 years of relevant experience.