What are the responsibilities and job description for the Data Engineer(DB2, PostgreSQL) position at Compunnel Inc.?
Job Title: Data Engineer with (DB2, PostgreSQL) Exp - W2/FullTime - Can provide sponsorship
Duration: Long Term
Location: Westlake, TX - Hybrid - Local Preferred.
Must haves
- 6 Years of Data Engineering (DB2, PostgreSQL)
- SQL, PL/SQL, Python, and shell scripting languages, (50% PostgreSQL, 30% python, 20% shell scripting).
- Interview will include complex SQL query coding challenges as well as PostgreSQL and shell.
- Must have experience working with ETL and streaming tools and processes (Kafka is preferred, but would consider other tools as well, ie: Adeptia or Alteryx)
Nice to haves: Financial industry experience working with an Intelligence unit or Cyber Fraud team DB2, Snowflake, Oracle, MongoDB, Redshift
Required Technical Skills
- Bachelor’s or master’s degree or equivalent experience in a technology related field like Computer Science or Engineering with consistent track record.
- Object oriented Python programming and proven experience with machine learning libraries - Pandas, NumPy, Scikit-learn, TensorFlow, etc.
- Hands on event-based systems, functional programming, new technologies and messaging frameworks such as Kafka.
- Ensures alignment with enterprise data architecture strategies.
- Improves data availability via APIs and shared services and recommends optimization solutions using cloud technologies for data processing, storage, and advanced analytics.
- Provides technical mentorship for cyber security on database technologies.
- Performs risk assessments and implement validation of data processing system to ensure app functionality and security measures.
- Performs independent and sophisticated technical and functional analysis for multiple projects supporting several divisional initiatives.
- Building technical infrastructure required for efficient Extraction, Transformation, and Loading (ETL) of data from a wide variety of data sources by improving, object-oriented/object function scripting languages such as Python.
- Expertise with relational databases, Splunk, Snowflake, YugabyteDB, Aerospike, S3 and similar data management platforms.
- Experience with DB2, writing and stored procedures to process data is
- Data parsing/analytics experience in large data sets using Python, scripting, and other similar technologies, integrating with and consuming APIs.
- Familiarity with quantitative techniques and methods, statistics, econometrics – including probability, linear regression, time series data analysis and optimizations.
- Knowledge of hybrid on-prem and cloud data architectures and services, especially data streaming, storage and processing functionality
- Experience in Agile methodologies (Kanban and SCRUM) is a plus.