What are the responsibilities and job description for the Data Engineer AWS position at FRACSYS INC?
Job Description
Fracsys Inc is hiring a AWS Data Engineer position. The ideal candidate must have at least 8 years of industry experience and be responsible for the successful technical delivery of Data Warehousing and Business Intelligence projects.
- Data extraction, data cleaning, data loading, statistical data analysis, exploratory data analysis, and data wrangling
- Writing complex SQL queries and stored procedures to extract, manipulate, and analyze data from relational databases
- Conducting diagnostic and troubleshooting steps and presenting conclusions to a varied audience
- Building data marts and data warehouses to facilitate analytical capabilities across various sources of data
- Designing and implementing data patterns on cloud using cloud data warehouse and data virtualization tools
- Evaluating new tools and technologies for target state on cloud
- Learning and implementing new ETL, BI tools, and data warehousing technologies
Qualifications
- Senior data professional with over 8 years of expertise in data engineering, data analysis, and data science
- Experience creating data marts, data warehouses on-premise and on cloud for real-time and batch processing frameworks
- Experience building and optimizing big-data data pipelines and data sets including Postgres, AWS Relational Data Service
- Extensive hands-on experience building and optimizing data structures for data analytics, data science, and business intelligence
- Experience building self-service data consumption patterns and knowledge of cloud-based data lake platforms
- Experience wrangling data (structured and unstructured), in-depth knowledge of database architecture
- Experience utilizing or leading implementations leveraging ETL tools (Informatica/Talend), BI reporting tools such as MicroStrategy, Microsoft Power BI, data modeling tools such as Erwin, Oracle, SQL Server, NoSQL, JDBC, UNIX shell scripting, PERL, and JAVA, XML/JSON files, SAS, Python, AWS cloud-native technologies, S3, Athena, Redshift
- Experience in Snowflake is an added bonus
- Familiarity with the following technologies: Hadoop, Kafka, Airflow, Hive, Presto, Athena, S3, Aurora, EMR, Spark
- Ability to drive, contribute to, and communicate solutions to technical product challenges
- Ability to roll-up your sleeves and work in process or resource gaps and fill them in yourself
- Excellent written and oral communication skills in English
Education
Bachelor's degree in computer science, data analytics, information systems, or a related degree or equivalent experience. A Master's degree is preferred.