What are the responsibilities and job description for the Database Analyst position at Capital Technology Alliance?
This advertisement is for a full-time, on-site (Tallahassee, FL) position serving one of our Florida Government clients.
Our Florida State Government client seeks a Database Analyst to create tables, migrate data, configure ETL pipelines, and create backup loads. The Database Analyst position will provide consulting services in support of the creation of a Data Warehouse.
Job Duties:
1. Writing ETL Pipelines
ETL Tool Selection:
- Provide alternatives and advantages/disadvantages to aid in selection of a no/low code ETL platform/tool.
Design and Develop ETL Pipelines:
- Create, optimize, and maintain ETL pipelines to migrate data from various sources (Oracle, MSSQL, flat files) to the target Snowflake data warehouse.
- Utilize a combination of coding (Python, SQL, etc.) and no/low-code tools (e.g., Azure Data Factory, Talend, Informatica) to develop ETL processes that meet data integration needs.
Data Transformation and Validation:
- Apply transformations to cleanse, filter, join, and aggregate data from the source systems.
- Ensure data quality by implementing validation rules to verify data correctness and completeness before loading into Snowflake.
Source-to-Target Mapping:
- Create detailed source-to-target mappings to ensure accurate transformation from source systems (Oracle, MS-SQL) into Snowflake.
- Ensure solution provides data quality, consistency, and integrity throughout the ETL process.
Automation and Scheduling:
- Work with other resources to automate the ETL processes to run at scheduled intervals based on specific needs.
- Monitor and optimize ETL jobs to ensure they run efficiently and on time.
Error Handling and Troubleshooting:
- Implement error detection, logging, and alerting mechanisms within the ETL pipelines.
- Troubleshoot ETL failures, identify root causes, and make necessary adjustments to ensure pipeline stability.
Performance Optimization:
- Tune ETL processes for performance improvements, minimizing resource consumption and ensuring fast data migration.
Testing:
- Develop test cases for ETL processes and data pipelines to ensure they meet business requirements and are free of errors.
- Perform unit testing of ETL pipelines.
2. Creating and Maintaining Snowflake Tables
Table Design and Creation:
- Design, create, and optimize Snowflake tables to store transformed data, ensuring proper indexing, clustering, and partitioning strategies.
- Work with business stakeholders to understand data requirements and create tables that meet the business needs (e.g., fact and dimension tables, data marts).
Schema Management:
- Maintain and manage Snowflake schema to ensure efficient querying and reporting.
- Monitor schema changes and manage data migrations across environments (development, staging, production).
Data Modeling:
- Develop and maintain logical and physical data models in Snowflake, ensuring they align with business requirements and reporting needs.
- Support dimensional data modeling and star/snowflake schema design to facilitate reporting and analysis.
Data Governance and Security:
- Implement data governance standards and policies to ensure data integrity and security.
- Define roles and access controls within Snowflake to restrict access to sensitive data.
Data Quality and Consistency:
- Implement data quality checks within Snowflake to ensure that the data loaded into tables meets the necessary standards for reporting and analytics.
- Perform regular data audits to check for discrepancies or inconsistencies.
3. Collaboration / Documentation
Collaboration:
- Work closely with business stakeholders, business analysts, data architect, and IT teams to gather requirements and provide technical solutions.
- Provide training, support, and documentation to end-users and technical teams to provide for long-term ongoing maintenance and support of the solutions.
- Regularly interface with and provide status updates to other project resources including project manager, business analysts, data warehouse architect, and key stakeholders.
- Work diligently to ensure completion of project goals by the end of the project.
Documentation:
- Document ETL processes, Snowflake schema designs, and data pipelines for future reference and team knowledge sharing.
- Maintain clear documentation on data definitions, transformation rules, and data quality standards.
Required Experience:
- 5 years of experience with Snowflake data warehouse.
- 3 years of experience with no/low code ETL tools/platforms (e.g., Matillion, Talend, Informatica).
- 3 years of experience performing ETL processes using code (e.g. Python, SQL).
- Strong experience with data integration, data quality, and data governance.
- Strong experience with relational databases, SQL, and data modeling.
- Strong experience working collaboratively with business analysts, developers, and non-technical end users related to data warehousing.
Preferred Experience:
- Excellent skills in communicating technical information to non-technical resources.
- Excellent troubleshooting skills.
- Excellent interpersonal and communication skills.
- Excellent communication and collaboration skills.
- Strong analytical and problem-solving skills.
- Detail-oriented, ability to switch tasks, ability to self-direct and prioritize tasks.
- Experience researching and investigating problems to develop viable solutions, report findings, and provide clear and effective recommendations.
- Experience working in a project-oriented environment.
- Experience developing functional and technical specifications, design flow, and system blueprints.
- Experience with Microsoft Office 365.
- Ability to plan, organize and document IT solutions.
- Ability to plan, organize and coordinate IT work assignments with other technical resources.
- Ability to work independently and as part of a team.
Education and Certifications:
Bachelor's degree in Computer Science, Information Technology, Data Science, or related field. Master's degree preferred.
Salary : $80,000 - $110,000