What are the responsibilities and job description for the Database Analyst - Data Warehouse / ETL position at Tech Army?
Job Details:
Position: Database Analyst - Data Warehouse / ETL
Location: Tallahassee, FL, 32301 (Onsite)
Duration: 03 Months contract with possibility of extension
Scope of Services:
INTERVIEW= Onsite / Virtual
JOB TYPE= Onsite
Scope of Work
Required consultant experience provided by Contractor, shall include:
- 5 years of experience with Snowflake data warehouse.
- 3 years of experience with no/low code ETL tools/platforms (e.g., Matillion, Talend, Informatica).
- 3 years of experience performing ETL processes using code (e.g. Python, SQL).
- Strong experience with data integration, data quality, and data governance.
- Strong experience with relational databases, SQL, and data modeling.
- Strong experience working collaboratively with business analysts, developers, and non-technical end users related to data warehousing.
Preferred Experience:
- Excellent skills in communicating technical information to non-technical resources.
- Excellent troubleshooting skills.
- Excellent interpersonal and communication skills.
- Excellent communication and collaboration skills.
- Strong analytical and problem-solving skills.
- Detail-oriented, ability to switch tasks, ability to self-direct and prioritize tasks.
- Experience researching and investigating problems to develop viable solutions, report findings, and provide clear and effective recommendations.
- Experience working in a project-oriented environment.
- Experience developing functional and technical specifications, design flow, and system blueprints.
- Experience with Microsoft Office 365.
- Ability to plan, organize and document IT solutions.
- Ability to plan, organize and coordinate IT work assignments with other technical resources.
- Ability to work independently and as part of a team.
Required Duties and Responsibilities of Consultant shall include but are not limited to:
The Property Tax Oversight (PTO) program within the Florida Department seeks a Database Analyst to create tables, migrate data, configure ETL pipelines, and create backup loads.
The Database Analyst position will provide consulting services in support of the creation of a Data Warehouse for the Property Tax Oversight Program.
1. Writing ETL Pipelines
* ETL Tool Selection:
o Provide alternatives and advantages/disadvantages to aid in selection of a no/low code ETL platform/tool.
* Design and Develop ETL Pipelines:
o Create, optimize, and maintain ETL pipelines to migrate data from various sources (Oracle, MSSQL, flat files) to the target Snowflake data warehouse.
o Utilize a combination of coding (Python, SQL, etc.) and no/low-code tools (e.g., Azure Data Factory, Talend, Informatica) to develop ETL processes that meet data integration needs.
* Data Transformation and Validation:
o Apply transformations to cleanse, filter, join, and aggregate data from the source systems.
o Ensure data quality by implementing validation rules to verify data correctness and completeness before loading into Snowflake.
* Source-to-Target Mapping:
o Create detailed source-to-target mappings to ensure accurate transformation from source systems (Oracle, MS-SQL) into Snowflake.
o Ensure solution provides data quality, consistency, and integrity throughout the ETL process.
* Automation and Scheduling:
o Work with other resources to automate the ETL processes to run at scheduled intervals based on specific needs.
o Monitor and optimize ETL jobs to ensure they run efficiently and on time.
* Error Handling and Troubleshooting:
o Implement error detection, logging, and alerting mechanisms within the ETL pipelines.
o Troubleshoot ETL failures, identify root causes, and make necessary adjustments to ensure pipeline stability.
* Performance Optimization:
o Tune ETL processes for performance improvements, minimizing resource consumption and ensuring fast data migration.
* Testing:
o Develop test cases for ETL processes and data pipelines to ensure they meet business requirements and are free of errors.
o Perform unit testing of ETL pipelines.
Creating and Maintaining Snowflake Tables
* Table Design and Creation:
o Design, create, and optimize Snowflake tables to store transformed data, ensuring proper indexing, clustering, and partitioning strategies.
o Work with business stakeholders to understand data requirements and create tables that meet the business needs (e.g., fact and dimension tables, data marts).
* Schema Management:
o Maintain and manage Snowflake schema to ensure efficient querying and reporting.
o Monitor schema changes and manage data migrations across environments (development, staging, production).
* Data Modeling:
o Develop and maintain logical and physical data models in Snowflake, ensuring they align with business requirements and reporting needs.
o Support dimensional data modeling and star/snowflake schema design to facilitate reporting and analysis.
* Data Governance and Security:
o Implement data governance standards and policies to ensure data integrity and security.
o Define roles and access controls within Snowflake to restrict access to sensitive data.
* Data Quality and Consistency:
o Implement data quality checks within Snowflake to ensure that the data loaded into tables meets the necessary standards for reporting and analytics.
o Perform regular data audits to check for discrepancies or inconsistencies.
Collaboration / Documentation
* Collaboration:
o Work closely with business stakeholders, business analysts, data architect, and IT teams to gather requirements and provide technical solutions.
o Provide training, support, and documentation to end-users and technical teams to provide for long-term ongoing maintenance and support of the solutions.
o Regularly interface with and provide status updates to other project resources including project manager, business analysts, data warehouse architect, and key stakeholders.
o Work diligently to ensure completion of project goals by the end of the project.
* Documentation:
o Document ETL processes, Snowflake schema designs, and data pipelines for future reference and team knowledge sharing.
o Maintain clear documentation on data definitions, transformation rules, and data quality standards
Education/Certifications:
Bachelor's degree in Computer Science, Information Technology, Data Science, or related field.
Master's degree preferred
Job Type: Contract
Pay: $50.28 - $53.35 per hour
Expected hours: 40 per week
Schedule:
- 8 hour shift
Application Question(s):
- * 5 years of experience with Snowflake data warehouse.
- * 3 years of experience with no/low code ETL tools/platforms (e.g., Matillion, Talend, Informatica).
- * 3 years of experience performing ETL processes using code (e.g. Python, SQL).
- * Strong experience with data integration, data quality, and data governance.
- * Strong experience with relational databases, SQL, and data modeling.
- * Strong experience working collaboratively with business analysts, developers, and non-technical end users related to data warehousing.
- Local Candidates Only
Ability to Commute:
- Tallahassee, FL 32301 (Required)
Ability to Relocate:
- Tallahassee, FL 32301: Relocate before starting work (Required)
Work Location: In person
Salary : $50 - $53