What are the responsibilities and job description for the Data Scientist/Big Data Engineer position at Radiant Digital?
About Radiant
Radiant is committed to solving complex challenges for our customers by delivering innovative technology solutions. Our client-centric model increases transparency and accountability, which helps promote efficiency and effectiveness. We work and grow together with our customers, and we help empower our clients to achieve success. Our flexible delivery model allows us to provide end-to-end solution delivery, single project execution, and, or strategic resources. CMMI Maturity Level III and ISO 9001 2015 certified.
Required Skills
Radiant is committed to solving complex challenges for our customers by delivering innovative technology solutions. Our client-centric model increases transparency and accountability, which helps promote efficiency and effectiveness. We work and grow together with our customers, and we help empower our clients to achieve success. Our flexible delivery model allows us to provide end-to-end solution delivery, single project execution, and, or strategic resources. CMMI Maturity Level III and ISO 9001 2015 certified.
Required Skills
- 4 years of Proven experience in data conversion and report building using Snowflake and SAS Viya.
- 4 years of Demonstrated experience in data transformation processes using ETL, SQL, Python, and R.
- 4 years of Experience working with data analytics and business intelligence tools.
- 4 years of Experience working in a Scrum or Agile development environment.
- 4 years of Proficiency in ETL processes and tools such as AWS Glue and Lambda.
- 4 years of Strong knowledge of database management and data warehousing concepts.
- 4 years Expertise in SQL for data querying and manipulation.
- Data Conversion and Reporting Support for System Modernization Efforts
- Data Transformation and Integration: Prepare and optimize data for migration to Snowflake and SAS Viya platforms, ensuring seamless integration and functionality by creating data transformation processes using ETL, SQL, Python, and R.
- Develop Federal and State Reports: Build comprehensive reports that meet federal and state requirements using Snowflake and SAS Viya, ensuring accuracy and compliance.
- Scrum Team Collaboration: Work as a member of an agile team to deliver new features and functions, delivering best-in-class value-based technology solutions.
- Data Quality Management: Develop and implement databases, ETL processes, data collection systems, and data quality strategies that optimize statistical efficiency, accuracy, and quality.
- Problem Examination and Resolution: Examine problems within the Data Intelligence space using ETL, Lambda, and Glue, and implement necessary changes to ensure data quality improvement.
- Data Analytics and Insights: Utilize advanced data analytics techniques to support strategic decision-making, ensuring data integrity, quality, and timeliness of results.
- The above job description and requirements are general in nature and may be subject to change based on the specific needs and requirements of the organization and project.