What are the responsibilities and job description for the Denodo Application Admin position at KeyLogic (previously IIA)?
KeyLogic is seeking a DevOps Engineer to support a major national laboratory. Our team of software engineers have a deep understanding of corporate business domains. Combining this knowledge with our comprehensive data engineering capabilities, allows us to extract data from a broad range of sources and deliver trusted datasets for interactive reports, visualizations, and analytical models that support data-driven insights/decisions.
Responsibilities:
Design, build and maintain a stable and efficient infrastructure to optimize service delivery across production, QA, and development environments throughout the development lifecycle. Monitor, troubleshoot, maintain, and continuously improve building, packaging, and deployment processes. Implement automated infrastructure capabilities like backups, security tools, and monitoring. Utilize a consistent DevOps approach to improve all phases of the process and ensure end-to-end quality across functions. Knowledge of deployment/configuration management tools like Jenkins, Maven, Puppet, or Ansible. Utilizes version control tools like GIT, Bitbucket, SVN, or CVS. Experienced with network infrastructure, database, cloud and data center operations, and security protocols. Strong knowledge of Linux and/or Windows OS. Experience with programming and scripting with languages like PHP, Python, Perl, Bash, Java, SQL, or C . On any given day, you may be called upon to:
- Setup and configure Denodo servers, data sources, security settings, and system parameters.
- Collaborate and partner with database administrators, data engineers, and vendors to monitor Denodo system performance and implement necessary adjustments.
- Proactively assess system needs and plan for future hardware and software upgrades.
- Denodo data virtualization/integration engineer:
- Connect to various data sources and create integrated data views using Denodo.
- Manage user accounts, roles, and permissions to control data access levels.
- Optimize performance of virtual data views through caching and query optimization.
- Serve as a Denodo data delivery expert to support enterprise reporting and analytics.
Qualifications:
- Candidate must have a BS/BA in Computer Science or related field and a minimum seven years of relevant experience.
- Must be US Citizen
- Must be able to obtain and maintain a U.S. Department of Energy clearance, with the ability to obtain a DOE Clearance
- Candidate should reside in the Albuquerque area as onsite work is required. A virtual candidate with a minimum of 3 years experience with Denodo would be considered but would require onsite travel as needed.
Required Skills:
- Linux administration: Basic Linux system administration knowledge. Linux Scripting/Commands and Shell Scripting including overall Linux Operating System knowledge.
- Installation and configuration: Setting up and configuring Denodo servers, including data sources, security settings, and system parameters.
- Scripting abilities: Proficiency in scripting languages like Python or Bash for automation tasks.
- User administration: Managing user accounts, roles, and permissions to control data access levels.
- Performance monitoring and tuning: Monitoring Denodo system performance, identifying bottlenecks, and implementing necessary adjustments.
- Backup and recovery: Implementing data backup strategies and ensuring the ability to restore data in case of emergencies.
- Troubleshooting and support: Diagnosing and resolving issues related to data access, connectivity, and performance.
- Database knowledge: Familiarity with various database technologies like SQL Server, Oracle, MySQL, and data modeling concepts.
- SQL Skills: The ability to understand complex SQL queries and write/tune efficient SQL code.
- Data source management: Connecting to various data sources (relational databases, flat files, APIs) and defining data access rules.
- Data integration skills: Experience with data extraction, transformation, and loading (ETL) processes.
- View creation and optimization: Designing and developing virtual data views to provide seamless data access for users, optimizing performance through caching and query optimization techniques.
Salary : $119,700 - $129,700