What are the responsibilities and job description for the CPPS (Capacity Planning) - Remote position at Lorven Technologies Inc.?
Position: CPPS (Capacity Planning)
Location: Denver, CO & Atlanta, GA
Duration: Contract
Job Description
Core tools across the board
Location: Denver, CO & Atlanta, GA
Duration: Contract
Job Description
Core tools across the board
- Strong Excel skills
- Word
- Power Point
- Windows PowerShell
- Forecasting for zVM using R software and python scripts
- Knowledge of navigation within zView monitoring tool
- Coding Data extracts (Capacity Reports) within zView
- CMS navigation
- Disaster recovery validation for Delta DL2 environment
- z/OS batch jobs to process zVM data into SAS databases
- Ad Hoc – Zview
- Daily Performance reports
- Monthly Performance and Capacity Plan Forecast
- Knowledge of TSO, JCL, SDSF, CA-View, & SAS/MXG code to process SMF data for
- problem analysis, and system/application changes
- Omegamon, and RMF reporting and monitoring tools
- Monitoring performance of DASD subsystem
- Documentation for process to eliminate looping DB2 threads
- IDMS (Carmaster, Roommaster) online database monitoring and reporting
- z/OS Insights processing of TPF data to Linux environment
- WLM Service Policy maintenance and updates
- Thruput Manager coding of system control thresholds
- Application Performance Monitor queries
- SAS, MXG, and Cheryl Watson contract documentation
- CA-ESP coding to extract batch schedule data
- Code to interrogate TPF Transaction logging data – JCL, SAS
- Ad Hoc – JCL, SAS, MXG, TSO, SDSF, Omegamon, RMF Portal
- CMS Navigation on zVM
- Structured Data Driver (SDD) throttle daily reporting via Splunk monitoring tool
- Maintenance of SDD throttle environment for shopping transaction volume changes
- Qliview and SAS Portal required to determine throttle values
- DBCR and Change Mgmt knowledge to document SDD changes – Toolchain, Service
- Now
- Coordinating iStream and Lpar weight changes – Service Now Change tickets
- Coordinating Efficiency Level Testing – Service Now Change Tickets, Splunk, Excel,
- Lonnie Sheets
- Worldspan CECB throttle mechanism updating
- OIO Report and Verification – Excel, zOS, zVM, SCRT
- Community MIPS Forecasting – Excel, zOS, zVM, Python, Cognos, SYSCO Collection
- data, SAS Portal
- DASD Performance/End of Life – SAS, zOS
- Delta Performance – SYSCO, IBM Data Collector, Software Profiler, Excel, SAS, JCL
- Delta Monthly Deliverables – Excel, Splunk, ‘Lonnie’ Sheets (realtime monitoring),
- PowerPoint
- CEC Community MIPS Configurations (LPAR Weights) - Excel
- Impact Analysis – SAS, zOS, zVM, SYSCO, IBM Data Collector, Software Profiler,
- Toolchain, Transaction logging on 1P), Transaction Logging in 1G/1V.
- MIPS/IO Savings - SAS, zOS, zVM, SYSCO, IBM Data Collector, Software Profiler
- Memory/Working Storage – Splunk, Excel
- Delta Component Analysis – Excel, SYSCO, zOS
- VFA Analysis – Excel, IBM Data Collector, SYSCO, zOS, zVM
- Daily Analysis/Monitoring – Splunk, Lonnie Pages, Excel
- Daily Delta Workbook – Excel, Splunk
- CEC Upgrade Performance Analysis – Excel, IBM Data Collector
- Monthly Collection Data sent to Delta – Excel, SAS, zOS, zVM, SYSCO, IBM Data
- Collector, Software Profiler
- IBM CPU Measure Collection – Toolchain, zVM
- Ad Hoc - Excel, SAS, zOS, zVM, SYSCO, IBM Data Collector, Software Profiler,
- Splunk, Word, SAS Portal, Lonnie Sheets, Toolchain, Transaction logging
- System Performance monitoring, and Monthly Capacity Forecasting - LX1 Lpar
- Tool Chain reporting via zView.
- Disaster Recovery verification of LX1 Lpar.
- Mongo DB monitoring via zView and OPS Manager
- GitHUb command language to access MAU – metric access utility
- Monthly peak TPS shop forecast
- Open Systems (Windows & LINUX servers) capacity planning: NGGF (On-Prem &
- Cloud), NGA, DCC, LTOTE, Splash
- Open Shift (On-Prem & Cloud) capacity planning: TripServices, NDC, Akana, Hotel,
- uAPI, IDMS, TripChange
- Coordinate all other systems: Capacity Planning Checklist (Bi-Weekly)
- Tools: QlikView, Grafana, AppDynamics, Splunk, Excel, SAS Portal, PowerBI
- Assist in defining documentation for Shopping Capacity Request form –Service Now
- Review customer submitted Capacity Requests for approval - Qlikview, SAS Portal, Service Now
- Collect Shopping transaction data for verification process – Qlikview, SAS Portal
- Customer Tracking - Qlikview
- Customer Behavioral Changes - Qlikview
- Shopping Performance Changes - Qlikview
- Top Customers Reports - Qlikview
- Bi-Weekly Capacity Planning by Customer - Qlikview
- Look to Book Report - Qlikview
- Customers Database - Qlikview
- Customers by region report – Qlikview, Power BI
- Peak Analysis Report – Qlikview, Power BI
- Air Shopping rates Forecasting input by Customer - Qlikview
- Anomaly Report – Qlikview
- Ad Hoc – Qlikview, Power BI
- Proficient in python, SQL, distributed systems, and modern Kubernetes tooling
- Creating system architecture and implementing solutions to complex problem
- spaces.
- Developing python data pipelines in Apache-Airflow and maintaining a robust
- monorepository structure to support future development.
- Implementing & maintaining Kubernetes infrastructure on OpenShift and Azure to
- support NDC objectives.
- Working with the data team to ensure that data needs are met, and data sources are
- accessible. This task requires proficiency in modern big data tools such as HDFS,
- Spark, and Hive
- Maintaining & monitoring any data created pipelines (with proper use of Airflow)
- Planning future projects for the team with senior leadership and being able to create
- solutions that can easily adapt to end user changes.
- Able to defend ideas and designs during stand-up meetings and architecture
- proposals.
- Ensuring and continuously improving development standards so that new projects
- are quick to start and require as few dependencies as possible.
- Organize thoughts into a comprehensive solution that addresses all stages of the
- software development lifecycle
- Data Collection/Extraction
- Data Transformation and Manipulation.
- Data Visualization