What are the responsibilities and job description for the Java Spark Developer (Only W2 and USC) position at ProCorp Systems Inc.?
Job Details
Job Overview:
Seeking a Java Spark Developer with expertise in big data processing, Core Java and Apache spark particularly within finance domain. Candidate should have a strong experience working with financial instruments, market risk and large-scale distributed computing systems. This role involves developing and optimizing data pipelines for risk calculations, trade analytics and regulatory reporting. Key Responsibilities:
Develop and optimize scalable Java Spark-based data pipelines for processing and analyzing large scale financial data.
Design and implement distributed computing solutions for risk modeling, pricing and regulatory compliance.
Ensure efficient data storage and retrieval using Big Data.
Implement best practices for spark performance tuning including partition, caching and memory management.
Design and implement distributed computing solutions for risk modeling and regulatory compliance.
Maintain high code quality through testing, CI/CD pipelines and version control (Git, Jenkins).
Work on batch processing frameworks for Market risk analytics.
Qualification Skills:
7 years of experience in software development with at least 3 years of experience in Java Spark and Big data frameworks.
Strong proficiency in Python and Java Spark with knowledge of core spark concepts (RDDs, Dataframes, Spark Streaming, etc.)
Experience working in financial markets, risk management and financial instruments.
Familiarity with market risk concepts including VaR, Greeks, scenario analysis and stress testing.
Hands on experience with Hadoop, Spark.
Proficiency on Git, Jenkins and CI/CD pipelines.
Excellent problem-solving skills and strong mathematical and analytical mindset.
Ability to work in a fast-paced financial environment.