What are the responsibilities and job description for the Lead .Net Developer with Master Data(Local to NJ and NY only) position at Marici Solutions?
Job Details
Lead .Net Developer with Master Data Location: Hybrid/Midtown New York City 3 days a week Long Term Contract
Client has not been able to get the right resource, so they are still hiring on priority basis, most interviews got failed as resources lacked Master Data development exp or lied about it.
Note: Candidates must be LOCAL to the NEW YORK or NEW JERSEY area and COMMUTE into the office THREE TIMES A WEEK.
*** We need: A senior (12 years) .Net developer with experience developing and maintaining robust software solutions to support financial market data analysis and investment decision-making. They must Build and maintain market data applications / REST API s using .NET / Python and Develop data ingestion pipelines to collect and integrate data from market data vendors. Candidate must Implement and integrate Azure Cloud services for efficient data handling and processing and Deploy applications and services to the Azure Cloud, ensuring scalability and reliability.
**Candidates must have Long Projects/Good Tenure, Excellent communication skills and a State issued ID (Not Bills) showing they are Local.
Candidates must have recent Market Data development experience.
Market Data Developer/.Net/Azure/Capital Markets Summary
This role focuses on developing and maintaining robust software solutions to support financial market data analysis and investment decision-making. The ideal candidate will have a strong technical background, excellent problem-solving skills, and the ability to work independently and collaboratively within a dynamic team environment.
Design & Development:
- Build and maintain market data applications / REST API s using .NET / Python
- Develop data ingestion pipelines to collect and integrate data from market data vendors.
- Develop and maintain complex T-SQL queries and stored procedures for data manipulation and reporting:
- Create scalable market data solutions to consolidate and analyze large datasets from market feeds and internal systems
Data Pipeline & ETL Processes:
- Implement efficient ETL processes to transform and load market data into databases or data warehouses
- Automate data cleaning, transformation, and validation tasks to ensure high-quality data
- Utilize Azure Data Factory (ADF) and Databricks to create scalable data ingestion processes.
Data Integration & Storage:
- Integrate disparate data sources into a unified system
- Design data storage solutions (relational or NoSQL databases) to support real-time and historical analysis
Monitoring & Support:
- Develop and maintain monitoring tools to track data feed quality and system health
- Provide production support and resolve issues as they arise, ensuring minimal disruption to trading activities
Cloud Experience:
- Implement and integrate Azure Cloud services for efficient data handling and processing
- Deploy applications and services to the Azure Cloud, ensuring scalability and reliability
- Required Skills:
- Bachelor's degree in computer science, Engineering, or a related field (or equivalent professional experience).
- 6 years of practical work experience involving system architecture, software application and message and data transfer and processing.
- Strong experience in .NET Core with hands-on development of scalable web applications using C#.
- Ingest data from multiple REST APIs, Build and maintain libraries to scale API development.
- Proficiency with SQL for writing efficient queries, interacting with databases, and optimizing performance.
- Experience working with RESTful APIs and integrating systems with APIs.
- Familiarity with cloud platforms like Azure or AWS and cloud-based deployment and scaling practices.
- Exposure to CI/CD tools such Azure DevOps.
- Implement, maintain and update CI/CD pipelines on a cloud environment.
- Experience with data feeds and data modeling - BBG, Intex, S&P, Markit, Greenstreet, MSCI and other similar market data vendors
- Excel expertise, Data Lakes and warehousing, SQL Server, Databricks