What are the responsibilities and job description for the AWS Python Developer position at Conch Technologies, Inc?
Hi, 12 Months Contract or Direct Fulltime
Greetings from Conch Technologies Inc
Job Title: Python Developer (Spark AWS)
Location: Plano, TX/ Wilmington, DE, Columbus, OH – ONSITE
Duration: 12 Months Contract or Direct Fulltime
Job Responsibilities
Develop and maintain data platforms using Python, Spark, and PySpark.
Handle migration to PySpark on AWS.
Design and implement data pipelines.
Work with AWS and Big Data.
Produce unit tests for Spark transformations and helper methods.
Create Scala/Spark jobs for data transformation and aggregation.
Write Scaladoc-style documentation for code.
Optimize Spark queries for performance.
Integrate with SQL databases (e.g., Microsoft, Oracle, Postgres, MySQL).
Understand distributed systems concepts (CAP theorem, partitioning, replication, consistency, and consensus).
Skills
Proficiency in Python, Scala (with a focus on functional programming), and Spark.
Familiarity with Spark APIs, including RDD, DataFrame, MLlib, GraphX, and Streaming.
Experience working with HDFS, S3, Cassandra, and/or DynamoDB.
Deep understanding of distributed systems.
Experience with building or maintaining cloud-native applications.
Familiarity with serverless approaches using AWS Lambda is a plus
Thanks and Regards,
Chanikya [IT Recruiter]
Direct : 214-247-7117
chanakya@conchtech.com
linkedin.com/in/bhadchan
Greetings from Conch Technologies Inc
Job Title: Python Developer (Spark AWS)
Location: Plano, TX/ Wilmington, DE, Columbus, OH – ONSITE
Duration: 12 Months Contract or Direct Fulltime
Job Responsibilities
Develop and maintain data platforms using Python, Spark, and PySpark.
Handle migration to PySpark on AWS.
Design and implement data pipelines.
Work with AWS and Big Data.
Produce unit tests for Spark transformations and helper methods.
Create Scala/Spark jobs for data transformation and aggregation.
Write Scaladoc-style documentation for code.
Optimize Spark queries for performance.
Integrate with SQL databases (e.g., Microsoft, Oracle, Postgres, MySQL).
Understand distributed systems concepts (CAP theorem, partitioning, replication, consistency, and consensus).
Skills
Proficiency in Python, Scala (with a focus on functional programming), and Spark.
Familiarity with Spark APIs, including RDD, DataFrame, MLlib, GraphX, and Streaming.
Experience working with HDFS, S3, Cassandra, and/or DynamoDB.
Deep understanding of distributed systems.
Experience with building or maintaining cloud-native applications.
Familiarity with serverless approaches using AWS Lambda is a plus
Thanks and Regards,
Chanikya [IT Recruiter]
Direct : 214-247-7117
chanakya@conchtech.com
linkedin.com/in/bhadchan