loader image

Remote Sr. Python Developer

Job Description• Understand the Distributed Computing environment like Hadoop and have experience in leveraging Distributed Computing in Python application to handle large data volumes with good performance

• Experience of DevOps / CI / CD (Continuous Integration / Continuous Deployment) techniques.

• Experience in Development and Automation of Data Pipelines using Python

• Working experience in Anaconda distribution, Dask, Pandas, NumPy, Dockers + Kubernetes, Django

• Knowledge/experience of Apache Spark or AWS Glue experience

• Experience with AWS / S3 / Redshift will be a plus

• Develop/enhance Data Pipelines, ETLs, Web Services, Interfaces for sourcing and transforming structured and unstructured data

• Implement Data Parsing, Standardization and Data Quality rules

• Implement performance enhancements in Python applications/data pipelines to handle large data volume

• Automate and Deploy Data Pipelines / ETLs in a DevOps environment

• Good communication skills

• Willing to do research, self-study and willing to cross-train and recommend industry best practices to customer s architecture and development team


Position

Backend Developer


Must have Skills

  • Django

    Beginner

  • AWS

    Beginner

  • Docker

    Beginner

  • NumPy

    Beginner

  • Pandas

    Beginner

  • Python

    Beginner

  • CI/CD

    Beginner

  • DevOps

    Beginner

Client Payroll

Up to 450 K/Year USD (Annual salary)

Fully Remote

Cancel
Cancel

Active

Skip

Remote Sr. Python Developer

Job Description• Understand the Distributed Computing environment like Hadoop and have experience in leveraging Distributed Computing in Python application to handle large data volumes with good performance

• Experience of DevOps / CI / CD (Continuous Integration / Continuous Deployment) techniques.

• Experience in Development and Automation of Data Pipelines using Python

• Working experience in Anaconda distribution, Dask, Pandas, NumPy, Dockers + Kubernetes, Django

• Knowledge/experience of Apache Spark or AWS Glue experience

• Experience with AWS / S3 / Redshift will be a plus

• Develop/enhance Data Pipelines, ETLs, Web Services, Interfaces for sourcing and transforming structured and unstructured data

• Implement Data Parsing, Standardization and Data Quality rules

• Implement performance enhancements in Python applications/data pipelines to handle large data volume

• Automate and Deploy Data Pipelines / ETLs in a DevOps environment

• Good communication skills

• Willing to do research, self-study and willing to cross-train and recommend industry best practices to customer s architecture and development team


Job Type

Client Payroll


Positions

Backend Developer


Must have Skills

  • Django

    Beginner

  • AWS

    Beginner

  • Docker

    Beginner

  • NumPy

    Beginner

  • Pandas

    Beginner

  • Python

    Beginner

  • CI/CD

    Beginner

  • DevOps

    Beginner

Up to 450 K/Year USD (Annual salary)

Longterm (Duration)

Fully Remote

Skip

Ashutosh M

| United States