Python Developers

The Python Developer is responsible for the design, architecture, and development of advanced software solutions.
Adheres to the organization’s software solutions while following the best practices and standards set by the Software
Development Group.

Location: YONDU HQ

Status: For Pooling

# of Positions Available: 3

Job Description

• Implement system integrations from/to Ultron with other IT systems. For example, connecting with the API of a CRM system by automating requests to be made to it, to deliver model outputs
• Support Ultron data science team with expertise in Python data engineering in terms of large scale data processing and performance tuning
• Be responsible of active data interfaces from/to DaaS datawarehouse – keeping track of and coordinating changes in DaaS data model and/or data contents, to mitigate impact on Ultron.
• Connect regularly with DaaS team on Ultron's requirements as a DaaS consumer, in terms of extraction performance, access rights, etc.
• Perform code reviews ahead of code merges for data engineering code to peers
• Define job automation and orchestration scripts in Ultron's Airflow environment

Job Qualifications/Requirements
  • Strong code development practices in Python >=3.7 with high amount of rigor and high code standards
    • Experience in quality assuring data engineering code, e.g., by reviewing pull requests
    • Strong capabilities in data management using
    • Relational methods/systems (SQL),
    • Object storage/big data approaches (AWS S3/HDFS/Azure Data Lake),
    • Distributed computing frameworks (such as Apache Spark)
    • Strong capabilities in data storage layer design, in the physical (data asset organization, data type choice, data compression, data formats) and logical sense (data cardinality and normal forms, primary/foreign key relationships, integrity constraints)
    •  Strong capabilities in PySpark, covering data management and performance tuning, at data scale >1TB
    • Experienced in code versioning and release management through git, e.g., following the gitFlow approach
    • Experienced in unit testing, static code analysis/code linting, using e.g., pytest, flake8, black, isort
    • Hands-on experience with a workflow orchestrator, preferrably Apache Airflow
    • Basic knowledge of DevOps/cloud native approaches–minimally Docker, ideally Kubernetes, Terraform and similar
  • Nice-to-haves:
    • Experienced in Python based machine learning, using sci-kit learn, preferably also Spark ML