Location: YONDU HQ
# of Positions Available: 6
DUTIES AND RESPONSIBILITIES:
● Develops automated data pipelines based on recommended solution by Data Engineering Lead using official ETL tools determined by Data Engineering Head
● Creates automated data pipeline design documentation, Source-to-Target spreadsheets, and technical design specifications. ● Provides technical expertise, troubleshoots, guides and supports during design, development, testing, production and post-production.
● Understands the source system, captures and documents business rules.
● Analyzes and proposes changes to maximize code reusability and maintain business value.
● Develops, implements and maintains highly efficient and scalable automated data pipeline processes.
● Provides expert technical knowledge and solutions pertaining to Data integration.
● Identify opportunities to optimize the automated data pipeline including environment setup, implement monitoring, quality and validation processes to ensure data accuracy and integrity.
● Provides inputs into data governance organization to drive the data governance charter.
● Ensures compliance to internal standards and guidelines.
● Conducts unit testing and component integration testing for solutions developed
● Reviews business and technical requirements and ensures the data integration platform meets requirements.
● Applies industry best practices for automated data pipeline design and development.
● Implementation of an end-to-end complex automated data pipeline system using common Big Data Tools such as (but not limited to) Apache Spark, Talend, Kafka, Airflow, NiFi i and Hadoop Tools.
● Implements monitoring and measurement of data quality as defined by business rules.
● Ensures adherence to architectural governance standards and practices.
● Develops best practices, standards of excellence, and guidelines for programming teams.
● Ensures compliance to the DevSecOps practice.
● Conducts System Testing - execute job flows, investigate system defects, resolve defects and document results.
● Provides a level of effort estimates for new initiatives and change controls in order for projects to be evaluated, budgeted and tracked for success.
● Education Bachelor’s degree in an information technology field or Computer Science or related fields
● Related Work Experience:
○ Required: Java, Spark, Linux / Unix, Scala, SQL
○ Advantage: Hadoop, AWS Service knowledge, Jenkins, Bitbucket, CI/CD process,ETl/DW, BigData
○ Good communication and interpersonal skills for interacting and collaborating with developers, analysts, and business staff throughout the organization.
○ Ability to communicate clearly in writing to document data requirements and translate into technical solutions