Data Engineer
Job post no longer accepts applications

KNOWLEDGESG GLOBAL PTE. LTD.
a month ago
Posted datea month ago
N/A
Minimum levelN/A
2. Key Responsibilities
• Design, implement, and optimize ETL/ELT pipelines using Apache Spark, PySpark, Databricks, or Azure Synapse.
• Build and operationalize real-time streaming pipelines using Kafka, Confluent, or Azure Event Hubs for risk and liquidity data.
• Integrate and transform data from Core Banking, Payments, Trade, Treasury, CRM, and Compliance systems.
• Implement data quality, validation, and lineage frameworks using Great Expectations, Deequ, or dbt.
• Develop and maintain enterprise data models and schemas (3NF, Dimensional, Data Vault 2.0).
• Collaborate with Governance and Security stakeholders to ensure compliance with MAS TRM, PDPA, and PCI-DSS, including controls for masking, tokenization, and encryption.
• Participate in data platform modernization programs (e.g., Teradata/DB2 to Snowflake/Databricks/Synapse).
• Work with Data Scientists and AI Engineers to deploy ML feature stores and model-serving pipelines.
• Support regulatory reporting data flows (MAS 610/649, Basel III/IV).
• Maintain CI/CD and automation pipelines for data infrastructure using Azure DevOps, Terraform, or GitHub Actions.
3. Required Technical Skills
Category
Tools / Technologies
Languages
Python, PySpark, SQL, Scala
Data Platforms
Azure Data Lake, Synapse, Databricks, Snowflake
Orchestration
Apache Airflow, Azure Data Factory, dbt
Streaming
Kafka, Confluent, Event Hubs
Governance
Apache Atlas, Azure Purview, Collibra
Security
Encryption, Tokenization, RBAC, Audit Logging
CI/CD & IaC
Terraform, Azure DevOps, GitHub Actions
4. Experience & Qualifications
• 6-10 years of experience in Data Engineering, with minimum 3 years in BFSI (Banking, Insurance, or Capital Markets).
• Demonstrated experience building real-time and batch data pipelines on Azure or AWS.
• Exposure to regulatory data models such as MAS 610, Basel III, IFRS 9/17, and BCBS 239.
• Familiarity with DevOps and MLOps principles and integration patterns.
• Bachelor's or Master's degree in Computer Science, Data Engineering, or a related discipline.
• Preferred Certifications:
- Microsoft Azure Data Engineer Associate
- Databricks Data Engineer Professional
- Snowflake SnowPro Core
• Design, implement, and optimize ETL/ELT pipelines using Apache Spark, PySpark, Databricks, or Azure Synapse.
• Build and operationalize real-time streaming pipelines using Kafka, Confluent, or Azure Event Hubs for risk and liquidity data.
• Integrate and transform data from Core Banking, Payments, Trade, Treasury, CRM, and Compliance systems.
• Implement data quality, validation, and lineage frameworks using Great Expectations, Deequ, or dbt.
• Develop and maintain enterprise data models and schemas (3NF, Dimensional, Data Vault 2.0).
• Collaborate with Governance and Security stakeholders to ensure compliance with MAS TRM, PDPA, and PCI-DSS, including controls for masking, tokenization, and encryption.
• Participate in data platform modernization programs (e.g., Teradata/DB2 to Snowflake/Databricks/Synapse).
• Work with Data Scientists and AI Engineers to deploy ML feature stores and model-serving pipelines.
• Support regulatory reporting data flows (MAS 610/649, Basel III/IV).
• Maintain CI/CD and automation pipelines for data infrastructure using Azure DevOps, Terraform, or GitHub Actions.
3. Required Technical Skills
Category
Tools / Technologies
Languages
Python, PySpark, SQL, Scala
Data Platforms
Azure Data Lake, Synapse, Databricks, Snowflake
Orchestration
Apache Airflow, Azure Data Factory, dbt
Streaming
Kafka, Confluent, Event Hubs
Governance
Apache Atlas, Azure Purview, Collibra
Security
Encryption, Tokenization, RBAC, Audit Logging
CI/CD & IaC
Terraform, Azure DevOps, GitHub Actions
4. Experience & Qualifications
• 6-10 years of experience in Data Engineering, with minimum 3 years in BFSI (Banking, Insurance, or Capital Markets).
• Demonstrated experience building real-time and batch data pipelines on Azure or AWS.
• Exposure to regulatory data models such as MAS 610, Basel III, IFRS 9/17, and BCBS 239.
• Familiarity with DevOps and MLOps principles and integration patterns.
• Bachelor's or Master's degree in Computer Science, Data Engineering, or a related discipline.
• Preferred Certifications:
- Microsoft Azure Data Engineer Associate
- Databricks Data Engineer Professional
- Snowflake SnowPro Core
JOB SUMMARY
Data Engineer

KNOWLEDGESG GLOBAL PTE. LTD.
Singapore
a month ago
N/A
Contract / Freelance / Self-employed
Job post no longer accepts applications
Data Engineer
Job post no longer accepts applications