For Employers
Data Engineer


KNOWLEDGESG GLOBAL PTE. LTD.
12 days ago
Posted date
12 days ago
N/A
Minimum level
N/A
ITJob category
IT
Key Responsibilities:

  • Design, build, and maintain robust, scalable data pipelines for ingestion, transformation, and storage from multiple data sources.
  • Develop and optimize ETL/ELT workflows using modern data processing frameworks.
  • Architect data models and warehouse structures to support analytical and reporting needs.
  • Work with large-scale structured and unstructured data across distributed systems (Hadoop, Spark, or cloud-based).
  • Implement data governance, security, and quality standards across all data layers.
  • Collaborate with Data Scientists, Analysts, and Business stakeholders to ensure reliable data delivery and accessibility.
  • Integrate cloud data services (AWS, Azure, or GCP) with existing systems to enhance performance and flexibility.
  • Monitor, troubleshoot, and optimize data pipelines for performance and cost efficiency.
  • Mentor junior engineers and establish best practices in data engineering and DevOps for data.
  • Stay current with emerging tools and technologies in data architecture, streaming, and automation.

Technical Skills Required:

Programming & Data Processing:
  • Python, Scala, Java, or SQL (advanced proficiency).
  • Apache Spark, Hadoop, Flink, or Kafka for distributed data processing.

Data Warehousing & Databases:
  • Relational Databases: PostgreSQL, MySQL, Oracle, SQL Server.
  • Cloud Data Warehouses: Snowflake, Google BigQuery, Amazon Redshift, or Azure Synapse.
  • NoSQL Databases: MongoDB, Cassandra, DynamoDB.

Cloud Platforms:
  • AWS (Glue, S3, EMR, Redshift, Lambda, Athena).
  • Azure (Data Factory, Synapse, Databricks, Data Lake).
  • GCP (BigQuery, Dataflow, Pub/Sub, Dataproc).

ETL / Workflow Tools:
  • Airflow, NiFi, Talend, dbt, Informatica, or Azure Data Factory.

Data Modeling & Governance:
  • Dimensional modeling, Data Vault, and normalization.
  • Metadata management, lineage tracking, and data catalog tools (e.g., Collibra, Alation).

DevOps & CI/CD for Data:
  • Git, Docker, Kubernetes, Terraform, Jenkins, or similar.

Bonus Skills:
  • Experience with machine learning pipelines (e.g., MLflow, Vertex AI, or SageMaker).
  • Knowledge of streaming and real-time analytics systems.

Qualifications:

  • Bachelor's or Master's Degree in Computer Science, Data Engineering, Information Systems, or related field.
  • Minimum 10 years of experience in data engineering, data integration, or data architecture.
  • Proven track record of managing large-scale data solutions across hybrid or cloud environments.
  • Relevant certifications (preferred):
    AWS Certified Data Analytics - Specialty
    Microsoft Certified: Azure Data Engineer Associate
    Google Cloud Professional Data Engineer
    Databricks Certified Data Engineer Professional
Related tags
-
JOB SUMMARY
Data Engineer
KNOWLEDGESG GLOBAL PTE. LTD.
Singapore
12 days ago
N/A
Contract / Freelance / Self-employed

Data Engineer