Join an established company that is unlocking technology’s untapped potential
End-to-end ownership of Data Engineering in APAC
A role that will accelerate your career
Our client is a globally leading technology protection services provider with millions of consumers across the globe. With offices across multiple continents, they are establishing a dynamic and rewarding environment for their employees to grow in.
You will be responsible for :
Designing APAC specific Enterprise Data Platform to to deliver cloud-based intelligent systems and build data pipelines, architectures, and data sets from raw, loosely structured data
Building processes supporting data transformation, data structures, metadata, dependency and analyze disparate and diverse data assets to automate insights and drive business performance
Interfacing with the cross-functional teams including Finance, Risk, SCM and Product to design and build data models that provide actionable insights into key business performance metrics, as well as supporting the needs of the commercial team
Leveraging cloud-based architectures and technologies to deliver optimized ML models at scale
Working closely with leaders within data architecture, enterprise architecture, data science/analyst and domain experts, to build and maintain roadmaps against the IT strategy
Owning the delivery of a modern data engineering model that follows Dev/Ops principles and standards for continuous integration/ continuous delivery (CI/CD) processes
Identifying the automation of existing manual processes to drive key business performance
Participating in Project Working groups and assisting in tracking the Project Mile Stones and deliverables
You have a Bachelor’s or Master’s degree in computer science, engineering, math or a related field.
You have at least 8 years of experience with public and private cloud solutions, including building and maintaining a data ecosystem that includes an ERP environment as well as designing and building data-intensive solutions using distributed computing
You have at least 6 years of experience in programming with SQL, Python, Scala and Java
You have advanced working SQL knowledge and experience working with relational databases, query authoring (SQL), as well as working familiarity with a variety of databases leveraging big data technologies
Experience with distributed data streaming frameworks like Spark Structured Streaming, Apache Flink, Kafka is required
You have OLAP system experience (Cubes, MSSQL)
You are familiar with data warehouse usage and optimization (RedShift, Hive, Snowflake)
You are well-versed in building data visualizations or analytics e.g. Power BI, SSRS
You have strong analytical skills and an understanding of statistical methodologies.
You are able to communicate effectively with external clients and internal teams and manage expectations
You are capable of working as an effective team member, share knowledge and information with the team and help others meet team priorities.
You have the ability to work independently & be an effective decision-maker
You are able to engage with offshore based development and support teams.
You show a willingness to learn & adapt based on strategic initiatives in the pipeline
Help us improve Jobscentral by providing feedback about this job:
Report this Job
Once a job has been reported, we will investigate it further. If you require a response, submit your question or concern to ourTrust and Site Security Team
Job ID: 9GGUPMGBHO
privacy and protection,
when applying to a job online, never give your social security number to a prospective employer, provide credit card or bank account information, or perform any sort of monetary transaction.Learn more.
By applying to a job using Jobscentral you are agreeing to comply with and be subject to the Jobscentral
Terms and Conditions
for use of our website. To use our website, you must agree with the
Terms and Conditions
and both meet and comply with their provisions.