Senior Data Engineer (WORK FROM HOME)

Job Locations PH
Requisition Post Information* : Posted Date 3 months ago(1/25/2024 8:24 AM)
Requisition ID
8-2024-20002
# Of Equipment Required
4
Category (Portal Searching)
Information Technology

Overview

Join our award-winning IT team as we lead the way in digital, cloud, and security technology services. You will be playing a critical role in delivering innovative solutions for our biggest client, Canada’s leading telecommunications, tech, and media corporation.

 

We’re looking for passionate and creative tech leaders who want to take their career to the next level and make a real impact. Our success is fueled by our people and our passion for innovation, so we empower our Qmunity and provide a workplace where they can flourish and grow.

 

We offer premium benefits, including:

  • Work from Home setup
  • Salary based on experience, miscellaneous allowances
  • Performance bonuses and yearly increase
  • HMO from day 1 for you + 2 free dependents
  • 6 months paid maternity/paternity leave
  • Company-sponsored training and upskilling, and career growth opportunities!

Responsibilities

  • Design, develop, and operate scalable and efficient data pipelines across on-premise and public cloud environments.
  • Collaborate with cross-functional teams translating business requirements into performant data pipelines with service level objectives (SLOs).
  • Drive best practices in data engineering, ensuring high code quality, maintainability, and user documentation.
  • Leverage cloud technologies with a focus on Google cloud services such as BigQuery, Dataflow, and Pub/Sub.
  • Optimize data pipelines for end-to-end performance, reliability, security, and resource-efficiency.
  • Develop and implement monitoring, alerting, and incident response processes for data pipeline systems and infrastructure.
  • Assure the operations of data pipelines through 24-7 monitoring of data service level indicators (SLIs).
  • Assure the performance of data infrastructure, ensuring 99.99% high availability.
  • Provide DevOps mentorship and guidance to team members.
  • Contribute to the development of data systems disaster recovery plans and conduct regular compliance validation drills.
  • Mentor and coach junior data engineers, fostering a culture of continuous improvement and learning.

 

 

Qualifications

  • Bachelor’s degree in computer science, Software Engineering, or related field; advanced degree preferred.
  • 4+ years of experience in engineering and operating end-to-end data systems across multi-cloud environments, with expertise in GCP.
  • 4+ years of proven experience in software architecture and design, driving the technical direction of projects.
  • 4+ years of expertise in on-premises data engineering utilizing technologies such as Hadoop, Kubernetes, and Kafka.
  • 4+ years of experience using Scala, Apache Spark, Apache Kafka, SQL/NoSQL, Shell scripting, Python, and Hadoop frameworks.
  • At least 2 years of hands-on experience in streaming batch processing technology using Apache Kafka and cloud platform using Google Cloud Platform.
  • At least 2 years of hands-on experience building data pipeline in Java and Python, and applying CI/CD automation within data pipeline SDLC.

 

Nice to have:

  • Knowledge of Database design, optimization, and performance tuning
  • Site Reliability Engineering or Software System performance optimization

 

 You will thrive in this role if you have:

  • Excellent problem-solving skills and the ability to work in a collaborative, fast-paced environment.
  • Passion for staying updated on the latest trends and technologies in cloud data engineering.
  • Fluent in English, both verbal and written, with excellent communication skills.

 

If you’re ready to take the next step in your career, APPLY NOW! 

 

#LS-CS1

Options

Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
Share on your newsfeed