Data Engineer (Work from Home)

Job Locations PH
Requisition Post Information* : Posted Date 3 months ago(1/26/2024 4:09 AM)
Requisition ID
# Of Equipment Required
Category (Portal Searching)
Information Technology


Join our award-winning IT team as we lead the way in digital, cloud, and security technology services. You will be playing a critical role in delivering innovative solutions for our biggest client, Canada’s leading telecommunications, tech, and media corporation.


We’re looking for passionate and creative tech leaders who want to take their career to the next level and make a real impact. Our success is fueled by our people and our passion for innovation, so we empower our Qmunity and provide a workplace where they can flourish and grow.


We offer premium benefits, including:

  • Flexible work setup: home-based or hybrid available (reporting to sites in Taytay, San Mateo, Naga, or upcoming Makati or Ortigas Center)
  • Salary based on experience, miscellaneous allowances.
  • Performance bonus and yearly increase
  • HMO from day 1 for you + 2 free dependents
  • 6 months paid maternity/paternity leave
  • Company-sponsored training and upskilling, and career growth opportunities!


  • Design, develop, and operate scalable and efficient data pipelines across on-premise and public cloud environments.
  • Drive best practices in data engineering, ensuring high code quality, maintainability, and user documentation.
  • Leverage cloud technologies with a focus on Google cloud services such as BigQuery, Dataflow, and Pub/Sub.
  • Optimize data pipelines for performance, reliability, resource-efficiency, and end-to-end assurance.
  • Develop and implement monitoring, alerting, and incident response processes for data pipeline systems and infrastructure.
  • Assure the operations of highly performant data pipelines through 24-7 monitoring of data service level indicators (SLIs).
  • Assure the performance of data infrastructure, ensuring 99.99% high availability.
  • Collaborate and take direction from senior data engineers, ensuring a culture of continuous improvement and learning.


Minimum Required Skills & Experience:

  • Bachelor's or advanced degree in Computer Science, Software Engineering, or a related field.
  • More than 2 years experience in Engineering and operating end-to-end data systems within a cloud environment.
  • More than 2 years of expertise in on-premise data engineering utilizing technologies such as Hadoop, Kubernetes, and Kafka
  • At least 2 years of proficiency in Scala, Apache Spark, Apache Kafka, SQL/NoSQL, Shell scripting, Python, and Hadoop frameworks
  • Candidate must also have hands on of the following:
    • Hands-on experience with streaming and batch processing technologies such as Apache Kafka and Spark\Spark Streaming.
    • Hands-on experience with cloud platforms, with a preference for expertise in Google Cloud Platform (GCP).
    • Hands-on experience building data pipelines in Java and Python.
    • More than 2 years experience applying CI/CD automation within data pipeline SDLC.
  • At least 2 years of experience in database design, optimization, and performance tuning.


You will thrive in this role if you have:

  • Cloud data or architecture certifications in GCP, AWS, or Azure
  • Experience with OpenShift
  • Excellent problem-solving skills and the ability to work in a collaborative, fast-paced environment.
  • Passion for staying updated on the latest trends and technologies in cloud data engineering.
  • Fluent in English, both verbal and written, with excellent communication skills.



If you’re ready to take the next step in your career, APPLY NOW! 




Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
Share on your newsfeed