Sr Big Data Infrastructure Engineer (GCP)

United States - Remote
Public Cloud - Offerings and Delivery – Cloud Data Services /
Full - Time /
Remote
About the Role:
We are seeking a highly skilled and experienced Senior Big Data Infra Engineer to join our dynamic team. The ideal candidate will have a strong background in developing and scaling both stream and batch processing systems, and a solid understanding of public cloud technologies, especially GCP. This role involves working in a remote environment, requiring excellent communication skills and the ability to solve complex problems independently and creatively.

What you will be doing:

    • Implementing automation/DevOps best practices for CI/CD, IaC, Containerization, etc to Build a reusable infra structure for stream and batch processing systems at scale.
    • Create automation, whether that is building DevOps pipelines, scripting or creating Infrastructure as Code in Terraform
    • Participating in work sessions with clients 
    • Completing technical documentation 

Requirements:

    • Experience in Developing and Scaling data Processing Systems This includes working with technologies like Pub/Sub, Kafka, Kinesis, DataFlow, Flink, Hadoop, Pig, Hive, and Spark.
    • Expertise in public cloud services, particularly in GCP.
    • Experience with GCP managed services and understanding of cloud-based messaging/stream processing systems are critical.
    • Experienced in Infrastructure and Applied DevOps principles in daily work. Utilize tools for continuous integration and continuous deployment (CI/CD), and Infrastructure as Code (IaC) like Terraform to automate and improve development and release processes.
    • Has knowledge in containerization technologies such as Docker and Kubernetes to enhance the scalability and efficiency of applications.
    • Worked effectively in a remote setting, maintaining strong written and verbal communication skills. Collaborate with team members and stakeholders, ensuring clear understanding of technical requirements and project goals.
    • Proven experience in engineering stream/batch processing systems at scale.
    • Strong programming abilities in Java and Python.
    • Hands-on experience in public cloud platforms, particularly GCP. Additional experience with other cloud technologies is advantageous.

Must Have:

    • Google Associate Cloud Engineer Certification or other Google Cloud Professional level certification
    • 4+ years of experience in customer-facing software/technology or consulting
    • 4+ years of experience with “on-premises to cloud” migrations or IT transformations
    • 4+ years of experience building, and operating solutions built on GCP (ideally) or AWS/Azure
    • Technical degree: Computer Science, software engineering or related