Job Location : Trivandrum, Kochi, Chennai, pune, Bangalore
Experience : 5 Yr
CTC Budget : 2000000 to 2800000
Posted At : 17-Oct-2025
with a strong foundation in Java (17+) and hands-on experience with Google Cloud Platform (GCP). The ideal candidate will design, develop, and maintain data processing and orchestration pipelines leveraging modern cloud technologies such as Apache Beam, Cloud Dataflow, and Airflow/Composer.
Key Responsibilities:
• Design, develop, test, and deploy highly scalable and reliable data processing pipelines using Java 17+ and the Apache Beam SDK, executed on GCP Cloud Dataflow.
• Build and manage data orchestration workflows using Apache Airflow or GCP Cloud Composer, including creation and maintenance of DAGs with common and custom operators.
• Work with GCP Big Data services including BigQuery, BigTable, and Google Cloud Storage (GCS) to optimize data storage and processing.
• Write, review, and optimize complex SQL queries for data retrieval, transformation, and analytics (especially in BigQuery and Cloud SQL).
• Ensure code quality, testing coverage, and technical documentation are maintained to the highest standards.
• Collaborate with senior engineers and stakeholders to deliver robust, secure, and efficient data solutions.
• (Added Advantage):
o Deploy and manage applications using Google Kubernetes Engine (GKE).
o Implement security best practices via Identity and Access Management (IAM).
o Work with Cloud Spanner and develop secure REST APIs for data access.
Mandatory Skills:
Candidate should have hands on experience in Java 17+
Must have good experience in spring & microservices
Experience with GCP Cloud Dataflow (Apache Beam SDK)
Hands-on with Airflow/Composer – creating and managing DAGs, operators
Strong understanding of GCP Big Data services – BigQuery, BigTable, GCS, Cloud SQL
Excellent SQL query writing and optimization skills