Re: Shiv Chandra Requirment - GCP Data Engineer - Dallas, TX or Hartford, CT

Hi Bhargav,
Thank you for sharing the job description.
Please find my candidate's document attached for your review. Let me know if you need any additional information or documentation from my side.
Looking forward to your feedback.

Candidate Details : 
Name  : Luis
Experience : 12+
Location : TX






Thanks & Regards
Manasa
M: 201 215 6949 
https://www.responseinformaticsltd.com


On Tue, Jun 17, 2025 at 7:04 PM Bhargav M <bmylavarapu496@gmail.com> wrote:
Hello There,
Hope this email finds you well.

Please find the requirement below for your reference.

Job title : GCP Data engineer

Location :  Dallas, TX or Hartford, CT 

If candidates is good  can ask for remote


The goal is to migrate data warehouse assets and ETL pipelines from Teradata to Google Cloud Platform (GCP). The role involves hands-on development, testing, and optimization of data pipelines and warehouse structures in GCP, ensuring minimal disruption and maximum performance.

Key Responsibilities:
•  Lead and execute migration of data and ETL workflows from Teradata to GCP-based services such as BigQuery, Cloud Storage, Dataflow, Dataproc, and Composer (Airflow).
•  Analyze and map existing Teradata workloads to appropriate GCP equivalents.
•  Rewrite SQL logic, scripts, and procedures in GCP-compliant formats (e.g., standard SQL for BigQuery).
•  Collaborate with data architects and business stakeholders to define migration strategies, validate data quality, and ensure compliance.
•  Develop automated workflows for data movement and transformation using GCP-native tools and/or custom scripts (Python/Java).
•  Optimize data storage, query performance, and costs in the cloud environment.
•  Implement monitoring, logging, and alerting for all migration pipelines and production workloads.
______________
Required Skills:
•  6+ years of experience in Data Engineering, with at least 2 years in GCP.
•  Strong hands-on experience in Teradata data warehousing, BTEQ, and complex SQL.
•  Solid knowledge of GCP services: BigQuery, Dataflow, Cloud Storage, Pub/Sub, Composer, and Dataproc.
•  Experience with ETL/ELT pipelines using tools like Informatica, Apache Beam, or custom scripting (Python/Java).
•  Proven ability to refactor and translate legacy logic from Teradata to GCP.
•  Familiarity with CI/CD, Git, and DevOps practices in cloud data environments.
•  Strong analytical, troubleshooting, and communication skills.
______________
Preferred Qualifications:
•  GCP certification (e.g., Professional Data Engineer).
•  Exposure to Apache Kafka, Cloud Functions, or AI/ML pipelines on GCP.
•  Experience working in the healthcare, retail, or finance domains.
•  Knowledge of data governance, security, and compliance in cloud ecosystems.



Regards,
Bhargav Mylavarapu | Delivery Lead,

CENTSTONE SERVICES

Address: 3400 State Route 35, Suite 9B, Hazlet, New Jersey, 07730 USA

--
To post on my website
EMAIL TO : shiv.dash.tech.solution@blogger.com
 
>> My website - https://usitrecruiter-solution-point.blogspot.com/
 
 
.
.
.
.
.
.
---
You received this message because you are subscribed to the Google Groups "Shiv Chandra C2C Requirements" group.
To unsubscribe from this group and stop receiving emails from it, send an email to shiv-chandra-c2c-requirement+unsubscribe@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/shiv-chandra-c2c-requirement/CAHN5f%2BgW1rND3WvctHMH8bvmhx1JjSwGtygq5vYMNELi86C1ug%40mail.gmail.com.

Post a Comment

0 Comments