At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. 

EY GDS – Data and Analytics (D&A) – GCP
As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. 
 

The opportunity

We’re looking for candidates with strong technology and data understanding in GCP engineering space, having proven delivery capability. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team.

Your Technical responsibilities:

  • Working knowledge of Google Cloud Platform
  • Worked with a combination of cloud based big data technologies (Google Big Query, Cloud Data Flow, Cloud Composer, Cloud Functions, Cloud Pub Sub, GKE, Spark, Python and SQL, Nifi) and (OLTP and Data warehousing) within SQL, NoSQL and other RDBMS’s.
  • Working exp on Data flow or STS to build pipeline from traditional and cloud source to GCP.
  • Good understanding of cloud design considerations and FinOps 
  • Proficient in a modern scripting language like Python, Spark and Scala.
  • Methodological, logical, and flexible approach, to have gained significant experience of general solution
  • Prior experience working with: container technology such as Docker, version control systems (Github), build management and CI/CD tools (Concourse, Jenkins)
  • Google Cloud Certified is add-on
  • Good to have knowledge in streaming tools like kinesis (Knowledge in Kafka would also suffice)

Requirements (Qualifications)
We are looking for the candidates with the following: 

  • BE/BTech/MCA with a sound industry experience of 3 to 7 years

Mandatory skills: 

  • Programming Language
  • Hands-on GCP experience 
  • Good Understanding of Application Architecture

Preferred skills: 

  • Experience working on CMMI / Agile / SAFE methodologies
  • Experience working with AWS and/or Azure and/or GCP and a proven track record of building complex infrastructure programmatically with IaC tooling or vendor libraries 
  • Strong scripting / programming skills in Python, Linux Shell 
  • Experience with CI/CD pipelines and using test driven frameworks
  • Experience with implementing and testing GCP Services
  • Experience with Agile and DevOps concepts
  • Strong communication and written skills
  • Experience creating technical architecture documentation
  • Experience in Linux OS internals, administration and performance optimization. 
  • Experience with Agile and DevOps concepts Developing monitoring architecture and implementing monitoring agents, dashboards, escalations, and alerts.  

Your people responsibilities:

  • Cloud Infrastructure Management: Manage Google Cloud and other cloud environments (like AWS, Azure). Design, set up, and manage data processing systems on GCP and other cloud environments.
  • Data Pipeline Workflows: Build and maintain data pipelines using tools and services like Dataflow, Dataproc, Apache Beam, Apache Airflow, Cloud Composer, etc. to collect, process, and distribute data.
  • CI/CD Implementation: Implement CI/CD pipelines for data applications using services like Cloud Build, Jenkins, or Gitlab. Continuous testing, continuous integration, build, package, and deployment should be automated across multi-cloud environments.
  • Code Management: Take charge of source code repositories like GitHub or Bitbucket, ensuring robust version control practices.
  • Data Security & Compliance: Implement security best practices. Handle actions such as encryption, managing and securing service accounts, secret management (like with Secret Manager), authentication, access control, and data compliance necessities (like GDPR).
  • Performance Optimization and Cost Control: Monitor, debug, troubleshoot, and optimize performance of Dataflow jobs, and other data infrastructure. Implement strategies for optimizing cost across multiple clouds.
  • Data Lakes and Data Warehouses Management: Manage BigQuery or other cloud data warehouse solutions and services for Data Lakes.
  • Collaboration & Communication: Collaborate with data scientists, architects, developers and other stakeholders, sharing insights and making data-driven decisions to develop data strategies and models.
  • Continuous Learning: Keep up to date with the latest GCP and other cloud services, big data, and CI/CD technologies. Constant improvement and learning are typically expected in such roles as technology continues to evolve quickly.
  • Architecting and Designing: Involve in requirements understanding, solution architecture, designing of the data models considering best practices, service limits, security and cost in multi-cloud environments.
     

EY | Building a better working world 


 
EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.  


 
Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate.  


 
Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.  

Location

Coimbatore, TN, IN, 641014

Job Overview
Job Posted:
7 months ago
Job Expires:
Job Type
Full Time

Share This Job: