GCP Data Platform Architect

  • Permanent
  • London
  • Posted on June 18th, 2021
Sorry, this advert is now closed. Click here to view our live vacancies.

GCP Data Platform Architect - Insight & Data Services - Permanent

Base Location: London / UK wide (some remote working)

The Client:

Our client is a global leader in Systems Integration and IT Consultancy. They have built out a super advanced and respected industry wide Insights & Data Practice. The Data Engineering, Architecture and Platform practice is part of global Insights & Data group; their goal is to help the organisations they work with become truly ‘insight driven’, to fully exploit their data using the convergence of Cloud and Artificial Intelligence to deliver real business value.  Their objective is to marry the most innovative insights solutions with rock solid, industrialised engineering.

The Role:

We can’t become ‘insight driven’ without data.  We can’t deliver the AI revolution without data.  Data is the lifeblood of everything we do in Insights and Data.  We embrace experimentation and industrialisation, so we need passionate, energetic data engineers who are focused on using their skills to drive out real business value from their data assets.

Now the data space is clearly very large, thus the myriad of skills and technologies is equally sized but also rapidly evolving.  So, it’s important to us that you have a desire for continued learning. 

Essential Experience:

  • We are looking for senior and very experienced Lead Engineers / Architects with 7+ years commercial experience.
  • Google Cloud Platform (GCP) and any experience with its myriad of services:- Google Compute Engine (GCE), Google Kubernetes Engine (formerly Container Engine) (GKE), Google App Engine (GAE)and Google Cloud Functions (GCF).
  • AWS (e.g. Athena, Redshift, Glue, EMR)
  • IBM Cloud
  • Python, Spark, SQL
  • Snowflake
  • Experience of developing enterprise grade ETL/data pipelines with tools like Informatica and Talend
  • Software engineering practices (coding practices to DS, unit testing, version control, code review)
  • Hadoop (especially the Cloudera and Hortonworks distributions), other NoSQL (especially Neo4j and Elastic), and streaming technologies (especially Spark Streaming).
  • Deep understanding of data manipulation/wrangling techniques.
  • Experience using development and deployment technologies, for instance virtualisation and management (e.g. Vagrant, Virtualbox), continuous integration tools (e.g. Jenkins, Concourse, Drone, Bamboo), configuration management tooling (e.g. Ansible) and containerisation technologies (e.g. Docker, Kubernetes, Swarm).
  • Experience building and deploying solutions to Cloud (AWS, Azure, Google Cloud) including Cloud provisioning tools (e.g. Terraform).
  • Ability to translate business requirements into plausible technical solutions for articulation to other development staff.
  • Experience designing analytics deliveries, planning projects and/or leading teams.

Copyright © 2024 83Zero