Loading

Banner Default Image

Data Engineer

Data Engineer



Duration: 6-months (Strong likelihood of a renewal)
Location: Remote for (UK Based)
(inside IR35)

Role/Individual:


The Person

Well versed at designing and building big data pipelines with machine learning workloads which are repeatable and scalable for extremely large datasets.

Within my clients Technology service line, the Engineering team’s mission is to be the best Engineering team ever, by using the right tools for the right job, failing fast and automating like our lives depend on it. Right from the leadership level, our team are fully committed to the Engineering culture and are consistently trying to change to ensure that culture is nurtured and grows.

We operate at genuine scale taking advantage of our alliance partnerships with the industry leading public cloud providers to deliver cutting edge solutions to clients on a global stage. If the public cloud is your future, you’re experienced Data Engineer and Google Cloud Platform practitioner, you’re amazing at solving difficult, interesting, and complex data challenges and actual writing the code in Python or other languages.

You will be working to continuously advance and standardise our and our client’s infrastructure and pipelined deployments, whilst collaborating with colleagues to write data wrangling code that scales and takes advantage of the technologies available in the market or sometimes using private solutions.


Location
We are open to remote working from wherever you may live.

Technical Required skills
•Pandas, numpy, Python
•In depth knowledge of ETL processes
•Apache Airflow (ex. Google Cloud Composer)
•SQL (ex. PostgreSQL, Bigquery)
Desired skills
•Apache Spark (Dataproc)
•Apache Beam (ex. Dataflow)

Job role
•Build and Run data related cloud infrastructure and software services on GCP.
•Participate in managing and mentoring team of engineers.
•Run sprints and help prioritise your team’s workload. Jump between helping your team solve problems, to diving into your editor of choice to poke around data and what can be done with it.
•Manage relationships with clients to become a trusted delivery partner.
•Support your wider Engineering community by assisting other teams, sharing knowledge, debugging problems, contributing code to various projects, and any other things you can think of.
•GTD – Get things done. Use the DevOps approach to Data Engineering to bring capabilities together and ship feature reach projects on time.
•Develop tools, services, processes and pipelines where needed which are useful in your daily work
•Work with DevOps teams and environments on subject such as MLOps pielines, performance monitoring, security monitoring, deployment/configuration, continuous integration/build and data resource creation scripts.
•Enjoy yourself. Maintain and cultivate the culture of your team
•Automate like your life depends on it (because it does). Keep out of the cloud portal, and where possible build automation starting From Dataflow and Airflow and ending with cloud native ML pipelines orchestrated with Python and other IaC tools like Terraform.

Requirements
•Proven experience managing a team of engineers and being responsible for delivering projects
•Ability to context switching between talking technical to other engineers (devops, security, etc.), to explaining the business benefits of serverless data engineering to non-technical business partners
•Proven experience with Hadoop, Airflow, etc.
•Hands on experience with GCP
•Proven experience in working with scripting languages such as Python, Ruby, Go, Shell or equivalent
•Solid CI/CD pipelining skills ideally ML orientated
•Product focussed, build once and reuse
•Build it run it, take ownership of your code and your pipelines
•Automates everything including testing
•Proven experience working in an agile environment
•Ability to continuously learn, work independently, and make decisions with minimal supervision
•You like nothing more than to watch colleagues flourish under your guidance.
•Be a technological cloud advocate to a wider audience inside and outside of the business
•Curiosity - Bring fresh and bold ideas to the team and wider firm be it new software or containerisation of existing infrastructure and projects

What we can offer
•Scale, most of our clients are well known global brands, their infrastructure isn't small and usually anything but simple
•A great team environment, inside and outside of the workplace
•Very close and advanced alliance relationships with GCP - Work with the product owners directly and help build their roadmap
•Love of technology and learning about even newer technology to help our clients be successful
•Paid certifications
•Flexible and considerate working hours and locations
•Access to regular training opportunities, attending conferences and certification support is encouraged
•Working with Google tech which is not publicly available yet