GCP Data Engineer
GCP Data Engineer£600 - £650 per day inside IR35 6-month contractHybrid working in LondonWe''re working with a global healthcare and AI research organisation at the forefront of applying data engineering and machine learning to accelerate scientific discovery. Their work supports large-scale, domain-specific datasets that power research into life-changing treatments.They''re now looking for a GCP Data Engineer to join a multidisciplinary team responsible for building and operating robust, cloud-native data infrastructure that supports ML workloads, particularly PyTorch-based pipelines.The RoleYou''ll focus on designing, building, and maintaining scalable data pipelines and storage systems in Google Cloud, supporting ML teams by enabling efficient data loading, dataset management, and cloud-based training workflows.You''ll work closely with ML engineers and researchers, ensuring that large volumes of unstructured and structured data can be reliably accessed, processed, and consumed by PyTorch-based systems.Key ResponsibilitiesDesign and build cloud-native data pipelines using Python on GCPManage large-scale object storage for unstructured data (Google Cloud Storage preferred)Support PyTorch-based workflows, particularly around data loading and dataset management in the cloudBuild and optimise data integrations with BigQuery and SQL databasesEnsure efficient memory usage and performance when handling large datasetsCollaborate with ML engineers to support training and ..... full job details .....
Other jobs of interest...
Perform a fresh search...
-
Create your ideal job search criteria by
completing our quick and simple form and
receive daily job alerts tailored to you!