Spark Architect

  • Leeds
  • Pracyva

Spark Architect / SME

Contract Role: Long term contract role but 6 months to begin with & its extendable

Location: Leeds, UK (min 3 days onsite)



The information below covers the role requirements, expected candidate experience, and accompanying qualifications.

JOB ADVERT FOR IJP


Exciting Long-Term Opportunity to Work on Cutting-Edge Technology in One of Client's Top Fastest-growing Accounts which has over 1900 associates working across India, UK, China, Hong Kong and Mexico. End client is one of the biggest financial services organizations in the world, with operations in more than 38 countries. It has an IT infrastructure of 200,000+ servers, 20,000+ database instances, and over 150 PB of data. As a Spark Architect, you will have the responsibility to refactor Legacy ETL code for example DataStage into PySpark using Prophecy low-code no-code and available converters. Converted code is causing failures/performance issues.


The End Client Account is looking for an enthusiastic Spark Architect with deep component understanding around Spark Data Integration (PySpark, scripting, variable setting etc.), Spark SQL, Spark Explain plans. Also able to analyse Spark code failures through Spark Plans and make correcting recommendations; able to review PySpark and Spark SQL jobs and make performance improvement recommendations; able to understand Data Frames / Resilient Distributed Data Sets and understand any memory related problems and make corrective recommendations; and able to monitor Spark jobs using wider tools such as Grafana to see whether there are Cluster level failures. As a Spark architect, who can demonstrate deep knowledge of how Cloudera Spark is set up and how the run time libraries are used by PySpark code.


Your benefits

As the Spark architect, you will have the opportunity to work with one of the biggest IT landscapes in the world. You can also look forward to opportunity to collaborate with senior leadership stakeholders in the End clients place, guiding technical team and drive overall program objectives.


Your responsibilities

As a Spark Architect you will be working for the largest British universal bank and financial services – GDT (Global Data Technology) Team, you will be responsible for:

  • Working on an Enterprise scale Cloud infrastructure and Cloud Services in one of the Clouds (GCP).

Drive Data Integration upgrade to PySpark.

Mandatory Skills

You need to have the below skills.

  • At least 12+ Years of IT Experience with Deep understanding of component understanding around Spark Data Integration (PySpark, scripting, variable setting etc.), Spark SQL, Spark Explain plans.Spark SME – Be able to analyse Spark code failures through Spark Plans and make correcting recommendations.
  • To be able to traverse and explain the architecture you have been a part of and why any particular tool/technology was used.Spark SME – Be able to review PySpark and Spark SQL jobs and make performance improvement recommendations.
  • Spark – SME Be able to understand Data Frames / Resilient Distributed Data Sets and understand any memory related problems and make corrective recommendations.

Monitoring –Spark jobs using wider tools such as Grafana to see whether there are Cluster level failures.Cloudera (CDP) Spark and how the run time libraries are used by PySpark code.Prophecy – High level understanding of Low-Code No-Code prophecy set up and its use to generate PySpark code.Ready to work at least three days from the clients Leeds (UK) office and accept changes as per customer/Wipro policies.

Good to have skills.

Ideally, you should be familiar with


Collaboration with multiple customer stakeholdersKnowledge of working with Cloud DatabasesExcellent communication and solution presentation skills.

Insert your email to proceed to Pracyva's job offer

or