PBT Group Careers

Be part of our team of Data Specialists and embark on a career of the future!

Job Title
Senior Big Data / Hadoop / AWS Data Engineer and Information Team Lead
Employment Type
Full Time
Experience
6 to 25 years
Salary
Negotiable
Job Published
04 August 2022
Job Reference No.
1079874713

Job Description

PBT Group has a requirement for a Hadoop Developer to interpret requirements provided by business and produce effective Big Data solutions.

 

The Hadoop Data and Information specialist will be responsible to design, develop and support application solutions with focus primarily on Hadoop for a Financial Services Environment. This role will play a vital role in building new data pipelines from various structured and unstructured sources into Hadoop. Must be eager to wear multiple hats and be capable of picking up new technologies at a fast pace.

 

Programming exposure for code transformations and integrating the big data solution with existing systems. Develop information solutions from a variety of sources for both structured and unstructured data. Technical ownership of Big Data solutions for structured and unstructured data.

 

DUTIES:

  • Develop and implement big data models and solutions
  • Design and implement ETL methodologies and technologies and the integration with big data
  • Conduct root cause analysis on production issues
  • Technical leadership of entire information management process of both structured and unstructured data
  • Provide ongoing support and enhancement to ETL system
  • Optimization and the information solutions
  • Implementing machine learning algorithms
  • Configuration of the Hadoop infrastructure and environment for optimal performance
  • Integrate with statistical and actuarial analysts to build models
  • Producing relevant technical documentation and specifications
  • Estimate time and resource requirements for business requirement
  • Integration of big data solutions with existing reporting and analytical solutions
  • Develop data processing functions (DPF’s) using Java and Python

 

SKILLS REQUIRED:

  • Big Data Developer
  • Hadoop based architecture with Hortonworks Data Flow & Data Platform clusters
  • HDF, HBase, Hive, Spark, Storm
  • AWS experience would be a major advantage.

 

Skills

Industries