PBT Group Careers

Be part of our team of Data Specialists and embark on a career of the future!

Filters

City
Industries
Date Posted

Senior Google Cloud Platform (GCP) Data Engineer (Technical Lead) Reference No: 2225778651 | Stellenbosch, South Africa | Posted on: 17 April 2026

We are seeking a highly skilled Senior Data Engineer to take on a technical lead role within our data engineering capability. This individual will be responsible for both hands-on delivery and providing technical direction, ensuring the design and implementation of scalable, high-quality data solutions.   The ideal candidate will have strong experience in Google Cloud Platform (GCP) and a proven track record in building modern data platforms. This role requires someone who is equally comfortable leading technical initiatives, mentoring team members, and engaging with stakeholders to deliver impactful data solutions.     Key Responsibilities Lead the design and implementation of scalable data pipelines and architectures using GCP technologies (BigQuery, Dataflow, Pub/Sub, Cloud Storage). Act as the technical lead for data engineering, setting best practices, standards, and architectural direction. Build and maintain robust ETL/ELT processes to transform and integrate data from multiple sources. Collaborate closely with data scientists, analysts, and business stakeholders to deliver data solutions aligned to business needs. Provide hands-on development support while guiding and mentoring other engineers. Ensure high standards of data quality, governance, and performance optimisation across all pipelines. Troubleshoot and resolve complex data-related issues in a timely manner. Drive automation and efficiency through reusable frameworks and tooling. Maintain clear documentation of data architecture, pipelines, and models. Stay up to date with emerging trends and best practices within GCP and data engineering.     Requirements Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field. 5+ years’ experience in data engineering, with proven experience in a senior or lead capacity. Strong hands-on experience with GCP, particularly BigQuery, with exposure to Dataflow, Pub/Sub, and Cloud Storage. Proficiency in Python and SQL for data processing and transformation. Solid experience in data modelling, ETL/ELT development, and modern data architectures (e.g. lakehouse). Experience in designing and leading scalable data solutions. Strong problem-solving and analytical skills. Excellent communication skills with the ability to engage both technical and non-technical stakeholders. Experience in mentoring or leading technical teams is highly advantageous. Exposure to financial services environments would be beneficial.     * In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent   * If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.  
Salary: R750 to R950

Business Intelligence Business Analyst (BIBA) & Process Engineer – Banking Reference No: 2409646765 | Johannesburg, South Africa | Posted on: 15 April 2026

PBT Group is seeking a highly skilled Business Intelligence Business Analyst (BIBA) with strong banking experience to join a dynamic client environment in Johannesburg. This role requires a well-rounded professional who combines Business Analysis, Data Analysis, and Process Engineering expertise, with a particular emphasis on process understanding and optimisation. The successful candidate will play a key role in bridging the gap between business, data, and technology teams, ensuring that data-driven solutions are aligned to business processes and strategic objectives.   ? Key Responsibilities Engage with business stakeholders to gather, analyse, and document business, data, and process requirements Analyse and map current (“as-is”) and future (“to-be”) business processes, identifying opportunities for optimisation Translate business and process requirements into clear functional and data specifications Work closely with data engineers, BI developers, and architects to support solution design and delivery Perform data analysis to support business insights, reporting, and decision-making Ensure alignment between business processes and data models / BI solutions Facilitate workshops and stakeholder sessions across business and technical teams Support testing processes, including UAT, ensuring solutions meet business and process requirements Drive data quality, governance, and consistency across reporting environments Document processes, data flows, and business rules in line with best practices   ? Key Requirements Proven experience as a BIBA / BI Business Analyst within the banking or financial services sector Strong combination of: Business Analysis Data Analysis Process Analysis / Process Engineering (essential) Demonstrated experience in process mapping, optimisation, and re-engineering Solid understanding of data warehousing, BI, and reporting environments Strong SQL and data analysis capability Experience working with cross-functional teams including data engineers and BI developers Familiarity with Agile and/or traditional delivery methodologies Excellent stakeholder engagement and communication skills   ? Nice to Have Experience with BI tools (e.g. Power BI, Qlik, Tableau) Exposure to data governance and data quality frameworks Knowledge of banking systems, regulatory environments, or financial data   ? Key Competencies Strong analytical and problem-solving skills Process-driven thinking with attention to detail Ability to translate complex data and processes into business-friendly insights Effective stakeholder management across business and IT Ability to work in fast-paced, delivery-focused environments         * In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent   * If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.  
Salary: R400 to R550

Senior Data Scientist – Generative AI Reference No: 1512548261 | Johannesburg, South Africa | Posted on: 13 April 2026

PBT Group is seeking a Senior Data Scientist – Generative AI to lead the design, development, and deployment of advanced AI and machine learning solutions. This role requires a highly hands-on technical specialist with deep expertise across Generative AI, Machine Learning, and Data Science, capable of building production-grade models that drive measurable business value.   The successful candidate will play a key role in shaping AI capabilities, developing scalable model architectures, and collaborating with engineering and business teams to deliver impactful, real-world AI solutions.     Key Responsibilities Design, build, and deploy Generative AI models for practical enterprise applications (text, image, video, multimodal) Develop, fine-tune, and optimise large language models (LLMs) and transformer-based architectures Apply advanced machine learning and statistical techniques to solve complex business problems Work with structured and unstructured datasets to engineer high-quality AI solutions Partner with data engineers and software teams to integrate models into production environments Drive model performance, scalability, and cost optimisation initiatives Contribute to AI strategy, experimentation, and innovation initiatives Conduct research and remain current with emerging trends in Generative AI and ML Ensure adherence to data governance, security, and responsible AI practices Communicate findings, insights, and model outcomes to technical and business stakeholders     Minimum Requirements Experience 5+ years’ experience in Data Science / Machine Learning / AI roles Proven track record delivering production-grade ML or AI solutions Strong hands-on development experience (not purely research or oversight)     Core Technical Expertise Strong practical experience with: Generative AI & Deep Learning Large Language Models (LLMs) Transformer architectures GANs / VAEs / modern generative approaches Prompt engineering and model optimisation Programming & Frameworks Python (essential) PyTorch and/or TensorFlow Experience working with modern AI / LLM APIs Machine Learning & Data Science Model development and evaluation Feature engineering and optimisation Statistical modelling and experimentation Data exploration and analytical problem solving Data Engineering & Processing Strong SQL proficiency Experience with Spark / large-scale data processing Handling complex, high-volume datasets Cloud & Infrastructure Experience deploying ML / AI workloads on AWS, Azure, or GCP Understanding of scalable cloud-native architectures MLOps & Deployment Model lifecycle management Docker / containerisation concepts CI/CD for ML pipelines Exposure to Kubernetes advantageous     Preferred Experience Highly advantageous: NLP, Computer Vision, or Multimodal AI experience Reinforcement Learning (RL) or Self-Supervised Learning (SSL) exposure AI model optimisation and performance tuning Model interpretability / explainability techniques Experience working in regulated or data-sensitive environments Research publications or open-source contributions     Key Competencies Strong analytical and problem-solving capability Deep technical curiosity and innovation mindset Ability to work hands-on and independently Strong collaboration across technical and business teams Clear communication of complex AI concepts Pragmatic, solution-driven approach to AI adoption     Ideal Candidate Profile This role is suited to a senior, technically strong Data Scientist who combines: Advanced Generative AI expertise Solid Machine Learning & statistical foundations Strong engineering & production mindset Ability to translate AI capabilities into business value     * In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent   * If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.
Salary: R600 to R800

Java Integration Engineer Reference No: 2196575350 | Cape Town, South Africa | Posted on: 13 April 2026

Build the Digital Backbone of Healthcare   We are looking for a Java Integration Engineer to help build and evolve a high-performance integration platform that powers the exchange of healthcare data between medical schemes, hospitals, switches, insurers, and healthcare providers.   Our systems enable millions of healthcare transactions, helping ensure that claims are processed, benefits are administered, and critical healthcare information flows reliably across the healthcare ecosystem.   This role sits at the heart of that infrastructure.   If you enjoy solving complex integration problems, working with high-volume data pipelines, and building systems that directly impact healthcare delivery, this is an opportunity to work on meaningful technology at scale.       What You'll Work On You will help develop and maintain a large-scale enterprise integration platform that connects multiple healthcare systems through real-time and batch processing.   Key components of the platform include: High-performance Java-based processing engines Event-driven architectures using Apache Kafka Real-time and batch system-to-system integrations Configurable interface orchestration and scheduling Secure file and API integrations (FTP/SFTP/FTPS, REST, SOAP) Real-time listeners using TCP/IP, HTTP, and web services Data transformation engines for complex industry formats Centralized monitoring, auditing, and alerting   The platform processes large volumes of transactions daily and supports mission-critical workflows across healthcare organizations.       Why This Role Is Interesting This role sits at the intersection of software engineering, distributed systems, and enterprise integration.   You will work on: ? Event streaming architectures using Kafka ? High-volume data processing pipelines ? Real-time integration platforms ? Complex system interoperability challenges ? Enterprise-scale transaction processing systems   Your work will help ensure that healthcare systems communicate reliably, which directly impacts how healthcare services are delivered and administered.       Responsibilities Design and implement new functionality within the integration platform Develop and maintain Java-based integration services Build and enhance Kafka-based event streaming pipelines Collaborate with developers and analysts to design robust integration flows Optimize SQL queries and data access patterns for performance Improve system resilience, scalability, and reliability Perform root cause analysis across integrated systems Write unit and integration tests to ensure system quality Produce clear technical documentation Contribute to improving engineering practices and platform architecture       Requirements Relevant tertiary qualification in Information Technology / Computer Science / Engineering 5+ years experience developing enterprise Java applications Strong experience with backend systems and integration platforms       Technical Skills   Core Development Java (strong OOP knowledge) Multithreaded programming JDBC REST / SOAP APIs Networking programming (Sockets)   Integration & Messaging Apache Kafka Enterprise system integration Real-time interface design Batch processing pipelines   Data Strong SQL (Microsoft SQL Server) Relational database modeling   Technologies XML / JSON data transformation FTP / SFTP / FTPS integrations JavaMail HTML   Tools Microsoft DevOps / Team Foundation Application servers (Glassfish or similar)       Bonus Experience Experience in the Healthcare or Insurance industry is advantageous.     Personal Attributes Self-motivated and proactive Strong analytical and problem-solving skills Clear communicator and team player Detail-oriented with a strong sense of ownership Comfortable working across complex integrated systems       What We Offer ? Hybrid working environment ? Exposure to modern integration technologies ? Opportunity to work on large-scale, meaningful systems ? Supportive and collaborative engineering culture ? A variety of technically challenging projects       Interested? If you're passionate about Java, Kafka, and large-scale integration systems, we’d love to hear from you.   * In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent   * If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.  
Salary: Negotiable

Senior Tester / QA Specialist Reference No: 4170261061 | Johannesburg, South Africa | Posted on: 13 April 2026

PBT Group is seeking a Senior Tester / QA Specialist with strong experience in card and payments environments, particularly with Base24 and related payment processing platforms. The successful candidate will play a key role in ensuring the quality, reliability, and stability of complex transaction systems through structured testing practices.   This role requires a strong manual testing background, with exposure to test automation considered advantageous. Experience with Postilion or similar card and payment switching platforms would be highly beneficial. The candidate will work closely with developers, business analysts, and stakeholders to validate system functionality and ensure high-quality delivery within a banking or payments environment.     Key Responsibilities Test Planning & Execution Design, develop, and execute manual test cases and test scenarios for card and payments systems. Perform functional, integration, regression, and user acceptance testing (UAT). Validate system functionality related to Base24, Postilion, and other card processing platforms. Quality Assurance & Defect Management Identify, log, and track defects using defect tracking tools. Collaborate with development teams to analyse, prioritise, and resolve defects. Ensure comprehensive test coverage across transaction flows and system functionality. Test Documentation Develop and maintain test plans, test cases, and test reports. Ensure traceability between business requirements, test scenarios, and results. Collaboration & Stakeholder Engagement Work closely with business analysts, developers, and product teams to understand requirements. Support UAT processes, assisting business stakeholders in validating system functionality. Provide clear reporting on testing progress, risks, and quality metrics. Continuous Improvement Contribute to improving testing methodologies and quality assurance standards. Support initiatives to introduce or enhance automation testing practices where applicable.     Minimum Requirements 5+ years of experience in Software Testing / QA roles. Strong manual testing experience within enterprise systems. Experience working in banking, payments, or card-based environments. Exposure to Base24 or similar card processing platforms. Experience with UAT coordination, defect tracking, and test documentation.     Advantageous Skills Experience with Postilion or payment switching platforms. Exposure to test automation tools or frameworks. Understanding of card transaction processing and payment lifecycle flows. Experience working within Agile delivery environments.     Key Competencies Strong analytical and problem-solving abilities. Excellent attention to detail. Ability to work effectively in high-volume transaction environments. Strong communication and collaboration skills. Ability to work independently while contributing to a team environment.   * In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent   * If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.  
Salary: Negotiable

Migration Data Analyst Reference No: 1112150375 | Johannesburg, South Africa | Posted on: 13 April 2026

PBT Group is seeking an experienced Migration Data Analyst to support a large-scale data migration initiative within the investment management environment.   The successful candidate will play a critical role in analysing, mapping, validating, and reconciling data as part of system migration and transformation projects. This role requires strong analytical capability, attention to detail, and the ability to work closely with both business and technical stakeholders to ensure data integrity and successful migration outcomes.     Key Responsibilities Data Migration & Analysis Analyse source and target data structures to support migration activities Perform detailed data mapping between legacy and new systems Identify data gaps, inconsistencies, and transformation requirements Develop and execute data validation and reconciliation processes Ensure data accuracy, completeness, and integrity throughout the migration lifecycle     Data Quality & Reconciliation Investigate and resolve data discrepancies between systems Perform reconciliation of large datasets to ensure successful migration Define and implement data quality checks and controls Support root cause analysis for data-related issues     Stakeholder Engagement Collaborate with business stakeholders, data engineers, and project teams Translate business requirements into data specifications and mapping logic Provide clear communication on data issues, risks, and progress Support testing phases including UAT and post-migration validation   Documentation & Governance Create and maintain detailed data mapping documentation Document transformation rules, business logic, and data definitions Ensure adherence to data governance and regulatory requirements Support audit and compliance requirements where applicable     Minimum Requirements Relevant tertiary qualification in IT, Finance, Data, or a related field 5+ years’ experience in data analysis, with strong exposure to data migration projects Mandatory experience within the investment management / financial services industry Strong understanding of financial instruments, portfolios, and investment data     Technical Skills & Experience Advanced SQL skills (data extraction, transformation, and analysis) Experience with data migration tools and methodologies Strong data mapping and data modelling experience Experience working with large, complex datasets Familiarity with ETL processes and data pipelines   Key Skills & Competencies Strong analytical and problem-solving ability High attention to detail and data accuracy Ability to identify root causes of data issues Strong communication and stakeholder engagement skills Ability to work under pressure and meet project deadlines     Advantageous Experience with data migration in investment platforms or fund administration systems Exposure to data quality frameworks and governance practices Experience with tools such as Excel (advanced), Power BI, or similar Knowledge of regulatory environments within financial services     Contract Details Contract role based in Johannesburg Hybrid/onsite requirements dependent on client needs Competitive market-related rate   * In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent   * If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.  
Salary: R500 to R650

Senior Data Scientist Reference No: 139076348 | Johannesburg, South Africa | Posted on: 13 April 2026

The Data Scientist will work within the Data Science and Analytics team to deliver data-driven insights, predictive models, and automated solutions that address complex business challenges. The role requires strong expertise across the full data science lifecycle, including business understanding, data exploration, feature engineering, model development, validation, and deployment into production environments. By leveraging advanced analytics and machine learning, the Data Scientist will contribute to strategic decision-making, process optimisation, and innovation across the organisation.   Key Responsibilities Research and test innovative data science techniques that can be utilised in predictive and prescriptive analytics solutions. Stay informed on emerging data science trends, technologies, and applications relevant to industry advancements. Collaborate with business stakeholders to identify problems/opportunities, elicit requirements, and define expected outcomes. Partner with stakeholders to design and propose data-driven approaches that address business needs and support new strategies. Develop conceptual models and solutions to meet business requirements. Work with subject matter experts and data engineers to identify, obtain, and prepare relevant data sources. Perform preprocessing of structured and unstructured data, including manipulation, transformation, normalisation, standardisation, visualisation, and feature engineering. Apply appropriate data mining, statistical, and machine learning techniques to solve business problems. Implement predictive and prescriptive models on large-scale datasets, including distributed computing platforms (e.g. Spark). Validate and test models using methods such as back-testing, A/B testing, and scenario modelling. Monitor, refine, and maintain models to ensure continued accuracy, relevance, and business value. Profile, visualise, and interpret data to inform modelling approaches and provide actionable insights. Review existing models and code, ensuring quality and identifying improvements. Generate reports, dashboards, and visualisations that effectively communicate insights to technical and non-technical audiences. Ensure compliance with applicable policies, procedures, regulations, and professional standards. Contribute to the ongoing review and enhancement of internal data science policies and practices.   Experience Required Minimum of 5–6 years’ experience in data science projects, including the delivery of predictive and prescriptive models. Proven proficiency with Python, including machine learning libraries and frameworks (NumPy, Pandas, SciPy stack, Matplotlib, scikit-learn), and experience working in Jupyter notebooks. Strong SQL skills and experience handling large, complex datasets. Demonstrated application of machine learning and statistical techniques to real-world business problems. Experience working in agile development teams. Proven ability to operationalise and productionise data science solutions. Exposure to high-scale production environments.   Knowledge and Skills Required Strong understanding of the Data Science Development Cycle, including problem framing, data profiling, feature engineering, model building, evaluation, and productionisation. Outstanding problem-solving and analytical abilities, with the ability to conceptualise and test hypotheses. Ability to clean, unify, and integrate structured and unstructured datasets. Proficiency in data visualisation and communication tools (e.g. Tableau, Power BI, Kibana). Solid programming experience (Python / Java) based on prepared designs. Familiarity with modern big data platforms and distributed processing technologies (e.g. Hadoop ecosystem, Spark, Kafka, HDFS). Strong understanding of ETL processes, data flows, and big data architectures. Experience designing comprehensive solutions aligned with business and technical requirements. Ability to effectively communicate insights, trends, and correlations to diverse audiences with varying technical expertise. Strong report writing skills with clear visualisations and concise commentary.   Inherent Requirements Degree (Honours, Masters, or PhD) in a quantitative field such as Statistics, Mathematics, Computer Science, Actuarial Science, or Engineering. Professional certifications in Data Science or related technologies (e.g. Python, Azure, AWS, Spark, Machine Learning, Big Data, Cloud Infrastructure). Minimum 5 years’ hands-on experience in data science and analytics initiatives. Proven ability to apply machine learning techniques and deploy solutions into production.     * In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent   * If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.  
Salary: Negotiable

DevOps Engineer Reference No: 2682347259 | Cape Town, South Africa | Posted on: 13 April 2026

PBT Group is seeking a skilled DevOps Engineer to join a dynamic retail environment, focused on building, automating, and optimising scalable and reliable cloud-based infrastructure.   This role requires a hands-on engineer who can drive DevOps best practices, enhance CI/CD pipelines, and support modern application delivery through containerisation and Infrastructure as Code. The ideal candidate will work closely with development, data, and operations teams to ensure efficient, secure, and high-performing systems.     Key Responsibilities   DevOps Engineering & Automation Design, implement, and maintain CI/CD pipelines to support efficient software delivery Automate build, test, and deployment processes across environments Improve system reliability, scalability, and performance through automation     Cloud & Infrastructure Management Deploy and manage cloud infrastructure across AWS and/or Azure environments Optimise cloud usage for cost, performance, and scalability Ensure high availability and disaster recovery capabilities     Containerisation & Orchestration Build and manage containerised applications using Docker Deploy and manage container orchestration platforms such as Kubernetes Support microservices-based architectures     Infrastructure as Code (IaC) Develop and maintain infrastructure using tools such as Terraform and/or Ansible Ensure consistent and repeatable environment provisioning Maintain version-controlled infrastructure configurations     Monitoring, Security & Support Implement monitoring, logging, and alerting solutions Ensure system security, compliance, and best practices are followed Troubleshoot and resolve production issues efficiently Collaborate with teams to support releases and deployments     Minimum Requirements Relevant IT qualification or equivalent experience 4+ years’ experience in a DevOps or Cloud Engineering role Experience working in fast-paced, customer-centric environments (retail advantageous) Willingness to work onsite in Cape Town 4 days per week     Technical Skills & Experience Strong experience with cloud platforms (AWS and/or Azure) Hands-on experience with containerisation (Docker) and orchestration (Kubernetes) Proven experience building and maintaining CI/CD pipelines (Jenkins, GitLab CI/CD) Experience with Infrastructure as Code (Terraform, Ansible) Strong scripting skills (e.g. Bash, Python) Familiarity with version control systems (Git)     Key Competencies Strong problem-solving and analytical thinking Attention to detail and commitment to quality Ability to work collaboratively across teams Proactive mindset with a focus on automation and continuous improvement Strong communication and stakeholder engagement skills     Advantageous Experience in retail or high-transaction environments Exposure to microservices architecture Knowledge of monitoring tools (e.g. Prometheus, Grafana, ELK stack) Experience with security best practices in cloud environments     Why Join PBT Group PBT Group offers the opportunity to work on impactful, large-scale solutions within leading organisations. You will be part of a collaborative, forward-thinking team that values innovation, automation, and continuous improvement.         * In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent   * If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.
Salary: R400 to R500

Senior Data Engineer Reference No: 432642153 | Cape Town, South Africa | Posted on: 13 April 2026

PBT Group is seeking a highly experienced Senior Data Engineer to lead the design, development, and optimisation of scalable data solutions. This role will play a key part in modernising legacy data environments and driving the transition toward a cloud-native, big data architecture.   The successful candidate will bring deep technical expertise across both traditional Microsoft-based data stacks and modern data engineering technologies, with a strong focus on building robust, high-performance data pipelines and enabling advanced analytics capabilities.   Key Responsibilities Data Architecture & Engineering Design and implement scalable, high-performance data architectures. Lead the evolution from legacy data platforms to modern cloud-based solutions. Build and maintain robust ETL/ELT pipelines for large-scale data processing. Data Modelling & Performance Optimisation Develop and optimise advanced data models aligned to business requirements. Improve data storage, access, and retrieval performance across platforms. Conduct performance tuning and troubleshooting of complex data pipelines. Modern Data Platform Enablement Drive adoption of modern technologies including Python, PySpark, and Databricks. Support the transition from legacy tools (SSIS, SSRS, SSAS) to scalable big data frameworks. Contribute to the development of analytics-ready data environments. Advanced Analytics & Big Data Enable advanced analytics use cases including machine learning pipelines and predictive modelling. Work closely with data scientists and analysts to deliver high-quality datasets. Leadership & Mentorship Provide technical leadership and mentorship to data engineering team members. Promote best practices in data engineering, coding standards, and solution design. Stakeholder Collaboration Engage with cross-functional teams to understand business needs and translate them into technical solutions. Act as a key interface between technical teams and business stakeholders. Data Governance & Quality Ensure data quality, integrity, and compliance with governance standards. Implement best practices in data management, security, and regulatory compliance.   Technology Environment Legacy Stack VBA MS SQL Server SSIS SSAS SSRS Target / Modern Stack SQL Python PySpark Databricks Core Technical Skills Advanced SQL development Python programming Spark (PySpark & Spark SQL) Data warehousing and data modelling Big data processing frameworks Cloud & Data Platform Technologies AWS (S3, Lambda, Redshift, EMR, Glue, Athena) Event-driven architecture (SQS, SNS, EventBridge) API integrations (API Gateway) Data governance tools (Unity Catalog, Delta Lake) Security & networking (VPC, KMS, Secrets Manager) Development & Tooling AWS CDK Docker Azure DevOps JIRA, Confluence Draw.io   Qualifications & Experience Bachelor’s or Master’s degree in Computer Science, IT, or related field 10+ years’ experience in data engineering Proven experience designing and implementing complex data solutions Strong experience in both legacy and modern data environments Demonstrated leadership and mentoring capability   Preferred Experience Experience with real-time/streaming data pipelines Exposure to containerisation and orchestration (Docker, Kubernetes) Knowledge of data security and privacy best practices Relevant certifications in cloud or data engineering technologies   Ideal Candidate Profile Strong problem-solver with a strategic mindset Able to bridge legacy and modern data platforms effectively Excellent communication and stakeholder engagement skills Passionate about data, innovation, and continuous improvement     * In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent   * If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.  
Salary: R600 to R750

SQL Data Engineer Reference No: 3945265994 | Cape Town, South Africa | Posted on: 13 April 2026

PBT Group, leaders in Business Intelligence, has a vacancy for a Microsoft SQL Data Engineer. Duties: Plan and analyse complex business requirements and implement technology enabled solutions to address multi-discipline business opportunities/problems. Conduct planning, analysis and design activities in conjunction with other development specialists. Participate in analysis of complex business opportunities/problems to deliver designs meeting requirements. Participate in estimation of tasks and assist in the development of project plans. Code or makes modifications to programs of high complexity, according to specifications. Conduct medium to high complexity evaluations for product releases, stand-alone products, etc. Conduct walkthroughs and quality review of deliverables. Knowledge of design and developing end-to-end data acquisition processes to be used in population of data warehouse/data marts and/or in the creation of interfaces. Provide guidance and mentoring on business intelligence technology and systems in general, especially in the area of ETL processes. Participate in the formulation of standards to support the data acquisition development process. Design, develop and execute complex data acquisition or interface routines using ETL tool, ensuring that business and technical requirements are met. Ensure compliance with established policies, standards and methodologies.   Required Skills: Strong MS SQL Data Engineering experience Solid SSIS (SQL Server Integration Services) experience. Solid SSRS (SQL Server Reporting Services) experience. Ability to analyse and define requirements Database design Intimate knowledge of source systems as well as a basic understanding of dimensional models. Conventional database- and data warehouse modeling skills, in order to understand the data warehouse data models. A sound knowledge of the programming language used to write the data staging programs or ETL tool. A sound knowledge of SQL, or the language used to access the source databases and the data warehouse from the data staging programs or ETL tool. A sound knowledge of the capabilities of the ETL tools, to know what their capabilities and shortcomings are – in order to exploit or avoid those aspects in the data staging programs. Pride of work, thoroughness and attention to detail.   Required Qualifications / Training: Course on the ETL / related toolset. Relevant data warehouse and BI solution training is essential. B.Sc. or related degree is advantageous. 2+ years programming experience.   * In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent   * If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.    
Salary: R30000 to R75000

C# .NET Developer Reference No: 1339633698 | Cape Town, South Africa | Posted on: 13 April 2026

PBT Group has an exciting opportunity for an experienced C# .NET Developer to join our software engineering team. The role involves developing and maintaining enterprise applications and services using modern Microsoft technologies and frameworks. The successful candidate will have hands-on experience with C#, .NET Core, Web API, and SQL Server, along with exposure to cloud-based environments (Azure preferred) and CI/CD practices.   Key Responsibilities Design, develop, and maintain applications using C#, .NET Core, and ASP.NET Web API. Build and consume RESTful APIs and integrate with internal and external systems. Collaborate closely with cross-functional teams to gather requirements and translate business needs into technical solutions. Write clean, efficient, and well-documented code. Conduct unit testing and participate in peer code reviews. Support and enhance existing applications, ensuring performance, reliability, and scalability. Work within an agile environment, contributing to sprint planning and retrospectives. Implement automation, continuous integration, and deployment practices using DevOps pipelines.   Required Qualifications and Experience Degree or Diploma in Computer Science, Software Engineering, or related field. 5+ years’ experience in C# and .NET development (preferably .NET Core 5 or newer). Solid experience with SQL Server, Entity Framework, and LINQ. Experience in frontend frameworks (Angular, React, or Blazor) is advantageous. Proficiency with Azure services (App Services, Functions, DevOps Pipelines) preferred. Exposure to containerization (Docker, Kubernetes) beneficial. Experience with Git, CI/CD, and Agile methodologies.   Key Competencies Strong technical and analytical skills. Excellent communication and teamwork capabilities. Detail-oriented and quality-driven. Self-motivated, adaptable, and eager to learn. Ability to deliver under tight deadlines.     * In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent   * If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.
Salary: Negotiable

Senior Data Quality Analyst Reference No: 782994432 | Cape Town, South Africa | Posted on: 13 April 2026

PBT Group is seeking a Data Quality Analyst who operates beyond traditional operational reporting and takes a strategic, investigative approach to data challenges. This role is suited to someone who can interrogate data issues deeply, unpack ambiguity, and drive meaningful improvements across systems, processes, and teams.   The ideal candidate is a structured thinker and problem solver who can translate vague business concerns (e.g. “the data is wrong”) into clear, testable hypotheses, identify root causes, and recommend sustainable solutions that improve data integrity and business outcomes.   Key Responsibilities Data Quality Analysis & Problem Solving Investigate and analyse data issues across systems, pipelines, and reports Break down ambiguous data concerns into structured problem statements and hypotheses Distinguish between symptoms and root causes of data issues Apply analytical techniques to assess data accuracy, consistency, and reliability Root Cause & Impact Assessment Evaluate upstream and downstream controls and dependencies Identify whether issues stem from process, system, or behavioural drivers Assess and prioritise issues based on risk, business impact, and recurrence Strategic Improvement & Optimisation Recommend and implement improvements to enhance data quality and process efficiency Contribute to building sustainable, scalable approaches to managing data quality Support continuous improvement initiatives across data ecosystems Business Engagement & Storytelling Translate technical findings into clear, business-friendly insights Articulate the business impact of data issues through strong storytelling Collaborate with stakeholders across business and technology teams Enable better decision-making through improved data understanding Data Quality & Governance (Advantageous) Apply and promote data quality dimensions such as completeness, validity, consistency, and accuracy Contribute to data governance and quality frameworks where required   Minimum Requirements Relevant degree or diploma in IT, Data, Analytics, or related field 3–7+ years’ experience in a data-focused role (Data Analyst, BI Analyst, Data Engineer, or similar) Strong SQL and data interrogation skills Experience working with large datasets and data pipelines   Key Skills & Experience Strong analytical and critical thinking ability Ability to distinguish root causes from symptoms Understanding of upstream vs downstream data flows and controls Experience analysing data across systems, processes, and reports Ability to structure ambiguous problems into clear, actionable solutions Strong stakeholder engagement and communication skills   Advantageous Experience with data quality frameworks and metrics Exposure to data governance practices Experience in financial services or complex enterprise environments   Key Competencies Strategic and investigative mindset Strong attention to detail Curiosity and problem-solving orientation Excellent communication and storytelling ability Ability to influence and drive change across teams   * In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent   * If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.  
Salary: Negotiable

Java Developer Reference No: 163181475 | Cape Town, South Africa | Posted on: 13 April 2026

Back-End Developer (Java) We are looking for an experienced Back-End Developer to design, build, and maintain reliable back-end solutions using Java technologies. This role focuses on developing secure, scalable systems and APIs that support high-performing applications. Key responsibilities Design, develop, and maintain back-end systems using Java and .NET Build and optimise scalable APIs for front-end integration Apply security best practices to protect applications and data Improve application performance and database efficiency Collaborate with front-end teams and key stakeholders Troubleshoot, debug, and enhance existing systems Requirements Bachelor’s degree in Computer Science, Software Engineering, or a related field 5+ years’ back-end development experience using Java Strong database experience (SQL, PostgreSQL, or MongoDB) Experience with microservices and containerisation (Docker, Kubernetes) Familiarity with DevOps tools and CI/CD pipelines Cloud exposure (AWS or Azure) is advantageous Strong analytical and problem-solving skills In line with the POPI Act, by submitting your application, you consent to PBT retaining your personal details for future career opportunities. If you have not received feedback within two weeks, please consider your application unsuccessful.
Salary: Negotiable

Business Intelligence Business Analyst (BIBA) Reference No: 2722103546 | Cape Town, South Africa | Posted on: 13 April 2026

PBT is a technology-agnostic company specialising in BI solutions. We help businesses harness the power of data to drive informed decision-making and achieve their strategic goals. Our team of experts is dedicated to delivering innovative and customised solutions that enable our clients to gain a competitive edge in today's rapidly evolving market.   Role Description: As a BI Business Analyst at PBT, you will play a pivotal role in understanding the nature and business impact of requests, performing the required analysis, and collaborating with developers to ensure the successful delivery of functional and technical specifications. You will support the business by identifying opportunities for improvement, creating and maintaining documentation, and conducting high-level testing before deploying solutions.   Requirements: A relevant Business Analysis qualification is essential. Minimum of 5 years' experience as a Business Analyst Exposure to or experience in the Kimball methodology would be particularly advantageous. Strong technical data analysis skills Proficient in business process modelling Solid understanding of the Software Development Life Cycle (SDLC) High level of computer literacy, particularly in Excel Experience with Microsoft BI tools such as Power Pivot, Power BI, and SQL query writing Ability to design and document logical dimensional models. Well-developed analytical and problem-solving skills   Join our dynamic team at PBT and be part of a company that values innovation, teamwork, and personal growth. If you are enthusiastic about leveraging data to drive business success and meet the above requirements, we would love to hear from you.   * In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent   "If you have not heard from us in two weeks, please note that you were unsuccessful for the role. However, we will keep your resume on file and reach out if any other suitable opportunity arises in the future".  
Salary: Negotiable

Data Engineer (Azure) Reference No: 2992219558 | Cape Town, South Africa | Posted on: 13 April 2026

We are seeking a skilled Azure Data Engineer with 3–5 years of experience to join our growing data team. This role focuses on designing, building, and optimizing scalable cloud-based data solutions using Microsoft Azure technologies. You will be responsible for batch and real-time data ingestion, transformation, and integration to support analytics, reporting, and advanced data use cases.   You will work closely with business stakeholders, data analysts, and platform teams to deliver reliable, secure, and high-performing data pipelines.     Key Responsibilities Data Engineering & Pipeline Development Design, develop, and maintain ETL/ELT pipelines using Azure Data Factory (ADF). Build and optimize large-scale data processing solutions using Azure Databricks (Spark). Implement real-time data ingestion using Azure Event Hubs. Develop and maintain scalable data models for analytics and reporting. Perform data transformation, cleansing, and enrichment processes. Cloud & Data Platform Engineering Support and enhance Azure-based data lake and data warehouse architectures. Optimize data storage, partitioning, and performance strategies. Ensure high availability, scalability, and cost-efficiency of data solutions. Automate workflows and support CI/CD for data pipelines. Data Integration & Streaming Integrate structured and unstructured data from multiple enterprise systems. Design solutions for both batch and streaming data pipelines. Collaborate with integration teams on event-driven architectures. Data Quality, Governance & Security Implement data validation, monitoring, and reconciliation processes. Apply data governance and security best practices across Azure services. Document data lineage, transformations, and architecture components. Collaboration & Delivery Translate business requirements into scalable technical solutions. Partner with analytics and BI teams to deliver trusted datasets. Participate in agile delivery cycles and code reviews.     Qualifications & Experience Bachelor’s degree in Computer Science, Engineering, Data Science, or related field. 3–5 years of hands-on experience in data engineering roles. Strong expertise in: Azure Data Factory Azure Databricks Azure Event Hubs SQL (advanced level) Python (preferred) Experience designing data lakes and data warehouse solutions. Strong understanding of ETL/ELT design patterns. Experience working with Azure cloud services and security models. Knowledge of data modeling (dimensional and normalized models).     Desired Skills Experience with Delta Lake and Spark optimization. Familiarity with DevOps practices and CI/CD pipelines. Exposure to event-driven architecture concepts. Strong troubleshooting and performance tuning skills. Excellent communication and stakeholder engagement abilities.     Why Join Us? Work in a modern Azure-first data ecosystem. Opportunity to build both batch and real-time data solutions. Exposure to enterprise-scale data architecture initiatives. Supportive and forward-thinking data leadership.   * In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent   * If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.  
Salary: Negotiable

Data Integration Engineer (Azure | Event-Driven Architecture) Reference No: 3164349258 | Cape Town, South Africa | Posted on: 13 April 2026

We are seeking a highly experienced Data Integration Engineer (6–7 years) to design and deliver enterprise-grade, event-driven integration solutions within a modern Azure ecosystem. This role is focused on building scalable, secure, and resilient real-time and API-driven architectures that enable seamless data exchange across enterprise systems.   You will play a key role in shaping integration standards, streaming platforms, and API strategies while collaborating closely with engineering, platform, and data teams.     Key Responsibilities Event-Driven & Streaming Architecture Design and implement event-driven architectures using Azure Event Hubs and Kafka. Build and maintain scalable real-time streaming data pipelines. Ensure high availability, fault tolerance, and performance optimization of streaming platforms. Define messaging standards, schemas, and integration best practices. API & Serverless Integration Develop and manage APIs using Azure API Management (APIM). Build serverless integration services using Azure Functions. Design RESTful APIs and microservices-based integration solutions. Manage API lifecycle, versioning, and security policies. Data Platform Enablement Integrate data into downstream platforms such as Azure Databricks and SQL-based systems. Support real-time ingestion into analytics and operational reporting systems. Collaborate with data engineering teams on ingestion frameworks. Governance, Security & Monitoring Implement monitoring, logging, and alerting for integration services. Apply authentication and authorization standards (OAuth, managed identities, RBAC). Ensure secure and compliant data movement across environments. Maintain architecture documentation and integration blueprints. Collaboration & Delivery Partner with enterprise architects and engineering teams to define integration strategy. Drive integration initiatives from design to production deployment. Provide technical guidance and mentorship to junior engineers.     Qualifications & Experience Bachelor’s degree in Computer Science, Engineering, or related field. 6–7 years of experience in integration engineering or middleware development. Strong hands-on expertise in: Azure Event Hubs Kafka Azure API Management (APIM) Azure Functions SQL Azure Databricks Experience designing event-driven and microservices architectures. Strong understanding of distributed systems and messaging patterns. Experience working in cloud-native Azure environments.     Desired Skills Experience with CI/CD pipelines and Infrastructure as Code (ARM, Bicep, Terraform). Knowledge of DevOps and monitoring tools. Strong performance tuning and troubleshooting capabilities. Excellent communication and stakeholder management skills.     Why Join Us? Opportunity to shape enterprise-wide event-driven architecture. Work within a modern Azure-first integration ecosystem. Influence technical standards and integration strategy at scale.   * In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent   * If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.  
Salary: Negotiable

Data Analyst / Visualization Engineer (Power BI & Azure) Reference No: 2061994532 | Cape Town, South Africa | Posted on: 13 April 2026

We are seeking a Data Analyst / Visualization Engineer with 3–5 years of experience to develop high-quality dashboards and analytical solutions within a modern Azure data platform. This role combines strong SQL expertise, data validation and testing capability, and advanced Power BI development skills to ensure trusted, business-ready insights.   You will collaborate closely with business stakeholders and data engineering teams to deliver accurate, validated, and visually compelling reporting solutions.     Key Responsibilities Data Analysis & Visualization Develop interactive dashboards and reports using Power BI. Build and optimize semantic data models within Power BI. Write advanced SQL queries to extract and transform data from Azure-based platforms. Develop DAX measures and calculated fields aligned to business KPIs. Azure Data Platform Collaboration Consume curated datasets from Azure Data Factory and Azure Databricks environments. Work with Azure SQL and data lake structures. Support reporting performance optimization within Azure architecture. Data Validation & Quality Assurance Perform rigorous data testing, validation, and reconciliation. Validate KPIs, metrics, and calculation logic before release. Identify and resolve discrepancies across source systems. Document business rules and data transformation logic. Stakeholder Engagement & Delivery Translate business requirements into reporting and visualization solutions. Present insights clearly to executive and operational stakeholders. Provide continuous enhancements to dashboards and reporting frameworks. Support self-service BI enablement.     Qualifications & Experience Bachelor’s degree in Data Analytics, Computer Science, Finance, or related field. 3–5 years of experience in BI development or data analytics roles. Strong expertise in: Power BI (data modeling, DAX, visualization best practices) Advanced SQL Data testing and validation methodologies Experience working with Azure data platforms. Understanding of data warehousing and dimensional modeling concepts.     Desired Skills Experience with performance tuning Power BI datasets. Knowledge of Azure Data Factory and Databricks data flows. Strong analytical mindset and attention to detail. Excellent communication and business engagement skills.     Why Join Us? Opportunity to shape enterprise reporting standards. Work in a modern Azure-driven analytics ecosystem. Deliver insights that directly impact strategic business decisions.     * In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent   * If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.
Salary: Negotiable

Azure Data Engineer Reference No: 970717537 | Cape Town, South Africa | Posted on: 13 April 2026

PBT Group is seeking an Azure Data Engineer to join a modern, highly analytical environment within the SRS Data Science division. This role is ideal for a technically strong professional with 2–5 years of experience, a solid academic foundation, and a passion for building scalable, cloud-native data solutions.   The successful candidate will contribute to both internal data initiatives (including reporting automation) and high-impact client projects, supporting advanced analytics and AI-driven solutions built on the Azure ecosystem.     Key Responsibilities Partner with business analysts and stakeholders to translate business needs into technical solutions Design and implement scalable data ingestion and transformation frameworks Develop robust ETL / ELT pipelines using Azure-native technologies Work extensively with Azure Data Factory, Azure Databricks, and AI-focused tooling Produce high-quality technical documentation, including architecture diagrams and runbooks Support ongoing enhancements, maintenance, and optimisation of data solutions Collaborate within cross-functional Agile delivery teams     Minimum Requirements 2–5 years’ experience in Data Engineering or related roles Bachelor’s degree in a relevant field (e.g. Computer Science, Data Engineering, Engineering, Statistics, Mathematics, Information Systems, etc.) Hands-on experience with Azure data platform technologies     Core Technical Skills Strong working knowledge of: Azure Data Factory (ETL / data orchestration) Azure Databricks (data processing / analytics workloads) Python (especially Pandas / PySpark) or R Cloud-based data pipelines and transformations Complex data structures and scalable data design     Preferred Experience & Knowledge Advantageous but not strictly required: Exposure to AI / Machine Learning infrastructure (AI Foundry / ML deployment concepts) CI/CD concepts for data pipelines Git / version control workflows Agile delivery environments JIRA / Azure Boards or similar tools Designing resilient, modular data architectures     Key Competencies Strong analytical and problem-solving skills Ability to work independently in a remote environment Clear communication and documentation ability Comfort working with both technical and non-technical stakeholders Structured, quality-driven approach to engineering work     Work Arrangement ? Fully Remote Role (100%) This position is entirely remote, offering flexibility while working within a highly collaborative and technically progressive team.     Environment & Culture The team operates in a modern, innovation-driven environment with strong emphasis on: Cloud-native infrastructure Advanced analytics and AI Engineering best practices Continuous learning and professional growth Collaborative Agile delivery   * In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent   * If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.  
Salary: 650

Senior C#.Net Analyst Developer Reference No: 3470161959 | Cape Town, South Africa | Posted on: 13 April 2026

The Senior Analyst Developer is responsible for designing and implementing advanced technical solutions aligned to business requirements. This role focuses on high-quality software development, technical leadership, and contributing to best practices and innovation within the development environment.   The successful candidate will play a key role in solution design, development, and delivery, while also mentoring junior team members and collaborating with stakeholders to ensure robust, scalable, and efficient systems.     Key Responsibilities Development & Delivery Design, develop, test, and debug software solutions in line with functional and technical requirements Contribute to solution architecture and define technical frameworks for new applications and integrations Analyse, troubleshoot, and resolve application and system issues Ensure adherence to coding standards, architectural principles, and best practices Develop and maintain technical documentation Perform unit testing and support integration and regression testing Manage and contribute to source control and release processes Deliver enhancements aligned with existing application architecture Provide alternative and innovative technical solutions where applicable     Technical Leadership & Mentorship Mentor and guide junior developers to support team capability growth Contribute to technical standards, frameworks, and best practices Participate in technical design discussions and forums Ensure quality assurance standards are met across deliverables Drive a culture of continuous improvement and technical excellence     Systems Stability & Support Support deployment processes, including creating deployment artefacts and instructions Collaborate with cross-functional teams to ensure stable application and database environments Monitor system performance and optimise applications post-deployment Provide support, including standby duties where required     Collaboration & Stakeholder Engagement Work closely with business analysts, testers, and other technical teams throughout the SDLC Contribute to application and solution architecture decisions Support database design, optimisation, and implementation Participate in proof of concept (POC) initiatives Provide technical input into future application and technology strategies Deliver ad hoc reporting and analysis where required     Minimum Requirements Relevant IT qualification (Diploma or Degree) BSc in Computer Science or similar (preferred) Minimum 8 years’ experience in software development within an object-oriented environment Minimum 8 years’ experience with .NET and SQL database design and development Proven experience in solution design and system integration     Technical Skills & Experience Strong proficiency in C# / .NET (Core and Framework) Solid experience with SQL Server and database design Understanding of Object-Oriented Programming (OOP) and SOLID principles Experience with Design Patterns and Test-Driven Development (TDD) ORM frameworks such as Entity Framework Exposure to RESTful APIs / Web Services Experience with CI/CD pipelines and DevOps practices Version control using Git Additional Technologies: ASP.NET MVC WCF Caching mechanisms JavaScript frameworks (Angular, jQuery, Knockout) HTML & CSS API testing frameworks Advantageous: Experience within financial services or investment environments     Key Competencies Innovation Drives innovation through experimentation and continuous improvement Encourages creative problem-solving and new approaches Collaboration Works effectively across teams and encourages open communication Values diverse perspectives and team contributions Adaptability Remains resilient and solution-focused in challenging environments Adapts quickly to change and supports others through transitions Client Focus Understands and aligns solutions to business and client needs Ensures high levels of service delivery and stakeholder satisfaction Results Driven Delivers high-quality outcomes within deadlines Maintains a strong focus on performance, accountability, and delivery     * In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent   * If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.  
Salary: Negotiable

Scrum Master Reference No: 3805549984 | Cape Town, South Africa | Posted on: 03 March 2026

PBT Group is seeking a motivated and experienced Scrum Master to join its IT Product and Enablement division.   This role is responsible for fostering Agile principles (Scrum & Kanban), enabling high-performing delivery teams, and ensuring the successful delivery of customer-centric IT products and solutions. The Scrum Master will operate as a servant-leader, driving collaboration, removing impediments, and ensuring alignment between delivery teams and broader strategic objectives.   Key Responsibilities Agile Facilitation & Coaching Guide and coach software development teams in applying Scrum and Kanban frameworks Facilitate Agile ceremonies including sprint planning, daily stand-ups, reviews, and retrospectives Promote Agile best practices and continuous improvement within the team Sprint & Delivery Management Manage and maintain sprint boards using JIRA Ensure accurate backlog refinement and prioritisation in collaboration with Product Owners Track progress, identify risks, and remove impediments to ensure sprint goals are met Drive predictable, high-quality delivery in a fast-paced environment Stakeholder & Team Enablement Foster collaboration between development teams, product owners, and stakeholders Support teams in delivering value aligned to strategic business goals Act as a servant-leader, empowering teams while ensuring accountability Monitor team health and performance, addressing challenges proactively   Minimum Requirements Education & Certification Certified Scrum Master (CSM, PSM, or equivalent) – Mandatory Experience 3–5+ years’ experience as a Scrum Master in software development environments Proven experience leading Agile teams using Scrum and Kanban Experience delivering IT products in a fast-paced, iterative environment Strong understanding of the Software Development Life Cycle (SDLC)   Technical & Professional Skills Proficiency in Agile project management tools, particularly JIRA Strong facilitation, coaching, and mentoring capability Excellent conflict resolution and problem-solving skills High emotional intelligence and adaptability Strong organisational and time management skills Ability to drive delivery discipline without compromising team morale   Preferred Experience Advanced Agile certifications (e.g., SAFe, Advanced Scrum Master) Familiarity with software development tools and DevOps practices Exposure to financial services or financial technology environments   Key Attributes Proactive and delivery-focused mindset Passion for Agile transformation and continuous improvement Strong communicator across technical and non-technical audiences Ability to inspire and mentor cross-functional teams Committed to innovation and modern ways of working   * In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent   * If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.  
Salary: R300 to R400

Data Warehouse Developer Reference No: 1192094358 | Centurion, South Africa | Posted on: 19 February 2026

PBT Group is seeking an experienced Senior Data Warehouse Developer to join our team. This role plays a key part in designing, developing, and maintaining enterprise-level data warehouse solutions — ensuring data integrity, accessibility, and performance across large-scale financial systems. The ideal candidate will have a strong technical foundation in SQL, ETL/ELT development, and data modeling, with proven experience delivering reliable and scalable data warehouse solutions. The role also involves translating complex datasets into actionable insights through effective visualization and reporting tools such as Power BI.   Key Responsibilities Data Warehouse Development Design, develop, and maintain robust data warehouse solutions. Develop and optimize ETL/ELT processes to ensure efficient data extraction, transformation, and loading. Configure real-time data replication from source systems using tools such as Stelo SQDR or Qlik Replicate. Perform data quality checks, validations, and reconciliation to maintain accuracy and integrity. Convert legacy Excel reports into Power BI dashboards and reports. Implement data governance, security, and compliance best practices. Develop and document technical solutions that are scalable, maintainable, and reusable. Reporting and Analytics Build interactive Power BI dashboards and reports aligned with business requirements. Translate technical data into clear, actionable insights for both technical and non-technical stakeholders. Ensure accurate data attribution and validation for reporting and cost analysis. Perform data analysis and visualization to support business decision-making. Collaboration and Support Engage with business analysts, data scientists, and technical stakeholders to clarify data needs and design solutions. Participate in operational support and troubleshooting of existing data warehouse components. Provide mentorship and technical guidance to junior developers. Collaborate across teams to drive data-driven initiatives and process improvements.   Skills and Competencies Technical Expertise Strong proficiency in MS SQL Server (2022) and T-SQL. Experience with ETL tools (SSIS, Azure Data Factory, Informatica, Talend). Solid understanding of data warehouse methodologies (Kimball, Inmon). Intermediate to advanced experience with Power BI for data visualization. Exposure to data replication tools (e.g., Stelo SQDR, Qlik Replicate). Data transformation, modeling, and integration experience. Knowledge of cloud data platforms (Azure Synapse, AWS Redshift, BigQuery) advantageous. SQL, T-SQL, SSIS, Power BI, ETL, ELT, familiar with Cloud   Soft Skills Excellent analytical thinking and problem-solving abilities. Strong communication skills, both written and verbal. High attention to detail with a commitment to quality and accuracy. Effective time management and prioritization under tight deadlines. Collaborative team player with a proactive, solutions-driven mindset.   * In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent   * If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.   Qualifications and Experience Bachelor’s degree in Computer Science, Information Systems, Mathematics, Economics, or related field. 5+ years’ experience as a Data Warehouse Developer or similar role. Proven experience in data modeling (star/snowflake schemas). Experience with data visualization tools (Power BI, Tableau, QlikView). Familiarity with version control systems (Git). Cloud certification (Azure, AWS, GCP) is advantageous. Experience with Python or PowerShell scripting beneficial. Exposure to big data technologies (Spark, Hadoop) advantageous.   Behavioural Competencies Examining and interpreting information critically Generating insights and innovative ideas Producing high-quality, accurate deliverables Articulating complex data clearly Collaborative and adaptive team approach Strong accountability and ownership mindset
Salary: 350

Informatica Data Engineer Reference No: 3021938427 | Johannesburg, South Africa | Posted on: 19 February 2026

PBT Group has an urgent requirement for an Informatica Developer.   The Data Integration Developer is responsible for the design, build, and deployment of the project's data integration component. A typical data integration effort usually involves multiple Data Integration Developers developing the Informatica mappings, executing sessions, and validating the results.   We’re looking for someone with strong, hands-on experience in: Informatica PowerCenter in large enterprise environments Oracle (data warehousing, complex transformations, performance optimisation) Ability to work independently and contribute from day one   Previous banking / financial services exposure would be a strong advantage.   * In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent   * If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.
Salary: Negotiable

Senior Java Developer Reference No: 1280169108 | Cape Town, South Africa | Posted on: 22 January 2026

PBT Group has an opportunity for a Senior Java Developer. We are looking for an exceptional individual to build Java integration services and applications with an enthusiasm for solving interesting technical challenges in a Banking environment. Be involved in all aspects of development playing a critical role in design, planning, development and deployment.   The ideal candidate will not just be an outstanding Java developer, but will bring a mindset of creativity, innovation and balanced bleeding edge technology development with a relentless pursuit of timely product engineering delivery. The role requires ownership, logical thinking, and estimation of your own work, coding, unit testing, trouble shooting and performance optimizations. The candidate will work in an Agile (SCRUM) environment and will be allocated and co-located to a small agile development team (squad). The candidate will collaborate with team members to understand the problem & define, design, and ship new features through the development lifecycle. This includes brainstorm, contribution to new ideas, conceive innovative strategies, and implement solutions to difficult problems. Must be able to work independently, effectively and efficiently in a collaborative agile environment with scrum masters, solution engineers, designers, and developers.     Duties: Key Roles, Responsibilities and Skills/Competencies: Strong technical/ software engineering background (‘hands-on’) Strong application/software development or programming background in Java Good experience in system, application architecture, design, development, implementation and deployment (end-to-end). Ability to work on different tiers of the application Object oriented Design and MVC-pattern Experience on web technologies (Web Development JavaScript Frameworks - preferably Angular 4, JavaScript, CSS, HTML5, Etc.) Solid experience on J2EE (Servlet, JSP, JDBC, JMS, EJB), Spring Framework, JPA, Hibernate, Open-Source Frameworks Solid experience with SOA-Architecture with the related integration protocols (e.g., Web Services (SOAP/XML), REST & JSON, and MQ), as well as the related SOA-security requirements/models. Solid application database management practices in a high-volume Java environment (SQL, DB2, and NoSql – Cassandra/MongoDB). Solid JEE/WebSphere Foundation experience, especially on WebSphere Application Server, JBOSS, Linux, Virtualisation technologies and Caching technologies. Design, build and run of IBM Websphere Application Server Infrastructure Solutions Migration from proprietary Application Servers to WAS Performance and troubleshooting of WAS Infrastructure Good experience with integrated system environments Practical experience in a high-volume banking environment. (E.g., application clustering, scaling, multi-threading, session management, etc.) Experience with re-use and standardisation, security considerations, deployment architecture like automated application builds, software configuration management & tools, etc. Experience with Application Integration challenges in a big corporate environment with ESB’s e.g. WebSphere MessageBroker, DataPower, MQ-Series, and API-Connect. Proven ability as a problem-solver Self-driven, self-starter, technology leader, and able to work independently. Must have working experience on the following tools/IDEs Eclipse InteliJ Git Maven Jenkins SonarQube Nexus Debugging & troubleshooting. Write well documented and maintainable code. Passion for software excellence and be quality driven. You must be prepared to bring new ideas to the workplace, but also to accept how things have been done and the reasons for doing things this way.   Experience: 10+ years Java application programming/development experience 8+ years JEE-Experience 5+ years Application Server experience i.e. Websphere Application Server 5+ years relational database experience (DB2 preferred) Strong application/software development or programming background in Java. Good experience in system, application architecture, design, development, implementation and deployment (end-to-end). Ability to work on different tiers of the application Object oriented Design and MVC-pattern. Experience on web technologies (Web Development JavaScript Frameworks - preferably Angular 4, JavaScript, CSS, HTML5, Etc.). Solid experience on J2EE (Servlet, JSP, JDBC, JMS, EJB), Spring Framework, JPA, Hibernate, Open Source Frameworks. Solid experience with SOA-Architecture with the related integration protocols (e.g. Web Services (SOAP/XML), REST & JSON, and MQ), as well as the related SOA-security requirements/models. Solid application database management practices in a high volume Java environment (SQL, DB2, and NoSql – assandra/MongoDB). Solid JEE/WebSphere Foundation experience, especially on WebSphere Application Server, JBOSS, Linux, Virtualisation technologies and Caching technologies. Design, build and run of IBM Websphere Application Server Infrastructure Solutions Migration from proprietary Application Servers to WAS. Performance and troubleshooting of WAS Infrastructure. Good experience with integrated system environments.   Qualifications/ Certification: B-Degree in Computer Science or related technical field.   * In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent   * If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.    
Salary: 80000

Web Developer Reference No: 3627491287 | Cape Town, South Africa | Posted on: 19 January 2026

Are you a talented and passionate web developer with over 5 years in the Web development space? We are seeking a skilled individual to join our dynamic team and play a pivotal role in designing and developing our cutting-edge platforms.   Must have: Over 3 years of Web Development experience Experience managing a team of junior web developers   Skills required: JQuery JavaScript CSS 3 HTML 5 Knowledge of common JavaScript Frameworks   About the role: Support existing and develop new functionality and components Help write and optimize in-application SQL statements Design and develop scalable application solutions Debug and resolve application issues Interpret business requirements Prepare documentation and specifications Collaborate with other team members and stakeholders Oversee the junior development team Assist with testing efforts   * In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent   * If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.  
Salary: Negotiable

Senior Full Stack Web Developer Reference No: 1746502919 | Johannesburg, South Africa | Posted on: 16 January 2026

The Senior Full Stack Web Developer will work collaboratively within delivery teams and across the wider business to design, build, and support high-quality digital solutions. The role focuses on delivering exceptional customer experiences while enabling scalable, secure, and efficient platforms within the financial services and investment domain.   Key Responsibilities Collaborate with proposition, operations, change, and development teams to solve business and system challenges. Analyse client requirements and produce detailed technical specifications covering data, rules, logic, design, development, and implementation. Translate high-fidelity UI/UX designs into fully responsive, high-performing web applications. Own projects end-to-end, from solution design through development, deployment, and production support. Provide input into impact analysis, development estimates, technical feasibility, and risk assessments. Build reusable components, libraries, and frameworks for future scalability. Ensure UI/UX designs are technically feasible and optimised for speed, performance, and scalability. Collaborate closely with backend teams to ensure seamless front-end and back-end integration. Conduct peer code reviews to maintain code quality and best practices. Support production environments by resolving defects, incidents, and ad-hoc requests timeously. Perform unit testing and implement fixes following business or system incidents. Assist with technical architecture concepts and contribute to evolving development standards. Participate in Agile delivery processes and contribute to continuous improvement initiatives. Build and maintain strong relationships with key business and technical stakeholders.   Minimum Qualifications Relevant degree in IT, Computer Science, Finance, Economics, Statistics, Investment Management, or Business Management. Excellent verbal and written communication skills. Multilingual capability is advantageous. Minimum of 5 years’ experience in a database or application development environment.   Required Experience At least 5 years’ experience within: Financial adviser firms Financial services customer environments Investment platform businesses Proven experience delivering high-quality customer-facing solutions. Strong problem-solving skills in complex business environments. Experience working in fast-paced, delivery-focused teams. Strong collaboration across cross-functional teams. Familiarity with Agile methodologies and JIRA-based workflows.   Technical Skills Core Skills Microsoft SQL Server (T-SQL) C# / .NET HTML, CSS, JavaScript Responsive design frameworks (Bootstrap or similar) Beneficial / Advantageous React.js Angular .NET Core Entity Framework Core API development (RESTful services) Solid understanding of architectural design patterns   Tools & Platforms Visual Studio Git / Bitbucket JIRA / Confluence Azure DevOps Octopus Deploy   Knowledge Requirements The successful candidate will demonstrate: Deep understanding of investment and life products (subject-matter expert level). Strong knowledge of financial services regulations affecting investment platforms. Insight into financial adviser operations and client servicing models. Experience delivering solutions that enhance adviser and customer experiences. Understanding of the competitive investment platform landscape. Knowledge of change management processes. Strong awareness of current information security (Infosec) practices. Exposure to Agile and DevSecOps practices.   Key Competencies Communicating with Impact: Clear, concise verbal and written communication. Customer Service Excellence: Ownership and accountability for high-quality delivery. Analytical Thinking: Structured problem-solving and logical reasoning. Driving for Excellence: High attention to detail and quality outcomes. Entrepreneurial & Commercial Thinking: Business-aware decision-making. People Skills: Ability to influence and collaborate across diverse teams. Resilience: Maintains focus and composure in high-pressure environments. Teamwork & Collaboration: Works effectively towards shared goals. Persuasion & Influence: Aligns stakeholders toward desired outcomes. Leading Change: Encourages innovation and continuous improvement. Trust & Integrity: Acts with honesty, consistency, and professionalism.   * In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent   * If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.  
Salary: Negotiable