Archive for November 29th, 2017

  • True digital enlightenment lies in data

    PBT Group thought leadership

    The start of a year inevitably brings discussions around upcoming trends – with people debating what businesses should be taking on board to remain relevant. This is especially true within the technology space, as businesses continue to move towards becoming digitally attentive.

    Given this, as digital continues to lead conversations around boardroom tables, it is becoming evident that the success of digital business lies in the accurate and timeous analysis of data, which can be leveraged, effectively, to enable digital operations.

    For this reason, many companies are returning to the ‘drawing board’ and relooking the fundamentals of their data management strategy for digital success. Key to this is investigating the existing data foundation a business has in place.

    However, with so much data exchange happening within a company, often decision makers focus more on investing in technology aspects, driven under the assumption that the data foundation is running effectively. Yet, with ongoing changes to governance, privacy and the focus around the security of data, existing data foundations need to be enhanced.

    To truly achieve digital reformation, concerted efforts in formalising and streamlining data governance within a data foundation should become a key focus for executives. If we consider the Protection of Personal Information (PoPI) Act, which is set to come into effect at some point in 2018, there should be an urgency driving this.

    As part of this process, the business will have the opportunity to automatically address, and in certain areas, enhanced data management practices – meeting governance and security requirements. Furthermore, with a robust data foundation, organisations will now have the potential to improve the quality of the data to be analysed, which can benefit the business in the results produced.

    The data foundation is a critical component to the entire data strategy a business executes. In fact, the foundation will determine the capability of the business to extract insights from the information coming into the business from various sources.

    And of course, a vital component to this is utilising the right skills to get the job done. This means taking on board the expertise of data architects and engineers, who have the experience and insight to shape data foundations in a way that meets these current business requirements.

    The more focus a business places on digital, the more attention they will need to give to their data, as data will lead their strategies going forward and remain an important tool in achieving digital victory.







  • Welcoming a data analysis reality

    PBT Group thought leadership

    Analysing data has become integral to the success of any business in the digital environment. But despite its importance, there are still those who struggle to come to terms with big data and how to reap the rewards of its associated benefits.

    Big data is hardly a new phenomenon. Many cite the arrival of tablets and smartphones as the defining moment that set the wheels in motion of our reliance on data and extracting insights from it to develop more customer-centric solutions. But even before that, we had the internet. This created an enabling environment for people to generate massive amounts of information either from a content perspective or from providing feedback.

    So, if this has been around for many years, why the sudden interest in big data? For one, companies are realising that they need to find value in the massive amounts of data they have stored in their corporate back-end, whether that is hosted on-site or in the cloud.

    In data-rich environments like finance it is easy to see why better analysis can result in more bespoke solutions. However, big data transcends industry and even company size.

    A man for all seasons

    No longer the exclusive reserve of enterprises, data analysis has become accessible to organisations of all sizes. From start-ups to SMEs, being able to extract value from data is a critical function of a competitive organisation. Cost is no longer an obstacle to overcome.

    In part, the likes of cloud computing and machine-learning can be thanked for this. As more companies embrace and adopt newer technologies, so too do they become more cost-effective for others to implement. Using the cloud as a platform for growth also means the organisation has additional resources to perform data analysis and understand what the specific needs are of their various customer segments.

    One of the key considerations to make when it comes to data analysis is whether there is commitment from the C-suite. Having the organisation support the understanding of data means the different divisions inside a business can more effectively go about the task at hand.

    Regulatory affairs

    Even though data analysis provides numerous opportunities to differentiate the organisation from its competitors, complying with regulatory requirements should remain top of mind.

    Given that the Protection of Personal Information Act (PoPI) will be law from next year, companies need to carefully consider how they store, analyse and share data. For some, it might mean re-evaluating existing processes whilst others might just need to fine-tune some elements. Irrespective, it needs to be the ‘guiding light’ when it comes to analysis.

    Organisations need to understand that the age of data analysis is well and truly underway. How best they decide to embark on the journey remains up to them, but they need to embrace it or risk losing their relevance.

  • Getting to grips with data

    This article appeared on Brainstorm Magazine online, on the 31st October 2017

    Source: Brainstorm Magazine online

  • Dreaming the Future of Africa

    By Jessie Rudd,  Technical Business Analyst at PBT Group

    This article was first published on ITWeb, on 07th November 2017

    Source: ITWeb

  • Ensure the quality of your data

    PBT Group thought leadership

    Data is all around us. In the digital landscape, it has become the currency to do business and remain relevant. But what of its quality and how best to effectively manage it? After all, if your underlying foundation is not sound, there is very little hope of transforming business processes successfully.

    On its own, data means very little. This is even more so the case today where data sources have increased exponentially thanks to the likes of cloud computing, mobility, social networking, and the Internet of Things. It is no longer just a case of receiving data from email, customer forms on websites and call centre agents. Companies are inundated with data coming into the business from all sides.

    Involving business

    And, unlike in the past, this is not just a technology discussion. With terms like structured and unstructured data becoming part of boardroom discussions with all C-suite executives contributing viewpoints, companies need to find a way to effectively manage their data and ensure its quality.

    Considering this, Data Quality Management and Master Data Management are two singularly important concepts that should be part of any business strategy in the digital world. Like one litre of oil contaminating one million litres of fresh water, dirty data can significantly negatively impact its effective analysis inside an organisation.

    Data Quality Management ensures that the data reflects and represents real-life truths comprehensively and consistently throughout complete data sets. Master Data Management enables us to manage the contextual data relating to the key data entities to set standards, ensure consistencies and increase confident interpretation of the data trends and analytics.

    Embracing new technology

    While the data coming in to the organisation needs to be closely scrutinised for its quality, just because it is considered ‘good’ in January does not mean it will continue to be relevant to the business by the time December comes.

    Fortunately, machine-learning provides deeper data analysis capabilities, that can overcome data quality gaps in historic datasets. Even though many companies still rely on data scientists or analysts to accurately prepare and interrogate the information stored in back-end systems, these newer technologies provide agility and speed to deliver insights to more people inside the organisation.

    On the one hand, this means different departments can take more nuanced insights from the data than just receiving a high-level overview. On the other hand, it creates an enabling environment for the company to break the traditional silo approach of data analysis.

    Continually evolving

    Irrespective of the technology or approach used, decision-makers across the organisation need to remember that neither Data Quality Management and Master Data Management are ‘fire-and-forget’ solutions.

    For these approaches to work effectively, the company must continually review and update its internal best practice to ensure aspects around data stay aligned to the corporate strategy. The importance of data is only going to increase as organisations strive to find that competitive advantage in a connected world. The differentiation will come in terms of who manages their data best and who can ensure its quality for business success.


  • Building an effective data foundation

    This article was published on IT News Africa, on the 30th October 2017

    Source: IT News Africa


  • Data architecture goes agile

    By Masindi Mabogo, Director at PBT Group

    This article was first published on ITWeb, on 11th October 2017

    Source: ITWeb