News & events

  • Reconsider the role of business ‘chiefs’

    This article appeared on IT Online, on the 14th September 2017

    Source: IT Online

  • Business Leader: Yolanda Smit

    This business profile was first published on Engineering News, on 08th September 2017

    Source: Engineering News


  • Considering the role of ‘Chiefs’ in business

    This article appeared on CAJ News Africa, on the 06th September 2017

    Source: CAJ News Africa      


  • The Big Data exploration

    By Julian Thomas, Principle Consultant at PBT Group

    This article was first published on ITWeb, on 06th September 2017

    Source: ITWeb

  • Are insurance companies seeing Big Data as a competitive opportunity?

    This article appeared on Cover SA online, on 29th August 2017

    Source: Cover SA online

  • Business skills critical to leveraging technology effectively

    This article was first published on Engineering News, on 25th August 2017

    Source: Engineering News

  • Incentive Assertiveness – The future success of gamification

    This article appeared on IT News Africa, on 10th August 2017

    Source: IT News Africa

  • Fluffy, white digital clouds?

    By Jessie Rudd, Technical Business Analyst

    Source: ITWeb         


  • The relevancy of big data for investors

    The relevancy of big data for investors

    By Yolanda Smit, Regional Director at PBT Group


    Source: CAJ News Africa

  • The relevancy of big data for investors

    By Yolanda Smit, Regional Director at PBT Group

     Source: Techno Africa        


  • Hype Cycle for Emerging IoT

    By Masindi Mabogo, Director at PBT Group

    Source: Talk IoT

  • The rise and fall of IOT

    By Masindi Mabogo, Director at PBT Group

    Source: ITWeb


  • The data game is changing

    This article was first published on ITWeb, on 12th April 2017

    Source: ITWeb

  • Saving the rhino with IOT

    By Jessie Rudd, Technical Business Analyst at PBT Group

    Source: ITWeb

  • Embracing a BI mind-set change, if you want BI success in the digital world

    This article appeared on IT News Africa, on 16th March 2017

    Source: IT News Africa

  • It’s time to embrace a Business Intelligence mind-set

    This article appeared on Tech IT Out, on 16th March 2017

    Source: Tech IT Out

  • Advice on how to ensure security of companies’ data

    This article was first published on Engineering News, on 3rd March 2017

    Source: Engineering News

  • Intrinsic dark data

    By Masindi Mabogo, Director at PBT Group

    Source: ITWeb

  • Why we need to encourage a future in data

    This appeared on Business Media Mag, on 07th February 2017

    Source: Business Media Mag

  • Technology trends driving 2017

    This appeared on Tech IT Out, on 09th February 2017

    Source: Tech IT Out

  • The business strategy that’s helping Coca-Cola, Netflix and Puma dominate their industries

    This article was first published on SME SA, on 10th February 2017

    Source: SME SA

  • A startup’s short guide to finally using cloud

    This article was first published on SME SA, on 31st January 2017
    Source: SME SA
  • A data driven world

    This article was first published on ITWeb, on 08th November 2016
  • BI packs a punch

    A number of BI trends have had a tremendous impact on business in 2016.

    By Yolanda Smit, strategic BI manager at PBT Group.
    Johannesburg, 4 Nov 2016

    Edging towards the end of yet another year, the Christmas paraphernalia in commercial districts leaves one in shocked silence at the thought that another year has come and gone. Those who have experimented in agile development practices cannot resist the prompting that the end of a cycle calls for a moment of retrospection.

    Meditating on the last year of the business intelligence (BI) industry trends ironically enough finds ‘agile’ thinking has become far more prevalent in BI circles. Companies are starting to realise that traditional BI development life cycles of three to six months are no longer good enough. Information of high quality has become a key ingredient to business success, and therefore, the demand for faster delivery is becoming more pervasive.

    However, the insight of those who experimented with agile development practices in BI has started to mature, with the realisation that BI is typically far more complex than application development. Therefore, more disciplined approaches to agile are required in order to effectively scale, and enable shorter delivery cycles to meet the needs of decision-makers for ad hoc decisions. Not only does BI delivery become faster, but disciplined agile BI delivery enables agile business with improved responsiveness to new opportunities or changing circumstances.

    Meet and greet

    These agile practices make the BI delivery team more user-centric, in a weird way mimicking the realisation of a common business strategic ambition to become more customer-centric. It is this strategic objective that is driving the second observed trend of an increased take-up of big data. The ambition of customer-centricity has been evolving over the last couple of years in SA, but the realisation is starting to hit home that a company can only truly be customer-centric if it know its customer. Multitudes of sources containing fragments of knowledge about customers can be consolidated in innovative ways using big data technology to generate insights never thought possible before. Investment in big data technology exploration or proof-of-concepts has increased significantly this year, as companies are preparing the way for the realisation of their customer-centricity ambition in the next two to three years.

    BI is typically far more complex than application development.

    Upstream, in the data provisioning value chain, there has also been an increased investment in more mature information management capabilities. Companies are investing in robust master data management capabilities that enable more robust data integrations from fragmented sources. The quality of data upstream is also being measured and monitored intentionally, in order to proactively ensure the high quality of the information downstream provided into analytics for decision-making. All of this is being managed more intentionally, as companies, especially in the financial services industries, invest a focused effort on maturing their information governance practices. Concepts like data compliance, data owners, and data stewards are slowly becoming part of the mainstream business management glossaries.

    Downstream, analytics has also seen significant increase in maturity over the last year. BI strategic focuses are moving away from the masses using BI for standard management information reporting or dashboards, and the attention becomes concentrated on key power-users who require high-powered and advanced analytics. They go by various names, such as quantitative analysts, statisticians, actuaries, management accountants, etc. However, these are all just various stage names for the user profile known as data scientists.

    These are the users who approach the BI team with new query or view requests almost on a daily basis, and the query result or view is probably only required for once-off use. These are also the users who typically benefit from self-service capability to source their own information, manipulate it according to the business problem they are faced with, generate various analytic views through a large variety of diverse visualisation features in their analytic tools, or even employ some highly sophisticated predictive and prescriptive analytical capability.

    Cognitive computing

    These users are also the front-runners to lead BI into the reality of true cognitive computing in the near future. Although there hasn’t been real progress in the space of cognitive computing in the last year, curiosity is starting to rise and questions are being asked more often.

    All in all, 2016 has been a year of rapid increase in maturity for management of information, as information is becoming ‘big’ – big on the strategic agenda of any company that wants to establish and maintain a competitive advantage in the new digital age, as well as big, as in big data technology.

    As far as the delivery of BI capabilities goes, the need for agility rings as true as the sleigh bells of Christmas 2016 that loom ahead. BI is no longer a standalone capability that merely spits out beautiful reports. It is effectively becoming embedded into the competitive DNA of the company, and therefore, agile delivery will become a critical success factor to keep up with the proverbial Jones’s.

    Source :

  • A Data Career?

    By Juan Thomas, CIO of PBT Group.
    Johannesburg, 28 Sep 2016

    Being a data scientist is one of the hottest jobs in America right now. In fact, according to research¹, close to half of the 25 ‘best jobs in America’ named are tech-related – and of these, data scientists sit at the top of the list.

    However, when we look at the local landscape, is this reality mirrored here, especially if we consider the growth of the digital world and its impact on South African businesses?

    The answer is a simple no – not, because the need for data scientists is not there. In fact, it is the complete opposite. There is a very real need for these unique technical skills in the local business market, especially when you consider the amount of data businesses find themselves dealing with, and this data continues to increase significantly year on year.


    We know already that businesses are realising that if used correctly, data actually adds massive value to the bottom line and results in better business profitability. However, the data is often too complex and disparate and thus requires a unique skill set of a data scientist, if you want the data analysed to ensure that it can actually add value. A data scientist is someone who has the ability to harness a wide range of skills to translate raw data into a meaningful end result for the business, as well as communicate this result in a way which tells a story of interest to the audience. To do this, one usually possesses the following skills: technical IT, behavioural economics, statistics, visualisation, psychology, business knowledge.


    Yet, South Africa still finds itself in a rather dire situation when it comes to these needed ICT skills. The results of the Johannesburg Centre for Software Engineering (JCSE) skills survey² reiterates this sentiment, confirming that the local shortage, in ICT, is still massive. Couple this with the fact that often technically skilled individuals are recruited to work overseas, has compounded this situation. The result, unfortunately, is a negative impact on the business environment, as organisations struggle to find the specialised personal they need.


    From a corporate point of view, more companies need to get involved and become part of the solution. This can be as simple as supporting ongoing programmes already active in the market that encourage young employees to study further to develop their technical skills capabilities – based on what the market requires. Alternatively, businesses can develop their own programmes or encourage young employees to study further, eg. part-time graduate or diploma courses.


    Furthermore, the public sector also has a great opportunity here – where it could provide facilities, like training centres and bursary schemes (over and above the current programmes, and ones specially focused on ICT) to assist young professionals becoming better skilled before, and when, entering the job market in the ICT space.

    The need for specific ICT skills in the business world will likely not disappear anytime soon – rather it will only grow as innovation in this space continues. As a result, a career in this path will serve an individual very well. Corporations in South Africa should support the development of niche technical skills through IT education and by getting involved in programmes to assist and promote such ICT skills development. Without this commitment, we cannot ensure that the technical skills needed by businesses today, will be there in the future – these skills have to be developed if the generations to come are able to make an impact.


    Source :

  • Elevating business intelligence

    There are critical success factors on the road to the cloud for business intelligence.

    By Julian Thomas, Solution Architect Manager at PBT Group.
    Johannesburg, 4 Oct 2016

    In my previous Industry Insight, I outlined certain important points to consider when defining a cloud business intelligence (BI) strategy. In summary, this came down to having a clear understanding of the business case (and associated benefits), as well as the business use cases (to understand where this would be applied).

    At this point, if you had gone through this process, hopefully as part of a coordinated, managed project, you would most typically be right in the middle of the hype cycle of the project. At this stage of the process, everyone is really excited about the expected benefits identified in the strategy, such as improved performance, reduced costs, scalability, agility, improved planning, financial forecasting and costing.

    As a result, there is tremendous energy and pressure to proceed rapidly forwards. However, what about the potential downside? And, what about the risks? It is crucial that not just the pros of a cloud BI solution are evaluated, but also the cons. Knowing the potential challenges upfront will allow us to implement mitigating steps, or worst case, delay the implementation until potential issues/risks can be resolved.

    While there is certainly no limit to the number of unique challenges that can be encountered across diverse, unique organisations, I believe there are a handful of standard, common pitfalls that organisations should be aware of, and proactively acknowledge and manage.

    Data traffic: Initial and/or recurring uploads of data into the cloud – can be costly and time-consuming. This can have a significant impact on the batch windows and overall cost of ownership. Very often, additional, dedicated lines need to be set up to the relevant service provider to ensure optimal performance. This is especially true in the African context, where data bandwidth is costly. It is critical that a realistic assessment of these costs is included in the overall estimates and planning.

    Security and compliance: Moving data into the cloud, especially personal information, can be difficult to obtain approval for. There is a great deal of caution in South Africa right now with regards to personal information. This can often result in a hybrid solution requirement, where certain data has to remain onsite, while non-restricted data is moved into the cloud. At the very least, additional due diligence needs to be performed to ensure regulations are not violated when adopting a cloud BI solution.

    Pricing models: Pricing models for cloud BI solutions can at times lack a certain level of transparency. It is often initially easy to consider a cloud BI solution to be cheaper, but on closer reflection, based on pricing/usage, cloud BI solutions can sometimes end up being more expensive over the long term.

    Governance: Adopting the cloud as a platform implies relinquishing a certain level of control and governance. On the whole, we are comfortable with this, as most of the vendors and platforms have demonstrated their ability to manage this on our behalf very well. However, it would be foolish to assume all organisations will appreciate this point. This is an important human element to consider and be mindful of.

    It is crucial that not just the pros of a cloud BI solution are evaluated, but also the cons.

    Having understood the potential cons, it is now important to define the critical success factors that the potential cloud BI solution needs to be measured on. These can once again include numerous points, but there are certain core principles that need to be clearly understood, quantified and measured, before continuing any further.

    Data privacy and security: Carefully consider the minimum requirements of local and international regulations on privacy and security of data, which can limit what data is stored in the cloud, or hosted in specific countries.

    Data transfer rate: Define the acceptable speed at which data needs be uploaded/downloaded in order to meet batch window and end-user requirements.

    Data transfer volume: Define the expected data transfer volume and frequency, and evaluate within the context of existing bandwidth.

    Data transfer costs: Define an acceptable cost per gigabyte of data transfer, taking into consideration any potential price escalation clauses based on volume uploaded or downloaded, etc.

    Local availability: The importance of reliable Internet connectivity needs to be clearly understood and defined, particularly with regards to the impact that lack of Internet access can have on the solution and the business as a whole.

    Cloud availability: The availability of the cloud BI service provider obviously has a huge impact on the success of these solutions. We expect cloud BI service providers to have stable platforms, but what are the organisation’s requirements and expectations regarding this?

    Disaster recovery: Appropriate disaster recovery needs to be in place to protect data and solutions, as well as to meet regulatory requirements.

    Suitable redundancy: This speaks to the ability of the solution to configure/select the level of redundancy to suit the nature, importance and usage of the data being stored in the cloud.

    Change management: This speaks to the internal organisation’s capability to adopt the new paradigm. This is an important part of the successful implementation of the solution.

    Understanding these points in the beginning of the cloud BI journey will yield great dividends in the future, as it lays the groundwork for all subsequent decisions around vendor and platform selection, and solution implementation options.

    Source :

  • Encouraging a future in data

    Companies must inspire and empower employees to continue learning and expanding their skill sets.

    By Juan Thomas, CIO of PBT Group.
    Johannesburg, 28 Sep 2016

    Being a data scientist is one of the hottest jobs in America right now. In fact, according to research, close to half of the 25 ‘best jobs in America’ named are tech-related – and of these, data scientists sit at the top of the list.

    However, when looking at the local landscape, is this reality mirrored here, especially when considering the growth of the digital world and its impact on South African businesses?

    The answer is a simple no – not because the need for data scientists is not there. In fact, it is the complete opposite. There is a very real need for these unique technical skills in the local business market, especially when considering the amount of data businesses find themselves dealing with, and this data continues to increase significantly year on year.

    Back to basics

    Businesses are realising that, if used correctly, data actually adds massive value to the bottom line and results in better business profitability. However, the data is often too complex and disparate, and thus requires a unique skill set of a data scientist if the business wants that data analysed to ensure it can actually add value.

    A data scientist is someone who has the ability to harness a wide range of skills to translate raw data into a meaningful end result for the business, as well as to communicate this result in a way that tells a story of interest to the audience. To do this, one usually possesses the following skills: technical IT, behavioural economics, statistics, visualisation, psychology and business knowledge.

    Yet, SA still finds itself in a rather dire situation when it comes to these needed ICT skills. The results of the Johannesburg Centre for Software Engineering (JCSE) skills survey reiterates this sentiment, confirming the local shortage in ICT is still massive. Couple this with the fact that, often, technically skilled individuals are recruited to work overseas, and the situation is compounded. The result, unfortunately, is a negative impact on the business environment, as companies struggle to find the specialised personnel they need.

    Given this, it is becoming clear that more in the way of skills development needs to be done. The JCSE survey is very clear about the fact that there is a need for industry and academia to step in and help SA build the skills needed to drive forward this new digital economy businesses find themselves operating in. Of course, it is great to see how schools and universities are starting to place a focus on programmes dedicated to IT skills development, but this alone is not enough.

    Empowering employees

    From a corporate point of view, more companies need to get involved and become part of the solution. This can be as simple as supporting ongoing programmes already active in the market, which encourage young employees to study further to develop their technical skills capabilities – based on what the market requires. Alternatively, businesses can develop their own programmes or encourage young employees to study further, eg, part-time graduate or diploma courses.

    Companies struggle to find the specialised personnel they need.

    Those already in the field should speak to the company where they are employed to see if it’s viable to create skills development programmes and technical cross-skilling sessions and workshops, to encourage continuous learning within this space. Data does not stop evolving, so neither should employee knowledge and skills. Continuous on-the-job training with strong mentorship is key to developing the crucial ICT skills needed locally.

    Furthermore, the public sector also has a great opportunity here – where it could provide facilities, like training centres and bursary schemes (over and above the current programmes, and ones specially focused on ICT) to assist young professionals in becoming better skilled before – and when – entering the job market in the ICT space.

    The need for specific ICT skills in the business world will likely not disappear anytime soon – rather, it will only grow as innovation in this space continues. As a result, a career in this path will serve an individual well.

    Corporations in SA should support the development of niche technical skills through IT education and by getting involved in programmes to assist and promote such ICT skills development. Without this commitment, industry cannot ensure the technical skills needed by businesses today will be there in the future – these skills have to be developed if the generations to come will be able to make an impact.


    Source :

  • Incentive assertiveness

    Gamification is being used as a way to engage and motivate people to achieve their goals.

    By Masindi Mabogo, director at PBT Group.
    Johannesburg, 31 Aug 2016

    The existence of games dates back to ‘human ancient days’. They were used as a channel for social interaction, knowledge sharing, developing mental skills, entertainment as well as teaching spiritual and ethical lessons.

    Common game tools were made of bones, sticks, shells, stones, fruit seeds and shapes drawn on the ground. Their features¹ included uncertainty of outcome, agreed rules, competition, and elements of fiction, elements of chance, prescribed goals and personal enjoyment. In competition games, the reward was the social status (sole bragging rights) within one community or the thrill of reaching higher levels.

    Games have always exhibited the psychological ability to 1) encourage participation through rewarding achievements, 2) influence behaviour through teaching, as well as 3) improve skill(s) through practical attempts. The progression of technology eradicated the limitation from ancient tools and provided infinite possibilities for gaming feature expansion. Over the years, the gaming world perfected and ascertained the effectiveness of these attributes, and the notion of gamification today is to draw the strength of these features into company activities.

    Badgeville², a company that offers an award-winning enterprise gamification and analytics solution, defines gamification as a concept of applying game mechanics and game design techniques to engage and motivate people to achieve their goals

    This concept taps into the basic desires and needs of the user’s impulses, which revolve around the idea of status and achievement. Many other narrations of this concept acquiesce that game elements such as points and rewards are linked to a goal/task as an incentive to encourage participation.

    Gartner³ further redefined the definition to explicitly indicate the engagements have to be digital, meaning participants interact with computers, smartphones, wearable monitors or other digital devices, rather than engaging with a person.

    Rules of engagement

    There are 10 game mechanics pulled from the world of video gaming that are commonly inherited into gamification solutions. These are fast feedback, transparency, goals, badges, levelling, onboarding, competition, collaboration, community and points. Rajat Paharia, founder and chief product officer of BunchBall, discussed these mechanics in detail in his book: Loyalty 3.0: Big Data and Gamification Revolutionizing Engagement (chapter 4).

    Gamification is gaining popularity due to its landscape that makes the hard stuff in life fun. Its addition to Gartner’s hype cycle in 2011 also propelled its popularity in the corporate world. In fact, Gartner* correctly predicted that by 2015, a gamified service for consumer goods marketing and customer retention will become as important as Facebook, eBay, or Amazon, and more than 70% of Global 2000 organisations will have at least one gamified application.

    Gamification is gaining popularity due to its landscape that makes the hard stuff in life fun.

    Many global organisations are already enjoying the competitive advantages derived from their gamification solutions. With more organisations coming on-board, major successes will be directly proportional to the value proposition of their incentives. Companies that have realised this are looking at innovative ways to make their incentives relevant and irresistible to their customers. A successful strategy adopted in recent times is to formulate partnerships that extend incentive permutations beyond the shorelines of the business.


    For example, South African health and insurance companies have already partnered with clothing stores, grocery stores, hotels, flights, computer stores, cinemas, car hires, florists and many others, to expand their rewarding permutations. Their customers are already enjoying an array of incentives through these mutual alliances, while these companies are greatly influencing customers to strive for a healthy lifestyle, and in turn, entrenching genuine customer loyalty.

    My everyday gamification experience is through the health insurance reward programme that tracks my active lifestyle and rewards me for reaching my goals, with yearly cashback (in currency) guarantees, free healthy consumables, shopping discounts and monetary savings for holidays.

    I am currently addicted to my mobile running application, which allows me to track and compare workouts, set personal goals, invite and motivate friends into group activities as well as periodic challenges. I find this motivating and it guarantees my participation due to its appeal to my natural desires for competition, achievement and improvement. I am sure everyone can identify with a few examples in their personal experiences.

    Generally speaking, the future success of gamification will largely depend on the assertiveness of the incentive to engage the participant in order to influence their behaviour, while meeting business objectives.

    Bunchball*, the first company to offer a technology platform (Nitro) to integrate game mechanics into non-game digital experiences, advocates that customers are hungry for reward, status, achievement, competition and self-expression, and they’ll go out of their way to engage with the businesses that gives it the best.



  • Pokémon Go mad

    With the help of serious amounts of big data, Pokémon Go rockets into a whole new cross-section of the population.

    By , BI consultant at PBT Group
    Johannesburg, 2 Aug 2016

    In the space of a few weeks, an augmented reality game called Pokémon Go has managed to do the impossible. It has surpassed WhatsApp, Instagram and Snapchat and is on a par with Twitter for daily, active, ‘on the app’ users [1].

    This is a game featuring Pokémon – originally a Game Boy video game that entranced children of all ages in the 90s [2]. It has somehow, with the help of serious amounts of big data, rocketed itself into a whole new cross-section of the population.


    Those who are not gamers, or addicted to their smartphones, are more than likely a little perplexed by the whole phenomenon. So then – what is Pokémon Go? To put it relatively simply, it is a location-based, augmented reality game for mobile devices, developed by a company called Niantic for iOS and Android operating systems.

    Augmented reality is a technology that overlays a generated image on a user’s view of the ‘real’ world. This overlaying of images creates a composite view, viewable only via the application doing the overlay. So, basically, there is a game that is overlaying itself on Google Maps, directing players to various sites to ‘capture’ various Pokémon. The brilliance behind this second-generation phenomenon has been many years in the making.

    It all comes down to big data.

    In 2011, the company behind Pokémon Go, Niantic, released a game called Ingress. Ingress was one of the first of its kind. A fitness game, otherwise known as an exergame, Ingress relied heavily on the telematics that is present in all smartphones on the market nowadays. Telematics can be used to track body movement and reactions, which is exactly what Ingress did for Niantic.

    This completely new and revolutionary method of collecting data used the game to direct users and players to various ‘portals’ or sites, which were initially extrapolated from geotagged photos on Google Earth. Players were also actively encouraged to submit more sites for consideration, and to date, around five million sites have been approved for use [1]. These sites were suggested, collated, collected and verified using serious advanced analytics and big data.

    These sites are approved for use by games like Pokémon Go, which, by the very nature of what they are, are going to give rise to a whole new mountain of big data; and not just any big data – relevant right now big data.

    One of the biggest drawbacks and problems with big data is that it is, more often than not, historical data, without context and relevant meaning. Not in this case. For the first time, on a scale that can truly be called big, big data is relevant and extremely powerful. This may explain why many attempts to hack the game have already been documented. This may also explain why people are being warned to be careful – which may also be why this step in a new direction is a little bit scary.

    For the first time, on a scale that can truly be called big, big data is relevant and extremely powerful.

    Let’s be real for a moment, using a simple example. Google, with very little effort, is already fully capable of determining where someone is, how s/he got there, how long it took to get there, how long the person will be there for, etc. All this is calculated because the person set up an appointment on Google Calendar, synced a reminder with his/her phone, and looked up the destination on Google Maps. Without too much effort, an entire company knows exactly where this person is. The thing is, however, there is an inherent trust in ‘corporations’. Users assume, or hope, the companies have an ethos in place that will protect them from abuse or exploitation.

    So, imagine then how much power someone with less-than-desirable intentions would have, should they be able to get access to the Pokémon Go server? How much traffic they would be able to direct or divert, exactly where they want?

    Endless possibilities

    Doom and gloom aside, though, let’s think about the practical applications of exergames like this. Let’s say I paid a company like Niantic to ‘place’ one of the ‘collectables’ near my coffee shop. My sales would skyrocket. The marketing possibilities are quite mindboggling.

    Let’s take it a step further though. This same method of data collection could just as easily be tweaked by marketing companies to collect and collate real-time data. This data can then be stored and analysed to become intelligent data, giving invaluable insight into where a person shops, how long they stay in a shop for; and then, of course, billboard placements along the route they travel to get to the shop can be undertaken.

    Having this data intelligence means a business can develop and offer customised offerings based on the initial real-time location that was achieved from the data collection. In fact, when looking at the bigger ‘data’ pictures and thoroughly following the data processes, the possibilities are limited only by the imagination.

    Data collection like this, along with data analysis, is fast becoming mainstream. Unless marketing companies get good data collection methods in place, with well-equipped and forward-thinking analysts who can also analyse the data effectively, they are soon going to find themselves lost in the world of 1 and 0.

    But right now, in a world of Pokémon Go mad and no real clue how to navigate around it, the answers might just lie in data collection, analysis and intelligence.


    Sourse : ITWeb

  • The real business value of pursuing best practice data governance

    Data governance underpins the holistic transformation of the business

    YOLANDA SMIT - 22 July 2016

    The progressive application of data governance to priority areas of data business provides immediate benefits while companies work towards the end-goal of automated data governance systems, says business intelligence firm PBT Group Strategic BI manager Yolanda Smit.

    Ascribing the accountability of data to various business functions and formalizing the existing informal data management systems in line with business rules and requirements will immediately provide better oversight of business-critical functions.

    For each rule and principle of data governance defined, adding new data and systems becomes easier and faster and reduces or eliminates risks.

    Integrating and automating business processes and systems require that rules and policies be effectively applied to the data related to them. Data governance processes, thus, underpin the holistic transformation of the business, she adds.

    Data governance also supports the data architecture of a business by ensuring that information is effectively referenced to provide accurate and comparable views. This reduces data storage, management and associated data-law risks while improving basic business functions.

    “Standardization of data and data quality improves the efficiency of all business systems using these critical data. However, the best way to deal with these issues progressively and on a granular level is to determine what data is strategic or high priority and then manage those first.”

    “We advocate a pragmatic and systematic approach to improving data governance. While our customers typically worry about the complexity, once we start to answer some basic questions for data governance – who owns the data and which manager is responsible for it – it is easy to identify the highest-priority work.”

    The business rules inherent in any organisation can readily be unearthed and formalized, and companies are typically surprised at the ease with which data governance progress can be implemented and the value that is unlocked by improving data management, says Smit.

    Finally, having good data management and governance systems in place is very effective to ensure control over the business and that it easily meets regulations within multiple jurisdictions, which is often the case with multinational firms.

    Any further digitization, changes to regulations and information technology system improvements are also bolstered by high-quality data, with the business’s sustainability, therefore, enhanced.

    The value of best practice data governance is more than just effective compliance and affords an opportunity to streamline processes, owing to detailed knowledge of data flows and processes, and data governance is often a catalyst for efficiency, concludes Smit.

    Source : Engineering News

  • BI of the future

    Business intelligence must address an expanded set of data needs, with a convergence of different technologies.

    By , strategic BI manager at PBT Group.
    Johannesburg, 14 Jul 2016


    There’s an interesting buzz in the business intelligence (BI) industry, where more and more clients are starting to ask what BI of the future looks like. I think there are a few factors that drive this question.

    Vendors’ messaging to their target market is definitely a strong driver that’s causing users of their technology to start wondering. The typical messages from vendors are that BI is dead and analytics replaces it. Depending on the vendor’s own internal strategy, it might either be pushing for cloud platforms for analytics like the big four (SAP, Oracle, IBM and Microsoft), or some of the more niche players are punting visualisation or advanced analytics as if it’s the be-all and end-all of the ‘new BI’. And if the market is not confused yet, they’ll throw words like search and text analytics, self-service, in-memory, or agile into the mix.

    In the process of answering this question with my clients in the last year, I realised that companies have to get past the hype, and get back to understanding that all the available forms of BI exist for one reason only, and that is to meet actual business needs. Therefore, instead of asking what the future of BI will look like, rather ask what the future of the company’s BI should look like to meet the needs of its information users and support its organisational strategy.

    Granted, each company is not 100% unique, and there are a few common business change trends that I’ve picked up in my experience with a variety of clients, where new and alternative BI technology approaches should be considered in order to adapt to the changing needs of business.

    Bi Buzz

    The first common trend speaks to how the general profile of data sources has changed over time. Traditionally, users’ needs were satisfied by providing them with operational data brought into a central data warehouse, combined with a stock-standard stack of BI tools (reports, scorecards, and dashboards).

    However, now more users are struggling with additional, non-operational data from various sources, ranging from internal spreadsheets to external cloud data providers like Google Analytics. This results in the users creating manual work-arounds to cope with the diverse data, leaving the BI users feeling overwhelmed and constrained. Therefore, the BI of the future must address far more diversified data needs with a convergence of varied technologies specialised to different scenarios.

    Just as the data source diversifies, the profile of BI users also changes significantly, leading to the second trend I’ve picked up on. The traditional user base was predominantly management decision-makers on all levels, supported by power users. The changing trend sees this profile being extended by adding a large operational consumer layer. Information is more pervasively consumed directly by operational systems to drive rule engines, enabling operational decisions as part of the workflow.

    On the other extreme, increased sophistication in power users gives rise to a highly specialised community of data scientists needing advanced technologies such as big data, predictive and prescriptive analytics, and even machine learning for building operational intelligence into the operational systems. These data scientists are also the typical users that require far more self-service power in their BI tools.

    Bereaved BI

    This democratisation of data requires a paradigm shift that makes data central to IT and BI capability development in order to ensure the business intelligence competency centre’s (BICC’s) ability to effectively service the needs of the end-users. The traditional engagement model between business and IT has left BI orphaned and treated as an after-thought, but as BI becomes operationalised and more strategically relevant, data considerations must become central to the systems development life cycle.

    All the available forms of BI exist for one reason only, and that is to meet actual business needs.

    Finally, BI needs to deliver at the speed of decisions. Traditionally, daily, weekly, monthly data refreshes were sufficient, and business accepted a six to 18-month timeframe for delivering new capabilities. Today’s pace of business has increased significantly. Companies must be agile and adapt their tactics and strategies in-flight in order to remain competitive. The operational dependencies on BI, therefore, implies a dire need for faster refreshes (near real-time) and more agile and flexible delivery cycles.

    The last, and potentially most disruptive, factor is increased regulation. This trend is especially common in the financial services industry, but impacts more industries (especially companies that have crossed into providing financial services as value-added services). Corporate regulations like King III have matured over the last decade, to the point where detailed regulations are more explicitly touching on management and the use of data and information to ensure reliable decision-making on a corporate governance level.

    Besides corporate regulation Acts like POPI, the Amendment Bill to the ECT Act has significant implications to what companies may or may not do with their data, forcing BICCs to revisit their own methodologies, practices, and governance.

    The implications on these common business and environmental trends lead to the answers; the main issues the BI ecosystem of the future must cater for:

    * The rapid delivery of information supported by conventional BI capabilities, integrated with next-generation architectures, to include data discovery, data cataloguing, data integration, data virtualisation, advanced analytics and more;
    * Underpinned by a more agile BI delivery that enables tangible business value through data science at the speed companies make decisions; and
    * Carefully governing all components of the ecosystem in order to protect the quintessence of BI: the single version of the truth.

    Inevitably, a highly complex ecosystem such as this requires conscious stewardship, starting with a well-rounded, robust and sustainable strategy, with the strategy becoming the driving factor of what a company’s BI should look like in the future.

    A clear strategy empowers BI decision-makers to wade through the hype and identify the various innovative BI technologies that best suit their companies’ needs, and thereby become the creators of their own BI fate.

    Source : IT Web

  • Solving problems with data

    Yolanda Smit, Strategic BI Manager at PBT Group speaks about the difference between big data vs normal data and highlights the ever important question companies should be asking, “do you have a problem that big data can solve?”

  • Better intelligence?

    Business Intelligence, or BI, is increasingly a must-have technology for progressive businesses. But how does it stack up with the instincts of business leaders who should be taking the BI reigns, and why can’t you expect results tomorrow?

    There’s a lot of jargon encouraging a business based on intelligence. Marketing hype often talks of the ‘fact-based’ or ‘real-time’ business. But it’s disingenuous, since all good companies operate on intelligence. The arrival of technological products that handle that intelligence is not solving a particular problem as much as enhancing a pre-existing role.

    Yet this can lead to the suspicion that BI aims to replace the instincts of business leaders, usurping their guidance. It has bred hostility to BI, misinterpreting the value BI solutions can bring to an organisation, says Riona Naidu, head of Consulting Services & Marketing at Knowledge Factory.

    “BI shouldn’t be an uncomfortable thing. It shouldn’t be a big move. It should be something that works alongside what the business is actually doing, just making it a bit more efficient and to the point.”

    BI should complement the culture of an organisation, not upset it. CEOs aren’t exclusively fact-based and technology can’t replace their intuition. If done right, BI serves to inform a leader’s decisions or creates an opportunity for them to test their hypothesis. Yet a successful approach to BI requires leaders not to be blind to the promises of modern intelligence gathering. Real progress requires vision. Says Juan Thomas, Solutions director of PBT Group: “Companies that are really getting value from their investments are the ones that keep investing in newer and better technologies. We have the capability to do it, but do we have buy-in from a programme manager or the C-level?”

    Executive buy-in

    The question of business buy-in quickly surfaces. Good BI falls squarely on the culture pushed by the CEO, says Bill Hoggarth, an independent consultant with over 30 years of BI experience. “If you look at most of the changes in the history of BI and analytics, they’ve been driven by individuals. Sam Walton was the first to base his business on data- and fact-led decisions. When he only had two stores in Washington state, he already had invested in a nascent mainframe. He made information part of his business strategy and it made Walmart what it is today.”

    Walmart is often cited as a wholly technology-driven company. It has long embraced technological developments in supply chain management to keep its pricing both competitive and profitable. Most recently, the company merged its various IT divisions into Walmart Technology, a single monolithic entity that surveys all of the retail giant’s technology plans from close to the business’ pulsing heart. It is also a trailblazer of BI adoption.

    You have to understand the business context and then work back to what data is required to deliver the intelligence that will enable you to make the good decisions.
    – Steven Ing, BSG

    Any company can achieve such an edge, but it doesn’t happen overnight, despite the promises of some BI solutions. More on that later, but the first and fundamental step to any BI journey starts, according to Steven Ing, associate consultant at BSG, from the business value proposition: “What decisions do you need to make to enable that proposition? You have to understand the business context and then work back to what data is required to deliver the intelligence that will enable you to make the good decisions. The reason why BI has failed is that a lot of these projects are started the other way around. You’re not really understanding what decisions you’re trying to affect and, therefore, what intelligence is required for that.”

    Expensive mistakes

    But can the onus really be put on business to appreciate the cat’s cradle of technicality present under BI’s hood?

    “There’s always this clash between business and IT,” says Tony Bell, director at Decision Inc. “What’s happened is that the business people have decided they have performance issues that require immediate solutions in reporting and analysis. So their first capability is to make better decisions. They don’t care where the data comes from. They just want results to see and improve. They want that value. If you can show the value and improvement in business, business is entirely happy and that expands from one department to another.”

    But not everyone agrees with this view. The best BI successes, says PBT Group’s Thomas, tend to come from leaders who take an interest in the technology side of things: “Most of the success stories I know of are where your executives are really interested in that technical landscape. I think that’s where, once you understand why we do what we do, there is immediate buy-in. We shouldn’t underestimate executives’ understanding. If you’re throwing R30 million at a BI shop for five years, you will start asking questions.”

    Gary Allemann, MD at Master Data Management, combines the two views, saying it’s important for leaders to have their minds on both the business outcomes and the technology behind it. Otherwise the eagerness for BI can lead to some expensive mistakes.

    So the idea that IT is the central custodian and guardian of all data in an organisation…I don’t buy that.
    – Bill Hoggarth, BI consultant

    “One client we’re working with had American consultants come in and build a new segmentation model for them, but it was built upon data sets that they didn’t have in their business,” says Allemann. “So, if you don’t have the data, the model is useless. But at the same time, the concept of saying, ‘I’m building these reports and models because I’m trying to achieve better segmentation to achieve a business goal’ is absolutely correct.”


    Executive buy-in and the vision of a fact-driven company are important steps towards BI nirvana, but the process still faces a significant barrier: who owns it?

    “There are no barriers preventing companies from doing anything with BI,” says Ing. “Only people prevent this from happening. The C-level have heard of these concepts, like being fact-based, doing analytics, etc. But they still throw the ball over the wall to IT. Come back and show us something. They’re not taking ownership and changing that culture in the organisation.”

    But passing the buck to IT is almost reflexive. BI relies on data and data represents all the information in an enterprise. The management of that information quickly becomes an issue of governance as part of the King legislation framework. Since governance is often managed through technology solutions, says Chris O’Connell, MD of BITanium Consulting, companies habitually kick the BI ball to IT.

    “King puts (information governance) responsibility firmly on the company’s plate. My feeling is IT is the proxy responsible for that key responsibility,” he says, leading them to become automatic heirs to a technology-based regime such as BI.

    The problem is that if you don’t understand the data and you don’t understand the model, how do you know if that model is correct for that data?
    – Matthew Cook, Datacentrix

    That, though, soon devolves into buck-passing, which is problematic for most technology projects, but an outright death-knell for BI. Allemann appreciates the business habit of trying to be hands-off, but in a BI context, cautions against it: “One of the challenges we have when we throw that ball over the wall to IT is that it’s very easy to turn to IT and say, ‘You’re not getting us the solution we want’. But business needs to engage until they get the answers they want. We shouldn’t be picking technology until we understand the problem.”

    Leaving most of that process to IT is dangerous, says Hoggarth. IT rarely has access to most of the data that will feed a BI solution, something that even a thorough interview process can’t overcome.

    “Most customer data in SA today does not reside in the IT realm,” he says. “It sits on a Salesforce cloud or Microsoft Azure cloud somewhere. IT can’t manage that. It has no say over that. They often don’t even know their marketing or sales teams have put data in the cloud. So the idea that IT is the central custodian and guardian of all data in an organisation…I don’t buy that.”

    Then what role can IT play? Says O’Connell: “I think its role becomes putting the guard rails in place, to make sure that business doesn’t hurt itself.”

    Matthew Cook, Business Development manager at Datacentrix, agrees, noting that often business moves faster than IT. Even though the responsibility to find a solution is passed on to the technologists, users soon grab at the reigns again.

    “The process takes time, and in the meantime, business gets impatient and buys something, because it needs an answer now,” says Cook. “Should IT provide the guardrails? Absolutely: how do we (service providers) support IT in supporting business, but at the pace that business wants it done at?”

    Allemann is not convinced a sandbox for business should be on IT, not in terms of BI: “Business needs to be defining what those boundaries are. We trust our financial data, but we don’t give it to IT and say, ‘It was your responsibility, so you sign off on these reports’. It’s signed off by the accountants, the auditors. So when we’re looking at marketing data, or sales data, or inventory, it’s not IT’s problem to make those reports accurate. It becomes their problem, because they have a role to play. But business needs to engage right from the start.”

    Quick wins vs long-term strategy

    A central theme starts to emerge from the conversation: embracing BI is neither simple nor iterative. It requires a lot of upfront hand-wringing and decisions that need to be carried through by all, as well as an understanding by the company leadership of its various moving parts. A BI solution can’t simply be delegated, then judged by the results.

    “The problem is that if you don’t understand the data and you don’t understand the model, how do you know if that model is correct for that data?” asks Cook. “That’s the life cycle you need to go through: to get a better understanding of your data from a context perspective. Get an understanding of what it is you’re hoping to achieve from a measurement point of view and then marry the two together.”

    Does this mean there are no quick wins in a BI environment? That may well be the case. If anything, BI needs to be pre-empted by an overarching plan: a framework that stops BI’s organic growth from overwhelming everything. Just consider how many reports a company generates and how often the nature of those reports changes as people shift positions. The result is information overload and the death of clarity. Losing control over BI is always a risk, hence the need of a steady hand from the start.

    Yet Naidu says there is still a role for some quick wins in BI: “It’s part of the culture change. You walk into any boardroom and you have the pro-data crowd that wants the BI systems right away. Then you have the pro gut-feel and then some people who are in-between the two. Quick wins is one way to unify everybody and prove the case as they go on.”

    But do not confuse ‘quick win’ with ‘quick fix’, cautions Thomas: “Some companies are stingy and hoping the small solution they use with self-service BI will solve their challenges and give an instant advantage. That’s not the case. You have to crawl first, then start walking. The guys who are running are the ones seeing the real advantage.”

    Hoggarth, though, feels optimistic that the modern innovation around BI is making it more feasible to get insight faster, because it’s no longer restricted to internal data.

    “We have the tools, but often the raw ingredients for those wins don’t exist within the company itself,” he says. “They are out there somewhere. They want to know what competitors are doing and what customers think. BI hasn’t been able to do that. It can now and that’s the inflection point. This is a very exciting time for BI.”

    Credit : IT Web

  • Defining a cloud BI strategy

    Before adopting a cloud business intelligence solution, a company must delineate its approach to the cloud BI concept.

    By , Solution Architect Manager at PBT Group.
    Johannesburg, 2 Jun 2016

    Business intelligence (BI) in the cloud is a hot topic within many companies. As with all new concepts, there is the inevitable level of confusion and uncertainty regarding how to proceed. To demystify the topic, I believe things need to go back to the basics.

    For many companies wanting to understand BI and their cloud strategy, the first area of focus would be on actually defining a viable ‘cloud business intelligence strategy’. Only once the strategy has been defined does it make sense to evaluate the product(s) and associated vendor(s) that can support this strategy – and from here, to define the critical success factors for successful cloud BI adoption.

    Only when all of this is accomplished is the company ready to start planning the physical implementation. This is a lot to digest in one Industry Insight, so let me start with what to consider when defining a successful cloud BI strategy.

    Never-ending story

    What makes cloud BI a complex and confusing topic? Well, to my mind, it is the long list of cloud BI solution scenarios available to clients. Picture this: a company decides to focus purely on the data; this means it could look at an operational data store or data warehouse, fully or partially, in the cloud. Alternatively, it could consider hosting the entire visualisation stack, or a partial subset thereof, in the cloud. For example, any combination of operational, management and strategic level reporting and self-service analysis can be hosted in the cloud. Additionally, many product vendors are now supplying cloud-based advanced analytics platforms – confusing, right?

    To further compound the issue, certain vendors can also accommodate the entire set of solution scenarios in their cloud technology stack, while others only address a subset of these. Added to this is the fact that certain vendors focus exclusively on supplying infrastructure in the cloud, allowing any combination of software products to be installed.

    As a result, deciding on which approach to take can be an overwhelming task and makes defining the strategy extremely challenging.

    I believe navigating the myriad available options successfully comes back to the basics – which means gaining a clear understanding of the actual business case.

    Starting point

    Reducing capital investment, by reducing onsite hardware costs, is typically the first business case considered for cloud BI.

    This is closely followed by increasing the speed and decreasing the cost of scalability. Cloud BI environments are inherently scalable in terms of storage, software licensing, processing power, etc, and that scalability is on-demand. This means the business can scale up in peak times, during major marketing campaigns, for example, and then scale back down, when required, paying only for what was used.

    Reducing maintenance costs, via reduced on-premises energy consumption, software maintenance and upgrades or standby and support are also valid IT-driven business cases. In evaluating this business case, however, companies need to be careful to consider the human impact this might have on making IT maintenance staff redundant. Care, consideration, and long-term skills transfer need to be examined to minimise redundancy needs, in line with this business case.

    The positive impact that cloud BI can have on business continuity is growing rapidly in the South African context.

    A business case that sometimes does not get enough exposure is improving the IT adoption rate of new features. Traditional IT departments are, at best, on latest version minus one. Many clients, however, are two, sometimes even three, versions behind. Being cloud-enabled ensures companies have access to the latest features, allowing them to respond to emerging trends and take advantage of new features far earlier than potential competitors, which are still using on-premises solutions.

    Reducing resource costs and increasing productivity is a major factor to consider from the business point of view. Through improved collaboration, the business community can be more actively involved in creating and sharing content, freeing up traditional IT resources and improving the end-user experience, all while increasing information delivery times.

    The positive impact that cloud BI can have on business continuity is growing rapidly in the South African context. Business continuity is improved as there is less dependence on local resources, so BI services can still be available during local infrastructure outages.

    These days, it is sometimes difficult to tell where big data begins and cloud BI ends, given the two are so closely intertwined. Cloud BI enables big data integration, as it brings organisations far closer to online big data services. Many cloud BI products come with built-in interoperability, with various cloud-based big data services. This means large volumes of data can ‘stay’ in the cloud, without having to clog the local bandwidth and processing capability.

    Having understood the business cases that are relevant to the company, it then becomes important to understand the use cases where cloud BI can be employed. These are typically focused on collaboration; for example, regionally and/or internationally distributed sales teams, groups of departmental power users, etc. Another area of collaboration, which is seldom considered, is sharing BI with third parties without having to expose the internal network.

    The benefits of cloud BI can be virtually limitless, but so too are the options and permutations of a cloud BI implementation. It is therefore critical to have a clear idea of what is important to the business, now and in the medium term, as doing so is the critical first step in the cloud BI journey.

    Source : IT Web

  • No strangers among self-service

    Although accountants have used physical spreadsheets for hundreds of years, the revolution of computerised self-service tools has been on the rise since the “Tale of VisiCalc”, an interactive visible calculator invention by Daniel Bricklin and Bob Frankston in the late 70s.
    VisiCalc laid the foundation for Lotus 1-2-3, which established itself as a data presentation package as well as a complex calculation tool that integrated charting, plotting and database capabilities. It was also the first spreadsheet vendor to introduce naming cells, cell ranges and spreadsheet macros in the early 80s. Microsoft Excel was the next milestone in response to computerised self-services tools in the mid-80s. Self-service analytics as a need is in fact no stranger.
    Most aspects of people’s lives are inundated with self-service alternatives. Companies are more frequently offering alternatives to “do it yourself”. Examples include airline check-in, automated teller machines for banking, public vending machines for a quick snack, as well as kiosks for settling shopping mall parking fees – all of which have enjoyed high adoption around the globe. Their success in adoption has been attributed to the ease-of-use of the self-service terminals and portals. Many companies have seen greater cost savings in their support costs, as well as improved service delivery due to these self-service alternatives.

    Defining the self
    Gartner’s IT glossary defines self-service analytics as a form of business intelligence (BI) in which line-of-business professionals are enabled and encouraged to perform queries and generate reports on their own, with nominal IT support.
    It is often characterised by simple-to-use BI tools with basic analytic capabilities and an underlying data model that has been simplified or scaled down for ease of understanding and straightforward data access. This promotes the notion of a shift from IT-led enterprise reporting to business-led self-service analytics in which business users are encouraged to “feed themselves”. The definition also supports the approach in which a semantic layer is prebuilt and a BI tool that is easy to use is presented to access the data.
    Ideally, training should be provided to help users understand what data is available and how that information can be exploited to make data-driven decisions to solve business problems. However, once the skilled IT professionals set up the data warehouse/marts that support the business needs, users should be able to query the data and create personalised reports with very little effort. Slow adoption in self-service culture is mostly attributed to computer tools that required specialised knowledge to operate.

    Experts only
    Until recently, existing self-service BI tools were mostly for specialists – they were hard to operate and required a knowledge level similar to that of data scientists. Front-line business managers who desired BI-style insights had to send query requests to BI specialists working in the BI department, and had to wait for unbearable turnaround times to get reports that were difficult to change or influence. All this is changing, due to advances in database and query technology, as well as redesigned front-end tools to make it easier for any user to interact with the data.

    Self-service BI attempts to generate new insights through shared responsibilities.

    The concept behind self-service is that front-line business executives and managers should be able to quickly get up and running with these tools, without having a data analysis background and without requiring a BI specialist as a middleman. Generally, these self-service BI tools should be as easy to use as the typical spreadsheet enabling a user to query data, analyse the answers, and create some kind of visual representation of this data that is suitable for presentation or sharing with other non-technical personnel.
    Self-service BI in no way overthrows traditional database management or data scientists. The insights provided by these professionals are complex in nature and remain invaluable. Instead, self-service BI attempts to generate new insights through shared responsibilities, realising new value from hard-won data through more informal, ad hoc analysis.
    The business need for self-service tools has always been around and has not changed much over time. What has changed, and continues to change, is the technology used, the data available, and the culture/expertise of information use. New technology possibilities are nurturing the self-service culture in recent times.
    The increasing adoption is confirmed by the exponential growth in annual revenue for the three “leaders” in Gartner’s 2014 Magic Quadrant for Business Intelligence and Analytics Platforms (Tableau, Microstrategy and Qlik).
    I concur with CEO of Clarity Solution Group Mike Lamble’s opinion: “In the self-service paradigm, ‘power users’ triumph over portal users. Tools are analytic-centric rather than reporting-centric. Business discovery supersedes information delivery. Semantic layer-free data exploration and rapid prototyping are where the action is.”

  • Hello Watson

    A few years back, big data landed in the world of analytics with a rather unflattering and unstructured thump, very nicely hash-tagged with phrases like ‘the next big thing’, ‘powerful’, ‘unprecedented insight’, etc. The sheer volume, velocity and variety had data analysts frothing at the mouth.
    Fast-forward a few years and it has become increasingly apparent that the volume, variety and velocity is increasing exponentially, with absolutely no signs of slowing down or tapering off. If anything, it is going to get worse. The more IOT connected ‘things’ that are invented or added, the larger, faster and more disparate and ‘uncontextualised’ big data is going to get. This is a very large and fast-moving problem.
    Already, there is way too much information with not enough talent to manage it or time to sift through it. The longer it stays untouched and unused, the more context is lost and the more data is lost to data decay.

    For one moment, consider the following:
    According to IBM Watson¹, unstructured data accounts for 80% of all data generated today. With the majority of that data being noisy, in formats that cannot be read by traditional systems and ‘uncontexualised’, this noisy and dirty data is expected to grow to over 93% by 2020¹.
    Fuel – oil platforms can have more than 80 000¹ sensors in place. A single platform can produce more than 15 petabytes of data in its lifetime¹. Tools like Watson could help companies prevent drilling in the wrong place and help with flow optimisation (the volume and rate at which oil is pumped).
    Healthcare – in your lifetime, you will generate 1 million GB of health-related data[1]. That is the equivalent of 300 million books. Imagine what a computer that can collate and predict quickly and accurately could do with that much information?
    Transportation – By 2020, 75% of the world’s cars will be connected¹. They will collectively produce approximately 350MB of data per second to be assessed and acted on. Self-driving and self-learning cars will soon be the norm. By their very nature, they will need to be able to learn and apply reasoning. Governments are not going to re-grid their entire road infrastructure.
    Added to these scaling volumes is a huge shortfall of talented analysts and data scientists. Those that are around simply can’t keep up with the ever growing volumes of data. This shortfall presents a massive problem for business, because even the most advanced data platforms are useless without experienced professionals to operate and manage them.

    Answers please
    So, then, what is the solution? More training and better academic programmes? Possibly, but the exponential nature of big data means users are always going to be playing catch-up. So, another solution needs to be found: A scalable and fast solution that can leverage insight at close to the same speed as big data is collated and collected; a solution that is as close to keeping the original context of the volume as possible. Say hello to Watson[2] and Coseer[3].
    The future of big data is finding, scripting and training computers to do the work for people. Computers that think the way humans think, that use context to flesh out meaning, and can think outside of a rigid decision tree logic.

    What will result is limitless possibility.

    Computers that cognitively can make decisions and learn from each and every interaction, at speeds that humans can only dream of.

    What, exactly, is cognitive computing?
    Simplified, cognitive computing is the creation of self-learning systems that use data mining, pattern recognition and natural language processing (NLP) to mirror the way a human brain works, derives, contextualises and applies logic. The purpose of cognitive computing is to create computing systems that can solve complicated problems without the constant of human oversight, and in the process, far surpassing the speeds at which humans can do it.
    What will result is limitless possibility. This is perhaps as close to true agile computing as it will ever be.
    Cognitive computing is all about changing the world and entire industries, being able to see things that were lost in the volume, and finding insight that people have not been able to grasp before.
    Today, 2.5 quintillion bytes of data¹ is created everyday – that is 1 000 000 000 000 000 000 000 000 000 000 bytes1. Every person on this planet will add 1.7Mb of data to that statistic, every second of today.
    Human intelligence simply cannot scale in the same way that data is scaling, and cognitive computing enables people to deal with these massive amounts of data. Don’t get me wrong – cognitive computing can never replicate what the human brain does. It is simply a system that can handle massive amounts of unstructured data, fast and accurately.
    The insight that could be provided is immeasurable.

  • What makes an IT company great?

    In business, having your foot in your mouth can easily translate to a foot out the door. So when the ICT world makes a noise, the aim is to attract new business.
    They don’t talk about themselves, especially companies that sit in the food chain between vendors and clients. Yet, these companies help make up a R13.9 billion local industry, excluding telecommunications. So Brainstorm decided to turn the tables, so to speak, and ask its roundtable attendees: what sits behind the sales pitch?
    The people of ICT are as regular and congenial as anyone else, but their brands often demand a bit of swagger and bravado. ICT companies court big fish with big contracts – cloud may have broadened the market to smaller customers, but it remains a highstakes game.
    So when the opening question is an opportunity for attendees to buff their brand, a steady march of jargon and catchphrases appears. One person even manages to capture the entire mantra in a single comment: “You must innovate and iterate, but one must also be cautious of cutting-edge delivery. You want to deliver solid solutions, but also be at the forefront of technology. People you work for want to know you deliver top-end solutions and give them what they need.”
    This is what practically every ICT company would and does say. But similar comments don’t even make a complete round before the conversation begins turning inwards, speculating on the direction of a very competitive industry.
    “With quality comes a price and we’re at a time where companies are very cost-sensitive,” says Lance Fanaroff , joint CEO of Integr8, wondering if customers are going to forego quality in tough times. “It’s about whether that additional quality warrants them spending the additional money. As South Africa comes on the squeeze, there will be more of a focus on price than quality.”
    Bruce Pitso, South African regional manager at Ruckus Wireless, agrees that there’s pressure from customers to reduce costs or be undercut, but sees it as a more long-term trend that emerges as business leaders take control of ICT: “You get organisations that will want quality at any (or great) cost. But you get other companies that say, ‘If it will work at a cheaper cost, we’ll go for it’. People who decide about technology often don’t know much about technology. So the best of breeds will come in and cater a project for a long-term investment. But because the supply chain has been instructed to go for a more cost-effective solution, it’s a challenge. And we are headed in that direction.”

    Quantity race
    At this point, it warrants a reminder that ‘ICT’ is a catch-all phrase that involves many different types of technology and implementation. Pitso’s comment gels with that of companies that count hardware and infrastructure as a large part of their business. Whether you look at the rise of Chinese manufacturers or fibre networks threatening incumbents, there is a quantity race in those markets that can override quality, something customers are taking advantage of (if the risk seems fine).

    But Decision Inc.’s CEO Nicholas Bell notes that this is not the case in the more service-orientated ICT sector: “On the professional services side, it’s slightly different. Quality is no longer the diff erentiator – it’s the norm, the expectation. The market wants it for less, but it doesn’t want the quality dropoff . So with a large enterprise, you can’t increase rates or make great margins. They are squeezing us on rate, but the quality remains constant. These days, they have options and take advantage of that.”
    But is there even a choice? A popular military maxim says that a good plan now is better than a perfect plan later. Yet in ICT, a good plan now often leads to big headaches later, says Professor Barry Dwolatzky, director and CEO at the Joburg Centre for Software Engineering. “There is a concept bandied around called technical debt. If you rush to market with something that is cheap now, it will cost you later. We need to get it out of our minds that you do something well or you do it cheaply. We have to find ways to do things well and cheaply and use that in the same sentence, not as flipside to the coin.”
    Yet, companies can’t just expect cutthroat pricing if they dug themselves into a hole, says Warren Olivier, regional manager, Southern Africa for Veeam Software: “Often companies didn’t invest in an entire solution, only a part of it. There hasn’t been that entire end-to-end focus.” The result is patchwork environments that are never brought ahead on the curve.

    We’re at a time where companies are very cost-sensitive.
    – Lance Fanaroff, Integr8

    Nonetheless, this spells more opportunity for the ICT industry: “I’m not going in to force this down the customer’s throat,” says Olivier. “I’m going to say, you got a bit of this and that. I’m going to find ways to leverage more out of this. That’s where partnerships and alliances – co-opetition – are important.”
    But delivering on those intentions and partnerships requires a key ingredient, one that is becoming ever scarcer in the country.

    Falling behind
    Earlier in the discussion, Bell had remarked: “(Enterprises) can’t afford for you to learn on their time. Everything is more urgent and must be out faster.”
    He was talking about the demand for quality despite cost, but this also touches on a much bigger issue, one that requires much more buy-in than it is getting: skills.

    Saying the ‘S’ word almost always draws the same response from IT professionals: a metaphorical roll of the eyes and a, ‘yeah, but what can you do?’ look. Yet skills are a serious problem. South Africa is not producing enough of them, particularly in the ICT field, and that shortfall is growing as 21st-century technologies start making an impact.
    “There is a big gap between quality skills and the quantity of those skills,” says Dave Ives, head of Solutions at Karabina Solutions. “We have quality skills, but if I look at the new skills – machine learning, new languages and the stuff we’re encountering in the predictive space – I would say we have a skills shortage. If I look at the CRM and digital transformation space, taking a company to end-to-end transformation, I question if we have the depth and capability in this country.”
    The big issue, he adds, is the lack of a large pipeline of people coming into the sector. Ives isn’t alone in this concern: an annual survey from Wits University’s Joburg Centre for Software Engineering last year found South African ICT skills to be lagging far behind Egypt, Kenya and Nigeria.

    “We invest too little in skills as an industry and country,” says Dwolatzky, adding that this burden is too often laid at the feet of universities and government. Instead, the ICT industry needs to become much more involved and address its own culture. “If you look at the companies in India, for example, they recruit people from universities and put them on ten months of intensive training before putting them to work. They invest in their skills.”
    Local companies throw newcomers into the deep end, then complain that the skills are rubbish: “We have to put the spotlight on what we as an industry are doing to produce the skills we need. There is plenty to complain about, but it’s all of our responsibility.”
    “If I was approaching a vendor, I’d ask, ‘Do they actually contribute by growing skills?'” adds Kim Andersen, CTO at T-Systems South Africa. “In India, they decided IT matters to India’s economy. That has transformed the country. South Africa hasn’t made that decision yet.”

    A pool of sharks
    Due to the lack of decent local skills, it has created a market defined by scarcity: high salaries, low retention rates and relentless headhunting.
    “A lot of companies in SA don’t look at skills as an investment,” says Pitso. “They do it because they have to – they’ll take in interns as a tax kickback. But if someone is studying software programming, they aren’t stupid. So they will exploit this and get a better job.”

    That lack of an investment mindset is, instead, in the words of one attendee, creating a pool of sharks. For example, when one of the country’s major banks needed skills for antiquated Cobol systems, they started an academy to train those skills. But other companies, instead of partnering with the initiative, snapped up graduates as quickly as they could.
    This took place between large entities such as financial institutions and government departments bolstering their in-house talent pool. The trend is creating a negative impact on the much smaller ICT market, which is often burdened with the expectation to train skills that they know will be lured away.
    “I used to keep guys for three years,” says John Eigelaar, director and co-founder of Keystone Electronic Solutions. “Now most of them leave within six months to a year. I lose people before they are even at a useful stage!”
    Adds Bell: “The problem is that the large companies such as the banks can off er salaries that don’t fall in line with the market we play in. They create this ceiling that makes it very hard for others to compete. So you have guys with a year’s experience getting double their salary and the industry loses them.”

    But while there is agreement that the country needs more skills, not everyone sees the above as entirely negative. It may also define a feedback loop that helps the industry.
    “If you train proper skills, wherever they go, they will generate more work,” says Armandè Kruger, regional sales director at PBT Group.

    Armande Kruger - Regional Sales Director at PBT Group

    Armande Kruger – Regional Sales Director at PBT Group

    “Instead of getting a slice of the pie, let’s grow the pie. It’s not a perfect model, but there is another side to it.”
    Still, Dwolatzky makes a clear call to arms: “Let’s all get around the table, let’s run a programme jointly. We all contribute and create a big pool of skills we can then all fish from.”

    ‘Sinful crimes’

    ICT companies are not entirely victims. Skills are also scarcer because of the influx of new technologies and customer industries. Business ICT has been booming, creating a wave that the industry is not only happy to ride, but tends to add its own hubris of ‘must have or die’ technologies. Raising this habit leads to a rare moment of admonishment from the attendees. Says Dwolatzky: “There is a concept called technological determinism. Does the technology drive change in business or does change in business drive the technology? We as technologists fall into the trap of technological determinism. We think we invent a new widget and that widget will change the world. In fact, it’s business that is changing things.”
    The conversation starts around a question about the cloud: today, the mantra for salespeople is that cloud helps companies to innovate. But go back only a year or so, and the message was more about the cost benefits of using cloud infrastructure. This isn’t an atypical example of messages ICT sends to customers. As business takes more interest in technologies it doesn’t quite understand, the result is confusion – and ICT has been exploiting this.
    Says Andersen: “The IT industry has long been committing sins: we’ve sold technology above and beyond the needs of business. We’ve pushed technology because we thought it was so great. But what is the business value?”
    Yet, the real trap may be a matter of ego: solution providers cannot come across as incompetent or clueless. So they often have to toe the line for a technology that itself has yet to really define its usefulness. Fanaroff draws on the popular example of cloud: “Cloud means different things to different people. Meanwhile, it’s waiting for infrastructure to catch up and offer richer services to companies.”
    The issue with new technology, he says, is that the use cases are not always there yet and it takes time for the market to find them. This tends to rely on other parts of the puzzle, such as connectivity and cost, to match expectations.
    “Technology matures, so the message changes all the time.”
    Technology is a moving target, which means those selling technology often have to run and talk at the same time. But those doing the buying shouldn’t think they are immune. Everyone in the ICT game should understand how highly fluid it is. Says Kruger: “Innovation comes from passion – once something becomes commercial, the innovation stops. And innovation these days happens at the speed of fibre, driven by a new generation that expects quick delivery. So innovation should be pushed from the bottom up. But if it only lives for a year, it lives for a year and then you move on.”

  • Is the Internet of Things a data opportunity?

    Armandè Kruger, Regional Sales Director for PBT Group


    There is no denying the sheer amount of data at our disposal. And thanks to the Internet of Things (IoT), data is increasing by the minute. However, for those of us that work with data or need to understand the influence this data has on businesses/clients or markets, we are now faced with the task of sorting through this mass of information from various interconnected devices.

    Those who rate IoT as a fad, should consider the transformational impact this is likely to have on business, specifically on the data centre side. Take for example the amount of data that these interconnected devices are producing (and will continue to produce). Surely this alone offers organisations an opportunity to analyse the data and use this information to gain a much stronger competitive advance in an increasingly saturated business market?

    In fact, Gartner estimates that 6.4 billion connected ‘things’ will be in use worldwide this year, which is a 30% increase from 2015 alone – indicating that the IoT phenomenon is not going away. As a result, decisions around how best to process the associated data from these various devices, should be an organisational priority in 2016.

    Along with this, analysing the data is key – and a practical way businesses can go about doing this is turning to advanced analytics. In fact, companies can apply advanced analytics to their entire data realm, to ensure that they can leverage off the information and integrate it within various areas of their business.

    In other words, advanced analytics can be used to turn data coming from various business areas; including the IoT, sales, marketing, call centre feedback, transactional data and even social media, into something more powerful. Can you imagine the impact if a business could not only manage the data within, but also analyse it correctly, to make business decisions that are accurate, add value to customers, improve the bottom line and in return increase profit margins? This is what will give a business a new and substantial competitive edge – and this is why IoT is no fad.

    Advanced analytics can actually generate predictive information for a business. This means that an organisation can now gain insight into future outcomes using advanced analytics. And it is this that allows advanced analytics to assist a business in becoming more customer focused, as the data being analysed can tell a business more about their customer’s needs, wants and expectations. Having this information at their fingertips means businesses can effectively satisfy a customer’s needs instantly.

    So, as organisation’s start to make sense of 2016 and the challenges and opportunities the year is likely to bring, remember that the IoT and data is starting to play a very important role in the success of companies globally (if utilised correctly).

    Don’t discount the impact these so called technology terms will have on your own business. Additionally, be careful of thinking that if you have implemented some form of big data, you will solve your data problem – as this in isolation offers very little value. Analysing the data correctly through aspects such as advanced analytics, is what will help determine how you can create or capitalise on opportunities in this digital marketplace in 2016.

    Source : IT News Africa


  • A visualisation love story

    By , strategic BI manager at PBT Group.

    The fellowship of humanity is founded in stories – stories that evolved from cave drawings to Shakespearian writings to the modern-day stories on the cinema canvas. I know the debate between reading versus watching movies is probably as old as the television itself and is still ongoing. However, the fact that cannot be debated is that people are infatuated with stories.

    The visualisation of stories in movies has simply made stories more accessible to the portion of the population not inclined to find reading as enjoyable as others. This does not mean books are redundant, as some people will never sacrifice the joy of immersing in their own imaginations through the written word, but ultimately, visualisation of stories continues to enrich a much wider part of the population.

    In a similar way, one of the latest buzzwords in business intelligence (BI) – visualisation – is causing quite a significant uproar. Let’s set the record straight from the beginning: Visualisation is not the new BI. Visualisation is just an added medium through which one can publish the intelligence in the data so a larger portion of the company can benefit from discovering the story ‘hidden’ therein.

    Picture it

    Way back, in the Shakespearian BI era, data and intelligence were expressed in data tables. It never ceases to amaze me when I come across BI end-users who can glance at a data table with 25 columns and 72 rows showing regional weekly sales data for the last 18 months, and – within seconds – become excited by the trends they observe in the data. Yes, such data whizzes exist, and sometimes leave me reeling, convinced that Neo stepped out of ‘The Matrix’ through my computer screen. However, I tend to fall on the side of the masses, and for me to make sense of large data sets, the golden rule applies: “A picture is worth a thousand words (or numbers, in the case of BI).”

    Over the past couple of years, managers have started to realise that, instead of being solely dependent on a small team of highly skilled quantitative analysts to analyse and interpret the data for the masses in lengthy book reports, the latest technological advancement in visualisation capabilities of tools, like PowerBI, Qlikview, and Tableau, among others, may unlock the story in the data to a wider audience, much faster.

    However, don’t be fooled into thinking it is as simple as putting the tools and the data in the BI users’ hands, and – ‘hey presto’ – BI value is unlocked by the masses for the masses. It is not that easy. Returning to my analogy of books versus movies, consider how many people are involved in publishing a book versus bringing a movie to the silver screen. The book involves the writer, an editing team, the back-cover writer and the publisher. Judging from the credits on a movie, it can take a team of more than 100 people to effectively tell one story on the telly.

    One must understand that to unlock the story in the data through visualisation takes very careful planning and design, to ensure the right visualisation mechanism (bar graph, line graph, heat map, xy plots, etc) is used that best tells the story. I’m yet to come across a company where all BI users just intuitively know how to match the right mechanism to the data to effectively answer the business question they have.

    Unlocking value

    Cue the role of the data visualisation architect (Google it, such a person exists), who is almost like the scriptwriter, location manager, set designer, casting director and director of photography all rolled into one. This person effectively combines the right data (script) with the right visualisation mechanism (set designer), and the right formatting and structure (director of photography), and then publishes it on the right platform (location manager) to the selected target audience (casting director), who will then effectively utilise the insight to the value of the organisation, resulting in a conclusion of the story. The data visualisation architect will know the science behind how visualisation is perceived and physiologically processed by the viewer, and use this specialist skill to design the optimal visualisation for each and every insight story embedded in the data.

    As a side note, I must caution the reader not to confuse the data visualisation architect with the other buzzword, data scientist. That would be like demoting the director to become the set designer. Granted, data scientists have a very strong understanding and a keen, almost intrinsic, ability to design visualisations that tell a very clear story. However, a data scientist is usually the person who drives and steers the whole journey of discovering a new story, scripting, designing, casting, recording, and producing the movie. Data scientists are far more valuable in the exploratory analytics space where the business question is still being formulated, the hypothesis must still be defined, the suitable data sourced and analysed, and finally, the conclusions drawn and presented to decision-makers.

    Just like the data scientist discovers untold stories hidden in the data, the data visualisation architect can enable business to unlock the value intrinsic in their existing data for decisions made on a daily/weekly/monthly basis. This is done by converting existing reams of data-table book reports into well-crafted visualisation views that tell the story succinctly in an aesthetically pleasing way – empowering the other 80% who cannot intuitively see the picture in the data, like Neo.

    Source : IT Web

  • Big data a must for insurance industry

    Petr Havlik, Managing Director for CyberPro Consulting

    How to effectively manage big data is something all companies need to be aware of. When it comes to a data-driven business like insurance, it is even more critical to success or failure. Are local insurers adequately prepared for this?

    The concept of big data is certainly not a new one. Many companies have been trying to better deal with the sheer amount of raw data they have at their disposal sifting the good from the bad. But to really manage it properly, executives are faced with difficult decisions in identifying the exceptional technologies able to efficiently process large quantities of data within acceptable time frames.

    Cynics might argue that big data is just a buzz phrase and ignore it. However, an IBM study has found that 74 percent of insurance companies surveyed report that the use of information and analytics, including big data, is creating a competitive advantage to their organisations.

    In South Africa, the rise of the connected lifestyle is resulting in customers who demand more from their insurers. This connectedness has also given rise to more informed consumers who are better aware of competitive offerings than in the past. Not only do they want better pricing but they also expect innovative, value-adds that appeal to their lifestyle requirements. If an insurer is not able to deliver this, then the customer is more than willing to change companies. Insurance brand loyalty is a thing of the past.

    This means that insurance companies now compete on multiple levels ranging from premiums, customer services, and claims experience, to brand recognition and product structure, amongst others.

    And the foundation to all of this? Quality data.

    An insurer needs to implement the kind of IT systems that empower them to make informed decisions based on their customer requirements as well as market trends that will impact them from the short-term through to the long-term.

    According to IBM*, insurance companies must leverage their information assets to gain a comprehensive understanding of markets, customers, products, distribution channels, regulations, competitors, employees, and so much more.

    Of course, it is not all about just big data. Using Business Intelligence tools that integrate data management and analytics become essential to build the right kind of information needed for making quality business decisions.

    Fortunately, South African insurers are willing to adapt. A case in point is the flexibility of solutions available to cater for a range of consumer needs. This could of course not have been developed without getting to grips with big data and testing new solutions. The future is looking promising for those insurance companies that have taken heed of the call to arms and are starting to realise the treasure trove that is their big data.

    Source : Cover

  • Embracing integration in a cloud-based world

    At a time when cloud computing is becoming fundamental to business, the importance of integrating systems effectively cannot be overstated. Petr Havlik, director for CyberPro Consulting, looks at the impact this will have in South Africa.


    “Historically in IT, developing software systems and utilising things like business intelligence solutions were considered separate disciplines. However, this has all changed given technology needs to be much more integrated in order to help the decision-maker gain a single view of the operational areas and customers in their business – and ultimately become a more partner-driven business.”


    Such an approach reflects a growing shift in a world where companies are looking at expanding their traditional business lines with more value-added offerings. For example, renewing a passport at a bank could never happen without having an integration between multiple parties. The connected world is now seeing organisations playing multiple roles in the lives of their customers.


    “Just look at what is happening in the South African landscape. You have telecommunication providers muscling in on the banking space, banks providing all sorts of value-add online offerings, and numerous other companies across a variety of sectors identifying different ways of adding to revenue streams.”


    Havlik says that this diversification and working with different business partners provide an organisation with a great platform to be successful in the ‘new world’. However, while integration is topical and relevant, it is certainly not very sexy given its focus on back-end processes and systems.


    “The changes that cloud computing bring to the integration landscape can be exciting and frightening at the same time. In the past, integration resulted in significant infrastructure and skills investments needing to take place. Using a cloud platform, such as for example Microsoft’s Azure platform, provides the business with access to a centrally-hosted environment providing those services.”


    This gives companies access to a highly available and scalable environment that can be utilised to integrate with their partners. Going the cloud route brings with it significantly lower costs and faster implementation times.


    “In certain respects, the benefits of integration are providing traditional-minded businesses the first real use cases for adopting cloud-based systems. Even concerns around security and privacy are being addressed thanks to the strong security measures adopted by cloud providers, although customers must remain cognisant of the facts around data storage in offshore locations.”


    And while there is a need for more multinationals to open data centres in South Africa to address the need to keep certain information within the confines of the country, the reality of seeing this happen is still a few years away.


    “Irrespective of this, integration is something companies are starting to take more seriously outside of the traditional confines of the IT department. The benefits of doing this effectively cannot be ignored for longer,” Havlik concludes.

    Source : The SA Leader

  • The golden key

    By , director at PBT Group.

    Undoubtedly, cloud computing has changed the way in which IT serves companies. Virtually anything can be offered as a service (XaaS), and business intelligence (BI) in the cloud is no exception.

    In fact, there are a handful of reputable vendors offering a cloud BI as a service solution. Their options range from complete data warehousing and BI suites, in the cloud, to less complex data exploration and visualisation solutions such as Birst, GoodData, MicroStrategy, SalesForce, Tibco Spotfire Cloud, SAP Lumira Cloud, IBM Watson Analytics, Microsoft Power BI, Amazon Web Services and Oracle BI Services, to name a few.

    Cloud BI solutions typically provide multitenancy hardware and host software over the Internet – which imports business data, places a structure around it (if unstructured), applies the data models and generates a Web-based user interface. These allow for analysis and distribution of reports and dashboards. These solutions are also compatible with smart mobile devices, giving access to analysis and dashboards from anywhere and everywhere.

    Additionally, cloud BI implementations are sold as alternatives to traditional BI, given the advantages of providing faster implementation time and a significantly different cost structure from traditional implementations, which are based on on-premises data centre installations.

    The case for cloud

    Some companies commonly use cloud for proof-of-concept projects before actually implementing them in-house. Other bespoke benefits include greater agility, higher levels of innovation and improved ability to contain costs and reduce the need for capital acquisition. According to John Myers, who compiled a research report on Analytics in the Cloud, the top three financial drivers behind cloud BI platforms are minimised hardware and infrastructure cost, reduced implementation cost, and reduced administrative cost, in that order[1].

    However, cloud BI can come with its own challenges. For example, bandwidth constraints can negatively impact data transfers into the cloud; as such, large companies might still need to maintain on-premises sources for cloud BI or face additional costs for upgrading connectivity to the cloud. As a result, it is unlikely that all company data will move to the cloud. So businesses with large volumes of data may still need to transform raw data on-premises to reduce the size of the data, and only transfer ‘subsets’ of necessary company data. In other words, keeping data transfers small is important in cloud BI to manage both cost and upload/download bandwidth issues.

    Other challenges that may lead companies to hold back from expanding to a cloud-based approach are performance and latency-related issues; the fact that this approach discourages customisation; and some businesses have privacy and security concerns when it comes to sending sensitive business data beyond the corporate firewalls.


    However, most cloud solutions now follow sophisticated encryption techniques with stricter security processes and certificates, which are more advanced than many companies have internally. Therefore, security should not be a worrying factor for adoption – anymore. Companies that have made the move to the cloud often state the benefits – faster time to market, no need to maintain on-premises software and simplicity of use – outweigh any downsides[2].

    An article from CIO states: “Business intelligence solutions were once the jurisdiction of the largest enterprises, which had the deep pockets and obligatory technical resources to implement what’s often a fairly complex technology. Today, though, the cloud brings BI within the grasp of small to medium-sized businesses as well. Smaller to medium companies get the benefits of enterprise software without having to have the on-premises footprint and the staff that typically would be required[3].”

    Although there is no rush to replace traditional BI applications with cloud alternatives in the near future, BI SaaS does have a number of legitimate use cases today. For large companies, it is often complementary to improving customer transparency, in coexistence with traditional BI, while it is an affordable option for small to medium organisations[4].

    Cloud BI solutions have the potential of levelling the competitive advantages that were previously enjoyed by large counterparts. The ‘pay per use’ costing models makes the cloud approach attractive to companies of any sizes across any industry. According to Analytics in the Cloud, a January 2015 report by Enterprise Management Associates, adopters’ interest has been increasing in the past three years and can be expected to continue in recent years, at an accelerated rate[5].


    Source : IT Web

  • Wearable magic

    By Jessie Rudd - BI consultant at PBT Group


    In our neck of the woods, wearable tech is still very much a novelty.
    “Oh, you have an Apple Watch?”
    “Yes, it came with my phone” kind of novelty.

    Unfortunately, at its heart, this kind of technology is expensive and out of range for most people. Even with companies actively linking rewards and freebies to various wearables that can track data – how many hours you sleep, what you eat, how many steps you take – the uptake is still slow.

    However, there is real magic in what devices like these can do with the kinds of data being collected. The latest wearables can tell users how to lose weight, quit smoking, or even help them train and reach their fitness goals. Things like smart socks [1]. It may sound silly, but imagine the possibilities of owning a pair of socks that can help to improve running pace and technique? Even going so far as recommending the type of shoes that should be worn. Cleverly built textile pressure sensors allow the device to collect real-time data. This data can be used by the user for feedback as well as by myriad different companies; sports shoe designers, medical practitioners, marketing companies, etc.

    Endless possibilities 

    Imagine being guided to a destination by slight vibrations in the shoes being worn? Imagine the kind of freedom it could give to the visually impaired? The technology to do just that already exists – Lechal [2]. What about a device that detects and destroys cancer cells using nanoparticle phoresis? Already in development – Calico Labs [3].

    On Monday, 28 October 2013, the first operation for abdominal surgery using GoogleGlass [4] was simultaneously livestreamed to the Congress ‘Games for Health Europe’ and live to YouTube. There is no doubt this laparoscopic surgery was just the tip of the iceberg. Proof of concept has already been simulated for the seamless transfer of patient vital signs into GoogleGlass by Phillips Healthcare [5]. This opens a door to a whole new way for doctors to get the information they need, when they need it most. Imagine the power of a device that allows doctors performing surgery to simultaneously monitor a patient’s vital signs and react to changes – without ever having to take their eyes off the procedure or patient.

    According to Soreon research [6], mankind is on the cusp of a wearable revolution in the healthcare sector. It has projected an expected increase of investments into the healthcare sector from $2 billion in 2014 to $41 billion in 2020.

    CDW Healthcare [7] estimates wearable technologies could potentially and significantly drop hospital costs by as much as an astonishing 16% over the course of five years.

    Mutual benefits 

    Even the most basic and entry-level wearables can monitor and gather wearers’ activity level, heart rate, and other vital signs. The opportunity for engaging the individual at a personalised customer level is now wholly within reach. Take, for example, Discovery Health’s recent vitality Active Rewards [8] initiative. Reach your goals, as measured by a variety of wearable tech and apps, and get rewarded for various targeted activities. It’s a win-win. Users get rewarded for getting fit, and being fit makes users healthier and less likely to claim from the medical aid.

    Wearables that measure slight changes in the daily routines of senior and other vulnerable people already exist – Care Predict [9], and healthcare developers are well on their way to actively implementing new wearable technologies for use in patients with Alzheimer’s, diabetes, macular degeneration and those with neuropathic pain. In harnessing wearable health technology, there is now an opportunity for healthcare leaders to find new ways to build engagement and create accurate views of the health of individuals and communities.

    Wearables are both producers and consumers of data. By their very nature, wearables are textbook generators of big data, with high velocity, volume and variety. As in any big data scenario, transforming that data into insight and action requires powerful, scalable analytics, data visualisation and a transparent reporting platform. Wearables in healthcare share many characteristics with the networks of sensors in Internet of things applications. However, healthcare adds layers of complexities, particularly regarding security.

    The future is perhaps not hover boards and self-lacing Nike shoes. Not yet, anyway [10]. But what it is, is still pretty awesome. The result of a crowdfunding initiative, you just have to have a look at the Scully [11] if you have any doubts.




    Source : IT Web

  • Investigating the shifts in technology impacting SA business

    Remaining relevant means organisations have to embrace evolving into digital businesses. Driving this are several technology trends that have the potential to disrupt. Several local industry leaders provide insight into some of these drivers for change.


    Wireless connectivity and mobile devices

    Bruce Pitso, regional manager for South Africa at Ruckus Wireless, believes that the increase in affordable personal computing devices is resulting in changing expectations around connectivity and how information is accessed. “Already, we are seeing devices being launched that only support Wi-Fi. This is pushing wireless adoption in public spaces like restaurants, coffee shops, and shopping centres. It also means many properties are being developed with Wi-Fi as a requisite. Couple that with the integration between home automation solutions and mobile apps, and then you have an environment conducive for significant growth.”


    In South Africa, there is likely to be a concerted push towards more wireless hotspots. This will not be limited to retail environments and hospitality, but extend into the corporate, recreational, warehousing and educational sectors. According to Pitso, brands will invest in Wi-Fi and leverage the connectivity it brings for various marketing opportunities.


    The connectivity discussion will also encourage the partnership of Internet Service Providers and allow business to realise their return on investment (ROI) in the Unified Communications and Cloud Investments.


    Adding on from this, Armandè Kruger, regional sales director of the PBT Group, says mobile devices in this changing environment should no longer be viewed as tools, but rather as an extension of the individual.


    “Mobile is becoming more personal. In a sense, it is the electronic fingerprint of people. Going forward, we will see mobile devices used for verification, tracking, commerce, classification, identification, and so much more,” says Kruger.


    For Frank Rizzo, data analytics leader at KPMG, the dominance of mobility in the digital world will see industry players shift their focus from traditional to mobile computing. “A significant change is on the horizon. The rising focus on the mobile platform is affecting a number of business aspects, including ecommerce spending and online advertising. And then there is augmented reality that is also growing rapidly thanks to mobility. Against the backdrop of steadily increasing processing power, the future holds significant potential for this as can be seen with the development around wearable computing.”


    Messaging platform

    Looking beyond wireless and the associated devices, there is enormous opportunity for a dominant messaging platform to displace SMS, says Grant Theis, co-founder of ttrumpet. “Messaging underpins everything people do in the digital world and yet we are still to scratch the surface of its capabilities. For me, the next wave of killer apps will be built on top of messaging and become an indispensable function of our connected existence.”


    Theis says the development of these apps will enable people to solve a range of business challenges, consume content more easily, play games, and conduct financial transactions, amongst others. “Many consumer businesses are being built on top of messaging platforms. This is ushering in an age of more hyper-local development. It is no longer good enough for global platforms to adopt a cookie cutter approach to solve business problems in countries, provinces, municipalities, and even local neighbourhoods. Solutions have to be customisable and meet the needs of the user communities it serves.”


    Data and the cloud

    According to KPMG’s Rizzo, the breakneck pace at which technology is moving sees it becoming one of the greatest agents of change in the modern world. “Social, mobile, analytics, cloud, and the Internet of Things have become driving forces behind the rapid evolution of digital businesses. These technologies will only be more amplified as we usher in 2016.”


    He says that data and analytics are likely to become a key basis of competition, underpinning new waves of productivity growth, innovation, and consumer experience by 2020.


    PBT’s Kruger agrees. “Thanks to the hype around big data, analytics will continue to receive interest and adoption next year. Software developers will likely focus their efforts on developing algorithms for data that will automate up to 80% of all daily decisions people have to make.”


    Rizzo adds that the cloud computing model is still very relevant with no other trend impacting the world of IT as significantly in the past decade. “Underpinned by both technology and economic disruptions, the cloud will fundamentally change the way technology providers engage with business customers and individual users as it is a key driver for mobility and data analytics.”


    Kruger expects the cloud to introduce bold movements in the organisation in 2016. “Previous barriers like security, bandwidth, and privacy are also becoming either a non-event or are in the process of being properly addressed,” he says.


    Embedded computing and wearables

    Rizzo also says the move towards embedded systems is an interesting one to take note of. “With technology erasing the boundaries between hardware and software, embedded systems are expected to bring the new wave of change.”


    The increasing use of data generated by wearables, bring your own device, and social media platforms, are also leading to more pre-emptive analytics of data.


    “These analytics are likely to encompass automated application responses based on inputs received from analytical models. Due to advances in technology, faster and more real-time analytics will be possible through in-memory analytics and in-memory processing. Next year could be the one where these aspects gain considerable momentum in organisations,” concludes Kruger.


    Source : The SA Leader

  • The art of advanced analytics

    By Masindi Mabogo - Director at PBT Group

    Nazi’s Enigma machine

    For many years during the 20th century, the German military used the Nazi’s Enigma “Cypher” machine to encrypt their secret massages. The Enigma machine had a second set of letters known as the ‘lamp board’. This meant when a letter was pressed on the first keyboard, the lamp board would light up a corresponding cipher text letter to represent the original letter typed, creating “the encryption” for decrypting the message.

    The science behind Enigma was a circuit board, made of three to four rotors, which could be changed to create +/-150 quadrillion encryption possibilities daily, with each letter that is typed. This presented an immense challenge for Allied code breakers before and during World War II (WWII).

    Turing’s Bombe

    From Cambridge University’s overview of Alan M Turing’s life, he was a mathematician, cryptologist, logician, and computer scientist who was a key component in cracking Germany’s Enigma code, allowing the Allies to intercept key information about the next target, food delivery and overall intelligence of the German military, helping them win WWII.

    Together with his team, who worked at Bletchley Park, they created the code-breaking machine named the British Bombe machine, using the principle of contradiction and extraordinary mathematical insights. They also came up with a system for deciding which cracked messages should be passed along to the British Army, Navy and RAF to eliminate suspicion by German forces that the code was cracked.

    Advanced analytics art

    There have been numerous accounts of major events where the art of looking in the past, to identifytrends and patterns to evaluate the present and predict the future, have been applied with great benefits. This ‘art’ uses analytical techniques based on complex learning algorithms to craft models used to predict future outcomes, all with a focus on establishing a mathematical equation as a model to represent the connections between the different variables in consideration.

    Advanced analytics (AA) requires knowledge of past behaviour to generate profiles that would then be used to assess current behaviour and predict possible outcomes. Wayne Eckerson (as if he was describing the Bombe machine) put it into perspective: “An analytical model estimates or classifies data values by essentially drawing a line through data points. When applied to new data or records, a model can predict outcomes based on historical patterns.” The Bombe machine required a short phrase (new data) that could be in the encrypted message to work out the Enigma’s encryption configuration used to generate the code.

    Further, these profiles or models could be interacted with to simulate interventions and potential future outcomes before deciding on the course of action(s). In the Enigma machine, it was understood that by merely using the rotors, one could change the encryption configuration, with over 150 quadrillion possibilities.

    AA relies on sophisticated quantitative methods to produce insights that traditional approaches to business intelligence are unlikely to discover. These methods are built by statisticians, mathematicians or data scientists, similar to Turing and his team.

    Predictive analytics, data mining, big data analytics, location intelligence and sentiment analysis are just some of the recent analytical tools that fall under the heading of AA. These might also include non-predictive tools such as clustering. Sentiment analysis is often associated with social media more than AA.

    While these analytical practices focus on forecasting future events and behaviours as well as extracting complex opinions, it also allows businesses to conduct ‘what-if’ analyses to predict the effects of potential changes in business strategies as well as assess positive or negative connotation in any type of data available.

    The Enigma and Bombe machines stand out as AA ambassadors. The Enigma led the Germans to many victories through the secure communication it offered, while the Allies snatched WWII by creating the Bombe to crack the communication encryption. It is believed that cracking the code saved up to two years of fighting between the Germans and the British forces. Both these machines were engraved from pure AA techniques, built by mathematicians, statisticians and/or data scientists.

    Looking at these examples, what becomes important today is for businesses to be able to effectively derive value from AA – for future events to be predicted that are meaningful to the overall running of a business and its processes. AA has become an important part of the broader business intelligence process and a true bearer of opportunity to unlocking key competitive advantages – just as it was back then.

    Source : IT Web

  • IoT: everyone is excited, except me!

    The Internet of Things (IoT) has become a buzzing topic these days. The IoT: everyone is excited, except me!

    By Venkata Kiran Maram, BI Consultant at PBT Group

    IoT is fundamentally a concept that describes a future where every day physical objects will be connected to the Internet and be able to identify themselves with other devices. These smart devices, systems, and services that “communicate” with other devices via the Internet aim to make our lives easier, and offer us many benefits.

    However many people are distracted from the implications that get ‘swept under the carpet’ – mainly the security and privacy issues.

    Have you ever thought about what the consequences of living in a world where everything generates data about you, are?

    If more and more information is becoming available to devices, and when those devices are connected, it also means that information is readily available to hackers. These connected devices also collect, transmit, store, and often share large amounts of consumer data, some of which is very confidential and personal – creating privacy risks.

    Venkata Kiran Maram, BI Consultant at PBT Group

    In recent times, there has been a number of distressing events reported, which includes, attempts to hack web-connected CCTV footage, as well as numerous hacks on things like Smart TVs, Internet routers, connected fridges, baby monitors and washing machines, to name only a few.

    While all these kinds of products are beneficial to us, it must be remembered that for many of these, security is not the manufacturers’ major concern/priority. Their main focus (and rightly so) is on the actual function of the product – like turning on the TV, or monitoring your baby’s sleep.

    Our laptops and smartphones, which most of us utilise almost every day, listen to us when we’re making calls, both audio and video – and we shouldn’t forget this. There are several ways in which a hacker can turn on the microphones on these devices without you being aware and sometimes we even switch them on ourselves (not knowing the potential risk). For example, when we use the Voice processing systems on our devices, eg. “Siri” on our IPhone.


    “The reality is that our computers, laptops and mobile devices are tracking us even when we are idle. In fact, today one of the most commonly used free email accounts, pays attention to everything you type and conveniently displays advertisements based on your subject matter.”

    We are slowly moving towards an era where everything will be connected and while this may seem exciting and it will be beneficial to the consumer and many businesses, it also comes with substantial risks regarding security and privacy. And this needs to be considered.

    Implementation of secure access control and device authentication may seem like the most suitable solution, however we are dealing with more than the average connecting device here. The successful implementation of the above is difficult to achieve, without affecting the user experience or having to include hardware that is not really necessary.

    As a society, we need to explore the value of being able to be secure and maintain privacy in our personal lives without having the risks of IoT interfering.

    We need to understand IoT effectively, to make sure that it actually benefits us and doesn’t leave us in a vulnerable position.

    Privacy is a prerequisite for free expression, and losing that, in my opinion, would have a huge impact on our society. So yes, embrace the concept, but with your eyes wide open.


    Credit : Tech Financials

  • When good data goes bad

    Data hygiene ensures a data warehouse is populated with accurate and complete data.

    By Jessie Rudd, BI consultant at PBT Group
    Johannesburg, 2 Oct 2015

    Dirty data might sound like something that belongs in a Clint Eastwood movie made for the 21st century. However, it is actually the umbrella definition used to describe data that contains errors. This could be misleading, duplicate, inaccurate or non-integrated data, but also data that violates business rules – such as data without a generalised formatting, or data that is incorrectly punctuated or misspelt; not for one moment forgetting fake data.

    In the world of data, warehousing, big data, social media, etc, any company worth its salt will have many procedures and practices in place to try and limit the amount of dirty data being stored and potentially consumed. However, there is some data that is scrubbed and vetted, stored and consumed, but can go bad over time. And no matter how thorough the process, the occasional Mickey Mouse, Donald Duck or Luke Skywalker will make an appearance on most B2B customer profiles.

    Netprospex’s “The State of Marketing Data 2015” [1] found that overall e-mail deliverability rates continue to introduce unnecessary risk into e-mail marketing programmes, with the average company database deliverability having a less than optimal health scale rating of 3.2 out of 5 – just barely above questionable.

    Ever more disturbing, the study found record completeness only garners a measly 2.9 out of 5. Lead scoring, lead routing, effective content personalisation and Web customisation are all highly dependent on having actionable information about each prospect or customer. Most companies with limited budget and skill simply don’t have the time necessary to wait for progressive profiling to kick in, and many can’t afford to compete against fake form data. At a point in time, the information provided by a customer is probably correct, barring human error. However, what happens when domain changes, or position, or company?

    Physical and e-mail addresses going bad over time, cell numbers changing, fake and or incomplete profile information – these are all very real issues facing many marketing departments across the globe today. A marketing campaign is only as robust and successful as the number of customers it reaches and converts. So what is the solution?

    Coming clean

    Data hygiene refers to the procedures put in place to ensure at any given moment, a data warehouse is populated with the most accurate and complete data. This is done by laying the proper foundation, and then building on that foundation a process of accountability. This can be done by actioning the following:

    Groundwork: Any marketing campaign is only as good as the leads it generates. A full, thorough and complete understanding of the target market is the only way to convert ideas to leads, to offers, to business, to profit. A comprehensive data warehouse, as well as an intrinsic understanding of the customer that resides in that warehouse, should form the backbone of any company’s business intelligence department. If a company understands the story its data is telling, then marketing to the correct customer should be a given. Data quality is all about teamwork.

    Cleanse and append: All inactive, duplicate, and junk contacts should be purged from the data warehouse. Once bad data is removed, the company might find itself with fewer contacts than expected, but it will also have a more valuable insight into the business.

    “The occasional Mickey Mouse, Donald Duck or Luke Skywalker will make an appearance on most B2B customer profiles.”

    Also, if the company is unable to continually replenish its database with fresh leads to make up for the loss, it might be worth considering working with a vendor that can enrich the database and fill in missing contact information from its own database of records. Another solution is to put a procedure in place whereby existing customer information is augmented by freely available social media content.

    While this may be a more complicated method of enriching customer data, it is fast becoming a must-have for any B2B company. Social media profiling is well on its way to becoming an integral part of most marketing campaigns.

    Make it a routine: Fundamental to any good database is the understanding that it is almost impossible to keep bad data from entering it. That is one of the most important reasons why companies need to make data management a priority. The routine checking, cleaning and appending of data to ensure information is always complete and up to date is one of the most important steps in preventing dirty data and data decay.

    Maintaining complete and accurate business contacts is critical to an organisation’s overall success. Data is at the heart of almost every marketing and sales strategy.

    The half-life of data, in essence, the viability of a bit of information before it goes bad, is probably nowhere near as long as people would like to think it is. If companies don’t act now – and fast – their customer-centric data may soon be at the point where it is next to useless.

    [1] Netprospex Benchmark Report 2015

    Credit : IT Web

  • The Industrial Revolution resulted from the pebble thrown in the pond by way of the agricultural revolution.

    The data revolution

    The business case for master data management and data quality.

    By , strategic BI manager at PBT Group.
    Johannesburg, 1 Sep 2015


    The Victorian Industrial Revolution in the 1700s truly revolutionised the world, changing the way people travel, work, eat and live.

    However, the Industrial Revolution resulted from the pebble thrown in the pond by way of the Agricultural Revolution. Britain’s colonial dominion in the world gave it access to a vast agricultural diversity, and its influence and investment in these colonies resulted in significant technological innovations and developments, increasing the productivity of farms.

    This Agricultural Revolution then resulted in excess wealth, raw produce such as food and especially cotton, as well as spare workforce capacity, as farm workers migrated to urban areas in search of work. Surplus produce and population shifts resulted in a dire need to process and distribute the produce. The excess wealth was sensibly applied to spur technological developments in automation (most significantly, probably, the textile industry), metallurgy and transportation, which was effectively empowered by one key innovative breakthrough: Deriving coke from coal as a key energy source for the steam engine and numerous manufacturing machines.

    The outcome of the Industrial Revolution is life as we know it in the global village, where technological innovation is the norm rather than the exception, together with all its social ramifications of unemployment, urbanisation, increase in crime, etc.

    Cliff’s edge

    This brief history lesson sets the context for the data revolution, which is a natural progression from its agricultural pebble in the pond – the digital revolution. Developments in digital storing and digital processing, together with the Internet and social media since the 2000s, now leave the industry at a precipice: there is a data explosion[1] on hand, with data being “excess produce”, and digital and data technological innovation being “excess wealth”. People must learn very quickly how to make sense of all the data at hand, before it explodes and pushes everything and everyone off the precipice.

    Big data technology would be the one innovation to highlight, as I believe it is the analogous “coke derived from coal” that will fuel the data future. However, big data is not the silver bullet that will ensure a bright future. It is merely an innovative resource that needs to be honed and applied mindfully to ensure return on investment. Quoting Gartner from its Top 10 Strategic Trends for 2015[2] when referring to trend number four, analytics: “Big data remains an important enabler for this trend, but the focus needs to shift to thinking about big questions and answers first, and big data second – the value is in the answers, not the data.”

    So, what key practices are needed to transform the data explosion into a data revolution? Big data innovation needs to be accompanied by the technology and disciplines of master data management (MDM) and data quality management, similar to manufacturing developments being accompanied by rigid health and safety regulations and quality standards.

    The whole truth

    Data quality management disciplines ensure the big data generated or leveraged effectively reflects and represents real-life truths. Just like a consumer wouldn’t like finding out a take-away burger is produced from rat meat, the consumer would also not like to discover that decisions being made based on an understanding of customers in South Africa, was in fact data collected about people from a different nationality living in the US.

    “The Industrial Revolution resulted from the pebble thrown in the pond by way of the agricultural revolution.”

    Granted, that’s an extreme example, but it illustrates the importance of data quality management. Data quality technology enables users to measure and monitor data quality in all the diverse data stores. Best practice data quality discipline is to implement controls in source to prevent data quality degradation, but data quality tools also enable reactive data cleansing and improvements.Master data management enables users to manage the contextual data relating to their key data entities to set standards, ensure consistencies, and increase confident interpretation of the data trends and analytics. I would imagine a vehicle manufacturing plant driving off into the abyss of bankruptcy if what it thought was a stainless steel exhaust was in fact made of PVC. Just so, effective master data management is crucial to ensure all the various stakeholders in the value chain understand the meaning of all the descriptive or contextual data elements of their key data entities, such as customer, product, campaign, or even organisational structure. Beyond just understanding the meaning, it is crucial that all stakeholders have access to the same consistent view of such data.

    In May 2015, Germany announced its aggressive investment to initiate the fourth industrial revolution, referred to as Industry 4.0[3]. The essence of Industry 4.0 is “smart factories” based on artificial intelligence in all aspects of the manufacturing value chain.

    This artificial intelligence will be dependent on impeccable big data to learn from, but with the current state of data, where debates about inconsistent figures on reports is still pervasive in most boardrooms, I predict “smart factories” driven by “confused intelligence”. Industry 4.0 must be preceded by a data revolution, which cannot be achieved without effective MDM and data quality management.

    In August 2014, UN secretary Ban Ki Moon issued a mandate for UN members to bring about a data revolution to improve reporting on sustainable development[4]. May the private sector be the leaders and catalysts for this, and not the followers!


    Credit :IT Web


  • The business data lake

    Companies use data lakes as a landing place to hold large volumes of data in their original form. 


    By , director at PBT Group.
    Johannesburg, 29 Jul 2015

    According to Margaret Rouse*: “A data lake is a large object-based storage repository that holds data in its native format until it is needed.”

    Martin Rennhackkamp** called it: “A scaled-out all-encompassing free-for-all staging area.”

    In simplicity, a data lake is a large, easily accessible landing place that holds massive volumes of structured and unstructured data in their original form.

    There are many write-ups with various narrations on why the data lake came about. Let’s attempt to gather these motivations from various write-ups with the objective of circumventing the technical jargon.

    The audience

    Data scientists, analysts (super/technical users) and developers were the primary targeted beneficiaries for the data lake invention. The data lake speaks to their needs for “quick and elastic” data access without the obstacles of data warehouse (DWH) bureaucracy. It also affords them an opportunity to deal with other types of data (unstructured) that previously presented challenges to the DWH ecosystem.

    In recent years, the data lake innovation is witnessing adoption beyond the targeted audience, presenting the challenge for the technology stack to support novice and non-technical users.

    The inclusion of the novice users still remains an area deserving improvement in the areas of provision of user-friendly tools for mining these data lakes.

    Unstructured data

    The data lake is built from the ground up as a big data solution, where unstructured data is still considered as data holding a valid passport to live in the data lake ecosystem.

    The data lake is synonymous to big data, with its warm hospitality for both structured and unstructured data, alleviating the need for users to switch between environments to break newly and unified forms of business value promised by the ‘holy matrimony’ of structured and unstructured data.

    The data lake arose in response to new types of data (video, audio, images, text file, binary, etc) that needed to be captured and harvested for enriched corporate insights and competitive advantage.

    Quick data take-on

    The approach of just dumping information “as is” into the data lake sets aside the vigorous and time-consuming technical complexities engraved into the data warehouse’s DNA. This allows data to be made available for business to use timeously. Although the technicality is detached from the data intake steps, they are moved to a step often called “distillation”.

    The data lake is synonymous to big data, with its warm hospitality for both structured and unstructured data.

    Distillation can be approached in cyclical iterations, as and when the data needs to be used. In this step, the business users create map(s) against the data in the lake to generate the view of the data that fulfils their immediate requirements. The mapping process takes a fraction of the time due to the notion of focusing on the immediate and specific requirements. In other words, the structure and interpretation of the data is only done when it is used – this is called “schema on read”, as opposed to the “schema on write” approach that is used in data warehousing.

    The cost

    All data lake write-ups have some cost benefit arguments and they all seem to be riding on the wave of plummeting storage costs. They further embrace the concept of quick turnaround time as well as immediate return on investments, with the benefits of starting small and scaling-up as required. Others cite the cyclical approach that allows cost to be distributed across lines of business at the time of data consumption.

    All of the above stands true, until data is ingested into the lake. However, data exploration technology still remains an area of “unknown cost” mainly due to low maturity in the technology/applications enablers to interrogate the data.

    The term data lake is being accepted as a way to describe any large data pool in which the schema/structure and data requirements are not defined until the data is queried. The innovation culminated from the thirst for quick data access by technical teams to explore all forms of data types. The data lake is famous as a cost-effective solution in a way that leverages the business need for local views.

    As Rennhackkamp says: “If the data lake is used correctly in the BI ecosystem, together with the data warehouse being used for what it, in turn, is good for, one can have a synergistic extended BI ecosystem that can really provide good information and insights to the business as and when needed.”


    Credit : ITWEB

  • Insurers need to start using predictive analytics or lose out!

    By Dr Corine Van Erkom Schurink,

    While predictive analytics is a hot topic, many local businesses are hesitant to embrace it. However, given the significance of providing valuable insight on the current and future performance of a company, implementation has to happen sooner rather than later.

    Some decision-makers are concerned about the complexity of integrating with business functions that are seemingly quite disparate – think marketing and IT for example. However, once the value of predictive analytics is understood, as well as the associated processes and data requirements, then the choice becomes clear.

    Dr Corine Van Erkom Schurink, Advanced Analytics Team Leader at PBT Group

    While it makes sense to utilise predictive analytics in sectors like financial services and insurance, it really is something that can benefit any organisation in this age of connectedness and Big Data.

    While predictive analytics requires deep data, insurers today, who often work with silos of information, need to recognise the fact that a good analytics dataset can typically consist of a mix of integrated data (and not separate information) – being customer data, historical policies or product data, and intermediary/agent performance records. Through recognising this, they will be able to reap the benefits predictive analytics can bring to an insurance business.

    For insurers, predictive analytics can aid in ‘flagging’ customers who are most likely to commit fraud at an early stage of their life cycle, for example. It can also help predict the performance of a company representative or selling agent and their likelihood to ‘drop out’. Just think about the cost-savings that could be achieved.

    Similarly, in the case of predictive fraud, actuaries and underwriters can now quickly reduce the risk exposure of the company by adjusting the rules supporting decisions and the algorithms that determine new policy premiums or claim settlements – which will reduce expenses even further at individual customer levels.

    So why are South African companies so slow in embracing this technology?

    For one, local universities have been slow to adopt a curriculum that places data scientists in the market. Another reason is that companies fail to understand what is needed to effectively implement predictive analytics. Far too often it becomes one of the ‘functions’ of the IT department, which has no real sense of the business directives of the company and of the strategic business implementation of the advanced analytics outputs. There has to be inter-functional consultation, shared resources, and shared budgets.

    might argue that if all insurers implement predictive analytics there will be no real differentiation. However, the ones that are able to go back to their ‘roots’ and use traditional marketing tools such as pricing, quality of service, innovative products, and effective service, will be the companies that are able to supercharge their offerings. The level of effectiveness in how predictive analytics is leveraged and actioned across the organisation will also contribute to the gain of competitive advantage.

    Insurers adopting predictive analytics will not only reduce their risks and related costs, creating larger profits for reinvestment, but also free a large amount of resources that can be redirected to new projects. Those who adopt predictive analytics will become slicker and less ‘cash strapped’ than the insurance companies who will have missed the boat.

    So, for those who are still undecided about the merits of embracing predictive analytics, think about the lost revenue opportunities and whether that can be written off in such a competitive economy.

    Credit :

  • Pulling data to good use

    Author, inventor and undersea explorer Sir Arthur Charles Clarke once said: “Any sufficiently advanced technology is indistinguishable from magic.” While the field of data analytics has experienced some drastic changes over the last decade – in terms of variety, volume and velocity of information – industry experts argue that the analytics space still has some way to go before it can be likened to the level of enchanting innovation described above.

    And the steward of this journey is the data scientist. Described by the Harvard Business Review as one of the sexiest jobs of the 21st century, a data scientist is an individual with the required skillset and mindset to discover the untold stories hidden in the world of big data. Data scientists have the expertise and knowhow to deploy and explore massive amounts of data in an agile and flexible manner, says Yigit Karabag, information management and analytics practice manager for SAS in the Middle East, Turkey and Africa. “Companies should be recruiting this new breed of professionals. They’re most definitely the key to innovation and are having a tangible impact on business and society.” The emergence of an elaborate data community only serves to reinforce the value of these individuals and drive the demand for data scientists.

    For Steven Ing, associate consultant at BSG, the key differentiator in today’s ever-changing business landscape is the effective use of the talent and intelligence at your disposal. Acknowledging that the number-one focus area for technology spend globally is data and data analytics, he believes organisations should position these talented people to better tap into the potential of data as a strategic business asset. Given the current data-rich business landscape, Davide Hanan, MD of Qlikview South Africa, calls on organisations to not just rely on data experts, suggesting businesses educate as many people as possible – at all levels of the organisations – to work with data. By empowering all business users to create their own data apps and reports, the business is freeing up IT professionals to focus their attention on actually managing the data and keeping it secure.

    Read More

  • POPI: More than consumer data

    Just as companies start to become complacent in achieving POPI (Protection of Personal Information) compliance, the deputy minister of justice, John Jeffery, announced last week that the ball on establishing the regulator is starting to ‘roll’.

    Remember, POPI will have a huge impact on the record-keeping and non-disclosure disciplines of public and private bodies, not only in relation to personal information kept regarding consumer data, but in terms of impacting employee data as well.

    In this Industry Insight, I will convey a brief insight into some common risk areas of non-compliance in the HR function. The opinions are based on the experience of some of PBT Group’s principal consultants who advise clients’ data management functions.


  • Local insurance industry slow to embrace new technology, TechFinancials

    By Gerhard Botha

    Telematics and big data have generated much interest globally within the insurance industry. Yet, there is still reluctance amongst some SA insurers to adopt these practices. While local firms have the capability and know-how to implement, the low margins seem to compel them to remain conservative.

    The prevalence of fraud in the industry has also necessitated many insurance firms to focus on security and other preventative measures, instead of investing in the likes of telematics and big data. Limited budgets mean that these technologies are simply not on the priority list.

    Scepticism around the benefits that these provide, coupled with questions around their robustness and trustworthiness, is part of this hindrance. Certainly, telematics is useful to understand behaviour but it is also very easy to manipulate.


    Gerhard Botha – CTO of PBT Group


    On the big data side, many insurers do not see themselves as having large volumes of information. Many of the decision-makers feel that their current data management technology is sufficient for any analytical needs they might have. However, there still remains a demand for more skills to understand and exploit existing data assets.

    Of course, as with any new technology, the biggest concern revolves around the connectivity required to implement and monitor it properly. Data in South Africa remains expensive and, at times, sporadic. With both telematics and big data requiring high volumes of data to be transferred, there is simply no capacity and reliability to work with a reasonable safe level of trust.

    And you cannot forget the grudge purchase aspect of telematics. It requires technology to be installed in customer vehicles – and people still argue ‘why they should pay for something they do not want’. Additionally, telematics is of limited use to insurers because of its sensitivity to isolating poor driving behaviour. For example, through telematics it is difficult to determine if someone is racing or just changing lanes in dense traffic and therefore should be paid out by the insurer, should an accident occur.

    However, with telematics expected to merge with the Internet of Things (IoT), more investment will be made in ensuring the stability and effectiveness of the technology. This is also contributing to the development of tools such as accelerometers, pressure gauges, speedometers, and location sensors to add to the value proposition of both telematics and the IoT.

    And while we are still some time away of it being implemented, self-driving vehicles could significantly impact the insurance industry. This will also mean that significant telematics and big data technology would have to be rolled out.

    However, even though companies are slow to adopt telematics and big data, there are positive signs. With a few insurers already developing customer solutions around these and others launching pilot projects and proof of concepts, this could be an exciting year for the South African market. Of course, whether customers will embrace these innovations remain to be seen.

  • Predict the future, ITWeb

    The ephemeral torrent of big data streaming across networks, mobiles, wires and inboxes every second of every day is a powerful source of information if harnessed correctly. Data technology continues to evolve, but its adoption is fragmented and much of the value inherent in all of this information is lost. Advanced analytics is the key to grabbing hold of this data and using it to make informed business decisions, transforming the way organisations plan, interact and grow. “Advanced analytics is the extensive use of data, statistical and predictive models and other kinds of business simulation alongside daily, fact-based management to drive decisions and actions,” says Suren Govender, MD of Accenture Analytics. “Our clients face some real challenges in their businesses today and with advanced analytics, we assist them in harnessing and using diverse data sources at their disposal to give them a real competitive edge in their industries.”
    Delving into the depths of advanced analytics provides the organisation with significant advantages. It can refine productivity and efficiency through improved operational intelligence and near real-time responses. Insight is shared across enterprise silos, building comprehensive overviews of both structured and unstructured data. It can improve return on investment and inspire innovation throughout the enterprise with visualisation and the elimination of redundant infrastructure.

    Read More…

  • The dawn of data science, ITWeb

    For the great explorers of the past, navigation without a compass would have been unthinkable. Today, not only are we able to determine in which direction we’re headed, but we have a wealth of other useful information at our fingertips. From location co-ordinates, to weather conditions, terrain, traffic congestion, crime hotspots and toll roads, we’re able to plan our journeys down to the smallest details. But, unless you have the necessary ways and means of managing and interpreting this information overload, you will still be lost.

    The same can be said for businesses, says Armandè Kruger, regional sales director at PBT Group.The business that has the correct information first and has the best ability to derive value from it and act on it will be the one that outperforms its rivals.” Kruger believes business owners with money to invest this year should be spending that extra cash on big data analytics.

    A company’s data is one of its most important assets, and yet data management remains a headache for most, says Jason Barr, divisional manager of storage and availability at XON. With the exponential growth of data, IT departments often struggle to come up with an effective data management strategy, especially one that is secure, reliable and cost-effective. One of the key steps, notes Barr, is to avoid operating in silos because this hinders a business’ ability to create a holistic approach to management. Data management should ideally span from inception to expiration. The goal must be to close the gap between the business and IT systems, he continues.

    Read More…

  • Data storage in the cloud – is South African business ready? ee Publishers

    Are businesses in South Africa ready for storing their data in the cloud?   It is actually a redundant question, as in several recent security surveys respondents were asked if they would share sensitive company information in dropbox. The overwhelming response was yes!   While that indicates that the use of storage in the cloud is more pervasive than what is generally thought, it raises a number of other questions.

    Read More…

  • Big data, BI provide greater customer insights, ITWeb

    The business intelligence (BI) market is changing at a rapid speed, and organisations of all sizes need to revamp their data management environments foragility, flexibility, and responsiveness to address ever-changing customer expectations.

    This is according to Donald Farmer, vice-president of innovation and design at Qlik, who notes there are real opportunities for business to take advantage of big data. But most IT departments don’t yet have the skills or the time to implement new infrastructures while struggling to maintain their existing workload.

    Using data more intelligently is becoming all-encompassing for organisations, says Armandè Kruger, regional sales director at PBT Group. Determining buying patterns, maintaining stock levels, detecting fraud, helping manage incentive programmes, and even managing customers better requires data and the analytical systems to derive practical BI from it, he adds.

    Kruger says businesses should tap into BI and big data technologies for greater customer insights.

    Businesses should investigate new data discovery tools, predictive and text analytics, and geospatial technologies, says Kruger.

    Read More…

  • What to expect from BI in 2015, Bizcommunity

    The global Business Intelligence (BI) and analytics market grew by 8% in 2013 and expectations are that the growth for last year would be even bigger…

    Clearly, the focus now is on how BI can unlock value inside the organisation – more effectively than before. So what does 2015 hold for the market and the development of solutions to harness this?

    Armandè Kruger, regional sales director at PBT Group, believes that the wider adoption and incorporation of BI and advanced analytical outcomes in business processes will lead towards more preemptive analytics. These analytics will encompass automated application responses based on inputs received from analytical models.

    “It might sound like a cliché to say that real-time processing of information is critical to the success of the connected business. However, no company can afford not to find ways to improve the efficiency of their analytics as a means of enabling better decision-making. Thanks to advances in technology, faster and more real-time analytics are possible through in-memory analytics and in-memory processing. As a result, we expect 2015 to be the year where these aspects gain considerable momentum in organisations.”

    Read More…

  • post

    BI technology and industry updates

    Being involved in industry conferences, events, information sessions and workshops, positions PBT Group as a thought leader in the markets we serve. Eagerly contributing and sharing valuable industry knowledge and trends, PBT Group often provides opinions about BI expertise and processes to noteworthy business and technology publications.