Archive for November 6th, 2015

  • The art of advanced analytics

    By Masindi Mabogo - Director at PBT Group

    Nazi’s Enigma machine

    For many years during the 20th century, the German military used the Nazi’s Enigma “Cypher” machine to encrypt their secret massages. The Enigma machine had a second set of letters known as the ‘lamp board’. This meant when a letter was pressed on the first keyboard, the lamp board would light up a corresponding cipher text letter to represent the original letter typed, creating “the encryption” for decrypting the message.

    The science behind Enigma was a circuit board, made of three to four rotors, which could be changed to create +/-150 quadrillion encryption possibilities daily, with each letter that is typed. This presented an immense challenge for Allied code breakers before and during World War II (WWII).

    Turing’s Bombe

    From Cambridge University’s overview of Alan M Turing’s life, he was a mathematician, cryptologist, logician, and computer scientist who was a key component in cracking Germany’s Enigma code, allowing the Allies to intercept key information about the next target, food delivery and overall intelligence of the German military, helping them win WWII.

    Together with his team, who worked at Bletchley Park, they created the code-breaking machine named the British Bombe machine, using the principle of contradiction and extraordinary mathematical insights. They also came up with a system for deciding which cracked messages should be passed along to the British Army, Navy and RAF to eliminate suspicion by German forces that the code was cracked.

    Advanced analytics art

    There have been numerous accounts of major events where the art of looking in the past, to identifytrends and patterns to evaluate the present and predict the future, have been applied with great benefits. This ‘art’ uses analytical techniques based on complex learning algorithms to craft models used to predict future outcomes, all with a focus on establishing a mathematical equation as a model to represent the connections between the different variables in consideration.

    Advanced analytics (AA) requires knowledge of past behaviour to generate profiles that would then be used to assess current behaviour and predict possible outcomes. Wayne Eckerson (as if he was describing the Bombe machine) put it into perspective: “An analytical model estimates or classifies data values by essentially drawing a line through data points. When applied to new data or records, a model can predict outcomes based on historical patterns.” The Bombe machine required a short phrase (new data) that could be in the encrypted message to work out the Enigma’s encryption configuration used to generate the code.

    Further, these profiles or models could be interacted with to simulate interventions and potential future outcomes before deciding on the course of action(s). In the Enigma machine, it was understood that by merely using the rotors, one could change the encryption configuration, with over 150 quadrillion possibilities.

    AA relies on sophisticated quantitative methods to produce insights that traditional approaches to business intelligence are unlikely to discover. These methods are built by statisticians, mathematicians or data scientists, similar to Turing and his team.

    Predictive analytics, data mining, big data analytics, location intelligence and sentiment analysis are just some of the recent analytical tools that fall under the heading of AA. These might also include non-predictive tools such as clustering. Sentiment analysis is often associated with social media more than AA.

    While these analytical practices focus on forecasting future events and behaviours as well as extracting complex opinions, it also allows businesses to conduct ‘what-if’ analyses to predict the effects of potential changes in business strategies as well as assess positive or negative connotation in any type of data available.

    The Enigma and Bombe machines stand out as AA ambassadors. The Enigma led the Germans to many victories through the secure communication it offered, while the Allies snatched WWII by creating the Bombe to crack the communication encryption. It is believed that cracking the code saved up to two years of fighting between the Germans and the British forces. Both these machines were engraved from pure AA techniques, built by mathematicians, statisticians and/or data scientists.

    Looking at these examples, what becomes important today is for businesses to be able to effectively derive value from AA – for future events to be predicted that are meaningful to the overall running of a business and its processes. AA has become an important part of the broader business intelligence process and a true bearer of opportunity to unlocking key competitive advantages – just as it was back then.

    Source : IT Web

  • IoT: everyone is excited, except me!

    The Internet of Things (IoT) has become a buzzing topic these days. The IoT: everyone is excited, except me!

    By Venkata Kiran Maram, BI Consultant at PBT Group

    IoT is fundamentally a concept that describes a future where every day physical objects will be connected to the Internet and be able to identify themselves with other devices. These smart devices, systems, and services that “communicate” with other devices via the Internet aim to make our lives easier, and offer us many benefits.

    However many people are distracted from the implications that get ‘swept under the carpet’ – mainly the security and privacy issues.

    Have you ever thought about what the consequences of living in a world where everything generates data about you, are?

    If more and more information is becoming available to devices, and when those devices are connected, it also means that information is readily available to hackers. These connected devices also collect, transmit, store, and often share large amounts of consumer data, some of which is very confidential and personal – creating privacy risks.

    Venkata-K-Maram700x500

    Venkata Kiran Maram, BI Consultant at PBT Group

    In recent times, there has been a number of distressing events reported, which includes, attempts to hack web-connected CCTV footage, as well as numerous hacks on things like Smart TVs, Internet routers, connected fridges, baby monitors and washing machines, to name only a few.

    While all these kinds of products are beneficial to us, it must be remembered that for many of these, security is not the manufacturers’ major concern/priority. Their main focus (and rightly so) is on the actual function of the product – like turning on the TV, or monitoring your baby’s sleep.

    Our laptops and smartphones, which most of us utilise almost every day, listen to us when we’re making calls, both audio and video – and we shouldn’t forget this. There are several ways in which a hacker can turn on the microphones on these devices without you being aware and sometimes we even switch them on ourselves (not knowing the potential risk). For example, when we use the Voice processing systems on our devices, eg. “Siri” on our IPhone.

     

    “The reality is that our computers, laptops and mobile devices are tracking us even when we are idle. In fact, today one of the most commonly used free email accounts, pays attention to everything you type and conveniently displays advertisements based on your subject matter.”

    We are slowly moving towards an era where everything will be connected and while this may seem exciting and it will be beneficial to the consumer and many businesses, it also comes with substantial risks regarding security and privacy. And this needs to be considered.

    Implementation of secure access control and device authentication may seem like the most suitable solution, however we are dealing with more than the average connecting device here. The successful implementation of the above is difficult to achieve, without affecting the user experience or having to include hardware that is not really necessary.

    As a society, we need to explore the value of being able to be secure and maintain privacy in our personal lives without having the risks of IoT interfering.

    We need to understand IoT effectively, to make sure that it actually benefits us and doesn’t leave us in a vulnerable position.

    Privacy is a prerequisite for free expression, and losing that, in my opinion, would have a huge impact on our society. So yes, embrace the concept, but with your eyes wide open.

     

    Credit : Tech Financials

  • When good data goes bad

    Data hygiene ensures a data warehouse is populated with accurate and complete data.

    By Jessie Rudd, BI consultant at PBT Group
    Johannesburg, 2 Oct 2015

    Dirty data might sound like something that belongs in a Clint Eastwood movie made for the 21st century. However, it is actually the umbrella definition used to describe data that contains errors. This could be misleading, duplicate, inaccurate or non-integrated data, but also data that violates business rules – such as data without a generalised formatting, or data that is incorrectly punctuated or misspelt; not for one moment forgetting fake data.

    In the world of data, warehousing, big data, social media, etc, any company worth its salt will have many procedures and practices in place to try and limit the amount of dirty data being stored and potentially consumed. However, there is some data that is scrubbed and vetted, stored and consumed, but can go bad over time. And no matter how thorough the process, the occasional Mickey Mouse, Donald Duck or Luke Skywalker will make an appearance on most B2B customer profiles.

    Netprospex’s “The State of Marketing Data 2015” [1] found that overall e-mail deliverability rates continue to introduce unnecessary risk into e-mail marketing programmes, with the average company database deliverability having a less than optimal health scale rating of 3.2 out of 5 – just barely above questionable.

    Ever more disturbing, the study found record completeness only garners a measly 2.9 out of 5. Lead scoring, lead routing, effective content personalisation and Web customisation are all highly dependent on having actionable information about each prospect or customer. Most companies with limited budget and skill simply don’t have the time necessary to wait for progressive profiling to kick in, and many can’t afford to compete against fake form data. At a point in time, the information provided by a customer is probably correct, barring human error. However, what happens when domain changes, or position, or company?

    Physical and e-mail addresses going bad over time, cell numbers changing, fake and or incomplete profile information – these are all very real issues facing many marketing departments across the globe today. A marketing campaign is only as robust and successful as the number of customers it reaches and converts. So what is the solution?

    Coming clean

    Data hygiene refers to the procedures put in place to ensure at any given moment, a data warehouse is populated with the most accurate and complete data. This is done by laying the proper foundation, and then building on that foundation a process of accountability. This can be done by actioning the following:

    Groundwork: Any marketing campaign is only as good as the leads it generates. A full, thorough and complete understanding of the target market is the only way to convert ideas to leads, to offers, to business, to profit. A comprehensive data warehouse, as well as an intrinsic understanding of the customer that resides in that warehouse, should form the backbone of any company’s business intelligence department. If a company understands the story its data is telling, then marketing to the correct customer should be a given. Data quality is all about teamwork.

    Cleanse and append: All inactive, duplicate, and junk contacts should be purged from the data warehouse. Once bad data is removed, the company might find itself with fewer contacts than expected, but it will also have a more valuable insight into the business.

    “The occasional Mickey Mouse, Donald Duck or Luke Skywalker will make an appearance on most B2B customer profiles.”

    Also, if the company is unable to continually replenish its database with fresh leads to make up for the loss, it might be worth considering working with a vendor that can enrich the database and fill in missing contact information from its own database of records. Another solution is to put a procedure in place whereby existing customer information is augmented by freely available social media content.

    While this may be a more complicated method of enriching customer data, it is fast becoming a must-have for any B2B company. Social media profiling is well on its way to becoming an integral part of most marketing campaigns.

    Make it a routine: Fundamental to any good database is the understanding that it is almost impossible to keep bad data from entering it. That is one of the most important reasons why companies need to make data management a priority. The routine checking, cleaning and appending of data to ensure information is always complete and up to date is one of the most important steps in preventing dirty data and data decay.

    Maintaining complete and accurate business contacts is critical to an organisation’s overall success. Data is at the heart of almost every marketing and sales strategy.

    The half-life of data, in essence, the viability of a bit of information before it goes bad, is probably nowhere near as long as people would like to think it is. If companies don’t act now – and fast – their customer-centric data may soon be at the point where it is next to useless.

    [1] Netprospex Benchmark Report 2015

    Credit : IT Web

  • The data revolution

    The business case for master data management and data quality.

    By , strategic BI manager at PBT Group.
    Johannesburg, 1 Sep 2015

     

    The Victorian Industrial Revolution in the 1700s truly revolutionised the world, changing the way people travel, work, eat and live.

    However, the Industrial Revolution resulted from the pebble thrown in the pond by way of the Agricultural Revolution. Britain’s colonial dominion in the world gave it access to a vast agricultural diversity, and its influence and investment in these colonies resulted in significant technological innovations and developments, increasing the productivity of farms.

    This Agricultural Revolution then resulted in excess wealth, raw produce such as food and especially cotton, as well as spare workforce capacity, as farm workers migrated to urban areas in search of work. Surplus produce and population shifts resulted in a dire need to process and distribute the produce. The excess wealth was sensibly applied to spur technological developments in automation (most significantly, probably, the textile industry), metallurgy and transportation, which was effectively empowered by one key innovative breakthrough: Deriving coke from coal as a key energy source for the steam engine and numerous manufacturing machines.

    The outcome of the Industrial Revolution is life as we know it in the global village, where technological innovation is the norm rather than the exception, together with all its social ramifications of unemployment, urbanisation, increase in crime, etc.

    Cliff’s edge

    This brief history lesson sets the context for the data revolution, which is a natural progression from its agricultural pebble in the pond – the digital revolution. Developments in digital storing and digital processing, together with the Internet and social media since the 2000s, now leave the industry at a precipice: there is a data explosion[1] on hand, with data being “excess produce”, and digital and data technological innovation being “excess wealth”. People must learn very quickly how to make sense of all the data at hand, before it explodes and pushes everything and everyone off the precipice.

    Big data technology would be the one innovation to highlight, as I believe it is the analogous “coke derived from coal” that will fuel the data future. However, big data is not the silver bullet that will ensure a bright future. It is merely an innovative resource that needs to be honed and applied mindfully to ensure return on investment. Quoting Gartner from its Top 10 Strategic Trends for 2015[2] when referring to trend number four, analytics: “Big data remains an important enabler for this trend, but the focus needs to shift to thinking about big questions and answers first, and big data second – the value is in the answers, not the data.”

    So, what key practices are needed to transform the data explosion into a data revolution? Big data innovation needs to be accompanied by the technology and disciplines of master data management (MDM) and data quality management, similar to manufacturing developments being accompanied by rigid health and safety regulations and quality standards.

    The whole truth

    Data quality management disciplines ensure the big data generated or leveraged effectively reflects and represents real-life truths. Just like a consumer wouldn’t like finding out a take-away burger is produced from rat meat, the consumer would also not like to discover that decisions being made based on an understanding of customers in South Africa, was in fact data collected about people from a different nationality living in the US.

    “The Industrial Revolution resulted from the pebble thrown in the pond by way of the agricultural revolution.”

    Granted, that’s an extreme example, but it illustrates the importance of data quality management. Data quality technology enables users to measure and monitor data quality in all the diverse data stores. Best practice data quality discipline is to implement controls in source to prevent data quality degradation, but data quality tools also enable reactive data cleansing and improvements.Master data management enables users to manage the contextual data relating to their key data entities to set standards, ensure consistencies, and increase confident interpretation of the data trends and analytics. I would imagine a vehicle manufacturing plant driving off into the abyss of bankruptcy if what it thought was a stainless steel exhaust was in fact made of PVC. Just so, effective master data management is crucial to ensure all the various stakeholders in the value chain understand the meaning of all the descriptive or contextual data elements of their key data entities, such as customer, product, campaign, or even organisational structure. Beyond just understanding the meaning, it is crucial that all stakeholders have access to the same consistent view of such data.

    In May 2015, Germany announced its aggressive investment to initiate the fourth industrial revolution, referred to as Industry 4.0[3]. The essence of Industry 4.0 is “smart factories” based on artificial intelligence in all aspects of the manufacturing value chain.

    This artificial intelligence will be dependent on impeccable big data to learn from, but with the current state of data, where debates about inconsistent figures on reports is still pervasive in most boardrooms, I predict “smart factories” driven by “confused intelligence”. Industry 4.0 must be preceded by a data revolution, which cannot be achieved without effective MDM and data quality management.

    In August 2014, UN secretary Ban Ki Moon issued a mandate for UN members to bring about a data revolution to improve reporting on sustainable development[4]. May the private sector be the leaders and catalysts for this, and not the followers!

    [1] http://www.martinhilbert.net/worldinfocapacity.html/
    [2] http://www.information-management.com/gallery/gartners-top-10-strategic-tech-trends-for-2015-10026168-1.html
    [3] http://www.wired.co.uk/news/archive/2015-05/21/factory-of-the-future
    [4] http://www.undatarevolution.org/

    Credit :IT Web