This second article focusses on another core topic of the Digital Transformation Expo in London: cyber security. Like the previous article, the text below is my personal interpretation of the sessions I attended, so not necessarily completely factually correct. The key take-away for me: cyber security and defense is primarily about people, not about technology and processes.
It’s all About the User
There were dozens of vendors present selling solutions dedicated to hardening applications and the underpinning (cloud-based) infrastructure. According to Gartner, companies and governments will spend an estimated $124 billion on information security products and services in 2019, an increase of almost 9 percent compared to 2018. That is an amount similar to the GDP of Ukraine in 2018.
So, the question is warranted whether that money is well spent.
The sobering news according to Verizon’s 2019 Data Breach Investigations report is that 81 percent of the confirmed data breaches involved weak, stolen or default passwords. Regardless whether the data is hosted in a) the companies’ own datacenter, b) by an external service provider or c) in the cloud (e.g. AWS, Azure, Google), most serious attacks start with collecting user data.
By the way, with ‘user’ I mean anybody from the CFO, intern, developer, supplier, contractor or customer who has in one way or another access to part of your restricted or confidential data (see illustration).
During the expo, several red team engineers demonstrated how publicly available data, data and tools available for sale on the ‘dark web’ and weaknesses in the application and/or infrastructure can be combined to slowly but surely enter your IT environment. Every internet-facing interface (e.g. Outlook web, active directory federation services, self-hosted Lync servers, web VPN interfaces) provides The Bad Guys with an ‘attack surface’ to turn snippets of user data into access to the companies’ IT environment.
The user is the most successful attack vector, not technology.
A Bigger Wall is Not the Answer
Hence, building bigger virtual walls around your data should not be your sole objective. It is equally important to be safe to fail. Like the Maginot Line failed to keep the Germans out in WWII, companies have to assume their fortifications and obstacles will one day be circumvented by an attacker.
Also similar to the Maginot Line is the need for an uninterrupted flow across borders for ‘non-combatants’. Customers, business partners, suppliers and users expect a frictionless user experience and, in the process, ensure part of your data already left your premises. Slack, Teams, Dropbox, Box, OneDrive, public email (e.g. Gmail) and other collaboration tools ensure some restricted and confidential data leaves your company and consequently your control. Even worse, who prevents your business partner or supplier from storing data you consider confidential on a non-encrypted USB stick or laptop?
As said, you can ‘harden’ your own devices (e.g. encryption, two-factor-authentication (2FA), password policies) until your users pull their hairs out in frustration, but one day sensitive information will slip through.
As you cannot reduce your cyber risk profile to zero, what can you do?
Business Value Drives Cyber Defense Approach
All data is equal, but some data is more equal than others. Customer data, intellectual property, employee payroll data and passwords are valuable corporate assets, not mere ‘data.’ For starters, this implies that it is the business and not IT which should be in the lead when classifying data. The business knows the context and thus value of your documents, emails, presentations and other data points. Secondly, data protection is more than GDPR compliance as your product designs, chemical recipes, source code, customer contracts and cost structure may be more valuable than the reputation damage and fines related to a breach of Personally Identifiable Information (PII).
Hence, when defining your cyber defense approach, look beyond GDPR compliance (and other regulations).
With the value of your data driving your cyber defense approach, data classified as ‘key company asset’ should be under strict centralized business and IT governance. You preferably also don’t want this data to reside in multiple, geographically distributed data centers, or hosted by different public cloud providers, as each has its own cyber defense do’s and don’ts. Experimenting with different technologies, cloud providers and so on is crucial from an innovation perspective, but not so much when safeguarding your golden eggs. With the latter you want to invest in specialists with deep skill sets instead of generalists.
For key data assets, users are more likely to accept that they have to jump through additional burning hoops to access the data even though user experience remains important as users will find ways to circumvent the controls if they are perceived as unreasonably restrictive. And let’s not forget: a product design or other piece of intellectual property has value only when you can actually use it. Risk is only one side of the coin, the other is benefits (e.g. more revenue, margin or customers). A fact often overlooked by the CISO and other risk management professionals when designing a risk mitigation approach they consider appropriate.
Talking about user experience, another topic covered in London was the end of the password (hooray!). While counterintuitive, passwordless authentication eliminates one important weakness: weak passwords, passwords on Post Its, passwords stored as documents on the desktop and so on. Biometric authentication, pattern-based one-time passwords, tokens or single-step 2FA are solutions gaining traction for data with medium value.
Value should not only drive user experience, but your whole cyber defense approach as depicted in the illustration below. Both at industry level and business line/process level, every company should make a conscious decision what the integral cyber defense maturity should be. Full-blown Security Operations Centers (SOC), Cloud Access Security Brokers (CASB) and Red Teaming are expensive as are company-wide two factor authentication and security segmentation at application level (‘micro segmentation’). They require a solid business case.
Expect other cyber defense practices, like the Zero Trust Model, to enter mainstream adoption soon regardless of the industry the company is in as it is all but impossible to distinguish between internal and external users. Suppliers, contractors, customers and students need access to some data but not to others with devices you may or may not know. And let’s not forget the impact of Artificial Intelligence as The Bad Guys already embraced it.
Artificial Intelligence to the Rescue?
As mentioned in my first article on the visit of the Digital Transformation Expo in London, Artificial Intelligence is going to cause serious waves in the years to come. Attackers will turn to AI to automate their attacks and make them more complex (e.g. deep fakes, social media manipulation). At the same time, concepts like Smart Cities, Smart Medical Devices, Smart Building and the Internet of Things (IoT) in general will dramatically expand the number of targets and attack vectors. Combined, they leave companies with little choice but to invest in AI-based cyber defense mechanisms. Due to their nature, AI systems can scale more easily than humans can to cope with the growth in number and diversity of smart devices. AI not humans can detect new unknown patterns in millions of data points, countering the bad guys who use AI to help them evade detection.
In other words, the same cat and mouse game continues.
The text below focusses on artificial intelligence and is my personal interpretation of the keynote sessions I attended, so not necessarily completely factually correct.
Keynote by Sir John Sawers
Sir John Sawers, former chief of Secret Intelligence Service (MI6) started off with providing the audience with a macro perspective on the challenges we are facing in the years to come. He identified three key topics:
- The dominant forces are back. The United States, China, and to a lesser extent Russia, are battling for world dominance. The focus of the last 50 years on collaboration and removing (trade) barriers has been replaced by protectionism, military power and a battle for dominance in the technology space. Countries are decoupling, each pursuing their own interests. This even applies to the internet, it is already being Balkanized by countries like Russia and China.
- Populistic parties are on the rise. Often driven by a growing inequality, more people vote on ‘strong leaders’ and populistic parties, thereby polarizing the political landscape. So also within countries we are increasingly decoupling.
- Importance of technology. With the decoupling of countries, access to military technologies and other strategic technology areas has become a matter of state policy. The United States, Europe, China and other countries now closely guard those industries they consider vital to their national interests. At the same time is technology increasingly used by (non)state actors to infiltrate and disrupt vital infrastructure elements of other countries.
The most important technology for the years to come is Artificial Technology (AI). Its impact on both military capabilities and GDP (10x increase in world GDP à $13,500 Trillion) is huge, easily offsetting the billions of dollars invested by companies and countries.
Keynote by Stuart Russel
The second keynote by Stuart Russel, godfather of modern artificial intelligence, elaborated on the history and limitations of AI. He started off with some history (e.g. Aristotle predicted around 340 BC that workers would be irrelevant if something like AI would evolve, and Turing predicting something similar again in 1951).
Stuart Russel then cautioned that we currently surf on a wave of optimism; we are in an AI gold rush. Billions are invested in AI, but the technology is still in its infancy. Despite the billions already invested in creating autonomous cars, we are still billions away from achieving the desired objective. While Russel expects car manufacturers to continue their investments in the technology until it reaches maturity, there might be other areas where AI proves too costly (for the foreseeable future). Hence, a considerable share of the billions invested in AI will not yield any return, but the potential upside is too big to be ignored.
There are also other signs that indicate the lack of maturity like voter manipulation via social networks and racial bias. Both are undesired and it is primarily up to the engineers to prevent this as they create the software and model the objective. Regardless of the objective, Russel points out that machines should always be beneficial to the extent that their actions can be expected to achieve our human objectives. They should never be allowed to pursue their own objectives.
He also mentioned that the amount of data will become less relevant as the software underpinning AI becomes smarter. We will need less data to reach the same or even better decision. Hence, data is not the new oil. In time, its relevance will decline and for me one of the eye-openers of the session.
After covering the difference between programming related to machine learning (ML can learn only) and probabilistic programming (PP can learn and predict and is therefore more useful as it can give answers and anticipate), Russel moved on to the second key take away for me: we cannot predict when AI will reach or surpass human intelligence. There is no ‘Moore’s Law’ for AI. The critical conceptual breakthrough can be tomorrow or in twenty years. To make his point he used the breakthrough required to eventually build the atomic bomb. In 1933 one scholar in the United States declared somewhere in September of that year that it would be impossible to harness the power of the atom while 16 hours later another scholar in Germany came up with the required breakthrough.
Also interesting is the observation that we humans are not yet prepared for AI which is smarter than us. Our ability to think is what gives us meaning in life. What will we do when robots have become smarter than us? It would allow for ‘everything as a service’ but at what social cost? Will we all degrade to a kind of vegetative state, spending our lives consuming content and food?
Keynote Garry Kasparov
The second day started with a keynote by Garry Kasparov. He shared his insights on AI using the advances in automated chess engines as an example. My key take away from this session is twofold: a) these engines win because they make fewer mistakes than humans, they are not smarter; and b) by combining human and machine, we can create 1+1=3.
Let me explain the last one a bit more in-depth. Machines that learn structured games like chess and go start from scratch. They are not fed with the existing body of knowledge available from humans playing the game, but play millions of games to find out the moves that most likely result in winning the game. This also results in blind spots, because it will ignore moves that from a statistic perspective are less likely to result in winning. However, humans know due to their body of knowledge that some of these statistical outliers actually results in winning the game. Even worse: due to the millions of games played and the resulting deeply ingrained framework used by the machine to make decisions, a human can use the winning move thousands of times before the machine learns to adapt its framework.
We are heading towards a less stable and more isolated world whereby Artificial Intelligence is going to be a huge game changer. Therefore, AI is a technology which cannot be ignored, but it remains unclear when AI will surpass human intelligence. It is also yet to be seen how our days will look like when we are no longer the smartest kid on the block.
The second part will be less abstract and focus on my lessons learned regarding trends in cyber defense.
ISO and many other certifications are an increasingly expensive ritual dance. Every year, the auditor and client company go through the same motions with the sole outcome a stamp on a piece of paper. Even worse, with every new version, most standards expand in scope and required paper trail. The only benefactor is this trend are the consultants, trainers, auditors and standardization bodies.
In real life, it is the attitude, professionalism and skills of the individuals responsible for a product or service that drive quality, security and performance.
So why do companies continue these wasteful practices?
Why we use certifications
Ask an academic why we invented certifications and she/he will answer: to reduce information asymmetry. Due to the distance between the buyer and supplier of a product or service, it is difficult for the buyer to observe the qualifications of the supplier. Certifications allow the supplier to signal for the buyer unobservable attributes like quality and security. Similarly, employers use academic qualifications to differentiate between job applicants, “independent of whether or not students learn anything in the process of attending college” (I). But more than that quote later.
Last but not least, certifications are used to demonstrate regulatory compliance. IT service providers use standards like ISO 27001 and others to signal a certain level of control to their clients and regulators.
When certifications are useful
As mentioned, certifications like ISO 9001 (quality), ISO 27001 (security), ISO 20000 (service management), ISO 31000 (risk management) and ISO 22301 (Business Continuity) allow an IT department or IT service provider to communicate certain attributes to other stakeholders. They signal that, at least on paper, the IT department or IT service provider acknowledges the importance of the topic covered by the ISO or other auditable standard.
According to Ter Laak and King, certifications may even provide a competitive advantage in markets where buyers can choose from numerous suppliers. They observed that suppliers with an ISO 9001 certification tended to grow harder than supplier without a certification (II). In other words, early adopters may enjoy a competitive advantage when they are able to convince their buyers of a certificates’ added value. Until the competition catches up that is. At that point, the playing field is level again.
In mature and especially risk-averse markets (e.g. healthcare, banking, insurance, government), certifications are a precondition to do business. Buyers include them as a mandatory requirement when inviting suppliers to tender, disqualifying any bid that fails to comply. Here, certifications are used to reduce liability risk and the accompanying lawsuits.
Another advantage of certifications is the body of knowledge embedded in the underpinning standards. All define, at a high abstraction level, a set of desired outcomes, and the activities to achieve and control those outcomes. The quality of the standard itself is ensured through a combination of committees, a centralized governing body and strictly enforced development and update processes. In principle, everybody can participate in the committees responsible for the 21,378 published ISO standards or 4,938 ISO standards under development (note: data retrieved on May 2017). Numbers which provide a natural point to move on to the next topic.
- Some standards and certifications enjoy a, temporarily, first mover advantage.
- Certifications prevent opportunistic companies from entering certain markets.
- Why invent the wheel when it already has been invented?
When certifications lose their effectiveness
Your lunch can be ISO 22000, BRC, SQF, IFS, USDA Organic, AHA and ISTA & Hygiene modified approval scheme -certified. Does this information in any way influence your decision to buy?
The ISO catalogue dedicated to Information Technology includes 71 published standard and standards under development (note: data retrieved June 2017). This is only the tip of the iceberg however as the ISO 27000 standard alone consist of 45 underpinning standards, including:
- ISO/IEC 27005 — Information security risk management
- ISO/IEC 27010 — Information security management for inter-sector and inter-organizational communications
- ISO/IEC TR 27016 — information security economics
- ISO/IEC TR 27019 — Information security for process control in the energy industry
- ISO/IEC 27042 — Analyzing digital evidence
This Wikipedia page points out that 45 ISO 27,000 related standards are still not enough: “Further ISO27k standards are in preparation covering aspects such as digital forensics and cybersecurity, while the released ISO27k standards are routinely reviewed and updated on a ~5 year cycle.” The last part of the latter sentence means that soon your current quality management system becomes obsolete and your team is faced with a mandatory update to the new version. A new version which, in my experience, only grows in scope and consequently paperwork and cost (see below).
More importantly, the unchecked growth of standards causes confusion among both buyers and suppliers.
Confronted with a relentless and unchecked growth of standards and certifications, both B2C and B2B buyers lose track and either ignore them or stick to what they know from the past. Another side effect of the proliferation of standards is misinterpretation. Recently, I red a tender requiring the supplier to be ISO 25,000 certified. ISO 25000 is a family of standards (again, one was not enough), focusing on the quality of software in terms of:
- functionality (e.g. suitability, accuracy),
- reliability (e.g. maturity, fault tolerance),
- usability (e.g. understandability, learnability),
- efficiency (e.g. time behavior, resource utilization),
- maintainability (analyzability, changeability), and
- portability (e.g. adaptability and installability).
Hence, the standard is a useful reference guide for architects and software developers, but it is not a standard one can certify against. Yet. Similar to the Agile Manifesto, consultants, trainers and auditors have identified standards and certifications as an easy source of revenue. ISO 25,000 may well be their next victim.
Misinterpretations are part of a broader issue: buyers considering certifications a quick fix. These buyers think along the following lines: if you are ISO 9001 certified, I get high quality products and services. If you are ISO 27001 certified, my data and applications are safe. If my payment processor is Payment Card Industry Data Security Standard (PCI DSS) certified, my credit card information is secure.
Sorry to burst your bubble, but retailer Target lost credit card data of 40 million people and Meiman Marcus exposed 1.1 million payment card cards despite being PCI certified. The link between the actual security level and ISO 27000 is even far weaker than achieved through PCI. PCI actually provides a solid defence against hackers as long as the security specialists of the company regularly assess their readiness against new threats. Attack vectors and threats evolve as stealing data and ransomware can be very lucrative. Companies solely focusing on the piece of paper tied to PCI compliance will therefore inevitably become vulnerable for attacks somewhere down the line.
An equally dangerous example of misinterpretation is assuming ISO 27000 safeguards against security weaknesses in the application.
The most valuable commodity hackers are after is stored in the databases and applications: personal data and commercial data (e.g. Game of Thrones scripts). To safeguard both types of data, the client company must look beyond ISO 27,000 as the latter focuses on the support and operations phase of the IT life cycle, while the security level of the application and database is shaped during the design and development phases. Not ISO 27000, but Secure Software Design, Security Development Lifecycle (SDL), OWASP top 10, OWASP SAMM, NIST SP-800, and NIST SP 1800 should be the terms to look for. Among others, as explained in the second part of this blog.
Besides regulatory compliance and reducing information asymmetry between buyer and supplier, some standards are also promoted as a means to improve performance (e.g. financial, market share). What most scholars agree on is a difference in performance between companies without any ISO 9000 or other quality management system and companies with an ISO 9000 or other quality system. One example is Heras et al (III): “Using the return on assets employed (ROA), the average level of profitability was calculated for the 400 certified firms and the 400 non-certified firms for each of the years 1994, 1995, 1996, 1997, and 1998. […] In all five years, it can be observed that the average profitability of the certified firms is superior.”
However, with 1,138,155 certifications in 2014, one can hardly call it a differentiating capability. All but the smallest niche players have an ISO 9000 certification like all but the smallest hosting providers are ISO 27000 certified. These companies all enjoy the same benefits and substantial costs (see below).
Even worse, the cost associated with the comprehensive bureaucratic controls environment in combination with the ever-expanding scope of every new version may tip the scales in the wrong direction. More in general, the main criticisms levelled at ISO 9000 are (IV):
- It is bureaucratic (e.g. if it is not on paper or in a tool it does not exist for an auditor)
- It is costly to implement (e.g. writing and maintaining procedures, hiring quality manager, tooling auditor fees).
- It does guarantee the quality of the product (e.g. it does not safeguard against a garbage-in garbage-out scenario).
- It is not suitable for small organisations (e.g. due to the high cost)
Furthermore, Tsiotras and Gotsamani observed that many companies indeed certify “just for the sake of it”, listing the following issues with ISO 9000 (V):
- Low flexibility and slow response to change.
- Lack of correlation between certification and high quality or increased customer satisfaction.
- An excessive obedience to documented procedures, which may discourage critical thinking.
- A lack of focus on continuous improvement beyond the achievement of certification.
Hence, the following quote to wrap things up on the impact of ISO 9000 on firm performance from Cagnazzo et al (VI): “Despite substantial literature on the ISO 9000 standard, there is still much debate concerning the standard’s impact on firm performance, competitiveness and operations management. […] Although the number of firms that want to implement ISO 9000 quality management system is increasing day by day, many of them increasingly started questioning the link between ISO 9000 and firm performance.”
- The unchecked proliferation of standards and certifications reduces their effectiveness
- Lazy and/or badly informed buyers misuse certifications as a quick fix.
- There is a weak correlation between the desired result (e.g. secure data, quality) and certifications.
- The never-ending scope increase of standards will increase their cost to a point whereby the business case turns negative.
But the auditor saves the day, right?
Unfortunately, the auditor is of little help. Auditors only check whether you performed the mandatory quality or risk assessment, they don’t (and are rarely knowledgeable enough) to determine the quality of the assessment itself. They tick boxes. Take the following objective and controls from ISO 27,000-2005 for example.
” Objective: To provide management direction and support for information security in accordance with business requirements and relevant laws and regulations.A.5.1.1 Information security policy document.Control: An information security policy document shall be approved by management, and published and communicated to all employees and relevant external parties.A.5.1.2 Review of the information security policy.Control: The information security policy shall be reviewed at planned intervals or if significant changes occur to ensure its continuing suitability, adequacy, and effectiveness”
- A.5.1.1 means the auditor wants to see a document with the title ‘security policy’, a signature of a manager somewhere in the document and a location on the intranet where the employees can find it.
- A.5.1.2 means the auditor looks for evidence that somebody has reviewed the document (e.g. new version number), but leaves it up to the company to determine where significant changes occurred regarding the suitability, adequacy and effectiveness of the policy.
The logic behind this approach is the impossibility of auditors to thoroughly understand the specific risk profile of every individual company. However, it also limits the actual value of the standard and accompanying certificate as they demonstrate their inability to protect against a garbage-in garbage-out scenario and certifications for the sake of it.
The actual level of security depends on the professionalism, skills, culture and leadership style of the IT team. Aspects which are difficult to catch with the abstract ‘hard controls‘ most standards are based on.
- The added value of an auditor is very limited.
In 2014, ISO.org reported 1.609.294 valid certificates world-wide, an increase of 3 percent compared to 2013. According to this survey, the three year cost charged by a certification body for ISO 9001 varies between $5,400 and $7,425. That translates into $2.9 billion and $3.9 billion out of pocket costs per year related to for ISO-related certifications (VII). These amounts exclude the cost that companies incur for hiring and retaining an internal quality manager, the additional administrative burden, internal audits, tooling, training and so on. Depending on the size of the company, think of at least $100,000 for a small company and up to a million for a large corporation.
Other certifications are equally lucrative of consultants, trainers and auditors. The previously mentioned ISO 22000, BRC, SQF, IFS, USDA Organic, Kosher, Halal, AHA and ISTA & Hygiene certifications translate into a global food certification market that is expected to reach a value of $14.5 billion by 2019, growing at a CAGR of 5.2%.
Certifications are a money printing machine for thousands of consultants, auditors and standard organizations.
- The direct beneficiaries of certifications are consultants, auditors, trainers and standard organizations.
The losers and how the losers become the winners again are covered in the second part of this blog. Losers which eventually also include the auditors, consultants and trainers when the paying customers start demanding value for their money.
Notes and references
(I) Spence, M.. Job Market Signaling, Quarterly Journal of Economics 87, pages 355-374, 1973.
(II) Ter Laak, A., King, A., The effect of certification with the ISO9000 quality management standard: a signaling approach, 2006.
(III) Heras, I., Casadesus, M., Dick, G., ISO 9000 certification and the bottom line: a comparative study of the profitability of Basque region companies, Managerial Auditing Journal, 2002.
(IV) Barnes, D., Operations Management: An International Perspective, 2007.
(V) Tsiotras, G., Gotzamani, K., ISO 9000 as an entry key to TQM : The case of Greek industry, International Journal of Quality and Reliability Management, 1996.
(VI) Cagnazzo, L., Taticchi, P., Fuiano, F., Benefits, barriers and pitfalls coming from the ISO 9000 implementation: the impact on business performances, WSEAS Transactions or Business and Economics, Volume 7, 2010.
(VII) Assuming average yearly cost for ISO 9001 certification is on average equal to others.
The six principles behind the Digital Manifesto are interdependent. They reinforce each other, creating a positive feedback loop.
Information technology is not a neatly packaged box with a guaranteed return on investment stamped on it. Information Technology is like a kitchen: spending $25.000 on a new kitchen does not automatically result in a great dining experience. You also need a cook. In this case, the cook is the IT professional, embodied by the principle less defensive, more offensive. The IT team no longer delivers a piece of hardware or software to the business, but a solution or even better: a value proposition. A value proposition fulfills a specific want or need of the business, creating value (e.g. additional benefits, less risk).
While advanced automation and robots are substituting humans in several service-related areas of the value chain, employees remain necessary for truly added value activities like strategy setting, innovation, performance improvement and exception-handling.
Employees serve Customers
For the foreseeable future that is. In 2016, Google launched a research project to see if computers can be truly creative. It is only one of the projects part of a global effort to create machines with artificial intelligence and ‘deep learning’ capabilities.
The constantly evolving technology landscape also has a profound impact on the value propositions offered by the business to its customers. Music lovers used to record their favorite music on tape, replacing them over time with DVRs, hard drives, MP3 players, and more recently, streaming from the cloud.
Customer have (increasingly differentiated and technology-rich) needs
The rate of change is not constant, but increases both in volatility and complexity. The tape recorder was invented around 1930 and enjoyed a stable and predictable lifecycle for almost half a century. No such luck for more recent substitutes.
Customer needs change over time (at an accelerating rate)
More generally, every new product is more capable than its predecessor, but also far more difficult and expensive to design and produce. This translates into a network of hundredths, if not thousands, of specialized companies to deliver one coherent value proposition from a customer perspective (I).
Together, these companies form a virtual entity, bundling a broad set of capabilities, skill sets and other assets to realize one or more shared objectives. The larger and more complex the value proposition and network, the more organization (as in “the act or process of planning and arranging the different parts of an event or activity” (II)) is required to achieve the required effectiveness and efficiency. Do this well and the result is wealth for all the stakeholders involved.
Constantly changing needs require organization (e.g. end-to-end approach, leadership, investment in new skill sets)
Done well, organization results in added value (read: improved revenue and margin, reduced risk)
By investing part of that wealth into the quality of the working environment, employee satisfaction is improved, the first step in the so called service-profit chain (III). The service-profit chain establishes relationships between value creation, customer loyalty, and employee satisfaction. Loyal customers buy more and generate referrals, both key drivers of growth and profitability. To become loyal, they need to be consistently satisfied by the value proposition offered by the company.
The larger the service component of the value proposition, the higher the impact of those responsible for designing and delivering the service. Hence, the emphasis on employee satisfaction as content employees are more productive, go that extra mile, and are less likely to look around for other job opportunities.
People are the most important asset a company can invest in
The end-result is a positive feedback loop, from which all stakeholders benefit.
Notes and references
(I) Sustainable success requires companies to invest the available resources (e.g. available budget, management attention) its distinctive or core competencies. Other competencies should be sources from external partners. Prahalad and Hamel consider a competency core when it is not easy to copy by competitors, can be used for other products and markets and contributes to the end consumer’s experienced benefits and the value.
(II) Source: Merriam Webster dictionary.
(III) Heskett, J. L. , Jones, T.O., Loveman, G. W., Earl Sasser, W. Jr., Schlesinger, L. A., Putting the Service-Profit Chain to Work, Harvard Business Review, 2008.
The environment in which we live and work today is more uncertain and complex than ever before in history. To survive, let alone thrive, the leadership team has to boost its capability to sense and act on both foreseen and unforeseen events quickly and decisively.
According to McGrath the downfall of Sony, BlackBerry, Blockbuster, Circuit City and even the New York Stock Exchange can be attributed to failing to sense and act on both foreseen and unforeseen events quickly and decisively.
“Their downfall is a predictable outcome of practices that are designed around the concept of sustainable competitive advantage. The fundamental problem is that deeply ingrained structures and systems designed to extract maximum value from a competitive advantage become a liability when the environment requires instead the capacity to surf through waves of short-lived opportunities. To compete in these more volatile and uncertain environments, you need to do things differently.”
Of the 500 largest companies in 1957, less than eighty were still part of the S&P 500 forty years later. Some were taken over but most shrunk or simply went bankrupt.
Even today, Facebook, Twitter, LinkedIn and other young multibillion companies are not exempted from these economic forces. Yahoo was one of the pioneers that turned the internet into a billion-dollar business. Today it is struggling to find its mojo back.
Facebook had attracted a huge teen following, an important demographic group for marketeers, at its inception. However, privacy concerns in combination with Mom, Dad, Aunt Edna, Uncle Jim and the rest of the uncool lot joining Facebook is affecting engagement with this age group. Consequently, they move on to apps like WhatsApp, Snapchat or others to communicate with their peers. For now, between today and a couple of years from now, a startup introduces a new value proposition, starting a new cycle.
Technology is therefore both a key enabler of new business models and at the same time a major source of strategic risk.
Data follows a similar path. The continued miniaturization of sensors, CPU’s and other components turns ‘dumb’ products into ‘smart’ ones. This too is a potential source of billions in revenue for both IT service providers and the companies using their solutions. Downsides include bankruptcy for companies ignoring the Internet of Things and Big Data all together, and waste for companies unable to effectively realize the potential value represented by these buzzwords.
Combine data with advanced algorithms and you have a tool to automate knowledge-intensive work, create robots maintaining other robots and autonomous driving trucks, cars and airplanes. However, until Artificial Intelligence (AI) becomes mature enough to dynamically solve myriad situations, both foreseen and unforeseen, weaknesses in either data set or algorithm could result in dramatic distortions in the value chain or a car ending up in the ditch.
Budget is only part of the solution
It is important to note that the changing role of technology does not equal asking the CFO to double the IT budget or adopt every new technology entering the market.
The success of Apple’s iPad doesn’t come from any introduction of a new disruptive technology. It is a winner because Apple combined easy-access to a wide variety of books, music, games and movies with a good looking, high quality device. Additionally, the iPad actually provided so much more functionality than the average e-reader that it created a new market. Consumers did not know they had the need until Apple launched the product.
As a result, the iPad sold more than 3 million units in its first 80 days, making it fastest selling electronic device at the time. Number two, at a considerable distance, was the DVD player with 350,000 units in its first year.
The creation of new (uncontested) market spaces as a means to break away from traditional competition models is described by Kim and Mauborgne in their book Blue Ocean Strategy. They argue that the traditional fighting for competitive advantage, battling over market share, and struggling for differentiation, has resulted in a bloody “Red Ocean” of rivals fighting over a shrinking profit pool. The authors argue that tomorrow’s leading companies will succeed not by competing head-to-head with competitors, but by fulfilling a new demand in an uncontested market space, creating a “blue ocean.” Oceans that will in most cases be full of technology and data.
The need for technology and business departments to act fast and decisive is amplified by the infusion of information technology into our day-to-day lives. Today, some 450 million people have internet on their mobile phone and with 4 billion people using mobile phones, and that number is expected to grow fast. We can consume information 24 hours a day, purchase a book at 3 am and read a memo from a colleague at the breakfast table. Information technology is not only changing business models, but also the way we spend our free time (e.g. checking our Facebook account, playing mobile games).
Technology overcomes many boundaries, enabling companies to tap into new markets and enriching the private lives of billions of people. We are part of a global eco-system, with all its opportunities and challenges. To thrive as a company in this world, companies need to invest in the capabilities reflected by the six principles introduced as part of the Digital Manifesto.
Biological ecosystems consist of multiple interdependent species that need each other in order to survive. Species use natural selection mechanisms to adapt themselves to their environment and the most successful ones produce more offspring. Kelly’s book “Out of control” uses beehives, the economy, intelligence and evolution as examples of systems where the sum of parts create more than all the individual parts (e.g. one bee, company or brain cell) can. Especially his nine ‘incubation’ principles are worth a look when faced by uncertain and complex needs and wants from internal and external customers. Another interesting article with the same topic is The Biology of Corporate Survival from Reeves, Levin and Ueda. They point out that:
“Business environments are more diverse, dynamic, and interconnected than ever—and far less predictable. Yet many firms still pursue classic approaches to strategy that were designed for more-stable times, emphasizing analysis and planning focused on maximizing short-term performance rather than long-term robustness.”
The article provides business and IT leaders with several strategic pointers to improve the alignment between external environment and the company.
Kelly, K., Out of Control the New Biology of Machines, Social Systems and the Economic World, 2008.
Reeves, M., Levin, S., Ueda, D., The Biology of Corporate Survival, Harvard Business Review, January-February 2016.