CrowdStrike is accountable for ITIL failure but Microsoft must manage its SaaS ecosystem more diligently

| ,

Last Friday’s cluster &^%$ of IT outages plaguing companies globally will likely result in several billion dollars of economic impact. However, for CIOs, the problem wasn’t a security issue. Instead, this was an IT services management (ITSM) issue that caused massive disruption with companies relying on Microsoft’s Windows platform.

Software-as-a-service (SaaS) has become mainstream, with our research showing 68% of enterprise software is being delivered using this model today. As SaaS allows the software vendor to maintain, upgrade, and improve their solutions via their cloud delivery, updates are regularly issued as par for the course. However, as the CrowdStrike outages illustrate, many IT departments are getting too complacent, allowing their SaaS vendors to have full control of application management, updates, and automated delivery, especially when it comes to security updates. In addition, tech giants like Microsoft must be more diligent with their SaaS ecosystem partners.

This event happened as corporate IT has blindly trusted SaaS providers to patch their own solutions and assumed adequate testing and risk assessment

Friday, July 19th, wasn’t the first time there have been significant IT outages. For example, Rogers, the second largest telecom services provider in Canada, significantly impacted its customers in July 2022 with a router update from Cisco. And in 2020, SolarWind, another cybersecurity firm, dealt its customer a similar as their systems failed after an update. In the case of SolarWinds, this event has been traced back to a bad actor implanting malicious code in an update.

While companies depend on security patches to safeguard their systems, applications, and data, blindly trusting a loose federation of software companies to be mutually compliant is increasingly risky. Any IT leader worth their salt must have a process that not only governs software but also ensures that new software and patches have a modicum of testing to ensure compliance and stability. One only needs to crack open that dusty volume on the ITIL (Information Technology Infrastructure Library) framework to recant the importance of having a standardized process for quality assurance, testing, and deployment.

What caused the CrowdStrike mayhem was the release of a virus definition. Because the update is automated and is accepted automatically by its antivirus software, Flacon, when it was enabled, it caused the ‘blue screen of death’.

These automatic updates from SaaS vendors are common. However, EDR and antivirus firms push out a significant number of virus definition updates per week, sometimes even per day, depending on the severity of a virus they’ve discovered. This is all done to meet device-level security requirements required by standards like Soc2. However, when CrowdStrike released its version early Friday, July 19, it resulted in a global Windows meltdown for nearly every firm running CrowdStrike’s Falcon product.

CrowdStrike is a fault due to negligence processes

The ONLY explanation for this is CrowdStrike’s fundamental failure to follow basic ITSM or ITIL practices. ITIL is an industry recognized five step framework outlining a set of best practices and guidelines for managing and delivering IT software and services. ITIL offers software development teams with a systematic approach to IT service management (ITSM) with a focus on aligning their services with the needs of the business and ensuring the quality of the products they deliver.

In the case of CrowdStrike, its development team likely glossed over Step 3, Service Transition. While it likely focused on its standard operating procedure for writing the update the virus definition code, it appears they dropped the ball here for some unknown reason or hubris. As a reminder, in service transition, standard ITIL practices dictate the developer ensure the software (package, feature, or update) undergo a validation and test step. Surely, CrowdStrike has a stage gate for this, don’t they?

This step would have put the update through a quick battery of code testing, integration testing with Window’s OS, and finally system testing between the antivirus, Windows, and any additional services that might be called. Given the failure happened after the update hit the CrowdStrike Falcon software causing Windows clients globally to fail, it is pretty clear there was a lack of quality or system testing prior to release.

Therefore, one can only assume that CrowdStrike’s developers make the poor decision to skip testing and trust that their update would just work. This is a black mark on CS’s quality control, assuming it has one, and should lead to many CIO asking their CrowdStrike rep, “WTF, don’t you test these?!”

As an aside, this isn’t first time this has happened under the watch of the CEO of CrowdStrike, so this is likely to be an endemic issue with Crowdstrike’s internal processes.

This fiasco not Microsoft’s direct fault, but it does highlight weaknesses in its SaaS ecosystem

This outage has nothing to do with Microsoft, per se, even though Microsoft has had similar issues with the malformed Microsoft Defender antivirus updates that have disabled users’ computers in the past.

Yet, while not a Microsoft-caused issue, Microsoft must revisit how SaaS developers are allowed to release software in their ecosystem. Given the brand damage and potential legal and regulatory suits that will emerge from this, it is likely, but should never be assumed, that Microsoft will need to review its partner program.

In this review, Microsoft’s partner program must revise how its ecosystem will be regulated and held accountable for any damage-causing economic events of varying magnitude. The highest, and arguably the one that CrowdStrike reached this past week, may result in its partnership being revoked. While adding additional stage gates may slow down delivery, innovation, and services, the vendor may have to implement stringent guidelines as not doing so makes it culpable for not ensuring its partners are accountable for their products having a detrimental impact on its customers.

So what is next for the world of SaaS and cyber?

The global outages caused by a CrowdStrike update could have been mitigated had it followed ITIL. Basic ITIL procedures dating back over several decades implore responsible testing prior to a release.

While CrowdStrike took down global operations for millions of firms with a malformed virus definition as part of an automated update to its ESR clients, it is still culpable for not testing. This negligence should be the focus of many companies and lawyers.

And while this (hopefully) will be a one-time event for the vendor, the damage done was nearly catastrophic. The result is a significant erosion of trust between CrowdStrike and its customers. But we can’t just blame CrowdStrike, we signed agreements that have allowed this to happen.

Nonetheless, we are fools if we don’t learn from this event. As such, every CIO must revisit the policies and procedures for accepting updates from their SaaS and applications vendors. This will be an extensive list that will stretch from Microsoft’s Windows updates to SAP. Through this exercise, the CIO must come to an understanding of which vendors may have a material impact on their firm’s ability to function.

While HFS isn’t advocating that the CIO adopt a manual review of any patches or updates, for that would be a fool’s errand. Rather, we recommend that a firm’s IT leadership, and if they have one of their managed services providers, take stock of the enterprise user license agreements (EULA), who owns the risk, and what terms of service may require updating given the risk or impact any similar outage may cause in the future.

As many are likely to discover, it is unlikely the CIO, CEO, or chief legal officer will have much ground to stand on with CrowdStrike after this event. But to remain complacent towards future events is negligent. After all, what responsible firm will trust its supply chain (tech or non-tech) to make changes to solutions that may render your systems inoperable? There must be a ‘trust but verify’ program that ensures that activities originating outside their IT organization must have followed some process for approval/staging/release gate prior to widespread deployment inside of the organization.

Expect another week of upcoming chaos and fallout.

The extent of this outage puts a lot of pressure on corporate IT and their services partners to reset computers and implement the fixes. With computers supporting systems, it’s likely that many, but unfortunate not all, end-point devices can be fixed remotely. This will require teams of experts to manually apply updates, reboot, and test systems.

The Bottom Line: The CrowdStrike event is the latest call to arms for your CIO to assign a team responsible for re-architecting your software and SaaS policies.

These new policies must prioritize operational resiliency and follow basic ITSM/ITIL best practices. We will be assessing the productivity, economic, and supplier impacts resulting from this CloudStrike event and how tech giants like Microsoft hold their SaaS ecosystems more accountable in future for this level of abject failure.

 

Posted in : Cybersecurity, SaaS, PaaS, IaaS and BPaaS, Security and Risk, Sourcing Best Practises, Uncategorized

Comment0 ShareThis 323 Twitter 0 Facebook 0 Linkedin 0

GenAI or Die

We’ve finally reached the point in our careers when we need to learn new and more ingenious ways of doing things. No more papering over the cracks, no more passing the problems onto someone else to fix. WE are the problem that now needs fixing.

For decades, we’ve been throwing trillions at our enterprise operations, namely ERP systems, data warehouses, RPA, software engineers, process architects, etc. But to what end? To eke out some more efficiency here, rinse out some cost there? Move some data around a bit faster? Are these solutions still relevant by the time you’re done? Or is there always a new, shinier option capturing everyone’s attention?

Click to Enlarge

It’s time to reinvent our whole mindset toward enterprise operations to save ourselves from drowning in our legacy pools of technology, process, data, skills, and culture.

Yes, we’ve been predicting the extinction of the corporate dinosaur for a while now, but we’re truly arriving at the time when tired old businesses will be displaced if they persist on limping along under the sheer weight of valueless technology and transactional people, determined to cling on to the “way we’ve always done things.”  Your firm will sink under its ability to change anything, it’ll become increasingly uncompetitive, and you will go down with it groping for air, wondering why you suddenly lost your corporate relevance.

GenAI is the closest we’ve come to bridging the business/IT chasm

Eighteen months into the most significant technology rush since the ERP heydays, GenAI has the potential to change the game entirely because this is the first time business leaders can think like application developers to create new content, to access critical data, to command better ways to get things done, and so on.

Its ability to analyze unstructured data, create content, and qualitatively analyze and make recommendations is striking. The pace of model evolution and compute power is quickly advancing the technology to the point we are beginning to believe that GenAI-enabled middleware and frontend solutions may replace existing business process applications.

The big question now is whether business owners who clearly own innovation can reinvent how their companies operate or whether they are too burdened by three decades of technology, process, data, and culture debts.

Generative AI is your secret weapon. It tears down barriers, fuels innovation, and keeps you ahead of the curve. Adapt or perish – the choice is yours.

Using Michael Porter’s Five Forces (+1) framework, let’s break down how generative AI (GenAI) is shaking up the enterprise world:

  1. Threat of New Entrants: Smashing Entry Barriers. GenAI lowers technical and operational barriers for new entrants. OpenAI’s APIs, for example, enable startups to integrate powerful AI without heavy R&D costs, allowing them to compete with established giants. Brace yourself for an influx of AI-driven competitors, like how Grammarly leverages advanced AI to disrupt traditional proofreading and writing tools.
  2. Bargaining Power of Suppliers: Breaking Supply Chains. AI’s predictive and analytical powers disrupts traditional supply chains by eliminating middlemen. Farmers Business Network (FBN) uses AI to provide farmers with data-driven insights, price transparency, and predictive analytics, enabling direct and cost-effective purchasing from suppliers. This leads to significant cost savings, increased efficiency, and greater empowerment for farmers.
  3. Bargaining Power of Buyers. Mass Personalization of Customer Experience. Mars drives mass personalization in its pet food business using generative AI by analyzing data on pet health, preferences, and behaviors to create tailored nutrition plans. Through platforms like Kinship, Mars offers customized pet food recommendations, enhancing customer satisfaction and loyalty.
  4. Threat of Substitute Products or Services. This is where disruptive innovation comes into play. Adobe’s AI-powered tools, like Adobe Sensei, allow users to create stunning visuals and designs effortlessly, outpacing traditional design methods.
  5. Industry Rivalry: Productivity on steroids. With AI-driven automation, efficiency skyrockets. Amazon’s use of robots and AI in warehouses has revolutionized logistics, intensifying industry rivalry. GenAI provides killer insights; retailers like Zara use AI to analyze market trends and consumer behavior, enabling them to react swiftly to fashion changes and outmaneuver competitors.
  6. Bargaining Power of Workforce (our Bonus Force). New Workforce will not accept Old. The next generation of employees won’t settle for outdated systems and processes. GenAI is to work what calculators are to math – indispensable. Companies like Google and new startups attract top talent not only with higher pay but also by avoiding cumbersome processes and systems that make work dull and boring.

It’s time to effect real change and write off our painful legacies if we want real innovation

Do you believe your current IT department, whose massive budget is 80% allocated to keeping the lights on, is your firm’s best investment toward achieving your goals?

Do you believe that outsourcing your F&A team has solved your business problems? Do you think your 7-year global ERP suite implementation has kept pace with the changing needs of your business over this time? Considering breakthrough GenAI capabilities, it’s time you reflect on this – it’s not too late to make radical changes to keep yourself relevant and reshape your business function as we stumble face-first into the Generative Era.

Over the last three decades, enterprises have implemented organizational models and technologies that perpetuate existing processes based on operating models dating back to the 1990s – and some even as far back as the Second World War.

We have to quit funneling most of our tech funds into maintaining legacy business processes

Despite trillions of dollars spent over the decades, process automation and data remain a stark problem. Our latest data on this topic, in which 366 operations leaders (not technology leaders) identified their companies’ top changes, clearly indicates this while also citing that they don’t have the budget to fund the innovations they require.

Click to Enlarge

Business applications become saviors of the business

If all we’re going to do is outsource work, strap RPA robots into the pilot seats, and replace old technology with new technology that processes widgets in similar ways, consider for a moment if these incremental approaches funded by loose pocket change will ever be enough. Even among the large-scale technology programs that take years to complete (do they ever really end?), are the solutions still relevant by the time you’re done? Or is there already a new, shinier option capturing everyone’s attention?

Decidedly, business leaders are voting with their wallets as they flock to specialty applications full of features ERP vendors can’t produce and relatively minor implementation costs. Business leaders are leaving behind big SAP and Oracle ERP worlds that increasingly are little more than glorified “systems of record.” ERPs are no longer strategic assets, despite their eye-watering 9-digit implementation and proportionately large maintenance costs weighing down companies’ financials like boat anchors. Undoubtedly, every CFO has felt the “replace it with Workday” push, but they are hesitant because the future has shifted elsewhere.

ERP systems have largely become the dull data layer beneath a raft of specialty functional applications orchestrated by Salesforce, Pega, ServiceNow, Zip, or Outsystems. It doesn’t matter what SAP and Oracle implement from a functionality or GenAI perspective – their clients are looking at true business applications for innovation. However, these once-affordable platforms’ sales teams are pushing hefty price hikes onto clients who now hold increasingly less love for them as a result. This has led to a new group of technology providers, like Unqork, who are marketing, “Why pay the big ticket, ever-escalating prices of specialty application pricing when you can build it yourself to your exact specifications in our low code systems?”

Yet, the dark secret of many business applications is that they simply replicate the old processes in a SaaS environment. Consider for a moment if your company changed its approach to sales because it implemented Salesforce, changed its performance management program when it implemented Workday, or changed its T&E process when it implemented Concur. All the old work methods were replicated into easier-to-use, flexible, lower-cost systems now owned by the business, not IT.

However, this is no longer IT’s fault. It’s clear who is in the driver’s seat. Whereas CIOs used to rule the roost with large ERP applications and complex data warehouse projects, business leaders now control the purse strings. Our latest data from a study of 605 organizations shows clearly that a company’s IT department controls less than half of the corporate IT spending in 46% of organizations. So, the ownership of innovation is not an IT issue; it’s the business’s.

GenAI Comes to the Forefront:  Embrace it with your eyes wide open

Lest you think that GenAI is just another RPA-like hype, think for a moment about how it has already changed entire industries:

  1. Until recently, the creative industry relied on legacy photographers and videographers to create and then enhance images and videos using digital tools, often pixel by pixel. Now, led by Adobe’s fast adoption of GenAI, an entire wave of GenAI illustration and editing tools can change whole images and videos in seconds. Still, as powerful as this is, GenAI’s ability to create hyper-realistic images and videos catered to narrowly specified requirements threatens the entire workforce, forcing workers to scramble to recreate themselves and spawning new developments in intellectual property law and deep fake detection.
  2. In the world of education, long reliant on the legacy of dusty libraries, lengthy essays, and math drills, professors and teachers realize that GenAI is a threat to homework and how people learn. Professors and teachers now rely on GenAI to build courses, lesson plans, and assignments once sold online and part of the textbook industry. But what good is any of that if anyone who wants to learn anything can, with a handful of prompts and access to the world’s entire knowledgebase, do the same, and then learn anything and even obtain feedback on their work? What is the point of the university and training industry if learners can generate more effective, tailored learning models than teachers can?
  3. The social media influencer industry is being turned on its head by GenAI. While the industry is impacted similarly to the creative industry, more and more social media accounts are being created and run by ChatGPT hustlers that use GenAI capabilities to analyze audiences and competitors, and then generate social media designed to drive engagement. An untrained influencer can create blogs, email campaigns, images, and advertising copy in seconds, empowering teams to create faceless virtual influencer accounts. While these clout chasers may only have fringe impacts, realize the same revolution is occurring in the legacy advertising industry amid upheavals affecting the same topics, which we know as copywriting, audience data analysis, market research, and search engine optimization. The people-centric professional services industry is undergoing radical change.

It is important to note that these real-world impacts are on white-collar workers long entrenched in antiquated work practices. Innovative business leaders are leveraging GenAI’s power to completely change how their industries operate

These same impacts are being felt directly in the software industry. Look at every technology provider’s rush to demonstrate GenAI capabilities in every solution they have. Behind the scenes, they are struggling with their own technology and business process debts that are built into the foundations of the data models, application rule engines, and workflow capabilities built to allow clients to do the same old work with a bit more efficiency and reporting capability. How many demos have you seen showing the latest AI-powered widget in your software? Behind the scenes, they fear the dawn of a new generation of applications will have no legacy debt and be fully capable of changing the way we operate – if clients will let them.

And this is the big question. Will business owners who clearly own innovation reinvent how their companies operate? Or are they too burdened by two decades of technology, process, data, and culture debts and repeated examples of technology implementation that essentially digitally replicated exactly what once existed on paper and in file cabinets? For every dozen legacy leaders taking a “wait and see” approach to GenAI, one is changing the game, blazing new approaches to delivering customer service, financial operations, security management, strategic sourcing, and legal services.

The Bottom-line:  This is Your Moment… Embrace GenAI or Die

Maybe we’re overreaching a bit. You clearly won’t die if you don’t implement GenAI to re-engineer your operations completely (though in the healthcare world, you could save lives!). But your company will fall behind your competitors. New debtless digital-first competitors will arrive. Your company’s results will start to lag and, we assure you, some other smart and creative person with some prompt training will certainly take your job.

The time for complacency has ended. GenAI represents a seismic technological shift, offering unparalleled capabilities to transform business operations. Decades-old methods of using systems to digitize old workflows provide only incremental improvements, which are no longer sufficient. Companies must embrace GenAI or risk being outpaced by more agile, innovative competitors. The era of maintaining outdated processes and technologies is over.  Now, it is about revolutionizing how we work, leveraging GenAI to unlock new efficiencies, insights, and opportunities. This is OUR moment. Seize it with GenAI or face the inevitable decline. The choice is clear: innovate or…

Posted in : Artificial Intelligence, Automation, Autonomous Enterprise, Buyers' Sourcing Best Practices, ChatGPT, Design Thinking, Digital OneOffice, Digital Transformation, GenAI, Generative Enterprise, Global Business Services, GPT-4, GPT-4o, OneEcosystem, OneOffice

Comment0 ShareThis 1577 Twitter 0 Facebook 0 Linkedin 0

Can today’s service providers modernize manufacturers with their plethora of technology expertise?

Manufacturers are striving to achieve better visibility into their daily operations, from sourcing and design to factory operations and the various departments involved. They are taking steps to modernize their operations with infusion technologies such as digital twin, the internet of things (IoT), 6G, 5G, cognitive robotics, artificial intelligence (AI), machine learning (ML), and digital thread and standardization of enterprise resource planning (ERP) and manufacturing execution systems (MES) to achieve operational clarity. At the same time, manufacturers are working to make the supply chain more resilient and adopt sustainability practices with evolving regulatory and compliance requirements to create a well-rounded, sustainable manufacturing value chain. Manufacturers are looking to drive digital transformation with a consistent focus on efficiency and building resilience across the value chain. 

HFS’s newly published Manufacturing Intelligent Operations Services report explores the capabilities of the industry-leading service providers, enabling manufacturers to achieve their goal of autonomous manufacturing and sustainable practices, making operations efficient and resilient. 

Exhibit 1: Manufacturing Intelligent Operations Services Report, 2024, evaluates the capabilities of the 12 service providers across the horizons

Strengthening the foundation for intelligent manufacturing with data and technology collaboration  

To achieve the goal of integrated manufacturing processes, establishing a data-driven business model is essential. Manufacturers must put data at the heart of their operations, going beyond simply regular reporting. Manufacturers need to integrate data from across functions, breaking the silos to optimize shop-floor operations and drive innovation and real-time decision-making. Thus, the convergence of information technology (IT) and operational technology (OT) has become crucial in gaining access to comprehensive real-time data for better decision-making and assessment of processes on the shop floor. 

Real-time data from the convergence of IT and OT also enables the seamless implementation of digital twin and digital thread, creating a connected ecosystem. Digital twin enables all involved stakeholders to visualize a digital replica of an operational asset or manufacturing facility. It creates a unified environment that helps to eliminate silos while enabling data centralization from multiple sources, such as sensors, machinery, and enterprise systems. And digital thread helps to establish traceability and connect data throughout the manufacturing lifecycle. Together they break down communication silos and establish a continuous loop across the manufacturing value chain. 

It is crucial to bridge the IT and OT divide to create a data-driven culture, support technology transformation, and enable real-time visibility into end-to-end shop floor operations. Service providers are supporting manufacturers in driving collaboration between IT and OT data by integrating unified digital platforms to increase operational efficiency. 

Building partner ecosystem to enhance capabilities and drive innovation 

Service providers are building robust partnerships and strategic alliances to build a strong delivery ecosystem that supports evolving customer requirements and constantly changing market dynamics. Service providers are building alliances with niche players, hyperscalers, startups, industry bodies, and academia to drive innovation, foster digital capabilities, and deliver value to clients. This not only enables them to build innovative solutions around emerging technologies but also to leverage a vast pool of talent in driving digital transformation. This establishes that service providers are continuously aiming to evolve with the rapidly changing industry and customer demands and requirements and tap into new growth opportunities. 

Resilient supply chains and sustainability are becoming key priorities 

With supply chain challenges causing recent disruptions, manufacturers are increasing investments to gain supply-chain visibility. Simultaneously, increasing regulatory pressure is driving manufacturers to adopt circular economy and decarbonization practices, including: 

  • Service providers are building connected and intelligent supply chain capabilities by analyzing operational requirements and integrating technologies such as predictive analytics, AI, and digital twins to optimize and gain real-time visibility into processes.  
  • Service providers have also geared up to help manufacturers mitigate risks related to carbon footprints, waste reduction, energy management, pollution management, and green supply chains. 

The Bottom Line: Service providers will need to continue to orchestrate smart manufacturing strategies to drive end-to-end digital transformation across the manufacturing operations value chain, striving for efficiency

HFS subscribers can download the report here.

Posted in : HFS Horizons, Manufacturing

Comment0 ShareThis 58 Twitter 0 Facebook 0 Linkedin 0

Health systems have unprecedented options to address all that ails their business

There has been much progress with life expectancies, health outcomes, and access to care across the globe—although it is inconsistent. Therefore, it should be unsurprising that the triple aim of care (cost, experience, and health outcomes) remains elusive in the US and worldwide. Still, the post-pandemic years have seen significant progress in applying technologies smartly and an attitudinal shift to experiment in a regulated industry.

In the HFS Horizons: Healthcare Providers (HCP) Service Providers, 2024 study, we evaluated 36 service providers for their ability to address the cost (Horizon 1), experience (Horizon 2), and health outcomes (Horizon 3) for health consumers globally.

Exhibit 1: The increasing number of service providers capable of addressing the triple aim of care is reflected in the large cohort of Horizon 3 placements

Financial pressures take center stage for health systems and hospitals

The end of the pandemic-driven stimulus has returned healthcare providers to a new reality with even greater pressure to remain above water financially. Given the deteriorating health outcomes in the US and other large population centers, demand for care is increasing. This is why providers must discover a way to meet the demand in a cost-optimum manner.

The desire to improve the point-of-care experience and drive connectivity in a post-care scenario is growing to drive retention and value-based care (VBC) outcomes. Providers are becoming more open to technology-enabled innovation to improve productivity, health outcomes, and financial woes.

Lastly, there has been a material increase in outsourcing and offshoring for services and technology enablement, an acceleration over the past few years driven by the need of the hour and the supplier landscape.

Service provider capabilities and attitudes shift toward the possibilities

Service providers have improved their ability to address complex clinical and financial challenges by attracting talent from the industry, investing in key technologies, and becoming bolder in their market approach. Many service providers have significantly expanded their delivery footprint across the globe, addressing new markets and solving distinctly local problems.

The sophistication of capabilities leveraging technology to predict a pneumothorax, prevent diabetes, or facilitate food as a prescription adds an incredible new lever in the marketplace to help health systems and hospitals fight diseases.

Despite progress, opportunities continue to be missed to address the triple aim of care

While many service providers have the capabilities to address all attributes of the triple aim of care and have ample evidence to showcase results, health systems and hospitals remain entrenched in the legacy paradigm of addressing episodic symptoms rather than holistic health.

The bias toward acute care has translated into ignoring and underserving primary care, post-acute care, and rehab (physical therapy, substance abuse treatment). Technology adoption in the non-acute care space for electronic health records and revenue cycle management is below 50%, indicating a huge opportunity to remediate and support basic healthcare.

The Bottom Line: Those who help save lives need help keeping their businesses alive so they can positively and consistently impact the triple aim of care.

As the US and world population ages rapidly, climate change’s implications for health become acute, and clinician shortages are exacerbated, implementing creative ecosystems and improving how we address health and deliver care equitably has never been more urgent. The contemporary paradigm is a legacy sick-care construct ill-suited to the 21st century. We all need to lean in to reimagine and do it fast to move the triple-aim needle.

HFS subscribers can download the report here.

Posted in : Healthcare, Healthcare and Outsourcing, HFS Horizons, Life Sciences

Comment0 ShareThis 124 Twitter 0 Facebook 0 Linkedin 0

Businesses are in an arms race to create low-touch collaborative supply chains

In response to disruptive global events such as geopolitical tensions and continuously changing shipping guidelines, enterprises are increasingly prioritizing making their supply chains resilient and agile. There is a significant emphasis on digitalization, integrating AI, IoT, and analytics technologies to improve network transparency, efficiency, and security. Sustainability has also become crucial, as businesses aim to minimize environmental impact and uphold ethical practices in their supply chains. These changes not only address regulatory pressures but also align with consumer expectations for responsible business practices. Overall, these shifts are transforming supply chains into more dynamic, interconnected, and accountable systems capable of meeting the complex demands of the modern global market. 

There’s a growing trend toward creating collaborative ecosystems that include suppliers, partners, and even competitors. This approach leverages shared technology platforms and data insights to collectively drive efficiencies and innovate supply chain solutions. 

In the HFS Horizons: Supply Chain Services 2024 study, 18 supply chain services providers are analyzed and profiled. Seven of these providers are identified as leaders in Horizon 3, focusing on ecosystem collaboration. Seven are identified as innovators in Horizon 2, excelling in cross-functional alignment, and four are identified as disruptors, primarily working on function-level transformation.  

Exhibit 1: 68% of upcoming supply chain investments in the next 2 years will be in Horizon 3

New service offerings coming into the fray 

  • Generative AI integration: There’s a significant push toward integrating GenAI across various facets of supply chain management, from planning and logistics to customer interaction and compliance. This technology is expected to enhance automation, improve decision-making, and create more dynamic and responsive supply chain systems. 
  • Sustainability services: Providers are increasingly offering services to achieve sustainability goals, such as carbon footprint reduction, lifecycle assessments, and sustainable sourcing strategies. These services are crucial for companies aiming to meet regulatory requirements and consumer demands. 
  • Digital twins and advanced analytics: The use of digital twin technologies and advanced analytics is being expanded to offer more detailed insights into operations, enabling predictive maintenance, and optimizing supply chain resilience. 

New buying patterns are surfacing 

  • Shift toward subscription and as-a-service models: There’s a noticeable trend toward subscription-based and as-a-service purchasing models. These models provide flexibility, reduce upfront costs, and align with the increasing preference for OpEx vs. CapEx expenditures in corporate budgeting. 
  • Increased demand for customized solutions: Enterprises are looking for solutions they can tailor to their specific needs, reflecting a move away from one-size-fits-all offerings. This customization is particularly prevalent in areas such as AI implementations and data analytics services. 

New scope of work on the table 

  • Global expansion: Organizations are increasingly designing supply chain solutions to support global operations, with a focus on integrating cross-border supply chains and managing international compliance and logistics challenges. 
  • Focus on resilience and agility: Services are being developed to enhance the resilience and agility of supply chains, enabling enterprises to respond more swiftly to market changes and disruptions. This includes tools for better risk management and dynamic rerouting of logistics in response to external shocks. 

New operating models being adopted 

  • Collaboration across sectors: There’s an increasing emphasis on collaboration across different sectors and industries to optimize supply chain operations. This involves partnerships with tech companies, logistics firms, and even competitors to pool resources and capabilities. 
  • Leveraging big data and IoT: The scale of supply chain operations is expanding with the integration of IoT and big data. These technologies enable teams to handle vast amounts of data across extensive networks, improving real-time decision-making and operational efficiency. 

The Bottom Line: The shift toward deeper digital integration and an evolved operating model is transforming human-dependent supply chains into low-touch collaborative supply networks. 

HFS subscribers can download the report here.

Posted in : Artificial Intelligence, GenAI, HFS Horizons, internet-of-things, Supply Chain, supply-chain-management, sustainability

Comment0 ShareThis 91 Twitter 0 Facebook 0 Linkedin 0

Cognizant’s $1.3B Belcan Bet: A high-stakes move to disrupt the IT services stalemate with hybrid engineering, IT, OT, and BT capability

| ,

Cognizant’s bold $1.3B acquisition of Belcan isn’t just a headline grabber; it’s a game-changer in an IT services market that’s hitting a plateau:

  • Growth Catalyst: Engineering services, where Belcan excels, is the rocket fuel Cognizant needs. While IT services see modest gains or losses between -5% and +5%, engineering services are soaring with growth rates over 10%.
  • Tech Arsenal Upgrade: Belcan’s technical engineering services prowess adds serious firepower to Cognizant’s already robust suite of emerging technologies and digital engineering capabilities. With formidable AI, automation, and analytics capabilities, Cognizant now stands tall across all high-growth, high-adoption technologies:

Click to Enlarge

  • Diverse offerings and domain expertise: Belcan brings expertise in both engineering and traditional IT development, testing, and integration capabilities across Aerospace and Defence, Automotive and Industrials. This strategic expansion and capability addition in global locations complements Cognizant’s existing technology expertise.
  • Niche Industry skills: Belcan brings deep technical expertise in high-precision sectors of Aerospace and Defence, Automotive and Industrial – industries where subject matter expertise is crucial to delivering complex projects. These industries are investing in digital threads, MBSE, integration of operations and enterprise technologies, building sustainable solutions and supply chain resilience which is Belcan’s sweet spot.
  • Strategic Differentiation: In these asset-heavy industries, Belcan’s expertise gives Cognizant a unique edge in large-scale transformational deals. This expands Cognizant’s digital transformation capabilities into aerospace and defense industries that are currently in a growth trajectory and are undergoing technology disruptions, compelling them to think of new product development, efficiency in operations, and business model transformation. This acquisition positions Cognizant as one of the few major players capable of seamlessly integrating IT (Information Technology), OT (Operational Technology via Belcan), and BT (Business Transformation) (See below). Only a few service providers, like Accenture, HCLTech, Infosys and Capgemini, can boast a similar trifecta:

Click to Enlarge

The challenges of this merger

Historically, Indian heritage service providers have struggled with large mergers. But if Cognizant can bridge these gaps, this acquisition could redefine the competitive landscape. However, all this hinges on seamless integration—no small feat given both firms are very different:

  • Geographical Divide. Belcan’s workforce is predominantly North American, while Cognizant’s strength lies in its substantial Indian presence. Although the value proposition is different, this integration should be similar to Capgemini’s integration of iGate, where European consulting norms were merged with an Indian IT culture.
  • Budgeting Clash. R&D budgets, Belcan’s domain, operate under different dynamics compared to IT budgets, which are Cognizant’s forte.
  • Reduce dependency on financial services and healthcare. Cognizant has a high revenue dependency on the financial services and healthcare sector—almost 60% of its revenue comes from these two highly regulated sectors. This deal will help Cognizant diversify its revenue mix. Additionally, the aerospace and defense sector is riding tailwinds due to the demand for travel and geopolitical conflicts.

Major opportunities this acquisition creates for Cognizant

Makes Cognizant one of the largest players in the engineering services business. There have been many acquisitions in the engineering services space recently – HCLTech acquired ASAP, Infosys’ acquisition of ER&D services provider in-tech, Happiest Minds Technologies acquired PureSoftware Technologies. However, this is the largest engineering acquisition by an IT services major since Capgemini acquired Altran in 2019 for $3.9B.  Cognizant’s engineering business post-merger will be in the region of $1.8B.

The investment price is reasonable and the investment is almost 100% additive in revenues.  The $1.29B acquisition price is very competitive, adding an estimated $800m in incremental revenues to Cognizant, of which 40% is in product engineering and 35% in embedded software.  This broadens Cognizant’s offerings far beyond its mainstays of healthcare, life sciences and financial services, which are struggling for future growth in traditional IT services markets.

The post-investment merger is set up to drive synergies and continuity. Cognizant will now boast a $1.8B global engineering services practice under the leadership of renowned Lance Kwasniewski, which includes the acquisition of embedded software firm Mobica early last year.  In a similar vein to Jason Wojahn becoming leader of Cognizant’s ServiceNow practice with the Thirdera acquisition, HFS sees this as the smart strategy for Ravi Kumar to expand his team by retaining key leadership talent to continue driving the businesses they built. This helps blend the cultures, retain key talent and ensure continuity.

The aerospace and defence market opportunity is spectacular.  The aerospace and defense market is booming in terms of commercial, private, and government spending, which HFS estimates at surpassing $800B this year and $1T in two years’ time. There is also a strong demand in MRO services in the industry and because of Belcan’s OEM experience with clients like Boeing, Airbus, and Lockheed Martin.  Belcan is able to service this demand effectively. Being in a position to deliver AI, technology, embedded software, and technical engineering services is a real additive area for Cognizant to exploit in this high-growth sector full of both large enterprises and hundreds of mid-sized subcontractors.

Seizing opportunities in automotive, E&U and industrials sectors, which are going through technology shifts. These industries are investing in new product development and optimizing their manufacturing process and their supply chain, which need industry domain and technical expertise as it is complex high-precision engineering. For example, to seize product engineering opportunities a provider must have capabilities in aircraft design, avionics, propulsion systems and defence technologies in the A&D industry. Similarly, capabilities in vehicle design, powertrain development, and autonomous systems are needed to grab opportunities in the growing SDV and electric vehicles market. The acquisition builds an industry-focused differentiator and is an opportunity to build next-gen technology solutions for Cognizant.

The offshore opportunity is still fledgling and positions Cognizant very strongly.   Boeing, Airbus, Collins Aerospace, Pratt and Whitney, Lockheed Martin, and Thales in the A&D sector, and similarly, ZF, Hella, BorgWarner, Volvo, Hyundai, Stellantis, and Ford in the automotive sector have set up GCCs in India.  There is also a significant focus on the expansion of engineering-centric GCC centers in India right across manufacturing and other verticals. The market is ripe for expansion, with Cognizant in a very strong position to take on these services.

The opportunity to take engineering and embedded software capabilities into other industries is clear.  Huge industries such as medical devices, household electronics, and automotive technologies are rife with demand for embedded software, product engineering, and supply chains.  With Cognizant’s focus on bringing Engineering, OT, IT, and BT together, there is a clear roadmap for growth, provided the firm can bring together the differing skills, cultures, and client needs effectively as a holistic and integrated capability.

Bottomline: Cognizant now has a seat at the engineer’s top table, but now needs to ensure it stays there

The deal strengthens Cognizant’s already existing technical expertise and deepens client relationships, creating major new market opportunities. This also puts Cognizant on the industry map for large-scale transformations with strong emerging technologies capabilities, making the firm a formidable player in the engineering services industry with domain and technical expertise. All Cognizant now needs is to cross the integration milestone seamlessly to build on this considerable momentum.

Posted in : Business Process Outsourcing (BPO), engineering, IT Outsourcing / IT Services, Manufacturing, Supply Chain, Uncategorized

Comment0 ShareThis 2668 Twitter 0 Facebook 0 Linkedin 0

Ten reasons why GPT-4o will pour fuel onto the GenAI smoldering platform

|

It’s hard to remember that the Generative Era is barely more than a year into its fledgling life since LLMs were cast upon us, and the tremendous excitement that GPT-4 started delivering with all the dramatic improvements helping us create new content and data.

However, our new research* covering a quarter of the Global 2000 shows things have not moved as fast as many of us were expecting, with only 5% of enterprises committing significant technology spend on GenAI and successfully deploying GenAI solutions across multiple parts of their business, and two-thirds doing practically nothing:

Click to Enlarge

So, what will spark enterprises to move faster with GenAI and keep it from the graveyard of so many previous technology “innovations”?

The big problem with any type of new tech, since John Mauchly initiated modern computer programming in 1949 with Short Code, is that business folks dump it into the tech people to figure out and implement for them.  While they get excited at the tech’s potential impact on their businesses to be slicker, smarter, and more competitive, they do not believe they need to get pulled deep into the technology to understand exactly how it can do it and what the business needs to do to exploit it.

This is why so many businesses got sold down the river with cloud migration during the pandemic and RPA just prior. They bought into the vision the technology firms were selling but gave it to technologists to implement that vision without changing how they ran their businesses to drive that change.

As Microsoft CEO Satya Nadella recently bemoaned, the uptake of AI “hinges on other companies doing ‘the hard work’ of changing their cultures.”  Easier said than done, Satya, but perhaps Microsoft needs to invest in fast-track change management services to help your clients buy more CoPilot licenses?

Enter GPT-4o… another iteration of GenAI that just took things to a much more human level

The one thing that has been consistent since ChatCPT 3.5 was launched in November 2022 has been the continual proliferation of LLMs and the capabilities of the technology.  However, it is the latest iteration of GPT that makes the biggest advancement yet and will surely wake up the majority of enterprise leaders as we pointed out when GPT-4 hit the streets last year.

  1. Multimodal makes everything much more human.  The thing I am loving about GPT-4o, apart from being twice as fast as GPT-4 is the “omni” or multimodal capability to bring text, vision and speech into the same neural network.  With GPT-4, these were processed separately, with voice being transcribed into plain text, which erases the nuanced information from the LLM; so all the tone and emotion captured in an audio format are now reduced to plain boring text.  Net-net GPT-4o can process images, audio, video, and text simultaneously. GPT-4 could only process text and images. In effect, old GPT was like texting a friend, GPT-4o is like calling a friend.
  2. Real-time human-2-machine conversation is now possible.  In short, we are able to converse naturally without first converting words to text, with real energy, emotion, and expressiveness. We’ll also be able to interrupt it, have it change its tone of voice, and respond with emotion.  The whole nature of collaboration with machines has gone to a new level.
  3. Enhanced multilingual support and capabilities. GPT-4o has greatly improved the quality and speed of ChatGPT’s international language capabilities compared to previous models. It can communicate fluently in dozens of languages, making it accessible to more users globally. The model demonstrates more robust performance in non-English languages and translation tasks.  Combined with its human-like chat and collaboration, surely the excuses to invest in generative customer engagement are moot?
  4. It really does have human eyes now. GPT-4o can read the expressions on people’s faces and judge their emotions by simply pointing your iPhone camera. This thing really does have eyes that process what we see beyond transactional images. While GPT-4 had optical potential, GPT-4o is making AI optical capability much more real.
  5. It’s being incorporated into Apple’s iPhone and Google’s Android operating systems.  The earlier version of ChatGPT Voice available in the iPhone and Android app allowed you to converse with the AI in a relatively natural way — but it wasn’t listening to what you were saying; rather, it converted it to text and analyzed that instead. Hence, Siri and Google Assistant should soon be becoming much more human than their current transactional forms, which most of us turn off because they’re just so useless.
  6. Summaries are concise and relevant.  GPT-4o provides summaries of conversations and searches that are very accurate in both tone and length, while GPT-4 often produces inaccurate language and tone which require a lot of supervision to get right.  Is this finally the end of legacy Google search and bad call transcripts?  Surely disruption of legacy text strings is now in full play?
  7. Visual interpretation and data tables are much more usable and accurate, ready to support business needs. It accurately converts image data into a clean table format without misinterpretations. It is precise in converting text and data, while previous versions made a lot of inaccuracies.  Research capabilities are more detailed, provide more accurate breakdowns of data and analysts, and provide real practical examples.  Do we really need to keep relying on clunky old data and analytics tools that require so much manual manipulation to get what we need?
  8. Image generation capabilities are just so much sharper. GPT-4o is more visually appealing and produces conceptually accurate images. It is much more usable for enterprise projects needing high-quality visuals than what we have experienced using the current versions of Dall-E (for example).  GPT-4 gave us a taste but now we surely we now are seeing the potential to create content ourselves without the need for expensive agencies and outdated complicated software packages?
  9. The cost of accessing its APIs is 50% cheaper.  OpenAI has clearly realized its costs are holding back wary enterprises and is now pushing 50% less cost for many of its core APIs, such as Chat Completions API, Assistants API, and Batch API.  Are we finally going to be freed from decades of legacy software, abhorrent license fees and meaningless code bases?
  10. Coding is vastly improved.  So far, many developers are lauding the improvements in GTP40’s ability to solve many coding projects, such as multiple thousand lines of code in under 10 minutes, which previously took prompt engineering processes many hours.  It can also create multiple apps in Python that the previous version struggled with. According to one developer, “4o not only solved it and provided clear concise dissection of the solution. 4 can be easily tricked into going down a death spiral it does not know how to backtrack correctly. OpenAI did incredible improvements to 4o. I can see Model 5 gonna start to get rid of human programmers for good.”  We recently discussed how GenAI is already making radical improvements to human-heavy legacy code development, and these new advancements are reinforcing the end to legacy coding as we know it.

The Bottom-line:  Just as we were giving up the ghost on GenAI, it becomes more human than ever

I am one of the biggest cynics when it comes to tech innovation and business change because of one reason—there needs to be a bloody great burning platform to force businesses to adopt. With GPT-4o, many of the reasons for murdering the technology in this death spiral of a thousand pilots have been the inability to adapt it to so many business scenarios.  Ambitious C-suites will clamor louder than ever to see this AI tech immersed into their organizations and will seek leaders to defrost their frozen middle ranks to make this happen for them.  Your job may not be replaced directly by AI, but you will more likely be replaced by someone who knows how to use AI if you don’t wake up and get with the GenAI program.

To conclude, I will go back to the main excitement behind GenAI… it is disruptive because it helps us create new data and new content. But it needs to become an extension of our humanness to do that, not merely another technology tool that can add some value in bits and pieces.  Having multimodal capability that brings speech, text, video, and content together in one neural network that we can communicate with in real-time and immerse into our day-to-day activities is the game changer we have been unwittingly waiting for.  Now it is here, and we can only imagine how quickly this will keep evolving as OpenAI, Google, Anthropic, Apple, Microsoft, NVidia, and co keep pumping all their investments into this emerging tech.

* The survey was conducted in collaboration with Genpact.  We will be releasing the full study on 5/22

Posted in : Analytics and Big Data, Artificial Intelligence, Automation, Buyers' Sourcing Best Practices, ChatGPT, Cloud Computing, Customer Experience, Digital OneOffice, Employee Experience, GenAI, Generative Enterprise, Global Business Services, GPT-4, GPT-4o

Comment0 ShareThis 916 Twitter 0 Facebook 0 Linkedin 0

How to have awkward conversations at work… with Gallo’s humor

|

The biggest issue I see within enterprises today is how smart executives can find common ground with their bosses to make great decisions.

I really enjoyed this simple, but magnificent advice from HBR’s Amy Gallo where she describes some simple steps on how best to achieve the right results at work without creating all sorts of bad energy:

There are some pearls of Gallo wisdom I took down:

Do a risk assessment – weigh the consequences of disagreeing

Chances are you won’t get fired or make an enemy for speaking your mind

What do you stand to lose… what could happen later if you don’t raise this issue now?

When and where you meet matters to have the conversation

Time can really help, perhaps you can find colleagues on the same page as you and their support and ideas may bolster your case

A private meeting may be a lot less threatening

Make it a chess game, not a boxing match

What to say and how to say it

Maintain a strategic focus… keep everyone’s integrity intact

Ask permission to disagree… allows your superior to opt in without feeling threatened

Explain, “I’d like to lay out my reasoning. Would that be OK?”

Connect your idea to a shared goal…. something you both care about such as company morale, quarterly earnings, etc

How to present your argument

Project confidence and neutrality…. anxious body language can harm your message

Breathe deeply

Stay humble and curious enough to hear critiques

Share only facts, not judgments

Always add, “I know you make the final call here”

The Bottom-line:  Amy will be speaking at the HFS Summit next week!

And, of course, how could we resist not enticing Amy to come and keynote at the HFS summit next week with the topic… “Is it Me? Or Is it Them? How to Collaborate with Difficult People”.  We’ll also be sharing a few copies of her book,  “Getting Along: How to Work with Anyone (Even Difficult People)”.

Posted in : Employee Experience

Comment0 ShareThis 41 Twitter 0 Facebook 0 Linkedin 0

From People to Tech arbitrage: Can we really survive this Great Services Transition?

The IT and business services world has entered a crucial phase where the winners and losers will become clear in the next few months.  Many are already getting left behind in the legacy services world of shopping low-cost labor, while the smarter ones are vying to become strategic partners to their enterprise clients, helping them write off decades of people, process, data, and technology debt to forge the path to the brave new AI world.

Click to Enlarge

We are firmly along this S-Curve evolution from people to technology arbitrage that the Generative Enterprise demands. Welcome to this Great Services Transition, where the entire financial construct of services relationships is being reinvented to capitalize on the complex ecosystem of AI platform players, hyperscalers, data integration products, automation tools, LLM builders, and so on.

For example, services powerhouses like TCS and Wipro are digging deep into their tried and trusted past glories to (at least) restore some of the old energy and verve into their teams, but they won’t be able to rest on their laurels by simply placing popular leaders at the helm.  They have to embrace the complex shift from people-based arbitrage to technology-based arbitrage if they are truly going to make it through to the other side – which we are calling this Great Services Transition.

Can today’s services firms really make the painful changes to reinvent their business models, or have their owners made so much money off the old model they simply aren’t motivated to grapple with painful change?

All these major providers, from Accenture, IBM, and Capgemini to the plethora of Indian heritage services firms and technology consultants such as Deloitte, EY, and KPMG, have to change their financial construct with their clients to one of shared risk, shared learning, and ultimately shared reward; otherwise, they face a race to the bottom.

This means changing the habits of a lifetime.  You only need to look at the likes of Kodak, Nokia, Yahoo, Xerox, and even JC Penney, which simply failed to innovate with the times and were too late to play catch-up once they had woken up to the new reality.  One could argue that many of the services firms in today’s spotlight are already too embedded in their legacies to turn things around.  The continued cycle of providing people-based services will yield a modicum of modest growth as enterprises seek continued cost savings and invest in AI  build-out initiatives. But as the model transitions to AI-led technology arbitrage, those left with hundreds of thousands of resources requiring decent utilization rates will see margins further degrade. The people-based arbitrage model is plateauing.

When your leadership is fat and happy, and the stock still holds up, why go through the aggravation of painful change when you can quietly ride off into the sunset with your cash pile?  When your board and stockholders only care about your quarterly numbers, and you don’t have the time or trust to drive a long-term plan, what can you really do beyond chasing ever-decreasing deals and focusing on cutting costs to the bone?  Sadly, it’s not always the fact that leaders fail to see the change coming; it’s more the casino that is Corporate America’s stock market that dictates which companies will survive or add themselves to the list of innovation failures.

However, as analysts who’ve covered this market for nearly 30 years, we steadfastly refuse to give up because many of today’s IT service leaders are too greedy, too risk-averse, or just too ignorant to find a path for survival and renewed prosperity.  So let’s break down this Great Services Transition into four simple problems to solve:

To survive The Great Services Transition, there are Four Problems to Solve:

Solving problem 1).  Enterprises and service partners must be aligned on the change mandate

What service partner has a culture you want to work with that will blend well with yours? Ambitious enterprises and their service partners are both striving to be effective in the emerging world of AI-driven business models and operations. This means this transition only works when there are two parties ready to tango and change together. To this end, service providers must become partners of change for their clients to help them understand the sheer noise of technology change going on around them.  Clients need internal alignment to ensure that its time to make the move.

Solving problem 2).  Services must provide access to affordable talent with real expertise

The shift from labor to technology doesn’t take away the need for people; it actually necessitates experts who can shepherd their clients along to help them change. They must provide continuous education on how to manage organizations’ fast-moving technology ecosystems and work with them to create business roadmaps based on emerging tech to make them slicker, smarter, more efficient, and less bloated.

Solving problem 3).  Determine the people, process, data, and technology debt to address

In the Great Services Transition, enterprises are buying services solutions that improve performance, drive speed to market, reduce cost, and create new content and data.

You must address your debt in these four areas which your firm has likely collected over the last 30+ years:

1. Fixing your skills debt: Develop new skill sets that can support the transition to embracing emerging technology and AI-driven business models.

2. Fixing your process-debt:  Recreate new processes process to determine what should be added, eliminated, or simplified across your workflows to support your slicker AI-led operating model.

3. Fixing your data debt:  You must align your data needs to deliver on your AI-centric business strategy. This is where you clarify your vision and purpose. Do you know what your customers’ needs are? Is your supply chain effective in sensing and responding to these needs? Can your cash flow support immediate critical investments? Do you have a handle on your staff attrition?

4.  Fixing your technology debt: IT spending just keeps increasing and only keeps swelling with each new platform and coding change. Stop buying tech for the sake of tech—this has been the failure of so many previous investments, such as the two-thirds of enterprises left struggling with their cloud migration journeys signed during the pandemic. The Great Services Transition is where you proceed through steps one to three before making bold decisions on your technology investments of the future.

Solving problem 4).   Restructuring your services engagements to shift from labor arbitrage to technology arbitrage

Enterprise leadership has always been – and still is – obsessed with cost reduction.  This is what they understand more than anything, and they view innovations such as GenAI as another lever to justify investments based on yet more cost take-out. The best approach is to reduce overall delivery costs by 20-30%, apportioned over 3-5 years.  This is offset by the increased value and reduced labor costs driven through effective investments in change, processes, data, and technology.  Clients MUST sign up for process reinvention and data transformation as part of it.  Clients need to TRUST their partners to get them there.  Providers need the TALENT to work with their customers, or the whole thing simply erodes to the bottom.

The Bottom Line:  Change the habits of a lifetime, or crawl away, as this S-Curve is the biggest people and technology challenge we’ve ever faced

As human beings we’ve already grown comfortable with what is familiar to us and avoided doing things differently until we have literally no choice.  This is the case with the services industry, which has ballooned in growth and home comforts for three decades.  The stark reality today is that enterprises do not need to keep spending on low-cost people-based services – they have what they need, and there is so much supply they can look at many providers to get it.  What enterprises desperately need are partners to work with them who share similar desires to learn new methods, unlearn old habits, and to teach them to exploit new technologies and new data methodologies and work with them to attack new markets with these capabilities.

This is how to survive the Great Services Transition. The big question now is whether enterprises and their services partners have the appetite to fix their skills, processes, data, and technical debt? Can they really learn new ways of operating, change their cultures, and embrace emerging technologies? Everyone needs to dig deep and decide whether they want to be a footnote or the future

Posted in : Analytics and Big Data, Artificial Intelligence, Automation, Autonomous Enterprise, Business Data Services, Business Process Outsourcing (BPO), Buyers' Sourcing Best Practices, Cloud Computing, Digital OneOffice, Digital Transformation, GenAI, Generative Enterprise, Global Business Services

Comment1 ShareThis 1096 Twitter 0 Facebook 0 Linkedin 0

To assure the journey toward the Generative Enterprise™, organizations must show humility for the unknown

Operations leaders face unprecedented challenges. They have to manage the new complexity of becoming cloud native and anticipate the implications of GenAI. If that’s not enough, they also have to find answers to the Digital Dichotomy, balancing the macroeconomic “slowdown” with the “big hurry” to innovate to keep up with innovation pacesetters. Yet, it is not a question of doing one or the other—they must address all those challenges simultaneously.

Against this background, quality assurance (QA), or simply “testing,” as it was called in the old days, can no longer be a reactive afterthought coupled with an unwillingness to invest in quality. Instead, it must become an integral part of the software development lifecycle (SDLC) and take on a much more holistic responsibility for assuring transformational outcomes.

This is the context for HFS’s seminal study on assuring the Generative Enterprise™ (HFS Horizons: Assuring the Generative Enterprise™, 2024). We sought to understand better where organizations are with their QA efforts and how they are trying to ensure transformational outcomes. We also explored how they assure change agents such as automation, AI, and blockchain. Finally, we looked at the adoption of GenAI through the lens of QA. In the following, we share the insights gleaned from our study.

Lofty aspirations for quality assurance

Finding your bearings on all things QA is difficult because there is a massive gap between the aspirations (and lip service) for QA and QA functions’ maturity and enterprises’ willingness to invest in them. While the mature end of the market is pivoting toward quality engineering (QE) with a focus on achieving continuous testing, data-driven decision-making, and cross-functional collaboration, two-thirds of the market is stuck at a lower maturity level, often still working with a Waterfall methodology, as Exhibit 1 depicts. Simply put, most organizations are on the left-hand side of this infographic.

Exhibit 1: Pivoting to quality engineering is about aligning quality assurance to customer journeys

Source: HFS Research, 2024

Yet, we might finally see organizations embracing the transformation of their quality assurance functions. According to our study’s reference clients, 95% described the primary value delivered by their service provider today as the ability to drive functional optimizations with selective quality assurance capabilities. Simply put, they expect services on the left-hand side of Exhibit 1. However, in two years, most of these organizations expect a transformation of their quality assurance functions, intending to drive experience-led outcomes and stakeholder experiences while creating new sources of value through ecosystem synergy.

“Shift right” is starting to augment “shift left”

Discussing these issues with QA leaders, technology partners, and service providers could crystalize more nuanced market dynamics. Most organizations have embraced “shift left” principles emphasizing the early and proactive involvement of quality assurance activities in software development. We are seeing mature QA functions augment this with “shift right” principles advocating quality activities also at the later stages of the SDLC.

Regarding organizations’ QA priorities, four clusters jump out for us: First, carving out a budget for QE architecture modernization and transformation. This goes back to the discussion of the Digital Dichotomy and the need to self-fund innovation. Second, solving the conflict of production quality versus speed to market. Not least, with innovation cycles dramatically compressed with the ascent to GenAI, this is a hard nut to crack. Third, overcoming a new complexity to provide integrated assurance for apps, infrastructure, and platforms or progressing toward the OneOffice™, as HFS would put it. And fourth, driving change management to foster transformation. Yet, large chunks of the QA community are stuck in a tools and technology mindset.

There is much noise but little assurance on GenAI

Let’s zoom in on the topic that comes up in every discussion we have, regardless of the context. While many providers hype use cases around domain knowledge and code creation, grown-up discussions about how to assure GenAI are sparse. As such, Cognizant’s Artificial Intelligent Lifecycle Assurance (AILA) and Infosys’ AI Assurance Platform are more exceptions than rules. This provides indicators for the enterprise adoption of GenAI. We are still at the very beginning of enterprise adoption. The core value proposition of GenAI use cases in QA is having a higher accuracy with less data. Thus, providers can drive new levels of automation to generate test scenarios and test cases.

While we had many discussions on the infusion of QA with GenAI, deliberations on governance on GenAI are still nascent. Where we had honest discussions, providers reflected that large language models (LLMs) are largely unfamiliar entities. One executive framed this aptly: In the context of GenAI, we must show humility to the unknown. Thus, the predominant way to engage is through experimentation. However, there is no beating around the bush that we were a tad underwhelmed by the discussions of assuring GenAI outcomes; outside of the service providers, we had some stimulating conversations. For instance, MunichRe provides insurance for AI, while German start-up QuantiPi blends quality assurance with governance for GenAI.

North Star (autonomous) persona-based testing

Beyond the broader pivot to QE, what have we learned about the evolution of quality requirements? Like cloud-native operations are shifting toward persona-based solutions, QE is shifting toward persona-based testing, requiring capabilities to support better specific stakeholders such as product teams, site reliability engineering (SRE), DevOps engineers, and beyond. But to be clear, this is the North Star, and only a few organizations have it on their roadmap. Looking at these requirements in the context of GenAI, the key is blending prompt engineering with specific persona scenarios. Using prompts to generate automation scripts such as Selenium can significantly enhance the scale of automation. Lastly, using the input from one prompt to generate another prompt can leverage GenAI to industrialize offerings.

Horizon 3 market leaders blend a compelling vision of transformation QA with nuanced approaches to assure change agents such as GenAI 

Last but by no means least, congratulations to the Horizon 3 market leaders. These leaders’ shared characteristics include blending a compelling vision of transformational QA with nuanced approaches to assure change agents such as GenAI. The wheat separates from the chaff when providers ensure transformation outcomes are enabled rather than depicting functional testing and an overreliance on tools and technology. The leaders are pushing the envelope on transformation with new themes such as cross-functional testing, zero-touch testing, black-box testing, and beyond. Exhibit 2 outlines the detailed rankings of our research.

Exhibit 2: The vanguard of the quality assurance services ecosystem

Accenture and Wipro stand out, clearly outlining the evolution to QE. They shift the focus of their narratives by depicting transformational journeys and outcomes rather than getting stuck in tools and technology. Infosys supports clients’ pivots to product-centric delivery and intelligent ecosystems, while TCS infuses emerging technologies into QE and embeds them at the core of transformation. Capgemini surprised us with a nuanced and thoughtful narrative on adopting GenAI. Cognizant was demonstrating test automation chops and strongly emphasized customer experience assurance. Perhaps surprisingly to some, Persistent is going deep on cloud-native transformation by investing ahead of the market in digital engineering and GenAI capabilities.

The Bottom Line: The QA community needs to emancipate itself

The innovation delivered by the QA community continues to be stupendous. Yet, the community does a modest job articulating the goals and outcomes those change agents achieve. Without snapping out of this tool- and solution-centric view of QA, it is difficult to articulate a more value-driven approach, where QA executives get the limelight they crave to discuss and decide the significant sourcing issues. Stepping up to QE transformation and putting QE at the heart of transformation and software development is an opportunity for the community to emancipate itself from being boxed in a technology and tools mindset. Discussions with the market leaders in Horizon 3 gave us some hope that we are getting closer to this.

HFS subscribers can download the report and find many more details here.

Posted in : Artificial Intelligence, GenAI, Generative Enterprise, HFS Horizons, The Generative Enterprise

Comment0 ShareThis 137 Twitter 0 Facebook 0 Linkedin 0