The majority of CIOs are struggling to balance innovation and operational excellence. Ageing infrastructure affects service quality, poor performance creates negative customer sentiment and ultimately stalls Digital Transformation agenda. To get the balance right, we found there were 3 key areas that were key to successful digital transformation:
- Moving from project to an Agile Product Management approach
- Embracing Artificial intelligence and Machine Learning
- Adopting Agile DevOps practices
Agile Product Management
In the future, we will no longer have a divided conversation between Agile, DevOps and Project teams, instead It will be a Product-centric value stream conversation with a singular focus on customer value .
Projects focus on managing and delivering requirements whereas product focus is customer satisfaction gap and understanding which problems to solve. Product management looks at the whole product lifecycle.
Product management approach concentrates on aligning Business, Portfolio , Product and Technology Strategy to solutions developed to close customer satisfaction gap.
Scrum provides key strategic and tactical feedback loops of the success of solutions. The product backlog ensures it aligns product vision and strategy with Technology strategy to apply the right technologies for solutions
This makes it an exciting time to be a Product Owner as the product management approach means looking strategically at product discovery, product development, product support over the whole life of the product and shifts focus to introducing, growing, maturing and then eventually retiring a product.
All product backlog items were included from ideas, to new features, enabling functionality and incidents/defects. Our focus was customers so we ensured that we applied principles for strategic product management to prioritize and understand what would solve customer problems and address their pain points.
Principles of Strategic Product Management
Artificial Intelligence (AI) and Machine Learning (ML)
The two fields that are advancing at a fast rates are artificial intelligence (AI) and Machine Learning (ML). The field of AI has made machines intelligent, self-reliant, and far more imaginative than people ever thought they could be. AI/ML will drive a big change across businesses that moves beyond smart IT and process automation.
We have found them useful to assist in analyzing test data, identifying coding anomalies that could lead to bugs, as well as automating security and performance monitoring to detect and proactively mitigate potential issues.
AI refers to a computer or other device’s capacity to imitate intelligent human behavior. This can involve activities like understanding, analyzing, solving problems, and making decisions. While ML is a particular subset of AI that focuses on the creation of algorithms that can learn from data’ Large datasets of labeled data are used to train ML algorithms, which enables them to learn to recognize patterns and make predictions. This was the more exciting area for us as we ingest and analyze a lot of data.
Tame profileration of tech and tools
In an organization I was working with, many teams had set up their own environments, resulting in support for multiple infrastructures, LLMs and tools. This led to higher complexity, cost of operations, confusion and risk. We didn’t want to waste time analyzing all the providers, hosts, tools, models but instead concentrated on building our infrastructure and applications in a way that gave us the flexibility to switch easily.
We has existing providers that we had already invested heavily in their platforms and they were rolling out new AI services that provide us the economics of some use cases and open access to new ones therefore adopted standards widely used by our providers for AI and infrastructure as code, and open-source LLMs where we settled on a subset that were supported.
AI/ML requires meaningful data
There is a misconception that Gen AI can simply sweep up the necessary data and make sense of it. This is not possible without clean and accurate data, which requires real work and focus. It is important to invest the time to specify and rate the importance of content sources (“authority weighting”). This authority weighting for content sources helped the model understand the relative value of different sources which overtime would help provide analysis that could support decision making..
We also found it important to maintain platforms as new data is added, which happens often, affected performance. AI requires meaningful data to make noticeable improvements and drive innovation, and it’s difficult when data lives in so many different places, each have different versions and conflicts
Reuse it or lose it
It’s easy to get caught in building abstract gen AI capabilities that don’t get used. Our plan was to build Gen AI foundations strategically to enable reuse across solutions. So we did a review of use cases, to ascertain their common needs or functions and shifted energies to building solutions that can serve many use cases. We built these common elements as assets that could be reused to create a new capability. The practice of reusable code was shown to increase the development speed of Gen AI use cases by up to 30%.
Invested in DevOps
Artificial intelligence (AI) and machine learning (ML) are still maturing in their applications for DevOps. The strategy we used was software factory to improve product development flow for multiple products with minimal human intervention once developed. This meant using technologies and tools to automate the CI/DI pipeline process of our DevOps framework. This approach helped us to translate complex manual error prone processes into an approach that can be tested, measured, and scaled.
The software factory
The software factory was aimed to aid and accelerate our digital transformation. The approach utilizes DevOps and Scrum framework. It combines tools, teams, and practices to organize, standardize, store, and reuse code, enabling our teams to efficiently build upon accumulated knowledge. The benefit is a more organized and structured approach, faster software development and higher quality code.
The foundation of a software factory is the tools, services, repositories, and practices used to develop software and included stored recipes, templates, and reusable code that can be leveraged to quickly create new applications. This foundation has enabled teams to plan, build, and deploy software more predictably.
The software factory will continue to evolve and have its own development life-cycle as new tools, capabilities and products become available. That way we can take advantage of emerging technologies.
3 areas of focus for DevOps
- Automation: Increased efficiency is central to DevOps, and this was achieved through automating a range of relatively slow, onerous processes in software development and infrastructure maintenance. This takes the onus off systems administrator and testers, who otherwise have to perform these tasks manually and de-risked deployment through enabling us to easily catch and fix bugs that may arise..
- Continuous Integration: The practice of sharing and merging code in a central location was adopted. This was more collaborative and ensured developers weren’t working in isolation for an extended period of time and did more frequent merges (daily at minimum) rather than only merging when the work was completed.
- Continuous Delivery: By having the code in a central location, we were also able to automate delivering and implementing software product changes as they’re made. We did smaller more frequent systems as a result.
What DevOps and AI/ML helped us achieve
Saved us time to develop and maintain systems
Code generation
Ingestion of data
Analysis and pattern identification
Release notes
Knowledge articles for service desk queries
Training guides and updates
Detect potential issues and vulnerabilities
Process automation
Product backlog management