Validations vs Solutions - Same But Different
Back in 2020, I wrote this article Data Driven Design, where I focused upon the model of thinking about business outcomes and measuring success through the data that is designed from the start (even as part of the ideation) - "How do we know when we are done?" - Thanks to a comment from Peter Laurie that resonated with since 2004-2006.
Moving forward (to now), especially with the hype in AI, acceleration in AI technology and search for ways to optimise problems to solve, this model is very much applicable to the way we execute.
Through many of the discussions that I'm having at the moment, there is an age old problem that is still there. Like the theory of light where it exhibits wave and particle characteristics, so do projects where it exhibits characteristics due to the problem being solved and the engineering process being used however a project is typically marked with a singular rating.
We would always like to have a successful project. Why wouldn't we? But consider the unknowns (whether it be related to the problem or the engineering process). These unknowns can easily disrupt the success of a project. Coming back to the Data Driven Design article, designing the metrics to measure the success (and maturity) are important to ensure we are growing over time vs hitting a specific success / failure score.
There are two different modes that I subscribe to depending on how well formed "things" are. And it's based upon the Double-Diamond in Design Thinking.
In a traditional mode, there is a willingness to satisfy whatever is the perceived to be the solution. "Here is a requirement; let's just build that." Considering this approach, in the first diamond aka "Discover-Define", then how confident are we what we are needing to build is the right outcome? Or how confident are we in our strategy that this is the right technology or platform to use?
Times like this, it's better to talk in terms of validations vs solutions.
Consider the data, metrics and facts that contextualises this within your organisation to help define the direction and the solution. My North Star with much of this centres on the business canvas model.
Is it valuable? Is it feasible? Is it viable? And through this change, what is the impact that we are looking for?
Even with a desired and assumed target state, there are typically M number of ways to get there and the end solution to that problem could also be accomplished in N number of different ways. The proposition here then becomes:
How do we explore many different ways in the most efficient method whilst still collecting data that is contextual?
One of those projects from the 2004-2006, I was leading a team that was developing based upon an existing framework that was adopted by the customer. What it lacked was an engineering process which we needed. Over a very short period of time, I implemented several vastly different implementations that demonstrated how the framework could be adapted and used with key requirements for engineering - source code control, build automation, developer environment, developer tests. What we were able to do was to identify an implementation that best suited the team we had. That divergent behaviour to explore different options helped create a better view about how the team would execute. The decision was based upon execution and demonstration of capability vs 100% subjective opinion.
Recommended by LinkedIn
This ultimately, set up the team to be very efficient; to be very enjoyable; and to be very successful. It was one of the most pivotal projects that I have contributed to and (to this date), it is also one that has been a basis of much of the work that I do now.
The questions that I continue to ask myself (in many of the tasks that I execute):
1. What actionable metrics define success?
This is not the number. It's a metric that can be captured and can be understood.
2. What areas of growth do I want to build?
Both in understanding the problem and delivering on a solution - capturing data that align to the metrics we want.
3. How much time and effort do I look to invest in exploring?
Both in understanding the problem and delivering on a solution- capturing data that align to the metrics we want.
4. How do I know when I'm done?
It may not be perfect; It may mean that there is a logical point that I am willing to stop.
NB: I'll finish the Double-Diamond in my next article.
Cloud Engineering | Community Builder | Innovation
3wComing from the development and delivery side of this perspective - it can be easy to blame (when things are going well) the project / delivery / engineering side. Some of the perspective is to protect the brilliant people that involved and to ensure that their contribution (irrespect of the outcome itself).
Passionate about giving people the tools they need to get things done!
3wBrad Goddard Helen Davis Michael Simms Ravi Tirumalai Fiona Waters Michael Connell Manju Pandey Sandeep Puri Aditya Bhattacharyya Kim G. Arn K.