We recently launched our new Agile Maturity Index (AMI) and began our five part blog series focussing on the first two foundational pillars, High Performing Teams and Process & Governance. In this third blog in the series, we’ll focus on the Technology pillar.
When we consider a client’s technology challenges, we look at both the organisation’s and the individual team’s Continuous Delivery capability as well as an agile, emergent approach to enterprise architecture that engages the teams. Equally important is to start to understand the use of data insights and analytics in driving the backlogs. This ensures we are being both efficient but, more importantly, effective in our output.
Using the AMI, we begin with analysing the high-level quantitative data from the initial survey. Whilst we can see that the teams are scoring very positively for Continuous Delivery, Architecture seems to be a mixed bag and there seem to be some serious concerns raised around Data Insights.
Whilst it might not seem that Continuous Delivery is an area that needs much focus, the AMI provides another interesting view of the data:
Here we can see all 15 micro-attributes with their quantitative scores on the vertical axis and the AI derived sentiment on the horizontal (the size of the circles relates to the number of comments and these were used to drive the sentiment). What’s interesting to see here (highlighted) is that whilst Continuous Delivery is the highest scoring micro-attribute with a positive sentiment, Speed to Market is one of the lowest scoring. That’s a topic we’ll analyse more closely in another blog, but the AMI clearly highlights that there is a disconnect between the team’s technical ability to deploy to a live environment and an organisational culture that enables ideas to move through to a reality quickly.
As with the other macro-attributes the real analysis is done as we start to drill down further into the details of the data:Architecture
There’s a huge amount of insight we can gather even at this level and focussing on architecture we can see from the spread of answers that a team’s ability to contribute towards future state architecture tends to divide. People seem to be engaged with it overall, but they appear to lack strategic alignment.
Looking at the comments in this area we can see some positives such as “Implicitly every day, team leads have more influence at scheduled tech meetings”. But there is some divergence across the teams with others scoring lower and stating that “Architects sit in a separate team and rarely engage with engineers”.
That would certainly be an area to focus on in order to improve team engagement and a better emergent architecture. The current danger is that teams will make short-term tactical decisions around their implementation approaches.
Based on the answers to the data insights questions we can see that “[The UX team] are under a lot of pressure for creation of work from the business, and therefore driving the backlog from their insights isn’t happening”. In addition, teams have low visibility around the use of data insights or analytics to both drive the backlog and to measure the success of their work. Comments such as “We get given some analytics outputs occasionally from the PO. We cannot access it ourselves” are a concern and there are many more in a similar vein.
Using the AMI our analysts would look more closely here to understand how we can increase the use of data and the visibility of relevant analytics to the team, so they are not working blind. Whether or not this is a quick win will depend on the organisation actually having the data and analytics needed.
Filtering the data by physical location shows us that the teams in Newbury are scoring much more highly in this area than the teams in London. So, a good starting point would be to see if there are processes that can be pulled across from those teams who seem to feel that a better job is being done.
This gives a brief overview of what could be achieved from the analysis of a single macro-attribute from an initial survey. Using pulse surveys to track improvements over time as well as using data integrations and the analysis of unstructured conversational data in Slack can provide even deeper insights to help this scaled agile team continue to improve, driving speed to market, efficacy of output and build positive, happy and engaged teams.
The fourth blog in the five-part series focuses on the Organisational Agility pillar which enables us to analyse how well the organisation supports the team through effectively scaled processes, agile leadership and culture, and lean product management.
If you’d like a demo or more information on how you can use the Agile Maturity Index to measurably improve your agile transformation, please drop me an email at firstname.lastname@example.org.