The Analytics component of ADVANCE helps us focus on understanding how to use measurements to their best effect. Lots of organisations produce data from their management information systems (MIS), and they spew out reports which are presented at meetings and other fora.
Organisations are complex entities and the breaking down of the information that is produced helps us understand the more detailed activities of the operations.
For example, a report that shows a trend of reducing student fee income does not explain the reasons for this trend occurring. It does provide the opportunity to pose the question:
“Why is this happening?”
Of course, a coaching manager might ask:
“What are the reasons for this trend?”
These questions will then initiate further scrutiny that may uncover a whole host of reasons such as (but not limited to):
- Reduction in enrolments;
- Increased number of student withdrawals;
- Different mix of fees for a given number of students, etc.
This process is what we generally understand as analysis, and is a daily occurrence as we unpick the broader measures to run the organisation. So, how does analytics differ to analysis?
Analytics refers to the overall approach of discovering and interpreting insight from data, that is used to guide decision making. This guidance may include scenario modelling, that incorporates predictive modelling to aid understanding of the current and future situations.
A large part of analytics is the process of communicating the insight by incorporating visualisation techniques, but also it is the use of these techniques that help elicit meaningful patterns, leading to potentially new knowledge.
From a practical perspective, the current state of analytics is a marriage between statistics and computer science, which not only provides the necessary analysis and predictions, but also to automate the repetitive scenario modelling on generally large and distributed datasets. A lot of this activity originated under the banner of Operations Research (OR).
We are not going to dive into the technicalities here; there is a vast base of literature that deals with the mechanics of analytics.
We use the Analytics component to focus on the identification, maintenance and consumption of data to inform better decision-making, so that the outcomes can be realised. How will the data help you?
Here are a few scenarios that may resonate with you:
- There is insufficient data upon which to base a decision. “We don’t know how many unique visitors there are to our website because it’s managed by a third party”.
- The time taken to obtain the data is too long. “We can’t hang around to wait for reports, that then need to be analysed and cross-referenced.”
- The data is inaccessible. “We must have the data, but it isn’t available through the XYZ system.”
Having identified your vision, together with the key measures, you’ll be in a much better position to know what data you will need. Sometimes our enthusiasm for improvement clouds our thinking and we rush off and develop comprehensive reporting systems that collect and harvest both the data we want, as well as data that will not help us reach our objectives.
Analytics is about determining the actual data, analysis and forecasting that is required, and then ensuring that existing mechanisms are fully exploited.
Only when an existing system cannot deliver what is required do we consider changes; in most cases the existing systems can deliver sufficient data to support enough modelling to make significant improvements.
In the case of using ADVANCE for individuals, this phase is particularly useful. We can find it difficult to measure our own development and it is not always intuitive (or even attractive!) to seek data from others.
However, the realisation that external data sources are useful means that individuals start actively soliciting feedback on their performance as a result of ADVANCE, as they can see a faster way forward.
Back in the Introduction, we posited that “it’s all about the people”. A lot of performance management systems concentrate upon process, which is understandable to some degree as processes are normally easy to quantify and monitor. If students receive feedback in two weeks rather than four, then a reduction in the lead-time can be reported.
Of course, it does not follow that a student will view this as an improvement. The feedback may be of lower quality and less detailed, or it may be judged to be too generic across the cohort.
So we have to be careful that not only do we collect the correct data, it has to be of sufficient quality, volume and breadth to permit further analysis.
Businesses commonly refer to stakeholder value. Does a student/employee/employer/society (stakeholder) receive a fair return (value) from a university?
If we focus intently on the value that students obtain from their experience of study, there would be a concerted effort to define measures that relate directly to specific operational activities; in other words to understand and monitor the organisational processes that students interact with (sometimes referred to as ‘the student journey’).
A focus upon the value of staff might concentrate upon finding ways of acquiring data about staff, so that management decisions can be made relatively quickly rather than relying upon an annual staff survey.
Reflection: From the data you gathered during the Awareness component, what measures does your organisation have in place to monitor the value obtained by your stakeholders?
As leaders we are interested in the future (‘to-be’), but we also recognise that we need to know how far aware the vision is from the current situation (‘as-is’). Our key questions therefore are:
- What is the current situation (‘as-is’)? Having completed the strategy phase of ADVANCE you’ll have a comprehensive set of data that includes not only quantitative measurements of performance, but also the location and detail of the various institutional repositories. During the process of discovering this data, you’ll no doubt have made some new acquaintances that will help you in the future.
- What challenges are on the horizon that could be avoided? At this point you should have a combination of data sources. First, the external data that you gathered as part of the Definition phase will give you a perspective on the wider sector that your department/institution operates within. Second you will have already synthesised your own thoughts and experience together with the views of people you work with. Many coaches and attendees of my workshops have remarked that their daily conversations have now much more value, since they are using the insight that they gain from regular data-gathering to inform the questions they pose. ‘Open questions’ are a great assistance as well.
- Are all the critical issues being monitored appropriately? One of the dangers of investing effort into the future is that the essential operations are neglected in some way, resulting in a reduced quality of service. This can be one of the most challenging aspects of any transformation, as some mistakes can be catastrophic enough to question the need for a particular change initiative. When a culture needs to change, it is not always a good idea to partition some staff as caretakers while you lead all the exciting work. Everybody needs to be encouraged to participate, and therefore we have to be sure that the daily business will only be improved, not harmed. A clear focus on stakeholders is important; it reinforces who staff are serving and is a constant reminder of the essential activities for legal/regulatory compliance. Management systems that focus on stakeholders tend to be more successful in this regard, and we should endeavour to choose measures that help us drive improved behaviours. You’ll find that the data gathered during the Awareness component is useful here.
- What are the priority actions? Priority actions fall into two categories. First, there are actions that are visible from basic report analysis; what needs to happen with regard to a drop in student achievement? Second, what actions are there that affect our ability to reach strategic objectives? For example: What is the key factor affecting student recruitment in this subject discipline?
- Where is the detail for further analysis? This question ensures that we collect all of the information that we need to be able to inform our decisions. It’s of no use to collate aggregated data for reporting, if we cannot dis-aggregate to understand the underlying causes.
Use these questions as your own, regular ‘sense-check’. Their simplicity disguises incisiveness, particularly when posed by a coaching manager!
Reporting and visualisation
The prevalence of desktop PC business software such as spreadsheets, together with in-built tools to quickly assemble graphics has resulted in lots of creativity being applied to organisational reports. Please do resist the temptation to use every available graph in each report. If we assume that we are only going to present information of value, then there are two basic rules to abide by as follows:
- The measures should be accompanied by a graph that indicates the trend. Graphs referred to as ‘Dot Plots’ are the simplest and most desirable.
- A value to reflect the variance since the last report. This can either be a positive or negative number, or arrows can be used for a graphical visualisation.
Many reports have considerable variations in this theme, but in essence you should strive for simplicity. An added benefit is that their simplicity means that local statistics departments will be able to produce them quickly. Anything that communicates the data and insight faster is a good thing.
As leaders we are of course interested in our staff. Our use of a coaching mindset means that we see the value of developing our staff, and in return our staff will perform at a higher level.
Since conversation is a fundamental part of coaching relationships, a scenario can develop whereby the person doing the coaching has the broadest view of the staff values and behaviour, but not everyone else has the privilege of sharing that view.
In time, as a culture develops, the values and behaviours of staff will develop and higher performance will become the norm. But what should we do in the meantime?
It might be that there are some issues that are not being surfaced by staff, even to a coaching manager, and then by implication those difficulties are not being attended to. This can lead to extra inertia for the change initiative to overcome.
Reflection: Compare the student feedback of teaching with your assessment of the module teaching team’s satisfaction in their roles. What do you observe?
Universities often survey their staff anonymously on an annual basis, and report back the cumulative results. In a similar vein, the UK HEIs conduct a National Student Survey (NSS) to understand the satisfaction of students in relation to a number of factors such as teaching quality, assessment and feedback, Student Union, etc.
The use of the NSS results (which are made public for all HEIs) to manage performance has resulted in institutions installing their own student feedback systems.
What are the reasons for this?
One reason is that it is difficult to monitor performance based upon an annual measure. How can those delivering the student experience understand what really improves the reported student satisfaction with such an infrequent measure?
As a result, some HEIs measure student satisfaction more frequently at the end of each semester or term, so that dissatisfaction can be discovered earlier and corrective actions can be taken.
Another reason cited is that students can sometimes be reticent to commit their true feelings to a written survey, and that more practice in completing surveys, and more evidence that action will be taken as a result of completing the survey, will improve the quality of data collected.
Similarly, you wouldn’t check application figures annually, as there are some marketing and outreach activities that could positively affect the recruitment cycle. So, if we are interested in our staff, we should place data gathering with respect to how valued they feel, as a priority activity.
Whether you do this more formally using an anonymous questionnaire or not is a matter for you, your culture and your bureaucracy. But as a coaching manager you understand the benefit of engaging directly with your staff to develop their capabilities and talents.
If you develop a reflection habit that is structured, it is wise to include data from conversations you have with staff. This is best achieved by carrying a notebook with you at all times (many of you will already), so that you can jot down the essence of a conversation you have with each individual.
It isn’t usually necessary to record exactly who it was, or where it happened, but a brief note can capture the current sentiment, that might otherwise be lost in the busy-ness of the day.
As is so often the case with reflection habits, it’s the process of data capture that assists your memory. Your act of noting a thought down somehow reinforces it, and also permits you to reflect in the moment. You also have a record that can be entered into your structured reflection system.
Part of the practice of a coaching manager is the ability to ask good questions. In fact, like good educators, coaching managers “don’t teach answers, they teach questions”.
The same approach should applied to the data we discover, and the data we receive. Data, and particularly reports, can mis-lead through error. Organisational politics can result in data being presented in specific ways to exaggerate or conceal poor performance. How do we deal with this?
The best antidote to errors (and there will always be errors) is to practice excellent data hygiene and process integrity yourself, and act upon the insight you discover. For you to act with integrity, whilst using data produced by somebody else, means that you need to assure yourself that the data has been collected properly.
Tread carefully when questioning the origin of data or results of analysis; people can fell threatened and become defensive, particularly if the debate is public.
Questions about the data tend to relate to statistical concepts of reliability, sample size and bias. For instance:
- What is the standard deviation of this value?
- What is the significance of the variance?
- Is there a seasonal/cycle in the results or is it just noise?
- How representative was the training set of data for the model?
- What bias might exist in the sample?
- What was the sample size?
- What confidence do you have in the forecast?
Other questions around visualisation are important to query, particularly if the comprehension of reports is difficult. Visualisation standards can be challenging to adopt at the beginning, so the sooner you establish that trend plots are preferable to tabular data, and that pie charts don’t really tell us much, the sooner you will all benefit from analytics reporting.
Such systems can bring new dimensions to meetings, where you spend some time looking at the outcome data (which reports what has already been actioned upon), and the majority of the time looking forward and discussing future scenarios that are based upon performance data.
Yet more opportunities to coach!
If you are reading this book before you have started to implement ADVANCE, it is a useful exercise to answer the five questions in ‘As-is, to-be’. Keep your answers safe and then repeat the exercise when you have completed Awareness, Definition and Vision.
One way to approach the Analytics component of ADVANCE is to select a particular area of concern for analysis, and through further questioning, analysis and dialogue with those around you, develop a focus that makes explicit use of the data to initiate small-scale improvements. For example, student retention is an important factor that affects the health of a programme, department or institution.
Traditional approaches would look at the withdrawal figures, interview a few students, and then spend a lot of time hypothesising about what we think are the reasons. A more analytics based approach might be to look at a broader set of student data and then producing some measures that your analysis indicates could be related to a student’s choice to withdraw.
This might be their achievement or engagement with lectures, and you might also look at personal tutor meeting records. Further analysis might indicate that students of a particular demographic are more likely to withdraw, which would help identify where support efforts could be channelled.
Using the measures from your vision, populate each metric with the data from your current situation. This is your baseline from which all subsequent activities will be measured.
Depending upon the effort that is required to obtain the data, you might need to create a protocol that simplifies this in the future, such as a request to the IT department to create a report of the data that you need for your purposes. These can sometimes take a little time to arrange, but once they are done they are embedded in the system.