Category: Leadership

  • Flexible learning materials – how to make them quickly

    Flexible learning materials – how to make them quickly

    Really pressed for time? Download the full guide to creating more flexible learning materials.

    I’ve been part of higher education for over twenty years now, witnessing a number of changes that have each presented different challenges.

    There is a paradox within higher education. The generally conservative characteristics of institutions reflect an environment that thrives when it is required to change. Academics demonstrate agility that a start-up desires; innovative solutions to new problems are debated, then developed at break-neck speed.

    Of course, it is extremely disruptive to take an entire educational sector and transfer it online.

    In the UK, `lockdown’ measures started part-way through an academic year. Institutions and students had no alternative but to cope with emergency measures to complete the remainder of their courses.

    Ahead of the game

    Many academics were ahead of the game, and examples are emerging where student satisfaction has improved as academic staff have deployed interesting and stimulating online alternatives to traditional lectures and tutorials.

    The transition hasn’t been easy and it is not over yet. Uncertainty remains in society, and perhaps the only practical strategy going forward is to maintain a state of prepared-ness.

    Not all subjects are the same though. A course that predominantly runs on debate and discussion might cope better with social distance controls than a course that expects learners to have their hands operating specific equipment, working in close proximity to peers, as they might be expected to do after they have graduated into their first jobs.

    These challenges become re-cast as obstacles to circumvent or just knock-down; constraints that make something appear impossible become the driving force for innovation.

    Longstanding debates as to the effectiveness (or not) of online teaching have quickly been dusted-off and are being re-visited with vigour. The difference now is that a) the whole sector has some experience of online learning, and b) moving forward, it is difficult to imagine a university education without it.

    Managing changes in learning

    Academic managers are constantly managing a number of recurrent issues from year to year. The headache this year of moving into an uncertain environment, with incomplete knowledge, is perversely an issue that is both extremely difficult to comprehend, as well as being intoxicatingly exciting.

    A university’s branding is based on its reputation for research, teaching and  societal impact. This reputation continues to be tested as we adopt new solutions for delivery.

    Academic change agents are revelling in the volume of change that can be lead, so that the eventual response is progressive, innovative, and results in something that operates better as a result.

    Many of my managerial colleagues are faced with the reality of managing the practicalities. If social distancing reduces your classroom capacity by 75%, there is not enough slack in the system to increase the number of staff hours by a factor of four to compensate.

    This assumes that the traditional models of teaching are just transferred online with no change. And so, there are the physical constraints of maintaining a sustainable education sector causing the need for teaching to change.

    Unintended outcomes

    Interestingly, attendance at my departmental meetings has rocketed since we have gone online. It does show that the physical environment can prohibit networking, especially if it just isn’t practical to make the journey across campus to attend a meeting that is sandwiched between lectures.

    I’ve had countless conversations with academic staff who are trying to make things work, as well as those that are genuinely bewildered by current happenings. Most of the questions I field are about the practicalities of potential solutions.

    If we need four times as many hours to deliver the old solution, how can we deliver at least the same quality of service without making excessive demands upon staff?

    Authoring flexible learning materials quickly

    Academic colleagues are starting to demonstrate that new models of teaching can work and I have summarised some of these approaches, and the thinking behind them, into a short guide to creating more flexible, blended and online courses. The guide is not meant to be the last word on curriculum design and flexible learning materials, nor is it claiming to be a definitive answer to the challenges ahead.

    It is a collection of the main issues that academics face when being required to turn their course delivery into something different.

    The question of hours – how long will it take? – is a common part of academic workload discussions, so the bits of the guide that address this might be of use to other academic managers who are having similar conversations.

    Some small vignettes describe case studies where online or blended approaches have been particularly successful in the past. There are also some links to materials that I have found useful in developing my own teaching.

    Working in technology domain, this is an exciting time to explore the use of tools to facilitate, and ultimately enhance the way we interact, work and learn together. The challenges are there to be overcome.

    “The impediment to action advances action. What stands in the way becomes the way.”

    Marcus Aurelius.

    Download the full guide to creating more flexible learning materials.

  • Deep reflection for practitioners

    Deep reflection for practitioners

    Those that practice regular reflection, and have an operational system in place, witness some significant benefits in their development. At the very least, you will be more aware of how you behave – and while you might not always be pleased with the news – the increased accuracy of your insight from deep reflection will provide a more rigorous foundation on which to base your future decisions.

    Many of those that have attended my leadership development workshops have reported significantly larger successes as a direct result of adopting the reflection habit. When I’ve coached clients, they also realise the potential of regular, structured reflection, and in the main this is sufficient to successfully achieve significantly higher than average performance.

    However, there are two specific scenarios where the reflection habit needs to be extended. The first is when someone has been practicing reflection for some time. They have got into the habit of setting developmental goals and using their deep reflection data to plan for new experiences.

    The second scenario is when an individual presents a demanding goal that will have considerable impact; this may require 3-5 years to achieve, and substantial, sustained effort to successfully attain. In such cases I tend to recommend adopting the reflection habit exclusively to begin with, but sometimes the time frame is so compressed that we need to add something else on top as well.

    One of the important skills of reflection is the ability to separate the recording of facts from any interpretation that you might have ‘learned’ to use, to process the new experience. This presents two key advantages for your leadership development:

    • The ‘significant’ event is recorded accurately, with an emphasis upon fact. Which would you rather have to base your future decisions on – an account of a significant event seen through your normal ‘prejudiced lens’, or an accurate record of what actually happened?
    • Since the recording of the event is separated from any reflection post-processing, the reflection itself is more significant. You consciously reflect upon the data that you have collected, safe in the knowledge that you have worked hard to ensure that the facts of the experience have been collected.

    Furthermore, when you have completed the reflection, you have two records; the original event, and your subsequent, considered thoughts. This is invaluable when you start to look for patterns in your own behaviour.

    I’m of the opinion that leadership is a continual learning process. We may coach others, but when we actively engage in reflection we are actually coaching ourselves. But to qualify that specifically, it’s a continual active learning process.

    The reason I say this is that many people appear to be satisfied with passive learning through experience, measuring their progress in terms of years of service or the rung of the career ladder achieved. I’m motivated to take charge of my learning, as I’m sure readers of this blog are also.

    You will already have started looking for new opportunities to engage in, either to practice your newly found skills, or to experiment with new experiences. This often occurs at a subconscious level, as I witness with clients in coaching sessions.

    As they grow more aware of their progress, they start to actively plan for development experiences, further building their experiential evidence. As I mentioned earlier, this is enough of a development-boost for a lot of leaders, but if you really want to master your own development, we’ll need to do a bit more.

    Action planning

    Action planning is useful when it is focused upon one, two, or at most three aspects of your development. It should be measurable (of course), used for a specific purpose, and discarded when the outcome has been achieved. 

    More importantly, it must be relevant to your current and future states, and is therefore shaped by the other development tools that you might employ. Plenty of my workshop attendees complain about how difficult action planning can be, and that it seems to not be worth the effort as achieving a successful outcome can be sporadic.

    It is likely that those who have not yet developed an accurate model of their self-awareness will find action planning problematic. Sort out a reflection habit, and you’ll have plenty of pertinent data to draw upon.

    Finally, action planning needs to be considered part of a more holistic approach, but I’ll come back to that in a short while.

    A strong theme of my approach to behavioural changes for leadership development, is that any new habits should be simple to adopt. So my action plans tend to be lists of objectives.

    Each objective is SMART (Simple, Measurable, Achievable, Result-oriented and Timebound). For more on SMART objectives please consult Professor Google. But to be honest, the only aspect of SMART that my clients struggle with is Achievable.

    It takes a fair bit of self-awareness to repeatedly assign yourself achievable goals (that mean something). Goals are either stratospheric, or just too safe. Safe goals are achieved easily, but the lack of stretch is does not promote effective personal development. If you’re still unsure as to how to progress, establish the reflection habit right now.

    So far, we have a process in place to capture experiential data and reflect upon it in a structured fashion. We also have a simple means of expressing specific developmental objectives, with a focus upon delivery of outcomes. In the same way that structured reflection can be sufficient for many developing leaders, the addition of action planning, driven by themes that have emerged from the reflections, can provide added effectiveness.

    But those who truly aspire to excel, can utilise their existing developmental habits to build a much more comprehensive, holistic system. One of the potential limitations of capturing reflections and formulating action plans is that there could be a mismatch between what the individual pursues, as opposed to what is required for a given situation.

    I feel that the risk of objective mismatch diminishes over time, as individuals become more self-aware. But therein lies the problem. If the risk diminishes the more you do it, then you are most at risk when you start the process.

    As a result, I tend to coach clients to adopt the reflection habit as a primary, discrete activity, without being overly goal driven at the outset. Early on, it’s more about self-discovery.

    I’ve found that some people like a bit more structure to their learning when they start reflecting, and if they are used to a culture of action planning, then it’s important to insure against any over-enthusiastic development plans being created.

    In my experience, an effective approach is to tackle the issue of critical self awareness head-on, by asking the individual to conduct a self appraisal. This needs to be quick and simple, to get the maximum benefit, and a SWOT (Strengths, Weaknesses, Opportunities and Threats) analysis can be a good starting point.

    A better start, in my view, is a SWAIN (Strengths, Weaknesses, Aspirations, Inhibitors and Needs) analysis. This approach contextualises current strengths and weaknesses in terms of the future desires of the individual, and implicitly fires up the relevant planning neurons.

    Used at the outset, structured reflection can be suitably constrained so as not to go too far off course, and the first set of developmental objectives are likely to be relevant to the initial self-assessment.

    So what’s the problem with adopting this whole system from day one?

    Well, it can be done, but the danger is that it becomes too much of a system, that needs to be applied in a prescribed way. When faced with such a fundamental change in personal development, a lot of people cry out for forms and flowcharts, in order to cope with the amount of change.

    This more or less guarantees its failure. Whilst we need to use paper (physical or virtual) to make records, we should not fall victim to excessive administration.

    An developmental leader embraces the holistic view. If any gaps exist, they are plugged with efficient processes that enrich the overall development process. But the same individual is also acutely self-aware, and adopts an incremental approach to enhancing learning. I favour such an approach when it comes to building a personal learning system.

    First, build your self-awareness through regular, structured reflection. From the themes that emerge, use action planning to focus your attention on a constrained number of developmental issues. Then, add the SWAIN self-appraisal checks to the mix. Use each SWAIN to check your overall progress, and to diagnose any specific needs for your holistic development. In terms of frequency, you’ll establish your own schedule. But here is a suggestion:

    • Structured reflection – daily;
    • Action planning – as and when development issues arise;
    • SWAIN analysis – every  quarter (3 monthly).

    To obtain an overall view of your learning requires a suitable container, in which all of your learning evidence is ‘kept’. Traditionally, artists keep evidence of their work in a portfolio, to illustrate how they have developed and to show what their capabilities are. This is similar to what we might want, except that it would be useful if the path of learning development could be observed.

    Journalling

    The practice of journalling has been around for as long a people could write. If you develop a reflection habit, then you will need somewhere to record your experiences, draw conclusions and then plan your new experiments.

    The experience of writing longhand can be cathartic. However, once the volume of entries starts to accumulate, it can become increasingly difficult to ‘mine’ your records to identify patterns. Coupled with the fact that some people are worried that either a journal is lost, or that someone else might read it, there is often some resistance to writing things down.

    A common reaction to the prospect of regular reflection is: “I couldn’t possibly write down everything I feel, just in case it gets out”. It’s a shame that people feel this way, but I have two comments to make.

    First, I am advocating reflection about how we develop as leaders, probably in the workplace. We are not talking about self-disclosure and deep therapy. Second, if you don’t want anyone else to read it, then there are methods that don’t require you to keep your journals locked away in a safe.

    Using technology

    More people have access to technology these days, and for most university employees a computer is at the centre of their work. Computers can help with the reflection habit, since we have lots of opportunities to use them, particularly if you own a smartphone.

    This is my ‘secret’ to regular reflection: Every workday I will write for a minimum of 10 minutes before I read my email.

    I could, of course, be actually sending an email to myself, that contains my reflection. No financial outlay, the records are kept electronically so they can be searched, the organisation ensures that they are backed-up, and I can access them wherever I have access to a network connection.

    This is the simplest and cheapest approach which is relatively secure. If you send the emails to another email address then you would have to ensure that they were encrypted before you sent them – emails are the equivalent of postcards on the Internet as everyone can read them –  but if you send them to yourself, only the IT system administrator could read them.

    Another alternative, is to use a free blogging service (such as Google Blogger or WordPress) with the privacy controls set so that only the author (me) can see it.

    The use of a blowing tool has significant advantages for your organisation. The table below describes a workflow that will simplify your regular reviews. The simpler a tool is, the more you are likely to use it regularly.

    Activity

    Using a tool like Google Blogger

    (or WordPress, etc.)

    1. Collect – write notes at every opportunity, record fragments of conversations for later review.

    Post frequently directly via the web, or through emails from your iPhone, internet cafe, PDA, etc.

    At least 10 minutes per day before opening your email!

    2. Review and reject – go back and look at what you have written. Sort the wheat from the chaff.

    If you write one summary review every week, then that is at least 4 structured reflections per month.

    To review quarterly, you need only look at 3 of the latest monthly review postings.

    Review your postings for the week. Write a summary post and Label it (different blogging platforms have different vocabularies – it might be ‘tag’ or ‘category’). 

    You might choose WeeklySummary as your Label for instance. If you are reviewing the month then the label might be monthlySummary. And for quarterly reviews …

    Why do I need to add a label? Labels allow you to quickly sort your postings. When you come to do your first monthly review you just click on the weeklySummary Label. 

    Then just read the 4 latest postings and conduct your review.

    3. Refine and plan – use the reviews to create stand-alone pieces of writing. For example, after writing for a few months you might want to write a summary piece of how a new approach you have adopted has developed over a semester.

    Now you can start to project forward and think about what you want to achieve with your writing.

    Create a stand-alone post and label it ‘article’ or ‘potential’ or anything else that you can identify at a later date.

    Think of these posts as more developmental; if you have an idea that is related to this post, then use the Comments link at the bottom of the post to record your thinking. 

    This is especially useful when developing a theme for your development.

    Workflow for reflecting with a web-based blogging tool.

    At any point in time this tool serves as a snapshot of your current developmental needs, together with an explicit, reasoned narrative of your learning journey. It’s also evidence of the importance that you place upon continued development. Coaching managers understand this and use reflective practice to develop themselves beyond all expectations.

  • ADVANCE – Analytics

    ADVANCE – Analytics

    The Analytics component of ADVANCE helps us focus on understanding how to use measurements to their best effect. Lots of organisations produce data from their management information systems (MIS), and they spew out reports which are presented at meetings and other fora.

    Organisations are complex entities and the breaking down of the information that is produced helps us understand the more detailed activities of the operations.

    For example, a report that shows a trend of reducing student fee income does not explain the reasons for this trend occurring. It does provide the opportunity to pose the question:

    “Why is this happening?”

    Of course, a coaching manager might ask:

    “What are the reasons for this trend?”

    These questions will then initiate further scrutiny that may uncover a whole host of reasons such as (but not limited to):

    • Reduction in enrolments;
    • Increased number of student withdrawals;
    • Different mix of fees for a given number of students, etc.

    This process is what we generally understand as analysis, and is a daily occurrence as we unpick the broader measures to run the organisation. So, how does analytics differ to analysis?

    Analytics refers to the overall approach of discovering and interpreting insight from data, that is used to guide decision making. This guidance may include scenario modelling, that incorporates predictive modelling to aid understanding of the current and future situations.

    A large part of analytics is the process of communicating the insight by incorporating visualisation techniques, but also it is the use of these techniques that help elicit meaningful patterns, leading to potentially new knowledge.

    From a practical perspective, the current state of analytics is a marriage between statistics and computer science, which not only provides the necessary analysis and predictions, but also to automate the repetitive scenario modelling on generally large and distributed datasets. A lot of this activity originated under the banner of Operations Research (OR).

    We are not going to dive into the technicalities here; there is a vast base of literature that deals with the mechanics of analytics.

    We use the Analytics component to focus on the identification, maintenance and consumption of data to inform better decision-making, so that the outcomes can be realised. How will the data help you?

    Here are a few scenarios that may resonate with you:

    • There is insufficient data upon which to base a decision. “We don’t know how many unique visitors there are to our website because it’s managed by a third party”.
    • The time taken to obtain the data is too long. “We can’t hang around to wait for reports, that then need to be analysed and cross-referenced.”
    • The data is inaccessible. “We must have the data, but it isn’t available through the XYZ system.”

    Having identified your vision, together with the key measures, you’ll be in a much better position to know what data you will need. Sometimes our enthusiasm for improvement clouds our thinking and we rush off and develop comprehensive reporting systems that collect and harvest both the data we want, as well as data that will not help us reach our objectives. 

    Analytics is about determining the actual data, analysis and forecasting that is required, and then ensuring that existing mechanisms are fully exploited.

    Only when an existing system cannot deliver what is required do we consider changes; in most cases the existing systems can deliver sufficient data to support enough modelling to make significant improvements.

    In the case of using ADVANCE for individuals, this phase is particularly useful. We can find it difficult to measure our own development and it is not always intuitive (or even attractive!) to seek data from others.

    However, the realisation that external data sources are useful means that individuals start actively soliciting feedback on their performance as a result of ADVANCE, as they can see a faster way forward.

    Value

    Back in the Introduction, we posited that “it’s all about the people”. A lot of performance management systems concentrate upon process, which is understandable to some degree as processes are normally easy to quantify and monitor. If students receive feedback in two weeks rather than four, then a reduction in the lead-time can be reported.

    Of course, it does not follow that a student will view this as an improvement. The feedback may be of lower quality and less detailed, or it may be judged to be too generic across the cohort.

    So we have to be careful that not only do we collect the correct data, it has to be of sufficient quality, volume and breadth to permit further analysis.

    Businesses commonly refer to stakeholder value. Does a student/employee/employer/society (stakeholder) receive a fair return (value) from a university?

    If we focus intently on the value that students obtain from their experience of study, there would be a concerted effort to define measures that relate directly to specific operational activities; in other words to understand and monitor the organisational processes that students interact with (sometimes referred to as ‘the student journey’).

    A focus upon the value of staff might concentrate upon finding ways of acquiring data about staff, so that management decisions can be made relatively quickly rather than relying upon an annual staff survey.

    Reflection: From the data you gathered during the Awareness component, what measures does your organisation have in place to monitor the value obtained by your stakeholders?

    As-is, to-be

    As leaders we are interested in the future (‘to-be’), but we also recognise that we need to know how far aware the vision is from the current situation (‘as-is’). Our key questions therefore are:

    • What is the current situation (‘as-is’)? Having completed the strategy phase of ADVANCE you’ll have a comprehensive set of data that includes not only quantitative measurements of performance, but also the location and detail of the various institutional repositories. During the process of discovering this data, you’ll no doubt have made some new acquaintances that will help you in the future.
    • What challenges are on the horizon that could be avoided? At this point you should have a combination of data sources. First, the external data that you gathered as part of the Definition phase will give you a perspective on the wider sector that your department/institution operates within. Second you will have already synthesised your own thoughts and experience together with the views of people you work with. Many coaches and attendees of my workshops have remarked that their daily conversations have now much more value, since they are using the insight that they gain from regular data-gathering to inform the questions they pose. ‘Open questions’ are a great assistance as well.
    • Are all the critical issues being monitored appropriately? One of the dangers of investing effort into the future is that the essential operations are neglected in some way, resulting in a reduced quality of service. This can be one of the most challenging aspects of any transformation, as some mistakes can be catastrophic enough to question the need for a particular change initiative. When a culture needs to change, it is not always a good idea to partition some staff as caretakers while you lead all the exciting work. Everybody needs to be encouraged to participate, and therefore we have to be sure that the daily business will only be improved, not harmed. A clear focus on stakeholders is important; it reinforces who staff are serving and is a constant reminder of the essential activities for legal/regulatory compliance. Management systems that focus on stakeholders tend to be more successful in this regard, and we should endeavour to choose measures that help us drive improved behaviours. You’ll find that the data gathered during the Awareness component is useful here.
    • What are the priority actions? Priority actions fall into two categories. First, there are actions that are visible from basic report analysis; what needs to happen with regard to a drop in student achievement? Second, what actions are there that affect our ability to reach strategic objectives? For example: What is the key factor affecting student recruitment in this subject discipline?
    • Where is the detail for further analysis? This question ensures that we collect all of the information that we need to be able to inform our decisions. It’s of no use to collate aggregated data for reporting, if we cannot dis-aggregate to understand the underlying causes.

    Use these questions as your own, regular ‘sense-check’. Their simplicity disguises incisiveness, particularly when posed by a coaching manager!

    Reporting and visualisation

    The prevalence of desktop PC business software such as spreadsheets, together with in-built tools to quickly assemble graphics has resulted in lots of creativity being applied to organisational reports. Please do resist the temptation to use every available graph in each report. If we assume that we are only going to present information of value, then there are two basic rules to abide by as follows:

    • The measures should be accompanied by a graph that indicates the trend. Graphs referred to as ‘Dot Plots’ are the simplest and most desirable.
    • A value to reflect the variance since the last report. This can either be a positive or negative number, or arrows can be used for a graphical visualisation.

    Many reports have considerable variations in this theme, but in essence you should strive for simplicity. An added benefit is that their simplicity means that local statistics departments will be able to produce them quickly. Anything that communicates the data and insight faster is a good thing.

    Frequency

    As leaders we are of course interested in our staff. Our use of a coaching mindset means that we see the value of developing our staff, and in return our staff will perform at a higher level.

    Since conversation is a fundamental part of coaching relationships, a scenario can develop whereby the person doing the coaching has the broadest view of the staff values and behaviour, but not everyone else has the privilege of sharing that view.

    In time, as a culture develops, the values and behaviours of staff will develop and higher performance will become the norm. But what should we do in the meantime?

    It might be that there are some issues that are not being surfaced by staff, even to a coaching manager, and then by implication those difficulties are not being attended to. This can lead to extra inertia for the change initiative to overcome.

    Reflection: Compare the student feedback of teaching with your assessment of the module teaching team’s satisfaction in their roles. What do you observe?

    Universities often survey their staff anonymously on an annual basis, and report back the cumulative results. In a similar vein, the UK HEIs conduct a National Student Survey (NSS) to understand the satisfaction of students in relation to a number of factors such as teaching quality, assessment and feedback, Student Union, etc.

    The use of the NSS results (which are made public for all HEIs) to manage performance has resulted in institutions installing their own student feedback systems. 

    What are the reasons for this?

    One reason is that it is difficult to monitor performance based upon an annual measure. How can those delivering the student experience understand what really improves the reported student satisfaction with such an infrequent measure?

    As a result, some HEIs measure student satisfaction more frequently at the end of each semester or term, so that dissatisfaction can be discovered earlier and corrective actions can be taken.

    Another reason cited is that students can sometimes be reticent to commit their true feelings to a written survey, and that more practice in completing surveys, and more evidence that action will be taken as a result of completing the survey, will improve the quality of data collected.

    Similarly, you wouldn’t check application figures annually, as there are some marketing and outreach activities that could positively affect the recruitment cycle. So, if we are interested in our staff, we should place data gathering with respect to how valued they feel, as a priority activity. 

    Whether you do this more formally using an anonymous questionnaire or not is a matter for you, your culture and your bureaucracy. But as a coaching manager you understand the benefit of engaging directly with your staff to develop their capabilities and talents.

    If you develop a reflection habit that is structured, it is wise to include data from conversations you have with staff. This is best achieved by carrying a notebook with you at all times (many of you will already), so that you can jot down the essence of a conversation you have with each individual.

    It isn’t usually necessary to record exactly who it was, or where it happened, but a brief note can capture the current sentiment, that might otherwise be lost in the busy-ness of the day.

    As is so often the case with reflection habits, it’s the process of data capture that assists your memory. Your act of noting a thought down somehow reinforces it, and also permits you to reflect in the moment. You also have a record that can be entered into your structured reflection system.

    Being critical

    Part of the practice of a coaching manager is the ability to ask good questions. In fact, like good educators, coaching managers “don’t teach answers, they teach questions”.

    The same approach should applied to the data we discover, and the data we receive. Data, and particularly reports, can mis-lead through error. Organisational politics can result in data being presented in specific ways to exaggerate or conceal poor performance. How do we deal with this?

    The best antidote to errors (and there will always be errors) is to practice excellent data hygiene and process integrity yourself, and act upon the insight you discover. For you to act with integrity, whilst using data produced by somebody else, means that you need to assure yourself that the data has been collected properly.

    Tread carefully when questioning the origin of data or results of analysis; people can fell threatened and become defensive, particularly if the debate is public.

    Questions about the data tend to relate to statistical concepts of reliability, sample size and bias. For instance:

    • What is the standard deviation of this value?
    • What is the significance of the variance?
    • Is there a seasonal/cycle in the results or is it just noise?
    • How representative was the training set of data for the model?
    • What bias might exist in the sample?
    • What was the sample size?
    • What confidence do you have in the forecast?

    Other questions around visualisation are important to query, particularly if the comprehension of reports is difficult. Visualisation standards can be challenging to adopt at the beginning, so the sooner you establish that trend plots are preferable to tabular data, and that pie charts don’t really tell us much, the sooner you will all benefit from analytics reporting.

    Such systems can bring new dimensions to meetings, where you spend some time looking at the outcome data (which reports what has already been actioned upon), and the majority of the time looking forward and discussing future scenarios that are based upon performance data.

    Yet more opportunities to coach!

    Exercise

    If you are reading this book before you have started to implement ADVANCE, it is a useful exercise to answer the five questions in ‘As-is, to-be’. Keep your answers safe and then repeat the exercise when you have completed Awareness, Definition and Vision.

    One way to approach the Analytics component of ADVANCE is to select a particular area of concern for analysis, and through further questioning, analysis and dialogue with those around you, develop a focus that makes explicit use of the data to initiate small-scale improvements. For example, student retention is an important factor that affects the health of a programme, department or institution.

    Traditional approaches would look at the withdrawal figures, interview a few students, and then spend a lot of time hypothesising about what we think are the reasons. A more analytics based approach might be to look at a broader set of student data and then producing some measures that your analysis indicates could be related to a student’s choice to withdraw.

    This might be their achievement or engagement with lectures, and you might also look at personal tutor meeting records. Further analysis might indicate that students of a particular demographic are more likely to withdraw, which would help identify where support efforts could be channelled.

    Using the measures from your vision, populate each metric with the data from your current situation. This is your baseline from which all subsequent activities will be measured.

    Depending upon the effort that is required to obtain the data, you might need to create a protocol that simplifies this in the future, such as a request to the IT department to create a report of the data that you need for your purposes. These can sometimes take a little time to arrange, but once they are done they are embedded in the system.

  • ADVANCE – cultivate your culture

    ADVANCE – cultivate your culture

    In agriculture, cultivate  makes us think of preparing the soil for planting and tending to crops. When we cultivate we improve something; we foster growth, perhaps by focusing on a particular situation, person or characteristic.

    To achieve change that lasts, we are going to have to cultivate those around us (and probably ourselves at the same time). ADVANCE requires us to use data to improve the quality of the decisions we make.

    But we can’t rely on the decision-making of one leader, as there is insufficient capacity and capability to undertake the busy operations of a department or institution.

    We also know that the academic environment is full of people with a leadership mindset; they want to lead or be led; they don’t warm to directive management.

    We therefore need to build a culture that increases leadership capacity, so that more individuals are empowered to take the initiative, but also to ensure that they take initiatives that will move the department forward, not strangle it with uncoordinated conflict.

    We don’t necessarily have to create new processes every time we want to initiate change. Some managers do this and create longer term problems for themselves.

    Often it’s best to make better use of the existing systems and processes; you might use them in different ways, or increase the value of them. But the key is to save any disruption for specific obstacles.

    Culture change, and the processes of cultivating a different group mindset are complex topics. We are not going to address this complexity in its entirety in one article.

    However, we are going to explore a fundamental instrument of most organisations – the annual staff appraisal – and examine how a coaching mindset, combined with rational data from the environment, can significantly accelerate your ability to cultivate positive change.

    Annual appraisals

    The annual staff appraisal can strike dread/apathy/excitement/disappointment into all parties. In many cases, staff may feel that they will have to defend what they have done, or at least make an argument to counter what their manager expects from them.

    There is a tension between coaching as a developmental activity, and appraisal, which is something that a coaching manager must navigate carefully.

    Managers might want to use appraisal processes and documentation for the purposes of ‘transparency’ – where everyone appears to be set common objectives that can be easily reported on. Inevitably, with such a situation it is difficult to get all staff to play to their strengths. We are all different, and have something unique to offer.

    Managers are also being ‘managed’, and therefore they are likely to be required to report when all of their staff appraisals have been completed (are all the forms completed correctly and filed with HR?).

    If you have a few appraisals left to do, and they should have been completed earlier, there may be an implicit pressure to ‘get the paperwork done’, rather than fully take advantage of a developmental conversation with a member of staff.

    In a university setting there is the additional challenge of working with academic staff. As academics we like to argue and debate; we like to understand what something really means, and feel that we can relate to the context upon which a measure might be applied.

    We don’t have a problem with qualitative measures, but the fact that we are comfortable with the fact that we don’t have an answer readily available, doesn’t necessary help the organisation progress.

    But academic life can be a relatively selfish pursuit, and if we are thinking, we are learning. As we have explored earlier in this book, academic staff in general respond less enthusiastically to directive management styles, hence our advocacy of the manager as a coach.

    But as leaders we should attempt to focus upon activities that deliver value. What is the point of maintaining a dysfunctional approach to staff appraisals, if the mere thought of it saps the life out of us?

    However, if you think that you can just dispense with appraisals, then good luck. It would be a bold move to counter the generally accepted wisdom of a large bureaucracy, that has policies for staff appraisals, even though most of the managers see it for what it is.

    Of course as leaders we shall tap into our optimism and explore a more positive approach.

    Reflection: Reflect upon the conversations that you have had with staff in relation to their performance at appraisal time. Now compare this with your daily conversations. What differences do you observe? How can you transform the annual appraisal conversation with a member of staff?

    Perhaps the first issue to tackle is that the appraisal might typically be an annual conversation, and therefore it is too detached from working life. So maybe the first thing to think about is how the annual appraisal can be coupled more directly into the daily conversations.

    How can daily dialogue contribute towards the annual appraisal?

    What departmental themes could link a staff member’s contribution into the departmental/institutional vision?

    If we are going to evaluate performance, what evidence would you expect a member of academic staff to provide?

    These questions are much easier to answer if we have a clear vision of what the department/institution will look like, which you will have as a result of the foundation stones of ADVANCE. You will have the confidence that not only is the vision based upon reason and fact, but you will also have involved the same staff who you are appraising during its construction.

    If, after all this they don’t know what the vision looks like, how can they translate you aspiration into their daily working lives?

    This should give you a clear idea of who falls into the ‘un-coach-able’ club. 

    As I said earlier, don’t waste energy coaching staff who aren’t receptive to open, challenging, developmental language. Invest in those who have potential, and those who are already performing at a high level.

    When you have developed your vision based upon facts that are relevant to your environment, the future is crystal clear. You will have identified the measures/metrics/characteristics that will indicate progress towards your vision. You can feel the future success!

    If a staff member can’t ‘feel’ the success, maybe they are a) in the wrong role, or b) in the wrong environment.

    You need to exercise some sensitivity in both of these cases. I feel that directive performance management can often ignore these two scenarios (or at least dismiss them, assuming that if someone is truly unhappy they’ll find another job), resulting in frustration for the manager and undue stress and anxiety for the staff (and their families, significant others, etc.).

    A coaching manager has to have the mindset whereby they truly want to help people. That includes people who don’t seem to be able to align themselves with the vision. Maybe they have been used to a way of being managed, and your approach is a surprise.

    Or they are actually quite fearful of change. Coaching can be quite effective in these situations, particularly if you commit to developing a relationship based on trust.

    They need to trust that you are genuinely interested in their workplace well-being. You can only build this trust by being optimistic, honest and generous with them. So, perhaps they are not quite ‘un-coach-able’ yet.

    Attendees at my workshops have echoed this sentiment many times; through a coaching oriented relationship they have helped a staff member either align themselves better with a department, or they have worked together to discover what the individual would prefer to do.

    When a staff member has a clear vision of what they want, a lot of the barriers disappear. Whilst this may result in the member staff leaving, their departure is because they have found something better for them.

    Don’t underestimate the strength of the message that this projects to the immediate environment. When staff leave of their own accord ‘for something better’, they leave on positive terms. The rest of the department will see this; they will already know that a particular individual would not align with the change initiative.

    But they also observe an academic manager who reinforced the belief that the staff should be valued, and that means helping them discover their own potential, through a role they are suited to.

    The coaching manager does not persecute staff and make them perform against their will.

    So, with your measures and vision to hand (which you repeat and make reference to at every opportunity), the daily conversations become easier. 

    It’s then a process of aligning individual staff capabilities with the departmental themes. It’s about identifying where staff development has to take place – and after a short while, your staff will start telling you what development they need to align with your vision.

    As a departmental culture develops, mindful of a clearly articulated vision, the annual appraisal becomes more straightforward. Staff will identify the evidence that is already in place as a result of them aligning themselves to the vision. The developmental conversations will already have started during the year, and will be regarded as continual.

    The appraisal will suddenly have found its place – a chance to review progress over an extended period, and an opportunity to think a year or two ahead, as well as to discuss individual staff aspirations. Therefore, the appraisal will have morphed into something that is more developmental. 

    And this is at the heart of being able to cultivate a culture that wants to continually perform at a higher level.

    OK you say, this is all well and good. But at the outset there are staff who will find this approach challenging, and they will make the process arduous. Surely this will bring the whole culture change to a halt?

    It is common for the first round of appraisals to be difficult. There will be a minority that welcome the change in approach, fully subscribing to the notion that they can take charge of their own development in the context of improving the department.

    There will be a significant portion that are wary, suspicious, and genuinely frightened that they can’t measure up to the vision. Some of them will display apathy (“I’ve seen this before; just sit tight until the next initiative”), some will retreat and become reclusive, and some will generate a veil of enthusiasm, and produce a shopping list of expensive, time-consuming staff development activities.

    Beware, because the first request that you turn down could be used as an excuse to suggest that you never really meant what you said in the first place!

    And finally there is likely to be a hardcore minority who have every intention of not engaging. They may be frightened, confused, delusional, incompetent or just insecure. Every trick in the book will be used to dodge the process.

    Some managers see this as a game, with the objective of trying to ‘outwit’ their ‘opponents’. Unfortunately there are many examples of this approach being legitimised, in that the measured performance improves.

    Of course in such situations it is unlikely that a longer term vision has been created and it is the short-term transformation of numbers that is reported as a success. Nonetheless, the cost to the environmental culture can be quite damaging.

    As an academic coach you’ll persevere beyond the initial challenge as you’ll have  confidence in the long term view. The measures you will have chosen will be based on the data that you have reasoned is important.

    In time, some of the hardcore will come round and realise that it might be interesting to engage after all, especially since the manager seems to want to help staff.

    The second round of appraisals is where managers see the greatest transformation. The keen early adopters are already bearing fruits of their focused engagement, and doing things that are visible to the rest of the department.

    They’ll already be in a position to Externalise. Success in acquiring one or two small funded projects can do wonders for the self-confidence, motivation and external visibility of an academic, which of course you will be supportive of.

    While hard-liners will still be resisting, the rest of the department will have started shifting. They’ll have witnessed the successes of the early adopters, and some will have got themselves involved already.

    Others will test the water by suggesting some new activities that they would like some development for. Some will be bold enough to set themselves a target to achieve for the coming year.

    By the third appraisal the bulk of the changes will have been made. Staff will have discovered what they like doing, to what extent it can be accommodated (usually the department is more flexible than people think), and have witnessed the benefits to them personally, all wrapped up in a department that is performing better.

    If during this period your department has recruited new staff, then the transformation is accelerated significantly. The new starters come in fresh and adopt the developmental approach without being held back by any prior cultural baggage.

    What is important to remember that if you actively monitor and measure performance in a directive way, the annual appraisal will remain the key event on the calendar to report achievement.

    In contrast to this, a coaching-oriented style positively supports development on a continual basis, meaning that the annual ‘check-in’ can be more focused upon the strengthening of core values and the development of longer term career goals for an individual.

    So, you have it within your power to re-purpose the staff appraisal process and it’s an excellent instrument to cultivate higher performance.

    Reflection: What are the potential benefits of planning to develop role models in your environment? What can staff learn from a role model?

    Exercise

    A key tool of culture change is how you approach the appraisal and development of others. To do this you must familiarise yourself with the current staff appraisal process. Sometimes this is referred to as a ‘cycle’, or a ‘developmental review’ and there may be key points in the annual calendar at which point certain activities are undertaken.

    Once you have oversight of the process, look for ways in which your vision and measures can be incorporated into the cycle. For instance, do you have an event whereby a line manager discusses the objectives of the department for the coming academic year?

    As a coaching manager you are more likely to use this departmental objectives as prompts for developmental requirements for the individual concerned.

    If all staff need to produce two published outputs this year, what support will each of them need? Some will need more support than others.

    Depending upon your procedures for appraisal, you need to either rework the forms/processes etc., for your own purposes, or you should provide an addendum that enables the explicit links to be drawn between the departmental/institutional objectives, and the individual’s developmental requirements.

    The purpose of the addendum is to explicitly highlight the linkage between an individual’s contribution to the larger environment. This helps everybody by making clear what needs to happen, and prompts them to think about the support they need to help the department achieve its target.

    Developmental conversations that start with this tend to productive. Sometimes an individual will not feel able to respond; this is OK as well, as the process of helping them complete it is another fantastic coaching opportunity.

    If we look at some extracts from a developmental objective setting form (Tables 2 and 3), we can observe the link between departmental target, an indicator of what successful achievement looks like, a space for he individual’s contribution as to how they shall engage, and a date by when it needs to be concluded.

    This both prepares the groundwork and frames a coaching conversation in terms of the individual’s development. The key question for the individual is:

    “What development support do I need to achieve my objectives?”

    You might choose to add this to your form, to be completed as an outcome from your meeting. 

    Departmental target

    How will we know when this has been achieved?

    How will you provide evidence of your engagement?

    By when?

    Improve student satisfaction score across modules taught

    85% of the students will report ‘satisfied’ or ‘very satisfied’

     

    End of Semester

    Improve first-time pass rate

    80% of the students will pass first time and progress

     

    End of Semester

    Improve student achievement

    60% of students achieve at least 2:1 or First

     

    End of Semester

    Provide timely, constructive, written feedback to students

    100% of summative assessment feedback received within 4 weeks of submission deadline

     

    End of Semester

    Table 2. Extract from the teaching quality section of a development review form.

    Departmental target

    How will we know when this has been achieved?

    How will you provide evidence of your engagement?

    By when?

    Improve quality and volume of research output for the department

    Principal researcher: 6 peer-reviewed articles, >2*

    Researchers: 3 peer-reviewed articles, >2*

    Other staff: 1 peer-reviewed article, >2*

     

    End of year

    Improve the external esteem of the department

    Principal Researchers: 2 research events organised/edited books/edited journal special issues

    Researchers: 1 research event organised/edited books/edited journal special issues

     

    End of year

    Improve the research environment

    Principal Researchers: attract and supervise 1 new PhD student

    Researchers: attract and supervise 1 new PhD student

     

    End of year

    Increase research funding into the department

    Principal Researchers: achieve at least one successful bid >£150k as Principal Investigator

    Researchers: submit at least 2 applications for funding >£10k          Other staff: engage with at least 1 funding bid submission

     

    End of year

    Table 3. Extract from the research section of a development review form.

    Using the above as a guide, take the measures you identified in Definition and Vision, and create a document that can be used to augment your existing developmental review/appraisal documentation.

  • University: vision creation

    University: vision creation

    Leadership development texts often refer to the importance of vision – having a vision, constructing a vision and the communication of the vision. But what is vision creation, and how important is it to have one?

    A vision is some description of a future state. In terms of planning for development, it is anticipated that the vision will be aspirational. Aspirational enough to stretch the object being developed, without demoralising if the aspiration is too ambitious.

    Successful communication of the vision is of utmost importance. Many a realistic vision has been left stranded by poor communication – the enablers of the vision, those who will follow your ‘North Star’, either misunderstood the message, or just “didn’t get it”.

    This can be disastrous and is a scenario we shall work systematically to avoid. If your vision is clear, and repeated often, the work to be done will follow logically and performance improvements will be witnessed.

    So, where do we start?

    If a critical success factor of a vision is how it is communicated, then we need to ensure that:

    • The recipients understand and can visualise the future state that you are describing. This means that it needs to be described using their language – the language of the industry they work in, using day-to-day expressions and statements particular to the domain;
    • There are obvious and explicit items to measure progress against. Everybody likes to see progress, and when we are in the thick of it, we can sometimes lose sight of the overall goals. A clear vision identifies the ‘big picture’ in terms of key measures, and serves to remind us of the importance of persisting to realise the vision rather than getting caught-up in the daily complexity of life.

    The importance of communicating a vision is fundamental to the effective delegation of objectives to staff. If they interpret a different vision, you’ll get a sustained effort that works against you. Vision creation can be a powerful force.

    Reflection: When was the last time (if ever) that you sat down and described what you wanted from your professional life? Your personal life? Both?

    To ensure that a vision is relevant to your domain, then you will need to consider:

    • Your external working environment. What measures are the standard for the industry? What are the benchmarks that you will be exceeding?
    • Your internal environment. How aligned are you with the internal processes/culture? Where does the agency for successful change lie within the institution?

    Successful completion of the Definition component highlights data that is missing and prompts additional activity to fill the gap. Thus ADVANCE is rarely a treated as a linear process in which we pass through each stage once only. Rather it should be seen as a model that prompts refinement and iteration.

    If some information is lacking from the vision, we need to return to the definition component. Inevitably, some aspects will reveal insight that may improve the definition of another component. This is to be expected, embraced and ultimately, exploited for maximum benefit.

    Aspirational, but realistic

    This is a common concern, especially if you have little or no prior experience of vision construction. It is important to rely on the data you have collected. The fact that you have done this as part of Definition will substantially increase your chances of success.

    Why quote a 20% increase in applications when the market median is 12%? 

    Such a statement may be too aspirational. Conversely, a 5% increase maybe judged as too conservative, or too risk-averse. This will propagate silent messages that will sabotage your vision from the outset – people want to be led, not constrained.

    You should also consider the number of items that you will need to describe your future state. It is surprising how many of my coachees dive in and create a list of a dozen or so items to measure. This is symptomatic of a directive management mind-set, and needs to be reconsidered when approaching  vision creation. My question to them is simple:

    “What successes is your market leader known for?” 

    After some initial, irrelevant detail, it quickly becomes apparent that there is a much shorter list of items that are important. This list might have between 3-5 crucial measures, that really make the market leader stay out in front of the rest.

    Reflection: Which institution/department would you regard as a market leader? What are your reasons for this judgement?

    From your earlier data gathering work, you will have no doubt read the vision statements of other universities. Whilst there will be differences, you will have seen a lot of similarity.

    And so, you might say, what is the point of a vision that is common across most of the sector, if not across industry as a whole? After all, don’t we all want to “offer the best student employment opportunities” and “attract world-class academic staff”?

    Of course we do. But the realisation of this vision creation is specific to an organisation, and therefore the operational objectives that achieve the vision will vary from university to university.

    Vision creation needs to be sufficiently abstract and concise to enable it to be repeated until it is completely embedded within the organisation, so that it can be recalled and referred to during daily work.

    Therefore, vision creation should be seen as much more than just a brief articulation of a future state. It needs substantiating with a set of objectives that can be measured.

    These two components – the future state, together with the list of objectives in a narrative – make up the output of the Vision component of ADVANCE.

    Getting started

    Unless you have been thinking and talking about it for some time, vision creation tends to be mostly a cyclical process.  We need to understand our desire for aspiration if we are to specify sufficiently stretching objectives.

    We need to understand the measures if our vision is to be realistic.

    During many workshops I have witnessed a combination of approaches to vision creation. Some senior managers make bold statements in relation to a burning issue for their organisation, and this becomes the focal point. Relevant measures in the industry drive the construction of the vision statement.

    Other situations (typically in public sector and educational leadership settings) bring forth aspirational statements that require subsequent translation into measures that are relevant to the sector. One example is that of ‘reputation’. “How can my educational institution improve its research reputation?”

    Another way of thinking about vision creation is to imagine how the objectives will actually be realised through management activities. Some leaders ignore this, maintaining a clear separation between the leadership and management concepts, concentrating on the ‘what’ and ‘why’, rather than the ‘how’ and ‘who’. 

    However, leaders that consider how their managers will delegate, can also gain some insight into the culture of their organisation. A by-product of vision creation is the realisation that an organisation is too heavily micro-managed, and that traits of leadership such as autonomy and empowerment, need more emphasis during the working day.

    This has substantial implications for the organisation as a whole, and may, in fact warrant a clear steer from the vision statement and objectives that a culture change is required and has to happen. Of course, this can be challenging to measure, but itself is an example of how a vision can actually become specific to its target environment.

    Such a public declaration of the need to change fundamentally can also be a powerful statement that important issues will be tackled head-on. This helps those who require reassurance whilst realising the vision, as well as clearly identifying those staff who are likely to experience difficulties fitting into the future state.

    Process-led vision creation

    One situation that HEI managers can find themselves in the middle of, is having the responsibility to transform the performance of an academic unit within an institution.

    This requires some appreciation that an overall institutional vision will exist, and therefore if you are to use a more tailored, local vision to help lead the necessary adjustments, it is important that explicit linkages are visible between the two vision statements.

    However, if your current institutional vision is not clear, or it is undergoing consultation and review, then you may need to prepare for more than one potential vision for your department.

    For instance, your institution might have signified that it wishes to ‘change mission’ in order to become financially stable.

    This might be achieved by an outlook that is more enterprising; however, a HEI, or a department within an institution can be enterprising in many different ways, from adopting a more commercial approach to direct engagement with the business community, through to the expansion of the teaching business into new markets (international, franchising, e-learning, etc.).

    The process of gathering data and performing some basic analysis as described in the Definition component will no doubt have strengthened the perspective you hold in relation to what potential can be achieved.

    But it’s also important to remember that your colleagues will probably not have completed the same exercise, and in many cases they will have no interest in doing so.

    One way to address this is by involving staff in the process of using data as early as possible, with a view to building a culture that naturally produces and consumes data for the purposes of continuous improvement.

    As a leader you need to translate the aims of the institution into operational plans, and in doing so describe work that is meaningful to the recipient. They need to understand what is required, if the vision is to be realised.

    In such circumstances, vision creation requires a process that should be focused upon the development of academic staff. It should be designed to align operational activities to strategic objectives, by making clear, negotiated declarations of what is to be achieved over a given period, and then taking an evidence based approach to evaluate the results.

    Briefly, the key principles are that the process is:

    • Focused upon development;
    • Outcomes are negotiated and agreed at the outset;
    • Individuals are held to account;
    • Judgements are based on evidence.

    The first step is to understand what operational targets are relevant for your vision.

    Declaring the targets

    At this point, let’s assume that there is a vision in place. There needs to be some aspirational, future state that you can link operational targets to. This will provide the ‘story’ that staff can relate to, and as a result be able to identify their developmental needs.

    To help describe this process, we shall make use of an example. You may choose to keep a pen and some paper to hand, to make notes as you go along. This will make it much easier when you repeat the exercise with your own data.

    We are going to start with the vision statement and measures profile:

    “The department will have an international reputation for the high-quality provision of teaching and research, to prepare graduates for professional careers.

    It will attract highly qualified academic staff with international esteem, and be recognised as a leading contributor to educational, research and industrial partnerships.

    Significant social and economic impact will be delivered by cultivating industrial projects to sustain a diverse set of income streams.

    The vision will be achieved by:

    • Delivering a high-quality portfolio that is relevant to the needs of industry;
    • Creating a student experience that beats the sector median;
    • Developing peer credibility amongst academic staff by increasing external activities;
    • Creating and disseminating knowledge for social and economic benefit, both regionally and nationally.”

    Measures profile:

    Performance Indicator

    Present

    5 years from now

    Income

    90% Teaching

    8% Research

    2% Other

    78% Teaching

    10% Research

    12% Other

    Graduate employability

    65%

    85%

    Student satisfaction

    Bottom quartile

    (25th percentile)

    > Median

    (50th percentile or above)

    From the brief details above, there are some clear targets for the department to achieve, which are described as part of the ‘measures profile’. These are operational targets which are measurable, and have a timeline in which they are to be achieved (5 years).

    But what about the text of the vision statement? Some interesting phrases include:

    • “international reputation for the high-quality provision of teaching and research”;
    • “to prepare graduates for professional careers”;
    • “attract highly qualified academic staff with international esteem”;
    • “recognised as a leading contributor to educational, research and industrial partnerships”;
    • “significant social and economic impact”;
    • “cultivating industrial projects”;
    • “sustain a diverse set of income streams”.

    For a moment, think ahead to the future. You have successfully achieved the required transformation, by proactively managing the performance of your staff.

    How will you evidence the achievements, beyond the simple measures of income profile, NSS and league table position?

    What evidence would satisfy you of someone’s claim that they had established an international reputation in teaching?

    You might be interested in the amount of funded pedagogic research they have generated, or the number of peer-reviewed research articles that they have published.

    There might be evidence of close engagement with an international institution. An individual may represent the views of other academics as a member of an international panel of experts.

    In short, there are numerous ways in which engagement with an activity can help realise the collective achievement of an institutional/departmental target.

    What is particularly important with academic staff, is the need to identify activities that relate to their work, in the way that they perform their work. Academic staff, in general, tend to resist micro-management, and can often be reluctant to engage with institutional targets.

    However, the translation of operational targets into relevant academic activities can be productive in terms of performance management.

    Here are some more examples of potential academic activities that could support realisation of the vision statement:

    • international reputation for the high-quality provision of teaching and research;
    • to prepare graduates for professional careers – proportion of students employed within 12 months of graduation;
    • attract highly qualified academic staff with international esteem;
      • Proportion of staff that hold research qualifications;
      • Proportion of staff that are active researchers;
      • Number of peer-reviewed articles published per head;
    • recognised as a leading contributor to educational, research and industrial partnerships;
      • Proportion of total income from funding grants/commercial projects/consulting arrangements/intellectual property licensing;
    • significant social and economic impact;
      • Number of projects with voluntary sector;
    • cultivating industrial projects;
      • Number of projects solicited per annum;
      • Total income received per annum from industrial projects;
    • sustain a diverse set of income streams;
      • Percentage mix of recurrent teaching and research income, commercial research income, industrial projects and other income.

    Reflection: How many of these could your staff successfully engage with tomorrow?

    This stage can be both exciting and sobering. You will see the potential of some staff that is not being fully utilised, but you may also realise that your vision is potentially not achievable with the current resources.

    However, after considering the operational activities that could feed into a vision, you might also discover that different activities help you achieve your vision anyway.

    Reflection: With support and development, what could be realised over the next twelve months? The next five years?

    Thinking even further into the future can give an added check as to the realism of a vision. You don’t need to forecast an entire set of staff development plans for the department, but you could identify whether your vision will be achieved solely through staff development, or whether new staff will need recruiting.

    If you are creating a vision for a departmental transformation, then it is wise to look at the student enrolment projections which should come from the central planning unit at the same time, as significant growth may affect your plans.

    Exercise

    Using the data that you have collected so far, construct a vision statement and measures profile for your own situation. Remember that you now have:

    • Data from the Awareness component; some description of the current state relating to the internal assessment of the organisational unit/individual staff member, etc. For a department you will know what proportion of staff hours are spent on teaching, research and administration for instance.
    • Data from the Definition component; you will have gathered external data that enables the position of the object that will be the subject of the transformation. For example, you will have data from the relevant league tables, student survey data that is reported externally, research assessments, etc.

    Usually, when then the data is brought together, a more holistic picture forms. You begin to understand the character of the department, but you also begin to have a more informed view of your competitors since you are looking at their data as well in the Definition component.

    Additionally, you will no longer be constrained by the measures that your HEI currently uses to measure performance. In the example above, the “proportion of staff hours … spent on teaching, research and administration” is described.

    You might decide that this, in conjunction with a financial income profile, may be the most concise means of reporting progress towards your vision. 

    Once you have constructed a vision and measures profile from the data you have collected, you are in a much better position to argue for its inclusion in your reporting.

    Don’t worry if you find that you need some more data to proceed. If that is the case, at least you know what to ask for!

    The mere observation that data is missing indicates that you have a need for it to satisfy and justify the vision that you are creating. Remember that you will spend a lot of time talking about your vision, so it is important that it’s based on a solid footing of data.

  • Digital manufacturing: start small

    Digital manufacturing: start small

    While you will find a relentless justification for the need to apply science to manufacturing management in these articles, it would be rather naive to think that just looking after the mathematics will solve all of the challenges. Digital transformation is complex and is enabled through people. People need leadership, particularly in organisations, and if you are to be successful at delivering your vision of digital manufacturing, there needs to be someone at the front who knows how to enact change.

    Pilot schemes are an effective way of introducing potentially disruptive practices to an organisation. It’s of vital importance to indicate the success at a small scale so that there is evidence that the change works in your organisation’s culture.

    The mathematics of analytics is not always difficult, and simple tools like spreadsheets can take the brunt of the daily workload.  Training staff to apply this thinking to their activities can take time, but gets better with practice on your digital manufacturing journey.

    One effective way of ensuring that staff in the pilot become engaged is to make sure that either the measures of the improvement can be explicitly linked to their efforts, or that their efforts are directly measured.

    In the same way that an SPC chart can show a machinist when the tool has lost its edge, individual’s performance on activities can be measured as above, within, or below some control limits. This data is an essential ingredient of a successful digital manufacturing ecosystem.

    Reporting such results enables staff to direct their efforts to the most pressing priorities, while also engendering a culture of continuous improvement.

    The linking of production activity to monitoring of manufacturing objectives helps develop a culture whereby operations are of interest, and studied by all staff, rather than leaving it all to the production planning department.

  • Human Resource Analytics anyone?

    Human Resource Analytics anyone?

    People are the key asset of any business organisation. For a business to be profitable we need to manage resources effectively, especially Human Resources. Human Resource Analytics can help.

    Years of management practice, organisational research and psychology, and explorations into the characteristics of leadership have generated vast swathes of knowledge that have been, and will continue to be, discussed and debated at length.

    Approaches to management can vary from the very scientific, using quantifiable processes, to the intuitive, focusing on human-centric behaviours and the interplay between people in their daily working lives. Some managers hare highly trained in a variety of approaches, whereas others have a tendency to learn from experience.

    And of course, there are people that manage their staff using a collection of approaches, that might be context or time-dependent.

    Irrespective of your own tendency, it is generally accepted that measurement is a vital function in business. No matter how nebulous our work environment might be at times, we still need to ensure that the organisation is profitable now and in the future.

    So, when a business cannot make a profit, it is no longer sustainable. The default action in such circumstances is to ensure that there are measures in place, and that these are used to target the source of the challenges.

    When it comes to people, measurements can be motivational and also demoralising. The context within which the measurements are deployed has a large bearing on this, but it can also be the way that the measurement and monitoring is communicated.

    An experienced manager understands that their staff are all individuals, and that there should be tailored approaches to each of them.

    Problems can arise when attempts to introduce a more quantifiable management approach, that invariably uses measures. Such initiatives are fraught with difficulty. Staff can react in unpredictable ways, or the measures might stimulate unintended behaviours that can undermine any well-intentioned objectives. 

    But, if an over-zealous manager can disrupt the daily goings-on with measurements and targets, think of the chaos that is possible if they graduate to the use of Human Resource Analytics.

    The ability to automate reporting, and calculate descriptive statistics on employee performance can contribute to a potentially toxic environment, which is the last thing that we need.

    One of the promises of analytics is the ability to use data to prescribe behaviour – model different outcomes from the existing data and then make an outcome happen – and this is referred to as prescriptive analytics.

    Gartner analytics maturity graph in the context of human resource analytics

    Prescriptive analytics provides foresight, and it assumes that you have mechanisms for insight (diagnostic analytics) and hindsight (descriptive analytics) already in place.

    This journey towards behaviour change is something that people managers acquire in time, with experience. It is therefore attractive to make this journey as quickly as possible, and if tools can help, all the better.

    So, if you have a problem with sickness absence, you can try and ask a few questions to see what might be going on. You are relying on the answers to be informative and accurate. You can’t necessarily ask all of the questions that you might want to, and you are left trying to piece together what the underlying problem might be, based on your own mental models of the staff and how they behave.

    This is clearly a case for diagnostic analytics, to help you gain insight as to what the root causes of the sickness absence are. You can’t hope to start predicting behaviour until you understand the data upon which you will base your predictions.

    Staff engagement is another measure that is prominent in larger organisations. How will you engage your staff better with the data you receive from an annual survey?

    The answer is complex. We need to make sense of data that is probably not yet sufficiently joined-up and accessible. The data exists, but it isn’t “to hand”. As managers we need to ensure that our enthusiasm for improving operations through the use of data, does not lead to Management By Objectives (MBO) on steroids; we don’t want  to create an over-bearing environment where everyone is functioning only to serve the measures. This stifles creativity and creates stress (that might be the cause of the sickness absence rate…).

    But we have evidence of predictive analytics approaches making tangible improvements to sales, logistics, plant maintenance and other industrial operations. Why not human resource analytics as well?

  • Digital transformation: an approach for SMEs

    Digital transformation: an approach for SMEs

    All the hype from Industry 4.0 creates a lot of impetus for SMEs to ask how they can make it work for them. SMEs are focused on doing more with less, and are motivated to respond to any call to ‘reduce the productivity gap’.

    Management approaches such as lean manufacturing can achieve a lot with existing plant and machinery, and there are countless case studies that demonstrate what can be achieved with pencils, paper, data analysis training, and persistent leadership that builds a ‘can-do’ culture of continuous improvement.

    There is often a sense of frustration from SMEs who can see that they could benefit from a particular technology, yet the cost (either capital or operational) is prohibitive or that they just cannot afford the downtime to implement it.

    In such cases, it is not clear for an SME what the route forward for improvement is, and they can get stuck in a rut with no obvious solution.

    SMEs are sold a vision from the technology vendors that perhaps seems unattainable; the potential benefits of the technology can only be realised in a cooperative, supportive environment, and this is certainly not quick to build.

    So, what SMEs need is a framework that can  explain what stages need to be in place so that their digital transformation ambitions are successful.

    This framework should contain three key aspects as follows:

    1. Understanding what capability is required. Technological know-how is important, but perhaps of more importance is understanding the business requirement for change first, and using this to drive a set of technological and environmental development requirements. Lean is a good way to both understand the existing situation and also to see where the next benefits will come from, and it is essentially a human-centric approach that yields tangible results.
    2. Once the business imperative is understood, we are in a much better position to evaluate what actual technological innovation is required.  A technology vendor might well encourage a large-scale programme of IIoT adoption, but this is exactly the sort of behaviour that makes SMEs sceptical of Industry 4.0, as the upfront costs are often too high. Choosing a process to transform, and the judicious use of simple sensors, localised analytics, and dynamic data visualisation can go a long way to actually realising the benefits that have been identified in the requirements phase.
    3. Look for opportunities to scale. This is perhaps the most exciting stage. Now that we have a few processes that have been improved, it is likely that a) we shall start observing other areas of the business that are becoming stretched, and b) we are also beginning to broaden our outlook and see new opportunities for growth that did not exist before, as a realist of more collaborative operational activity. This is where we can now exploit the fact that we not only understand our processes better, but we can also start to think about automation and the intelligent delegation of process monitoring to the machines.

    The outputs from each of the above steps tend to provide additional insight that motivates continued development. And perhaps more importantly, each decision to purchase equipment is justified by a specific problem that is to be addressed.

    Digital transformation is often described as a ‘top-down’ approach, where executive leadership needs to ‘buy-in’ and support the agenda.

    In SMEs the executive is often the workforce, and they need a bottom-up approach to make it work!

  • Is ‘Lean’ key to Industry 4.0 adoption?

    Is ‘Lean’ key to Industry 4.0 adoption?

    Lean methods in manufacturing concentrate on the elimination of wasteful activities and resources that do not add value to the final product. If something isn’t required, why pay for it?

    Industry 4.0 is about using digital technologies to enhance   industrial operations, whether it be design, manufacturing, services, or the like.

    So, a lean description of a process might well be an excellent specification for an Industry 4.0 compliant process.

    Since some businesses struggle with the ideas around the adoption of Industry 4.0 technology, maybe we should start with processes that are already lean?

    In fact, are lean methods a potential way forward for the introduction of industrial digital technologies?

    Lean methods have been around a while, are generally people-centric when it comes to developing solutions, and there are countless case studies of organisations that have succeeded in eliminating waste from their operations. Perhaps a lean approach might provide a first step for new technology adoption for those businesses that remain sceptical of Industry 4.0.

  • Essay: Managing research and teaching

    Essay: Managing research and teaching


    Abstract

    This article explores the challenges of managing research and teaching in UK Higher Education, by examining the variability of boundaries that are drawn around such spaces. Changing policy in the UK is provoking Higher Education Institutions to respond in dierent ways, to address the emergence of quasi, and ultimately, free market conditions. In particular, we examine how differing management and leadership cultures, namely mangerialist and collegial, can impose more or less constraints upon research and teaching management, as both discrete and combined activities. Furthermore the potential interplay between research and teaching is examined with a view to exploring a new model of university management, that has departmental leadership as a core component of a more de-coupled strategy. Finally we consider the implications of such thinking upon institutional management and leadership, and conclude that the emerging complexity in the UK HE sector is demanding a more adept leadership culture that embraces emergence and the development of a holistic understanding of research and teaching.

    1 Introduction

    This article considers the management of research and teaching in terms of the constraints that are often imposed upon each set of activities. This is a complex, challenging issue for university managers, and ultimately, institutional leaders. Firstly, a brief synopsis of relevant events in the development of Higher Education in the UK is discussed, to set the context for the rest of the discussion. Pertinent concepts are then described, before the limits upon the management of research and teaching are explored. Finally some implications for University management are described. We begin by considering how the United Kingdom (UK) Higher Education (HE) sector has been developing of late.

    2 The higher education context

    Universities have been considered to be collegial institutions, consisting of scholarly academics who create and disseminate new knowledge. That knowledge is imparted upon a community of students, who after a period of time, acquire a degree and move on within the wider economic community. The scholarly pursuit, perhaps as a means in itself, would be a key motivation in such an environment. Government policies that apportion funding to universities, would insulate a Higher Education Institution (HEI) from accounting for its activities, unlike private industry that needs to create financial profit now and in the future.

    Those who are employed in a UK HEI understand that this halcyon description lies some distance from the reality. For some time now, UK Government policies have steadily influenced HEIs by augmenting different sets of conditions upon how a university might function. An emerging need to demonstrate that the public funds are been spent wisely and appropriately, has led to substantial effort being expended upon the quality of an HEI’s provision. The UK Quality Assurance Agency (QAA) has substantial influence over the way in which a university manages its processes, and when this is combined with strictly enforced guidelines from the Higher Education Funding Council for England (HEFCE), a university can find itself needing to react to these constraints.

    The objectives of HEI management thus become more defined than the traditional, nebulous pursuit of knowledge. Queries from funding bodies require managerial systems to provide the requisite information. Activities that were once undefined, become scrutinised in terms of resource consumption, and whether the activity itself provides `value’. Indicators of performance become more overt, with league tables appearing that rate institutions on their relative results for teaching, research, employability and `student experience’. The recent trend in the introduction of partial fees, and latterly, whole fees (albeit capped at certain thresholds at the time of writing) has introduced a quasi-market environment in which UK HEIs function (Le Grand and Bartlett, 1993).

    The requirement to report upon performance sharply opposes the more collegial culture of HEIs. As institutions begin to focus upon the minutiae and install systems to manage performance, efficiencies in the way individual staff work are immediately called into question. Activities that were once regarded as part of the norm, are now exceptionally identified as being wasteful or redundant when considered at the micro level. Staff find that as a direct consequence of the systems being measured, that their own performance is assessed and reported, leading to an implicit pressure to do more with less (Smyth, 1995; Cuthbert, 1996). In times when external funding is reduced, that implicit pressure becomes explicit as academic managers direct and control the activities and working conditions of academic staff (Trowler, 1998).

    From a cultural perspective, there appear to be HEIs who are more ready to accept managerial practices than others. Pratt (1997) identifies the general polarisation of institutions that existed before 1992, and those that were formed post 1992. Universities that existed prior to 1992 had traits of a more collegial culture; a model of governance, rather than command and control management, was more evident in their daily operations. Conversely, as polytechnic institutions became able to use the title of university post 1992, the traditional bureaucracy associated with Local Authority management tended towards a more actively managed culture, though not to the extent of a private company. McNay (1995) observed that the generalised differences between this bipartite split in the sector, have started to diminish in the light of changing funding policy. 

    Specifically, both parts of the sector are operating under the same funding regimes, and are observed and reported upon by identical agencies such as QAA.

    3 Managing operations

    As the HE quasi-market has developed, universities have undergone transformations in an attempt to adjust to the more explicit demands that are placed upon them. The increased desire to act rationally, is one example of how internal decision making has been affected by economic pressures.

    University managers have used private sector management approaches as inspiration for their re-interpretation in the HE sector, which is often referred to as new managerialism (Reed and Anthony, 1993; Clarke and Newman, 1994; Deem, 1998). We now consider the two most significant spaces within HE, research and teaching, and explore the limits by which pertinent activities within those spaces can be managed. First of all, we shall consider the research space.

    3.1 Managing research spaces

    To understand the constraints of research requires some understanding of what research is, if only to clarify its distinction from teaching. For the purposes of this discussion we assume some basic definitions from Bushaway (2003) as follows:

    • Research. Using a systematic process of enquiry to undertake some original investigation, leading towards new knowledge or new under standing.
    • Research leadership. Understanding the research context, setting goals and enabling research to be directed.
    • Research management. The control and coordination of research activities to ensure its correct operation.
    • Research coordination. Managing resources in relation to research objectives, maintaining appropriate accountability within a university.
    • Research planning. The creation of a research strategy that is congruent with the aims of the university.
    • Research support. Creating and maintaining an environment in which research activities can flourish.

    Furthermore we assume that research is funded by an external source, and therefore other forms of research activity that a university will typically undertake, such as scholarship (Dearing, 1997), the application of knowledge, and the development of learning and teaching materials, will not be considered within the scope of this discussion.

    The management of research requires an appreciation of project and finanical management, quality assurance, logistics, human resources, administration, marketing and networking (Bushaway, 2003). Since it is externally funded, key stakeholders demand progress to be reported and results to be evaluated. All of these tasks must also be auditable. Thus, the assessment of performance is an important activity for the management of research, and whilst research might be considered a creative discipline, there is much that must be managed if the discipline is to be a sustainable income stream for a university.

    It is the creative part of research however, that is influenced by the need to manage and account for research performance. As funding councils and bodies demand more tangible evidence of `impact’, whether it be social or economic, research is ultimately affected by the thrust of evaluation. `Blue sky’, high risk, high reward, research is becoming increasingly difficult to conduct, as funders become more prescriptive with their desire for evidence.

    As such, whilst funded research generally lends itself to managerial activities as the measures are generally well-defined and apportioned to a finite budget, the very nature of the requirement to demonstrate tangible outcomes, limits opportunities to take risks and conduct truly innovative investigation.

    To summarise, externally funded research is actively managed and sits comfortably in an environment that measures, monitors, reports and manages performance. Management of the creative aspect is somewhat different and thus presents a boundary beyond which management activity is less productive and may even harm outputs.

    3.2 Managing teaching spaces

    At first sight, the management of teaching spaces would seem to be determined by finite sets of resources such as, facilities, staff, programme timetables, specialist equipment, length of module or programme, etc. Within this there is the knowledge capability of each staff member (what they can teach), and the interplay between different subjects upon a learner’s (and an academic’s) timetable. For example, an academic may teach two closely related modules and another might teach three disconnected subjects, with a clear difference in the workloading between both situations. Other, discrete constraints are how much time staff can make available; teaching duties assumes the inclusion of other activities that are distinct from teaching itself, such as administration, pastoral care, attendance at departmental meetings, marketing and open days.

    The management of these constraints can often focus around a normative currency, which is often time-related in terms of the number of hours `contact’. Contact refers to the amount of hours a tutor spends with students face to face; immediately this does not take account of electronic interactions and communication, which as technology becomes ever more pervasive, is an increased part of the academic’s working life. Using the currency of contact, systems emerge whereby other activities are converted into `contact hours’, so that they can be included as part of an overall assessment of an individuals workload. The manifestation of all the teaching constraints may result in a delivery norm of 1 hour lecture and 2 hour tutorial per week, per module, for example.

    The interpretation of this varies in relation to management style, as well as the characteristics of the academics being measured. Such styles range from trust-based laissez faire approaches, through to more prescriptive models that attempt to account for all activities. The reporting of teaching outcomes is challenging, since it is considered to be largely based upon qualitative data, yet there is often a demand to report it quantitatively in order to `benchmark’. The evaluation of teaching itself is a complex topic, especially when we consider the ethical constraints that are imposed upon studies of teaching practice. Dearlove (1997) argues that resource constrained teaching activities can be managed effectively, but the remainder can only be facilitated.

    Thus, the management of teaching (and teaching related activities) is often interpreted as the management of performance and culture, in response to the conflicting demands of the external HE environment as discussed earlier. In contrast to teaching delivery, scholarly activities are perhaps more nebulous to account for, and there is a tendency either to assume that an academic makes a professional judgement as to the hours they invest, or a nominal block of contact hours (referred to as self-managed time, which is outside of teaching periods), is used for the purposes of representing workload. There may of course be discrete activities such as writing an academic article, authoring a book, writing a funding bid or conducting a scientific experiment, that some attempt can be made to forecast the time required.

    In particular, a pedagogic experiment may be part of some externally funded work, where constraints were imposed at the design and planning stages of the bid application. Such work may be assumed to be more defined.

    As such, the complexity of the teaching role means that significant portions of the workload are both variable in scope and size, and challenging to account for. How does an academic manager assess the teaching quality of an academic? Assessment characteristics might be the number of complaints received, or the average grade profile of the student cohort, or even the overall student satisfaction as reported from an end of module questionnaire.

    However, all of these measures are open to manipulation, but also they can also be considerably influenced by external factors meaning that they cease to be a reliable measure. For instance, Key Performance Indicators (KPIs) for grade performance (the percentage of students who achieve 2:1 honours or above), does not take account of the ability of a particular cohort. In a climate where students are demanding more specialist programmes, smaller cohorts will demonstrate more volatile performance statistics. This complexity, and the arguments within, create an extremely challenging environment for the academic manager of teaching spaces. Academic staff understand too well the relative difficulties of attaching measures to teaching quality, satisfaction, retention, progression and achievement. Such understanding leads to frustration and tension when the measures report adverse conditions that may be beyond the influence of the staff, and of course, staff may respond to the role of measurement by performing strategically.

    However, there is clearly a conflict between the ability to measure, monitor and control sizable aspects of the teaching role, in an environment that is demanding its effective management.

    4 Managing research and teaching spaces together

    Whilst each space has its own constraints, there are also limits imposed when the two activities are combined. Indeed, universities have a need to consider the two spaces not only as separate entities, but also as the fundamental constituent activities of a HEI. It follows that the complexities of managing the spaces separately is further complicated when they are brought together.

    The character or self-perception of an institution may impose constraints upon these activities. The simplest example is whether an institution regards itself as research or teaching intensive. Since universities are typically large organisations, that are composed of smaller units, the relative achievements of a particular unit may appear to be at odds with the overall perception of the institution. 

    For instance, a small department that has aspirations to improve its research outputs and reputation may decide to submit grant applications, and therefore will be actively promoting the inclusion of research as part of its strategic plan. In a teaching-intensive university there maybe countless hurdles to overcome, since the operations will tend to reflect the predominant activities, which may not be conducive for funded research.

    Dedicated research administration and support may not exist for instance, or have insuficient capacity for certain types of projects. The academic staff time will not formally be available, since the HEFCE funding received is restricted to teaching duties and is not for the pursuit of research monies. As a consequence staff may invest their own time to write bids, until they achieve their first successful grant. This grant will then be used to `buy them out’ of teaching, or in other words, spend less time with students. This behaviour reinforces the divide between research and teaching, especially when teaching colleagues see research active colleagues’ careers progress at a greater pace.

    Loosely-coupled departmental structures, together with collegial tendencies, might be ideal conditions for teaching excellence. They are however, environmental conditions that are less than ideal for the monitoring of measurements, such as costs. Additionally, they are tolerant of poor teaching quality since it is dicult to directly challenge and manage performance that is below that what is expected. Clearly, in an age where the control of costs is mandated by external factors from free market or quasi-market forces, there is a boundary to be placed upon laissez faire cultures (at the potential expense of teaching and research quality). Conversely, managerialist practices can stifle creativity and engender educational approaches that are based on training models, rather than fostering learning through exploration and the creation of new knowledge.

    Many academic staff feel that a hard boundary exists between research and teaching spaces, even though staff may be expected to contribute to both spaces in pursuit of the university’s mission. One such reinforcement of the boundary between research and teaching is that caused by the differential in funding for either activity. 

    Resources for teaching have been steadily reduced and replaced with systems for Quality Assurance (QA). These systems are discrete from teaching activity and have served to considerably increase the administration workloads of academic staff (J.M. Consulting Ltd, 2000), whilst demonstrating no obvious support for research (Brown, 2002). QA systems are essentially managerial, causing tension when the activity to be observed does not lend itself towards direct comparison with `benchmarks’.

    There is an irony that after successive years of research funding being awarded through the UK Research Assessment Exercise (RAE, now the Research Excellence Framework, REF), teaching-intensive universities, who generally have not achieved the research esteem of research-intensive universities, are now motivated to acquire esteem and compete for funding with the HE sector at large. The motivation for this change in behaviour has been in part, the publishing of university league tables such as the Guardian newspaper (http://www.guardian.co.uk/education/table/2011/may/17/university-league-table-2012), which attempt to indicate the relative performance of each institution against each other. Having a value greater than zero in the research column is one strategic way of propelling an institution further up the league.

    However, this change in strategy means that tensions that may have existed amongst academics who fight to continue with their own research against a backdrop of a full teaching workload, must now become more exposed within departments, faculties and ultimately the institution itself.

    Clearly, institutions that decide to undergo a transformation have made a conscious choice to re-engineer their culture, and how that culture is managed. This has implications for the university management, who must recognise the limits of research and teaching management, both separately and together, with a view to pursuing a successful strategy.

    5 Implications for university management

    Within the quasi-market (Le Grand and Bartlett, 1993) of UK HE, external factors such as reduced funding, quality assurance compliance and reported performance through league tables and student satisfaction (National Student Survey), mean that HEIs have a need to manage and improve performance. As discussed earlier, the management of research and the management of teaching exposes different limits. 

    Research has a history of having to be accountable for external stakeholders, and therefore its management has developed upon a more rational basis. Teaching however, has been funded differently, in a way that has insulated expenditure from free-market volatility in the main. When cuts in teaching funding are announced, they are typically met with some objection. Within this funding model, certain acivities relating to the consumption of resources are straightforward to manage.

    Some aspects, generally related to the quality of teaching and the `learning experience’, are more difficult. Performance management in university culture is a very challenging topic, and one which has significant implications for HEI management.

    The first implication is that the university must have a clear understanding of its purpose. The categorisation of `research-intensive’ and `teaching-intensive’ will become less relevant as institutions attempt to performance manage both research and teaching to respond to external measurements.

    Indeed, institutions who still have collegiate approaches to managing teaching, alongside managerialist approaches towards research, may have a more challenging time in the emerging marketplace. Post-1992 institutions, with histories of bureaucratic teaching and QA management, may adopt more readily, the disciplines of managed research activity.

    However, the managerialist approach is essentially `top-down’ and this presents a risk that the collegial, creative environment where ideas emerge and flourish, will be silenced by KPIs and committees. 

    Shattock (2003) argues that the environmental conditions for change are more likely to exist in a university that fosters a more holistic, emergent approach to strategic management. Since both research and teaching are two fundamental constituents of a HEI, then university management must consider the institution’s strategy in a holistic manner. This contrasts with institutions that have separate research and teaching strategies (with no obvious links between the two (Gibbs, 2002)), managed by separate Pro Vice Chancellors.

    Henkel(2000) advises that the identities of instutitions have developed over a long period and therefore they may offer some considerable resistance if the future appears to be fundamentally different. Even if the perspective exists that research and teaching may be separate islands in an institution, the creation of explicit, positive links between the two is not easy to manage (J.M. Consulting Ltd (2000), referred to in Locke (2004)). Dearlove (1998) suggests that a close understanding of how the culture functions, especially its strengths, will be instrumental for university leadership to consider through a period of transformation.

    From an institutional perspective there should be strategies for research and teaching. However the implementation of these strategies is less complex if there are explicit links between the strategies; separate PVCs, with disconnected strategy documents, only create difficulties for departmental management. Therefore there should be explicit, appropriate links between the two strategy documents (if one, unified strategy is a bridge too far), indicating their mutual contribution towards the mission of the institution. For instance, `teaching informed by research activity’ is as important a statement as `the processes of research informing the teaching’. The nature of scholarship is a too broad and contested term to be the only documented nexus (Neumann, 1994) between teaching and research, and assumes that it is interpreted consistently across all functions.

    Therefore, the facilitation of an emergent environment where the holistic strategy is described by the university’s executive, to be interpreted and operationalised by departmental units, should be a key aim for an HEI. A university that has a culture of flexibility, being able to adapt to emerging trends, will be better placed to accommodate medium-term transformational objectives such as engaging with funded research for the first time.

    Understanding the core purpose can then set the scene for departments to scrutinise their own means of achieving the institution’s goals. To prevent departments from crudely interpreting the university mission, there is an implication for the institution’s Human Resources function, which must address a historical disparity between the careers of research active academics and teaching academics (Locke, 2004). A related matter is that of recruitment; institutions may choose to be more selective with the appointment of new staff, to align better with the emerging values (Locke, 2004).

    The adoption of an emergent approach means that leadership should not be confined to the senior management tiers. For departments to be able to interpret the institution’s goals, and thus develop their own strategic response, leadership roles must be cultivated at departmental level also. These leaders will manage, support and facilitate (Middlehurst and Kennie (2003) referred to in Locke (2004)) the real agents of change – the academics – in order to develop responses to tensions between the core components of university operations, research and teaching. This may inform the conversations around scholarship; what it is, and what it means in the context of the academic role.

    Whilst there may be a conceptual linkage, for scholarship to act as the nexus betwixt research and teaching (Elton, 2005), it is for the actual practitioners to work this out in their own context.

    6 Conclusions

    The question as to whether there are limits to the management of research and teaching is a pertinent one for UK HEIs at this time. New managerialism can be seen as a way of `grasping the nettle’, and undoubtedly some aspects of a university’s mission, that being funded research and the resource management for teaching, appear to be suitable candidates. In fact, institutions are already demonstrating evidence that they have adopted this approach.

    However, the realisation that managerialist, top-down approaches may also have negative connotations for the other functions of a university – high quality, inspirational teaching, scholarship and research creativity – has severe ramifications for the approach that university management should take.

    It would seem that a leadership model of trust should be adopted, whereby an open and honest discourse is held to understand the current identity of an institution, as well as a future identity that the university might want to aspire to. This would then be transposed into a set of goals to be interpreted at departmental level, to reflect the cultural and subject discipline norms, the capabilities of the staff, and indicate some of the uncertainties for the future. The university Human Resources department must also prepare to facilitate the development of departmental leadership, fostering an environment where leaderly talent is nurtured, whilst also developing and enforcing policies that make staff recruitment more agile and a better fit for the needs of the departments.

    In conclusion, as HEIs operate in an `age of supercomplexity’ (Barnett, 2000), a suitably adaptable approach to management is required. Paying homage to collegiality will demand leadership at all levels of the institution, to effectively manage a shared understanding of what the core function of a particular university is. This understanding will be derived by considering the limits of research and teaching management as a holistic entity, without resorting to a corporate management approach to performance measurement.

    References

    Barnett, R. (2000). Realising the university in an age of supercomplexity. Society for Research into Higher Education. Open University Press, Buckingham.

    Brown, R. (2002). Research and teaching: repairing the damage. Exchange, 3:29{30}.

    Bushaway, R. (2003). Managing Research. Managing Universities and Colleges: Guides to good practice. Open University Press and McGraw-Hill Education, first edition.

    Clarke, J. and Newman, J. (1994). The managerialisation of public services. In A. Cochrane and E. McLaughlin, editors, Managing Social Policy, pages 13{31}. Sage, London.

    Cuthbert, R., editor (1996). Working in Higher Education. Open University Press, Buckingham.

    Dearing, R. (1997). Higher education in the learning society. Technical report, The Stationery Oce, London.

    Dearlove, J. (1997). The academic labour process: From collegiality and professionalism to managerialism and proletarianisation? Higher Education Review, 30(1):56{75}.

    Dearlove, J. (1998). The deadly dull issue of university administration? good governance, managerialism and organising academic work. Higher Education Policy, 11(1):59{79}.

    Deem, R. (1998). New managerialism and higher education: the management of performances and cultures in universities in the united kingdom. International Studies in Sociology of Education, 8:47{70}.

    Elton, L. (2005). Scholarship and the research and teaching nexus. In R. Barnett, editor, Reshaping the University: New Relationships between Research, Scholarship and Teaching, Society for Research into Higher Education, chapter 8. Open University Press, Maidenhead, first edition.

    Gibbs, G. (2002). Institutional strategies for linking research and teaching. Exchange, 3:8{11}.

    Henkel, M. (2000). Academic Identities and Policy Change in Higher Education. Jessica Kingsley, London.

    J.M. Consulting Ltd (2000). Interactions between research, teaching and other academic activities. Technical report, Higher Education Funding Council for England, Bristol.

    Le Grand, J. and Bartlett, W., editors (1993). Quasi-markets and Social Policy. Macmillan, London.

    Locke, W. (2004). Integrating research and teaching strategies: Implications for institutional management and leadership in the United Kingdom. Higher Education Management and Policy, 16(3):101{120}.

    McNay, I. (1995). From the collegial academy to corporate enterprise: the changing cultures of universities. In T. Schuller, editor, The Changing University, pages 105{115}. Open University Press, Buckingham.

    Middlehurst, R. and Kennie, T. (2003). Managing for performance today and tomorrow. In A. Hall, editor, Managing People, Society for Research into Higher Education. Open University Press, Buckingham.

    Neumann, R. (1994). The teaching-research nexus: applying a framework to university students learning experiences. European Journal of Education, 29(3):323{339}.

    Pratt, J. (1997). The Polytechnic Experiment, 1965-1992. Open University Press, Buckingham.

    Reed, M. and Anthony, P. (1993). Between an ideological rock and an organizational hard place. In T. Clarke and C. Pitelis, editors, The Political Economy of Privatization. Routledge, London.

    Shattock, M. (2003). Managing Successful Universities. Open University Press, Maidenhead.

    Smyth, J., editor (1995). Academic Work. Open University Press, Buckingham.

    Trowler, P. (1998). Academics, Work and Change. Open University Press, Buckingham.

  • The manager as coach

    The manager as coach

    Coaching is a popular topic, particularly in the world of business. Executive leaders employ personal coaches to have developmental conversations, to explore hypothetical scenarios, and to encourage self-awareness. It follows that the practice has expanded, with many people deciding to make careers of coaching, as greater numbers of individuals use coaching services to improve their own development.

    One of the defining aspects of business/life/personal coaching is the absolute focus upon the processes of coaching. The coaching engagements are typically short term, perhaps six separate sessions for instance, and therefore there is a lot of emphasis on developing techniques to establish ‘rapport’ quickly between the client and the coach.

    Coaches that focus on helping clients solve their own challenges need not know anything about a particular business domain or industry; in fact the fresh perspective may be a significant advantage in terms of lateral thinking. In addition, each engagement is clearly identified – the session will be dedicated to coaching – with no chance of conversations being polluted by the most recent managerial crisis.

    Such coaches practice the skills of coaching conversation, using powerful questions to challenge and pursue potential barriers in the client’s thinking. The conversation may be augmented with specific tools that can help clients gain a new outlook on a situation.

    What these coaches don’t have therefore, is a line management responsibility for the client. In fact, they are employed by the client so there is a relationship of service. The coach is not required to appraise the client with a view to determining any actions other than to improve the client’s performance. Finally, the coach has a defined engagement with the client that is usually temporary in nature. Once the required development has been undertaken, the relationship ceases.

    In sharp contrast, the coaching manager has a line management responsibility for the coachee. They are required to appraise the coachee at least annually, and the outcome of that appraisal may be linked to career progression. In addition, the relationship normally is expected to be of a more significant length.

    With these characteristics in place, how does this affect the manager’s ability to coach?

    Reflection: Think beyond the use of open questions in your dialogue with staff. What difficulties might you envisage if you adopt a more developmental approach towards your staff?

    It’s important to realise that a coaching manager has to adopt a different outlook to a ‘pure’ coach. Coaching practice is different to having the responsibility for staff and operations. Tasks have to be completed on time and to the correct standard, in an efficient manner. There is bound to be directive language in the requisite conversations, otherwise the short term objectives might not be met.

    In terms of the annual appraisal, or any event where a manager has to evaluate the performance of a member of staff (which is more common in project-oriented environments), then there is a fundamental tension between making judgements and coaching.

    When we appraise staff, we are placing the focus upon the objectives of the organisation, rather than the needs of the individual. As we have discussed so far, whilst coaching could be the preferred way of supporting the development of individuals, it can only at best follow on from an appraisal.

    The fact that an appraisal conversation can be less conducive to coaching, means that the coaching manager would be wise to clearly identify the context of the discussion up front. So, be clear when you are making organisational judgements based on the needs of the institution, and be clear when you are coaching.

    Developing a coaching mindset

    The decision to coach is relatively easy to make. There are simple practices that can be adopted such as asking open questions and ‘active listening’ that can yield a lot of value. As managers in challenging environments we can forget that we are immersed in the present and consumed by tasks that need completing. The time to pause and reflect can disappear and therefore our opportunities to learn are diminished.

    However, adopting simple changes in behaviour does not in itself result in a coaching mindset. Some extra value will be obtained, but ultimately there is a limit to what can be achieved by listening and questioning as a line manager. Remember, the pure coach does not have direct responsibility for your own development; they merely help you identify the need for it.

    As a coaching manager you must truly want to help people. Managers who see their staff as instruments for their own advancement will struggle with developmental coaching. They’ll adopt some techniques that make them perform better in the long run than a directive manager, but the real power of coaching will not be realised. 

    Where some managers can go wrong is that they want to help staff, but their help actually constitutes advice and directive instruction.

    Every time you use a directive approach towards your staff, you are inhibiting the opportunities for them to think for themselves and possibly solve the problem in the future, without bothering you!

    Managers that adopt a coaching mindset tend to look within themselves and use the coaching of others to increase not only the coachee’s self awareness, but that of their own. They will serve their staff by genuinely supporting their development, and they will strive to be helpful rather than evaluative. Along the way, a coaching manager will develop an individual coaching identity, that will be based upon their own personal values.

    Reflection: In your current context, what will be possible as a result of you adopting a coaching mindset?

    The ‘un-coach-ables’

    There is one further difference between personal coaching and the practice of the coaching manager. Executives hire coaches out of choice. They are wanting help with something and the coach is brought in to assist.

    Imagine the scenario where you are introduced to a coach upon the recommendation of your line manager. In fact, your line manager has read about the benefits of coaching and feels that you will be able to perform better after coaching.

    How enthusiastic are you likely to feel about this?

    This is an example that supports the perception that performance management is viewed as a remedial task, and that by association, coaching is similar as it is a method of improving an individual’s performance. This is problematic for two reasons.

    First, as we have discovered, coaching is related to learning. If the recipient doesn’t want to learn, they are unlikely to embrace coaching. This might not be a conscious decision on the part of the individual; they may not be sufficiently self-aware to recognise that their actions are creating challenges for others.

    Second, coaching is often deployed as means to ‘fix’ people. If there are specific weaknesses in an individual’s performance, it could be that the individual may not also be receptive to coaching.

    As a result, a considerable amount of time and effort is expended attempting to coach the ‘un-coach-ables’, rather than supporting able staff who are willing to grow.

    Coaching should not be viewed as a panacea. The coaching manager will achieve far more by concentrating coaching upon receptive staff, so that their talents and abilities can be realised. In the longer term, our fostering of a coaching culture will create an environment whereby those who have been coached will adopt the necessary mindset to coach others, resulting in a reduction of the impact of individual poor performance upon the performance of the collective.

    Exercise

    Reflect back over the conversations you have had over the past working week.

    • What proportion of these discussions did you provide advice?
    • What were your reasons for providing advice?
    • What were your reasons for asking questions?
    • If your staff were less dependent upon your expertise, how would you spend your time?
  • What is performance management?

    What is performance management?

    I think that it’s fair to say that if you hear `performance management’ in an academic context, then it is referring to a negative situation. People tend to be ‘performance managed’ when their behaviour or ability to perform a role is under question.

    The connotation is that staff from the Human Resource (HR) department will be involved, and that some formal processes will be underway. So, performance management can be perceived as something that is done to staff when they are not measuring up to a standard.

    Perhaps though, performance management should not be exclusive to dealing with situations of poor performance. It should reflect the approaches employed to manage performance at all levels, both good and not-so-good. 

    Reflection: How does this compare with your previous experience of performance management?

    Of course, some would argue that the role of a university is far too complex to boil down into a few quantitative measures, and any attempt to specify measures to be managed, will result in added tension when the monitoring systems are implemented.

    For instance, the breadth of activities that a university undertakes will inevitably lead to compromises being made. Maximising excellence in research has to be made at the expense of other activities.

    Such activities can differ between HEIs; the ability to maintain good student satisfaction scores, or the amount of industrial (‘third-stream’) income are likely suffer if the academic staff focus wholly upon high-quality journal articles.

    Conversely, an enterprising university may find that its entrepreneurial income generation may be constraining an ability to create new knowledge and solicit research council funding.

    And of course, a focus upon income generation through student tuition fees may create a culture that finds it difficult to relate to the wider benefits of engaging in research and scholarly activity.

    In all cases there are tensions that require sensitive management. We should remember though, that what might be a complex situation for a group of staff (such as a department), might actually be distilled down to something that is much more polarised for an individual member of staff. For instance, a department may strategically plan to change its income profile to increase the proportion of funded research.

    Whilst for a research active academic this could reinforce or amplify the tension between teaching and research duties, for a teaching oriented academic there may be no foreseeable change in their immediate future.

    Reflection: Think back to your last appraisal meeting with your line manager. What aspects of that discussion, in relation to your performance, were, or could be, counter-productive for you?

    Our understanding of performance management is shaped by our experiences of being managed in an academic context. It is not uncommon for first time academic line managers to be exasperated by annual appraisal discussions with academic staff.

    Some staff will enthusiastically discuss quantifiable targets for the year ahead, and offer an insightful commentary on their performance for the previous year.

    Others will appear noncommittal and defensive; they’ll describe their contribution as strong but argue that their work is necessarily complex and unable to be measured.

    Another academic may provide an outright objection to the whole process and provide the basis of ‘a difficult conversation’, and in some cases cite the measurement of performance as a contributor to poor personal well-being.

    The mixture of these discussions will vary depending upon the prevalent culture of the institution, but also the local culture within departments and teams. It is useful to consider how this culture might be fostered by the predominant approach to management in your environment.

    Directive or self-directed management?

    A directive approach to management typically exhibits the following characteristics:

    • Performance measures and goals and determined at all levels of the institution, and formulated by the senior leadership team;

    • Managers monitor the performance of individuals against local targets;

    • Your line manager makes it clear what has to be done, how it should be done and by when;

    • Performance is assessed in terms of how well the work was done;

    • Frequent use of initiatives/project working to achieve short term goals.

    In terms of the daily conversations, a directive manager would have a tendency to instruct:

    • “That’s the second year running that the assessment and feedback scores have been less than 60%. You need to investigate and report back with an action plan by next Tuesday.”

    • “Those application conversion figures don’t add up. Marketing don’t seem to be able to talk to Central Planning.”

    • “The Quality Lead won’t like this. Get support from Central Intelligence and Estates first, before you present a paper at the Committee meeting.”

    This style is motivated by outcomes and can be frequently encountered in academic support/administrative areas. It also occurs in academic areas to varying degrees.

    In contrast there is management that encourages staff to be self directed. 

    This can be characterised as:

    • The mission of the organisation is identified, with the declaration of long-term ambitions;

    • A series of stakeholder consultations are held to determine the strategic priorities;

    • Action plans are created that may include qualitative and vague outcomes;

    • Managers utilise measures to initiate discussions around enhancement;

    • Managers use the mission to reinforce what has to be achieved, and refer to assessments of values and behaviours as measures of progress;

    • Significant emphasis is placed upon the staff recruitment processes, to ensure that incoming staff are of an appropriate ‘fit’.

    The daily dialogue also reflects the increased focus upon the individual, rather than on a system or process:

    • “I’ve seen the assessment and feedback scores as well. What do you think is the cause?”

    • “The application figures seem to regularly have errors in them. What are the reasons for this happening?”

    • “I think we can see where this is heading. How would you tackle it?”

    A self-directed style of management encourages a greater alignment between the intrinsic motivation of an individual and the organisation, with less reliance upon the control and reporting of performance against short term objectives. Traditionally, this has typified the stereotypical academic environment, whereby academic staff are trusted to work to support the institution’s mission, rather than to perform in a coordinated way to achieve an end of year operating surplus.

    Managers that require specific objectives to be met in a short timeframe can find this situation particularly frustrating. However, the emerging competitive marketplace in HE has started to focus the minds of university executive leaders in such a way that HEIs are starting to adopt more directive styles of management. In the same way that managers can be frustrated with a department of self-directed academic staff, academics can also find directives and ‘managerialism’ problematic.

    Like most things in life it is a question of balance; where the balance lies is likely to be different for each institution. But the increased pressure to perform well both financially and in the published league tables, combined with the potentially destructive situation of failing to get the best out of academic staff, means that there is much to gain or lose depending upon the approaches we adopt as leaders.

    It’s important that we make sufficient effort to understand our environment so that we can devise the best approach. You might think that a book that advocates the use of data to achieve transformation might be heading down the road of directive management. But data is not always quantifiable in the sense of ratios or absolute numbers, and qualitative data is often a rich source of insight for the curious.

    If we are to be successful at managing performance, we have to appreciate what is worth measuring, what will motivate individuals, and how they respond to the local culture.

  • Re-framing teaching vs research

    Re-framing teaching vs research

    I enjoy visiting academic departments in other universities. There is something comforting about recognising an issue that is shared across many different departments. It might be common curricula, shared student challenges, staff issues or even a similar approach to dealing with a particular external challenge. I can feel reassurance when a department has the same problems as mine, especially if neither department has a solution!

    But there is also the excitement of observing something new, an innovation, a disruptive response. How can that solution be brought back to my department. How can I reap similar benefits?

    I particularly enjoy conversations with staff. When travelling, especially internationally, the barriers of rank seem to evaporate and we can talk freely as academics. This reveals insight that might otherwise have been obscured by status.

    When they discover that I have a management role, the initial question is typically “so how do you manage teaching and research together?”, followed by the statement “I suppose that you don’t get time for research with all of the people administration”.

    Such conversations are great openers for me. I am constantly challenging the notion of research versus teaching, preferring instead the view that each should be supporting the other. My role then has added justification, as my reasoning is that with a view that challenges the norm, I can use the management role to influence the academic environment for the better.

    It’s difficult to change the status quo as an individual academic. I hear the argument that you don’t need rank to lead change, and in principle I agree with this.

    But in some cases, it’s much easier to effect change if you directly control the systems that drive the behaviour of staff, such as academic workload planning, staff development and curriculum design, as these are the key instruments through which change can not only be instantiated, but also embedded into the department.

    When I explain my desire for a healthier relationship between research and teaching, I find that most people say that they ‘get it’. Only they are hampered by the harsh realities of their university requiring growth of student numbers, with increased contact time, etc.

    At this stage of the conversation the challenge is usually presented – “so how do you do it yourself, then?” – and it’s now time for me to advocate both my principles and my tactics for maintaining harmony.

    My first principle is to believe that I should be teaching knowledge that I have created. Not exclusively, as there are fundamental concepts that need to be learned, but the university experience has to be more than learning from texts. We need to be in an environment that creates.

    Second, I accept that it is my responsibility to help every student that is in my class, irrespective of their background. If they are enrolled, they deserve the best that I can offer.

    In fact, what I have just done is articulate two views that are often held separately by academic staff; those with a more research orientation pursue the former, whilst those who are more teaching focused employ the latter.

    My tactics for managing the delivery of both principles are wholly based upon productivity. As a researcher I must create  and disseminate knowledge. Disseminate to who though? Traditionally this has been to research communities of other academics. To  do this requires time spent performing experiments, writing up results and evaluations and then presenting it at events.

    As a teacher I must plan and deliver challenging curricula that meets the needs of the student, in a way that engages and motivates them to succeed. Each student needs feedback on their progress, and a summative judgement of their performance at the end.

    It is possible to fill the entire working week with either of the above activities, leading to academics who are either research or teaching focused. So, how do I manage to keep all of the plates spinning at the same time?

    The breakthrough for me was realising that if I considered students as fellow learners, then the tasks that need to be completed could be shared amongst a wider body of people. For instance, if students are included within the research, as co-investigators, they are then contributors. One of the major benefits for research is that the opportunities for creativity and innovation are increased when students are involved.

    Another benefit is that students who participate in the processes of research as part of their university experience develop learner autonomy faster. This of course bodes well for more advanced research topics later in their studies.

    A third benefit is that a focus on the processes of research – or having students as active participants rather than passive audiences – means that the quality of intra-student interactions improves, providing more timely and tailored feed back for each student, that could not be provided by a single academic.

    The final benefit is perhaps the most convincing. The sheer increase in volume of research material that is produced by a class of students led by an academic, versus the efforts of a lone academic is remarkable. In fact, you will begin to wonder how you will manage to write it all up.

    The key tactic that facilitates the dissemination of this work is also a productivity tip. Academics who are prolific researchers ensure that they spend time writing. They protect their time for writing, as they know that this is the final hurdle between their research and its evaluation by the research community.

    Academic staff far and wide have told me that they don’t have the time to write, mainly because of teaching. But when I challenge them to write for just ten minutes per day, before they open their email inboxes, they can then find at least an additional 50 minutes per week. Now this won’t be sufficient time to write up everything, but if we utilise similar approaches to writing as we might by including students in the co-creation of research outputs, then we can also engage students in the writing-up of the research that they have contributed to.

    We all have the same amount of time per day, and the Higher Education industry can always demand more from academics than they can give. But a simple re-framing of the challenge of conducting teaching and research can yield significant benefits for staff and students alike.