In episode 7 of our DART podcast I sat down with Ed and Steph to discuss the work they’ve both done on updating the AIDA (Assessing Institutional Digital Assets) which we will soon relaunch under its new name – The AOR Toolkit (Assessing Organisation Readiness).
Origins of The AOR Toolkit
Initially Kevin Ashley and Ed completed the JISC-funded AIDA project in 2009, and the idea was it could be used by any University – i.e. an institution – to assess its own capability for managing digital assets.
In this episode we talk about AIDA origin, examples of how people have used it, the changes we made and why, how the AOR toolkit can be used, and what next steps people can take once they have run through the self-assessment.
The AOR Toolkit – Audio Recording
The AOR Toolkit – Interview transcription
Frank Steiner: Right, so we are back again with the 7th episode of the DART podcast. I am well rested after three days holidays…well, sorry three weeks, actually.
Ed Pinsent: Three weeks!
Frank: Three weeks, I know. Well, it felt like three days, that’s all I can say. Once again I’ve got Ed and Steph with me today.
[both say hello]
Frank: Today we’re going to talk about a little project we’ve been working on for the past couple of weeks which is the relaunch of the AIDA toolkit which is the name some of you might know it by. The relaunch is just around the corner. We’ve got a new name for it, we now call it Assessing Organisational Readiness or The AOR Toolkit. So, Ed & Steph could you tell me a bit more about why you have looked at it again and why you thought of revamping and changing it a bit?
Ed: We wanted to revamp it because it was never quite right in its original incarnation. We kind of ran out of money, we ran out of time in 2010. AIDA was a JISC-funded project, I worked on it with Kevin Ashley and I always wanted to take it a bit further but Kevin said no, he said money was not forthcoming. So I think that was one of the triggers but also because, well, there seemed to be an opportunity for ULCC to assert its ownership of the toolkit a bit more. And also in the intervening six years I have learned a bit more about the subject, so I thought I could make some improvements. I just suddenly saw how it could be made better quite quickly and easily, and I’ve got to say Steph was very encouraging. Even though I’m slow to see the value of the toolkit sometimes, Steph thinks it is potentially very useful to a lot of people.
Steph Taylor: Yes, I got quite excited when I re-looked at the old AIDA toolkit because when we’re teaching on our digital preservation courses, we do look at self-assessment and it is a logical next step for people to assess where they are with digital preservation, a good next step after our courses, but what I was finding with the tools that were available were either very, very superficial so not really that useful when you’d done them or they were incredibly complicated taking about months to complete and involving just about everybody in your organisation including your financial status, your tech people, et cetera. So when I looked at the AIDA toolkit again with Ed and saw that it had got a lot of potential to sit nicely in the middle ground, to not take 11 months to complete, to be a lot quicker than that, but to also not be so superficial so that people had got something really useful that they could start conversations with within in their own organisation, identify areas they needed to work on and also have something that they could take to senior management to start that conversation going.
Frank: That’s great. I mean, I know from talking to you guys that despite not having done much with AIDA since it came out, there was sort of a continued interest and people reaching out to you guys, especially Ed, because Ed’s name was associated with the project?
Ed: Yes, people kept finding it online, mostly from not in the UK, interestingly enough. Yes, and people were just using it as a kind of benchmarking tool for all kinds of digital content, and I should stress that what AIDA is supposed to be about is not even exclusively preservation, not even exclusively digital content but just management of content and the emphasis ought to be on the organisation. The main thing it’s measuring is your capability to do it, and this is one of the problems we had in the toolkit name to begin with, AIDA, because it didn’t really explicitly state that this was the case. It seemed to be saying that you’re assessing your digital assets but you’re not assessing your digital assets, you’re just assessing your organisational capability for being able to do anything with the stuff in the first place. So yes, I had libraries saying, you know, this is a very useful benchmarking tool, but I also had amazingly people saying this is a very good teaching aid for people who want to understand something about how to manage digital content, this is kind of a nice comprehensive overview of all the kind of things you need to speak about, so I was encouraged to find that.
And we had the other view as well from Toronto, I don’t know if you want to go into that one? It was a slightly different thing.
Frank: Yes, just mention some of the examples, I know I’ve seen a few pop up in emails over the last few weeks so please share with our listeners.
Ed: Yes, the people in Toronto were saying they know what Steph has also just said, that there are a lot of assessment tools available and Toronto were saying there are probably too many and some of them seem to be a bit flaky or subjective or they’re not peer reviewed sufficiently or there isn’t enough of a method associated with them and Toronto’s thing was they would like there to be something.
They’ve analysed all of the tools that are available in this area and they’ve tried to assess and grade them, and I’m afraid AIDA didn’t do too well. They said we can see that it’s instinctive and quick and everything but it’s also we don’t see how you got to this form of words or how you got to that particular framework or structure.
So in the grand scheme of things, yes, if one wanted to have something that plugged into the international community of greatness in order to be accepted as a standard throughout the world, then AIDA would probably fall short and I think the revision isn’t going to address that either. On the other hand, do we care?
Steph: I think we don’t at this stage. I’m happy to be controversial in that, because I think what it does bring is just a really useful starting point for people to identify not only the areas that you may be weak in but also you might be surprised to find that you’re actually quite good in some areas and may be more ready than you thought in certain areas. Well, I keep saying it but it is a very, very good starting point.
Ed: Exactly, I think that’s all you can get out of it, it’s a starting point to see where you’re strong and where you’re weak and that’s why we try to get that comprehensive coverage, the organisation, the technology and the staff, and resources, yes.
Steph: And I think also I’ve noticed in the revisions that Ed has been making to it that I think a lot of the conversations that we have with people that come on the courses and people that we do consultancy work with have fed into this. We have kept it simple for talking to people outside of your particular sphere in your own organisation, so if you’re going to try and engage senior management, you need to go with a good clear message that you maybe need a specialist member of staff or some skills in this area, or we’ve got that, but we need some more technical support in doing what we’re doing, and The AOR Toolkit gives you that nice framework again to start the conversations, I think.
Frank: So you’ve made some changes and revised it and fed some experience and expertise back into it, so what exactly has changed in The AOR Toolkit compared to AIDA?
Ed: First of all, it’s emphasised the fact that it’s assessing capability not digital things. That’s changed the title, the acronym, everything, and I hope that’s going to be upfront, the first thing you see when you look at the title of the toolkit, when you open page one, you’re clear that you’re not looking at your drives, you’re not counting the number of jpegs, you’re not looking at your software and your hardware, you are looking at your entire organisational capability for supporting this content. So that was the first thing.
The second thing was to take it out of higher education because AIDA was targeted at higher education, I think we got the JISC money on that basis, so it’s going to be useful to the JISC community, and of course that’s fine, that’s great, you know? So the type of language we used, the type of assessments we proposed ended up with something which I assume UK universities could recognise and identify with and would find it useful. But we thought the value we could have now is that, well, everybody’s got digital content now, so why not have a toolkit that addresses and helps everybody.
Frank: Yes, that makes sense.
Ed: Exactly, so I don’t think we even call it digital assets anymore, it’s something called managing digital content, which is vague but I’ve tried to explain in the introduction what I mean by that and what is in the scope and the idea is, yes, potentially anybody, large or small, even you at home with your hard drive full of images could perhaps assess something about… How ready you are to support it, yes. So that’s the thinking, and hopefully we’ll address the needs of more people in that way. I think those are the two main fundamental changes that we had in mind.
Frank: Okay, I mean, for me it sort of goes back to what we see with the training as well, that you’ve got people from other sectors, HE, museums, archives, because like I just said, they will have digital content and they need to look into it, and there could be financial institutions, government institutions attached… You know, it is literally everyone and anyone.
Ed: Yes, precisely.
Steph: Yes, we do. I mean, we’ve been doing an analysis recently on the sectors that people come to on our courses and we get a lot of people from commercial companies, from banking, from retail, from the media, also from higher education, museums, but there’s a whole range of people that are supporting digital content in all kinds of areas, and again obviously we address this in the training and we wanted a toolkit that would support the different needs they have.
Frank: Okay, that’s great. So in terms of using the toolkit, how would people use it and what would they get out of it once they have used it? I know there’s a nice traffic light scorecard to be had at the end!
Ed: Well, there is a traffic light scorecard, yes. What you get at the end of it, the punch-line is you get maybe three sheets of A4 with different colours on a grid telling you how well you’re doing or how badly you’re doing. Now I borrowed that from something called the Digital Preservation Capability Maturity Model developed by Charles Dollar and Lori Ashley in America. They’re part of something called Saving the Digital World, I think. That’s their website. But I like those colours, you know? So I’ve taken those modified traffic light colours, so if you see red on your final results, lots of reds, you’re doing badly. If you see light greens and dark greens, you’re doing quite well. So that’s quite useful.
Frank: That sounds fairly simple & straightforward.
Ed: Yes. To build that grid, you would look at the exemplars that we’ve given in the toolkit and find out which one is the closest match to where you think you are now, and you’ve got five possible stages of degrees of success that you could analyse amongst the various elements. I think there are around 30 elements to assess. Further, if you wanted to, you could really grind it quite fine. You could assess your entire organisation or you could assess one department or you could assess both or you could assess many departments. We didn’t take it down to the level of one individual person but you get the idea, because the interesting thing is you could find out that a department is doing quite well while the whole organisation is failing quite badly in that one area, and you could get this contrasting view, and find stuff out.
But in terms of how you go about that, I like to think it could be fairly quick and easy. Certainly the people who have used it when we were building the original AIDA toolkit, they found they could cover one strand very quickly and easily, the organisational strand. When it came to the IT strand, they had to go and ask IT people, understandably, and when it came to the resource strand, it’s possible they said, well, we wouldn’t know this, we’d have to ask our finance officers. But even so, there’s a difference between just asking your IT storage manager a question, like do you carry out checksums or don’t you? There’s a difference between doing that and asking them to provide evidence that they do so, you know, and having to undertake some kind of formal audit, because I have to keep stressing it’s not meant to be an audit. It’s just a kind of…
Frank: Yes, that was going to be my next question.
Ed: Yes, it’s not a formal audit. It’s just a kind of finger in the wind where you get these pretty colours at the end of it, and even the coloured dots method has a tremendous amount of value because it’s something senior managers get very quickly and they can see this visual indicator.
Frank: That brings me to my next question then. So I’ve run through the AOR toolkit, Done the quick assessment, so what could be my potential next steps after that? You just mentioned senior management, so I assume I could go and say, listen, we’re doing really badly and we shouldn’t, so you make a business case in a broader sense.
Ed: Yes, you could make a business case. I think you could identify priorities, what do we need to fix first? How do we get those reds into oranges or into greens? How long are we going to take doing that? How much money are we prepared to spend? Are we worried about the reds or not? I think that’s what the dream was with the original AIDA and I think that’s survived, if it would just help you target which are the most burning areas to fix first of all, and that’s not new. They’ve been doing that in the preservation assessment survey at the British Library for a long time. I don’t know if that’s still going, but again it’s one of the things which Kevin and I found out about and we even went to visit the people who ran the preservation assessment survey and it was a different way of doing it but the idea was based on 100 objects in your collection, you could extrapolate a view of the entire collection, and it filled in a database on that basis.
But the outcome was the same, you could identify where the real problems were quickly, and instead of vaguely saying I’m really worried about preservation, or I’m vaguely worried about conservation or archival care, you would have ten or 15 things precisely that needed to be fixed first.
Steph: And I think certainly the traffic light method we have taken our own medicine and used that as a way of identifying things for clients that have come to us for consultancy work.
Ed: Indeed, yes.
Steph: We were dealing with a large and complicated organisation and being able to present clearly to the non-specialists within that organisation the idea of a red, amber, green, moving into danger areas, we found that ourselves to be a very useful visualisation where people who are not specialists in an area can easily understand the danger zones and also the things that they’re doing well.
Ed: Yes, it’s true. I mean, that particular assessment, I did use the Dollar and Ashley model I mentioned before and I was able to complete it with them in a matter of hours which shows you it is possible to do this kind of thing quite quickly and get a very good result quite quickly.
Steph: And I think the other thing that I like about it is that there’s no limit on the amount of times that you use it, so you may use it once, highlight some areas, go away, do something about those hopefully, and then you could re-run through it and see how you were doing, highlight what’s changed, see what areas you want to tackle next, so it is an ongoing thing which I think could be really useful.
Ed: But if specific requirements come out of it, you know, we do have a consultancy offering. We’d love to help people with anything to do with, say, advice about file formats, migration, systems, metadata schemas, and all these particular things can come out of using the AOR toolkit, and we’d be happy to advise on next steps.
Steph: Yes, I mean, I think a big driver for me with being so excited about it is obviously we get people on the course and they learn about digital preservation and we try and make the course as relevant to different sectors as we can but at some point we’ve got to let people go home, and when they go home they’ve got to grapple with what they do next within their own organisations, and we try and, you know, it’s like preparing warriors for battle, we try and send them out there as well-equipped as we can from coming on the courses but I think this will be a great starting point for actually the big next steps they have to take which is bringing it into their own organisation and identifying their own problems.
Frank: Well, I think to me it sounds like a good sort of identifying where you are now, and then whatever your digital preservation strategy is then, if it’s ambitious or realistic, you kind of, well, this is my starting point, how do I get to there?
Ed: But not just preservation, I like to think it could apply to a digitalisation project or implementing a CMS or records management or, well, anything that involves some form of management of digital content, even research data management, although I think that’s one of your questions that’s coming up…
Frank: Yes, well, we can jump around, it doesn’t matter.
Ed: Yes, because this was one of the success stories of AIDA is that it got recognised by Sarah Jones and Joy Davison at HATII, as being something which potentially could be remoulded into a structure which research data managers would recognise and that was quite visionary of Joy to do that in 2012 because it’s very much a very important issue right now in 2016, and a lot of universities are very concerned with managing research data. So she envisaged turning AIDA into a toolkit that was specific for that particular feed and for that particular audience, because AIDA was vague, general, it was just digital assets, yes, are in higher education but it could be almost anything, but she said, no, it’s got to be research data and it’s got to be research data management and it’s got to be targeted specifically at data owners. What are they called? The PIs, principal investigators or whatever they’re called and all the people who are involved or are stakeholders in the management of this precious data.
So, yes, the basic grid and structure of AIDA survived into CARDIO but the specific words that were used and the targets of assessment were directly targeted at that audience, those people, and we came up with a form of words and a structure that is, well, it just fits like a glove. It plugs directly into that context, that environment and I can’t take the credit for that because I had help from a group of 12 reviewers who came in to help me do this, including Jen Mitcham, a friend of ours, but lots of people from UK universities, digital preservation experts, copyright experts, technical experts, all of whom volunteered, and I think they got a small honorarium from Joy to help revise AIDA and we came up with a real blinder, you know?
It was a fantastic team and a great piece of work, and very, very quickly it showed you how I think the basic structure of assessment can, with very little effort, be turned into something germane to almost any field, and CARDIO has worked. It’s online, people have used it, universities now use it as a first step to see if their organisation has any capacity whatsoever for dealing with research data. So I’d love to hear more success stories from CARDIO but I’m very proud to have been a part of that one.
Frank: Sounds great. Well, we’ve got the AOR toolkit coming out soon so hopefully that will be a similar success story, and you’ll have been involved in that as well. So I think we covered everything I wanted to ask you guys about it. Just for our listeners, we are hoping to have the new and revised toolkit out towards the end of the month, so once you’re listening to this podcast and the blog post there will be a little link for you to pre-register your interest so you will be the first to know when it comes out and when it’s available to download, and obviously we would also invite your feedback and any comments you have because, like I said just now, it’s kind of a collaborative approach and community approach.
Ed: Of course, yes.
Frank: Although I know you guys are experts, there are also things you don’t know.
Steph: Yes, we’re always very happy to gather feedback and to get new ideas as well, so…
Frank: Yes, and I think it’s also…once you’ve put it into practice you might come across things that…wouldn’t it be nice if we check this particular area or actually, you know, something might be superficial and we don’t need it, we will find out. But thank you guys for now, and we’ll be back with the next episode soon. Thank you.
Ed: Goodnight, listeners.