An event that asked developers to demonstrate the Future of Repositories can only be considered a great success when it receives entries that include:
- Multiple real-time examples of using “Repositories As A Service (RaaS)”, not only exchanging data but also sharing sophisticated functionality between EPrints and DSpace – and even including an Android application
- A tool for bundling and depositing a whole raft of research related outputs from the Web via RDF
- A tactile repository search interface with dynamic search suggestions, specifically designed for tablets and smartphones
- A complete gesture and voice-driven system for depositing and searching in repositories
All these – and other great entries too – were achieved in a couple of days’ work during the course of the conference, for the annual OR Developer Challenge, and presented at a packed Show-and-Tell session on Thursday afternoon (true, there was free beer).
Stuart Lewis’s team were worthy winners with their RaaS project, particularly as they showed a genuine commitment to a cross-platform approach – something which, sensibly, backgrounds the individual software platforms, that often receive too much attention, and focuses on the Repository as an application and entity in its own right.
We were also really pleased to see a prize go to Patrick McSweeney and Matt Taylor. And enjoyed seeing Dave Tarrant stealing the show (again) with his live demonstration of using a Microsoft Xbox Kinect to submit items to a repository.
Our own entry may not have won, but several people liked it, and you may see more of it in future. For the second year running, the Developer Challenge was a great opportunity for Rory and me to concentrate on an idea that we’ve been kicking around, without having found a home for it in existing work (yet). This was true for our Semantic Metadata popup tools that won the challenge with last year.
This year we set about achieving our longstanding desire to take the very tactile and dynamic search interface that Rory created for the MERLIN project, and turn it into a touchscreen app, for smartphones and tablets. The results were pretty effective.
The MERLIN interface on LASSO is quite complex, but at the heart of it is the tag cloud of related terms that the Termine text-mining suggests. This always looked like it might be good on a touchscreen, so we stripped it down, rearranged and tweaked it to make it viable on an Ipad screen. If you’ve got an Ipad, you can try it out by pointing your Safari browser at http://is.gd/texasweb. (It will work on desktop browsers too, but it looks best on a portrait oriented screen.)
If you’ve got an Android device, you can even download an app-based version of it from http://is.gd/texasapp. (It’s a bit cramped in a smartphone display, but still essentially working.)
If you’ve got neither we created this page to give you a rough idea what it looks like.
It’s worth mentioning that this is a real, live, working application: enter a search term (“logical positivism”, “climate change”, “Jeremy Bentham”, …) and it searches over the full-text corpus of all articles in University of London Open Access repositories (SHERPA-LEAP consortium members), and makes suggestions about additional or alternative search terms, based on the results of the text-mining analysis of the articles
The hackathon approach of working closely together to create something quickly worked well again: Rory hacks, and I test and review each time he hits ‘Save’! All very agile and iterative.
In honor of our hosts in Austin, we decided to call the new interface Touchscreen Enhanced Cross-Search with Augmented Serendipity – or TEXAS for short.