Ressources numériques en sciences humaines et sociales OpenEdition Nos plateformes OpenEdition Books OpenEdition Journals Hypothèses Calenda Bibliothèques OpenEdition Freemium Suivez-nous

Digital Palaeography meets Optical Glyph Recognition in Rouen

HDDA2012 (“Historical Documents in the Digital Age”)  at the University of Rouen turned out to be unusual (for me at least) in a number of respects. Firstly it was organised as part of a project (“DocExplore”)  funded under the Interreg framework of the EU, and hence attended by people from both sides of the channel, rather than being exclusively French. As a consequence the presentations were in both English and French, with apparently quite successful simultaneous translation, though I did not test this for more than a a few minutes. Secondly, I didn’t have to explain to anyone what the TEI was, and why it might be interesting; everyone seemed to know all about that already, even the informaticiens. And thirdly, there was no-one else from Adonis present, so it fell to me to ask the man from the Archives Nationales why they did not provide an OAI feed into Isidore as well as into Europeana (they’re planning to).

There were about eighty attendees, most of whom survived the full day and a half of invited presentations/round tables. There was a bit of audience interaction, but not much, and surprisingly perhaps only a couple of desultory tweeters, one of which doesn’t count since it was me. There was however plenty of time for old-fashioned face to face discussion over lengthy pauses for sustenance in Rouen’s rather nice Maison de l’Universite. As far as I can tell there were roughly equal numbers of archivistes and informaticiens, but they did not mix a great deal.

Proceedings were kicked off with two very good “state of the art” summaries of what’s going on in the way of cultural heritage digitization in France by J F Moufflet from the Archives de France, and Matthieu Bonicel from the BNF. I particularly liked the latter because of his optimism about using the technology to break down the walls between the silos of digital artefacts being created everywhere, pointing to evidence from maybe half a dozen great projects previously unknown to me. Both of these speakers pushed all the right buttons about open public access and accountability, transparency and integration of resources, respect for standards etc. thus making quite a contrast with the following speaker, from the archive of Canterbury Cathedral, who found herself having to explain why they’d made a deal with Satan in the form of findmypast.co.uk to get their parish records database online, thus perhaps revealing the very different business models in which archivists operate on either side of the channel.

The second session was given over to tools for transcribing and indexing all those lovely digital images. Stephane Nicolas from LITIS, the Rouen team responsible for software development, laid out clearly the challenges and advantages of integrating transcription and images. Two rather more technical presentations followed: one from Franck Lebourgeois which felt a bit like a graduate seminar about the mathematical basis of OCR, and another from Marcal Rusinol from a Spanish lab about vision processing techniques for word recognition or (as it seems it is called in the trade) “word spotting”.

The last session of the day was billed as being about digital paleography proper, and was divided appropriately between two contributions from palaeographers (Elizabeth Lalou from Rouen, and Marc Smith from the Ecole Nationale des Chartes) and two computer engineers (Veronique Eglin from LIRIS and Richard Guest from Kent). The former group clearly understood the potential the technology offered to address some long standing difficulties in the treatment of e.g. allographic variation or the use of frequency statistics in the definition of “writing style” ; the latter group maybe had a harder job in making explicit just what the state of those particular arts currently is.

The second day I arrived a bit late, for some rather odd discussions, again revealing extraordinary differences in attitude on either side of the Channel, about the “ludique” use of IT in cultural heritage applications, i.e. how to make cool exhibits in museums. It began with a moderately dreadful intervention from a professional French developer of such things, but was rescued by a man from the British Library called Clive Izard who gave a historical survey of the BL’s flirtations with technology, from the days of the Information Access programme (which, I may say, was one of the funders of the BNC) up to the present, third, generation of the “Turning the Pages” application. He was followed by another excellent (and splendidly named) speaker Clotilde Vaissaire-Agard from Le Havre, who reminded us about the need to place the scholar at the centre of the picture (I was reminded of a former OUC S colleague’s plaintive cries of “What about the users”?) . She also endeared herself to me forever by citing the manuscriptorium project (remember Enrich?) as an outstanding example of what the technology facilitated by making it possible to share metadata and digital resources across institutional boundaries for the benefit of manuscript scholarship.

The final session though labelled as con cerning that old war horse “Is there such a thing as Digital Humanities”, actually contained three very good and complementary talks intimately concerned with the themes of the conference. Alison Wiggins from Glasgow’s Bess of Hardwick project gave a convincing account of their attempt to ground the project in practical user-focussed concerns (she cited Claire Warwick et al’s Lairah as one of their inspirations); Dominique Stutzmann from IRHT raged, with ample evidence, against the lack of decent interfaces in transcription software; and finally Alixe Bovey from Kent gave a well illustrated overview of the strengths and limitations of various interfaces developed for interacting with the physicality of medieval sources. She concluded by lamenting, in the way people do, the absence of smell associated with digital images, and the mismatch between the haptics of the touchscreen and the codex. I was more impressed by Alison’s comment that it was more useful to know what kind of paper Bess of Hardwick wrote to the Queen on than it was to be able to reproduce it.

The price of a new laptop, or My love affair with Ubuntu

I think the first unix system I installed on a laptop was release 0.8 or thereabouts of Knoppix. You could take an innocuous -looking CD, stick it into a crusty old PC, tweak its bios to boot from CD, and bingo, you were running a real linux without having touched the Windows hard disk itself. How cool is that? OK, it took the best part of a day to load Knoppix from the CD into memory, so I pretty soon found the button for copying Linux itself onto the hard disk of my elderly laptop, but the “Live CD” concept had much to recommend it. You didn’t have to drop the Windows security blanket, and you could go on using your Windows filestore. That must have been around the year 2001 or so : at OUCS we started using Knoppix as the basis for a series of TEI give-aways at workshops and conferences – at first on CD, then on USB sticks, as the technology improved.

And then came Ubuntu. I think my first real Linux laptop was a Thinkpad on which I installed the first of an entire menagerie, from Warthog in 2004, to the Quetzal I installed yesterday. Yesterday also, I switched allegiance from Lenovo to Samsung, and installed the Quetzal on a series 9. It looks a bit like a Mac Airbook, but it’s not evil.

To get a new laptop in 2004, I had to write a one page justification for the boss, fill in numerous forms, send them off to the University’s approved supplier, and wait a few weeks for the machine to arrive. Then it would take a few minutes to unpack the machine, and at least 3 days to get Linux installed on it, much of it involving me pestering smarter people with better things to do.

In 2012, it took me just a few minutes to click on a button and order a new laptop, which was delivered to my house in about 24 hours. It took rather more than a few minutes to get it out of the packaging, but installing Linux took a couple of hours max. I downloaded an ISO image from the Ubuntu website. I made it into a bootable USB key using some software recommended by some other website. I stuck the key into the side of my new laptop. I tweaked its BIOS (just like the old days) to boot from the USB drive. And everything Just Worked. OK, I had to make difficult decisions like what language I use, what is my name, what graphic did I want to represent me, and did I want to wipe out the Windows 78 partition on this machine, so maybe a little more than an hour or two later, and that’s it. Wifi works, sound and graphics work, wireless mouse works (good, as my fingers don’t understand trackpads) … I can even imagine getting used to the Unity interface (which seems to be a bit more stable than it was under Pangolin).

And that, dear reader, is when I realise that the real price of a new laptop is yet to be paid. OK, I expect to have to install some favourite bits of software to get my familiar working environment back (digikam, subversion, emacs, oxygen, chrome, dropbox …) : that doesn’t take long. I can remember how to re-configure thunderbird to collect my mail: what I’d forgotten is just how long it takes to get nice new fresh copies of all the old mail which was sitting gathering dust on the IMAP server. I know how to check out all the stuff that actually matters from the Sourceforge and Googlecode TEI repositories : don’t underestimate how long that takes either. It took me over an hour just to remember how to re-set my password on the OUCS subversion repository.

My goal for today was to be able to rebuild TEI P5 from source, and crank out nice PDF slides from TEI source. (that’s the sort of thing I do every day, to be honest). On top of what I already had, I needed to download and install Oxygen XML editor, Chrome, and Dropbox. I needed to install nice new packaged 64-bit versions of jing, trang, onvdl, rnv, tei-emacs, openjdk-jdk, latex-beamer, texlive-xetex, and texlive. That all went smoothly, except that the packaged version of rnv was a 32 bit one, and so I had to rebuild it from source. Same problem with Acrobat Reader, but there is, of course, no option to rebuild that from source, so I will probably have to live without Mr Adobe’s fine products for a while.

Conclusion? Nothing surprising I suppose: installing your own system is still a good feeling, seductive enough for enough people that other people put lots of effort into making it much simpler to achieve. Hats off to the unsung labourers in the Canonical salt mines, and the open source community generally, who just go on doing what they have always promised they would, keeping the faith.  It’s a relief that manufacturers like Samsung let them get away with it.

And now, it is the evening of my first full day with psammead and I think another glass of wine is in order. If I could only learn how to use this wretched trackpad…