So back in February I was asked to contribute a chapter to a new book being confected by some top people in the domain of the digital humanities, an invitation which I naturally accepted with alacrity, and only a small sense of alarm. I admit: I was flattered, though naturally also felt it was about time my eminence was recognised in such a way.
Dashing off an abstract is an easy task, so I did that, and then forgot all about it. Here’s the abstract. Like other such pieces, it promises much, and even gets mildly polemical towards the end, which seemed to do the trick, as the proposal was, in due course, accepted.
Where do metamodels come from and how do they survive? Lou Burnard There is a very old joke about standards which says "Standards are a good thing because there are so many to choose from". Like many old jokes, this plays on an internal contradiction (the structuralist might say "opposition") in its topic. Standards are, on the one hand, of most benefit to the extent that they reflect and facilitate diversity ; on the other, they are of necessity managed or even imposed by a centralising authority. This contradiction is particularly noticeable when the process of standardisation has been protracted because the technologies concerned are only gradually establishing themselves. We see this tension even in consumer electronics where there is a financial market-driven imperative to establish standards as rapidly as possible; but the same tension underlines the gradual evolution of ways of thought via communities of practice into de facto and (eventually) "real" standards. This article explores the evolution of standards for data modelling methodologies with regard to this tension. It considers some significant early experiments with the application of data modelling techniques to humanities research data (Manfred Thaller; J-C Gardin) and discusses to what extent some researchers simply adopted technical standards emerging in the wider data processing community (relational databases, information modelling), while other communities strove to define their own models (AI, language understanding systems). It will present in some detail the theoretical model (metamodel) underlying the Text Encoding Initiative's approach to standardisation and ask the question whether, over time, all such community-based efforts are forced further towards convergence and away from diversity. The TEI currently maintains a balance between "do it like this" and "describe it like this" schools of standardisation; in the long run, it therefore risks being superceded by advocates of the latter who distrust the former, or advocates of the former, who are impatient with the latter. Oxford, 1 Mar 2014
Summer came and summer is now going, and this particular bird is coming home to roost. I received last week a polite reminder that my manuscript should be delivered by the end of the current month, should conform to a defined house style, and would I please sign in blood the form I was sent back in April assigning my rights in this non-existent work to non-existent publishers Snipcock and Tweed ? Naturally I replied at once pleading for a stay of execution (but ignoring the rights assignment question) which was graciously accorded, somewhat to my surprise, even unto mid October. So now I really have little excuse not to find out what grand idea this abstract is abstracted from, really ought to get down to doing the research it grandly promises to summarise, and write the wretched piece. If only I didn’t have all those other more interesting (or less interesting but more urgent) things to do.
Well, let’s see., I plan to use this blog as a record of the painful process, just so that in years to come I can look back and see where it all went horribly wrong. At least no-one is likely to find me here.
OpenEdition vous propose de citer ce billet de la manière suivante :
foxglove (17 août 2014). Metamodelling through : the prolegomena. Foxglove. Consulté le 6 décembre 2024 à l’adresse https://doi.org/10.58079/otm1