Summit digital tools humanities




















So much for the major criticisms from our panelists. With the goal of making tools more easily adopted by the average historian, we have tried to distill the suggestions from our users into a comparatively small set of considerations for tool builders.

These might be considered the key features of the contract between tool builder and tool user. Most are not groundbreaking suggestions, but they underscore the real needs of interested users — needs that have not been adequately met.

And as stated earlier, digital humanists should refrain from complaining about the limited acceptance of their methodologies when the tools and techniques that they have developed remain opaque or even unintelligible to an interested, general humanities audience. Tools have generally neglected the typical humanities user in their design and documentation. Builders of digital humanities tools, especially those that deal with technically more sophisticated techniques, like text mining and visualization, could considerably increase their tools' visibility and speed of adoption with more attention to their user interface and clear instructions with example use cases.

The intended audience of most tools, to the extent that a discernable one presents itself, seems to be technically sophisticated users who are already sold on the value and utility of the tool and who are willing to play around with the tool to get a sense of its possibilities. But as our panelists' interests suggest, the potential audience is far larger.

Scholars approach scholarly software as software first and scholarship second. Any intellectual nuance that might be useful to the visitor must come after having met their expectations for web and software design. That means simply and clearly indicating what the tool is, how one uses it, and making it as easy as possible for one to get started and to see some initial results, even if only approximate, from the tool.

While it is important to minimize the black-box problem by explaining how the tool works, it is equally important that such explanations don't crowd out a more basic explanation for new users. Provide concrete examples and explain the methodological value. Documentation needs to be non-technical in two ways. First, and most obviously, it must explain the basics of how to operate the tool, and it would be extremely helpful to provide examples about using the tool itself — that is, to present specific examples of analysis across several disciplines.

The cost outlay to create such content is not negligible, but the benefit would be disproportionately high. The second, and perhaps more crucial aspect of the documentation, will explain in general terms how the methodology of the tool can be useful. This would provide important motivation for the curious scholar who has come across the tool or has been directed to it but remains skeptical of the value of a new methodology.

Documentation should explain, with examples, how the research methodology that the tool embraces can be useful and appeal to users across a variety of disciplines. Especially if tool builders typically see their role as making a methodology more accessible to scholars, they should include some justification and explanation of the methodology in their documentation.

Even if methodological diffusion is not the principal goal of the project, explicit attention to it will only further the larger mission and adoption of the tool. Be clear about the limitations of the tool and set reasonable expectations. Though it may appear obvious to the technically sophisticated humanist tool producer, tool introductions need to be clear that the tools themselves neither function as substitutes for historical research nor attempt to produce historical knowledge.

It cannot be overemphasized to the new user that the tools simply facilitate historical research by revealing trends or themes that might have otherwise gone unnoticed, and that to interpret what such trends or themes might mean remains the work of the historian. For the time being, then, tool builders might tone down the rhetoric about the interpretive power of the tools and how they can revolutionize research.

Similarly, they should be encouraging users to think more deeply about the way tools create different views of, and interactions with, information which can then be interpreted and used as a means for developing and exploring historical arguments.

Certainly, technically sophisticated users will have a better understanding of how a tool works, and will use the tools in more complex ways to facilitate their own analysis. But this should not be the only audience that developers try to engage with.

Allocate more resources to user interface development. The user interface for many digital projects often seems developed as an afterthought, thrown together after completing the core functionality. However, a truly good user interface requires significant investment in design and development that needs to be integrated into the project timeline and budget.

It needs to be flexible to accommodate expanding tool features. Scholarly software designers should more consider the research on user-centered design approaches e. Development should also include extensive testing. The bugs and crashes frustrated many panelists. Though some instability is unavoidable with prototype tools, scholars were almost resentful that they had invested time in a tool that was wasted out because of a critical failure, which in turn lessened the likelihood they will return to the tool even after stability is improved.

No tool is an island; tools must support combinatorial approaches to data. As digital tools become more easily accessible and more primary sources become available online, data standardization becomes even more crucial.

To this end, tool builders might collaborate with data repositories and other tools to encourage compatibility between different formatting standards. This is not to say that all humanists and repositories must adhere to the same standards or data formats. No single approach can possibly accommodate the myriad kinds of resources and institutions that are making data available.

But there is a willing audience at hand, and some explicit training about data standardization — especially since it's not exactly widespread in typical humanities training — could offer a tremendous boost to tool adoption. Our panelists were excited and inspired by the visualizations of Shaping the West ; their technological uncertainty hardly deterred them from attempting new visualizations with Many Eyes.

However, data roadblocks were fatal. For the former, users found that substituting their own standardized data was impossible; for the latter, users found it too difficult to standardize their data in an appropriate way. Similarly, tools need to be as interoperable as possible, especially in terms of how they can import and export data. Conclusions: broader goals for digital humanities tools The participants in our survey and our panel discussion showed a great deal of enthusiasm for digital research tools and eagerness to engage with them.

Although their interest is demonstrable, unfortunately so is the insufficient usability of many digital humanities tools. As our panelists indicated, concerns about the extent to which digital humanities tool developers consider what humanities scholars actually want to accomplish [ Warwick ] remain strong.

From explicit statements alone, it appears that our scholars are about equally divided as to whether widespread adoption of digital tools should happen in their field of history. Perhaps this reflects optimist and pessimist points of view about technological change generally. But what resonates most strongly from our panel discussion is that virtually all of the participants reported at some point a glimmer of hope with respect to how digital tools might help them to research in new ways and re-conceptualize their work.

Yet their frustrations over steep or insurmountable learning curves considerably dampened their hopes. At the same time, our panelists made clear that, if interesting results could be produced in a short time, they would be inspired to use the tools even more.

Perhaps such rough and ready use should be a more explicit aim of digital humanities tool development. With the first wave of digital humanities tools having produced excellent experimental and prototypical work, the fundamental barrier to wider adoption of digital tools seems to lie now in quality interfaces, accessible documentation, and expectations management. Many tools now seem to downplay the importance of the user interface and documentation with the implicit rationale that people who are really interested in using the tool will figure out how to make the tool relevant to their own work.

Our survey and discussion shows that this is often not the case. There are plenty of interested, curious, and technically capable humanities researchers that have little time and patience for trial and error with new methodologies when they are uncertain of their value. However, they remain receptive to the possibilities offered by the tools. When considering a user sympathetic to the promise of digital history tools, some leading by the nose is not only helpful, but also necessary.

An appropriate social contract is not just about writing functional code, but also about creating an experience that helps mediate a potentially uneasy relationship between data regardless of representation and researcher.

Furthermore, though often seemingly outside the scope of a tool-building project, tools should not only document their functionality, but also should explicitly encourage scholars to approach their work in new ways. And in the midst of embracing new kinds of methodological challenges and cutting-edge tool development, tool designers must not forget the importance of a simple and clear user interface.

It must not only make it easy to use the tool in productive ways, but also explain what the tool is for, provide examples of how it can be used, and give non-technical details about how it works in order to minimize the skepticism of black-box analysis. Such ease of use will hopefully bring increased integration of technology in humanities instruction, especially in terms of research methodologies and awareness of the importance of data standardization so that humanists are better able to communicate with archivists, librarians, and technologists who tirelessly facilitate data exchange whether analog or digital.

The audience for early tools was, and in some ways needed to be, other technically sophisticated humanists. But the potential audience has broadened considerably. Put another way, tool builders might consider both their tools and their target audience as more transitory than revolutionary.

Keeping the cautiously optimistic user in mind would encourage a wider user base and facilitate the traction of digital humanities methodologies. Traditional humanists are willing to venture down the digital path, but they need to feel comfortable along the way. An emphasis on cultivating a broader audience and new relationships with them must be a concern not only for tool builders, but also for funders of such tools, who must ensure that tools adequately account for the time and expense of quality interfaces and documentation.

Prioritizing a wider audience can help further adoption of tools in general, and thus further the acceptance of their use and development as comprising legitimate scholarly work. A second phase of the search begins in the fall. You well know that provosts and deans tend these days to think and talk like venture capitalists — we choose among investment opportunities by evaluating their potential returns. Too often, and of necessity, we judge return on investment with fiscal metrics like research expenditures, royalties, and tuition revenue streams.

I am especially proud of our decision to fund a hiring initiative in the digital humanities because, while it certainly has the potential to create all of these traditional returns on investment, it produces others that our campus values equally. Our investment in the digital humanities, for example,. These are vanguard contributions that can generalize across other disciplines and will evoke difficult and important academic conversations well beyond the boundaries of the humanities.

This portfolio of returns would seem to easily justify our investment especially when we consider the reasonable costs associated with building strength in the humanities — my colleagues at the campus level are delighted by start-up packages bounded by a mere five digits. But there is a final return worth noting, and this is actually a fundamental reason that motivated us to invest in this initiative. I develop a tool as a specific means to an end, and the end is always pertinent to some literary question.

Archival research is one way of obtaining data. To get that data one must employ a methodology etc. The development of a tool is akin to this. My field is the digital humanities, and some part of my research is on how computing affects positively and negatively scholarly activity. Building the tool — which expresses a particular intellectual stance on certain issues — is meant to be a research activity. Current Issue Announcements Call for Reviewers Call for Submissions.

Ann M. Abstract While the purpose and direction of tools and tool development for the Digital Humanities have been debated in various forums, the value of tool development as a scholarly activity has seen little discussion. Tool development in the Digital Humanities has been the subject of numerous articles and conference presentations. This may be, in part, because of the perception that tools are developed to aid and abet scholarship, but that their development is not necessarily considered scholarship in and of itself.

This perception, held by the vast majority of tenure review boards, dissertation committees, and our peers, may be an impediment to the development of the field of digital humanities. Indeed, as our survey results indicate, some tool developers also subscribe to this. A majority of respondents, however, consider tools development positively linked to more traditional scholarly pursuits. As one respondent indicated, I develop a tool as a specific means to an end, and the end is always pertinent to some literary question.

Tool development as a methodological approach was considered no less rigorous and scholarly than other approaches: My field is the digital humanities, and some part of my research is on how computing affects positively and negatively scholarly activity. Several major recent reports urge the academic community particularly in the humanities to consider tool development as a scholarly pursuit, and as such, build it into our system of academic rewards.

The hurdles we might expect in seeing these recommendations implemented are complicated by a parallel but distinct issue noted by the MLA Report on Evaluating Scholarship for Tenure and Promotion : namely, that a majority of departments have little to no experience evaluating refereed articles and monographs in electronic format.

The prospects for evaluating tool development as scholarship, at least in the near term, in these departments would appear dim. In , scholars from the humanities, the social sciences, and computer science met in Charlottesville, Virginia for a Summit on Digital Tools for the Humanities. The present study was thus undertaken in response to some of the questions and conclusions that came out of the Digital Tools summit and also as a follow-up to our own experiences in conducting an earlier survey in the spring of on the perceived value of The Versioning Machine.

One of the intriguing results of The Versioning Machine survey, which was presented as a poster at the Digital Humanities conference [ Schreibman et al. The vast majority of respondents found it valuable as a means to advance scholarship in spite of the fact that they themselves did not use it, or at least did not use it in the ways the developers of The Versioning Machine envisioned its use. As a result of feedback during and subsequent to the poster session, the authors decided to conduct a survey focusing on tool development as a scholarly activity.

In the end it was decided to focus the study on developers of digital humanities tools: their perceptions of their work, how it fits into a structure of academic rewards, and the value of tool development as a scholarly pursuit.

Rather than invite select respondents to take the survey, we decided that we would allow the field of respondents to self select. Additionally we sent invitations to about two dozen people whom we knew developed tools. An initial set of questions were drawn up in autumn This was circulated to several prominent tool developers for feedback. The survey was refined on the basis of their feedback and issued to mailing lists in December By March , when the survey closed, 54 individuals had completed it.

Survey questions were grouped into four main categories: Demographics , Tool Development , Specific Tools , and Value. These categories reflected the main emphases for the survey — i. In order to allow developers to comment on their experiences with more than one tool, the survey provided for demographic information to be collected once and linked to any number of tools developed by an individual. The survey was constructed in this way because we were curious as to whether developers had different experiences with particular tools, or whether perceptions of value would be consistent regardless of the type of tool developed.

For the most part, developers who described experiences with more than one tool had similar perceptions of value regardless of the tool. There were, however, some differences, including a developer who did not feel that one tool developed early in his career could be categorized as a scholarly activity, while two others were.

We were impressed with the thoroughness with which the majority of respondents completed the survey. As returns came in, however, we realized that there were several questions we did not ask but wish we had. One involved geographic location of the respondent.



0コメント

  • 1000 / 1000