Home ] About ] Events ] Publications ] Resources ] Services ]

Subject Centre logo - click to access the main Subject Centre site

C&IT Centre

No 18, December 1999
red slash icon

Up ]

News from CTICML

The proposal for a new Subject Centre for Languages, Linguistics and Area Studies, outlined in our last Newsletter, is one of a few which are still under negotiation with HEFCE. CTICML has been awarded an extension of its funding to the end of March 2000 as HEFCE are anxious to maintain continuity of service. We now expect to distribute the final ReCALL Newsletter in March 2000.

A bit of encouraging news, amongst all this uncertainty: the Unit of Assessment panel for European Studies includes in its descriptor ‘Language Learning and Teaching with the Integration of New Technologies’ and in its research output: ‘Teaching materials – including CALL materials’. Further details at http://www.rae.ac.uk. In this context, EUROCALL has been very active in collaboration with relevant international associations and organised a special Research Colloquium at the University of Essen in May 1999. One of the outcomes of this event is a Statement of Policy which is being adopted by all these associations, in an effort to encourage the recognition of research in the area of computer-assisted and technology-enhanced language learning. The full document is available on the EUROCALL website at http://www.hull.ac.uk/cti/eurocall.htm

Finally, we have to report that Jenny Parsons, Information Officer at CTICML since 1992, is moving on to a new, permanent post as Arts Faculty Administrator at the University of Hull. We would like to express our thanks for the excellent job she has done over the years and to wish her well in her new post. We shall certainly miss her!

Chinese Language Learning Materials on the Web

1. Introduction

The Oxford University Chinese Language Learning Materials project was envisaged by Mr Shio-yun Kan of the Institute of Chinese Studies, and the technical developments were realised by the Humanities Computing Development Team at Oxford University Computing Services. The project fits into a wider 3-year initiative led by the Institute of Chinese Studies to create a Centre for Teaching Chinese as a Foreign Language (CTCFL). The first year of the project involved editing existing teaching materials and creating new ones suitable for delivery over the web for elementary level. These include video clips, images, text, and audio recordings. Secondly, a technical framework was developed to deliver these materials in a suitably interactive way.

The target audience for the materials will initially be undergraduates at Oxford University, later extending to the whole of UK Higher Education. It is also hoped that the materials may be useful for children learning Chinese language within the UK.

2. Learning Chinese on the web

Types of exercises

The elementary Chinese web-site course has over 20 lessons, which can be used as supplementary material alongside the currently used Chinese language textbooks. The course offers something that a textbook cannot normally cover. For instance, the video clips of dialogues give students an opportunity to see how sentences are used in a Chinese speaking environment. In the exercise sections, visual images such as cartoons, photos and video clips are used along with English instructions and explanations to reinforce grammatical concepts.

Lesson structure

Each lesson consists of two types of exercise. The first type of exercise uses five functions: question, hint, answer, listen and comment. The question function instructs students how to carry out the exercise. The instructions are in English and use visually-aided materials to emphasise grammatical concepts. The hint facility allows students to complete their tasks according to individual levels and needs. Answers are typed into the boxes provided and can be checked against the standard examples. When the comment button is clicked, alternative answers with explanations are displayed on the screen. The second type of exercise consists of multiple choice questions, supported by the hint and comment functions.

'Stroke order' exercises

Learning stroke order, one of the essential Chinese writing skills, helps students to remember Chinese characters and encourages written fluency. Although this writing skill is taught in classes or individual tutorials in the very early stages of learning in China, its importance is frequently overlooked by Western teachers of Chinese. As a result, most western students are unfamiliar with stroke order and find it very difficult to read Chinese handwriting. In order to fill in this gap, the CTCFL and the Humanities Computing Development Team (HCDT) have created the framework for an animated stroke order program to teach beginners how to write Chinese characters. This program shows students stroke by stroke how a Chinese character is formed in brush-pen writing style, and indicates the radical of the Chinese character concerned. Students can also use the program as a self-test facility. This program will significantly reduce classroom teaching time.

Testing the program

A limited test of the program was carried out in Spring 1999 with 8 first year students, making use of the language laboratory of the Institute for Chinese Studies, University of Oxford. The general feedback was positive and the students thought that the lessons and exercises were informative and useful. As some of these students were not particularly familiar with Chinese word processing programs, they found it difficult to complete the tasks within the given time. Although a large scale test was sent to all other Chinese departments in UK universities in the same period, no feedback has yet been received.

3. Delivering Chinese on the web

A model was developed which disassociated the technical aspects of delivering the material from the content. This allows the `content developer' to focus on issues concerning the content of exercises and tests. The exercise pages also had to be flexible enough to be easily used and linked to from a broader, currently developing central web site. This led to a `modular' style of development where three types of exercises, multiple choice, question and answer and character drawing were developed as stand-alone applications but with a view to working together as a whole. This unity was helped by creating a common look and feel to all exercise types, helping users to become quickly proficient in navigating the different types of exercises. The multiple choice exercises are primarily text-based with supporting sound and graphical information, whilst the character drawing exercises are graphically-based with supporting text and sound.

The multiple choice and question and answer exercises work using CGI Scripts reading information held in a structured directory hierarchy. The information is pieced together then delivered as a whole on the web. The character drawing exercises work on the server in a similar way but in addition a separate piece of client software needed to be written in order to display the Chinese characters encoded as vector graphics and to provide suitable interactivity with the user. Unfortunately Chinese fonts were not suitable for this part of the project as they cannot be de-constructed into their respective strokes. Instead each character had to be digitised, encoded using appropriate graphical software. The choice of vector graphics over raster images was made to speed up creation and delivery of content and make the information re-useable. Unfortunately, at the time of development there was no standard web-based vector graphic format. However the chosen scheme, based simply on Cartesian co-ordinates could, with a little tweaking, be compliant with the emerging Scalable Vector Graphics (SVG) from W3C.

The system relied quite heavily on organisation of materials into a structured system and specific formats of materials. For this reason extensive documentation was provided. This was suitably non-technical to ensure the content developer could envisage how the high level system works, but detailed appendices outlined the technical aspects of the system, allowing for the materials to be built on and further incorporated into a developing programme.

In summary the delivery system developed ensured the content developers could concentrate on content and not get side tracked by technical issues. Content once created was easily editable and not hard-wired into any proprietary system.

Reports about the development of the project and a small exemplar of the materials is currently available for demonstration from http://www.oucs.ox.ac.uk/hcdt/projects.html. This technical demonstration contains very limited content as further content will be created and reviewed for use in the coming academic year.

4. Future plans for the project

The aims for developing computer aided language teaching and learning programs are as follows:

  • To supplement the main teaching program.
  • To introduce a Chinese language speaking environment.
  • To use visual aids to teach grammar.
  • To make language learning more interactive and flexible.
  • To build up an archive of teaching materials.

During a 3-year development programme (1999-2001), the CTCFL would like to spend the first year with the HCDT, focusing on creating web-based materials for elementary level Chinese language teaching. In the second year, the CTCFL will be developing intermediate level teaching materials (such as reading and listening comprehension) for the web site, with technical support from the HCDT and other relevant units at the University of Oxford. The CTCFL will meanwhile improve existing teaching material in response to feedback received from Oxford students and those of other universities. In the third year CTCFL will develop advanced level web-site material such as simultaneous interpreting exercises, dialect recognition and tasters of other Chinese languages such as Cantonese and Shanghainese. Our plan is to use the web-site to carry out each stage of the experiment and to improve our material. By the end of the third year the CTCFL will decide on the final presentation format of the material created over the 3-year period.

Shio-yun Kan, Institute of Chinese Studies
Peter Karas, Humanities Computing Development Team

University of Oxford


TELL Consortium News

Survey on implementation practice of TELL resources

A national survey for language practitioners forms the basis of an MA dissertation at the University of Hull on implementation practice of TELL resources in British Higher Education institutions. The aim is to identify best practice for the successful integration of TELL resources into teaching and learning. The study will investigate university cultures and infrastructures, access to information, technology, training and support, and attitudinal factors. All language practitioners in British Higher Education are welcome to participate in the survey, including those who have not yet used TELL or CALL resources.

Please download the Implementation Practice of TELL Questionnaire (pdf-file) from the TELL Web site at the URL http://www.hull.ac.uk/cti/tellquest.htm or, for an electronic or print-based copy, contact the author directly: Gabi Diercks-O'Brien, University of Lincolnshire and Humberside, [email protected].

The deadline for returning questionnaires is January 14th 2000. Every respondent to the questionnaire can automatically enter a prize draw for two book tokens, each worth �20, provided by the researcher.

English Materials

A new range of TELL Consortium materials has been developed to assist learners of English with

  • grammar
  • reading skills
  • English for special purposes

In some cases the materials are also suitable for native speakers of English, for example, to consolidate their grasp of English grammar.

GramEx and GramDef English

Both programs are suitable for native speakers and foreign learners of English. Each provides access to the Infogram grammar help file, which covers the major topics of English grammar and can be printed out by the user.

GramEx English assists intermediate level learners, re-inforcing essential areas of English grammar through sentence-based gapfill exercises.

GramDef English offers a hypertext exploration of the grammar of six short authentic texts, and allows the user to test their knowledge of the structure and grammar of English.

REAL: Reading and Listening Strategies

REAL Reading in English for Business and Management and REAL Reading in English for Social Scientists provide opportunities for learners of English as a second or foreign language to develop reading skills relevant to their course of study in the subject area at an English-speaking university.

For further information about the English programs see the website at http://www.hull.ac.uk/cti/tell.htm IC


Software Reviews

Italia 2000

Supplier: Giunti Multimedia, Ripa di Porta Ticinese 91, 20143 MILANO - ITALY. Tel +39 02 8393.374, Fax +39 02 5810.3485, Email: [email protected]
System Requirements: Minimum 486 processor, CD-ROM drive, Windows 3.1, sound card, microphone + headphones/speakers
Price: 300,000 Italian lire (155 Euros)

This multimedia course in Italian language and culture includes a book, a CD-ROM, a videocassette and a floppy disk. The production of the course was funded by the European Commission under Lingua-Socrates schemes between 1994 and 1998. It involved institutions in Italy (Universit� di Parma), in Spain (Universidad de Castilla-La Mancha), in Ireland (University College Dublin and Trinity College) and in the UK (Universities of Wales, Coventry, Exeter, Warwick, Anglia University and University College of St. Martin in Lancaster). It also includes authentic sources from RAI and a local private TV company (TelEtna).

The book consists of 12 units strictly linked to the videocassettes. It covers the four main language skills through a variety of contemporary themes in Italian life and culture (youth, education, entertainment, professional life, Europe, health, food, religion, and women in society). Each unit is based on a progression system to maximise the use of the authentic resources and to foster the development of reading and listening in combination with active skills (speaking and writing).

The first pre-listening activity, designed to prepare the learner for the topic, also encourages the learner to use a variety of language learning strategies. A teacher could maximise the use of this type of activity by encouraging reflection on learning and understanding of self in relation to the language and topic learnt. The latter may involve a more explicit approach to strategies than the one used in the book. Nevertheless the book offers the framework and a good opportunity for a teacher to introduce strategies.

The second stage involves viewing without sound, providing scope for prediction and vocabulary preparation, and encouraging the careful observation of visual information, a significant component of communication and cultural differences.

Finally the listening activity starts from a general understanding and gradually zooms in on a more detailed comprehension of the dialogue. Useful for teachers and learners are the general grids which can be used during the listening activities.

A variety of activities covering lexical, grammatical and functional aspects of the language follow. Productive skills are introduced gradually through a pathway which consists of simple guided activities designed to reconstruct the text and to re-use linguistic structures encountered in the unit. More open-ended activities conclude each unit. The overall format is identical for listening and reading tasks, with regular pre- and post- activities and a core activity whilst listening or reading.

Each unit has two levels, clearly defined with a * for level 1 activities and ** for the second level. This feature is particularly useful to colleagues who, like me, are often teaching to mixed-level classes consisting of lower and upper intermediate and advanced. This course is not designed for absolute beginners.

On the cultural level, the course offers a lot of points for discussion, and the pre-activities together with stage 2 activities are very useful for understanding language and culture.

The CD-ROM offers the opportunity to revise what is covered in the book and on video. Each videoclip is accompanied by eight exercises designed to stimulate comprehension and communication. Here the learner has the opportunity to practise speaking by taking part in role plays and answering questions. The opportunity to compare answers with a model answer is also available. The CD-ROM is an excellent source of self-study activities.

The floppy disk includes a variety of computer-assisted language learning activities based on the texts of the book and the transcripts of the dialogues. Once again this is a great opportunity for self-study, offering reinforcement of structures and skills already encountered. In addition to this variety of resources there is the opportunity to join the course over the Internet. This allows maximum use of the course materials in a variety of contexts. To my mind, the materials can be used as an open learning course as well as be an integral part of a taught programme. The variety of resources increases the chances of learners finding a style and medium which is in tune with their preferences or learning styles. In addition it provides new opportunities to explore other ways to learn, beyond those the learner has already used.

Behind all these well structured and designed resources a key element is support and feedback. The course has in-built support in that it provides keys to the exercises and transcripts of the videoclips. When these valuable tools are enhanced by use of the guide, feedback of a tutor, other native speakers or a language learning adviser, the course can achieve its full potential.

It is a good resource to have in a self-access centre and to consider for an undergraduate course at intermediate and advanced level.

Marina Mozzon-McPherson
University of Hull

Note: there's an Italia 2000 website at http://www.italia-2000.com/ with a free online multimedia course based on the boxed set, and the developer of the software, Gavin Burnage, has a website at http://www.mml.cam.ac.uk/ital2000/CD/ with a detailed description of the software, its design and pedagogy, and its evaluation and use.


Nedercom Spelling 1; Nedercom Lezen 1.

Supplier: Nedercom Eduware, Postbus 141, 9300 AC Roden, The Netherlands, Tel & Fax +31 50 501 23 11, [email protected], http://www.tip.nl/users/nedercom
System requirements and price: refer to supplier

Under review are the two new educational software packages Nedercom Spelling 1 and Nedercom Lezen 1. As the titles indicate, the programs aim to provide a learning tool towards developing spelling and reading skills in the Dutch language.

The spelling program focuses on four different areas:

a) the spelling of verbs;

b) spelling rules regarding all words other than verbs;

c) the use of punctuation marks;

d) the spelling of difficult words.


An additional section, dealing with what is called 'basic knowledge', precedes these four sections and practises the ability to recognise various grammatical categories. If a student's performance regarding the spelling of verbs is inadequate, this section may be accessed.

The program provides two basic work modes; the first is diagnostic and the second offers exercises. The first mode offers a way of assessing a student's current knowledge and identifies which sections of the program need to be studied. The second mode presents students with a series of exercises. Each section is structured around a series of questions dealing with the particular spelling area under discussion. For each question, feedback and a brief explanation of the problem is given. A score of correct and incorrect answers, which is maintained per student, can be stored and later accessed by the course tutor.

The reading program is divided into two sections: section A uses short texts in an attempt to locate and focus on the central idea or theme of a text; section B focuses on the structural aspects of texts, and discusses the way in which various parts of the text (paragraphs, introduction, body, conclusion, whole texts) are built up.

Both courses present a well structured and integrated approach. Spelling mistakes, for instance, are categorised and recognised as misspellings of verbs or different words, which makes it possible to make students aware of the underlying problems of their spelling mistakes - often grammatical issues - and allows them to practise these.

The programs also offer a thorough approach. The reading program, for instance, tackles the most basic concepts first (theme and several structural devices), gradually building on these towards acquiring a range of reading skills, which lead to more extensive analyses of whole texts, including themes, linking devices, structural breakdown, summary, etc. The spelling program methodically works through all the major issues of Dutch spelling, giving students a wide variety of exercises and good feedback to their answers. The reading program, in particular, uses a more educated discourse and may therefore prove helpful not only in the development of reading skills, but also in the development of a more formal kind of language use, possibly even providing an aid to teaching writing skills.

Besides being well structured, the programs are also user-friendly, both for student and teacher. The feedback given to students is admirably clear and detailed, making the most of the learning experience it offers. Also, as the students work through the program, a `schema' is automatically compiled for each reading text, storing the answers given by the student, providing them with information regarding theme, main points, structure, etc., which may be accessed by the student at any time. For teachers, the software is easy to use and useful in keeping track of student performance. The student records with correct and incorrect answers can be an effective tool in charting particular strengths and weaknesses of individual students, or of groups as a whole, and may prove valuable in deciding on future teaching strategies.

However, the packages may not be suitable for use in some teaching situations. They are not geared towards learners of Dutch as a second or foreign language, but have been developed for native learners at secondary school level. The level of language used in the software is too high for complete beginners, for instance, and will prove useful only from a relatively high intermediate level upwards. The software's origins as a tool for teaching in Dutch schools is borne out by the fact that the reading program, for instance, does not provide any links to vocabulary or grammar. This is only logical, since its focus on a native target group means that the software is not meant to develop language proficiency, although this would be desirable, indeed a priority, in a second or foreign language environment. Consequently, use of the software abroad, for instance, will require more initiative from teacher and student if optimal use is to be made of the material within a programme of Dutch as a foreign language. Nonetheless, the quality and extent of the material is such that this will be well worth the effort. Highly recommended.

Dennis Strik

University College London

Software Reviews on the Web

A review of TELL Me More Pro can be found on the CTICML website at http://www.hull.ac.uk/cti/resources/reviews/revlist.htm together with a number of other software reviews which have only been published on the website. You can also find an alphabetical index of all software reviews published both in the ReCALL Newsletter and on the website.


Innovation in Language Teaching through the ELEN Project

The ELEN (Extended Learning Environment Network) Project is funded by TLTP3 for the period 1998-2001. There are eight consortium universities* engaged in curriculum innovation using technology to aid keyskills development (in phase one) and to embed subject specific materials (in phase two) into teaching programmes.

Phase one is underway with approximately 4,500 students and 60 lecturers across the universities taking part in pilot projects. We are now starting the development work for phase two, part of which will be focusing on languages.

We will be encouraging lecturers in languages to develop the use of learning environments, such as the Virtual Campus and WebCT, and assessment tools, such as Question Mark, to aid the embedding of TELL and other suitable materials into the curriculum. We hope to have a number of pilot projects running throughout the 2000/01 academic year which experiment with electronic resources, web technology and on-line assessments and their applicability to language teaching.

If you would like further information about the ELEN Project please visit our web site at http://home.ulh.ac.uk/cfll/elen/

* University of Lincolnshire & Humberside, University of Huddersfield, Middlesex University, Thames Valley University, Loughborough University, University of Plymouth, University of Manchester and De Montfort University


Book Reviews

CALL Environments: Research, Practice, and Critical Issues

J Egbert & E Hanson-Smith (eds.)
TESOL Inc., 1999, pp.523, ISBN 0 939 791 79 2, Price: US$ 39.95

"Most CALL texts are technology driven, that is, they are organized around the types of activities that computers are able to do. Few have presented a specific framework based on second language acquisition (SLA) research. This volume fills that need." So reads the write-up on the back cover of this TESOL publication. A lofty claim, indeed, for any book on CALL. The writer is, of course, absolutely right about the need for an analysis of CALL grounded in SLA research. CALL research has long been heavy on practice and light on underpinning SLA theory. Unfortunately, though this book has other merits, it does not "fill that need". Its attempt to organise disparate articles on CALL within an SLA framework ends up to be more like an ill-fitting, off-the-peg suit than a novel, tailor-made creation. What, for example, is the connection between criteria for selecting software (the topic of chapter eleven) and Authentic Task, the SLA category in which the chapter is placed? There's also the fact that the eight Conditions for Optimal Language Learning Environments, which form the divisions of the book, are of the common-or-garden authenticity and learner autonomy variety rather than the result of innovative or groundbreaking research. Certainly, anyone buying this book with the primary goal of furthering their insights into the SLA basis of CALL will be disappointed.

But that is not to say we should write this book off entirely. Ignoring its claims to offer insights from SLA and the problems with overall structure, this 523 page book remains a fairly comprehensive introduction to CALL theory and practice in the nineties. A glance down the Table of Contents quickly reveals that this book covers quite a broad range of topics: designing a CALL classroom, using tailor-made ELT software, setting up e-mail projects, using MUDs and MOOs, finding ELT resources on the WWW, etc. For the novice Web-enhanced language learning (WELL) teacher, these articles provide a wealth of information. The more experienced WELL teacher may also find the wide range of recommended URLs and the book's reasonably detailed bibliography useful.

Of the more theoretical articles, the ones that stand out are Bill Johnston's chapter which examines the changing definition of audience with the advent of new technologies and Carla Meskill's insight into the future of CALL (if for no other reason, the book may be almost worth buying just for this!). Of the more practical contributions, Jim Buell's whistle-stop tour of Internet-based teaching resources is well put together and is the sort of article one might recommend to a teacher wanting to know what resources are available on the Web. When I first got hold of the book I had been looking forward to reading the chapter on authoring software but I was disappointed to find that it focused almost entirely on American products produced mainly for the Apple Mac. I'm not sure the British reader would be too interested in MacLang, MacReader or Dasher or in some of the non-EFL software such as Hollywood High or Go West! The Homesteader's Challenge described in other chapters.

With CALL Environments, TESOL have entered a section of the market dominated by the Dave Sperling's Internet Guide. The SLA framework aspect which seems to have been used to differentiate this book from Sperling's is, as we have seen, a bit of a non-starter. If budgetary considerations means it comes down to a choice between the two books, I know which one I would choose. Having said that, CALL Environments is up-to-date and may still prove a useful addition to the bookshelf of the CALL aficionado.

David Catterick

University of Dundee

Editor's note: A contrasting review, published in Language Learning & Technology Vol. 3, No. 1, July 1999 may be read at http://polyglot.cal.msu.edu/llt/vol3num1/review/review1.html

The Internet and ELT

David Eastment
Summertown Publishing, Oxford, 1999, pp.60, ISBN 1 902741 14 5, Price: �15.50.

This short book is based on a report originally commissioned by the British Council in 1996, and revised in 1998. It is similar to The Future of English? (Graddol 1997) in format, length and price, the main text being supplemented by shorter texts in the margins and at the foot of each page (a layout that is presumably intended to facilitate browsing). However although the two British Council reports have superficial similarities, they will not appeal to the same body of readers. Whereas The Future of English? is densely packed with statistical information, case studies, in-depth discussion and academic citations, The Internet and ELT is an altogether less scholarly work with fewer references and a more superficial approach. It is, however, what it claims to be: `a short, clear guide to the practical state of the art', intended for teachers, directors of studies and ELT managers.

The book is divided into six sections, covering:

  1. general information about e-mail and the Internet
  2. a 'snapshot' of ELT Internet use
  3. Internet skills
  4. electronic commerce
  5. issues, such as quality and reliability
  6. trends which have implications for ELT.

Although ELT practitioners will find practical up-to-date information in every section, the report's greatest contribution is the support it provides for managers / directors of studies who are thinking of introducing web-based learning to their institutions. Glossaries explain the terminology they will need to use when communicating with service providers, and the section on e-commerce presents the necessary arguments to persuade colleagues of the advantages of offering courses on-line. Currently the only other sources of such information are commercial providers, touting for custom, and computer magazines, aimed at the general reader.

The sections of the report which deal with teacher and learner training are somewhat less successful. There already exist a number of books and articles which provide teachers with an overview of the Internet and ELT (for example Carrier 1997, and Sperling 1998) and although this report lists plenty of URLs for teachers to test out for themselves, the web addresses are not accompanied by much evaluative comment. The problem of poor quality ELT websites is touched on,under the subheading Is it all really any good?, but the only rating criteria cited are for general educational sites, and a boxed text entitled Evaluating English Language Websites simply refers the reader to another (web-based) source, rather than training the teacher in methods of evaluation. The report identifies some Internet skills for teaching and learning, but provides less detail than other recently published sources (for example the WELL Project booklet). Regarding the tricky issue of copyright, it warns on the one hand against any unauthorised copying (p.40), but on the other hand advertises WebWhacker, `a program with which you can download complete sites to view offline' (p.46), and advocates that teachers should learn the skills of manipulating and creating: `teachers need to have sufficient control over the technology to cut and paste text from a Web page into a worksheet, to insert and resize graphics.' (p.28)

Such contradictions arise because the same issues are discussed more than once, with a slightly different stance taken on each occasion. Regarding language resources, for example, we read that:

"it is unlikely that publishers will make available high quality reference works online until solutions have been found to the problems of charging and copyright" (p.17), and "generally, the Web is not a useful source for reference materials such as dictionaries" (p.16), yet also that "dictionaries online are a growth area: .... a good variety is available on the Web, and this is likely to increase" (p.16 boxed text).

Perhaps part of the problem is the way the report is constructed, each double page spread containing several boxed texts, glosses and URLs which relate to a greater or lesser extent to the main text. On a web page these would be accessible from the main text via hypertext link, but linear text on the printed page requires more careful consideration of layout and information structure. Some of the URLs are listed many pages before or after their description in the main text; some are unannotated, leaving no clue as to how they relate to the developing argument of the chapter, while in other cases the same information is repeated three times over: in the main text, in a boxed text and as an annotation to the URL (for example the comments about Media Matrix, pp. 26 and 27). Apparently, any kind of information - case study, teaching tip, site description or anecdote - can appear anywhere on the page - along the bottom, down the right margin, or in the main text - so although the layout looks attractive it does little to facilitate information searching. The report really is a storehouse of Internet facts which readers will want to refer back to, but page headings are often uninformative, there is no index, and even the glossed technical terms on each page are not presented in alphabetical order. Interestingly, the report refers to Laurillard's comment that hypertext `undermines the structure of the `texts' it uses and reduces knowledge to fragments of information' (p.17). Has the same process taken place in the preparation of this report, which organisationally owes so much to hypertext?

Fortunately, electronic publishing also makes it possible for the text to be continually updated in the light of new developments and readers' comments, which the publishers invite. The typos, inconsistencies, and gaps we have noticed in this first edition will doubtless be corrected, and future editions will provide an increasingly valuable resource for ELT professionals.

Hilary Nesi and Benita Studman-Badillo

University of Warwick

References

Carrier M. (1997) 'ELT online: the rise of the Internet', ELT Journal 51 (3), 279-309.

Graddol D. (1997) The Future of English?, London: The British Council.

Sperling D. (1998) Dave Sperling's Internet Guide (2nd edition), New Jersey: Prentice Hall.

WELL Project Webskills for Language Learners (accessed July 30 1999) http://www.well.ac.uk/wellproj/wellbook.htm

Language Teaching and Language Technology

S Jager, J Nerbonne, J and A Van Essen,(eds)
Swets & Zeitlinger Publishers, Lisse, 1998, ISBN 90 265 1514 6, Price: US$ 87.00, DGL 165.00.

This book - edited by Sake Jager, John Nerbonne and Arthur van Essen - publishes a selection of papers presented at the Language Teaching and Language Technology conference in April 1997 at the University of Groningen. "The conference was organized to promote an exchange of ideas on how best to harness language technology to improve language teaching" (p.1). In the introduction, the editors state that CALL programs make "very little use" (ibid.) of language technology. They argue that many subdisciplines of (computational) linguistics (e.g. phonology and morphology) provide adequate descriptions of linguistic behaviour so that these descriptions and their implementation can now form the basis for robust and successful CALL applications.

The volume includes 23 papers which cover various aspects of foreign-language learning - speaking; vocabulary; grammar; reading, writing and testing; distance learning - as well as learner behaviour (users: models and studies) and some wider pictures of the field (reflections and visions). These contributions reflect the fact that the two areas of research and application - language technology and computer-assisted language learning - are beginning (again) to converge in certain projects. Language technology provides some answers for CALL in that it offers improved computational processing of language by performing one or more of the following tasks: speech recognition, lemmatization, syntactic categorisation, vocabulary extraction, parsing, text generation and speech synthesis. CALL, which in many programs relies on hypertext, digital audio and video, (simple) database technology and network communication, offers not only some insights into how these technologies are used in a language learning context, but also expertise in how computers can be successfully integrated into the language learning process.

The papers 'sit' at different points on this convergence continuum of language technology and CALL. At one end, there are contributions that describe robust and thoroughly evaluated language technology and sketch potential use in language learning - at the other end, we find descriptions of robust and evaluated CALL software for which improvements through language technology are envisaged. Only some (in my opinion) outstanding projects demonstrate the successful application of language technology in CALL software.

Carson-Berndsen (pp.11-20) demonstrates that "... finite-state phonology has shown that the argument [by Salaberry (1996) against language processing approaches to CALL due to the fact that linguistics cannot account for the full complexity of language] cannot be upheld for the phonological domain. The phonological knowledge base used ... here is a complete and fully evaluated phonotactics of German" (p.12). APron, Autosegmental Pronunciation Teaching, uses this knowledge base and generates "event structures for some utterance" (p.15) and can, for example, visualise pronunciation processes of individual sounds and well-formed utterances using an animated, schematic vocal tract. Witt and Young (pp.25-35), on the other hand, are concerned with assessing pronunciation. They implemented and tested a pronunciation scoring algorithm which is based on speech recognition and uses hidden Markov models. "The results show that - at least for this setup with artificially generated pronunciation errors - the GOP [goodness of pronunciation] scoring method is a viable assessment tool." The third paper on pronunciation, by Skrelin and Volskaja (pp.21-24) outlines the use of speech synthesis in language learning and lists dictation, distinction of homographs, a sound dictionary and pronunciation drills as possible applications.

A number of papers are based on results of the GLOSSER project, a COPERNICUS project that aims to demonstrate the use of language processing tools (comprising Locolex, a morphological analyser and part-of-speech disambiguation package from Rank Xerox Research Centre, relevant electronic dictionaries e.g. Hedendaag Frans and access to bilingual corpora). "The project vision foresees two main areas where GLOSSER applications can be
used. First, in language learning and second, as a tool for users that have a bit of knowledge of a foreign language, but cannot read it easily or reliably" (p.88). Dokter and Nerbonne report on the French-Dutch demonstrator running on UNIX. The demonstrator uses morphological analysis to provide additional grammatical information on individual words and to simplify dictionary look-up;

  • relies on automatic word selection;
  • offers the opportunity to insert glosses (taken from the dictionary look-up) into the text;
  • relies on string-based word sense disambiguation ("Whenever a lexical context is found in the text that is also provided in the dictionary, the example in the dictionary is highlighted." (p.93)).

Roosma and Pr�sz�ku draw attention to the fact that GLOSSER works with the following language combinations: English-Estonian-Hungarian, English-Bulgarian, French-Dutch. They describe a demonstrator version running under Windows. Dokter, Nerbonne, Sch�rcks-Grozeva and Smit conclude in their user study "that Glosser-RuG improves the ease with which language students can approach a foreign language text" (p.175). Part of another project took place under GLOSSER: Allodi, Dokter and Kuipers report on the effective use of Web resources in language learning. They advocate an indirect link to the remote server - via a virtual server. This virtual server (a CGI script) formats the retrieved documents (selecting relevant information, adjusting to limited space, disabling unnecessary interactivity features, unified appearance) by editing the incoming HPSG script before displaying it to the learner.

Other CALL projects that explore the use of language processing technology are RECALL (Murphy, Kr�ger and Griesz (pp.62-73), Hamilton (pp.200-208)) - a "knowledge-based error correction application" (p.62) for English and German - and the development of tools for learning Basque as a foreign language (Diaz de Ilarranza, Maritxalar and Oronoz (pp.149-166)). The latter project relies on a spell-checker, morphological analyser, syntactic parser and a lexical database for Basque and the developers report on the development of an interlanguage model.

Sager (pp.82-87) introduces HOLOGRAM, an authoring shell, and concludes that "[f]urther enhancements are planned for answer analysis, where NLP techniques can be deployed to reduce the work of teacher-developers in defining possible answers even further." Yablonsky (pp.53-61) shows the possible uses of the Russicon dictionaries and language processor as language tools for Russian and Ukrainian (e.g. spell check, thesaurus, paradigm look-up). Hu, Hopkins and Phinney (pp.95-100) outline a commercially available English grammar checker - NativeEnglish - which relies on identifying "errors by matching patterns of known errors in its database" (p.96). Rothenberg (pp.146-148) outlines a distance learning course for Spanish beginners which uses the web and CD-ROMs and in some exercises relies on automatic speech recognition. A program that "was developed to diagnose and remedy deficiencies in traditional parsing skills" is presented by van Heuven (pp.74-81).

Two papers deal specifically with issues of language testing. D�ntsch and Gediga (pp.177-186) show that knowledge assessment which "uses test items to define a universe of empirical knowledge states" (p.177) can be applied to CALL programs and the relation of test items in terms of difficulty can be easily computed. de Vos and Hacquebord (pp.108-122) describe the use of Evita for computer-aided testing.

Borchardt (pp.218-225), in the last paper of the book, provides an encouraging outlook for CALL. He argues that although there are forces of "technological mercantilism" (p.218) that hinder the development and dissemination of CALL, these negative influences are not only beginning to be compensated by the momentum "generated by the convergence of theory and practice" (p.220), but that the anarchy of the Web is about the liberation of CALL from the gatekeeper that limits access to the market place.

This volume shows that a similarly encouraging convergence can be found in the area of language technology and CALL - more effective language technology stimulates a search for suitable language learning and teaching applications which can make good use of them, and the better understanding and the wider use of CALL means that CALL researchers and developers begin to look more in the direction of human language technology for solutions that will lead to improved language learning software.

Mathias Schulze

Centre for Computational Linguistics, UMIST

Teaching European Literature and Culture with Communication and Information Technologies

Edited By Sarah Porter and Stuart Sutherland
CTI Textual Studies, 13 Banbury Rd, Oxford, OX2 6NN, UK

This collection of papers will be of immediate and intense interest to those CALLers who may also teach some of the non-language specific elements on MFL degree courses. It is aimed at combatting the dearth of research publications on the teaching of literature and culture with ICT. The papers are not intended as examples of best practice but rather they relate examples of successful practices in this area. Porter sets out the background and goes on to discuss in detail three important areas:  

  • using technology in order to make a substantial contribution to the teaching of literary and cultural subjects
  • evaluating the influence of technology on the subject matter and teaching method
  • investigating the possible or inevitable changes to the old boundaries between the subject areas.

After a nod of recognition to the past successes of CALL researchers and practitioners, Porter gives a brief description of the five main papers offered here. Her salient conclusion is worth quoting in full. � We need to look beyond the immediate focus of content which is specific to a single subject area to consider at a higher level the methodologies which are being used for successful teaching, and thus be able to make informed decisions about whether technology will enhance or dilute the teaching situation � (p.7).

In his paper, Burnage argues cogently and convincingly for the creation of full-time academic educational technology posts (an argument that will ring true to many, if not all, CALL practitioners) whilst relating the recent experiences at the University of Cambridge's Faculty of Modern and Medieval Languages in using CALL software such as TransIt-TIGER, the Web and the development and extension to other languages of the Italia 2000 project software. We can only applaud and welcome Burnage's call for the proper recognition of software creation as a publication. McNeill's paper 'A Season in Cyberspace - Reflecting on web-based resources for French studies', laments the severe lack of relevant secondary French resources that he could integrate into his courses via his own webpages. He describes three models of webpage design (linear, multi-frame and fragment-based) and includes several screenshots to illustrate the advantages of using this method over the traditional lecture and seminar. He calls for the greater integration of Web resources and discusses the implications for teaching and learning strategies, chief among them being the interactions of students with their peers and their tutors. Lee's contribution describes a JISC funded project Virtual Seminars for Teaching Literature mostly from a pedagogic and design angle. CALLers will be familiar with his three rules on integration regarding noticeable gains, appropriateness and not using ICT to replace teaching or teachers. The importance of this paper lies in the documentation of the process of designing and using four separate tutorials that were specifically aimed at target audiences with particular learning expectations. Davies' paper describes the distinct advantages to literature researchers of using an ICT tool in investigating some of the works of P�res Gald�s and draws to our attention the issue of editing and the displacement of the `omniscient' editor when the users of the electronic edition are obliged to make critical decisions for themselves during their researches. It will be interesting to see and trial a running version of the electronic texts.

The final paper by Fiormonte, Babini and Selvaggini on the Digital Variants Archive Project not only offers illustrations of their work but also the URL to this fascinating project so that the reader, with the required level of Italian, may visit their webpages and evaluate their labours. Digital Variants attempts to teach advanced learners textual criticism and writing skills, test their writing models and, significantly, study the creative process involved in the composition of a literary text with examples from seven authors showing drafts, rewritings and annotations. Coincidentally, with recent and powerful developments in webpage design, the layout of the materials here would benefit enormously from annotational support rather than being exclusively frame-based and greater contrasts in the use of colours would also be helpful. Despite these small criticisms, this project is highly imaginative in its scope and should be praised and supported by other researchers and teachers. The developers have already extended their model to a study of the Spanish writer Galiano after confirming that their approach had successfully contributed to the teaching of L2 writing skills. These five stimulating papers with an excellent editorial introduction should be read by all CALL researchers and practitioners.

Liam Murray

University of Warwick

Note : Print copies of this publication are sent free of charge to subscribers to the journal Computers & Texts which is free on subscription to academic members of staff attached to UK HE institutions. Webversion: http://info.ox.ac.uk/ctitext/publish/occas/eurolit/index.html

New book on CAA

Computer-Assisted Assessment in Higher Education, edited by Sally Brown, Joanna Bull and Phil Race. The book contains an edited collection of articles about the use of CAA in the UK and overseas. Available from Kogan Page at: http://www.kogan-page.co.uk


Conference Report: Advising for Language Learning

University of Hull
28 - 29 July, 1999

The summer conference on Language Advising, the third in the series, once again attracted delegates from a wide range of backgrounds and institutions. Encouragingly, more participants were in posts as advisers, reflecting the growth in the profession. The conference took the form of a series of papers and practical workshops, focusing both on the theory underpinning advising and on day-to-day practicalities.

Day one commenced with an inspiring plenary session entitled `The Learner: self-made man or man-made self?' by Philip Riley (Centre de Recherches et d'Applications P�dagogiques en Langues , CRAPEL, Universit� de Nancy) which he admitted to be "theoretical". In spite of the accuracy of the description, the paper was delivered with such clarity and evident enthusiasm, that it was comprehensible to all as well as entertaining. Dealing with issues of individual learner identity, autonomy and authority, as personified by the adviser, it gave rise to the question of whether a learner can indeed be autonomous when confronted by an adviser. This potentially intractable problem can, however, be resolved through discourse, a subject to which the proceedings subsequently returned.

The second paper, also with a theoretical tenor, `The language adviser / counsellor: roles, functions and tools', given by Marina Mozzon-McPherson (University of Hull) balanced the equation in that it examined the identity of the adviser and considered the implications of terminology on the wide panoply of roles fulfilled. The critical function of discourse as a tool to assist learners in reflecting on their perceptions of both themselves as learners and on language learning per se proved an apposite development to Philip Riley's paper. The focus on the complementary role in relation to teaching was a topic to which all could readily relate and the same was true of the issues relating to the learning environment and the skills of an adviser. A great deal of food for thought was given to those already
advising and to those considering embarking upon advising.

'Support for the learner: the role of metacognition in open and distance learning' drew on the experience of Stella Hurd (Open University). She described some of the particular problems which beset students of French following a distance learning course and how enhancement of their metacognitive skills contributed to resolving them.

'The Virtual Language Adviser' by Andy Hagyard (University of Lincolnshire and Humberside) gave delegates the opportunity to see a very different type of support available for language students by browsing the extensive web-based system developed at that institution. From feedback received, this innovative approach to advising was of great interest.

The final session of the day was of a practical nature, dealing with `Setting up a Tandem Programme: problems, possibilities and considerations'. Initially Judy Jowers (University of Hull) described the current system of tandem learning, administered by the Advisory Service in the Language Institute at Hull, mentioning not only its strengths but also its weaknesses. It is not accredited but supports a large body of learners in one aspect of autonomous learning. The focus then switched to the University of Manchester, where Sandra Truscott and John Morley have recently succeeded in setting up a more formal, accredited scheme in addition to one analogous to Hull's. John having described the programme, Sandra called for delegates to participate in a stimulating group exercise to devise ways of overcoming the problems encountered in setting up this type of tandem programme.

Delegates were able to see a demonstration of Merlin, an award-winning web-based learning environment, by Debra Marsh (University of Hull). Whilst initially developed to support distance learning of EFL, it has now been extended to other disciplines. Those who made the effort to attend the presentation were rewarded with a demonstration of the environment, an overview of the range of applications currently under development and the opportunity to explore some of the issues associated with Internet-based learner support.

'Analysing learners' needs: a first step towards independent learning', Annegret Jamieson, Miranda van Rossum and Judy Jowers (University of Hull), started with a general introduction to needs analyses, followed by two brief case studies of needs analyses used in a Dutch and a German module. This led to active participation in the evaluation of various models and culminated in a lively discussion on the importance of needs analysis.

A welcome change came in the shape of Russell Whitehead's entertaining presentation: 'The interlanguage strategies of Roberto Benigni in "Down by Law"'. The video clips illustrated very aptly the type of strategies which can be used by learners and suggested by advisers. No doubt the visual and humorous nature of the presentation and discussion made it all the more memorable.

Participation was predictably demanded in the session on 'Responding to Others' by Steve Page, (University of Hull) of the Counselling Service. A listening assessment exercise designed to ascertain the type of response (empathic /asking for information /critical) given to various statements proved to be quite illuminating, especially to those of us who are already advisers and certainly gave rise to reflection. A pair exercise in listening and summarising was likewise revealing; responding for 5 minutes to the question 'What would you die for ?' called for some real introspection ...

Marina Mozzon McPherson's second paper, 'The discourse of advising for language' used transcript data from advising sessions as a basis for analysis and group discussions, relating to a set of micro and macro skills employed by `good' advisers. Again we were encouraged to review our own practices, which must be beneficial.

The final set of presentations were from the University of Salford, 'Two Pilot Language advising Projects at Salford'. The first of the projects was presented by Roselyne Edwards and Christina Flanagan, who described the project they set up offering advice, on an optional basis, to a restricted number of non-specialist students. In spite of various obstacles, the uptake was encouraging, as was the feedback they received from those learners who participated. As a result, a wider ranging project is to be embarked upon in the future. The second project, presented by Linda Altshul, involved e-mail advising and was again on a small scale but very well received by her learners.

The roundtable and summing up were very brief indeed as the general feeling seemed to be that heads were buzzing with ideas and time was needed for reflection on them. The overall atmosphere of the conference was very open, offering a forum for the exchange of experiences. Advisers perhaps require peer group support as much as do learners.

Judy Jowers

University of Hull

Other conference reports can be found on our website at http://www.hull.ac.uk/cti/events/confreps.htm


EUROCALL 2000

University of Abertay, Dundee
31 August 2000 - 2 September 2000

Call for Papers now available at

http://dbs.tay.ac.uk/eurocall2000/

Deadline for submission of abstracts:

31 January 2000

For information about the conference including the organisation of pre-Conference events, contact Philippe Delcloque ([email protected])

Forthcoming Events . . . . . . . . . . . . . . . .

A continuously-updated events calendar can be found at http://www.hull.ac.uk/cti/resources/calendar.htm

 red slash icon

Home ] About ] Events ] Publications ] Resources ] Services ]

C&IT Centre, Language Institute, University of Hull, Cottingham Road, Hull  HU6 7RX, UK
Tel: +44 (0)1482 465872, Fax: +44 (0)1482 473816, Email: [email protected]

Site maintained by Fred Riley, [email protected]
Last updated 23 March 2000

red slash icon