France Belanger and Dianne H. Jordan.
Evaluation and Implementation of Distance Learning: Technologies, Tools and Techniques.
Hershey, Penn.: Idea Group Publishing, 2000.
paper, 256 p., ISBN 1-878-28963-2, US$69.95.
Idea Group: http://www.idea-group.com
The target audience for this book is set out as "professionals and educators who are interested in preparing themselves for the transition from traditional learning to the emerging distance learning environment." This in effect, sums up what the book is about. It covers the life cycle of analysis, design, development, conversion and evaluation of distance education material produced by the tools of technology. The key thread throughout the book is around transition - you do not have to do it tomorrow, in fact that is not a good idea.
The book examines the need for distance learning, goes through many distance learning concepts and terms and reviews the advantages and disadvantages of six distance learning technologies: computer-based training, computer-aided instruction, Web-based training, teleconferencing, videotape, and video tele-training. It then explores conversion issues, going through the steps needed to decide if a course is suitable for distance learning, it then covers the development of appropriate content and concludes the cycle with material on evaluation. Some of the content is fairly technical and at times out-of-date (such as hardware specifications and software descriptions). There are also several case studies and a useful bibliography and some Web addresses.
For anyone already using information technology for distance learning I doubt this book would offer anything new. In fact some might find it somewhat simplistic in parts. I found it didn't add anything new to my understanding. It would undoubtedly be of some use to distance educators considering using some of the technology to "reinvent" their courses or those wishing to develop distance education through the use of technology. For those groups it gives a reasonable overview of the issues and processes they will have to consider, though some may be alienated by the jargon and detail. Unfortunately, though worthy, it is not a book that enthuses me for technology based distance learning. - Wendy Clark
Christine L. Borgman.
From Gutenberg to the Global Information Infrastructure: Access to Information in the Networked World.
Cambridge, Mass.: MIT Press, 2000.
cloth, 340 p., ISBN 0-262-02473-X, US$42.00.
MIT Press: http://mitpress.mit.edu
This book takes a broad perspective, encompassing many aspects of the question of how we access information, and how we can ensure that information continues to be accessible. In order to provide such breadth, the author complements her own research with extensive literature reviews and analyses, drawing upon research and practice in many related fields ranging from computer science and and information policy to psychology and sociology. She thus covers a range of issues covering the spectrum from the technological underpinning of the global information infrastructure to the social and psychological aspects of the ways in which we access and use information.
The component parts of the GII include "computer networks, digital libraries, digital preservation, electronic publishing, information retrieval, human-computer interface design, telecommunications, information-related behaviour [and] information policy" (p. ix). Borgman surveys the progress that has been achieved in each of these elements, while noting that it is necessary also to pay more attention to the way that humans behave in the face of these technologies. Finally, technologies and policies are only viable if people choose to adopt them, so alongside such issues as copyright and privacy, information policy must also be informed by an understanding of the ways in which people choose to access and manipulate information. Efforts to scale up the present-day Internet to an infrastructure for electronic commerce, education, entertainment and communications depend on the interactions between people, technology and content, so the book views the development of the GII as a gradual evolution of these three elements together, making possible both new ways of doing old activities, and a variety of completely new activities.
One important role envisaged for the GII is that of "Global Digital Library", a construct that Borgman proposes to encompass digital libraries that are connected to, and accessible through, a global information infrastructure linking electronic resources around the world. She points out, however, that "In view of the rapid expansion of computer networks, distributed access to information resources, electronic publishing, tele-learning, distance-independent learning, electronic commerce, and related applications of information technologies, much more research on all aspects of digital libraries is needed" (p. 51). Borgman analyses three elements of access to such a network: connectivity, content and usability. Usability is the most problematic of these, as it is limited by the ability of non-specialists to use the technology and the information available. "Information systems continue to be difficult to learn and to use, despite the technological advances of the last two decades" (p. 118). Nonetheless, such an infrastructure would empower people around the world by improving their access to information.
One way in which access can be improved is through the digitization of documents, which allows a digital image of a document to be distributed online to multiple users. So, apart from documents that are "born digital", libraries face the important task of making traditional documents available in digital form. Chapter three contains an interesting discussion of what constitutes a document, and surveys various ways in which electronic documents differ from their conventional counterparts. This is complemented at the end of chapter four by a survey of a variety of new techniques which are being developed to identify characteristics of documents and behaviours and to improve information retrieval and other aspects of document creation and use. Borgman concludes that the real power of technology in electronic publishing may be in new formats for publishing. New genres can be created that provide better searching, sorting and displays, hyperlinks between documents, and links from citations to the full source.
Debates about electronic publishing also range across a number of disciplines, as they "involve the interaction of technological, psychological, sociological, economic, political, and cultural factors that influence how people create, use, seek, and acquire information" (p. 83). Likewise, the change from print to electronic journals depends on a complex relationship between technology, behaviour, and economics: "From a technological perspective, documents are merely 'things' to be manipulated in information systems. From a behavioral perspective, however, documents have a 'social life'" (p. 115).
A theme that is reiterated throughout the book is "the reflexive relationship between human behavior and technology. People use the technologies that are available and adapt them to suit their purposes. Subsequent iterations of technology reflect those adaptations in combination with new capabilities made possible by technological advances" (p. 168). Furthermore, in the chapter titled "Acting Locally, Thinking Globally", Borgman makes the point that "Designers of each digital library must tailor their systems to the identified needs of their target audience" (p. 222), while at the same time recognising that their system will serve as part of a larger entity.
The final chapter considers how we can develop from the present-day Internet to the global information infrastructure. "The first [challenge] lies in scaling the technology, the economics, and the behavior to a network that supports several orders of magnitude more users. The second is to provide access to information in this environment. And the third is to transfer the technology and services to parts of the world with different traditions" (p. 225). Some of the necessary approaches are indicated in the trends for a research agenda for digital libraries that Borgman herself identifies (p. 167). Given the opportunities that computer networks offer for information access, the need for libraries may be questioned. However, libraries play a vital role in information infrastructure, and also fulfill the social role of promoting learning.
Even so, the picture that Borgman paints is of an infrastructure that can be accessed by people in a wide variety of situations for a wide variety of purposes. "Information technologies are converging, computer networks are extending their reach, digital libraries are proliferating, and the user community is growing exponentially. These developments combine to make vastly more information resources available to many more people in many more places" (p. 266).
The challenge that the book sets out is to ensure that the technological challenges are confronted in ways that recognise all the other factors that combine to allow a successful development of the infrastructure. Borgman's book performs a useful service in giving a broad overview of all the kinds of issues that need to be addressed. - Peter J. Beech
Daniel P. Bovet and Marco Cesati.
Understanding the Linux Kernel.
Sebastopol, Calif.: O'Reilly & Associates, 2000.
paper, 702 p., ISBN 0-596-00002-2, US$39.95.
O'Reilly & Associates: http://www.oreilly.com
Linux has fast become a serious competitor in markets that were once sceptical of the open source "upstart" and it is proving itself the sensible option for many embedded applications. This success is due primarily to the rapid rate of development and revision made possible by the GNU General Public License, the open source model adopted by Linus Torvalds when he first released the kernel some ten years ago.
However, one related area that has yet to match the operating system's popular success, despite the best efforts of writers and publishers alike, is the availability of an approachable, explanatory literature. The Linux Documentation Project provides some of the best reference material one could ask for, but there are still a number of topics for which HOW-TO guides are unfortunately not enough: the kernel itself probably being the prime example.
Unlike users of most other operating systems, Linux users can - and usually do - optimise the kernel in order to improve performance and reduce any excess baggage delivered with their distribution's default kernel. This pursuit of "kernel tweaking" is sometimes confused with the more esoteric art of "kernel hacking", something normally not undertaken lightly. One development likely to increase the number of tweakers becoming full-blown kernel hackers is O'Reilly's recent release, Understanding the Linux Kernel by Bovet & Cesati.
Although the very title could induce panic in even relatively experienced Linux users, there are many who have waited a long time for a book such as this. The original manuscript developed out of course notes prepared by the authors for their Computer Science students at the University of Rome. This means that the book is a well-planned and educational "grand tour" in comparison to the white-knuckle ride offered by much of the existing kernel documentation.
Understanding the Linux Kernel, although not requiring prior knowledge of operating system kernels, is fairly demanding of the reader and assumes a degree of familiarity with system programming in C and x86 Assembler. The authors quickly deal with the basics of a Unix kernel before delving into memory management, a subject which seems to pervade every remaining chapter.
The Linux Ext2 filesystem, device management and I/O are also dealt with extensively and, aside from any interest in kernel-level programming, the book provides excellent advanced reading for Linux users seeking more information on these subjects. Symmetrical MultiProcessing (SMP) is discussed to an extent, there is good coverage of the Virtual Filesystem and each chapter is fairly well illustrated with flowcharts and block diagrams.
Unfortunately, a book about the Linux kernel will always be like yesterday's newspaper. The authors have made efforts, at the end of each chapter, to anticipate the 2.4 kernel and this would suggest that the publishers intend this edition to see everyone through the 2.5 development kernel. The declared purpose of this book is to encourage readers to refer directly to the kernel source and, although the source is quoted - or paraphrased - throughout the book, there are few, if any, direct pointers to source files. Considering that the current source tree is a 20MB (compressed) download, it might have been worthwhile including a snapshot on CD-ROM as well as an indication of where to start the aforementioned "referring".
However, given the scope and importance of this book, these points are trivial. It will be interesting to see what future revisions of this title bring; discussion of the kernel's networking code perhaps or further coverage of kernel modules (at present dealt with in an Appendix). As it stands, this is the Linux title to talk about and, whatever your level of interest, it has to be recommended. - Rory Beaton
Further information about the book is available from http://www.oreilly.com/catalog/linuxkernel/
The Linux Kernel HOWTO is available at http://www.kernel.org/pub/linux/docs/HOWTO/Kernel-HOWTO.html
Michael H. Brackett.
Data Resource Quality: Turning Bad Habits Into Good Practices.
(Addison-Wesley information technology series)
Boston: Addison-Wesley, 2000.
paper, 354 p., ISBN 0-201-71306-3, US$39.95.
The increase and ease of use of technology has meant that data are now relatively easy to create and store. However, how useful is that data and how can it meaningfully be made available to the organisation as a whole? To quote the author "the information technology discipline to date has concentrated on the information aspect rather than the information aspect of the discipline". This book tries to re-address the balance by focusing on the best ways to achieve data quality within an organisation.
The format of the book is straightforward. The introductory chapter deals with the current status of data quality. Each of the next 10 chapters is concerned with important habits and procedures to recognise bad practice in terms of poor data quality, then detailing good and best practice for each of these areas. The final summary chapter extols us not to ignore disparate data resources, and to act to resolve the issues surrounding data quality as soon as possible. Useful appendices include summaries and examples. The checklists could be usefully employed in a variety of situations whether by auditors reviewing data sources within a company, to business executives considering how to get meaningful and reliable company information, and to IT specialists interested in convincing top management of the need for common data standards throughout a company.
The book is easy to read and is intended for the general reader. The examples provided are relevant and pertinent to the issue that the author is describing in each chapter. Alone the wealth of examples lends some credence to the author's assertion that data quality is getting worse and more disparate. Although it is clear that the Brackett has considerable experience in this field and has worked on technical solutions to some of the problems of poor quality data resource, this book offers in the main general advice rather than technical solutions. For example there are no specific details on tools or technology currently available to ensure a high quality data resource. This concentration on the basic principles relating to data source quality is deliberate, as in the view of the author, the basic principles are "relatively static and are independent of the current technology".
There are consistent themes in the book regarding the quality of the data resource - the importance of communication, the importance of balancing the technical and the business needs, and the importance of focusing on the organisation needs. To the author's credit, he does not promise any quick fixes or promise easy solutions. It is openly admitted that there needs to be a commitment to nurturing quality and that improving or maintaining this quality is not without costs. I found his focus on communication and understanding refreshing and in my experience, is genuinely one of the most difficult aspects of achieving quality.
Just one minor quibble about this worthwhile book. Personally, I found the use of boxes to enclose trivial statements reduced the impact of their use for important bullet points. An example of this was "The information technology discipline is approaching an awaking" on page 279.
In summary, I would have no hesitation in recommending this book to a reader interested in improving or understanding data resource quality. - Pam Howard
Macromedia Flash 5 Advanced for Windows and Macintosh.
(Visual Quickpro Guide)
Berkeley, Calif.: Peachpit Press, 2000.
paper, 440 p. with CD-ROM, ISBN 0-201-72624-6, US$29.99.
Peachpit Press: http://www.peachpit.com
Peachpit Press have surpassed themselves with this book for all budding "master wizards" of Macromedia's latest version of the multimedia authoring application Flash 5. This latest edition to Peachpit's Press' "Visual Quickpro Guide" series is brimming with step by step visual guides and concise explanations detailing every stage necessary to achieve a particular effect or action. Russell Chun deserves a lot of credit for his ability to educate the reader without breaking him or her into a sweat! Chapter one, being a foundation course in Flash in its own right, along with the sample movies and Fla files included on the CD-ROM, make this part of the book alone worth more than a lot of professional courses being offered today. Chapter two concludes the foundation course showing how to incorporate 3-D elements and QuickTime files into movies for effective multimedia presentations that can be published on CD-ROM, or for transmission over the Internet.
Flash 5 Advanced is divided into five main parts: Approaching Advanced Animation, Understanding Actionscript, Navigating Timelines and Communicating, Transforming Graphics and Sound, and finally Working with Information. Each part is, in turn, subdivided into 12 chapters allowing the reader to pick up the "QuickPro Guide" at any place, and/or browse quickly, in order to find the relevant explanation and the step by step visual guide.
This book will dramatically change the way in which we develop rich multimedia and interactive Web experiences. Combining any number of the complex action scripts with dozens of built-in XML commands means that the interactive possibilities are endless. Steve Vegas added sections in this book on XML, Generator and the CGI "GET & POST" methods; hence accessing, processing and communicating information within projects becomes refreshingly easy and straightforward. The expert tips will make any site stand out from the crowd.
Increasingly Flash content is no longer about adding only dynamic elements. Instead, when combined with scripting languages such as XML, it becomes a more complex application which extends the user's experience beyond the four corners of the browser. Similarly, the new "sound objects scripting" capabilities, MP3 compression and streaming all will grow rapidly, which means that sound will be used much more creatively. All these aspects are covered in this guide.
For those who have recently upgraded from Flash 4 to 5, the book provides a navigational help to find the new locations for some of the actions or commands which have been logically clustered together. In this respect, I wasn't personally planning on upgrading my version of Flash 4. This book convinced me that it is a worthwhile upgrade, thanks in part to a 30-day version that is included on the CD-ROM, which also contains over 120 Shockwave movies providing all the sample files needed, as well as accessible Fla files to work with. The movies are available via a browser interface, and the Fla files have all the supportive documents necessary to recreate the working examples. The CD includes several other resources related to Flash including tutorials, source files, news, relevant URLs, music clips, fonts, mailing lists addresses, and details on newsgroups, message boards and forums where questions can be posted. A sniffer file and Dreamweaver Behaviour are included in the Flash Deployment Kit as well as the players for both Windows and Macintosh platforms.
Overall this book is an impressive package which will be extremely useful to anybody involved in the development of content-rich Web sites. - Glenn Dalgarno
Steve Clarke and Brian Lehaney.
Human Centered Methods in Information Systems: Current Research and Practice.
Hershey, Penn.: Idea Group Publishing, 2000.
paper, 241 p., ISBN 1-878-28964-0, US$69.95.
Idea Group: http://www.idea-group.com
The other day my son and I had a discussion about which were best - humans or robots. We talked about the way in which humans approached problems and that robots could only do what they had been programmed to do. We chose a specific example: putting a newsletter into envelopes - and we disagreed about which would be best at this job. I talked about how boring the job was and my son pointed out that the robot couldn't deal with outside interruptions or getting more envelopes from the store room. Eventually we agreed that humans were "better" but that robots were very useful. This book reminds me of the gap that exists between the world of information systems world and the human world in the same way that I was reminded about the benefits of humans over robots.
Clarke and Lehaney have brought together a group of authors from academic institutions around the world, producing a collection of papers that should be required reading for any experienced system designer. As Lorraine Warren points out IS is essentially an applied discipline and frequently the system designer is drawn into a technical, problem-solving approach rather than a human centered one. Warren goes on to say that the human centered approach has had to draw from many other disciplines (psychology, sociology, linguistics and anthropology) to "acquire" a theoretical sophistication. In these papers authors draw on a vast and varied body of research from other areas. From a practical perspective, and for an outsider to this particular world of academic research, some of the papers made me feel inadequate to say the least! However, it made me recognise the scope of work being undertaken and most importantly that we should not perceive one particular systems methodology as the solution to a particular problem.
Practically speaking, neither the editors nor the publishers identified in the course of the book their target audience. I have already said that I think it should be required reading for the experienced system designer. For the post graduate student it gives an overview of current research and practice in information systems and the range of research methods available (M. Gordon Hunter provides an excellent model for the selection of a research methodology and Cordoba, Midgley and Torres have an on-going project that synthesises a variety of techniques in a wide-ranging and on-going project). For a student considering a research project in this area, this book would make an excellent starter. The extensive references throughout give scope for further investigation of the various topics and some of the ideas definitely stimulate further thought. For example, I am still struggling a little with Wenn's idea of topological transformations. It explains what I know as "feature creep" but "... commence a dialogue that in effect brings them closer together and deforms the rubber sheet creating a depression that others may be attracted into"? Of course I have taken this out of context but it reminds me of Warren's statement that perhaps the solution to her particular problem did seem obvious (I also made a note to this effect about Mallalieu and Clarke's case study - once I'd come to terms with the concept of a "wicked" system).
The audience? Any serious information systems practitioner who is interested in the evolution of their craft should read it and, I hope, will go back to it for inspiration and to follow up references. The increased sophistication of systems and speed of change means that systems should be an evolving art. Unfortunately there are those who won't read this book that should: Cordoba, Midgley and Torres identify the resistance of the IS administration to their approaches. This group of subject matter experts - probably from a pure technology background - will find these approaches difficult to accept and justify. To bring together the "hard" ideas involved in making a systems work and the "soft" human centered ideas implicit in these approaches is a challenge for those interested in the application of these methods to systems design. It is an area that they must tackle if these methods are to be successful. This is an aspect of this research that I will watch with interest. - Wendy Baird
Derek Franklin and Brooks Patton.
Flash 5! Creative Web Animation.
Berkeley, Calif.: Peachpit Press, 2000.
paper, 540 p. with CD-ROM, ISBN 0-201-71969-X, US$39.99.
Peachpit Press: http://www.peachpit.com
I was delighted when this latest version of the Flash 5 Creative Web Animation guide landed upon my welcome mat. The delight though was short lived as I delved quickly within its 540 pages. The more I read, the more I was convinced that designers and programmers, who have purchased the actual software and license, are better served by the Flash 5 manual accompanying the retail package. If though, like me, you are evaluating the 30-day demo of the Flash 5 program deciding to purchasing an upgrade from a previous version, then you'll need a manual of sorts. There is, in fact, a bit of a learning curve to discover the new locations of some of the commands and context menu items that are now not were you expect to find them (they have been re-arranged logically elsewhere). If you are an absolute newcomer to the Flash program then this book is really aimed at your bookshelf. It still has the ability to teach a lot about how the application can assist in creating powerful interactive Web elements for yours and your clients' Web sites.
When the first Flash 3 Creative Web Animation guide was published, it was one of the best teaching and reference aides out in the marketplace at the time. It continued to play an important role when it helped introducing the new Flash 4 features. To some extent, from the beginners point of view, it still has a great deal of good information. However, in my humble opinion, this new version of the book does not live up to its predecessors. If it were not for the coverage of the new Flash 5 features, you may as well stay with version 4 of the guide.
There is an indication on the back cover of the guide that recommends this title to beginners, intermediates and some advanced multimedia programmers and designers. And this is probably the main problem with the book: it is trying much too hard to cram all of the program's power into its 16 chapters. The writers, Derek Franklin and Brooks Patton, should have decided to divide the material into two parts, or two volumes, for example a "beginner - intermediate" book and a more "intermediate - advanced" guide. The designers and programmers would have thus been able to benefit more from additional material, references and more illustrations/screen shots of the intricate stages of putting a complex action script together. Instead, a considerable amount of space has been dedicated to colour-enhanced pages depicting sites that are available online. They could have as well been included on the CD-ROM. Some of the book examples could have also been enhanced by the use of spot colour.
The CD-ROM that accompanies the Creative Web Animation guide is not so generously full of sample movie files or accessible .fla files. Disappointingly there is not a single example file or supportive document to complement the first chapter; it would have been an ideal place in which to have included those visual and "coloured" references of the different OS desktops, or even to highlight the differences between the dockable tool palettes and the work area environment, giving the user a better visual reference. Having said that, there are a few excellent example QuickTime movies, but, alone, they are still not worth the cost of the book. This is undoubtedly a pity, considering the great usefulness of the previous editions, which I considered to be valuable references and sources of accessible .fla file examples. The new edition is rather more suitable for the beginner going to the intermediate level.
If you, like me, are evaluating the new version of Flash, you might find some information valuable enough, because it gives a glimpse of the new and all powerful authoring features. However, there are other guides on the market that provide more power in a more succinct format, (separately) for both the beginner and the professional - Glenn Dalgarno
Robert D. Galliers, Dorothy E. Leidner, and Bernadette S. H. Baker (editors).
Strategic Information Management: Challenges and Strategies in Managing Information Systems.
Boston: Butterworth-Heinemann, 1999.
paper, 590 p., ISBN 0-750-63975-X, US$42.95.
This second edition is not light reading, neither in the figurative or literal sense. This tome on management offers a wide variety of articles with a European and American flavor. A framework is offered that provides structure for putting information systems (IS) into the larger strategy for business. A textbook for business management classes, this book also provides discussion questions that readers, even if not students, can use as self-directed learners in IS strategy and planning.
Strategy is important because of all of the effort that has gone into development of information systems. Viewpoints of IT have changed as the decades rolled by and technology has also changed from fragmented to integrated. With the focus now on business issues, organizations can focus on information as a resource to be used in novel ways and on developing information systems as their business's nerve center (p. 19).
Readers should be aware that a certain level of knowledge is expected. Graduate students in a business class will appreciate that the readings are contemporary, but should also know that the future course of IS is neither detailed nor predicted. Readers looking for trends are better off turning to their favorite online tech resource, such as Tech Republic. New for this edition, however, are readings emphasizing interrelationships, as well as updates to research discussed in the older edition introduced originally five years ago.
A must read is the preface and the introduction, both of which provide the framework and pre-organizer; figure 0.1 in the preface affords a great visual overview. It is extremely helpful that not only is there an introduction to each section, but also that there is the reinforcing icon to let readers know at a glance the conceptual relationships of the chapters at the start of each section. The chapters are broken into four categories: IS strategy, IS planning, IS strategy-business strategy relationship, and IS strategy and the organizational environment.
The first section, IS strategy, sets the groundwork by explaining the difference between IS planning and strategy. The section is designed to highlight strategy, with planning set aside for the next section. Stages of growth are discussed, with sample strategies from real world examples. CEOs may be particularly interested in the chapter that discusses their role in managing change in the context of information technology. The research in this chapter looks at the roles of CEOs in managing organizational change, and how attitudes and assumptions are varied. In 1987, almost 100 executives were interviewed; all of them were male. While the author cautions against generalizing the findings, it is interesting to note that one of the CEO's roles is as "disconfirmer" (p. 105), the purpose of which is to acknowledge that things are awry. CEOs also take on the role of parent, role model, and a host of others. Discussion about how IT impacts the organization is particularly intriguing because it shows that control must be shared if knowledge gaps are identified and action is needed (understanding will have to come first before action can be taken). CEOs as change agents differ in orientation and across organizations, but all of the CEOs are driven in their purpose.
Part two, IS planning, focuses on information systems planning, a means by which IS strategy is developed. This section puts planning in context and discusses various approaches to planning, as well as evaluating the outcomes of IS planning. The first chapter in this section details the important information technology services for large organizations. Opening with a scenario showing the technological tools of the future in use, the reader quickly becomes aware that in order to implement the "dream tools", IT must be an integral part of the planning process. Predictions 20 years old are assessed and then predictions for the next decade are offered, e.g., client/server will evolve from predominant technology architecture to important application architecture (p. 165). Assumption, business drivers, and applications set the stage for managing the IT-based innovations and their challenges. The authors caution that "prospective futurologists are advised to consider the track record of their profession" (p. 182) if they plan to make realistic predictions.
The third part, IS strategy-business strategy relationship, focuses on key issues, such as business process reengineering and e-commerce. One chapter, in fact, is devoted to presenting four case studies of such "market-making" successes and failures. Electronic markets emerged as a way to reduce costs of conducting transactions. The member traders, the authors write, find that "separating logistics from the transaction is revolutionary" (p. 405) and that everything associated with the new method of transaction must be redesigned to accommodate this "revolution." For discussion purposes, Australian and American beef, Japanese used cars, and Dutch flower and potted plant trading are discussed in detail, along with a comparison of characteristics, gains, and challenges.
The next section, IS strategy and the organizational environment, discusses the wider context of which IS strategy is part, namely the globalization of business, decision-making, organizational culture, and knowledge management. For businesses on the international scale, being able to coordinate the IS strategy requires that creative alternatives be developed. For example, it is important to be aware of technology standards at the local and international level so that the strategy can appropriately factor them into standard processes. Information technology and knowledge management (KM), in one context are part of a cultural, rather than technological, focus. KM helps "organizations create and distribute internal knowledge" (p. 523); the chapter helps to pinpoint challenges in an organization's culture so that barriers can be anticipated. Behavioral dimensions drawn (e.g., employee vs. job-oriented) from the literature are complemented by 13 "propositions" (perceptions of barriers) that range from information culture to sub-unit culture to individual/organization culture. For example: "Individuals perceiving their tacit knowledge as high in individual and corporate value will engage in selective sharing of tacit knowledge." The chapter also discusses the differences between tacit and explicit knowledge.
Author and subject indices round out this work. - Beth Archibald Tang
Community Informatics: Enabling Communities with Information and Communications Technologies.
Hershey, Penn.: Idea Group Publishing, 2000.
paper, 596 p., ISBN 1-878-28969-1, US$139.95.
Idea Group: http://www.idea-group.com
The logical first step when this apparently weighty 600 page book with a tightly packed typeface arrived was to ask what is "Community Informatics". In the call for contributors to this book Michael Gurstein described Community Informatics as "an approach for linking the opportunities which Information and Communications Technologies (ICT's) present, with economic and social development efforts at the community level in such areas as support for SME's and electronic commerce, community and civic networks, electronic democracy and on-line participation, self-help health communities, on-line advocacy, and cultural and linguistic regeneration among other areas." My common sense understanding views it as the application of information communication technologies to assist social, economic, political or cultural communities.
Dr. Michael Gurstein is currently an Associate Professor in the Management and Technology Program of the Technical University of British Columbia and the Director of the Centre for Community Informatics at TechBC. He is a leader in this field who sees community informatics as an extension of the better known field of health informatics, bringing together various disparate ICT areas to a purposeful focus of supporting communities and community development.
The book is a collection of invited papers and chapters by researchers and practitioners working in this area and in particular focuses on trends and opportunities as we move into the new millennium. It has contributions from over 40 authors and I counted at least 12 countries represented in 26 chapters. At first glance the book seems dense and impenetrable but I have to say I liked it. There are some very interesting pieces in the book which covers many important issues and topics. Just a glance through some of the chapter headings indicates the breadth of the material:
- The Access Rainbow: Conceptualizing Universal Access to the Information/Communications Infrastructure
- Requirements for a Regional Information Infrastructure for Sustainable Communities: The Case for Community Informatics
- Embedding the Net: Community Empowerment in the Age of Information
- The Role of Community Information in the Virtual Metropolis: The Co-Existence of Virtual and Proximate Terrains
- Building the Information Society from the Bottom Up? EU Public Policy and Community Informatics in North West England
- Community Networks for Reinventing Citizenship and Democracy
- ICT and Local Governance: A View from the South
- Community Informatics for Electronic Democracy: Social Shaping of the Digital City in Antwerp
- Internet-Based Neighborhood Information Systems: A Comparative Analysis
- Community Impact of Telebased Information Centers
- Cafematics: The Cybercafe and the Community
- Communication Shops and Telecenters in Developing Nations
- Linking Communities to Global Policymaking: A New Electronic Window on the United Nations
Micael Gurstein opens the book with a broad introduction in the chapter "Community Informatics: Enabling Community Uses of Information and Communications Technology". This sets the context, provides an historical perspective and lays out the challenges as well as hope communities have for the Internet, ICT and the new technologies as they try to influence the direction and development of the all-pervasive global economy. There is so much in this book, by its nature it is a book you will tend to dip into and stop at chapters that interest you to read in more depth. I examined several chapters in which I have a personal interest and I found them to be informative and useful. I have been involved in a small project that has been working to bring information technology and systems into women's community groups in the UK (see http://www.womenconnect.org.uk) and it was with this in mind that I examined several portions of the book. The attempts to bring the technology to communities is described by Jo Pierson of Vrije University in Brussels with the fun idea of a travelling "cyberbus".
Doug Shuler, in "New Communities and New Community Networks" demonstrates that strategic, planned approaches can enable communities to use ICT to advance their local, national and global causes in creative and often very effective ways. The vital role of the cybercafe as a "social portal", the "community centres for the 21st century" is explored by James Stewart in the chapter "Cafematics: The Cybercafe and the Community". I was heartened by the description of The Chicago Neighbourhood Early Warning System and the other case studies around housing activism and local involvement in planning and decision-making. Theoretical models are not neglected and I found the chapter "The Access Rainbow: Conceptualizing Universal Access to the Information/Communications Infrastructure" offered a persuasive and clear seven-layered model which I am certainly going to use in my own community IT consultancy work.
Another interesting chapter was "On-Line Discussion Forums in a Swedish Local Government Context". The approach described here is becoming far more widely used now and, having been involved in some online forums recently, I know the lessons in this case study could well be applied in other settings. Going online alone is no guarantee of success. Broader areas of citizenship and democracy, the cultural impact and bias of information are also addressed as well as relationships between business and local communities, "real" communities and "virtual" communities. There are also suggestions that those communities that are already successful should assist similar communities elsewhere in the world, sharing skills as well as resources as a sort of international mentoring approach.
There is a wealth of valuable material including many comparative case studies in this book. Anyone interested in ICT and how it effects communities and community development will find some chapters of real relevance. The extensive references are a gold mine to the information curious.
Do not be put off by the book's bulk or its price for it is worth it. I only wish that the publisher or editor would have set up a Web site with links and updates. - Wendy Clark
Computers Ltd.: What They Really Can't Do.
Oxford; New York: Oxford University Press, 2000.
cloth, 221 p., ISBN 0-19-850555-8, US$25.00.
Oxford University Press: http://www.oup.com
Among the millions and millions of words written every year celebrating how powerful and empowering computers are, this little book stands out quite boldly, expressing a different view: computers are not such an omnipotent human creation able to solve all our problems. Not now, not ever, despite the formidable progress made by hardware and software designers in recent times. In fact, this is how David Harel, the Dean of the Faculty of Mathematics and Computer Science at the Weizmann Institute of Science in Israel, sets the scene in his preamble:"This book concentrates on the bad [news]; on the negative side of things. Computers are expensive, which is already bad news. They frustrate us: programming them is laborious and using them can be difficult; they are seductive, luring us away from more important things; they err; they crash; they contract viruses; and on and on. But it is not these kinds of bad news that concern us here. The goal of the book is to explain and illustrate one of the most important and fundamental facets of the world of computing - its inherent limitations."
Normally, writing about "bad news" would not be a very jolly endeavour, however, Computers Ltd. makes for a refreshing reading, as it provides a stimulating discussion of those areas (mainly mathematical in nature) in which computers are stretched to their limits. Thus, in the first chapter, the author starts by examining the concept of algorithm, which is at the basis of everything a computer can do. In this respect, it might appear, at first, that most of the material considered is not really relevant to the 'everyday' uses of computers, such as editing a spreadsheet or playing a video game. The widespread perception is that such tasks are quite distinct from the more traditional numerical analysis and number-crunching operations with which we have identified computers (after all, what does a "computer" do except for "computing by calculation"?). However, at a low level, every ordinary desktop PC carries out an enormous amount of plain calculations, all predetermined by the developers of the software, in order to achieve the desired result. In the end, whether a boring word processing document or the entire Toy Story film, we are dealing with long streams of ones and zeros. Nothing more. This is, in my opinion, where the true educational value of the book lies. By abstracting the concept of computing, Harel reminds us that everything can be reduced to basic building blocks.
The bulk of the discussion is devoted to problems (theoretical and practical) that cannot be solved by any computer. The key here is the notion that time and storage space are the constraining elements. Sometimes an algorithm would require an infinite amount of time to yield a definite result; other times it is the quantity of data that is well beyond reality (more than then entire contents of the universe, for example).
Although all of this might sound dry, it is not. Harel's language is engaging and never tiresome. Even when examining concepts such as noncomputability or intractability, he does so with a natural directness that makes the (undoubtedly complex) notions easy to absorb and understand. Moreover, most of the chapters examine concrete problems, from colouring networks, to the tower of Hanoi, from the monkey puzzle to artificial intelligence, to the important applications of quantum and DNA computing, directly relevant to the field of cryptography. The latter, let's remember, is gaining momentum, as it forms the basis for the security infrastructure built into e-commerce and electronic data exchange.
Overall, Computers Ltd. is a delightful little book which opens one's mind and exposes the real capacity of computing machines. The negativeness that transpires from the pages should not be taken as sheer pessimism or dismissal of what we have achieved so far. Harel knows much too well the progress we have been making in recent years. However, his message is that we should not be complacent about it. After all, the power of any man-made machine is limited by the power of our own mind. For the time being, at least. - Paolo G. Cordone
Linda Lau (editor).
Distance Learning Technologies: Issues, Trends and Opportunities.
Hershey, Penn.: Idea Group Publishing, 2000.
paper, 264 p., ISBN 1-878-28980-2, US$69.95.
Idea Group: http://www.idea-group.com
Distance Learning Technologies sets out, in its own words, "to provide both academicians and practitioners with a body of knowledge and understanding regarding the distance learning technologies." It consists of a series of contributions by academics and others, singly or in pairs, and relies wholly on American experience, apart from one intriguing contribution from Egypt. The first section, of four chapters, looks at theoretical foundations. The second, of seven chapters, looks at what the editor calls the conceptual aspect, though the contributions seem to focus more on nuts and bolts, on looking at what works and does not work. The third section consists of five case studies of the use of distance learning technologies in a variety of settings.
This was a difficult book to review; in progressing through the book, the reviewer was beset by hope and frustration in equal measure. It covers the field well enough with a spread of academic, business and government settings to pick from, the authors seem well enough versed in their fields, there is plenty of detail. Yet the overall feeling is one of curious disappointment, a sense that the contributors did not have their thinking caps on when they wrote their pieces. Much is said about learning theory and much is said about distance learning technology, but the two never seem to meet up in an integrated way.
The first chapter sets the tone. Written by Valerie Morphew, an assistant professor of education, it examines Web-based learning from a constructivist viewpoint. She begins by defining constructivism as an approach in which "the student co-constructs new meaning as a knowledge building process - piece by piece, new knowledge is built onto former knowledge", a house building metaphor which has no room for the fundamental premise of constructivism that old knowledge is transformed in the making of the new knowledge. She says later, when discussing curriculum planning, that "theoretically, the curriculum taught should equal the curriculum learned", again a curiously rigid formulation for a constructivist, who should delight in the moments when the students take the ball away from the instructor and run with it where they please. Perhaps in the measured world of American academia, learning things that are not on the curriculum doesn't count. Her overall account of the details of constructivist learning is good enough, and she goes on to make recommendations for the planning of distance learning curricula. The trouble is that her recommendations are very generalised, and could apply to all curricula under any circumstances. There is little to mark out the ways in which distance learning is different or ways in which the technology seriously influences the way the course is delivered. The most interesting statement in her contribution comes very close to the end, when she says that most Web-based instruction today is based on behaviourism, viewing the learner as an empty vessel waiting to be filled. I wish she had used this as her starting point. It is even more tempting to use the Web in the least challenging way than it is to use face-to-face teaching in the least challenging way: bung some pearls of wisdom up on a Web page, and let them digest it at their own pace. The best teaching, as she clearly knows from her constructivist grounding, is that which is genuinely interactive. The chapter would have been much stronger if she had used the space allotted to outline how distance learning technology compares with face-to-face methods: how it fares badly, how it fares differently, and how it improves on the physical proximity version.
This kind of unjoined up thinking tends to pervade: too many of the contributors, while discussing the possibilities of interaction, seem to view teaching as, at bedrock, a one-way process. Nguyen and Kira, for instance, describe it as "a communication process in which a body of knowledge is delivered from an instructor to students". Too many see distance learning as a phenomenon which must be inferior to face-to-face learning, and seem to be unable to comprehend the possibility that it might bring attributes to bear in a different way. Schrum, for instance, discussing the place of evaluation, says that students currently assess most instructors, but difficulties arise online because "in an online environment the love of subject, commitment to students, sense of humour, and willingness to adapt might not come through". There is a mounting body of evidence that relationships made through the Web can be just as complex and subtle as those made face-to-face. They will be differently constructed and emphasise different things, but they will not inevitably be of lower quality. Yermish says about video-conferencing: "from a student standpoint there must be some diminution of perceived quality when the instructor is a small figure on a TV screen instead of a live being in a face-to-face environment". This is probably so, but he does not balance this with any account of the advantages that video-conferencing may provide apart from overcoming the exigencies of trying to make more money in an increasingly competitive market. Only one group of contributors, Purcell and Purcell-Robertson, say that distance learning can be as rich as or richer than the face-to-face variety.
I have led with criticism, and the book deserves it, but it has its strengths too. The Purcells' piece on interactive distance learning and the building of community is very good. Berg and Smith's ideas on the use of change management techniques has some thought provoking points. The two best organised contributions come from within the military structure - Meyer-Peyton from the Department of Defense and Smith and Ransbottom from West Point (Perhaps we face the formation of a military-industrial-educational complex: now that would be truly frightening.). Schrum, despite her lapses in thinking, has a number of interesting points on the pedagogical decisions necessary about the fundamental nature of an online course. Adams and Freman are tantalising on the transmission by distance learning of tacit knowledge. Perhaps "tantalising" sums it up. There is much that is good in this book, but it could have been so much more. - Robert Parsons
Robin Williams and John Tollett.
The Non-Designer's Web Book.
Berkeley, Calif.: Peachpit Press, 2000.
paper, 303 p., ISBN 0-201-71038-2, US$34.99.
Peachpit Press: http://www.peachpit.com
There is no doubt that, in recent years, the World Wide Web has transformed itself from a very specific and 'enclosed' space used by scientists to a mature publishing medium. A consequence of this is that designing and creating a Web site need not be an intimidating task, not even for the uninitiated. It is possible to find a multitude of good books, all covering various aspects of publishing online, from those aimed at the very beginners to fiendish complicated tomes containing pages and pages of intricate programming code. Among those belonging to the first category, The Non-Designer's Web Book by Robin Williams and John Tollett stands out by providing a well-balanced overview of techniques and good practices. And it is not surprising, considering the pedigree of both authors.
Robin Williams is certainly a familiar name when it comes to promoting a good style of electronic publishing. Her several books on DTP and typography have done much to help a whole generation of designers acquiring a sophisticated taste and sound knowledge of those little rules that make the difference between an amateurish-looking layout and an effectively constructed professional design. John Tollett has been involved with art design for many years before switching over to cyberspace.
As with many books by Williams, this guide also starts by introducing the very basic concepts in a truly informative and fun way. The first chapter is dedicated to the World Wide Web in general: modems, ISPs, browsers, plug-ins, and file format, they are all mentioned here. The second chapter examines the various techniques for searching the Web by explaining how search engines and directories work. In this respect, I would have named the chapter "searching the Web", rather than "searching the Internet". Confusing one for the other is a basic mistake and the authors are certainly knowledgeable enough to avoid this.
Part two of the book covers the task of actually making Web pages. It describes how to add graphics, tables, frames, and other elements, while it also provides useful tips on how to organise the files on the Web server. Even a comparably small site can end up being made of many files, and here is where good file management comes in handy. Although only about eighty pages up to this point, the book has so far covered quite a lot of ground, in a language that is accessible, understandable and very effective. Good use of screen shots illustrates important concepts and show how things appear on a screen.
But the best has still come. Part three is devoted to the vitally important issue of design practices. In my opinion, the four chapters included here should be made compulsory reading for all Web designers. It is a negative reflection of the relatively young age of Web publishing that there are a lot of ugly sites out there. Seemingly obvious practices are often disregarded and the result is poor navigation mechanisms, non-functional interface, colour combinations that make the text difficult to read, as well as the all-too-common desire to overload a site by adding all sorts of applets, animations, interactive content and other 'moving parts' that do not really add anything to the functionality of the site.
Williams and Tollett do their best here to minimize such goof-ups. In particular, by providing a gallery of poorly designed pages, along with the 'corrected' version, the authors provide the reader with a clear indication of how much a layout can be enhanced by a few simple modifications on text size, alignment, colour contrast, graphics placement, etc. This portion of the book is really great.
The subsequent chapters deal with more advanced topics: colour, graphics, typography as well as a discussion on how to test a site, once it is completed. There is a good deal of useful information, all presented in a succinct way and complemented by relevant screenshots or bulleted lists. The common thread always seems to be an emphasis on sleekness, elegance, simplicity and effectiveness. It might not be an easy balance to find, but this book goes a long way towards making the Web a better and nicer place.
An interesting feature at the end of each chapter is represented by quizzes, through which readers can verify that they have assimilated information properly. Some of the questions are actually better described as mini-tutorials, such as the following at the end of chapter 6:"Do some redesigning of two of your own Web pages. Open them in your Web authoring software. Print the pages as they are right now, then print them again after you do some easy rearranging."
The tasks that are asked to be carried out range from grouping similar items into closer proximity to creating contrast in appropriate places. Overall, these exercises foster a definite sensitivity for the main aesthetic issues concerned. And this is what Web design should also be about. Highly recommended. - Paolo G. Cordone
Copyright ©2001, First Monday
A Great Cities Initiative of the University of Illinois at Chicago University Library.
© First Monday, 1995-2017. ISSN 1396-0466.