Given my recent (and apparently insatiable appetite) for studying the contexts, interface(s), and success and failure modes between man and machine, it’s not a surprise that I’ve been flying head-on into the field of Human Factors. Sub-disciplines include Cognitive Engineering and Human-Computer Interaction (HCI).
It would appear to me that there isn’t one facet of the field of web engineering that can’t be informed by the results of human factors research and ought to be part of anyone’s education in Operations at the very least. How we make decisions, how we operate under multiple goal conflicts (think time pressure and outage escalation), concerns for designing controls and displays, even organizational resilience has foundations in Human Factors, and I think has just as much to do with our field as Computer Science and Distributed Systems.
So when I had opened “Human Factors for Engineers” I found the Preface by Stuart Arnold really captured my own impressions upon discovering this field some years ago, so I’m going to quote the whole thing here:
Preface, Human Factors for Engineers
I am not a human factors expert. Nor can I even claim to be an able practitioner. I am an engineer who, somewhat belatedly in my career, has developed a purposeful interest in human factors.
This drive did not come from a desire for broader knowledge, though human factors as a subject is surely compelling enough to merit attention by the inquiring mind. It came from a progressive recognition that, without an appreciation of human factors, I fall short of being the rounded professional engineer I have always sought to be.
I started my career as a microwave engineer. Swept along by the time and tide of an ever-changing technical and business scene, I evolved into a software engineer. But as more time passed, and as knowledge, experience and skills grew, my identity as an engineer began to attract a new descriptor. With no intent, I had metamorphosed into a systems engineer.
There was no definitive rite of passage to reach this state. It was an accumulation of conditioning project experiences that turned theory into practice, and knowledge into wisdom. Throughout, I cloaked myself in a misguided security that my engineering world was largely confined to the deterministic, to functions that could be objectively enshrined in equations, to laws of physics that described with rigour the entities I had to deal with. To me, it was a world largely of the inanimate.
Much of this experience was against a backdrop of technical miracles; buoyant trade and strong supplier influence; where unfettered technological ingenuity drove the market, and users were expected to adapt to meet the advances that science and engineering threw at them. It was an era destined not to last.
Of course, from my world of microwave systems, of software systems and of maturing systems engineering, I had heard of human factors. Yet it seemed a world apart, scarcely an engineering-related discipline. People, teams and social systems were an unfortunate, if unavoidable, dimension of the environment my technical creations operated in.
Nor was I alone in these views. Recently asked by the IEE to write a personal view on systems and engineering, I light-heartedly, but truthfully, explained that I have been paid to drink in bars across the world with some of the best systems engineers around. This international debating arena is where I honed a cutting edge to my systems engineering. Above all others, I was struck by one thing: an engineering legacy of thinking about systems as being entirely composed of the inanimate.
Most engineers still view humans as an adjunct to equipment, blind even to their saving compensation for residual design shortcomings. As an engineering point of view for analyzing or synthesising solutions, this equipment-centred view has validity—but only once humans have been eliminated as contributing elements within some defined boundary of creativity. It is a view that can preempt a trade-off in which human characteristics could have contributed to a more effective solution; where a person or team, rather than inanimate elements, would on balance be a better, alternative contributor to overall system properties.
As elements within a boundary of creativity, operators bring intelligence to systems; the potential for real-time response to specific circumstance; adaptation to changing or unforeseen need; a natural optimisation of services delivered; and, when society requires people to integrally exercise control of a system, they legit-imise system functionality. In the fields of transportation, medicine, defence and finance the evidence abounds.
Nevertheless, humans are seen to exhibit a distinctive and perplexing range of ‘implementation constraints’. We all have first hand familiarity with the intrinsic limitations of humans. Indeed, the public might be forgiven for assuming that system failure is synonymous with human weaknesses — with driver or pilot error, with disregard for procedure, even with corporate mendacity.
But where truly does the accountability for such failure lie: with the fallible operator; the misjudgements in allocation of functional responsibility in equipment-centred designs; the failed analysis of emergent complexity in human—equipment interaction? Rightly, maturing public awareness and legal enquiry focuses evermore on these last two causes. At their heart lies a crucial relationship — a mutual understanding — between engineers and human factors specialists.
Both of these groups of professionals must therefore be open to, and capable of performing, trade-off between the functionality and behaviour of multiple, candidate engineering implementations and humans.
This trade-off is not of simple alternatives, functional like for functional like. The respective complexities, characteristics and behaviours of the animate and inanimate do not offer such symmetry. It requires a crafted, intimate blending of humans with engineered artefacts—human-technology cooperation rather than human—technology interaction; a proactive and enriching fusion rather than a reactive accommodation of the dissimilar.
Looking outward from a system-of-interest there lies an operational environment, composed notionally of an aggregation of systems, each potentially comprising humans with a stake in the services delivered by the system-of-interest. As external system actors, their distinctive and complex characteristics compound and progressively emerge to form group and social phenomena. Amongst their ranks one finds, individually and communally, the system beneficiaries. Their needs dominate the definition of required system services and ultimately they are the arbiters of solution acceptability.
Essentially, the well-integrated system is the product of the well-integrated team, in which all members empathise and contribute to an holistic view throughout the system life cycle. The text that follows may thus be seen as a catalyst for multi-disciplinary, integrated teamwork; for co-operating engineers and human factors professionals to develop a mutual understanding and a mutual recognition of their respective contributions.
In this manner, they combine to address two primary concerns. One is an equitable synthesis of overall system properties from palpably dissimilar candidate elements, with attendant concerns for micro-scale usability in the interfaces between operator and inanimate equipment. The other addresses the macro-scale operation of this combination of system elements to form an optimised socio-technical work system that delivers agreed services into its environment of use.
The ensuing chapters should dispel any misguided perception in the engineering mind that human factors is a discipline of heuristics. It is a science of structured and disciplined analysis of humans — as individuals, as teams, and as a society. True, the organic complexity of humans encourages a more experimental stance in order to gain understanding, and system life cycle models need accordingly to accommodate a greater degree of empiricism. No bad thing, many would say, as engineers strive to fathom the intricacies of networked architectures, emergent risks of new technology and unprecedented level of complexity.
The identification of pressing need and its timely fulfillment lie at the heart of today’s market-led approach to applying technology to business opportunity. Applying these criteria to this book, the need is clear-cut, the timing is ideal, and this-delivered solution speaks to the engineering community in a relevant and meaningful way. Its contributors are to be complimented on reaching out to their engineering colleagues in this manner.
So I commend it to each and every engineer who seeks to place his or her particular engineering contribution in a wider, richer and more relevant context. Your engineering leads to systems that are created by humans, for the benefit of humans. Typically they draw on the capabilities of humans, and they will certainly need to respond to the needs and constraints of humans. Without an appreciation of human factors, your endeavors may well fall short of the mark, however good the technical insight and creativity that lie behind them.
Beyond this, I also commend this text to human factors specialists who, by assimilating its presentation of familiar knowledge, may better appreciate how to meaningfully communicate with the diversity of engineering team members with whom they need to interact.
It’s almost as if he was convinced that cooperation and communication between disciplines was not only warranted, but needed if problems are ever to have scalable and efficient solutions. 🙂