“Crisis” is perhaps too strong a word for it, but the NZ engineering profession is going through some tribulations at the moment because of the consequences of the Canterbury Earthquakes. There is, I think, a general sentiment amongst the public and amidst the profession that we should have somehow done better, and perhaps that’s how it should be. I am sure that I am far from alone in having both reviewed my historical work and altered my practice going forward in light of the Canterbury lessons. Engineering, like other professions, operates on the basis of a unified baseline technique – codified, standardized, approaches to problems and solutions. It is sufficiently complex that despite this baseline, results vary from engineer to engineer. Therefore, when thinking about the failures in Canterbury, we need to somehow carefully disentangle what is wrong with the profession from what was done wrong by the professional.
After reviewing all the evidence, the Structural Engineers Society (SESOC) has formulated 10 recommendations for improving the situation:
- Candidates for CPEng at AFA and previously assessed candidates must have passed a “Fundamentals of Engineering Examination” at their next CRA
- Providing improved technical training of CPEng Practice Area and Staff Assessors
- Improvements to engineers’ post graduate training
- An immediate need for training on earthquake assessment and seismic retrofit
- Establishing a Recognised Structural Engineer class to undertake complex work
- Establishing a working group to determine works to be classified as “complex structures”; the starting point will utilise the practice in British Colombia and CERC recommendations
- Requiring candidates for Recognised Structural Engineer to pass a “Complex Structures” examination
- Promoting wider use of independent peer reviews
- Proposing the introduction of random auditing of CPEng and RSE practitioners
- Upgrading requirements for CPD and the maintaining currency under the Chartered Professional Engineers of NZ Act 2002
On first viewing, it is difficult to argue against any of these as being too onerous for a profession which ultimately causes life or death for hundreds of people in any kind of substantial building. We need to be confident that these buildings are safe, that they will perform as intended, and that people won’t die. In particular, I believe that wider use of peer-review would avoid many of the poor solutions that get implemented.
However, I am worried by the ideological implications for some of these recommendations because there is a risk that they will hurt, rather than help, the professionalism of engineers. The main one being, the introduction of examinations. I think that while this sounds excellent on paper, it has the same kind of potential pitfalls as other highly prescriptive approaches to knowledge.
The problem, in its simplest statement, is that engineering uses science, it is not scientific, inasmuch as there are still substantial elements that are non-technical. If we have a casual look at the current framework for assessing competence, there are 12 competencies, only 2 of which relate to “engineering” in a technical sense. The rest is about professionalism in various ways, and by recognizing their limitations in those technical areas, someone can remain a competent professional providing valuable service. But even if we (probably rightly) regard those as hard limits on acceptability, there is significant variation in approach and result across engineers. For example, my firm is a member of a small practice group, that get together and discuss technical and professional matters, and we recently picked a site in Wellington and used the loading standard to determine the wind-speed. It will not surprise anyone that there was significant variation in the answers we obtained, which is to say, if one of us were setting the exercise as an “exam” then the others would all have “failed.”
More generally, the standards are all based on a kind of practical moderation of theory formulated by a panel of experts and practitioners – they are not “real” in the sense of the gravitational constant. So the actual mechanisms used to derive an answer are necessarily approximate to begin with.
Perhaps though, the most disturbing thing about this proposal, from my point of view, is that at the moment the concept for registration is that you need to get together a portfolio of evidence and then make an argument that you’ve taken a good approach. You need to defend your thinking both in writing and in a face-to-face interview, having persuaded 2 other engineers to endorse even submitting a portfolio. It’s a very human process for a very human activity, and I am worried that inside the constraints of an exam, that flexibility will be lost. One of the case studies I submitted in my portfolio was essentially me disagreeing with another engineer of greater seniority and specialization – in an exam, that would have been a black mark, but in the human environment it demonstrated a practical and professional approach to getting the best solution for my client.
I may sound a little hypocritical here, because my principle strategy for interviewing potential employees is to get them to sit an engineering principles test, and failing that test is not beneficial to your chances. However, the real test is not the one people fill out in exam conditions, though it is a difficult test – the real test comes when we sit down and discuss the reasoning behind the test. I’ve endorsed the employment of candidates who got a lot of things wrong, because I could see that they had the right approach and attitude and so were possibly teachable, and I have done the opposite too, where “correct” tests were not backed up with the right attitude to knowledge.
My impression about the problems that are prompting these proposals is that they would mostly already be addressed by more stringent application of the existing framework. This proposal sounds a bit like a grab-bag of buzz phrases and concepts, rather than a serious attempt to think about the ways in which the current system does or does not work.