Navigation Assessments – Present and Future

In many areas, there is reluctance to attempt to put numbers on navigational performance in a navigation assessment report. The reasons for this are twofold. Firstly, subjectivity: no two consultants will assess the same topic in the same way; one man’s 6 is another man’s 8, or even 10. Even without the tyranny of numbers, we spend much time ‘moderating’ incoming reports, in an attempt to ensure a consistency of delivery to the client, who needs to prioritise on which vessels he needs to target his attention and resources. Secondly, the importance of an item depends on its context when measured; this can change according to circumstances of the case. For example, the provision of additional equipment may lessen the value of, or even render inapplicable a traditional process or procedure.

Our early narrative-only reports were well received by the client; the concept was to give an accurate idea of what was happening on the bridge of their ship. This works fine where the person receiving the report is an experienced mariner themselves; but these reports – often running to twenty or more pages – take time to compose and to read, and although we extract the more important items into an executive summary, and submit supplementary lists of action points, we felt that it could be improved.

Also, some clients want or need to be able to demonstrate progress towards improved navigational performance. Therefore, we started to include a summary score sheet as an appendix to the report and give scores of 1 to 6 (very poor to excellent) for some ten basic areas, including the five pillars of watchkeeping, collision avoidance, position monitoring, passage planning and bridge teamwork, and indeed, it has given us much satisfaction to see these numbers steadily improving for each client, over the years. But this does not overcome the problem of objectivity is assigning the numbers.

A possible solution would be to follow the model used by OCIMF or CDI inspection system where questions phrased so that they can be answered as either Yes, No, Not seen or Not Applicable. The questions are all formulated so that a “Yes” answer indicates the desirable outcome. Any “No” answer must be explained in detail, and other answers may be. The report is in a database format so that it can be queried in many ways, and tailored reports generated to match the particular user’s requirements.

Assuming a suitably large and comprehensive question set, it would also be possible, using a suitable matrix overlay, to put a scoring and a weighting on each individual answer, in order to generate a more objective overall report than is possible by subjective individual scoring of each topic.

With this as the ultimate objective, we have set about developing an assessment report which is now based essentially on “yes / no” answers – which is practicable, provided it is based on a sufficiently comprehensive question set. This is now in its eighth edition, with much additional focus on the use and integration of ECDIS, as dual ECDIS, with no paper chart backup, fast becomes the norm. “Yes” answers don’t need to be qualified, although the Assessor may add detail where he feels it appropriate. “No” answers must be explained. The report also allows for answers “not applicable” and “not seen”. Colour-coding also allows the reader to quickly scan to the important or required information. At present, this is in word processor document format, but the intention is to move it to database.

But ultimately, it is not about numbers or scores; navigational assessments are primarily about improving navigational performance and safety; we should never lose sight of what distinguishes navigational assessments from audits – that is training and mentoring.

The Navigation Assessment report therefore focuses on:

  • What was done well
  • Training and mentoring provided
  • Potential for improvement

Training is elaborated in bullet points under subheadings of the five pillars of navigational safety – watchkeeping, collision avoidance, position monitoring, passage planning, bridge teamwork – and any other items.

We see a wide range of skills and, abilities from company to company, and within companies, but it is always possible to find things which were done well, and to use these to encourage ships personnel in other areas. It is important to understand what motivates people to come to sea. The mystery which was navigation has long been supplanted by small boxes which show latitude and longitude to a precision which far exceeds that of the ENC survey data; nevertheless, we find that many navigators of all nationalities still obtain great satisfaction in learning and practising astro-navigation – and we enjoy teaching it.

Training – whether one-to-one or formal sessions to all – is tailored to areas of weakness identified in the assessment. We will also ask ships personnel if there are any areas of navigation that they would like to receive training on. On a three-day assessment, the amount of training which can be provided is obviously limited, particularly where the ship is in busy or confined waters and/or hours of rest are precious. With a five-day assessment, much more can be achieved. On a longer passage, and particularly with time at anchor, we can achieve a very extensive training program. The benefits seem almost exponential as the officers become progressively more attuned and involved in the process.