FREELANCE RESEARCH SUPERVISOR
CONSULTANT IN WRITTEN & ORAL COMMUNICATION
Research supervision for MSc and PhD students in social sciences (from the definition of your research project to data gathering and analysis)
Coaching and training in scientific writing and oral communication (writing up and presenting)
Technical writing for the industry
Guest speaker on the topic of human error management in critical systems
Here is a template in English for the OpenOffice fans who happen to have a PhD thesis or a research report to write. This template has built-in headings, indexing, as well as page-sensitive headers and footers. Feel free to modify and share them.
Guide to dissertation writing
Here is a practical guide (in English and French) explaining how to write a research report for a Master's degree or a post-Master's degree (Mastère Spécialisé. This document also contains some tips and tricks that can be useful for a PhD thesis or an article.
Useful tips for scientific publication in English
EASE publishes the Guidelines for Authors and Translators of Scientific Articles to be Published in English. It is a valuable resource for junior scientists as well as for more experienced authors looking for a quick read on best practices.
Short notes about work observation
Here is a simple, practical guide (in French) explaining how to conduct work observations in the field, from an ergonomic point of view.
This is a zip archive of mine containing reading notes about 460 references in HMI, expertise, problem solving, troubleshooting and human error, for the most part. The contents of the archive are listed here and in the zip file. Please bear in mind that these notes are not objective: they only contain the arguments that I wanted to keep a record of. Also, these notes cover different time periods depending on subjects. Last, if you think this is a useful resource, please consider sharing your own work with your community.
Co-Director of a French executive post-Master's degree in industrial safety management: Manager des organisations à risques.
Supervision of 50 post-MS students
Co-supervision of 7 PhD students at Mines-ParisTech between 2007 and 2012
Scientific publication and conferences
Consultancy, conferences and training for the domains of LNG, gas distribution, nuclear energy, railways, and oil & gas. The topics included HMI, human error and team management, safety management, safety culture, accident analysis and fieldwork observation.
Since January 2018, I have been a freelance research supervisor and consultant in written and oral scientific communication. I am also the Co-Director of the post-Master's degree in Industrial Safety:Manager des organisations à risques at ICSI since July 2012.
From 2007 to 2012, I have been a researcher within the Chair of Industrial Safety, held by Erik Hollnagel at Mines-ParisTech. As of today, the Chair is still hosted by Mines-ParisTech at Sophia-Antipolis (France) and co-funded by a number of industrial partners. The aim of the Chair is to carry out research towards new models of, and methods in, industrial safety, and positively impact on practices.
From 2000 to 2005, I was a Research Associate in the Computing Science department of Newcastle University (UK), where I worked for 6 years. My role was to conduct ergonomics-oriented research within a large interdisciplinary research project on the dependability of computer-based socio-technical systems (DIRC). Before that, I obtained a PhD in cognitive ergonomics (University of Provence, France, 1999) where I studied troubleshooting errors in technical systems, and the contribution of this type of error to industrial accidents.
Human performance and human error;
Violations and workarounds;
Management of industrial safety.
Besnard, D. & Hollnagel, E. (2014). I want to believe: Some myths about the management of industrial safety. Cognition, technology & Work, 16, 13-23. [pdf]
Desmorat, G., Guarnieri, F., Besnard, D. Desideri, P. & Loth, F. (2013). Pouring CREAM into natural gas: the introduction of Common Performance Conditions into the safety management of gas networks. Safety Science, 54, 1-7. [pdf]
Baxter, G., Besnard, D. & Riley, D. (2007). Cognitive mismatches in the cockpit. Will they ever be a thing of the past? Applied Ergonomics, 38, 417-423. [pdf]
Besnard, D. & Cacitti, L. (2005). Interface changes generating accidents. An empirical study of negative transfer. International Journal of Human-Computer Studies, 62, 105-125. [pdf]
Besnard, D. & Greathead, D. & Baxter, G. (2004). When mental models go wrong. Co-occurrences in dynamic, critical systems. International Journal of Human-Computer Studies, 60, 117-128. [pdf]
Besnard, D. & Arief, B. (2004). Computer security impaired by legal users. Computers & Security, 23, 253-264. [pdf]
Besnard, D. & Greathead, D. (2003). A cognitive approach to safe violations. Cognition, Technology & Work, 5, 272-282. [pdf]
Besnard, D. & Cacitti, L. (2001). Trouble-shooting in mechanics: a heuristic matching process. Cognition, Technology and Work, 3, 150-160. [pdf]
Besnard, D. & Bastien-Toniazzo, M. (1999). Expert error in trouble-shooting: an exploratory study in electronics. International Journal of Human-Computer Studies, 50, 391-405. [ pdf]
Besnard, D. & Channouf, A. (1994). Perception infraliminaire de stimulus familiers et résolution de problèmes simples. Anuario de Psicologia, 62, 41-53. [pdf]
Besnard, D., Dahani, S., Tazi, D., Tose, A., Takano, K. & Rebeillé J.-C. (2013). The culture of cultures. A worldwide meta-analysis of 21 safety culture surveys in oil, gas transport facilities & storage and services companies. Paper given at APSS '13, Singapore.
Pelleterat De Borde M., Martin C., Besnard D. & Guarnieri F. (2013). Decision to reorganise or reorganising decisions? A First-Hand Account of the Decommissioning of the Phoenix Nuclear Power Plant. Communication given at Decommissioning Challenges: An Industrial Reality and Prospects, 5th International Conference, France (2013). [pdf]
Desmorat, G., Desideri, P., Loth, F., Guarnieri, F. & Besnard, D. (2011). Accidents in the gas distribution industry. Some consequences of the introduction of new analysis criteria. Paper given at the ESREL 2011 conference, Troyes, 18-22 September. [pdf]
Besnard, D. & Hollnagel (2010). Some myths about industrial safety. Invited presentation given at the industrial Safety Days (Sikkerhetsdagene) conference, 1-2 November, Trondheim (Norway). [pdf]
Sanseverino, V. & Besnard, D. (2010). Some side-effects of change. Presentation given at the 1st workshop on the Assessment of the Consequences of Change, 14 October, Sophia-Antipolis (France). [pdf]
Besnard, D., Fabre, D., Van Wassenhove, W. & Runte, E. (2009). An account of scientific transfer to the industry: the co-development of an incident analysis tool. 9th conference of the European Sociology Association, Lisbon, 02-05 Sept. [pdf]
Baxter, G., Besnard, D & Riley, D. (2004). Cognitive mismatches: Will they ever be a thing of the past? Communication at the Flightdeck of the Future conference, October 6, Nottingham. [pdf]
Besnard, D. (2004). Human-machine interaction in safety-critical systems. Communication at the IFIP 10.4 group, July, Siena, Italy.
Besnard, D. (2003). Building dependable systems with fallible machines. Communication at the 5th CaberNet Plenary Workshop, 5-7 November, Madeira. [pdf]
Besnard, D. & Lawrie, A. T. (2002). Lessons from industrial design for software engineering through constraints identification, solution space optimisation and reuse. ACM Symposium on Applied Computing, Madrid (pp. 732-738). [pdf]
Besnard, D. (2001). Attacks in IT systems. A human-factors centred approach. Supplement to 2001 International Conference on Dependable Systems and Networks (DSN-2001), Göteborg (Article B-72). [pdf]
Besnard, D. (2000). Expert error. The case of trouble-shooting in electronics. SafeComp 2000, Rotterdam (pp.74-85). [pdf]
Besnard, D., Boissieres, I., Daniellou, F. & Villena, J. (Eds) (2017). La culture de sécurité. Comprendre pour agir (Safety culture. Concept and actions). Les Cahiers de la Sécurité Industrielle, Icsi. [pdf]
Besnard, D. (2016). Le management, c'est fini (Management: that's over). Preventica [pdf]
Larouzée, J., Guarnieri, F. & Besnard, D. (2014). Le modèle de l'erreur humaine de James Reason (James Reason's human error model.) CRC Research Report - CRC_WP_2014_24, MINES ParisTech (in French). [pdf]
Besnard, D. (2014). Assez de recherche. Au boulot. L'apport des sciences humaines et sociales à la gestion des risques industriels. (Enough research. Get down to work. The contribution of human and social sciences to the management of industrial risks.) Les Tribunes de la Sécurité Industrielle, Foncsi (in French). [pdf]
Besnard, D., Balzer, P., Brunel, C., Promé M., Sedaoui, A., Tazi, D. & Villena, J. (2014). Dessine-moi une culture de sécurité Changer quoi, comment et pour aller où ? (Draw me a safety culture. What to change, how, where to?). Unpublished thinktank report, Icsi (in French).
Besnard, D. & Tazi, D. (2013). Manual of the Icsi Safety Culture Diagnosis. Internal document, Icsi.
Desmorat G., Guarnieri F., Besnard D., Desideri P., Loth F. (2013). De l'utilisation du modèle CREAM et des "conditions communes de la performance" pour la conduite du retour d'expérience. Application à la sécurité de la distribution du gaz. Research report (in French). [pdf]
Pelleterat De Borde M., Martin C., Besnard D., Guarnieri F. (2013). Ce que "réorganisation" veut dire ? Une étude du démantèlement de la centrale nucléaire Phenix. Research report (in French). [pdf]
Albrechtsen, E. & Besnard, D. (2013). Assessing risks in IO: It's about choice. In Albrechtsen, E. & Besnard, D. (Eds) (2013). Oil and gas, technology and humans: risk assessment methods in organizational change. Ashgate.
Besnard, D. (2013). Assessing the performance of human-machine interaction in eDrilling operations. In Albrechtsen, E. & Besnard, D. (Eds). Oil and gas, technology and humans: risk assessment methods in organizational change. Ashgate. [pdf]
Albrechtsen, E. & Besnard, D. (Eds) (2013). Oil and gas, technology and humans: risk assessment methods in organizational change. Ashgate.
Besnard, D. & Hollnagel, E. (2012). Some myths about industrial safety. CRC Technical Report. [pdf]
Miotti, H., Guarnieri, F., Martin, C., Besnard, D. & Rallo, J.-M. (2010). Préventeurs et politique de prévention en santé-sécurité au travail (Health and safety practitioners and policy in occupational safety). Survey report. Paris: AFNOR. (Available on request in both English and French)
Besnard, D. & Robson, R. (2010). Overlooking causes in healthcare accident investigation. Choosing the analysis is choosing the results. CRC Technical Report. [pdf]
Besnard, D. (2010). Automated control and supervision in eDrilling operations: an HMI-centred set of risk assessment criteria. In E. Albrechtsen et al.: Essays on socio-technical vulnerabilities and strategies of control in Integrated Operations. Technical Report, SINTEF-NTNU. [pdf]
Besnard, D. (2008). Human cognitive performance. Undefended professoral dissertation. [pdf]
Besnard, D., Gacek, C. & Jones, C.B. (Eds) (2006). Structure for Dependability: Computer-Based Systems from an Interdisciplinary Perspective, London, Springer, ISBN 1-84628-110-5. [info]
Besnard, D. (2006). Procedures, programs and their impact on dependability. In Besnard, D., Gacek, C. & Jones, C.B. (Eds) Structure for Dependability: Computer-Based Systems from an Interdisciplinary Perspective, London, Springer, ISBN 1-84628-110-5. [pdf]
Besnard, D. & Baxter, G. (2006). Cognitive conflicts in dynamic systems. In Besnard, D., Gacek, C. & Jones, C.B. (Eds) Structure for Dependability: Computer-Based Systems from an Interdisciplinary Perspective, London, Springer, ISBN 1-84628-110-5. [pdf]
Besnard, D. & Marshall, L. (2005). Academic research: a high-cost, low benefits business. Unpublished white paper. [pdf]
Besnard, D. & Jones, C. (2004). Designing dependable systems needs interdisciplinarity. Safety-Critical Systems Club's Newsletter, May, 6-9. [pdf]
Besnard, D. & Baxter, G. (2003). Human compensations for undependable systems. Technical report CS-TR-819, University of Newcastle. [pdf]
Arief, B. & Besnard, D. (2003). Technical and human issues in computer-based security. Technical report CS-TR-790, University of Newcastle. [pdf]
Besnard, D. (1999). Erreur humaine en diagnostic. Doctoral dissertation, University of Provence, Aix en Provence, France (in French). [pdf]
Besnard, D. (2015). Operators who take initiatives reinforce safety. [pdf in English to appear soon] [pdf in French]
If asked about the world of public research, I am of the opinion that the latter is biased, and that we do too much of it, without caring for knowledge and technology transfer. This is a waste of industrial performance and societal resources.
It is a waste of industrial performance because the industry has problems that only research can help solving. But instead of tackling these problems, many academics (I plead guilty for some of my work) devote themselves to extending their list of publications by writing papers that only other academics care to read. Another typical behaviour is that of publishing slightly amended versions of the same work. Also, this is a waste of societal resources because most of public research is funded by taxpayers' money.
The above biases do not originate from researchers themselves. They exist because most of academic research is assessed on the basis of the number of publications and where they are published. No attention is paid to contents, transferability of results or potential societal benefits. With such blind evaluation criteria, collaborating with the industry is simply counter-productive since the latter has a relatively long publication cycle. As long as evaluating academic work will be an accounting exercise, more theoretical papers will be published, with a marginal impact on society and industrial practices.
I once was reminded by a very respectable senior colleague that the electric bulb was not invented by redesigning candles. I therefore accept that fundamental research is needed to foster new ideas and steer our scientific knowledge in new directions. However, fundamental research should not be an objective in itself. Also, applied research should be allowed to occupy a larger spot on the scientific stage because its impact on society and industrial practices is more direct, more tangible and lends itself to evaluation.
I therefore plead for:
The revision of evaluation criteria for scientific publications;
A reflection on how society benefits from science as currently done;
The promotion of applied research and knowledge transfer.
The paragraph above does not necessarily reflect the views of my institution or those of my employer.
Was this site helpful to you? How about giving back?
If you found anything useful here, please consider making a donation to the charity of your choice or helping someone in need.