Change search
Refine search result
1 - 5 of 5
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard-cite-them-right
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 1.
    Baroutsi, Nicoletta
    Swedish Defence University, Department of Military Studies, Science of Command and Control and Military Technology Division, Command and Control Section. Lund University, Lund, Sweden.
    A Practitioners Guide for C2 Evaluations: Quantitative Measurements of Performance and Effectiveness2018In: ISCRAM 2018 Conference Proceedings: 15th International Conference on Information Systems for Crisis Response and Management / [ed] Boersma, Kees; Tomaszewski, Brian, Rochester, NY, USA: Rochester Institute of Technology , 2018, p. 170-189, article id 1546Conference paper (Refereed)
    Abstract [en]

    Quantitative evaluations are valuable in the strive for improvements and asserting quality. However, the field of Command & Control (C2) evaluations are hard to navigate, and it is difficult to find the correct measurement for a specific situation. A comprehensive Scoping Study was made concerning measurements of C2 performance and effectiveness. A lack of an existing appropriate framework for discussing C2 evaluations led to the development of the Crisis Response Management (CRM) Matrix. This is an analysis tool that assigns measurements into categories, and each category display unique strengths, weaknesses and trends. The analysis yielded results proving to be too rich for a single article, thusly, this is the first of two articles covering the results. In this article, the Practitioners Guide focus on results valuable for someone interested in evaluating C2. Each evaluation has specific requirements that, for best result, ought to be reflected in the chosen measurement.

  • 2.
    Baroutsi, Nicoletta
    et al.
    Swedish National Defence College, Department of Military Studies, Command & Control Studies Division. Linköping University.
    Berggren, Peter
    FOI.
    Nählinder, Staffan
    Linköping University.
    Granlund, Rego
    Santa Anna IT Research Institute, Sweden.
    Turcotte, Isabelle
    Laval University, Canada.
    Tremblay, Sébastien
    Laval University, Canada.
    Assessing development of team training2014In: ISCRAM 2014 Conference Proceedings. Book of Papers / [ed] Starr Roxanne Hiltz, Mark S. Pfaff, Linda Plotnick, Patrick C. Shih, The Pennsylvania State University, USA , 2014Conference paper (Refereed)
  • 3.
    Berggren, Peter
    et al.
    FOI.
    Johansson, Björn JE
    FOI.
    Baroutsi, Nicoletta
    Swedish National Defence College, Department of Military Studies, Command & Control Studies Division.
    Dahlbäck, Nils
    Department of Computer and Information Science, Linköping University.
    The shared priorities measure as a way of assessing team strategic awareness: a bridge between self-assessment and the deep blue sea of field recordings2014In: Proceedings of the 2014 European Conference on Cognitive Ergonomics, ACM Digital Library, 2014, p. 13-Conference paper (Refereed)
    Abstract [en]

    Objective, easy to use, easy to comprehend, high face-validity assessment methods for measuring shared awareness in teams are hard to find. This paper describes an experiment where a new measure called Shared Priorities, which is based on ranking of self-generated strategic items, is tested. Trained teams were compared to non-trained teams in a dynamic problem-solving task in terms of performance and shared awareness. The shared priorities measure was used alongside other, well-documented measures of team awareness based on self-rating. The results show that the Shared Priorities measure correlate with performance and could also distinguish between trained and non-trained teams. However, the Shared Priorities measure did not correlate with the other team measures, suggesting that it captures a different quality of team work than the self-rating measures. Further, the shared priorities measure was found to be easily administered and gained a high user acceptance.

  • 4.
    Berggren, Peter
    et al.
    FOI.
    Johansson, Björn JE
    Linköping University, Sweden.
    Baroutsi, Nicoletta
    Swedish National Defence College, Department of Military Studies, Command & Control Studies Division. Linköping University, Sweden.
    Turcotte, Isabelle
    Laval University, Canada.
    Tremblay, Sébastien
    Laval University, Canada.
    Assessing team focused behaviors in emergency response teams using the shared priorities measure2014In: Proceedings of the 11th International ISCRAM Conference / [ed] S.R. Hiltz, M.S. Pfaff, L. Plotnick, and P.C. Shih, Pennsylvania, USA: ISCRAM , 2014, p. 130-134Conference paper (Refereed)
    Abstract [en]

    The purpose of this work in progress paper is to report on the method development of the Shared Prioritiesmeasure to include content analysis, as a way of gaining a deeper understanding of team work incrisis/emergency response. An experiment is reported where the performance of six trained teams is comparedwith the performance of six non-trained teams. The experiment was performed using an emergency responsemicroworld simulation with a forest fire scenario. Dependent measures were simulation performance, the CrewAwareness Rating Scale (CARS), and content analysis. Trained teams performed better and scored higher onmeasures of team behaviors.

  • 5.
    Berggren, Peter
    et al.
    FOI.
    Johansson, Björn JE
    FOI.
    Svensson, Erland
    Pensionerad.
    Baroutsi, Nicoletta
    Swedish Defence University, Department of Military Studies, Command & Control Studies Division.
    Dahlbäck, Nils
    Linköping University, Sweden.
    Statistical modelling of team training in a microworld study2014In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Sage Publications, 2014, p. 894-898Conference paper (Refereed)
    Abstract [en]

    A command and control environment is a dynamic and complex setting with complicated technical systems where teams of operators interact to reach shared goals. This study presents an experiment in which we, by means of Structural Equation Modeling (SEM), explain the relations between basic concepts of command and control environments: mental workload, frustration, situational awareness, and performance. This paper reports a LISREL analysis of the Baroutsi, Berggren, Nählinder, & Johansson (2013) data. From that data, a new latent variable “Frustration” emerges, which now can be included in the model.

1 - 5 of 5
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard-cite-them-right
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf