the Creative Commons Attribution 4.0 License.
the Creative Commons Attribution 4.0 License.
Learning outcomes, learning support, and cohort cohesion on a virtual field trip: an analysis of student and staff perceptions
Jessica H. Pugsley
Lauren Kedar
Sarah R. Ledingham
Marianna Z. Skupinska
Tomasz K. Gluzinski
Megan L. Boath
Download
- Final revised paper (published on 06 Oct 2022)
- Supplement to the final revised paper
- Preprint (discussion started on 08 Oct 2021)
- Supplement to the preprint
Interactive discussion
Status: closed
-
RC1: 'Comment on gc-2021-36', Anonymous Referee #1, 10 Nov 2021
General comments
This manuscript describes a considered approach to the delivery of a virtual field trip during the COVID19 pandemic. The authors collected questionnaire data to evaluate how the students and staff perceived the strengths and weaknesses of the approach. I strongly recommend major revisions that would allow the team to reorganise key information, address questions raised by the reviewers, consider different approaches to the data descriptions appropriate to the datasets, and rewrite the discussion section to be evidence-supported and literature-supported. The main issue is that the reader cannot distinguish the difference between author (instructor?) reflections and claims supported by the data. These need to be clearly distinguished for the reader.
RC1. Does the paper address relevant scientific questions within the scope of GC?
Yes
RC2. Does the paper present novel concepts, ideas, tools, or data?
The virtual field trip design and delivery considerations is potentially novel, but without presentation of the literature it is hard to judge this aspect.
RC3. Are the scientific methods and assumptions valid and clearly outlined?
No. The methods section needs some additional information to be improved. There are few (no?) assumptions or limitations described by the authors.
RC4. Are the results sufficient to support the interpretations and conclusions?
No. The data presented needs to be amended to accurately represent the data collected. Additionally, the discussion section needs to be rewritten to clearly reflect assertions that are evidence-based, versus opinions of the instructors.
The quantitative results need to be amended to valid statistically descriptions of ordinal data. The qualitative results are quantified, rather than including rich text to show the lived experiences of respondents. The qual section would benefit from direct quotations to help qualify the data.
RC5. Do the authors give proper credit to related work and clearly indicate their own new/original contribution?
RC6. Does the title clearly reflect the contents of the paper?
Yes, but it is overly long and complex. I think the title needs to be edited to reflect the major findings of the research.
RC7. Does the abstract provide a concise and complete summary?
Somewhat. The abstract needs to have a clear distinction of the main design features of virtual field trip (what makes this VFT unique to other VFTs?) and clearly outlined results of the study was (evidence supported)
RC8. Is the overall presentation well-structured and clear?
The structure is OK, but course design information is peppered throughout. The content related to course design needs to be moved together. There needs to be only results in the results sections, and the discussion needs to explicitly link to the data in the results.
RC9. Is the language fluent and precise?
Yes, but the manuscript requires a copyedit and a check for grammar throughout.
RC10. Are the number and quality of references appropriate?
No, there is a deficit of appropriate literature in this manuscript
Specific comments
Abstract
All data and results need to be clearly written and evidence-based (See later comments in review). Readers should understand the difference between an instructor(author) reflection and a claim supported by data.
1 Introduction
Defining the common acronyms for VFE’s, VFT’s, or VLE’s might be helpful here for the reader (if you plan to use one throughout your piece).
There are published materials comparing traditional and virtual field experiences in the geosciences prior to COVID. I would suggest looking again. Namely, take a look GSA abstracts from 2009 onwards for authors reporting on comparisons (those authors will have peer reviewed materials that can support your work). One such work looks at student field experiences in a videogame vs. real life (Dohaney et al 2012).
I would recommend clearly writing out the research questions for the reader rather than alluding to “your focus” or weaving together your “we are interested in” statements. Having clearly stated RQs allows the data and results to be judged against those RQs, and the strength of your evidence more accurately weighed.
Line 66 What do you mean by learning ability? This is a vague term, and would benefit from specific language like “learning gains”, retention of information, or other more specific language that is useful in education research
Line 67/68/69/70 Reference Novelty Space research by Orion
Is going to a new physical location, and encountering new people and places of a similar impact to students’ novelty space as trying new online software?. I would argue otherwise, but happy to be incorrect if there is research that backs this claim. Look for research on novelty space and educational technology to support your ideas here.
Line 74 Is learning to be measured in this article? If so, foundational literature will be needed to establish your measure for learning. If it isn’t measured, then you need to be clear that it is student perceptions of learning (not actual measures of learning) being captured.
Line 79 online questionnaires? A survey is different from a questionnaire
2 Course Design
In this section you tell us about your considerations when designing the curriculum, but you don’t actually tell us what the final design was (in a clear way – it’s woven together). A course design section should tell us what the design was/is.
To support replication or comparisons of the curriculum, it is recommended to include responses to the following checklist of questions:
- Did the learning take place in the classroom (lecture), laboratory, field or other type of learning environment?
- Was the mode of learning and interaction with students predominantly in person, online or in blended mode?
- Who is the curriculum appropriate to? Describe the level and demographic information (if appropriate, and if ethical approval was granted)
- Is this learning appropriate to other groups of students not described in this research? (e.g., appropriate to all levels of education, etc.)
- How many learning activities are you describing?
- What is the duration of the learning activities?
- What types of learning activities are you describing?
- What academic disciplines would this curricula be appropriate for?
- Are the students assessed? Describe the assessment.
- What type of student interactions are there? Do they work independently, paired, or in groups?
- Where can the wider teaching community access the full curriculum detail? Are there additional curricular materials that could be provided to the reader to support the learning and teaching process? (Provide a permalink/doi to a source of information, or within an appendix)
- In what country(ies) did the learning and research take place?
- What institution(s) participated in the learning and research? How many were there?
- What kind of institution(s)? (Imagine that someone reading doesn't know the difference between types of institutions in your home country)
- What course(s) or class(es) did the learning take place? What topics are being taught in this course?
Check your writing against the above to make sure that this information is covered in your description.
Line 96 What are the learning outcomes of this virtual curriculum? They should be listed for the reader. (Are these the questionnaire questions? If so, be explicit and include them in-text)
Line 100 what do you consider cohort cohesion? Connect this to educational literature. Do you mean student relating to one another, forming social bonds? Do you mean students working together as a team? Whatever best fits your ideas here, please present literature to support it and define it.
Line 104 Is data from the initial questionnaire included in this article? If so, this is considered results. Those results should be included in the results section, rather than a course design section. Revisit the prompts above for what specific information should be included in a course design section.
Line 110 Any dataset must be described in terms of how many people were invited to participate and how many responded to the questionnaire (in this case). All of this information is included in the methods section (Section 3?)
3 Eliciting Perceptions
There is no description of the research design approach (i.e., your philosophical perspectives, paradigm, and the specific methodology applied). Is this a case study? Is this a mixed methods study? Describe specifically the research design used in this research.
Here are some specific questions that should help provide useful information:
- What are your research questions in this study?
- What is your approach to the research?
- What research methodology and paradigm are you using?
- What research method(s) are you using in this study? Describe in detail.
- What empirical or anecdotal evidence have you gathered to support your research questions? What kind of evidence and data was gathered? How much evidence/data was collected?
- What kind of phenomena are you attempting to measure/characterise? (student learning, perceptions, attitudes, performance, behaviours…etc?)
- Describe the recruitment, sampling, and approved ethical human research process of gaining consent for the research
Who are the students? Here are some questions that should provide useful information:
- How many students/participants are there? (Important note: please indicate the difference between how many students participated in the curriculum, vs. how many consented to participating in the research)
- What level of education are they currently undertaking?
- What year of study are they current undertaking?
- What degree programme or majors are they undertaking?
- What key demographic information can you include that helps us understand who they are and how that relates to your research questions/curriculum outcomes? (Gender, age, race, ethnicity, languages, disability, prior academic experience, nationality, immigration status, and/or social class, amongst other factors)
3.1 Questionnaire Design
You should incorporate the literature on why the use of those specific question formats are needed here. Why Likert scales? Why open questions?
Line 114. Very important: Was ethical approval to conduct this research approved by your institution? Any and all data collected from human beings needs to be approved by a higher education institutional body. Your university will have specific guidelines that need to be followed on how data is collected, stored, and the nature of inquiry (what kind of questions did you ask, and what risk and benefits are there from people participating in your study)
Line 115 Who was invited to participate? Were there any exclusion criteria? Specifically how were the participants invited to participate (email, etc?).
Line 115 Describe the response rate to all of the questionnaires.
Line 115 How was informed consent to participant in the study gained? In writing?
Line 116/126 Specifically when were the questionnaires administered (timing of the course)?
Line 121 Were the questions (statements) your own design or did you gather them from existing validated instruments?
Line 121 Were the questionnaires validated for content validity? Did you test the questionnaire prior to use?
Line 135/6 Given the content of the questions, did you research other researcher’s questionnaires to establish valid statements in these themes (LOs, support and logistics)?
Supplementary Material
Are Qs 1-3 open questions? Provide instructions on the supp material sheet
This is the student version of the questionnaire. Are there any additional questions that were included for staff?? If so, include at the bottom with something like: “Additional questions were asked such as xxx, yyy”
You use a variety of 5-point scales, that may/may not elicit changing beliefs. For example, a scale of 1 (bad) and 5 (good) – what does 3 look like? Sort of bad and sort of good?
The pre questionnaire (about access) should also be included here, if any of the data is discussed in the article.
3.2 Questionnaire Analysis
Line 141. What kind of coding were you doing? Please describe and reference the coding method used. Thematic analysis? Constant comparative analysis? Content analysis? Etc.
Line 147. The key themes are results – suggest moving those to the results section.
4 Internet access, student availability and other issues
What is the overall aim and purpose of this section? Is this a course design section or a results section?
Overall, there is a blend of information presented in this section. I interpreted section 4 to be a results section. A results section should describe and report on the data presented. There is a combination of course design materials, author (teacher?) reflections, and data combined into one. I think it should be clear to the reader which is which. In my experience, all course design information is presented in a specific section, and then the results of survey information is presented showing support for/reflection on/in critique of the course design. This way the information is clear to the reader, allowing them to judge for themselves. I would strongly suggest a reorganisation of the information in this way.
4.1 Internet access
Line 158-159 overly complex sentence, please amend
Did you have permission from your institution to use the learning analytics described in this section? It wasn’t mentioned in the methods section about the tracking of student sign-ins, and other data described here.
Figure 1 is really clear and effective.
Line 170+ . I think it’s important that the content of these sections is results from the surveys and the internet data. Any course design information should be presented in the course design section prior. Try and separate out the types of data: course design elements, survey data, and internet data, etc. so it’s clear to the reader.
4.2 Student availability and other issues
Many claims here not supported by data (or is it?).
Line 183/184. What is this statement based on? How do you know the student availability? What data supports this? Are these anecdotal events from students? Try and organise information supported by the data so that it’s clear to the reader what the data says, versus what you say (the authors observations/interp of the data).
5 Perceptions
5.1 Quantitative statements
Line 200. How many staff and how many students responded to the questions? (put in text here)
You use a 5-pt scale for most of your statements. Revisit how these data are conventionally analysed. Likert scale data (ordinal data) is most commonly approached differently than interval data.
Why did the authors use averages to describe the central tendency in your quantitative responses? Explain your thinking and choices when using central tendencies with regard to ordinal data. Medians are more appropriate here – what does the average of Strongly agree and Likely to be achieved and less likely to be achieved?
What statistical tests did you apply to the two groups of data to ascertain differences? Are your cohorts big enough to determine differences? A difference of averages is just that – it’s a descriptor but not a conclusive results. You need to do paired comparisons of the groups (staff and students). It is not appropriate to apply t-tests to ordinal data, and so is more appropriate to use Mann-whitney and Kruskal Wallis tests to these data. Take a look at MW and KW tests.
Figures 2-4
Are box and whisker plots the best way to represent ordinal data? It is not the most common approach to these types of data. I worry it may be misrepresenting it. Have a think about how else this data might be more accurately represented.
There are copy editing errors (spelling) within, please amend
5.1.1 LOs
Line 210. A reminder that the reader doesn’t know the learning outcomes, so we can’t really situate ourselves for what these statements mean with regard to how effective the virtual field trip was.
“Post-course, the IQR increased for both cohorts, 2-4 for students and 1-4 for staff. The average student response increased to 3.2 and remained fairly constant for staff at 2.6.” – Is this the best way to describe changes to the data?
Revisit how pre and post-changes to survey data (with ordinal data) are normally characterised. If we compare groups (pre- vs. post; or staff vs student) we need to conduct statistical tests to do so. See my comment from above. Movements of the range does tell us something, but it needs to be communicated accurately (minium, maximum) rather than using IQR.
Broadly, the descriptive text here is difficult to read. Using a table to show the pre, post, median, and groups (students, staff) would create a shortcut for the reader. Then the writers can highlight the key CHANGES and the key DIFFERENCES between the groups – rather than spending time describing the numbers.
What is positive shift and what is a negative shift. Take a look at other research on how that is described and quantified in text and in plots.
5.1.2 Support + 5.1.3
All the same comments as above. All quantitative data needs to be revisited for how it is calculated, described and written about in-text and in the figures.
5.2 Qualitative statements
Line 269. Long text answers – open response questions?
Line 269 – if the questions are to be described in detail, I suggest providing the full questions in text for the reader
Line 275 – if open response information alludes to the previous quant questions, it is fair game for the researchers to incorporate that text into the sections about those statements. Alternatively you could reorganise your Results section 5 around Themes (recommended) rather than on the “type of response” (qual vers quant).
Sections 5.2.1., 5.2.2, 5.2.3, and 5.2.4.
Instead of treating the qualitative data as rich text, you’ve opted to apply quantitive methods to the qual data. I would include a reasoning of why you did this in the methods section.
Why not provide the codes and themes and include specific quotes that exemplify those themes?
I find the text in these sections to be summaries of what is provided in Figure 5, instead of providing rich descriptions (Geertz) of what the participants felt, thought, and reflected upon the course and it’s design.
What do we gain (as a reader and community) from the quantaitive results here? If the authors decide to include quantitative assessment of these data, a bulleted summary might be more effective.
Figure 5
I do like adding the positive or negative framing of the results here. The % needs to be described in the caption for the reader – what does 20% vs 80% mean?
6 Discussion
There are two major flaws with the discussion section, as it is currently. First, the claims made about “what was found in this study” are not explicitly linked to evidence. And second, there is no linkages to the literature. In a discussion section, the authors need to connect their research to the existing literature. There is significant work done on educational technology and student and staff perceptions of virtual field experiences. You currently do not include any meaningful literature.
All claims must link to the evidence. It’s not enough to paraphrase the results, the reader needs to see the connection. There are many claims made that don’t explicitly link to the evidence provided. You cannot make claims that you don’t have evidence for (e.g., students learned more from the virtual field trip – you didn’t measure learning, etc.). Try and create some transparency to the reader about what are reflections from the authors (instructors) and what are discussions of the evidence (supported by data).
6.1 Were learning outcomes met?
Think about use of academic paragraphs in this section. When to start, when to stop, etc.
Line 328. Is this true? How do we know the LOs were met? Back this statement up (link directly to) the evidence.
Line 328 recognition by whom? Link to evidence
Line 330 Sign-posting of evidence is needed throughout. E.g., reduced negative perceptions of online teaching (Statement 5, Fig 3), etc.
“Students also recognised that they had more time to work on data, analysing and synthesising it to expand their understanding and learning” – where do we learn this? Is this referring changes in perceptions of time management?
Line 338. If you include anecdotal information, be sure to be clear that the reader is aware of this
6.2 Were pre-identified concerns lived out, and were mitigations measures effective?
This section has a good balance of including links to evidence and discussion of the results.
How do you know which mitigations were more or less effective? (hint, unless the question was specifically asked, you don’t). You can assume, but you don’t have data to support it.
6.2.1
Line 359. Do you have evidence that students spent more time on tasks than normal?
Line 363. Do you have evidence that students had a greater breadth and depth of learning? Did you measure learning?
6.2.4
Line 378. Did you get permission to use the students grades in this study? If you didn’t get informed consent to use their scores/grades, you can’t allude to it in the research.
6.3 Did the virtual field trip develop student cohesion and peer-peer learning?
Line 387 – benefits of in-person field experience (reference needed)
Line 388 – Mentioning course design aspect here (whereas that should be mentioned in the course design section).
Line 390. Do you have evidence that groups helped students break down barriers? Do you have data that describes how the student grouping went? Any qualitative data describing student experiences?
Line 395 – Instructors’ reflection that students already worked well as a team? (any evidence to support this?)
6.4 Was academic-student support effective?
Line 405. “It was felt” – how could you rephrase to accurately represent the data?
Line 407. Any evidence that students who normally don’t’ engage much were engaged?
7 Conclusions
What are the major findings of this research study? You don’t summary the key findings here, and I think the manuscript would be much improved with this important information.
Line 409. You didn’t present student grades (and I suspect didn’t get informed consent to use those grades?). What are some other indicators of success? (why not lean back onto the data and results that you do have to make a claim about what success looks like here?
Starting Line 413 – recommendations could be in a bulleted list
Do you have data to support your recommendations?
If not, say explicitly. “These are instructor recommendations for how to support students in virtual field trip success:” and then add in (explicitly) where the data you did collect supports your assertions.
New course design elements are included in the conclusions section. My recommendation is that no course design aspects are included in the results, discussion and conclusion sections. These elements should be well described in a course design section early in the piece and then alluded to later in these following sections.
Line 436. Do you feel that the current manuscript communicates students negative comments accurately? I sense that you’ve glossed over the critiques. How could you better incorporate their views (even the negative ones)?
Technical corrections
Check for comma punctuation throughout your piece. They are often missing, or overused in some sections. Check for use of oxford comma (or not) throughout, consistency
Line 47 Use of comma with e.g.,
Line 54 Do you mean equity (not equality?)
Line 323 … Were the learning outcomes met?
Line 342 … were our mitigation measures effective?
Citation: https://doi.org/10.5194/gc-2021-36-RC1 -
AC4: 'Reply on RC1', Clare Bond, 31 May 2022
General Comments
We thank reviewer one (RC1) for their considered review and expertise in questionnaire design and analysis. We appreciate that an extended literature review that covers both the range and variety of virtual field trips as well as questionnaire design and analysis will enhance the manuscript. As part of this we will include a section on the assumptions and limitations of the approach taken.
The reviewer raises questions around the presentation of the data, we will re-consider how the data is presented and make clear how our assertions relate to the data, highlighting areas of interpretation. We will review our presentation of the data to reflect best practice for ordinal data, and will include specific quotes to provide richness to the narrative. We considered making all rich text available, but for the anonymity of the participants we have refrained from providing the full texts.
Specifics
Abstract - The reviewer suggests that the abstract includes a section on what makes this virtual field trip unique from others, but this was not the purpose of the paper. We have no specific knowledge about the uniqueness, or otherwise of the VFT. Apart from the fact that it was a newly designed VFT and therefore in that sense was unique. We feel that the abstract includes evidence supported results, with percentages given for responses to questions.
Course Design - The reviewer feels that course design information is ‘peppered’ through the document. We will review the manuscript again to ensure that all course design information is together, and that the results and discussion are clearly separated. We will use the suggested checklist to aid in changes made to the course design section. We will include the number of responses solicited as well as those elicited, and will provide information on our philosophical approach. Information on the student cohort and their stage in their undergraduate education is already given in the course design section. We did not collect data on demographics or personal characteristics; as this was not part of the design or ethical approval. We will specify the exact timing of questionnaire release in relation to the course. We will discuss how the questionnaire was designed and why.
Research Questions - The reviewer asks for defined research questions and we will add in a section that outlines our research questions to ensure clarity for the reader.
Novelty space – we will include literature on the concept of novelty space to support any assertions made on novel experiences.
Perceptions of Learning - Our paper is focused on perceptions of staff and students, as highlighted in the title, we will check the manuscript for clarity to ensure this is clear.
Learning Outcomes and Questions - The reviewer was interested in how the courses learning outcomes relate to the questions posed. We will be explicit about how the two relate in the text.
Cohort cohesion - We will define cohort cohesion and use referencing to support our definition.
Ethical Approval - The reviewer raises a question on ethical approval. University of Aberdeen ethical approval was applied for and given for the research; we do not include this information specifically as this is normal best practice. By submitting the paper we confirm that ethical considerations and protocols have been undertaken. If Geoscience Communication need more than this – please let us know.
Supplementary Material - We will provide the additional information requested on the supplementary material.
Questionnaire Analysis - We will provide further information on the questionnaire analysis undertaken. The section on internet access fed into course design. We will incorporate this section into course design. The section on student availability was supported by the pre-course questionnaire – we will make this clear.
Data Presentation - The reviewer questions the presentation of Likert scale data as apparently ‘continual’ distributions in the form of box and whisker plots and inter-quartile ranges. We felt this best allowed comparison of the pre- and post- data as well as between staff and students. We understand that this is not conventional and we will look to create a new set of figures that allow this comparison whilst showing the actual Likert distribution. We will investigate the use of statistical tests, but the dataset is small. If we are unable to use statistical tests we will make sure it is clear that we have not done so.
Code Descriptors - We can provide further information on the code descriptors used. We will describe why we chose to take a quantitative approach.
Figure 5 – the positive and negative framing ‘fell-out’ of the analysis of responses. We will describe what the percentages mean in the caption.
Discussion and Conclusions - We will amend the discussion and conclusions linking better to existing literature and providing evidence as appropriate to support statements.
The reviewer raises specific line numbered comments we will check though these and amend as appropriate.
Citation: https://doi.org/10.5194/gc-2021-36-AC4
-
CC1: 'Comment on gc-2021-36', Solmaz Mohadjer, 17 Feb 2022
This is a great reflection on online and virtual field teaching. The manuscript is clearly written and well-structured. I applaud the authors' effort to evaluate key issues related to carrying out a geoscience virtual field trip including internet connectivity, student and staff perceptions on learning outcomes and cohort cohesion. Lessons learnt here will certainly inform future virtual field teaching. A couple of questions and comments:
1. Key demographic data on students and staff are missing including the number of participants completing the questionnaire. If these data were collected, please consider including them.
2. Ethics and consent - Please include relevant information (in accordance with GDPR) on the ethics assessments and the consent forms used for this study.
3. There are 11 learning outcomes (bullet points under question 3) in the questionnaire. Some of these outcomes are missing from Figure 2 and the discussion section. Why is that? Could you provide an explanation?
4. Statistical information - It's not clear to me if your results are statistically significant, especially when the number of participants is not reported (and the number of staff is low). Please explain what statistical tests you 've performed including the corresponding values (in the text and in your figures, if possible).
5. Can you include the questionnaire used for testing students’ internet connection speeds in the supplemental information?
6. It is stated that "the answers of some participants might have been influenced by attendance in the initial week's session prior to questionnaire completion". Can you provide more information on this session and how it might have influenced participants' answers?
7. Was there a reason for the questionnaire being anonymized? This prevented the authors from matching pre- and post-assessment results for individual participants. One way to match an individual's pre- and post-assessment answers (while remaining anonymous) is to ask participants to use a code name (known only to them) on both surveys (e.g., see: https://gc.copernicus.org/articles/4/281/2021/gc-4-281-2021.html).
8. "Students were unsure as to the usefulness of a workbook in terms of “finding a quiet space to work” in advance of the course". This statement, as phrased here, was also unclear to me. Could it be that students didn't understand what they were being asked to rate? Perhaps this statement can be phrased differently?
9. Great figures, clear and effective!
Citation: https://doi.org/10.5194/gc-2021-36-CC1 -
AC5: 'Reply on CC1', Clare Bond, 07 Jun 2022
General Comments
We thank the contributor for their positive comments. We are really pleased to hear that “Lessons learnt here will certainly inform future virtual field teaching.”
We respond to the specific comments made:
- Key demographic data on students and staff are missing including the number of participants completing the questionnaire. If these data were collected, please consider including them.
- We will include this data.
- Ethics and consent - Please include relevant information (in accordance with GDPR) on the ethics assessments and the consent forms used for this study.
- We acquired University of Aberdeen ethical approval including GDPR for the study, we will provide details on this as required by Geoscience Communication.
- There are 11 learning outcomes (bullet points under question 3) in the questionnaire. Some of these outcomes are missing from Figure 2 and the discussion section. Why is that? Could you provide an explanation?
- We will add a sentence that explains why we have chosen to present the learning outcome questions we did.
- Statistical information - It's not clear to me if your results are statistically significant, especially when the number of participants is not reported (and the number of staff is low). Please explain what statistical tests you 've performed including the corresponding values (in the text and in your figures, if possible).
- We will investigate the statistical tests suggested by RC1 and include this if appropriate.
- Can you include the questionnaire used for testing students’ internet connection speeds in the supplemental information?
- yes we can include this.
- It is stated that "the answers of some participants might have been influenced by attendance in the initial week's session prior to questionnaire completion". Can you provide more information on this session and how it might have influenced participants' answers?
- this relates to when the participants submitted the ‘prior’ questionnaire. We will add this information to the manuscript.
- Was there a reason for the questionnaire being anonymized? This prevented the authors from matching pre- and post-assessment results for individual participants. One way to match an individual's pre- and post-assessment answers (while remaining anonymous) is to ask participants to use a code name (known only to them) on both surveys (e.g., see: https://gc.copernicus.org/articles/4/281/2021/gc-4-281-2021.html).
- yes, we could have done this, but set-up time was tight and we wanted to ensure anonymity.
- "Students were unsure as to the usefulness of a workbook in terms of “finding a quiet space to work” in advance of the course". This statement, as phrased here, was also unclear to me. Could it be that students didn't understand what they were being asked to rate? Perhaps this statement can be phrased differently?
- we will rephrase this sentence.
- Great figures, clear and effective!
- thank you – RC1 raised some concerns over how we presented the data. We think they are clear too, but understand their concerns. We will investigate if it is possible to create figures that are as clear, but that represent the original Likert choices better.
Citation: https://doi.org/10.5194/gc-2021-36-AC5
-
AC5: 'Reply on CC1', Clare Bond, 07 Jun 2022