the Creative Commons Attribution 4.0 License.
the Creative Commons Attribution 4.0 License.
Jupyter Book as an open online teaching environment in the geosciences: Lessons learned from Geo-SfM and Geo-UAV
Abstract. Together with our students, we co-created two open-access geoscientific course modules using the Jupyter Book environment that formed part of one undergraduate-level and one open-ended (undergraduate – professor) geology course that comprised both field and classroom teaching. The modules covered the acquisition of drone data and subsequent processing of 3D models and were iteratively revised over a four-year period. Each module implemented an in-line collection of videos, animations, code snippets, slides, and interactive material to complement the main text as a diverse open learning environment. Behind the scenes, Github was used to facilitate content version, co-creation and open publishing of the resources. We found that students were favourable to the framework and especially valued the framework’s accessibility, inclusivity, co-creation capabilities, and interactivity. Collaboration certainly helped cultivate an interest in revising the source materials and updating information where it was deemed outdated or unclear, regardless of the contributor’s background, affiliation or level of experience. However, effective co-creation relied on students to be able to use the tools at their disposal, plus be given the opportunity to contribute in their own ways. Through our combined efforts, we succeeded in providing lasting, up-to-date and open course materials to a campus with a small department that does not have significant experience nor capacity in developing and maintaining open educational resources. Work remains to establish optimal playtime durations for integrated animations and videos, as well as the translation of the modules into different languages. Finally, our efforts are an important step in the development of open educational geoscientific content co-created with input from technical experts, social scientists, and students alike.
- Preprint
(1286 KB) - Metadata XML
- BibTeX
- EndNote
Status: final response (author comments only)
-
RC1: 'Comment on gc-2024-6', Enze Chen, 25 May 2024
I was asked by the Editors to review this article, and while I have no experience in the geosciences, I can speak to the efficacy of Jupyter Book and curriculum design. Below are my comments to the Authors.
The manuscript by Betlem, et al. presents a study of two open educational resources (OERs) in the geosciences, Geo-SfM and Geo-UAV, that they have co-created with students over several years. Through a mix of qualitative and quantitative student feedback on surveys, they demonstrate that their OERs based on the Jupyter Book framework was not only able to achieve content-based learning goals, but also enhance diversity and inclusion through the co-creation process. Overall, I find the work to be timely, with standard methodology, and I recommend publication in GC as a Research Article if they could address some of the following comments:
My understanding is that one of the main results of the work is the student co-creation process in creating the OER and the ways they stand to benefit from it. This makes the work exemplary in that Jupyter is widely used to create accessible, interactive content in the sciences, but it’s usually the instructor(s) who create the curriculum and present the finished product. Very few, to my knowledge, have worked with students directly in creating the content, and the authors do a good job of introducing students to the GitHub workflows required to make this a reality. It is a powerful reminder that students already bring a lot into the learning environment.
I have skimmed their published Jupyter Books, which look well-crafted and I’m sure the community will stand to benefit. I am wondering, if in addition to the student feedback, the authors can include some of the content contributions of the students. Perhaps a Figure (in main or supplemental) showing the “before and after” state of one of the pages after student contributions? Just so readers get a sense of what extent the 39 pull requests contributed… if it was only a few typos or substantial media/text. This could give readers a clearer sense of how they might ask their own students to contribute.
Also, it seems like the number of contributors matched the number of students? (n = 10) Was a GitHub PR a requirement of the course? Extra credit? It would be interesting to know in terms of student motivation for collaboration, to bookend the value the authors analyze.
Other than perhaps a very helpful reminder that GIFs can be useful for more than just memes, I found the multimedia section slightly less compelling. It would seem obvious to me that the inability to pause a GIF could be problematic (especially if it’s 78 seconds long), but I do acknowledge their small file size and the fact that the authors wanted to study this. In terms of including text along with the video, perhaps they would be interested in universal design for learning (no need to include this in the paper): https://udlguidelines.cast.org/
Clarification question: Did the students have to write any code in the Jupyter Book to solve a problem? Or was everything pre-written and they just had to execute it? The manuscript isn’t clear, there is some mention of it in the surveys, but not in the results/analysis.
Otherwise, I appreciate their detailed description of the survey methodology. It’s refreshing to see box and whisker plots with the individual points overlaid, as well as sample size.
My major criticism of this work lies in the Introduction, which I felt like tried to do too much and detracted from the rest of the work. They start with FAIR as a lead-in to OER, but Ctrl+F for “FAIR” doesn’t come up again. I think their work is strong enough to just jump straight into OER/-P? Same goes for “eight primary factors” and “5 Rs.” A Research Article should be properly referenced/contextualized, and I think cleaning up the Introduction by shortening it would strengthen their messaging.
Minor comments:
- Abstract: L6, “GitHub” (check everywhere proper caps) … “content versioning”
- Introduction: The second sentence appears to start at a line break.
- Should it be “gif” or “GIF” ?
- Table 1 text in Code and Description columns overlap.
- L253: The Open Enough rubric jumps out rather suddenly, though I see the citation appears in the Intro in a different context. Perhaps explain briefly what this rubric/paradigm is, especially if the new curriculum outperforms previous ones?
- L310: The opening sentence, “From the perspective of instructors, we are excited to see that open-source software and infrastructure has matured to the point where…” is uncannily similar to an opening sentence from Chen and Asta, JCE 2022, “From the perspective of instructors, we are particularly excited to see computational software and infrastructure mature to the point where…” I am the aforementioned Chen (which they already cite in their work) and just thought I’d point this out—which, to be clear, I like the phrasing and agree with the messaging.
- L345: They mention with Geo-UAV how it “showcased the possibility of having interactive and portable documentation that can be easily…” While I believe this to be true in implementation, it wasn’t really discussed elsewhere I think, so this paragraph stood out a little awkwardly.
- L374: “along with”
- It would be nice if explicit URLs to the Jupyter Books could be included in the data availability section for ease of access, though I understand if links to Zenodo serve a more permanent record.
I want to thank the authors for contributing this work and the editors for the opportunity to review.
Other:
- Scientific significance: 2, Good
- Scientific quality: 2, Good
- Presentation quality: 2.5, Good/Fair
Citation: https://doi.org/10.5194/gc-2024-6-RC1 -
AC1: 'Reply on RC1', Peter Betlem, 08 Jun 2024
Dear Enze Chen,
We thank you for your interest in our manuscript and for your constructive responses to it. We agree with your suggestions and have addressed these in the manuscript. We provide a brief inline reply to each below. We also include relevant snippets of the revised manuscript to illustrate significant changes to the manuscript. Line references below refer to the new line number (as found in the attached snippets).
I was asked by the Editors to review this article, and while I have no experience in the geosciences, I can speak to the efficacy of Jupyter Book and curriculum design. Below are my comments to the Authors.
The manuscript by Betlem, et al. presents a study of two open educational resources (OERs) in the geosciences, Geo-SfM and Geo-UAV, that they have co-created with students over several years. Through a mix of qualitative and quantitative student feedback on surveys, they demonstrate that their OERs based on the Jupyter Book framework was not only able to achieve content-based learning goals, but also enhance diversity and inclusion through the co-creation process. Overall, I find the work to be timely, with standard methodology, and I recommend publication in GC as a Research Article if they could address some of the following comments:My understanding is that one of the main results of the work is the student co-creation process in creating the OER and the ways they stand to benefit from it. This makes the work exemplary in that Jupyter is widely used to create accessible, interactive content in the sciences, but it’s usually the instructor(s) who create the curriculum and present the finished product. Very few, to my knowledge, have worked with students directly in creating the content, and the authors do a good job of introducing students to the GitHub workflows required to make this a reality. It is a powerful reminder that students already bring a lot into the learning environment.
I have skimmed their published Jupyter Books, which look well-crafted and I’m sure the community will stand to benefit. I am wondering, if in addition to the student feedback, the authors can include some of the content contributions of the students. Perhaps a Figure (in main or supplemental) showing the “before and after” state of one of the pages after student contributions? Just so readers get a sense of what extent the 39 pull requests contributed… if it was only a few typos or substantial media/text. This could give readers a clearer sense of how they might ask their own students to contribute.
We have decided to use the following PR for that and have included a comparison figure in the appendix to accompany the discussion of the main text. See Fig. A1, L267-268. PR link: https://github.com/UNISvalbard/Geo-SfM/pull/66
Also, it seems like the number of contributors matched the number of students? (n = 10) Was a GitHub PR a requirement of the course? Extra credit? It would be interesting to know in terms of student motivation for collaboration, to bookend the value the authors analyze.
Good point. No, a GitHub PR was not a formal requirement of the course, nor a task for extra credits. Starting in year 4, Geo-SfM students were encouraged to learn the ropes of submitting PRs (and reviewing them) by making small additions to an overview table (generated from a yml file; see Session 6 – Uploaded examples on the Geo-SfM pages) in pairs to “archive/list” their results. Only after that, were they encouraged to make PRs to the main educational resources. The aforementioned list documents previous projects in the course and acted as an inspiration source for the students to see what had been done before. In previous years, students provided the information to instructors (in class or through GitHub issues), who then submitted the changes on their behalf. We have tried to state this more clearly in the relevant paragraphs of the manuscript. See e.g. L149-152.
Other than perhaps a very helpful reminder that GIFs can be useful for more than just memes, I found the multimedia section slightly less compelling. It would seem obvious to me that the inability to pause a GIF could be problematic (especially if it’s 78 seconds long), but I do acknowledge their small file size and the fact that the authors wanted to study this. In terms of including text along with the video, perhaps they would be interested in universal design for learning (no need to include this in the paper): https://udlguidelines.cast.org/
In the revised manuscript we have shortened the original text slightly and added additional context as we feel that – as also pointed out by our students – the GIFs formed a crucial accessibility component of the educational resources. We therefore argue it should remain described in the manuscript, even if at least in part. We have also replaced “animations” with GIFs on several occasions to enhance clarity. Thank you for the universal design suggestion, which will be kept in mind during future revisions of the materials. See L. 324-336.
Clarification question: Did the students have to write any code in the Jupyter Book to solve a problem? Or was everything pre-written and they just had to execute it? The manuscript isn’t clear, there is some mention of it in the surveys, but not in the results/analysis.
Given that the instructed topics featured limited programming exercises and mostly explanatory and narrative content, almost all programming-related course content was pre-written, or suitable as input with minor modifications. We have made minor revisions to the manuscript to reflect that the code snippets were pre-written, though it is also important to keep in mind that student contributions were made in MyST Markdown, feeling like a programming exercise to many of them.
Otherwise, I appreciate their detailed description of the survey methodology. It’s refreshing to see box and whisker plots with the individual points overlaid, as well as sample size.
My major criticism of this work lies in the Introduction, which I felt like tried to do too much and detracted from the rest of the work. They start with FAIR as a lead-in to OER, but Ctrl+F for “FAIR” doesn’t come up again. I think their work is strong enough to just jump straight into OER/-P? Same goes for “eight primary factors” and “5 Rs.” A Research Article should be properly referenced/contextualized, and I think cleaning up the Introduction by shortening it would strengthen their messaging.
Agreed, we may have gone into too much detail early on. We have revised the introduction and significantly shortened it. FAIR references have been replaced by more relevant works. The 5 Rs are now only touched upon, and only further expanded upon as introduction to the Open Enough rubric at the start of the discussion. (see comment below and attached introduction snippet).
Minor comments:
Abstract: L6, “GitHub” (check everywhere proper caps) … “content versioning”
Replaced case variations of GitHub with the proper case GitHub.
Replaced content version with versioning.
Introduction: The second sentence appears to start at a line break.
Removed line break.
Should it be “gif” or “GIF” ?
Replaced gif with GIF.
Table 1 text in Code and Description columns overlap.
We have tried to better accommodate the column spacing through modifications of the template.
L253: The Open Enough rubric jumps out rather suddenly, though I see the citation appears in the Intro in a different context. Perhaps explain briefly what this rubric/paradigm is, especially if the new curriculum outperforms previous ones?
A more extensive introduction has been given to the rubric with an extended explanation that includes parts of the former introduction. See L282-295.
L310: The opening sentence, “From the perspective of instructors, we are excited to see that open-source software and infrastructure has matured to the point where…” is uncannily similar to an opening sentence from Chen and Asta, JCE 2022, “From the perspective of instructors, we are particularly excited to see computational software and infrastructure mature to the point where…” I am the aforementioned Chen (which they already cite in their work) and just thought I’d point this out—which, to be clear, I like the phrasing and agree with the messaging.
Thank you very much for pointing this out. We apologise for the unintended similarities in the opening sentence and have revised the paragraph and added additional context. See L359-367.
L345: They mention with Geo-UAV how it “showcased the possibility of having interactive and portable documentation that can be easily…” While I believe this to be true in implementation, it wasn’t really discussed elsewhere I think, so this paragraph stood out a little awkwardly.
We have removed the paragraph following L345 in the discussion and instead added a brief sentence at the end of Section 2.1, reading:
As fieldwork forms a large component of data acquisition, our study implicitly tested the portability of Geo-UAV into field-based teaching, either by accessing the tutorial online in the field, or by exporting PDF pages prior to heading out.
L374: “along with”
Added
It would be nice if explicit URLs to the Jupyter Books could be included in the data availability section for ease of access, though I understand if links to Zenodo serve a more permanent record.
Added explicit URLs to the rendered Jupyter Books to the data availability section and placed all information in a table. See data availability section.
I want to thank the authors for contributing this work and the editors for the opportunity to review.
Likewise, thank you for taking the time to review the work, as well as for your constructive and insightful feedback.
On behalf of the co-authors
Peter Betlem, 8th of June 2024.
-
RC2: 'Reply on AC1', Enze Chen, 08 Jun 2024
Dear Peter and others,
Thanks for sharing your thoughts in this revision. I like the new additions of Section 1.2 and the Figure A.1
Congrats on the great work, I fully support publication.
Enze
Citation: https://doi.org/10.5194/gc-2024-6-RC2 -
AC3: 'Reply on RC2', Peter Betlem, 26 Jul 2024
Dear Enze Chen,
Thank you for reviewing our suggestions and taking another look at the revised content.
We are glad the suggested revisions are to your liking and look forward to sharing this work with the wider geoscience community.
On behalf of all authors and with best regards,
Peter Betlem
Citation: https://doi.org/10.5194/gc-2024-6-AC3
-
AC3: 'Reply on RC2', Peter Betlem, 26 Jul 2024
-
RC2: 'Reply on AC1', Enze Chen, 08 Jun 2024
-
RC3: 'Comment on gc-2024-6', Jonathan W. Rheinlænder, 26 Jun 2024
Review of Betlem et al. "Jupyter Book as an open online teaching environment in the geosciences: Lessons learned from Geo-SfM and Geo-UAV"
The manuscript describes the lessons learned from two geoscience courses held over a 4-year period, with a focus on incorporating online teaching materials such as Jupyter Notebooks, videos, and animations. Quantitative and qualitative feedback was provided by the students to evaluate the use of Jupyter Notebooks, etc., as part of the course modules. The evaluation focusses mostly on how these tools contribute to openness and sharing and how easy they are to use. Little attention is given to whether these tools contribute to enhanced learning, however, and a clear description of the learning outcomes/goals of the courses is missing. The methodology seems standard, but one minor drawback is only using written student feedback through the surveys. Conducting student interviews as well could have been a nice compliment to the written feedback.
The manuscript is well-written with a logical paragraph and sentence structure. I think the content and experiences from the use of online Notebooks can be highly relevant to a wide audience including lecturers in higher learning institutions as well as high-school and secondary school teachers/students. The experiences gained here could be highlighted even more by including a separate section with lessons learned and recommendations in a bullet-point format.
Overall, you will find that my comments are relatively minor, and I find the manuscript suitable for publication in GC once these comments have been addressed.
Sincerely,
Jonathan W. Rheinlænder
General Comments
Comment #1
The paper focus mostly on the aspect of openness and sharing in the context of online Notebooks and multimedia in teaching. While I understand this choice, I was hoping there would be a stronger focus on the potential impact onenhanced learning using these online teaching tools. Questions #1 and #2 in Fig. 2 touches on this aspect but is vague. Specifically, what is meant with “met my needs”? and “topics covered by the Compendium were relevant to the course”? What I am missing here is a specification of the learning outcomes of the modules and how did the use of notebooks/multimedia facilitate achieving these?
I will also add that I struggled to identify the main research question(s) from your introduction. Perhaps you can state this more explicitly (e.g. L65-84) to help the reader understand what your main goal is with the study.
Comment #2
This is more a personal preference, but I would like to see the introduction a bit shorter. It is great that you provide context for the study, but I think it could benefit from being a bit more to the point with less “theoretical” background. What makes this paper compelling are your experiences, so I would try to get to the main bit sooner. In general, I think the paper is likely to get more reads if it is a bit shorter/concise.
Comment #3
On a similar note, I found the discussion to be a bit long and repetitive at times. I think the discussion could benefit from being slimmed down a touch. Evaluate if all the paragraphs are needed? Does some of the material belong in the results?
The idea behind Section 4.2 – Lessons learned and future directions is great, but I think it belongs in the Conclusion/Outlook section, which would help get your message across. You could make it even more applicable to other teachers who wish to incorporate these tools in their own teaching, e.g., by highlighting the main lessons learned (e.g. through a list/bullet points), do’s and don’ts including your recommendations for how to best implement these tools? For example, regarding installation of python libraries, etc.
Comment #4
Overall, the manuscript is written in an easy-to-read style, which is not overly academic. I think this is a good style choice for this type of paper. But a few places you “break” with this style, using words/phrases that may not be so widely used, for example:
“… were their first foray into the large and growing ecosystem of such tools”
“Students affirmed as much and specifically noted the efficacy of …”
For a non-native speaker (like myself), it makes the text flow less naturally. In addition, I think the paper can benefit from using more active voice, i.e. “We asked the students” instead of “the students were asked” (passive voice) for a more personal touch. Here, I would recommend sticking to the less academic writing style to make the paper more accessible to a wider audience (students/teachers).
Comment #5
Overall, the figures and table support the text nicely. But it would be nice if you can add more in-text references to the figures and tables where relevant. For example, in L202-205.
Specific Comments
- L4: ”… were iteratively revised over a four-year period”
Could you specify what is revised? I.e. the modules and not the 3D models. - L82: discusses --> discuss
- L85: In the Methods section, can you add some more information about the student´s prior experience with coding/git? Did you have students that had no experience with Python.
And how do you think experience with git affects student participation? Are students with less coding/git experience less likely to participate and how are these students evaluated?
Also, is there a particular reason why you did not consider interviewing the students as well, in addition to the surveys? Can you please comment/discuss how student interviews (individual or focus-group) could compliment the surveys. What kind of additional information could you obtain through interviews?
- L89-90: Could you comment/discuss how higher participation numbers could affect the results? How scalable do you think the use of co-created jupyter notebooks are? Does it work for courses with e.g. 100 students? Do you see any problems with large courses?
- L99: STEM fields? I am not familiar with this. Could you spell it out instead?
- L104: in person --> in-person
- L135-36: Can you elaborate on how the online participation replaced graded assessments and exams? I think that including more details here will be very valuable for a lot of teachers.
Did the students get a grade at the end of the course? Or pass/fail? Please add some more details in section 2.2. Who evaluates the student participation (course assistants, peer-to-peer)? And how is participation quantified and turned into a grade? Do you think this could work for large courses (>50 students)?
- L173: Can you provide the full student questionnaire? Perhaps in the supplementary?
- L187: “Assessment of the class of 2024 thus in addition focussed.” Sounds a bit funky. Can you re-formulate?
- L190: Could you specify how many questions in total? It was not entirely clear if the questions included in Fig.2 are just a snippet.
- L219-20: I think the start of the paragraph would fit better in the introduction, where you describe the course goals.
- L224: reflected --> indicated
- L225: “… the medium would benefit from being able to be paused.” I stumbled a bit reading this. Can you re-phrase?
- L264-65: “Analysis of feedback provided by the students indicated as much and highlighted several advantages of using the Jupyter Book/GitHub framework, in particular.”
Can you be a bit more specific here? Which specific feedback showed that open-source material increased participation and discussion? What were the “several advantages”? And what about increasing learning? Would be great if you could discuss this too. - L265-68: “Although interactivity, … to become contributors” This sentence was a bit convoluted and too long. I struggled to identify the subject/object of your sentence. Consider breaking it into two sentences to make it clearer.
- L271-72: Does this mean that you use student peer-review as part of the grading?
- L279: Add reference to figure/table
- L282-83: This is repeating what you wrote earlier (L225). Consider if it is needed here too.
- L284: This is the first time you mention your hypothesis. I think this should be stated earlier in the introduction, i.e. how did you expect incorporating online teaching tools would impact student learning, participation, etc.? I would put this after stating your research question(s).
- L350: I almost feel that this could be a separate section, i.e. “the teacher’s perspective” or something like that. I imagine this is relevant to a lot of readers (i.e. teachers).
- L363: “UAV-based …” Is this information relevant? I wonder if you could keep it more general.
Figures and Tables
- I would like to see a figure showcasing how a typical Notebook would look like. Could you add a screenshot as an example?
- Table 1: Text is difficult to read due to formatting. Check for punctuation/spelling errors. Describe in methods how many codes (constructive, positive, others?)
- Figure 2+3: I struggle to see the difference between bars, boxes and whiskers. The mean value is not so easy to identify from this type of box plot. Also, can you note how many modules in total.
Citation: https://doi.org/10.5194/gc-2024-6-RC3 -
AC2: 'Reply on RC3', Peter Betlem, 17 Jul 2024
Dear Jonathan W. Rheinlænder,
Thank you for taking the time to review our manuscript and provide us with constructive feedback. We agree with most of your suggestions and have implemented these where relevant in the manuscript. A brief inline reply to your comments is provided below. Where relevant, we also include snippets of the revised manuscript as an attachment to illustrate significant changes; some of these were previously implemented in response to Enze Chen (Reviewer 1; https://doi.org/10.5194/gc-2024-6-AC1). Line references below refer to the new line number (as found in the attached snippets) unless otherwise specified.
- Comment #1
The paper focus mostly on the aspect of openness and sharing in the context of online Notebooks and multimedia in teaching. While I understand this choice, I was hoping there would be a stronger focus on the potential impact onenhanced learning using these online teaching tools. Questions #1 and #2 in Fig. 2 touches on this aspect but is vague. Specifically, what is meant with “met my needs”? and “topics covered by the Compendium were relevant to the course”? What I am missing here is a specification of the learning outcomes of the modules and how did the use of notebooks/multimedia facilitate achieving these? I will also add that I struggled to identify the main research question(s) from your introduction. Perhaps you can state this more explicitly (e.g. L65-84) to help the reader understand what your main goal is with the study.
Indeed, our work mostly focuses on the use of the Jupyter Book/GitHub pages framework and assesses its use in teaching and in the co-creation of open educational resources. We have tried to further clarify this in the revised introduction, which also accounts for similar suggestions from Reviewer 1. Herein we have also included an additional clarification on what the Jupyter Book environment is and how it is distinguished from other items in the Jupyter ecosystem, such as the well-known Jupyter Notebooks (L83-85).
The learning outcomes are listed in the online Jupyter Book modules themselves, which are referenced in the data availability sections. Indeed, a full reflection on the potential impact on enhanced learning of the executable books would certainly be relevant, as also called for in the manuscript. However, doing so within the current contribution risks distracting from our key findings, and places such a reflection beyond the scope of the presented work.
- Comment #2
This is more a personal preference, but I would like to see the introduction a bit shorter. It is great that you provide context for the study, but I think it could benefit from being a bit more to the point with less “theoretical” background. What makes this paper compelling are your experiences, so I would try to get to the main bit sooner. In general, I think the paper is likely to get more reads if it is a bit shorter/concise.
Agreed, as mentioned in Comment #1 and following a similar comment from Reviewer 1, the introduction has been revised to this end.
- Comment #3
On a similar note, I found the discussion to be a bit long and repetitive at times. I think the discussion could benefit from being slimmed down a touch. Evaluate if all the paragraphs are needed? Does some of the material belong in the results?
Agreed. As part of the restructuring of the manuscript, we have moved and combined thematically related sections throughout the manuscript, slimming down the discussion.
- The idea behind Section 4.2 – Lessons learned and future directions is great, but I think it belongs in the Conclusion/Outlook section, which would help get your message across. You could make it even more applicable to other teachers who wish to incorporate these tools in their own teaching, e.g., by highlighting the main lessons learned (e.g. through a list/bullet points), do’s and don’ts including your recommendations for how to best implement these tools? For example, regarding installation of python libraries, etc.
We have added a do’s and don’ts section to the supplementary information section (L620 – 668), highlighting the main lessons learned and providing various other tips and tricks.
- Comment #4
Overall, the manuscript is written in an easy-to-read style, which is not overly academic. I think this is a good style choice for this type of paper. But a few places you “break” with this style, using words/phrases that may not be so widely used, for example:
“… were their first foray into the large and growing ecosystem of such tools”
Fixed (L293) – “ … formed their first introduction into the large and growing ecosystem…”
- “Students affirmed as much and specifically noted the efficacy of …”
Fixed (L325-326) – “… students noted the learning effectiveness of the modules …”
- For a non-native speaker (like myself), it makes the text flow less naturally. In addition, I think the paper can benefit from using more active voice, i.e. “We asked the students” instead of “the students were asked” (passive voice) for a more personal touch. Here, I would recommend sticking to the less academic writing style to make the paper more accessible to a wider audience (students/teachers).
We have adjusted several of the words and phrases to make the text more accessible. In addition, we have replaced passive voices with active voices where applicable and where such a replacement would not change the context.
- Comment #5
Overall, the figures and table support the text nicely. But it would be nice if you can add more in-text references to the figures and tables where relevant. For example, in L202-205.
Agreed, we have added several additional references to the tables and figures to guide the reader through the findings.
- Specific Comments
L4: ”… were iteratively revised over a four-year period”
Could you specify what is revised? I.e. the modules and not the 3D models.
Fixed (L3-4).
- L82: discusses --> discuss
Rephrases (L89 – 103).
- L85: In the Methods section, can you add some more information about the student´s prior experience with coding/git? Did you have students that had no experience with Python.
As part of the questionnaire, all students were asked about their previous experience in programming and familiarity with Jupyter Notebook/Lab/Book/Sphinx/Read The Docs (given their close relation to Jupyter Books), as well as with the concepts of YouTube and GIFs. The summary whisker plot has been added to the Appendix (Fig. A1).
- And how do you think experience with git affects student participation? Are students with less coding/git experience less likely to participate and how are these students evaluated?
Assessing the git experience was beyond the immediate scope of our work, as the modules are not part of a programming course, but rather use git for co-creation purposes only. As mentioned in L183-189 (original lines), we did however see that for student co-creation of (non-programmable) course content to be successful, a more extensive, preparatory three-hour tutorial into git and GitHub resulted in higher uptake and interest amongst the students. This in turn increased the number of student-initiated contributions/pull-requests to the source code of the educational materials.
- Also, is there a particular reason why you did not consider interviewing the students as well, in addition to the surveys? Can you please comment/discuss how student interviews (individual or focus-group) could compliment the surveys. What kind of additional information could you obtain through interviews?
We determined that in-class, colloquial feedback sessions and surveys were the most practical and feasible tool for inquiring into students’ perspectives and experiences for three reasons.
First, surveys allowed for a broad set of areas of inquiry including questions related to use and experience. We knew we may want to use this survey several times in the future to be able to make some comparisons between courses/groups of students; this is more readily achievable with surveys than with focus groups or interviews which are characterized by more open questions making comparison more difficult year over year.
Second, typical course-sizes at UNIS are small (5-20) and teaching often takes place in a more informal setting where students, lecturers and assistants are encouraged to openly discuss course contents and suggest improvements. Individual feedback during the courses themselves, as well as in-class feedback sessions at the end of courses typical provide a first-order overview of what went well, what did not, as well as suggestions on how things can be improved (as noted in L169-170; original lines). Such feedback led to the initial revisions throughout the first 2-3 years prior to the implementation of the written survey.
Third, the students who enroll in courses at UNIS typically may depart from the island almost immediately after a course is completed. It can make other modes of inquiry such as interviews or focus groups very challenging. Students do not maintain their email addresses after their term ends so communication with students after the last day of a course is very difficult. As such, we knew we needed to use a method that allowed for gathering of data from students at the end of the course but that would be efficient given obstacles with maintaining contact with students after a course ends.
- L89-90: Could you comment/discuss how higher participation numbers could affect the results? How scalable do you think the use of co-created jupyter notebooks are? Does it work for courses with e.g. 100 students? Do you see any problems with large courses?
The Geo-UAV and Geo-SfM modules are optimized for cohorts of 10-20 students at a time and we did not manage to test higher participation numbers within the current study. That said, git, around which GitHub is built, is used by most major corporations and allows contributors to simultaneously work on code and documentation. So, if that is something to go by, it certainly is not hard to imagine that courses with e.g. 100 students could use a similar approach as here documented, though may need minor adjustments to the workflows.
- L99: STEM fields? I am not familiar with this. Could you spell it out instead?
We have added the acronym’s full meaning: science, technology, engineering, and math (STEM).
- L104: in person --> in-person
Fixed.
- L135-36: Can you elaborate on how the online participation replaced graded assessments and exams? I think that including more details here will be very valuable for a lot of teachers.
Did the students get a grade at the end of the course? Or pass/fail? Please add some more details in section 2.2. Who evaluates the student participation (course assistants, peer-to-peer)? And how is participation quantified and turned into a grade? Do you think this could work for large courses (>50 students)?
We have expanded section 2.2 with additional details (see e,g, L161-165). In short, students were graded with a pass/fail for the respective modules with instructors certifying the work. As for whether this setup could work for large courses, please see our response above.
- L173: Can you provide the full student questionnaire? Perhaps in the supplementary?
Table S3 provides the relevant questions of the questionnaire and comprises the statements included in Fig 2-3, Fig S1, Table S1-2.
- L187: “Assessment of the class of 2024 thus in addition focussed.” Sounds a bit funky. Can you re-formulate?
Fixed (L222-225).
- L190: Could you specify how many questions in total? It was not entirely clear if the questions included in Fig.2 are just a snippet.
Table S3 provides the relevant questions of the questionnaire and comprises the statements included in Fig 2-3, Fig S1, Table S1-2.
- L219-20: I think the start of the paragraph would fit better in the introduction, where you describe the course goals.
This section has been revised.
- L224: reflected --> indicated
Fixed.
- L225: “… the medium would benefit from being able to be paused.” I stumbled a bit reading this. Can you re-phrase?
Rephrased.
- L264-65: “Analysis of feedback provided by the students indicated as much and highlighted several advantages of using the Jupyter Book/GitHub framework, in particular.”
Can you be a bit more specific here? Which specific feedback showed that open-source material increased participation and discussion? What were the “several advantages”? And what about increasing learning? Would be great if you could discuss this too.
We have rephrased the section to better address this comment (L321-327).
- L265-68: “Although interactivity, … to become contributors” This sentence was a bit convoluted and too long. I struggled to identify the subject/object of your sentence. Consider breaking it into two sentences to make it clearer.
This sentence has been revised (L321-327)
- L271-72: Does this mean that you use student peer-review as part of the grading?
No, this peer-review was not part of the grading process and solely implemented to facilitate co-creation of the resources. See L161-165.
- L279: Add reference to figure/table
Added reference to table 1.
- L282-83: This is repeating what you wrote earlier (L225). Consider if it is needed here too.
Both sections have been combined in an extended earlier paragraph near original line L225.
- L284: This is the first time you mention your hypothesis. I think this should be stated earlier in the introduction, i.e. how did you expect incorporating online teaching tools would impact student learning, participation, etc.? I would put this after stating your research question(s).
This section has been revised.
- L350: I almost feel that this could be a separate section, i.e. “the teacher’s perspective” or something like that. I imagine this is relevant to a lot of readers (i.e. teachers).
The section has been separated into a new section.
- L363: “UAV-based …” Is this information relevant? I wonder if you could keep it more general.
Removed UAV-based.
Figures and Tables
- I would like to see a figure showcasing how a typical Notebook would look like. Could you add a screenshot as an example?
We have added links to the compiled Jupyter Books in the data availability section, as also suggested by Reviewer 1.
- Table 1: Text is difficult to read due to formatting. Check for punctuation/spelling errors. Describe in methods how many codes (constructive, positive, others?)
Agreed. Additional spacing has been added before each student statement and the codes are described in the methods.
- Figure 2+3: I struggle to see the difference between bars, boxes and whiskers. The mean value is not so easy to identify from this type of box plot. Also, can you note how many modules in total.
There are two modules in total, whose feedback is combined in Figure 3 to reflect responses on the same topics.
Once again, thank you for taking the time to review our work.
On behalf of the co-authors,
Peter Betlem, 17th of July 2024.
- Comment #1
- L4: ”… were iteratively revised over a four-year period”
Data sets
Geo-SfM: Teaching Geoscientific Structure-from-Motion Photogrammetry Processing P. Betlem and N. Rodes https://doi.org/10.5281/zenodo.11173239
Geo-UAV: Teaching Geoscientific Drone-Based Data Acquisition N. Rodes, P. Betlem, and S. M. Cohen https://doi.org/10.5281/zenodo.11173399
Geo-MOD: Teaching Geoscientific Photogrammetry-Based Data Acquisition and Processing P. Betlem, N. Rodes, and S. M. Cohen https://doi.org/10.5281/zenodo.11172855
Viewed
HTML | XML | Total | BibTeX | EndNote | |
---|---|---|---|---|---|
374 | 85 | 29 | 488 | 13 | 15 |
- HTML: 374
- PDF: 85
- XML: 29
- Total: 488
- BibTeX: 13
- EndNote: 15
Viewed (geographical distribution)
Country | # | Views | % |
---|
Total: | 0 |
HTML: | 0 |
PDF: | 0 |
XML: | 0 |
- 1