Está en la página 1de 5

Cycle 1 Experimenting with Screencasting for Feedback: 10 Respondents ACTION TAKEN: The creation of a video feedback method through

h the implementation of screencasting in order to assess students photography. This process replaced the traditional method of written feedback, which typically involves the use of quantifying tools, such as rubrics. The benefit of screencasting with audio is its ability to transfer information through multiple modalities in order to increase the chances of retention. The screencasts were created using Snapz Pro X, were viewed using Adobe Bridge for a portfolio overview, and Adobe Photoshop was used to demonstrate to students the editing of individual images. While presenting the students images audio feedback from the teacher was recorded. When finished, the completed QuickTime movie was placed in the students folder on the schools server. RESEARCH QUESTION: How will using a multimodal approach to provide a critique of student photography affect student-learning outcomes, as evaluated by the students? PREDICTED OUTCOME: In this cycle I began using video screencasting as feedback for student photography projects. My literature review led me to believe that of my students would prefer video feedback over written feedback because this type of feedback tends to be more personal, and it can provide a greater level of detail about the students work. EVIDENCE USED TO EVALUATE THE ACTION: The evidence used to evaluate the action taken was a survey asking the following questions on a five-option scale, which allowed room for student comments. 1. How did you like receiving feedback as a video? 2. Does video feedback help to provide information that is more detailed than written feedback? 3. Did video feedback encourage you to think about your work more deeply than you would have without the feedback? 4. Did video feedback make you more or less enthusiastic about photography? EVALUATION: All of the students preferred the use of video feedback, and 80% of them chose the most positive response offered (Figure 1.1.) All the students thought my responses were more detailed than they would have been in written form, and 60% again chose the top response offered in this category.

How did you like receiving feedback as a video?


0% 0% 20% 0% I liked it a great deal I liked it I liked it as much as a written feedback I did not like it 80% I did not like it at all

Figure 1.1. How did you like receiving feedback as a video? Of the students polled, 80% indicated this process made them think more, with 10% feeling that it did not change their level of thinking (Figure 1.2.) These results suggested that the students believe they engaged more critical thinking skills when viewing their artwork as it is analyzed. Traditionally middle school photography students have been presented with a rubric, which quantifies their performance on a lesson. They are then left to take that information and apply it to their artwork. Students rarely, if ever, look at both their rubric and images simultaneously. This method of feedback presents them with both pieces of information, in order to increase the chances of connections being made and retained. Therefore, the results of this question make sense because the students are now directly able to apply the information on their rubric to their artwork, and make the appropriate connections.

Did video feedback encourage you to think about your work more deeply?
0% 0% 10%

I thought about my artwork much more


40%

I thought about my artwork a little more I did not change my thinking

50%

I thought about my artwork less I thought about my artwork much less

Figure 1.2. Did video feedback encourage you to think about your work more deeply than you would have without the feedback? Finally, 70% of the students believed that screencasting increased their enthusiasm for the subject, while the remaining 30% responded that it made no change (Figure 1.3.) These student responses indicate they that this process of feedback motivates them. This could be because of the personal nature of video compared to written feedback. Short, Williams and Christies Social Presence Theory (SPT) indicates that humans prefer face-to-face conversations when possible, especially if that conversation has negative connotations. SPT continues to theorize that when faceto-face conversations are not possible were prefer methods that more closely mimic face-to face. Video is much closer to talking with another person than written works are. Therefore, if SPT is accurate, students would prefer video feedback to written feedback, especially if the message was critical. It could be that this increased enthusiasm is due to the more personal nature of the feedback. No matter the reason it appears that this method of feedback both increases enthusiasm and engagement in their work.

Did video feedback affect your enthusiasm about photography?


0% 0%

It made me much more enthusiastic


30% 40%

It made me more enthusiastic It did not change my enthusiasm It made me less enthusiastic

30%

It made me much less enthusiastic

Figure 1.3. Did video feedback make you more or less enthusiastic about photography? REFLECTION: My action research was founded on the idea of video feedback being more beneficial to students learning than traditional methods. Therefore this cycle was designed to test that assumption. The evaluation of the action was based on student responses to the survey. The responses were suggestions for improvement and a way for students to target what benefited their learning. The hope was that what students though to be beneficial would in truth improve their learning. Only positive responses were given to the survey questions of this cycle. These results may be directly related to the benefits of video feedback, but it is possible that they could also be due to tester biases. I believe that I have developed a good report with this group of students and they be providing positive responses due to this connection. While I have no data to indicate the truth of this matter, it should be noted that the possibility to tester bias does exist. The questions on the survey were designed to help identify a general overview of the students feelings toward video feedback. Each question contained space for comments so students could elaborate. Category oriented questions are an efficient way of analyzing data from groups, but they do not always generate thoughtful responses. A space for open-ended responses was added below the question to hopefully encourage more critical thinking from the respondents. All of the comments were positive, which helped to reinforce the validity of my predictions.

The students enjoyed the use of video feedback, and appreciated the level of detail that it provided (Figure 1.1.) I believe that this method displayed the investment by their teacher, and that contributed to their increased enthusiasm for the subject. Where written feedback can be interpreted as impersonal and at times effortless, video feedback reveals the level of effort the instructor puts into each assignment. The technology choices in this cycle have worked to relay a broad range of information to the students. Snapz Pro X provided control over the screencast. Data on portfolio critiques in my literature review indicates that the benefits are its ability to critique the trends in student artwork. Adobe Bridge was an efficient way to show the students their portfolio of images while discussing them. Finally, the students create their work in Adobe Photoshop, and it seems beneficial to model the editing process with the same tool they are using. With it I can show their work, review each step that they took and model artistic choices that could be improved. These positive results signified that screencasting is beneficial to the students learning, (Figure 1.2) students also preferred feedback in this manner, and it increased their level of enthusiasm (Figure 1.3.) Hopefully this motivation was due to more than a new method of feedback being provided. The use of new technologies can be exciting and energizing, and it is possible that using screencasting for the first time had that effect on the students. This could explain the positive responses from the survey questions. Hopefully their enthusiasm is from more than the use of a new technology. Feedback did not take long on a per student basis because there were no quantifying tools involved in this cycle (such as rubrics or checklists), only a review of the images and editing. This process is lacking in concrete feedback that can be used for grading. I have traditionally used rubrics to grade student work because I believe they can give students a better understanding of their performance if used correctly. In the next cycle I would like to incorporate rubrics into the screencasts and to help benchmark the students progress. Moving forward the inclusion of a rubric would create more detailed feedback. I believe that this will allow me to provide personalized and detailed feedback and also to provide more quantifiable criteria for the students to evaluate their work. In cycle two I will present the rubric along with the feedback after the assignment is given. In cycle three I will provide the rubric with the assignment to see if it helps to guide the student work. Average recording length 2:00 min. Total time per project 5min.

También podría gustarte