Author: Diego Benalcazar Vega
Intervention and Questionnaire data collection. Quick final reflection
After providing students with ample time to complete the peer feedback and questionnaire and being actively involved since early December, this included sending emails, discussing the project in class, and reminding them even after January that the peer feedback tool was still available. Today, Saturday, January 13th, I have decided to close the questionnaire, collect the data, and begin analysis.
It appears that only one student has engaged with the Peer Feedback Tool, contributing four responses in four different videos out of the six. Short positive answers. Not that constructive and with little detail. Maybe I would need to think about revising the Feedback Guide.
The questionnaire, which intended to capture the dynamics of feedback exchange in the class, gathered responses from just one out of potentially 5 students. This solitary response, while not broadly representative, offers individual data points into the individual student’s perceptions and experiences of the peer feedback process, both before and after the introduction of an anonymous feedback system.

(see the pdf with the data below)
The data analysis yields limited insights, as the results cannot be effectively triangulated, compared with other responses, or used to derive substantial qualitative and quantitative conclusions. However, the notable lack of responses and engagement in the intervention provides valuable information about the project itself.
Firstly, it’s necessary to reevaluate the intervention process. There may be issues with either the process itself or its timing.
Secondly, increasing the sample size seems like a viable strategy for improving results and engagement. Initially, with only five students, I anticipated complete engagement. However, working with a larger group might yield a lower engagement percentage, such as 20%, but the absolute number of responses would still be higher.
As a final reflection, while I finish my presentation and evaluate my results. I do believe that this first cycle gave me some valuable insights to restart the cycle again. This is what most AR projects are like, right? Learn from the mistakes, evaluate, redo.
On another final note, I see several opportunities in my job where AR could be applied. One particular project we aim to develop within the program, particularly for the two Bachelor of Arts degrees, involves sharing optional/elective units. I am considering approaching these modifications as an AR project. I am eager to see how this initiative unfolds.
Peer Feedback guide
As part of my intervention, I decided to create a guide to lead students into giving constructive and positive feedback during the intervention, that will align with the realm of sound design and music production. I have designed the “Peer Feedback Guide”, aiming to deepen students’ understanding and engagement with their creative work. This method not only enhances technical skills but also nurtures a supportive learning environment.
The guide emphasises key principles like constructiveness, specificity, and balance. Constructive feedback, focusing on specific aspects of sound design, helps students identify both strengths and areas for improvement. This approach ensures that feedback is actionable and growth-oriented, rather than purely critical. The balance between positive reinforcement and constructive criticism is crucial for fostering an environment conducive to learning and creative exploration.
In practice, students are encouraged to analyse various elements of their peers’ work, such as the clarity of sound, the emotional impact, and the creativity in sound design. The guide provides examples to illustrate how feedback can be both specific and balanced. For instance, comments might highlight the effective use of ambient sound while suggesting improvements in the transition of sound effects, guiding peers towards a more immersive sound experience.
The guide also stresses the importance of maintaining anonymity, respect, and empathy in the feedback process. This ensures unbiased and honest reviews, fostering a safe space for open and constructive critiques. The closing note of the guide encapsulates the essence of this feedback approach – not just to critique but to contribute to a collaborative and enriching learning journey. It underscores the significance of peer feedback as a tool for both personal and communal growth in the field of sound design and music production.
Here is the Guide:
Intervention
This post is dedicated as an appendix to share the spaces where the three integral stages of my intervention happened: the receipt of artefacts, the implementation of the peer feedback platform, and the administration of the questionnaire.
For the collection of artefacts, I utilized the Moodle page of the unit, which is accessible exclusively to enrolled participants. To provide a comprehensive understanding of this stage to those outside the course, I am including a screenshot of both the submission area and the consent declaration required for participation in the study.
The Peer Feedback platform, designed in a Paddlet: https://artslondon.padlet.org/dbenalcazar1/peer-feedback-wryih9i3ure8muwt
Finally, the questionnaire designed in MS Forms: https://forms.office.com/Pages/ResponsePage.aspx?id=xClkjH8We0e4y3fugnWNETS8HV0ZYNxCn8Ugo5vq6ehUQUdaTUFRRTZES0lGWFE1Q0VGSzVFWlRPNi4u
Desinging the intervention
With the research question established, I initiated the planning and designing phase of the intervention. This involved mapping out the entire intervention process, enabling me to think through and evaluate all components comprehensively, ensuring nothing was overlooked. Basing my design on McNiff and Whitehead (2010).
Below is the finalized design, detailing each step of my intervention.
Reference:
McNiff, J and Whitehead J. (2010). You and your action research project. Routledge.
Ethics form
Reflecting on the journey of creating and developing my ethical enquiry form for the Action Research Project at UAL, I was surprised by the depth and complexity of the ethical considerations involved in academic research. My primary objective was to explore ways to improve student participation in peer feedback, a crucial aspect of learning in my Audio Post-Production class.
With my research question defined: “How might we improve student participation in peer feedback?” and a map of the whole project I started to think about all the ethical implications I could come across in the project, also, seeing the ethical form helped me a lot to think about all the aspects and potential considerations.
Creating questions for both pre- and post-intervention experiences was a thoughtful exercise. I aimed to understand the students’ comfort levels in providing peer feedback before and after the intervention, as well as their perceptions of anonymity’s impact on the quality of feedback. This required me to consider the psychological impact of my research on the participants, ensuring that the questions were sensitive and ethically sound.
The aspect of obtaining informed consent was particularly challenging. I had to ensure that my students understood what they were consenting to and that they had the freedom to opt out without any repercussions. This led me to delve into various methods of obtaining consent, such as through the submission of work or an online questionnaire.
A significant part of my ethical consideration was the potential risks to the participants. The anonymous nature of the feedback could potentially lead to negative or unconstructive comments. To mitigate this, I established guidelines and protocols and provided resources on how to give constructive feedback.
Throughout this process, I was constantly aware of my responsibility to uphold the ethical standards set by the University of the Arts London. This meant not only ensuring the well-being of my participants but also considering my own.
Literature about AR in Music Production education
I set out to look for some literature on action research in music production education, but everywhere I looked, I got pointed to music education. While music production is part of a big music education umbrella, music education research mostly focuses on performance and theory.
Though this field is vibrant and diverse, we can learn from various aspects of teaching and learning different aspects of music and it has a significant impact on both theoretical understanding and practical applications for our context.
Music education, with its multifaceted and dynamic nature, has been a fertile ground for action research. The diversity in focus ranges from understanding the psychology of individual learners to exploring the dynamics of ensemble participation. For instance, studies have investigated how students respond to pupil-centred learning in piano lessons, examining the psychological aspects of individual learning styles (Mackworth-Young, 1990). Similarly, research on compositional tasks in elementary music classrooms has provided insights into the design of such tasks and their impact on student learning (Miller, 2004).
Several themes have emerged in the literature, reflecting the varied interests and focal points within music education. Studies have delved into the sub-culture of high school ensembles, exploring the range of effects on participants (Adderley, Kennedy, and Berz, 2003). Others have focused on understanding the motivations behind choir participation among different age groups, as well as the roles students adopt in various ensemble settings (Conway and Borst, 2001).
Action research in music education often explores the dynamics of tutor and peer feedback, although finding specific examples that focus exclusively on this area can be challenging. The literature indicates that action research in music education is diverse and covers a wide range of subject matter. It integrates research and action, is collaborative, and is grounded in a body of existing knowledge, leading to powerful learning experiences for participants. (Cain, 2008).
Moreover, peer tutoring in music education has been recognized as a valuable cooperative learning strategy. Even though most studies revolve around instrument performance tutoring, it involves pairs of students working together, providing assistance, instruction, and feedback. This method has been examined in various studies, reflecting its potential in music education and the specific elements that can enhance teaching and learning music. (Fernández-Barros, et al. 2023).
Regarding music technology, I found a study that focuses on an action-research project involving 12 teachers and 68 students at an Ecuadorian university, funny enough where I’m from. The study developed a Holistic and Technological Model of Music Education (HTMME) and assessed its effectiveness through an original questionnaire and qualitative work. The findings highlight the positive appreciation of the new model and demonstrate how learning music with ICT can induce creative-musical processes in students. This research emphasises the significant impact that technological integration can have on music education, offering new teaching experiences and relevant learning opportunities for students. (Bolivar et al, 2021).
Outside of music education, I found plenty of literature about peer feedback in a classroom setting. The article “Improving Student Peer Feedback” by Linda Nilson, published in College Teaching in January 2003, addresses the challenges and inefficiencies of traditional peer feedback methods in educational settings. Nilson critiques the typical judgment-based peer feedback for its lack of validity, and reliability, and its tendency to be superficial and emotionally charged. The article proposes an alternative approach focusing on neutral, informative, and thorough responses that aim to enhance the peer feedback process. This approach encourages students to provide genuinely valuable feedback, avoiding common pitfalls associated with emotional biases and superficial assessments. The article also explores the broader context of cooperative learning and peer assessment in modern education, highlighting its importance for developing critical thinking and collaborative skills.
Tim McMahon (2010), explores the use of action research to enhance peer feedback in an undergraduate program. It focuses on addressing students’ reluctance to provide critical feedback to their peers. The study, conducted over four years, involved implementing and refining a peer assessment regime to generate high-quality, reflective, and useful peer feedback. The paper documents the transformation of a system initially characterized by students’ reluctance to criticize into one that encouraged effective peer feedback and critical thinking. It highlights the importance of making peer assessment formative and giving students ownership of the feedback process, resulting in a more engaged learning community.
The paper “Action Research on Implementation of Peer Assessment as an Effective Learning Strategy: Evidence from WIUT” by Feruza Yodgorova (2020), examines the implementation of peer assessment in higher education. The study, conducted at Westminster International University in Tashkent (WIUT), utilized mixed-method research to explore the effectiveness of peer assessment in enhancing the learning process. It involved students from Business Administration and focused on the student-centred approach of peer assessment. The paper discusses both the positive perceptions and challenges faced in implementing peer assessment, including its impact on student relationships and potential issues of collusion among group members.
In conclusion, while there are no specific studies directly addressing the creation of audio artefacts, the insights obtained from action research in music education and the application of peer feedback in various fields have significantly contributed to the formulation of a more informed approach. This knowledge has been instrumental in guiding the development and shaping the direction of my intervention in this area.
References:
Adderley, C., Kennedy, M., & Berz, W. (2003). “A Home Away from Home”: The World of the High School Music Classroom. Journal of Research in Music Education, 51(3), 190–205. https://doi.org/10.2307/3345373
Bolívar-Chávez,O.-E.; Paredes-Labra, J.; Palma-García, Y.-V.; Mendieta-Torres, Y.-A. Educational Technologies and Their Application to Music Education: An Action-Research Study in an Ecuadorian University. Mathematics 2021, 9, 412. https://doi.org/ 10.3390/math9040412
Cain, Tim. (2008). The Characteristics of Action Research in Music Education. British Journal of Music Education, v25 n3 p283-313 Nov 2008. https://eric.ed.gov/?id=EJ1073476
Conway, Colleen & Borst, James. (2001). Action Research in Music Education. Update: Applications of Research in Music Education. 19. 10.1177/87551233010190020102.
Fernández-Barros, Andrea, Duran, David, and Viladot, Laia. (2023). Peer tutoring in music education: A literature review. International Journal of Music Education, 41. https://journals.sagepub.com/doi/abs/10.1177/02557614221087761
Mackworth-Young, L. (1990). Pupil-centred learning in piano lessons: An evaluated action-research programme focusing on the psychology of the individual. Psychology of Music, 18(1), 73–86. https://doi.org/10.1177/0305735690181006
McMahon, Tim. (2010). Peer feedback in an undergraduate programme: Using action research to overcome students’ reluctance to criticise. Educational Action Research. 18. 273-287. 10.1080/09650791003741814.
Miller, B. A. (2004). Designing compositional tasks for elementary music classrooms. Research Studies in Music Education, 22, 59–71.
Nilson, Linda. (2003). Improving Student Peer Feedback. College Teaching. 51. 34-38. 10.1080/87567550309596408.
Yodgorova, F. (2020). Action Research on Implementation of Peer Assessment as an Effective Learning Strategy: Evidence from WIUT. European Journal of Research and Reflection in Educational Sciences, Vol. 8 N. 8, p.45-55.
Where to implement the intervention? Sample Size?
Initially, I had an idea of working with first-year students, who are more in number, which seemed reasonable. However, upon deeper consideration, the large class size (32 students) posed significant challenges. In a larger group, individual contributions might become diluted, and the quality of interaction could be compromised. Students might feel overwhelmed by the volume of feedback they need to provide and receive. This could potentially lead to a less engaging and effective peer feedback process. Also, I wanted to collect data post-intervention and with the unit I teach in the first unit it would have to do the implementation at the end of January.
In contrast, my audio post-production class, with its smaller size of 5 students, offers a more manageable and focused group. Smaller classes foster closer interaction, more meaningful feedback, and a stronger sense of community. Each student will likely feel more responsible and accountable for their participation, knowing that their input is vital to the group’s overall experience. This setting can lead to more thoughtful, detailed, and constructive feedback, which is crucial for the development of skills in a specialized field like audio post-production. Also, I could implement this with their 8 small video assignments that we have been doing since the beginning of the term.
With a smaller class, we probably can develop something together, as a whole-class co-creation. As Bovill (2019) indicates, co-creation in learning and teaching invites all students to collaborate actively with both their teacher and peers, creating a shared responsibility for the learning process, in this case, asynchronous peer feedback. This co-creative approach aligns with the objectives of my intervention, which seeks to empower students through active participation and shared decision-making in their learning journey.
The nature of the course content also plays a pivotal role in the decision. Audio post-production is a technical and creative field that requires critical listening and attention to detail. The process of giving and receiving feedback in such a class is not just about technical correctness but also about artistic and esthetical choices and creative expression. An online peer feedback platform can allow students to engage in this critical discourse outside the time constraints of regular class hours.
Moreover, the existing structure of the audio post-production class, which doesn’t allow for extensive in-class feedback, highlights the need for an additional space where students can engage deeply with each other’s work. This intervention becomes not just an enhancement but a necessary component to fill the existing gap in the curriculum.
Engaging a smaller, more specialized class in this peer feedback project might also lead to higher levels of student engagement and a richer learning experience. Students in a specialized course like audio post-production are likely to be more invested in the subject matter. This investment can translate into more enthusiastic and committed participation in the peer feedback process. The intervention addresses a specific need within the course, potentially leading to a more impactful and beneficial outcome for the students.
Now with fewer students in the sample, the fewer students participate, the fewer I will be able to gather data from. :/
Bovill, C. (2019) ‘Co-creation in learning and teaching: The case for a whole-class approach in Higher Education’, Higher Education, 79(6), pp. 1023–1037. doi:10.1007/s10734-019-00453-w.
The intervention / research question
A key factor in developing this intervention was the recognition of a gap in student engagement and constructive collaboration. I wanted to develop something I started last academic year, encouraging peer-to-peer interaction in an academic setting. The idea emerged from observing that students often felt disconnected from the feedback process when it was a one-way street from instructor to student. It became increasingly clear that peer evaluations could serve as a powerful tool for learning if executed in a supportive, structured environment.
My intervention is an online space for receiving and giving peer feedback, where anonymity could reduce bias and apprehension, enabling students to express honest and constructive feedback. I’m thinking of a two-week window to allow ample time for reflection and response. Following up with a questionnaire to collect data on the effectiveness of the intervention and to understand the students’ experiences and perceptions of the process.
With this primary idea of this small-scale intervention, I reached my research question: “How might we improve student participation in peer feedback?”. I believe this was instinctive out of a series of reflective inquiries into the nature of collaborative learning and especially of student autonomy, where, according to Deci and Ryan (2000) students could feel more motivated to participate.
I asked myself, what elements are essential for meaningful engagement? How can we move beyond mere participation to ensure that students are actively contributing to and benefiting from the peer feedback process?
This intervention is a step towards reimagining the educational landscape where students learn from each other as much as they do from their instructors. But also, it’s an endeavour to make the learning process more democratic, inclusive, and effective following UAL’s principles. In a sense, this intervention is fundamentally about levelling the educational playing field. By anonymising the feedback process, we are mitigating social biases that often infiltrate classroom dynamics. The intention is to create an inclusive platform where every voice has equal weight and value, and where participation is not influenced by one’s background, identity, or social standing. This approach to peer feedback aspires to champion inclusivity and ensure that all students, regardless of their individual circumstances, have equitable access to contribute to and benefit from the collective wisdom of their peers.
Deci, E.L. and Ryan, R.M. (2000) ‘The “what” and “why” of goal pursuits: Human needs and the self-determination of behavior’, Psychological Inquiry, 11(4), pp. 227–268. doi:10.1207/s15327965pli1104_01.
ARP project timeline
Looking ahead to the execution of my action research project, I envision a journey that begins as soon as possible and progresses methodically through January, ending in our presentation.
In the initial weeks, I will gather course data, which will serve as the bedrock for my investigation. This phase is crucial as it will inform the formulation of my research question.
By the end of the month, I will have submitted my final approved ethics form for my research, ensuring that my methodologies respect the dignity and privacy of all participants. Concurrently, I will engage with both primary and secondary research to fortify my understanding of the educational landscape. This dual approach will provide a robust foundation for the intervention I plan to implement.
With the intervention meticulously planned, December will mark the transition from theory to practice. I will set everything in motion, stepping into the role of both educator and researcher. The execution of the intervention will be a dance between adaptability and structure, with each step carefully observed and documented. At the end of the intervention period, I will be collecting data through questionnaires. I still do not know if these may be online, or in person, as I’m not 100% sure about my intervention yet.
After the intervention period. I will review the final outcomes and engage in deep reflection. This period is not merely about assessment but about understanding the layers of impact that my actions have had on the learning environment if any, as little as they may be.
For this unit, a big milestone of this project will be a presentation to share the findings and my reflection on the journey. I’m hoping it will encapsulate the challenges faced, the successes celebrated, and the knowledge gained. This presentation will not only signify the end of a cycle but also set the stage for future inquiries.
