• Our Mission

An Illustration depicting a formative assessment concept

7 Smart, Fast Ways to Do Formative Assessment

Within these methods you’ll find close to 40 tools and tricks for finding out what your students know while they’re still learning.

Formative assessment—discovering what students know while they’re still in the process of learning it—can be tricky. Designing just the right assessment can feel high stakes—for teachers, not students—because we’re using it to figure out what comes next. Are we ready to move on? Do our students need a different path into the concepts? Or, more likely, which students are ready to move on and which need a different path?

When it comes to figuring out what our students really know, we have to look at more than one kind of information. A single data point—no matter how well designed the quiz, presentation, or problem behind it—isn’t enough information to help us plan the next step in our instruction.

Add to that the fact that different learning tasks are best measured in different ways, and we can see why we need a variety of formative assessment tools we can deploy quickly, seamlessly, and in a low-stakes way—all while not creating an unmanageable workload. That’s why it’s important to keep it simple: Formative assessments generally just need to be checked, not graded, as the point is to get a basic read on the progress of individuals, or the class as a whole.

7 Approaches to Formative Assessment

1. Entry and exit slips: Those marginal minutes at the beginning and end of class can provide some great opportunities to find out what kids remember. Start the class off with a quick question about the previous day’s work while students are getting settled—you can ask differentiated questions written out on chart paper or projected on the board, for example.

Exit slips can take lots of forms beyond the old-school pencil and scrap paper. Whether you’re assessing at the bottom of Bloom’s taxonomy or the top, you can use tools like Padlet or Poll Everywhere , or measure progress toward attainment or retention of essential content or standards with tools like Google Classroom’s Question tool , Google Forms with Flubaroo , and Edulastic , all of which make seeing what students know a snap.

A quick way to see the big picture if you use paper exit tickets is to sort the papers into three piles : Students got the point; they sort of got it; and they didn’t get it. The size of the stacks is your clue about what to do next.

No matter the tool, the key to keeping students engaged in the process of just-walked-in or almost-out-the-door formative assessment is the questions. Ask students to write for one minute on the most meaningful thing they learned. You can try prompts like:

  • What are three things you learned, two things you’re still curious about, and one thing you don’t understand?
  • How would you have done things differently today, if you had the choice?
  • What I found interesting about this work was...
  • Right now I’m feeling...
  • Today was hard because...

Or skip the words completely and have students draw or circle emojis to represent their assessment of their understanding.

2. Low-stakes quizzes and polls: If you want to find out whether your students really know as much as you think they know, polls and quizzes created with Socrative or Quizlet or in-class games and tools like Quizalize , Kahoot , FlipQuiz, Gimkit , Plickers , and Flippity can help you get a better sense of how much they really understand. (Grading quizzes but assigning low point values is a great way to make sure students really try: The quizzes matter, but an individual low score can’t kill a student’s grade.) Kids in many classes are always logged in to these tools, so formative assessments can be done very quickly. Teachers can see each kid’s response, and determine both individually and in aggregate how students are doing.

Because you can design the questions yourself, you determine the level of complexity. Ask questions at the bottom of Bloom’s taxonomy and you’ll get insight into what facts, vocabulary terms, or processes kids remember. Ask more complicated questions (“What advice do you think Katniss Everdeen would offer Scout Finch if the two of them were talking at the end of chapter 3?”), and you’ll get more sophisticated insights.

3. Dipsticks: So-called alternative formative assessments are meant to be as easy and quick as checking the oil in your car, so they’re sometimes referred to as dipsticks . These can be things like asking students to:

  • write a letter explaining a key idea to a friend,
  • draw a sketch to visually represent new knowledge, or
  • do a think, pair, share exercise with a partner.

Your own observations of students at work in class can provide valuable data as well, but they can be tricky to keep track of. Taking quick notes on a tablet or smartphone, or using a copy of your roster, is one approach. A focused observation form is more formal and can help you narrow your note-taking focus as you watch students work.

4. Interview assessments: If you want to dig a little deeper into students’ understanding of content, try discussion-based assessment methods. Casual chats with students in the classroom can help them feel at ease even as you get a sense of what they know, and you may find that five-minute interview assessments work really well. Five minutes per student would take quite a bit of time, but you don’t have to talk to every student about every project or lesson.

You can also shift some of this work to students using a peer-feedback process called TAG feedback (Tell your peer something they did well, Ask a thoughtful question, Give a positive suggestion). When you have students share the feedback they have for a peer, you gain insight into both students’ learning.

For more introverted students—or for more private assessments—use Flipgrid , Explain Everything , or Seesaw to have students record their answers to prompts and demonstrate what they can do.

5. Methods that incorporate art: Consider using visual art or photography or videography as an assessment tool. Whether students draw, create a collage, or sculpt, you may find that the assessment helps them synthesize their learning . Or think beyond the visual and have kids act out their understanding of the content. They can create a dance to model cell mitosis or act out stories like Ernest Hemingway’s “Hills Like White Elephants” to explore the subtext.

6. Misconceptions and errors: Sometimes it’s helpful to see if students understand why something is incorrect or why a concept is hard. Ask students to explain the “ muddiest point ” in the lesson—the place where things got confusing or particularly difficult or where they still lack clarity. Or do a misconception check : Present students with a common misunderstanding and ask them to apply previous knowledge to correct the mistake, or ask them to decide if a statement contains any mistakes at all, and then discuss their answers.

7. Self-assessment: Don’t forget to consult the experts—the kids. Often you can give your rubric to your students and have them spot their strengths and weaknesses.

You can use sticky notes to get a quick insight into what areas your kids think they need to work on. Ask them to pick their own trouble spot from three or four areas where you think the class as a whole needs work, and write those areas in separate columns on a whiteboard. Have you students answer on a sticky note and then put the note in the correct column—you can see the results at a glance.

Several self-assessments let the teacher see what every kid thinks very quickly. For example, you can use colored stacking cups that allow kids to flag that they’re all set (green cup), working through some confusion (yellow), or really confused and in need of help (red).

Similar strategies involve using participation cards for discussions (each student has three cards—“I agree,” “I disagree,” and “I don’t know how to respond”) and thumbs-up responses (instead of raising a hand, students hold a fist at their belly and put their thumb up when they’re ready to contribute). Students can instead use six hand gestures to silently signal that they agree, disagree, have something to add, and more. All of these strategies give teachers an unobtrusive way to see what students are thinking.

No matter which tools you select, make time to do your own reflection to ensure that you’re only assessing the content and not getting lost in the assessment fog . If a tool is too complicated, is not reliable or accessible, or takes up a disproportionate amount of time, it’s OK to put it aside and try something different.

Browser does not support script.

  • Departments and Institutes
  • Research centres and groups
  • Chair's Blog: Summer Term 2022
  • Staff wellbeing

Oral presentations

banner 4

Oral assessments offer teachers the opportunity to assess the structure and content of a presentation as well as students’ capacity to answer any subsequent probing questions. They can be formatted as individual presentations or small-group presentations; they can be done face-to-face or online, and they can be given behind closed doors or in front of peers. The most common format involves one or two students presenting during class time with a follow-up question and answer session. Because of logistics and the demands of the curriculum, oral presentations tend to be quite short – perhaps 10 minutes for an undergraduate and 15-20 minutes for a postgraduate. Oral presentations are often used in a formative capacity but they can also be used as summative assessments. The focus of this form of assessment is not on students’ capacity to find relevant information, sources and literature but on their capacity to package such materials into a logically coherent exposition.

Advantages of oral presentations

  • Allows for probing questions that test underlying assumptions.
  • Quick to mark – immediate feedback is possible.
  • Allow students to demonstrate a logical flow/development of an idea.
  • Presentation skills are valued by employers.
  • Students are familiar with this assessment method.

Challenges of oral presentations

Can be stressful for some students.

Non-native speakers may be at a disadvantage.

Can be time-consuming.

Limited scope for inter-rater checks.

A danger that ‘good speakers’ get good marks.

How students might experience oral presentations

Students are often familiar with giving oral presentations and many will have done so in other courses. However, they may focus too much on certain aspects to the detriment of others. For example, some students may be overly concerned with the idea of standing up in front of their peers and may forget that their focus should be on offering a clear narrative. Other students may focus on the style of their presentation and overlook the importance of substance. Others yet may focus on what they have to say without considering the importance of an oral presentation being primarily for the benefit of the audience. The use of PowerPoint in particular should be addressed by teachers beforehand, so that students are aware that this should be a tool for supporting their presentation rather than the presentation in itself. Most oral presentations are followed by a question and answer phase – sometimes the questions will come from peers, sometimes they will come from teachers, and sometimes they will come from both. It is good practice to let students know about the format of the questions – especially if their capacity to answer them is part of the marking criteria.

Reliability, validity, fairness and inclusivity of oral presentations

Oral assessments are often marked in situ and this means that the process for allocating marks needs to be reliable, valid and fair when used under great time pressure. Through having a clearly defined marking structure with a set of pre-established, and shared, criteria, students should be aware of what they need to do to access the highest possible marks. Precise marking criteria help teachers to focus on the intended learning outcomes rather than presentational style. During oral presentations content validity is addressed through having marking criteria that focus on the quality of the points raised in the presentation itself and construct validity is addressed during the question and answer phase when the presenter is assessed for their capacity to comment on underpinning literature, theories and/or principles. One of the issues in having peer questions at the end of an oral presentation is that the teacher has very little control over what will be asked. This does not mean that such questions are not legitimate – only that teachers need to carefully consider how they mark the answers to such questions. In order to ensure equality of opportunity, teachers should ask their own questions after any peer questions, using them to fill any gaps and offer the presenter a chance to address any areas of the marking criteria that have not yet been covered. Oral presentation may challenge students with less proficiency in spoken English, and criteria should be scrutinised to support their achievement.

How to maintain and ensure rigour in oral presentations

Assessment rigour for oral presentations includes the teacher’s capacity to assess a range of presentation topics, formats and styles with an equal level of scrutiny.  Teachers should therefore develop marking criteria that focus on a student’s ability to take complex issues and present them in a clear and relatable manner rather than focus on the content covered. Throughout this whole process teachers should be involved in a form of constant reflexive scrutiny – examining if they feel that they are applying marking criteria fairly across all students. As oral presentations are ephemeral, consider how the moderator and/or external examiner will evaluate the assessment process. Can a moderator ‘double mark’ a percentage of presentations? Is there a need (or would it be helpful) to record the presentations?

How to limit possible misconduct in oral presentations

The opportunities for academic misconduct are quite low in an oral presentation – especially during the question and answer phase. If written resources are expected to be produced as part of the assessment (handouts, bibliographies, PowerPoint slides etc.) then guidance on citing and referencing should be given and marking criteria may offer marks for appropriate use of such literature. In guiding students to avoid using written scripts (except where it is deemed necessary from an inclusivity perspective) teachers will steer them aware from the possibility of reading out someone else’s thoughts as their own. Instead, students should be encouraged to use techniques such as limited cue cards to structure their presentation. The questions posed by the teacher at the end of the presentation are also a possible check on misconduct and will allow the teacher to see if the student actually knows about the content they are presenting or if they have merely memorised someone else’s words.

LSE examples

MA498 Dissertation in Mathematics

PB202 Developmental Psychology

ST312 Applied Statistics Project

Further resources

https://twp.duke.edu/sites/twp.duke.edu/files/file-attachments/oral-presentation-handout.original.pdf

Langan, A.M., Shuker, D.M., Cullen, W.R., Penney, D., Preziosi, R.F. and Wheater, C.P. (2008) Relationships between student characteristics and self‐, peer and tutor evaluations of oral presentations.  Assessment & Evaluation in Higher Education , 33(2): 179-190.

Dunbar, N.E., Brooks, C.F. and Kubicka-Miller, T. (2006) Oral communication skills in higher education: Using a performance-based evaluation rubric to assess communication skills.  Innovative Higher Education , 31(2): 115.

https://www.youtube.com/watch?v=HRaPmO6TlaM

https://www.lse.ac.uk/resources/calendar/courseGuides/PB/2020_PB202.htm

Implementing this method at LSE

If you’re considering using oral presentations as an assessment,  this resource  offers more specific information, pedagogic and practical, about implementing the method at LSE. This resource is password protected to LSE staff.

Back to Assessment methods

AT back button- final

Back to Toolkit Main page

Support for LSE Departments F StockSnap_VL01D336R8

Contact your Eden Centre departmental adviser

04 - Welcome to the Eden Centre logo

If you have any suggestions for future Toolkit development, get in touch using our email below!

Email: [email protected].

Academia.edu no longer supports Internet Explorer.

To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to  upgrade your browser .

Enter the email address you signed up with and we'll email you a reset link.

  • We're Hiring!
  • Help Center

paper cover thumbnail

Students’ Oral Presentation as Multimodaland Formative Assessment

Profile image of Fauzul Aufa

Abstract: The pervasiveness of digital media technologies has significantly shifted the notion of teaching and language learning. This also affects how teachers design particular assessment for students’ learning process in a multimodal environment of the contemporary classroom. However, the construction of multimodal assessment and its effects on students’ learning outcomes particularly on their oral performance is still inconclusive. Taking into account Wiliam’s (2011) strategies for successful formative assessment practice and the advancement of Computer-mediated Communication (CMC) use in learning, this paper illustrates the emergence of students’ oral presentation as multimodal assessment in language classrooms particularly at tertiary level, and provides insights for teachers to design and develop a rubric for assessment. Specifically, this paper argues that despite its challenges in classroom practice, this alternative assessment can be used to assess students’ multimodality ...

Related Papers

English Teaching: Practice and Critique

This article explores the emergence of multimodality as intrinsic to the learning, teaching and assessment of English in the Twenty-First Century. With subject traditions tied to the study of language, literature and media, multimodal texts and new technologies are now accorded overdue recognition in English curriculum documents in several countries, though assessment tends to remain largely print-centric. Until assessment modes and practices align with the nature of multimodal text production, their value as sites for inquiry in classroom practice will not be assured. The article takes up the question: What is involved in assessing the multimodal texts that students create? In exploring this question, we first consider central concepts of multimodality and what is involved in “working multimodally” to create a multimodal text. Here, “transmodal operation” and “staged multimodality” are considered as central concepts to “working multimodally”. Further, we suggest that these concepts...

oral presentation formative assessment

English Teaching: Practice and …

ABSTRACT: This article explores the emergence of multimodality as intrinsic to the learning, teaching and assessment of English in the Twenty-First Century. With subject traditions tied to the study of language, literature and media, multimodal texts and new technologies are ...

Kate Anderson

Purpose-This article presents an analysis of empirical literature on classroom assessment of students' multimodal compositions to characterize the field and make recommendations for teachers and researchers. Design/methodology/approach-An interpretive synthesis of the literature related to practices and possibilities for assessing students' multimodal compositions. Findings-Findings present three overarching types of studies across the body of literature on assessment of student multimodal compositions: reshaping educational practices, promoting multiliteracies approaches to learning and evaluating students' understanding and competence. These studies' recommendations range along a continuum of more to less structural changes to "what counts" in classrooms. Research limitations/implications-This review only considers studies published in English from 2000to 2019. Future studies could extend these parameters. Practical implications-This analysis of the literature on assessing student multimodal compositions highlights foundational differences across studies' purposes and offers guidance for educations seeking to revise their practices, whether their goals are more theoretical/philosophical, oriented toward reshaping classroom practice or focused on ways of measuring student understanding. Social implications-Rethinking assessment can reshape educational practices to be more equitable, more theoretically commensurate with teachers' beliefs and/or include more thorough and accurate measures of student understanding. Changes to any or all of these facets of educational practices can lead to continued discussion and change regarding the role of multimodal composition in teaching and learning. Originality/value-This study fills a gap in the literature by considering what empirical studies suggest about why, how and what to assess with regard to multimodal compositions. Introduction Attention to multimodal compositions in the field of English Education has proliferated over recent decades. Multimodality-meaning making and texts that incorporate multiple modes, or different channels of communication-is a hallmark of learners' practices (both in and out of schools) with screen-based multimodal texts (e.g. videos, videogames) and non-digital forms (e.g. signs, collages, live performances). Many studies have discussed the benefits of students' multimodal composition, or the creation of texts (broadly defined) using multiple Students' multimodal compositions

Shakina Rajendram

This article presents the results of a review of published literature on the use of the multiliteracies pedagogy to teach English Language Learners (ELLs). A total of 12 studies were selected for the literature review based on three inclusion criteria: (1) studies using the multiliteracies framework or other aspects of the multiliteracies pedagogy such as multimodality; (2) studies with ELL participants; and (3) studies conducted within the last 10 years. Through a detailed review and analysis of these studies, five emerging themes related to the potential benefits of the multiliteracies approach were identified and discussed in this article: (i) student agency and ownership of learning; (ii) language and literacy development; (iii) affirmation of students' languages, cultures and identities; (iv) student engagement and collaboration; and (v) critical literacy.

María Martínez Lirola

Our society has become more technological and multimodal, and consequently teaching has to be adapted to the demands of society. This article analyses the way in which the subject English Language V of the English Studies degree at the University of Alicante combines the development of the five skills (listening, speaking, reading, writing and interacting) evaluated through a portfolio with multimodality in the teaching practices and in each of the activities that are part of the portfolio. The results of a survey prepared at the end of the 2015–16 academic year show the main competences that university students develop thanks to multimodal teaching and the importance of tutorials in this kind of teaching.

Peter Smagorinsky

Emily Howell , David Reinking , Rebecca Kaminski

IATED 2017 Proceedings

Raquel Bambirra , Valéria Valente , Marcos Racilan

This paper presents a research that focused on raising students’ awareness of the main characteristics of a textual genre to support its written production in English at a secondary public school in Brazil. Based on the pedagogy of multiliteracies, this study promoted the collaborative and process writing of campaigns against animal abuse by 104 first-year students from five classes organized in groups of four. In the pre-writing phase, the students reflected on animal rights and critically analyzed some samples of campaigns on the same subject using a checklist containing the main characteristics of this textual genre. Elaborated collectively and piloted for some years by several teachers at this school, this instrument was also used both to guide the writing process and to subsidize the evaluation of the final production. Next, the groups chose a circumstance involving animal abuse to be the focus of their campaigns and organized their ideas in a conceptual map. Throughout the writing process, the students searched the internet for images, looked up words in print and online dictionaries, used text and image editors, and created logos on a website in order to indicate the authorship of the campaigns. During the rewriting phase, the teacher went from group to group giving feedback. Finally, the post-writing phase involved the production of a set of slides that documented the groups’ reflections on the production of their campaigns and served as support for the oral presentation of their work. The campaigns made by the students are rich in multimodal resources, what shows their creativity in meaning making. It was possible to notice that they developed their levels of autonomy and critical literacy by investing in a collaborative work that demanded intense negotiation. By having mediated the whole process, the checklist was essential for the students to become aware of the characteristics of a campaign, what assured the quality of their productions. Keywords: campaign, multiliteracies, textual genre characteristics, process writing, checklist.

Journal of Computers in Education

Anna Åkerfeldt

Finita Dewi

The proposed study is an attempt to respond to the demands of todays' EFL teaching which should start to move away from focusing on the teaching of language skills only to preparing learners for different literacy practices they might encounter in their future involving visual, gestural, audio, spatial, digital dimension of communication. Although several study have been conducted in exploring the use of digital storytelling project to foster students' multimodal literacies, in Indonesian context, almost none of the researches focused on the development of students' multimodal literacies in orchestrating different use of all semiotic modes in making meaning and communicating ideas. This study will explore how the use of pedagogy of multiliteracies in the Digital Storytelling (DST) project can contribute to the students' multiliteracies. Utilizing a case study design, this study will involve 30 elementary school students from the fifth grade and 1 elementary school English teachers. The school site and the teacher is purposively chosen as they have previously joined a digital story project. Multiple method of data collection will be used to triangulate findings, among others are the result of observation, interviews with students and teachers, and the document analysis including students' story scripts and final product of DST. The analysis of the findings will be based on the description of the course and the project, procedure and instructional steps using the pedagogy of multiliteracies, students' use of multimodality in their digital story and students' knowledge and discovery on multimodality

RELATED PAPERS

Universal Journal of Educational Research

García-Pinar Aránzazu

Sylvana Sofkova Hashemi

Language, Culture and Curriculum

Leila Kajee

Annual Review of Applied …

Heather Lotherington

Journal of Media Literacy Education

Robin Jocius

E-Learning and Digital Media

Jen Scott Curwood

Gillian Brooks

Journal of Educational Policies and Current Practices

Richard (Jianxin) Liu

Colombian Applied Linguistics Journal

Miguel Farias

Languages 2021, 6(3), 140.

Languages _MDPI

Terry Loerts

Research in the Teaching of English

Mary McVee , Lynn E Shanahan , Nancy Bailey

Elyse Petit

Elyse Barbara Petit

Phillip Towndrow , Mark Evan Nelson , Wan Fareed

Marjorie Siegel

Pablo Peláez

Roxana C Orrego

Bill Cope , Mary Kalantzis , Sonia Kline

Multimodal Communications

Navan N Govender

Sri Endah Kusmartini

Dwi Astuti Wahyu Nurhayati

W. Ian O'Byrne

Adelia Carstens

Nato Pachuashvili

Punctum. International Journal of Semiotics. Special issue "Multimodality in Education"

Sofia Goria

UC Berkeley L2 Journal

Christelle Palpacuer Lee

Teaching Education

Eu Made 4ll , ilaria moschini , Daniela Cesiri , Giuseppe Balirano , Massimiliano Demata , Arianna Maiorani , Maria Bortoluzzi , Inmaculada Fortanet Gómez

Tuba Angay-Crowder

Pedagogies: An International Journal

Maureen Kendrick , Diane R. Collier

Naomi Silver

Chris Walsh

RELATED TOPICS

  •   We're Hiring!
  •   Help Center
  • Find new research papers in:
  • Health Sciences
  • Earth Sciences
  • Cognitive Science
  • Mathematics
  • Computer Science
  • Academia ©2024

X

Teaching & Learning

  • Education Excellence
  • Professional development
  • Case studies
  • Teaching toolkits
  • MicroCPD-UCL
  • Assessment resources
  • Student partnership
  • Generative AI Hub
  • Community Engaged Learning
  • UCL Student Success

Menu

Oral assessment

Oral assessment is a common practice across education and comes in many forms. Here is basic guidance on how to approach it.

The words Teaching toolkits ucl arena centre on a blue background

1 August 2019

In oral assessment, students speak to provide evidence of their learning. Internationally, oral examinations are commonplace. 

We use a wide variety of oral assessment techniques at UCL.

Students can be asked to: 

  • present posters
  • use presentation software such as Power Point or Prezi
  • perform in a debate
  • present a case
  • answer questions from teachers or their peers.

Students’ knowledge and skills are explored through dialogue with examiners.

Teachers at UCL recommend oral examinations, because they provide students with the scope to demonstrate their detailed understanding of course knowledge.

Educational benefits for your students

Good assessment practice gives students the opportunity to demonstrate learning in different ways. 

Some students find it difficult to write so they do better in oral assessments. Others may find it challenging to present their ideas to a group of people.  

Oral assessment takes account of diversity and enables students to develop verbal communication skills that will be valuable in their future careers.  

Marking criteria and guides can be carefully developed so that assessment processes can be quick, simple and transparent. 

How to organise oral assessment

Oral assessment can take many forms.

Audio and/or video recordings can be uploaded to Moodle if live assessment is not practicable.

Tasks can range from individual or group talks and presentations to dialogic oral examinations.

Oral assessment works well as a basis for feedback to students and/or to generate marks towards final results.

1. Consider the learning you're aiming to assess 

How can you best offer students the opportunity to demonstrate that learning?

The planning process needs to start early because students must know about and practise the assessment tasks you design.

2. Inform the students of the criteria

Discuss the assessment criteria with students, ensuring that you include (but don’t overemphasise) presentation or speaking skills.

Identify activities which encourage the application or analysis of knowledge. You could choose from the options below or devise a task with a practical element adapted to learning in your discipline.

3. Decide what kind of oral assessment to use

Options for oral assessment can include:

Assessment task

  • Presentation
  • Question and answer session.

Individual or group

If group, how will you distribute the tasks and the marks?

Combination with other modes of assessment

  • Oral presentation of a project report or dissertation.
  • Oral presentation of posters, diagrams, or museum objects.
  • Commentary on a practical exercise.
  • Questions to follow up written tests, examinations, or essays.

Decide on the weighting of the different assessment tasks and clarify how the assessment criteria will be applied to each.

Peer or staff assessment or a combination: groups of students can assess other groups or individuals.

4. Brief your students

When you’ve decided which options to use, provide students with detailed information.

Integrate opportunities to develop the skills needed for oral assessment progressively as students learn.

If you can involve students in formulating assessment criteria, they will be motivated and engaged and they will gain insight into what is required, especially if examples are used.

5. Planning, planning planning!

Plan the oral assessment event meticulously.

Stick rigidly to planned timing. Ensure that students practise presentations with time limitations in mind. Allow time between presentations or interviews and keep presentations brief.  

6. Decide how you will evaluate

Decide how you will evaluate presentations or students’ responses.

It is useful to create an assessment sheet with a grid or table using the relevant assessment criteria.

Focus on core learning outcomes, avoiding detail.

Two assessors must be present to:

  • evaluate against a range of specific core criteria
  • focus on forming a holistic judgment.

Leave time to make a final decision on marks perhaps after every four presentations. Refer to audio recordings later for borderline cases. 

7. Use peers to assess presentations

Students will learn from presentations especially if you can use ‘audio/video recall’ for feedback.

Let speakers talk through aspects of the presentation, pointing out areas they might develop. Then discuss your evaluation with them. This can also be done in peer groups.

If you have large groups of students, they can support each other, each providing feedback to several peers. They can use the same assessment sheets as teachers. Marks can also be awarded for feedback.

8. Use peer review

A great advantage of oral assessment is that learning can be shared and peer reviewed, in line with academic practice.

There are many variants on the theme so why not let your students benefit from this underused form of assessment?

This guide has been produced by the UCL Arena Centre for Research-based Education . You are welcome to use this guide if you are from another educational facility, but you must credit the UCL Arena Centre. 

Further information

More teaching toolkits  - back to the toolkits menu

[email protected] : contact the UCL Arena Centre

UCL Education Strategy 2016–21

Assessment and feedback: resources and useful links 

Six tips on how to develop good feedback practices  toolkit

Download a printable copy of this guide

Case studies : browse related stories from UCL staff and students.

Sign up to the monthly UCL education e-newsletter  to get the latest teaching news, events & resources.

Assessment and feedback events

Funnelback feed: https://search2.ucl.ac.uk/s/search.json?collection=drupal-teaching-learn... Double click the feed URL above to edit

Assessment and feedback case studies

To read this content please select one of the options below:

Please note you do not have access to teaching notes, assessing oral presentation performance: designing a rubric and testing its validity with an expert group.

Journal of Applied Research in Higher Education

ISSN : 2050-7003

Article publication date: 3 July 2017

The purpose of this paper is to design a rubric instrument for assessing oral presentation performance in higher education and to test its validity with an expert group.

Design/methodology/approach

This study, using mixed methods, focusses on: designing a rubric by identifying assessment instruments in previous presentation research and implementing essential design characteristics in a preliminary developed rubric; and testing the validity of the constructed instrument with an expert group of higher educational professionals ( n =38).

The result of this study is a validated rubric instrument consisting of 11 presentation criteria, their related levels in performance, and a five-point scoring scale. These adopted criteria correspond to the widely accepted main criteria for presentations, in both literature and educational practice, regarding aspects as content of the presentation, structure of the presentation, interaction with the audience and presentation delivery.

Practical implications

Implications for the use of the rubric instrument in educational practice refer to the extent to which the identified criteria should be adapted to the requirements of presenting in a certain domain and whether the amount and complexity of the information in the rubric, as criteria, levels and scales, can be used in an adequate manner within formative assessment processes.

Originality/value

This instrument offers the opportunity to formatively assess students’ oral presentation performance, since rubrics explicate criteria and expectations. Furthermore, such an instrument also facilitates feedback and self-assessment processes. Finally, the rubric, resulting from this study, could be used in future quasi-experimental studies to measure students’ development in presentation performance in a pre-and post-test situation.

  • Higher education
  • Oral presentation competence

Van Ginkel, S. , Laurentzen, R. , Mulder, M. , Mononen, A. , Kyttä, J. and Kortelainen, M.J. (2017), "Assessing oral presentation performance: Designing a rubric and testing its validity with an expert group", Journal of Applied Research in Higher Education , Vol. 9 No. 3, pp. 474-486. https://doi.org/10.1108/JARHE-02-2016-0012

Emerald Publishing Limited

Copyright © 2017, Emerald Publishing Limited

Related articles

We’re listening — tell us what you think, something didn’t work….

Report bugs here

All feedback is valuable

Please share your general feedback

Join us on our journey

Platform update page.

Visit emeraldpublishing.com/platformupdate to discover the latest news and updates

Questions & More Information

Answers to the most commonly asked questions here

Faculty Learning Hub

Faculty Learning Hub

Oral assessments: benefits, drawbacks, and considerations.

What is the best way to measure achievement of course learning outcomes?  As modalities for teaching and learning continue evolve, it is important to refresh assessment practices with student success and academic integrity in mind.

Oral exams and assessments are not a new practice, although they tend to be more common in European countries than in North America (Sayre, 2014).  A well-constructed oral exam as one element of an overall-assessment strategy can provide many benefits for the learner and the evaluator. This teaching tip will explore those benefits, weigh potential drawbacks, and finish with considerations for implementing oral assessment in your course.

Note: An oral assessment must be listed as such on the course outline as it is a distinct type of evaluation and student must be aware of its inclusion from the start of the course.

What is an oral exam? Typically, an oral exam is one in which students are provided in advance with topics or questions that cover a set of course outcomes. During the oral exam, which can either involve sitting down with the professor in-person or on zoom or recording a timed video, the student is provided randomly with 1-2 prompts and must explain or reply in the set time. Here is a Danish student explaining about her oral exam experience.

Potential Benefits

  • Oral exams are versatile . Orals have been used successfully in a variety of disciplines, including mathematics, religious studies, business, physics, medicine, and modern languages (Hazen, 2020).
  • Oral exams provide evidence and support for higher order thinking and problem solving skills . A large part of STEM-related learning consists of learning how to problem-solve.  Many written tests make exclusive use of shorter problems so that a student who hits a roadblock is not penalized too heavily. In an oral exam, on the other hand, the faculty can provide supportive prompts so that students overcome any stumbling blocks and both practice and demonstrate more fully their problem-solving skills. This makes an oral exam both kinder and more thorough than a written exam. (Sayre, 2014; Simpson, 2015).
  • Oral exams can encourage better learning outcomes .  The act of explaining an answer to the examiner adds to the student’s learning, making the test an opportunity for further learning. (Sayre, 2014; Zhao, 2018).
  • Oral exams potentially alter the way students study .  Students in Iannone and Simpson’s study focused more on understanding and less on memorization when they knew that they would be examined orally. Students may study harder for an in-person test than for a written test. (Iannone and Simpson, 2015; Zhao, 2018, Hazen 2020).
  • Oral exams help students develop authentic communication skills in their discipline . Sayre suggests that oral tests help her students learn how to talk like scientists. Oral tests allow students to develop the ability to communicate in skill areas they will need later in the workplace (Sayre, 2014; Hazen, 2020).
  • Oral exams as one examination tool among others can increase academic integrity . While oral exams should not be used solely for the purpose of curbing cheating, it is difficult to cheat for an oral exam, depending on how it is set up (Iannone and Simpson, 2015; Hazen, 2020).
  • Oral exams provide an alternative for demonstrating achievement of course learning outcomes . Students who are more comfortable with speaking than with writing will benefit from a portion of their assessment grade being allocated to an oral assessment. (Iannone and Simpson, 2015, p. 973). An oral exam component of an overall assessment strategy is therefore in keeping with the principles of Universal Design for Learning.
  • Oral exams provide an alternative means of expression. Oral exams may suit some students better than written demonstrations depending on their strengths and abilities.

Potential Drawbacks

  • Time Commitment . Oral tests are less work to administer and mark than essay exams but take more time than self-grading multiple-choice exams (Hazen, 2020).  It is important to make sure that fewer questions are asked in order to retain the benefit of more in-depth responses which oral exams offer (Sayre, 2014).A well-constructed marking rubric will be important to help guide marking.
  • Potential Bias . Of course, anonymity is not possible for an oral test; however, all assessments contain a potential for bias, and it is unclear whether bias is a greater factor for orals than for other assessment methods. Math students in Iannone and Simpson’s study discussed a perceived lack of fairness when hearing that their fellow students were asked more questions but felt that videos of their oral test performance would guarantee a degree of fairness (Iannone and Simpson, 2015; Hazen, 2020).
  • Test anxiety. Iannone and Simpson cite studies showing that test anxiety decreases with increased familiarity and understanding of the benefits of oral testing. Their own study demonstrated that math students taking an oral test for the first time were anxious at feeling exposed if they could not answer the questions (Iannone and Simpson, 2015).

Considerations

Have you thought it through? Would you like to add an oral exam to the evaluation scheme in your course outline?  Here are some considerations to bias for best in an oral exam.

  • Keep it short . Oral tests allow for the examiner to interject with conceptual questions and discussion of a given problem.  This means that there should be fewer questions asked. A good oral test for smaller classes will likely take no longer than 20 minutes and will be structured more like an interview than a list of questions. (Sayre, 2014). Oral tests for larger classes can be shorter, but then should be worth less of the overall grade.
  • Consider open-book . To deal with pre-exam anxiety, oral tests can be open-book or open-note. This will allow students to check formulas or charts and focus more on their ability to apply relevant information to problem-solving than on memory retrieval. (Sayre, 2014, p. 31).
  • Weight appropriately.  The time and complexity of an assessment should correspond to its weight in the overall course marking scheme.  For example, an oral test worth 5% may have only one question marked according to one criterion, while a test worth 20% of the final grade should provide the student with a complex opportunity to demonstrate knowledge and ability and should be marked based on multiple criteria.
  • Prepare for grading the exam.   Grading does not have to be prohibitive; the examiner can take notes during the exam, with a rubric that grades for problem-solving, not just the right answer (Sayre, 2014). With a good rubric, the marks should be consistent from student to student.
  • Ensure that the marking scheme is valid . To provide students with meaningful feedback and grades, consider a rubric with sufficient levels to capture the range of achievement. The example below is set up for one criterion per question.

Setting up Simple Oral Assessments

While oral assessments are meant to be short, they can be more or less formal. For informal formative assessment, one teacher describes 60-second interviews asking students to explain concepts that they may have had difficulty with on a test. You can arrange for 5-to-10-minute interviews with each student in the class, or you can arrange for students to create a video of themselves talking through a question or a problem using the video feature in the eConestoga assignments tool.  Here are suggested steps for setting up an oral assessment.

  • Align the test with the course learning outcomes and include it in the course outline.
  • Decide how long the test will be.
  • Create a question pool with a corresponding rubric.
  • Prepare students by explaining the test and by providing opportunities to practice.
  • If the test will be asynchronous, provide a practice opportunity so students know how to use the technology.
  • Consider recording the test in case of questions later.
  • Consider how to assist students in signing up for a time.
  • Consider where students will wait for their turn and how much class time the test will take.
  • Panopto course folders provide a platform for students to record.
  • If you wish to randomize questions, you can set up an eConestoga quiz with random questions and a link to Panopto.

Further Supports

Guidelines on Online Oral Exams for Lecturers

Hazen, H. (2020). Use of oral examinations to assess student learning in the social sciences. Journal of Geography in Higher Education , 44 (4), 592–607. https://doi.org/10.1080/03098265.2020.1773418

Iannone, P. & Simpson, A. (2015). Students’ views of oral performance assessment in mathematics: straddling the “assessment of” and “assessment for” learning divide.  Assessment and Evaluation in Higher Education ,  40 (7), 971–987. https://doi.org/10.1080/02602938.2014.961124

Sayre, E. (2014). Oral exams as a tool for teaching and assessment.  Teaching Science (Deakin West, A.C.T.) ,  60 (2), 29–33. https://doi.org/10.3316/aeipt.203840

Zhao, Y. (2018). Impact of oral exams on a thermodynamics course performance . Paper presented at 2018 ASEE Zone IV Conference, Boulder, Colorado. https://peer.asee.org/29617

' src=

Laura Stoutenburg

A college professor and accredited TESL trainer for more than 20 years, Laura Stoutenburg, holding an M.A., has taught and developed curricula for a variety of topics, with her work including language assessment in China and Canada. Before joining Teaching and Learning as a consultant, Laura coordinated Conestoga’s TESL Certificate and English Language Studies programs. She specializes in matters related to Intercultural Teaching and language acquisition, and is available at the Kitchener Downtown Campus.

Related Posts

Academic Development Centre

Oral presentations

Using oral presentations to assess learning

Introduction.

Oral presentations are a form of assessment that calls on students to use the spoken word to express their knowledge and understanding of a topic. It allows capture of not only the research that the students have done but also a range of cognitive and transferable skills.

Different types of oral presentations

A common format is in-class presentations on a prepared topic, often supported by visual aids in the form of PowerPoint slides or a Prezi, with a standard length that varies between 10 and 20 minutes. In-class presentations can be performed individually or in a small group and are generally followed by a brief question and answer session.

Oral presentations are often combined with other modes of assessment; for example oral presentation of a project report, oral presentation of a poster, commentary on a practical exercise, etc.

Also common is the use of PechaKucha, a fast-paced presentation format consisting of a fixed number of slides that are set to move on every twenty seconds (Hirst, 2016). The original version was of 20 slides resulting in a 6 minute and 40 second presentation, however, you can reduce this to 10 or 15 to suit group size or topic complexity and coverage. One of the advantages of this format is that you can fit a large number of presentations in a short period of time and everyone has the same rules. It is also a format that enables students to express their creativity through the appropriate use of images on their slides to support their narrative.

When deciding which format of oral presentation best allows your students to demonstrate the learning outcomes, it is also useful to consider which format closely relates to real world practice in your subject area.

What can oral presentations assess?

The key questions to consider include:

  • what will be assessed?
  • who will be assessing?

This form of assessment places the emphasis on students’ capacity to arrange and present information in a clear, coherent and effective way’ rather than on their capacity to find relevant information and sources. However, as noted above, it could be used to assess both.

Oral presentations, depending on the task set, can be particularly useful in assessing:

  • knowledge skills and critical analysis
  • applied problem-solving abilities
  • ability to research and prepare persuasive arguments
  • ability to generate and synthesise ideas
  • ability to communicate effectively
  • ability to present information clearly and concisely
  • ability to present information to an audience with appropriate use of visual and technical aids
  • time management
  • interpersonal and group skills.

When using this method you are likely to aim to assess a combination of the above to the extent specified by the learning outcomes. It is also important that all aspects being assessed are reflected in the marking criteria.

In the case of group presentation you might also assess:

  • level of contribution to the group
  • ability to contribute without dominating
  • ability to maintain a clear role within the group.

See also the ‘ Assessing group work Link opens in a new window ’ section for further guidance.

As with all of the methods described in this resource it is important to ensure that the students are clear about what they expected to do and understand the criteria that will be used to asses them. (See Ginkel et al, 2017 for a useful case study.)

Although the use of oral presentations is increasingly common in higher education some students might not be familiar with this form of assessment. It is important therefore to provide opportunities to discuss expectations and practice in a safe environment, for example by building short presentation activities with discussion and feedback into class time.

Individual or group

It is not uncommon to assess group presentations. If you are opting for this format:

  • will you assess outcome or process, or both?
  • how will you distribute tasks and allocate marks?
  • will group members contribute to the assessment by reporting group process?

Assessed oral presentations are often performed before a peer audience - either in-person or online. It is important to consider what role the peers will play and to ensure they are fully aware of expectations, ground rules and etiquette whether presentations take place online or on campus:

  • will the presentation be peer assessed? If so how will you ensure everyone has a deep understanding of the criteria?
  • will peers be required to interact during the presentation?
  • will peers be required to ask questions after the presentation?
  • what preparation will peers need to be able to perform their role?
  • how will the presence and behaviour of peers impact on the assessment?
  • how will you ensure equality of opportunities for students who are asked fewer/more/easier/harder questions by peers?

Hounsell and McCune (2001) note the importance of the physical setting and layout as one of the conditions which can impact on students’ performance; it is therefore advisable to offer students the opportunity to familiarise themselves with the space in which the presentations will take place and to agree layout of the space in advance.

Good practice

As a summary to the ideas above, Pickford and Brown (2006, p.65) list good practice, based on a number of case studies integrated in their text, which includes:

  • make explicit the purpose and assessment criteria
  • use the audience to contribute to the assessment process
  • record [audio / video] presentations for self-assessment and reflection (you may have to do this for QA purposes anyway)
  • keep presentations short
  • consider bringing in externals from commerce / industry (to add authenticity)
  • consider banning notes / audio visual aids (this may help if AI-generated/enhanced scripts run counter to intended learning outcomes)
  • encourage students to engage in formative practice with peers (including formative practice of giving feedback)
  • use a single presentation to assess synoptically; linking several parts / modules of the course
  • give immediate oral feedback
  • link back to the learning outcomes that the presentation is assessing; process or product.

Neumann in Havemann and Sherman (eds., 2017) provides a useful case study in chapter 19: Student Presentations at a Distance, and Grange & Enriquez in chapter 22: Moving from an Assessed Presentation during Class Time to a Video-based Assessment in a Spanish Culture Module.

Diversity & inclusion

Some students might feel more comfortable or be better able to express themselves orally than in writing, and vice versa . Others might have particular difficulties expressing themselves verbally, due for example to hearing or speech impediments, anxiety, personality, or language abilities. As with any other form of assessment it is important to be aware of elements that potentially put some students at a disadvantage and consider solutions that benefit all students.

Academic integrity

Oral presentations present relative low risk of academic misconduct if they are presented synchronously and in-class. Avoiding the use of a script can ensure that students are not simply reading out someone else’s text or an AI generated script, whilst the questions posed at the end can allow assessors to gauge the depth of understanding of the topic and structure presented. (Click here for further guidance on academic integrity .)

Recorded presentations (asynchronous) may be produced with help, and additional mechanisms to ensure that the work presented is their own work may be beneficial - such as a reflective account, or a live Q&A session. AI can create scripts, slides and presentations, copy real voices relatively convincingly, and create video avatars, these tools can enable students to create professional video content, and may make this sort of assessment more accessible. The desirability of such tools will depend upon what you are aiming to assess and how you will evaluate student performance.

Student and staff experience

Oral presentations provide a useful opportunity for students to practice skills which are required in the world of work. Through the process of preparing for an oral presentation, students can develop their ability to synthesise information and present to an audience. To improve authenticity the assessment might involve the use of an actual audience, realistic timeframes for preparation, collaboration between students and be situated in realistic contexts, which might include the use of AI tools.

As mentioned above it is important to remember that the stress of presenting information to a public audience might put some students at a disadvantage. Similarly non-native speakers might perceive language as an additional barrier. AI may reduce some of these challenges, but it will be important to ensure equal access to these tools to avoid disadvantaging students. Discussing criteria and expectations with your students, providing a clear structure, ensuring opportunities to practice and receive feedback will benefit all students.

Some disadvantages of oral presentations include:

  • anxiety - students might feel anxious about this type of assessment and this might impact on their performance
  • time - oral assessment can be time consuming both in terms of student preparation and performance
  • time - to develop skill in designing slides if they are required; we cannot assume knowledge of PowerPoint etc.
  • lack of anonymity and potential bias on the part of markers.

From a student perspective preparing for an oral presentation can be time consuming, especially if the presentation is supported by slides or a poster which also require careful design.

From a teacher’s point of view, presentations are generally assessed on the spot and feedback is immediate, which reduces marking time. It is therefore essential to have clearly defined marking criteria which help assessors to focus on the intended learning outcomes rather than simply on presentation style.

Useful resources

Joughin, G. (2010). A short guide to oral assessment . Leeds Metropolitan University/University of Wollongong http://eprints.leedsbeckett.ac.uk/2804/

Race, P. and Brown, S. (2007). The Lecturer’s Toolkit: a practical guide to teaching, learning and assessment. 2 nd edition. London, Routledge.

Annotated bibliography

Class participation

Concept maps

Essay variants: essays only with more focus

  • briefing / policy papers
  • research proposals
  • articles and reviews
  • essay plans

Film production

Laboratory notebooks and reports

Objective tests

  • short-answer
  • multiple choice questions

Patchwork assessment

Creative / artistic performance

  • learning logs
  • learning blogs

Simulations

Work-based assessment

Reference list

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings
  • Advanced Search
  • Journal List
  • v.12(6); 2020 Jun

Logo of cureus

Remote Assessment of Video-Recorded Oral Presentations Centered on a Virtual Case-Based Module: A COVID-19 Feasibility Study

Conrad krawiec.

1 Pediatric Critical Care Medicine, Penn State Health Children's Hospital, Hershey, USA

Abigail Myers

2 Pediatrics, Penn State Health Children's Hospital, Hershey, USA

Introduction

The coronavirus disease 2019 (COVID-19) pandemic has resulted in the suspension of our pediatric clerkship, which may result in medical student skill erosion due to lack of patient contact. Our clerkship has developed and assessed the feasibility of implementing a video-recorded oral presentation assignment and formative assessment centered on virtual case-based modules.

This retrospective study examined the feasibility of providing a remote formative assessment of third-year medical student video-recorded oral presentation submissions centered on virtual case-based modules over a one-week time period after pediatric clerkship suspension (March 16th to 20th, 2020). Descriptive statistics were used to assess the video length and assessment scores of the oral presentations.

Twelve subjects were included in this study. Overall median assessment score [median score, (25th, 75th percentile)] was 5 (4,6), described as “mostly on target” per the patient presentation rating tool.

Patient-related activities during the pediatric clerkship were halted during the COVID-19 pandemic. This study demonstrated the possibility of remotely assessing oral presentation skills centered on virtual case-based modules using a patient presentation tool intended for non-virtual patients. This may prepare students for their clinical experiences when COVID-19 restrictions are lifted. Future studies are needed to determine if suspended clerkships should consider this approach.

In 2020, the coronavirus disease 2019 (COVID-19) pandemic resulted in the unprecedented prolonged closure of educational institutions worldwide to curb the spread of the virus [ 1 , 2 ]. Medical students were included in this group of learners per the guidance of the Association of American Medical Colleges (AAMC) [ 3 ]. Thus, our institution temporarily suspended the clinical portion of the pediatric clerkship.

Electronic resources exist to supplement the pediatric clerkship curriculum, thus key aspects can be taught remotely [ 4 - 7 ]. One aspect that electronic sources lack, however, is patient contact. Lack of patient contact results in the inability to practice clinical skills, including interviewing or orally presenting patients recently seen. These clinical skills are often assessed during the pediatric clerkship and students will often specifically receive feedback on these skills [ 8 ]. They are also prioritized by some clerkships for the summative evaluation as students must develop these skills to demonstrate they can assess a patient and synthesize their medical knowledge [ 8 ].

At our institution, we have instituted a remote learning curriculum for our third-year medical students starting at the end of April 2020. When COVID-19 restrictions are lifted, our students will undergo two weeks of patient contact time. Because our students will not have been in a clinical environment for a prolonged time period, they may have difficulty transitioning [ 9 ]. To minimize the impact this transition will have on our students, our pediatric clerkship has developed a video-recorded oral presentation assignment centered on a virtual case-based module with remote formative assessment. Our goal was to enhance the development of this clinical skill remotely thereby allowing students to focus on clinical skill development in areas that cannot be achieved without patient contact (i.e., patient interviewing) when restrictions are lifted.

The objective of this study is to demonstrate the feasibility of student video-recording an oral presentation centered on a virtual case-based module and having our attending faculty members provide a formative assessment. The study hypothesis is that it is feasible to assess and provide formative feedback on video-recorded oral presentations by pediatric attending faculty members using a patient presentation rating tool intended for non-virtual patients.

Materials and methods

Study design

This is a feasibility study requesting students to video-record an oral presentation centered on a virtual case-based module for formative assessment during a time period (March 16th, 2020 until March 19th, 2020) when Pennsylvania State College of Medicine third-year medical students were abruptly restricted from providing direct patient care during the pediatric clerkship. A retrospective review of faculty submitted formative assessments of the video-recorded oral presentations centered on virtual case-based modules was completed. This study was reviewed by our institution’s review board and determined to be non-human research.

Subject population

Third-year medical students - (1) part of our institution’s traditional curriculum, (2) rotated at the pediatric clerkship’s primary site or off-campus affiliate sites during the first month of the academic year (2020-2021), (3) were abruptly restricted from direct patient care due to the COVID-19 pandemic, and (4) completed a video-recorded oral presentation centered on a virtual case-based module - were included in this study. Students who were part of the longitudinal integrated curriculum were excluded.

Clerkship overview

The pediatric clerkship at our institution is a four-week rotation with the following clinical requirements: outpatient clinic, nursery, and inpatient service. On March 16th, third-year students were restricted from direct patient care, thus only three weeks of the clerkship was completed.

Video-recorded oral presentation assignment and assessment creation

The video-recorded oral presentation assignment was developed by a pediatric clerkship director experienced in inpatient medicine and an outpatient pediatrician. A patient presentation tool developed by Lewin et al. was utilized for this assessment [ 10 ].

Using behavioral and verbal anchors, the patient presentation tool assesses various oral presentation sections including patient history, physical exam and diagnostic study results, summary statement, assessment and plan, clinical reasoning/synthesis of information, and general aspects (organization, speaking style) based on a 5-point scale (5 being the highest) [ 10 ]. Overall assessment of presentation is based on a 9-point scale (9 being the highest and described as “well above expectations”). Eight faculty members were recruited to use this tool as they were assessing the video recordings.

Video-recorded oral presentation assignment implementation

Starting on March 16th, 2020, the subjects were provided a remote learning curriculum and were notified of the video-recorded oral presentation assignment. They were informed that the pediatric clerkship will be graded pass/fail, that submission of a video-recorded oral presentation for formative assessment will be required, and was due on March 19th, 2020. The subjects were instructed to (1) video-record an oral presentation of either a patient they have seen during the course of the clerkship or after completing a virtual online case-based module through Aquifer © (Lebanon, New Hampshire, USA) and (2) upload the assignment via the Instructure Canvas (Salt Lake City, Utah, USA) learning management system. Students were given specific directions including the use of professional attire, limiting the video-recording to 10 minutes, and requesting students to review the video prior to submission for clarity and organization. After receiving the video-recordings, the files were securely distributed through the CANVAS © learning management system among eight pediatric attending faculty volunteers who reviewed and provided formative assessment scores of the oral presentation.

Data collection

All completed assignments were collected using the Instructure Canvas learning management system. Using the Canvas learning management system, we extracted the following data: overall video-recorded oral presentation rating scores and video-recorded oral presentation scores divided by section as outlined by the patient presentation tool [ 10 ].

Virtual case-based module

If students elected to give an oral presentation based on a virtual case-based module, we asked students to complete the pediatric Aquifer © case-based module 3, a 3-year-old male seen for a well-child visit [ 4 ]. This case was chosen as it provides a robust history and physical examination, tasks the student to identify and prioritize problems uncovered during this visit, allows the student to apply a differential diagnosis when appropriate, formulate a management plan, and practice their organization skills during the oral presentation.

We used descriptive statistics to assess the study population in terms of length of presentation, type of patient presented, and assessment scores based on the patient presentation tool [ 10 ]. Formative assessment of each oral presentation was reported in the median and interquartile range.

Twelve individual oral presentation videos centered on the virtual case-based module were included in this study. Median video length [median time (mm:ss), (25th, 75th percentile)] was 06:20 (05:04,07:21).

Video-recorded oral presentation assessment scores

Overall, median overall formative assessment score [median score, 25th, 75th percentile] of video-recorded oral presentation centered on virtual case-based modules was 5 (4,6), described as “mostly on target” per the patient presentation tool [ 10 ]. The lowest items scored were pertinent positives and negatives of the differential diagnosis [2 (1,3)] (Table ​ (Table1 1 ).

Note: Sections scored on a 1 to 5 scale, 5 being the highest score

Patient Presentation Rating Tool for Oral Case Presentations [ 10 ].

Oral presentations are an essential clinical skill that facilitates physician to physician communication, improves efficiency on rounds, and enables individual as well as group learning [ 8 ]. It also can be complex and time-consuming as students must use their medical knowledge and clinical reasoning skills to select the pertinent details to present from a patient’s history, physical, diagnostic, and laboratory tests [ 8 , 10 ]. In this study, we hypothesized that video-recorded oral presentations centered on a virtual case-based module can undergo a formative assessment. This study successfully demonstrated that a formative assessment can be remotely provided for video-recorded presentations based on virtual case-based modules. These results imply that this form of assessment is possible, may prepare students for the eventual live clinical experience (with patient contact), and potentially optimize the transition period from COVID-19 remote learning to a post-COVID-19 clinical patient experience.

To our knowledge, a pediatric clerkship has never been halted in this manner for a prolonged period due to a nationwide health emergency. Because of this, our pediatric clerkship, like others across the United States was placed in an unprecedented situation, potentially placing our students at risk of achieving suboptimal competency in various clinical areas [ 11 ]. Novel approaches are necessary to ensure that our students, who were hastily restricted during their pediatric clerkship and future students that have yet to complete their pediatric clerkship, are adequately trained [ 11 ].

Our institution’s current plans are for each clerkship to institute a remote learning curriculum and complete a two-week immersive clinical experience in each of the core clerkships. The remote learning curriculum will allow students to learn the basic concepts relevant for pediatrics and the two-week patient contact experience will allow students to apply their knowledge. When the two-week patient contact experience begins, however, the transition period may be difficult. Students will not have seen a patient (possibly for months) and similar to transitioning from the pre-clerkship to clerkship years, students may be overwhelmed by clerkship logistics, expectations, and adjusting to the clinical culture [ 12 ]. In all, students may be overwhelmed by this and the number of tasks they must complete in a short time period post-clerkship suspension, potentially limiting their clinical experience.

Thus, it is the clerkship’s responsibility to ensure that students in a remote curriculum continue to be comparably trained and are provided as many similar clinical experiences as possible to ease the transition that will occur on clerkship reinstatement. While the pediatric clerkship is currently limited in allowing students to see patients during the remote learning experience, there are other ways that students can be robustly prepared for the clinical environment. The area that our clerkship elected to focus on is the oral presentation.

If students are rigorously prepared to practice oral presentation skills using pediatric faculty members (that they will eventually present to), students may start to apply their communication, medical knowledge, and clinical reasoning skills earlier and potentially focus their clinical skills on other areas that they cannot easily achieve remotely (i.e., history taking and physical examination and providing live patient care) when they return to clerkship. Students may also have a better understanding of their expectations, roles, and responsibilities of this skill for our clerkship and thus are better prepared to provide meaningful patient care and be effective team members sooner.

In our study, we found that it is feasible for students to submit a video-recorded oral presentation centered on a virtual case-based module and recruit pediatric attending faculty members to assess and provide formative feedback. We also found that the overall median scores were “mostly on target” according to the patient presentation tool. The students who completed these assessments were the first students for the academic year, thus these results may indicate that these students developmentally require more practice. Alternatively, these results may indicate that because these assessments are formative and the clerkship is now pass/fail, these students were given the feedback necessary to improve their skills. Finally, students may not have received enough individual educational attention during the normal clinical workflow and thus were not given enough instruction. More studies are necessary to determine if these assessments are consistent.

Limitations

There were several limitations to this study. This includes its small sample size, the short intervention period, and the lack of randomization. The patient presentation rating tool intended for live patients was used without the opportunity to validate it for use in virtual case-based modules due to the haste in its implementation. Future studies will be required to validate the tool for this purpose. Student perception is also unknown regarding the effectiveness of this assessment technique, thus future qualitative studies are planned to determine this.

Conclusions

Our pediatric clerkship was suddenly curtailed during the COVID-19 pandemic. The students were provided a remote learning curriculum to emphasize pediatric concepts but may not be able to demonstrate their clinical skills in communication, data synthesis, and patient assessment. Our study demonstrated that it is possible to assess oral presentation skills centered on virtual case-based modules using a patient presentation rating tool intended for non-virtual patients and may potentially prepare students for their clinical experiences when COVID-19 restrictions are lifted. Future studies are needed to determine if suspended clerkships should consider this approach.

Acknowledgments

The authors are grateful to our pediatric faculty, who took the time to assess our students during this stressful time.

The content published in Cureus is the result of clinical experience and/or research by independent individuals or organizations. Cureus is not responsible for the scientific accuracy or reliability of data or conclusions published herein. All content published within Cureus is intended only for educational, research and reference purposes. Additionally, articles published within Cureus should not be deemed a suitable substitute for the advice of a qualified health care professional. Do not disregard or avoid professional medical advice due to content published within Cureus.

The authors have declared that no competing interests exist.

Human Ethics

Consent was obtained by all participants in this study. Penn State College of Medicine Institutional Review Board issued approval STUDY00014941. The Human Subjects Protection Office determined that the proposed activity, as described in the above-referenced submission, does not meet the definition of human subject research as defined in 45 CFR 46.102(e) and/or (l). Institutional Review Board (IRB) review and approval is not required. Please note: While IRB review and approval is not required, you remain responsible for ensuring compliance with FERPA. If you have additional questions regarding FERPA regulations, please contact the Office of the University Registrar. The IRB requires notification and review if there are any proposed changes to the activities described in the IRB submission that may affect this determination. If changes are being considered and there are questions about whether IRB review is needed, please contact the Human Subjects Protection Office.

Animal Ethics

Animal subjects: All authors have confirmed that this study did not involve animal subjects or tissue.

IMAGES

  1. FREE 7+ Oral Presentation Evaluation Forms in PDF

    oral presentation formative assessment

  2. 75 Formative Assessment Examples (2024)

    oral presentation formative assessment

  3. Free Printable Oral Presentation Rubric

    oral presentation formative assessment

  4. FREE 7+ Oral Presentation Evaluation Forms in PDF

    oral presentation formative assessment

  5. (PDF) Rubric formats for the formative assessment of oral presentation

    oral presentation formative assessment

  6. oral presentation assessment

    oral presentation formative assessment

VIDEO

  1. Oral presentation group#2

  2. oral presentation English 4 19130479

  3. ACA101 Assessment 4 Oral Presentation

  4. WORKSHOP ON ORAL ASSESSMENT FOR FRENCH TEACHERS

  5. Presentation Skills 101

  6. ORAL ASSESSMENT 1

COMMENTS

  1. Rubric formats for the formative assessment of oral presentation skills

    Acquiring complex oral presentation skills is cognitively demanding for students and demands intensive teacher guidance. The aim of this study was twofold: (a) to identify and apply design guidelines in developing an effective formative assessment method for oral presentation skills during classroom practice, and (b) to develop and compare two analytic rubric formats as part of that assessment ...

  2. PDF Rubric formats for the formative assessment of oral presentation skills

    formative assessment methods for other contexts, and for complex skills other than oral presentation, and should lead to more profound understanding of video-enhanced rubrics. Keywords Digital rubrics · Analytic rubrics · Video-enhanced rubrics · Oral presentation skills · Formative assessment method * Rob J. Nadolski [email protected]

  3. (PDF) Rubric formats for the formative assessment of oral presentation

    The aim of this study was twofold: (a) to identify and apply design guidelines in developing an effective formative assessment method for oral presentation skills during classroom practice, and (b ...

  4. 7 Smart, Fast Formative Assessment Strategies

    3. Dipsticks: So-called alternative formative assessments are meant to be as easy and quick as checking the oil in your car, so they're sometimes referred to as dipsticks. These can be things like asking students to: write a letter explaining a key idea to a friend, draw a sketch to visually represent new knowledge, or.

  5. (PDF) Assessing Oral Presentation Performance: Designing ...

    oral presentation competence of which three relate to formative assessment processes (Van Ginkel et al. , 2015a). These principles include the provision of feedback, peer

  6. Oral Exams: A More Meaningful Assessment of Students' Understanding

    Both oral presentations and oral exams in STEM typically have a "knowledge and understanding" focus as the content type, but could also address problem solving ability or inter/intrapersonal competencies. ... (2019), "Oral Exams: A Deeply Neglected Tool for Formative Assessment in Chemistry," in Active Learning in General Chemistry ...

  7. Oral presentations

    Because of logistics and the demands of the curriculum, oral presentations tend to be quite short - perhaps 10 minutes for an undergraduate and 15-20 minutes for a postgraduate. Oral presentations are often used in a formative capacity but they can also be used as summative assessments. The focus of this form of assessment is not on students ...

  8. Students' Oral Presentation as Multimodal and Formative Assessment

    Oral presentations can be one of instances of formative assessment in the classroom. This multimodal assessment can be used to create an interactive feedback session which afford more

  9. Students' Oral Presentation as Multimodaland Formative Assessment

    Taking into account Wiliam's (2011) strategies for successful formative assessment practice and the advancement of Computer-mediated Communication (CMC) use in learning, this paper illustrates the emergence of students' oral presentation as multimodal assessment in language classrooms particularly at tertiary level, and provides insights ...

  10. Oral assessment

    We use a wide variety of oral assessment techniques at UCL. Students can be asked to: present posters. use presentation software such as Power Point or Prezi. perform in a debate. present a case. answer questions from teachers or their peers. Students' knowledge and skills are explored through dialogue with examiners.

  11. Assessing oral presentation performance: Designing a rubric and testing

    The purpose of this paper is to design a rubric instrument for assessing oral presentation performance in higher education and to test its validity with an expert group.,This study, using mixed methods, focusses on: designing a rubric by identifying assessment instruments in previous presentation research and implementing essential design ...

  12. PDF Using Peers to Assess Oral Presentations to Foster Learning

    Keywords: oral assessment, formative, summative, presentations, student learning International Journal of Innovation in Science and Mathematics Education 22(3), 74-80, 2014. ... Formative peer assessment of an oral presentation also provides the presenters with immediate feedback. While this type of feedback may not be as accurate as feedback ...

  13. Oral Assessments: Benefits, Drawbacks, and Considerations

    Setting up Simple Oral Assessments. While oral assessments are meant to be short, they can be more or less formal. For informal formative assessment, one teacher describes 60-second interviews asking students to explain concepts that they may have had difficulty with on a test. You can arrange for 5-to-10-minute interviews with each student in the class, or you can arrange for students to ...

  14. Oral presentations

    Oral presentations are often combined with other modes of assessment; for example oral presentation of a project report, oral presentation of a poster, commentary on a practical exercise, etc. ... (including formative practice of giving feedback) use a single presentation to assess synoptically; linking several parts / modules of the course ...

  15. Assessing oral presentations: An analysis of score-reaching dialogue

    AOPs are prevalent across the academy in disseminating research and as formative and summative assessment types (Palmour, 2020; ... the constructs of EAP across oral presentation assessments and within one oral presentation assessment are conceptualised and operationalised in various ways at different points in time by teacher assessors: 1) as ...

  16. PDF Enhancing Learning with Oral Assessment

    This paper describes the oral assessment activity we designed and used as a culminating activity for faculty participants in a professional academic development program. The program offers multiple certificates, and the goal of each certificate is to enhance participants' abilities to design and deliver exceptional student learning experiences.

  17. Scholars180: An effective oral presentation assessment for optometry

    The oral assessment format allows the potential evaluation of all six cognitive domains of Bloom's taxonomy, which are knowledge, comprehension, application, analysis, evaluation, and creation . This makes it a valuable multifunctional assessment type. Additionally, assessments can also be classified into summative and formative assessments ...

  18. Remote Assessment of Video-Recorded Oral Presentations Centered on a

    Materials and methods. Study design. This is a feasibility study requesting students to video-record an oral presentation centered on a virtual case-based module for formative assessment during a time period (March 16th, 2020 until March 19th, 2020) when Pennsylvania State College of Medicine third-year medical students were abruptly restricted from providing direct patient care during the ...

  19. PDF Students' Oral Presentation As Multimodal and Formative Assessment

    Keywords: Computer-mediated communication (CMC), multimodal assessment, oral presentation, formative assessment, teachers' feedback, tertiary students. Introduction ... The first issue is related to the focus of assessment in oral presentations. By concentrating on students' multimodal proficiency and their technical skills, the teachers ...

  20. PDF Using Group Oral Presentations as a Formative Assessment in Teaching

    formative assessment and summative assessment. 2.2.1. Formative Assessment Black and Wiliam [3] defined formative assessment as "activities undertaken by teachers and by their students in assessing themselves that provide information to be used as feedback to modify teaching and learning activities." Thus, formative assessment aims to provide

  21. PDF Tunisian EFL Learners' Perceptions of the Use of Oral Presentations in

    Oral Presentations in the Formative Assessment of Speaking . For research purposes, we would like you to help us by answering the following questions concerning your perceptions of oral presentations in the formative assessment of speaking. There is no "right" or "wrong" answer and you don't even have to write your name. All answers will