Investigating new assessment technologies

Assessments matter to students. They are a way to demonstrate knowledge and mastery, and provide an invaluable learning experience. In 2020 due to necessity UCEM moved away from in-person real-time exams and delivered a mixture of computer marked assessments (CMA) and take-home exams. The results were positive and it is clear that the pandemic offers us the opportunity to rethink many of our ingrained and sometimes outdated approaches, in particular the idea that exams are the only way to assess.

The UCEM Learning, Teaching and Assessment Strategy (2020 – 2025) underpins the educational ambitions set out in UCEM’s Vision and Strategy to 2025, and defines six strategic priorities for the upcoming five years:

  • Student Centred: Putting students at the heart of everything we do
  • Industry Excellence: Creating built environment professionals of the highest calibre
  • Widening Access and Participation: Offering different pathways; welcoming more diversity; improving accessibility
  • Student Outcomes: Maximising every student’s potential
  • Student Satisfaction: Providing a positive, engaging, and rewarding education
  • Strategic use of Technology: Using technology wisely to support our vision

Assessment and feedback are golden threads that weave their way through these strategic priorities. UCEM plans to transform assessment from the traditional use of summative examinations and assignments to fully integrated experiential assessment methodologies, utilising technologies to enhance the student assessment literacy and success. The first step in this approach is to pilot a set of new assessment methodologies and tools on a group of pre-agreed modules.   After consideration of the existing assessment process the team working on assessment redesign have been looking at the purchase of an assessment platform.

Assessment platform

UCEM makes considerable use of CMAs and they are a cost-efficient scalable assessment solution that can reduce marking resource (due to automation) and provide instantaneous feedback to students. Currently they are created and stored within the UCEM VLE (Moodle) using Moodle quiz and question banks. While Moodle has been an appropriate tool for this purpose, investigation identifies that the existing CMA writing process at UCEM is complex and time consuming; and that the CMAs created are limited in functionality. With this in mind the procurement of an assessment platform should be seen as an opportunity to improve the assessment workflow process and ideally move beyond delivery of online tests to a new transformative assessment approach. A platform is much more than a CMA system and will help us streamline existing process. 

Earlier in the year through a process of research we short-listed four assessment platforms:

The assessment tools were selected for their potential in a number of areas including writing, delivery and grading workflow; increased question types (calculations, video, drawing, labelling images, graphing, classification etc.), the ability to use existing tools (like excel) in assessment, use in of the tool when student has a poor or limited connection; the ability to set up different sittings for the same exam for students with disability packages, improved moderation and branching in questions.

Two of these platforms have recently been considered in more detail through a formal systematic testing and evaluation process. During this process we have utilised the expertise of many UCEM staff. We won’t name individuals but would like to make particular mention of the Digital Education team (Digital Education, Learning Technology, Core Services, SITs team, Editorial and quality, IT, information governance and library, media team), Academic registry (Assessment team, academic standards and timetabling) and our academics.

We are now pleased to announce that we have now signed a contract to pilot the assessment platform Inspera over the forthcoming year.

Inspera

Inspera is a Scandanavian assessment platform and is used by a growing number of UK HE institutions including Bath, Oxford, BPP and Newcastle.

Inspera logo

The platform has an impressive amount of functionality at all stages of the assessment process. There are over 20 types of questions, the vast majority of which are automatically marked. Rubrics can be added at the question level and question sets can be shared and tagged. Test sittings can be built specifically for learners with additional needs and saved as templates. Templates can be set up to choose questions from a question pool and branch depending on scores. Students can be given a question choice which allows them to view all available questions and then select which question they wish to answer. The test environment can be customised from the admin side and there are multiple accessibility options for the student. There is strong student support and while a test is in progress Inspera allows Admin users to send notifications to candidates. Candidates taking tests are shown on the admin’s monitoring panel and if a candidate is disconnected for more than couple of minutes it is indicated as such.

Where Inspera excels is in its handling of workflows in relation to authoring and marking. The use of committees allows marking groups and moderation options. Inspera allows efficient marking of open question types, with precise annotations and criteria-based marks, tailored by workflow configuration allowing for double-blinded marking. Notes can be written when marking and kept private or shared with other markers or students. Markers can mark entire tests or just particular questions for all students. Feedback is comprehensive and can take various forms. The analytics for questions and for tests are comprehensive. They include scores, time on questions, question order, attempts and correlation between question and total score. The system is continually evolving as their openly available roadmap indicates.

The bigger picture

The assessment platform and other related technologies chosen are only one piece of the puzzle in the move towards authentic, high-quality assessments that are appropriate for students and for staff. Other work streams will consider the assessment process in more detail, data connected to assessments and the infrastructure ramifications of implementing the described systems. There may also be policy implications of changes in our approach. Later on there will also be opportunities to consider continuous assessment, through ePortfolios and other reflection approaches, and student assessment literacy through the use of peer review. In delivering any new technologies we are keen to ensure that we include the student-voice in decisions and involve (and inspire) academics, especially in future assessment design approaches. Watch this space for more on how to get involved.

There is still much work to do but UCEM are now taking a more holistic and student-focused view of assessments that encompasses academic integrity, building of competencies and digital innovation. It is a really exciting time for the institution!