Every year millions of candidates appear for examinations organized by assessment bodies globally, handling the complex process of marking answer scripts to declare time-bound results to candidates. The reliability of marking depends on the degree of consistency with which the evaluators mark the answer scripts. Over the years the number of candidates appearing for exams is increasing at a rapid rate, while the number of markers remains largely limited. This leads to the markers always being under pressure to mark more answer scripts within a limited time which leads to a sub-standard evaluation of the answer scripts.
Assessment bodies are thus revamping their examination management systems to conduct exams in a transparent, secure and efficient manner. But that’s not all; publishing the exam result efficiently and in a fair manner is an important task that assessors need to overcome in order to grow. Unfortunately, not many solutions are available in the market to overcome this hindrance.
Managing the traditional evaluation system has a bunch of difficulties which slow down the result processing. The process is completely manual and even one small mistake in marks assignment or calculation can affect a candidate’s performance.
Challenges faced by traditional evaluation system
- Discrepancies in handling of answer scripts:
The answer scripts are transported and distributed manually for evaluation and there’s a greater chance of answer scripts getting misplaced unknowingly or knowingly. There are also chances of tampering the answers scripts.
- Tampering of scores:
Tampering of scores can happen for any reason. In the traditional evaluation process, there’s only one course of evaluation and no parallel multiple evaluations. Thus, the tampering goes unnoticed.
- Handling of re-evaluations:
In the traditional method of evaluation, it is a tedious job of getting out the answer scripts and scanning it in order to issue it to the candidate for re-evaluation. The entire deal with handling the answer scripts is very risky and time-consuming.
- Longer turn around time:
The traditional evaluation process spans over months and there are also chances of delays in evaluation for various reasons. This slows down the result publication and may result in serious problems for the candidates who may be waiting to share the result with companies or higher education institutes.
- Lack of security:
Additional security measures have to be manually taken by an assessment body to maintain the secrecy of the evaluation. This accounts for extra time, manpower and management.
Mindlogicx offers the best digital solution with IntelliEXAMS to aid any assessment body in performing swift and accurate evaluations through its Onscreen Marking System.
On Screen Marking (OSM) – The new age solution for enabling on demand evaluations
OSM – On-Screen Marking is the most vital part of the IntelliEXAMS® because this component aids the online evaluation of the scanned answer scripts ‘anytime, anywhere’. The elementary business rules of OSM ensure that each answer is evaluated along with diagrams and graphs, answer keys to do the evaluation, easy re-evaluation process, annotations to provide remarks for each answer, parallel evaluation of an answer script by multiple evaluators and most importantly multiple-level of evaluation to ensure that there’s no biased evaluation. On-Screen Marking can be accessed only by the authorized SMEs and COEs to perform the evaluation. The best part is that the evaluation can be done from a laptop, mobile or tablet as the app supports all devices on cross platforms such as iOS, Android and Windows. As of date, 60 Million answer scripts have been evaluated with IntelliEXAMS through Onscreen Marking System.
- Using OSM, evaluation can be done for both pen and paper-based assessment and online assessment.
- For pen and paper-based handwritten answer script, the scanning of the answer script is done using the scan-desk. The digitized scanning of the answer scripts can also be done at source using mSCAN option on IntelliEXAMS mobile app. The scanned answer scripts are then uploaded to the cloud.
- For online exam, the subjective exam is written on a laptop or PC on the Content Editor app, and then the file is uploaded as it is, to the cloud. While if the content is an existing document, image, or graph it is uploaded to the cloud in the form of .doc, .pdf, .jpeg, etc. files.
- The scanned answer scripts and uploaded documents are available in OSM (On Screen Marking) for the evaluators to evaluate the same.
- The result of OSM is shared in the OGS (Onscreen Grading System) module where the grading logic calculates the final marks of a candidate for a subject.
- The processing of both OSM and OGS is supported on cross platforms such as iOS, Android and Windows.
Value additions of Onscreen Marking System
- Anytime and anywhere on-the-go evaluation
- Live tracking of number of answer scripts evaluated or pending
- Facilitates transparency in evaluations
- Real-time evaluation analytics
- Compatible with cross platforms like Android, iOS, and Windows
- Simultaneous evaluations by faculties
- Global outreach for geo-located evaluators
- Quick turnaround time for result processing
- Support to give individual remarks during evaluation
- Support for handwritten answer scripts and answer scripts uploaded as a file
- Support for candidate identity masking
- Support for intuitive annotations
- Swift and easy re-evaluation process
- Longer retention of evaluated answer script
IntelliEXAMS Onscreen Marking System is the only effective solution that will assist examination bodies by proactively identifying issues and discrepancies right at the very beginning. This helps them save a lot of resources and effort besides building stakeholder’s trust within the system. By making use of process-based methodology, the solution transforms the way marking of answer scripts is done today. IntelliEXAMS On-Screen Marking System increases the quality and consistency of the entire examination process by making it more secure and transparent in a highly cost-effective manner. This means savings for examination boards as well as greater confidence in the whole examination system.