Skip to main content
Evaly automatically grades objective questions, but some question types require your review and manual scoring. This guide covers the grading workflow, scoring modes, and providing effective feedback.

Automatic vs Manual Grading

Understanding which questions are automatically graded helps you plan your grading workload.

Auto-Graded Question Types

These questions are graded instantly when participants submit:

Multiple Choice

Compares selected options to correct answers. Supports single or multiple selection.

Yes or No

Simple binary choice with predefined correct answer.

Image Choice

Multiple choice with images, same scoring logic.

Audio Choice

Multiple choice with audio, same scoring logic.

Fill in the Blank

Exact text matching (case-insensitive) for each blank.

Ranking (Exact)

Full credit only if entire order matches exactly.
Special Case: Ranking (Per-Position)
  • Automatically graded with partial credit
  • Each item in correct position earns proportional points
  • Example: 3/5 correct positions = 60% of points

Manually Graded Question Types

These questions require educator review:

Text Field

Short or long text responses. Requires reading and judgment.

File Upload

Documents, images, code, presentations. Requires content review.

Audio Response

Voice recordings. Requires listening and evaluation.

Video Response

Video submissions. Requires watching and assessment.

Matching Pairs

Complex matching logic. Requires review (for now).

Slider Scale

Subjective ratings. Typically used for surveys.

Likert Scale

Opinion scales. Typically used for feedback.
Tests with manually graded questions cannot show full results to participants until you complete grading. Participants see a “Grading in Progress” message.

Grading Workflow

Follow this workflow to efficiently grade participant submissions.

Step 1: Navigate to Results

1

Open Your Test

Go to the Tests page and select the test to grade
2

Go to Results Tab

Click the Results tab to see all participants
3

View Grading Progress

See grading progress indicator:
  • Total answers: X
  • Graded: Y
  • Pending: Z
  • Progress: Y/X %

Step 2: Select a Participant

Results View Shows:
  • Participant name and email
  • Completion status
  • Current score (for auto-graded questions)
  • Pending manual reviews
  • Submission time
Sort Options:
  • By submission time (newest first)
  • By name (alphabetical)
  • By score (if partially graded)
  • By grading status (needs review first)
Click on a participant to open their detailed submission.

Step 3: Review Submission

The submission view displays:
Navigate through test sections:
  • Section Title displayed at top
  • All questions in that section
  • Auto-graded results shown with checkmarks/x’s
  • Manual review questions highlighted
  • Section score calculated as you grade

Step 4: Grade Manual Questions

For questions requiring manual review:
1

Review Answer

Read/view/listen to the participant’s submission:
  • Text: Read response in rich text viewer
  • Files: Download and review files
  • Audio: Play audio with playback controls
  • Video: Watch video with playback controls
2

Assign Score

Enter the score based on your rubric:
  • Percentage mode: Enter 0 or 1 (0% or 100%)
  • Point-based mode: Enter 0 to max points for question
  • Partial credit: Decimal values allowed (e.g., 7.5/10)
3

Add Feedback (Optional)

Provide comments to help the participant:
  • Explain score
  • Highlight strengths
  • Suggest improvements
  • Clarify misconceptions
4

Flag for Review (Optional)

Mark question for follow-up:
  • Unclear answer
  • Possible plagiarism
  • Technical issue
  • Needs second opinion
5

Save Grade

Click Save to record the grade and move to next question
Grading Interface:
┌────────────────────────────────────┐
│ Question: Explain photosynthesis        │
│ Type: Text Field                       │
│                                        │
│ Student Answer:                        │
│ [Photosynthesis is the process...]     │
│                                        │
│ Score: [8] / 10 points                 │
│                                        │
│ Feedback:                              │
│ [Good explanation of the process...    │
│  but missing details about...]         │
│                                        │
│ [ ] Flag for review                    │
│                                        │
│ [Cancel] [Save Grade]                  │
└────────────────────────────────────┘

Step 5: Complete All Questions

  • Grade all manual questions for the participant
  • Review auto-graded results (can override if needed)
  • Ensure all questions have final scores
  • Progress indicator updates in real-time

Step 6: Move to Next Participant

  • Click Next Participant or return to results list
  • Repeat grading workflow
  • Track overall grading progress

Scoring Modes

How scores are calculated depends on the section’s scoring mode.

Percentage Mode

Each question is worth 1 point, regardless of complexity. Calculation:
score = correctQuestions / totalQuestions * 100

// Example:
// 10 questions in section
// 7 correct
// Score: 7/10 = 70%
When Grading:
  • Assign 0 or 1 for each question
  • Partial credit: 0.5 = half credit
  • Final score is percentage

Point-Based Mode

Questions have custom point values based on importance/difficulty. Calculation:
score = earnedPoints / totalPossiblePoints * 100

// Example:
// Question 1: 2 points (earned 2)
// Question 2: 5 points (earned 3)
// Question 3: 3 points (earned 3)
// Total: 8/10 = 80%
When Grading:
  • Each question shows its point value
  • Enter score from 0 to max points
  • Decimals allowed (e.g., 3.5/5)
  • System calculates percentage
Setting Point Values:
  1. Edit question in Questions page
  2. Find “Point Value” field
  3. Enter custom value (default: 1)
  4. Save question
Quiz: "Chemistry Midterm"
Scoring Mode: Point-Based

Easy Questions (1-2 points each):
  Q1: Define atom (1 pt)
  Q2: Name three elements (1 pt)
  Q3: Multiple choice (2 pts)

Medium Questions (3-5 points each):
  Q4: Balance equation (3 pts)
  Q5: Explain bonding (5 pts)

Hard Questions (8-10 points each):
  Q6: Multi-step problem (8 pts)
  Q7: Lab analysis essay (10 pts)

Total Possible: 30 points

Overriding Auto-Grades

You can manually change auto-graded results if needed.

When to Override

  • Answer is technically correct but marked wrong
  • Multiple interpretations are valid
  • Spelling/formatting issue in fill-in-blank
  • Want to award partial credit
  • Technical glitch affected grading

How to Override

1

Find Question

Navigate to participant’s submission, find the auto-graded question
2

Click Override

Click “Override Grade” or edit button next to auto-grade result
3

Enter New Score

Enter manual score:
  • 0 to max score for question
  • Replaces auto-grade completely
4

Add Explanation

Explain why you’re overriding:
  • Helps you remember later
  • Useful for grade appeals
  • Maintains transparency
5

Save Override

Manual score replaces auto-score immediately
Grade Record Shows:
Question 5: Multiple Choice
Auto Score: 0/1 (incorrect)
Manual Override: 1/1 (correct)
Graded By: Dr. Johnson
Reason: "Answer was technically correct, 
         accepting alternative interpretation"
Overridden: Yes
Overriding marks the grade as manually scored, so it won’t be affected by auto-regrading if question is edited.

Regrading After Question Changes

If you modify a question’s correct answer after participants have submitted, you can regrade affected submissions.

Triggering Regrade

When you change correct answers on a question:
  1. System detects existing grades
  2. Shows affected submissions count
  3. Prompts: “Regrade X affected answers?”
  4. Choose to regrade or skip

Regrade Process

1

Edit Question

Change correct answer, options, or scoring
2

Regrade Prompt

Dialog appears:
Regrade Required

Changing the correct answer will affect 47 existing grades.
Do you want to regrade all affected answers?

[Cancel] [Skip Regrade] [Regrade All]
3

Choose Action

  • Regrade All: Applies new scoring to all submissions
  • Skip: Keeps old scores, only affects new submissions
  • Cancel: Don’t save question changes
4

Automatic Regrading

If you choose “Regrade All”:
  • System recalculates all affected answers
  • Updates participant scores
  • Preserves manual overrides
  • Logs the regrade action
What Gets Regraded:
  • All submissions for that question
  • Uses new correct answer
  • Recalculates scores automatically
What Doesn’t Change:
  • Manual overrides (preserved)
  • Questions you manually graded already
  • Other questions in the test
Regrading can significantly change participant scores. Use carefully, especially after releasing results. Consider the impact on fairness.

Providing Effective Feedback

Feedback helps participants learn from their mistakes and improves future performance.

Feedback Best Practices

Be Specific

Good: “Your explanation of mitosis was accurate, but you forgot to mention cytokinesis.”Poor: “Good job” or “Needs work”

Be Constructive

Good: “Consider organizing your essay with clear topic sentences for each paragraph.”Poor: “Your writing is messy.”

Highlight Strengths

Acknowledge what they did well:
  • “Excellent analysis of the data”
  • “Your introduction was compelling”
  • “Great use of examples”

Guide Improvement

Give actionable advice:
  • “Review pages 45-48 in the textbook”
  • “Practice similar problems in chapter 3”
  • “See me during office hours to discuss”

Feedback Templates

Create reusable feedback for common situations:
Your answer demonstrates understanding of [concept A],
but doesn't address [concept B], which is essential to
fully answering the question. Review [resource] and
focus on how [concept A] and [concept B] interact.
You've made a good start, but this question requires
more depth. Consider expanding on:
- [Point 1]
- [Point 2]
- [Point 3]

Aim for at least [X words/minutes] on questions like this.
Outstanding response! You demonstrated:
✅ Clear understanding of [concept]
✅ Strong analytical skills
✅ Effective communication
✅ Good use of examples

This is exactly the level of detail I'm looking for.
Your response discusses [topic X], but the question
asks about [topic Y]. While your information is
accurate, it doesn't address what was asked.

Make sure to read questions carefully and stay focused
on what's being asked.

Rubric-Based Grading

Use consistent criteria for manual grading:
Essay Question Rubric (10 points total):

Content (5 points):
  - Addresses all parts of question
  - Demonstrates understanding
  - Uses accurate information
  - Provides relevant examples

Organization (3 points):
  - Clear structure
  - Logical flow
  - Effective transitions

Writing Quality (2 points):
  - Grammar and spelling
  - Clear expression
  - Professional tone
Provide feedback based on rubric categories:
Score: 7/10

Content (3/5): You addressed two of the three main
points but didn't discuss [missing concept].

Organization (3/3): Excellent structure with clear
introduction, body paragraphs, and conclusion.

Writing (1/2): Several grammar errors and typos.
Consider proofreading more carefully.

Grading Progress Tracking

Monitor your grading workload: Progress Dashboard:
Test: Midterm Exam
Participants: 85

Auto-Graded Questions: 15 (100% complete)
Manual Review Questions: 5

Grading Progress:
  Total Answers: 425
  Graded: 312 (73%)
  Pending: 113 (27%)

Estimated Time Remaining: 2.5 hours
  (based on your average grading speed)
Participant Status:
  • ✅ Fully Graded: 52 participants
  • 🟡 Partially Graded: 18 participants
  • 🔴 Not Started: 15 participants

Bulk Operations

Apply Same Feedback

For common errors across participants:
  1. Grade first occurrence with detailed feedback
  2. Copy feedback text
  3. Paste and modify for similar cases
  4. Maintains consistency

Export for External Grading

If grading offline:
  1. Export participant responses
  2. Grade in spreadsheet or grading software
  3. Import scores back to Evaly
  4. (Feature availability depends on plan)

Releasing Results

Once grading is complete:
1

Complete All Manual Grading

Ensure all questions needing review are graded
2

Review Final Scores

Check participant scores for:
  • Accuracy
  • Fairness
  • Consistency
3

Release Results

Toggle “Results Released” in test settings:
  • Participants can now see their scores
  • Detailed feedback is visible
  • Correct answers shown
4

Communicate Release

Notify participants:
  • “Results are now available”
  • How to view their scores
  • Grade appeal process (if applicable)
You can release results before completing all grading, but participants with ungraded questions will see “Grading in Progress” for those items.

Best Practices

  1. Grade Consistently: Use the same rubric for all participants
  2. Grade by Question: Grade all participants’ answers to Question 1, then Question 2, etc. for consistency
  3. Take Breaks: Grading fatigue affects consistency and fairness
  4. Blind Grading: Consider hiding names to reduce bias
  5. Second Opinions: For borderline cases, get colleague input
  6. Document Criteria: Write down your grading standards
  7. Be Timely: Return grades promptly while content is fresh

Next Steps

Analytics

Analyze test performance and question difficulty

Results Management

Export and manage participant results

Question Types

Learn about auto-grading for each question type

Test Settings

Configure scoring modes

Build docs developers (and LLMs) love