Overview Analytics
Get a high-level view of test results:Participation
- Total participants
- Completed vs in-progress
- Completion rate percentage
Score Statistics
- Average score
- Median score
- Highest and lowest scores
- Score distribution
Time Analysis
- Average time spent
- Time per section
- Completion patterns
Question Insights
- Hardest questions
- Easiest questions
- Success rates
Participant Results
Detailed results for each participant:Results Table
Sorting and Filtering
- Default Sort: Highest score first
- Filter by Status: Completed only, in-progress, all
- Search: Find participants by name or email
- Export: Download results as CSV or Excel
Participants are only included in analytics if they have started at least one section.
Score Calculation
Evaly calculates scores based on section scoring mode:Percentage Mode (Default)
Point-Based Mode
Scoring mode is set per section. Mixed modes are calculated correctly in the total score.
Score Distribution
Visualize how participants performed across score ranges:- Histogram View
- Statistical Analysis
Bar chart showing distribution:
- Quick identification of score clustering
- Spot bimodal distributions
- Identify if test is too easy or too hard
Question Difficulty Analysis
Identify which questions were hardest and easiest:Success Rate Calculation
Hardest Questions
Questions with lowest success rates:Easiest Questions
Questions with highest success rates:Very high success rates (above 95%) may indicate questions are too easy or testing basic knowledge.
Section Analysis
Breakdown of performance by test section:Section Insights
- Difficulty Comparison: Compare average scores across sections
- Time Analysis: See which sections took longest
- Completion Rates: Identify dropout points
- Question Distribution: Balance question counts
Answer Distribution
For multiple-choice questions, see how participants answered:Distractor Analysis
Understand why wrong answers were chosen:- Popular Distractors: Wrong answers chosen frequently indicate common misconceptions
- Unused Options: Options rarely chosen may not be plausible enough
- Pattern Analysis: Detect if participants are guessing randomly
- Effective Distractors
- Weak Distractors
Good distribution across options indicates quality distractors:
Progress Tracking
Monitor ongoing test progress:Metrics Explained
- Working in Progress: Participants who started but haven’t completed all sections
- Submissions: Participants who completed all sections
- Average Time: Mean completion time for finished participants (in seconds)
- Completion Rate: (Submissions / Total Participants) × 100
Exporting Data
Download analytics data for external analysis:CSV Export
Download participant results:
- Name, email, score, percentage
- Section-by-section breakdown
- Time spent per section
- Question-level responses
Excel Export
Formatted spreadsheet with:
- Multiple sheets (overview, results, questions)
- Charts and visualizations
- Pivot table ready format
Best Practices
Interpreting Results
Interpreting Results
- Look at median score, not just average (outliers can skew average)
- Compare section averages to identify difficult content areas
- Review hardest questions for clarity and accuracy
- Check if score distribution is normal or skewed
- Investigate very high or very low overall averages
Improving Tests
Improving Tests
- Revise questions with very low success rates
- Remove or replace questions with near-100% success rates
- Update distractors that are never chosen
- Adjust point values based on question difficulty
- Balance section difficulty for fair assessment
Using Question Analytics
Using Question Analytics
- Questions with 30-70% success rates discriminate well
- Below 30% may be too hard or unclear
- Above 70% may be too easy for assessment purposes
- Check answer distribution for guessing patterns
- Use insights to improve future question writing
Performance Tracking
Performance Tracking
- Export data after each test for records
- Compare results across test versions
- Track improvement over time for repeat assessments
- Use section analysis to guide curriculum decisions
- Share aggregate (not individual) data with stakeholders
Comprehensive Analytics Query
The analytics engine provides all metrics in a single query:Analytics update in real-time as participants complete sections and answers are graded.
Next Steps
Manual Grading
Grade open-ended questions to complete results
Live Monitoring
Monitor tests in real-time during administration
Export Results
Download data for external analysis
Improve Tests
Use analytics to refine your assessments