Guides

Designing activities to facilitate analytics

Moodle creates a specific set of data about student activity and interaction with items on Moodle. This guide provides information on choosing Moodle activities based on the data produced.  It is possible to design learning tasks so that the data analytics provide an accurate estimation of student ability, comprehension, and progress.

Having effective data will mean that you can efficiently and effectively offer advice and guidance to students, and streamline feedback processes through automatic targeted responses.

Designing learning tasks

Data related to a student’s knowledge, skills, or abilities is collected in Moodle through the student’s completion of tasks, performance results for select activities, forum posts, or time spent in lessons. To effectively evaluate the learning, the “task completion”, “result”, or “time spent” data need to be related to learning.

It isn’t always possible to correlate all desired learning outcomes to an item that Moodle can track, but there are learning strategies that can be adopted to capture and evaluate student progress. The below chart shows how tasks can be connected to a Moodle data source.

Note: Basic Moodle activity knowledge may be required to interpret the flow chart.

Reviewing collected activity data

When you have collected data on the student Moodle activity, reflect on how the data is related to the student progress.  It is often useful to triangulate data from different sources, for example activities, attendance, assessment, and logs, to get an overall impression of student learning.  Data can also be collected over a period of time, to identify improving or decreasing engagement and behaviour.

You can consider how the student’s data aligns with your own assumptions and feelings about the student’s progress.  Do you feel that the data accurately measures the student’s understanding, skills, and abilities?  Can the data be interpreted into actions that you can suggest the student takes?

For example, if you identify a student is not clicking on a resource, what could this mean?  Perhaps the resource is hard to find, or it is repeating other material, or it’s in the wrong spot, or it’s not seen as useful.  Perhaps if you track this resource over time you would find that the student is viewing it on a different week and their viewing habits are varied.  Are there any other data sources, or example a formative task, that would help you understand the student behaviour?  If the student has done badly in a related quiz, you could contact the student and explain how the resource is useful and suggest they view the resource and retake the quiz.

Reviewing the data that has been collected can also inform future activity design choices, data collection strategies, or analysis approaches.  It is an acceptable practice to fine-tune all of these and progressively improve how you are using analytics to improve the student learning experience.