In my previous blog topic, I explored creating Levels 1 & 3 evaluations/surveys in the LMS. In this blog, I explain how your newly created evaluations are managed and deployed to learners.
Adding an evaluation to an item
Evaluations are linked directly onto an item. To do this, open your item and click the “More” button on the Related menu.
Next, select Evaluations and then “Edit” button.
The top section “Item Evaluation: User Satisfaction” is for Level 1 surveys, while “Follow-up Evaluation : Application of Learning” is user for Level 3 evaluations. The middle section is for Level 2 evaluations – essentially exams on the material learners can take before and/or after the course.
Since we previously created a user satisfaction survey, we will add that to the item by writing in (or if you need to, search for) the survey ID. We can then add how many days a learner has to complete the survey and if the survey is required to be completed before the user receives credit for taking the course.
Once the survey is added and the page saves, learners will receive the survey in their learning plan as soon as the learning event is recorded for the course.
There are a few additional fields if you are using a level 3 survey.
Participants selects who will complete the survey – the learner and/or their manager.
Required for completion by selects if this is a mandatory survey or not. If it is mandatory, it will remain on the learning plan until it is complete.
Assign dictates how many days after the learning event is recorded that the survey is assigned and placed in the employee’s/manager’s learning plan.
Allow dictates how many days after assignment they are required to complete it by.
Assigning a survey
Level 1 user satisfaction surveys are automatically assigned as soon as a learning event is recorded for the learner on that item, without any additional action needed from the administrator. However, there are circumstances where admins need to send out learning evaluations before the learning event is recorded. This was a limitation until last year, when SuccessFactors introduced an option for admins or instructors to initiate evaluations after the start of a scheduled offering but before it is complete/learning event recorded. This is done by going to the page of the scheduled offering and clicking on “Initiate Evaluations”.
Proceed through the warning – the survey will be sent to all registered users.
Accessing the survey
Upon assignment via learning event completion or the admin push, the learner accesses the survey from their learning plan, where they can open and complete it.
Survey APM Jobs
There are two important jobs that need to be scheduled in the system to ensure evaluations are being delivered and accessed (and not cluttering up learning plans.
The Evaluation Synchronization accomplishes two things. First, it assigns the Level 3 Follow-up Evaluations to learners once the configured number of days since learning completion has passed. Secondly, it triggers the email notifications to learners alerting them that they have a survey to complete.
The Clean Up Overdue Item Evaluations job removes surveys from user’s learning plans that are 1) overdue, 2) optional/not required for item completion and 3) have not been started by the learner. This ensures that overdue evaluations do not begin to clutter learning plans. This ensures that learners do not attempt to remove long overdue surveys from their learning plan on their own with data that may not be accurate.
Having learners complete evaluations without a meaningful way to look at the data would be pointless. Luckily, the LMS comes standard with several reports that allow admins to extract and visualize the results. There are several, but the ones I recommend the most are the Item Evaluation and the Item Evaluation by Instructor.
With these reports, you can view responses from across items and scheduled offerings of user satisfaction with the course. Furthermore, with the instructor report, you can localize the data to just report on specific instructors. These reports can then be provided directly to instructors without giving them views of their fellow instructor’s results.