Accessible Information anytime, anywhere: The intentional use of videos in Psychology for learning and reducing statistical anxiety

Karin Oerlemans, University of Canberra

Janie Busby Grant, Centre for Applied Psychology, University of Canberra


Psychology at UC has undergone substantial change recently, particularly a move to flexible course delivery. The recipients of grant funding, academics embraced the opportunity to review unit content, adopted technologies, and drew on the support of teaching and learning specialists. In 2013, the psychology statistics lecturer (the second author), seeking advice from one of the T&L specialist (first author), incorporated a number of vodcasts into the unit, utilising a combination of lecture-based, enhanced and worked examples (Kay, 2012), scaffolding the learning, but also with the aim of reducing some of the anxiety students expressed around learning statistics and the SPSS software package. The mid-semester feedback was overwhelmingly positive. When an improvement in final unit results, particularly at the top end of the grade scale, was noted, the decision was made to investigate the reasons for the students’ positive engagement with the unit’s core learning materials.

This poster presents the initial findings of the 2014 study exploring how the use of vodcasts may have positively impacted students’ use of SPSS and reduced statistics anxiety. Researchers surveyed the students, using a before-after design within one semester. The study also drew on other de-identified data such as unit results and Moodle access data, comparing 2014 with previous years, to understand students use of the vodcasts and other tools and see if and how they were linked to the improved results.

It has been noted in the literature that there is a substantial lack of evidence based practice in higher education’s use of technology. Price and Kirkwood (2014) found that whilst the adoption of technology for use in teaching and learning was widespread, the effectiveness of its use was “open to question” and that much evidence for use was based on case study data, and anecdotal data of past practice that had worked. By using a before-after design, it is hoped to collect more rigorous data and aid our understanding of the use of vodcasts to support the improvement in flexible delivery of teaching and learning in higher education.

our poster


Psychology at UC has undergone substantial change in recent times, particularly a move to flexible course delivery. The recipients of grant funding, academics embraced the opportunity to review unit content, adopted technologies, and drew on the support of teaching and learning specialists. In 2013 & 2014, the psychology statistics lecturer, seeking advice from one of the T&L specialist, incorporated a number of vodcasts into the unit, utilising a combination of lecture-based, enhanced and worked examples (Kay, 2012), scaffolding the learning, but also with the aim of reducing some of the anxiety students expressed around learning statistics and the SPSS software package. Students’ engagement with statistics is important for those completing a psychology degree; however, statistical anxiety can be a significant stressor for students needing to complete a statistics unit (Chew & Dillon, 2014; Pan & Tang, 2005).

It has been noted that there is a substantial lack of evidence based practice in higher education’s use of technology, with much evidence for use based on case study data, and anecdotal data of past practice that worked (Price & Kirkwood, 2014). Kay (2012) in an extensive review on the use of vodcasts noted that the research on the use of these often had significant methodological concerns. He identified 4 types of vodcasts used in teaching and learning, including lecture-based, enhanced, worked examples and supplementary podcasts. Whilst the review showed much case based evidence for the use of video podcasts in higher education, often leading to increased learning performance, yet only one of these was in math, and none in psychology. This research seeks to further develop these themes.

Research questions

The following research questions were addressed in the study:

  • Can videos be used to support students and lead to a reduction of students’ anxiety in statistics?
  • Is the use of videos in coursework related to improved student learning outcomes?


This study, using primarily quantitative data, explored how the use of video podcasts positively impacted students’ learning of SPSS and reduced statistics anxiety.  Kay’s (2012) definitions for the 4 types of video were used:

  • Lecture-based (captured Echo recordings of face-to-face lectures),
  • Enhanced (recorded PowerPoints with audio explanations, captured using Camtasia, for example the introductory video to the unit),
  • Worked examples (short audio-visual explanations for solving specific procedural problems or using a program), and
  • Supplementary podcasts (videos that augment the teaching and learning of a unit, such as the explanations of the assessment tasks).

The videos were recorded, lectures as they occurred during the semester, but other videos predominantly before the start of semester. This ensured that they were available to students as and when they needed them. Lecture videos were further supported by the PPT handouts, and assessment videos with other written instructions. This was the norm in past practice and the rest of the teaching of the unit was also not changed, with face-to-face lectures and tutes conducted as per normal during the semester. So that the only significant change in content was the introduction of the videos. With this intervention we hoped to see a positive effect on student anxiety, improved learning outcomes for the unit, and better Unit Satisfaction Survey (USS) scores. Two data sets were used to determine the outcomes:

Data Set 1: UC-level de-identified archival data, from both the 2013 and 2014 cohorts:  

  • The use of online resources – views and downloads from Moodle &Echo
  • USS scores
  • Learning Outcomes – students grades

Data Set 2: A student questionnaire – using a before-after design within one semester, measuring:

  • students’ self-reported use of various online resources, and use of face-to-facecontact options to support their learning within the unit
  • anxiety related to statistics, using 24-item Statistical Anxiety Scale (Vigil-Colet, Lorenzo-Seva, & Condon, 2008)


Students access habits

There were a total number of 37 videos in the Echo Centre, and 1 on YouTube. 2 Assessment videos, 12 lecture captures, 23 tutorial screencasts, and 1 introductory video (on YouTube). The Assessment videos were 13 and 20 minutes long, the lectures all 113 minutes
long, the tutorial screencasts varied between 2 and 22 minutes long and the introductory video to the course was 7 minutes long.

Most, though not all, students accessed the videos, and not all students viewed all videos. In all,of the original 250 students enrolled, 217 students accessed the Echo Centre, where all the recordings were stored. The following graph shows the number students and how many times they accessed the Echo recordings. In other words, 33 students did not watch any Echo recordings, 36 students only watched 1, 17 students watched 2 Echo recordings, and so on. Only 1 student watched all 37 recordings in the Echo centre. This graph does not show the number of times students watched the Echo recordings, just the unique views.

echos accessed by students

lecture capture and handouts views

Because the unit also ran face-to-face, there were handouts prepared and made available to the students. Based on the Moodle data, the graph above shows the number of views of the lecture capture compared to the number of times students downloaded the handouts over the course of the semester. There were 12 lectures, and the numbers refer to the modules in which they occurred in the unit. We can see that the downloads of the handouts is larger than the lecture views, with the largest number of views 126 for lecture 4b, and the largest number of downloads 393 for the 4 slides per page handout for lecture minutes viewed per lecture

Finally, we were curious about how long students watched the individual lectures for. What is immediately obvious from the graph above is that most students did not watch the full lecture, though some students watched the videos longer than the 113 minutes of the lecture. Lecture 8 (module 5a: survey design and correlation) hadthe longest views (274 minutes). The most viewed and downloaded item were the Lab Report Supporting resources:

lab report views

Student Statistical Anxiety

We looked for 3 sub sets of statistical anxiety, examination, asking for help, and interpretation anxiety. When using only the data from those that completed both parts A and B (N = 42), all three were non-significant – there was no significant change in any measure of anxiety between before and after. And no difference in total score (.085).

stats anxiety views table

Statistical Anxiety Procrastination

The Lab Report assessment was due on the 17th of October, 2014. The video with instructions was uploaded to the Echo Centre on the 3rd of September. Students began to access this almost immediately and continued well past the assessment due date. Whilst it was apparent that some students left it to the last minute, many began much earlier, as can be seen from the following. What is also evident is that many students accessed the video many times, with total views of 1014 (see table above).

dates LR video accessed graph

Unit Satisfaction Survey (USS) scores

We saw an improvement in the USS data for this unit. The USS data is a University based survey instrument which students complete after the semester is completed. 31% of students completed the survey, the table below shows the results from this year and the previous 2 years, with improvements noted across all areas[1].
USS scores table

Improved learning outcomes for the unit

There was a noted improvement in the learning outcomes for the unit, with a particularly large increase in the percentage of HD’s (2013 – 7%; 2014 – 15%). There was also a decrease noted in the percentage of students who completed all the unit requirements but failed to gain a pass (NX[2]; 2013 – 9%; 2014 – 5%). The graph below shows the spread of results and the trend lines indicate the improved results between 2013 and 2014.

student results graph

Effect of Videos in Psychology

Finally, we wished to understand how much the use of the videos helped students in gaining these results. A Pearson’s r Correlation Analysis between the use of the videos and the final results were conducted for each of the assessment tasks and the unit overall.

  • For the Lab Report there was a strong and significant positive relationship between the number of supporting resources accessed and the final result for the assessment, r(232) = .57, p < .001.
  • There was a weak positive relationship between the students’ results on their final assessment and the number of times they watched the video, r(232) = 0.19, p < .05.
  • There was a strong positive relationship between the students access of exam resources and the result on their final exam, r(232) = 0.54, p < .001.
  • Finally, overall there was in fact a strong positive relationship between those students who viewed the videos throughout the semester, including lectures, tutorial screencasts and assessment videos and their final grade for the semester, r(232) = 0.47, p < .001. The graph below demonstrates this relationship.

correlation graph 1


Student Access: Students accessed the videos very much as expected, more at the beginning of semester, and less as time went on. Their behaviour in length of time watched also matched that of other research, with students on average watching longer at the beginning and shorter as time went on. Interestingly, this was not so for tutorial and assessment videos, which were more often accessed and usually watched all the way through.

Student Statistical Anxiety: the lack of results here may have been the timing of the second questionnaire, which unfortunately fell in the week before the exam!

A further survey was conducted the following year, but this will be reported on in a subsequent paper. One common issue with Stats Anxiety is the procrastination effect it produces on students (Chew & Dillon, 2014). It was noted with the videos that students did not in fact leave it all to the last minute, but did in fact access the materials throughout the semester, including in the lead up to the exam as part of their review. It will be interesting to review this access data with the new semester when this information will be available from the beginning of the semester.

From the survey, a comment from a student, “Surprisingly I am really enjoying statistics- never thought that would happen!” and “Given that I haven’t been in a maths based classroom for some 20 odd years, this unit did make me nervous but I have to say I have thoroughly enjoyed it”.

USS: An increase overall in the USS can be observed, with students agreeing or strongly agreeing that they have found the unit experience highly satisfactory on an increasing level. In looking at the comments from the survey students commented that “The level of organisation is perfect!” and “This unit, by far, has been the most organised and structured unit I have ever undertaken at this university and for that I am extremely grateful.”

Learning Outcomes: We see an overall increase in the results from students, with less students failing, and more students scoring highly. From the analysis it is obvious that the videos are having an effect, with those students who view more videos during the course of the semester gaining a better end results. There are still students starting the unit and then at some point dropping out, or not completing the work and this is an area we wish to explore more.

This final comment from a student sums up the findings nicely, “The Lectures and being able to access everything online, even all the tutorial material with the screencasts. It has been an enjoyable semester and has calmed me and many other students down, as we were worried about statistics.”


Chew, P. K. H., & Dillon, D. B. (2014). Statistics Anxiety Update: Refining the Construct and Recommendations for a New Research Agenda. Perspectives on Psychological Science, 9(2), 196–208. doi: DOI: 10.1177/1745691613518077

Kay, R. H. (2012). Exploring the use of video podcasts in education: A comprehensive review of the literature. Computers in Human Behavior, 28, 820–831. doi: doi:10.1016

Pan, W., & Tang, M. (2005). Students’ Perceptions on Factors of Statistics Anxiety and Instructional Strategies. Journal of Instructional Psychology, 32(3), 205.

Price, L., & Kirkwood, A. (2014). Using technology for teaching and learning in higher education: a critical review of the role of evidence in informing practice. Higher Education Research & Development, 33(3), 549-564. doi: DOI: 10.1080/07294360.2013.841643

Vigil-Colet, A., Lorenzo-Seva, U., & Condon, L. (2008). Development and validation of the Statistical Anxiety Scale. Psicothema, 20(1), 174-180.

[1] USS: Unit Satisfaction Scale; GTS:  Good Teaching Scale; GSS: Generic Skills Scale; OSS: Overall Satisfaction Scale; SES: Student Experience Scale

[2] NC Fail result based on failure to complete one or more of the assessment requirements for a unit; NX Fail result based on failure to reach pass grade in a unit having completed all the unit assessment requirements

©Karin Oerlemans and Janie Busby Grant