Open Education: Improving student retention while dealing with a truly diverse audience

McCallum, Tim (2014) Open Education: Improving student retention while dealing with a truly diverse audience. In: Annual Online Learning Consortium International Conference on Online Learning, 29-31 Oct 2014, Orlando, FL, USA.

[img]
Preview
Text (Accepted Version)
McCallum_Sloan_2014b_AV.pdf

Download (125Kb) | Preview

Abstract

This presentation provides an opportunity to demonstrate freely and readily available online tools and will include live demonstrations. This engaging presentation will walk through analysis, evaluation and the optimal use of available technologies to ensure improved student retention.

Student attrition is a major concern when it comes to the philanthropic odyssey of providing open educational resources over the web. Students are able to engage and disengage without financial or academic penalty and do so regularly without warning.

Formal academic credits towards credible degree programs are now being offered as part of online learning. The University of Southern Queensland (USQ) is committed to the OER universitas (OERu) http://oeru.org/ as a partner. Partners are accredited institutions in their respective national, provincial or state jurisdictions and therefore able to provide formal academic credit towards credible degrees in Africa, Asia, Oceania and North America (Wikieducator.org, 2014)

We are now talking about a more enduring commitment from students, and we are dealing with a truly diverse audience. So how do we ensure that we are meeting the needs of these students who are engaging for full credit?
This presentation demonstrates how we can improve student retention through: formative and summative evaluation: using the CIPP Model for evaluation analysis of seemingly esoteric data: using free tools like Google Analytics
the optimal use of available technologies.
The CIPP model for evaluation Stufflebeam & Shinkfield (2007) provides 4 evaluation types: context, input, process and product. The first evaluation type, context, assesses the user's needs and identifies problems within a defined environment, let's look closer at this.

Suppose we perform a survey at the start of a given course where a given number of students identify their intention to gain formal academic credit for the course. If we have a Key Performance Indicator (KPI) relating to this metric, we can make this part of the CIPP model for evaluation. In the case where this effort did not succeed we can begin to ask the question, why?
Empirical data combined with meaningful survey data provides powerful information. Free online tools such as Google Analytics provide a breakdown of web site traffic. This empirical data is segmented by many metrics including technology, behaviour and geo-location. This data can be further broken down to reveal mobile device branding, service provider, operating system, web browser and screen resolution information. Web analytics also provide Behavior Flow reports which provide a visual representation of the path which users take, when travelling from one web page to the next on your web site.
Suppose we learn from the analytics that there is an increase in mobile usage to our site, and after delving deeper, we discover that there is a clear leader in the mobile device branding. If our site is not responsive to a user's behaviour and environment we are making it impossible for them to continue. Thankfully there are ways to make our site compliant, and we will cover these a little later.

Behavior Flow reports allow us to analyse user engagement and ascertain whether users are following the intended path through our site. If after having this insight into user engagement we are not completely happy with the results, we can look into content, pedagogy and various technical issues. After further investigation, perhaps we discover that failure to complete an assessment piece is not due to waning interest as originally assumed, but in fact a useability issue with the navigation of the site.

Other reasons for student attrition may include the inability to stream media, having limited bandwidth, the inability to print material for offline viewing, or perhaps even something as simple as browser incompatibility.

Combining evaluation and analysis in this way allows us to accurately determine if we are getting closer to our goals or further away from our goals. This streetwise approach also provides enough flexibility to make incremental changes along the way and the following tools are invaluable in this ongoing process. This information is relevant to all learning and teaching platforms, and all end user devices.


Statistics for USQ ePrint 26933
Statistics for this ePrint Item
Item Type: Conference or Workshop Item (Commonwealth Reporting Category E) (Poster)
Refereed: Yes
Item Status: Live Archive
Faculty / Department / School: Current - Division of Academic Services - No Department
Date Deposited: 24 Mar 2015 20:31
Last Modified: 17 Jul 2017 04:46
Fields of Research : 13 Education > 1399 Other Education > 139999 Education not elsewhere classified
URI: http://eprints.usq.edu.au/id/eprint/26933

Actions (login required)

View Item Archive Repository Staff Only