Revisiting my PhD thesis

I recently reread my PhD thesis (titled “Instructional interactions and online course effectiveness at a large Mexican organisation”), which I completed in 2014. I learned so much, much more than what I could write in an academic way. And I wanted to share it… So I added a prologue and an epilogue. I don’t know if that is common practice; I don’t really think it is. I did it anyway. I used the space as an opportunity to express myself as a person, with passion, dreams and hopes, to move past the formalities of scientific language and to describe the fantastic learning journey of my PhD work.

Anyway, here they are…

Prologue

My interest in designing effective online courses to help organisations provide accessible and efficient training to their staff has been present since the beginning of my professional career. I focus on Mexico because it is my home country and I am aware of our limitations, compared to more developed countries. I acknowledge the potential of technology to facilitate learning. I believe that employees who have suitable development opportunities and feel well-prepared to do their jobs are happier and more productive. Contributing to this area is my way of creating a better world. I am passionate about education and learning technologies.

While the value of online learning is widely accepted, in corporate settings, especially in developing countries such as Mexico, doubts remain. Some of these lingering questions relate to bad practices. I have personally experienced some awful, page-turner online courses. I decided to make it my mission to find engaging ways of fostering online learning.

Since 2006 I have worked in the field of corporate e-learning. I started in a large Mexican company. I was in charge of managing an online training system in the e-learning platform Moodle. I used to help managers develop the courses and support online students. I had no freedom to change the learning design. I was given a template, which had to be completed. Courses were text-based. There were no embedded interactions with other people. Every tool that fostered collaboration was blocked, due to security concerns. Communications with the teachers were infrequent, via email. I found the courses static, uninspiring and boring.

In 2009 I began working on my own as a learning consultant and realised that many organisations excluded social interactions from their online courses (e.g., Padilla Rodriguez & Armellini, 2013a; Padilla Rodriguez & Fernandez Cardenas, 2012). I found that most employees took for a fact that online courses were isolating experiences. I have had conversations with Human Resources staff in which learning designers share their beliefs that virtual courses are mostly online reading materials available for self-study that require no human support. Participants’ expectations of online courses seem to be tainted by these preconceptions.

On the other hand, I have also come across reports of online students stating that they learn more from other people and that sometimes online content cannot answer their specific questions. At organisations, usually, only one or two people require any particular training at a time. It does not make sense to have a group of students. It is unfeasible. I can understand that. Yet, I am intrigued. Are content-based online courses in organisations really effective? The organisations I worked with claimed they are. However, evaluations are often limited to satisfaction surveys and exam performance without an agreed baseline.

I am a psychologist and my master is in education and cognition. I am familiar with learning theories such as constructivism, which emphasises the role of peers and experienced others in learning. In the academic contexts I have been acquainted with, I have constantly heard of the value of team work and sharing of experiences. Animated discussions with people can make any topic exciting. Are online courses that foster social interactions effective, then? The universities I worked with claim they are. However, as in organisations, evaluations often lack a pre-post systematic approach.

The contrast between companies and academic institutions suddenly made sense when I found Anderson’s (2003a) interaction equivalency theorem and its thesis about how learning can be supported if one of three types of interactions is present at a high level. If it was supported by empirical evidence, it would explain why organisations and universities claim that their online courses are effective in spite of their differences. It could also lead the way for effective online learning designs. I decided to test this thesis. This research was conducted in a large Mexican organisation and aimed to explore the relationship between online interactions and course effectiveness.

 

Epilogue

When I started conversations with the participating organisation, I was focused on answering my research questions: creating groups of students, delivering different versions of the same course, evaluating interactions and effectiveness… I soon realised that if I wanted to be successful, I needed to do much more than what I had originally planned.

Helping the organisation acquire the necessary conditions to implement interactive online learning was a hard, exhausting job. The Moodle Features course, which I moderated, had over 150 students. Since it was the first online course delivered to sales supervisors at the organisation, I tried to reply to all messages. I wanted to challenge the misconception of online courses as isolating spaces. I wanted participants to know that they were not alone, that someone was reading their contributions. I had many 14-hour days.

My original plan was to redesign, develop and evaluate four online courses. I ended up assessing employees’ perceptions on e-learning, installing and configuring Moodle in the organisation’s server, negotiating expectations, training different stakeholders, creating seven courses and moderating three of them, evaluating interactions and effectiveness, presenting results to senior managers, and establishing the foundations for the project to continue beyond the completion of my thesis.

In this sense, my PhD work was a unique opportunity to systematically document the process of implementing online learning at a large Mexican organisation. While other academics have studied different topics in the area of corporate e-learning (e.g., Vaughan & MacVicar, 2004, studied employees’ pre-implementation attitudes towards e-learning; Gunawardena et al., 2010, defined predictors of learner satisfaction and transfer of learning in a corporate online education programme), few researchers have had the opportunity to work with so many different stakeholders at different stages of the implementation process. I did. Throughout the project, I worked with HR managers, HR staff, learning designers, IT staff, retailers, sales supervisors, and sales managers and directors.

My friendships within the organisation helped a lot in terms of access and support. I am grateful. I would not have been successful without their help. It was not easy. Yet, the experience I acquired, plus random expressions of enthusiasm, make it all worth it.

Lots of things have changed at the organisation. They now have a more interactive approach towards learning. They now have experience with different designs for online courses. They now know how to conduct thorough evaluations of course effectiveness, taking advantage of their existing resources and practices. They are now an example to follow for the parent corporation. They were pioneers in innovating, moving beyond the possibilities of face-to-face courses and online courses-in-a-box.

This project is only a beginning for the organisation. Learning designers are taking the lessons learned forward to improve their educational offer. It will take them a while to create an efficient learning culture and to truly empower employees to be in charge of their own development. It is a process, which will be full of challenges, but the first step is taken. I am happy to have been part of this.

While much remains to be researched, I have moved forward in my mission of helping employees feel better prepared to do their jobs and thus, creating a better world. I have also satisfied my personal curiosity and answered questions I have had since long ago. Are content-based online courses in organisations effective? Yes! Are online courses that foster social interactions effective? Yes! Anderson’s (2003a) interaction equivalency theorem seems to hold true if expanded to include course delivery. Both kinds of courses can be effective. The key is not in the type of interactions embedded in the course but rather on how these interactions take place.

I have learned more than what I can document in a formal, systematic way. I now feel confident enough to help organisations in the implementation of e-learning, from scratch. I can now foresee potential problems that may arise at different stages. I have enough experience to make decisions on the spot. I have a better understanding of how to design, implement and evaluate effective online courses. I know how different stakeholders can be crucial in the success or failure of corporate e-learning. I have evidence to justify the value of online courses to managers and other practitioners. I appreciate how online courses can change people’s attitudes towards learning and how technologies can empower employees.

It was a rewarding journey.

Advertisements

Are MOOCs pedagogically innovative?

While claims about pedagogic innovation in Massive Open Online Courses (MOOCs) are common (eg, Grainger, 2013; MOOCs@Edinburgh Group, 2013; Sharples et al, 2014; University of British Columbia, 2014), most reports provide no evidence to justify those claims. I recently published a paper with my colleague (and former supervisor) Alejandro Armellini on this topic. We report on a survey aimed at exploring how different stakeholders describe MOOCs, focusing on whether they would consider them pedagogically innovative, and if so, why. Respondents (n = 106) described MOOCs primarily as free, openly accessible online courses that attract large numbers of participants. Views on pedagogic innovation fell into three categories:

1) MOOCs are pedagogically innovative (15.1%). Explanations referred to massiveness, openness and connectivism. None of the participants offered a clear definition of or criteria for pedagogic innovation.

2) MOOCs are not pedagogically innovative (84.9%). More than half of the respondents added an unsolicited opinion, including strong criticisms of MOOCs.

3) MOOCs may or may not be pedagogically innovative. Umbrella terms (such as “pedagogically innovative”) were seen as unhelpful to describe all types of MOOCs.

The evidence suggests that caution should be exercised when characterising MOOCs as pedagogically innovative. Concrete practices within specific MOOCs may constitute pedagogic innovations in the areas of course design and delivery, such as the student-created assignment bank and the radio station of the cMOOC DS106. However, these practices are not exclusive to MOOCs: they can (and do) occur in any course, across different modes of study. The MOOC environment did not prompt them.

MOOCs provide good examples of technological innovation but also of highly debatable approaches to pedagogy. They may be deemed valuable as massive open online resources (MOORs), but far less so in terms of being pedagogically innovative courses. We should be cautious about applying blanket terms to describe different types of MOOCs. However, based on the evidence currently available and contrary to what many reports claim, MOOCs cannot be described as inherently pedagogically innovative.

The full paper is available at: http://www.ncolr.org/jiol/issues/pdf/14.1.2.pdf

References

Grainger, B. (2013). Massive Open Online Course (MOOC) Report. London, UK: University of London. Retrieved from http://www.londoninternational.ac.uk/sites/default/files/documents/mooc_report- 2013.pdf

MOOCs@Edinburgh Group. (2013). MOOCs @ Edinburgh 2013: Report #1. Retrieved from https://www.era.lib.ed.ac.uk/bitstream/1842/6683/1/Edinburgh_MOOCs_Report2013_no 1.pdf

Sharples, M., Adams, A., Ferguson, R., Gaved, M., McAndrew, P., Rienties, B., … Whitelock, D. (2014). Innovating Pedagogy 2014: Open University Innovation Report 3. Milton Keynes, UK: The Open University.

University of British Columbia. (2014). UBC MOOC Pilot: Design and delivery overview. Retrieved from https://circle.ubc.ca/bitstream/handle/2429/51200/UBC_MOOC_Pilot_Report.pdf?seque nce=1

MOOCs behind the scenes

Since Massive Open Online Courses (MOOCs) emerged in 2008, they’ve made a lot of noise. They have intrigued, challenged and fascinated (some of) us. However, the literature on them is still limited and is mostly focused on the learners, with a significant minor focus on the institutional perspective (see Liyanagunawardena, Adams, & Williams, 2013).

Today I presented a paper -coauthored with Alejandro Armellini & Viviana Caceres– at Global Learn 2016 on how MOOCs are conceptualised, designed, developed, implemented and evaluated: MOOCs behind the scenes. The discussion that followed was enriching. There are many questions: Where is the institutional, overarching pedagogical strategy that guides MOOC-related decisions? Why are MOOCs failing to benefit from OERs? How can we assess the effectiveness of a MOOC when MOOC participants have all sorts of reasons to join a MOOC?

My presentation is available in Slideshare with a CC-licence. Feel free to share and to contact me if you have any questions or comments.


 

Reference

Liyanagunawardena, T., Adams, A., & Williams, S. (2013). MOOCs: A systematic study of the published literature 2008-2012. The International Review of Research in Open and Distributed Learning, 14(3), 202-227. Retrieved from http://www.irrodl.org/index.php/irrodl/article/view/1455/2531

Study Skills MOOC

oew2016-badge-small.pngAs part of my post-doc research at UANL, I am studying a Study Skills MOOC, inspired by and developed in partnership with the University of Northampton. The aim is help students improve their self-efficacy beliefs and develop key study skills:

  1. Managing time
  2. Taking effective notes
  3. Finding reliable information
  4. Citing using APA format
  5. Understanding academic texts
  6. Academic writing

This course is available on Open Education Blackboard, or via the direct link: http://tinyurl.com/hemooc1  It will be delivered in Spanish.

On Twitter we are using the hashtag #hemooc (add @BrendaPadilla to obtain a direct response).

Joining is free! We are starting on May 2, 2016. We look forward to this experience.

 

MOOCs on Learning and Course Design

Along with Prof Gráinne Conole and Dr Paul Rudman, on the 29th Feb 2016 I will deliver two massive open online courses (MOOCs) in the European platform EMMA:

  • Designing Online Courses with the 7Cs Framework. The 7Cs represent a powerful seven-stage process that helps educators and designers create effective courses using digital technologies. In this course, students will learn collaboratively with other students and from experts in the field. Throughout 6 weeks, they will work on the design (or redesign!) of a technologically enhanced course.
  • 21st Century Learning . Online technologies, social media and open education have made learning ubiquitous.  This course provides an introduction to 21st century learning practices. Students will have a chance to evaluate their digital literacies, create their own personal learning environment, find open educational resources, explore virtual worlds and more!

These courses have no cost. They are aimed at academics, teachers and people interested in education, in general. Please feel free to join!

The nightmare of attributing OERs

Recently, I worked with Gabi Witthaus in the development and delivery of the Storyboarding for Learning Design Open Online Course (OOC), with Jeff Stanford’s support as a facilitator. We designed the course drawing from our experience in topic-related workshops. We used open educational resources (OERs), which are materials released under a licence that permits their free use and re-purposing. Some, we had developed ourselves. Others were created by other academics. Attributing everything correctly was a nightmare.

As I was working in the Storyboarding OOC, I was also working in the redesign of a MOOC on Learning Design, which will be delivered in EMMA. I was simultaneously creating activities and materials for both courses. While each course is unique, the content is similar. Eventually I had to ask myself: Should I say that the EMMA MOOC is based on the OOC or the OOC based on the EMMA MOOC? Should I attribute my own work? Is a note at the beginning of the OOC, stating that I am a co-creator of the materials, enough for people to understand that everything that does not state otherwise was created by me? How important is it to be that specific?

For both courses, I used OERs I have used in the past. I decided to check if my previous attributions were correct. I discovered that some OERs were widely used, but it was unclear who the original author was. For example: I found an OER used by A, who attributed the OER to B. However, I could also find the same OER attributed to C, and to D, and to E. So who was the actual creator? Sometimes, when I find an OER without an attribution, I assume the author of whatever I am looking at is also the author of the OER (for example: imagine an image within a text; if the image has no attribution, I assume the author of the text is also the creator of the image). A number of people seem to share my assumption. What if the assumption is incorrect? What if the author simply did not write the correct attribution? Who to attribute then? If both B and C use an OER without attributing it, how do you know who is the actual author?

And this is just the beginning. One of the activities is about how to ruin a course. A similar activity was developed in the past by Rebecca Galley and the OULDI team at the Open University UK, 2009. However, our version of this activity is very different. The idea behind it is the same, but everything else is different. Should we attribute it? In other words, how much does an OER need to be modified before it doesn’t really make sense to attribute it? For the OOC, we decided to have a note stating that our activity was inspired by the other OER.

But sometimes it is trickier than that. Have you ever had an “original” idea that -you later find out- was published before by someone else? I have created activities that I later find as OERs elsewhere. (A simple example: Think about a welcome activity where you ask participants to introduce themselves. Lots of people have created a similar activity.) What do you do about it in terms of attribution? Is it really your work? What if you had seen that OER before but you didn’t remember?

Sometimes I look for several OERs on the same topic and I create my own resource based on them. Should I attribute all the OERs? None at all? Only the one that had the most influence on my resource? What if the lines are not clear cut?

Finally, what if I want to use an OER that was inspired by another one? (The Storyboarding OOC’s activity on how to ruin a course is a great example of this). Should I attribute both the creator of the OER and the inspirators?

Attributing OERs gives me a headache.

OER Attribution Problems

  1. Creating similar materials for two courses and having to decide if one is based on the other one.
  2. Finding the original author of widely used OERs with unclear attributions.
  3. Deciding if being inspired by an OER is enough to require an attribution, or how much does an OER need to be modified before you can’t really say it is based on someone else’s work.
  4. Attributing a resource based on several OERs.

I don’t have answers for all my questions. But I’ve come up with certain ideas that help. Using full OERs (not trying to build upon several of them or change them considerably) prevents lots of problems. Also, favouring public domain resources (thank you all who let everyone use your work without minding the attribution) works, especially when you want to build upon materials. Attribution builders available online help you establish a standard when attributing and identify data required. They also provide guidance on how to attribute derivative works (thank you, Gabi, for the info!). 

If you have any other answers or suggestions, please share! I would be more than happy to learn.

MOOCs reloaded: Redesigning two MOOCs in EMMA

Cool techie image of Neo from the movie The Matrix

Image courtesy of Sudhee | Flickr

 

In October 2014 I was part of a team at the University of Leicester working in the EMMA (European Multiple MOOC Aggregator) platform. We ran a pilot of two massive open online courses (MOOCs): Learning Design and Technology-Enhanced Learning. We had over 60 participants in each of these courses. It was an interesting experience that has enabled us to improve our MOOCs.

 

We are currently getting ready for the rerun in May 2015. I am in charge of the redesign of the MOOCs. Here is what we have been working on:

1. Listening to the evidence: During the first pilot of the MOOCs, we asked participants to use blogs as a way of documenting their learning. We noticed that most of them did not use this tool. We have modified our activities to make the use of blogs optional.

2. Obtaining participants’ feedback: We have added optional entry and exit surveys that will let us know more about participants’ background, motivation and expectations, and their perceptions on the MOOCs. Answers will be useful for research-based future improvements.

3: Facilitating time-management: Many learners struggle to balance their life and work commitments with their online studies. We are trying to help by adapting units in our MOOCs’ lessons so they can be completed in approximately 30 minutes each.

4. Increasing accessibility: We have increased the variety of formats in which we present the information, trying to provide a suitable option for learners with different needs. For example, we have included text alternatives to describe relevant images.

5. Clarifying attribution: We try to give credit to the people who created the open educational resources (OERs) we are using. We have clarified attributions where required. Images without an attribution belong to the public domain.

6. Making materials learner-friendly: MOOCs may attract participants with all sorts of educational backgrounds and skills. Finding the ‘right level’ is a challenge. What some participants might consider ‘too easy’ might be ‘too hard’ for others. We are trying to find the right balance. We have added new examples and explanations. We are aiming to make the courses easier to follow and more learner-friendly than before.

7. Fostering learners’ engagement: Most of the activities in our MOOCs follow the e-tivity framework described by Prof Gilly Salmon. This approach recommends the use of ‘sparks’, which are resources -such as images and videos- that generate interest in the topic of the activity. We have changed some of our sparks to make them more interesting, engaging and relevant than before.

8. Improving communication with learners: For the rerun in May we will increase  our participation in communication channels outside EMMA, such as Twitter. We have selected useful resources and questions that foster reflection, which we will share with participants throughout the delivery of the MOOCs.


I look forward to the following stages of these MOOCs!