eLearning

A special issue of Future Internet (ISSN 1999-5903).

Deadline for manuscript submissions: closed (31 August 2015) | Viewed by 35196

Special Issue Editors


E-Mail Website
Guest Editor
Communication, Culture, and Technology (CCT) Georgetown University Washington D.C., USA

E-Mail
Guest Editor
Deputy Pro Vice-Chancellor (Technology Enhanced Learning) Faculty of Architecture, Computing and Humanities University of Greenwich Old Royal Naval College London SE10 9LS, United Kingdom

Special Issue Information

Dear Colleagues,

Tremendous growth in the last few years in regards to eLearning, especially through the popularization of Massive Open Online Courses (MOOCs) and Microlearning, prompts us to ask: “what is the learning that is occurring within the online space and what are the outcomes?” Understanding what type of learning is resulting from these efforts, is still a major challenge. Many of the eLearning systems being used have limited dashboard capabilities, with many not going beyond the basic course management components; however, given the scale of the eLearning efforts in recent years, dashboards, defined as sophisticated monitoring systems, are a necessity for all participants—students, instructors, and administrators—as they have the potential to enable us to make sense of the learning that is occurring within the virtual space. What are then the issues surrounding the development of dashboards to visualize learning processes and outcomes? We posit that the major obstacles are no longer about the technical know-how; the technical expertise to develop eLearning dashboards is relatively easy to identify and procure. Rather, the challenges are about conceptualizing and designing dashboards that provide meaningful analytics that engage participants, including instructors and administrators, in making connections of what they are learning.

Within this context, this Future Internet Special Issue on eLearning aims to explore the topic of dashboards. Therefore, we invite researchers to submit manuscripts that focus on eLearning Dashboards.

Some may approach it from a learning dashboard perspective that targets learners focusing on questions such as “what should be on a learning dashboard that is meaningful to learners and would encourage repeat engagement and reflection about what is being learned?”

Some may approach it from a teaching dashboard perspective asking questions about how to support learners who are at risk, or those who may need more challenging learning opportunities. For example, “what information do we want to know about learners to help us predict dropout?” “When would such information be most useful?”

Others may choose to approach it from an institutional perspective with a focus on trying to capture learning experiences over time and asking questions such as “What type of programs would best engage our learners further?” or “what information do we need to capture to generate meaningful information about how our students learn?” and “how do we analyze it intelligently using techniques from, e.g., AI and Big Data?” and “What are the overall curricular outcomes and competencies?”

We also welcome manuscripts that address the broader social and ethical dimension about what data is being captured and how it is used to generate data for a dashboard.

Prof. Yianna Vovides
Prof. Liz Bacon
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Future Internet is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • design
  • dashboards
  • elearning
  • analytics
  • online learning
  • eportfolios
  • engagement
  • deep learning
  • monitoring
  • big data
  • artificial intelligence

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (5 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Other

2670 KiB  
Article
Elusive Learning—Using Learning Analytics to Support Reflective Sensemaking of Ill-Structured Ethical Problems: A Learner-Managed Dashboard Solution
by Yianna Vovides and Sarah Inman
Future Internet 2016, 8(2), 26; https://doi.org/10.3390/fi8020026 - 11 Jun 2016
Cited by 4 | Viewed by 7531
Abstract
Since the turn of the 21st century, we have seen a surge of studies on the state of U.S. education addressing issues such as cost, graduation rates, retention, achievement, engagement, and curricular outcomes. There is an expectation that graduates should be able to [...] Read more.
Since the turn of the 21st century, we have seen a surge of studies on the state of U.S. education addressing issues such as cost, graduation rates, retention, achievement, engagement, and curricular outcomes. There is an expectation that graduates should be able to enter the workplace equipped to take on complex and “messy” or ill-structured problems as part of their professional and everyday life. In the context of online learning, we have identified two key issues that are elusive (hard to capture and make visible): learning with ill-structured problems and the interaction of social and individual learning. We believe that the intersection between learning and analytics has the potential, in the long-term, to minimize the elusiveness of deep learning. A proposed analytics model is described in this article that is meant to capture and also support further development of a learner’s reflective sensemaking. Full article
(This article belongs to the Special Issue eLearning)
Show Figures

Graphical abstract

2518 KiB  
Article
Improving Teacher Effectiveness: Designing Better Assessment Tools in Learning Management Systems
by Dov Kruger, Sarah Inman, Zhiyu Ding, Yijin Kang, Poornima Kuna, Yujie Liu, Xiakun Lu, Stephen Oro and Yingzhu Wang
Future Internet 2015, 7(4), 484-499; https://doi.org/10.3390/fi7040484 - 18 Dec 2015
Cited by 9 | Viewed by 9089
Abstract
Current-generation assessment tools used in K-12 and post-secondary education are limited in the type of questions they support; this limitation makes it difficult for instructors to navigate their assessment engines. Furthermore, the question types tend to score low on Bloom’s Taxonomy. Dedicated learning [...] Read more.
Current-generation assessment tools used in K-12 and post-secondary education are limited in the type of questions they support; this limitation makes it difficult for instructors to navigate their assessment engines. Furthermore, the question types tend to score low on Bloom’s Taxonomy. Dedicated learning management systems (LMS) such as Blackboard, Moodle and Canvas are somewhat better than informal tools as they offer more question types and some randomization. Still, question types in all the major LMS assessment engines are limited. Additionally, LMSs place a heavy burden on teachers to generate online assessments. In this study we analyzed the top three LMS providers to identify inefficiencies. These inefficiencies in LMS design, point us to ways to ask better questions. Our findings show that teachers have not adopted current tools because they do not offer definitive improvements in productivity. Therefore, we developed LiquiZ, a design for a next-generation assessment engine that reduces user effort and provides more advanced question types that allow teachers to ask questions that can currently only be asked in one-on-one demonstration. The initial LiquiZ project is targeted toward STEM subjects, so the question types are particularly advantageous in math or science subjects. Full article
(This article belongs to the Special Issue eLearning)
Show Figures

Figure 1

1066 KiB  
Article
Output from Statistical Predictive Models as Input to eLearning Dashboards
by Marlene A. Smith
Future Internet 2015, 7(2), 170-183; https://doi.org/10.3390/fi7020170 - 2 Jun 2015
Cited by 3 | Viewed by 6407
Abstract
We describe how statistical predictive models might play an expanded role in educational analytics by giving students automated, real-time information about what their current performance means for eventual success in eLearning environments. We discuss how an online messaging system might tailor information to [...] Read more.
We describe how statistical predictive models might play an expanded role in educational analytics by giving students automated, real-time information about what their current performance means for eventual success in eLearning environments. We discuss how an online messaging system might tailor information to individual students using predictive analytics. The proposed system would be data-driven and quantitative; e.g., a message might furnish the probability that a student will successfully complete the certificate requirements of a massive open online course. Repeated messages would prod underperforming students and alert instructors to those in need of intervention. Administrators responsible for accreditation or outcomes assessment would have ready documentation of learning outcomes and actions taken to address unsatisfactory student performance. The article’s brief introduction to statistical predictive models sets the stage for a description of the messaging system. Resources and methods needed to develop and implement the system are discussed. Full article
(This article belongs to the Special Issue eLearning)
Show Figures

Figure 1

438 KiB  
Article
Quantitative Analysis of the Usage of a Pedagogical Tool Combining Questions Listed as Learning Objectives and Answers Provided as Online Videos
by Odette Laneuville and Dorota Sikora
Future Internet 2015, 7(2), 140-151; https://doi.org/10.3390/fi7020140 - 15 May 2015
Cited by 3 | Viewed by 5484
Abstract
To improve the learning of basic concepts in molecular biology of an undergraduate science class, a pedagogical tool was developed, consisting of learning objectives listed at the end of each lecture and answers to those objectives made available as videos online. The aim [...] Read more.
To improve the learning of basic concepts in molecular biology of an undergraduate science class, a pedagogical tool was developed, consisting of learning objectives listed at the end of each lecture and answers to those objectives made available as videos online. The aim of this study was to determine if the pedagogical tool was used by students as instructed, and to explore students’ perception of its usefulness. A combination of quantitative survey data and measures of online viewing was used to evaluate the usage of the pedagogical practice. A total of 77 short videos linked to 11 lectures were made available to 71 students, and 64 completed the survey. Using online tracking tools, a total of 7046 views were recorded. Survey data indicated that most students (73.4%) accessed all videos, and the majority (98.4%) found the videos to be useful in assisting their learning. Interestingly, approximately half of the students (53.1%) always or most of the time used the pedagogical tool as recommended, and consistently answered the learning objectives before watching the videos. While the proposed pedagogical tool was used by the majority of students outside the classroom, only half used it as recommended limiting the impact on students’ involvement in the learning of the material presented in class. Full article
(This article belongs to the Special Issue eLearning)
Show Figures

Figure 1

Other

Jump to: Research

380 KiB  
Project Report
Utilizing the ECHO Model in the Veterans Health Affairs System: Guidelines for Setup, Operations and Preliminary Findings
by Herschel Knapp and Sanjog Pangarkar
Future Internet 2015, 7(2), 184-195; https://doi.org/10.3390/fi7020184 - 8 Jun 2015
Cited by 3 | Viewed by 5937
Abstract
Background: In 2011, the Veterans Health Administration (VHA) consulted with the Project ECHO (Extension for Community Healthcare Outcomes) team at the University of New Mexico, Albuquerque, to reproduce their successful model within the VHA. Methods: The VHA launched SCAN-ECHO (Specialty Care Access Network-Extension [...] Read more.
Background: In 2011, the Veterans Health Administration (VHA) consulted with the Project ECHO (Extension for Community Healthcare Outcomes) team at the University of New Mexico, Albuquerque, to reproduce their successful model within the VHA. Methods: The VHA launched SCAN-ECHO (Specialty Care Access Network-Extension for Community Healthcare Outcomes), a multisite videoconferencing system to conduct live clinical consultations between specialists at a VHA Medical Center (hospital) and primary care providers stationed at satellite VHA CBOCs (Community-Based Outpatient Clinic). Results: Analysis of the first three years rendered a mean attendee satisfaction of 89.53% and a consultation satisfaction score of 88.10%. About half of the SCAN-ECHO consultations resulted in patients receiving their treatment from their local primary care providers; the remaining half were referred to the VHA Medical Center when the treatment involved equipment or services not available at the CBOCs (e.g., MRI, surgery). Conclusion: This paper details the setup, operation logistics and preliminary findings, suggesting that SCAN-ECHO is a viable model for providing quality specialty clinical consultation service, prompter access to care, reduced commutes and continuing education. Additionally, the use of a secured Internet-based videoconferencing system that supports connectivity to multiple (mobile) devices could expand the utilization of this service. Full article
(This article belongs to the Special Issue eLearning)
Show Figures

Figure 1

Back to TopTop