Gesture-Based eLearning Systems

The popularity and use of gesture-based human interfaces adds another dimension to eLearning. This paper will discuss gesture-based systems and learning including its educational challenges and benefits. The educational technology with be parsed illustrating the embodied cognition theory as it related to gesture-based systems and eLearning. Information on how the technology works with examples of gesture-based systems and elearning will be offered with critical reflections and recommendations.

The Educational Challenge

The educational challenge of gesture-based eLearning encompasses both the technology that is used, as well as, the impact on eLearning. Mobile application, touch technologies and gesture based interfaces are mainstream. What is the challenge of incorporating these technologies into eLearning systes? As the studies will suggest, gesture-based systems enhance learning and can be valuable in extending the benefits and effectiveness of eLearning to another dimension.

Educational strategies have evolved from the traditional, teacher –centered didactic methodologies to the authentic and transformative methodologies adding value to the educational experience and outcomes. Digital media and online classrooms have further advanced the educational process allowing for a wealth of information and knowledge at one’s fingertips. Diversity in the classroom can be addressed for the online platform and enhance the experience for all learning styles.

As one goes through life as a learning process, all of one’s senses are used.  With the evolution of online systems we see a transformation from the simple digitial version of printed material advocating learning as a passive consumption to active knowledge production for students and teachers alike. Can the incorporation of all the senses be transmitted and represented in digital media and part of the components for the learning experience as well? This all depends on the abilities and technologies needed to represent and simulate the senses. Gesture-based human interfaces are the next wave of technologies currently in deep research and production.

Gesture-based systems can perhaps advance to a point that all the ways of learning done by human beings can be incorporated and merged in educational frameworks offering the best characteristics from them all – face to face, online, virtual and life learning.

Parse the Technology

top-10-trends-in-education-technology-for-2016-17-638

(Mohammed, 2015)

Being that mobile devices and gesture-based human interaction are popular and in mainstream computing, it is crucial that eLearning embrace and utilize it for maximum benefit.

embodied_20congnition
(Djik, 2013)

 

Embodied Cognition Theory

Embodied cognition theory proposes that the body shapes the mind. According to Ozcelik and Sengal (2012), in their study titled, Gesture-Based Interaction for Learning: Time to Make the Dream a Reality, “The embodied cognition theory suggests that the body shapes the mind (Anderson, 2003). For instance, pointing by hands reduces the cost associated with maintaining information in our working memory (Ballard, Hayhoe, Pook & Rao, 1997). Memory for action events (eg, knocking on the table) is better when the events are performed by the subjects than when they are read or heard (Cohen, 1983). Engelkamp and Zimmer (1989) stated that the motor system is responsible for this effect since memory for self-performed events is better than that of experimenter performed events and imagined events. To support this proposal, a functional magnetic resonance imaging study demonstrates that the pre-motor cortex is activated when words are encoded through gestures (Macedonia, Müller & Friederici, 2010).”

According to Samuel McNerney (2011), in his article titled, A Brief Guide to Embodied Cognition: Why You Are Not Your Brain, “Embodied cognition has a relatively short history. Its intellectual roots date back to early 20th century philosophers Martin Heidegger, Maurice Merleau-Ponty and John Dewey and it has only been studied empirically in the last few decades. One of the key figures to empirically study embodiment is University of California at Berkeley professor George Lakoff.”

Ozcelik and Sengal (2012) reveal that “Research studies have shown that individuals who make hand gestures learn better than the ones who do not (eg, Alibali & Goldin-Meadow, 1993; Broaders, Cook, Mitchell & Goldin-Meadow, 2007). For instance, when children use their hands while they are explaining how they solve mathematical equivalence problems (eg, 6 + 4 +5 = _ + 5), they perform better in post-tests (Broaders et al, 2007). Comprehension and, consequently, memory are improved by these acts (Stevanoni & Salmon, 2005). In addition, gestures facilitate deep and long-lasting learning (Cutica & Bucciarelli, 2008). “Several proposals have been put forward to explain why gestures enhance learning. To begin with, Beilock and Goldin-Meadow (2010) have suggested that including motor actions into mental representations is responsible for performance improvements. Learners may produce integrated and richer representations when they describe a procedure with gestures (Alibali & GoldinMeadow, 1993; McNeill, 1992). By means of gestures, people can externalize their thoughts (Clark, 1999), and as a result, more cognitive resources become available for learning from the limited resources of the mind (Goldin-Meadow & Wagner, 2005).”

For teachers and students, the more intuitive a system is the more effective and enjoyable it will be to use and learn from. The more natural the interface is to our daily interactions and movements, the more benefits it can offer. Furthermore, being that the digital world is such a large part of our daily lives, bringing it to the classroom will offer a natural dimension to performance, progression and learning.

Furthermore, while gesture-based systems can offer great educational benefits in addressing differential instructions, diverse learning styles and multiple intelligences, I feel this technology will go far and beyond such specific domains and prove to be further reaching.

Considerations and Focus for Design

gbs_20ui

(Wikiepdia, 2015)

Various publications and studies have brought to light possible areas for consideration and focus when designing for gesture-based gameplay and eLearning.

According to the 2011 Horizon Report, “While gesture-based computing has found a natural home in gaming, as well as in browsing files, its potential uses are far broader. The ability to move through three-dimensional visualizations could prove compelling and productive, for example, and gesture-based computing is perfect for simulation and training. Gesture-based computing has strong potential in education, both for learning, as students will be able to interact with ideas and information in new ways, and for teaching, as faculty explore new ways to communicate ideas. It also has the potential to transform what we understand to be scholarly methods for sharing ideas. Gesture-based computing is changing the ways that we interact with computers, both physically and mechanically. As such, it is at once transformative and disruptive. Researchers and developers are just beginning to gain a sense of the cognitive and cultural dimensions of gesture-based communicating, and the full realization of the potential of gesture-based computing within higher education will require intensive interdisciplinary collaborations and innovative thinking about the very nature of teaching, learning, and communicating.” (Johnson, Smith, Willis, Levine & Haywood, 2011).

 

guide-to-gesture-based-interactions

(Ivek, 2014)

According to Gao et al (2012), in their study titled, What Does Touch Tell Us about Emotions in Touchscreen-Based Gameplay?, “The increasing number of people playing games on touch-screen mobile phones raises the question of whether touch behaviors reflect players’ emotional states… In parallel with this increasing use of touch-based devices to play games, there is also an increasing interest, within the game industry, in adding emotion recognition capabilities to games (e.g., see Microsoft XBOX 360 Milo project  and Fable Journey) to increase the engagement experience of the player. This capability could be used either to evaluate new games or to create games that make use of the player’s affective states to create a personalized, engaging game experience [Bianchi-Berthouze 2013; Yannakakis and Hallam 2007]. “

While all these benefits can be useful to the gaming industry, they can also be useful to eLearning for many of the same reasons.

eboard-classroom-1024x944

(Interactive IT Group, 2015)

According to Gwo-Jen et all (2013) in their study titled, Effects of Touch Technology-based Concept Mapping on Students’ Learning Attitudes and Perceptions, “Concept maps have become a widely used educational tool around the globe. The advancements in computerized interface technologies have enabled even more alternatives for using concept maps in teaching and learning. This study investigates the effects of two different touch technology-based concept mapping interaction modes on students’ learning achievements and learning attitudes in a natural science course, as well as their degree of acceptance of using concept maps to learn. Ninety two sixth graders were randomly divided into three groups. Experimental Group One was taught using the Interactive Whiteboard (IWB)-based concept mapping approach, Experimental Group Two learned with the touchscreen-based concept mapping approach, while the control group learned with the traditional paper-and-pencil-based concept mapping approach. The experimental results show that, in terms of learning attitudes toward the natural science course and the degree of acceptance of using concept maps to learn, the students were significantly more positive about the two touch technology-based interaction modes than they were about the traditional paper-and-pencil mode…To sum up, touch technology has potential in educational applications; therefore, it is worth studying the effects of different forms of touch technology used as educational tools on the learning performance of students.”

 

Gesture based systems is very new and being used in many industries, but new to education. Much research is being done so it is most definitely something to visit in the near future for more information and case studies.

How the Technology Works: What is Gesture-Based Computing?

Educause in their Education Learning Initiative on new Media: Gesture-Based Computer, “Gesture-based computing refers to interfaces where the human body interacts with digital resources without using common input devices, such as a keyboard, mouse, game controller, or voice-entry mechanism. Such systems can scan a limited space, analyze human movement within it, and interpret the motions of hands and arms, head, face, or body as input. Touchscreen interfaces likewise enable gestures such as swipes or taps to navigate, play games, or organize data. Gesture recognition technology is used in a wide array of applications, from those that parse facial expressions to determine human emotional reactions to complex, augmented-reality simulations that evaluate whole-body movement.”

 

Examples of Gesture-Based eLearning Systems

figure_201

(Tutwiler, Lin, & Change, 2013)

In a pilot study by Shane Tutwiler, Ming-Chao Lin and Chun-Yen Chang, (2013), titled, “The Use of a Gesture-Based System for Teaching Multiple Intelligences: A Pilot Study”, a gesture-based learning system using the Microsoft Kinect was created. It was then used to “illustrate that bodily states have been shown to impact cognitive states (Barsalou, 2008; de Koning & Tabbers, 2011). Finally, it has been suggested that gestures may activate learning without majorly increasing cognitive load (de Koning & Tabbers, 2011).” The results indicated that “It would seem that students were able to learn, or at least remember, types of intelligences quite well after partaking in the GBS activity. No causal claims can be made, of course, since there was not a control group with which to compare findings. This trend (quadratic learning curve) does beg further longitudinal analysis using a more robust sample. Findings from this further study would then better inform us on the impact of a GBS on the persistence and transfer of knowledge.”

lets_20go

 

In a paper by Vogel et al (2014) titled, “Mobile Inquiry Learning in Sweden: Development Insights on Interoperability, Extensibility and Sustainability of the LETS GO  Software System”,  presented an “overall lifecycle and evolution of a software system developed in relation to the Learning Ecology through Science with Global Outcomes (LETS GO)research project. Recent developments of our LETS GO system include the implementations of two prototypes using gesture based interaction supported by the use of the Microsoft Kinect (Vogel, Pettersson, O., Kurti, & Huck, 2012) and touch enabled interactions facilitated by the used of the Samsung SUR-40 tabletop computing surface (Müller, 2012). The latest version of the LETS GO system allows for integrating new interaction features provide by multi touch enabled devices and gesture based interaction in a way that we can expand the interaction modes in which learners work with the visualizations.”

 

virtual_20chemical_20lab
In the examination of the use of a gesture-based system (GBS) in a case study by Piotr Jagodzinski, Robert Wolski and Adam Mickiewicz (2104), titled, “The Examination of the Impact on Students’ Use of Gestures While Working in a Virtual Chemical Laboratory for Their Cognitive Abilities. Problems of Education in the 21St Century“, one of the cognitive theories is the embodied cognition theory.

According to this theory, it is important to use appropriate gestures in the process of assimilating new information and the acquisition of new skills.  The further development of information and communication technologies has enabled the development of interfaces that allow the user to control computer programs and electronic devices by using gestures. These Natural User Interfaces (NUI) were used in teaching Chemistry in middle school and secondary school. A virtual chemical laboratory was developed in which students can simulate the performance of laboratory activities, similar to those that are performed in a real lab. The Kinect sensor was used to detect and analyze hand movement. The conducted research established the educational effectiveness of a virtual laboratory, which is an example of a system based on GBS gestures (gesture-based system). The use of the teaching methods and to what extent they increase the student’s complete understanding were examined. The results indicate that the use of the gesture-based system in teaching makes it more attractive and increases the quality of teaching Chemistry.

Analyzing the obtained results shows that the use of the gesture recognition system when using Kinect in the virtual lab had a positive impact on increasing the efficiency of teaching Chemistry. Students working with a virtual lab gave a better performance in terms of remembering information, and also showed the greater durability of remembering information. To a greater extent, students could understand the information passed to them. Moreover, it increased their ability to use the knowledge acquired in solving tasks in situations known to them from lessons. As a result, they achieved even better results in solving problematic laboratory tasks, which is related to achieving the objectives contained in the taxonomy category of learning objectives with the highest educational value. Comparing the achievements of students of particular middle and secondary school groups, we can conclude that the use of gestures and movements in a virtual chemistry laboratory offers higher efficiency than education for students from the other groups. The use of a Kinect sensor recognizing gestures and movements can increase the interactivity and effectiveness of an educational virtual laboratory. The inclusion of human movements in the cognitive process makes learning that is supported by multimedia content by using gestures produce better results and is more effective, which confirms the assumptions embodied in the cognition theories. Indeed, in our study, groups of students who were working with a virtual laboratory, using gestures and hand movements, achieved the best results. Students from other groups watching their teacher presentations and movie instructions, which did not use these gestures, achieved worse results, especially in solving problems. The results of our research regarding the application of the gesture and movement recognition system in the virtual Chemistry lab showed that gesture recognition computer technology will make a big impact on education in the future. The virtual laboratory we have developed with the Kinect sensor, in our opinion, is the new path to virtualization different laboratories doing very similar operations manually in a virtual environment to those in a real environment.”

Critical Reflection

While it is indicated through educational embodied cognition theories and various case studies, gesture-based systems does and did improve learning. It is important that the incorporation of the gestures into the gesture-based system be directly correlated to the learning objectives in order to be of a significant contributor to the improvement of the learning of these objectives. Furthermore, it is suggested that the instructor or facilitator be knowledgeable of the technology and its use so as to offer a competent illustration of the system and the students have competency in its use for an optimal learning experience. Furthermore, it definitely does add more opportunities for differential instruction and learning, as well as, an increased address to diverse learning styles and multiple intelligences due to the mere addition of movement. However, there is much research so there is more to be seen as to what can all be addressed and to what benefits.

These are new technologies making their presence in many types of systems for various industries such as gaming, healthcare, defense, and manufacturing. It is very new for the education industry and is in parallel with the ever advancing technologies for online systems like LMS and MOOCs. It is being heavily researched and development so is an exciting evolution to follow and see where it may take us in the educational arena for the future.

Conclusions and Reconmmendations

It is clear that gesture-based learning systems do enhance learning as compared the traditional models. According to Gao et al (2012), “…touch technology has potential in educational applications; therefore, it is worth studying the effects of different forms of touch technology used as educational tools on the learning performance of students.”

When developing eLearning interaction and content there are various guidelines to follow. According to Stephanie Ivec in her article titled, “A Guide to Gesture-based Interactions: Learning at Your Fingertips”,

“Your mobile content must answer these questions if you want your m-Learning initiative to be successful.

What can I do?
Where? How?
What happened?
How do I get back (undo)?
Where can I get help understanding how it works?”

As you’re developing mobile and gesture-based learning content, Stephanie suggests to look to your favorite apps for inspiration. “For instance, the “pull down to refresh” gesture is very common and widely familiar to smartphone users. Often an app will show a user interface walkthrough when first launched. This is a great idea to borrow for your m-Learning, as long as you keep it short and sweet. Show only the most important interactions in the walkthrough. If you try to explain everything at once, users will skip the walkthrough.

Another idea is to gradually introduce hints as a user is going through your course. I recently upgraded to a new smartphone and frequently get hints via pop-up windows that help me adjust my settings and preferences. This is known as “progressive disclosure” and is a great way to help your users by providing relevant instructions only when they need them. You could do this with automatic messages that appear, or a subtle information icon in a corner that users can elect to touch for more information. “(Ivek, 2014).

Overall, a parallel advancement as seen in the technologies used in gameplay and mobile devices will also be seen in e learning systems. Being that the preliminary and pilot studies have indicated positive results for improved learning, It will be one of the future trends to come and to watch. What and how it will benefit or be a disadvantage depends greatly on the direction and implementation of the advances. Some that are those that inherently already exist in online systems and digital communication for both the teacher and the student. I do believe that gesture-based systems will perhaps advance to a point that all the ways of learning done by human beings can be incorporated and merged into educational frameworks offering benefits for them all – life, face to face, online and virtual

References

Dijk. (2013) Embodied Cognition. Retrieved from http://www.slideshare.net/jelle1975/dijk-2013-embodied-cognition-lecture-3-small.

Educause. (2014). Educause Learning Initiative: Gesture-Based Computing. Retrieved from https://net.educause.edu/ir/library/pdf/ELI7104.pdf.

Gao, Y., Bianchi-Berthouze, N., & Meng, H. (2012). What Does Touch Tell Us about Emotions in Touchscreen-Based Gameplay?. ACM Transactions on Computer-Human Interaction (TOCHI), 19(4), 1-30. doi:10.1145/2395131.2395138.

Gwo-Jen, H., Chih-Hsiang, W., & Fan-Ray, K. (2013). Effects of Touch Technology-based Concept Mapping on Students’ Learning Attitudes and Perceptions .Journal Of Educational Technology & Society, 16(3), 274-285.

Interactive IT Group. (2015). eBoard LCD Interactive Touch Screens. Retrieved from http://www.interactiveitgroup.com.au/for-education-and-business/.

Ivec, S. (2014). Picture: A Guide to Gesture-based Interactions: Learning at Your Fingertips. Retrieved from http://elearningindustry.com/guide-to-gesture-based-interactions.

Jagodziński, P., & Wolski, R. (2014). The Examination of the Impact on Students’ Use of Gestures While Working In a Virtual Chemical Laboratory for Their Cognitive Abilities. Problems of Education in the 21St Century, 6146-57.

Johnson, L., Smith, R., Willis, H., Levine, A., and Haywood, K., (2011). The 2011 Horizon Report. Austin, Texas: The New Media Consortium. Retrieved from http://www.nmc.org/pdf/2011-Horizon-Report.pdf.

Kinect Milo. (2015). Kinect (Project Natal) : Milo – Virtual Human XBox 360. Retreived from https://www.youtube.com/watch?v=JF_HXTQ7Quo.

Lionhead. (2015). Fable: The Journey. Retrieved from http://www.lionhead.com/games/fable-the-journey/.

McNerney, S. (2011), A Brief Guide to Embodied Cognition: Why You Are Not Your Brain. Retrieved from http://blogs.scientificamerican.com/guest-blog/a-brief-guide-to-embodied-cognition-why-you-are-not-your-brain/.

Mohammed, K. (2015). Top 10 Trends In Education Technology For 2016. Retrieved from http://www.slideshare.net/karima1/top-10-trends-in-education-technology-for-2016-42375015.

Systems Consulting. (2013). Image. Retrieved from http://www.systemsconsulting.com.co/.

Wikipedia. (2015). Gesture Recognition. Retrieved from https://en.wikipedia.org/wiki/Gesture_recognition.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s