Friday, October 17, 2014

Comparing two forms of concept map critique activities

Knowledge Integration Map (KIM)
Source: Wikipedia
Concept maps can be versatile tools for learning and assessment. However, evaluating concept maps can be challenging. What are effective ways to analyze concept maps?

I presented a paper at the 6th international conference on concept mapping in Santos, Brazil in October 2014 (See conference program with links to papers here). The paper introduces a new form of concept map, called Knowledge Integration Map (KIM). Different from Novakian concept maps, KIMs divide the drawing area into discipline-specific sections (see example above). Placing concept in these designated areas elicits how learners categorize these concepts and it highlights cross-links between sections. Cross-links can be seen as particularly interesting as they link concepts in different categories. 

The empirical study presented in the paper compares two forms of KIM activities implemented in biology classrooms. Concept  map  activities  often  lack  a  subsequent  revision  step  that  facilitates  knowledge  integration.  This  study compares two kinds of concept map critique activities embedded in an evolution unit: Student dyads in one group compared their concept maps against an expert map while dyads in the other group conducted a peer-review. Analysis of the concept maps suggests that both treatment groups significantly improved their understanding of evolution. However, the two groups developed different criteria: The expert-map group focused mostly on concept-focused criteria like concept classification while the peer-review group used more link-focused criteria like link labels and missing connections. This paper suggests that both critique activities can be beneficial to making more coherent connections across different topics in biology. 

The paper is available here (as PDF)The title of the paper is 'Comparing two forms of concept map critique activities to support knowledge integration in biology education'.

Making Sense of Concept Maps

Overview of concept mapping analysis methods
(by Beat A. Schwendimann)

Concept maps can be versatile tools for learning and assessment. However, evaluating concept maps can be challenging. What are effective ways to analyze concept maps?

I presented a paper at the 6th international conference on concept mapping in Santos, Brazil in October 2014 (See program with links to papers here). The paper provides an overview of evaluation/ analysis methods for concept maps and identifies powerful indicators that can track changes in students' understanding. The paper is available here (as a PDF). The title of the paper is 'Multi-level analysis strategy to make sense of concept maps' (also see concept map above).

An extended version of the paper has been published as a book chapter: Schwendimann, B. A. (2014). Making sense of knowledge integration maps. In D. Ifenthaler & R. Hanewald (Eds.), Digital knowledge maps in education: Technology enhanced support for teachers and learners. New York: Springer.

Thursday, October 9, 2014

Create 3D sculptures using Oculus Rift


VRClay allows to create 3D sculptures in a virtual reality environment using Oculus Rift or Razer Hydra. More information about it here.

Affordable computer kit Kano to teach coding

London-based startup Kano offers a $150-priced computer and coding kit "Kano" that can be used to teach children about computers and coding. The modular "Kano" kit consists of several plug-in components that can be attached to create a fully functional PC. The plug-in components of the Kano computer include a compact 'Raspberry Pi' computer board, an orange Bluetooth keyboard with trackpad, 8 GB memory card, and a speaker. See Kano's website here.

"Kano" uses a custom programming approach, called "Kano Blocks" which is a version of the Linux operating system. Kano Blocks can output real code in Javascript and Python.

Identifying fake physics in videos to teach physics

Popular videos often seemingly impossible jumps, throws, or dunks. Many of these videos have been manipulated.

Physics professor Rhett Allain wrote an interesting post about how you can spot if a video has been manipulated by analysing the physics. Allain's post could by used by physics teachers to teach about physics as well as a resource in a course about video editing. See Allain's post here.

Thursday, August 21, 2014

eReader usage leads to less recall than paper books

eBook reading
[Source: http://www.marylhurst.edu/_resources/img/ENG-woman-reading-tablet-library.jpg]
As study conducted at Stavanger University (Norway) reported that "readers using a Kindle were significantly worse than paperback readers at recalling when events occurred in a mystery story.

In the study, 50 readers were given the same short story. Half read the 28-page story on a Kindle, and half as paperback. Afterwards, participants were tested on aspects of the story including objects, characters and settings.

The researchers found that "The Kindle readers performed significantly worse on the plot reconstruction measure, for example, when they were asked to place 14 events in the correct order."

As an explanation, the authors refer to the tactile properties of paperbacks. "When you read on paper you can sense with your fingers a pile of pages on the left growing, and shrinking on the right. The differences for Kindle readers] might have something to do with the fact that the fixity of a text on paper, and this very gradual unfolding of paper as you progress through a story, is some kind of sensory offload, supporting the visual sense of progress when you're reading."

The authors suggest that publishers should make evidence-based decisions about what kind of content is best presented in what kind of format.

More details on the study here: http://www.theguardian.com/books/2014/aug/19/readers-absorb-less-kindles-paper-study-plot-ereader-digitisation