Since is was practically around the corner and I'd been wanting to acquaint myself with the latest news on educational data mining for some time already, I decided to spend three days at the Educational Data Mining conference, which was held in Eindhoven, July 6 through 8, 2011. Having sat it all out, I have to say that my feelings are mixed, saw very good stuff and some work that makes you wonder. A couple of general observations first, then some details on a few papers and posters. A confession out the outset: I am interested in informal (non-formal) kinds of learning, so I was specifically on the lookout for uses of data mining that would foster this kind of learning.
First, the EDM community is heavily dominated by people of US extraction. That inevitably brings a bias in that 'educational' is surreptitiously being defined as 'in accordance with the US educational system'. This is not necessarily bad, but it is something to keep in mind. Second, and perhaps as a consequence of this, data mining seems almost congruent with intelligent tutoring systems. Even though the title of a paper may suggest something different, ITSs are never far away. Third, and most importantly in my view, the conference's take on what data there are to mine is a very narrow one. This is connected to their narrow view of what constitutes education: school-based, teacher-led formal learning, with no concept of other forms of learning. This may simply be a choice, which is already narrows down the field. However, it gets worse as within the confines of formal learning, their sole educational model is that of the teacher as the sage on the stage, who may be assisted by ITSs to relieve them from some of the drudgery of repeatedly having to answer the same questions. I am exaggerating, true, but not all that much. My main problem with this is that to the extent that EDM is successful, it acts as a conserving force, reinforcing received testing methods and having little eye for educational innovation. From which indeed follows that I do not see EDM as espoused in the conference as innovation of education, at best as innovative methods to support traditional forms of learning.
That being out of the way, there were several papers and posters of interest to be seen and heard at the conference. A few observations on just three of them. First, Kelly Wauters et al. from K.U. Leuven discussed a novel means of rating proficiency in their Monitoring Learners' Proficiency: Weight Adaptation in the ELO Rating System. They used a modified version of the ELO rating system that chess players use for this and apply it to rate proficiency on learning items. If you want to sequence learning items adaptively, you not only need to know how 'difficult' the items are, but also how good someone is at particular ones. That way, you can provide learners with items that in terms of their difficulty match their proficiency. Second, and breaking away from tradition, Worsley and Blikstein from Stanford also worry about proficiency or expertise. They wonder What is an Expert? and seek an answer in the use of Learning Analytics to identify emergent markers of expertise through automated speech, sentiment and sketch analysis. Thus they look at say speech utterances and sketches to acquire an impression of someone's expertise at a particular subject. Interestingly, both novices and experts reveal little lack of confidence, the former since they are sure not to know, the latter since they are sure they do know. Both (short) papers are fun for their innovativeness, they are also useful in the context of informal learning (in, say, Learning Networks) as they provide means to characterize learners' expertise and thus means better to help them.
Third and finally, there was a nice poster by Anna Lea Dyckhoff from the computer supported learning group, informatics at RWTH Aachen, practically our neighbours at OUNL. Although still in its infancy, she is developing a learning analytics toolkit (eLAT) that allows teachers to gauge their students interaction with the content in Personal Learning Environments. I am not sure whether the use of the term teacher in connection with a PLE is entirely fortunate - after all, if PLEs are really personal they must by definition also refer to informal learning situations in which the role of teachers is not self-evident. However, such toolkits are very valuable as they provide a means to help personal learners that self-guide their learning, or so I would hope. In this same vein, R. Pedraza-Perez et al. from Cordoba, Spain offer a Java desktop tool to mine Moodle log files, and GarcĂa-Saiz et al. from Cantabria have built an E-Learning Webminer (EIWM) that, by discovering student's profiles, is intended to help them navigate and work in distance taught courses.
No comments:
Post a Comment
Please, leave your comments. I do moderate comments, only in an effort to blog spam. That's all. So rest assured, I'll only weed out off-topic comments but emphatically I will not block comments that are critical or negative. The one exception to that rule is if the commentator hides in anonymity. I see this blog as a way to air my views and engage with others on them. But what is the point of a discussion if you don't know with whom you are talking?