A JRNEY to Explore Online Learning
I met Greg in June at the end of a scavenger hunt as part of an ISTE Global Collaboration PLN event. And I'm so glad I did. Right away I could tell that Greg was one of the good guys working hard to find ways to make the lives of kids and teachers better. And his way was through analytics. Now this is not my normal cup of tea, but when I heard him explain what he was doing, I felt strongly about having him share his story here.
The last four years of my life I have been consumed with understanding student online learning experiences. This is a passion that truly hit me out of the blue. It all started when I was working as an assistant principal at North Hunterdon High School. The school had recently adopted a 1:1 student Chromebook initiative. In this program, every student was issued a Chromebook for their use 24-7. To keep students safe online, the school purchased an online monitoring software program that recorded all online activity on the Chromebook, regardless of if they were at school, at home or a coffee shop. This software would flag instances when students were trying to access inappropriate websites. It was my job as an assistant principal to review the activity records of students who were flagged for their online behavior. As I would review student records, I was amazed by how much information we were collecting about student online behaviors. It was not uncommon for a student to have over 500 page views in one day. I knew that hidden somewhere in these hundreds of page views was extremely valuable information about student learning. Somewhere buried in the data was information about a unique learning experience for each student. Somewhere in the hundreds of lines of websites visited was information that could help teachers and students better understand how learning occurs online.
I became extremely interested in understanding how educators could use learning data in the same types of ways that Amazon or Google uses data about their users. I was in the final years of a doctoral program and had already begun work on a dissertation. I knew that online learning was my passion so I switched dissertation topics and focused my research on learning analytics.
The goal of my dissertation research was to provide teachers with specific and targeted data about how their students were using their Chromebooks to complete a particular assignment. I worked with a handful of teachers to identify an assignment that they wanted analytics for, and then I manually sifted and sorted through the hundreds of page views for each student to determine the online activity that was specifically conducted for the targeted assignment. I was then able to tell each teacher how long each student spent working on an assignment, what resources they used, what google searches they conducted, and I illuminated an online learning experience to which the teacher had previously been blind.
This data proved to be very impactful to the teachers, and each of them noted several instructional design changes that they will make as a result. For example, one teacher who gave students a week to work on an essay that she expected to take 2-3 hours was surprised to find that most students were only spending 1/2 that amount of time. As a result, she is going to reduce the amount of time that she gives for the assignment in the future. Another teacher was surprised at how often students were using questionable online resources instead of the list of reliable references that she provided. This teacher was going to reteach a lesson on evaluating online resources with her students and also review her expectation for completing the assignment using only reliable sources.
What has been particularly interesting is that every teacher that we work with finds something different in the data. Every teacher approached the analysis of their data with their learning expectations and unique students in mind. The value that a teacher finds in looking at their student and class information is far richer than an outside researcher could provide.
My dissertation research showed that targeted data about student online learning had value, but the problem was that my manual methods for sifting and sorting the data were not practical or scalable for continued use. To solve this problem, I worked with my dissertation committee chairman and a software engineer to form a company called Learnics which developed a google extension called ThinkingApp that allows students to record and submit their own learning data. This puts the ownership of learning data back in the hands of the learner.
Math students are required to show their work when completing a problem; why not require students working on an online assignment to do the same thing. Students turn on the ThinkingApp when they begin working, and it records all online activity. When they are done, they stop the recording and have the opportunity to delete any unrelated data from their activity log. They then submit their targeted data to the teacher who through the use of an analytics dashboard can see class-wide averages for total time submitted, most popular websites and google searches. They can also see similar information on a student level. The ThinkingApp allows teachers to start assessing and understanding the learning process, not just the learning product. Teachers can curate specific online learning experiences that they want their students to have and collect data that allows them to receive feedback about online learning experiences.
The data provided through targeted learning analytics is by no means a magic bullet that will fix all the woes of our educational system, but I do feel that this data can close an important online learning experience feedback loop that is currently missing for many educators. I look forward to continuing my JRNEY of promoting targeted learning analytics, and I look forward to taking many innovative educators along that JRNEY with me.