In my continued exploration of professional development and evaluation, I partnered with the Educational Technology and Media department from my university to conduct a pilot survey on what types, and how faculty use current classroom technologies. The results of this pilot will inform necessary modifications to the data collection tool prior to faculty-wide administration at a later date. This is a summary of the project outcomes.
Purpose and Objectives
The aim of this pilot study was to assess current classroom technology usage at a private university in Seattle, Washington. A secondary purpose was to test the data collection tool
Five study objectives, including two related to data collection, were created:
current level and type of technology usage by faculty.
readiness for online teaching (through analysis of objective 1).
Determine if current technology offered to faculty meets the needs
of the faculty.
Collect feedback from pilot participants on survey questions for
Determine if pilot survey collects intended data.
A survey was distributed to a convenience sample of 20 participants with the ability to recruit others. The participants were asked questions regarding areas of teaching where technology is incorporated, types of classroom technology use, student use of classroom technology, and self-identification of rate of technology adoption. Descriptive analysis was run to determine characteristic technology use of the sample along with correlation tests to beginning understanding use profiles.
The results of this pilot study indicate that of the eleven (11) participants that completed the survey, most professors are fast to average technology adaptors indicating that they are open to technologies in the classroom and use technology in at least one area of their teaching/student learning.
Professors feel mostly comfortable with supported classroom technologies unless they do not have access to them. If they do not feel comfortable with a technology, students will also not be exposed to these technologies which may include those that all professors have access to but are not part of every classroom such as mics and webcams. Professors also tended to rely more heavily on supported technologies as opposed to social media, which is true even when factoring into technology adoption identification. Professors used on average five (5) of the supported technologies where Canvas was the most commonly used. In comparison, professors only used one (1) social media platform on average, YouTube was the most preferred.
The faculty in this study were supportive of student use of technology in the classroom, allowing students to use all types of technologies only discriminating when in the classroom period technology may be used.
These findings cannot be generalized to the entire faculty demographic. Recommendations to clarify survey items for better responses include definitions of major technology terminology and changes to the Likert scales for inclusion.
In these past few weeks, I have been exploring professional development (PD) models that optimize adult learning. The primary focus of these posts has been on the characteristics of adult learning and various professional development formats that honor these characteristics. While understanding these models is important so that participants gain the most out of their professional development, in this post I’d like to focus on applying these concepts to incorporate content, exploring educational technology best practices described in the ISTE coaching standard 4b: “Design, develop, and implement technology rich professional learning programs that model principles of adult learning and promote digital age best practices in teaching, learning, and assessment,” (ISTE, 2017).
In investigating digital age
best practices, formative assessment appeared as a reoccurring theme. Formative
assessment as part of a feedback loops empowers learners to engage in the trial
and error of learning safely and with minimal risk. Applying formative
assessment to professional development could offer similar results. In applying
this idea to the ISTE standard, I began wondering what
digital tools could be implemented to teach teachers about the importance of
“Formative Assessment” and why is it a best practice?
Feedback loops are often used as a teaching
best practice in aiding students build 21st century skills. As
described in other posts in this blog, of the four different types of
assessment, traditional, or summative, assessment measures learning after an
assignment has been turned in. Summative
evaluation assumes that a student has “learned” after an intervention (such as
teaching) and the educator evaluates the extent of that learning, (Vlad-Ortiz,
2018). While summative assessment is useful for formal evaluation, it may not
be timely nor help students improve if only offered as one-time feedback,
(Vlad-Ortiz, 2018). Where summative assessment is formal and final, formative
assessment is more casual and on-going as the evaluation occurs during the
learning, (Vlad-Ortiz, 2018). Formative assessment therefore provides a checkpoint
for student understanding, (Office of Educational Technology, n.d.)
I explore the benefits
of feedback loops for students in this post, I’d like to expand the
investigation to including formative feedback as a tool in adult learning. The Office of Educational Technology found
that formative feedback when coupled with technology tools may be more complete
than traditional assessment and may “reduce time, resources, and disruption” to
conduct the assessment, (Office of Educational Technology, n.d.) These benefits
help educators as formative assessment may provide an avenue for capturing
teaching qualities that open opportunities for “self-reflection, peer
reflection, feedback, and supervisor evaluation,” (Office of Educational
Technology, n.d.). Extending these concepts further, formative assessment can
be used in professional development as a means to inform instructional practice
where participants track their own learning, (Office of Educational Technology,
n.d.). This means that meaningful evaluation can occur more rapidly and
frequently, offer more insight, and help guide professional development needs.
tools that can be used for formative assessment.
There are several educational technology tools
that can be used for formative assessment. Common Sense Education created a
list of the top
27 tools for formative assessment available here. These formative feedback tools include the
following features: student progress tracking, interactive and collaborative
activities, student-paced learning, and instant feedback to both students and
teacher. Formative feedback is given by utilizing interactive slideshow
presentations, video responses, multi-multimedia platforms, content-mapping,
quizzes (including clickers and polling), and backchannel chats. In creating
the list, Common Sense Education agrees with the Office of Educational
Technology stating that the best formative assessment tools help students (and
participants in this case) self-reflect and assess so that they understand
their current level of learning and self-identify areas of improvement, (Common
Sense Education, n.d.).
formative assessment into professional development.
Incorporating formative assessment in adult learning must assume that participants are learners who are joining the professional development for a variety of different motives that are relevant to their work situations. Though are quite a few professional development resources available on the internet on formative feedback tools, I’d like to use this professional development video I found through YouTube entitled, “10Tips for Formative Assessment with Technology: Meaningful, Sustainable, & Scalable” as an example. In the video Dr. Monica Burns walks participants through her tips by highlighting main features and how to use some formative feedback tools. A summary of her tips is provided in figure 1.1 below.
Though the video is purely informational as
Dr. Burns lectures for about 30 minutes on her ten tips, this could be a useful
resource for participants that are highly motivated. The professional
development model used assumes that the participants already have an awareness
of formative assessment and simply need guidance or ideas on how to implement
this in their teaching practice.
According to the ISTE standard, best practices
for the effective PD includes modeling, (ISTE, 2017). While the workshop above
may model ways to use each tool through verbal and visual description, it fails
to include participant buy-in and interaction. Formative feedback could have
been included into the professional development itself, allowing participants
an opportunity to experience instant feedback through the lens of a learner. For
example, demonstrating how to gauge comprehension to better understand the
audience’s needs could have been accomplished by using a backchannel chat or
using the polling/quizzes apps described in the video. This tangible and experiential approach could
help increase self-efficacy of technology tools for mixed audiences where the
presenter modifies their role to facilitation at certain periods of the professional
development. When presenters start
thinking about their participants as learners, professional development becomes
stronger, more impactful which can yield better improvements in teaching and
Common Sense Education, (n.d.) Top tech tools
for formative assessment. Available from: https://www.commonsense.org/education/top-picks/top-tech-tools-for-formative-assessment
Office of Educational Technology, (n.d.)
Section 4: Measuring for Learning. Available from: https://tech.ed.gov/netp/assessment/
Vlad-Ortiz, C. (2018). Incorporating feedback
loops to develop an empowered student [blog]. Available from: http://professorvlad-ortiz.org/incorporating-feedback-loops-to-develop-an-empowered-student/
Vlad-Ortiz, C. (2018). Instructional coaching: Using rubrics to quantify qualitative data for improved teaching outcomes. Available from: http://professorvlad-ortiz.org/instructional-coaching-using-rubrics-to-quantify-qualitative-data-for-improved-teaching-outcomes/
I embarked on a project where I undertook the role of peer coach. Using the communication skills and logistical training from class, I enhanced my coaching skills over a ten week period. I’m not a stranger to coaching, in a former career I counseled patients on therapeutic diets, diet change, and overcoming barriers to change, using very similar principles. In fact, I became quite nostalgic all throughout this process. The strange and unfamiliar term of “peer coaching”became comfortable and familiar once concepts like “probing questions”, and “building rapport” came to light. With no billing hours and diagnosis to defend (mainly to insurance) peer coaching felt quite light and freeing in comparison to coaching in a medical application.
The project itself consisted of enlisting the help of a peer who would be willing to undergo a collaborative revision of an existing lesson plan. The idea was to spend time building rapport and establishing set roles for each peer prior to the collaborative process. The collaboration would then focus on one major area of concern to be improved in the lesson plan. Following this revision, both parties would reflect on the process to provide feedback.
The Coaching Process
To start the project, I partnered with a former supervisor, SK, who is very open-minded to incorporating technology in the classroom. She had been wanting to explore new ways to use technology in online and blended courses beyond simple course management. She felt that online classes tended to be boring or isolating because most are designed to be “work at your own pace” and independent. Faced with planning a new blended course set to go live during the next academic year, SK sought me out for suggestions. Throughout the peer coaching process, we had four face-to-face meetings (where the majority of the collaboration was performed) while also communicating follow-up items via email. A summary of these encounters are provided below:
First Meeting. In our first meeting, SK shared more information about her new course intended to be a blended classroom with community engagement components. Beyond the course description, the only other information established were the course objectives she had developed after reviewing textbooks with similar themes.
After understanding more about the scope of the work, we established our roles, expectations for our time together, and ended our session by creating a SMART goal that would guide our future work. The expectations for me in the coaching role were clear, I was to facilitate the assessment- and course calendar- development process, keeptrack of our progress towards achieving our goal, and provide key resources needed to complete the work. My peer would then complete all other work necessary to continue to the next phase.
As part of this first phase of coaching, I also met with my direct supervisor to share the above information and ensure that our work aligns with departmental goals. Interestingly, this discussion coincided with a revamp of the departmental goals unrelated to this project. Later in the quarter, technology incorporation and digital citizenship were included as new goals. With this new vision, our coaching work aligned with our departmental values.Our supervisor was very encouraging, supportive, and wanted feedback regarding the results of our collaboration at the end of the process.
Second Meeting. Prior to our second meeting, I began reflecting on SK’s goals and our previous conversations. Given that the course objectives were already established, I wondered if the “Backward Design” model would be a good starting point for our work. I verbalized this intention to my peer via email which also included resources on “Backward Design”. During our second meeting, we took a closer look at the established course objectives and began identifying thinking skills that would satisfy each objective. We soon discovered that one objective in particular required both low order- and higher order- thinking skills to successfully complete. SK expressed a desire use this objective as our starting point since it was the largest and most complicated. We agreed that we would develop a unit around this objective that would then serve as a model for the subsequent objectives/units.
Third Meeting. At the end of our second meeting, SK expressed a concern about her choice of text, wondering if it was the best option available. I had suggested using multiple sources that would be updated more frequently including websites,journal articles, and open source textbooks. I promised to provide a few databases on open source materials so SKcould review prior to our third meeting.
SK made good use of the databases and had established a rough draft of the course calendar. In the calendar she separated big topics into one-week units along with associated learning outcomes for each unit. For the big unit we had decided to focus on,SK developed a three-week timeline with associated reading assignments and engagement activities. For the reminder of our meeting, we discussed the engagement activities at length focusing on any potential technology integration that would allow for collaboration.
Fourth Meeting. By this time, we had already met our SMART goal. Prior to meeting, I used our loosely-defined definition of engagement (including active learning,collaboration, and participation) and made notes on the unit’s learning activities for future consideration. These suggestions were mainly to address prior concerns of isolation in traditional blended classrooms. We went through these suggestions. My peer expressed a desire to stop our work for the time being as she was happy with our progress and wanted time to reflect upon the ideas explored in this last meeting.
Feedback and Reflections
At the end of our peer coaching relationship, SK provided positive feedback on our progress. She was happy that we were able to remain on task to meet our SMART within our allotted time despite very busy schedules.She appreciated the ability to ask for suggestions and bounce ideas off of eachother. Talking through ideas was helpful for understanding how each component could be more engaging in an onlinesetting. Despite our momentum in organizing the blended classroom, SK noted that she will be taking sabbatical making our last meeting an excellent stopping point.
Taking from an outside perspective, one of my colleagues, LB, reviewed the progress outlined above and agreed to provide feedback. LB’s comments and reactions to the project were positive and focused on three aspects:
1) Coaching relationship; she noted that the relationship my peer and I had worked well to help us achieve our goals. Having established clear expectations early on ensured the accountability my peer wanted to gain a head start in course development.
2) Unit organization; though my peer and I didn’t plan and evaluate a lesson plan,which was the original scope of this project, LB commented on the process of developing the unit. She noted that the assessment components of our chosen unit appeared fun, engaging, and meaningful for students.
3) Coaching skills; LB and I shared experiences during this project.LB commented on the fact that I performed my coaching skills well. While I think my past experiences partially reflect this, I do also think that my success is rooted in the fact that my peer is also an experienced collaborator and understood what a collaborative partnership should look like.
Things that went well. Taking LB’s comments into consideration and reflecting back on my performance, I had an overall positive experience. Mypeer and I were very appreciative of one another’s efforts towards the progression of our project. We stood by our established expectations and fulfilled our roles accordingly. One aspect that was a little surprising for me was the fact that my peer saw me as a subject matter expert and expected this type of coaching style. Interestingly, I did not see myself as the“expert”, opting instead for a more collaborative coaching style. In the end,my role/style morphed into a little of both. One delightful discovery my peer and I made through our brainstorming and collaborative efforts, we used our strengths to explore a creative way to use Pinterest as a visual timeline for a major project. By using what knowledge I had about existing technologies, and collaborating by offering lots of options and suggestions for their use, my peer could choose the option that was right for the course or the one she felt most comfortable exploring.
In addition to responding to my peer’s expectations well, another strength of this project was our communication style. Because SK and I worked together previously, we had already established rapport and understood our working styles. SK knew that her preferences would be honored throughout this process and her decisions would be supported because she was encouraged to express herself open and honestly. Most of our communication was through face-to-face interaction with only supported our good communication. Email communication was limited to follow up emails. These follow-ups were helpful to ensure accountability by both parties. Each email would review past conversations, action items to be completed prior to the next meeting, and any resolutions to concerns, such as the opensource databases.
On a curious note, SK felt very motivated to complete her part in a timely manner because she was very respectful of the fact that this was an assignment for me and she didn’t want to “mess up” my project.
Things that could have been improved.LB mentioned several times that she enjoyed the layout and the organization of the assignments prepared for the big unit as a strong feature to the project. However, I cannot take credit for the organization as my peer completed this work. SK knew what she wanted and I served as resource to help her reach that goal. Because of this, I feel that I didn’t really do anything aside from give options and opinions of the information my peer brought forth. I must recognize however that this is what my peer wanted and in this particular coaching scenario, it worked well. In the future, I would also like to improve my communication skills to be more in line with the prescribed communication methods learnt in this course. Should I collaborate with a peer that isn’t as clear with what they want, the probing and clarifying questioning skills are going to prove crucial to success.
While the topics of our meetings were loosely set previously, I never created agendas or had any particular topics to review aside from the backwards design model. Keeping the meetings loose did allow for more open-ended exploration of our goals but I wonder what the outcome could have been if I had better defined our meetings? Again,this style worked well for this particular coaching scenario, but I’d like to keep this idea in mind for a future coaching partner who perhaps needs more structure or guidance.
Thoughts on coaching for the future. I would love to incorporate a coaching culture in my department. Working with SK was not only an opportunity to help her gain ideas and resources for her new class, but it was also an opportunity to get to know one another in a different environment. Our collaboration was meaningful and fruitful.
Though we currently do not have a one-on-one coaching program in my department, we have classroom observations as one of our required professional development strategies.Therefore the basic idea and structure is already in place. I’d like to expand upon that work to create a more constructive professional development environment where professors move away from work in isolation to work in collaboration. I’ve already begun exploring coaching culture in a previous blog post available here. Moving forward, I would need department input and an assessment of current thoughts and attitudes towards peer coaching. Should the department approve, more meaningful and fruitful interactions would allow 21st Century skills to thrive in our courses.
What happens when you allow two people with seemingly different backgrounds to work together? Great collaboration! This is true of a program co-sponsored by the Center for Educational Equity and Big Brother/ Big Sister that paired 9-14 year old girls with adult women to learn about computers. The little and big sisters would meet to solve computer problems through a software program called SISCOM, (Wolman, 1986). Together they would dive deep into discussion, take turns leading and learning, helping each other problem solve through a process that provided 20 hours of computer basics instruction, (Wolman, 1986). Not only did the pairs work together to solve their shared problem but institutions worked together to provide the necessary resources. This story highlights the successes of Co-Learning.
Traditional learning environments are generally set up to rely on one “expert” or teacher to lead and the remaining participants as the learners. The teacher chooses what material to cover and to what extent the participants engage in the material. While this system works on the surface level, one of the major problems is that the teacher and students do not interact,“…when teachers and students do not interact successfully, contradictions occur,” (Tobin & Roth, 2005). This leads to the development of negative emotions that can manifest as disinterest, disappointment, frustration for the students, and job dissatisfaction for the teachers, (Tobin & Roth, 2005). According to Rheingold, one of the appeals of co-learning is that it levels out the hierarchy of the classroom. When Rheingold engages in co-learning, he has everyone sit in a circle because then everyone is visible and everyone has an equal voice, (Rheingold, 2018). Co-learning assumes that teacher isn’t the gatekeeper nor the expert in all subjects and that all participants have something valuable to share and teach about a given concept. Just like in the Big Brother/Big Sister example above, neither the little nor big sister had an advantage over the learning and teaching of the SISCOM program. Both partners took equal interest and value in what the other knew, shared, and did. Because of the flattened hierarchy, it increased motivation, engagement, and excitement about learning/teaching, thereby improving learning outcome and attitudes towards learning, (Tobin, 2014).
One of the coveats of co-learning is co-teaching. While co-learning gives all participants an equal voice in learning together, co-teaching takes this a step further by inviting participants to also engage in all phases of the teaching process, (Tobin and Roth, 2005). When implemented, co-teaching occurs between two or more teachers where one teacher may take on a mentor role. The most important factor of co-teaching is that it is not a mere division of tasks, but rather that teachers participate in the creation of all tasks. Because some of the learning that occurs is subconscious, following through on process of co-teaching is important, (Tobin & Roth, 2005).
I’d also like to make a small mention about cogenerative dialogues. Tobin defines cogenerative dialogues as a side-component of co-teaching though it may also be used seperately. Cogenerative dialogues involves small groups of about 5 individuals representing stakeholders (or demographics) that discuss specific incidences in class including reflection on lessons, (Tobin, 2014). Initially, these discussions can explore what works and what doesn’t in class lessons, but the discussions can also be expanded to roles of students/teachers, classroom rules, and how to use resources, (Tobin, 2014). The benefit of these independent discussions that that all views and understandings are valued and all explanations are co-generated. It helps to ease communications among all cultural, socioeconomic boundaries by identifying (and acting upon) contradictions and later improving the quality of teaching and learning (Tobin & Roth, 2005).
Despite the benefits of co-learning, several barriers should be addressed. Rheingold hypothesizes that teachers may be adverse to adopting co-learning because of the high level of trial and error that goes along with it, (Rheingold, 2018). Teachers must give up a certain level of control and understand that outcomes will vary from classroom to classroom. While Rheingold is sympathetic to these barriers, he argues that trial and error also offers real-time modeling of problem solving and troubleshooting. The key is to show students how to reflect upon a problem, re-examine, and adjust to the situation as necessary, (Rheingold, 2018).
Co-learning with a tech twist. The ISTE standard for educators (4b in particular) indicates that teachers “collaborate and co-learn with students to discover and use new digital resources and diagnose and troubleshoot technology issues”, (ISTE, 2017). In short, the standard places importance on the principles of co-learning addressed by Tobin and Roth, in addition to the modeling Rheingold stresses as a key factor to co-learning by focusing on how technology can foster collaboration while improving troubleshooting skills. I had a particular problem in mind when I chose to explore this ISTE standard 4 component. In my human nutrition class, students conduct a dietary analysis on their own diet. The main features of this assignment is that students must accurately track their intake over the course of three days then input the data into an analysis program, later analyzing the findings in comparison to the Dietary Guidelines for Americans. The analysis program I had selected for this assignment, SuperTracker (https://www.supertracker.usda.gov/), will be discontinued at the end of this academic year for undisclosed reasons. While the program was not without its faults, I supported the use of SuperTracker due to the fact that it is a free program easily accessible to anyone with internet, and it relied on the USDA database, an accurate and reliable set of nutrition data. I am now facing the challenge of reviewing apps and websites for SuperTracker’s replacement. However, the assignment would take a whole new meaning for students if they were allowed to co-learn from the start to finish of this project. In order for this project idea to be successful, it is important to consider how nutrition-related apps can be leveraged to facilitate co-learning among students and professors regarding modes of nutrition education.
Addressing the ISTE Standard. As I started my search of nutrition-related apps and their feasibility for co-learning, I determined that credibility of app information should be a top priority. One of the challenges my students face is finding credible information to further their understanding. For as long as I’ve been a professor, we’ve always looked at articles and websites and discussed the importance of reviewing these for credibility. However, information is now found in a variety of different mediums not limited to digital articles. Students are now using apps, videos, and other multimedia to gather information. Understanding where that medium sourced their information is key to determining credibility. By examining and evaluating credibility for each app, all members involved in the use of this app would participate in troubleshooting and problem solving, a key caveat of the ISTE standard.
The sheer amount of nutrition apps is staggering so I decided to narrow my search by starting with a credible source that provided a curated list, the Apps Review section of the Food and Nutrition Magazine. Food and Nutrition Magazine is a publication of the Academy of Nutrition and Dietetics (AND). Where AND publishes research through the Journal of Nutrition and Dietetics, the magazine is often viewed as the “lighter” side or the “practical” side of the dietetics world. Food and Nutrition Magazine features new products, recipes, research highlights, in short, ways to keep updated in the food and nutrition world. The curated list of apps (https://foodandnutrition.org/tag/apps/) contains reviews of new and upcoming apps by the editors. Those that are deemed reliable, credible, and useful, make the app list. The apps featured on the list explore a variety of nutrition topics that may have a nutrition education focus including food safety, physical activity, dining out, meal planning, in addition to apps that may be used by professionals in a variety of different capacities, such as video recording.
The list could serve as a good starting point for facilitating co-learning of the human nutrition dietary analysis project. Having students further explore these apps in pairs (or small groups of three) in relation to assignment parameters can help facilitate collaboration and co-learning. Adding a presentation element where these pairs teach the class on the usability of their chosen app may invoke the principles of co-learning. Finally, placing students in small, diverse groups and allowing them to reflect on the assignment makes their viewpoints heard as they embark in cogenerative dialogues.
While I initially had my sights set on this curated list for my human nutrition class, some of these apps may help facilitate student-professor collaboration, while others help foster practitioner-patient collaboration, making the possibility for implementing this list in other co-learning scenarios very feasible. When both parties are able to contribute to how and why an app is used for various purposes, the co-learning is maximized.
ISTE. (2017). ISTE standards for educators. Available at: https://www.iste.org/standards/for-educators
Rheingold, H. (2018). Co-learning: Modeling cooperative-collaborative learning [blog]. Available at: https://dmlcentral.net/co-learning-modeling-cooperative-collaborative-learning/
Tobin, K. (2014). Twenty questions about cogenerative dialogues. In book: Transforming urban education: Collaborating to produce success in science, mathematics and technology education, Chapter 11, Publisher: Sense Netherlands, Editors: Kenneth Tobin, Ashraf Shady, pgs.181-190DOI: 10.1007/978-94-6209-563-2_11
Tobin, K., Roth, W.M. (2005). Implementing coteaching and cogenerative dialoguing in urban science education. School of Science and Mathematics, 105 (5): 313-21.
Wolman, J. (1986). Co-learning about computers. Educational Leadership, 43 (6), pg. 42.
Who says playing video games doesn’t teach you anything? Playing and creating games could actually help students develop another 21st century skill, computational thinking (CT). Computational thinking is a form of problem solving that takes large, complex problems, breaks them down into smaller problems, and uses technology to help derive solution. In deriving solutions, students engage in a systematic form of problem solving that involves four steps: 1) “decomposition” where a complex problem is broken down into smaller, more manageable problems, 2) “pattern recognition” or making predictions by finding similarities and differences between the broken down components, 3) “abstraction” developing general principles for the patterns that emerge, and 4) “algorithm design”, creating step-by-step instructions to solve not only this problem but other similar problems in the future, (Google School, 2016). By engaging in computational thinking, “students develop and employ strategies for understanding and solving problems in ways that leverage the power of technological methods to develop and test solutions, (ISTE, 2017). In other words, the key to successfully following this process is that students develop their own models rather than simply applied existing models, (Google School, 2016).
In researching ways to apply computational thinking in the classroom, I ran across scholarly articles discussing the gamified classroom. I have always been intrigued with this concept, from my own experience students are so much more engaged during class time when the required content is converted into a game. During these game sessions, my role changes from the the person delivering the content, to the person delivering the game (i.e. asking the questions). The students are responsible for providing the content by providing solutions to the posed questions, thereby evoking problem-solving skills and in some cases, critical thinking skills. This idea-thread then led me to think “what are some ways that a “gamified” classroom can help develop computational thinking?”
To help answer my question, I came across two articles that pinpointed models in game-design to build computational thinking:
Yang and Chang explore how students can increase their motivation for learning when they are allowed to design their own game given a specific topic. During the game design process there is significant problem-solving that occurs because of the interaction and the immediate feedback the process entails. In addition, students gain high order thinking such as building creativity, and critical thinking. The authors mention three game building software that does not require extensive coding skills: RPG Maker, Game Maker, and Scratch. During their study, the researchers investigated the effects of game design process on seventh grade biology students that were using either Flash animation (digital flash cards) or RPG Maker. The investigated effects included concentration, critical thinking, and academic performance. Their result demonstrated that the group using the RPG maker had significant improvements on critical thinking and academic performance, while no significant difference was noted on concentration for both groups.
Kazimoglu et. al. begin their inquiry by providing a few definitions. It is important to understand the terminology they use, mainly defining any game used for educational purposes as a “serious” game. They acknowledge that several definitions of computational thinking exist so they create their own definition that require the following elements: 1) conditional logic (true vs. false conditions); 2) building algorithms (step-by-step instructions); 3) debugging (resolving issues with the instructions); 4) simulation (modeling); and 5) distributed computation (social sharing). The authors are challenged to create a non-threatening introduction to programming unit to combat common student perception that programming is “difficult.” Kazimoglu et. al. believe that when students are allowed to engage in game design, they are motivated to learn which provokes problem solving. They take this approach to their introduction programming class where they challenge students through a series of exercises using the Robocode platform. At the end of the study, all students successfully completed the exercise, engaging in problem-solving skills.
Conclusions. Interestingly, both of these articles struggle to exactly define “computational thinking” and both mention that specific research investigating the extent to which games can develop CT is lacking. However, what both can agree on is that CT is best developed when students are the game designers. In order to do this, both studies involved elements of programming instruction to help students successfully build their games.
While these articles offer models into successfully implementing computational thinking through game design and creation, it was a little disheartening to discover that programming instruction was a necessary component. My inclination was to think how can these processes be implemented and/or adapted in other classroom scenarios particularly when programming instruction may or may not be feasible. Interestingly, not all researchers agree that programming need be involved in successful CT implementation. Voogt et. al. argue that although most research on CT involves programming, because CT is a thinking skill, it does not require programming in order to be successfully implemented, (Voogt et. al., 2015). In fact, in a literature review conducted by Voogt demonstrated that students do not automatically transfer CT skills to a non-programming context when instruction focused on programming alone. The strongest indicator of CT mastery was actually heavily dependant on instructional practices that focuses on application, (Voogt et. al., 2015).
The lack of a standard definition of computational thinking also needs to be addressed. The two articles above and the Voogt researchers agree that discrepancies exist among current definitions of computational thinking. To avoid confusion regarding the role of programming and other such technologies, computational thinking can be simply defined as a way of processing information and tasks to solve complex problems, (Voogt et. al., 2015). It is a way to look at similarities and relationships between a problem and follow a systematic process to reaching a solution. Figure 1.2 summarizes this simplified process.
According to this new context, it is not necessary to program games in order for students to build computational thinking. Allowing students to participate in systematic artifact creation will do the trick. Some examples of artifact creation without the use of programing include: remixing music, generating animations, developing websites, and writing programs. The main idea of this artifact creation process is that students follow procedures that can be applied to similar problems. Figure 1.3 highlights this artifact creation process.
How can this artifact creation process be used in creating gamified classroom? To help me explore this issue, one of my colleagues suggested allowing students to develop and design their own board game. While the solution seems low-tech, others agree with this strategy. Michele Haiken, an educational leadership for ISTE, writes about adapting “old school” games for the classroom to help develop critical thinking and problem solving skills, (Haiken, 2017). Students can even create an online “quest,” scavenger hunt, or create a “boss event” to problem-solve computationally, (Haiken, 2017). For more tech-y solutions, existing platforms and/or games such as GradeCraft and 3DGameLab can be used to apply computational thinking in a gamified classroom, (Kolb, 2015). Regardless of the method used, low-tech board games or high-tech game creation through programming, allowing students to participate in the artifact creation process helps to build computational skills that they can then apply to other complex problems to create their own models.
Google School, (2016). What is computational thinking? [Youtube Video]. Retrieved from: https://www.youtube.com/watch?v=GJKzkVZcozc&feature=youtu.be.
Haiken, M., (2017). 5 ways to gamify your classroom. Retrieved from: https://www.iste.org/explore/articledetail?articleid=884.
International Society for Technology in Education, (2017). The ISTE standards for students. Retrieved from: https://www.iste.org/standards/for-students.
Kazimoglu, C., et. al., (2012). A serious game for developing computational thinking and learning introductory computer programming. Procedia-Social and Behavioral Sciences, 47, 1991-1999.
Kolb, L., (2015). Epic fail or win? Gamifying learning in my classroom. Retrived from: https://www.edutopia.org/blog/epic-fail-win-gamifying-learning-liz-kolb.
Voogt J, et. al., (2015). Computational thinking in compulsory education: Toward an agenda for research and practice. Education and Technologies, 20(4), 715-728.
Yang, Y. C., & Chang, C. (2013). Empowering students through digital game authorship: Enhancing concentration, critical thinking, and academic achievement. Computers & Education, 68(c), 334–344.