Peer Coaching Focus- For Teacher or Student Outcomes?

Educators are facing an ever-changing professional landscape. As society evolves into the 21st century, the needs of various industries change, requiring different skills. Teachers are challenged to improve and update skills, knowledge, and actions to match those needs, (Ma, Xin, Du, 2018). Teachers can’t keep up on their own.  “New curriculum, standards, resources/materials, assessments, methodologies, technology, and reforms will not and do not have much impact unless teachers have appropriate access, knowledge, skills and continuous learning opportunities. Teachers require time for reflection, mentoring relationships, collegial interaction, expert role models, and ongoing professional development for any of these changes to be effective,” (Becker, 2014).  As Becker alludes to, the format of professional development is important in providing educators the tools they need to make the changes necessary for successful student impact.  In order to maximize success, professional development is moving away from theory-only, lecture-based models to more effective personalized learning models such as peer coaching. Studies show that educators participating in peer coaching better practice and adopt new strategies, retain and increase skills over time, and are better able to explain teaching/learning models than un-coached educators, (Joyce & Showers, 2002).  Statistics back these findings, five percent of educators will transfer new skills into practice as a result of theory, whereas ninety percent of educators will transfer new skills into practice with theory, demonstration, practice within training, feedback, and coaching, (Becker, 2014).

The sixth ISTE standard for coaches encourages this peer coaching model by recommending an “engage[ment] in continual learning to deepen content and pedagogical knowledge in technology integration and current and emerging technologies necessary to effectively implement the ISTE Student and Educator standards,” (ISTE, 2017). If peer coaching is to be done correctly, should the coaching focus on teacher outcomes or student outcomes? This inquiry comes from my reflective thoughts on the skills and strategies used in successful coaching which are mainly teacher-focused. Given that the learner audience would be a peer, the coaching efforts logically should be focused on meeting their needs. My hypothesis is that meeting these needs would automatically relate into increased learning outcomes for the students through improved instructional methods. However, in a current peer coaching relationship, we are heavily focused on student learning outcomes rather than the peer’s needs. Are my peer’s needs being met through meeting the student learning outcomes, or should one be given priority over another?  Below are the results of my investigation, offering both sides of the argument from which I draw my conclusions at the end.

Evidence for Teacher-Focused Peer Coaching.

There is evidence to support that peer coaching has a marked effect on professional improvement and classroom implementation. A research study conducted in China looked at the impact of peer coaching on professional development, learning, and application of that learning in instructional design, attempting to investigate the problem that teachers who had knowledge of certain pedagogies were unable to apply them in the classroom. Twenty peers were coached and evaluated through performance rubrics and teaching videos. The results of the study suggest that personalized approaches such as peer coaching increased learning participation which improved in-depth learning.  In addition, participants were more effective in content application than traditional methods, (Ma, Xin, & Du, 2018). This study makes the case for keeping peer coaching focus on the instructors for improved teaching outcomes.

Several studies have concluded peer coaching effectiveness not only teaching modalities but also in personal development. Undergoing the peer coaching process can help teachers become more reflective of their work and therefore better able to identify own professional development needs, (Soisangwarn & Wongwanich, 2014).  Ma, Xin, and Du found similar results in their study, by sharing and offering suggestions to other teachers, the peers became more reflective of their own work, (Ma, Xin, & Du, 2018). By becoming more reflective, they are building emotional intelligence and self-awareness.

Lastly, effective peer coaching can also increase the self-efficacy of teachers. Researchers investigated the effect of peer coaching on instructional knowledge and self-efficacy on student teachers in a TEFL (teaching English as a foreign language) program. The results of the study indicated an increased self-confidence as the student teachers expressed freedom to ask questions and express their opinions. Undergoing the process of peer coaching also allowed the student teachers to become self-directed learners which built self-efficacy, (Goker, 2006).  The above evidence supports teacher-focused peer coaching because the intent of coaching is to serve as professional development, helping the peer, not the students, improve in both personal and professional skill development.

Evidence for Student-Outcome Focused Peer Coaching.

The evidence for student-outcome focused peer coaching is driven by results. Researchers Joyce and Showers argue that learning how to learn is equally as important as acquiring skills and knowledge for classroom application, (Joyce & Showers, 2002). Interestingly, Joyce and Showers make the case that the teachers should be treated like the students when approaching professional development through peer coaching.  They state that in order for peer coaching to be successful, the pair needs to identify the learning outcomes and select the training component most adequate for successful achieving those outcomes, (Joyce & Showers, 2002). This approach to peer coaching puts the student outcome first by treating the peer as a student and following a similar approach to learning outcomes.

Researchers Scott and Miner explore peer coaching solely for the purpose of improving student outcomes in higher education.  They argue that peer coaching is rarely used in higher ed due to environmental and cultural factors including the fact that professors are mostly autonomous, peer coaching can be time-consuming, and outcomes are not tied to tenure efforts nor other evaluation efforts, (Scott & Miner, 2008). However, when peer coaching focused on improved student outcomes, other evaluation methods, such as course evaluations also improved, (Scott & Miner, 2008). This makes the case for incorporating more peer-coaching and feedback as the predominant feedback mechanism in higher education, i.e. course evaluations, typically lack enough information for true improvement to occur.

Infographic on benefits of teacher vs. student outcome focused peer coaching.
Figure 1.1 Summary of Teacher vs. Student-Outcome Focused Peer Coaching.

Conclusion

The matter of teacher versus student-outcome driven peer coaching is not an easy debate to settle.  Most authors evaluated in this review often provided a two-pronged view of coaching looking at the benefits on both sides. Joyce and Showers concluded their study explaining that when teachers learn how to learn, and consistently use newly acquired skills and strategies well in the classroom, a critical point is reached that impacts students’ development, (Joyce & Showers, 2002).  Becker agrees, peer coaching can accomplish both improved outcomes from the teacher and the student when allowed in the right capacity including organizational implementation, (Becker, 2014). These sentiments are mirrored by several other authors and researchers as well. Pam Robbins, author of “Peer Coaching to Enrich Professional Practice, School Culture, and Student Learning”, explains that there are many uses and purposes for peer coaching from understanding diversity in the classroom, implementing new technologies, or improving learning outcomes. Peer coaching is poised to help teachers face many challenges in the classroom and promotes new opportunities, (Robbins, 2015). Given all of the above evidence, it can be concluded that peer coaching should focus on both teacher and student outcomes. When done well, both teachers and students benefit.

References

Becker, J.M. (2014). Peer coaching for improvement of teaching and learning [pdf]. Available from: http://radforward.com/blog/wp-content/uploads/2014/01/peer_coach_article.pdf.

Goker, S.D. (2006) Impact of peer coaching on self-efficacy and instructional skills in TEFL teacher education. System. 34: 239-254l

ISTE. (2017) ISTE standards for coaches. Available from: http://www.iste.org/standards/for-coaches

Joyce, B., Showers, B. (2002). Student achievement through staff development [pdf]. Available from: https://www.unrwa.org/sites/default/files/joyce_and_showers_coaching_as_cpd.pdf

Ma, N., Xin, S., & Du, J. Y. (2018). A peer coaching-based professional development approach to improving the learning participation and learning design skills of in-service teachers. Educational Technology & Society, 21 (2), 291–304.

Robbins, P. (2015). Chapter 1: Establishing the need for peer coaching. In: Peer Coaching to Enrich Profession Practice, School Culture, and Student Learning [e-book]. Available from: http://www.ascd.org/publications/books/115014/chapters/Establishing-the-Need-for-Peer-Coaching.aspx

Scott, V., Miner, C. (2008). Peer coaching: Implication for teaching and program improvement [pdf.] Available from:  http://www.kpu.ca/sites/default/files/Teaching%20and%20Learning/TD.1.3_Scott%26Miner_Peer_Coaching.pdf

Soisangwarn, A., Wongwanich, S. (2014). Promoting the reflective teacher through peer coaching to improve teaching skills. Procedia – Social and Behavioral Sciences. 116: 2504 – 2511. Available from:  https://ac.els-cdn.com/S1877042814006181/1-s2.0-S1877042814006181-main.pdf?_tid=aa5bc8ae-6473-42f0-a7e3-a561b25b9b8a&acdnat=1541369407_8987477626b3f7a71d8baf9789f13d8f

 

Managing Common Coaching Miscommunication

If the foundation of effective peer coaching is collaboration, good communication is one of its pillars. Mark Ladin, CMO of Tiger Connect, an IT company, shares this mindset by defining communication and collaboration as one and the same.  He argues that both communication and collaboration function on the exchange of information, however without good communication, you can’t have a functioning collaborative relationship that yields productive results, (Ladin, 2015).  Therefore, eliminating miscommunication in partnerships promotes good collaboration, (Lohrey, n.d.).  Collaborative communication offers many benefits including: creating flexible work environments that promote trust and familiarity, enhances decision-making by tackling problems through various angles, and increasing overall satisfaction of the collaboration process, (Lohrey, n.d.)

The ISTE Coaching Standard (1D) calls for coaches to implement strategies for initiating and sustaining technology innovations and manage the change process in schools and classrooms, (ISTE, 2017). A peer can feel comfortable enough to implement suggested strategies, when good communication between the collaboration peers is established. If good communication is central to collaboration, what miscommunication is common during peer coaching and what are some strategies to avoid it? This question does not readily yield concrete results on peer coaching alone, but rather there are several approaches to reasons for miscommunication including: modes of communication, a variety of communication barriers, and types of information given that may lead to miscommunication.

Modes of communication.

While mode of communication may not be the first thing to come to mind when considering miscommunication, the impact communication delivery has on conversation comprehension is compelling. According to Willy Steiner, an executive career coach, the degree of communication effectiveness compared to information efficiency differs when offered via face-to-face, telephone, or email communication, (Steiner, 2014). The author argues that face-to-face communication offers the best information efficiency (i.e. better understood) while email is most effective (i.e. quick). This can be further compounded by factoring in three types of communication: visual, verbal, and non-verbal. Face-to-face communication allows for better understanding in all three communication types, though it is the slowest communication mode.  Email is the quickest mode but tends to promote higher levels of misunderstanding in verbal and visual communication and does not allow for any interpretation of non-verbal communication, (Steiner, 2014).  A research study on adult learners using information communication technology found similar results.  The aim of the study was to determine what type of information communication technology would better support virtual coaching. The results found that email was useful for the exchange of information but lacked the ability to create authentic communication experiences or relationships, and often led to more miscommunication, (Ladyshewky & Pettapiece, n.d.).  Use of telephone technology was more effective than emailing because phone calls offered more verbal cues, while video-conferencing (mimicking face-to-face communication) was just as efficient as face-to-face conversations if technical issues are not present, (Ladyshewky & Pettapiece, n.d.).  As a result, communication comprehension is a major consideration for avoiding miscommunication. When possible, face-to-face or similar communication modes should be used to help build relationships and deliver the most amount of understanding while limiting email to information transfer only.

Communication barriers.

Research shows that face-to-face communication better maximizes understanding and relationship building in collaborative partnership. However, even in face-to-face environments, several barriers may create inadvertent miscommunication events.  According to the Coaching Room Company, there are seven potential barriers that may lead to ineffective coaching, summarized in figure 1.1 below.

Infographic highlighting seven barriers to good communication.
Figure 1.1 Seven Barrier to Good Communication.

Considering that many of these barriers involve understanding and respect of the coaching peer, developing a good collaborative relationship prior to working on the mutual project is essential for avoiding miscommunication.

Information miscommunication.

Peer coaching invites the coach to step into a leadership position in which the goal is to collaborate and facilitate work with a peer toward a mutual goal. Another area of potential miscommunication may stem from how the coach leader presents information to the peer.  Figure 1.2 below lists the various information communication errors that may arise in leadership.

Infographic on common communication mistakes
Figure 1.2 Common Communication Mistakes

It is not only important to consider how communication is performed but also what is being communicated.  Forbes Coaching Council expands on the communication errors provided in Figure 1.2 to focus on information clarity. Miscommunication can occur when the message is non-individualized or personal, (Forbes, 2018). Using the same strategies, communication techniques, and information to various coaching peers can harm the coaching relationship. A common miscommunication is use of vague, generic language or messages leading to lack of clarity in direction. The peer is left feeling like they are missing out on important information or that the information they were provided was not delivered effectively, (Forbes, 2018). To help eliminate the lack of direction, clear expectations that are developed by both parties can help promote the shared vision contributing to better collaboration.  The peer leader should avoid communicating only negative outcomes, instead include the positive outcomes to avoid creating an image that the shared work is not successful, (Forbes, 2018). Lastly, it is crucial that the coach recognize their bias and remember that the process is not about their wants but the needs of the peer being coached.  Business coach Tony Alessandra said it best, “You can choose to connect with others from their perspective, the way they want to be communicated with by modifying your own presentation style; or you can choose to meet only your own needs – facing the consequence misconnecting with others…,” (Alessandra, 2015).

Promoting good communication. Several of the communication barriers addressed above stem from how communication is delivered, what information is delivered, and how each party perceives that information. Good communication is established when both parties feel safe, comfortable, and trust one another in their collaborative environment. Both hold the responsibility of keeping an open-mind into the process and commit to relationship building. Only after good communication occurs between coaching peers can good collaboration exist.

Resources.

Alessandra, T. (2015). Expert advice- How you can prevent miscommunication. Available from:  https://www.fripp.com/expert-advice-how-you-can-prevent-miscommunication/

Forbes Coaching Counsel. (2018). Common communication mistakes to avoid as board directors. Available from: https://www.forbes.com/sites/forbescoachescouncil/2018/01/18/common-communication-mistakes-to-avoid-as-a-board-of-directors/#6f86f4332b44

ISTE, (2017). ISTE standards for coaches. Available from: https://www.iste.org/standards/for-coaches

Ladin, M. (2015). Communication and collaboration: Why they are one in the same? Available from: https://www.tigerconnect.com/blog/communication-collaboration-theyre-one/

Ladyskewshy, R., Pettapiece, R.G. (n.d.). Exploring adult learners usage of information communication technology during a virtual peer coaching experience. Available from: https://espace.curtin.edu.au/bitstream/handle/20.500.11937/32326/227280_153211_Jnl_online_learning_full_paper.pdf?sequence=2&isAllowed=y

Lohrey, J. (n.d.) Importance of promoting collaborative communication in the healthcare environment. Available from: https://smallbusiness.chron.com/importance-promoting-collaborative-communications-health-care-environment-79568.html.

Ramsey, P.G.S. (2008). The twenty biggest communication mistakes school leaders make and how to avoid them. Available from: https://www.corwin.com/sites/default/files/upm-binaries/25868_081218_Ramsey_ch1.pdf.

Steiner, W. (2014). Avoiding communication breakdowns. Available from: https://executivecoachingconcepts.com/avoiding-communication-breakdowns/

The Coaching Room. (2016). 7 barriers to effective communication killing your relationships. Available from: https://www.thecoachingroom.com.au/blog/7-barriers-to-effective-communication-killing-your-relationships

Length of Peer Coaching Session for Successful Planning and Implementation

Building 21st century skills is the imperative focus for most educational institutions.  Many education articles and blog posts are centered around techniques and concepts that educators can use to develop these skills in their students. Yet what about an educator’s need to build these skills? How can educators learn and gain 21st century skills before teaching and modeling them in the classroom? One proposed method is to provide more professional development to help educators build these skills, but now many researchers argue that traditional presentation-only professional development sessions leave little room for implementation. An early study conducted by Showers and Joyce found that only ten percent of professional development participants implemented what they learned into the classroom. When educators were allowed to practice what they had learned, implementation increased drastically, (Showers & Joyce, 1996).   

What Showers and Joyce were researching was the concept of “peer coaching.” Peer coaching is a professional development strategy in which colleagues spend time in a collaborative environment working towards improving standard-based instruction and support efforts for building 21st century skills, (Foltos, 2013).  Peer coaching may take on many forms but usually includes a collaborative process in which the teacher leader assists in co-planning activities, models strategies and techniques, provides observation of teaching and reflection, while avoiding formal evaluation of the peer, (Foltos, 2013).  Through peer coaching, the collaborating pair begin to build a culture of standards and expectations, increase instructional capacity, support ongoing evaluation, and create a platform for connecting teaching practices to school policies, (NSW Department of Education, 2018).  Student learning benefits when teachers learn, grow, and change through peer coaching, (Showers & Joyce, 1996). 

The ISTE standard for coaches defines a peer coach’s role: as “contribut[ing] to the planning, development, communication, implementation, and evaluation of technology-infused strategic plans at the district and school levels,” (ISTE, 2017).  Therefore, understanding peer coaching best practices is important to effective coaching.  Since the coach’s role is to take part in the planning, implementation, and evaluation cycle, I began wondering about effective time spent in coaching sessions with a peer.  This wonderment stems back from my past role as a nutrition counselor. One of the biggest issues that would come up concerned the appropriate length of a counseling session. Medical insurance allowed for billing in fifteen-minute increments though fifteen minutes was hardly enough time for any successful progress to take place. There was distension among professionals about whether 30 minutes or one hour was more effective. My former employer insisted that every session should be a minimum of one hour, which felt appropriate for first, second, and sometimes even third session, yet felt unnecessarily long after about the fourth session.  I often wondered at what point is there too much information given, in comparison to too little, for a coaching session to be effective?  Now as I step into the role of a technology coach, these same questions enter my mind, what is a reasonable timeframe for peer coaches to fulfill their roles (i.e. how long would a coaching cycle take)? 

My questions, as it appears, do not have a straightforward answer. A program called “Incredible Years” offers some guidelines into actual number and timeframes, citing that one-hour coaching sessions should occur after every two or three teaching lessons particularly if the educator is new to the program. More experienced educators may meet less often. Despite these very specific guidelines, the program designers state that the guidelines serve as recommendations at best, (Incredible Years, n.d.).  

Researchers and educational leaders agree that coaching, regardless of its medium, is an individualized process. According to educational leader, Les Foltos, peer coaching needs to be personalized to be effective. One of the hallmarks of a good peer coach is making the process manageable for the coaching partner, (Foltos, 2013). Time spent on improvement will be dependent on other time obligations, such as current workload.  Rather than focusing on a fixed time minimum, Foltos recommends that the time set out for coaching should be based on the peer’s capacity and readiness for improvement, (Foltos, 2013). In fact, peer coaching may never have a clear resolution time but rather it may be a cyclical process. The key to understanding the process length will lie in continual reflection and evaluation of the coaching goal(s), (Foltos, 2013). 

Foltos isn’t the only educational leader to suggest the long-term nature of peer coaching, the NSW Department of Education defines peer coaching as a “long term professional development strategy,“ (NSW Department of Education, 2018). Like Foltos, the NSW suggests a cyclical nature to peer coaching as outlined in figure 1.1 below. 

Infographic describing the four steps to peer coaching facilitation.
Figure 1.1 Peer Coaching Facilitation

The peer coaching cycle is dependent on relationship development and trust building that supports open, honest communication and comfort with risk-taking. Once these relationships have formed, the coaching process can be ongoing because professional development needs and goals change. The length may also be naturally determined as many teachers choose to continue the collaboration process even after the initial goal has been met, (Showers & Joyce, 1996). There is congruence among researchers that length of peer coaching session is less important than the process that is followed.  Initial peer coaching sessions should focus on relationship building in which both parties share goals, agree on the coaching process, and establish agendas with topics to explore.  A good peer coach would help their collaborative partner establish SMART (specific, measurable, attainable, realistic, timely) goals that help them build a personalized timeline for meeting their joint objectives, (NSW, 2018). Once this process has been followed, any sequent sessions should allow for flexibility and reflection, ensuring its ongoing nature, (NSW, 2018). 

Though there are many similarities in nutritional and technology coaching, the timeline needs are vastly different.  In both instances, the relationship development between a coach and their partner is crucial for success.  Open, honest communication and risk taking does not readily occur without a safe and established relationship. However, in technology coaching, the idea is to work with a peer, not a client, to build a collaborative partnership that is long lasting and transcends any initial short-term goal.  

Resources 

Foltos, L., 2013. Peer Coaching: Unlocking the Power of Collaboration. Chapter 1: Coaching roles and responsibilities. Corwin Publishing. Thousand Oaks, CA. 

Incredible Years, (n.d.) IY peer coaching expectations. Available from: file:///C:/Users/Catalina/Downloads/Peer-Coaching-Dosage-8-16%20(1).pdf 

ISTE, (2017). ISTE standards for coaches. Available from: https://www.iste.org/standards/for-coaches  

NSW Department of Education, (2018). Peer Coaching [website]. Available from:  https://education.nsw.gov.au/teaching-and-learning/curriculum/learning-for-the-future/Future-focused-resources/peer-coaching 

Showers, B., Joyce, B., (1996). The evolution of peer coaching. Available from: http://educationalleader.com/subtopicintro/read/ASCD/ASCD_351_1.pdf

Developing Professional Development as Part of the Community Engagement Project.

The community engagement project challenges students to create a professional development session to be presented at a conference of the student’s choosing.  As part of building effective digital age environments, as prescribed by the ISTE Standards for Coaches #3, I chose to create an interactive session that focused on active learning and digital collaboration tools to improve current practices in nutrition education. Technology in nutrition education currently has limited uses but impactful potential. Despite the fact that nutrition information is plentiful in the digital world, the approach of dietitians and nutritionists has been to increase presence through blogs, social media, and videos (such as those on YouTube), while the Academy of Nutrition and Dietetics (AND), the representative organization for all dietitians, set their efforts to instill a code of ethics and provide information on privacy in the digital workplace.  These efforts may help mitigate nutrition misinformation but are often one-sided or engage only limited populations. For example, blogs may allow comments but do not allow for active engagement with the blog topics nor takes into account implementation on a local level. Social media platforms such as Facebook, Pinterest, and Twitter allow for nutritionists’ voices to be heard but rarely offer collaborative engagement between other experts, or communities. The solution is relatively simple as the digital tools mentioned offered plenty room for continued collaboration among participants at any level, (local or global).

The Academy itself recognizes the potential of technology in nutrition and has published a practice paper on nutrition informatics.  Nutrition informatics is a relatively new field in dietetics that addresses technology’s role in health practices.  The Academy discusses the potential pros and cons for each of the various practice fields in dietetics (clinical, food services, education/research, community, consultation/business) and technology’s potential for growth in each of those areas. In education specifically, the Academy recognizes use in distance learning, student progress tracking, speciality testing for licensing and certification, and professional course development.  However, it does not mention need for collaboration or engaging various audiences requiring nutrition education.

In order to bridge this gap and address the ISTE Coaching Standard, the topic for this professional development proposal focuses on building better nutrition education through digital collaboration tools.   The goal of this session is to explore benefits of active learning through technology aides (EdTech) and implement tools into existing lesson plans with the following objectives in mind:

  • a) Understand and/or review importance of active learning (evidence-based practice)
  • b) Become familiarized with collaborative edtech tools
  • c) Engage with edtech tool selection criteria and best practices
  • d) Explore ways to incorporate digital tools in lesson plan scenarios.

Professional Development Session Elements

In this one-hour session, participants will be invited to explore the main topic through both face-to-face and online collaboration, as the entire group navigates through a website developed specifically for the presentation. Since all of major content is available to them online, there is no need for note-taking, allowing participants to remain engaged throughout the session. Elements of the session involve: a pre-session technology self-assessment, an online group discussion via Padlet, think pair share elements, and lastly self-reflection elements submitted during and after the session.  More details on these elements are provided below.

Length. The Academy hosts local sub-organizations in each state. I chose to develop this professional development session for local dietitians and nutrition educators with the opportunity to present at the local education conference held annually.  The requirements of this local organization state that all educational sessions must be a minimum length of one hour. This is to meet the CEU (continuing education unit) minimum for registering dietitians. Considering that through the DEL program we have taken entire classes dedicated to active learning and digital tools, the length will limit the depth of information presented.  However, the ability to continually collaborate with both participants and presenter will allow for continued resource sharing after the session has ended.

Active, engaged learning with collaborative participation. Participants will be encouraged to participate and collaborate before, during, and after the session for a full engagement experience. The audience will be asked to review certain elements of the presentation website available here intermittently as they discuss key elements with the participants next to them. See figure 1.1 for lesson plan details.

Building Better Nutrition Education Through Digital Collaboration Tools
Objectives

Session Goal: Introduce ways to incorporate digital collaboration tools into existing nutrition education lesson plans.

Learning Objectives: At the end of the session participants will:

  • a) Understand and/or review importance of active learning (evidence-based practice)
  • b) Become familiarized with collaborative edtech tools
  • c) Engage with edtech tool selection criteria and best practices
  • d) Explore ways to incorporate digital tools in lesson plan scenarios
Performance Tasks

  • Participants will complete self-assessment prior to the session
  • Participants will demonstrate understanding of active learning by submitting informal Google Form Quiz in session
  • Participants will engage in collaborative edtech tools by submitting responses during the session
  • Participants will create their own digital tool need by complete case scenario
  • They will submit self-reflection via flipgrid post session
Plan Outline

  • Session Introduction (5 mins)
    • Prompt and Participation: Padlet Q & A- Describe a time you attended a great education session, what made that session great?
    • Review of self-assessment (completed prior to session)
  • Importance of active learning- evidence-based practice (5-10 mins)
    • Review of evidence: Google form quiz (embedded in site)
    • How can digital tools help? (5-10 mins)
  • Choosing the right digital tool (10 mins)
    • Triple E Framework rubric
    • Criteria for choosing the right digital tool
  • Tips on incorporating tools into existing lesson plan (10 mins)
    • Video Tutorial (take home message/resource)
  • Active practice (10 mins)
    • Case scenarios-flipgrid response
    • Flipgrid self-reflection
  • Questions (5 mins)

Total session length: 60 mins.

Figure 1.1 “Building Better Nutrition Education through Digital Tools” Session Lesson Plan.

Before the presentation, the participants will be invited to a google form self-assessment poll addressing comfort and knowledge with technology tools as well as their current use of technology tools in practice. During the presentation, the audience will be prompted to participate in “think, pair, share” elements, as well as, respond to collaboration tools prompts on padlet, google forms, and embedded websites.  After the presentation, participants will be encouraged to summarize their learning by submitting a flipgrid video.  

Content knowledge needs. The session content begins with establishing the importance of active learning as evidence-based practice to meet objectives a) and b). Just as motivational interviewing and patient-centered practice is desirable in nutrition, active learning invoking 21st century skills is evidence-based and an education standard. The content will then shift into teacher-focused how-tos for digital tools including how digital tools can help, how to select the right digital tool, and how to incorporate that tool into an existing lesson plan to address objectives c) and d). My assumption is that participants who are not comfortable with technology may be fearful or lack of motivation to explore various tools.  Group collaboration, modelling and gentle encouragement through case studies may help mitigate these fears.

Teachers’ needs. While the majority of the session focuses on introductory content to active learning and digital tools, teacher’s needs in digital tool management can be addressed through coach/presenter modeling. Simple statements such as, “I created this flipgrid video to serve as a model for students.” or “This google form was hyperlinked to gauge students’ understanding so far,” can serve as a basis to explore class management and digital tool management within the limited time. The website itself offer a section on FAQs, exploring questions and misconceptions about active learning and digital tools. Even with all of these resources, the audience will be introduced to technology coaching and may choose to consult a coach at their current institution.

In addition to modeling, three tutorial videos are available on the website to help teachers begin creating their own active learning lesson plans using the backwards design model. Each of the tutorials features closed captioned created through TechSmith Relay for accessibility.  The Google Site was also chosen because content is made automatically accessible to viewers, all the website creator has to do is include the appropriate heading styles and use alt text for pictures, figures, and graphs.

Lessons Learned through the Development Process.

One of the major challenges to developing this project was understanding the needs of the target audience.  Because nutrition informatics is relatively new, technology use has not be standardized in the profession, therefore estimating the previous knowledge and use of digital tools by the audience was difficult. My assumption is that technology use and attitudes about technology will be varied. The website attempts to breakdown information to a semi-basic level.  The only assumption I made was that the audience has good background in standard nutrition education practices. I also chose to develop the Technology Self-Assessment for the audience to complete prior to the session as a way to gain some insight into current technology use and comfort so that I may better tailor the session to that particular audience’s needs.

I realized as I was developing the lesson plan for this session that I only have time to do a brief introduction to these very important topics. If I were to create a more comprehensive professional development, I could expand the content into three one-hour sessions including 1) introduction and theory to collaborative learning which would address the importance of digital tools in nutrition education and establish need for active learning, 2) selecting, evaluating, and curating tech tools allowing educators to become familiarized with available tools based on individual need, and 3) lesson plan development integrating collaboration tools, a “how-to” session where participants create their own plan to implement. I had not anticipated that length was going to be a barrier, however, if the audience truly has limited digital familiarity and comfort, perhaps beginning with an introduction to these topics is sufficient.

One positive lesson that I’ve learned is that trying new things, such as creating a Google Site, can be very rewarding.  I have never experimented with Google Sites prior to this project and I am quite happy with the final website, though the perfectionist in me wants to continue tweaking and editing content. I originally was aiming to create slides for this presentation but realized that I am attempting to convince a possibly skeptical audience on the benefits of digital tools so using the same old tool would not allow me to do the scope of modelling I desire.  

I must admit that before this project, I had a hard time placing myself into the role of a “tech coach” because I would continually see each concept through the lens of an educator and how to apply the concepts to my own teaching.  It has been difficult for me to take a step back and realize that I am teaching but just in a different context. Creating the step-by-step tutorials was the turning point where I envisioned the audience modeling their lesson plans to the example I had given.  I hope I have the opportunity to present this session at the educational conference and bring the ideals of active learning and digital tools to professionals working in various education settings.

The Connection between Digital Competence and Problem-Solving

The word “troubleshooting” most often invokes images involving a conversation with the IT department, a progression of actions guided by the technician and performed by the user, and ending with a resolution in which the user’s original knowledge of technology has not been augmented. Unfortunately this is a all too common scenario. The user defaults all troubleshooting responsibility to a third party because of unfamiliarity or knowledge deficit of technology. This is not limited to just consumers and companies, there is a concern that students also do not troubleshoot well. According to the ISTE coaching standard, coaches should help teachers and students “troubleshoot basic software, hardware, and connectivity problems common in digital learning environments,” (ISTE, 2017). While calling for IT or passing responsibility onto another party, like a teacher for example, is generally practiced, learning to troubleshoot is a beneficial 21st century skill because it helps develop digital competence.

Why is digital competence important?

Like all 21st century skills, digital competence is a highly-sought skill in the ever-evolving workforce. An e-magazine, Training Industry, wrote an industry-perspective article on digital competence and highlights the need for competence in the workforce from the top of the organization chart down.  The author believes that the tech world today emcompasses “VUCA”, or volatility, uncertainty, complexity, and ambiguity. The role of those working in tech today should be to navigate this VUCA world seamlessly and one of the ways to do this is to reinforce digital competence, (Newhouse, 2017).  The industry definition of digital competence expands to include not only knowledge of technology but also involves understanding digital environments, effectively creating and consuming digital information, communicating and collaborating with diverse stakeholders, innovating rapidly, critically thinking/problem solving, and maintaining security, (Newhouse, 2017). This definition was devised from new European Union definitions and involves five major facets summarized in figure 1.1 below.

Infographic on the 5 major facets of digital competence
Figure 1.1 Facets of Digital Competence

What role does “digital competence” play in helping students problem-solve and troubleshoot online/technology issues?

One issue that arises is the general assumption that since students grew up with technology, or are considered digital natives, that they automatically build digital knowledge or that students know how to use technology well, (Hatlevik, et. al, 2015).  However, in order to use technology well, students need to build digital competence and literacy. According to researchers Hatlevik, Gudmundsdottik, and Loi, building digital competence is complex and involves various factors as summarized in figure 1.2 below.

Infographic on the key elements for developing digital competence
Figure 1.2 Developing Digital Competence

The researchers recognize that these facets are essential to culviating a deep understanding of technology while promoting critical reflection and creativity of digital skills.  These qualities in turn develop problem-solving skills in both independent and collaborative settings, (Hatelvik,et. al., 2015).

Other than knowledge deficits involving how to perform troubleshooting tasks, researchers suggest that when demanding conditions, such as a completing an assignment,  becomes difficult, it may hurt self-regulation and autonomy, (Koole, et.al, 2012). These difficulties can include cognitive, motivational, implementational, or a combinations of these factors.  While this theory is debated, meta-analyses indicate that low intrinsic value activities (such as homework) may lower complex problem solving abilities such as those required by troubleshooting, (Koole, et al. 2012).  Along with motivational issues, students may resolve themselves to believing that there is only one correct path or resolution to a specific problem in which the educator is the gatekeeper of the solution. Rather than seeking the solution for themselves, students prefer to go straight to the source which develops a learned helplessness, (Miller, 2015).

How can students develop digital competence?

Digital competence is a very complex concept that spans several social, motivational, personal, cultural, and technical understandings, therefore, there is no straightforward way for developing digital competence.  However, educators play a big role in establishing foundations for competence that may lead to better problem-solving and troubleshooting in two major ways:

  1. Allowing for self-directed learning. A consensus exists in the fact that students need to be reflective of their own learning, (Miller, 2015 and Plaza de la Hoz, et. al., 2015).  The role of the educator then shifts to provide resources including digital tools that allow students to experiment by active participation and engagement.
  2. Change in class culture. The attitudes and beliefs of the educator also reflects importance of digital competence in students. If the educator places low importance in digital competence, the students learn not to value or develop these important skills.  The educator can establish new beliefs, resources, and structures to promote a culture of answer-seeking through appropriate digital tools and tool use. Lastely, students must build self-efficacy through trial and error in a safe environment.

While researchers are investigating efficient methods for developing competences, all sources agree that in order for students to be successful in the 21st century, educators must open up the path to new technologies, new pedagogies, and new attitudes that help build digital competency, (Miller, 2015, and Plaza de la Hoz, et. al., 2015).  

Resources

Hatlevik, O.E., Gudmundsdottik, G.B., Loi, M. (2015). Digital diversity among among upper secondary students: A multilevel analysis of the relationship between cultural capital, self-efficacy, strategic use of information, and digital competence. Computers & Education. 81: 245-353. Available from: https://drive.google.com/file/d/0B5W5P9bQJ6q0RFNib3A5Vm9wWWM/view

ISTE, (2017). ISTE standards for coaches. Available from:

https://www.iste.org/standards/for-coaches

Koole, S.L., Jostmann, N.B., Baumann, N. (2012). Do demanding conditions help or hurt regulation? Available from: https://drive.google.com/file/d/0B5W5P9bQJ6q0M0QzalRBa0FfTXM/view

Miller, A. (2015, May 11). Avoiding learning helpness. Available from: https://www.edutopia.org/blog/avoiding-learned-helplessness-andrew-miller

Newhouse, B. (2017). Closing the digital competence gap. Available from: https://trainingindustry.com/magazine/issue/closing-the-digital-competence-gap/

Plaza de la Hoz, J., Mediavilla, D.M., Garcia-Gutierrez, J. (2015). How do teachers develop digital competence in their students? Appropriations, problematics, and perspectives. Available from: https://www.researchgate.net/publication/301914474_How_do_teachers_develop_Digital_Competence_in_their_students_Appropriations_problematics_and_perspectives

Developing Evaluation Criteria for EdTech Tools

Digital tools in the classroom is an asset to learning. According to the U.S. Department of Education, technology in the classroom ushers in a new wave of teaching and learning that can enhance productivity, accelerate learning, increase student engagement and motivation, as well as, build 21st century skills, (U.S. Department of Education, n.d.).  The offerings of technology tools for the classroom are plentiful as priorities shift to support a more integrated education. Educators now have several options for cultivating digital tools to better engage students, promote active learning, and personalize instruction. But choosing the right tools can be challenging especially considering that educators face a seemingly overwhelming array of options. How would can educators filter through all of the options to select the best tool(s) for their classroom?  

Enlisting the help of a technology coach who can systematically break down the selection process to ensure that the most appropriate tools are used is part of the solution.  In following with best practices, the third ISTE standard for coaching (3b) states that in order for tech coaches to support effective digital learning environments, coaches should manage and maintain a wide array of tools and resources for teachers, (ISTE, 2017).  In order to cultivate those resources, coaches themselves need a reliable way to select, evaluate, and curate successful options. Much like an educator may use a rubric or standards to assess an assignment’s quality, coaches can develop specific criteria (even a rubric) to assess quality of technology tools.  

Tanner Higgin of Common Sense Education understands the barrage of ed tech tools and the need for reliable tech resources, which is why he published an article describing what makes a good edtech tool great.  The article seems to be written more from a developer’s point of view on app “must-haves”, however Higgin also makes reference to a rubric used by Common Sense Education to evaluate education technology. He mentions the fact that very few tech tools reviewed receive a 5 out of 5 rating which makes me assume that Common Sense Education has a rigorous review system in place. I was curious to learn what criteria they use to rate and review each tool and/or so I investigated their rating process.  In the about section on their website, Common Sense Education mentions a 15-point rubric which they do not share. They do share, however, the key elements included in their rubric: engagement, pedagogy, and support, (Common Sense Education, n.d.). They also share information about the reviewers and how they decide which tools to review. This information serves as a great jumping off point in developing criteria for selecting, evaluating, and curating digital tools. Understanding the thought process of an organization that dedicates their time and resources for this exact purpose is useful for tech coaches in developing their own criteria.  

Continuing the search for technology tool evaluation criteria led me to several education leaders who share their process through various blog posts and articles.  Reading through the criteria suggestion, a common theme started to develop. Most of the suggested criteria fit under the umbrella terms defined by Common Sense with a few modifications, which are synthesized in figure 1.1 below.

Infographic with suggestions on evaluation criteria
Figure 1.1 Digital Tool Evaluation Criteria Suggestions

There is consensus among the educational leaders who placed emphasis on engagement and collaboration features of the tool. Tod Johnston from Clarity Innovations noted that a good tech tool should allow for personalization or differentiation of the learning process that also allowed the instructor to modify the content as needed for each class, (Johnston, 2015).  ISTE author, Liz Kolb added to this by stating that tools that allow for scaffolding help to better engage differentiation, (Kolb, 2016). Both Edutopia and ISTE authors agreed that sociability and shareability of the platform was important to engage students in wider audiences, (Hertz, 2010, & Kolb, 2016).

While engagement was a key element of selecting a tech tool for the classroom, even more important was how the tool fared in the realm of pedagogy in that first and foremost the technology needs to play a role in meeting learning goals and objectives, (Hertz, 2010).  Secondly, the tool should allow for instructional best practices including appropriate methods for modeling and instruction of the device, and functionality in providing student feedback, (Hertz, 2010 &, Johnston, 2015). Another pedagogical consideration is the ability of the platform to instill higher level thinking rather than “skill and drill” learning, (Kolb, 2016). Specific rubrics on pedagogy such as the SAMR and TRIPLE E framework models has been created and can be used in conjunction with these principles.

Support and usability was among the top safety concerns for evaluating these tools.  Cost and the desired features accessed within cost premium was among these concerns particularly when students needed to create an account or needed an email was a concern, (Hertz, 2010). Hertz called this issue free vs. “freemium”, meaning that some apps only allow access to limited functionality of the platform while full functionality could only be accessed through purchase of premium packages. If the platform was free, the presence of ads would need to be accessed,  (Hertz, 2010). In terms of usability, coveted features such as easy interface, instructor management of student engagement, and seperate teacher/student account were desirable, (Johnston, 2015). Along with cost and usability, app reliability and compatibility with existing technology was also listed as important features, (Johnston, 2015).

The evaluation process itself varied from curated lists of the top tech tools, criteria suggestions, even completed rubrics.  If those don’t quite apply to a specific evaluation process, a unique approach would be to convert the rubric into a schematic like the one shared from Denver Public Schools  where each key evaluation element could be presented as a “yes” or “no” question with a “yes, then” or “no, then” response following a  clear decisive trajectory for approval or rejection.  

What I’ve learned through the exploratory process of developing evaluation criteria for tech tools is that It is not important or necessary that a tool meet every single criteria item. Even the educational and tech experts reviewed in this blog emphasized different things in their criteria. In his blog, Tod Johnston suggests that there is no right or wrong way to evaluate technology tools because this isn’t a cookie cutter process.  Just like all teachers have a different style and approach to teaching so would their style and approach to using tech tools. The key to evaluating tools to to find the one that best fits the teacher’s needs, (Johnston, 2015).

Resources

Common Sense Education., (n.d.). How we rate and review. Available from: https://www.commonsense.org/education/how-we-rate-and-review

Hertz, M.B., (2010). Which technology tool do I choose? Available from: https://www.edutopia.org/blog/best-tech-tools

ISTE, 2017.  ISTE standards for coaches.  Available from: https://www.iste.org/standards/for-coaches.

Kolb, L., (2016, December 20). 4 tips for choosing the right edtech tools for learning. Available from: https://www.iste.org/explore/articleDetail?articleid=870&category=Toolbox

Johnston, T. (2015). Choosing the right classroom tools. Available from: https://www.clarity-innovations.com/blog/tjohnston/choosing-right-classroom-tools

Vincent, T. (2012). Ways to evaluate educational apps. Available from: https://learninginhand.com/blog/ways-to-evaluate-educational-apps.html

U.S. Department of Education., (n.d.). Use of technology in teaching and learning. Available from: https://www.ed.gov/oii-news/use-technology-teaching-and-learning.

Culturally Relevant Learning Environments- Examples in Nutrition

How you learn is built in to the larger part of who you are, embodies your collective experiences, norms, beliefs, and values; it is a part of your culture. Building community in the learning environment, whether on- or off-line, establishes safety, facilitates collaboration, and can help cultivate sense of self and role in the community. The ISTE standard for coaches calls coaches to “create and support effective digital age learning environments to maximize the learning of all students… by model[ing] effective classroom management and collaborative learning strategies to maximize teacher and student use of digital tools and resources and access to technology-rich learning environments”,(ISTE, 2017). In order to maximize these resources for learning, we need to establish a technology environment that engages students’ cultural background and understandings.

Building community can be particularly difficult in an online environment where social cues, particularly non-verbal ones, may be more challenging to interpret or oftentimes gets misinterpreted.  This becomes confounded when factoring in cultural languages and exchanges. These exchanges are not limited to ethnic cultures, but also generational cultures where task interpretations may take on different meanings.  For example, assigning students the task of investigating three community food resources may be interpreted and approached differently by students who are very familiar with technology, as opposed to non-traditional students or students that have limited access to technology.  Coaches can help instructors build understanding of the cultures present in a classroom, and implement successful learning strategies through culturally relevant pedagogy (CRP).

What is CRP and why is it important?

McCarther defines culture as an “amalgamation of human activity, production, thought, and belief systems,”(McCarther, 2017). “Culture is fundamental to learning,” (Pitsoe, 2014). Each student brings to the classroom a “fund of knowledge” shaped by their culture that influences who students are, what they believe, and how they think, (Cavalli, 2014). It is easy to understand that students bring all of themselves represented through culture in their learning, but does how they are taught represent them and their culture?  In 1995 researcher Gloria Ladson-Billings coined the termed “culturally relevant pedagogy” (CRP) in response to the fact that students learn best when their ideas and voice are shared and appreciated by the world, (McCarther, 2017). CRP invites educators to create socially just spaces and structure for students to share their voice by using teaching strategies that support the use of cultural knowledge, previous experiences, and unique performance styles that are familiar to diverse students in the classroom, (Cavalii, 2014 & McCather, 2017).  According to Ladson-Billings, student learning success encompasses academic success, cultural competence, and sociopolitical consciousness. CRP is not prescriptive but rather flexible and ever-changing in response to the cultures unique to a particular classroom, (McCather, 2017). Good implementation of CRP in the classroom involve four key components as described by Pitsoe and summarized in Figure 1.1 below.

Infographic of CRP Components
Figure 1.1 Components of Culturally Relevant Pedagogy

Understanding how students learn, the reality of their world today, and what skills they need to challenge the existing systems is crucial to the implementation of CRP.

Need for CRP in Nutrition

The need for CRP in nutrition education is great. Nutrition is incredibly personal as we all eat certain foods for a variety of different reasons. Most reasons for eating are linked to social and cultural norms rather than a strong connection to health (though cultural eating is linked to maintenance of health).  Nutrition practitioners and educators need to be aware of the delicate interplay between culture and health as new foods and traditions are introduced to the diet. Presenting nutrition information in a culturally relevant manner helps engage individuals by giving them the appropriate context and tools to facilitate change. Below are two examples that help illustrate the need for CRP in nutrition counseling:

In the article, “Culturally tailored post secondary nutrition and health education curricula for indigenous populations”, the authors investigate the types and number of culturally relevant nutrition and health programs offered to students seeking to work with Alaskan natives and studying for an allied health degree.  There is a need for such training as Alaskan natives currently face a disproportionate rate of chronic disease development, particularly when Western diets substitute the traditional diet, (McConnell, 2013). After a brief review, the authors found very limited curriculum related to culturally appropriate/relevant nutrition counseling that included spirituality, respect of elders, and personal relationships with the land, waterways, and animals, (McConnell, 2013).  The information that they found was limited to stand-alone culturally tailored courses that the authors argued were considered “dead-end” trainings that were short term and only offered non-transferable skill-building, (McConnell, 2013). After a more comprehensive search, the authors found limited offerings of post-secondary training that resulted in a mainstream credential. Reasons for the limited availablity were hypothesized to be possibly related to funding, oral culture, researchers available for study, or a mix of the above, (McConnell, 2013).

The authors’ rationale for culturally tailored curriculum is very interesting, arguing that the more effective nutritional counseling approach was not to create courses for the indigenous patients themselves, but rather train future nutritionists/dietitians with additional credentials to tailor teachings that align with the food norms and beliefs of the target population. This correlates with the CRP theory principles in which states that is the role of the instructor to understand the culture of the class/client, not the client/student, as it is more effective to receive education in a context that is culturally familiar and resonates better with clients, (Pitsoe, 2014).  

When considering my own education options, to my knowledge, there isn’t post-secondary continuing education ending in credentials available for nutritionists/dietitians on culturally appropriate/relevant counseling. However, when implemented well, CRP can deliver results.  Another article, “Adaptation of a Culturally Relevant Nutrition and Physical Activity Program for Low-Income, Mexican-Origin Parents With Young Children”, described a community intervention nutrition program designed around the “Social Learning Theory” to help low-income hispanic families decrease rates of childhood obesity.  This 5-year program gave individuals in the intervention group $25 a month to spend on fresh fruit and vegetables while participating in family nutrition and physical activity nights.  As part of the model, the researchers used the “Anchor, Add, Apply, and Away” approach where participants would share food memories from childhood, share stories of life as an immigrant, problem solve by learning to make a new recipe with local foods, and share what was learned at the end of the process, (Kaiser, et. al., 2015). Parents were also asked to provide examples of what they did to promote nutrition and physical activity in their family. This served to give ideas and motivate others in the group.  At the end of the program, parents reported that children spent less time watching tv or playing video games, did more physical activity, and either maintained weight or lost weight, (Kaiser, et. al., 2015). This article explores a patient-centered approach to culturally relevant nutrition education where success was gained not only through cultural food norms and values, but also encouraged the exploration of new foods through the social learning theory.

Implementation of CRP in Nutrition Classes

There is a demonstrated need for more culturally relevant pedagogy in nutrition education, particularly considering that using the same teaching techniques on all students does not set up these individuals for sustainable success when cultural aspects to nutrition are not fully incorporated.  This begs the question: What are some approaches and examples of using culturally relevant pedagogy in nutrition classes?

According to Pitsoe, in order to maximize learning, teachers must first understand the cultures represented in their classrooms and use that understanding into their lessons, (Pitsoe, 2014).  To help with this, the Milwaukee Public Schools offers a list of questions to help teachers gain a better understanding of their students. Figure 1.2 examines these questions.

Infographic of questions building CRP
Figure 1.2 Questions for Building Culturally Relevant Practices from Milwaukee Public Schools

Once the class culture is understood, the next step is to select instruction strategies that effectively engage that culture. Some ways that teachers have successfully implemented this is by using cultural mythology to open discussions about a topic, conduct an environmental study of pollution in local community, or investigate the nutrition status of the local community, (Cavalli, 2014). These strategies could also be expanded to include discussions on the impacts of technology on food culture and generational culture.

A master’s thesis by A.C. Cavalii, provides an fuller example of CRP as implemented  in an urban science class setting. Her approach to CRP involved taking an eleven-lesson unit and blending strategies to incorporate not only direct teaching but also guided inquiry, and community investigation.  A summary of her approach can be found in Figure 1.3 below.

Figure depicts CRP Lesson Planning
Figure 1.3 A. Cavalli’s CRP Lesson Planning Example

By modeling and providing examples for instructors on building culturally relevant lessons, coaches can help teachers better develop online strategies that incorporates cultural relevance to enhance learning and build better online communities.

References

Cavalli, A. C., (2014). Teaching nutrition and health in the urban science classroom- A blended approach to culturally relevant and problem based learning. Education and Human Development Theses, The College at Brockport [website].  Available at: https://digitalcommons.brockport.edu/cgi/viewcontent.cgi?article=1547&context=ehd_theses

ISTE, (2017). ISTE standards for coaches. Available at: https://www.iste.org/standards/for-coaches

Kaiser, L., Martinez, J., Horowitz, M., Lamp, C., Johns, M., et al. (2015).  Adaptation of a culturally relevant nutrition and physical activity program for low-income, Mexican-origin parents with young children. Center for Disease Control [webpage]. Available at: (https://www.cdc.gov/pcd/issues/2015/14_0591.htm)

McConnell, S., (2013). Culturally tailored post secondary nutrition and health education curricula for indigenous populations. Int J Circumpolar Health. Available online at: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3748461/)

Milwaukee Public Schools, (n.d.). Culturally responsive practices. Available at: http://mps.milwaukee.k12.wi.us/en/Families/Family-Services/Intervention—PBIS/Culturally-Responsive-Practices.htm

Instructional Coaching: Using Rubrics to Quantify Qualitative Data for Improved Teaching Outcomes

Feedback can be a powerful tool to improve teaching and learning. Through feedback, new perspectives can be gained as teachers begin to can acern what is working and what isn’t in current instructional methods. Feedback also offers suggestions on achieving goals and standards that drive an educator’s work. There are four different types of feedback: formative, summative, confirmative, and predictive. Formative feedback occurs before an intervention takes place, such as giving students feedback on an assignment where the feedback does not impact the final grade.  I explore the benefits of formative feedback in this post. Summative feedback occurs after an intervention, such as when students turn in an assessment and the feedback provided is in relation to the grade outcome, (Becker, 2016). Predictive feedback occurs before any instruction has ever taken place to ensure that the method will be effective while confirmative occurs well after summative feedback to ensure that the methods are still effective, (Becker, 2016).  Of the four types, formative, and summative feedback are among the most widely used evaluation in educational institutions.

At the end of each quarter,  two types of summative evaluation is collected for each of the classes I’ve taught, quantitative and qualitative data to assess my performance as a professor, and the course outcomes.   The quantitative portion uses a likert scale ranging from 1=strongly disagree to 5= strongly agree, whereas at the bottom of the evaluation form, there is a section where students can provide comments, intended to give constructive feedback for classroom improvement.  While the comments are not always written constructively (I am addressing this through a mini-module students are required to complete for all of my classes), it’s mainly the common themes that present themselves in the evaluations that are powerful influencers of improving my classes.  However, what I’ve learned is that most of the time, the summative feedback is simply too late to improve the current student experience because the issue can’t be addressed until the next time the course is offered. As a technology and instructional coach, in order to help other educators improve their teaching outcomes, more timely feedback would be required that utilized both quantitative and qualitative assessment measures. While most learning management system (LMS) platforms can offer a multitude of analytics, quantifying data such as exam scores, class averages for assignments, and average engagement time on the platform, there isn’t an explicit way to neither collect nor quantify qualitative data.

The ISTE standard for coaching states that coaches should, “coach teachers in and model effective use of tools and resources to systematically collect  and analyze student achievement data, interpret results, and communicate findings to improve instructional practice and maximize student learning, (ISTE, 2017). If LMS can collect quantitative data that can be assessed throughout the quarter (through summative feedback), could it also be used to quantify qualitative data (i.e. comments) for improved teaching outcomes?  To answer this question,  I’d like to address it two ways:  1) Establish an understanding in the value and importance of self-reflection of assessments, and 2) Address how rubrics can help quantify qualitative data.

Importance of self-reflection.  Self-reflection can give several insights into the effectiveness of teaching.  According the Virginia Journal of Education, self reflection is a method to support current strengths and identify areas of improvement including continuing education or professional development needs. Educators may seek out self-reflection in order to review past activities, define issues that arise throughout the quarter/semester, understand how students are learning, modify a class due to unexpected circumstances, or address whether or not the teacher’s expectations have been met. Overall, self-reflection improves teacher quality, (Hindman & Stronge, n.d.)

Educators may sometimes make decisions based on emotions when deciding whether or not an element worked well in the classroom. However, without context to justify that decision, emotions are not a clear indicator of outcomes. Self reflection puts a process in place in which educators can collect, analyze, and interpret specific classroom outcomes, (Cox, n.d.).  Though there are various ways to perform self-reflection (see Figure 1.1), the most effective outcome is to ensure that the process has been thoroughly completed.

Figure on Cox's Types of Self-Reflection
Figure 1.1 Cox’s Types of Self-Reflection.

For an  instructional coach, following the proper self-reflection steps would be a great way to begin the discussion with someone wanting to improve their teaching. An instructional coach would help the educator:

  • Understand their outcome goals,
  • Choose the data collection/reflection method best suited to meet these goals,
  • Analyze the data together to identify needs,
  • Develop implementation strategies to address needs.

Because is the process is general, it can be modified and applied to various learning institutions. With my coaching background as a dietitian, similar to my clients needs for change, I would also include questions about perceived barriers to change implementation.  These questions would include a discussion on any materials, or equipment the educator would deem necessary but that may be difficult to obtain or that may require new skills sets to use fully.

Using rubrics to quantify qualitative data. Part of self-assessment includes using rubrics, in addition to analyzing data, goal setting, and reflection. According to the Utah Education Association (UEA), using a rubric helps to address the question “What do I need to reach my goals?”,  (UEA, n.d.). Rubrics present expected outcomes and expected performance, both qualitative qualities, in quantifiable terms. Good rubrics should include appropriate criteria that is definable, observable, complete, and includes a continuum of quality, (UEA, n.d.).  

If rubrics help quantify qualitative data, then how can rubrics assess reflection?  DePaul University tackled that very question, in which the response asked more questions including: what is the purpose of the reflection, will the assessment process promote reflection, and how will reflection be judged or assessed? (DePaul, n.d.).  Educational Leader, Lana Danielson remarks on the importance of reflective thinking and how technological, situational, deliberate, or dialectical thinking can influence teaching outcomes. Poor reflective outcomes, according to Danielson, is a result of not understanding why teachers do the things they do, and that great teachers are those know what needs to change and can identify reasons why, (Danielson, 2009).   Figure 1.2 describes the four types of reflective thinking in more detail.

Infographic on the four modes of reflective thinking
Figure 1.2 Grimmett’s Model of the Four Modes of Reflective Thinking

Developing rubrics based on the various types of reflective thinking will help quantify expectations and performances to frame improvement. The only issue with this model is that it is more diagnostic rather than quantifiable.  A more specific rubric model developed by Ash and Clayton in 2004, involves an eight-step prescriptive process including:

  • Identifying and analyzing the experience,
  • Identifying, articulating, and analyzing learning,
  • Undertaking  new learning experiences based on reflection outcomes, (DePaul, n.d.)

The Ash/Clayton model involves developing and refining a rubric based on learning categories related to goals.  All of the qualities related to the learning categories are defined and refined at each stage of the reflection process. More information on the eight-step process can be found here.

Regardless of the reflection assessment model used, coaches can capture enough criteria to create and use rubrics as part of the self-reflection process that can help improve teaching outcomes due to new awareness, and identified learning needs that may block improvements. Most LMS systems support rubrics as part of assessment in various capacities (some only support rubrics on designated “assignments” but not features like “discussions,” for example).  Each criteria item includes quality indicators which are also associated with a number, making the qualitative data now quantifiable similar to the way “coding” in qualitative research allows for quantifiable results. New rubric features allow for a range of quality points on common criteria and freeform responses, allowing for the possibility of modifications to the various reflection types. Because of the new functionalities and the myriad of rubric uses in LMS today, creating a good-quality rubric is now the only obstacle of rubric implementation for self reflection.

References

Becker, K. (2016, August 29.) Formative vs. summative vs. confirmative vs. predictive evaluation. Retrieved from: http://minkhollow.ca/beckerblog/2016/08/29/formative-vs-summative-vs-confirmative-vs-predictive-evaluation/

Cox, J. (n.d). Teaching strategies: The value of self-reflection. Retrieved from: http://www.teachhub.com/teaching-strategies-value-self-reflection.

Danielson, L. (2009). Fostering reflection. Educational Leadership. 66 (5)  [electronic copy]. Retrieved from: http://www.ascd.org/publications/educational-leadership/feb09/vol66/num05/Fostering-Reflection.aspx

DePaul University, (n.d.) Assessing reflection. Retrieved from: https://resources.depaul.edu/teaching-commons/teaching-guides/feedback-grading/Pages/assessing-reflection.aspx

Hindman, J.L., Stronge, J.H. (n.d). Reflecting on teaching: Examining your practice is one of the best ways to improve it. Retrieved from: http://www.veanea.org/home/1327.htm

ISTE, (2017). ISTE standards for coaching. Retrieved from: https://www.iste.org/standards/for-coaches.

Utah Education Association., (n.d.) Self-Assessment: Rubrics, goal setting, and reflection. [Presenter’s notes]. Retrieved from: http://myuea.org/sites/utahedu/Uploads/files/Teaching%20and%20Learning/Assessment_Literacy/SelfAssessment/Presenter%20Notes_Self-Assessment_Rubrics_Goal_Setting.pdf

Strategies for Teaching Effective Email Communication

I have a dilemma.  No one comes to my office hours anymore.  I made this realization years ago when I would find myself alone in my office, staring at the clock, waiting for my “shift” to be over or filling that time with grading and lesson planning. On average, I’d probably have 1-2 students come see me before the end of the quarter and it was usually because the situation was dire.  Later, I changed my approach to be more flexible. I didn’t have fixed office hours so that students could make appointments with me that better accommodated both schedules. Students would approach me either in class or via email to set up an appointment time. For a time, this strategy worked very well to catch struggles and issues earlier on. Despite all of these efforts to be available for students, resolving major issues, addressing prolonged absences, and discussing successful study strategies are not what the typical student emails me about. Now, students email me about anything and everything.  

It wouldn’t be too bad filtering through emails, if students also didn’t have the expectation that professors respond to any email with 48 hours, during which all of the responsibility for investigating that question gets placed on the instructor.  “I wasn’t sure what to do, I was waiting for a response from you,” is the usual response I get if I was too busy to answer a non-urgent email. It’s difficult not to become frustrated in this scenario when about 2.5 hours of my day is spent answering emails.  With work-life balance considered, that means that ¼ of my day is spent unproductively. During that time, I could have been working on assessment, lesson planning, or updating content with current research.

This is not the only email communication concern I have.  At least three times a quarter, I need to gently correct the students that choose to address me by my first name as opposed to my professional title- Professor Vlad-Ortiz.  To their merit, once corrected, students do not repeat that mistake. What happens far more often is unclear communication and informal tone. Emails starting in “I need you to…”, or “lift my registration hold…” demonstrates a misunderstanding of the formality needed to address faculty.  Rather than phrasing their request politely, it reads more like a demand. Because of the implications and expectations loaded into each of these emails, it is important to investigate and address appropriate strategies for teaching effective email communication to students.

Why is all of this important? Understanding how to properly communicate online, including email, is part of good digital citizenship. The skills of knowing email appropriateness, tone, and formality are essential to be successful in the 21st century.  Though there are several other caveats to good online communication, I’ve identified three basic email communication components to help students get started in practicing successful digital citizenship.

Graphic of email communication basics
Figure 1.1 Overview of Email Communication Basics.

All emails to educators, regardless of their title, should be formal.  The educator-student dynamic is professional in nature so communication should reflect that relationship. Addressing professors by their professional name not only establishes that formal relationship, but as Molly Worthen, Assistant Professor at University of North Carolina, explains, in a world where formality is on the decline, using a professor’s title helps to ensure respect regardless of the professor’s race, age, and gender, (Worthen, 2017).  This is particularly important considering that it is the more privileged students that tend to violate this formality, (Worthen, 2017). Along the lines of respect, the tone of the email should be polite and courteous. By sending an email, the sender is asking for the professor’s time and consideration on a particular manner. Worthen brilliantly explains that requests should not sound like a text message nor communication with a customer service representative, (Worthen, 2017).  As with my examples above, the professor doesn’t need to do anything, as in “I need you to lift a hold from my account,” or “I need to register for your class…” but rather understands that the sender is asking for a favor. As Mark Tomforde, Associate Professor at University of Houston, very accurately describes, professors are incredibly busy, so emails should truly represent issues that can’t be resolved through any other means.  Using email to request anything and everything trival is a disrespectful of the professor’s time and expertise, (Tomforde, n.d.). Emails should demonstrate that the sender has already taken several steps to solving the problem on their own and clearly defines how the reader can help resolve that problem, (Purdue, n.d.). Ideally, the issue should be quickly resolved through one email and the sender should be able to distinguish when it is appropriate to talk in person as emails should not be substitutions for real conversations, (Tomforde, n.d.).

Role of the Educator. According to the ISTE standard for educators, the role of the educator is to “…inspire students to positively contribute to and responsibly participate in the digital world,” (ISTE, 2017).  The key words in that definition are “positively contribute” and “responsibility participate”. The issues addressed above indicate that there is a weight to the actions and intentions set-forth in email and other online communication. The responsibility of the student is to create communication that is both framed positively and courteously while taking the responsibility for the resolution of the email’s request. One of the indicators for this ISTE standard charges educators to create experiences for learners to make positive, socially responsible contributions and exhibit empathetic behavior online that build relationships and community, (ISTE 2017). Relationships and community rely on the actions of many in order to be successfully built.  In building a healthy online community, we can’t expect students to just know how to behave and communicate properly. Skills are not intuitive and should be taught. In order to address this ISTE indicator, I’ve compiled three solutions or strategies can be used to reverse the current culture and promote good digital citizenship for our students.

Graphic for educator strategies for online communication.
Figure 1.2 Overview of Educator Strategies for Online Communication.

1) Professor Modeling. Teaching digital citizenship is a shared responsibility, so it is important for educators to actively address and model proper practices on a regular basis, (Crompton, 2014). In addition to using good email etiquette when communicating with students, professors should give students opportunities to explore and practice good etiquette. This can be achieved through explicit learning. For specific examples, Helen Crompton provides three scenarios of how digital citizenship can be modeled by professors in the classroom.  Another example is an activity that Mrs. Jizba created in which she has students write two emails, one to their friend and one to their principle.  She engages the students in a conversation about what content, tone, and choice of words are appropriate in each scenario.  This simple activity clearly demonstrates how students establish the norms of good digital citizenship through modeling and practice.

2) Explicit language in department handbook that is then repeated in syllabi. Just as there are codes of conduct at each institution, departments should include standards of conduct for online communication.  In order for these standards to have impact, each faculty member should mirror these standards in their syllabi. Through these collaborative efforts, the message of appropriate online communication is clear and consistent. Both Worthen and Tomforde share their guidelines to help with standard development.

3) Holding students up to the expectations. Just as important as modeling and creating language in the department handbooks and syllabi, is holding students up to those expectations.  That means addressing any violations in a gentle and professional manner. For example, when students address me incorrectly, I respond back with, “We are a formal institution and ask that students address all faculty by their professional title, in my case you would address me as Professor Vlad-Ortiz.  Please know that I am telling you this not to reprimand you or make you feel bad, but simply to let you know of our institutions professional standards so that you avoid potentially offending faculty in the future.” As Worthen concludes, it’s all about treating students as adults, (Worthen, 2017). As educators, we prepare students for the real world. If we do not hold students to these expectations, they will not be successfully prepared for their future professional lives.

Resources

Crompton, H. (2014, August 28). Know the ISTE standard for teachers: Model digital citizenship. Retrieved from: https://www.iste.org/explore/articleDetail?articleid=142

International Society for Technology in Education. (2017) ISTE Standards for Educators.  Retrieved from https://www.iste.org/standards/for-educators

Purdue Online Writing Lab, (n.d.) Email etiquette for students [powerpoint]. Retrieved from: https://owl.english.purdue.edu/owl/resource/694/01/

Tomforde, M. (n.d.) Email etiquette: Guidelines for writing to your professors. Retrieved from: https://www.math.uh.edu/~tomforde/Email-Etiquette.html

Worthen, M. (2017, May 13). U can’t talk to ur professor like this. Retrieved from: https://www.nytimes.com/2017/05/13/opinion/sunday/u-cant-talk-to-ur-professor-like-this.html

Implementing Student-Centered Activities in Content-Intensive Courses

If you’ve ever taught a content-intensive course, you’ll know it’s like trying to finish a marathon in a sprint. In my experience, you get to the finish line, but you hardly remember the journey there. The content-intensive courses I teach are the foundational nutrition classes. Each contain at least six major learning objectives with about two sub-objectives and are designed to cover upwards of fifteen chapters of material in a ten-week quarter system. The predominant approach to these types of classes by faculty is to go broad, not deep, in learning and understanding.  I must admit this has been my approach as well, in fear that I will miss out on covering one of the learning objectives or sub-objectives. While my students tell me that the courses are interesting and engaging, I can’t help wonder if they will actually remember any content from the course or if they feel as if their brain has been put through a blender by spring break. Is the learning authentic or are they just learning for the sake of memorization to pass the final exam?

The ISTE Standards for Educators charge instructors with, “design[ing] authentic, learner-driven activities and environments that recognize and accommodate learner variability,” (ISTE, 2017).  If instructors truly wish to design their course using evidence-based practices, the focus needs to shift from covering material to student learning without compromising the learning objectives. ISTE educator standard 5b implies that technology can help marry the two concerns, “design authentic learning activities that align with content area standards and use digital tools and resources to maximize active, deep learning,” (ISTE, 2017). This ISTE 5b standard can best be illustrated by the “genius hour” concept developed by Nicohle Carter in pursuit of developing a personalized learning environment for her students. The idea is brilliant.  Allow students one opportunity a week (or as time allows) to dive deep into a topic they are interested in and demonstrate their learning through an artifact or digital presentation. The implementation of genius hour follows a six-component design model that highlights new roles and responsibilities for teachers and students alike, (Carter, 2014). See figure 1.1 for more information on the six-component personalized learning design.

Infographic highlighting 6 essentials for personalized learning.
Figure 1.1 Nicohle Carter’s Personalized Learning Essentials.

When implemented well, intrinsic motivation for learning soars, students are engaged in the material, and teachers can meet those ever-important learning objectives without feeling like they are just shoveling materials into students’ brains, (Carter, 2014). It seems like a win-win.  However, I started thinking back on my content-intensive courses and wondered how can student-centered activities (like genius hour) be implemented in these types of courses?

As a starting place for answering my question, I revisited Kathleen McClaskey’s continuum of choice.  I find the concept interesting that developing student-centered learning/activities, it ultimately comes down to how much control the teacher wants to let go of and how much “choice” is open for the students. In traditional content-intensive courses, the teacher has all of the control, or what McClaskey would classify as teacher-centered, (McClaskey, 2005).  She/he creates the lectures that revolve around a specific chapter in a textbook, then lectures to ensure the material in covered. Students, in this model, sit and observe the lecturer in hopes of absorbing some of the materials (or in most cases, cramming the information into their brain the night before the exam) while never actually deeply engaging with the information.  Using McClaskey’s continuum of choice suggests that some activities can still be controlled while giving the students some freedom to explore topics in their own choosing, i.e. consider the participant and co-designer models, (McClaskey, 2005).

Diagram of the Continuum of Choice.
Figure 1.2 McClaskey’s Continuum of Choice. (Continuum of Choice TM by Barbara Bray and Kathleen McClaskey is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.)

The challenging thing about the more student-centered models such as the designer or advocate from McClaskey’s continuum requires time, a luxury oftentimes not afforded in content-intensive courses, nor do they address how to implement each model topic.  However, despite these concerns, I am beginning to realize that in order to allow for more intrinsic and authentic learning, I need to let go of the desire to control all aspects of the content-intensive courses and shift my focus to what is really important, student learning.

Many of the resources similar to McClaskey, mention explicit instruction as part of a student-centered classroom. Explicit instruction provides “effective, meaningful, direct teaching…where students are active participants in the learning process,” (Shasta County, 2009). Creating an explicit learning lesson involves six guiding principles. 1) The instructor begins the class by setting the stage for learning, the learning objectives are clear and students understand their responsibility for their learning. 2) This is followed by clear, simple, and direct explanation of what the daily task is, why it is important, and how to best complete the task.  Students appreciate when tasks are broken down into smaller, logical steps. 3)The instructor models the process, including their thought process using visuals. This is important because simply explaining a concept doesn’t mean that the students will understand it or know what to do. 4) Before diving into the assignment on their own, students are given a guided activity where the instructor assesses readiness of the class. 5) Once the concept has been mastered, the students take to the task independently. 6) After the task(s) has been completed, the students are given an option for informal or formal reflection, the artifact is collected and compared to the learning objectives, (Shasta County, 2009).  Figure 1.3 provides a reference guide for these steps.

Infographic on explicit learning
Figure 1.3 Explicit Learning Reference Guide

According to the Shasta County Curriculum Lead, explicit learning is best used when there is a “well-defined body of information or skills students must master,” especially when other models such inquiry-based or project-based cannot be successfully implemented, (Shasta County, 2009).  The role of the teacher is more directed, specific, and allows students more insight and practice into the skills that they are learning. What I like about explicit learning is that the classroom activities do not have to be modified completely but the modification occurs is how the material is presented and practiced.  Students can appreciate this model because they engage in active learning but still have guidance and support from the teacher via modelling.

Through explicit learning even the content-intensive courses can have a deeper and more meaningful impact on learning. I had one class in particular in mind when considering the explicit learning/personalized learning approach. I teach a not-so-introductory nutrition class designed to meet the needs of allied health students.  All allied health students are required to take at least one nutrition class as part of their career training, and for many, this class will be the only nutrition class they will ever take. The pressure is high in terms of delivering content as it is very likely that they will not revisit this material anywhere else. While I can’t change the fact that they need to explore the chemical compositions and the processing of the nutrients in the body, I can influence how they engage with the health effects and recommendations of these nutrients, which are ever-changing anyway.  Using personalized learning and the explicit learning models, I could allot for one class time a week for the exploration of the health effects/recommendations on whatever condition, trend, or issue they wished to explore. Like the genius hour, the students could work together to investigate and create a digital artifact of their choosing that would best present their topic, and lastly to further promote collaboration, they could work together to provide feedback to one another on their topics. The students would be learning through co-learning, gaining a stronger and deeper interest into the subject matter, proving that content-intensive courses can also be student-centered.

Resources

Carter, N. (2014, August 4).Genius Hour and the 6 Essentials of Personalized Education. Retrieved from http://www.edutopia.org/blog/genius-hour-essentials-personalized-education-nichole-carter

International Society for Technology in Education, (2017).  The ISTE standards for educators. Retrieved from: https://www.iste.org/standards/for-educators.

McClaskey, K. (2005, November 5). Continuum of choice- More than a menu of options. Retrieved from http://kathleenmcclaskey.com/choice/

Shasta County Curriculum Lead, (2009).  What is direct/explicit learning [Word doc]. Retrieved from http://www.shastacoe.org/uploaded/dept/is/district_support/explicit_instruction_may_2009.doc