To fully understand ISTE coaching standard 3D), “Select, evaluate, and facilitate the use of adaptive and assistive technologies to support student learning,” I created a website which hosts three videos, (ISTE, 2017). Each video describes the steps in the “Backwards Design” model as a means to incorporate edtech into existing lesson plans. It was important to incorporate captions as part of assistive technology to support all students. The videos below were created with screen-capturing software, TechSmith Relay, and later uploaded and captioned using YouTube’s captioning functionality.
Note: Please click on “closed captioning” icon on bottom of video to view captions.
Backward Design Three-Step Video
Stage One Application: Modify the basic lesson provided from the Colorado Extended Food and Nutrition Program to meet specific criteria that you develop using the backward design.
Stage Two Application: Determine the type of understanding you want your audience to achieve and build your action-oriented task. Consider the active learning elements and digital tools you wish to include. How do they enhance engagement and performance?
Stage Three Application: Develop your lesson plan. Double check that your activities meet your main objective(s).
Through this process, I finally understood the importance of assistive technology. Great effort was put into each video to ensure that all students can use and learn according to their abilities. After this experience, I now always take the extra steps to add captions or use alternative text to all graphics I upload into my digital environments.
With the goal of promoting 21st century skills, the university is looking for ways to better serve the students and faculty by providing them with enriching technology experiences. Collaborating with the educational technology department, I embarked on a three-phase project with the scope of gathering information, and brainstorming solutions to improve the technology usage experience in the classroom.
This project aligns with the ISTE coaching standard 3F- “Collaborate with teachers and administrators to select and evaluate digital tools and resources that enhance teaching and learning and are compatible with the school technology infrastructure,” (ISTE, 2017). The project depended on several collaborations from gathering feedback from faculty, requesting data from technology departments, to continued collaborations among these departments to support proposed classroom technology changes.
A summary of the project outcomes is provided below.
Edit, distribute, analyze, and report on the Classroom Technology Survey.
Pilot Survey Summary. A few months ago, I participated in the development of a pilot study with the scope of investigating faculty use of current classroom technology. Though the results could not be generalized to the greater faculty body, we learned that the faculty sub-group was not adverse to technology nor did they consider themselves expert users. We also found some correlations between faculty comfort with technology and the types of technology that are used in the classroom. This result mirrored what types of technology students were exposed to. Similar correlations were found between faculty comfort and where technology was incorporated into teaching and learning. However, from voluntary feedback provided by our pilot subjects, we learned that not all subjects understood the term “classroom technology” clearly nor understood the context with which we were addressing “active learning.” With these and other suggestions, we improved the survey for clarity and brevity. The outcomes of that pilot project can be reviewed in greater detail here.
Survey Outcomes and Implications. After implementing the changes described above, the edited survey was presented to the faculty body by a contributing stakeholder and responses were collected for two weeks. A total of 108 completed surveys was collected representing twenty-four departments and roughly one-third of the total faculty population.
The majority of the participants considered themselves “average-technology adopters” which indicates that faculty are not adverse to incorporating technology into their teaching and student learning but will not do so unprompted or without substantial assistance and resources. These results were congruent with the pilot study. Understanding faculty adoption rate is an important consideration as any proposed classroom model change may be better received by this faculty body alongside a comprehensive training plan.
Additionally, all faculty used technology in teaching and student learning though there wasn’t a correlation between self-identified technology adoption rate and total areas of technology incorporation. The faculty are currently using technology to support lecture or the physical classroom, and disseminate course resources to students. All faculty currently felt proficient with most classroom technologies expect for mobile devices. Faculty might be unfamiliar with how mobile devices can be used as a classroom technology. However, faculty also indicated a desire for more training on mobile devices. Interestingly, when asked if faculty had access to mobile device and the frequently which with they use the device, about half of the participants wished to learn more or were already using mobile devices in the classroom and the other half did not have access and would not use it. This unusual finding may be an implication of the characteristics of average technology users who may not be inclined to try new technology without comprehensive training and modelling.
One final implication of the survey outcomes indicates that students are more likely to engage in passive participation with technology in online environments based on the characteristics of faculty familiarity. Given that most faculty engage with the online classroom as a way to present course resources or view lecture videos, students may not gain full exposure to 21st century skills or digital citizenship.
Figure 1.1 provides more details into the survey’s findings.
Figure 1.1. Classroom Technology Survey Results and Implications Presentation.
Phase Two- Brainstorm Classroom Models.
After the survey results were released, stakeholders including digital librarians, professional development department members, and computer information department members, met to discuss current classroom models and began brainstorming possible models that will meet future needs of faculty. The meeting started with a review of the survey described above, followed by a review of input data pulled from the classroom podium central tower. The data helped understand how often each classroom device was accessed throughout the academic year. The inputs data helped reinforce the survey data by identifying the podium pc as one of the most used classroom technologies and the VCR among the least utilized.
After the usage background information was presented, the discussion on future classroom models ensued. By observing the concerns expressed by each stakeholder department, it became apparent that the issue is deeper than I originally thought. I believed that the most difficult component of developing a new classroom model was faculty support. However, in addition to faculty support, budget, limitations of physical space, inventory logistics, and training demands, were among the concerns addressed. Despite these concerns, all were optimistic about future directions proposed by the collaboration leader.
Figure 1.2 below summarizes these models.
No final decision about future directions was decided. However, the meeting concluded on a positive note. All departments left with assignments to gather more information by the end of summer. The committee would reconvene at that time to further support establishing a decision.
Phase Three: Curate activity examples for the classroom models.
The final phase of this project looked at examples of how some classroom models could be used to support active learning. One classroom model used at the university is an high-tech active classroom. These classrooms consist of flexible furniture whose arrangement promote collaboration. The circular nature of the desks allow student focus on each other rather than the lecturer. Additionally, the room features an advanced podium system where each flex furniture center contains connections to it own monitor mounted on the wall. The lecturer can control which monitor is displayed and can share input from one computer to all of the others.
Coupled with a request by my department to conduct a high-tech classroom demonstration for an accreditation visit, I created a demonstration with two possible uses of that space after meeting with the Food and Nutrition Program Director, the Digital Librarian, and the DEL Program Director. The showcase featured examples of low-tech classroom activities and high-tech activities featuring that classroom space.
The low-tech example features uses of the flex furniture/space. This example highlights how the classroom can be setup as skill development stations. Each station could be set up to focus on a specific counseling or clinical skill. Students would have the opportunity to practice the skill at the station before demonstrating it to an assessor. Once mastery was achieved, the student would rotate to the next station.
The high-tech example utilizes the capabilities of presenting the input from each group onto its individual monitor. In this scenario, students are given a component of a larger problem to work on in their group. Each group would present their work on their corresponding monitor. Because the podium can project each group’s work onto their individual monitor, the educator could then ask students to take a step back and review all of the components together, reflecting on the big picture idea behind the project.
Conclusion and Reflection
The work I’ve conducted on the future classroom models left me feeling very optimistic about the future of educational technology. I was pleased that common misconceptions of faculty resistance to technology was not a barrier to classroom technology change at the university. The opposite was found to be true as results of the classroom technology survey revealed. The conclusion of the stakeholder meeting introduced interesting ideas where educators will no longer be limited by the static technology that is made available in physical space. Educators may very soon be able to connect to their own devices and allow students to connect and interact with classroom technology more actively. Even examining current classroom models, active learning does not need advanced technology as it can take place with low tech activities. This was also expressed in the survey as faculty indicated high proficiency and desire for more whiteboards in the classroom. With the option of both high-tech and low-tech possibilities for active learning, students are able to gain appreciation of digital literacy, creativity, collaboration, and problem-solving. As a higher education institution,we can’t say that technology hasn’t taken hold over our teaching and student learning. Active learning and student engagement are held to high esteem as our attitudes and ideals shift from traditional classrooms to classrooms that support the development of 21st century skills.
In reviewing my past blog posts, I notice that reflection is often found as an important step to a larger concept. In general, reflection is needed to gather a baseline of current practices and informs either knowledge gained or next steps for improvement.
For educators, reflection can help guide instructional practice. In the “Applying Formative Assessment in Professional Development” post, formative feedback can be used a vehicle to promote both self- and peer-reflection in professional development which leads to a reduction in time, energy, and resources expended when compared to a formal, summative evaluation. The researchers quoted in the “Peer Coaching Focus- For Teacher or Student Outcome” post, agree with the utility of reflection as one of the main indicators for continuing education success is time allotted for educators to reflect.
When reflection is offered as part of a process, an individual can also be better informed of their own learning and/or skill deficits which can help identify continued professional development needs. In “Instructional Coaching: Using Rubrics to Quantify Qualitative Data for Improved Teaching Outcomes” self-reflection occurs after reviewing compiled feedback from students on assignments and/or teaching practices. Under these circumstances, instructors can use self-reflection to gain insights in practices that are effective, define areas of improvement, understand how the students are learning, and address whether or not the instructor’s expectations of teaching and learning have been met. One of advantages of using this type of process is that the feedback is factual rather than emotional, or based off solely the educator’s experience.
Peer reflection can also be a helpful mechanism to gather insight into instructional practice. Because it is a collaborative event, it involves a social construct for learning. In the “Creating a Peer Coaching Culture” post, several key components of a successful peer coaching session identified by Dr. Gotteman involved both self and group reflection. Reflection in this process helps to identify areas of improvement for the educator that is being coach and helps both peers establish a starting point for improvement.
Les Foltos agrees with this idea as he identifies reflection as both a method to conduct peer coaching and as a part of the peer coaching cycle. Reflection is crucial to understanding next steps when looking at the appropriate length of a peer coaching relationship. This introduces a cyclical nature to the coaching process.
One way to teach students how to give good constructive feedback is by using models that require reflection as part of the process. The RISE Model provide students the opportunity to reflect before any additional feedback can be given. By reflecting first and commenting second, students can build a bigger picture and better understanding of the works that they are evaluating. The RISE model can also be used to guide self-reflection of student’s own works to inform an improved process.
When implemented on a regular basis, reflection can help both students and educators gain deeper understanding of the learning process. Reflection brings to light crucial information that guides a process and builds a pathway for continued success.
As a technology coach, one of my responsibilities is to “Advocate for policies, procedures… to support implementation of the shared vision represented in…technology plans and guidelines,” according to the ISTE Coaching Standards, (ISTE, 2017).
I had an opportunity to contribute to the shared vision and future planning of my department during the second year of my masters studies. My department was undergoing a revision in departmental goals and program outcomes. Our director asked faculty to evaluate what was important for our students to learn and/or demonstrate prior to leaving the university. Understanding that 21st century skills are an integral part of the future workforce, I suggested we included elements of digital citizenship.
This contribution was influenced by an informal assessment I had conducted of our department digital citizenship readiness where it was identified that digital communication was an area of improvement for our students. Therefore, as part of new our digital citizenship goal, each program made a commitment to hold students accountable to digital etiquette. Figure 1.1 highlights the outcome of that commitment.
I worked with the instructor of the introductory FCS course to build the evidence of mastery for this departmental goal using posts from this learning portfolio and modules I have previously created in Canvas (learning management system). Implementation of these assignment are currently taking place. We will evaluate the assignments to compare outcomes to our benchmarks at the end of the quarter.
In my continued exploration of professional development and evaluation, I partnered with the Educational Technology and Media department from my university to conduct a pilot survey on what types, and how faculty use current classroom technologies. The results of this pilot will inform necessary modifications to the data collection tool prior to faculty-wide administration at a later date. This is a summary of the project outcomes.
Purpose and Objectives
The aim of this pilot study was to assess current classroom technology usage at a private university in Seattle, Washington. A secondary purpose was to test the data collection tool
Five study objectives, including two related to data collection, were created:
current level and type of technology usage by faculty.
readiness for online teaching (through analysis of objective 1).
Determine if current technology offered to faculty meets the needs
of the faculty.
Collect feedback from pilot participants on survey questions for
Determine if pilot survey collects intended data.
A survey was distributed to a convenience sample of 20 participants with the ability to recruit others. The participants were asked questions regarding areas of teaching where technology is incorporated, types of classroom technology use, student use of classroom technology, and self-identification of rate of technology adoption. Descriptive analysis was run to determine characteristic technology use of the sample along with correlation tests to beginning understanding use profiles.
The results of this pilot study indicate that of the eleven (11) participants that completed the survey, most professors are fast to average technology adaptors indicating that they are open to technologies in the classroom and use technology in at least one area of their teaching/student learning.
Professors feel mostly comfortable with supported classroom technologies unless they do not have access to them. If they do not feel comfortable with a technology, students will also not be exposed to these technologies which may include those that all professors have access to but are not part of every classroom such as mics and webcams. Professors also tended to rely more heavily on supported technologies as opposed to social media, which is true even when factoring into technology adoption identification. Professors used on average five (5) of the supported technologies where Canvas was the most commonly used. In comparison, professors only used one (1) social media platform on average, YouTube was the most preferred.
The faculty in this study were supportive of student use of technology in the classroom, allowing students to use all types of technologies only discriminating when in the classroom period technology may be used.
These findings cannot be generalized to the entire faculty demographic. Recommendations to clarify survey items for better responses include definitions of major technology terminology and changes to the Likert scales for inclusion.
Good professional development just doesn’t happen on its own. Along with timely execution by a knowledgeable instructor that respects adult learning, to meet the ISTE coaching standard 4, professional development also needs support by administrators. While it is clear to me that administrators inform policies and procedures that govern culture in an institution, I must admit that I do not have a lot of background knowledge nor intimate understanding of the process administrators use to determine professional development. For this post, I’d like to investigate that process a little more closely. In particular, I would like to take a closer look in to understanding what role administrators play in the successful implementation of professional development.
Through my investigation, I gathered insight into what administrators face on a daily basis. Much like the changing landscape for teachers in implementing strategies and methods needed for 21st century skills, administrators are faced with the same predicament in engaging students and teachers with these skills. What is unique to the administrator’s challenge is that they have the added responsibility of initiation. Change starts with them so their attitudes and behaviors mirror the rate of success in improvement. Administrators who value technology and the development of 21st century skills are then viewed as technology leaders who must demonstrate willingness to learn, be flexible, and accept on-going change for technology adoption and implementation to occur, (Grady, 2011). An administrator’s role as a technology leader begins by setting a clear vision and understanding the standards that govern that vision, (Grady, 2011). Grady’s view on the administrator’s qualities mirrors that of the ISTE standard in the fact that not only are vision and goals to be communicated to faculty but good administrators model good technology use in various modes, provide engaging professional development, and engage in continuous professional development themselves as a lifelong learner, (Grady, 2011). Grady also shares that administrators that are good technology leaders also recognize faculty at the cornerstone of implementation, (Grady, 2011). Therefore, while professional development may create awareness about specific policies, it is understood true implementation requires more action and evaluation.
Former teacher turned administrator, Lyn Hilt, shares her investigation and thoughts on the administrator’s role in implementing successful professional development. After reflecting upon her experiences undergoing professional development as a teacher and having no recollection of anything that she implemented from those experiences, she concludes that rather than engaging in “development”, institutions should adopt the idea of “professional learning.” One key facet that Hilt wishes the reader to consider is that “teachers are not vehicles through which schools deliver programs and policies,” (Hilt, 2011). Instead Hilt offers the idea that teachers are individuals with passions and interests, so an administrator’s true role is to foster a desire to learn, (Hilt, 2011). Hilt buys in to the notion that teachers are adult learners and therefore effective “development” should take this into consideration. When teachers elicit true excitement about learning, that learning becomes implemented into their teaching, (Hilt, 2011).
Both Grady and Hilt agree that building community and shared experiences are key to successful professional development. Grady offers the “teacher-to-teacher” model where technology modeling takes center stage. In this model, teachers demonstrate learning activities to other teachers (their audience) while allowing their audience an opportunity to explore and implement these activities, (Grady, 2011). While it may seem that the role of the administrator in this model is minimal, successful implementation is dependent on allowing teachers opportunities for repeated activities as this model does not work well in isolation. In addition, administrative support is crucial by providing key resources and time to practice the skills learned in each “teacher-to-teacher” session, (Grady, 2011). While Grady’s model fosters community through localized support, Hilt emphasizes community and collaborations supported through professional learning communities (PLCs) that represents a broad network of professionals learning from each other in addition to the local resources. In the PLC model, teachers are viewed as experts and therefore are afforded active participation and choice in professional development. Hilt offers several characteristics of teachers as experts as summarized in figure 1.1 below.
In both of these models described above, the teachers are in control of the learning itself while administrators support that learning. As established, successful implementation of professional development, or learning, relies on the administrators’ ability to establish a clear vision, communicating that vision while modeling good technology practices, and finally providing resources. When teachers are allowed an active role in an environment that supports on-going learning and fosters community, learning that shapes teaching occurs.
In these past few weeks, I have been exploring professional development (PD) models that optimize adult learning. The primary focus of these posts has been on the characteristics of adult learning and various professional development formats that honor these characteristics. While understanding these models is important so that participants gain the most out of their professional development, in this post I’d like to focus on applying these concepts to incorporate content, exploring educational technology best practices described in the ISTE coaching standard 4b: “Design, develop, and implement technology rich professional learning programs that model principles of adult learning and promote digital age best practices in teaching, learning, and assessment,” (ISTE, 2017).
In investigating digital age
best practices, formative assessment appeared as a reoccurring theme. Formative
assessment as part of a feedback loops empowers learners to engage in the trial
and error of learning safely and with minimal risk. Applying formative
assessment to professional development could offer similar results. In applying
this idea to the ISTE standard, I began wondering what
digital tools could be implemented to teach teachers about the importance of
“Formative Assessment” and why is it a best practice?
Feedback loops are often used as a teaching
best practice in aiding students build 21st century skills. As
described in other posts in this blog, of the four different types of
assessment, traditional, or summative, assessment measures learning after an
assignment has been turned in. Summative
evaluation assumes that a student has “learned” after an intervention (such as
teaching) and the educator evaluates the extent of that learning, (Vlad-Ortiz,
2018). While summative assessment is useful for formal evaluation, it may not
be timely nor help students improve if only offered as one-time feedback,
(Vlad-Ortiz, 2018). Where summative assessment is formal and final, formative
assessment is more casual and on-going as the evaluation occurs during the
learning, (Vlad-Ortiz, 2018). Formative assessment therefore provides a checkpoint
for student understanding, (Office of Educational Technology, n.d.)
I explore the benefits
of feedback loops for students in this post, I’d like to expand the
investigation to including formative feedback as a tool in adult learning. The Office of Educational Technology found
that formative feedback when coupled with technology tools may be more complete
than traditional assessment and may “reduce time, resources, and disruption” to
conduct the assessment, (Office of Educational Technology, n.d.) These benefits
help educators as formative assessment may provide an avenue for capturing
teaching qualities that open opportunities for “self-reflection, peer
reflection, feedback, and supervisor evaluation,” (Office of Educational
Technology, n.d.). Extending these concepts further, formative assessment can
be used in professional development as a means to inform instructional practice
where participants track their own learning, (Office of Educational Technology,
n.d.). This means that meaningful evaluation can occur more rapidly and
frequently, offer more insight, and help guide professional development needs.
tools that can be used for formative assessment.
There are several educational technology tools
that can be used for formative assessment. Common Sense Education created a
list of the top
27 tools for formative assessment available here. These formative feedback tools include the
following features: student progress tracking, interactive and collaborative
activities, student-paced learning, and instant feedback to both students and
teacher. Formative feedback is given by utilizing interactive slideshow
presentations, video responses, multi-multimedia platforms, content-mapping,
quizzes (including clickers and polling), and backchannel chats. In creating
the list, Common Sense Education agrees with the Office of Educational
Technology stating that the best formative assessment tools help students (and
participants in this case) self-reflect and assess so that they understand
their current level of learning and self-identify areas of improvement, (Common
Sense Education, n.d.).
formative assessment into professional development.
Incorporating formative assessment in adult learning must assume that participants are learners who are joining the professional development for a variety of different motives that are relevant to their work situations. Though are quite a few professional development resources available on the internet on formative feedback tools, I’d like to use this professional development video I found through YouTube entitled, “10Tips for Formative Assessment with Technology: Meaningful, Sustainable, & Scalable” as an example. In the video Dr. Monica Burns walks participants through her tips by highlighting main features and how to use some formative feedback tools. A summary of her tips is provided in figure 1.1 below.
Though the video is purely informational as
Dr. Burns lectures for about 30 minutes on her ten tips, this could be a useful
resource for participants that are highly motivated. The professional
development model used assumes that the participants already have an awareness
of formative assessment and simply need guidance or ideas on how to implement
this in their teaching practice.
According to the ISTE standard, best practices
for the effective PD includes modeling, (ISTE, 2017). While the workshop above
may model ways to use each tool through verbal and visual description, it fails
to include participant buy-in and interaction. Formative feedback could have
been included into the professional development itself, allowing participants
an opportunity to experience instant feedback through the lens of a learner. For
example, demonstrating how to gauge comprehension to better understand the
audience’s needs could have been accomplished by using a backchannel chat or
using the polling/quizzes apps described in the video. This tangible and experiential approach could
help increase self-efficacy of technology tools for mixed audiences where the
presenter modifies their role to facilitation at certain periods of the professional
development. When presenters start
thinking about their participants as learners, professional development becomes
stronger, more impactful which can yield better improvements in teaching and
Common Sense Education, (n.d.) Top tech tools
for formative assessment. Available from: https://www.commonsense.org/education/top-picks/top-tech-tools-for-formative-assessment
Office of Educational Technology, (n.d.)
Section 4: Measuring for Learning. Available from: https://tech.ed.gov/netp/assessment/
Vlad-Ortiz, C. (2018). Incorporating feedback
loops to develop an empowered student [blog]. Available from: http://professorvlad-ortiz.org/incorporating-feedback-loops-to-develop-an-empowered-student/
Vlad-Ortiz, C. (2018). Instructional coaching: Using rubrics to quantify qualitative data for improved teaching outcomes. Available from: http://professorvlad-ortiz.org/instructional-coaching-using-rubrics-to-quantify-qualitative-data-for-improved-teaching-outcomes/
In my last post, I discussed at length the characteristics of effective professional development (PD) which should include “…interaction, relevancy, purposefulness, and focused on the learner,” (Vlad-Ortiz, 2019). Since learning requires effort, professional development models that include a social context and an active component tend to be the most successful models, (Vlad-Ortiz, 2019). Keeping in mind the ISTE standard for professional development addressed in the last post, one model known as the “facilitator model” caught my attention as having potential to meet the above criteria. According to Dr. Frances Gipson, to “facilitate” means to make easier, (Gipson, 2012). The assumption is that a facilitator acts as a guide and manages a group towards a shared goal or purpose. Dr. Gipson warns that the word “facilitator” is often misinterpreted as a passive role, however, a good facilitator acts more like a leader ensuring that the group makes good use of resources, decision-making power, and problem-solving skills, (Gipson, 2012). Because facilitation requires active participation from all participants, could this model help improve professional development learning outcomes?
In order to begin addressing this question, one must first understand how adults learn. According to researchers, the specifics of how adults learn are largely unknown and more research is required to complete that understanding, (Borko, 2004). However, what is currently understood is that learning is a dynamic activity that takes time to develop, while learning opportunities can occur anywhere such as a brief conversation in a hallway, for example, (Borko, 2004). Learning can be facilitated with a few considerations from the adult learning model, or “andragogy,” summarized in figure 1.1. below.
Under Dr. Knowles’ assumptions, good professional development should be goal orientated, relevant, practical, respect the learner’s time and expertise, and bring the learner into an active role rather than passive, (Office of Head Start, n.d.). This is not unlike the criteria my colleagues and I created in my previous blog post. As adult learners, we want professional development to address our needs rather than tell us about our needs.
Facilitation as a professional development model.
Dr. Hilda Borko conducted a study on various professional development models to begin understanding the complex relationships that exist between teachers, students, and learning. It is through this work that she began to understand that more research is needed to explain how adult learning works, (Borko, 2004). Through this study, she explored a few case studies that utilized facilitation models as a form of professional development and concluded that facilitation can be successful if the professional development is well-defined, (Borko, 2004). In particular, the most successful programs, where the learners adapted strategies more readily and rapidly, had clear descriptions of the facilitator’s role, specific learner/participant outcome measures, and well-developed activities and materials that were transportable across a variety of contexts, (Borko, 2004). One caveat of this success meant that facilitators led small groups of teachers that had common goals. Scaling up towards larger groups may present challenges as the activities and materials may no longer apply towards everyone’s needs or context, (Borko, 2004).
Dr. Borko’s fears of scaling up may not be warranted as the facilitation model has been used in many contexts. In Turin, Italy, researchers followed the progress of a teaching community that implemented a “Teacher-Facilitator” model in place of traditional professional development. Educators were followed over a period of 10 years to evaluate any teaching profile changes, particularly in the field of “cooperative learning”, (Ellerani & Gentile, 2013). Using the “teacher-facilitator” model, teachers were placed into groups with an “expert” teacher whose role was to facilitate professional development, emphasizing job-embedded skills and collaborative learning. The teacher-facilitators ultimately helped establish professional learning cohorts (PLCS) which later expanded into interdisciplinary networks that included administrators and other schools in the district, (Ellerarni & Gentile, 2013). The researchers remark that the success of this program lies in three factors, 1) the facilitation skills of the teacher-facilitators, 2) increased focus on importance of collaborative learning among teachers, and 3) increased job-related support by the district, (Ellerani & Gentile, 2013).
Qualities of a good facilitator.
Regardless of the scale in which the learning context takes place, the key element to effective learning in this model means imposing a good facilitator. Dr. Gipson summarizes her definition of a good facilitator through a concept known as the Five “C’s” described in figure 1.2 below.
Good facilitators understand how to establish a community that values inquiry and the opinions of others as a way to invite participation from all members. To do this, facilitators must be both firm and flexible with curriculum while communicating these intentions well to the group, (Borko, 2004). These facilitation skills can be developed over time with the appropriate preparation and resources, (Borko, 2004).
Through this investigation, it can be concluded that facilitation as a professional development model does support adult learning when implemented correctly. The skills of the facilitator is crucial to the success of converting learning into implementation while appropriate resources fuel that success. Facilitation may not be useful or appropriate in larger groups, used in the short term, or as one-time development as noted by Dr. Borko. However, special considerations can be made to scale such development as demonstrated in the Ellerani and Gentile research. Ellerani and Gentile noted that, “there is a strong correlation between the development activities of teachers and their actual development as teachers,” (Ellerani & Gentile, 2013). Facilitation respects the adult learner by putting adults in control of their learning, this in turn helps change their attitudes about learning, and ultimately helps put into action what they’ve learned.
Borko, H. (2004). Professional Development and Teacher Learning: Mapping the Terrain. Educational Researcher, 33(8). Available from: http://www.aera.net/uploadedFiles/Journals_and_Publications/Journals/Educational_Researcher/ Volume_33_No_8/02_ERv33n8_Borko.pdf
In its intention, professional development offers an opportunity for individuals to learn about new advancements in their respective field, including industry best practices. However, professional development (PD) is criticized for its inability to offer either content, format, or context that is relevant. In the DEL program, we were asked for our opinions on what makes for good professional development (PD). I reflected upon my experiences and noted that good professional development should be actionable, timely, and applicable. PD should focus less on the “what” and more on the “how”. My colleagues commented on the fact that good PD is characterized by interaction, relevancy, purposefulness, and focused on the learner. On the other hand, bad PD can be characterized as singular, stoic, and passive. Looking back on my own experiences, I remember one PD training I took that was a five-hour long video of a therapist droning on about the physiology of stress. While the topic was interesting (for about half an hour) without any engagement or application, the training suddenly felt like an endless lecture. More so, what makes it bad is that the PD worked on the premises that bombardment of facts equates into deep knowledge, however, “having knowledge in and of itself is not sufficient to constitute as expertise,” (Gess-Newsome, et. al., n.d.).
Criteria for Good Professional Development.
Because my colleagues and I all work in education and have experienced our fair share of PD, both good and bad, we were able to use our personal experience to determine the above criteria. Research on how we (humans) learn demonstrates that my classmates and I were not wrong. The goal of any professional development should impact student learning by augmenting knowledge in pedagogical content knowledge, (Gess-Newsome, n.d.). In other words, the main idea behind PD is to help individuals become experts. According to Gess-Newsome, et. al, expert knowledge is deep, developed over time, contextually bound, organized, and connected to big ideas, (Gess-Newsome et. al., n.d). This is interesting considering that most PD is offered in one timeframe at about an hour, hardly enough to begin the application and reflection necessary for that content to become “expert knowledge.”
What most PD, including my example of bad PD, is lacking is the opportunity to apply and reflect. Research on how we learn notes that learning needs two elements, 1) a social context which helps us to maintain high levels of motivation (because learning takes incredible amounts of effort) and, 2) an active component that allows the learner to engage with ideas that can either create new experiences, build opportunities to acquire knowledge, or directly challenge what we already know, (Gess-Newsome, et. al., n.d.). Engaging the learner also takes into consideration that learners will come into the session with their own conceptions and preconceived notions based on their current learning needs. To include all of these factors, the researchers from Northern Arizona University, strongly recommend the five principles of effective professional development summarized in figure 1.1 below.
The ISTE Standard 4b explores the properties of good professional development by defining the coach’s role as, “design[ing], develop[ing], and implement[ing] technology rich professional learning programs that model principles of adult learning and promote digital age best practices in teaching, learning, and assessment.” (ISTE, 2017). The standard highlights all of the principles of effective PD. Coaches should be able deliver PD that meets the needs of the learner within the context that is relevant to the learner.
While understanding the theory behind effective PD is important, on a personal level, applying these theories will prove crucial in the upcoming months as I was asked to facilitate a professional development session at a conference. My audience will be mixed group of registered dietitians with various levels of expertise in both nutrition education and technology. Understanding the need to develop effective PD, I realized it will be important to also understand which professional development model works best for audiences of mixed technology skill for me to meet learners’ needs.
After some investigation and feedback, it appears the best approach to address this inquiry will be in two parts, 1) understanding models for technology-infused PD, and 2) understanding the principles of learning differentiation.
Technology-Infused Professional Development.
Falling in line with the education best practices as noted by the ISTE standard above and the need for evidence-based practice required for all dietetic professional development, the PD should use technology in a way that allows for modelling adult learning and expose learners to using technology well in a professional setting. Northern Arizona University researchers offers four PD models that utilize technology in different ways as summarized in figure 1.2 below.
While reflecting upon these four models, professional development does not have to be limited to just one. All could be used as part of an on-going development process. However, the one that struck me as most useful for the PD session I am planning would be the face-to-face with technology support. I like the idea that the face-to-face portion isn’t a means to an end but rather the beginning of a longer term conversation. The researchers stressed that the audience engagement shapes the direction of the PD through the development of shared learning goals, (Gess-Newsome, et. al., n.d.). This was a unique way to view the face-to-face model that has been traditionally maintained as PD.
Differentiated learning implies that educators take into consideration individual learning styles and level of readiness prior to designing the lesson plan, (Weselby, 2014). According to Concordia University, there are four ways to incorporate differentiated learning:
1) Content– Though the role of any educator is to ensure that learning outcomes are met, differentiating content implies what learners are able to do with that content by applying Bloom’s taxonomy of thinking skills. Depending on the level of the learner, one learner might be content with simply defining a particular concept while another will strive to create a solution with that same content. Allowing learners to select their level of readiness through content differentiation allows for smoother introduction of the material.
2) Process- In process differentiation, the learners are engaging with the same content but are allowed a choice in the way in which they learn it. Not all learners require the same level of instructor assistance, or require the same materials. Process differentiation also assumes that some learners prefer to learn in groups while other may prefer to learn alone.
3) Product– In this model, the learning outcome is the same but the final product is different.
Learners have the ability to choose how they demonstrate mastery in a particular area through product differentiation.
4) Learning Environment– The learning environment that accommodates different learning needs can be crucial to optimal learning. Flexibility is key for this type of differentiation as learner may want various physical or emotional learning arrangements, (Weselby, 2014).
One of my colleagues suggested that I consider differentiated instruction as a strategy to approach the various technology skill levels of my target audience. I must admit that at first, I wasn’t sure how this could be applied to a conference setting. However, considering the face-to-face technology-infused PD model above, differentiated instruction suddenly became not only plausible but also the more effective method. Differentiated learning aligns with the principles of effective PD by allowing the session to be as learner-centered as possible. Because the learners take more responsibility for their own learning, they become better engaged in the process.
In searching for professional development models that incorporate technology for mixed audiences, I learned that understanding the pillars of good professional development is just as important as applying technology in a relevant mode for everyone to understand. Taking the two factors above into consideration, effective PD for my conference will need both a technology-infused model and the opportunity for differentiated learning.
I embarked on a project where I undertook the role of peer coach. Using the communication skills and logistical training from class, I enhanced my coaching skills over a ten week period. I’m not a stranger to coaching, in a former career I counseled patients on therapeutic diets, diet change, and overcoming barriers to change, using very similar principles. In fact, I became quite nostalgic all throughout this process. The strange and unfamiliar term of “peer coaching”became comfortable and familiar once concepts like “probing questions”, and “building rapport” came to light. With no billing hours and diagnosis to defend (mainly to insurance) peer coaching felt quite light and freeing in comparison to coaching in a medical application.
The project itself consisted of enlisting the help of a peer who would be willing to undergo a collaborative revision of an existing lesson plan. The idea was to spend time building rapport and establishing set roles for each peer prior to the collaborative process. The collaboration would then focus on one major area of concern to be improved in the lesson plan. Following this revision, both parties would reflect on the process to provide feedback.
The Coaching Process
To start the project, I partnered with a former supervisor, SK, who is very open-minded to incorporating technology in the classroom. She had been wanting to explore new ways to use technology in online and blended courses beyond simple course management. She felt that online classes tended to be boring or isolating because most are designed to be “work at your own pace” and independent. Faced with planning a new blended course set to go live during the next academic year, SK sought me out for suggestions. Throughout the peer coaching process, we had four face-to-face meetings (where the majority of the collaboration was performed) while also communicating follow-up items via email. A summary of these encounters are provided below:
First Meeting. In our first meeting, SK shared more information about her new course intended to be a blended classroom with community engagement components. Beyond the course description, the only other information established were the course objectives she had developed after reviewing textbooks with similar themes.
After understanding more about the scope of the work, we established our roles, expectations for our time together, and ended our session by creating a SMART goal that would guide our future work. The expectations for me in the coaching role were clear, I was to facilitate the assessment- and course calendar- development process, keeptrack of our progress towards achieving our goal, and provide key resources needed to complete the work. My peer would then complete all other work necessary to continue to the next phase.
As part of this first phase of coaching, I also met with my direct supervisor to share the above information and ensure that our work aligns with departmental goals. Interestingly, this discussion coincided with a revamp of the departmental goals unrelated to this project. Later in the quarter, technology incorporation and digital citizenship were included as new goals. With this new vision, our coaching work aligned with our departmental values.Our supervisor was very encouraging, supportive, and wanted feedback regarding the results of our collaboration at the end of the process.
Second Meeting. Prior to our second meeting, I began reflecting on SK’s goals and our previous conversations. Given that the course objectives were already established, I wondered if the “Backward Design” model would be a good starting point for our work. I verbalized this intention to my peer via email which also included resources on “Backward Design”. During our second meeting, we took a closer look at the established course objectives and began identifying thinking skills that would satisfy each objective. We soon discovered that one objective in particular required both low order- and higher order- thinking skills to successfully complete. SK expressed a desire use this objective as our starting point since it was the largest and most complicated. We agreed that we would develop a unit around this objective that would then serve as a model for the subsequent objectives/units.
Third Meeting. At the end of our second meeting, SK expressed a concern about her choice of text, wondering if it was the best option available. I had suggested using multiple sources that would be updated more frequently including websites,journal articles, and open source textbooks. I promised to provide a few databases on open source materials so SKcould review prior to our third meeting.
SK made good use of the databases and had established a rough draft of the course calendar. In the calendar she separated big topics into one-week units along with associated learning outcomes for each unit. For the big unit we had decided to focus on,SK developed a three-week timeline with associated reading assignments and engagement activities. For the reminder of our meeting, we discussed the engagement activities at length focusing on any potential technology integration that would allow for collaboration.
Fourth Meeting. By this time, we had already met our SMART goal. Prior to meeting, I used our loosely-defined definition of engagement (including active learning,collaboration, and participation) and made notes on the unit’s learning activities for future consideration. These suggestions were mainly to address prior concerns of isolation in traditional blended classrooms. We went through these suggestions. My peer expressed a desire to stop our work for the time being as she was happy with our progress and wanted time to reflect upon the ideas explored in this last meeting.
Feedback and Reflections
At the end of our peer coaching relationship, SK provided positive feedback on our progress. She was happy that we were able to remain on task to meet our SMART within our allotted time despite very busy schedules.She appreciated the ability to ask for suggestions and bounce ideas off of eachother. Talking through ideas was helpful for understanding how each component could be more engaging in an onlinesetting. Despite our momentum in organizing the blended classroom, SK noted that she will be taking sabbatical making our last meeting an excellent stopping point.
Taking from an outside perspective, one of my colleagues, LB, reviewed the progress outlined above and agreed to provide feedback. LB’s comments and reactions to the project were positive and focused on three aspects:
1) Coaching relationship; she noted that the relationship my peer and I had worked well to help us achieve our goals. Having established clear expectations early on ensured the accountability my peer wanted to gain a head start in course development.
2) Unit organization; though my peer and I didn’t plan and evaluate a lesson plan,which was the original scope of this project, LB commented on the process of developing the unit. She noted that the assessment components of our chosen unit appeared fun, engaging, and meaningful for students.
3) Coaching skills; LB and I shared experiences during this project.LB commented on the fact that I performed my coaching skills well. While I think my past experiences partially reflect this, I do also think that my success is rooted in the fact that my peer is also an experienced collaborator and understood what a collaborative partnership should look like.
Things that went well. Taking LB’s comments into consideration and reflecting back on my performance, I had an overall positive experience. Mypeer and I were very appreciative of one another’s efforts towards the progression of our project. We stood by our established expectations and fulfilled our roles accordingly. One aspect that was a little surprising for me was the fact that my peer saw me as a subject matter expert and expected this type of coaching style. Interestingly, I did not see myself as the“expert”, opting instead for a more collaborative coaching style. In the end,my role/style morphed into a little of both. One delightful discovery my peer and I made through our brainstorming and collaborative efforts, we used our strengths to explore a creative way to use Pinterest as a visual timeline for a major project. By using what knowledge I had about existing technologies, and collaborating by offering lots of options and suggestions for their use, my peer could choose the option that was right for the course or the one she felt most comfortable exploring.
In addition to responding to my peer’s expectations well, another strength of this project was our communication style. Because SK and I worked together previously, we had already established rapport and understood our working styles. SK knew that her preferences would be honored throughout this process and her decisions would be supported because she was encouraged to express herself open and honestly. Most of our communication was through face-to-face interaction with only supported our good communication. Email communication was limited to follow up emails. These follow-ups were helpful to ensure accountability by both parties. Each email would review past conversations, action items to be completed prior to the next meeting, and any resolutions to concerns, such as the opensource databases.
On a curious note, SK felt very motivated to complete her part in a timely manner because she was very respectful of the fact that this was an assignment for me and she didn’t want to “mess up” my project.
Things that could have been improved.LB mentioned several times that she enjoyed the layout and the organization of the assignments prepared for the big unit as a strong feature to the project. However, I cannot take credit for the organization as my peer completed this work. SK knew what she wanted and I served as resource to help her reach that goal. Because of this, I feel that I didn’t really do anything aside from give options and opinions of the information my peer brought forth. I must recognize however that this is what my peer wanted and in this particular coaching scenario, it worked well. In the future, I would also like to improve my communication skills to be more in line with the prescribed communication methods learnt in this course. Should I collaborate with a peer that isn’t as clear with what they want, the probing and clarifying questioning skills are going to prove crucial to success.
While the topics of our meetings were loosely set previously, I never created agendas or had any particular topics to review aside from the backwards design model. Keeping the meetings loose did allow for more open-ended exploration of our goals but I wonder what the outcome could have been if I had better defined our meetings? Again,this style worked well for this particular coaching scenario, but I’d like to keep this idea in mind for a future coaching partner who perhaps needs more structure or guidance.
Thoughts on coaching for the future. I would love to incorporate a coaching culture in my department. Working with SK was not only an opportunity to help her gain ideas and resources for her new class, but it was also an opportunity to get to know one another in a different environment. Our collaboration was meaningful and fruitful.
Though we currently do not have a one-on-one coaching program in my department, we have classroom observations as one of our required professional development strategies.Therefore the basic idea and structure is already in place. I’d like to expand upon that work to create a more constructive professional development environment where professors move away from work in isolation to work in collaboration. I’ve already begun exploring coaching culture in a previous blog post available here. Moving forward, I would need department input and an assessment of current thoughts and attitudes towards peer coaching. Should the department approve, more meaningful and fruitful interactions would allow 21st Century skills to thrive in our courses.