Reflecting on a “True PLC”

preview

In November, I set out to facilitate a virtual PLC that would run across the remainder of the school year. This PLC was comprised of a voluntary group of online instructors from Michigan Virtual School, and we recently gathered for our last session of the year. Below are my reflections on the 8-month experience and implications that have emerged.

Goals for the Year

  • Develop knowledge and skills as disciplined researchers of our own practice
  • Build capacity to collaborate with colleagues as critical friends
  • Have evidence of the impact of instructional experiments on students’ outcomes
  • Earn 24 SCECHs credits for participation

How’d we do? 

Based on feedback from the group, our data, and my own reflections, here’s how I’d assess our actual outcomes:

  • Made progress towards goal – Develop knowledge and skills as disciplined researchers of our own practice
  • Met goal – Build capacity to collaborate with colleagues as critical friends
  • Did not yet meet goal – Have evidence of the impact of instructional experiments on students’ outcomes
  • Made progress towards goal – Earn 24 SCECHs credits for participation

How do I know this is how we did?

Disciplined Researchers – My PLC members all identified an improvement aim they’d like to pursue. They analyzed the whys behind the problem they wanted to solve. They identified change ideas they felt might lead to the improvements they wanted to see. Most of them enacted a change idea, and the group helped them to analyze preliminary data results. Two members ended up diving into an extended research endeavor – one turning into a thesis for a Masters in Mathematics at Harvard (closing the gap between Calculus AB and BC) and the other emerging as informative of a viable PLC focus for 2017-18 (improving student writing in STEM subject areas).

Critical Friends – Every remaining PLC member reported that they valued the collaborative structure of our recurring meetings. Members reported that they felt more connected to their PLC colleagues, more aware of cross-departmental happenings, and more aware of what it can look like to engage in disciplined collaboration in a virtual setting.

Evidence of Student Impact – It became clear over time that having each individual PLC member identify their own improvement aim made it hard to give each improvement effort the time it required to stay disciplined and to be held accountable for showing up to each meeting with new information to inform change efforts. Although one member reported that she is more cognizant of parent and mentor communication as a result of her efforts to pursue her aim, this is not yet producing measurable results in her aim to increase the percentage of students who meet critical course deadlines. A promising indicator towards this goal is that members reported feeling as though the initial data gathered has left them “wanting more” and that it feels like a “great practice run for future collaborations.”

State Continuing Credit Hours (SCECHs) – We started with 9 voluntary PLC members, and we ended up with 5 who will earn the 24 credit hours. Of the 4 who will not, 1 took a new job out of state, 1 earned a promotion and shifted gears, 1 felt overwhelmed by her course load and opted out mid-year, and 1 opted out from the start based on a conflict with the monthly meeting time.

Moving Forward

Our PLC spent the entire last meeting reflecting on the year, and the last prompt I had them work together around was to, “Name the core principles and practices that you believe need to animate the work of a PLC that strives for the objectives we set out to achieve.” This is what they came up with:

  • Focused: Establish & work together towards a single, common improvement aim
  • Collaborative: Space where all voices are heard and for people to connect with one another
  • Structured: Common agenda structure, recurring meetings (possibly more frequently than monthly)
  • Job-Embedded: Relevant to one’s teaching / role
  • Clear Expectations: Regularly expected deliverables, both synchronously and asynchronously

I’m so thankful to have had the opportunity to engage in this endeavor this year. It came at the start of a new job among new people, doing work in education that is somewhat new to me. I was able to put my passions to work in this PLC, and I feel more connected to new colleagues whom I would not have had the opportunity to work with otherwise. This group voluntarily signed up to do something challenging – another of so many affirmations of the passionate educators that permeate our nation’s schools.

I look forward to the next iteration(s) of this learning experience with a laser-like aim on improving student learning, to prototyping what it might look like to create online structures to support other leaders’ in their facilitation of such a structure, and to sharing our experience with the rest of our organization in ways potentially useful for our broader structures for organizational learning.

Relevant Resources: 

Advertisements

Plan-Do-Study-Acting

Plan-Do-Study-Acting

This week, the virtual PLC I’m leading will gather for its 6th time since November. We’re in our third round of what one might call “Plan-Do-Study-Acting,” or working our way through improvement cycles animated by the Carnegie Foundation’s improvement science work.

As I prepared for this week’s meeting, I reflected on the challenges my team members have expressed that they’re facing, namely that their change ideas may not (yet) be resulting in the outcomes they predicted. It should go without saying that this can be a frustrating reality and one that might lead one to question the effort in general. So I revisited Anthony Bryk’s, Learning to Improve, which has served as a guide for me in this year-long effort, and pulled a 1-page excerpt for the team to review. Here are some key passages from it:

Especially during the early stages of an improvement effort, it is likely that the predicted outcomes will not occur. The improvement team asks, “Why? What did we not take into account?” Their analytic probing helps them form the next PDSA cycle on the iterative journey toward reliable change.” (p.208)

If the initial hunches turn out to be wrong and the process deficient, the idea will eventually break down, and the predicted outcomes will not repeat. Such failures are valuable grist for improvement efforts. Reflecting on these failures causes improvers to question critically, “What did we miss? Does the change protocol need to be refined further? Do we need to make adaptations to make it work in this new context?” (p.208)

I want this crew to know that their failures (if you want to call them that) are learning opportunities, and in fact, it is this “messy middle” of an improvement process where some of the biggest learning can occur if you open yourself up to it and have the “holding environment,” or as Drago-Severson describes it, “a safe context for growth” to do so.

I am hopeful that the group will leave this meeting, which will involve some personal connections, some reading, a Consultancy protocol, and a broader reflection on our effort, with a newfound sense of possibilities as they look ahead to next steps they can take or pivots they can make to intentionally strive towards their aim for student outcome improvement.

If you’re interested in seeing how I’ve gone about the facilitation of multiple iterations of supporting the progress of PLC members’ improvement efforts, you can see the agenda from last month’s meeting here, and the agenda for this week’s meeting here. Both have been modified for public consumption and to honor the privacy of my wonderful PLC. You should feel free to copy and make them your own, if you find them useful.

The “Messy Middle” of an Improvement Process

It seems simple enough to identify something specific about student learning that you want to improve and then to do the work of improving that thing, right?

It turns out it’s not so simple.

Here’s a scenario: You teach an online course and you’ve got a theory of action: If I can grow the amount of student initiated communication with me, then student outcomes will improve. This seems like a rationale theory you could connect dots between…if only I didn’t have to be the one reaching out and prompting students, and instead they were actively seeking me out for support, then we’d see evidence of students doing better because they are diving in as agents of their own learning. Makes sense to me!

But then you start looking at your data, and students who initiate communication are actually not really doing any better than anyone else in class. What’s going on here, and what the heck do you do next?

It turns out this is the actual story of a colleague I’m working with in a virtual PLC effort this year. In fact, he’s the person who initiated the PLC itself. He deeply desires to grow his knowledge and skill around action research, around improvement. So it makes sense he’s frustrated. This improvement science thing is not a clear path. Identifying a viable improvement aim is tough, and testable change ideas can feel ambiguous or like a crap shoot. But here’s the deal: He identified an aim, and he’s tried out the first of a couple change ideas:

Aim: To increase the frequency of student-initiated communication.

Change Ideas:

  • Increase teacher presence/personality in the course through the announcements and discussion board (Prediction: This will make students more comfortable reaching out to the instructor.)
  • The one he’s implemented: Make the teacher’s contact information more prominently displayed throughout the course. (Prediction: This will remind the students that reaching out to the instructor is a viable & valuable option.)

If he hadn’t done this, he wouldn’t be where he is now, asking himself what he can do next. I think this is a perfect example of the importance of just picking something to work on as a vehicle to deliberately learning how to improve. It’s also a concrete example of the fact that that vehicle is messy.

middle messy magic Brene Brown Irene Latham

So here’s my question for you – a choose your own adventure, if you will: What should my colleague do next?

Path 1: He should consider the various purposes for why a student might initiate communication and examine how the purpose for communication might correlate to impact on overall student outcomes in the course. Parsing purpose out might influence different or clearer strategies for catalyzing the specific types of communication he thinks might be most likely to positively influence a student’s performance.

Path 2: He should go back to the drawing board and look more specifically at what seems to be negatively influencing student outcomes, and he should re-evaluate change ideas that he can test. It’s possible his first crack at an aim is not an aim at all (he’s actually mentioned this himself…).

Path 3: He should give himself more credit and time! Try the other change idea, and see if the combination of the two elicits a measurable impact. Or double down on the first change idea and measure whether changes occur.

Path 4: Something else entirely, and I will comment at the end of this blog to share my wisdom!

A few True PLC posts ago, I expressed that I wasn’t sure whether we should have committed to one improvement aim as a PLC or whether it was just as meaningful to allow each individual PLC member to identify the aim most relevant to them, and for our PLC to serve as a regular holding space to analyze the fruits of each member’s labor. This story above might be an acknowledgement of the inherent challenges that ensue when you can’t give each individual the thought partnership they may need to refine an aim and to enact change ideas that line up with it. If we were to focus on one collective aim, we might have more time to make sense of the complexities my colleague is facing…

Let me know what you think!

 

Analyzing Improvement Efforts

plan-do-study-act

My online PLC crew is in the thick of testing change ideas in honor of their respective improvement aims. We’ve come a long way! We got grounded together. Each individual got clear about what it is they want to change about student learning in their online course. And they’ve all zeroed in on the change ideas they believe are most likely to get them the outcomes they desire.  This time when we convened, we paused to make sense of our progress towards improvement.

Here’s how it went down…

Meeting Objectives

  • Get reaquainted with your improvement effort
  • Practice data analysis
  • Analyze one another’s first sets of data resulting from initial changes implemented
  • Reflect on insights gleaned from this first cycle of inquiry to decide what you’re going to do next to strive towards your improvement aim

Meeting Flow (the intended flow, anyways!)

  • 10min: Get Connected (All Together)
  • 5min: Telling Your Story – Free write to get reconnected with and to prepare to tell the story of your improvement efforts to-date (On Your Own)
  • 15min: Practicing Analysis – ATLAS Looking at Data Protocol using one PLC member’s first set of data resulting from change idea (All Together)
  • 25min: Small-Group Analysis & Dilemma Grappling – Replicate data analysis protocol in smaller groups (Small Group Break-Outs)
  • 5min: Next Steps & Reflection – In next month, take the next steps you’ve identified to continue striving towards your improvement aim (All Together)

Check out the modified (for public consumption) and more granular version of this agenda here, and you should feel free to make a copy of the document and make it your own.

What Actually Happened

This meeting happened a week after the 1st to 2nd semester transition. Data from improvement efforts since our last meeting was…limited. I was prepared for this possibility, and we simply pivoted. Instead of break-outs to try to analyze everyone’s first sets of data, we dug more deeply into one PLC member’s data. This resulted in a few wins:

  • We were able to more deliberately practice a new protocol together – NSRF’s ATLAS Looking at Data – that will likely animate how we go about making sense of data in meetings to come.
  • The PLC member who had some data to share came out of the analysis experience with what I see as a pretty valuable insight: his improvement aim was actually not yet an aim! It is a likely means to an end, but not an end in and of itself. (His aim was to increase the level of student-initiated communication.) He realized he needs to do 1 of 2 things: Either get clear about what he predicts will happen if student-initiated communication increases OR reconnect with what he really wants to improve about student learning and evaluate whether his current effort is what is most likely to get him there.
  • I believe this approach provided others in the PLC a valuable model of what to expect moving forward. It gave them permission, this time around, to focus on being a good critical friend to their PLC teammate while also helping them to have a mental model of how to prepare to share next time.

Next Steps

Some of the PLC members expressed in their post-meeting reflections that they need to make sure to prepare in advance to put their data together in a shareable way. We also discussed the idea of zeroing in on 1 or 2 people to bring their data for us to analyze as a group, rather than trying to get everyone “the floor” in our precious monthly hour.

In response, I will be reaching out to the group a week before our next meeting to see who’s feeling like it’s a relevant and opportune moment to unpack their data. For those who say they’re ready to share, I’ll offer my support if they’d like help packaging their data for group analysis.

This all has me reflecting on the concept of “rapid testing.” Are we granular enough in our improvement aims to truly consider it improvement science? Should everyone realistically be showing up with data at each meeting? I go back and forth on this. For now, I think there is great potential for this effort to influence the improvement mindsets of the individuals in this group and to potentially influence learning structures for our organization moving forward.

“Deliberately learning our way to better outcomes is, in fact, how organizations improve quality and how interventions scale effectively.”Bryk

Planning an Improvement Cycle

This is part 4 of the “True PLC” blog series designed to articulate the specifics of and to reflect upon an 8-month virtual Professional Learning Community (PLC).

The first PLC meeting was largely about getting connected to one another and to the purpose for the group’s collaboration. The second meeting was about getting specific about the problems we want to solve about student learning. The third meeting, which I will highlight in this piece, focused on planning first improvement cycles in service of our specific improvement aims. This is really where the rubber hits the road!

where-we-are-in-plc-big-picture
Where we are in the big picture of this PLC effort.

Meeting Objectives

The objectives for this 3rd meeting were to:

  • Connect with one another
  • Glean insights from an example of an improvement process to inform our own
  • Support one another to ensure change ideas are aligned to key “whys” behind outcomes you want to change
  • Draft the plan for your first improvement cycle, using the Plan-Do-Study-Act cycle as a guide

Here’s how we went about it…

Meeting Structure

  • 10min: Get Connected – Round Robin Share-Out (All Together)
  • 15min: Text Analysis – A Case Study of an Improvement Process (On Your Own & Share Key Insights)
  • 5min: Capturing Your Current State – the key “whys” producing the outcomes you want to change and the change ideas you think will result in desired improvements (On Your Own)
  • 20min: Collaborative Planning – Ensure each group member leaves having zeroed in on their first specific change idea and the data they’ll use to measure whether their change leads to an improvement (Small Group Break-Outs)
  • 5min: Next Steps – In next month, carry out change idea and collect relevant data. Be ready to share at next meeting. (All Together)
  • 5min: Reflection – (On Your Own)

Check out the modified (for public consumption) and more granular version of this agenda here, and you should feel free to make a copy of the document and make it your own. You might find it interesting to review the improvement aims this PLC has identified in the Improvement Process Planning Table (Table prompts heavily informed by Bryk’s Learning to Improve).

Reflections

We’re not necessarily standards-driven in our approach. Over the weekend, my supervisor shared Solution Tree’s Global PD effort with me. It’s the DuFour team’s effort to offer a virtual support system for teams enacting PLCs. The online tool looks to be heavily standards-driven, with an emphasis on learning targets and common assessments. I think there is great potential in this and at the same time it bumps up against some of my beliefs about assessment and data.

I decided to create an environment where my participants selected the thing about their students’ learning that they’d like to change most (i.e. deepen contextual responses to peer feedback on writing). They are determining the most relevant data to create, collect, and analyze (i.e. evidence of peer feedback) in order to understand the impact their efforts have on that desired change. If an aim ends up being tied to a content standard, and if a common assessment is the data generation approach that feels relevant, great! If not, and the approach still leads to measurable improvements in student outcomes, I think this effort is just as valid. I would be VERY curious to hear thoughts on this, and I look forward to reflecting upon this with my PLC at the end of our effort too – Does an improvement effort have to be standards-driven?

Next Steps

So far we’ve been planning and preparing. Now it’s time to DO. I’ve asked my participants to enact their first prioritized change idea and to gather any relevant data that emerges between now and our next monthly meeting, even if minimal. At the next meeting, I’ll create the conditions for participants to share that data and to make sense of it in a way that helps them to make informed decisions about what to do next to continually strive towards their improvement aim.

In participants’ reflections, they continue to appreciate small group breakout time, the support they receive in this setting, and the space it provides to share plans and discuss ideas. To honor this, I think for the next meeting I will quickly model a protocol for data analysis and then send folks off into small groups to replicate it on their own, using the data they’ve brought with them.

kurtlewinquote

 

Determining PLC Improvement…Aims?

Determining PLC Improvement…Aims?

This is part three of the “True PLC” blog series designed to articulate the specifics of and to reflect upon an 8-month online Professional Learning Community (PLC).

While the first meeting was largely about getting connected to one another and to the purpose for the group’s collaboration, the second meeting was about getting specific about the problems we want to solve about student learning.

we-get-better

My design work continues to be informed by friends and former colleagues (see quote above), by the book Learning to Improve: How America’s Schools Can Get Better at Getting Better by Anthony S. Bryk, Louis M. Gomez, Alicia Grunow, and Paul G. LeMahieu, and more broadly by the work of the Carnegie Foundation for the Advancement of Teaching.

Meeting Objectives

The plan for this 2nd meeting was to:learning-to-improve-233x350

  • Expand upon personal connections
  • Help one another refine respective aims for student outcomes improvement
  • Practice the act of understanding the system or key drivers producing the outcomes you want to change
  • Build clarity (and share resources) about next steps for identifying change ideas to enact improvement processes
  • Reflect on our progress to inform next steps for our work together

This was a lofty set of objectives! Here’s how we went about it…

Meeting Structure

  • 5min: Objectives & Norms for the Day (All Together)
  • 5min: Articulate the Problem You Want to Solve re: Student Learning (On Own)
  • 5min: Quote Reflection, while facilitator creates break-outs groups (On Own)
  • 20min: Improvement Aim Tuning (Small Group Breakouts)
  • 15min: See the System Producing the Current Outcomes – Simulation & Practice (All Together & On Own)
  • 5min: Next Steps (All Together)
  • 5min: Reflection (On Own)

Lots of bite-sized pieces in this one! Check out the modified (for public consumption) and more granular version of this agenda here, and you should feel free to make a copy of the document and make it your own.

Progress & Improvement…Aims?

One participant aims to close gaps in student performance across multiple sections of the same course. Another seeks to identify support structures showing the most promising impact on student writing in STEM courses. Others want to increase the number of students meeting critical course deadlines. And that’s just a taste.

You’ll notice there are multiple improvement aims. This was an intentional choice as a facilitator. Some PLCs might choose to focus on one common aim. I decided to go the route of each individual identifying the improvement aim they find most personally relevant and for this PLC to serve as a collaborative holding space (see Drago-Severson) to support these individualized endeavors. I’ll let you know how I feel about this move in June…for now, it feels right.

Reflections

we-learn-by-doing-if-weI good about the flow of this agenda. If I had to do anything over again, I would keep break-out groups to no more than 3 people to ensure a healthy amount of time for each group member to receive tuning feedback within our 1-hour time constraint.

The norms approach was positively received. Informed by PLC-member reflections at the end of the first meeting, I introduced norms for us to “try on” for the day and to reflect upon at the end. Over my years as an educator and school development coach, it’s taken time for me to find ways to leverage norms that feel natural and useful, rather than forced and punitive. This approach felt like it honored the members of the group, and it garnered positive reflections at the end of the meeting.

I am wrestling a bit with how to build meaningful personal connections in a virtual space. I didn’t know these virtual PLC participants before we started meeting. Some of them only knew each other by association. I worry about the potential for “playing nice” to get in the way of collective capacity building (see Fullan) or for miscommunications to occur due to us not yet having built relational trust or having the conditions for deep, personal connection. So far, I do think we’re on a trajectory that exudes disciplined, collegial inquiry (in fact, this group is pretty darn stellar), but that foundational connection and mutual understanding can be elusive. I will continue testing ways to tend to the affective in an online space.

Next Steps

Participants left this meeting having received feedback on their improvement aims to ensure they are reasonable in scope and able to be measured for understanding if changes they put in place lead to desired improvements. They also observed me coaching one group member’s unpacking of the systemic whys likely leading to the outcomes she’s wanting to change.

These experiences were designed to prepare participants to:

  • Engage in their own system analysis
  • Identify promising change ideas within that system that they believe will lead to the improvements in student outcomes they want to see.

Understanding the system that surrounds a challenge is no simple task, and I’m hopeful that the resources I provided to support participants’ next steps will be useful. I’ve asked them to show up to our 3rd meeting with concrete change ideas they intend to try. It feels important for me to prompt this crew to ensure they’ve zeroed in on pre-data before enacting their change ideas, so they can really gauge whether their changes lead to improvements.

Stay tuned!

palmerquote

Kicking Off a Virtual PLC

Kicking Off a Virtual PLC

This is part two of a blog series designed to reflect upon and articulate the specifics of an 8-month online Professional Learning Community (PLC) effort. My goal by the end of this series is to offer a concrete, practical example of what it takes to make a true PLC happen. (It turns out this can be hard to come by!)

We wrapped our first of 8 scheduled virtual PLC meetings last week, and here’s how it went down…

Who showed up?

  • Eight of the 9 online instructors who voluntarily signed up, one of which is serving as my thought partner to continually advance the design and facilitation of the process across the year. (The absent participant had to take on bus duty after school.)
  • My 2 researcher colleagues, who will serve as resident experts and additional support across the year

Meeting Objectives

The plan was for us to…

  • Begin to connect with one another
  • Set the stage for the purpose for the group’s collaboration
  • Build clarity (and capacity) about next steps for either identifying a singular group aim or individual research questions
  • Identify some norms likely to support the group in realizing its purpose

You’ll notice that the meeting’s objectives are pretty affective in nature…

heronquote

When I began facilitating adult learning experiences over six years ago, I lacked a certain level of awareness of the essential nature of the affective elements of a learning environment. In fact, I may have even chalked them up as too “touchy feely” for me. Six years later, I know that adults have to feel connected to one another and valued in order to engage deeply and vulnerably as professionals and to open themselves up to the critique necessary to improve. I thank my friends and former colleagues from New Tech Network for helping me to internalize and experience this for myself, as well as protocols and resources from National School Reform Faculty and The National Equity Project that have continually advanced my own capacity to craft holistic adult learning experiences.

Meeting Structure

The first meeting flowed like this:

  • 15min: Connecting & Framing the Year (All Together)
  • 10min: Hopes & Fears (On Your Own)
  • 20min: Text Protocol (Small Group Breakouts)
  • 10min: Wrap Up & Next Steps (All Together)
  • 5min: Reflection (On Your Own)

You can see a modified (for public consumption) and more granular version of the first meeting’s agenda here, and you should feel free to make a copy of the document and make it your own.

Hopefully you’re taking note of a few things here:

  • Meeting “Medium” Variety: Participants interacted in a variety of ways, from all together, to on their own, to breakouts, to all together, to on their own again. Shifting mediums deepens the ability to stay engaged, maximizes chances of honoring the various ways in which people think/process best, and implicitly expects active participation.
  • Meeting Flow: We connected with one another, then we had a moment for personal open-ended responses, then we unpacked a complex text and considered concrete implications for our work. This flow was intentional and inspired by the National Equity Project’s Experiential Learning Cycle.
  • Meeting Resource Accessibility: I leveraged Google Docs to help execute a smooth, online, synchronous meeting flow. Permissions were set for anyone to be able to edit, and reflection spaces and resources applicable to the meeting were embedded directly into the agenda document for coherence and simplicity.
  • Monthly Meeting Structure: There is a background “skeleton” to the Google Doc agenda structure.  Monthly online meetings will operate through this meeting template. My intention is for this technical, logistical solution to clear the path for focusing on our collaborative work and learning rather than managing accessibility to the meeting itself. (Thanks to my friend, Anna Kinsella, for helping me to refine this structural design!)

intentionquote

Reflections & Next Steps

In closing reflections, participants suggested they liked the structure and are looking forward to connecting with colleagues they might not have otherwise. They expressed some nervousness about being able to contribute deeply to the group, and there was a minor technical issue when trying to transition between virtual meeting spaces.

I look forward to honoring this feedback from participants as well as to keeping this PLC moving forward towards it’s overarching objectives. I, along with my instructor and research thought partners, intend to provide optional asynchronous supports to participants to set them up for greatest success on their concrete next step, which is to zero in on the student outcome issue they want to tackle.

This next meeting feels especially crucial. In fact, it feels like it might be where many PLCs go to die, if not handled with care, targeted scaffolding, and intention, as well as with a healthy level of bias towards action. Bring it on! Stay tuned for further ruminations and concrete breakdowns of how the next iteration of our work together unfolds.

plan-do-study-act
Image used to animate the “core improvement questions” that drive the improvement science work in Learning to Improve by Bryk, Gomez, Grunow, and LeMahieu (2015).