Pages

Thursday, December 21, 2017

One Year in AP: Instructional Planning (Week Eighteen)

By Mark Heintz

Context

I have two main focuses as I write this weekly blog. Two driving questions that I have in my mind while making decisions.  They are:
  • How do I know if my students know? 
  • How do I get them to know if they know?  
Whether that is a skill or content, I want to know if they know it.   I no longer think it is acceptable for me to guess or get a feeling on whether or not they know it. Getting the students to know if they know it is downright hard, but I am really attempting to get to a point where the students can recognize their understandings or progress on their skill levels and content knowledge.  Therefore, the purpose of this year of reflection is to see how I make progress towards these two goals and elicit feedback from staff, students, and hopefully people who follow along on the journey.  You can read how last week went here.

Week Eighteen: Answer the Question

This week the content focus was primarily on how emperors legitimized their rule.  The primary focus was the 1450-1750 time period.

Here is the content standard for this week:
  1. List two examples as to how from each of the following empires legitimized their rule: Ming, Qing, Ottoman, Mughal, and Aztec.
This week's skill focus was aimed towards improving students' ability on the stimulus multiple choice and the document based question.  In all honesty, the skills were the focus.  This week was intended to be a practice run of the semester final exam.

 Here are the skill standards for this week:
  1. Analyze primary and secondary sources.
  2. Analyze images.
  3. Write a thesis based in response to the claim.
  4. Pull evidence from a document to support a claim. 
  5. Contextualize the prompt.
Cite Specific Evidence

I recently purchased a new planner to map out my instructional planning.  Here is a small window in what is involved in the planning of a single week of instruction.


Some of the basics that go into my planning is ensuring I actually attend to the objectives I set out to do.  I color code each objective so I can visually see what I am spending time on.  I also try to put time in each week for students to have moments to reflect on their learning.  Reflective moments are crudely drawn out as brains.  In each week, I make an attempt to get the students up and moving around as much as possible.  Those moments are noted by the blue stick figures.  This week was not a great week for movement because of the formal DBQ practice.  

How do I know the students learned?

The students worked through two full DBQ's this week.  It took about two days to complete each one.  As I stated previously, I wanted the students to get as much feedback as possible in hopes to make a few adjustments to their writing before going into the final.  That being stated, on two of the days, I had the students write each component on whiteboard tables.  Then I shared student samples to the class via AirPlay.  Here are some of the samples.


The two samples above represent failed attempts to put a topic sentence or claim at the beginning of their first body paragraph.  I AirPlayed these two samples side by side to show this to the class.


The above sample used clear evidence from the documents to support the claim.  The sample attempts to have the documents relate to one another.  The students begin to work on relating their evidence back to the prompt.  


The sample above addresses the prompt and clearly tells the reader what the paragraph will be about in the first sentence.  The student cited specific evidence from the document that supports their claim.  Then, they start to explain how that evidence would help the emperor rule.  Also, there is a great drawing!

How do I know the students learned and how do I know if they know what they were supposed to learn?

There were countless moments for the students to give feedback to each other and themselves. More impactful was the instructional practice that I did the previous two weeks. To ensure that I know that all students know, I posted a discussion question in Schoology.  All students wrote a sample of their writing.

Then I turned a few of their responses into a quiz.  The students selected all of the answer choices that represented the proficiency of the sample.d If the sample did not display proficiency in one of the items, the student did not mark it.  In the sample below, the student did not correctly identify that the sample laid a roadmap for the essay nor did they recognized that the sample uses clear evidence from document four.  However, the student correctly identifies the rest of the items as proficient or not proficient.


Explain the Reasoning 

How do I know the students have learned?

Students spend so much time on the writing skills and I feel like it is starting to make sense to them. It is not perfect and writing is a such a process that students need an immense amount of time to grow even a little bit.  In the sample below, there are a few errors just like in the ones above.  However, the growth is amazing! The students attempt to explain their thought process through writing.  Doing that correctly is incredibly challenging.  But the students are making progress on it.  They are infusing their writing with academic vocabulary that showcases that they know history.



How do I know the students learned and how do I know if they know what they were supposed to learn?

Even though most instructional minutes are spent writing, it does not always lead to 100% proficiency. The students still cannot always recognize and evaluate their work correctly.  They have questions about whether or not work is quality or correctly relates evidence back to their claim.


Call to Action

I need to continue to work on effective methods to ensure that students know what quality or substandard writing looks like.  I myself struggle with that question when reading work.  Writing is somewhat subjective.  I know that the more I read the more I can identify.  My call to action for myself is to continue to make students evaluating student work a priority to push their understandings of writing and their own ability to write.

Read week nineteen here.

Wednesday, December 20, 2017

The Whole is Greater Than the Sum Of Its Parts: Teaching Interconnectedness

by Quinn Loch

It is now my third year teaching AP environmental science (APES) and I am enjoying teaching it more and more each year. Even though it is labeled as a "science" course it is very interdisciplinary and the issues and challenges discussed in the course can be approached from many different perspectives - scientific, social, economic, political, law, engineering, etc.

As I have taught the course however, I felt as though each unit was being treated as its own separate entity and the interconnectedness between the concepts that we were learning was being lost. For examples, the unit on agriculture was seen as just concepts related to agriculture and not how agriculture is related to previous concepts like water use, water pollution, energy, ecological footprint, biodiversity, populations, etc.

The two consistent issues I was seeing were:

1. Students struggle to connect new concepts to previous ones, leaving them to miss the interconnectedness within APES that provides the "whole picture".

2. Students have a hard time retaining the large quantity of information throughout the year that comes in the form of vocabulary, concepts, and real world examples.

At a recent EGLLT meeting, Kim Miklusak shared part of her 6-step process to designing curriculum and it has helped me zero in on how to approach the major task/question that I was trying to answer: How can I establish routines that have students to approach the content more holistically?

A curricular planning tool to Kim shared to help connect a question/task to a desired goal and learning outcome(s).
The object/goal in my class has become daily warm ups that include short readings or articles about current issues in environmental science. One learning outcome will be facilitating students in their ability to make connections between concepts in the warm up with current or past concepts from the class. One strategy to accomplish this has been to provide students with a list of key terms or concepts that they have to use when writing a short summary in small groups. This has helped me spiral back older content without making it seem tangential.

Students make connections between seemingly unconnected concepts: evolution, air pollution, and genetic diversity.
The other learning outcome I hope to accomplish with these daily warm ups is having students examine an issue from multiple perspectives. For example, students recently read an article related to trophy hunting as a means of conservation. Students then discussed the social, political, and environmental perspectives in small groups, which then lead to a whole class discussion on the ethics of killing something to try and preserve it.

Students wrestle with the implications of hunting ranches.
Next steps include seeking out additional reading and writing strategies that are engaging and lend themselves well to revisiting old topics and concepts. The hope is that students create more of a web of information throughout the year rather than independent subsections of environmental science.

Tuesday, December 19, 2017

Gratitude and Giving


The final Tuesday of Semester 1was a day of Gratitude and Giving in the Collab Lab.

Giving

We kicked off the day with a Teaming on Tuesday book swap. After sharing a favorite book, with an explanation on why and how it had inspired them, staff were invited to exchange books with their colleagues to leave with a new read to enjoy during Winter Break.

The sampling of books in the photo below gives an idea of the interesting mix that was shared, including both non-fiction books related to education and professional learning, as well as some favorite works of fiction.




Mark Heintz' left with a new read shared by Kristen Lesniak and it even came with a hand-made book mark!




Gratitude

On the final Tuesday of the Semester 1 we also released Episode 7 of our We Are EG Podcast. We decided to do a Gratitude Podcast, inviting any interested staff and students to share something they feel thankful for. Producing this episode turned out to be more fun and inspiring than we imagined. We think you'll be inspired too. Take a listen!


Monday, December 18, 2017

6-Step Process to Designing Curriculum: Educational Strategies (Part 4)


By Kim Miklusak

I am currently taking a Foundations of Curriculum and Instruction course at UIC.  Our textbook, while a medical curriculum textbook, reminds us that curriculum design crosses education fields and that what we are doing in our classes every year has its grounding in research.  Kern, Thomas, and Hughes in their book provide a 6-step approach to curriculum development.  My goal is to share the theory behind our current practices to serve as a guide as design and redesign our courses.  Earlier steps can be found here.


Step 4: Educational Strategies

In the previous posts, after determining our targeted needs assessment and stating our goals and objectives, we now need to select our content.  For some subjects, the content and objectives overlap.  For others, the content may be more arbitrary.  Regardless of what content is decided upon, it always must build from the objectives and goals.  As stated in Kern, "transformative learning occurs when learners change in meaningful ways" (67). 

The final step in this part of the process is selecting the best educational methods for your content.  The selection must match with goals, yet at the same time be varied to meet the needs of the students, environment, and content.  Additionally, it must adjust to students' learning styles and preferences.  For higher-level, complex skills, oftentimes balancing a few methods works best over the course of a unit or a year.




In practice:

I find this element of curricular design the most interesting as a teacher.  It is true: we all have the methods that we feel most comfortable with day-to-day, and we all have the ones we feel most comfortable using in our content.  However, as Kern's chart shows, not all methods are best depending on our goals and objectives--or our students. 

Are students experiencing trouble reaching their goals?  Are we having classroom management issues?  Perhaps then there are times in our curriculum when we truly want to practice affective/attitudinal goals and could select the appropriate methods to do so.

However, this often requires us to step outside of our comfort zones--especially when coupled with new technology in the classroom or releasing control of our lessons to our students.  One of the ways I have helped work around this in my own teaching is by observing teachers across content areas.  I have learned so much by working with our Collab Lab team as well as learning with and from our peers in learning groups and lesson demos.

Please feel free to share other insights and ideas based on your experiences in the comments below!

Thursday, December 14, 2017

One Year in AP: All Learners Write (Week Seventeen)

By Mark Heintz

Context

I have two main focuses as I write this weekly blog. Two driving questions that I have in my mind while making decisions.  They are:
  • How do I know if my students know? 
  • How do I get them to know if they know?  
Whether that is a skill or content, I want to know if they know it.   I no longer think it is acceptable for me to guess or get a feeling on whether or not they know it. Getting the students to know if they know it is downright hard, but I am really attempting to get to a point where the students can recognize their understandings or progress on their skill levels and content knowledge.  Therefore, the purpose of this year of reflection is to see how I make progress towards these two goals and elicit feedback from staff, students, and hopefully people who follow along on the journey.  You can read how last week went here.

Week Seventeen: Answer the Question

This week the content focus was primarily on the Columbian Exchange and how emperors legitimized their rule.  The primary focus was the 1450-1750 time period.  Here were the standards for this unit:
  1. List five results of the Columbian Exchange.  
  2. List two examples as to how from each of the following empires legitimized their rule: Ming, Qing, Ottoman, Mughal, and Aztec.
This week's skill focus was aimed towards improving students' ability on the stimulus multiple choice and the document based question.
  1. Analyze primary and secondary sources.
  2. Analyze images.
  3. Write a thesis based in response to the claim.
  4. Pull evidence from a document to support a claim. 
  5. Contextualize the prompt.
Cite Specific Evidence

How do I know the students learned?

In the image below, the students filled out a chart each day with their understandings of the content.  When I actually had them fill out the chart, this helped me see their comprehension of the content. I walked around and saw what each student put in the chart and it was an easy way to see if they had learned.  It was a simple chart, just a place to house what they learned.  The picture below is a sample of the chart.



The way main way that showcased their understandings of the content and the skills was their writing.  I have stated numerous times in previous blog posts that they write all the time, and they still write all the time.  This week, the students wrote multiple times throughout the week to prove their abilities in each of the five skill focus. The purpose of some writing prompts was simply to review/refresh skills while others served to develop new understandings of the skills.  In other words, students were writing to learn. While the students showcased their ability on each of the skills, they also displayed their comprehension of the content.  Another reason I love having the students write daily. 


How do I know the students learned and how do I know if they know what they were supposed to learn?

Last week, I used a Schoology quiz to have students evaluate different levels of the contextualization skill.  This week, I used the same process but with different skills.  To do this, I followed similar steps as last week.  Each student wrote a sample body paragraph of an essay.  I chose several of those student samples representing various levels of proficiencies.   For each of the samples, I took each item of the rubric and turned the rubric into a multiple choice quiz.  When the students took the quiz, they had the choice to choose which criteria best matched the proficiency of the sample. In the sample below,  the students had nine criteria to mark as either proficient or not.  In this case, the student incorrectly identified two as having mastered and failed to identify one proficiency.



Explain the Reasoning 

How do I know the students have learned?

I continue to spend a great deal of class time on writing. I feel that through writing the students show me what they know. Furthermore, writing allows them to use the content in a meaningful way and is far superior then to demonstrate their understanding of a discrete point assessment.  Writing shows their true comprehension of the content.  I cannot stress this enough.  Writing displays that students have mastered the content and can use it or manipulate it in a meaningful way.  It also shows that they don't understand everything. They cannot fake their way while displaying their understanding through writing when they fail to fully grasp a concept


How do I know the students learned and how do I know if they know what they were supposed to learn?

As for the content chart, I wish I would have stayed more on top of this for their sake.  I did not go back to it every day.  If I had been more intentional about referencing the chart daily, students would have been more reflective of their own learning. 



As for the Schoology quiz, I absolutely love this.  The students were questioning me as to why certain parts of the writings were proficient.  They were analyzing each part. That point is so crucial.  Each skill had special attention.  Each skill was evaluated and honed in on as to whether or not each student knew it. This quiz allowed them to see what their understandings were.  Also, it showed me what they know and needed help on.  All students took the quiz; therefore, I was able to quickly see everyone's understandings. Such a great learning tool.

Reflection and Impact

I need to be better at having the students quickly assess their understandings of the content in a meaningful way. Also, I need to do this on a daily and weekly basis.  While I know that the students are doing well with the content, they need to see that more frequently in class rather than the weekly online quizzes.

Read week eighteen here.

Tuesday, December 12, 2017

6-Step Process to Designing Curriculum (Part 3)


From Kern, Thomas, and Hughes. See link above.
I am currently taking a Foundations of Curriculum and Instruction course at UIC.  Our textbook, while a medical curriculum textbook, reminds us that curriculum design crosses education fields and that what we are doing in our classes every year has its grounding in research.  Kern, Thomas, and Hughes in their book provide a 6-step approach to curriculum development.  My goal is to share the theory behind our current practices to serve as a guide as design and redesign our courses.  Steps 1 and 2 can be found here.

Step 3: Goals and Objectives


Once we have identified the needs of our learners, we need to clarify our goals and objectives.  While there may be some differences in how these terms are used in our various institutions, on the broader level, goals are the overall purposeful outcomes while objectives are measurable elements.  

When we state our goals, they help us to define and clarify our content, priorities, learning methods, and evaluation/assessment outcomes.  These are not only important for ourselves as we re/design our courses, but they are also important to communicate clearly to all stakeholders--peers, parents, students, administrators, etc.


According to Kern et al. there are five elements to keep in mind when writing a clear and measure objective: who will do how much (or how well) of what by when?  A key here is to use descriptors that are less open to interpretation.  For example, how can we measure the verb "know" as opposed to "identify, list, recite, define, etc."  


In practice:


For this step I wonder a few things, but many depend on the situations at our individual institutions.  For example, do we know the overall purpose of our course?  If there is not a clear "out" such as an AP exam or placement test, why do we cover the content and skills that we do?  Do we consider the general and targeted needs assessments as described in the previous blog posts?

Furthermore, does our course align to and build upon the goals of the courses before/after it in the sequence?  Does it need to?  If we are unable to articulate these goals and objectives, then we often end up duplicating assessments if not content and essential questions.  

Additionally, we may end up over-emphasizing a specific assessment-type when it doesn't really measure the outcome we are looking for.  For example, is the focus of my English class to read a novel or is the focus to practice skills via the novel?  If we are not clear in the focus of materials versus objectives, we may over-assess in some areas (for example plot of a text) when our main goal is something more sophisticated.  We also may misrepresent the total number and weight of questions that are not the focus of our stated objectives.  If we want students to practice more higher-level skills, more of our assessments should be weighted this way.

This further leads to a question of assessing Socio-Emotional skills and other subjective measures.  If we assess objectives like "paying attention," are those elements we instruct and model?  Are they truly the objectives we want to assess in our course?  If they are, we should be clear about how they are instructed and assessed.  If they are not, we should realign our focus to the objectives we do want to measure.

Once these goals and objectives are established, we are then able to move on to Step 4: Instructional Strategies, which I will discuss in the next blog post.

Thursday, December 7, 2017

One Year in AP: Student Suggestions (Week Sixteen)

By Mark Heintz

Context

I have two main focuses as I write this weekly blog. Two driving questions that I have in my mind while making decisions.  They are:
  • How do I know if my students know? 
  • How do I get them to know if they know?  
Whether that is a skill or content, I want to know if they know it.   I no longer think it is acceptable for me to guess or get a feeling on whether or not they know it. Getting the students to know if they know it is down right hard, but I am really attempting to get to a point where the students can recognize their understandings or progress on their skill levels and content knowledge.  Therefore, the purpose of this year of reflection is see how I make progress towards these two goals and elicit feedback from staff, students, and hopefully people who follow along in the journey.  You can read how last week went here.

Week Sixteen: Answer the Question

This week the content focus was primarily on the Columbian Exchange and the global flow of silver.  The primary focus was the 1450-1750 time period.  Here were the standards for this unit:
  1. List five results of the Columbian Exchange.  
  2. List three effects of global flow of silver.
This week's skill focus was aimed towards improving students' ability on the stimulus multiple choice and the document based question.
  1. Analyze primary and secondary sources.
  2. Analyze images.
  3. Write a claim based in response to the claim.
  4. Pull evidence from a document to support a claim. 
  5. Contextualize the prompt.

Cite Specific Evidence

How do I know the students have learned?

The students took two mini quizzes this week. They were more traditional in nature, as in they were on paper and "graded" by the me, the teacher.   One was a short content objective quiz on the content objectives above.  The students averaged 80% on the content assessment.  The second quiz was a stimulus based assessment.  The students average 70% on the stimulus assessment.  Here is a sample question from the stimulus based exam.


How do I know the students learned and how do I know if they know what they were supposed to learn?

This question is still kicking my butt.  I try each week to know how they know.  This week I took a suggestion from my student about using Schoology to have each student evaluate writing samples. To do this, I first had each student write one part of the DBQ essay in a discussion post.  After they submitted their samples, I took several of the writing samples posted by the students and created a Schoology quiz.  In the quiz, I took one of the samples and for each of answer choices was an item from the rubric.  The student's selected each answer choice they felt the writing sample earned.  In the sample below the student felt only four of the points were earned.



Explain the Reasoning 

How do I know the students have learned? They consistently perform well on the content tests.  The students requested more frequent content assessments to hold them accountable. Since they requested more assessments, I gave them more assessments! Their performance was at the same level as it was on the last unit exam when there were more objectives being tested.  I am not sure what to make of that data point.  Since they performed at about the same level, should I keep doing the more frequent assessments or hold out until the end of the unit?

What the more frequent assessments did reveal was that there were a few gaps in their content knowledge.  The students need instruction on the economic systems involved in the Columbian Exchange and the role certain empires played.

There was some good news to using more frequent content assessments.  There were fewer outliers.  On the last content test, there were a few extreme cases of students "bombing" the test.  On this assessment, there were not such cases.  There was much more of a middle norm, which would be a case that the more frequent assessments may help in preserving the positive narrative of the course.

How do I know the students learned and how do I know if they know what they were supposed to learn? Using the Schoology quiz to assess student writing was amazing! First, it held all students to submitting a written response and then evaluating specific responses.  In the past, I walked around the room and the student typically write in pairs or I cannot get to every response.

The new use of the Schoology quiz enabled me to have all the students record theirs.  I wanted to know if they knew what excellent writing was. Second, the quiz revealed to me, the teacher, that they do in fact know what excellent contextualization writing is.  Almost every student was extremely close to accurately evaluating their peer's writing sample.   I cannot wait to use this method again!


As for the stimulus exam, the student scored at the same level as they did on the previous unit exam.  But this time the students spent a day going through the test with explanation of the answers.  I have written a detailed response for as to why each of the answer choices were right or wrong.  Having the students spend the day going through the test was fantastic.  Hearing the dialogue and conversations centered around the questions revealed that the students read most of questions too quickly. Even though they averaged about the same, they are understanding what they don't know and why they are getting questions wrong. There is still a lot of work to go in this arena.  


Reflection and Impact

As I stated last week, I need to get student's input more frequently.  They have given me such great ideas to improve student learning.  I used two of the suggestions this week and I know they helped.  I going to use the Schoology quiz more frequently to have the student evaluate student work.  It is so quick and simple to do and it reveals if students know what is expected of them.  Such a powerful instructional tool.  I cannot wait to use again!

Read week seventeen here.

Tuesday, December 5, 2017

The Lost Art of Field-Tripping

        
The best field trip I experienced was when I took my first group of students to see Much Ado About Nothing at Chicago Shakespeare Theater over twenty years ago.  The students, who had never seen live theater, were fascinated with the lighting, scenery, the physical comedy of the actors, and the entire sensory experience.  The play jumped off the page and turned into something entirely different in the theater space. 

 AP students? Far from it. These were lower level students, some of the weakest readers in the building.  Why did it work so well?



Create barriers, but not financial ones.

This is a privilege, a delightful treat, not a march. We built a barrier (study guide, quiz mastery, lunchtime discussion/preparation) that allowed students who were interested and intrigued to join us, but left behind those who were not yet ready.

We made a point to take students who came from disadvantaged backgrounds. Despite living thirty minutes from world-class museums, these students had never entered the doors of these institutions.

Anticipate where your students may struggle.
 
What will intimidate your students?  When will they feel uncertain or uncomfortable? Directly teach behaviors.  Before I took my lower level class to see Shakespeare, we talked about what to do if they found it boring, had to go to the bathroom, or got hungry. We discussed the difference between applause and yelling out.  Students worry about getting lost, what to wear, and when they will get to eat. Openly address those fears. 

Let students design the trip.

A few years ago, we were studying the novel Things Fall Apart, and a student wondered what some of the African cultural references actually looked like. A few weeks later, the students walked into the African Art wing at the Art Institute of Chicago and exploded with awe when they saw what was at the entrance – the towering dramatic costumes of the mysterious edwugwu.  Suddenly, the students understood the power of these intimidating demi-gods.

Prepare yourself and prepare your hosts.

We use the education departments at these institutions.  When my students went to the Art Institute in conjunction with our reading of Things Fall Apart, the volunteer docents took the time to read the novel themselves the week before our trip so they could be better prepared to help students find those cultural connections.