Teaching, Learning, and Assessment Across the Disciplines: ICE Stories

Teaching, Learning, and Assessment Across the Disciplines: ICE Stories

Sue Fostaty Young, Meagan Troop, Jenn Stephenson, Kip Pegley, John Johnston, Mavis Morton, Christa Bracci, Anne O’Riordan, Val Michaelson, Kanonhsyonne Janice Hill, Shayna Watson

Edited by Sue Fostaty Young and Meagan Troop

Cover designed by James Young

Cover photo by Philipp Trubchenko on Unsplash

Teaching, Learning, and Assessment Across the Disciplines: ICE Stories

Icon for the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License

Teaching, Learning, and Assessment Across the Disciplines: ICE Stories by Sue Fostaty Young, Meagan Troop, Jenn Stephenson, Kip Pegley, John Johnston, Mavis Morton, Christa Bracci, Anne O’Riordan, Val Michaelson, Kanonhsyonne Janice Hill, Shayna Watson is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, except where otherwise noted.

Dedication

1

This book is dedicated to Bob Wilson, Queen’s University Professor Emeritus, assessment maven and mentor extraordinaire.

Prologue/How to read this volume

2

This book is a testament to the pedagogical generosity of its authors. They want nothing more than to share their excitement at having found a grounding framework that shapes their understanding of learning and at being able to purposefully and intentionally support their students’ achievement. That common grounding framework is based on the ICE model (Ideas, Connections and Extensions). For those unfamiliar with ICE, Chapter 1 provides a brief overview, though it will also serve as a quick refresher for those of you who have used the model before.

The remaining chapters of the volume can be read in any order and in relation to your own emerging interests. To help make choices about what to read, each chapter begins with a description of the instructional context being presented. While the discipline depicted in a chapter might be vastly different from your own, many authors provide cues as to the ways in which their practices might be applicable, adapted, and/or transferable to other contexts. We suggest you keep your own context in mind as you read, staying open to ways of finding commonly held values and intentions that might inform your own ongoing teaching and assessment practices.

As a prompt to their writing, authors were invited to imagine sharing their experiences with ICE over coffee with a colleague. The result is a collection of highly individualized stories with the voices of each author deliberately retained with a conversational tone. The chapters, however, are designed in a consistent format that might help readers more easily find the details of most interest to them.

Chapter 1 provides an overview of the ICE model, setting the context and establishing the conceptual framework that influenced the teaching and assessment practices shared in subsequent chapters and explains the use of vocabulary adopted by ICE users.

Chapter 2 offers a rich portrait of the diverse applications of the ICE model in the context of learning, teaching, and educational development, and illustrates a longitudinal view of the influences of the model and the resonance that it has had with Meagan Troop’s principles of practice.

Chapter 3 features Jenn Stephenson’s account of the ways that using the metaphor of a broken toaster to convey the essential underpinnings of ICE has helped the model’s philosophy truly become “a mindset” for her and her students.

Chapter 4  Kip Pegley shares detailed descriptions of three distinct ICE-inspired learning activities he’s used in teaching undergraduate students in his course on popular music and the ways those activities have enriched his students’ learning experiences.

Chapter 5 John Johnston, a geoscientist instructor, and Meagan Troop, an instructional designer, examine their co-creation of an online course that applies the ICE model as a way to elicit different modes of thinking at the activity and with the course as a whole, elevating the learning and teaching experience for instructor, TAs, and students alike.

Chapter 6 Mavis Morton uses ICE extensively in both undergraduate and graduate courses and shares the multiple ways that she has engaged students and enhanced their metacognitive skills and awareness with their intentional application of the ICE framework in relationship to learning outcomes, activities, and assessments.

Chapter 7 Christa Bracci offers insights and reflections on her applications of ICE in the context of an Advanced Legal Research course. She illustrates the ways that the model offered curricular cohesiveness for students as they enhanced and developed research skills in authentic ways, as well as increasing their metacognitive awareness.

Chapter 8 As an instructor of the Lived Experience of Disability course, Anne O’Riordan details her creative integration of the ICE model to facilitate deep critical reflection through journaling and dialogue about the mentor-mentee relationships that formed in the course and in community.

Chapter 9 Val Michaelson and Jan Hill openly and candidly share Val’s first experience implementing the ICE model. They outline the details of how an undergraduate research methods course was structured and what they learned about teaching, learning and assessment along the way.

Chapter 10 Shayna Watson, a family physician, presents a comparative interpretation of the evolution of medical education by juxtaposing a Flexnerian perspective with one informed by ICE.

Chapter 11 In the volume’s coda, Sue Fostaty Young outlines the multiple, embodied ways that her assessment-focused educational development approach has been influenced by ICE. She offers examples of the transformative effects of the ICE framework in facilitating enhanced communication and congruence in curriculum decision making for both instructors and students

Chapter 11. ICE as an Educational Development Tool

I

Chapter 11. ICE as an Educational Development Tool

11.1 Instructional Context

39

Sue Fostaty Young –  Queen’s University

 I’ve worked as an educational developer for close to 25 years. The focus of my work has always been in finding ways to help post-secondary instructors improve their teaching for the express purpose of improving their students’ learning. Of course, as the literature tells us, meaningful and lasting changes in teaching practice aren’t likely to happen without some kind of change in, or development of, teachers’ conceptions of teaching and learning. That being the case, the goal of educational development is to help teachers develop increasingly sophisticated conceptions of teaching and learning while at the same time supporting the acquisition and development of the teaching skills they’ll need to enact those newly developed conceptions. So, in many ways, my practice has been focused on helping instructors think about their teaching in ways that are different from the ways they habitually do. That sounds easy enough except for the fact that very few post-secondary instructors come to their positions with any pedagogical background. That might mean they haven’t yet adopted an overarching conceptual framework or operational theory of learning to rely on to articulate their expectations for students’ learning or to reflect on their teaching. Without that ability to accurately name and frame their beliefs and values or, for that matter, name what it is they do and why they do it, it can be exceedingly difficult to work toward or plan for improvement. 

Over the years, I’ve discovered that inviting conversations about their assessment choices enables instructors to express the sometimes tacit system of beliefs and values on which they base their teaching and assessment practices. It gives them a chance, sometimes for the first time, to inquire into the aspects of their practice that are purposeful and quite intentional and the others that might actually be surprisingly inconsistent with their stated intentions.

Conversations about assessment are, at their core, conversations about learning. In causing instructors to shift their attention away from teaching (i.e. what they do) to focusing on learning (i.e. what their students do) we can begin conversations about the ways instructors make decisions to evoke that learning for their students. In making their tacit practice explicit, they then might be able to become more intentional and make a shift from trying to merely adopt “best practices” to a more invested exploration of “best principles” for their own system of values. In this way, my educational development practice has become entirely learning assessment-focused and entirely facilitated through ICE.

11.2 Discussion

40

It’s ironic that assessment now takes up such a significant part of my professional life. I spent my early academic career avoiding anything to do with assessment. My aversion to the topic stemmed from resentment of the ways that assessment had been done to me as a post-secondary student – testing recall of minutiae; poorly crafted multiple-choice questions with no opportunity to explain my thinking; one-size-fits all projects that didn’t fit my interests – I wanted nothing to do with perpetuating that kind of practice. It wasn’t until I had the opportunity to work with Dr. Bob Wilson, the originator of the ICE model, that I began to appreciate the potentially transformative effects of learning assessment when it’s purposefully designed for students and their learning. My own transformative learning experience was so complete that now both my teaching and development practices are largely dedicated to improving teaching through improved understanding of assessment.

I use ICE as an educational development tool in that it frames everything I do. The model, comprehensive yet simple without being simplistic is, in fact, a shorthand for an entirely complex conception of teaching and learning. It’s fully congruent with my own conception of learning as a non-hierarchical, non-linear, reiterative learning loop of developing expertise. What’s more, the model distills cognitive-transformative theories of learning into a highly accessible framework that seems to resonate with many instructors’ experiences of what learning looks like, no matter what discipline they work in. In many ways, ICE seems to be intuitive in that many instructors insist that, yes – that’s exactly the way they’ve conceptualized learning all along but hadn’t the ability to articulate. A bonus is that the vocabulary supplied by ICE provides a reliable kind of portable framework that helps instructors organize their thinking about teaching, learning, and assessment in ways that enable them to conceptualize and communicate their expectations and intentions more easily and with greater clarity. 

Because Bloom’s Taxonomy is arguably the most well-known model of learning, even instructors without much pedagogical background seem to have at least a passing acquaintance with it. If that’s the case, we start there. Almost invariably instructors report that, initially, they found Bloom’s to be very helpful but, after a while, it didn’t seem to work for them. Probing for specific examples of how and why the model stopped being utilitarian often results in reports that the seven hierarchical levels were perceived as too finely drawn or that the taxonomy is a little unwieldy to expect students to be able to benefit from. What’s more, the hierarchical nature of Bloom’s meant that instructors seemed to spend a lot of time at the lower end of the pyramid. After all, Bloom’s Taxonomy does presuppose that a learner must be proficient at one level of the pyramid before being able to be successful at the next. It’s at this stage that I often initiate a conversation about episodes of learning that seem to defy the notions of learning as linear and single domain-specific and ask the instructors(s) if they can identify any instances from their own experience that help to illustrate either the linearity or recursivity of learning and which makes the most sense in their current context. It’s essential to me that I meet instructors where they are both contextually and epistemologically. The process of development means that my job is to help each individual to grow from their own place of readiness and at their own pace.

My entire educational development practice – workshops, consultations, resource development, and general discussions – is structured in ways that put the focus on students’ learning. Throughout are invitations to educators to articulate, as best they can, what their expectations for learning look like. Typically, only after initial engagement with their own course and assessment practices and perhaps reviewing some examples of students’ work and discussing the ways those samples met and fell short of expectations, will I introduce ICE. It’s then that I might invite instructors to use ICE as a lens through which to revisit those work samples or to use ICE to describe their learning expectations. That simple exercise is an episode of supported practice with using the ICE model. Almost immediately, instructors experience a greater sense of clarity. 

In sharing ICE with instructors, either as part of a Departmental, small group, or individual consultation, I rely heavily on storytelling, examples from my own experience, and from across the disciplines that illustrate the conceptual points I’m trying to make. Those stories serve multiple functions. First, they illustrate and concretize the theories and conceptions. Secondly, they serve as a tacit invitation to those present to begin their own process of meaning-making. Lastly, storytelling is a way of modelling how to make Connections. That process, I believe, works to help illustrate the ways in which ICE might become more relevant to their own context. Storytelling and encouraging others to tell their own stories of assessment and conceptions of teaching are also ways of acknowledging and validating the varying instructional contexts of others. It also serves as an effective way for me to ascertain the storyteller’s grasp of the concepts and of ICE itself.

Conversing about the ways ICE (or any other Taxonomy of learning) is congruent with their own conceptions of learning is an essential component of the development process. That said, probing about the ways that ICE is incongruent with their conceptions and instructional context invites a certain criticality that helps draw out tacit assumptions and beliefs about teaching and learning. When consistently encouraged to adopt such a critical stance, instructors are actually being invited to explore the ontologies of their own conceptions, again helping to make the tacit explicit. I repeatedly tell instructors that I’m not trying to sell them on ICE; what I am trying to do is encourage them to find and adopt a conceptual framework of learning that resonates with their values and beliefs and that can reliably serve as a touchstone to inform their practice and, ideally, be shared with their students. The value of frameworks is that they are both rigourous and flexible: They provide structures and parameters that enable naming and framing of practice that help us focus on different schemas for questioning assumptions and understanding learning but they should be flexible enough to be adopted and adapted to suit a wide range of contexts.

Another strategy I use is to invite instructors, with ICE as a reference point, to scrutinize the relative success of one of their final exams in assessing the learning for which it was intended. Typically, instructors report that the intention of the exam was to assess students’ ability to make Connections and Extensions. Also, typically, after question-analysis using ICE as a reference, many discover a heavier-than-intended reliance on Ideas-based questions. Conversations then ensue about the precision of language necessary to evoke intended learning, the value of tables of specification for exam construction, and of blueprinting assessments against learning outcomes—all of which results in awareness of the importance of intentionality in assessment design and instructional decision-making. In addition to that growing awareness, whenever possible, I try to embed activities that result in positive supported practice of new skills. Ensuring even a brief episode of practice means that conceptual development and skill development are occurring in tandem.

I use the term “guided alertness” to refer to the process I use to draw people’s attention to their intentions, whether they’re tacit or explicit, for students’ learning. I have a penchant for prefacing almost all my answers to questions with “that depends”. I suppose it’s a way of drawing attention to the fact that context is everything when it comes to teaching and learning and what might be considered best practice in one context might not be in another. Additionally, if a best practice is incongruent with an instructor’s set of values, it’s highly unlikely to come across as “best”. “That depends” models my resistance to the notion of best practice and guides instructors’ alertness to the importance of intention and context. From that perspective, my Swedish colleagues have dubbed my approach to ICE-informed educational development as “non-normative”. I prefer to think of it as an example of meeting people where they are developmentally and contextually.

ICE also provides me with a framework through which to interact with, interpret and answer instructors’ questions. Using ICE as a filter, I can interpret the language of a question to determine if what is being asked for is a clarification of Ideas, a request for a nudge toward Connections, or that someone is close to a breakthrough Extension. The cues that ICE language provides enable me to be responsive to learning needs and to anticipate and create opportunities for learning or discussion. Using ICE in this way means that my practice is both informed and supported by the framework.

11.3 Impact

41

Instructors tell me time and again that learning about ICE has had transformative effects on their conceptions of teaching and learning, even for those who might already have had ones that are comparatively sophisticated and complex. It seems that because the framework provides such a reliable, accessible, and portable way of organizing their thinking about teaching, learning, and assessment, the users gain a sense of clarity. That clarity in turn enables an intentionality to their teaching that many hadn’t experienced before. Even some who report having made no changes to their teaching practice or instructional decision-making say that ICE has enabled them to be more intentional in their teaching and that they can now explain why they do what they do. More than anything else, instructors report that the greatest impact related to their introduction to ICE was that they gained a reliable way of organizing their thinking about teaching and learning which enables them to adopt a clarifying way of communicating with their students about learning and assessment. They often report gaining greater awareness about the effects of their assessment plans on students’ approaches to learning and the critical importance of ensuring congruence among the elements of their courses’ curricula. The end result for many is a satisfying sense of increased confidence in their abilities to facilitate learning.

11.4 Conclusions and Caveats

42

Because what and how a teacher chooses to assess has such a profound effect on what and how students learn, I believe it’s essential to get assessment ‘right’ – to ensure that assessment practices reliably support intended learning so that valid interpretations of students’ learning achievement can be made. Time and again I’ve seen the positive impact on teaching development and subsequently on learning achievement when instructors and students better understand the structure of assessment, its purposes and its power. It certainly isn’t necessary to use ICE to engage in assessment-focused educational development. What is necessary is ensuring an alignment between one’s conceptions of teaching and learning and the model one adopts to practice. Intentionality is key.

Appendix

1

Chapter 1: Introduction to the ICE Model

Figure 1. The ICE Model

Three gears whose motion influences the others. Each gear represents one phase of the ICE model: Ideas, Connections, or Extensions. The Ideas phase of learning includes understanding the fundamentals, facts, discrete skills, or steps in a process. It includes vocabulary, definitions, information, and discrete concepts. The Connections phase of learning includes the ability to articulate relationships, relate new learning to what is already known, and combine two or more discrete skills. At the Extensions phase of learning, individuals extrapolate learning to novel situations, they postulate or anticipate outcomes, and they understand the implications of new learning. At this phase, individuals can hypothesize.

Figure 2. Terminology often used at the Ideas phase of learning

Speech bubbles: recite, name, label, memorize, repeat, calculate. Additional terms used include assemble, cite, compile, define, describe, duplicate, follow, identify, imitate, list, locate, mimic, operate, participate, recall, recognize, replicate, report, reproduce, state, tolerate, trace.

Figure 3. Terminology often used at the Connections phase of learning

Speech bubbles: adapt, infer, differentiate, reframe, compare, solve. Access the Appendix for a full list of terms. Additional terms include adjust, apply, blend, calibrate, categorize, classify, code, collate, combine, compute, convert, coordinate, diagram, discriminate, distinguish, estimate, illustrate, integrate, match, modify, organize, paraphrase, rank, relate, translate, test.

Figure 4. Terminology often used at the Extensions stage of learning

Speech bubbles: analyze, rationalize, create, design, defend, predict. Access the Appendix for a full list of terms. Additional terms include anticipate, appraise, compose, critique, evaluate, extrapolate, hypothesize, interpret, invent, judge, justify, propose, project.

Chapter 1 Image References:

Based on the ICE taxonomy described in:

Fostaty Young, S. & Wilson, R.J. (2000). Assessment and Learning: The ICE approach. Winnipeg, MA: Portage and Main Press.

Fostaty Young, S. (2005). Teaching, learning and assessment in higher education: Using ICE to improve student learning. Proceedings of the Improving Student Learning Symposium, London, UK, 13, 105-115.

Chapter 5: How to Think Like a Geoscientist: Using ICE to Support Critical and Creative Inquiry

Figure 1. Part A of the Rubric

Part A of the rubric used for the Study Site Assignment (SSA). The rubric is comprised of 4 columns. The first column lists the focus of this rubric: the description of the study site. Each subsequent column describes this task when considering Ideas, Connections, and Extensions.

Ideas:

Connections:

Extensions:

Figure 2. Part B of the Rubric

Part B of the rubric used for the Study Site Assignment (SSA). The rubric is comprised of 4 columns. The first column lists the focus of this rubric: Composition, Structure, and Processes. Each subsequent column describes this task when considering Ideas, Connections, and Extensions.

Ideas:

Connections:

Extensions:

Figure 3. Part C of the Rubric

Part C of the rubric used for the Study Site Assignment (SSA). The rubric is comprised of 4 columns. The first column lists the focus of this rubric: Age of the Material. Each subsequent column describes this task when considering Ideas, Connections, and Extensions.

Ideas:

Connections:

Extensions:

Chapter 6: Shine the Light: Using the ICE framework in Sociology Courses to see the “Big Picture”

Figure 1: Course Design Model and Constructive Alignment, adopted from Aligning learning outcomes, assessment, and teaching methods in Ellis, D. (2007). Teaching Excellence Academy workshop. University of Waterloo, Canada.

A triangle diagram. At the top, Intended Learning Outcomes. Lower left corner, Teaching and Learning Activities. Lower right corner, Formative and Summative Assessments. Double headed arrows depict Methods that connect each point with another. At the centre of the triangle it reads, Concepts (Content: Knowledge, Skills and Values).

Figure 2: ICE Rubric to Measure Communication and Critical Thinking Skills

A four-column rubric. The first column identifies the elements to be assessed in this assignment, which include communication skills and critical thinking skills. The following three columns are dedicated to describing these skills in relation to learning at the Ideas phase, the Connections phase, and the Extensions phase.

The instructor describes communication and critical thinking at the Ideas phase in the following ways. When a statement is proceeded by the letter “C” this indicates a measurement of communication skills. The letters “CT” indicate a measurement of critical thinking skills.

The instructor describes communication and critical thinking at the Connections phase in the following ways.

The instructor describes communication and critical thinking at the Extensions phase in the following ways.

Figure 3: A Sample of an Academic Reading Review Table

A six-column table with an empty row below for student input. The columns read as follows, left-to-right: Author/Citation; Purpose, Statement, and Research Questions; Background, Theory; Methodology, Methods; Results, Findings, Conclusion; Other e.g., Tensions, Debates, Limitations.

Figure 4: A Completed Sample of the Academic Reading Review Table

A six-column table. Each column has a heading as indicated in the Sample Academic Reading Review Table. Student information has been entered below the first five columns. This reads as follows:

Purpose, Statement, and Research Questions: The gendered organization of violence is part of a socially constructed set of values through which we recognize ourselves, and each other.

Background, Theory: Poststructuralism

Methodology, Methods: Literature review and media case analysis

Results, Findings, Conclusion: Violence is a set of ideas and strategies that get put into practice in society in contextual and value-specific ways, for example, in operationalizing gender. Our meanings about gender define and limit who and how we can be violent.

Figure 4 Citation

Author/Citation: Naugler, Diane, 2017. Making Violence Remarkable: Reconsiderations of Everyday Gender Violences, Chapter 2 Mapping Geographies of Violence. Eds. Kitchin Dahringer, H.A. & Brittain, J.J. Fernwood Publishing, Halifax

Figure 5: Critical Media Assignment Description

A table with two headings and two columns outlines the assignment details and grade breakdown.

The first column lists the assignment details under the heading Critical Media and Topic Analysis, with a prompt for students to work in a group of 4 or 5 students:

The second column lists how students will earn their grade under the heading Grade Breakdown:

Figure 6: Example of an Extension in the ICE Model Exploring Messages Across Different Forms of Media

A slide with regular body text and text in the center of the slide in a speech bubble. The first sentence on the slide reads, Theorizes the relationship between gender and violence (there is one!).

Then there are three examples of information derived from various sources of media. From a scholarly journal, “There is need to explicitly address the less than full overlap of the violence that is variously ‘domestic’, ‘gender-based’, and ‘against women’. This includes consideration of violence that is gendered but not domestic.” (Walby, Towers & Francis, 2014, 188).

From an episode of Full Frontal with Samantha Bee, ““Over the past few months, we have all discovered who is behind workplace harassment and it’s literally thousands of men.”

From a book, “…role violence (sexual harassment) plays in the production of normative gender”
(Naugler, 2017, 29).

Chapter 8: Patient Mentorship in Occupational Therapy Education: The Influence of ICE on Student Learning

Figure 3: OT 825 Journal Review Assessment Rubric

A rubric assessing the journal entry by Peter Harris, 2013.

The rubric has 4 columns and 5 rows. The top row is comprised of 4 headers: Reflection Components, followed by Ideas, Connections, Extensions.

The first column, Reflection Components, lists the following areas of assessment: Objective Level, Reactive Level, Interpretive Level, Decisional Level, and Written Journal.

The follow ratings can be selected when assessing Ideas.

Objective Level—Ideas: Describes basic information of the situation/experience (e.g., visit with a mentor; tutorial discussion themes). Use of one sensory descriptor. (e.g., describing in detail what the student observed in the setting in which the visit took place).

Reactive Level—Ideas: Identifies a feeling/emotion or reaction related to the experience/situation.

Interpretive Level—Ideas: Discusses the meaning and significance of the experience. Demonstrates understanding of the meaning of one’s own experiences.

Decisional Level—Ideas: Discusses future implications for personal awareness and interactions.

Written Journal—Ideas: Names of mentors & students, as well as identifying data, have been omitted to ensure confidentiality

The follow ratings can be selected when assessing Connections. The teacher made two selections from this area of the rubric when assessing the sample journal entry. Each has been identified.

Objective Level—Connections: Provides a thorough description of the situation, using at least two sensory descriptors. Inclusion of events outside of the immediate course content—i.e., Campus accessibility, transportations system. Describes the context of the situation or experience.

Reactive Level—Connections: Describes previous memories or experiences that influence this reaction.

Interpretive Level—Connections: Teacher Selected Rating. Discusses the meaning and significance of the experience and relates this to previous experiences. Demonstrates understanding of the experiences of one’s mentor, student partner and colleagues in 825.

Decisional Level—Connections: Discusses future implications for personal interactions and professional practice.

Written Journal—Connections: Teacher Selected Rating. Takes needs of the reader into account in the presentation of the information (i.e., bolding, subtitles, spacing). Material is clearly written and presented with professional terminology where appropriate.

The follow ratings can be selected when assessing Extensions. The teacher made three selections from this area of the rubric when assessing the sample journal entry. Each has been identified.

Objective Level—Extensions: Teacher selected rating. Describes the situation in detail, including multiple sensory descriptors. Situation/experience is described in relation to past experiences.  Discusses both the personal and the macro-level environment (i.e., socio-political).

Reactive Level—Extensions: Teacher Selected Rating. Discusses personal reaction and relates this to the broader social environment.

Interpretive Level—Extensions: Discusses the meaning and significance of the experience in relation to the broader social environment. Demonstrates understanding of the complexity of issues at multiple levels.

Decisional Level—Extensions: Teacher Selected Rating. Discusses future implications for personal interactions, professional practice, and health care provision. Discusses implications at a policy and socio-political level.

Written Journal—Extensions: Vocabulary selected articulates ideas and understanding of the profession. Overall appearance and content demonstrate attention to detail and an effort to produce a document that is personally and professionally relevant.

Chapter 9: Using the ICE Framework in a 2nd Year Research Methods Class

Figure 1. Using the ICE framework in a Social Science Rubric

A 3-column rubric structured on the ICE framework, for use in a Social Science course.

The first column identifies the 3 components of the ICE framework—Ideas, Connections, and Extensions. The middle column identifies tasks which are connected to Ideas, Connections, Extensions. The last column includes marks assigned to each task, for a total of 25 marks.

Ideas: 8 out of 25 marks

Connections: 6 out of 25 marks

Extensions: 3 out of 25 marks

The following two tasks are also evaluated: 4 out of 25 marks each

 

About the Authors

2

Sue Fostaty Young: Most recently Sue was the Director of the Centre for Teaching and Learning at Queen’s University in Kingston, Ontario where she oversaw a wide range of programming to support post-secondary instructors’ teaching development. Her program of research—learning-assessment-focused educational development—has earned her recognition across campus as The ICE Queen.

Meagan Troop has worked in the field of educational development for over a decade with experience in both college and university contexts. In her most recent role as the Manager of Educational Development at Sheridan, Meagan has collaborated with faculty, staff, and administrators to design and facilitate evidence-informed programs and initiatives that build teaching and learning capacity. Her research interests include creative pedagogies, transformative learning, educational leadership, and the scholarship of teaching and learning. 

Jenn Stephenson: Jenn is a Professor in the Dan School of Drama and Music at Queen’s University. Her current research (with co-investigator Mariah Horner) examines a recent trend towards audience participation in game-theatre hybrid performances where the audience become active co-creators in the experience. As a teacher of theatre history and dramatic literature, she is infamous among her students for her love of toasters.

Kip Pegley is a Professor in the Dan School of Drama and Music, Queen’s University. He is the co-editor of Music, Politics and Violence (Wesleyan University Press, 2012), and, more recently, his work on sound and trauma has appeared in Singing Death: Reflections on Music and Mortality (Routledge, 2017), Music and War in the United States (Routledge, 2019), and MUSICultures (2019).

John Johnston: Currently, John Johnston is an instructor and teaching fellow in Earth and Environmental Sciences at the University of Waterloo, Ontario. As a passionate geoscience educator for more than two decades now, he has designed and taught more than twenty-five different courses at three universities and numerous professional workshops. John’s research focuses on coastal geoscience and geoscience education, most recently creating learning frameworks uniting students with extended reality experiences and three-dimensional geological models. 

Mavis Morton: Mavis is an Associate Professor in the department of Sociology and Anthropology and teaches in the Criminal Justice and Public Policy undergraduate program and the Social Practice and Transformational Change graduate program at the University of Guelph in Guelph (UOG), Ontario. Currently, Mavis is the Director of the First Year Seminar (FYS) program where she works with instructors to help design interdisciplinary courses for first-year students that fit the FYS program criteria. Mavis is a member of numerous teaching and learning committees at the UOG. Mavis is a critical community-engaged scholar, and her program of research includes both collaborative research with community partners on violence against women and other social justice issues and often includes working with and training students as researchers on the scholarship of teaching and learning (SoTL) research projects.

Christa Bracci is an adjunct professor at Queen’s University Faculty of Law, where she teaches and develops courses in legal research, writing, and practice skills. Her research interests include skills pedagogy and curriculum design. She has worked and practiced in a range of environments including in government, and in both boutique and large national law firms. Christa is a member of the Bar of Ontario. 

Anne O’Riordan: After a career in healthcare spanning four decades, as an occupational therapist and educator at Queen’s University, Anne traded academia for the front lines in healthcare. The patient mentors with whom she worked, along with her own journey as a patient and caregiver, compelled her to become a Patient Advisor at Kingston Health Sciences Centre.

Valerie Michaelson is an Assistant Professor in the Department of Health Sciences at Brock University. Her research focuses on health equity and the social dimensions of the health of children.  Social justice and human rights are core to her work. She is passionate about teaching students the critical thinking skills they will need to engage productively with the critical equity issues they will face as health professionals.

Kanonhsyonne Jan Hill is the Associate Vice-Principal (Indigenous Initiatives and Reconciliation) at Queen’s University. She leads the Office of Indigenous Initiatives, providing strategic support and leadership to oversee the university-wide implementation of the recommendations from the Queen’s TRC Task Force Report. Jan is Mohawk, Turtle Clan, and a Clan Mother at Tyendinaga Mohawk Territory. 

Shayna Watson is a family physician in the Department of Family Medicine, and Regional Clerkship Director, at Queen’s University in Kingston, Ontario. Her non-clinical work and scholarly interests include community-based medical education, longitudinal integrated clerkships, EDIIA (Equity, Diversity, Inclusion, Indigenization, and Accessibility), and reflective practice.

Back Cover

3

Teaching, Learning & Assessment Across Disciplines: ICE Stories is the end product of a collaboration of generous post-secondary educators whose practices have been influenced by the ICE model. Each author contributed a chapter based on their own conceptualization of the model and the ways they’ve used it in their classrooms. They begin by setting the context, either conceptual or instructional, in ways that are likely to resonate with readers’ own teaching and learning experiences. Authors share practical details of their instructional and assessment strategies and the ways that the ICE model has shaped their and their students’ thinking and learning.

This volume isn’t merely a compilation of cases. It represents a process of mutually supportive reciprocal review that the contributors adopted that saw them meet regularly over time to discuss one another’s conceptions of ICE, adaptations, and applications. They read one another’s chapters, provided peer-to-peer feedback, and learned from one another. Throughout the process, they served as generous, caring, critical friends, forming a community of inquiry.

We acknowledge and appreciate the thoughtful insights provided by the anonymous peer reviewers who shared their time and expertise. Your support was invaluable.

 

                                                               Sue Fostaty Young and Meagan Troop