CEP 813: Theory To Practice

Assessment, Digital Portfolios, MAET Year 3

I have been familiar with the use of digital portfolios in both my own professional context and for use with students since my pre-service teacher preparation. I have seen the use of portfolios especially digital, grow amongst teachers that I work with in part due to the affordances of digital portfolios. I think that the research that Bennett speaks of in support of multiple forms, occasions and design of assessments in order to create the fullest picture of any student has started to encourage teachers and administrators to break the mold of traditional assessment and move in the direction of standards based portfolios that document evidence of student’s progress towards mastery (2011). Breaking away from summative assessment focused instruction is necessary in order to be able to begin to piece together that complete picture of a student’s learning path.

In my own practice, I work with teachers to design lessons utilizing technology.  Sometimes we can get distracted by the shiny appeal of a new technology tool, but a key element of my work is steering the focus towards the enduring understandings first. I will usually begin working with a teacher and ask them questions until we have really gotten to the heart and purpose of the lesson or unit. Moving forward whether designing an assessment or lesson, we now have a clear goal and direction for our future work together (Wiggins & McTighe, 2005). In my role, it is also important that I am familiar with the different technology tools that can be used to create and share digital portfolios. There are so many fantastic tools starting from even your basic Google Drive shared folder that can quickly enable a teacher and student to collect and curate work samples. I wish that when I was in the classroom that I would have had these tools at my disposal.

Some of the biggest concerns amongst teachers that I work with regarding the use of portfolios is how they can teach students to collect high quality samples to contribute to their portfolios that actually demonstrate mastery, how they can provide timely feedback, and their fear of portfolios being a form of assessment that is too prone to subjective critique. I know that I can help them begin to address these concerns in multiple ways. I can encourage them to begin by establishing a vision, purpose and audience for their student portfolios (Niguidula, 2005, p. 45). Once those key elements are in place, the teacher can start to decide how the students will arrange and contribute to the portfolio in a way that will help them keep the focus on the purpose of the portfolio. The teacher can then dig deeper to decide what mastery in each standard or area of study will look like so that this information can be clearly communicated to students. One huge affordance of digital assessment tools is the range of possibilities for quick and easy feedback. Part of my job is to help pair teachers with the best tools that enable them to quickly provide the meaningful feedback that they need to provide. In assessing the portfolio, teachers can use rubrics to help establish expectations and a student self reflection piece for each artifact as a key component of demonstrating growth (Niguidula, 2005, p. 47).

Works Cited

 

Bennett, R. E. (2011). Formative assessment: A critical review. Assessment in Education: Principles, Policy & Practice18(1), 5-25. doi: 10.1080/0969594X.2010.513678

 

Niguidula, D. (2005). Documenting learning wtih digital portfolios. Educational Leadership, 63(3), 44-47. Retrieved from http://p2047-ezproxy.msu.edu.proxy2.cl.msu.edu/login?url=http://search.ebscohost.com.proxy2.cl.msu.edu/login.aspx?direct=true&db=eax&AN=507839321&site=ehost-live

 

Wiggins, G.P. & McTighe, J. (2005). Understanding by design. Alexandria, VA: Association for Supervision and Curriculum Development. Retrieved from http://p2047-ezproxy.msu.edu.proxy1.cl.msu.edu/login?url=https://search-ebscohost-com.proxy1.cl.msu.edu/login.aspx?direct=true&db=e000xna&AN=133964&scope=site
Blogging

WELCOME TO MY BLOG!

In my posts, I share successes and failures about working in education, technology integration, instructional design and self-reflect on my iterative learning journey as an educator.

My hope is that through sharing my experiences that I can challenge and inspire others daily work towards innovation and collaboration for positive change.

Tag Archive found at bottom of page.

CEP 813: Assessment and CMS

Assessment, Google Classroom, Google Forms, MAET Year 3

This week I created an assessment for the online course module that I am designing for administrators called Innovative Leadership. I needed to design some assessments that would measure the objectives for the first module of the course. The first module of the course is focused on increasing communication using Google Tools. Because of this objective and because the administrators that the course is designed for are part of a Google Apps for Education school district, I chose to use Google Classroom for my content management system.

The assessment itself is a portfolio artifact submission and reflection created using Google Forms. The purpose, audience, professional standards, and more details about this assessment are viewable in the following screencast: http://www.showme.com/sh/?h=mblc0ae

CEP 813: Assessment and Content Management Systems

Assessment, Content Management Systems, Google Classroom, MAET Year 3

In designing hybrid, blended or online learning experiences, it is important to consider what content management systems have to offer in terms of assessment. Keeping this in mind, I compared three content management systems (CMS) this week: Haiku, Google Classroom and Edmodo.

After analyzing and exploring the three potential systems, I have decided that the best option for my needs, designing a course to familiarize administrators with Google Apps for Education tools, is Google Classroom.

While it is clear that Haiku LMS and Edmodo have some important built-in functions like separate discussion forums and being able to view student analytics that Google Classroom lacks, I really appreciate the flexibility that comes with Google Classroom. I want to be able to emphasize the versatility of Google tools for assessment and using Google Classroom allows students to access materials all from the Google suite. While Edmodo and Haiku allow for Google tools to be used and posted as assessments and assignments, it would still require students to flip flop back and forth between their Google tools and the CMS.

I appreciate the ability to give heavy feedback in various ways, through Google Docs, the Class Stream, the assignment Comment box in the Google Classroom platform. While there is no traditional gradebook function, the assignments can be graded (or ungraded with feedback) and it is easy to see the progress of each student through their assignments in the Google folder and the tracker built in within Google Classroom.

For creating ePortfolios and for passing content on to different instructors, I think this is an excellent platform as you can share and maintain all the content via Google Drive. Surveys, quizzes and rubrics can all be created using Google Forms and linked to Classroom. There is no separate discussion forum within Google Classroom, but students as well as the teacher can interact in the Class Stream section. I would also just take advantage of the collaborative nature of Google Docs as a discussion forum within this platform.

If Google tools and being a Google Apps For Education district were not the focus for the course that I am offering, I would probably choose Haiku or Edmodo over Google Classroom because of their added functionalities. To address some of the shortfalls of Google Classroom as a content management system in general, I am jointly using a Weebly website along with Google Classroom for the course.

CEP 813: First Minecraft Experience

Assessment, Failure, Gamification, Minecraft

Over the last few days, I have spent a few introductory hours exploring a Minecraft world created using a MSU MinecraftEDU server. My previous experience involves watching students play Minecraft and listening to them talk about Minecraft, but I have never played myself. Below is a short screencast in which I introduce the greatest challenge that I faced during my first Minecraft experience. I’m looking forward to exploring Minecraft further and finding new ways to incorporate it and game-based learning in the classroom.

 

CEP 813: Growth Plan for Tech Integration (Formative Assessment Design Version 1.0)

Assessment, Backwards Design, Formative Assessment, Tech Integration

This week I have been tasked with beginning a draft of a formative assessment that I can use in my work as a Technology Integrator. The link below outlines the purpose for the assessment and how it will inform my work with educators while relating it to effective instructional design including the principles of Backward Design by Grant Wiggins and Jay McTighe. This assessment was also influenced by several coaching and leadership models introduced to me in a coaching training this week. I’m very excited about executing the concepts of Laura Lipton, Bruce Wellman, Paul Hersey and Ken Blanchard.

Growth Plan for Tech Integration (Formative Assessment Design Version 1.0)

CEP 813: Critical Review: Conversation as Formative Assessment in Technology Integration

Assessment, Backwards Design, Formative Assessment, Tech Integration

This week I looked critically at an assessment genre that is typical of my context and discipline and applied it to my understandings of our readings on Understanding by Design by Grant Wiggins and Jay McTighe as well as formative assessment to both explain and evaluate its validity. In the non-traditional role of technology integrator, I have identified formative feedback through conversation both digital and verbal as one of the most utilized assessments in my practice.

Critical Review: Conversation as Formative Assessment in Technology Integration

CEP 813: Annotated Assessment/Evaluation Exemplar

Assessment, Collaboration, Creativity, Learning Theory, MAET Year 3, Tech Integration

Assessment of Student Perceptions of 21st Century Learning

K-1Next-GenStudentSurvey-February2015-page-001The assessment that I have chosen to analyze was designed for Kindergarten and first grade students to self-assess their learning and learning environment in Saline Area Schools Next Generation classrooms. The Next Generation classrooms utilize 1:1 technology, flexible learning spaces and emphasize effective pedagogy with the development of 21st century skills. The assessment was designed and developed with feedback from the Next Generation classroom teachers, the Instructional Technology Director and myself. This assessment is a series of statements that were read aloud to students while they had a paper copy in front of them. They had to circle the smiley face if they thought that the statement applied to them most of the time, the straight line face if they thought that it sometimes applied to them and the sad face if they felt that the statement never applied to them. The assessment was anonymous in order to promote honesty and objectivity but which classroom each assessment hailed from was identified. The assessment was intended to be administered both mid-year and end of year.

K-1Next-GenStudentSurvey-February2015-page-002

Purpose and Alignment to Professional Standards

The purpose of this assessment was to provide feedback to the Instructional Technology Director, teachers, students and the district about how Next Generation classrooms are incorporating 21st century skills particularly creativity, critical thinking, collaboration and communication. Standardized assessments do not measure the soft skills like creativity, critical thinking, collaboration and communication but we know that their development is crucial for students to be successful in the workplace and can potentially have impact on student achievement. In analyzing the data received from this learning environments assessment combined with data from standardized assessments, we could begin to present evidence of their impact on student achievement. It also was designed to allow students to reflect on their learning experiences in the unique learning setting and how it has impacted them as a student this year.

The assessment aligns with the ISTE Standards for Students and the Partnership for 21st Century Learning’s Framework for 21st Century Learning. The goal was to measure students’ perceptions of their opportunities to be creative, critically think, communicate, collaborate, the impact of their flexible learning environment and access to technology for learning. Because the assessment was anonymous, the results will provide feedback on class-wide and program-wide student perceptions on the impact of certain activities on their learning. The way the assessment is currently administered does not provide specific results for each student.

Intended Use

The assessment is formative in nature as “evidence is actually used to adapt the teaching to meet student needs” (Black and Wiliam, 1998, p. 140). The data is compiled and analyzed and the Next Generation teachers meet individually with the Instructional Technology Director to discuss the results. The student data from this soft skills perception assessment is combined with standardized assessment student data to determine correlations particularly in analyzing achievement gaps and student growth. The discussion consists of reflection about positive correlations and highlights things that the teacher is doing well as evident in both sets of data and also addresses areas of improvement and what adjustments and changes that teacher might make to their learning environment and/or instruction to address those areas. The assessment is again administered at the end of the school year and the process is repeated. The information from this assessment combined with the other assessments created for the other grade levels of Next Generation classrooms and the standardized assessment data was also used to inform the Instructional Technology Director about current trends, areas of improvement and correlations between soft skills and academic achievement that are evidential support and data visualization for furthering the efforts of Next Generation programming within the district.

Assumptions Embedded within Assessment

In administering this assessment, we assumed that all students understood the prompts when they were read aloud and could match the written numbers with the spoken number prompt. We also assumed that students clearly understood the meaning of the smiley face, straight face and sad face and what they represented when they selected each. We assumed they could physically circle the response they chose. We assumed that each student would be honest when responding to the prompts. We also assumed that students would not be influenced by peers, the teacher or the assessment administrator when responding to the prompts. Finally, we assumed that this was an appropriate amount of prompts to gain enough information within an appropriate time-frame that did not extend past the attention span of the students.

Potential Challenges

This assessment could prove difficult for struggling readers as the students needed to have at the very least an ability to correlate what number was said to what was written on the paper. The assessment could definitely be challenging for ELL students as the prompts were written for a general education audience and did not include any picture supporting prompts to help with unfamiliar vocabulary in the statement portion. The smiley faces may also potentially be confusing to an ELL student as the cultural connotations may vary. This assessment also proved difficult for students because it was survey and responses were to be based on opinion and the students struggled with the idea that there was not a right and a wrong answer.

Implications for Assessment Re-design

This assessment echoes some of Lorrie Shepard’s suggested strategies for developing informative and useful assessments (2000, p. 10). It is on-going and administered at multiple points throughout the year, although it could also be administered at the beginning of the school year as a baseline and to provide transparency and set clear expectations for both teachers and students. This assessment provides insight into student perceptions that are used to provide feedback for the teacher. I think that in doing that we are gaining valuable information but we are also potentially overlooking the teacher’s perspective and prior knowledge. If the teacher identifies that they believe that they are really strong at allowing students to use technology to show what they know but the student responses show the opposite perception, being able to see and connect that data has the potential to lead to a more meaningful discussion and reflection on what the underlying cause of that is. A potential improvement could be to create a matching assessment that is designed to gauge teacher perceptions of their own teaching of this material. Currently, we are providing a self-assessment for students but not creating that same self-assessment piece for teachers.

References

Black, P. & Williams, D. (1998). Inside the black box: Raising standards through classroom assessments. Phi Delta Kappan, 80(2), 139-144.

Casner-Lotto, J., & Barrington, L. (2006). Are They Really Ready to Work? Employers’ Perspectives on the Basic Knowledge and Applied Skills of New Entrants to the 21st Century US Workforce. Partnership for 21st Century Skills. 1 Massachusetts Avenue NW Suite 700, Washington, DC 20001.

International Society for Technology in Education. (2015). Standards for Students . Retrieved 31 May 2015, from http://www.iste.org/standards/ISTE-standards/standards-for-students

Partnership for 21st Century Learning. (2015). Framework for 21st Century Learning – P21. Retrieved 31 May 2015, from http://www.p21.org/our-work/p21-framework

Shepard, L. (2000). The role of assessment in learning culture. Educational Researcher, 29(7), 4-14.