Fwd: assessment edweek

3 views
Skip to first unread message

Philipp Schmidt

unread,
Oct 5, 2010, 11:58:35 PM10/5/10
to open-assessment
Maybe interesting for more people ... P


---------- Forwarded message ----------
From: Joshua Gay <josh...@gmail.com>
Date: 5 October 2010 20:56
Subject: assessment edweek
To: Philipp Schmidt <phi.s...@gmail.com>, Ahrash Bissell
<ahrash....@gmail.com>


I'm watching the live stream of Karen Cator (Ed Tech Director of the
US) at Ed Tech Week
(http://www.edweekevents.org/livestream?intc=LF4EM). She is talking
about embedding assessment in online learning environments ... it's
worth listening to the first few minutes of the talk. I'm sure the
recorded stream will be put up somewhere (hoepfully linked to from the
link I shared above).

-Josh

--
"Every time the word 'achievement' or 'academics' is used to mean test
scores we cheapen the meaning of both terms." --Deborah Meier

David Gibson

unread,
Oct 6, 2010, 5:20:32 PM10/6/10
to open-as...@googlegroups.com
Hi Philip:

I just last week had a chance to sit right next to Karen on a panel at the National Educational Technology Summit at AACTE headquarters. Her elevator speech (she says it works if the building is rather tall) is a wonderful set of concepts taking the field forward. She also said that hopefully later in Nov we'll have the national technology plan with final approval and can trumpet it around - its really quite good, visionary, etc. I made a response comment that in addition to the visionary infrastructure work that is going on, we should add a human infrastructure layer - I was referring to having just seen in Australia a group that in the past had produced "distance education TV" and now is a game production studio with artists, game makers and content specialists, who play an intermediary role as experts who support teachers as well as the direct online learning experiences of 35,000 students (per game) in live, evolving contexts (and contests) that involve digital media learning.

I think the U.S. should re-tool its "Lab" structure for direct-to-student, and direct-to-parent, and direct-to-classroom approaches.

Another item to share at this time is that I'll be submitting a panel session to the ISTE conference with this content (still being edited a bit):

TITLE (10 words max):
Suggestion 1: Outcomes from National Technology Leadership Summit 2010 Performance Assessment Strand
Suggestion 2: Performance Assessment and Digital Media: National Technology Leadership Summit Outcomes
Suggestion 3: Digital Media Based Performance Assessment: The Trojan Horse of Education
Suggestion 4 (from Pam): Performance Assessment of 21st Century Teaching and Learning: Insights into the Future

DESCRIPTION (25 words max):
This panel will present outcomes from the performance assessment strand of the 2010 National Technology Leadership Summit (NTLS) held in Washington, DC at AACTE headquarters.

PURPOSE AND OBJECTIVES:
This session will present the key findings from the The purpose of this panel discussion is to present outcomes from the performance assessment strand of the 2010 National Technology Leadership Summit (NTLS). The Performance Assessment strand was This year NTLS took place in Washington, DC at AACTE headquarters from September 30th through October 1st. This strand was co-chaired by Dr. David Gibson (Society for Information Technology in Teacher Education - SITE Vice President) and Dr. Gerald Knezek (SITE President). Outcomes to be discussed converge upon a central theme, namely an The committee explored the themes understanding of performance assessment and project-based learning as crucial elements to the development of 21st century skills in both teachers and learners and methods for documenting progress toward meeting standards for performance. Specific themes will be explored in this session:
(1) Recognition of thei The importance of investment in innovation in university-based teacher education programs and the need to create a knowledge base that informs teacher decision making. This discussion will focus upon investing in expanding the skills and knowledge base for design-based research and particularly developmental research on performance assessment within the framework of TPACK (Technological, Pedagogical and Content Knowledge) as well as the national TPAC (Teacher Performance Assessment Consortium) measures currently being piloted across 16 states through by American Association of Colleges of Teacher Education (AACTE).
(2) Recognition of the need for investment in a repository of shared instruments for 21st century learning, including teaching vignettes that model TPACK and TPAC in classroom environments, that utilize social networking methods to build a national community of practice. This would ideally include collaborative data gathering networks and have the active involvement of AACTE, SITE, NTLS, and other stakeholder organizations. The primary goal would be to provide explicit instructions for enabling teachers to become change agents, field leaders, and entrepreneurial, while using technology in project-based learning incorporating their content area’s contemporary best practices for using technology in teaching. A main objective of this endeavor is to build and maintain a vetted collection of standardized instruments and shared protocols so that educators might create a unified, data-based, and robust model of the system.

OUTLINE:
Performance assessment can be conceptually divided into two sections – traditional and new wave. The old-fashioned way of doing performance assessment with paper portfolios, coaching, apprenticeship, and human scoring, ground to a stop with the advent of "standards." Chris Dede points out "research has documented that higher order thinking skills related to sophisticated cognition (e.g., inquiry processes, formulating scientific explanations, communicating scientific understanding, approaches to novel situations) are difficult to measure with multiple choice or even with constructed-response paper-and-pencil tests (Resnick & Resnick, 1992; Quellmalz & Haertel, 2004; National Research Council, 2006; Clarke & Dede, in press). However, as Jody Clarke points out "attempts to address these limitations by designing hands-on and virtual performance assessments in the late 1980’s and 1990’s encountered a number of technical, resource, and reliability problems in large-scale administration (Cronbach, Linn, Brennan, & Haertel, 1997; Shavelson, Ruiz-Primo, & Wiley, 1999)." The core idea of the panel’s discussion of new wave performance assessment methodologies is that there may be new artificially intelligent computational models in assessment systems that can become more like assistants to researchers and go beyond the current status of traditional methods as inert tools of analysis; and at the same time, the new models and methods might open up new cognitive territory for epistemological advances. In addition, the traditional goals of assessment - to provide valid inferences related to particular expectations for students - relies on conceptions of validity and reliability which may need re-thinking in the light of new scientific methods. 


This panel discussion will begin with a discussion of whether new wave methods, and the changing contexts of education in 2010 and beyond, might help performance assessment re-enter the professional conversations and practices. In particular, three forces seem poised as game changers for teacher education (Gibson & Knezek, in press). These include: (1) The global environment has dramatically shifted and flattened the world, offering educators new affordances, challenges, and models. Chief among the forces is globally networked digital media, open access to information, new visualization tools and methods, social networking, crowd sourcing, peer-to-peer pedagogy, new media literacy skills and the implications of these for 21st century skills for students. (2) Complex systems sciences are discovering new ways to represent time and evolving networks of systems. The new science is transforming many branches of knowledge, including physics, biology, engineering, medicine, neuroscience, etc. (3) The realization has dawned that technology is no longer an "add-on" or "nice-to-have" in education. It is a permanent element of the core competencies for teaching in the 21st century - a part of the "technological pedagogical content knowledge" needed by all educators. This panel will address the question of whether these game changers are the beginning of a transformation that is presently only partially visible, and what else might be on the horizon.

SUPPORTING RESEARCH:
Clarke, J., and Dede, C. (2010). Assessment, technology, and change. Journal of Research in Teacher Education, 42 (3).



Cronbach, L. J., Linn, R. L., Brennan, R. L, & Haertel, E. H. (1997). Generalizability analysis for performance assessments of student achievement or school effectiveness. Educational and Psychological Measurement, 57, 373-399.



Gibson, D., & Knezek, G. (in press). Game changers for teacher education. Paper presented at the 2010 Sydney Symposium. 



Haertel, G. D., Lash, A., Javitz, H., & Quellmalz, E. (2006). An instructional sensitivity study of science inquiry items from three large-scale science examinations. Presented at AERA 2007.



Mislevy, R., & Haertel, G. (2006). Implications of Evidence-Centered Design for Educational Testing (Draft PADI Technical Report 17). Menlo Park, CA: SRI International.



Mislevy, R.J., Steinberg, L.S., & Almond, R.G. (2003). On the structure of educational
 assessments. Measurement: Interdisciplinary Research and Perspectives, 1, 3-62.



National Research Council. (2006). Systems for state science assessment. Washington, DC: The National Academies Press.



Quellmalz, E. S. & Haertel, G. (2004). Technology supports for state science assessment systems. Paper commissioned by the National Research Council Committee on Test Design for K-12 Science Achievement. Washington, DC: National Research Council.



Resnick, L. B. & Resnick, D. P. (1992). Assessing the thinking curriculum: New tools for educational reform. In B. Gifford & M. O'Connor (Eds.), Changing Assessments: Alternative Views of Aptitude, Achievement, and Instruction. Norwell, MA: Kluwer Academic Publishers, 37-75.



Shavelson, R. J., Ruiz-Primo, M. A., & Wiley, E. W. (1999). Note on sources of sampling variability in science performance assessments. Journal of Educational Measurement, 36, 61-71.



Thagard, P., & Litt, A. (2008). Models of scientific explanation. In R. Sun (Ed.), The Cambridge handbook of computational cognitive modeling. Cambridge, UK: Cambridge University Press.

PRESENTER BACKGROUND:
Panel members include the NTLS 2010 performance assessment strand co-chairs, David Gibson (SITE Vice President) and Gerald Knezek (SITE President). Additional potential panel participants are the members of the NTLS 2010 performance assessment strand, including:
Dina Rosen (NAECTE Technology Committee), Pamela Redmond (Co-chair, AACTE Committee on Innovation & Technology), Mary Herring (AECT Past-President), John Mergendoller (Executive Director, Buck Institute of Education), Punya Mishra (Co-chair, SITE TPACK SIG), , Becky Fisher (ACPS Assistant Director for Instructional Technology), Denise Schmidt (Editor, Journal of Digital Learning in Teacher Education), John Lee (Co-chair, SITE Teacher Education Council), Rhonda Christensen (Chair, SITE Research & Evaluation SIG), Shelby Marshall (FableVision Solutions Architect), Scott DeWitt (Editor, CITE Journal – Social Studies), , Leigh Graves Wolf (Program Coordinator, Master of Arts in Educational Technology, MSU), and Daniel Tillman (Co-chair, SITE Digital Fabrication SIG).

Also - put on your assessment hat and take a look at this artist who is using large-scale data sets as live inputs to exploratory analytic-artistic interfaces. I think this has something to teach us about the future of assessment as well.

http://www.wefeelfine.org/

On Oct 5, 2010, at 11:58 PM, Philipp Schmidt wrote:

> I'm watching the live stream of Karen Cator (Ed Tech Director of the
> US) at Ed Tech Week
> (http://www.edweekevents.org/livestream?intc=LF4EM). She is talking
> about embedding assessment in online learning environments ... it's
> worth listening to the first few minutes of the talk. I'm sure the
> recorded stream will be put up somewhere (hoepfully linked to from the
> link I shared above).

--------
Best wishes


Allen Gunn

unread,
Oct 8, 2010, 2:23:39 PM10/8/10
to open-as...@googlegroups.com, Mike Roman
Hey friends,

Can't believe it's already been almost 3 weeks since all the fun in Palo
Alto!

Thanks to all of you who have submitted expense reports.

As previously noted, participant travel expenses were due this past
Wednesday. There are a number of participants from whom we have not
received an expense report. We will be closing the books for the event
on Monday, and after that point we will not be accepting any further
expense reimbursement requests.

At present we have expenses from Ahrash, Ari, Daniel, David, Donna,
Ingrid, Josh, Nils, Pippa, Stian, and Theron.

Please let us know if we should have your name on the above list, or if
you have recently sent us your expenses. Alternately, you can call Mike
at +1.415.839.6456.

thanks & peace,
gunner

--

Allen Gunn
Executive Director, Aspiration
+1.415.216.7252
www.aspirationtech.org

Aspiration: "Better Tools for a Better World"

Read our Manifesto: http://bit.ly/aspimanifesto

Follow us:
Facebook: www.facebook.com/aspirationtech
Twitter: www.twitter.com/aspirationtech

David Gibson

unread,
Oct 9, 2010, 3:13:09 PM10/9/10
to open-as...@googlegroups.com, Mike Roman
Gunner, you made it fun and productive! Great job.

On Oct 8, 2010, at 2:23 PM, Allen Gunn wrote:

> Can't believe it's already been almost 3 weeks since all the fun in Palo
> Alto!

--------
Best wishes


Reply all
Reply to author
Forward
0 new messages