Differentiating Electronic Portfolios and Online Assessment Management Systems

from papers presented at AERA and AAHE's Assessment Conference, 2003

PDF version of paper published in SITE 2004 Proceedings
©2003, Helen C. Barrett, Ph.D.

I have just written a short paper to go along with my presentation at the 2003 AERA conference in Chicago: e-Portfolios: Issues in Assessment, Accountability and Preservice Teacher Preparation (http://electronicportfolios.org/portfolios/AERA2003.pdf ). Here is an expansion of one page from that paper, addressing some of the issues of definition that I am exploring, between electronic portfolios and online assessment management systems. I wrote this short piece because I am finding that it is very difficult to research electronic portfolios today because of the emergence of very diverse models of implementation, especially in some of the new commercial tools that are available. These different implementations and "definition by default" make the task more difficult. Here is my first attempt at delineating the differences between electronic portfolios and online assessment management systems:

---
As noted, many Teacher Education programs are adopting electronic portfolios to meet NCATE 2000 Standard#2, Assessment System, and the implementation often resembles more of a grading or record keeping system than the traditional paper-based portfolio. In many ways, the implementation of electronic portfolios is changing the very definition of "portfolio" from past practice. Many electronic portfolio systems involve numerical scoring of artifacts against a rubric, with statistical analysis available to aggregate data collected.

There have been some examples of careful differentiation of electronic portfolio and assessment management. At the 2003 SITE Conference, Baylor University (Rogers, 2003) presented a very creative solution, which they programmed in-house:

"Baylor’s Teacher Candidate Development Portfolio (TCDP) consists of four inter-related components: a candidate profile, a candidate portfolio, the 'benchmark' assessments, and the formative assessments."(p.163)

Students create an electronic portfolio using a template and HTML authoring tools and posted to the portfolio server. The in-house software allows a faculty member to select the student’s name in the lower window and that student’s portfolio appears in the upper window of a web browser. The faculty member would review the student’s work (in the upper window) and complete a scoring rubric, which appears in the lower window. All of this assessment data is collected and stored in a database, which can be used for aggregation of data. However, the student portfolios were developed independent of the database environment used to collect and record the assessment data, letting the student maintain some individuality and control over the “look and feel” of their portfolios.

At the 2003 Assessment Conference of the American Association for Higher Education, a custom-designed electronic portfolio and separate assessment management system were demonstrated by the University of Denver (Thompson & Cobb-Reiley, 2003). The faculty member evaluating a student e-portfolio has the option of viewing the student portfolio in a split screen, as in the Baylor model, but may also open the assessment screen in a separate window. The assessor would then switch from one window to another to read the portfolio and then record their evaluation. Extrapolating from these two examples, and to simplify the software development process, perhaps the assessment management system could also be a stand-alone database, that would hold the assessment data. The student portfolio could be opened in a separate window, and the faculty reviewer could switch back and forth as needed. The assessment system could then be designed with database tools more aligned to other data management tools used in the school or college, without disrupting the integrity and authenticity of the student portfolio.

Below is an initial list of the differences between electronic portfolios and online assessment systems:
  Electronic Portfolio Assessment Management System
Purpose - Multiple purposes: Learning, Assessment, Employment - Single purpose: Formative and Summative Assessment
Data Structure - Data structure varies with the tools used to create the portfolio; most often common data formats (documents often converted to HTML, PDF) - Data structure most often uses a relational database to record, report data
Type of Data - Primary type of data: qualitative - Primary type of data: qualitative and quantitative
Data Storage - Data storage in multiple options: CD-ROM, videotape, DVD, WWW
server, LAN
- Data storage primarily on LAN or on secure WWW server
Control of design & links - Visual design and hyperlinks most often under control of portfolio developer - Visual design and hyperlinks most often controlled by database structure
Locus of control - Student-centered - Institution-centered
Selection of Contents - Artifacts selected by portfolio developer - Artifacts prescribed by institution
Technology skills required - More advanced skills required, including information design through hyper linking, digital publishing strategies, file management - Minimal skills required, equivalent to using a web browser and adding attachments to an e-mail message
Technology competency demonstrated Medium to high, depending on tools used to create portfolio - Low to medium, depending on the sophistication of the artifacts added to the portfolio

Why is it important to differentiate between electronic portfolios and assessment management systems? The literature on paper-based portfolios has raised many issues and cautions about portfolio use (Lucas, 1992): the weakening of effect through careless imitation; the failure of research to validate the pedagogy; and the co-option by large-scale external testing programs. The current trend toward online assessment management systems that are being called electronic portfolios leads to further confusion in the literature, making it difficult for research to validate the pedagogy.

Lee Shulman (1998) mentions five dangers of portfolios:

  1. "lamination" - a portfolio becomes a mere exhibition, a self-advertisement, to show off
  2. "heavy lifting" - a portfolio done well is hard work. Is it worth the extra effort?
  3. "trivialization" - people start documenting stuff that isn't worth reflecting upon
  4. "perversion" - "If portfolios are going to be used, whether at the state level in Vermont or California, or at the national level by the National Board, as a form of high stakes assessment, why will portfolios be more resistant to perversion than all other forms of assessment have been? And if one of the requirements in these cases is that you develop a sufficiently objective scoring system so you can fairly compare people with one another, will your scoring system end up objectifying what's in the portfolio to the point where the portfolio will be nothing but a very, very cumbersome multiple choice test?" (p. 35)
  5. "misrepresentation" - does the emphasis on isolated examples of "best work" misrepresent the teacher's "typical work" so as not to be a true picture of competency?

To balance this perspective, Lee Shulman also identifies these five benefits for portfolios:

  1. "...portfolios permit the tracking and documentation of longer episodes of teaching and learning than happens in supervised observations." (p.35)
  2. "...portfolios encourage the reconnection between process and product." (p. 36) In the best of all worlds, Shulman says that "the very best teaching portfolios consist predominanrly of student portfolios" and highlight the results of teaching that lead to student learning.
  3. "...portfolios institutionalize norms of collaborationm reflection, and discussion" (p.36)
  4. "...a portfolio can be seen as a portable residency... A portfolio introduces structure to the field experience." (p.36)
  5. "...and really most important, the portfolio shifts the agency from an observer back to the teacher interns... Portfolios are owned and operated by teachers; they organize the portfolios; they decide what goes in them." (p.36)

Portfolios as implemented in K-12 education provide us with a model that favors supporting the learning process over the focus on accountability (perhaps because other methods have been implemented to deal with more high stakes assessment). Here is a diagram that was developed by Evangeline Stefanakis (2002) from her work with portfolios that demonstrate multiple intelligences. Her research is based on her work with the Massachusetts Project Zero Network. As she says,

“The drive toward standardized and state testing requires us, as researchers and practitioners, to find ways to learn from tests and portfolios in order to develop a comprehensive assessment system in which accountability would be demonstrated at many levels related to student achievement. …In a more generalized way, I offer a design for a comprehensive system which combines formal, informal, and classroom assessment, including portfolios, to inform the state, the district, the school, and the teacher. The goal for each district is to carefully construct a comprehensive assessment system, with a collection of assessments that allow many stakeholders to use these data to improve both student learning and teachers’ teaching, Without portfolios to make visible what students do and what teachers teach, I am not sure this can be done. Figure 8-1 presents my representation for an assessment for learning continuum.” (p. 137)

A portfolio that closely emulates a paper version and just happens to be stored in an electronic container is a very different document from the current implementation of these online database systems. Technology appears to be changing the definition of “portfolio” (Batson, 2002) and many of these online systems may be careless imitations or distortions of the original purpose of portfolios. The use of portfolios as high stakes assessment may be further evidence of co-option by large-scale external testing programs, or a perversion of the portfolio process. It will be important for Teacher Education programs to maintain their focus on the original purposes for which paper portfolios have been successful, and carefully assess the impact that the conversion to an electronic format will have on those original goals. Just because technology allows aggregation of portfolio data, should we succumb to this temptation?

The real balancing act is how to meet the needs of the organization for an assessment management accountability system while not losing what might be valuable already in a paper-based reflective portfolio system. Another issue is to implement an electronic portfolio with teacher candidates that they want to replicate with their students once they have their own classrooms. More research is needed on examples of implementation that clearly differentiate between student-owned electronic portfolios and the assessment systems used by faculty to record evidence of students’ progress toward meeting standards.
---

Further discussion of the competing paradigms of electronic portfolios and assessment management systems (February 2004)


References

Batson, Trent (2002) “The Electronic Portfolio Boom: What's it All About?” Syllabus. Available online: http://www.syllabus.com/article.asp?id=6984

Lucas, Catharine (1992). "Introduction: Writing Portfolios - Changes and Challenges." in K. Yancey (Ed.) Portfolios in the Writing Classroom: An Introduction.(pp. 1-11) Urbana, Illinois: NCTE:

Rogers, Douglas (Baylor University) (2003) "Teacher Preparation, Electronic Portfolios, and NCATE." Proceedings of the 2003 Conference of the Society for Information Technology in Teacher Education Conference, Albuquerque, March 24-29. (pp. 163-5)

Shulman, Lee (1998) "Teacher Portfolios: A Theoretical Activity" in N. Lyons (ed.) With Portfolio in Hand. (pp. 23-37) New York: Teachers College Press.

Stefanakis, Evangeline (2002) Multiple Intelligences and Portfolios. Portsmouth: Heinemann

Thompson, Sheila & Cobb-Reiley, Linda (2003). "Using Electronic Portfolios to Assess Learning at the University of Denver." Presentation at 2003 AAHE Assessment Conference, Seattle, June 23, 2003.


Updated July 17, 2004