Printer Friendly

An e-portfolio design supporting ownership, social learning, and ease of use.


One of the most popular and widespread educational technology innovations since the introduction of the course management system has been software supporting the Internet-based creation of portfolios (Lorenzo & Ittelson, 2005a & 2005b). Adoption of educational portfolios has been increasing for some time. They are particularly popular in the United Kingdom, and are part of the government's personal development programs initiatives (Becta, 2007). In the U.S., their popularity is largely driven by assessment requirements, where portfolios can be used to demonstrate higher-order thinking such as synthesis and application (Camp & DeBlois, 2007).

However, as portfolios are increasingly adopted for assessment purposes, their original pedagogical purpose is often subverted (Barrett & Wilkerson, 2004). Instead of using portfolios for learning, and their contents for assessment, portfolio initiatives and software are increasingly focused solely on providing assessment data. The adoption of portfolios for assessment purposes can be a positive development, provided that it is understood that assessment data are a result of the learning process. Unfortunately, faculty and administrators' focus on assessment have skewed portfolio software designs. This research project was intended to refocus the design of electronic portfolio systems back onto learning.

Literature Review

According to Beetham (2005), a portfolio is "a collection of documents relating to a learner's progress, development, and achievements" (p.2), with the purpose of providing a record of progress, collecting evidence for outcomes assessment, and encouraging reflection on learning. The key feature of a portfolio is that it is owner-centric, as opposed to course-centric (ePortConsortium, 2003).

A portfolio's support for reflective thinking (Becta, 2007) is its most defining pedagogical feature. Students are typically required to upload artifacts, and then reflect on how the artifact demonstrates a certain competency or learning progression. Most software packages provide specific fields next to uploaded files for students to type their reflections.

One of the key reasons for the original use of portfolios was their ability to enhance students' meta-cognition (where students know what they know) through reflection. Originally fueled by theory, the use of portfolios was strongly supported by Lee Shulman (8th President of the Carnegie Foundation for the Advancement of Teaching) and other researchers (Carney, 2004). Today, their pedagogical benefit is well known (Murray, Smith, Pellow, Hennessy, Higgison, 2006), but have some theoretical holes that require further research (Carney, 2004).

As a result of these advantages, portfolios have been extensively adopted in many disciplines, particularly in teacher training programs. A 2002 study found that 89% of the educational program respondents used portfolios in their outcomes assessments (Salzman, Denner & Harris, 2002). Almost all of these units were using locally-developed software. The United Kingdom is also a major supporter of portfolios, and many professional bodies and large employers encourage the maintenance of portfolios (Beetham, 2005).

Zubizarreta (2006) argues that portfolios require three domains of activity: documentation, reflection, and collaboration. That collaboration should include faculty members (classroom teachers or an advisor) and other students. As he says, "... reflection is facilitated best by not leaving students individually to their own devices in thinking about their learning but by utilizing the advantages of collaboration and mentoring in making learning community property."

Research Opportunity

Cohn and Hibbitts (2004) identify the "ossification of the current prefabricated, one-size-fits-most portfolio" (p.2). This is largely a result of an overdeveloped focus upon assessment as the primary outcome of a portfolio. The focus on the assessment of learning at the expense of the development of learning (Barrett, 2005, Barrett & Wilkerson, 2004) has warped the dominant implicit portfolio design model.

This implicit design paradigm typically models a portfolio as a database of artifacts submitted in views for specific audiences. This model is suited for controlling access and providing multiple views onto data, and works well for assessment purposes. However, it makes portfolios relatively complex to construct, and the "private by default" permissions settings discourage sharing and social learning. Unfortunately, this paradigm is so common that almost all major programs use it.

Unfortunately, most portfolio software systems also do not provide robust features for collaboration and sharing. A survey conducted by the UK's Becta (2007) found that only 33% of the surveyed students in the case studies agreed that their system "shows me what my friends are learning" (p.17). Only 46% agreed that the systems helped them give feedback on each other's work.

Students want education software that helps them to connect with each other, lets them express their individuality, and is easy to use (Jafari, Mcgee & Carmean, 2006a & 2006b). Students are requesting systems more like the social software they use outside of school (Jafari et al., 2006a). As one student says, "social software has been around for a while now and it's a lot more user-friendly" (Becta, 2007, p.16). Similarly, staff and faculty who use portfolios for their own learning are less satisfied with education software than social software. Another person said:

I engage in many e-portfolio-like practices. Those involving dedicated e-portfolio tools have been far less satisfactory than those involving social software tools such as blogs, wikis, social networking sites. (Becta, 2007, p.24)

Conceptual Approach

My research approach is to create an artifact and use it in an educational setting. The combination of usage logs, survey data, and interview results can then be used to evaluate a design model and related software implementation.

This pragmatic approach to research is normally called design science. Building upon Herbert Simon's concept of the "science of the artificial" (Simon, 1996) it follows a "build and evaluate" cycle (Hevner, March, & Park, 2004). This is different from other scientific approaches, such natural science's "theorize and justify" approach.

Beyond simply creating a new implementation, this project focuses upon the creation and validation of an alternative model (or mindset) for the creation of portfolio software to support higher education. Design science research precisely includes this kind of theorizing (March & Smith, 1995). The following sections propose a number of conceptual requirements for a portfolio system (following Walls, Widmeyer & El Sawy 1992; 2004).


The concept of ownership is central in portfolio systems. As stated earlier, the defining characteristic of a portfolio is that it is owner-centric. Social presence is a theory describing how people can "project themselves socially and emotionally as 'real' people" (Cameron & Anderson 2006). Social presence uses 5 dimensions: focus, identity, safety, ownership, and style.

* Focus relates to the ability of students to talk about subjects that interest them. As Cameron and Anderson (2006) put it, "[t]opics of discussion within a computer conference are typically course focused and instructor directed."

* The concepts of identity and style relate to the ability of students to develop their personal voices. Identity can also be developed through customization and personalization of a site, use of a formal or informal voice, and stylistic decisions.

* Students must have a feeling of safety before they can project themselves online. This can be accomplished by allowing students to close off some of their work to a limited audience through permission settings, or by closing down their account in the future.

* The concept of ownership is also linked to social presence. Most blogging systems allow the users to control their environment and communication. Creating a sense of personal ownership is also thought to be crucial in constructivism, where learners are expected to learn in their own unique way.

Social Learning

Collaborative influences during portfolio construction has a positive effect upon student learning, but has been poorly acknowledged in existing system design. As one of the 3 basic realms of activities proposed by Zubizarreta (2006), collaboration should be reflected in systems' design as more than an afterthought. Casual exposure to peers' content should be one of the basic use cases of any portfolio software. Most systems emphasize security and privacy, and, as a result, leave out the casual learning that occurs in a traditional paper classroom portfolio project.

Ease of Use

One of the major complaints about existing systems is that they are difficult to learn and use. The Technology Acceptance Model (Davis, 1989) ease of use and usefulness constructs accurately model students' acceptance of portfolio software (Goldsmith, 2007). Because ease of use is such a critical variable, system designers should make a great effort to ensure that users have an easy time becoming skillful with a system.

The most common portfolio style uses a collection of pages that are organized for specific snapshots. This assumes that students will create a large amount of re-usable content that is submitted a number of times, and adds unneeded complexity.

Workflow of Collect, Select, Reflect, Assess Cycle

Ultimately, every portfolio system needs to support the basic workflow cycle of collect, select, reflect, and assess. Collect refers to the process of saving material used to satisfy course- related requirements. Select then refers to the process of choosing from among the saved artifacts to find those illustrating either achievements or learning. After selecting a set of artifacts, students then reflect on their learning, using their artifacts and a set of learning objectives as a prompt. Lastly, the teacher (or staff) assess the students' learning and record the results for further use. This cycle then repeats for the next assignment or class, or terminates at the end of the student's enrollment.

Research Design

The research approach for this project was to create a software package embodying the concepts presented in the previous section. Using a number of products would have yielded a stronger research design. Unfortunately, a review of existing products was unable to locate ones with the conceptual features being tested. As a result, I created a new software program for this project.

I began developing my portfolio software as a wiki plug-in for Elgg (an opensource social networking system) in January 2005. Since then, I have put out 7 major releases and many more incremental beta releases. The tool is broadly used, with some of the largest users being the University of Brighton, Emerald Publishing, and Cambridge University.

Screen Design

Folio provides extensive support for portfolio workflow, from the initial page creation to archiving work for later assessment. Whenever a user either goes to the "Page" tab for the first time, or clicks the link to add a new page, they are shown a list of pre-build templates (Figure 1).


Allowing users to create individual pages from a template has important implications. This means that users can add pages created for a single class after the user started their portfolio.

Pages with similar tags are shown in the left-hand sidebar. Because class pages are created through templates with pre-populated tags, this immediately allows students to view peer work (Figure 2).

Hovering over these pages with the mouse cursor results in them being 'popped-up' through the use of a JavaScript window. Users can also click on the link to be taken directly to that user's page. This hovering behavior is important because it provides an immediate sense of the level of development for each page. Since students may create a template

page to see their weekly assignment, not all created pages may have content. This pop-up feature (Figure 3) makes it simple for students to quickly scan for completed work.




The bottom of each page has a section for attachments. Attachments could be a select of text, a file, or embedded html. Each uploaded file comes with a reflection field. Clicking on this field changes it into a RichText editor (Figure 4).

Portfolio structure is displayed through a tree-based side navigation pane; this makes it easy to see all of a user's work. Tags are listed below the page body; clicking on these tags takes the user to a page showing all pages they have permission to see with the given tag. These features are displayed in Figure 5.


Each view page also displays a comment box on the bottom. This makes it easy for an instructor to provide public feedback. All comments are automatically e-mailed to the page owner. If desired, an option box is also present that will not keep a copy of the comment posted on the page.

Research Design

My research method was to create an artifact, use it in a live setting, and evaluate the artifact effectiveness through surveys and usage statistics.

First, I recorded detailed usage statistics.

* Pages Viewed: Users' browsing behavior was recorded.

* Page Edits: The system recorded each time a user edited one of their own pages.

* Popups List: Whenever students edited a page, they were shown a list of pages with the same tag. The number of pages listed varied throughout the term, as the first person to create a page would not have any similar pages to see. The total number of popups listed was recorded.

* Popups Viewed: The system recorded each time a student moved their mouse over a page listed in the sidebar (causing a window to popup with the page contents).

Second, my design model hypothesis claims that ownership, social learning, and ease of use are major factors in user satisfaction. These variables were operationalized and measured through a post-test survey.

* Hypothesis: Ownership, social learning, and ease of use predict user satisfaction.

Third, I conducted a number of phone interviews to provide additional triangulation for the survey results and system usage logs. The interview script was created to answer questions raised by the survey analysis. Primarily, these focused on the following questions.

1. I wanted to better understand students' feelings of control and ownership over their portfolio. Why did some feel they had control over the visual style? What factors influenced their feelings of control?

2. How did students use the ability to view peer work? Were they trying to benchmark their work, learn about the content, or build personal relationships? How did the tool help them in each of those areas? My tool was used in two separate sites.

1. The School of Information Systems and Technology (SISAT) at a research university used the software as an assessment portfolio tool in Fall 2008. All students taking courses were required to submit artifacts demonstrating learning for each class. Of the 11 sections that term, an average of 81% (56 students in total) submitted material into a portfolio.

2. Folio was also used to support 3 sections of a "Management of IT" MBA course at a small liberal arts university called IT509. This course was taught by the author. IT509 is part of the core MBA curriculum, providing a broad sample of MBA students. It is also an accelerated course, with students meeting once a week for 4 hours during the seven-week term. As a result, students are required to do much of their work outside class. Students used the site as their primary course management, submitting material each week before class. A total of 46 students were enrolled in these three courses.

Neither group of students was given incentives or encouragement to use the collaborative features of the tool.

Usage Results

The logging feature enabled me to record usage statics for both SISAT students and for the IT 509 course.

Overall Editing Usage

The SISAT students created a total of 91 pages from templates, an average of 11.4 per class (81% of the course, or 56 students in total). For each page submitted to a class, the author averaged 2.8 visits to the edit screen. In each of these edit screens, an average of 2.7 pages were shown in the sidebar. 53% of the time, people hovered their cursors over the available pages in the sidebar to pop-up their peer pages.

The 46 IT 509 students (in three sections) individually submitted a page for each of the 7 weeks of the course. Each week, the students visited the edit screen an average of 4.2 times. On average, the sidebar showed them 4.2 peer pages. On average, 88% of these peer pages were viewed through the use of a pop-up. Due to a glitch in the software, the second section's Week 1-5 log files were not recorded.

Editing Usage by Time

When graphed over time, we can see the time-based usage of the system. The following two charts show the frequency with which people viewed the edit screen, the count of pop-ups listed in the editing sidebar, and the number of times they hovered over the listed pages to view a pop-up.

The SISAT chart (Figure 6) is organized by month and day. Students began submitting template pages at the beginning of December (12), and continued in January (1). Students were required to submit their final assignment before grades were due in mid-January. The blue line shows saving activity, the green pop-ups displayed, and red pop-ups displayed that were "hovered" on by students.

The IT 509 charts (Figure 7 & 8) are organized by calendar week and separated by section (section meaning an individual course offered). Section assignment dates varied, as some courses ended earlier than others. Unfortunately, there were several problems with the viewing activity logs (save events were still recorded properly). Section 3 student logs were not recorded properly until midway through Week 5. Section 2 student logs for the last week were not properly recorded.

The IT 509 courses varied slightly in weekly assignments. Section 1 had no class on Week 5. The due date for Section 2 went into Week 8. Presentations on Weeks 6, and 7 resulted in less (or no) homework, depressing online submission activity (Figure 8). While Figure 7 shows variation throughout the term, it clearly indicates both early and frequent use of the pop-up feature. Figure 8 is provided to compare usage of the pop-up feature with overall system usage.

Peer Page Viewing Behavior Over Time

Beyond using the pop-ups to view peer work, students also had the option to click through to view peer pages. The qualitative comments indicate that students frequently used the pop-ups to find when students had submitted work, and then would click through to view peer work in more detail.

The following Figures 9 and 10 show the frequency with which students viewed pop-ups and how often they viewed their peers' pages.

As can be seen in both groups, students frequently looked at each other's pages throughout the term. This is encouraging, as it means that students quickly realized that they could view peer work, and persisted throughout their usage of the system.

Survey Hypothesis Results

Both IT 509 and SISAT students were given surveys to measure their experiences. Most SISAT students submitted their portfolio after the last week of class. Surveys were administered to 5 courses during the first two weeks of the following Spring 2009 term. Since many students were enrolled in multiple classes, they were asked to only fill out a survey one time. A total of 29 usable surveys were returned (a small number of blank surveys were thrown out).

The IT 509 students completed the post-test survey in the final week of the class. To prevent the possibility that students would give biased answers (since the researcher would be giving grades), the surveys were sealed and delivered to a neutral third party. After grades for the class were submitted, the sealed envelope of surveys was returned. A total of 39 surveys were returned.

Both groups of surveys were combined, giving a total of 68 surveys available for analysis.

Design Theory Variables

The survey measured a number of variables related to the design model. These included measures of students' feelings of control over their portfolio, perceptions of social learning, ease of use, and reported user satisfaction. Questions were in a 5-level Likert format.


Both the IT509 and the SISAT students reported that they felt like they controlled the content (90%) and organization (77%) of their portfolio. However, only 52% felt like they controlled the visual template. The latter is not surprising, as the software template was not easily customized. However, the combination or custom avatar (user picture), formatting, and fonts appears to have led at least half of the students to feel that they had control over their work's visual presentation.

A high number of students (83%) reported that their portfolio belonged to them. This is an important validation of the design feeling student-centric instead of institution-centric.

Social Learning

Students reported that the social learning aspects of the tool helped them in a number of ways. The most positive result was that the social learning helping them to learn the material (79%). 58% of students also reported that seeing each other's work helped them to get to know each other. 54% of students reported that they were motivated by knowing their work was public, and that their work quality was improved by being able to see peer work.

Ease of Use

Most students (79%) reported that the site was easy to use. Almost all of the students reported that they had an easy time editing their portfolio and adding content. Several reported that they had difficulty uploading large PowerPoint files, but otherwise that it was easy to use.

User Satisfaction

80% of users reported that using the site was a positive experience. Users were most satisfied with how the system helped them view classmate's work (71%), but also thought it helped them do better in class (64%) and get to know their classmates (50%).

Variable Reliability

My portfolio design predicts that ownership, social learning, and ease of use will predict users' satisfaction with the software.

The Ownership construct includes the following variables, and had a Cronbach's Alpha of .806. Social Science research generally considers an alpha above .7 as acceptable (Nunnaly 1978). Questions were in a Likert format.

* I had control over my portfolio's organization

* I had control over my portfolio's content

* I had control over my portfolio's visual template

* My portfolio belonged to me

* I have the ability to continue using my portfolio in the future

The Social Learning construct included the following questions, and had a Cronbach's Alpha of .887. Questions were in a Likert format.

* Being able to see other students' work:

** Helped me learn the material

** Improved the quality of my work

** Helped me to get to know other people

* It was motivating to know that other people would see my portfolio.

The Ease of Use construct was constructed from the following 5 TAM questions. These had a Cronbach's Alpha of .924. The <sitename> text shown below was replaced with the url of the site.

* Learning to operate <sitename> was easy for me.

* I found it easy to get <sitename> to do what I want it to do.

* My interaction with <sitename> was clear and understandable.

* I found <sitename> to be flexible to interact with.

* It was easy for me to become skillful at using <sitename>

The User Satisfaction variable had a Cronbach's Alpha of .881.

* Overall, using <sitename> was a positive experience.

* Using <sitename> helped me to do better in my class

* Using <sitename> helped me to have a better sense of my classmates' work

* Using <sitename> helped me to get to know my classmates

Regression Model Result

My hypothesis was that the ownership, social learning, and ease of use predict user satisfaction.

The resulting model has an adjusted R2 of 0.545, and an F score of 28.148 (Figure 11, 12, 13). As a result, I can be highly confident that the regression model is the result of random variation in my data.

I also created a regression model that used IT509 membership as a 1/0 variable to test the significance of the difference between the SISAT users' experience and the IT 509 users' experience. This was not significant, with a t value of 1.371 (a 0.175 chance of random chance), and did not have a significant effect upon either the overall model predictive value or the predictor's coefficients.


After analyzing an early copy of the dataset, I conducted a number of interviews to clarify issues raised and to triangulate the overall results.

Participants were gathered from both the SISAT students who submitted Fall 2008 program portfolios as well as students from the 2nd session of the IT 509 course. I interviewed 5 SISAT students and 4 from an IT 509 section.

The interviews averaged 8 minutes in length. I added questions as needed to clarify student responses, and provided explanation if a question was not understood. The questions focused on 4 central issues: site purpose, control, ownership, and social learning.

Site Purpose

One theme that came up several times in the interviews was the importance of the way in which the site was presented to students. Several of the SISAT students said that the site was introduced to them as a purely course-related requirement, and not as a site for their own personal portfolio.
   It was not told to us as you can use the portfolio in any way you
   want ... Professors said it's a requirement, you need to do
   this.... There needs to be more encouragement from the professors

One mentioned that the "institutional" feel of the site prevented him from realizing that it could be used like Facebook or MySpace. Referring to the feeling of MySpace or Facebook, another said that it does "not have as much as that; it had an institutional feel."


Most of the interviewed students said that they felt like they had control over their portfolio and the portfolio content. Several of them equated control with the potential for other people to insert things into their space, or on their own ability to put whatever they wanted into the site. One student said she had control "because I'm allowed to do certain things with the ePortfolio even if I don't use it." Another said that he had control, even though it was "tricky" to figure out where to place things. Another student said that he had "complete control, it was pretty much a blank slate for me to put in whatever I wanted."

Most of the interviewed students did not feel that they had control over the visual style of the portfolio. When asked to explain, most said that they had either not tried to use that feature, or that they did not know how to change the template. Some students seemed to think that they could not change the template, or that the institutional branding of the site meant that they did not have control over it.

A small minority disagreed, saying that they felt like they did have visual control. When asked to explain, they said that they had the ability to move things around the page, or that they had seen another student's page with a custom template. However, this group was definitely in the minority.


While most students reported that they had control over their portfolio, a significant number said that they did not feel like they fully owned their site. Some of the ownership issues related to the institutional branding of the site; one student said, "It didn't look like Facebook or MySpace so we didn't realize that was what it was for." Faculty also presented the site as a course requirement, and not as a way for students to create a personal site. One student said:
   We use the portfolio only as a course requirement. You don't have a
   sense you want to go into it all the while ... It was how it was
   introduced to us.

Lastly, one student explicitly compared setting up a portfolio to using a social networking site. As he said, "I didn't feel like it was genuinely mine, like if I went and created a page somewhere... I didn't feel it was ownership, I felt like it was part of the system a little bit."

The students who did report a sense of ownership mentioned the ability to add pages and control content as the source of that feeling.

Social Learning

Some of the most useful data gathered was in determining why students used the social learning features of the site.

First, the most common reason given for viewing peer work was to get ideas on how peers were approaching the assignment. Many students said something similar to how this tool let them "see how they're approaching the problem," or "what idea[s] they have that are different from my own." Some reported being surprised at what their peers wrote, with one saying:
   Some of the classmates answered very differently than mine ...
   [O]thers describe more. This is good; I didn't see it from this
   perspective before.

One student highlighted the usefulness of the feature for classes where the assignments were unclear, or more complicated (such as Finance or Accounting):
   [This feature would] be useful for when I didn't have an
   understanding of the assignment, then I can look at other peoples'
   work and can see how they interpreted it ... [It would] be good in
   a class where the assignments were tricky.

Second, many students reported that the tool acted as a benchmark. One student said that "I tried to always be better than someone else." Another student said:
   There was one time where I really wasn't sure of the approach to
   take ... I read someone else's, and ooh I wasn't going to write
   that much and prompted me to write more.

However, a strong minority of students reported that they did not use the tool in this way. Some said that they only looked at peer work after doing their own, or that there was not any peer work posted until after they completed the assignment.

Third, most students reported that the social learning feature did not help them to get to know the material better. As one said, "... as far as learning the material, that's more of a personal thing." Most students reported using the social learning aspects of the site after studying on their own. This is in contrast to the survey results, where most students indicated that the site helped them to get to know the material.

Fourth, students had different reactions to the question asking if they got to know fellow students better through the social learning feature. Because the IT 509 course was only 7 weeks long, most of the MBA students said that the class was too short and intense to allow much peer bonding. As an example, one student stated:
   Not really, but ... that's the compressed time format of the class
   itself ... It goes by so quickly; it's all you can do to keep up
   with your assignments.

Some students did say that they got to know their peers better. Those tended to talk about how it helped them understand peer perspectives, views, and writing ability. As another student wrote:
   Yah it did a bit, you can see their grammar style and how they
   interpret the questions. Can tell you a lot about how they think
   and where they're coming from.

Lastly, students seemed to have a strong desire to minimize their usage of the social learning features. The word "copying" or phrase "copying and pasting" in the formatting question frequently caused the tone of the interview to change. Students went from talking freely to single word "yes" or "no" answers. This impulse seemed particularly strong in PhD and international students. Student seemed to feel a need to justify their capability to accomplish the work without needing peer assistance.


The survey, interview, and usage results strongly support my predictions drawn from the background literature.

Design Model Validation

My portfolio design theory predicted that ownership, social learning, and ease of use are central variables in explaining students' portfolio adoption. Results of my hypothesis clearly indicate that these three variables can predict satisfaction with a high degree of accuracy (with an adjusted [R.sup.2] =.574).

Social learning proved to be an important variable. Students clearly felt that the social learning aspects of the site improved their learning and peer relationships. Students felt that seeing the way in which peers approached the assignments helped them see the material from a different point of view.

Students' usage of the social aspects of the site was high, and the pop-ups proved to be an important way to help students see peer work. Placing similar work in the edit screen is a novel concept that has been proved to increase students' peer exposure. While students may feel a need to minimize the extent to which their work is based off of peers, their heavy usage indicates it was a useful feature.

Ease of use was a significant predictor of overall satisfaction in my regression model.

Design Implications

The major implication of this work is that portfolio software should be redesigned to be more open, social, and easy to use. Jettisoning the current database-driven view of portfolios, which emphasize re-use instead of ease of use, is the first step designers should take in redefining the portfolio.

Next, portfolio designers and pedagogical or program leads should reconsider the emphasis on social aspects of portfolios. Students clearly are motivated by social portfolios, learn the material better, and better know their peers.


This research project has a number of limitations.

1. While two groups used the site, usable survey data were only gathered from 68 students. While this is a small sample, survey results were almost evenly split between the MBA and IT (MSIS/PhD) students, and were similar enough that a dummy variable in the regression was not statistically significant.

2. The SISAT students' experience of the portfolio features of the site may have been contaminated by their use of the blogging and file sharing features in other courses. This threat was mitigated by the instructions on the survey form, which asked students to answer questions based on their experiences with the portfolio tool. Interviews with students also indicated that their responses were based on portfolio usage.


The project demonstrates the validity of a new style of social ePortfolios.

1. Usage results indicate that students quickly learned how to view peer work, and consistently did so during their editing and submission process.

2. Survey results indicate that the design model is sound. The ownership, ease of use, and social learning constructs passed Cronbach's Alpha test, and were able to predict user adoption with an adjusted R2 of 55%.

3. Interview results triangulated the survey results, and provided additional information on the reasons why students used the social learning features.

Students enjoyed learning from peer work, and did it even when not required. Many reported being surprised at the way that different people approached an assignment or topic, and said that looking at peer work caused them to reexamine their approach.

This project provides consistent support for a new style of eportfolio. Easy to use software that encourages students to learn from each other, and provides a sense of ownership, will lead to improved adoption and learning. Instead of emphasizing assessment rigor, ePortfolio software should be designed to help students learn from each other.


Barrett, H. (2005). Researching Electronic Portfolios & Learner Engagement, Retrieved October 8, 2010, from,

Barrett, H., & Wilkerson, J. (2004). Conflicting Paradigms in Electronic Portfolio Approaches, Retrieved October 8, 2010, from,

Becta (2007). Impact Study of E-Portfolios on Learning, Retrieved October 8, 2010, from, research/impact_study_eportfolios.pdf.

Beetham, H. (2005) E-portfolios in Post-16 Learning in the UK: Developments, Issues and Opportunities, Retrieved October 8, 2010, from,

Cameron, D., & Anderson, T. (2006). Comparing Weblogs to Threaded Discussion Tools in Online Educational Contexts. Instructional Technology & Distance Learning, 2(11), Retrieved October 8, 2010, from,

Camp, J., & DeBlois, P. (2007). Current Issues Survey Report. Educause Quarterly, 30(2), Retrieved October 8, 2010, from,

Carney, J. (2004). Settings and Agenda for Electronic Portfolio Research: A Framework for Evaluation Portfolio Literature. Paper presented at the American Educational Research Association Annual Meeting, April 12-16, San Diego, CA, USA.

Cohn, E., & Hibbitts, B. (2004). Beyond the Electronic Portfolio: A Lifetime Personal Web Space. Educause Quarterly, 27(4), Retrieved October 8, 2010, from, BeyondtheElectronicPortfo/39884.

Davis, F. (1989). Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology. MIS Quarterly, 3(3), 319-340.

ePortConsortium (2003). Electronic Portfolio White Paper Version 1.0, Retrieved October 8, 2010, from,

Goldsmith, D. (2007). Enhancing Learning and Assessment through E-Portfolios: A Collaborative Effort in Connecticut. New Directions for Student Services, 119(Fall), 32-42.

Hevner, A.R., March S.T., & Park J. (2004). Design Science in Information Systems Research. MIS Quarterly, 28(1), 75- 105.

Jafari, A., Mcgee, P., & Carmean, C. (2006a). Managing Courses & Defining Learning: What Faculty, Students, & Administrators Want. Educause Review, Retrieved October 8, 2010, from,

Jafari, A., McGee, P., & Carmean, C. (2006b). A Research Study on Current CMS & Next Generation eLearning Environment, Retrieved October 8, 2010, from,

Lorenzo, G., & Ittelson, J. (2005a). Demonstrating and Assessing Student Learning with E-Portfolios. Educause Learning Initiative, ELI Paper 3, Retrieved October 8, 2010, from,

Lorenzo, G., & Ittelson, J. (2005b). An Overview of E-Portfolios. Educause Learning Initiative, ELI Paper 1, Retrieved October 8, 2010, from,

March, S.T., & Smith, G. F. (1995). Design & Natural Science Research on Information Technology. Decision Support Systems, 15, 251-266.

Murray, C., Smith, A., Pellow, A., Hennessy, S., & Higgison, C. (2006). From Application to Graduation and Beyond: Exploring User Engagement in the E-Portfolio Process. Paper presented at the European Distance and E-Learning Network Conference, June 14-17, Vienna, Austria.

Nunnaly, J. (1978). Psychometric theory, New York: McGraw-Hill.

Salzman, S., Denner, P., & Harris, L. (2002). Teacher Education Outcomes Measures: Special Study Survey. Paper presented at the Annual Meeting of the American Association of Colleges for Teacher Education. February 23-26, New York, NY, USA.

Simon, H. (1996). The Sciences of the Artificial (3rd Ed.), Cambridge, MA: MIT Press.

Walls, J., Widmeyer, G., & El Sawy, O. (1992). Building an Information System Design Theory for Vigilant EIS. Information Systems Research, 3(1), 36 -59.

Walls, J., Widmeyer, G., & El Sawy, O. (2004). Assessing Information System Design Theory in Perspective: How Useful was our 1992 Initial Rendition? Journal of Information Technology Theory and Application, 6(2), 43-58.

Zubizarreta, J. (2006). The Learning Portfolio: Reflective Practice for Improving Student Learning, Boston, MA: Anker.

Nathan Garrett

Assistant Professor of Information Technology, Woodbury University, 7500 Glenoaks Blvd., Burbank, CA 91510, USA //
Figure 6. SISAT Site Activity


               1     2     3    4    5   6    7    8    9   10    11

save          12     4     9         2   1        13    9   22    29
view_popup    90   176    89    1   13       14    7    3   18    74
list_popup    62   184   136   13    4   1    2   32   17   71   215


              12    13    14    15   16   17    18    19    20   21

save          26    11    18     7   16    5    13    23     2    8
view_popup    50    34    39    31   15   48    49    34     1   29
list_popup   146    33    61    61   55   67   132   203   108   69


              22    23    24    25   26   27   28    29    30    31

save          12     5                         9     2
view_popup    10    20     1                   15    11    3
list_popup    54    51                         51    20


              1    2     3     4     5    6    7     8     9     10

save               7     10    20    5    5    5     1
view_popup         25    48    80    44   37   20    8
list_popup         97    65    163   57   38   64    9


             11    12    13    14    15   16

view_popup         2

Figure 7. IT 509 Pop-ups Viewed

     1     2     3      4     5     6     7

1   105   523   360    302   94    480   307
2   411   767   47.3   610   184   143   60
3                            198   369   612

Figure 8. IT 509 Saves

      1    2    3    4    5    6    7    8

1     39   90   31   69   10   59   26
2     62   41   29   47   25   36   7    33
3     84   85   77   70   35   41   67

Figure 9. SISAT Page Pop-up Viewing


             1     2    3    4    5    6    7    8    5    10

view_other   31   53    37   2    7         10   5    2    34
view_popup   90   176   89   1    13        14   7    3    18


             11   12    13   14   15   16   17   18   19   20

view_other   42   54    19   6    3    10   18   31   20   1
view_popup   74   50    34   39   31   15   48   48   34   1


             21   22    23   24   25   26   27   28   29   30   31

view_other   8     6    10   3    8    1         6    5    5
view_popup   29   10    20   1                   15   11   3


             1     2    3    4    5    6    7    8    9    10

view_other        36    48   83   9    27   28   1
view_popup        25    48   80   44   37   20   8


             11   12    13   14   15   16

view_other   3     3         1    2
view_popup         2

Figure 10. IT 509 Peer Work Viewed

     1     2     3     4     5     6     7

1   63    186   147   143   24    157   139
2   128   222   169   222   131   48    29
3                           77    198   352

Figure 11. Model Summary

R          R Square   Adjusted R Square   Std. Error of the Estimate

.752 (a)     .565           .545                     .535

Figure 12. ANOVA

Model        Sum of Squares   df   Mean Square     F        Sig.

Regression       24.202       3       8.067      28.148   .000 (a)
Residual         18.630       65      .287

Total            42.832       68

(a.) Predictors: (Constant), Ease of Use, Social Learning, Control

(b.) Dependent Variable: User Satisfaction

Figure 13. Coefficients

                   Unstandardized     Standardized
                    Coefficients      Coefficients

Model              B     Std. Error       Beta         t     Sig.

(Constant)        .225      .245                     .919    .362
Control           .230      .132          .178       1.749   .085
Social Learning   .371      .086          .428       4.318   .000
Ease of Use       .287      .102          .296       2.819   .006

(a.) Dependent Variable: User Satisfaction
COPYRIGHT 2011 International Forum of Educational Technology & Society
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2011 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Garrett, Nathan
Publication:Educational Technology & Society
Article Type:Report
Geographic Code:1USA
Date:Jan 1, 2011
Previous Article:Training in mental rotation and spatial visualization and its impact on orthographic drawing performance.
Next Article:Conceptual model learning objects and design recommendations for small screens.

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters