Harris Interactive

Tell Us What You Think! Join Today!

HomeClient SolutionsOur Tech EdgeInternationalInvestor RelationsThe Harris Poll Library

Search Our SiteAbout Harris InteractiveContact Us!Careers at Harris InteractiveLatest News From Harris InteractiveTake The Weekly PollVisit Panel Member Services

Business School
Survey


Introduction


Reports Available


Methodology


Careerjournal.com


WSJ Article 2001


WSJ Article 2002


Contact Us

 Methodology

By Joy Marie Sever, Ph.D. Senior Vice President, Director of the Reputation Practice at Harris Interactive

When The Wall Street Journal and Harris Interactive made the decision to work together to rate and rank business schools, we agreed to a 5-part mission.

Our Mission

  1. To rate as many full-time M.B.A. programs as possible on as broad a range of characteristics as possible by as many recruiters as possible.
  2. To conduct this rating rigorously and without introducing bias that would unfairly affect particular schools.
  3. To create a rating experience that would be interesting and enjoyable to the recruiters taking part in the survey.
  4. To provide results that would be as useful as possible to students, recruiters, and the schools being rated.
  5. To be as open as possible with schools and recruiters about our methodology and our calculation of the scores and rankings.

The following information provides details relevant to points 1 to 4, and shows that we are committed to achieving point 5.

Overview

A total of 2,221 M.B.A. recruiters took part in Year 2 of The Wall Street Journal/Harris Interactive survey and provided a total of 3,641 school ratings. This figure represents a 39% increase in respondents from the Year 1 survey, published in April 2001, in which 1,600 respondents provided 2,687 school ratings. All interviews were conducted online and took place from October 23, 2001 to March 1, 2002.

As in Year 1, a total of 50 business schools are rated and ranked in the Year 2 survey, and each school was rated by a minimum of 20 recruiters (range = 26 to 156). School ratings are based on perceptions of the school and the school’s students (80%) and on the school’s "mass appeal" (20%) as defined by the total number of respondents who recruit from that school.

Mission 1: To rate as many full-time M.B.A. programs as possible on as broad a range of characteristics as possible by as many recruiters as possible.

How we selected the U.S. and Non-U.S. schools for the survey

U.S. Schools: Our selection of U.S. schools was based on information obtained from AACSB International, an accrediting organization for business schools. We believed this list provided the best objective source of U.S. business schools. As of July 2001, there were 357 U.S. business schools accredited by AACSB.

Non-U.S. Schools: As we were unable to identify an objective list of non-U.S. business schools, we created the list through meetings and discussions with experts in the field of M.B.A. recruiting (including business school deans, business school associations, recruiters and career services directors).

Full-Time Requirement: The Wall Street Journal/Harris Interactive survey focuses on the opinions of recruiters who recruit full-time business school graduates. Because of that focus, schools eligible for rating in our survey not only had to offer a full-time program, but were required to have at least 50 full-time students in their graduating class of 2001. We defined the class of 2001 as students who graduated during the 2001 calendar year.

Our final sample included 187 U.S. schools and 73 non-U.S. schools. These 260 schools were eligible and available for rating by recruiters when the survey began Oct. 23, 2001.

How we developed the recruiter survey instrument

Our goal was to identify the school and student characteristics that recruiters consider most important when they make decisions about which schools to recruit from and which students to hire.

The 26 attributes in the Year 2 survey were:

School Attributes

  • Career services office at that school
  • Past acceptance rate of job offers from students at this school*
  • Quality of past hires*
  • Content of the core curriculum*
  • Overall value for the money invested in the recruiting effort
  • Faculty expertise*
  • Students’ average number of years of work experience
  • Willingness of the school’s students to relocate to the job location you require
  • Retention rate of past hires*
  • Strong international perspective+
  • Likelihood of recruiting "stars" – that is, graduates who are very likely to be promoted within the company+
  • School "chemistry" – that is, the general like or dislike you have of the school overall

Student Attributes

  • Communication and interpersonal skills
  • Original and visionary thinking
  • Leadership potential
  • Ability to work well within a team
  • Analytical and problem-solving skills
  • Strategic thinking
  • Ability to drive results
  • Fit with the corporate culture
  • Entrepreneurial skills
  • General management point of view
  • Student "chemistry" – that is, the general like or dislike you have of the student overall+
  • Awareness of corporate citizenship issues such as social and environmental responsibility+

Overall Attributes

  • Overall satisfaction with school in terms of meeting your recruiting needs+
  • Likelihood of returning to school for future recruiting needs+

* = Attribute modified from Year 1 survey 

+ = Attribute new to Year 2 survey

 

How we reached and surveyed recruiters

In July 2001, we contacted all eligible M.B.A. programs and asked them to provide us with their recruiter contact information. Recruiter lists were collected from schools in three phases from October 2001 to January 2002. The first phase captured active recruiters at the business school since September 2000, and the subsequent phases served to accommodate any updates or additions to this initial list.

We set and achieved a goal of obtaining recruiter lists from at least 50% of the U.S. and non-U.S. schools eligible for rating.

All recruiters were contacted by email and/or by regular mail and invited to take part in the survey. In addition to receiving general information in the invitation about The Wall Street Journal/Harris Interactive project, each recruiter received the URL to access the survey online and a unique password.

The Recruiter Hotline

Recruiters who wished to participate in the survey but had lost or misplaced their invitations were encouraged to call a toll-free number in order to gain access to the survey. The recruiter hotline was operational from Nov. 12, 2001, through March 1, 2002. Call agents assisted nearly 300 U.S. and international hotline callers interested in taking part in the survey.

 

Mission 2: To conduct this rating rigorously and without introducing bias that would unfairly affect particular schools.

The Rating Process

In order to create an unbiased ratings process, our methodology included the following features:

  • An opportunity for all full-time M.B.A. programs to be included in the ratings
  • An opportunity for all M.B.A. recruiters to be interviewed for the survey
  • An opportunity for recruiters to rate a school, even though that school may not have supplied that recruiter’s name (in other words, schools didn’t have control over which recruiters rated their school)
  • Rating criteria built by the "customers" who know the schools best–the recruiters
  • A ratings system covering a broad range of attributes including school, student and overall characteristics
  • A ratings system that acknowledges the importance of the number of recruiters recruiting from a particular school as a reflection of the bond that a school has with its recruiters and the reach the school has in attracting recruiters
  • An online survey methodology that employs password protection to ensure that no recruiter could take the survey more than once
  • An audit of the results to ensure that only legitimate recruiters were taking part in the survey
  • An opportunity for recruiters to rate more than one school. Every recruiter was given the option of rating up to three schools (59% rated one school, 18% rated two schools, and 23% rated three schools)
  • A selection process whereby recruiters are randomly assigned to rate up to three schools from among all those schools at which they recruit. Each recruiter was asked to identify all schools with which he or she had recruiting experience since September 2000. For those recruiters identifying more than three schools, a school-selection process for the ratings phase was instituted in which recruiters were randomly assigned to rate three schools from among all those at which they recruit. (Note: Randomly assigning recruiters to rate schools represents a change from our Year 1 approach, in which recruiters were able to rate the schools they wanted to rate. Our analysis showed that school ratings and rankings [Year 1 vs. Year 2] were not unfairly affected by this change to our methodology.)

 

Mission 3: To create a rating experience that would be interesting and enjoyable to the recruiters taking part in the survey.

The results of The Wall Street Journal/Harris Interactive Recruiter Survey exist because of the 2,221 recruiters who chose to complete our survey. Creating a positive survey experience was key to the success of the project. A number of steps were taken to make this possible.

  • Extensive pre-testing: We interviewed recruiters before the official start of the field period to assess the overall survey experience and the maximum number of schools recruiters would feel comfortable rating
  • Respect for recruiters’ time: Our goal was to create a survey that could be completed by most recruiters in 15 to 20 minutes. Overall, 66% of respondents indicated that the length of the survey was "about right."
  • Creation of survey options: We offered recruiters an option to take a short form of the survey (a version that could be completed in about 10 minutes).
  • Continually monitoring recruiters’ satisfaction with our survey: Overall, 84% of the recruiters rated our survey as either "excellent" or "good," which is slightly above the Year 1 figure of 80%.
  • Recruiter incentives: U.S., Canadian (non-Quebec) and U.K. recruiters invited to participate in the survey had the opportunity to enter a sweepstakes to win four prizes of $500 and 150 copies of the book "Jobs Rated Almanac 2002" from the editors of CareerJournal.com, The Wall Street Journal website for managers, executives and professionals. Sweepstakes winners were determined by random drawings among eligible entries.

 

Mission 4: To provide results that would be as useful as possible to students, recruiters and the schools being rated.

Not only do the findings from our survey reflect the perceptions of a key constituency—recruiters—but they are based on the characteristics that recruiters told us are most important to them when deciding which schools to recruit from and which type of students to hire. Moreover, the broad range of recruiters that took part in our survey gives each school the opportunity to look more closely at recruiters from particular industry segments.

Results from The Wall Street Journal/Harris Interactive Survey are available in a number of formats:

  • The book, "The Wall Street Journal Guide to the Top Business Schools, 2003," published by Wall Street Journal Books, an imprint of Simon & Schuster
  • A special Wall Street Journal newspaper section and Journal websites, including CareerJournal.com and WSJbooks.com
  • Research reports prepared by Harris Interactive

 

Mission 5: To be as open as possible with schools and recruiters about our methodology and our calculation of the scores and rankings.

A. The calculation of The Wall Street Journal/Harris Interactive 2002 score

To be rated and ranked in The Wall Street Journal/Harris Interactive Survey, each school required a minimum of 20 recruiter ratings. A school rating was calculated for all schools reaching that minimum. The final scores used to rank schools in the 2002 survey used a combination of results from the current year (Year 2) and the previous year (Year 1).

Step 1: Calculation of Current Year Results

Each school rating was based on three components:

Perception A: The perceptions of the school on the 12 school attributes and 12 student attributes

Perception B: The perceptions of the school on the two overall attributes

Mass Appeal: The number of recruiters indicating that they recruit from the school

Calculating Current Year Perception A (75%)

Current year Perception A is based upon recruiter ratings in the Year 2 survey for each school on the 12 school and the 12 student attributes. Each recruiter was asked to rate the school using a 10-point scale where "1" equals "poor performance, does not meet your needs" and "10" equals "excellent performance, meets your needs very well."

    1. Each individual recruiter’s rating (from 1 to 10) on each of the 24 individual attributes was multiplied by the importance that recruiter gave that attribute ("1," "not at all important," to "4," "very important").
    2. For each individual recruiter, the sum across the 24 attributes (importance rating multiplied by the rating of the school on that attribute) was divided by the total score possible. The total score possible is the total number of attributes answered (maximum of 24) multiplied by 10 (the maximum rating possible) multiplied by 4 (the maximum importance that could have been given to an attribute).
    3. The current year Perception A score for each school was calculated by summing the Perception A ratings from all recruiters rating that school, divided by the total number of recruiters for that school and multiplied by 75 (to reflect the 75% contribution made by Perception A).

Calculating Current Year Perception B (5%)

    1. The current year Perception B score consists of two attributes— "overall satisfaction with the school" and "likelihood of recruiting from the school in the next two years." To create the current year Perception B score, individual recruiter’s ratings (from 1 to 10) on the two overall attributes were summed and divided by 4 (with the maximum score possible of 5, reflecting the 5% contribution made by this component).
    2. In cases where recruiters answered only one of the two overall attributes, their rating on the answered attribute was divided by 2 (again to reflect a maximum possible rating of 5 for the Perception B component).
    3. If both overall attributes were left unanswered by a recruiter for a particular school, no Perception B rating was contributed by that recruiter for that school for the current year.

Calculating Current Year Mass Appeal (20%)

The total number of recruiters who indicated that they recruit from each school was divided by 491 (which was the highest number of recruiters recruiting from any one school in the current year) and multiplied by 20 (to reflect the 20% contribution made by the mass appeal component).

Step 2: Calculation of Final 2002 Scores (current year plus previous year combined)

Calculating the Final Perception Score (80% of the final 2002 score)

    1. The final perception score was calculated by summing total perception scores (Perceptions A + B) from the previous year (Year 1) and the current year (Year 2) and dividing by 2 (with the maximum possible score of 80, reflecting the 80% contribution made by this component to the final 2002 score).
    2. In cases where schools received less than 20 ratings in the previous year, their current year perception scores were used as their final perception scores.

Calculating the Final Mass Appeal Score (20% of the final 2002 score)

    1. An average mass appeal score was calculated for each school by summing the previous year and the current year mass appeal scores and dividing by 2.
    2. In order to ensure that the school with the highest combined mass appeal score received a score of 20, the average mass appeal score for each school was divided by 18.24 (the highest average mass appeal score achieved by any one school) and multiplied by 20 to create the combined mass appeal score (to reflect the 20% contribution that the mass appeal component makes to the final school score).

The Final 2002 Score (Perception plus Mass Appeal)

The final school score is based on the sum of the combined perception score (from the previous and current years) and the combined mass appeal score (from the previous and current years). The maximum score possible is 100.

Before finalizing the results, recruiter ratings for each school were individually examined and outlier ratings were removed so as to not let individual recruiters exert extreme positive or negative influence on a school’s final score. Once outlier ratings were removed, all schools with at least 20 recruiter ratings were rank-ordered based on their final school score. The 50 schools ranked in The Wall Street Journal/Harris Interactive survey represent the 50 schools with the highest scores overall.

The final school scores in the 2002 survey range from a low of 58.8 to a high of 75.2.

B. How the results should be interpreted

The results of The Wall Street Journal/Harris Interactive Recruiter Survey are based on a total sample of 2,221 recruiters. When reported results are based on the entire sample, differences of plus or minus three percentage points can be considered statistically different at the 95% confidence level. Ratings for each school, however, are based on smaller sample sizes (ranging from 26 to 156). While we believe that the final sample of recruiters rating each school can be considered representative of recruiters for that school, the results based on these smaller sample sizes may prevent conclusions that include statements about statistically significant differences.

C. How the results of open-ended nominations were tabulated

In addition to the results that make up the final 2002 school score for each school, the recruiter survey also included a variety of nomination questions (e.g., academic specializations, campus environments). Respondents could select schools from the list of all 260 eligible M.B.A. programs. Results from a particular school in a particular nomination category in the current (Year 2) survey were tallied and combined with the nominations for that category from the previous year (Year 1). Schools were then rank-ordered based on the total number of nominations received from the current and previous years for each category. Results of these nomination questions were not included in the results on which the 2002 rankings are based.

Home | Client Solutions | Our Tech Edge | International
Investor Relations  |  Site Search | About Us | Contact Us | Careers
News Room | Weekly Poll | Harris Poll Library | Panel Members Area