Recently, I've been looking at the Open Doors data and how we use this data when talking about study abroad participation rates in the U.S. Open Doors institutional rankings by participation rates is calculated based on the total of undergraduate degrees conferred as reported in IPEDS data (Integrated Postsecondary Education Data System at the National Center for Education Statistics, U.S. Department of Education) [see April 16, 2012 update note below]. Frequently, researchers, practitioners and the press state that approximately or less than 2% of U.S. students study abroad each year. My calculation for the 2003-2004 academic year is that 7% of U.S. students studied abroad based on the total number of degrees offered in the U.S. My calculations for the 2004-2005 academic year (based on estimated IPEDS data) is that 7.4% of U.S. students studied abroad. My data follows:
2003-2004 - NCES/IPEDS and Open Doors data
Total degrees awarded in U.S.: 2,711,661
Associate's = 665,301,
Bachelor's = 1,399,542
Master's = 558,940
1st Professional = 83,041
Ph.D. = 48,378
Total U.S. higher education enrollment = 16,681,877
Total U.S. students studying abroad = 191,321
U.S. Study Abroad Participation Rate Using Total U.S. Higher Education Enrollment
191,321 study abroad students / 16,681,877 higher education enrollment = 1.1% study abroad participation rate
U.S. Study Abroad Participation Rate Using Total U.S. Degrees Awarded
191,321 study abroad students / 2,711,661 degrees awarded = 7.0% study abroad participation rate
To be sure, a 7% participation rate is very low and it's debatable on the significance this 5% difference makes. I think it's important for the field to be consistent, however, in how we talk about and report participation data and since the number of U.S. students studying abroad continues to grow so does the participation rate.
Note April 16, 2012: Earlier today I posted the following to Twitter "It is incorrect to say that only 1% of U.S. students study abroad. The percentage is closer to 10% than to 1%. Still low but not 1%..." This tweet generated some retweets and some messages back asking where I come up with 10% and this "discussion" of course made me happy as I like to see/hear people engaging and thinking critically about the field! So, I thought I would enter this update as a way to continue the debate and dialogue on data collection efforts in the field.
The reason I used NCES/IPEDS data to calculate overall study abroad participation rates [in my original post above] is because this is the data set that IIE Open Doors uses to calculate institutional participation rates. Footnote 1 on p. 20 of the 2009 Open Doors Report states "the estimated undergraduate study abroad participation rate is calculated by dividing the undergraduate study abroad total by the number of undergraduate degrees conferred (as reported in IPEDS)".
The 2011 Open Doors "Fast Facts" [using 2011 as this data was not previously presented] provides the following breakdown:
U.S. higher education system 270,604 (U.S. Study Abroad Total) 19,805,000 (U.S. Higher Education Total) = 1.4%
U.S. undergraduates 233,169 (U.S. Study Abroad Total) 2,452,218* (U.S. Higher Education Total) = 9.5%
U.S. undergraduates pursuing bachelor's degrees 230,752 (U.S. Study Abroad Total) 1,642,979* (U.S. Higher Education Total) = 14.0%
* Total undergraduate degrees awarded [assuming that this is NCES/IPEDS data]
To be honest, I'm not a fan of the Open Door's methodology of calculating institutional study abroad participation rates using NCES/IPEDS degree conferral data. I don't think we get an accurate figure of institutional study abroad participation rates by using the total number of degrees granted. If, for example, an institution sends undergraduates at all levels (freshman/first-year, sophomore/second-year, junior/third-year, senior/fourth-year and beyond) how can determine a study abroad participation rate if we divide the total of all these students by those who have their degrees conferred (seniors)? We can't, in my opinion.
This is, however, how IIE Open Doors calculates and presents participation data so this is what I used in my argument above.
Personally, I think the best way to calculate a participation rate is to take a portion [number of U.S. students who studied abroad] and divide that figure by the total [entire higher education enrollment]. This does, in fact, bring the total number of U.S. students who study abroad closer to 1%. I think the 2011 Open Doors "Fast Facts" data could use some additional data and here is one that I think could be helpful:
233,169 (total U.S. undergraduates who studied abroad in 2009/10) ÷ 17,565,300 (total undergraduate enrollment in the U.S. in 2009) = 1.3%.
Above are various thoughts and configurations of how to think about, calculate and present study abroad participation numbers.
What are your thoughts?
A Source for News and Discussion on International Educational Exchange & Mutual Understanding
Tuesday, March 20, 2007
Tuesday, March 6, 2007
Standards of Good Practice in the Field of Education Abroad
Recently during my research, I came across an historical article (1967) that was one of the first to address the development of standards in the field of study abroad. The citation is as follows:
Durnall, Edward J. (1967, November). Study-Abroad Programs: A Critical Survey. The Journal of Higher Education, 38 (8), 450-453.
In this brief four page article, Durnall discusses his survey of undergraduate programs in Europe conducted by U.S. institutions and mentions the methods used during his program evaluation as he utilized six of fifteen principles developed at a conference on study abroad held at Mount Holyoke College in 1960.
I just returned from the 3rd annual Forum on Education Abroad conference in Austin, Texas, with the theme of Standards in a Diverse World: The Future of Education Abroad, where the Mount Holyoke conference was mentioned and discussed by colleagues such as Bill Hoffa whose History of Study Abroad, Volume I: Beginnings to 1965 was distributed to all conference attendees.
While some of the material in Durnall’s article is dated I find the following comment by Durnall to remain valid today, “while it would be hoped that all institutions with study-abroad programs would voluntarily examine their programs in the light of commonly accepted standards and either make the necessary improvements to meet these standards or discontinue the programs, the realities of higher education in the United States today make this an unlikely event.”
Durnall, Edward J. (1967, November). Study-Abroad Programs: A Critical Survey. The Journal of Higher Education, 38 (8), 450-453.
In this brief four page article, Durnall discusses his survey of undergraduate programs in Europe conducted by U.S. institutions and mentions the methods used during his program evaluation as he utilized six of fifteen principles developed at a conference on study abroad held at Mount Holyoke College in 1960.
I just returned from the 3rd annual Forum on Education Abroad conference in Austin, Texas, with the theme of Standards in a Diverse World: The Future of Education Abroad, where the Mount Holyoke conference was mentioned and discussed by colleagues such as Bill Hoffa whose History of Study Abroad, Volume I: Beginnings to 1965 was distributed to all conference attendees.
While some of the material in Durnall’s article is dated I find the following comment by Durnall to remain valid today, “while it would be hoped that all institutions with study-abroad programs would voluntarily examine their programs in the light of commonly accepted standards and either make the necessary improvements to meet these standards or discontinue the programs, the realities of higher education in the United States today make this an unlikely event.”
Subscribe to:
Posts (Atom)