It’s a question undecided applicants are always asking: what should I put down as my intended major on my college applications? The answer of course will depend on the exact colleges one is applying to and the potential majors one will consider; however, this year, 2017, as students get ready to apply during the 2017-2018 admissions cycle for Fall 2018 freshmen spots at America’s most selective colleges, there is one major that certainly deserves your attention more so than others. Drumroll please….
The recent announcement of plans to drop four varsity sports from the University at Buffalo’s (UB) roster of Division 1 offerings pretty much puts a face on what counselors and other admissions professionals have been warning about the impact of funding cuts on both public and private institutions.
UB recently revealed that men’s soccer, men’s swimming and diving, baseball and women’s rowing will no longer be sponsored. This decision affects 120 students currently on team rosters (30 other students on those rosters will graduate this year). Although UB’s athletes have been offered the opportunity to stay in school with scholarships intact, the reality is they won’t get to compete at the D1 level. And competition for athletes who have spent a lifetime honing skills is pretty fundamental.
For these athletes or department heads facing academic program cuts, it’s no secret that college administrators and boards are increasingly being asked to make hard choices as they struggle with demographic and economic realities in a battle for long-term survival and institutional health.
And a divide is opening between financially healthy colleges versus those that are not, making it imperative for students and their parents to understand how financial constraints affect colleges, application processes, and admissions decisions.
Given the current economic climate, here are some questions colleges don’t always like to answer:
- How has the admissions office been affected by budget cuts?
Even in the face of increased numbers of applications to process, admissions budgets aren’t growing. As a result, admissions offices are making do with less. Glossy view books and travel allowances are becoming scarce, as colleges seek additional ways to trim budgets while continuing to respond to front office demands for more applicants. With tight budgets to manage, colleges are increasingly relying on enrollment management programs to guide and support the admission process, effectively allowing technology to take over recruitment and some elements of application review. As a result, students need to understand that their privacy is constantly under attack by colleges attempting to probe both qualifications and interest. Toward this end, seemingly benign third-party organizations seek to obtain and resell key pieces of information, ranging from standardized test scores to family income, to colleges hungry for data that can be fed into algorithms designed to assess credentials and guess at likelihood of enrollment. In other words, through skillful use of technology, admissions offices are not only saving money but also manipulating metrics important to ranking and outside perceptions of “quality”—both vital to long-term institutional health.
- Has the application process been affected?
To gain better control over the process and factors affecting selectivity and “yield” (the percent of students accepting an offer of admission), colleges are experimenting with different early action and binding early decision plans. Rather than setting up a process that encourages a single windfall of applications late in the season, admissions offices are looking for a more even distribution of work from September to May. And the appeal of early decision candidates committed to attending at the front end of the process is undeniable for both management and yield. Some colleges find it more efficient to force hard decisions earlier by denying larger percentages of early applicants—it takes time and money to read and re-read applications. Others prefer keep all options on the table by rolling large numbers of applicants into the regular pool. And given uncertainties inherent in a process that indiscriminately recruits and makes it relatively easy to submit applications, colleges look for ways to cover all bets by enlarging and employing wait lists—secret weapons in the battle to improve yield and control investment in financial aid. Seeking an early understanding of policies and being aware of the institutional incentives behind these policies may help guide application strategies. But given the number of uncertainties affecting budgets, staffing and priorities, don’t be surprised if what you thought you knew is no longer true. It’s not unusual for colleges to make substantial changes in application procedures—sometimes late into the year. So feel free to ask the question.
- Are priorities changing in financial aid?
While the new timeline imposed by an October 1 FAFSA start date and the use of “prior-prior year” income information for determining awards suggests a more sensible and timely approach to financial aid, the jury is still out as to how successful the new plan will be for both students and institutions. At the same time they are dealing with various logistical issues, colleges formerly boasting of “need-blind” admissions or “no loan” packaging are reassessing their policies to ensure adequate financial aid resources remain available to the greatest number of students. Most but not all colleges offer merit scholarships that are important recruitment tools in the process. But variations in the balance between grants and loans in financial aid packages make some colleges appear more generous than they really are. It’s not unusual for colleges to engage in “gapping” (not covering full need) when offering financial aid, but the gaps appear to be getting larger. And be aware that not all guarantee merit scholarships for four full years. To save money without harming published freshmen retention rates, colleges may not continue scholarships after two years—even if all academic requirements have been met. Although it really pays to be a savvy shopper before applying and committing to a school, keep in mind that financial aid offices ultimately hold all the cards and their incentive is to keep costs low while at the same time recruiting top prospects. Understanding the institution’s approach to financial aid from the very beginning could save disappointment later.
- Are budget cuts affecting programs?
Ask Buffalo’s baseball players or Temple’s rowers or the swimmers at the University of Maryland why this may be important. While some cuts cannot be anticipated, others may be planned and colleges have a responsibility to make them public. Be aware that the question isn’t limited to sports. Responding to increased pressure to emphasize more marketable majors, colleges are re-configuring programs—cutting some and adding new opportunities. At a more basic level, colleges may be quietly increasing class size, making it more difficult to get some majors, relying more heavily on teaching assistants (TA’s), or offering specific classes less often—even eliminating them altogether. Short of finding that a program or major has been done away with, students may experience difficulty finishing in four years if classes are overloaded or simply unavailable, especially in areas where coursework is highly sequenced. And if the prospect of transferring sometime in your undergraduate career doesn’t appeal, make sure the programs (including athletic) in which you are interested are on firm footing with the institution.
- Will there be changes in requirements for graduation?
Sometimes this can work in your favor. Loyola University of Chicago reduced the number of credit hours required for graduation from 128 to 120. But because AP/IB or other outside college credits earned during high school can mean significant money both to you and the institution, take the time to see how these credits may be applied (toward graduation or specific majors) and ask if the college anticipates changes in these kinds of arrangements. For example, Dartmouth no longer grants credit for AP or IB examinations. Placement and some exemptions may be offered instead. In other words, Dartmouth can now count on four years of tuition payments from undergrads. And the questions can be even more complex involving credit for internships, co-ops or research. If the goal is to graduate in four years or less, it’s worth investigating if there are plans under consideration that might affect your ability to graduate on time.
- What is the impact on student services?
Applicants don’t always take into account the real value of the student services component when considering colleges. As schools discover they can make money from room and board packages, students may find themselves limited by restrictive housing policies and meal plans. For lots of different reasons—including financial—colleges are limiting students to on-campus housing for more years. The more captive the audience, the less risk involved in building glamorous new facilities. But beyond day-to-day living, services also include everything from library or gym facilities and hours, to tech support, career advising, health/mental health services or academic support for writing centers and math labs. These should be “growing” operations, and if they aren’t, budget cuts in these areas might be concerning.
Because colleges won’t always volunteer the information, it’s important that you do some in-depth research and ask the questions necessary to understand potential game changers.
Make it your mission to test whether the college “experience” promised today will be there four years from now, and make sure the process by which you get there is clear.
For the admissions office, it’s a critical tool used to control the flow of students admitted to the institution. But for the applicant who has waited six long months for a decision, the wait list feels like a one-way ticket to nowhere.
And for students manipulated by enrollment management systems designed to attract thousands only to admit a select few, all we can say is, “Welcome to purgatory.”
The wait list scenario is particularly frustrating for the subset of applicants who were organized enough to submit early—Early Action, Early Action II, Single Choice Early Action, Restricted Early Action, Early Decision I or even Early Decision II—only to find themselves sitting on one or several wait lists.
And despite what “experts” might say, waitlisted students can only rely on anecdotal evidence as to what has worked in the past to move an application from wait list to admit. What may have been successful last year, won’t necessarily work this time. There are just too many factors at play.
But hope springs eternal.
For the most part, colleges are unapologetic about using the hopes of waitlisted students to further enrollment goals designed to fill freshman classes with the best, brightest and most highly qualified high school students.
And those familiar with the game know the wait list is used to shape a class profile that aspires to be balanced between males and females, is geographically and racially diverse, meets legislated residency requirements, fills the needs of obscure departments or sports teams, and still covers some part of the college operating budget.
“Essentially, the wait list exists to accommodate for demographics that were not met in the initial round of admission offers,” explains Richard Clark, director of undergraduate admissions for Georgia Tech, in a blog post titled, The Wait List Sucks. “If you have the right number of deposits from the West coast, you go to your wait list for more East coast students. If you have enough Chemistry majors, you may be going the wait list for Business students. Ultimately, the job of admission deans and directors is to make and shape the class, as defined by institutional priorities. Meeting target enrollment is critical to bottom line revenue, creating a desired ethos on campus, proliferating the school’s brand, and other factors.”
For the record, wait lists are almost never prioritized and are almost always unpredictable.
And all too often, schools promoting “needs blind” admissions quietly convert to “needs sensitive” when it comes to plucking a few lucky students from the list. Consequently, most bets are off for financial aid if you come through the wait list.
In other words, there’s no ranking, no money, and not much hope.
Sometimes, the list is hardly more than a thinly disguised public relations scam designed to keep agitated parents, alums, and other interested parties at arm’s length. It represents a political solution to an uncomfortable situation.
We can all agree that waitlisted is not a great place to be. If you’ve been accepted or rejected, your status is clear. You can move on with your life. But waitlisted is living with uncertainty.
And at the end of the day, very few waitlisted students are invited to the dance.
Here are some 2016-17 Common Data Set statistics (Question C2) published by a handful of colleges and universities:
Waitlisted: 1269 (582 accepted places on the wait list)
Admitted: 3 (33 in 2015; 61 in 2014; 49 in 2013)
Waitlisted: 1615 (1340 accepted places)
Admitted: 59 (6 in 2015; 21 in 2014; 41 in 2013)
Carnegie Mellon University
Admitted: 7 (4 in 2015; 73 in 2014; 87 in 2013)
College of William and Mary
Waitlisted: 4115 (2037 accepted places)
Admitted: 154 (187 in 2015; 59 in 2014; 96 in 2013)
Waitlisted: 4571 (2874 accepted places)
Admitted: 61 (81 in 2015; 96 in 2014; 168 in 2013)
Waitlisted: 2064 (1194 accepted places)
Admitted: 16 (129 in 2015; 0 in 2014; 87 in 2013)
Waitlisted: 810 (238 accepted places)
Admitted: 29 (0 in 2015; 0 in 2014; 10 in 2013)
George Mason University
Waitlisted: 1218 (839 accepted places)
Admitted: 200 (350 in 2015; 684 in 2014; 252 in 2013)
Waitlisted: 2184 (1249 accepted places)
Admitted: 149 (114 in 2014; 82 in 2013)
*2016-17 data is not being made available
Waitlisted: 102 (46 accepted places)
Admitted: 20 (7 in 2015; 8 in 2014; 2 in 2013)
James Madison University
Waitlisted: 2560 (1585 accepted places)
Admitted: 205 (500 in 2015; 166 in 2014; 405 in 2013)
Waitlisted: 1237 (840 accepted places)
Admitted: 18 (39 in 2015; 41 in 2014; 33 in 2013)
University of Michigan
Waitlisted: 11,197 (3970 accepted places)
Admitted: 36 (90 in 2015; 91 in 2014; 89 in 2013)
University of Richmond
Waitlisted: 3209 (1236 accepted places)
Admitted: 60 (151 in 2015; 12 in 2014; 95 in 2013)
University of Virginia
Waitlisted: 4987 (2871 accepted places)
Admitted: 360 (402 in 2015; 42 in 2014; 185 in 2013)
Waitlisted: 5452 (2677 accepted places)
Admitted: 26 (50 in 2015; 464 in 2014; 350 in 2013)
Waitlisted: 2118 (1544 accepted places)
Admitted: 0 (750 in 2015; 110 in 2013)
Washington and Lee University
Waitlisted: 1529 (652 accepted places)
Admission offers: 48 (193 in 2015; 72 in 2014; 96 in 2013)
Waitlisted: 2343 (864 accepted places)
Admission offers: 24 (53 in 2015; 70 in 2014; 44 in 2013)
Numbers vary by year depending on how accurately the admissions office pegged its “yield” or how desperate the need to control the composition of the freshman class. For colleges with unfilled seats after May 1st, the pool of waitlisted students is like a candy jar from which they can pick and choose depending on wants and needs.
“The wait list is a reminder that I’m not very smart,” continues Clark. “If I were better at my job, I could predict exactly how many students each year would accept our offer of admission.”
Sure there are steps you can take to try to get off the list—write a letter, get another recommendation, meet with an admissions rep—but there is an emotional cost which must be factored in.
“This is probably the toughest decision to get from a school,” explains Dean J, in her UVa admission blog. “For now you need to look at your other options and think about which one feels right to you. Some of you will want to hold on and see what happens with the waiting list and others will want to fully invest themselves in another school.”
There is no right or wrong here—only what is right for the individual student.
But is the list generally worth the wait?
Sometimes, but not usually.
Despite whatever feelings he has about the ACT, Georgetown’s admissions dean Charles Deacon concedes that the highly-selective university saw an increased number of students taking and submitting ACT scores this year. According to The Hoya, Georgetown’s student-run newspaper, the number of students submitting ACT scores was about even with those submitting SAT scores among this fall’s early applicants.
And this is a relatively new phenomenon.
For more than a half century, the ACT ran a distant second to the SAT in the high-stakes college admissions race. It was the “We Try Harder,” entrance exam—popular in the Midwest and the South but hardly worthy of notice on either coast.
But that all changed several years ago, as the ACT pulled ahead of the SAT in terms of test-taking popularity. And since then, the ACT has continued to widen the gap.
It’s not that the College Board is hurting for customers. In fact, more test-takers completed the new SAT from March through June of 2016 than took the old SAT during the same period in 2015, according to a report published by the College Board last fall.
But the number of high school graduates taking the ACT soared to a record 2.1 million students—nearly 64 percent of graduating seniors. From 2012-2016, the number of ACT test-taking high school grads increased by 25.5 percent, while the estimated overall number of graduates has increased by only 1.3 percent, leaving the College Board with something serious to think about.
In all fairness, a significant percent of the growth experienced by the ACT is a direct result of the adoption of the ACT for statewide assessment. For the graduating class of 2016, the ACT was administered to all public school graduates in 20 states. These students were pretty much required to take the ACT—like it or not.
But the good news for the ACT doesn’t end there. Not surprisingly, the number of tests submitted for admissions purposes shows a similar trend. Colleges are definitely seeing way more ACT scores than they did a decade ago. And it appears that many more students are taking both tests and submitting both sets of scores for consideration by colleges, particularly uber-selective institutions.
According to the New York Times, there appears to be a real “shift in the behavior of top high school students,” as many more choose to work toward high scores on both tests. And that’s okay with top colleges.
“I don’t know all the pieces of why this is happening, but I think more students are trying to make sure they’ve done everything they can,” said Janet Rapelye, dean of admissions at Princeton University, in an interview with the Times. “And for us, more information is always better. If students choose one or the other, that’s fine, because both tests have value. But if they submit both, that generally gives us a little more information.”
And applicants are getting the message. Those with top scores on both tests want colleges to have the benefit of knowing they did well on both. On the flipside, those who did significantly better on one test or the other tend to only submit the better set of scores—depending on the specific rules of the particular college or university.
It will be interesting to see how this trend evolves as “new” or redesigned SAT test results make their appearance among this year’s admissions decisions, particularly as the SAT has transformed itself into yet another curriculum-based test and blurred its differences with the ACT.
Regardless, based on test-submission patterns easily tracked for colleges posting Common Data Set information, the College Board has a very real challenge making up for ground lost to the ACT.
Here is a sample of test-submission statistics for the freshman class entering in 2005 as compared to the classes entering in fall 2016 (note that yearly totals exceeding 100% indicate colleges considered both the SAT and the ACT for some students):
2005 SAT: 87% vs. 2005 ACT: 13%
2016 SAT: 52% (53% in 2015)vs. 2016 ACT: 51% (49% in 2015)
2005 SAT: 31% vs. 2005 ACT: 69%
2016 SAT: 12% (14%) vs. 2016 ACT: 87% (85%)
Carnegie Mellon University
2005 SAT: 98% vs. 2005 ACT: 17%
2016 SAT: 78% (84%) vs. 2015 ACT: 41% (37%)
Case Western Reserve
2005 SAT: 89% vs. 2005 ACT: 58%
2016 SAT: 50% (57%) vs. 2016 ACT: 66% (62%)
College of William and Mary
2005 SAT: 97% vs. 2005 ACT: 3%
2016 SAT: 77% (80%) vs. 2016 ACT: 44% (44%)
2005 SAT: 98% vs. 2005 ACT: 18%
2016 SAT: 69% (75%) vs. 2016 ACT: 51% (45%)
2005 SAT: 89% vs. 2005 ACT: 11%
2016 SAT: 53% (59%) vs. 2016 ACT: 47% (41%)
2005 SAT: 95% vs. 2005 ACT: 7%
2015 SAT: 78% (84% in 2014)vs. 2015 ACT: 47% (40% in 2014)
2005 SAT: 98% vs. 2005 ACT: 2%
2016 SAT: 58% (63%) vs. 2016 ACT: 42% (37%)
2005 SAT: 100% vs. 2005 ACT: N/A
2016 SAT: 73% (80%) vs. 2016 ACT: 45% (36%)
2005 SAT: 97% vs. 2005 ACT: 23%
2016 SAT: 77% (80%) vs. 2016 ACT: 51% (51%)
2005 SAT: 99% vs. 2005 ACT: 14.9%
2016 SAT: 67.5% (73%) vs. 2016 ACT: 48.7% (46%)
University of Michigan
2005 SAT: 55% vs. 2005 ACT: 66%
2016 SAT: 26% (27%) vs. 2016 ACT 82 (83%)
University of North Carolina-Chapel Hill
2005 SAT: 99% vs. 2005 ACT: 22%
2016 SAT: 71% (76%) vs. 2016 ACT: 78% (74%)
University of Pittsburgh (Pittsburgh campus)
2005 SAT: 99% vs. 2005 ACT: 20%
2015 SAT: 80% (85%) vs. 2015 ACT: 50% (47%)
University of Virginia
2005 SAT: 99% vs. 2005 ACT: 14%
2015 SAT: 77% (82%) vs. 2015 ACT: 50% (44%)
2005 SAT: 89% vs. 2005 ACT: 53%
2015 SAT: 37.6% (41%) vs. 2015 ACT: 67.2% (63%)
Virginia Commonwealth University
2005 SAT: 95% vs. 2005 ACT: 15%
2015 SAT: 81.1% (87.4%) vs. 2015 ACT: 26.4% (26.9%)
Washington and Lee University
2005 SAT: 80% vs. 2005 ACT: 18%
2015 SAT: 37% (46%) vs. 2015 ACT: 63% (53%)
2005 SAT: 94% vs. 2005 ACT: 18%
2015 SAT: 58% (61%) vs. 2015 ACT: 41% (38%)
*The most recent Common Data Set posted online is 2015-16
Hours before Associate Dean of Admission Jeannine Lalonde (Dean J) posted her usual heads up to applicants that the University of Virginia was getting ready to post decisions, gizmo18 let the cat out of the bag on College Confidential: “Decisions come out today!”
Seven hours later, Dean J confirmed that applicants could expect to see one of three decisions—admitted, denied or waitlisted—sometime in the next few hours. And by 5:00, the wait was over.
“I can’t believe it! I got in,” crowed one happy applicant. “After straight rejections from Northwestern and GaTech I thought it was over. Words cannot describe my excitement.”
Another reported, “Didn’t expect much after rejections from Northwestern and Uchicago last week. But I’m happy I was proven wrong!!!! I was worried that writing my essay about Nike and Adidas in the sneaker industry was weird, but I guess not!!!!!!!!”
And from KingUU: “I got accepted! I’m so happy! Dreams can be real! UVA was my number 1 choice.”
But the news wasn’t universally happy.
“Deferred then waitlisted in state,” moaned another applicant. “Bruh just reject me already.”
Others were more philosophical, “Deferred EA, rejected RD. It was a long shot, but I definitely learned a lot about myself through it. Congrats to all who got in and good luck to everybody!”
To give the decisions context, Dean posted preliminary numbers for this year later in the week and recommended that admissions junkies with a real “need to know” could research numbers using a new tool devised by the UVa assessment team for presenting data in Tableau.
But the simple comparison with 2016 is interesting enough. Last year at this time, UVa reported receiving 32,426 applications (this number tends to jump around a little)—a significant increase from the previous year—and made initial offers to 9,416 students.
For this year’s class, the total number of applications soared to 36,807, with the number of in-state applicants increasing from 9,653 reported a year ago to 10,942 for the class of 2021.
The biggest contributing factor to the overall increase in applications, however, was the bump from out-of-state students who submitted 25,865 applications—up from 22,773 during 2015-2016.
To account for a steadily decreasing yield (percent of students accepting offers), which dropped from 53 percent in 2005-06 to 38 percent in 2016-17, as well as a need to continue growing class size, admissions increased offers to 9,957—about six percent more than last year. Of these offers, 4,276 went to Virginians (4,019 last year), and 5,681 went to out-of-state students (5,397 last year).
Early action admits accounted for 5914 of total. And the initial admission rate decreased to about 27 percent from 29 percent last year.
According to information provided by UVa to the Common Data Set, 4,987 students were offered spots on the wait list last year, and 2,871 accepted the offer. Of those students, 360 were eventually admitted.
In any event, here are all the “unofficial” numbers released by the UVa admissions office:
Total number of applications: 36,807 (up from 32,426 last year)
Total number of VA applications: 10,942 (up from 9,653 last year)
Total number of out-of-state applications: 25,865 (up from 22,773)
Overall offers: 9,957 (9,416 this time last year)
Total VA offers: 4,276 or 39% of resident applications (4,019/41.6% last year)
Total out-of-state offers: 5,681 or 22% of nonresident applications (5,397/23.7% last year)
Note that the offers of admission for nonresidents are higher because historic yield for nonresidents is generally lower than that for in-state student.
In a press release, UVa reports that of those admitted, over 1,000 are first-generation college students and more than 35 percent identify as members of a minority group. They come from all 50 states and 89 countries around the world.
And they present outstanding credentials. For those admitted who submitted new SAT scores, the middle 50 percent range was 1330-1490 (Dean J notes that “way more” students submitted the new SAT than the old, so she dropped the stats about the old exam). The middle 50 percent ACT composite was 31-34. And 93.4 percent of admitted students were in the top ten percent of their high school class, for those who attend schools that report rank.
Once again, self-described “tableau dabbler,” Jon Boeckenstedt, associate vice president for enrollment management at DePaul University, has come up with easy-to-use tools for visualizing basic college admissions data.
Drawing from information compiled in Peterson’s Undergraduate database and the Peterson’s Undergraduate Financial Aid database, both copyright 2016 by Peterson’s-Nelnet, Boeckenstedt has created a series of ten “views” or charts showing test scores, male and female admit rates, early decision vs. regular admit rates, need data as well as some general international student information to be used with caution.
And the colorful “optics” can be very revealing as well as educational for anyone putting together a college list.
For example, by looking at “SAT Math distributions,” it’s very easy to see that an applicant to Caltech with less than a 700 Math SAT has nearly no chance of admission as 98.9 percent of the freshman class entering fall 2015* (the teal-color bar) had math scores over 700 (exact numbers can be found by hovering your mouse over the bar). Judging from ACT Composite distributions, the student with less than a 30 Composite ACT score had no chance of admission to Caltech.
Using the same database, Boeckenstedt lays out 25th and 75th percentiles for SAT CR and Math scores as well as ACT Composites. Looking at the ACT view, for Stanford University, the 25th percentile of the distribution was 31 and the 75th percentile was 35—not too promising for a student with an ACT Composite below 31.
“While test scores are not the primary factor in admissions decisions, these charts can give you a good sense of where you might stand in the applicant pool,” explained Boeckenstedt. “And while you might not eliminate yourself from consideration if your scores are close to the border between one range and another, it’s clear that high scores are an important consideration at many of these institutions.”
Admit rate data, or the percentage of applicants offered admission, is equally interesting. The chart illustrating the difference between admit rates for men and women shows exactly how wide the margin can be. For example, in fall 2015, the admit rate for men at Vassar College was 35.4 percent and for women was 21.5 percent—a significant difference easily visualized by the distance between the purple and orange dots. At Harvey Mudd College, the admit rate for men was 9.4 percent while the admit rate for women was 21.4 percent—the dots are reversed!
But it’s the chart documenting the early decision (ED) and overall admit rates and their difference that could possibly suggest application strategies. In fall 2015, the admit rate for ED candidates at Tufts University was 39.2 percent, but the overall admit rate was only 16.1 percent, suggesting a huge advantage for ED applicants. This is confirmed in the light blue bar to the right of the chart showing the difference between the two rates.
Boeckenstedt warns that it’s important to be realistic about admit rates. “A 15% admission rate does not mean that your chances are one in seven; your chances may be better or worse based on any one of many factors in your file.”And, “if you’re a top student in the applicant pool, your chances are probably better; if not, and if there is nothing else to get your application noticed, your chances are almost certainly worse.”
He goes on to add, “…it’s clear that Early Decision makes the choice about where to apply, and under what plans, even harder.”
All of Boeckenstedt’s charts may be filtered by state. And to navigate the various views, simply click the gray boxes or arrows along the top. Use the scroll bar to move down the view, and hover over any data point to show details.
For the record, all the score information is given in terms of the “old” SAT and not the “new” SAT.
And, the information found on these pages may be more current and complete than what’s posted on college search websites or that contained in college guides.
Check this out: the 2017 College Board College Handbook was printed in June, 2016 and is based on data provided to the CDS for 2015-16. The 2018 edition with 2016-17 data won’t come out for months.
But many colleges have already posted their 2016-17 CDS survey responses, with more up-to-date information. So why not get a jump on the 2018 handbook and go directly to “source documents” found on institutional research pages?
In your research, you’ll find that not every website or guide uses all the information available through the Common Data Set. Not all will provide details on wait lists or transfers. But once you get familiar with CDS questions and format, you’ll discover these details are usually there and very accessible.
In addition, you can research trends by looking at CDS data over a series of years. That’s a plus when looking at retention or graduation rates, where you always want to see improvement. The College of William and Mary is extraordinarily helpful in this way, posting full Common Data Sets from as far back as 1997-98.
Keep in mind that the CDS is a voluntary project in which colleges “self-report” information with little or no centralized technical support or oversight. In other words, the data can be inaccurate or slanted in ways that favor the institution.
Note that you can always cross reference the CDS with College Navigator. But even then, the data is only as good as that which colleges may be willing or able to provide, and it sometimes lags the most recent CDS posting.
In the way of an introduction, here is a tour of the basic Common Data Set:
- Enrollment. Questions B1 and B2 provide the size of the institution as well as provide you with a breakdown of what the campus community looks like in terms of race and ethnicity.
- Graduation Rates. Questions B4 through B11 address “persistence” or what percent of students graduated within a specified time frame. You can easily compute 4-year graduation rates by dividing B7 (completions within four years) by B6 (the total class size). For example, the University of Virginia graduated 87.8 percent of the class beginning in 2010, within four years. Question B11 simply states the 6-year graduation rate of 94.1percent.
- Freshman Retention. Question B22 provides the freshman retention rate based on the date an institution calculates its “official” enrollment—a number subject to some manipulation depending on who is counting and on what day.
- Admissions. Using the answers to C1, you can get male/female as well as overall admit rates (selectivity) by dividing the number of admitted students by the number of applicants. This can extremely interesting when trying to determine the level of admissions difficulty for men vs. women or your basic odds of getting in. For example, in the fall of 2016, the College of William & Mary admitted 43 percent of its male applicants but only 32 percent of the females who applied.
- Yield. Once again using the responses to C1, yield may be computed by dividing the total number of enrolled students by the number admitted. Because of the sensitivity and importance of this number in college rankings, the definitions of “admitted” and “enrolled” can be different at different institutions.
- Wait list. The answers to C2 speak to the use of the wait list and the likelihood of admission from the wait list. In the spring of 2016, Dartmouth College offered 2064 students places on the wait list for a class eventually totaling 1121. Of those, 1194 accepted spots on the list. From that group, 16 were admitted.
- Other Admissions Factors. C7 outlines the relative importance of academic and nonacademic factors in admissions decisions. This may be a good place to see if interviews are available and how important they may be. Wake Forest University and Carnegie Mellon University consider the interview “important,” while Johns Hopkins and William & Mary simply note that the interview is “considered.”
- GPA. C12 provides the average high school GPA of enrolled freshmen. Because it’s hard to know if the number is weighted, unweighted or recomputed, the GPA response is left out of many college guides. It’s also a question that’s frequently left blank by colleges.
- Early Decision Advantage. Question C21 covers early decision and early action plans. This is where you can discover how much of an advantage it might be to apply to an institution early decision. For example, for fall 2016, the College of William and Mary received 1003 early decision applications and admitted 519 or 52 percent. Going back to question C1, a quick computation shows the overall admit rate to be much lower—37 percent. At Dartmouth, 26 percent of the early decision candidates were admitted according to Question C21, while only 11 percent were admitted overall.
- Transfers. D2 indicates how many transfer applications were received, how many students were admitted, and how many eventually enrolled. Other basic information on the transfer process includes the terms during which transfers may enroll (D3), minimum credit units required for transfer (D4), the need for an interview (D5), and a minimum college grade point average a college wishes to see for a transfer (D7).
- Residency. Under the “Student Life” section (F1), you can see the percent (and number) of out-of-state students (excluding international students) enrolled. The University of North Carolina at Chapel Hill enrolled 16 percent out-of-state students in fall 2016, while the College of William and Mary enrolled 34 percent.
- Annual Expenses. Questions G0 through G6 lay out undergraduate tuition, fees and room and board. More current data for the coming year would probably be found on an individual school website and if you’re interested, G0 gives a direct link to an institution’s net price calculator.
- Financial Aid. The H section is devoted to financial aid, including scholarships/grants and “self-help” awards. Question H2A provides information on non-need-based scholarships and grants, including athletes. And for international students, H6 answers the question of whether or not institutional aid is available to “nonresident aliens.”
- Percent of Need. Question H2i provides the percent of need a college claims was met for students awarded any need-based aid. For the 2016-17 reporting period, Temple University met 69 percent of need for incoming full time freshmen. Towson University met 54.8 percent and Bucknell University met 91 percent of need, while Stanford University and UVa claimed to meet 100 percent of need (keep in mind the “need” is a pretty subjective term).
- Faculty and Class Size. Questions I1 through I3 cover the range of territory relating to student-to-faculty ratio and average undergraduate class size. This is a complicated area full of definitional issues, but since colleges make a point of bragging about how small their classes are, you may want to take a look.
If this kind of analysis gives you a headache, feel free to use comprehensive college search websites and guide books that aggregate and re-work the data into more user-friendly formats.
But if you can’t wait until mid-summer and like the idea of going directly to the source, visit the CDS webpages for colleges you are researching.
This is the second in a two-part series on the Common Data Set. For sample links to CDS webpages, go back to Part 1.
Did you ever wonder where some college guidebooks and online search engines get their information? Are you curious about how publications like US News & World Report collect data for rankings? Would you like to go directly to the source?
If so, let me introduce you to the Common Data Set, an amazing resource anyone can access—if you know how.
The backstory is simple. The Common Data Set (CDS) was created as a way to satisfy the public’s insatiable appetite for college knowledge and statistics.
According to the CDS website, the Common Data Set initiative is “a collaborative effort among data providers in the higher education community and publishers as represented by the College Board, Peterson’s, and U.S. News & World Report. The combined goal of this collaboration is to improve the quality and accuracy of information provided to all involved in a student’s transition into higher education, as well as to reduce the reporting burden on data providers.”
So rather than answer a zillion questions from many different publishers and websites, colleges fill out a lengthy standardized form each year. Data is collected and compiled and doled out to publishers which use it for everything from college rankings to online college search tools.
And many colleges are kind enough to publish their CDS surveys on their websites so anyone can have access to the information. And if you get familiar with the various data fields, it’s a goldmine covering everything from admissions statistics to financial aid.
Typically, you can find CDS responses by going to a college’s Institutional Research Office webpage or by using the website search function and entering “Common Data Set.” You can also Google “Common Data Set” and institution name. If the information is posted, it will appear as a link.
But not all schools post the CDS and URL’s change frequently, so don’t be alarmed if after several attempts nothing comes up. A number of colleges simply don’t want the public to have easy access to what may be unflattering statistics or information they feel could be misinterpreted.
And keep in mind that the folks who administer the CDS don’t audit the information for accuracy. They rely on colleges and universities to provide accurate and truthful information, which isn’t always the case as we’ve learned from the repeated scandals involving US News.
Also, it’s fair to say that colleges sometimes differ about terms and definitions. For example, the CDS provides little guidance on what is required for grade point average information—weighted, unweighted, or recomputed. As a result, the reports on GPA are sometimes one and other times another. And often, the question (C12) simply isn’t answered.
Finally, don’t confuse the Common Data Set with the federal government’s College Navigator. They involve two different reporting systems and produce two different reports in different formats.
But for hardcore data junkies, the Common Data Set is hard to beat. Depending on the time of year, it’s more current than what you’re likely to find in any print guide or website.
To get started, here are some sample CDS links:
- Amherst College: https://www.amherst.edu/amherst-story/facts/common_data_sets
- Bowdoin College: https://www.bowdoin.edu/ir/data/cds-table.shtml
- Carnegie Mellon University: https://www.cmu.edu/ira/CDS/index.html
- College of William and Mary: http://www.wm.edu/offices/ir/cds/
- Cornell University: http://irp.dpb.cornell.edu/common-data-set
- Dartmouth College: http://www.dartmouth.edu/~oir/data-reporting/cds/
- Eckerd College: https://www.eckerd.edu/about/factsheet/
- George Mason University: https://irr2.gmu.edu/cds/cds_new/
- George Washington University: https://www2.gwu.edu/~ire/
- Georgetown University: https://oads.georgetown.edu/commondataset
- Gettysburg College: http://www.gettysburg.edu/about/offices/ees/institutional_analysis/cds.dot
- Harvey Mudd College: https://www.hmc.edu/institutional-research/institutional-statistics/common-data-set/
- Indiana University: https://www.iu.edu/~uirr/reports/compliance/cds/
- James Madison University: http://www.jmu.edu/instresrch/cds.shtml
- Kalamazoo College: http://www.kzoo.edu/about/assessment/common-data-set/
- Lewis and Clark College: https://www.lclark.edu/offices/institutional_research/common_data_set/
- Middlebury College: http://www.middlebury.edu/offices/administration/planning/mdata/history/cds
- Northwestern University: http://enrollment.northwestern.edu/common-data-set.html
- Pomona College: https://www.pomona.edu/administration/institutional-research/common-data-set
- Princeton University: https://registrar.princeton.edu/university_enrollment_sta/#comp000048a59a9e00000006304217
- Queens University: http://www.queens.edu/Academics-and-Schools/Office-of-Academic-Affairs/Institutional-Effectiveness-and-Planning/Institutional-Research.html
- Reed College: https://www.reed.edu/ir/cds/cdsindex.html
- SMU: https://www.smu.edu/Provost/IR/Statistics
- Stanford University: https://ucomm.stanford.edu/cds/pdf/stanford_cds_2016.pdf
- Swarthmore College: http://www.swarthmore.edu/institutional-research/common-data-set
- Temple University: http://www.temple.edu/ira/data-analysis-and-reporting/institutional-reporting.html
- UCLA: http://www.aim.ucla.edu/profiles/cds2.aspx
- University of Maryland-College Park: https://www.irpa.umd.edu/Publications/pub_cds.html
- University of North Carolina-Chapel Hill: http://oira.unc.edu/facts-and-figures/data-summaries-and-publications/common-data-set/
- University of Notre Dame: https://www3.nd.edu/~instres/CDS/CDS.shtml
- University of Richmond: http://ifx.richmond.edu/research/common-data.html
- University of South Carolina: http://ipr.sc.edu/cds/
- University of Virginia: http://ias.virginia.edu/common-data-set
This is the first part of two-part series on the Common Data Set. The second part will drill a little deeper into CDS questions and content.
Early applicants to the University of Virginia’s Class of 2021 received decisions earlier this week—well ahead of the January 31st published release date. Following the recent announcement from UVa President Teresa Sullivan that she will be leaving at the end of her current contract in 2018, the admissions office decided to give over 5000 prospective ‘Hoos some good news to consider.
And it’s clear that admission to the Commonwealth’s flagship university remains a highly sought-after prize among high school students—both from within the state and across the country.
Even with plans to increase undergraduate enrollment for 2017-18, the competition for admission under UVa’s early action program continues to be intense, as the overall number of applications grew to 20,446—about a 24 percent increase over numbers reported the same time last year.
Predictably, most of the early applicants, 14,968 (or 73 percent) came from out of state. The balance—5,278 applicants—came from within Virginia.
Out of this year’s early action pool, 5,914 students were admitted—about 14 percent more than for the Class of 2020, which experienced a seven percent jump in EA admits from the year before. Of those admitted, 2,575 were from Virginia (47 percent offer rate—down three percentage points), and 3,339 were from out of state (22 percent offer rate).
Among the offers, 4,496 were for the College of Arts & Sciences, 1,180 were for the School of Engineering and Applied Science, 97 were for the School of Architecture, 75 were for the School of Nursing, and 66 were for the Curry School of Education.
Typically, more offers are made to nonresidents because the yield among students faced with out-of-state tuition is significantly lower. But it’s worth noting that offers made to out-of-state students increased by over 13 percent from last year.
According to assistant admissions dean Jeannine Lalonde (Dean J), those offered early admission bids were very well qualified. The middle range of SAT scores of this year’s admitted students fell between 2020 and 2290 (ACT between 31 and 34). And 94.6 percent of the offers went to students in the top ten percent of their high school classes (this number only reflects those who attend schools that report rank).
Although over 9000 students were denied admission during the first round of consideration, another 5458 were thrown a lifeline by being deferred to the regular decision pool, which stands at about 16,250 additional applicants. The entire group will receive decisions before April 1. Note that deferred applicants are specifically encouraged to send new test scores and midyear grades as soon as possible.
All students will have until May 1, to make up their minds. And those early applicants who were lucky enough to be admitted to UVa’s Class of 2021 can expect to receive significant encouragement to commit as soon as possible.
Nancy Griesemer is an independent educational consultant and founder of College Explorations LLC. She has written extensively and authoritatively about the college admissions process and related topics since 2009.
Breaking from previous years, Stanford University has declined to release early action results for the class of 2021. This year, students applying under Stanford’s highly competitive restrictive early action (REA) program learned their decisions on Friday afternoon in emails sent by the admissions office, which since 2010 has freely provided numbers from the first round of application reviews.
In an email to the Stanford Daily, Dean of Admissions and Financial Aid Richard Shaw indicated that the release of early action data in the last several years was “exceptional.”
“Our policy is to release data at the end of cycle,” Shaw wrote. “We have returned to our standard approach in communicating about the Class.”
But in 2010, 2011, 2012, 2013, 2014 and 2015, Stanford released the number of early acceptances, deferral and rejections from the early pool to the Stanford Daily, shortly after decisions were communicated to applicants. And in each of those years, Stanford made the information available after early admission results were made public by Harvard University.
This year, Harvard will be communicating restrictive early action results on December 12, several days after Stanford has closed its books on the early phase of the admissions process.
For the record, Stanford made offers to 745 early applicants or 9.5 percent of 7,822 REA candidates last year, having received seven percent more applications than in the previous year—the largest early pool in Stanford’s history. Stanford ultimately went on to take top honors in the selectivity race for the third consecutive year, by dropping to a breathtaking overall 4.69 percent admission rate for the class of 2020.
For the same class, Harvard accepted 14.8 percent of 6,173 early applicants and ultimately came in with the second lowest admit rate in the nation at 5.2 percent.
While not binding, both early action programs prohibit (with some exceptions) applicants from applying early to other colleges and universities. Those accepted now are free to pursue other applications and compare results later in the application cycle. All final decisions are due by May 1, 2017. But if you’re a Stanford applicant, don’t look for too many deferrals to the regular pool. Stanford’s philosophy is to “make final decisions whenever possible,” while in the past, Harvard has routinely deferred most early applicants.
And like it or not, the selectivity competition launched by the early admission announcements is closely watched by the entire admissions community, not to mention alums and trustees. Between now and May, Stanford and Harvard will compete mightily for many of the same candidates and both will be hoping to score the highest yield—the percent of students accepting an offer of admission—in the country.
In communications with the Stanford Daily, Shaw indicated that the university will release the Class of 2021 data when all candidates have been notified, which he hopes will deescalate the statistics war between Stanford and the Ivy League. Earlier this year, Shaw told the Washington POST that he finds a discussion about admission rates distracting. “It just diverts everybody’s attention from the fact that we took 2,000-plus kids that are magnificent,” he explained.
When Shaw moved from Yale University to Stanford in 2005, the rate for the first class he admitted was just under 11 percent and totaled 400 more students than admitted for the class of 2020. This has happened because Stanford’s yield is now the highest in the country.
But whether from a desk in New Haven or one in Palo Alto, Shaw no doubt remains aware of his competition at Harvard.
“My feeling is, what’s the difference between 7 percent and 4 percent? It’s all very competitive,” Shaw said. “If you look at Harvard’s number, Penn’s, Princeton’s, or any number of institutions, they’re all quite competitive.”
Nancy Griesemer is an independent educational consultant and founder of College Explorations LLC. She has written extensively and authoritatively about the college admissions process and related topics since 2009.