Despite the intense policy interest surrounding income-based financing for higher education, there has been little to no causal investigation whether these solutions actually increase participation in educational programs. Most income-based financing is broadly simple. Students receive value, either through funds or educational services, and in return they pay a defined percentage of income until their contract is finished. In income-contingent loans, there is an attached interest rate and a defined debt obligation amount. Income share agreements, on the other hand, may define the percentage and number of years of the contract and leave out everything else. Empirical observations about these contracts from natural occurring data are generally in the affirmative: they do increase participation. Australia saw nearly a 70% increase in participation in higher education following their switch to income-based financing as the sole financing option, for example. Participation has risen in most countries where income-based financing has been introduced. Without randomization, however, it is impossible to rule out other factors for these increases. To our knowledge, not even instrumental randomization methods as employed in econometric analysis have been used studying the participation effects of income-based financing, and specifically income share agreements. We know of one attempt at a randomized controlled trial for the participation effects of income share agreements but it was survey-based and in a classroom. Hypothetical choices are helpful but potentially misleading.
Given the obvious gap in the literature, we performed a natural field experiment to test whether income share agreements do, in fact, increase participation, all else being equal. Our outcome variable was attempted signups. Would a student, randomly exposed to an income share agreement versus a loan with the same effective term and interest rate, be more inclined to participate in an education course? We answer in the affirmative and can reject the null. We observe a 66.65% (p < 0.000001, Pearson Chi Squared test) increase in signups when students are exposed to the income-based financing treatment versus the equivalent loan control.
The experiment has many defects. We did not ask subjects for observable information that would help expose some heterogeneity in the subjects who ultimately assented to the signup. Indeed, income share agreements might appeal strongly to a niche group while actually deterring the broader base of students. Our signup outcome variable was very low friction, the only personal information required was the subject’s email. The sheer novelty of our treatment may have induced signups out of curiosity, without being a true indication of intent to matriculate. The treatment also faces less competition than the loan control on the open market. Thus, while more students exposed to the income share agreement treatment may have signed up on our site, it is possible students exposed to loans merely went elsewhere for financing. Matriculation, in this case, would be a far better outcome variable.
Our design was simple. We created a marketing landing page for students to receive financing to attend a “coding bootcamp.” Coding bootcamps are generally 12-week courses run by small to mid-size for-profit and non-profit firms to teach individuals how to write software. The courses are optimized to allow most students to receive a job offer shortly after finishing the course. There are around 20,000 students who attend these bootcamps every year in the United States. Tuition ranges from $10,000 to $30,000. From conversations with CEOs of bootcamps, roughly half the students finance the courses themselves. The other half rely on one of several bootcamp-specific financiers. Interest rates are high, starting 5% above the risk-free rate. Outcomes of the bootcamps are strongly positive for the students. 80% of bootcamp students receive an offer within six months and a high salary ($65,000 — $90,000), which represents a 50% earnings increase for the average case. The major policy interest is why isn’t everyone doing this?
Our landing page presented general information about coding bootcamps and an offer for our organization to provide 100% financing (plus living expenses) to attend one. We randomly assigned a first-time visitor to one of two treatments in regards to this financing. Once a treatment was assigned, we set a cookie so that subsequent visits to the site would show the same treatment. It is still possible some subjects saw both treatments but this is unlikely. In the income share agreement (ISA) treatment, the terms for receiving funds were that students would need to commit 25% of their income for 3 years. We estimate that this contract is equivalent to a $15,000 loan with a 35% interest rate and a 3-year term. Using a conservative salary estimate post bootcamp of $45,000, a 25% ISA would mean students pay $8437.50 per year on average. Amortizing a principal balance of $15,000 over 3 years would roughly equal a 35% interest rate. This is a crude estimation, but our salary conservatism makes us feel more comfortable. In our Loan treatment, we offered the same services but we framed the obligation as a $15,000 mortgage-stye loan with a 3-year term and a 35% interest rate. In effect, the Loan treatment is comparable to financing the bootcamp with a high-interest credit card.
Visually, the two treatments were identical except several key phrases. Each difference is compared below.
We attracted individuals to the page solely by search engine advertisements. No other links to the treatment pages were available. We used keywords related to “coding bootcamps”, “learning to code”, “become a programmer”, among others. This means a user would possibly see our advertisement if they typed in one of our keywords into the search engine. We tracked clicks to the landing page and received 2131 visitors over the 2 months of the experiment. Since our randomization was naïve, each visitor had a 50% chance of being assigned to either treatment. While we did not track the amount each treatment was shown, our sample size is large enough such that we can approximate each treatment being shown 2131/2 or 1065.5 times. Of the 1065.5 subjects assigned to the Loan treatment, 60 signed up to the service (5.6%). Of the 1065.5 assigned to the ISA treatment, 100 signed up (9.3%). This difference signifies a 66.65% treatment effect and is strongly significant (p < 0.000001) using the Pearson Chi Squared test.
Our design could be improved with ‘balanced’ randomization or matched pairs. It is unclear how heterogeneous the sample sizes are, although simple randomization should eliminate this at our sample size. Heterogeneity in those who ultimately assented versus those who refrained is of ultimate interest, for both the creditor and the policy-maker, but this will need to wait for future research. The equivalence of our two treatments from a financial perspective is uncertain. We may not be discounting the ‘insurance’ feature embedded in an ISA. After all, no payments are due when the student is not employed. Nonetheless, at rough equivalence, our subjects are clearly not indifferent.