Step Seven

Test your ideas; fail fast to learn fast


By now you have formed a collaborative partnership, collected data, defined the problem to be solved, identified roles, designed processes for engagement, committed to advancing equity, and embraced the need for new models of postsecondary education. You are now ready to put your ideas into practice. To change systems, you want to learn about successes and failures quickly so that you can either scrap them or scale them up. This means committing to the process of “continuous improvement”—disciplined, essentially scientific, inquiry. You conduct rigorous tests of the ideas that your data suggests are promising. You implement a change idea, then adopt it, adapt it, or eliminate it depending on what you have learned. Think of these cycles as Plan, Do, Study, Act.

  1. Plan. Collect data, define the problem specifically (asking a series of deeper and deeper “why” questions), and design an intervention to test.
  2. Do. Implement the change idea in a limited way under a finite timeline. Collect data as you do so.
  3. Study. Analyze the data you have collected during the implementation stage.
  4. Act. Tweak the change idea, adopt it, scale it, or start all over again, depending on what you have found.

Here are some examples:

  • Elkhart, Indiana, has tested a number of pilots, including a certified production technician course at a local automotive plant. Based on the lessons they learned from that effort, they made substantial changes when they started a similar program at another manufacturer.
  • In New York, CUNY colleges are piloting debt forgiveness programs, gathering information for a university-wide pilot with a research agenda. In addition to the pilots noted elsewhere (see “Non-Academic Assistance”) CUNY Central tested the impact of debt forgiveness on students with account balances of $300 or less. They learned a great deal: The threshold was too low to encourage students to return; the balances weren’t low enough to prevent students from registering under current policy; accounting systems made it difficult to operationalize. As it turned out, only one college successfully implemented the program with 11 student participants, only four of whom completed the semester with balances forgiven. Although limited, the pilot yielded valuable lessons; CUNY now understands more about the effectiveness of existing policies and mechanisms for addressing low balances, students’ reasons for leaving college, and technical limitations of the colleges’ accounting processes. They learned that the threshold of support was not as helpful to students as expected. And the data-collection process revealed significant barriers. All have led to improvements.
  • In Las Vegas, the College of Southern Nevada successfully piloted block scheduling for its general education courses at three of its sites. The next step is to use the common general education courses within meta majors to create course blocks at other campuses, then scale up the block scheduling using the academic maps.