To Assess the Impact of Real World Learning, We Need to Think About Real World Impacts
We’re in the Customer Service Business
In my previous article about leveraging Real World Learning in college applications, I talk about measuring impact. This is crucial for understanding not only RWL data collection and assessment, but also how to think about structuring and allocating RWL resources.
To approach this, we should think differently about data and assessment. At Storyboards, we bring both deep knowledge of education and private sector experience to shape our approach. Our decades of experience in both the public and private sector has led us to a novel conclusion about evaluating the impact of Real World Learning:
We are in the Customer Service Industry.
So, we should collect, analyze, and evaluate data as if we’re in the Customer Service Industry, not only to assess the impact, but also to tell the story of our public schools.
Let me explain:
For education leaders raised in the No Child Left Behind era, we were drilled to think that impact
must
be measured through
data
and
assessment
and
evaluation
that leads to
accountability.
NCLB’s mandatory testing created a mindset that student learning only matters when it’s
quantified. The extremely punitive nature of No Child Left Behind completely
rewired
at least two generations of education leaders to believe that if your metrics didn’t show Adequate Yearly Progress, then you would be
objectively
labelled as “failing.”
Or, as my Methods professor used to say, “If you can’t measure it with an assessment, is it really learning?”
The very premise of Real World Learning defies the great Spreadsheetification of Education: Students develop the employable skills of empathy, communication, collaboration, proactivity, and executive functioning through internships and client-connected projects.
So, how do we measure the impact of Real World Learning?
The NCLB regime taught us to Design Backward. We need to Design Forward.
Relying solely on quant-based metrics means
beginning
with the test as the end-product assessment, then designing
backward
to create the lessons that teach to the test. This results in the “gamification” of education assessment.
Within the Real World Learning framework, this is simply counting the percentage of students who earn a Market Value Asset. If achieving MVAs is the assessment metric, we will inevitably design our whole RWL program and allocate resources towards earning MVAs.
But, as I talked about in my article on
creating impact, Market Value Assets in and of themselves don’t earn students employment opportunities and college admissions with merit-based financial aid. Students have to translate their RWL and other experiences into a cohesive narrative following the specific format of the college application.
In this way, Market Value Assets aren’t tokens that students collect in a
wallet. When they graduate, they can’t simply “cash in” their MVAs in exchange for jobs, admissions or competitive financial aid.
Rather, their RWL and other experiences are conveyed in a
portfolio that tells the story of why their skills, character, experiences, and personal story make them the perfect fit for their target schools and employers.
This is where we have to Design Forward our assessment of RWL.
If you Design Backward, the wallet and tokens idea makes sense. The data is easy to track, and “success” is “objectively” measurable.
But if you Design Forward, you need to think about how these kids leverage their MVAs and RWL experiences in the
actual format and process required by their post-graduation next opportunity.
For your college-bound students, that’s the college application. For your career-bound students, that’s an application and interview process that shows what they can do. In this way,
the
storytelling behind the MVA is more important than the MVA itself.
In an education landscape dominated by quant-data and tests, how do we measure the effectiveness of our students’ portfolios? How do we know if our RWL programming helped them find their right fit?
This is where the Customer Service Industry metaphor is extremely helpful.
Market researchers will tell you that, in almost any customer service business (restaurants, department stores, etc.), that the most powerful data is:
Customer-reported experience.
How customers
feel
about their experience is the single most important indicator that they will come back or recommend you to others.
This is why they give you free tacos to fill out the survey. This is why they offer a 50% off coupon to respond to a few questions. The data is so powerful that companies are willing to pay for it.
Data around “feelings” might not, well,
feel
objective. But that doesn’t mean the impact isn’t real.
Largely, customer experience is driven by you delivering
actual value in the product or service. More than that, it means that you helped them fix some sort of problem, that you were listened to and treated with respect, and at the end of the experience, you helped them solve something that needed solving.
This is something we can assess: Are students and families satisfied with the post-graduation opportunity you helped them earn?
You don’t need to construct complex regression analyses based on the salary potential of similarly situated non-RWL students compared against your RWL students. You don’t need longitudinal, disaggregated AYP data to show that you’re
laser focused
on targeted areas of deficiency in math and reading.
We can develop surveys to understand how students
feel
about where they’re going after they walk across your graduation stage. Whatever you want to know about their experience, you can ask with well-constructed Likert scale questions, space for narrative feedback–-all of this is totally legitimate data that can be collected, analyzed, and assessed to evaluate success.
This ties us back to my original article about creating and measuring impact. To fully realize the power of Real World Learning, schools need to develop systemic college and career application support that
actually plugs them into their post-graduation opportunity.
Now, we can design our Real World Learning program to
create data and assessment we need to not just measure impact, but to
tell our school’s story.
We
know that our schools, despite the odds, are doing incredible work.
We
know that our teachers and administrators are innovative, creative thinkers who are pushing the boundaries of what’s possible in education. But the public and other stakeholders don’t always know that. By getting our students into workplaces, connecting with clients–the entire
community
gets to see how incredible our students are because they are
literally working with them.
This is how we use Real World Learning to grow public school advocacy: Kids show what they can do, parents report positive experiences about outcomes, educators show return on investment–and the business and community leaders who are literally working with our students through their Real World Learning experiences can say, yes, these kids are great and the schools are doing a great job.
That’s the story
we as educators
know. Through careful program design that migrates students from school to college and career, along with thoughtful collection and assessment of data, it’s a story that we can tell.









