Employment vs. Employability Data — Assessing Current & New Data Needs

Margo Griffith
15 min readJul 21, 2021

What’s the difference between employment and employability data? What can employability data tell us about a learner’s job or career prospects? In this information-rich Credentialate Guide we look at the currently available data and what data we could be capturing to give us a better understanding of learner employability, with the ultimate aim of driving graduate employability outcomes.

Employment vs. Employability Data — Assessing Current & New Data Needs

The Essentials: Employment data vs. employability data

  • Employment vs. employability data — what’s the difference?
    Employment data essentially tells us which career path a learner took and is usually limited to a set period of time after they graduate. Employability data focusses on what skills — including technical, workplace and transferable skills — graduate’s posess that align to current job markets.
  • Why does employability data matter?
    For learners, it helps them understand the value of the time they spent learning and how it translates into real-world employment. For employers, it helps inform them what skills the learner brings to the job, and how they can obtain tangible, verifiable proof of those skills. And for education providers, it can illustrate a complex benchmark of their success or failure.
  • How do we capture more employability data?
    We need more and richer data to connect students with education and work in a meaningful way. Employability data needs go beyond the current “grading” language and evaluate a graduate’s employability more accurately upon completion of study. It must also be captured with transparency, flexibility and alignment with the industry to guide better decision-making.
  • What’s the issue with current employment data capture?
    Not all students participate in providing their post-graduate employment information. Additionally, alternative education options — such as industry certificates and micro-credentials — are not captured within the traditional report frameworks.
  • Who has the most robust graduate outcome data now?
    HESA and NACE distribute graduate outcome data that is recognised as more comprehensive . However, they are still less than complete and, as useful as they are in aggregate, they are much more useful at an institutional level.
  • How will employability data be used in the future?
    In essence, to drive change. Higher education needs a lot more data to remain competitive. The time and monetary cost of a four-year degree must be clearly demonstrable for both traditional and non-traditional students to embrace them.
  • How Credentialate provides a new perspective
    Credentialate is the world’s first Credential Evidence Platform. It helps you discover and evidence workplace skills. Credentialate is the only digital badging platform that includes a personalised qualitative and quantitative evidence record verified direct from the digital badge. Institutions can track skills and competencies across one, multiple or all their courses.

The Full Story: Employment data vs. employability data -What current data tells us and what new data we need

Big data impacts every single business and industry around the world. However, education is in need of, well, an education. We collect a lot of data from educational institution information at national levels. What data should we be paying attention to? What should we be capturing? And what are we missing?

At an institutional level, graduate employment data can tell educators a lot about what is happening right now, and how effective their curriculum may — or may not be — at producing desired outcomes. But for most learners, the primary desired outcome of higher education is getting a job or moving up to a better one. This is often why the focus is on whether or not the graduate is employed in their chosen field of study.

But if the student is not employed, why not? Did they graduate as an employable individual, for example, but choose either to remain unemployed for a variety of reasons, accept a position where they are underemployed, or do they actually lack the skills needed for an employer to deem them employable? Alternatively, they may have skills that make them employable, but lack the documentation (or certifications if you will) to prove those skills aside from the “degree” or “diploma” they may have received.

First, let’s start with a couple of definitions, to be sure we are all on the same page. Then we will look at a small portion of the data we have now and what it shows, and what data we would like to have that might be more useful.

Employment vs. employability data — what’s the difference?

Employment data essentially tells us what, career-wise, a graduate does a few months or, in some cases, years after graduation. Some graduate and learner data goes beyond just average GPA and graduation rates. It is important to look at several different factors when evaluating outcomes — particularly employability and employment data:

  • Are graduates employed full- or part-time… or underemployed?
  • Are they employed in their field of study or a field related to it?
  • How much do graduates earn? Is that commensurate with the investment they made in their education?
  • How many graduates go on to further education? In what fields of study?
  • Are learners and graduates satisfied with the education they received?

This data can be very revealing, and is often a primary driver behind funding, but it is also vital to both learners and employers as well.

Why do we call this employment data? Because there are many types of data, and various names for that data, out there. This data is not where our focus is, but here are a few definitions for you:

  • Education Institutional Performance Data — This data covers a wide variety of data points related to effectiveness (learner outcomes) and efficiency (comparing input to output). This last area is the most neglected and difficult to measure.
  • Learning or Learner Data — This is more outcome-focused, but also tracks things like attendance, test scores, seat time, and survey results. It’s more focused on what happens while the learner is attending the institution, and not what comes afterward.
Graduate outcomes equals employability rates
  • Graduate Outcomes: — While used by other institutions, this term directly refers to the HESA survey in the UK that attempts to contact all recipients of higher education qualifications approximately 15 months after graduation. The survey gathers statistics on the employment and study activities of graduates and their subjective opinions on the value of their new qualification.

What most of these boil down to, with the exception of learner data, is employment. Because what does a degree mean if it doesn’t result in employment? What skills does the graduate really have that relate to a job?

But perhaps just as important: why does any of this employment data matter?

Why does employability data matter?

Why do we need all this data? The most commonly stated reason is to measure the practical effectiveness of learning. For example, let’s say a business school had an 80% graduation rate with an average 3.5 GPA. What if none of those students graduates with the skills they need to get a job? How effective was their education?

This matters to the learner, who wants to understand the value of the time spent learning and how that translates to real world employment. It also matters to employers who want to know what skills the learner brings to the job, and want tangible, verifiable proof of those skills.

Funding Sign

The other primary reason is funding. As an example, in the United States, learner outcomes are often tied to Federal funding: if an institution fails to meet a certain outcome standard, they may be ineligible for certain Federal programs, can lose accreditation certifications, and worse will endure damage to their reputation from which they may never recover (see an excellent article by The Times Higher Education back in 2019 on this subject).

One US program specifically is the funding awarded to service Veterans. While there are two types of funding, what matters in this case is the requirements for any institution, whether for-profit, non-profit, or Title IV accredited universities. Graduation rates, after-graduation employment rates, especially for the veteran population, and even how long learners stay in their career field are audited on an annual basis at a minimum. If an institution cannot pass this audit, their veteran funding will be denied until they can pass an audit and reapply for the program.

Performance-based funding is also being rolled out across the globe, for instance Australia , the United Kingdom and Canada to name a few. The principle is the same: schools that perform well get more money, and those that don’t risk losing funding entirely.

Clear examples come from the failure of many for-profit “universities” with questionable business practices in the US. There are also numerous articles that talk about the failures of performance-based funding and the burden of coercive reporting and the inequality for students.

In short, graduate outcomes illustrated by institutional data are the benchmark for success or failure. Right now, that data focuses primarily on employment rather than employability. But can we do better?

The answer is, yes, we can.

How do we capture more employability data?

What does “better” look like? The answer is not simply more data, but richer data. If we are talking about employability, then richer data related to employability skills, not just industry specific skills.

For example, a bachelor’s degree in business from one university might not carry as much weight as the same degree from another university. What is the difference? Is the student from one school actually receiving learning in soft skills different from those received at another, similar school? What are those skills, and how are they measured and documented? This is not easy, and at the moment assessment and evaluation of employability skills is subjective for the most part. Without a clear framework and well-defined standards, how can these skills be measured universally?

This is the problem at its core: how do we measure and document leadership and listening skills, for example, as part of the above business degree? How do educators measure an individual learner’s aptitudes and document them as part of a transcript?

Digital credentials to document individual learner’s aptitudes

That is the challenge, but to address the needs of learners and employers, it is one that must be met. Learners need to know and be able to articulate what skills they are getting from a course, and what those skills mean. To do this, we need to align the curriculum to the skills, track the student’s mastering of them, and then find a way to document them. Those skills need to be available for students, potential employers, and others to access.

If you ask learners what they got at the end of their course, they will typically tell you what grade they received. What they should be able to tell us, is what skills they’ve learnt and why these skills will help them in their career, something Credentialate delivers through its personalised evidence layer.

And there is other good work being done in this area. EMSI in the United States has introduced Skillabi , an AI-based analysis tool that helps institutions to analyse curricula and determine where soft skills are already being taught in the current educational framework. This allows institutions to add courses or parts of courses to a current area of study, creating a holistic, instead of departmental, approach to curriculum development and revision.

The entire premise of the organisation is to connect students with education and work in a meaningful way, and a big part of that involves a credentialing system that goes beyond the current “grading” and evaluation systems that are an integral part of traditional frameworks but also the foundation of “employment data”.

What’s the issue with current employment data capture?

Where does this data come from and how is it gathered? First of all, within traditional frameworks, a learning institution is obligated to report graduate outcomes to either an accrediting agency, a government entity, or both.

The format of the reporting depends on the agency requesting the data, but there are typically several points addressed, similar to the questions asked above. So for example, if we were to look at the most recent UK learner outcome data, we would see:

  • 60% of graduates are employed full time
  • A majority of those are employed in their field of study
  • Around 20% are engaged in further study of some kind, usually related to their original field of study
  • Most of those who are unemployed are choosing to be unemployed to travel or care for others, for example
  • A majority (72%) feel they are satisfied with the outcome of their education, but a startling 18% are not using what they learned in their education in their current employment
HESA report: Graduate reflections on skills versus employment activities

You can see all of the current data for yourself at the HESA website . Each country or region has their own graduate outcome reporting or data set, most gathered and reported annually. Most often this data is taken from higher education institutions that offer bachelor’s, master’s and other advanced degrees.

However, as widespread as this data is, it’s incomplete. Participation in certain parts of a graduate questionnaire is voluntary: not everyone gives any answer at all or even participates. Not all accrediting agencies share their information in a single digestible format.

And there is another issue. What about those smaller, shorter courses that teach equally valuable skills? In a recent press conference, the Lt. Governor of Idaho, Janice McGeachin joined a limited number of other states in including skills-based training graduate outcomes in their reported statistics.

In this case, that reporting included several members of the Northwest Career Colleges, including electrical lineworker programs, beauty school and franchise training, and others. “We realised Idaho was missing out on a huge potential in recognising these career paths,” McGeaachin said.

However, in the process they discovered that these schools apply to more than just high school graduates seeking a career. “Many of our students are non-traditional students,” Janier Smith, Director of the Restorative Therapy Program at Stephen Senager College told us. “They choose our program over a traditional one because they want to get done faster and reenter the workforce in a productive position.”

For many jurisdictions, including the non-traditional, data on skills-based education programs is now included in employment outcomes. Even when this data has been gathered before, it’s not nearly as comprehensive as that obtained from more traditional frameworks. However, skills-based learning, especially when specifically targeted to a career path, often results in higher employment rates in skilled positions that pay more.

Without including outcomes from skills-based learning and other “micro-programs” that provide learners with certificates and credentials that create employability, we’re missing a lot of data about additional educational options and the skills learners take away from those courses.

Much of even the most robust of this data is called “experimental statistics”. In other words, we are still really in a testing stage, and models are not fully developed. Where is this data documented, and what does that documentation look like?

Who has the most robust graduate outcome data now?

The HESA website mentioned above includes some of the most robust graduate outcome data available. Although the National Association of Colleges and Employers (NACE) in the US also provides a lot of data as well. This data is typically documented in a variety of ways, including digital charts and graphs or downloadable spreadsheets containing raw data so a user can do their own analysis.

One factor impacting this data is the student response rate. For example, the NACE data includes data from, 358 schools of which:

  • 349 schools reported outcomes for the bachelor’s degree
  • 64 schools for their associate degree completers
  • 174 schools provided information for those completing a master’s degree program
  • 98 institutions reported results for doctoral degree recipients

This works out to likely the largest collection of data on graduate outcomes in the United States. Even so, that means we have data on the following:

  • 27.7% of all bachelor’s degree graduates;
  • 16.8% of all master’s degree graduates;
  • 10.7% of all doctoral degree graduates; and
  • 6.4% of all associate degree graduates.

Even the best of data sources is less than complete. When it comes to the HESA data, out of 478,805 surveys, 214,280 individuals did not respond at all.

Are learners fully equipped for current job market?

Multiple organisations do their best to gather and aggregate this data to better inform both educators and employees. While a popular concern among learners, most depend on simple article summaries like The Best Colleges for Your Money list.

This data, as useful as it is in aggregate, is much more useful at an institutional level. Where does a curriculum need to be more robust? Where are skills missing? What do students need to be employable that they are not getting? Graduate outcomes and performance data provides decision makers with a clearer picture of what is happening now, and how that aligns with future goals.

Are educators really equipping learners for the current job market? If not, why not, and how can those outcomes be improved? Current employment data essentially provides a starting point, a foundation for making data driven decisions about change.

How will employability data be used in the future?

What it all comes down to this is: higher education needs a lot more data to be competitive going forward. The high time and sometimes monetary cost of a four-year degree must be clearly demonstrable for both traditional and non-traditional students to embrace them.

Teachers to understand skill-based learning objectives

They need a way for a teacher to learn relevant information about the learner and what skills they actually possess when they complete a course of study. A more robust assessment that doesn’t overly burden educators must become the norm. This is the goal of QACommons and other initiatives, which seek to certify skills and ensure students are employable for their first job and the ever-changing workplace they will face.

Then the data must be documented somewhere and in such a way that an AI data search can both find it, and create a credential for it. In this way the student has a meaningful, verifiable, and secure way to illustrate the skills that make them employable.

What does that future look like? It could mean a combination of courses and micro-credentials “stacked” to form a degree or certificate, essentially a “proof of skills.” Ongoing credentialing and education could become the norm. These “digital badges” could hang on a decentralised yet verifiable framework.

For this to work, we need more data. We know we are starting with what we have now, something educators, learners, and employers all know is not enough. What we don’t know, what many stakeholders are working on, is what data we do need, and how we gather, measure and evaluate it.

It is through this transformation, this change from a focus on employment data to employability data, what that really means and how we use it, that we will change the language we use, the way we verify and validate learning, and the way we look at a candidate seeking employment.

How does Credentialate provide a new perspective?

Credentialate is a secure, configurable platform that assesses and tracks attainment of competencies and issues micro-credentials to students backed by personalised evidence at scale. By automatically extracting data from existing platforms and using an organization’s own assessment rubrics, we can objectively measure awarding criteria and validate its evidence.

By this same method we can automate the assessment, monitoring, promotion and validation of evidence-backed skills. For an institution, we provide the data and insights required to track skills and competencies across courses and entire programs.

Finally, we have decades of collective experience in educational technology and long-standing ties with global educational powerhouses. These solidify our ability to produce credible digital badges.

Credentialate assesses, monitors, promotes and validates learners’ attainment of evidence-backed skills, supporting the transition from learner to earner. It is a secure, configurable platform that assesses and tracks attainment of competencies and issues micro-credentials to students. If you’d like to learn more About Us and how we can work together, contact us or Schedule a Demo and let’s discuss!

About Credentialate

The world’s first Credential Evidence Platform

Launched In 2019, Credentialate is a Credential Evidence Platform that increases the power and meaning of digital badges and allows educators to analyse performance against learning outcomes like never before. Since its launch, Credentialate has secured partnerships with higher education leaders, including with a Group of 8 University. Credentialate was developed by Edalex — an edtech company on a mission to surface learning outcomes, digital assets and the power of individual achievement. Founded in 2016, Edalex brings together the team behind the CODiE award-winning openEQUELLA open source platform that centrally houses teaching and learning, research, media and library content.

Find out more at: edalex.com/credentialate

If you’d like to learn more about Credentialate, we invite you to Learn More or Schedule a Demo.

Originally published at https://www.edalex.com.

--

--