Platform updates

Introducing YourQual: a portfolio platform built from the floor of the work

What the platform is, what it does, and why each piece is the shape it is.

Matt Saunders
Matt Saunders Founder, YourQual 14 min read

The first two posts in this series have been careful about positioning. They have argued that the gap between technical compliance and proper readiness is bigger than most providers realise, and that selecting an ePortfolio platform is a judgement call that procurement processes tend to get wrong. Both posts have alluded to YourQual without saying very much about it directly. That has been deliberate, because credibility-focused content earns its credibility by being useful regardless of who is reading. But there is a limit to how long you can write around a thing before the reader reasonably wants to know what the thing actually is.

This post is the answer to that question. I am going to walk through what YourQual is, what it does, and why each piece of it is shaped the way it is. The trade-off is that this post is openly about the platform, which means it has to do more work to be substantive rather than promotional. I have tried to make it earn that substance by going into actual depth on each surface, rather than summarising. The result is long. If you only want the headline, the answer is: YourQual is an apprenticeship and vocational portfolio platform built specifically for UK training providers, designed around the way the work actually happens rather than the way software companies imagine it happens. The rest of this post is the long version.

Why I built it

I spent several years working as an assessor in the apprenticeship sector before starting on YourQual. In that time I used three different portfolio platforms across different employers and provisions. The third was good in some areas and frustrating in others. The other two were the kind of thing that wastes hours of an assessor’s week without anyone quite being able to explain why, because each individual frustration is small enough to absorb but the accumulated drag is enormous.

I am not going to name any of them. It would not be useful and the post would become about that rather than about YourQual. What I will say is that the pattern across all three was the same: they had been built by software companies for software companies’ reasons, then adapted to fit the apprenticeship sector after the fact. The result was platforms where the language did not quite match the work, the workflows assumed a shape of operation that did not exist in any provider I knew, and the features that mattered most to assessors were either missing or hidden behind two or three layers of admin overhead.

The argument that eventually convinced me to start building was a simple one. The sector is small enough that nobody is going to build the right platform unless someone who actually understands the work does it themselves. The big incumbents have too many customers in adjacent markets to focus on UK apprenticeship specifics. The newer entrants are mostly building general-purpose learning management systems with apprenticeship branding. There was a gap for a platform built by someone who had assessed apprentices, prepared for inspections, and been frustrated by the existing options. That gap is what YourQual exists to fill.

How the platform is structured

YourQual is a multi-tenant SaaS, which means every training provider has their own private space inside the platform with their own users, their own learners, their own data. Providers cannot see each other and there is no cross-provider visibility for anyone except me as the platform owner, and even then only for commercial and aggregate purposes rather than learner-level access.

Within each provider there are five user roles. Administrators run the provision and manage staff. Assessors do the assessment work. Learners are the apprentices themselves. IQAs (Internal Quality Assurers) sample evidence and verify the quality of assessor decisions. EQAs (External Quality Assurers) get time-limited access for external verification.

The roles do not map cleanly to permission levels because the work they do is different in kind, not just in seniority. An IQA is not a more senior assessor. An EQA is not a more senior IQA. Each has its own workflow and its own surfaces, and the platform reflects that rather than trying to flatten it into a single role hierarchy.

The assessor’s workflow

The assessor is the most active role in the platform because they are doing the most work. Everything else either supports the assessor’s work or audits it. So the design of the assessor experience is where most of the platform’s quality lives.

The starting point is the caseload. Every assessor has a list of the learners assigned to them. The list is not just a list of names. Each learner shows their qualification, their employer, their current portfolio progress as a percentage, their off-the-job hours against target, their assigned IQA if any, and a continuously computed RAG status that reflects whether their progress is on track relative to their expected timeline. The RAG status is the thing that catches most providers’ attention in demos because it is the answer to a question most platforms cannot answer at all: which of my learners are quietly drifting toward trouble. The RAG is recalculated as evidence is captured and time passes, so it reflects current state rather than a stale judgement from the last review.

Observations are the most common evidence type for most apprenticeship standards. YourQual’s observation tool is structured around duties rather than around KSBs directly. The assessor selects which duties they are observing in this session, and the platform auto-maps those duties to the relevant KSBs based on the standard. This is a different shape from most platforms, which ask the assessor to map evidence to KSBs one at a time, which is slow and which causes assessors to under-map (because mapping is friction). Duty-based observation with auto-mapping inverts that relationship. The assessor describes what is happening in front of them in the language they naturally use, and the KSB mapping is a side effect.

The Q&A tool is the second main evidence type. It is a recorded verbal knowledge discussion between the assessor and the learner, with timestamps marking which KSBs are being addressed at which moments. The result is a piece of evidence that can be reviewed later by the IQA without the IQA needing to be present at the original discussion. Knowledge discussions captured this way are richer than written knowledge responses for many standards, because the back-and-forth of an actual conversation tends to surface depth of understanding that a written answer can hide.

Witness testimony is the third. When an apprentice does something at work that needs employer or workplace mentor verification, the assessor sends a testimony request via email to the witness. The witness clicks a link, reviews the description, signs in their browser, and submits. No login is required for the witness, which removes the single biggest source of friction in collecting employer evidence. The signed testimony is then attached to the learner’s portfolio with timestamp and signatory metadata captured.

Knowledge marking is a separate surface where assessors review written knowledge responses against KSBs. The shape of this is mostly conventional, but the integration with the rest of the portfolio means that a knowledge response and a Q&A discussion can be mapped to the same KSB without duplicating evidence, and the IQA can see both together when sampling.

Progress reviews use a form builder so each provider can configure the shape of their review template. Sections are configurable, question types vary (text, scale, multi-select), and the three-party signoff between learner, assessor, and employer is structurally enforced. Each party signs in their own session, with timestamp and signature image captured, and the review only counts as complete once all three signatures are present. This sounds bureaucratic written out but in practice the experience is the same as docusign for the employer: an email, a link, a signature.

The assessor’s day in the platform is built around moving between these surfaces in the natural rhythm of the work. Observe in the morning, run a Q&A session in the afternoon, knock off a couple of testimony requests, review a progress review that came back from an employer, check the caseload at the end of the day to see who needs attention tomorrow. The platform does not try to dictate that rhythm, it tries to get out of its way.

The learner experience

Apprentices have their own portal with their own surfaces. The platform was built with the explicit goal of being something apprentices actually want to use, because platforms that apprentices avoid produce thin evidence and frustrate everyone involved.

The portfolio view shows the learner their progress against the KSBs of their standard, with verified and unverified evidence distinguished visually. They can see which KSBs have been observed, which have knowledge responses, which have Q&A coverage, and what the IQA has said about each. The progress percentage is computed from verified stars across all KSBs, which means it goes up as the IQA confirms the assessor’s judgement rather than just as evidence is added. This nudges learners and assessors to push evidence through to verification rather than letting it pile up in an unverified state.

OTJ logging is designed to be fast. The learner adds hours from any device with a date, a duration, and a short description of what was learned. The this-week and this-month progress against target is visible at the top of the OTJ page so the learner can self-correct if they are drifting behind. When the learner falls more than a week behind target, the system surfaces this on the assessor’s caseload so the conversation can happen before the next progress review. OTJ tracking is the single most common reason providers get downgraded at inspection, and most of the time the underdelivery comes from quiet drift rather than active resistance. Catching drift early is more effective than addressing it after a milestone.

Evidence upload from a phone is supported across every evidence type that needs a file. An apprentice on site can take a photograph of a window they have just installed, upload it directly to the platform from their phone, attach a short description, and have it appear in their assessor’s evidence inbox within seconds. This is the kind of thing that sounds obvious but that many platforms get wrong, either by requiring a desktop upload or by making the mobile experience so painful that nobody uses it.

The knowledge module gives learners a place to engage with the knowledge component of their standard. It is deliberately scoped to portfolio knowledge questions and not to EPA preparation. EPA preparation is a different product space and one I am working on separately under a different platform name, because trying to bundle EPA prep into a portfolio platform produces a worse version of both.

Targets are short tasks that an assessor can set for a learner, with a due date and a description. They appear on the learner’s dashboard and the assessor gets notified when they are completed. The target system is light, deliberately so, because heavy task management inside a portfolio platform tends to duplicate what providers already do in their CRM or project tracker.

IQA and quality assurance

IQA is where most platforms reveal whether they take quality assurance seriously or whether they treat it as a checkbox. YourQual treats it as a primary workflow.

The IQA’s surface is structured around sampling rather than around browsing. The IQA can configure a sampling plan that specifies what percentage of each assessor’s caseload should be sampled and at what cadence. The plan generates a queue of evidence to review, weighted by recency and by which assessors have been sampled least recently. Sampling actions are first-class objects in the platform: when the IQA reviews a piece of evidence and finds something that does not meet standard, they raise an action with a description, a severity, and an expected resolution. The action is tracked through to closure with timestamps and accountability. The assessor sees the action on their caseload until they have addressed it.

Milestone reports give the IQA a periodic view of where each assessor’s caseload stands, which assessors have outstanding actions, which learners are at risk, and what the trend has been across the last quarter. The reports are generated automatically from the underlying data rather than assembled manually, which means the IQA spends their time on the judgement work rather than on report production.

Assessor monitoring is a separate surface that tracks when each assessor was last sampled, how their evidence quality has trended, and whether they have any unresolved actions. This is the surface that supports the IQA’s responsibility for the overall quality of assessment across the provision, as distinct from the per-evidence sampling work.

EQAs get their own time-limited access for external verification work. They can see the portfolios they need to see, with the IQA sampling history and the audit trail, without having ongoing access after the verification window closes.

The audit pack

If the assessor workflow is the heart of the platform, the audit pack is the proof.

When an inspector or auditor asks to see a learner’s evidence, YourQual produces a complete audit pack as a ZIP file in under a minute. The pack is organised by section: learner profile and qualification details, off-the-job hours log, knowledge evidence, observations and skills evidence, witness testimonies, progress reviews, IQA sampling history. Each section is a folder. Each evidence item is a file or set of files with a manifest CSV at the top of the section listing what is inside.

Short courses, NVQs, and apprenticeships each generate the right pack shape automatically. Short course audit packs skip the sections that do not apply to short courses (no KSB structure, no progress reviews, no IQA sampling). NVQ packs reflect the per-unit evidence structure. Apprenticeship packs include the KSB matrix and the duty-based observation history. The platform knows the qualification type and generates accordingly, so the provider does not need to remember which sections apply to which learner.

The reason this matters is the timing. A continuously assembled audit pack is the difference between an inspection that you can prepare for in twenty minutes and an inspection that requires three weeks of reassembly work. The pack is not generated from scratch at inspection time. It is generated from data that has been continuously captured throughout the learner’s journey. Producing the pack is just packaging.

The commercial model

YourQual uses a credit-based commercial model rather than per-seat pricing. Each new learner enrolment costs one credit. Credits are purchased by the provider in advance from a pool, and the price per credit is set per provider based on volume and contract shape. There is no charge for additional staff users, additional administrators, additional IQAs, or any of the other roles. The cost is per learner, which is the unit that actually generates value for the provider.

The reason for this model is straightforward. Per-seat pricing punishes growth, because it makes adding staff feel expensive. It also punishes the kind of cross-functional working that good provisions tend to need, where an admin occasionally helps with IQA sampling or an IQA occasionally covers an assessor’s caseload. Credit-based pricing aligns the platform’s cost with the provider’s actual revenue model, which is per learner.

Credits also create a clean commercial conversation when volumes change. A provider that grows their cohort buys more credits. A provider that has a quiet quarter does not feel they are paying for capacity they are not using. Pricing is transparent in the sense that both parties can talk about it in real numbers rather than vague platitudes about scale, although the specific number per credit is a conversation rather than a published rate card because volume genuinely matters and a single price would either be unfair to small providers or uncompetitive for large ones.

Other things worth naming

Beyond the headline workflows there are a few smaller pieces that fill out the platform.

Broadcasts let an administrator or assessor send announcements to a group of learners with multi-file attachments. Used for everything from sector update bulletins to qualification reform guidance. The platform tracks who has opened the broadcast and who has not, which is useful for compliance evidence around employer engagement and learner communication.

CPD assignment lets an administrator assign continuous professional development tasks to staff users, with completion tracking. This is a small surface but matters because Ofsted will ask about staff CPD at any serious inspection and providers benefit from having a clean record rather than a folder of certificates.

Safeguarding reports are first-class objects with appropriate confidentiality controls. Designated safeguarding leads are gated separately from administrators, and the platform supports both reporting and incident management workflows. Cross-platform safeguarding metrics surface to me as the platform owner so I can spot worrying trends across the customer base, but provider-level data stays within the provider.

Storage and email metrics are tracked continuously so any provider can see their usage trends. The platform itself runs on infrastructure that scales with usage rather than charging by seat, so heavy users do not pay more for the underlying capacity, only for the credits they buy.

The platform sends transactional emails through a properly configured email service rather than through the database, which means deliverability is high and the email log is auditable. Every email the platform sends is tracked with timestamps and delivery confirmation.

What is coming next

The platform is in active development. New features are added based on a combination of customer feedback, my own continued assessment work which surfaces things that should be possible but are not yet, and considered judgement about what the sector needs that nobody else is building. The next major piece of work in the pipeline is a planned visit feature for off-site observation logistics. Beyond that, automated reminder emails are coming, followed by light and dark mode for accessibility, and module-level gating for providers who want to subscribe to only part of the platform.

Updates ship continuously and are announced in this insights stream under the Platform Updates category. Anyone using the platform will see new features appear in their environment without needing to do anything, which is the advantage of a SaaS that does not version-lock its customers.

Where to go from here

The honest version of the closing pitch is this. YourQual is built for UK training providers running apprenticeships, NVQs, and short courses. It is built by someone who has done the work and who keeps doing the work. It is priced per learner rather than per seat, which makes the commercial conversation cleaner than it usually is in this sector. The feature set is broad enough to cover the full assessment lifecycle and deep enough that the features which matter most are properly engineered rather than glossed over.

If any of that is interesting and you want to see what it looks like in practice, the demo is thirty minutes and there’s no deck.

Matt Saunders
Matt Saunders
Founder, YourQual

Matt spent a decade in UK training delivery before building YourQual. He writes about the sector when he isn't shipping the platform.

All insights

Talk to us

Like what you read?

Thirty minutes with someone who's actually worked in your sector. No deck, no sales theatre. Just the platform, your questions, and a clear answer on whether YourQual fits.