Response to feedback in the SQE1 results workshops
18 May 2022
Our response to queries and feedback raised in the SQE1 results workshops with education and training providers
Issue raised
Candidates raised a range of concerns about Pearson Vue test centres including:
- Noise
- Water not being provided
- Temperature – some centres were too cold
- Inconsistency in candidate handling eg some candidates were able to leave early and some were not.
Response
We continue to work with Pearson VUE regarding issues raised by candidates. In the instance that there is no water provision at a test centre, candidates are now able to access water in their locker during an unscheduled break. This will be monitored by a member of the test centre team. Clearer guidance has also been given to test centres in how to handle candidates that finish early.
Issue raised
One provider said it would be better if the booking deadline was closer to the assessments so that providers could help students decide whether they are ready to take the assessments
Response
Unfortunately we cannot move the booking deadline closer to the exam. This is due to administrative activities, such as data checks and adjustment provisions, which must be completed before the exam.
Issue raised
One provider suggested that it would be better if candidates get a larger refund when they cancel closer to the assessments
Response
Cancellation fees have been set based on costs incurred. We will continue to keep this under review.
Issue raised
Candidates reported that:
- Assessment days were too long
- The gap between FLK1 and FLK2 was not long enough.
Response
We will keep the length and timing of FLK1 and FLK2 assessments under review. It is too early to draw any firm conclusions after one sitting. We collect feedback from all candidates which is an important input to the reviews we conduct on the assessments.
Issue raised
Providers reported that the number of assessments in a year is problematic, especially when it is not possible for candidates to move straight from SQE1 to the next available SQE2 sitting. (One provider reported that this is especially a problem for international students if their visas run out before they can sit an assessment.)
Response
We have published information in response to this issue.
Issue raised
Candidates gave feedback about the content of the assessment questions including:
- Some questions were ‘woolly’
- Some questions were not single best answer questions
- Wording of some questions was confusing (eg the use of the words may/might and may/must)
- The assessment questions were not reflective of the published sample questions
- Candidates found the questions in FLK2 more complex and more time consuming to answer.
Candidates found it challenging when questions were randomly distributed across subject areas. They also commented that some questions were clumped together with questions from the same subject area.
Response
Complexity of language/question
Questions in the question bank were all created by writers trained by Kaplan in writing for SQE1.
Training incorporates specific messages about difficulty - reinforced during editing - and an emphasis that the challenges posed to candidates is whether they can apply their legal knowledge.
Training also makes it clear that questions must not include elements which are there to try to catch candidates out. Questions are edited rigorously before acceptance into the question bank having regard to that, amongst numerous criteria.
Facts are included which are not required to support the correct answer where they are required to support the distractors. The inclusion of such information is not a red herring or trick intended to mislead candidates but to require them to apply their legal knowledge and filter facts – a skill which improves discrimination.
Questions go through multiple rounds of review before being admitted to the question bank. An explanation that questions are not intended to trick candidates but to make sure they can identify and apply legal principles is always given to reviewers. If the questions had included elements intended to trick candidates, those reviewing including the members of the Angoff panel would have identified this.
Complexity of FLK2 questions
Performance on FLK1 was better than on FLK2. As the SQE Independent Reviewer observed, reasons for this difference in performance could include the fact that candidates had less time to prepare for FLK2 as it was taken just three days after FLK1. Further, candidates tended to do less well on the more transactional subjects such as conveyancing and litigation of which there are more in FLK2 than FLK1.
Statistical analysis carried out after the assessments and a thorough review and analysis of the questions in FLK1 and FLK2 did not suggest that there was anything in the question design or the standard of the assessments to account for the difference in performance across FLK1 and FLK2. We will continue to monitor and report on performance across the two assessments in future sittings.
Comments regarding ‘More than one answer’
Some candidates responding to the candidate survey suggested that there could be more than one correct answer to a question. That is not the case.
Candidates in SQE1 are not required to select between several correct answers for the most correct answer. Each question in SQE1 has only one correct answer. The questions are all scrutinised in accordance with the Kaplan process which makes sure that this is the case.
Sample questions
Before release, the sample questions were part of a much larger pool of questions which form the question bank from which assessment questions are selected. All follow the same style.
And while not all the questions are the same length, the sample ones are within the same parameters as the questions included in the November assessment.
The sample questions were created by a wide variety of question writers, who wrote, not for the sample pack, but for the question bank. These sample questions were selected to include factors such as a wide selection of writers and a good spread of legal topics across FLK1 and 2.
We understand that the experience of a live assessment cannot be replicated by providing sample questions for use by candidates. The correct answers are provided with the sample question pack. Identifying the correct answer to questions where the answers are provided is very much easier than answering exam questions.
The intention in providing sample questions is to give candidates examples which are illustrative and as representative of the exam as they can be. No sample questions will ever be exactly the same as the questions in an assessment paper.
We will continue to keep the sample questions published on the website under review to make sure they are as reflective as they can be given all of the above.
Distribution of questions
The range of concerns noted by providers shows the variety of candidate preferences. Questions are randomly ordered to address this.
Issue raised
Candidates would like more sample questions and past papers
Response
Whilst there are no immediate plans to publish further sample questions, we will keep this position under review.
Issue raised
Candidates suggest that we should signal which part of the FLK a sample question relates to.
Response
The sample questions are intended to give an example of the content and format of questions in the assessments. Candidates are not provided with this information in the assessments, and it may be misleading to include this information in the sample questions.
Issue raised
One provider suggested that there is an overlap of subject matters in the Assessment Specification (eg leases) making it hard for candidates to know what to learn.
Response
As we explain in the guidance to the Assessment Specification. The specification should be read holistically and candidates should focus their learning on its entirety.
Issue raised
Candidates who failed requested more detailed feedback, for example, which subject areas they answered well and not so well
Response
We are currently reviewing whether this would be possible.
Issue raised
Delay in candidate results was frustrating
Response
We’re sorry for the delay and any additional stress this caused candidates. We have reviewed in detail what happened and work is underway to reduce the risk of issues on the future results days.
Issue raised
Providers gave a range of feedback on the provision of candidate results including:
- Providers would prefer for individual candidate results to be given directly to providers
- Very difficult to know real pass rate without this
- Providers suggested that we should give providers a breakdown of candidate performance by background to help providers support candidates better
- One provider suggested that the publication of pass rates could disincentivise providers from offering courses without entry requirements. This could run contrary to the objective of the SQE to encourage flexible pathways to qualification
- Providers are concerned that candidates may find it hard to identify their primary provider especially when providers are collaborating or when the candidates uses more than one provider
- One provider suggested that provider pass rates should not include candidates who have self-studied with a manual purchased from a provider
Response
We continue to develop and refine our thinking on the best way to publish accurate candidate data by provider. Feedback and our experience from the November SQE1 assessment has been very helpful. We will continue to talk to providers about this as our work progresses.
Issue raised
One provider suggested that it would be helpful to raise awareness of the blueprint amongst candidates
Response
All candidates should review the Assessment Specification, including the blueprint, in full. It would be helpful if training providers could also draw candidates’ attention to the Assessment Specification and the blueprint.
Issue raised
Candidates would like a list of Pearson Vue test centres where assessments are available before they attempt to book the assessment. This helps with planning, especially for international candidates
Response
There is an ongoing discussion with Pearson Vue about providing more information on test centre availability for the SQE.
Issue raised
There is an inconsistency of information about the use of pens and other equipment on our website. Candidates would like clarity
Response
We continue to review the information on the website to make sure it is clear. The information provided is:
- SQE1: No personal possessions are allowed into the assessments. Candidates will be provided with all necessary equipment, including an online calculator. They may not use any of their own equipment.
- SQE2: No personal possessions are allowed into the assessments. Candidates will be provided with all necessary materials and equipment, including an online calculator, but they must bring their own pen. They may not use any other equipment.
Issue raised
For SQE2, one provider suggested it would be helpful to have more information about the written assessments, eg a list of documents that candidates might be asked to draft
Response
The Assessment Specification provides information on each of the SQE2 written assessments, and the assessment objective and the criteria for each. Providing a specific list would be prescriptive and could encourage candidates to focus on a list that may not be exhaustive.
Issue raised
Some providers were concerned that we plan to change the SQE1 assessment specification
Response
We will conduct an annual review but this will only:
- look at factual inaccuracies or changes in the law
- offer clarification in light of feedback from stakeholders where necessary
- look for any other essential changes.
Issue raised
Some candidates are concerned about capacity for SQE2. Particularly those who would find it difficult to travel to other test centres if their first choice is not available
Response
We had ample capacity for the first SQE2 assessment in April 2022. We continue to work closely with training providers and other stakeholders in the profession to match supply with projected demand for spaces.
Issue raised
Providers would find it useful to have regular meetings with SRA, Kaplan and other providers to share information and feedback