Tag Archives: ufe testing

What’s going to be on the UFE – 2013 Edition

I have to admit that this is my favourite post of the year to write. I love looking through the UFE report released approximately every May and reading the Board of Evaluators (BOE) comments. You’ll see a lot of these as well when you’re taking up your simulations and using the UFE Report as your answer sheet. The BOE comments can often be somewhat harsh which often gives me a chuckle.

I clearly need to get a life. Until then, here’s the 2013 edition.

A reminder, that this should be read in conjunction with the 2012 edition since rumour has it that the UFE is on a two year cycle. I avoid going to specific so as to not give away cases that you’re going to write soon.

Another reminder, I have no inside knowledge of what’s going into this year’s UFE. I just believe that the UFE reports are the best guidance we have as to what to watch for.

Overall, this years UFE report seems to be a pretty positive one compared to some previous years, indicating that candidates are performing well and steady. This is, of course, good news. Now, on to the useful stuff.

Examining the 2012 Report

In one of the most strongly worded warnings I’ve seen, the BOE cautions this years writers on Page 7:

The Board would like to caution candidates about a dangerous exam-writing strategy the Board believes is occurring. It appears as though some candidates are assuming there is a specific number of issues that the Board is looking for on each indicator. As a result, candidates appear to be repeatedly choosing to address only a few issues on each indicator even when numerous issues are outlined in the simulation. By choosing to limit the breadth of their discussions to a particular number of issues, candidates are risking not being able to demonstrate their competence in a specific area. Candidates should attempt to discuss all the issues they deem significant in the simulations, and not limit their analysis because they think they have reached the minimum number of issues the Board is looking for.

In 2010, when I wrote the UFE the marking guides that are now very prevalent were new. In years past, candidates had to estimate which and how many issues were required from the wording in the UFE reports which was of course more challenging but also built up your ranking/decision making skills on the UFE. It was part of the process. I think that these marking guides are a factor in this warning, candidates are playing the numbers game. Don’t do that this year. I strongly suspect that the BOE will come down on this heavily this year. Always rank and write according to the issue and don’t (necessarily) stop just because you’ve written the top three when a fourth might be equally valid. 

On page 9 and 10, the BOE discusses Assurance at length. The indicator seems to be that the UFE will continue to place you in unusual Assurance roles. The focus seems to be at putting you in unusual assurance roles and asking you to ‘make the leap’ and think about how your role can add value to the client. These seem to be more undirected.

In 2012, candidates also had difficulty in their Audit Planning, mainly the issue was where in the audit they were locating themselves. Remember to consider whether you are deciding to accept an engagement or whether you have already accepted an engagement. The conversation can be totally different depending on that factor.

On Page 10, the BOE identifies some of the PMR weaknesses they saw:

Candidates who were unable to demonstrate their competence in this area typically did not provide indepth discussions of the issues. Generally, they did not support their discussions with relevant case facts and Handbook guidance, but instead jumped to conclusions. Some candidates also performed poorly as a result of not addressing a sufficient number of issues, as mentioned above in the discussion of overall deficiencies in candidates’ performance.

Case facts, friends. Stating criteria alone does not add value and therefore does not score points. This was also mentioned as an overall weakness in 2012.

Next up: not-for-profit organizations were a weakness in 2012. On page 10:

Candidates did not seem to be familiar with the standards applicable to not-for-profit organizations. They did not address some of the accounting issues central to not-for-profit organizations, such as how to recognize donated goods and services, as frequently as the Board had hoped.

Don’t forget that there is a large not-for-profit and public sector out there and that both of these areas have been tested before. Although most of the standards are similar to private enterprise, they’ll probably test the differences so don’t be afraid of diving into the handbook to look them up.

Regarding Tax, performance seems to be pretty stable. Candidates are focusing more time and effort on the simpler stuff they know vs. the complex issues as brought up by the BOE on Page 11:

In general, candidates seemed to struggle with the more complex tax concepts tested, and provided extensive discussions on the basic concepts they were comfortable with (such as interest deductibility or RRSP withdrawals). The Board reminds candidates that ranking of issues remains an important skill, and as a result, they should not avoid more complex issues if they are significant to the simulation.

This is pretty standard. There will probably continue to be one or two simple indicators and another one that’s complex. It might serve you well to tackle the complex stuff if you can find the right section of the Tax Act to refer to quickly.

In MDM the BOE had suggestions to candidates on Page 12:

Before deciding what type of analysis to perform, candidates need to reflect upon what their analysis needs to accomplish quantitatively and what would be useful to the client.

Along with this, the BOE encourages candidates to take the analysis all the way and offer recommendations where warranted.

And finally, the often feared PQ indicator was tested more in 2012 than previously which signals that the BOE wishes to continue testing the more undirected stuff heavily. Candidates seem to be recognizing the issues that are more commonly looked for such as fraud but continue to miss the “big picture integration” that the BOE wants every year.

The BOE leaves us with an interesting comment at the very end, on Page 14.

The Board continues to emphasize the importance of being able to identify and appropriately address underlying issues on the UFE. These analytical skills are critical for a chartered accountant, and will continue to play an important part not only in the Level 1 assessment on the UFE, but also in the assessment of competence at Levels 2 and 3.

I find it interesting because it makes sense that the PQ affects Level 1 (overall score) but it also seems that they are indicating it might make a difference in Level 2 or 3 which could mean that regular indicators might become more undirected similar to what’s been mentioned above in Assurance. Keep your eye out for this.

What's going to be on the UFE – 2013 Edition

I have to admit that this is my favourite post of the year to write. I love looking through the UFE report released approximately every May and reading the Board of Evaluators (BOE) comments. You’ll see a lot of these as well when you’re taking up your simulations and using the UFE Report as your answer sheet. The BOE comments can often be somewhat harsh which often gives me a chuckle.

I clearly need to get a life. Until then, here’s the 2013 edition.

A reminder, that this should be read in conjunction with the 2012 edition since rumour has it that the UFE is on a two year cycle. I avoid going to specific so as to not give away cases that you’re going to write soon.

Another reminder, I have no inside knowledge of what’s going into this year’s UFE. I just believe that the UFE reports are the best guidance we have as to what to watch for.

Overall, this years UFE report seems to be a pretty positive one compared to some previous years, indicating that candidates are performing well and steady. This is, of course, good news. Now, on to the useful stuff.

Examining the 2012 Report

In one of the most strongly worded warnings I’ve seen, the BOE cautions this years writers on Page 7:

The Board would like to caution candidates about a dangerous exam-writing strategy the Board believes is occurring. It appears as though some candidates are assuming there is a specific number of issues that the Board is looking for on each indicator. As a result, candidates appear to be repeatedly choosing to address only a few issues on each indicator even when numerous issues are outlined in the simulation. By choosing to limit the breadth of their discussions to a particular number of issues, candidates are risking not being able to demonstrate their competence in a specific area. Candidates should attempt to discuss all the issues they deem significant in the simulations, and not limit their analysis because they think they have reached the minimum number of issues the Board is looking for.

In 2010, when I wrote the UFE the marking guides that are now very prevalent were new. In years past, candidates had to estimate which and how many issues were required from the wording in the UFE reports which was of course more challenging but also built up your ranking/decision making skills on the UFE. It was part of the process. I think that these marking guides are a factor in this warning, candidates are playing the numbers game. Don’t do that this year. I strongly suspect that the BOE will come down on this heavily this year. Always rank and write according to the issue and don’t (necessarily) stop just because you’ve written the top three when a fourth might be equally valid. 

On page 9 and 10, the BOE discusses Assurance at length. The indicator seems to be that the UFE will continue to place you in unusual Assurance roles. The focus seems to be at putting you in unusual assurance roles and asking you to ‘make the leap’ and think about how your role can add value to the client. These seem to be more undirected.

In 2012, candidates also had difficulty in their Audit Planning, mainly the issue was where in the audit they were locating themselves. Remember to consider whether you are deciding to accept an engagement or whether you have already accepted an engagement. The conversation can be totally different depending on that factor.

On Page 10, the BOE identifies some of the PMR weaknesses they saw:

Candidates who were unable to demonstrate their competence in this area typically did not provide indepth discussions of the issues. Generally, they did not support their discussions with relevant case facts and Handbook guidance, but instead jumped to conclusions. Some candidates also performed poorly as a result of not addressing a sufficient number of issues, as mentioned above in the discussion of overall deficiencies in candidates’ performance.

Case facts, friends. Stating criteria alone does not add value and therefore does not score points. This was also mentioned as an overall weakness in 2012.

Next up: not-for-profit organizations were a weakness in 2012. On page 10:

Candidates did not seem to be familiar with the standards applicable to not-for-profit organizations. They did not address some of the accounting issues central to not-for-profit organizations, such as how to recognize donated goods and services, as frequently as the Board had hoped.

Don’t forget that there is a large not-for-profit and public sector out there and that both of these areas have been tested before. Although most of the standards are similar to private enterprise, they’ll probably test the differences so don’t be afraid of diving into the handbook to look them up.

Regarding Tax, performance seems to be pretty stable. Candidates are focusing more time and effort on the simpler stuff they know vs. the complex issues as brought up by the BOE on Page 11:

In general, candidates seemed to struggle with the more complex tax concepts tested, and provided extensive discussions on the basic concepts they were comfortable with (such as interest deductibility or RRSP withdrawals). The Board reminds candidates that ranking of issues remains an important skill, and as a result, they should not avoid more complex issues if they are significant to the simulation.

This is pretty standard. There will probably continue to be one or two simple indicators and another one that’s complex. It might serve you well to tackle the complex stuff if you can find the right section of the Tax Act to refer to quickly.

In MDM the BOE had suggestions to candidates on Page 12:

Before deciding what type of analysis to perform, candidates need to reflect upon what their analysis needs to accomplish quantitatively and what would be useful to the client.

Along with this, the BOE encourages candidates to take the analysis all the way and offer recommendations where warranted.

And finally, the often feared PQ indicator was tested more in 2012 than previously which signals that the BOE wishes to continue testing the more undirected stuff heavily. Candidates seem to be recognizing the issues that are more commonly looked for such as fraud but continue to miss the “big picture integration” that the BOE wants every year.

The BOE leaves us with an interesting comment at the very end, on Page 14.

The Board continues to emphasize the importance of being able to identify and appropriately address underlying issues on the UFE. These analytical skills are critical for a chartered accountant, and will continue to play an important part not only in the Level 1 assessment on the UFE, but also in the assessment of competence at Levels 2 and 3.

I find it interesting because it makes sense that the PQ affects Level 1 (overall score) but it also seems that they are indicating it might make a difference in Level 2 or 3 which could mean that regular indicators might become more undirected similar to what’s been mentioned above in Assurance. Keep your eye out for this.

What they’re going to test on the UFE this year – Part 2

As discussed yesterday, hints as to which topics might come up on the 2012 UFE are available in the comments on candidate performance of the last UFE Report. Today we’ll review the 2011 UFE Report.

Why the 2010 report may contain more hints than the 2011

The UFE is marked in the second half of October and the UFE report is released in May. About a month later, in June, the provincial institutes/ordre already have the draft UFE available for review and provide input. I don’t have any inside knowledge in this area (would welcome some if anybody does!) but I (and I’ve heard this from others) suspect that there may not be enough time to grade, compile the results and integrate the weaknesses into the next UFE so that is why there may be a two year delay.

That said, there is obviously still the possibility, especially for giant, glaring issues.

Examining the 2011 UFE Report

Let’s see what the 2011 UFE Report has in store. The Executive Summary on Page 6 has this as their first caution.

The Board found evidence of candidates including large sections from the Handbook in their responses but failing to apply the guidance to the case facts. Candidates are reminded that simple repetition of technical rules does not demonstrate competence. On the other hand, some candidates used the Handbook to strengthen their discussions by including only the relevant guidance in their responses and then applying case facts to each and every Handbook criterion before deciding on an appropriate accounting treatment.

This is certainly a first I’ve seen this warning, and it appears that since they’ve allowed copying and pasting from the handbook that candidates have gone overboard and forgot that the key is to always link your criteria to case facts. Please don’t read this as a suggestion not to use the handbook. I think using the handbook available to you is an amazing resource, but you’ve got to know how to do it properly by linking to case facts concisely.

Continuing on page 6, The Board sympathized in 2011 with candidates due to the numerous changes in accounting standard in the past two years.

The Board designed the simulations to avoid any confusion as to what set of standards to apply by explicitly stating the reporting context. However, at times it was difficult to assess the technical knowledge of candidates and determine whether they were aware of the new accounting and auditing principles because their use of old terminology made the discussions confusing and hard to follow. The Board is sympathetic due to the recent volume of change in accounting and auditing standards, and simply encourages candidates to continue to try to stay current.

I expect that The Board may not be as sympathetic in future years but I also suspect that students coming through the pipeline in the next few years will no longer have heard of the old terminology and this will cease to be a problem. This is an example of something that should definitely be applied to the 2012 UFE.

Page 7 begins with a compliment.

Candidates’ performance in taxation improved this year. This was noted as a key detractor on the 2010 examination, on which the Board felt that candidates were avoiding the taxation issues. Candidates attempted to address the three primary indicators in taxation on the 2011 UFE. They continue to struggle with the application of the relevant tax rules to case facts, but the Board is encouraged by this first step towards strengthening performance in this area.

I mention this because it seems that The Board is looking for improved strength in this area and that this is only the “first step”. For this reason, and the two-year delay mentioned previously, I’d be brushing up on some taxation issues if that’s a weakness of mine (which it is..)

Another strong warning from The Board, as we continue on page 7 regarding non-directed issues.

…candidates did not perform well in this area on the 2011 UFE. In addition, they did not perform well on non-directed indicators in specific competency areas. … Candidates are reminded that they will not be specifically directed to all of the issues that the Board considers to be mission critical. Candidates need to take the time to read the simulation carefully; understand the situation, their role, and the needs of their client; and address all the significant issues, whether directed or not.

I don’t want to give too much away on where this comment comes from but after you’ve finished writing all your 2011 simulations it might be worth returning to this comment in your study plan. It’s also difficult to give advice on this issue because it is, by nature, non directed. I’ll devote another whole post to non-directed indicators/PQs but I found that I got better at spotting these as time went on through my study plan. I’ll also warn that I’ve seen people just make up these indicators where they didn’t really exist which is a waste of time so you can go wrong in each direction here.

Another pretty interesting warning comes at the end of Page 7 regarding quants.

The Board would like to caution candidates on what it sees as a downward trend in candidates’ performance on indicators that require quantitative analysis, despite most of these indicators being directed. Candidates seemed to struggle more in Management Decision-Making and Finance than in previous years, and this was partially due to their difficulty in performing meaningful quantitative analysis. The Board wonders whether, in this time of adapting to new accounting and audit rules, candidates have directed less effort to the other competency areas.

I can only assume, that if The Board is wondering about such things, it will be throwing in some pretty quantitative stuff in order to quench their curiosity.

On Page 8.

Candidates still appeared to spend more time on issues they understood and were comfortable discussing.

The issue of candidates avoiding complex issues is mentioned again. My advice from the 2010 UFE report remains.

Page 9.

As in prior years, candidates sometimes struggled to provide valid and relevant procedures given a situation. … Candidates are reminded to focus on significant weaknesses and to make sure their analyses are consistent, from the identification of the issue and the discussion of the implications to the recommendation for improvement. There sometimes seems to be a disconnect between the risks candidates identify and either their analyses or their recommendations.

Procedures continue to be something that can be improved and my advice from the 2010 UFE report remains the same.

On page 10 of the report, there is a mention of revenue recognition weaknesses demonstrated by many candidates. Although this is a topic that is always tested, it is possible that more focus will be placed on it in the near term.

Candidates are reminded of the need to provide depth of analysis when addressing accounting issues. To demonstrate competence, candidates need to show their understanding of what the issue is, explain why it is an issue, and explain how the issue should be addressed.

You might remember that in 2010, The Board mentioned taxation as a serious weakness. The 2011 candidates (possibly because the board warned in 2010) did better with taxation.

Although the level of taxation knowledge displayed by candidates in the 2011 UFE was stronger than in the previous year’s examination, there is still room for improvement.

You can expect that this will continue to be heavily tested.

Continuing into MDM, there were some areas of weakness mentioned.

There were three opportunities to demonstrate competence in Management Decision-Making on the 2011 UFE. The Board was surprised to see that candidates performed poorly in this competency area. … While most candidates were able to identify the benefits and risks of the offer from a qualitative perspective, they struggled to perform useful quantitative analyses.

And in the area of Finance.

While most candidates were able to calculate the appropriate ratios, they had a difficult time explaining the ratios and their underlying meanings. … Weak discussions were generally a result of candidates either not fully understanding how to calculate working capital or providing only a one sided explanation of the impact.

As mentioned earlier, there was some weakness in quants in 2011 and a weak discussion of the ‘behind the numbers’ part of Finance. It might be a good idea to review your ratios and what they mean and as mentioned previously, expect more quants.

Sorry for the long posts, I’ll keep it shorter in the future!

What other things do you think we might see on the UFE this year?

Pin It on Pinterest