coverage

What’s going to be on the UFE – 2013 Edition

I have to admit that this is my favourite post of the year to write. I love looking through the UFE report released approximately every May and reading the Board of Evaluators (BOE) comments. You’ll see a lot of these as well when you’re taking up your simulations and using the UFE Report as your answer sheet. The BOE comments can often be somewhat harsh which often gives me a chuckle.

I clearly need to get a life. Until then, here’s the 2013 edition.

A reminder, that this should be read in conjunction with the 2012 edition since rumour has it that the UFE is on a two year cycle. I avoid going to specific so as to not give away cases that you’re going to write soon.

Another reminder, I have no inside knowledge of what’s going into this year’s UFE. I just believe that the UFE reports are the best guidance we have as to what to watch for.

Overall, this years UFE report seems to be a pretty positive one compared to some previous years, indicating that candidates are performing well and steady. This is, of course, good news. Now, on to the useful stuff.

Examining the 2012 Report

In one of the most strongly worded warnings I’ve seen, the BOE cautions this years writers on Page 7:

The Board would like to caution candidates about a dangerous exam-writing strategy the Board believes is occurring. It appears as though some candidates are assuming there is a specific number of issues that the Board is looking for on each indicator. As a result, candidates appear to be repeatedly choosing to address only a few issues on each indicator even when numerous issues are outlined in the simulation. By choosing to limit the breadth of their discussions to a particular number of issues, candidates are risking not being able to demonstrate their competence in a specific area. Candidates should attempt to discuss all the issues they deem significant in the simulations, and not limit their analysis because they think they have reached the minimum number of issues the Board is looking for.

In 2010, when I wrote the UFE the marking guides that are now very prevalent were new. In years past, candidates had to estimate which and how many issues were required from the wording in the UFE reports which was of course more challenging but also built up your ranking/decision making skills on the UFE. It was part of the process. I think that these marking guides are a factor in this warning, candidates are playing the numbers game. Don’t do that this year. I strongly suspect that the BOE will come down on this heavily this year. Always rank and write according to the issue and don’t (necessarily) stop just because you’ve written the top three when a fourth might be equally valid. 

On page 9 and 10, the BOE discusses Assurance at length. The indicator seems to be that the UFE will continue to place you in unusual Assurance roles. The focus seems to be at putting you in unusual assurance roles and asking you to ‘make the leap’ and think about how your role can add value to the client. These seem to be more undirected.

In 2012, candidates also had difficulty in their Audit Planning, mainly the issue was where in the audit they were locating themselves. Remember to consider whether you are deciding to accept an engagement or whether you have already accepted an engagement. The conversation can be totally different depending on that factor.

On Page 10, the BOE identifies some of the PMR weaknesses they saw:

Candidates who were unable to demonstrate their competence in this area typically did not provide indepth discussions of the issues. Generally, they did not support their discussions with relevant case facts and Handbook guidance, but instead jumped to conclusions. Some candidates also performed poorly as a result of not addressing a sufficient number of issues, as mentioned above in the discussion of overall deficiencies in candidates’ performance.

Case facts, friends. Stating criteria alone does not add value and therefore does not score points. This was also mentioned as an overall weakness in 2012.

Next up: not-for-profit organizations were a weakness in 2012. On page 10:

Candidates did not seem to be familiar with the standards applicable to not-for-profit organizations. They did not address some of the accounting issues central to not-for-profit organizations, such as how to recognize donated goods and services, as frequently as the Board had hoped.

Don’t forget that there is a large not-for-profit and public sector out there and that both of these areas have been tested before. Although most of the standards are similar to private enterprise, they’ll probably test the differences so don’t be afraid of diving into the handbook to look them up.

Regarding Tax, performance seems to be pretty stable. Candidates are focusing more time and effort on the simpler stuff they know vs. the complex issues as brought up by the BOE on Page 11:

In general, candidates seemed to struggle with the more complex tax concepts tested, and provided extensive discussions on the basic concepts they were comfortable with (such as interest deductibility or RRSP withdrawals). The Board reminds candidates that ranking of issues remains an important skill, and as a result, they should not avoid more complex issues if they are significant to the simulation.

This is pretty standard. There will probably continue to be one or two simple indicators and another one that’s complex. It might serve you well to tackle the complex stuff if you can find the right section of the Tax Act to refer to quickly.

In MDM the BOE had suggestions to candidates on Page 12:

Before deciding what type of analysis to perform, candidates need to reflect upon what their analysis needs to accomplish quantitatively and what would be useful to the client.

Along with this, the BOE encourages candidates to take the analysis all the way and offer recommendations where warranted.

And finally, the often feared PQ indicator was tested more in 2012 than previously which signals that the BOE wishes to continue testing the more undirected stuff heavily. Candidates seem to be recognizing the issues that are more commonly looked for such as fraud but continue to miss the “big picture integration” that the BOE wants every year.

The BOE leaves us with an interesting comment at the very end, on Page 14.

The Board continues to emphasize the importance of being able to identify and appropriately address underlying issues on the UFE. These analytical skills are critical for a chartered accountant, and will continue to play an important part not only in the Level 1 assessment on the UFE, but also in the assessment of competence at Levels 2 and 3.

I find it interesting because it makes sense that the PQ affects Level 1 (overall score) but it also seems that they are indicating it might make a difference in Level 2 or 3 which could mean that regular indicators might become more undirected similar to what’s been mentioned above in Assurance. Keep your eye out for this.

What's going to be on the UFE – 2013 Edition

I have to admit that this is my favourite post of the year to write. I love looking through the UFE report released approximately every May and reading the Board of Evaluators (BOE) comments. You’ll see a lot of these as well when you’re taking up your simulations and using the UFE Report as your answer sheet. The BOE comments can often be somewhat harsh which often gives me a chuckle.

I clearly need to get a life. Until then, here’s the 2013 edition.

A reminder, that this should be read in conjunction with the 2012 edition since rumour has it that the UFE is on a two year cycle. I avoid going to specific so as to not give away cases that you’re going to write soon.

Another reminder, I have no inside knowledge of what’s going into this year’s UFE. I just believe that the UFE reports are the best guidance we have as to what to watch for.

Overall, this years UFE report seems to be a pretty positive one compared to some previous years, indicating that candidates are performing well and steady. This is, of course, good news. Now, on to the useful stuff.

Examining the 2012 Report

In one of the most strongly worded warnings I’ve seen, the BOE cautions this years writers on Page 7:

The Board would like to caution candidates about a dangerous exam-writing strategy the Board believes is occurring. It appears as though some candidates are assuming there is a specific number of issues that the Board is looking for on each indicator. As a result, candidates appear to be repeatedly choosing to address only a few issues on each indicator even when numerous issues are outlined in the simulation. By choosing to limit the breadth of their discussions to a particular number of issues, candidates are risking not being able to demonstrate their competence in a specific area. Candidates should attempt to discuss all the issues they deem significant in the simulations, and not limit their analysis because they think they have reached the minimum number of issues the Board is looking for.

In 2010, when I wrote the UFE the marking guides that are now very prevalent were new. In years past, candidates had to estimate which and how many issues were required from the wording in the UFE reports which was of course more challenging but also built up your ranking/decision making skills on the UFE. It was part of the process. I think that these marking guides are a factor in this warning, candidates are playing the numbers game. Don’t do that this year. I strongly suspect that the BOE will come down on this heavily this year. Always rank and write according to the issue and don’t (necessarily) stop just because you’ve written the top three when a fourth might be equally valid. 

On page 9 and 10, the BOE discusses Assurance at length. The indicator seems to be that the UFE will continue to place you in unusual Assurance roles. The focus seems to be at putting you in unusual assurance roles and asking you to ‘make the leap’ and think about how your role can add value to the client. These seem to be more undirected.

In 2012, candidates also had difficulty in their Audit Planning, mainly the issue was where in the audit they were locating themselves. Remember to consider whether you are deciding to accept an engagement or whether you have already accepted an engagement. The conversation can be totally different depending on that factor.

On Page 10, the BOE identifies some of the PMR weaknesses they saw:

Candidates who were unable to demonstrate their competence in this area typically did not provide indepth discussions of the issues. Generally, they did not support their discussions with relevant case facts and Handbook guidance, but instead jumped to conclusions. Some candidates also performed poorly as a result of not addressing a sufficient number of issues, as mentioned above in the discussion of overall deficiencies in candidates’ performance.

Case facts, friends. Stating criteria alone does not add value and therefore does not score points. This was also mentioned as an overall weakness in 2012.

Next up: not-for-profit organizations were a weakness in 2012. On page 10:

Candidates did not seem to be familiar with the standards applicable to not-for-profit organizations. They did not address some of the accounting issues central to not-for-profit organizations, such as how to recognize donated goods and services, as frequently as the Board had hoped.

Don’t forget that there is a large not-for-profit and public sector out there and that both of these areas have been tested before. Although most of the standards are similar to private enterprise, they’ll probably test the differences so don’t be afraid of diving into the handbook to look them up.

Regarding Tax, performance seems to be pretty stable. Candidates are focusing more time and effort on the simpler stuff they know vs. the complex issues as brought up by the BOE on Page 11:

In general, candidates seemed to struggle with the more complex tax concepts tested, and provided extensive discussions on the basic concepts they were comfortable with (such as interest deductibility or RRSP withdrawals). The Board reminds candidates that ranking of issues remains an important skill, and as a result, they should not avoid more complex issues if they are significant to the simulation.

This is pretty standard. There will probably continue to be one or two simple indicators and another one that’s complex. It might serve you well to tackle the complex stuff if you can find the right section of the Tax Act to refer to quickly.

In MDM the BOE had suggestions to candidates on Page 12:

Before deciding what type of analysis to perform, candidates need to reflect upon what their analysis needs to accomplish quantitatively and what would be useful to the client.

Along with this, the BOE encourages candidates to take the analysis all the way and offer recommendations where warranted.

And finally, the often feared PQ indicator was tested more in 2012 than previously which signals that the BOE wishes to continue testing the more undirected stuff heavily. Candidates seem to be recognizing the issues that are more commonly looked for such as fraud but continue to miss the “big picture integration” that the BOE wants every year.

The BOE leaves us with an interesting comment at the very end, on Page 14.

The Board continues to emphasize the importance of being able to identify and appropriately address underlying issues on the UFE. These analytical skills are critical for a chartered accountant, and will continue to play an important part not only in the Level 1 assessment on the UFE, but also in the assessment of competence at Levels 2 and 3.

I find it interesting because it makes sense that the PQ affects Level 1 (overall score) but it also seems that they are indicating it might make a difference in Level 2 or 3 which could mean that regular indicators might become more undirected similar to what’s been mentioned above in Assurance. Keep your eye out for this.

Heads up – Standards change over time (Especially prior to 2011!)

A great issue was brought up in the comments on an earlier post  – Do you need to be concerned about outdated standards in older cases? The answer is yes, you should be aware.

This is particularly applicable to tax which changes on a yearly basis but it’s also worth noting that IFRS/ASPE are still sort-of new in Canada. Prior to 2011 Canada still used Canadian GAAP and GAAS which were more internationalized in 2011. This means that older simulations (particularly 2009 and 2010) could reference outdated treatment or outdated standards in the solutions.

The good news is that a “consortium made up of members from each of the four provincial CA professional programs” reissued of the previous years UFE simulations and solutions so be sure to grab those off of the candidates portals of your Institutes if you’ll be going back that far. I should have mentioned this before.

Aside from this major change in standards, the UFE reports are not updated when tax rates or accounting/audit standards change over time so it’s up to you to recognize when somethings out of date. Before you start worrying needlessly, the good news is that this is not something you need to worry about a lot since most of the standards examined are stable. If you wrote SOA, it was the exact same situation.

On a related note, for 2013, your institute should have made you aware of the technical update which can be found here if you’ve forgotten about it which talks about changes from the published version of the Competency Map.

New IFRSs to be examinable for the 2013 CKE and 2013 UFE

Just thought I’d post a word about some new stuff that is examinable for the 2013 CKE and UFE at Level A. In case you missed (or didn’t bother reading) this update from the ICAO outlines the changes and their examinability.

I’ll be putting up some guides for these new IFRSs starting tomorrow.

Here is the breakdown:

New IFRSs that can be tested at the 2013 CKE (and UFE)

IFRS 10 Consolidated Financial Statements (New)
IFRS 11 Joint Arrangements (New)
IFRS 12 Disclosure of Interests in Other Entities (New)
IFRS 13 Fair Value Measurement (New)

Post 2013 IFRS

As mentioned in the ICAO’s note, IFRS 7 and IFRS 9 may be tested as well as the old standard such as IAS 39. You could get a situation where the answer would be the same under both standards or where they specifically ask you to answer using one standard. This was something we have a lot of in 2010 because of the transition to IFRS. The good news, if I remember right, is that they took it easy on us at the CKE. But not so much on the SOA so it could go either way. Good idea to know the differences therefore.

Revised Standards fully testable on the 2013 CKE (and UFE)

IAS 16 Property, Plant and Equipment (Amended)
IAS 19 Employee Benefits (Revised)
IAS 27 Separate Financial Statements (Revised)
IAS 28 Investments in Associates and Joint Ventures (Revised)
IAS 32 Financial Instruments: Presentation (Amended)
IAS 34 Interim Financial Reporting (Amended)

 

Do I have to remember everything from University in order to write the UFE?

The answer is to some extent yes, and to some extent no.

Yes – many of the University topics you would have learned for accounting are in some way covered by the UFE. You can see the specific topic coverage in the UFE competency map (This version is for the 2012 UFE but a new one will be out soon and should be similar).

 No – In University you would have likely learned them in far more detail than is covered in the UFE. Your responses in University would have also likely had to be to a higher degree of accuracy than they are on the UFE. 

If you are writing in 2013, you’ve got almost a year now to get yourself ready for the UFE. Step 1 is to review technical and give yourself a solid technical base because come next summer your entire focus will be on learning to write and debrief simulations and perfect your ability to write a solid response to a business case.

I’ll tell you right now: the number one fear of UFE writers is that they are not technically sound. This feeling seems to persist even up to the UFE and many students spend too much time on technical close to the UFE and not enough time writing simulations and debriefing them properly.

I’ll be working on some technical guides as we lead up to the CKE and next summer so keep coming back as we continue our march to the 2013 UFE.

What do previous writers think? How much technical from University did you need and to what extent?

What they’re going to test on the UFE this year – Part 2

As discussed yesterday, hints as to which topics might come up on the 2012 UFE are available in the comments on candidate performance of the last UFE Report. Today we’ll review the 2011 UFE Report.

Why the 2010 report may contain more hints than the 2011

The UFE is marked in the second half of October and the UFE report is released in May. About a month later, in June, the provincial institutes/ordre already have the draft UFE available for review and provide input. I don’t have any inside knowledge in this area (would welcome some if anybody does!) but I (and I’ve heard this from others) suspect that there may not be enough time to grade, compile the results and integrate the weaknesses into the next UFE so that is why there may be a two year delay.

That said, there is obviously still the possibility, especially for giant, glaring issues.

Examining the 2011 UFE Report

Let’s see what the 2011 UFE Report has in store. The Executive Summary on Page 6 has this as their first caution.

The Board found evidence of candidates including large sections from the Handbook in their responses but failing to apply the guidance to the case facts. Candidates are reminded that simple repetition of technical rules does not demonstrate competence. On the other hand, some candidates used the Handbook to strengthen their discussions by including only the relevant guidance in their responses and then applying case facts to each and every Handbook criterion before deciding on an appropriate accounting treatment.

This is certainly a first I’ve seen this warning, and it appears that since they’ve allowed copying and pasting from the handbook that candidates have gone overboard and forgot that the key is to always link your criteria to case facts. Please don’t read this as a suggestion not to use the handbook. I think using the handbook available to you is an amazing resource, but you’ve got to know how to do it properly by linking to case facts concisely.

Continuing on page 6, The Board sympathized in 2011 with candidates due to the numerous changes in accounting standard in the past two years.

The Board designed the simulations to avoid any confusion as to what set of standards to apply by explicitly stating the reporting context. However, at times it was difficult to assess the technical knowledge of candidates and determine whether they were aware of the new accounting and auditing principles because their use of old terminology made the discussions confusing and hard to follow. The Board is sympathetic due to the recent volume of change in accounting and auditing standards, and simply encourages candidates to continue to try to stay current.

I expect that The Board may not be as sympathetic in future years but I also suspect that students coming through the pipeline in the next few years will no longer have heard of the old terminology and this will cease to be a problem. This is an example of something that should definitely be applied to the 2012 UFE.

Page 7 begins with a compliment.

Candidates’ performance in taxation improved this year. This was noted as a key detractor on the 2010 examination, on which the Board felt that candidates were avoiding the taxation issues. Candidates attempted to address the three primary indicators in taxation on the 2011 UFE. They continue to struggle with the application of the relevant tax rules to case facts, but the Board is encouraged by this first step towards strengthening performance in this area.

I mention this because it seems that The Board is looking for improved strength in this area and that this is only the “first step”. For this reason, and the two-year delay mentioned previously, I’d be brushing up on some taxation issues if that’s a weakness of mine (which it is..)

Another strong warning from The Board, as we continue on page 7 regarding non-directed issues.

…candidates did not perform well in this area on the 2011 UFE. In addition, they did not perform well on non-directed indicators in specific competency areas. … Candidates are reminded that they will not be specifically directed to all of the issues that the Board considers to be mission critical. Candidates need to take the time to read the simulation carefully; understand the situation, their role, and the needs of their client; and address all the significant issues, whether directed or not.

I don’t want to give too much away on where this comment comes from but after you’ve finished writing all your 2011 simulations it might be worth returning to this comment in your study plan. It’s also difficult to give advice on this issue because it is, by nature, non directed. I’ll devote another whole post to non-directed indicators/PQs but I found that I got better at spotting these as time went on through my study plan. I’ll also warn that I’ve seen people just make up these indicators where they didn’t really exist which is a waste of time so you can go wrong in each direction here.

Another pretty interesting warning comes at the end of Page 7 regarding quants.

The Board would like to caution candidates on what it sees as a downward trend in candidates’ performance on indicators that require quantitative analysis, despite most of these indicators being directed. Candidates seemed to struggle more in Management Decision-Making and Finance than in previous years, and this was partially due to their difficulty in performing meaningful quantitative analysis. The Board wonders whether, in this time of adapting to new accounting and audit rules, candidates have directed less effort to the other competency areas.

I can only assume, that if The Board is wondering about such things, it will be throwing in some pretty quantitative stuff in order to quench their curiosity.

On Page 8.

Candidates still appeared to spend more time on issues they understood and were comfortable discussing.

The issue of candidates avoiding complex issues is mentioned again. My advice from the 2010 UFE report remains.

Page 9.

As in prior years, candidates sometimes struggled to provide valid and relevant procedures given a situation. … Candidates are reminded to focus on significant weaknesses and to make sure their analyses are consistent, from the identification of the issue and the discussion of the implications to the recommendation for improvement. There sometimes seems to be a disconnect between the risks candidates identify and either their analyses or their recommendations.

Procedures continue to be something that can be improved and my advice from the 2010 UFE report remains the same.

On page 10 of the report, there is a mention of revenue recognition weaknesses demonstrated by many candidates. Although this is a topic that is always tested, it is possible that more focus will be placed on it in the near term.

Candidates are reminded of the need to provide depth of analysis when addressing accounting issues. To demonstrate competence, candidates need to show their understanding of what the issue is, explain why it is an issue, and explain how the issue should be addressed.

You might remember that in 2010, The Board mentioned taxation as a serious weakness. The 2011 candidates (possibly because the board warned in 2010) did better with taxation.

Although the level of taxation knowledge displayed by candidates in the 2011 UFE was stronger than in the previous year’s examination, there is still room for improvement.

You can expect that this will continue to be heavily tested.

Continuing into MDM, there were some areas of weakness mentioned.

There were three opportunities to demonstrate competence in Management Decision-Making on the 2011 UFE. The Board was surprised to see that candidates performed poorly in this competency area. … While most candidates were able to identify the benefits and risks of the offer from a qualitative perspective, they struggled to perform useful quantitative analyses.

And in the area of Finance.

While most candidates were able to calculate the appropriate ratios, they had a difficult time explaining the ratios and their underlying meanings. … Weak discussions were generally a result of candidates either not fully understanding how to calculate working capital or providing only a one sided explanation of the impact.

As mentioned earlier, there was some weakness in quants in 2011 and a weak discussion of the ‘behind the numbers’ part of Finance. It might be a good idea to review your ratios and what they mean and as mentioned previously, expect more quants.

Sorry for the long posts, I’ll keep it shorter in the future!

What other things do you think we might see on the UFE this year?

What they're going to test on the UFE this year – Part 1

The answer to this question is probably worth a good chunk of change to students. Unfortunately, I don’t know anymore than the next guy. What I can suggest, though, is that you take a look at the last two year’s UFE Reports for some hints of areas where students performed poorly and to get some ideas of what the CICA may test a little more heavily. I admit, that in 2010 when I wrote the UFE, I didn’t do this but you’ll have no excuse because I’m going to give you the summary version here. I’ve removed as much specifics as I can so you should be able to go through this without “spoiling” any case solutions.

2010 UFE Report

Let’s start on page 6 which forms part of the Executive Summary of the report:

Candidates demonstrated stronger communication skills for the second straight year. They continued to limit their use of short forms, acronyms, and bulleted lists. This resulted in clearer discussions and improved the overall flow of the candidates’ responses.

Although the UFE Report mentions this as a positive thing, I’m going to highlight that they are looking for strong communications skills. This essentially means limiting the use of acronyms/short forms to only the most obvious (remember the audience of your report), avoiding using template responses and lists.

Page 6 also brings on the first warning sign:

As for major detractors, the Board would like to highlight serious concern regarding candidates’ lack of technical knowledge in the area of taxation.

Page 7 continues the warning:

On the 2010 UFE, there were three primary indicators in taxation. In general, candidates avoided these taxation issues and often left the analysis of these issues until the end of their responses. The taxation responses were often very brief and, therefore, displayed little competence.

How I would read this is that you may see a situation on the UFE where the topic should be easily searchable in the Tax Act and you should go look there rather than just spit out generalities about the topic. Whether you feel like this is worth doing at the time or how much time you should dedicate is a judgment call during the exam since I find the Tax Act time consuming to navigate. I have to admit, that this warning applied to me. I wrote the UFE in 2010 and although I did look in the Tax Act for one of the topics, the remainder of my response may have been too general.

The report next covers information about the nature of the UFE and roles in the 2010 UFE where the next warning appears on page 7:

However, the importance of ranking remains a concern. Candidates appeared to spend more time on issues they understood and were comfortable discussing. Candidates were generally able to identify issues, even in areas where they were not as knowledgeable (such as taxation issues), but they had difficulty discussing these issues with any depth. As a result, their responses on these issues tended to be very brief, and candidates did not display the required competence.

This is a familiar warning from years past where it was even more strongly worded. Essentially what they are saying here is that you can’t avoid the complex or tougher issues. You should be ranking the issues accordingly and tackling them based on their rank rather than picking and choosing the issues you are more familiar with. Complex issues are there on purpose and in order to show competence you need to hit enough of them.

Page 7 continues on to discuss roles:

The roles assigned and scenarios presented to the candidates on the 2010 UFE continued to be varied, with only two simulations presenting a traditional assurance role … The one exception … [in one case] candidates were asked to develop due diligence procedures … [and] Some candidates struggled to fulfill this role and, instead, fell back into the traditional auditor role.

Expect the roles you take in the simulations to continue to be varied and unique. I would also brush up on due diligence procedures for this years UFE.

Page 8 begins to discuss specific issues with individual competencies:

Overall, candidates understood the assurance roles assigned and the issues presented, but often struggled with the details of the assignments. In particular, candidates sometimes struggled to provide valid and relevant procedures for the scenario presented.

I felt like the 2010 UFE was the UFE of procedures and that in every case you had to develop procedures. Although the warnings abouts procedures seemed to be harsher in years past, continue to expect the UFE to ask you for many so you should be very comfortable developing procedures for a variety of scenarios. One of the complaints is that procedures candidates provide are too general as the board mentions below.

Candidates are encouraged to always consider the effectiveness of the procedures they provide. Procedures should address the risk area identified. In addition, when presenting an audit plan to an audit committee or a client, candidates should explain why the procedure is necessary, in other words, how it would successfully address the client’s assurance needs.

I couldn’t have put it better myself so we’ll continue to Page 9 where PMR is discussed.

It is worth noting that the accounting issues in these scenarios were relatively simple. In the future, the complexity level of the IFRS accounting issues presented will likely increase.

2010 was the first year of IFRS testing so you should expect the complexity to go up. Page 10 adds to the PMR discussion.

Reporting is an essential part of the profession, and candidates would benefit by becoming more familiar with important reporting tools, such as the MD&A.

Based on this, it would not surprise me if the 2012 UFE had a larger focus on reporting issues. I might get a little more familiar with some of these topics if it’s not something you’ve looked at in a while.

Page 10 contains some additional specific feedback on the tax problems but I think the summary above covers the issue adequately. Page 11 discusses the remaining competencies as well as the pervasive qualities indicators but I don’t feel like there is much value added here so you can go take a look yourself if you’re interested.

Tomorrow we’ll do a review of the 2011 UFE Report. What do you think we’ll see on the 2012 UFE and why?

What they’re going to test on the UFE this year – Part 1

The answer to this question is probably worth a good chunk of change to students. Unfortunately, I don’t know anymore than the next guy. What I can suggest, though, is that you take a look at the last two year’s UFE Reports for some hints of areas where students performed poorly and to get some ideas of what the CICA may test a little more heavily. I admit, that in 2010 when I wrote the UFE, I didn’t do this but you’ll have no excuse because I’m going to give you the summary version here. I’ve removed as much specifics as I can so you should be able to go through this without “spoiling” any case solutions.

2010 UFE Report

Let’s start on page 6 which forms part of the Executive Summary of the report:

Candidates demonstrated stronger communication skills for the second straight year. They continued to limit their use of short forms, acronyms, and bulleted lists. This resulted in clearer discussions and improved the overall flow of the candidates’ responses.

Although the UFE Report mentions this as a positive thing, I’m going to highlight that they are looking for strong communications skills. This essentially means limiting the use of acronyms/short forms to only the most obvious (remember the audience of your report), avoiding using template responses and lists.

Page 6 also brings on the first warning sign:

As for major detractors, the Board would like to highlight serious concern regarding candidates’ lack of technical knowledge in the area of taxation.

Page 7 continues the warning:

On the 2010 UFE, there were three primary indicators in taxation. In general, candidates avoided these taxation issues and often left the analysis of these issues until the end of their responses. The taxation responses were often very brief and, therefore, displayed little competence.

How I would read this is that you may see a situation on the UFE where the topic should be easily searchable in the Tax Act and you should go look there rather than just spit out generalities about the topic. Whether you feel like this is worth doing at the time or how much time you should dedicate is a judgment call during the exam since I find the Tax Act time consuming to navigate. I have to admit, that this warning applied to me. I wrote the UFE in 2010 and although I did look in the Tax Act for one of the topics, the remainder of my response may have been too general.

The report next covers information about the nature of the UFE and roles in the 2010 UFE where the next warning appears on page 7:

However, the importance of ranking remains a concern. Candidates appeared to spend more time on issues they understood and were comfortable discussing. Candidates were generally able to identify issues, even in areas where they were not as knowledgeable (such as taxation issues), but they had difficulty discussing these issues with any depth. As a result, their responses on these issues tended to be very brief, and candidates did not display the required competence.

This is a familiar warning from years past where it was even more strongly worded. Essentially what they are saying here is that you can’t avoid the complex or tougher issues. You should be ranking the issues accordingly and tackling them based on their rank rather than picking and choosing the issues you are more familiar with. Complex issues are there on purpose and in order to show competence you need to hit enough of them.

Page 7 continues on to discuss roles:

The roles assigned and scenarios presented to the candidates on the 2010 UFE continued to be varied, with only two simulations presenting a traditional assurance role … The one exception … [in one case] candidates were asked to develop due diligence procedures … [and] Some candidates struggled to fulfill this role and, instead, fell back into the traditional auditor role.

Expect the roles you take in the simulations to continue to be varied and unique. I would also brush up on due diligence procedures for this years UFE.

Page 8 begins to discuss specific issues with individual competencies:

Overall, candidates understood the assurance roles assigned and the issues presented, but often struggled with the details of the assignments. In particular, candidates sometimes struggled to provide valid and relevant procedures for the scenario presented.

I felt like the 2010 UFE was the UFE of procedures and that in every case you had to develop procedures. Although the warnings abouts procedures seemed to be harsher in years past, continue to expect the UFE to ask you for many so you should be very comfortable developing procedures for a variety of scenarios. One of the complaints is that procedures candidates provide are too general as the board mentions below.

Candidates are encouraged to always consider the effectiveness of the procedures they provide. Procedures should address the risk area identified. In addition, when presenting an audit plan to an audit committee or a client, candidates should explain why the procedure is necessary, in other words, how it would successfully address the client’s assurance needs.

I couldn’t have put it better myself so we’ll continue to Page 9 where PMR is discussed.

It is worth noting that the accounting issues in these scenarios were relatively simple. In the future, the complexity level of the IFRS accounting issues presented will likely increase.

2010 was the first year of IFRS testing so you should expect the complexity to go up. Page 10 adds to the PMR discussion.

Reporting is an essential part of the profession, and candidates would benefit by becoming more familiar with important reporting tools, such as the MD&A.

Based on this, it would not surprise me if the 2012 UFE had a larger focus on reporting issues. I might get a little more familiar with some of these topics if it’s not something you’ve looked at in a while.

Page 10 contains some additional specific feedback on the tax problems but I think the summary above covers the issue adequately. Page 11 discusses the remaining competencies as well as the pervasive qualities indicators but I don’t feel like there is much value added here so you can go take a look yourself if you’re interested.

Tomorrow we’ll do a review of the 2011 UFE Report. What do you think we’ll see on the 2012 UFE and why?

Pin It on Pinterest