This article is a Glossary guide to Analytics Metrics.
This guide contains the following:
- General Metrics
- Portfolio Metrics
- Risk Management Metrics
- Payment Metrics
- Change Management Metrics
- Submission Management Metrics
- Quality Management Metrics
- Time Management Metrics
- Benchmark
- Advanced Activity Reports (coming soon)
The following metrics are available across all subscription plans.
All definitions in the glossary are based on preconfigured filters.
You can only see data for contracts you have access to but that includes all contracts apart from deleted contracts and demo contracts.
You may have a perfectly legitimate reason to want to report on data from contracts that were completed or archived during the current reporting period so, by default, these are included but these can easily be filtered out via the filters page.
Users can personalise visualisations and this will change the data but not the title or any tooltips.
Power BI refers to filters as those pre-applied to each visual and slicers as those selected by a user (typically in a dropdown).
You can see which filters and slicers apply to a particular visual by hovering over it and clicking the Filters on Visual icon.
General Metrics
1. Count of Contracts – Total count of contracts that the user has access to (excluding deleted and demo contracts). Why? To see the volume of contracts you oversee.
2. All Notices and Replies – Total count of notices and replies in the reporting period. Why? To see the volume of communications in the reporting period.
3. Requiring Contract Admin Reply – Total count of notices due a reply from the contract admin within the reporting period i.e. Backlog. Why? To see the size of the backlog currently awaiting a response and highlight potential resourcing issues.
4. Requiring Supplier reply – Total count of notices due a reply from the supplier within the reporting period i.e. Backlog. Why? To see the size of the backlog currently awaiting a response and highlight potential resourcing issues.
5. Days waiting Contract Admin – Cumulative number of calendar days the supplier has been waiting for responses from the contract admin on open notices in the reporting period. Only counts those items where a response was due within the reporting period. For replies already received this is calculated from the notified date to the date the reply was received and for replies not yet received this is calculated from the notified date to the end of the reporting period. Why? To quantify the amount of waiting time that the supplier is incurring for decision and understand potential opportunity to reduce delays and non-productive costs to projects.
6. Days awaiting Supplier – Cumulative number of calendar days the contract admin and client has been waiting for a response from the supplier on open notices in the reporting period. Only counts those items where a response was due within the reporting period. For replies already received this is calculated from the notified date to the date the reply was received and for replies not yet received this is calculated from the notified date to the end of the reporting period. Why? To quantify the amount of time lost by the client waiting for quotes and other responses from the supplier and understand potential opportunity to reduce delays to projects.
7. Average days awaiting Contract Admin – For items due within this reporting period (even if they were notified in a previous reporting period). The number of days awaiting a reply divided by the number of items awaiting a reply. Why? To quantify the average amount of waiting time that the supplier has incurred for decisions and understand potential opportunity to reduce delays and non-productive costs to projects.
8. Average days awaiting Supplier – For items due within this reporting period (even if they were notified in a previous reporting period). The number of days awaiting a reply divided by the number of items awaiting a reply. Why? To quantify the average amount of waiting time that the contract admin and client has incurred for decisions and understand potential opportunity to reduce delays and non-productive costs to projects.
9. Awaiting Contract Admin – Total volume of items currently awaiting a decision from the contract administrator irrespective of what reporting period they were notified in or what reporting period the response was due in. This deliberately excludes items notified or due in the current calendar month as they technically fall outside the reporting period. Why? To understand the scale of the outstanding issues and the total size of the backlog.
10. Total volume of items currently awaiting a decision from the supplier irrespective of what reporting period they were notified in or what reporting period the response was due in. This deliberately excludes items notified or due in the current calendar month as they technically fall outside the reporting period. Why? To understand the scale of the outstanding issues and the total size of the backlog.
11. All Communications Volume - Total volume of communications raised over the reporting period by notified date, party and count (including replies). Why? To see the trend of communications being raised and by whom.
12. All Communications Type - Total communications raised over the reporting period, by notified date, split by type. Why? To understand the split in communications across workflows which aligns to disciplines in teams; this helps understand workload and supports resourcing decisions.
13. All Communications on Time – Bars shows the responses due in the reporting period (irrespective of which period the original notice was raised) and the line shows replies received on time in the reporting period as a percentage of all replies due in that period. For example, if a reply is due in January and received in February it will be reflected in the January bar and the January line. Why? To see the trend in overall compliance with contractual timetables and whether it’s improving over time.
14. All Communications Cumulative Overdue – The Late Replies line shows replies that were received late in the reporting period (based on reply date). The Cumulative overdue line shows the cumulative volume of replies that were due in a previous period or the current reporting period and have not yet been received (based on response due date). Why? To understand the size of the outstanding problems and the total size of the backlog and whether it’s improving over time
15. All Communications Backlog – The outstanding or pending replies current awaiting a response. This includes the pending issues and issues that are outstanding and over due in the next week and also items already overdue. Why? To understand the size of the outstanding problems and the total size of the backlog, to help inform resourcing decisions.
16. Replies received – The outstanding or pending replies current awaiting a response. This includes the pending issues and issues that are outstanding and over due in the next week and also items already overdue. Why? To understand the size of the outstanding problems and the total size of the backlog, to help inform resourcing decisions.
Portfolio Metrics
1. Count of Contracts – Total count of contracts that the user has access to (excluding deleted and demo contracts). Why? To see the volume of contracts you oversee.
2. Original Price – Total original price of all contracts that the user has access to (excluding deleted and demo contracts). Why? To see the original value of contracts you oversee.
3. Current Contract Price – Total current price of all contracts that the user has access to (excluding deleted and demo contracts). Why? To see the current value of contracts you oversee.
4. Total Certified – Sum of the cumulative value (i.e. total price for work done to date) of the latest Payment Certificate from each contract that the user has access to (excluding deleted and demo contracts). Why? To see the certified value of payments compared to the current contract price.
5. Latest Value of Open Changes – An open change means any change where the cost and/or time impact has not yet been agreed or assessed. For those that we can put a value on (i.e. at least one quotation has been received) this is the sum of the value of the latest quotation for each change that has not yet been accepted or assessed. Why? To help quantify the value of change that is currently under review.
6. Current Contract with an Accepted Programme – Count of all contracts that don’t have at least one accepted programme in FastDraft. Why? Without an accepted program, there is a lack of clarity regarding the progress of the works and determining whether the contractor is meeting their contractual obligations. Without an accepted program, it may be difficult to assess cost and time impacts of change.
7. Contracts by Lot - Total volume of contracts split by Lot – Why? It summarises the number or value of contracts you have in each lot and helps answer questions like “Do we have enough resources to support of maintenance programme next year?”.
8. Contracts by Company - Total volume of contracts split by Lot - Why? It summarises the number or value of contracts you have with a specific supplier.
9. Contracts by Conditions of Contract - Total volume of contracts split by contract type (e.g. “NEC4 ECC Option A”). Why? It summarises the number or value of contracts you have of each set of conditions of contract and helps answer questions like “How many JCT contracts do we have?”.
10. Contracts by Value – Total contract value split by contract type, including pricing option so that you can differentiate between fixed price and target cost, etc. Why track it? This is a useful report to drill down and answer questions like “Which types of contracts are we using?” and “How are most of our contracts priced?”
11. Contracts by Area - Total volume of contracts within an area. Why - It may help within understanding resource requirements across the organisation.
12. Contracts by Hub - Total volume of contracts within a hub. Why? It may help within understanding resource requirements across the organisation.
13. Contracts by Framework - Total volume of contracts by Framework – Why? It summarised the number or value of contracts you have under a specific framework and helps answer questions like “How many call-off contracts have been let under a particular Framework agreement?”
14. Framework - Value of contracts by supplier split by framework, including the price variance – Why? It helps you understand whether there is any kind of pattern in terms of which frameworks and/or suppliers are incurring the greatest cost variance.
15. Contracts Awarded – Volume of contracts awarded each year. Why? Can be filtered by framework and/or supplier to help you monitor the distribution of contracts over time.
Risk Management Metrics
1. Count – Total count of risk notices communicated up to the end of the reporting period, including any raised prior to the current reporting period. Why? To understand the volume of risk events that have been notified across all of your contracts. A high number isn’t necessarily a bad thing as it suggests risks are being raised and proactively managed.
2. High – Total count of high priority risk notices on your contracts raised up to the end of the reporting period that are still open (irrespective of when they were raised or whether a reply has been received or a mitigation plan is in place). Why? The volume of high priority risk events is a useful indicator of the number of things that are very likely to lead to significant delays and/or cost increases.
3. Unmitigated – Total count of risk notices on your contracts raised up to the end of the reporting period that are still open with no indication that a mitigation plan is in place (irrespective of when they were raised or whether a reply has been received). Mitigating actions are detected from any action assigned in the risk register and/or from a risk having been tagged as ‘Mitigated’ or ‘Mitigation Plan Agreed’. Why? The volume of risk events without any mitigation plan is a useful indicator of the number of problems that do not appear to be being actively managed. Unmitigated risk events are more likely to lead to delays and/or cost increases.
4. Open – Total count of risk notices on your contracts raised up to the end of the reporting period that are still open (irrespective of what priority they were assigned, when they were raised, whether a reply has been received or a mitigation plan is in place). Why? The volume of open risk events is a useful indicator of the number of potential problems that could lead to significant delays and/or cost increases.
5. Awaiting Contractor Administrator Reply – Total count of risk notices on your contracts raised by the supplier up to the end of the reporting period that are still awaiting a reply or acknowledgment from the contract administrator (irrespective of what priority they were assigned, when they were raised or whether a mitigation plan is in place). Why? The volume of risks awaiting a reply is a useful indicator of whether the contract administrator is actively responding to known risks that could potentially lead to significant delays and/or cost increases.
6. Awaiting Supplier Reply – Total count of risk notices on your contracts raised by the contract administrator up to the end of the reporting period that are still awaiting a reply or acknowledgment from the supplier (irrespective of what priority they were assigned, when they were raised or whether a mitigation plan is in place). Why? The volume of risks awaiting a reply is a useful indicator of whether the supplier is actively responding to known risks that could potentially lead to significant delays and/or cost increases.
7. Risks Replies on Time - This report tracks total risk notices where a reply was expected within the reporting period and the percentage where the reply was received on time - Why? Whilst a reply may not necessarily be expected under the contract a failure to reply (or at least indicate that a reply is not required) promptly is an indicator that risk management isn’t necessarily being prioritised.
8. Risk Cumulative Overdue Replies - The Late Replies line shows replies that were received late in the reporting period (based on reply date). The Cumulative overdue line shows the cumulative volume of replies that were due in a previous period or the current reporting period and have not yet been received (based on response due date). Why? To understand the size of the outstanding risk events and the total size of the backlog and whether it’s improving over time.
9. Risks by Status - Compares communication status (notified, open, closed, open and unmitigated) over the reporting period by due date and count - Why? To see the trend and split in the status of risks by period.
10. Risks to Change - This report tracks and compares the volume of risk events with changes notified over the reporting period and expresses the risk events as a percentage of the total. - Why? To see the trend of how well the project a flag risk events and how much change is agreed on the project as it provides an indication of foresight of potential problems.
11. Risk Volume - This report tracks the total risk notices raised over the reporting period by notified date, split by party and count. Why? To see the trend of risks being raised over time and by whom
12. Risks by Priority - This report tracks risks notified by period and over the reporting period and splits them by high, medium and low priority. Why track it? It shows you the split of risks being raised. This report is useful for exploring your risk backlog and the nature of risks.
13. Risk Age - This report tracks the total volume of risks by age bands and whether they are mitigated or not e.g. under 1 month, 1-3 months, 3-9 months, 12+ months, etc - Why? This report shows the thoroughness of the risk mitigation being put in place and is a complementary KPI to use with Risks Replies on Time.
14. Outstanding Risk Responses - This report tracks the total volume of outstanding replies split by party and provides extra granularity as to whether they are already late, or go overdue this week, etc. Why? This report shows the backlog of risks by a party.
Payment Metrics
1. Count of Payments Certified - Total volume of payment certificates issued in the reporting period. Why? Shows the total number of payments that should have been made this period.
2. Total Certified - Total cumulative value of payments based on the sum of the cumulative value from the most recently issued payment certificate on each of the contracts the user has access to. Why? Demonstrates the amount that should have been invoiced/paid to date.
3. Original Price - This report tracks the total original price for all contracts the user has access to. Why? Helps demonstrate where the contracts started compared to their current value.
4. Contract Value Outstanding – The difference between the current value of contracts and the total certified - Why? This report shows the value of contracts that has not yet been certified for payment.
5. Application and Certificates Count - This report tracks the total count of payment certificates vs applications each period, over the reporting period and estimates expected application based on at least one application/certificate expected per month for each active/live contract. Why? It shows you the gap between what the contracts asking for payment vs how many are being certified and the overall trend in the difference between these two processes.
6. Payless Notices - This report tracks the total number of payless notices served each period, over the reporting period. Why? Under UK law, the Housing Grants, Construction and Regeneration Act 1996 (as amended) (the “Act”) includes multiple provisions to encourage timely payments throughout the construction supply chain.
Change Management Metrics
1. Count of Change Notices - This report tracks the total number of change notices raised in the reporting period, by notified date, split by party. Why? To understand the volume of change that has been notified across all of your contracts.
2. Implemented Amount – The total value of all change notices that have been agreed or assessed. Why? To understand the total cost of change that has been implemented across all of your contracts.
3. Implemented Days – The sum of the time impact (in days) for all change notices that have been agreed or assessed. Why? To understand the total impact of change in terms of delays to project completion.
4. Open Changes – Total count of change notices on your contracts raised up to the end of the reporting period that are still open (irrespective of when they were raised, whether a reply or quotation has been received). Open in this context means a quotation has not been accepted or an assessment/determination of time and/or cost has not yet been made. Why? The volume of open change events is a useful indicator of the extent of change that is ‘in flight’ and extremely likely to have a time and/or cost impact.
5. Awaiting Contractor Administrator – Total count of change notices/quotations on your contracts raised by the supplier in the reporting period that are still awaiting a reply from the contract administrator. Why? The volume of change notices/quotations awaiting a reply is a useful indicator of current level of change activity and whether that is being effectively managed.
6. Awaiting Supplier – Total count of change notices awaiting a quotation or a revised quotation from the supplier. Why? The volume of change awaiting a quotation is another useful indicator of current level of change activity and whether that is being effectively managed.
7. Change Notices - This report tracks the total number of changes notified in the reporting period, by notified date, split by party. Why? To see the trend of the volume of change requests being raised.
8. Change Requests Status - Tracks the status of change notices. The bars show the count of changes notified in the reporting period (which increases the number of open changes), the number of rejected and implemented [i.e. cost and time impacts have been agreed or accepted] changes (both of which decrease the number of open changes) in the reporting period. The line shows the count of open change notices raised up to the end of the reporting period (including those raised prior to the current reporting period). Why? To see the trend and split in outcomes of change notices.
9. Quotes Instructed - This visual tracks the total count of quotes instructed by the contract administrator. This includes, where applicable, changes that have been notified by the supplier and accepted, changes that have been notified by the contract administrator and quotations that have been rejected by the contract administrator for a reason that requires a revised quotation. Why? Whilst there may be a lag, this should broadly align to the volume of change notices (excluding rejected notices that never reached the quote stage). If the number of quotations significantly exceeds the number of change notices this potentially points to some quality issues and/or areas of contention on the assessment of change.
10. Quotes Received - This report tracks the total count of change requests quotes received in the reporting period, by notified date. Why? To see the trend of quotations received compared to quotations instructed.
11. Supplier Change Notices Dealt With On Time - This visual tracks the count of supplier change notice replies that were required and received on time. Why? To see the trend in change request response compliance to timetables.
12. Change Notice Overdue Replies – This tracks change notices from the supplier that require a response from the contract administrator (e.g. for NEC3/NEC4 ECC the contract dictates that the contract administrator must decide whether a compensation event notified by the supplier is valid before instructing a quote). The Late Replies line shows replies that were received late in the reporting period (based on reply date). The Cumulative overdue line shows the cumulative volume of replies that were due in a previous period or the current reporting period and have not yet been received (based on response due date). Why? To understand the trend in terms of timescale compliance when replying to change notices and whether it’s improving over time.
13. Supplier Change Notices Dealt With On Time - This visual tracks the count of supplier change notice replies that were required and received on time. Why? To see the trend in change request response compliance to timetables.
14. Change Notice Overdue Replies – This tracks change notices from the supplier that require a response from the contract administrator (e.g. for NEC3/NEC4 ECC the contract dictates that the contract administrator must decide whether a compensation event notified by the supplier is valid before instructing a quote). The Late Replies line shows replies that were received late in the reporting period (based on reply date). The Cumulative overdue line shows the cumulative volume of replies that were due in a previous period or the current reporting period and have not yet been received (based on response due date). Why? To understand the trend in terms of timescale compliance when replying to change notices and whether it’s improving over time.
15. Quotes Issued On Time - This visual tracks the count of quotations that were required and received on time. Why? To see the trend in compliance to contractual timescales.
16. Quotes Backlog Overdue Replies – This tracks quotations required from the supplier following a change notice from the contract administrator, acceptance of a supplier change notice, or a rejection of a previous quotation requiring a re-quote. The Late Replies line shows quotations that were received late in the reporting period (based on reply date). The Cumulative overdue line shows the cumulative volume of quotations that were due in a previous period or the current reporting period and have not yet been received (based on due date). Why? To understand the trend in terms of timescale compliance when replying to change notices and whether it’s improving over time.
17. Quotes Dealt With On Time - This visual tracks the count of contract administrator replies to supplier quotations that were required and received on time. Why? To see the trend in compliance to contractual timescales.
18. Quotes Replies Overdue – This tracks contract administrator replies to supplier quotations. The Late Replies line shows replies that were received late in the reporting period (based on reply date). The Cumulative overdue line shows the cumulative volume of replies that were due in a previous period or the current reporting period and have not yet been received (based on response due date). Why? To understand the trend in terms of timescale compliance when replying to change notices and whether it’s improving over time.
19. Supplier’s Responses – Within the reporting period shows what proportion of supplier’s responses (in this context quotations) were received on time or late and what proportion of replies not yet received were pending (i.e. within timescales) or already overdue. Whilst excluded by default, this visual could be filtered to include the current calendar month hence ‘Goes overdue this week’ is also included. Why? To help understand contractual timescale compliance and whether the latest reported month appears to be affecting that positively or negatively.
20. Contract Administrator’s Responses – Within the reporting period shows what proportion of contract administrator’s responses (either to change notices or quotations) were received on time or late and what proportion of replies not yet received were pending (i.e. within timescales) or already overdue. Whilst excluded by default, this visual could be filtered to include the current calendar month hence ‘Goes overdue this week’ is also included. Why? To help understand contractual timescale compliance and whether the latest reported month appears to be affecting that positively or negatively.
21. Outstanding Change Responses – Across all of the different types of changes notices, quotations and quotation replies this visual demonstrates the split of outstanding reply actions (i.e. a reply has not yet been received up to the end of the reporting period) for each party and whether these are pending (i.e. within timescale) or overdue. Whilst excluded by default, this visual could be filtered to include the current calendar month hence ‘Goes overdue this week’ is also included. Why? To help understand contractual timescale compliance and whether the latest reported month appears to be affecting that positively or negatively.
22. Open Change Count by Age Band - This visual shows the age profile of open changes up to the end of the reporting period (irrespective of when the changes were notified). In this context and open change is one where either a quotation has not yet been instructed/received or a quotation has been received but the cost and time impacts have not yet been agreed/assessed. Why? To help understand how long it is taking to manage change through to a conclusion and to what extent there are long-running change events that may require attention/intervention.
23. Implemented Change Count by Status – By volume, the proportion of change that was accepted versus change that was assessed. Why? A high proportion of accepted quotations suggests a healthy collaboration between contract administrator and supplier. A high proportion of assessed changes could mean there is no accepted programme but most likely points to a degree of contention over cost and time impacts that could ultimately lead to a dispute.
24. Implemented Change Value by Status – By value, the proportion of change that was accepted versus change that was assessed. Why? A high proportion of accepted quotations suggests a healthy collaboration between contract administrator and supplier. A high proportion of assessed change value could mean there is no accepted programme but most likely points to a degree of contention that could ultimately lead to a dispute.
25. Quoted vs Implemented Cost – In this context implemented means cost and time impacts were agreed or assessed and implemented date means the date of that agreement or assessment. For changes implemented in the reporting period, shows the total cost initially quoted alongside the cost eventually agreed/assessed. To be clear, the quoted value is the total originally quoted for change implemented in this period. The quoted value is not reported in the month it was quoted, it is reported in the month the change was implemented so as to provide a side-by-side comparison. Why? Helps to demonstrate the extent to which cost is being accepted or driven down via a robust cost assurance process. There may be perfectly valid reasons why a good collaborative approach would lead to accurate first-time quotations that are accepted promptly. This needs to be assessed in the round using other available metrics.
Submission Management Metrics
1. Count - Total volume of items submitted for acceptance in the reporting period. Why? Shows the total number of submissions this period.
2. Awaiting Contract Administrator Reply – Total volume of items currently awaiting a decision from the contract administrator that fell due within the reporting period. This deliberately excludes items notified or due in the current calendar month as they technically fall outside the reporting period. Why? To understand the scale of the outstanding submissions and the total size of the backlog.
3. Awaiting Supplier Reply – Total volume of submissions currently awaiting a reply from the supplier that fell due within the reporting period. This deliberately excludes items notified or due in the current calendar month as they technically fall outside the reporting period. Why? To understand the scale of the outstanding submissions and the total size of the backlog.
4. Submissions - Total submission raised over reporting period by notified date, split by party and count. Why? To see the trend of submissions being raised.
5. Submission by Type - Takes a number of differently configured types and attempts to categorise them based on the volume submitted in the reporting period (using notified date). Why? To see the trend and split in status of submissions by period.
6. Submissions Dealt With On Time - Bars shows the responses due in the reporting period (irrespective of which period the original notice was raised) and the line shows replies received on time (based on reply date) in the reporting period as a percentage of all replies received in the same period. Replies won’t necessarily be received in the period in which they were due, so replies are counted as late in the period in which the reply was received. For example, if a reply is due in January and received in February it will be reflected in the January bar and the February line. Why? To see the trend in overall compliance with contractual timetables and whether it’s improving over time.
7. Submission Overdue Replies – The Late Replies line shows replies that were received late in the reporting period (based on reply date). The Cumulative overdue line shows the cumulative volume of replies that were due in a previous period or the current reporting period and have not yet been received (based on response due date). Why? To understand the trend in terms of timescale compliance when replying to submissions for acceptance and whether it’s improving over time.
Quality Management Metrics
1. Defects Volume – Count of defects and non-conformities raised during the reporting period by notified date, split by party and count. Why? To see the trend of defects and non-conformities being raised.
2. Defects by Status - Tracks the status of defects and non-conformities. The bars show the count of defects/non-conformities notified in the reporting period alongside the number of defects/non-conformities closed [i.e. accepted/corrected] in the reporting period. The line shows the count of open defects/non-conformities up to the end of the reporting period (including those raised prior to the current reporting period). Why? To track outstanding issues and total size of the backlog.
Time Management Metrics
1. Count – Count of programmes submitted during the reporting period. Why? To see the total number of programmes submitted.
2. Awaiting Contract Administrator Reply – Total volume of programmes currently awaiting a reply from the contract administrator that fell due within the reporting period. This deliberately excludes items notified or due in the current calendar month as they technically fall outside the reporting period. Why? To understand the scale of the outstanding programme submissions and the extent of the backlog.
3. Programme Submissions – Count of programmes submitted during the reporting period by date summitted. Why? To see the trend of volume of programmes being submitted.
4. Programme Submissions Dealt With On Time - Total programme submissions dealt with on time over the reporting period by due date, with count of submissions and percentage on time shown. Why? To see the trend in overall compliance to programme acceptance timetables (where applicable).
5. Programme Overdue Replies - The Late Replies line shows replies that were received late in the reporting period (based on reply date). The cumulative overdue line shows the cumulative volume of replies that were due in a previous period or the current reporting period and have not yet been received (based on response due date). Why? To understand the trend in terms of timescale compliance when replying to submissions for acceptance and whether it’s improving over time.
Benchmark
Benchmarking report – This report allows two sets of contracts to compared across a selection of metrics on single page, the two sample datasets based on a set of user defined criteria (e.g. North vs South, Hub 1 vs Hub 2, Framework 1+2 vs Framework 3+4, etc. The report allows you to set filters that will only apply to visuals on both the left-hand and right hand side of the report page. This then compares the following metrics for these two dataset on a single page:-
a) Volume of all communications,
b) % of all communication dealt with on time,
c) Volume of all risks,
d) % of all risks dealt with on time,
e) Volume of all change notices,
f) % of all change notices dealt with on time,
g) Volume of all quotation instructions,
h) % of all quotations instructions dealt with on time,
i) Volume of all quotation replies,
j) % of all quotations replies dealt with on time,
k) Volume of all submission and
l) % of all submissions dealt with on time.
Advanced Activity Reports (coming soon)
Watch this space for more information.
Comments
0 comments
Please sign in to leave a comment.