This is just a starting point...
Main Drivers:
General Tso's Rule of Tumb...
How many attributes? X 0.25 days
How many facts? X 0.25 days
How many metrics? X 0.25 days
How many reports? X 2.00 days
How many prompts? X 0.25 days
How many drill paths? X 0.25 days
How many filters? X 0.25 days
See variance factors below...
If these are new reports, plan to code examples in SQL first, and perhaps do a pilot to make sure your interpretation of the requirements is what the business really meant...not what they asked for. Allow 3-6 weeks for requirements gathering documentation and analysis for most small to medium sized projects under 100 reports.
1. How many total reports will be developed ? As a rule of thumb, allow a minimum of 1-2 days per report for development and unit testing.
Good things to know are:
a. What report level filters will be needed ?
b. What report level prompts will be needed ?
c. Will report level filters / prompts deliver the required results across all metrics on the report, or will metric level filter, dimensionality or transformation be required ?
One clue that a report-level filter will suffice is that all dimensionality is the same for all measures on the report. If the report requires this year vs. last year and perhaps additional columns that show the difference between this vs. last, you will most likely need to allow for a set of filtered and/or prompted filtered measures for that report. Meaning a proportionately increased amount of project complexity to implement the report.
d. Will drill anywhere be available for this report ? If so, trot out an additional 1 day of time for developing a series of unit tests for all meaningful drill paths for the report. Also add an additional 1 day for QA testing, on average per iteration. One hint: if you can keep the database used for QA static from start to finish, this will make the QA person's turn time much faster.
e. Will a customized, local drill map be needed for this report ? If so, allow an additional 0.5 days for development and unit testing.
f. Will measures require custom formatting ? Add 1 hour for development.
g. Will this report's measures implement thresholds ? Add 1 hour for development.
2. Is the data warehouse PDM complete with respect to all the subject areas you will require for this project ? If the data architect says it ain't, don't even bother to start drafting a design for your project until they are.
3. Document all applicable warehouse tables to be accessed for each report. This can be automated with database metadata SQL queries to develop a cross reference between the database tables and the database columns associated with a project's facts and attributes. This also serves as a reasonableness check to make sure the architect can ask for clarification of any questionable cardinality or relationship issues up front. For an average size project, allow 1 day for this activity.
4. How many measures/metrics will be developed ? Allow a minimum of 2 hours per metric for development. This assumes all tables/views/functions from which this measure may be retrieved is well-defined. Along with the normal display characteristics.
5. How many attributes will be defined ? Allow a minimum of 0.5 hours per attribute for development. This assumes all tables/views from which this attribute may be retrieved is well-defined, documented, completed and accessible. Otherwise, allow 0.25 days.
6. Number of complex reports ?... more than one or two compound measures, or containing dimensional measures, percents to total, etc... report time X 1.5
7. How many documents will be required ?
8. Will the "out of the box" GUI be used ?
9. Will a custom user-interface be required ?
10. Never underestimate the power of a good, short but to the point education and training program in conjunction if this is the first deployment of MicroStrategy at this organization. Allow a minimum of 1 week for developing and administering hands-on overview. For a thorough training course allow 2 weeks for the project.
11. How will report / data authorization and access control be handled ? If MicroStrategy security will be required in addition to network and/or data warehouse security, add up to 2 hours per user.
12. Have test plans been written by the business side for all the reports ? If they can't test it, you will never be able to prove a report is complete. Moreover, if they can't tell you how to test it, they probably don't need it.
13. Are all the relationships between attributes and measures known, written into the specifications ? Develop a matrix of attributes down one column and measures across the top. Check off each intersection that makes sense to report. Summarize this document by measure with all the applicable attributes by which it is reasonable to filter or aggregate this measure. Review this document with the business SME's to make sure you haven't missed something important...even if it's only conceptual in nature. Test the results by writing SQL scripts against the data warehouse to verify all documented relationships are understood and attainable. Allow 2 days to develop a presentation and meet with the business to review.
14. Have drilling hierarchies been defined ? Make sure there are adequate tests of measures at various levels of meaningful aggregation along the drilling hierarchies. Again, report proofs may be coded in SQL.
15. Each fixed, reportable measure must be associated with one or more reasonable (business-defined) attribute. Make sure the business understands all visible attributes defined within the project in their terms. This one thing can save tons of time and avoid misunderstandings later.
16. Have attainable performance acceptance criteria been defined by and or negotiated with the business based upon physical constraints of the working resources ?
That's a start, and my opinion only, based on real-world experiences. I'm sure others can add much more than I.
Best,
dmcmunn