I don't think an Object Oriented programming environment impacts on the issue of normalization in any substantive way.
The general rule is that active data should be more normalized than historical data. Normalization's main advantage is that facts are represented only once in the database, and are independent of other facts, so there are minimal opportunities for inconsistencies to creep in and ruin the value of the database. This single-sourcing of facts also means that, if a fact changes, you don't have multiple places to correct the data. Ultimately, it leads to minimum coding and maximum efficiency for active data.
Historical (inactive) data, on the other hand, is seldom updated, so the focus changes to retrieving data as quickly as possible. Denormalization helps here by minimizing the time spent searching multiple tables for the facts you need. Fully denormalized data comes very close to looking like indexed "flat files", and as a mainframer (you identify yourself as 'MIS', so I'm assuming) you know that flat files are faster than any database when it comes to retrieval.
Unless you're talking about an Object Oriented Database--but an Access forum might not be the best place to get a viewpoint on that. (Are there any commercial-grade object oriented database packages out there?) Rick Sprague
This fellow didn't strike me as a student, but as a manager in a shop considering upgrading from older technology. It doesn't seem to me that a student would choose the "MIS" identification.
But I might be influenced by the fact that my current client is a shop upgrading from older technology, and we're converting DL/I databases to DB2. This question gets asked, constantly, in one form or another, as we design the table structure. Rick Sprague
Quite possible you're correct. I just figured the poster was an MIS student! =============
Jeremy Wallace
Designing, Developing, and Deploying Access Databases Since 1995
We have a client/server application that uses Access. The data structure is normalized right now but we are having to use lots of joins in our sql statements for reports and also lots of functions inside our classes that enables us to get access to certain data thats not available in the object. If we were to denormalize the data then we could have all the information available to us in one object/class and would reduce the amount of joins needed to create custom reports. I'm use to having one object per table in the database and one object to join two related tables. So I was just trying to get a feel for what other developers are doing in similiar situations.
I would never denormalize a production database if I could help it. To Rick's reasons, which are very good, I would add that, if you are doing a lot of custom reports or exports, you can't beat the flexibility of normalized data. True, it is more complex to and somewhat more time consuming to design and print, but you can literally get whatever you need, whenever you need it. If you have several reports that need certain data that, if put in one table, would be considered denormalized, and you use those reports often, one thing you can do is to create a table just for those reports and add code to update those tables anytime the main (normalized) data is changed. That will give you the advantages mentioned above of having the data normalized and the extra speed and simplicity for reporting from that specific data.
OK, I owe you an apology. Sorry. I seem to have a bone to pick with students trying to get their homework done on forums like this, so I sometimes jump the gun.
I'm not a complete purist about normalization, but I do think it's important to be as normalized as performance will allow for. I agree with Rick that I don't know that OO has much to do with it, though I'll admit that my OO experience isn't that much more than 00.
It sounds like you know what you're doing, so you may well have already looked into this stuff, but indexing can play a big role in performance.
And everything Rick said is, no surprise, spot on.
But ultimately, if the application is "too" slow (quotes used _not_ for emphasis, but to point out how subjective that is), denormalization may indeed be not only a legitimate strategy, but necessary.
Again, sorry for the slam.
Jeremy =============
Jeremy Wallace
Designing, Developing, and Deploying Access Databases Since 1995
If you're doing an evening/overnight report production cycle, you can have the best of both worlds, almost for free. At the end of the day, you just run a bunch of Make Table queries that build a denormalized copy of your production database, and then run the reports off that. Since you're doing client/server, you'd benefit even more by building the snapshot copy on the client where the reports will run.
Reports will run faster, your queries and code will be easier to maintain, and if you want to look at it that way, you even get a free copy to use as a backup or send to data warehousing. Rick Sprague
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.