Many DBAs and developers I know “wing it” when it comes to database design. What are the issues around doing this?
I think you may have already gathered that I stand (or is that fly) with the ‘wing-it brigade’. But that is only after years of experience. In days gone by, clients used to ask us to write contracts that guaranteed response times at the terminal. So we just had to make sure that none of our systems had any chance of being bad performers. It’s a question again of balance between the simple design that may not meet the application requirements and the over-complex. You need to know your host database capabilities and you need to know your user requirements. For instance, when and how often has a particular report or view is to be produced? If the answer is ‘monthly’, then do you really need to keep an extra index on a large table just for that purpose?
The largest databases I have been involved with were in the banking industry back in the 80’s. Sometimes we had to design the system around the hardware limitations. For instance, we only calculated interest costs for ongoing customers, and posted transactions six monthly. We then had complex routines to work out interest adjustments of joiners and leavers and to display or print it on an ad-hoc per-account basis. But the point was not to clag up the database with millions of extra transactions if they were not entirely necessary. Of course times have changed, but very few organizations have unlimited resources, so a pragmatic approach to database design is always needed.
Many companies I am familiar with don’t have a formal method of source control. Why is this bad? What are the benefits of good source control?
At one time we had around 50 large clients using our systems that had over one million lines of code in about 8,000 units. None of the clients were prepared to upgrade and apply patches in an orderly fashion and on a prescribed day of the year. So we had a library management problem that resulted in an inability to issue the right set of patches or replacement units to any particular client. We addressed the problem eventually by building a fully automated library and change management system that allowed us to track all changes at a client level. Whenever we made a delivery, we could automatically ship all relevant changes. Fault reports at our help desk dropped by 80% over 2 years, and our customer satisfaction went screaming up. It’s a question of inventory and change management again. Most manufacturers do it very well these days (did someone say ISO?). I have not seen any automated systems you can use for tracking, storing and change managing Transact-SQL code -– but if it does exist and you have problems, use it!
Many companies have a scenario where they have a development database, a test database, and a production database. What is the most efficient way to maintain changes throughout all these databases?
I have only one answer –- use dbPAL. Within the schema version and change management environment, we have developed the concept of a ‘schema domain’. Because we have absolute version control, we can readily manage the exact state of all database instances in the same family. You can point a dbPAL schema at a database and be told exactly what version it is. You can extract a schema from a database and get a comparison with an existing schema. If you have a group of developers each with their own test database(s), you can rapidly apply updates at times that suit the developer. You can externalize and save different datasets representing (say) test beds and be confident that they can be imported into later versions of the database.