Five database problems and how to solve them

Illustration of staff working on a central database

By now, the fact that data is the most important commodity for a modern business is old news. The value of data and the need for a solid data strategy is well established, and the benefits speak for themselves. However, one aspect that often gets overlooked in conversations about data-driven business is that this information needs to pass through a database in order to start generating value – and it’s here that many organisations run into trouble.

Running a database can be tricky, especially once you start scaling to multiple regions or high volumes of transactions being processed at the edge. There are countless ways in which an inexpertly configured and maintained database can start developing problems, most of which can have a significant impact on performance and availability. Here are some of the top problems that can creep into databases – as well as what to look out for and how to solve them.

Data ingestion

This may sound obvious, but in order for a database to be valuable, it needs to be able to ingest data – and this process can present its own problems for unwary organisations. For example, you need to think about what sources your data is coming from, and any integrations that may be required in order to get it from those sources into your database. These connections may require additional resources to build and operate, and their health needs to be monitored in order to prevent bottlenecks.

Another concern is at what speed data needs to be ingested – real-time data processing sounds attractive, and can be hugely advantageous for certain applications, but for databases which aren’t accessed regularly or where the contents aren’t time sensitive, real-time ingestion may be introducing unnecessary complexity and slowdown when batch processing would have been just as effective.

These issues can be circumvented by planning out and documenting the details of your database architecture as early as possible. As part of this, DBAs should also be thinking about what kinds of data they’re ingesting into a given database; for example, if you’re building a customer database to serve a mailing list, you can probably stick to basic details like name and email address, without worrying about age, physical address, height and shoe size. The smaller your database is, the faster it will run.

Database size

As we mentioned above, a lean database is a fast database – and by the same token, a larger database will perform increasingly poorly over time. This can often happen without being noticed, which is why it’s important to conduct regular audits and health checks on your database, both to monitor the size and to make sure that it doesn’t contain any unnecessary data.

Of course, what counts as ‘unnecessary data’ will differ from one organisation to another, and organisations should have specific policies governing what data to collect, which databases to house it in and how long to keep it for. Not only does this help keep the size of databases down by allowing for old data to be cleared through rules-based automation, it also helps organisations maintain a good compliance posture when dealing with personal information.

Query optimisation

One of the most pervasive problems that affects databases is poorly-constructed queries. This is sometimes the result of gradual tweaking as a database evolves over time, other times the queries were badly written from the word go. Whatever the reason, an inefficient query can have major impacts on performance and stability.

There are a few common culprits to look for when troubleshooting underperforming queries. For example, it may be using too many join operations, or be structured in such a way that it’s performing full index scans for a relatively simple task. While you can manually comb through your database operations looking for inefficiencies, this can be a long and time-consuming task for DBAs.

It’s often much simpler to use utilities like Quest SQL Optimizer, which can automate much of the process. Tools such as this allow you to scan your database environment for potentially inefficient SQL statements, test alternatives to determine relative performance and restructure indexes to improve overall database speed.

Database security

The database is the crown jewel of any organisation’s IT. It’s the place where your customer records, financial information, access credentials and any amount of other valuable information is kept – which makes it a tempting target for hackers. Unfortunately, databases are also one of the elements of a company’s infrastructure which is most prone to developing security holes as it grows. Therefore, it should be among the top priorities for security teams to protect.

We’ve already covered the importance of regular auditing from a database health perspective, but it can also have a big impact on security. With Quest’s Change Auditor, admins can track data-level events such as changes to rows, triggers and tables, as well as alterations to access permissions - another area that must be closely monitored in order to ensure good security.

Resource consumption

According to Gartner, three-quarters of all databases will be deployed in the cloud by next year, but while cloud can be great for simplifying the operation and maintenance of your databases, that doesn’t mean that you can forget about hardware capacity and configuration altogether. If you’ve over-estimated the compute and storage needs of your cloud database in the initial planning stages, it’s easy for running costs to rapidly spiral. Similarly, if the needs of your database change over its lifetime, you may find that capacity has to be expanded as your database grows.

This is why it’s important to keep a close eye on the performance and resource consumption of cloud databases, optimising your storage, compute and networking configuration periodically to ensure that costs and performance are matching up to expectations. Tools like Quest Foglight Evolve can help organisations to set more accurate estimates of necessary cloud service tier requirements prior to migration of a virtual machine or container on which the database resides, and costs of that workload can be easily tracked on an ongoing basis once a migration has been completed, all within Foglight.

ITPro

ITPro is a global business technology website providing the latest news, analysis, and business insight for IT decision-makers. Whether it's cyber security, cloud computing, IT infrastructure, or business strategy, we aim to equip leaders with the data they need to make informed IT investments.

For regular updates delivered to your inbox and social feeds, be sure to sign up to our daily newsletter and follow on us LinkedIn and Twitter.