Quantcast
Channel: sqlconcept.com »» performance tuning
Viewing all articles
Browse latest Browse all 2

Some tools of the trade and why a DBA should know more

$
0
0

It is important to understand, that the DBA’s work is barely limited only to the mere database performance. Here is what I mean by this:

there is much more to the “well-being” of a system than just the good performance of the database. Very often I get to perform database performance analysis of web based systems, which are much more complex than the “storing and retrieving” of their data.

Let’s take a web based system. In the “big picture” level this is what affects the happiness of the user (from a performance perspective; the aesthetics part is a whole other topic):

  • the user’s internet connection speed
  • the total size of the page served, and how fast the data gets to and from the end user
  • the bug-free environment (uninterrupted connection to the front end servers)
  • security – people should care what information is collected and is a SSL used if sensitive data is entered

On a technical level the factors that make the users happy are:

  • the setup of the front end servers and the proper load balancing
  • the proper capacity planning / sizing of the system
  • the network speeds between the front end servers, the media servers, the database servers and any other third party servers (if any)
  • the speed of the disk systems
  • the proper caching techniques employed on the different server levels
  • the size of the data sets, flying around the network (do I really need all this data?)

Why is this all so important?

Here is why: it happens for a database specialist to be called to help out with a database performance issue, where the problem is not only related to the database server or the quality of the queries.

For a DBA it is important to understand that the lifetime span of a website request is much larger than the speed of the database access (reads and writes). Network latencies between servers, not properly formed transactions and faulty logic in the middle tier can bring plenty of headaches and no matter how well-performing database system you have, the end user will not be happy. The business owners will not be happy, either.

The interesting part is that when a performance problem occurs, the chances are that the database will get the first pointed finger. I know of cases when a DBA was called on site to debug serious performance issues, and after a while the DBA said “Hm, the images your site is displaying are too big. ” And yes, if you are serving the end user an image with a size of 1Mb, the chances are that the site will seem slow.

Another similar situation I have heard about is when the network bandwidth between the servers was not enough and the users were experiencing bad performance of the system.

Once again: why is this important for a DBA?

It is important, because if a DBA is called to solve a problem, the first step to be taken is to validate if there IS a problem as it is described and the second step will be to propose a solution (or even several solutions).

Why is it important to validate the problem? It is important, because one cannot solve a problem which is not valid by definition (check out this simple question here – many people got confused by it and were calculating the final result without acknowledging that in our reality there is a tight dependency between distance, speed and time; and so often it happens that the question is about speed, but actually it is matter of time!)

And so, the moral of the story is, that a DBA (if they really want to be an efficient and self-sufficient professionals) would actualy pay attention to the big picture first, validate the problem and then approach it and solve it.

But, how is this the DBA’s business?

Well, as simple as it is: the happier the client, the more money. I am not saying that a DBA should be able to solve BIGIP load ballancing problems. I am far from this idea. What I mean, is that the DBA should have just enough knowledge to be able to point to the load ballancer (if the problem is there) and say that the problem is not with the database. Or point to the network delays and so on…

Now what?

Now, I would like to recommend some tools which could be quite useful:

  • the HTTP watch tool, which gives seamless amount of information about what is going on on the network
  • the Wireshark tool, which will give plenty of information about your network packets
  • get yourself familiar with which perfmon counters to monitor on the web servers, application servers, media servers and so on
  • the Performance Dashboard reports in SQL Server are an excellent starting point for quick system overview: current CPU load (compared between how much is used by the system and how much by the SQL Server), wait types, missing indexes… (keep in mind that there is a fix for SQL Server 2008 for Performance Dashboard)
  • get yourself familiar with the wait statistics DMVs in SQL Server – great deal of problems can be spotted by looking at wait types.
  • look at your memory and your cache
  • T-SQL coding quality
  • and so on

Good luck with your performance tuning! Feel free to ask me questions either by email or by writing comments here.

 


Viewing all articles
Browse latest Browse all 2

Latest Images

Trending Articles





Latest Images