If you have seen my SQL Server tools Tips and Tricks presentation at a SQL Saturday you may remember my brief discussion on Central Management Server (CMS). A 45 to 60 minute session is not near enough time to dive into all the tools including this cool hidden gem that the Microsoft SQL Server team provided us with the SQL Server 2008 release. Even though this is a well blogged about tool a recent conversation at my new job reminded me that I needed to and wanted to blog about CMS. I run into and speak with individuals who connect with SQL Server on a regular basis and are not aware of CMS or what CMS can and can’t do for you. So let’s get this hidden gem installed and start utilizing its POWER!

The 1st thing you will need is a SQL Server 2008 or greater instance installed with MSDB available. I will be setting up and taking screen shots from my SQL Server 2012 instance for the purpose of this post. I will be setting up and configuring CMS as sysadmin on this database instance. For this post I will be using my local instance however you could also use an instance in your datacenter. There are advantages and disadvantages to both based on your working environment.

Let’s get started with the CMS setup.

Open up SQL Server Management Studio (SSMS). Dependent on how you have SSMS setup it will look something like the image below.

The next step is to open up our registered services view. This can be done by Ctrl+Alt+G (2012) or by choosing view and then clicking on registered servers.

Once completing the above you should see a screen similar to the following image.

At this point we are on the screen that allows us to register CMS. Right click on Central Management Servers choosing the Register Central Management Server….

You should see the New Server Registration screen appear. For this example I chose SQL Server Authentication , you can also use Windows Authentication when registering the server. For Server name I am using (local) to connect to my local SQL Server 2012 instance. The following example is using SQL Login cmsadmin and its secure password to authenticate into the instance. At this time I am not making any changes to the Connection Properties. In rare cases I have had the need to modify one or more of the Connection Properties. Click Test once you receive a successful connection click Save.

You should see the local instance registered underneath the Central Management Servers Folder. Note (local) can be replaced with your database server name which would appear underneath the Central Management server folder.

Underneath our local server let’s set up some server groups. Normally I start off by creating development, test, staging and production. This can vary based on your environment. I then later add additional Server groups. For example I supported a scaled out Biztalk and SSRS environment. I would create a server group for each of the scaled out environments registering the server/servers that are part of those environments. Our next step is to create our basic server groups. This can be done by right clicking over (local) which will display the image below.

Click on the new Server Group to display the next screen. I filled in Group name with Development and the Group description with information about the group. Click OK.

After creating the Server Groups you will end up with something like the following image.

The next step in the process is to register some servers to take advantage of the power of CMS. Right mouse click over the Server Group you would like to register your Database server under. When you see the following screen click on New Server Registration…

You will fill in the Server name with a server you want to access utilizing CMS. When registering a server you MUST use Windows Authentication. Click on Test and once you have a successful connection click on save.

Please read the included link for further explanation on why Windows Authentication must be used to connect. An excerpt from the site has been provided.

http://msdn.microsoft.com/en-us/library/bb934126(v=sql.110).aspx#Security

“Because the connections that are maintained by a central management server execute in the context of the user, by using Windows Authentication, the effective permissions on the registered servers might vary. For example, the user might be a member of the sysadmin fixed server role on the instance of SQL Server A, but have limited permissions on the instance of SQL Server B.”

Once you have registered your servers you can end up with something like this. Each person I run into that utilizes CMS has their own take on how they like to set the groups and registered servers. I typically find myself migrating to the following setup which is included in the image below.

Here is where it really gets cool. Let’s utilize the power of CMS. You can 1st double click on any of the registered servers and it will open a connection to that server within SSMS. You can also right click on one of the groups and open a query window.  In this case let’s hover over the development group. Select new query from the selections provided and a window will open which can be used to run a provided statement against the servers registered within that Server group. Notice at the bottom of the query window we are connected to 1/1 servers. This however does not display the full power of this tool.

Execute the following query:

SELECT
SERVERPROPERTY(‘Edition’) AS [Edition],
SERVERPROPERTY(‘ProductLevel’) AS [ProductLevel],
SERVERPROPERTY(‘ProductVersion’) AS [ProductVersion]

You should see the statement return something similar to the output below:

Edition ProductLevel ProductVersion
Standard   Edition (64-bit) RTM 11.0.2100.60

 

Not to impressive yet. Let’s kick it up a notch and see what else CMS can do for us.

Hovering over (local) or your {server name} this time right mouse click and select new query. Selecting new query from the selections provided will open a new window which will allow us to run a provided SQL statement against the servers registered within the Server groups. Notice at the bottom of the query window we are connected to 4/4 servers. The connected value will vary based on the number of servers registered. *Note you can also do this at the group level.*

Execute the following query:

SELECT
SERVERPROPERTY(‘Edition’) AS [Edition],
SERVERPROPERTY(‘ProductLevel’) AS [ProductLevel],
SERVERPROPERTY(‘ProductVersion’) AS [ProductVersion]

You should see the statement return something similar to the output below which will be dependent on your registered servers:

Server Name Edition ProductLevel ProductVersion
StagingServer Standard Edition (64-bit) SP1 10.50.2500.0
DevServer Standard Edition (64-bit) RTM 11.0.2100.60
TestServer Standard Edition SP3 9.00.4060.00
ProdServer Enterprise Edition SP4 8.00.2066

 

In a matter of seconds I was able to pull which versions of SQL Server I had out there in my different environments. Imagine this, you have hundreds if not thousands of servers. You have become the “Accidental DBA” and know nothing about the environment you inherited and were left with just a list of servers.  You have no knowledge of PowerShell. This is a quick and powerful way to extract information from your SQL Server database instances just by understanding some basic SQL statements. Seeing just how powerful this is should give you some pause and think do I really want to run this statement against our live production database? Use best practices and common sense is my best advice. Possibly think about using set transaction isolation level read uncommitted when executing your queries.

Registering CMS will store information within the MSDB database. Security to CMS can be controlled utilizing a couple of database roles within the MSDB database. ServerGroupAdministratorRole provides you the ability to administer CMS.  ServerGroupReaderRole grants users the ability to connect to CMS. Two views of interest are the sysmanagement_shared_server_groups and the sysmanagement_shared_registered_servers.

The sysmanagement_shared_server_groups view will return information with respect to the groups you have created for your registered CMS instance.

You may want to execute the following statement against the view to see how this data is being stored.

SELECT server_group_id,name,description,parent_id,num_server_group_children,num_registered_server_children
FROM msdb..sysmanagement_shared_server_groups
WHERE is_system_object <> 1

server_group_id name description parent_id num_server_group_children num_registered_server_children
7 Developement Place Development Servers in this group. 1 0 1
8 Staging 1 0 1
9 Test 1 0 1
10 Production 1 0 1

 

The sysmanagement_shared_registered_servers view will return information with respect to the servers you have registered and the group they were registered within.

You may want to execute the following statement against the view to see how this data is being stored.

SELECT server_id,server_group_id,name,server_name,description,server_type
FROM sysmanagement_shared_registered_servers

server_id server_group_id name server_name description server_type
1006 7 DevServer DevServer 0
1008 8 StagingServer StagingServer 0
1009 9 TestServer TestServer 0
1007 10 ProdServer ProdServer 0

 

There are a number of procedures that are utilized to manage CMS which you can examine for yourself by executing the following query.

You can execute the following statement to view these stored procedures.

SELECT * FROM msdb.sys.sysobjects
WHERE name LIKE ‘%sysmanagement%’ AND xtype=‘P’

CMS is a power tool that the Microsoft team has provided out of the box at no cost to you. Once CMS has been setup and configured the insight into you systems and environments is priceless. One of the reasons I embrace the Microsoft SQL Server Suite is that the tools provided have been thought thru and have had time put into them. Some are good, some are ah. CMS is one of those tools that provides the DBA’s, Developers and even Management a view into our SQL world.

References utilized in the past when 1st setting up CMS:

One of the 1st links I ever hit when examining CMS

http://www.sqlskills.com/blogs/kimberly/post/SQL-Server-2008-Central-Management-Servers-have-you-seen-these.aspx

Microsoft link mention in this post

http://msdn.microsoft.com/en-us/library/bb934126(v=sql.100).aspx

This Post written while Jamming to DKM….

Last Thursday night during my wind down period I found myself doing my last nightly scan of twitter. (Thomas Larock | @SQLRockstar) tweeted about SQLHelp community within Google +. I do love twitter, but the 140 character limit can be very limiting. I have avoided Google + up to this point not wanting to have another distraction keeping me from my lovely wife Missy. Watching television shows like project runway and housewives of …. is always the highlight of my evening {sarcasm}. It only took a couple of minutes and I had another user account and password to keep track of. In the last 4 days of use I have been able to provide some insight as well as receive some great feedback within the Google + SQLHelp community. Last check the community was up to 117 members and I foresee this number growing quickly. Many subgroups have been added to the SQLHelp community like SSIS Help, SQL Help and Vendor Tools to name a few. This is a community I am happy to be a part of and it is another avenue for all of us to share our diverse knowledge base. Please come in and join us and be sure to thank Thomas for not losing hope and brining this Google + community to us.

It is also important to mention that Thomas has created a couple of other communities for us. The Fathers in Technology community is open and ready for business. We have had some great suggestions and discussions in the last couple of days. Even if you are not a Father in Technology come visit, read our thoughts and even share your thoughts. As a Father In technology I am always open to great discussions. Fathers In Technology has a couple of subgroups such as How to unplug for your kids and Working from home that may be of interest to many of you. Since many of us pay the bills and support SQL Server 2000 the Community group SQL 2000 Must DIAF has also been created by Thomas. This is a great place to have fun with, post our frustrations and help SQL Server 2000 DIAF (Die in a Fire). With the stress this job can provide us it is fun to poke back at it.

Blog Post Brought to you by Kingdom of Sorrow radio.

Back in October of 2005 (in the long long ago) Microsoft released to market SQL Server 2005. Since this release Microsoft provided DBA’s detailed insight into the server state via dynamic management views (DMV).

In the beginning I did not fully embrace the DMV’s and changes made by Microsoft. In all honesty it was not until the past 3 or 4 years that I started embracing them. Maybe it was part of my natural progression of learning as well as part of my path to becoming a more knowledgeable DBA. I think it is safe to say that each and everyday a good portion of us are learning new uses or even new DMV’s to help resolve issues on our database servers.

One of my all time favorite system tables was sysprocesses. With the release of SQL Server 2005 Microsoft created 3 new DMV’s which have been maintained in all versions of SQL Server since 2005. Dynamic management views sys.dm_exec_connections, sys.dm_exec_sessions, and sys.dm_exec_requests where mapped to the sys.sysprocesses system view. I kicked, I screamed and I resisted these DMV’s convinced that sysprocesses provided me with everything I needed to understand about the connections  within my database instance. Yikes! I was so wrong.

A little over 4 weeks ago I was at a shop that had over 2052 + instances of SQL Server 2000. Approximately 75 percent of the environment was SQL Server 2000. We DBA’s at the shop were more likely required to trouble shoot an instance that was SQL Server 2000 requiring the use of sysprocesses since DMV’s were not yet available. I needed to push myself to grab hold of and utilize the server-scope views since my daily routine did not require them. My advice to any of you hold outs is move forward you are being left behind and as difficult as it may seem these server-scope views are our future.

Here are a couple of recommendations and explanations for the views which map back to old faithful sysprocesses. I think at this point it is important to point out that sysprocesses is no longer a system table it has been changed to a system view.

The 1st DMV I always utilize when examining sessions connected to a database instance is sys.dm_exec_sessions. This server-scope view provides us one row per authenticated session and shows information about all active user connections and internal tasks. When returning result sets from sys.dm_exec_sessions I like to look at session_id, Login_time, host_name, program_name, status, transaction_isolation_level and last_request_end_time. There are many other columns that can be returned and are fully documented by Microsoft. You can find detailed documentation about sys.dm_exec_sessions following the provided Microsoft link.

http://msdn.microsoft.com/en-us/library/ms176013.aspx

The second DMV I utilize when examining sessions connected to a database instance is sys.dm_exec_requests . This sever-scope view provides us one row for each request executing within a SQL Server Instance. When returning result sets from sys.dm_exec_requests I typically look at session_id, start_time, status, command, sql_handle, plan_handle, blocking_session_id, wait_type and wait_time. There are many other helpful columns that can be included in your result set such as percent_complete (very cool for backups/restores) and reads & writes. You can find detailed documentation about sys.dm_exec_requests following the provided Microsoft link.

http://msdn.microsoft.com/en-us/library/ms177648.aspx

The third DMV I utilize when examining sessions connected to a database instance is sys.dm_exec_connections. DMV sys.dm_exec_connections provides us with server-level information about the connections in SQL Server. When returning result sets from sys.dm_exec_connections I typically pull back session_id, connect_time, auth_scheme, client_net_address and client_tcp_port. When we were having Kerberos issues with SSRS we utilized sys.dm_exec_connections in order to identify the SQL Server Authentication scheme being used by our reporting system. You can find detailed documentation about sys.dm_exec_connections following the provided Microsoft link.

http://msdn.microsoft.com/en-US/library/ms181509(v=sql.110).aspx

I am convinced that once you make the leap into utilizing these DMV’s you will not look back. You can find all the information sysprocesses provided you with some added bonuses. Once you become familiar with sys.dm_exec_connections, sys.dm_exec_sessions, and sys.dm_exec_requests individually have fun writing queries joining them together. Also as of April 2013 we should all see a dramatic drop in the number of SQL Server 2000 database instances we support since it is officially no longer supported by Microsoft. SQL Server 2000 instances will still exist, but the ratio should change dramatically in our favor to SQL Server instances that have DMV’s providing us more detailed information into our server state.

Thank You Flogging Molly for providing blogging inspiration.

 

Back in July of 2007 I embarked on a new Journey at Publix Supermarkets. A major grocery retailer located in the Southeastern United States. A majority of Publix stores are located in Florida which is where the retailer started and was founded by George Jenkins. Upon entering the doors of Publix I was greeted by many smiling DBA faces. Chuck Futch, Greg Mathis, Larry Corwin, Thomas Shubick and James Martin were some of the first to welcome me and make me feel at home. Little was I to know that this would be my home for 5 years.

During the time there I saw many faces come and go, however the core part of the team stayed intact. One of the core members Greg Mathis was a DBA thru-and-thru. He could whip out a .sql script like there was no tomorrow. James Martin another DBA who was always able to make an impact and one of the best data modelers I have run into. Larry Corwin a jack of all trades so much that he was a truck driver stunt double in one phase of his life. Larry knew so many technologies it made my head explode. Some made lasting impressions like Bradley Ball who moved on later to Pragmatic works. Bradley is an excellent DBA and would often push me to look for solutions that I would not have looked for normally. Thomas Shubick was a friend and a co-worker. Tom is a DBA and has many cool hobbies such as singing in the Master Chorale of Tampa Bay. At this point the yelling you hear is Tom. I brought him back to Publix 6 months ago and left him behind. He’ll get over it (just kidding Tom). I could call out all the DBA’s at Publix by name as I believe that they are a solid core of DBA’s covering just about every database platform you can think of.

During my 5 year tenure at Publix I was involved in many key efforts. One effort was upgrading many application databases from SQL Server 2000 to more current versions of SQL Server. I was instrumental in the implementation of the scaled out SSRS environment which allowed Publix to cutover the crystal reports to SSRS. Supporting and installing a time scheduling vendor application which was accessed by over 1000 stores and 50,000+ employees more or less. Implementing, supporting and upgrading many key project databases that drove million dollar business decisions. And I was a proud member of the SWAT team which was involved in determining and resolving bugs and or performance issues. Suffice it to say I was a busy beaver (Bulldog) on a daily basis.

Over the last year or so I felt like something was missing. I could not put my finger on it and it was bugging me. When out in Seattle I had a moment of clarity. Thru the help of others and a mentoring program which will be mentioned in another blog post. Quick thanks to my friend, confidant and mentor Joseph Sack @josephsack from @sqlskills. I began to gain faith once again that I was a good DBA with a skill set that could take me anywhere. The question was did I want to stay at Publix or move forward with other DBA opportunities. The changing moment was when I received a call from Jason a friend and a recruiter in the industry. To be honest I was dodging his call in the beginning since I did not want to turn him down for another DBA post. He had contacted me in the past and I had said no and I did not want to burn any bridges with him. After another call from Jason I decided it would be best to do the dirty and tell him no. Boy was I a fool for not reaching out to Jason earlier; he had an excellent post which was right up my alley. After a couple of solid interviews I was made an offer by New York Life/AARP.

What pushed me to the point of leaving a stable job at Publix? I can say that it was not an easy decision to leave Publix, but when all the cards were placed on the table it appeared that the best option was to move forward. I can say with confidence that I did not leave Publix for money it was more for personal reasons. Late last year and early this year I was actively speaking at SQL Saturdays and began speaking at user groups. I must give props at this time to Magic Pass out of Orlando. Due to my rigorous schedule at Publix I fell off the speaking circuit and was not able to give back to the SQL Server community as much as I would like to. I was also pulled away from blogging. I had just started blogging and was getting my groove and that groove was interrupted. And finally I felt like I was missing time with my wife Missy and son Parker. Upon speaking with my new employer I felt like I was going to have the ability to gain back these things that in my opinion I had lost. Just the fact that I was able to put out this blog is evidence that change is good.

Today was my first day at the new job. I have to say I was very impressed with the organization. HR was helpful from day 1. Heck New York Life (NYL) has been there for me during the entire hiring process. I was met by Rachelle and Kevin of HR with a firm handshake and a friendly welcome. A couple of other new hires and I were taken down to security for our mug shots and new I.D.’s. I was then brought up to my new team and hiring manager. I was walked around the floor and introduced to many different individuals and teams of which I will most likely insult someone later in the week since I am horrible with names. I am working on this I promise. I also found everything very efficient. All my hardware and access was ready for me by 11am. The longer the day progressed the more confident I became with my decision to leave one power house (Publix) and head to another (NYL) for new opportunities.

I will close on this. Change is good.

Within many enterprise environments we DBA’s support not only the I.T. but also our business areas. Many times we find ourselves isolated from the Business area either by separation of duties or by the nature of the business. Over the past week I have had the pleasure to work with one of our business areas due to an overflow error taking place within a client application.

It all started with an email from our internal Development team. They were assisting with the overflow error , but unable to identify the problem. Initial speculation was a database schema or security issue.
My initial thought was a data type mismatch between the application and database based on the error message presented to me. Lucky for me I was the Primary DBA supporting this database in our production environment so I knew we did not have any schema changes in the past year , however the application and support team was not willing to accept this. Since the Business user was able to get the application to work in staging I used Redgate Database Compare to validate the production schema against our staging schema. As I expected the Database schema and Stored Procedures matched 100 percent. Security matched 100 percent as well. I was still unable to convince the Development team that the issue was somewhere at the Client.

I needed to take another approach to stay sane. I stepped out of the norm and called our Business user directly. There was excitement in her voice that someone had reached out to her to discuss the issue. I explained to her that I was on a fact finding mission and asked for forgiveness up front since I was going to ask a bunch of questions.

Within 15 minutes I was able to determine the following:
1) The user had received a new laptop and the application was reinstalled
2) This issue had been happening for over 6 months
3) The Data was loading correctly, the Business user however had to click thru the error each time a row was loaded
4) The Business user was willing to execute the Application while I was executing profiler
5) Anything we could do would be appreciated

Working with the Business user I was able to grab the procedures and SQL being executed by the application. Once complete I explained to the Business user that I would need some time to examine the profiler trace. There were no reported errors and it appeared that all the selects and inserts were executing correctly. Looking thru the trace I was able to come up with the following facts:
1) The Table that was the center of the issue
2) The Data that was being selected/inserted into this table
3) That no user errors were being generated

At this point I have completely ruled out schema differences between staging and prod. My next step was to examine the data located within the table. Interrogating the data I found 2 columns in staging that contained data where production did not. The business user quickly brought me back to earth since She explained why we have this data in production and not staging. Our next step was to examine a column which contained data pointing to where a particular image resided. A conference call was started and working hand in hand we begin to evaluate the path to the image. One of the Business users was familiar with how to open up the code for the application in question. Walking thru it we quickly determined that one of the fields was set to an int and needed to be a long int. After a quick change to the application , Success.

This was a wonderful experience. I felt like a contributing member of the team. The Business team I was able to assist brings value to our company and I can say with certainty helps generated revenue which in turn pays my salary. I was able to come home this evening with a sense of pride since I accomplished the following:
1) work outside of the box without breaking protocol
2) gain some insight into a new Business area
3) effectively communicate my thoughts so that they could be understood by our non technical support teams.

Always keep an open mind when trying to resolve an issue. If I had followed strict protocol this issue may have dragged out longer, our Business user would maintain the thought that I.T. is a distant relative and I would have never worked with the good folks I did over the past few weeks.

 

Day 1

October is one of my favorite times of year for many different reasons. One is the changing of the seasons; the second is our family trip to Cataloochee Valley North Carolina. Having lived many years of my young life in Londonderry N.H., visiting this part of the Mountain chain floods my memory of many positive events.

To this point this trip has not disappointed.

Friday evening we left the Taylor house at 11:15 PM, right on schedule. The drive up was a peaceful one. About 4 hours into the trip we started noticing the outdoor temperature change. The outside temp was in the low 60’s, mission 1 accomplished. Mission 1 was to find cooler temperatures. At 9:30 AM we pulled into Waynesville N.C. Our hope was to attend the Apple festival. Man it was packed, so in our own best interest we decided to skip the festivities. We moved on and decided to hit the Blue Ridge Parkway. We hit one of our favorites, Water Rock Knob. Or as our 5-year-old says “Water Knock Rob.”  About 2 years ago our Poodle Callie of 13 years needed to go onto a better place. This was one of her favorite spots to get wind in her ears. This day would not have disappointed. It was a cool 55 degrees with about a 20 knot wind. Very refreshing. We had a very cold hard Salami and Butterkasse cheese sandwich. It was time to move on.

Best Poodle Ever

 

 

 

 

Upon getting in the truck I noticed a light on the dashboard. Missy (Wife of 16 Years) pulled out the manual. Thank you, Honda. Well that is interesting, the low pressure light is on. We had experienced this one time before when we had the truck in the snow, so we thought nothing of it. I went as far as checking the tire pressure. Nothing really out of the normal, and I did not hear the leaking of air. We decided to come down off of the Blue Ridge and head down into Maggie Valley, a city made for our Motorcycle friends. By the way, the concept of free air has not left this town. I filled the low tire with air and the light went off. Issue resolved at least that is what we thought.

We have been looking for property of our own up here, so we decided to head back to Max Patch and get a lay of the land. About 30 minutes into the trip I noticed that the low tire pressure light was on again. I pulled over into a local parking lot and check out the tire. This time my worse fear was realized, we had a slow leak in the tire evidenced bythe all too familiar sound pssssssssssssssssssss. O.K. this is serious… time to get into town and resolve this issue. Living in the Tampa area one cannot throw a stone without finding a tire retailer. Not so here in the mountains.

The search for a tire shop should have caused panic. However, I am quite proud of the family. Missy was doing Google searches attempting to find a tire shop. Pretty much everything was either closed or very far away. We remembered seeing a small gas station back in Jonathan Creek which appeared to be open and had a sign that said brake and tire repairs. To the gas station we headed.

Once at the gas station we noticed quite a bit of activity. A couple of the technicians were working on a big ole’ RV. They were fixing a tire; yeah we are in the right place. I pulled up and spoke with Nate. Upon talking with him I felt like we were in good hands. Nate was able to jack up the truck with all the gear in place. He quickly pulled off the tire and found the gash in the middle of it. Dang you highway debris. Michelin makes a great tire, but something to keep in mind is that they are difficult to patch. On his third try the patch took for Nate. We were all very happy including Nate. I went in to pay for the Patch and we were done with that part of the adventure. Before leaving I went to give Nate something for his troubles. He would not accept any gratuity from me and was very gracious about it. He kept saying it was too much. Even at the small dollar amount Nate kept saying “It was too much”. It soon became evident that Nate enjoyed what he did. He would not accept our gratuity, and I was close to insulting him. I shook Nate’s hand and we were on our way to the cabin.

Jen & Gary’s Cabin ( Our friends Cabin )

http://www.cataloocheemountaincabin.com/

 

 

 

 

Sitting in the Cabin in front of the fire watching my wife read and my son play, the day’s events gave me hope. It also made me think.

Often as DBAs or Developers we are not often accessible to each other. Especially after late nights or early morning production calls. We are those little gas stations in the middle of nowhere. However, we are the people who often make it happen. Well a little gas station called Sorrels in a place called Jonathan Creek made it happen. This goes back to the old Cliché, “Don’t judge a book by its cover”. We passed the Sorrel gas station much earlier in the day. Based on how Sorrels looked I passed without stopping. This was a huge mistake on my part. Often times we pass on things because they are not the most appealing or most glamorous. This latest life lesson with the tire gave me glimpses into the past. Some of my most challenging projects and or tasks have been those that were either not glamorous or appealing. Many of these have brought me to today and have provided me with this incredible time with my family in the mountains.

This event also gave me pause and hope for the state of the United States. I found a group of individuals in the small town of Jonathan Creek working at a gas station by the name of Sorrel. These are hard working Americans helping a town survive. Swing on by and talk with these folks and experience what I did that day. This town could have easily given up, but everyone here is surviving and trying to make it to the next set of glory days.

Hey All,

I was recipient of a very interesting email today. An end user was using SSMS and found that when examining porperties/storage of a table the row count did not match when executing count(*) against the table. The parties involved were sure that something was wrong with SSMS or the INDENTITY column on the table.

What??????

Here is what I shared with my end user and shall share with you.

My first job as a DBA is to never trust my developer, correct ? So I would never second guess the information they are providing me. Just to be safe I opened up SSMS to validate what was being reported.

My 1st step was to examine the reported row count using SSMS properties/storage.

Next step I took was to examine the count of rows on the table itself. A count(*) was run against the table in question.

The following query returned a rowcount of 616597:

/*————————
SELECT COUNT(*) AS count FROM dbo.DLVAPPT
————————*/
count
———–
616597
(1 row(s) affected)

Sure enough when the row count within SSMS is compared to a count(*) for the table there are differnces.

A couple of clues led me to the folowing conclusion. This database had been recovered from a higher lifecycle down to the Development environment. Reindexing did not run against the DB in the lower life cycle. The following query returned different rowcounts for the Cluster and Non-Clustered indexes on the table.

/*————————
SELECT id,indid,rowcnt
FROM sysindexes
–WHERE indid IN (1,0)
where OBJECT_NAME(id)=’DLVAPPT’
————————*/
id                 indid     rowcnt
———–          ——   ——————–
805577908    1          616596
805577908    2          614184
805577908    3          614184

And the final clue was the comparison of  SSMS rowcounts and select count(*) in production matched.

To fix this issue I ran a dbcc reindex of the table.

DBCC DBREINDEX(DLVAPPT)

All the early on speculation of a bug in SSMS, indentity column issues or dbcc corruption could be put to sleep. The next step was to put the end user at ease and prove that the issue has been resolved.

Final step in my process was to pull the information exactly like our end user did? Opening properties/storage on the table showed that select count(*) matched what SSMS was reporting.

SELECT id,indid,rowcnt
FROM sysindexes
–WHERE indid IN (1,0)
where OBJECT_NAME(id)=’DLVAPPT’

id                 indid              rowcnt
805577908    1                 616597
805577908    2                 616597
805577908    3                 616597

SSMS can be used as a powerful tool. It does and can have it issues, but more than not I find it to be an effective tool in adminstering SQL Server. Can you imagine if SSMS queried every table for a rowcount? For our larger databases it would take forever to pull up not to mention the overhead it would put on the database server. Something else I hope you take away from this post is know your systems the best that you can it will help you with Problem Determination is the long run.

Thank you for reading this your friendly

DBABullDog