Showing posts with label SQL Server Administration. Show all posts
Showing posts with label SQL Server Administration. Show all posts

Shrink a Live SQL Server Database and Logs - Caveats and Best Practices

Updated on December 20th, 2017.

If you have deleted a lot of data in the database and truncated some tables, probably you may want to shrink database files to recover the unused space. Basically shrinking will move the data pages at the end of the file into unoccupied space, available in the front of the file.

You can use the DBCC SHRINKDATABASE command. For example, this command

DBCC SHRINKDATABASE (your_database,10) 

will decrease database size and allow for 10 percent free space. 

You can also shrink log file alone using DBCC SHRINKFILE command Take full back up of the database and run the following :

ALTER DATABASE your_database SET RECOVERY SIMPLE; 

DBCC SHRINKFILE (your_database_Log, 10); 

ALTER DATABASE your_database SET RECOVERY FULL; 

It will set the log file size to 10 MB 

Caveats of shrinking database and logs:

1. SHRINK commands should not be done frequently as it will increase the index fragmentation.

2. It is a fully logged operation which will increase the log size and slows down the performance

3. Shrinking is not possible while the current database is being backed up up or restored

4. Shrinking is not possible when columnstore index is enabled

Best practices

1. Use SHRINK only if a lot of operations such as TRUNCATE and DROP has lead to unused space

2. Take full backup before shrinking database files. If you shrink log file, set the Recovery mode into SIMPLE before shrinking and make it FULL after shrinking

3. Never turn ON AUTO SHRINK option

4. Always rebuild indexes after SHRINKING the database files to avoid fragmentation

SQL Server Default Port

In this article, we are going to explore the default port number of SQL Server required for getting connected to different applications.

What is a Port? It is an endpoint of communication in an operating system. A port is always associated with an IP address of a host, and the protocol type of the communication, thereby completing the destination or origination address of a communication session. A port is identified for each address and protocol by a 16-bit number, commonly known as the port number.
When front end applications want to connect to SQL Server, port number is essential.

The default port number of SQL Server is 1433.

How to get the port number in SQL Server?

Note - The steps are shown through SQL Server 2012 environment. They are largely applicable to SQL Server 2014 too.

Step 1: Open the SQL Server Configuration Manager window through Start > All Programs > Microsoft SQL Server 2012 > Configuration Tools 

sccm-navigation

Step 2: Explore SQL Native Client 11.0 Configuration (32bit)

sql-native-client-configuration

Step 3: Click on Client Protocols option within it  –

client-protocols

Step 4: Double click on TCP/IP from the right hand side pane

tcp-ip

We will get to know the Default Port as shown here

1433-default-port

The default port > 1433 port is the official Internet Assigned Number Authority (IANA) socket number for SQL Server. Client systems use TCP 1433 to connect to the database engine; SQL Server Management Studio (SSMS) uses the port to manage SQL Server instances across the network. You can reconfigure SQL Server to listen on a different port, but 1433 is by far the most common implementation.

Some other default SQL Server ports:

TCP 1434 – For Dedicated Admin Connection
UDP 1434 – For SQL Server Named Instance
TCP 2383 – For Default port of SSAS
TCP 2382 – For Named instances of SSAS
TCP 135 – For SSIS & T-SQL Debugger
TCP 80 and 443 – For SSRS
TCP 4022 – For Service Broker

other-default-ports

SQL Server: Create Objects Automatically in New Databases

Suppose whenever a new database is created, you also want to create a stored procedure automatically (by default) in this new database. Let us see the simplest way to do this.

The Model database is a system database which SQL Server uses as a model to create new databases. So if you create any object in the model database, that object will be created automatically in all newly-created databases. Let us see this with the following examples

Create this test procedure in model database

use model
GO
create procedure test
as
select 100 as number


Now execute this procedure

EXEC test

It displays the value 100.
Now create another database named test

CREATE database test
GO


Now execute the same procedure in test database

use test
GO
EXEC test


sql-create-objects

As you can see, executing the above statement displays the value 100 since this value was taken from the stored procedure in the model database and is now available with test database too.

Feel free to share other approaches to create objects automatically in a database.

Does my SQL Server Database Support Compression?

Some time back, Praveen Dabade had written a nice article on SQL Server Compressed Tables and Indexes in SQL Server where he explained how compression is now supported on ‘ROW and PAGE’ for tables and indexes. However did you know that compression is an enterprise-level feature?

How do determine what Enterprise Edition features are enabled on your database? Well you can use the sys.dm_persisted_sku_features DMV to find what Enterprise Edition features are
enabled on your database.

Learn more about Dynamic Management Views (DMV’s) here

Here’s the query for the same

SELECT feature_name,feature_id
FROM
sys.dm_db_persisted_sku_features;


Running this query will list all edition-specific features that are enabled in the current database. Some of the database changing features restricted to the SQL Server Enterprise or Developer editions are Compression, Partitioning, ChangeCapture etc.

This DMV is also useful in situations where you are planning to move a database from a higher to a lower edition. Eg: From Enterprise to Standard edition. Running the query will tell you if there are any Enterprise Edition features enabled that may not work when you move to a lower edition.

The DMV will return no rows if no features restricted to a particular edition are used by the database.

Get SQL Server Hardware Information using sys.dm_os_sys_info

If you do not have physical access to your Database but want to get the hardware information of your SQL Server, use this query:

SELECT cpu_count AS [Logical CPUs], 
hyperthread_ratio AS [Logical vs Physical cores Ratio], 
physical_memory_kb/1024 AS [Physical Memory in MB], 
committed_kb/1024 AS [Committed Memory in MB],
committed_target_kb/1024 AS [Committed Target Memory in MB],
max_workers_count AS [Max Workers Count], 
sqlserver_start_time AS [SQL Server Start Time], 
virtual_machine_type_desc AS [Virtual Machine]
FROM sys.dm_os_sys_info WITH (NOLOCK) OPTION (RECOMPILE);

The sys.dm_os_sys_info is available since SQL Server 2005 and contains information about server resources. For eg: you can find out the following: 
  • How many CPUs are there in the server 
  • Total logical processors 
  • Amount of physical memory available (physical_memory_kb - SQL Server 2012 onwards) 
  • Amount of Virtual memory available (virtual_memory_kb - SQL Server 2012 onwards) 
  • Last Date and time SQL Server service was started 
  • Virtual Machine Type description (not reliable) (virtual_machine_type_desc – SQL 2008 R2 onwards)
The cpu_count returns the number of logical CPUs and hyperthread_ratio returns ratio between physical and logical CPUs. I find the hyperthread_ratio to be confusing as it does not tell whether these are the actual hyper threaded cores or the physical cores. So I can never find out which processor is the server using. The physical_memory_kb tells the amount of RAM in my server and how much of it is committed using committed_kb. 

The sqlserver_start_time is useful to find since what date and time SQL Server has been running.

When I run the query on my machine, this is what I get:

system-info-sqlserver

List Empty Tables in SQL Server

I was recently doing a clean up of my website database. I remember creating some tables on my database but never adding any new rows to it.

Here’s a simple query to list all empty tables in your SQL Server database that uses a Dynamic Management View called dm_db_partition_stats which returns page and row-count information for every partition in the current database.

;WITH EmptyRows AS
(
   SELECT SUM(row_count) AS [TotalRows],
          OBJECT_NAME(OBJECT_ID) AS TableName
   FROM sys.dm_db_partition_stats
   WHERE index_id = 0 OR index_id = 1
   GROUP BY OBJECT_ID
)
SELECT * FROM EmptyRows
WHERE [TotalRows] = 0


OUTPUT
  
image

Note that the results from the view is only approximate. If there are any active transactions that are inserting or deleting rows, the count may not include it.

Activity Monitor in SQL Server

Activity Monitor is a handy tool in SQL Server to quickly see performance hot spots in the server, network and database activity. One of the most commonly performed tasks in Activity Monitor is to kill a stubborn connection that will not release its resources. In SQL Server 2008, this tool has undergone a major facelift from the 2005 version and helps in tracking your SQL server performance even better than before.

To view Activity Monitor in SQL Server 2008, right-click the instance name and choose Activity Monitor or click the Activity Monitor icon on the standard toolbar.

image

In SQL Server 2005, Activity Monitor can be viewed in SSMS by connecting to the server with Object Explorer, expanding ‘Management’, and then double-click ‘Activity Monitor’.

Once opened, the Activity Monitor Dashboard View contains four graphs which can help identity any abnormal activity in the Database. The graphs include % Processor Time (SQL Server), Waiting Tasks, Database I/O and Batch Requests. The default graph refresh rate is 10 seconds, but you change that easily by right-clicking any of the four graphs and selecting the appropriate refresh interval.

This snapshot is very helpful to get a quick performance snapshot without the need to use other monitoring tool for the same purpose.

SQL Activity Monitor

This dashboard also contains expandable/collapsible panes to view detailed information about Processes, Resources, I/O and Expensive Queries. Just click on the expand button to the right of each pane to view the detailed information. Here’s a quick overview of the different graphs and what they show.
Processor Time - The percentage of elapsed time that the processor spends to execute non-idle threads for the instance across all CPUs. The Processes pane gives information on what processes are running, what resources are being utilized etc. You can even right-click a process and choose Trace Process in SQL Server Profiler. Presto!
 
Waiting Tasks - The number of tasks that are waiting for processor, I/O, or memory resources. The Resource Waits pane details the processes waiting for other resources on the server.  It shows the latest information from several DMVs like the sys.dm_os_wait_stats
 
Database I/O – Information on data and log files for system and user databases. Provides the transfer rate in MB/Sec, of data from memory to disk, disk to memory, or disk to disk. The pane contains information to quickly detect a contention in disk I/O.
 
Batch Requests/sec - The number of SQL Server batches that are received by the instance. The Expensive Query pane lets you find and tune expensive queries. You can even right-click any query and view its graphical execution plan. Also see Find the Most Time Consuming Code in your SQL Server Database
Note: To view the Activity Monitor in SQL Server, a user must have VIEW SERVER STATE permission. Also make sure you close the Activity Monitor tool in SSMS when it is not required.

Create Database in SQL Server With Different Configuration Options

In this article, we will see how to create a database in SQL Server. We will also explore various options while creating a database.

Creating SQL Server databases can be achieved by two different ways as explained below -

1. You can make use of SQL Server Management Studio (SSMS) and Create Database wizard to create the database.

2. You can create the database programmatically by writing scripts.

If you are using the database wizard for creating the database, you will have limited options while creating your database. Most developers and administrators prefer to create the databases programmatically by writing scripts. These scripts can then be easily deployed on other SQL Server instances where you want to create the same database with the same configurations.

In this article, we will explore some configuration options as described below -
  1. Create database using database wizard
  2. Create database using script
  3. Create database with files and file groups
  4. Configuring File Stream during database creation
  5. Configuring recovery model during database creation
  6. Creating database with Collation option

Create database using database wizard

To start with, we will opt for SQL Server Management Studio [SSMS]. Once you open SSMS, in an Object Explorer window, we will right click the Databases folder and create a new database as shown below -

create-db-wiz

This step will show you a New Database window. Fill the required details as shown below -

newdb-window

You can give a database name and set the owner of the database. Also look at the database files. Check out the initial size of the database. In my case, it is 5 MB. The question is where is this size coming from?

Creating simple database using script

In SQL Server the "Model" [System Database] acts as a template for all the databases. Whatever the settings of Model, the same settings are applied to your new database. For example, let's add a simple table to our Model database and then create a PurchaseOrderDatabase. The table script is as shown below -

model-table

Note: All these scripts shown in this article can be downloaded from here

After creation of the table, create a database as shown in below script -

crdbscrpt

After creating the database, let's see how to select all the available databases under a given SQL Server instance. You can find the same under a view available under master database - sys.databases. The output of the sys.databases is as shown below -

sel-databases

You can now expand the PurchaseOrderDatabase using Object Explorer and check the tables folder. You will find "TestTable" created under Model database.

obexp1

This table got copied into our PurchaseOrderDatabase, as Model works as a template for the custom databases. You can also make use of 'sp_helpdb' stored procedure as shown below -

sphelp

The result of sp_helpdb stored procedure gives the output shown as above. It shows the information about the database as well as the files created for the database for both PurchaseOrderDatabase as well as Model database. So that’s about the Model database!

Create database with files and file groups

While writing the Create Database script in our query, we did not mention any other options. We will explore some options that can be used while creating the database. Let's first drop the PurchaseOrderDatabase so that we can create the same database with different options.

We will create the database using data files and file group options. Let's write the script as shown below to create a database PurchaseOrderDatabase -

db1-script

When you create a database, you have to provide at least two files - Primary file and Transaction Log File. Also you have to specify at least one file group while creating the database. In the above script, we have specified the size of the primary data file, maximum size of the database and file growth in Mega bytes. You can specify the size of the file in KB, MB, GB or TB. The default is MB.

You can also create the database using multiple data files with the extension .ndf which are also known as secondary data files as shown below -

db2-srpt

Configuring File Stream during database creation

Now let's create a database with File stream. File Stream is used to store unstructured data like documents and images on NTFS file system with the help of VARBINARY(MAX) type. By default, the file stream is not enabled on SQL Server. To enable the file stream access level on SQL Server, write the following command -

enable-filestrm-acclevl

After enabling the filestream access level, let's create a database with the file stream option as shown below -

file-streamdb

To test the file stream database, we will create a table with File Stream and will try to insert some data in the same, as shown below -

insertfs

While creating the database, you can specify the database recovery model. There are three options which you can avail for recovery model -

1. Simple - In this recovery model, there is no log backup. Also you cannot use Log shipping or point-in-time restores.
2. Full - In this recovery model, all backup types are permitted. Point-in-time recovery is possible.
3. Bulk Logged - In this recovery model, all backup types are permitted. No point-in-time recovery is supported. Recovery is possible at the end of any backup.

Configuring recovery model during database creation

We will now create a database with recovery options. But before that, we will see what is the default recovery option set on our database. For this, we will again make use of sys.databases system view as shown below -

existing-recmodel

The output is as shown below -

recoutput1

Below is a script for database creation with the recovery options -

recquery1

Creating database with Collation option

Finally, we have the last option to explore, i.e. create a database with collation option. SQL Server supports number of Collations.

Many cultures around the world use different character sets. Simply put, a collation encodes the rules that govern the proper use of characters for an alphabet or language.

Let's create a database with our choice of Collation. Before that we will also see what is the default collation set on our databases as well as all the available Collation with the help of sys.fn_helpcollations(). The queries are as below -

collation-query

The query for creating a database with the different collations is as shown below -

collation-dbscript

And that’s it!!

Conclusion

In this article, we have seen how to create a database in SQL Server using a wizard and by writing scripts using SQL Server Management Studio [SSMS] with various options like files and file groups, with file stream option, with recovery model and with collation option.

The scripts shown in this article can be downloaded from here

Measure Transactions Per Second in SQL Server

Performance Monitor tool in SQL Server represents aspects of system performance, such as CPU Usage % , Memory Paging etc. The sys.dm_os_performance_counters Dynamic Management View exposes data of all performance counters for the particular instance of SQL Server. This view is extremely useful when you have to present the performance data of your database in your dashboard.

We have touched upon the usefulness of the sys.dm_os_performance_counters in my previous post Removing Deprecated Code and Future Proofing your Queries. In this post, we will see another use of the sys.dm_os_performance_counters view to measure SQL transactions per second for a database. Here’s a very handy query written by Joe Stefanelli

DECLARE @ctr bigint
SELECT @ctr = ctr
    FROM sys.dm_os_performance_counters
    WHERE counter_name = 'transactions/sec'
        AND object_name = 'SQLServer:Databases'
        AND instance_name = 'SomeDBName'
WAITFOR DELAY '00:00:01'
SELECT ctr - @ctr
    FROM sys.dm_os_performance_counters
    WHERE counter_name = 'transactions/sec'
        AND object_name = 'SQLServer:Databases'
        AND instance_name = 'SomeDBName'

As you can see, were taking the difference of two values for a delay of one second to get the number of transactions per second. In future posts, we will see some more queries using this useful DMV.

Finding Last Backup Date in SQL Server 2005

I saw a question on the SO forums asking for a simple way to find the last backup date of a SQL Server 2005 database. Here's one approach.

SQL Server Last Backup

During a backup operation, the following tables are updated in SQL Server 2005:
  • msdb.dbo.backupfile,
  • msdb.dbo.backupmediaset ,
  • msdb.dbo.backupmediafamily and
  • msdb.dbo.backupset.
Here we are doing a join between sys.sysdatabase and msdb.dbo.backupset which gets us the database name, the last backup date and who took the backup. sys.sysdatabase is queried as it contains one row for each database in an instance of Microsoft SQL Server.

OUTPUT

SQL Server BackUp

SQL Server Management Objects 2008 (SMO) New Features

MSDN defines SMO as - SQL Server Management Objects (SMO) is a collection of objects that are designed for programming all aspects of managing Microsoft SQL Server. In this article we will practically explore some features of SQL Server Management Objects.

To start with, let’s create a ‘Windows Application’ using Visual Studio and name the application as ‘SMOExamples’ as shown below –

SQL Server SMO app

Now let’s add the following references to our project.
  1. Microsoft.SqlServer.ConnectionInfo.dll
  2. Microsoft.SqlServer.Smo.dll
  3. Microsoft.SqlServer.SmoExtended.dll
  4. Microsoft.SqlServer.SqlEnum.dll
  5. Microsoft.SqlServer.Management.Sdk.Sfc.dll

All these references can be found in the path – ‘C:\Program Files\Microsoft SQL Server\100\SDK\Assemblies’.

Now let’s design our windows form as shown below –

SMO Examples

Declare an object which will make a connection to our SQL Server instance. For this, we will use a class called ‘Server’ as shown below –

Server srv = new Server(Environment.MachineName);

Now to test the connection to the SQL Server, let’s write the following code on the Form Load event as shown below –

SMO test connection

Now let’s create a Database and Table with the name ‘SMOExample’ and ‘SMOTable’ respectively by writing the following code on the click event of ‘Create Database And Table’ button –

SMO Create Database

For creating a database, we have used a class called ‘Database’ which takes two parameters in the constructor. First one is the SQL Server instance in which we have to create a database. Second is the database name.

Now let’s create a script for all the tables available in our database. Write bee following code on the click event of ‘Generate Script’ button –

SQL Server SMO Generate Tables

For generating the script files we are using a ‘Scripter’ class.

Now let’s write the code to take a backup of the complete database. Write the following code on the click event of ‘Backup Database’ button –

SQL Server SMO BackUp Database

To take a backup of the database, we are using a ‘Backup’ class and to decide what will be the device of backup, we are using ‘BackupDeviceItem’ class.

Now the last thing we will explore is how to verify the backup, which is taken before we restore it to another server/same server. Write the following code on the ‘Verify Backup’ button –

SQL Server SMO Verify Database

Now run your project. This will show you the date and time created for ‘AdventureWorks’ database as shown below –

clip_image001[6]

Now let’s click the button ‘Create Database and Table’ and observe the results in SQL Server Management
 Studio Object Explorer –

clip_image002[10]

Now click on the button ‘Generate Script’. Go to ‘C:\’ and verify the script file –

clip_image004

Finally click the button ‘Backup Database’ and check your ‘C:\’ drive. Your backup file should have been created. Similarly click on ‘Verify Backup’ button and it will show you a success message.

Summary – In this article we saw few capabilities of SQL Server Management Objects (SMO).

Download the source code

Computed columns - Persisted Vs Non-persisted‏

Computed columns are derived columns based on other existing columns in the same table. Refer to this post http://www.sqlservercurry.com/2012/11/computed-columns-in-sql-server.html for more information:

There are two types of computed columns namely persisted and non-persisted.

There are some major differences between these two

1. Non-persisted columns are calculated on the fly (ie when the SELECT query is executed) whereas persisted columns are calculated as soon as data is stored in the table.

2. Non-persisted columns do not consume any space as they are calculated only when you SELECT the column. Persisted columns consume space for the data

3. When you SELECT data from these columns Non-persisted columns are slower than Persisted columns

Consider the following set of code

sql-persisted-code

create table #t1(col1 int, col2 as col1*0.20)
insert into #t1 (col1)
select top 5000 row_number() over (order by s1.name) from sys.objects as s1 cross join sys.objects as s2


create table #t2(col1 int, col2 as col1*0.20 persisted)
insert into #t2 (col1)
select top 5000 row_number() over (order by s1.name) from sys.objects as s1 cross join sys.objects as s2


Run the following code to understand that table with persisted computed columns consumes more space when compared to a table with non-persisted computed columns. Refer the column named data

exec tempdb..spaceused #t1
exec tempdb..spaceused #t2


sql-persisted-nonpresisted

SQL Server SESSIONPROPERTY

There are various settings that can be done at the session level. For example, you can set ANSI_NULLS OFF for a session and ON for another session. If you want to know if a setting is ON or OFF before using the setting, you can make use of system function SESSIONPROPERTY.

SESSIONPROPERTY will return 1 or 0 depending on whether the SET OPTION is ON or OFF

Consider the following example

SELECT SESSIONPROPERTY('ANSI_NULLS')

If the above statement returns 1, then ANSI_NULLS is ON in the session, otherwise it is OFF
Similarly the same function can be used to know the SET OPTION for various other options specified below

QUOTED_IDENTIFIER
ARITHABORT
ANSI_NULL_DFLT_ON
ANSI_WARNINGS
ANSI_PADDING
CONCAT_NULL_YIELDS_NULL


Eg:

SELECT SESSIONPROPERTY('ANSI_WARNINGS')

image

If the above returns 1, then ANSI_WARNINGS is ON in the session, otherwise it is OFF.

So when you are working in SQL Server, you can check the status of these settings using the SESSIONPROPERTY function

Drop All Tables in Database whose name begins with tmp

Sometimes in your projects, you may be using staging tables for purposes like loading ,parsing and cleaning data. Suppose you name all of these table with the prefix tmp_ and after these tables have served their purpose, you now want to drop all these tables.

Here are two ways to do this:

1. Execute Drop table statements - Copy the result of this query which generates DROP TABLE statements and execute the result again. Eg:

select 'drop table '+table_name from INFORMATION_SCHEMA.TABLES where table_name like 'tmp_%'

The above query generates drop table statements like

drop table tmp_test
drop table tmp_emp
drop table tmp_data


provided there are tables named tmp_test, tmp_test and tmp_data in the database.

2. Concatenate drop table statements and execute it as a whole

declare @sql varchar(8000)
set @sql=''
select @sql=@sql+' drop table '+table_name from INFORMATION_SCEHMA.TABLES where table_name like 'tmp_%'
exec(@sql)


The variable @sql will have drop table statements concatenated for each table. It will have the string drop table tmp_test drop table tmp_emp drop table tmp_data

When you execute the string generated from that variable, all those tables get dropped

The database 'AdventureWorks2012' cannot be opened because it is version 705

While attaching an .mdf file of the Adventure Works 2012 database, I got the following error:

The database 'AdventureWorks2012' cannot be opened because it is version 705. This server supports version 655 and earlier. A downgrade path is not supported.
Could not open new database 'AdventureWorks2012'. CREATE DATABASE is aborted. (.Net SqlClient Data Provider)


database-failed

It seemed to be a rather silly mistake where I had copied the .mdf for Adventure Works 2012 in a SQL Express 2010 folder. Here’s the set up screenshot on my machine

sql-install-location

Instead of copying the .mdf to MSSQL11.SQLEXPR12 (which is the folder setup for SQL Express 2012), I had copied it to the MSQL10 folder.

The fix was simple, I copied the file to the correct folder, ran SQL Management Studio as administrator and bingo! The error went away!

SQL Server 2008 R2 Service Pack 2

Microsoft has recently released Service Pack 2 for SQL Server 2008 R2. This release includes product improvements based on requests from the community and hotfix solutions provided in SQL Server 2008 R2 SP1 Cumulative Updates 1 to 5. Here are the download links:

Microsoft SQL Server 2008 R2 Service Pack 2 - contains Cumulative Update 1 to 5 from SQL Server 2008 R2 SP1 and upgrades the following SQL Server 2008 R2 editions: Parallel Computing Edition, Datacenter Edition, Enterprise Edition and Developer Edition, Standard Edition, Web Edition, Workgroup Edition  and Express Edition

Microsoft SQL Server 2008 R2 SP2 - Express Edition – contains SQL Server 2008 R2 Express with SP2 in a ‘single install’ package

Microsoft SQL Server 2008 R2 SP2 Feature Pack - a collection of stand-alone packages which provide additional value for SQL Server 2008 R2 SP2

Oracle Vs SQL Server – Live Discussion

Data is the lifeblood of a company! When it comes to choosing a database to store your data, performance is one of the key priorities for a DBA. While a SQL Server DBA uses Clustered index as one of the mechanisms to improve performance while retrieving data, an Oracle DBA uses Heap Tables. But which one is better?

In a live discussion on June 7th, two Database heavyweights Jonathan Lewis and Grant Fritchey will debate the pros and cons of Oracle Heap Tables and SQL Clustered Indexes.  Be prepared for a lively exchange which will not only entertain, but will teach you key concepts on Oracle and SQL Server.

image

RedGate has made available this webinar completely free of charge, but please note that places are strictly limited and offered on a first come first serve basis. So go ahead and register now

Common Myths Around SQL Server Best Practices

Idera has a bunch of Free SQL Server Webcasts that you can watch and learn from. One that particularly caught my attention was the webcast about SQL Server Myths Debunked…Or Are They? 

Numerous SQL Server experts have done a great job of debunking many of the common myths that crop up around SQL Server best practices and usage. However in some cases, the debunking of those myths creates unintended consequences – we ignore the underlying truth which started the myth in the first place.

SQL Server expert Michael K. Campbell does a survey of misperceptions that have arisen from commonly debunked SQL Server myths. This free webcast shows you how shrinking and using multiple files can be a best practice, what AWE and locking files in memory really means, along with other best practices that you may be missing out on because of misconceptions.

In addition to this webcast, I recommend the following webcasts:
  • What Are You Waiting For?
  • Putting a Better SQL Server in Production
  • Under The Hood with SQL Server Fundamentals
  • Learning to Hate the SMO: A PowerShell Love Story
  • TempDB: Performance and Manageability
  • 5 Common High-Availability Mistakes
  • Common SQL Server Security Mistakes
  • Top 10 SQL Server Backup Mistakes and How to Avoid Them

Find out Long Running query using sp_who2 with Dynamic Management Views‏ (DMV)

There are many ways to find out why your SQL server is performing slow. Suppose you know that your server is responding slowly while executing queries and you want to find out the query that takes a lot of time to execute. If you want to find out slow queries, you can use dynamic management views from version 2005 onwards. However you can still use the old system stored procedure sp_who2 along with management views, to find this info.

Run the following code

EXEC sp_who2

and see the result shown below. Find out rows with runnable status with highest CPUTime.

long_run1

Sp_who2 will not display the text of the query, so we need to find out the text using dm_exec_requests and dm_exec_sql_text management views. The view dm_exec_requests will show the text of the query. All we need to do is to find out spid from the result of sp_who2 and filter it in dm_exec_requests using session_id column. The following code displays the text of the query.

SELECT
        sql.text AS statement_text
FROM
        sys.dm_exec_requests  AS req
CROSS APPLY sys.dm_exec_sql_text(req.sql_handle) as sql
WHERE
    req.session_id=52

Similarly you can also use dynamic management views to find out various root causes for performance issues.

Using sys.dm_os_performance_counters to measure Transactions Per Second

Performance Monitor tool in SQL Server represents aspects of system performance, such as CPU Usage % , Memory Paging etc. The sys.dm_os_performance_counters Dynamic Management View exposes data of all performance counters for the particular instance of SQL Server. This view is extremely useful when you have to present the performance data of your database in your dashboard.

Note: This article has been updated and can be found here http://www.sqlservercurry.com/2013/09/measure-transactions-per-second-in-sql.html