Oct 15

SQL Server 2008 Interesting Features

Extracted from an article on DatabaseJournal.com.

Theoretically, SQL Server 2008 will be released in Q1 of 2008, along with a new version of Visual Studio and Windows.

Data Compression

According to Microsoft, while using compression will slightly increase CPU usage, overall system performance will be improved by reducing I/O and increasing buffer-hit rates. Data and indexes can both be compressed, and it doesn’t affect other features.

Performance Data

More perf counters have been added. The data collected by the counters is stored in a centralized data warehouse. Microsoft states that running the default set of performance related monitors will consume less than 5% of CPU and memory resources.

Performance Dashboard: tool that can read saved performance data. Historical and baseline comparisons can be made and used to create action triggers. For example, if memory use exceeds a threshold for more than five minutes, a more detailed data collection can be automatically triggered.

Performance Studio: a collection of performance tools. Together they can be used for monitoring, troubleshooting, tuning and reporting. The Data Collector component of the studio is configurable and low overhead. It supports several collecting methods, including queries, traces, and perf counters. Data can also be collected programmatically. Once data is collected, there are drill-down and aggregate reporting options, including suggestions for potential performance tuning.

Installation

The installer will be able to retrieve updates from Microsoft and install SQL Server, service packs, and patches as a single step. Service packs will be uninstallable.

Data Synchronizing Features

SQL 2008, Visual Studio, and ADO.NET bring together new methods of creating disconnected applications which synchronize with a central database. SQL 2005 started by providing support for change tracking by using triggers. SQL 2008 synchronizing is better integrated and optimized.

Dates and Times

New data types in SQL 2008.

  • Date: date only, no time.
  • Time: time only, no date.
  • Date Time Offset: “UTC time-zone aware value” (whatever that means)

Table-Valued Parameters

It’s sort of a hack, but you create a UDT which is a table and pass variables of that type into sprocs.

Merge

A new TSQL command, MERGE, allows a single statement to UPDATE, INSERT, or DELETE a row depending on its condition. The example below demonstrates the MEGRE being used to combine a new list of Inventory Item descriptions into the existing Inventory Master. In addition to new Descriptions, there are some new parts included in the NewInventory table. Without the Merge statement, two commands would need to run. The first would look for a match then upgrade the Description. The second statement would look for non matches and then INSERT. With MERGE, one statement can perform both tasks as shown below.

MERGE InventoryMaster AS im
USING (SELECT InventoryID, Descr FROM NewInventory) AS src
ON im.InventoryID = src.InventoryID
WHEN MATCHED THEN
UPDATE SET im.Descr = src.Descr
WHEN NOT MATCHED THEN
INSERT (InventoryID, Descr) VALUES (src.InventoryID, src.Descr);

Oct 15

HTTP Server Status Codes

RFC 2616

Informational 1xx

  • 100 Continue
  • 101 Switching Protocols

Successful 2xx

  • 200 OK
  • 201 Created
  • 202 Accepted
  • 203 Non-Authoritative Information
  • 204 No Content
  • 205 Reset Content
  • 206 Partial Content

Redirection 3xx

  • 300 Multiple Choices
  • 301 Moved Permanently
  • 302 Found
  • 303 See Other
  • 304 Not Modified
  • 305 Use Proxy
  • 306 (Unused)
  • 307 Temporary Redirect

Client Error 4xx

  • 400 Bad Request
  • 401 Unauthorized
  • 402 Payment Required
  • 403 Forbidden
  • 404 Not Found
  • 405 Method Not Allowed
  • 406 Not Acceptable
  • 407 Proxy Authentication Required
  • 408 Request Timeout
  • 409 Conflict
  • 410 Gone
  • 411 Length Required
  • 412 Precondition Failed
  • 413 Request Entity Too Large
  • 414 Request-URI Too Long
  • 415 Unsupported Media Type
  • 416 Requested Range Not Satisfiable
  • 417 Expectation Failed

Server Error 5xx

  • 500 Internal Server Error
  • 501 Not Implemented
  • 502 Bad Gateway
  • 503 Service Unavailable
  • 504 Gateway Timeout
  • 505 HTTP Version Not Supported


Oct 11

sp_betterhelpindex: A better sp_helpindex

Microsoft’s system sprocs are nice and all, but they’re rather basic. I have written an improvement to sp_helpindex which also includes key columns, included columns, reads and writes, fragmentation, and freshness of statistics. Like sp_helpindex, it goes in master but can be called from any database. Download the script with proper tabbing.

USE master
GO

IF EXISTS (SELECT 1 FROM sys.objects WHERE name = 'sp_betterhelpindex')
   DROP PROCEDURE sp_betterhelpindex
GO

CREATE PROC dbo.sp_betterhelpindex @tableName VARCHAR(255) AS

DECLARE @sql VARCHAR(MAX)

SET @sql ='
SET NOCOUNT ON

USE ' + DB_NAME() + '

DECLARE @object_id INT, @index_id INT, @indexName VARCHAR(500)
DECLARE @keyColumns VARCHAR(2000), @includedColumns VARCHAR(2000)

SET @object_id = OBJECT_ID(''' + @tableName + ''')

DECLARE @index TABLE (
   index_id INT,
   keyColumns VARCHAR(2000),
   includedColumns VARCHAR(2000)
)

DECLARE sp_betterhelpindex_indexes CURSOR FAST_FORWARD FOR
   SELECT index_id
   FROM sys.indexes
   WHERE object_id = @object_id

OPEN sp_betterhelpindex_indexes
FETCH sp_betterhelpindex_indexes INTO @index_id
WHILE @@fetch_status = 0
BEGIN
   SELECT @keyColumns = NULL, @includedColumns = NULL

   SELECT @keyColumns = ISNULL(@keyColumns + '', '', '''') +
   columns.name + (CASE WHEN is_descending_key = 1 THEN ''(-)'' ELSE '''' END)
   FROM caV2.sys.indexes indexes
   INNER JOIN caV2.sys.index_columns index_columns ON indexes.object_id = index_columns.object_id
   AND indexes.index_id = index_columns.index_id
   INNER JOIN caV2.sys.columns columns ON index_columns.object_id = columns.object_id
   AND index_columns.column_id = columns.column_id
   WHERE indexes.object_id = @object_id AND indexes.index_id = @index_id AND is_included_column = 0
   ORDER BY index_column_id

   SELECT @includedColumns = ISNULL(@includedColumns + '', '', '''') +
   columns.name + (CASE WHEN is_descending_key = 1 THEN ''(-)'' ELSE '''' END)
   FROM caV2.sys.indexes indexes
   INNER JOIN caV2.sys.index_columns index_columns ON indexes.object_id = index_columns.object_id
   AND indexes.index_id = index_columns.index_id
   INNER JOIN caV2.sys.columns columns ON index_columns.object_id = columns.object_id
   AND index_columns.column_id = columns.column_id
   WHERE indexes.object_id = @object_id AND indexes.index_id = @index_id AND is_included_column = 1
   ORDER BY index_column_id

   INSERT INTO @index VALUES (@index_id, @keyColumns, ISNULL(@includedColumns, ''''))

   FETCH sp_betterhelpindex_indexes INTO @index_id
END
CLOSE sp_betterhelpindex_indexes
DEALLOCATE sp_betterhelpindex_indexes

SELECT sys.indexes.name,
   (CASE WHEN is_primary_key = 1 THEN ''primary key, ''
   WHEN is_unique = 1 THEN ''unique, ''
   ELSE '''' END) + LOWER(sys.indexes.type_desc) AS type,
   keyColumns, includedColumns,
   sys.filegroups.name AS [filegroup],
   user_seeks + user_scans + user_lookups AS readsByQueries, user_updates AS updates,
   CAST(physicalStats.avg_fragmentation_in_percent AS INT) AS [fragmentation %],
   STATS_DATE (@object_id , i.index_id) AS statisticsUpdated, i.index_id
FROM @index i
INNER JOIN sys.indexes ON sys.indexes.object_id = @object_id AND i.index_id = sys.indexes.index_id
INNER JOIN sys.filegroups ON sys.indexes.data_space_id = sys.filegroups.data_space_id
INNER JOIN sys.dm_db_index_usage_stats usage ON usage.object_id = @object_id
   AND i.index_id = usage.index_id
   AND usage.database_id = DB_ID()
INNER JOIN sys.dm_db_index_physical_stats(DB_ID(), @object_id, NULL, NULL, NULL) physicalStats
   ON i.index_id = physicalStats.index_id AND alloc_unit_type_desc != ''LOB_DATA''
ORDER BY sys.indexes.type_desc, keyColumns, includedColumns
'

EXEC(@sql)

Attached File sp_betterhelpindex.sql_.txt

Oct 11

Caprock Vector

A thread-safe vector component for use in the Session and Application. I did not write this component. The original author has evaporated, so I have archived it for the betterment of mankind.

The Vector Object is a COM component designed to work with any scripting environment (it also supports native interfaces within Visual Basic and Visual C++). Whether you are using IIS/ASP or you are using the Windows Script Hosting environment, you can use this object to store items in a grow-able array or vector. The items can be any automation compatible data type such as strings, numbers and other automation objects. This object is designed to be a dynamic, growable array or collection. It supports traditional item references by numeric index such as:

for index = 0 to 10
   response.write vec.Item(index)
next

but also item reference by textual index like so:

set foo = vec.Item(“Foo”)

The unique quality about textual or Named items, they are only supported by automation objects that implement the “Name” property. In other words, you can store an automation object in the Vector. The Vector inspects the automation object for a property called “Name”. If the Vector finds this property, it will allow you to reference the item in the Vector by name or numerical index.

Special consideration has been made for the Vector object to work with IIS/ASP’s intrinsic Application and Session objects. You can store a reference to a Vector object in one or both of these objects without affecting the performance of your web application. This is accomplished by marking the Vector COM component as both apartment and free-threaded, aggregating the Free-Threaded Marshaler. This component was developed with VC++ 6.0 and ATL 3.0 for the best performance possible.

Attached File Download the 76k zip file containing the DLL, source code, samples and documentation.

Oct 11

Caprock StopWatch

An ASP stopwatch timer component for debugging, diagnostics and performancization. I did not write this component. The original author has evaporated, so I have archived it for the betterment of mankind.

Attached File Download the 21k zip file containing the DLL, source code, samples and documentation.