Wednesday, December 9, 2009

Terminal serrver has exceeded max allowed connections

When RDPing to remote server, this is a most common message displayed - " The terminal server has exceeded maximum allowed connections"

To overcome this message, following should be done.

start -> run -> mstsc /v ipOfServer /admin ===> for using RDP 6 client or above !

here ipOfServer = ip address of the server you are trying to connect.

instead you can use name of the server too.

Sunday, November 15, 2009

Change password on remote server when using RDP

RDP (remote desktop) key combinations:

* CTRL+ALT+END: Open the Microsoft Windows NT Security dialog box (CTRL+ALT+DEL) --- to change the password on the server

* ALT+PAGE UP: Switch between programs from left to right (CTRL+PAGE UP)
* ALT+PAGE DOWN: Switch between programs from right to left (CTRL+PAGE DOWN)
* ALT+INSERT: Cycle through the programs in most recently used order (ALT+TAB)
* ALT+HOME: Display the Start menu (CTRL+ESC)
* CTRL+ALT+BREAK: Switch the client computer between a window and a full screen
* ALT+DELETE: Display the Windows menu
* CTRL+ALT+Minus sign (-): Place a snapshot of the entire client window area on the Terminal server clipboard and provide the same functionality as pressing ALT+PRINT SCREEN on a local computer (ALT+PRT SC)
* CTRL+ALT+Plus sign (+): Place a snapshot of the active window in the client on the Terminal server clipboard and provide the same functionality as pressing PRINT SCREEN on a local computer (PRT SC)

Wednesday, November 11, 2009

Moving databases between different servers when replication and full text catalog are defined !

I have come across many articles on web that describes how to move databases from one server to another. But I didn’t find much information pertaining to my situation.

Recently, we carried out a server consolidation project and I was required to move databases and cubes between different servers. Moving databases is fine and is pretty straight forward task (you can use Detach/Attach method). But when the databases you are moving are involved in replication or a Full text (FT) catalog is defined on a database, there things become tricky !

In my situation, transactional replication was involved and the databases were having FT catalog defined. So to move databases, I followed below mentioned steps:

1. Use the below query to find what files are involved with the database.

USE [database_name]


select name as logical_name, physical_name, type_desc, state_desc

from sys.database_files

This will give you the details of the files associated with a particular database. (Note: here you can use SP_HELPDB ‘db_name’; but this will not give you the location of FT file.)

2. Then script out the replication. (One of the cool feature of SQL server 2005 SSMS is the ability to script out things: rt click the publisher à generate scripts à select create scripts and give a proper location where the script file will be put).

3. Also, to remove replication generate Drop scripts.

4. Then you can use the Detach/Attach method of moving databases form one server to another.

--Detach the database.

USE master


Sp_detach_db [database_name]


At this point, Copy the mdf, ldf, ndf (if any) and the FT directory to the new server.

--Physically move the full-text catalog to the new location.

--Attach the database and specify the new location of the full-text catalog.

CREATE DATABASE [database_name] ON

(FILENAME = 'S:\data\[database_name]_Data.mdf'),

(FILENAME = 'X:\Logs\[database_name]_log.ldf'),

(FILENAME = 'S:\myFTCatalogs\AdvWksFtCat')



5. After the attach is done, you can verify that the files that are attached are online or not by using:

USE [database_name]


select name as logical_name, physical_name, type_desc, state_desc

from sys.database_files

6. Then you can run the create replicaiton scripts to recreate replication. (Note, as you are moving databases to different server, the scripts are to be edited for the new location of the server!).


1. If replication is not dropped, then detach database will fail.

2. While attaching the database, if the full text catalog path is not provided correctly, then the full backup will fail with following error:

"Backup failed for Server 'ComputerName\SQLInstanceName' " or "The backup of the file or filegroup 'sysft_FullTextCatalog' is not permitted because it is not online"

This can be resolved as follows:


ALTER DATABASE [database_name] MODIFY FILE ( NAME = [logical_name_of_FTCatalog] , FILENAME = 'S:\FullText\catalog_folder_name')


This should bring the FT catalog ONLINE and you are good to take full backup.

If that does not bring the FTcatalog Online, then you have to rebuild the FT catalog or delete the old one and recreate from scratch.


Hope this helps !


Are you exaggerating on your résumé?

Recently, I came across a very good post worth reading: Are you exaggerating on your résumé?

Just check it out as it has good points to discuss and keep in mind !


Tuesday, October 13, 2009

Script to append the output file for SQL Agent job logging.

SQL Agent has a provision to log output to an external file. This is called "verbose logging" and this helps in proper management of multiple SQL Agent jobs.
Below is the script by which an output file can be pointed for logging as per the job is configured (either on failure, success or both).


--Note: first create a folder called "Job_logs" in K:\

USE msdb

declare @bkpath varchar(300)
Set @bkpath = 'K:\job_Logs\' --- sets the path to the folder created

update sysjobsteps
set output_file_name = @bkpath + REPLACE(, '''', '') + ' - ' + s.step_name + '.txt'
from sysjobsteps s, sysjobs j
where s.job_id = j.job_id and output_file_name is null

--Check if the path pointed is added correctly:

USE msdb ;
EXEC dbo.sp_help_job
@job_name = N'', -- specify the job name
@job_aspect = 'steps' ; -- tells the output_file_name

Hope this helps !



Saturday, September 26, 2009

Is it good to put data and log files together ?

This is a most common question asked ! The answer to this depends on the environment you work in.
What I mean by that is: If your environment is a heavy read / write then it is good to put data (.mdf) and log files (.ldf) on separate physical disks (preferably RAID). The reason for this is that when there are heavy inserts/updates/deletes, the way SQL server handles is that
1. It writes to the log files that it is going to change the data
2. It writes to the data files
3. It writes to the log file that it has changed the data i.e it commits the transaction.
This shows that it has to write both to the log and data files and the writes are random meaning it writes to the log file then jumps to data file and then again to the log file. This kills the performance making the writes much slower !

There might be good reasons to put data and log files on same disk/drive like
1. If your database is READ-ONLY i.e having more read activity than write like reporting or data warehouse.
2. If there is a lot of use of TEMP-DB only as compared to data / log file drives.
3. If you have a RAID-10 array, its good to put your data and log files on the same drive as RAID-10 is fast and have both mirroring and data striping functionality.

Although, to decide what should be implemented, a thorough research about the environment should be done and when the system is implemented using PERFMON to track the disk activity i.e disk read/writes helps !

Also to change the location of data and log file ALTER DATABASE can be used. **see my previous post how this can be done **

Hope this helps !


Wednesday, July 8, 2009

Reporting Services Login box appears when trying to deploy in SSRS 2005 SP3

Hi All,

I installed SQL Server 2005 SP3 recently. After that when I try to
deploy a solution in SSRS 2005, I get a dialog box with title:
"Reporting Services Login". It has my server listed as the first row:
"http://localhost/reportserver" and then it has two text boxes asking
for Username and Password.


When I was deploying the reports, in the properties of the report in target server URL field, I had http://localhost/Reports

I changed it to http://localhost/Reportserver, and everything worked fine.

1. Open your project in SQL Server Business Intelligence Development Studio
2. Select the project menu followed by the properties of the project.
3. Change the TargetServerURL: To the appropriate path to your reportserver.(

Hope this helps,



Friday, June 5, 2009

Moving database from one location to another

Hi All,

Recently I ran across a situation where in there was a GIS application which was trying to feed a large amount of data (more than 22GB) to SQL server. This led to the increase of the data file (.mdf) and eventually this led to filling the disk space. So came the situation of moving the database to a different location where there was more room for growth !!

1. using Attach/detach method.
use adventureworks

sp_helpdb 'adventureworks' ---> this will show you the logical and physical file names of the data and the log files for a particula database

Detach the database:
USE master;
EXEC sp_detach_db @dbname = N'AdventureWorks';

Move the files data(.mdf), secondary data file(.ndf) and log files (.ldf) to the desired location.

Attach the database:

USE master;
ON (FILENAME = 'C:\MySQLServer\AdventureWorks_Data.mdf'),
(FILENAME = 'C:\MySQLServer\AdventureWorks_Log.ldf')
This method has certain drawback and is not recomended by BOL: "We recommend that you move databases by using the ALTER DATABASE planned relocation procedure, instead of using detach and attach."

So the recomended method is to use ALTER DATABASE.

1. sp_helpdb 'adventureworks' -> will give logical and physical file names

2. use the alter database command to move the files.

ALTER DATABASE adventureworks
MODIFY FILE (NAME = adventureworks_dat, FILENAME = 'E:\adventureworks_database\adventureworks.mdf');
ALTER DATABASE adventureworks
MODIFY FILE (NAME = adventureworks_dat2, FILENAME = 'D:\adventureworks_data_file2\adventureworks_dat2.ndf'); --- here if you have any secondary data file, if not ignore this
ALTER DATABASE adventureworks
MODIFY FILE (NAME = adventureworks_log, FILENAME = 'E:\adventureworks_database\adventureworks.ldf');

WITH ROLLBACK IMMEDIATE --- this will take your database immediately offline

Now move the data files and log file to the respective desired location.

ALTER DATABASE adventureworks SET ONLINE -- this will take your database online

This will move your database from one location to another and is more efficient way to do.

This is a good link to follow:

Hope this helps !!


Wednesday, June 3, 2009

SQL2005 [264] An attempt was made to send an email when no email session has been established.

Hi All,

Below is the problem which is more obvious to face when a fresh install of SQL server 2005 is done.

Problem: "[264] An attempt was made to send an email when no email session has been established." This message is logged in the SQL agent error logs.

The message is from a job which will send email to an operator when it is completed. In the operator notification setting, I the correct email address has been entered for the 'Email Name' text box, such as
However the Database Mail is configured and I can send/receive a test email.


1. In Management Studio, right-click on SQL Server Agent and select "Properties."
2. Click "Alert System"
3. Uncheck "Enable mail profile"
4. Click "OK"
5. right-click SQL Server Agent again and select "Properties."
6. click "Alert System"
7. Check "Enable mail profile"
8. Click "OK"
9. Restart SQL Server Agent. (this will not bring any thing down, unless jobs are running at the moment when SQL agent is restarted i.e the running jobs will fail, but they can be ran after the agent is restarted)

Hope this helps !!!


Monday, June 1, 2009

SSIS package fails when scheduled as SQL server agent job

Problem: When run through the BIDS or Visual Studio, the SSIS package runs well. But when scheduled as a SQL agent job, the SSIS package fails with the error message:
" Code: 0xC0016016 Source: Description: Failed to decrypt protected XML node "DTS:Property" with error 0x8009000B "Key not valid for use in specified state.". You may not be authorized to access this information. This error occurs when there is a cryptographic error. "

Solution: It took me a while to figure out the problem and the same was mentioned at SSC.
The standard security on an SSIS package is "EncryptSensistiveWithUserKey" which only allows the user that created it to be able to run in as long as there is sensitive information (passwords) stored within the package.

A workaround that I am using is that I changed the package security to "EncryptAllWithPassword" which protects the entire package with a password. When you set the package as a job step, go into the properties of that job step, click on the "Command Line" tab and you should be prompted for the password. After you enter the password, the /DECRYPT switch should show up in the command line and the job should then be able to run under a different user.

Users won't be able to see the password because they will need to enter it before viewing the command line for the job step.

(The issue is addressed nicely with many inputs at

Thursday, May 28, 2009

Installing and configuring the ArcSDE service for SQL server is an easy job, but sometimes the ArcSDE service which is dependent on SQL server stops due to server restart or due to user shutdown. This happens even though the ArcSDE service is set to automatic start in case there is a server reboot.

Problem: The ArcSDE 9.3 service running on SQL server cannot be started. When attempted to start from GUI (cmd ->services.msc -> rt click esri_sde service -> start), it will show an error message that "the ArcSDE service started and stopped. This may happen due to the fact that some services may stop as they have no work to do"


1. run -> cmd -> sdeservice -o list (this will list the status of the sde service running)
Note: to see the status of the service use ------ sdemon -o status -i 5151 ------
2. sdemon -o start
Provide the sde password that was created during the ArcSDE installation.

This will start the service !!!
I came across a very minor issue, but often not scrutinized and this can lead to major trouble !!

Problem: After the SP3 installation, the report server was not available to the application accessing it. i.e the user (report user: it is a fixed user used to pull out reports -- user1) was denied access to the database.


1. login to the server where your report server is installed.

2. go to Administrative tools --> IIS manager --> on left part Expand local computer --> websites --> default websites --> reports --> rt click --> properties --> DIRECTORY SECURITY --> Authentication and access control --> EDIT --> CHECK THE CHECK BOX at top part [enable annonymous access]

This happens during the installation of service pack and the ""enable annonymous access"" becomes unchecked !!!

This fixes the issue !!

SQL server SP3 installation problem

Recently, I installed SP3 for SQL server 2005. The installation went successful, but after the installation the maintenance plans were failing.

Problem: The maintenance jobs are failing after the SP3 installation on the test server.

Error Message: Unable to cast COM object of type 'Microsoft.SqlServer.Dts.Runtime.Wrapper.PackageNeutralClass' to interface type 'Microsoft.SqlServer.Dts.Runtime.Wrapper.IDTSContainer90'.

: This message is generally due to the unregistered COM in the dts.dll file in the \90\binn folder of the SQL server installation folder.

On Server follow below steps to fix the issue:

1. start -> run -> cmd
2. cd c:\Program Files\Microsoft SQL Server\90\DTS\binn
3. REGSVR32.EXE dts.dll

This fixes the issue !!

Tuesday, May 26, 2009

SSIS package fails to execute when scheduled as a job

Recently, I was creating an SSIS package and the package when executed in the BIDS environment was successfully run, but when scheduled through a job in SSMS gave me an error:
Code: 0xC0016016 Source: Description: Failed to decrypt protected XML node “DTS:Password” with error 0×8009000B “Key not valid for use in specified state.”. You may not be authorized to access this information. This error occurs when there is a cryptographic error. Verify that the correct key is available. End Error Error: 2008-09-15 16:34:32.57 Code: 0xC0202009 Source: TextFileImport Connection manager “DecipherDestination” Description: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0×80004005. An OLE DB record is available. Source: “Microsoft SQL Native Client” Hresult: 0×80004005 Description: “Login timeout expired”. An OLE DB record is available. Source: “Microsoft SQL Native Client” Hresult: 0×80004005 Descrip… The package execution fa… The step failed.

Meanwhile I came across a good post describing the same situation:

Thanks to the original poster !!
This blog is for SQL server and ArcSDE administration. Thanks for visiting !

Disclaimer: The contents provided on this blog are as is and the poster assumes no responsibility in case of any loss of data. Please test thoroughly the solution or codes provided in your environment.