Monday, September 21, 2015

Database cannot be opened due to inaccessible files or insufficient memory or disk space

Database cannot be opened due to inaccessible files or insufficient memory or disk space


Msg 945, Level 14, State 2, Line 1
Database 'db' cannot be opened due to inaccessible files or insufficient memory or disk space. See the SQL Server errorlog for details.


Solution:
Check Microsoft SQL Server error log and determine reason for the cause.
If it is because of a persistent I/O error related to Application Programming Interface, a torn page, or other hardware issues, resolved it and restore it with help of backup. But if there is no availability of backups then try DBCC CHECKDB repair option.
 If possible add some disk space to files drive or cleanup some disk space.
Verify the permissions of user account.
Verify if the database is property is set to AutoGrow On.
.mdf and .ldf files should not be marked as Read Only on windows level.


If none of the above solved Issue: Try this


1. Change the database to offline to clear the db status

use master
alter database dbname set offline

2. Now change the database to online, at this step log file and data files will be verified by sql server


use master
alter database dbname set online


Friday, April 17, 2015

SQL Server SSRS: List out all Data Sources and all their dependency objects/Reports

 This Script will list all the data sources by their actual name, and all their dependent items:

SELECT C2.NAME AS Data_Source_Name
 ,C.NAME AS Dependent_Item_Name
 ,C.Path AS Dependent_Item_Path
FROM ReportServer.dbo.DataSource AS DS
INNER JOIN ReportServer.dbo.CATALOG AS C ON DS.ItemID = C.ItemID
 AND DS.Link IN (
  SELECT ItemID
  FROM ReportServer.dbo.CATALOG
  WHERE Type = 5
  ) --Type 5 identifies data sources
FULL JOIN ReportServer.dbo.CATALOG C2 ON DS.Link = C2.ItemID
WHERE C2.Type = 5
ORDER BY C2.NAME ASC
 ,C.NAME ASC;

How to find when was a login deleted in SQL Server

SELECT TE.NAME AS [EventName]
 ,v.subclass_name
 ,T.DatabaseName
 ,t.DatabaseID
 ,t.NTDomainName
 ,t.ApplicationName
 ,t.LoginName
 ,t.SPID
 ,t.StartTime
 ,t.RoleName
 ,t.TargetUserName
 ,t.TargetLoginName
 ,t.SessionLoginName
FROM sys.fn_trace_gettable(CONVERT(VARCHAR(150), (
    SELECT TOP 1 f.[value]
    FROM sys.fn_trace_getinfo(NULL) f
    WHERE f.property = 2
    )), DEFAULT) T
INNER JOIN sys.trace_events TE ON T.EventClass = TE.trace_event_id
INNER JOIN sys.trace_subclass_values v ON v.trace_event_id = TE.trace_event_id
 AND v.subclass_value = t.EventSubClass
WHERE te.NAME IN (
  'Audit Addlogin Event'
  ,'Audit Add DB User Event'
  ,'Audit Add Member to DB Role Event'
  )
 AND v.subclass_name IN (
  'add'
  ,'Grant database access'
  ,'drop'
  ,'Revoke database access'
  )

Wednesday, April 15, 2015

THE PACKAGE FAILED TO LOAD DUE TO ERROR 0XC0011008

Background: While working on a small SQL Server Integration Service project I ran into this problem.

I first developed the packages locally on my laptop using 2008 versions of SQL Server and SSIS. After creating two separate packages, one for the import and one to clean out the tables as I was testing the import successfully I knew I was going to have to update some the components in the package after moving it to the remove development box used by the dev team.

After copying the files from my laptop to the remote server, running them, making a few changes here and there and getting them to work like they were designed I reported in our morning standup meeting I was 95% complete. I just need to document the information for turnover and wanted to confirm all was still working. When this error popped up after another change and tried to redeploy.

The package failed to load due to error 0xC0011008 “Error loading from XML. No further detailed error information can be specified for this problem because no Events object was passed where detailed error information can be stored.”. This occurs when CPackage::LoadFromXML fails.



I was confused. What change from yesterday to today?

My code had not changed except for the minor update I just performed. So, I reverted the change and tried to deploy again. Same error. I then went to Google to see what information was out on the web for this error. Surely someone had run into this issue before. After finding several posts in forums by others dating as far back as 2005 I did not find anything to indicate what was wrong or what else to investigate. I sent a Tweet to some SSIS people and to the #ssis list. I also posted the error in several forums still thinking someone has run into this before.

On Tweet mentioned the Security Protection level. It was set to the default of EncryptSensutiveWithUserKey. I had not changed that, (at least not that I remembered). So I changed it to DontSaveSensitive just to see if somehow I had changed it in my sleep. Still no love.

Two people respond with this link;

http://tinyurl.com/yazwg58

No love here. I was working remotely but that was the only similarity.

For some reason I tried to open the dtsx file in the Deployment folder. After all I had nothing to lose. That is when I got these errors;

Error 1: Error loading CleanStagingTables.dtsx: The version number in the package is not valid. The version number cannot be greater than current version number.

Error 2: Error loading CleanStagingTables.dtsx: Package migration from version 3 to version 2 failed with error 0xC001700A “The version number in the package is not valid. The version number cannot be greater than current version number.”.

Error 3: Error loading CleanStagingTables.dtsx: Error loading value “<DTS:Property xmlns:DTS=”www.microsoft.com/SqlServer/Dts” DTS:Name=”PackageFormatVersion”>3</DTS:Property>” from node “DTS:Property”.

Error 4: Error loading ‘CleanStagingTables.dtsx’ : The package failed to load due to error 0xC0010014 “One or more error occurred. There should be more specific errors preceding this one that explains the details of the errors. This message is used as a return value from functions that encounter errors.”. This occurs when CPackage::LoadFromXML fails.

Aha! Could these be the underlying issue(s)? At least I had a different path to look down.

After searching again I found this post;

http://tinyurl.com/y8z9bud

According to the blog post the problem is this;

Reason for the error: Old version of the DTEXEC is picked up by SQL Server instead of the new one. That means, the exe shipped with 2005 is picked up when it is expected to use the one shipped with 2008. So, obviously this happens when SQL Server 2008 is running along with SQL Server 2005 on the same machine. As a result of this, we end up with two versions of DTEXEC executables. One residing in SQL Server 2005 path (“C:\Program Files\Microsoft SQL Server\90\DTS\Binn”) and the other in SQL Server 2008 path (“C:\Program Files\Microsoft SQL Server\100\DTS\Binn”).

There are three choices to fix this problem.

1. Hard code the path of SQL Server 2008’s DTEXEC while calling the SSIS package as shown below.

C:\Program Files\Microsoft SQL Server\100\DTS\Binn\DTEXEC.exe /F “D:\MyFolder\MyPackage.dtsx”

2. Rename the old exe in the 2005 path to a different name (Example:- C:\Program Files\Microsoft SQL Server\90\DTS\Binn\DTEXEC_Old.exe)

3. Go to PATH environmental variable and edit it in such a way that “C:\Program Files\Microsoft SQL Server\100\DTS\Binn” path appears well before the “C:\Program Files\Microsoft SQL Server\90\DTS\Binn” path.

I first tried choice #2. Renaming the file. Yup you guessed it. No love here.

I then proceeded to the choice #3. Again, no love. I don’t know why I never tried choice #1. Other than figuring if renaming the file and changing the ordering in the path environmental variable didn’t work why would the first.

My next move will not work for the majority of you. I uninstalled SQL 2005. We were not using it on this server and the only project on it was the one I was working so. After uninstalling I reapplied SP1 to SS 2008 just in case. This SS 2005 gone this of course caused other issues. (say it isn’t so).

The next issue was the dtsx files and the deployment manifest files were no longer associated with any program. Good and bad. Good because I knew I had finally broken the chain linked to SS 2005. Bad because now I couldn’t use my files! AHHH! Well not really. When I clicked on the manifest file the open with dialog box opened and once I associated the files with the SS 2008 versions all worked as designed. Dtsx files should be associated with Integration Services Package and the manifest files associated with DTSInstall.exe.

After getting this all working I went back and looked at the log tables to see if I could find anything to point to why this started after I had deployed the package without error.

Sure enough, I found this note in the logs.

Product: Microsoft SQL Server 2005 – Update ‘Service Pack 3 for SQL Server Database Services 2005 ENU (KB955706)’ installed successfully.

I know I had applied SP1 to SQL Server 2008 before I even started the work.  I actually had that on my laptop because I had recently gotten a new hard drive and had to reinstall everything.

I don’t recall clicking on SP3 to download/install but I have to take the responsibility since I can’t blame it on anyone else. The only thing I can think is that when SP3 got installed it make SQL 2005 the default server again and that is why I was getting the version out of sync error.

I heard a quote last night from comedian Ron White, “You can fix a lot of things, (my injection here) TSQL code, bugs, but you can’t fix stupid. Stupid is for-ever”

Reference

Tuesday, April 14, 2015

SQL Server view for monitoring open Service Broker conversations


SELECT conversation_handle
 ,is_initiator
 ,s.NAME AS 'local service'
 ,far_service
 ,sc.NAME 'contract'
 ,ce.state_desc
FROM sys.conversation_endpoints ce
LEFT JOIN sys.services s ON ce.service_id = s.service_id
LEFT JOIN sys.service_contracts sc ON ce.service_contract_id = sc.service_contract_id;

Monday, April 13, 2015

SQL Server agent job running at this moment.

SQL Server Agent has a lot of great information stored in the system tables and the GUI provides a nice way of retrieving and viewing the data, but it does not always give you the data the way you would like it displayed.  From a high level it is nice to see the current status of all jobs and the last run status of the jobs, but what if you want to see additional information or the data displayed in a different format.  On a one by one basis you can get this information from the GUI, but it is not very easy to correlate this data across jobs or even across different time periods.

There are probably a million reasons why you could want to try and track down what job(s) might have been running at a particular time. Most of them, of course, will be due to various performance issues, side effects, and other problems that you might end up observing on one of your servers at a given time—and having the ability to quickly rule out a SQL Server Agent Job (or know that it was the culprit) is therefore, a godsend.

One of the things SQL Server doesn’t make all that easy is to know what SQL Agent jobs are running.

exec msdb.dbo.sp_help_job @execution_status=1

This little thing tells precisely which SQL Agent jobs are running at the moment, and that fills a great need in a lot of cases to all the DBA's

Source:mikehillwig

Wednesday, April 8, 2015

Available Protocols for SQL Server

Three Available Protocols for SQL Server


Shared Memory: This is an in-memory protocol and thus is only suitable for applications running
on the same machine as the SQL Server.

Named Pipes: This is an interprocess communications protocol (IPC) that enables a process to
communicate with another process, possibly running on a different computer,
through the use of shared memory. This protocol typically works well in small
and fast local area networks because it generates additional network traffi c
during use. In larger and slower networks, TCP/IP works better.

TCP/IP:  Transmission Control Protocol/Internet Protocol (TCP/IP), is widely used today.
TCP guarantees the delivery and order of the information sent between computers,
while IP defi nes the format or structure of the data sent. TCP/IP also
contains advanced security features that make it attractive to security-sensitive
organizations and users. This protocol works well in larger networks and slower

Thursday, April 2, 2015

Grant/Revoke Permissions to run SQL Server Profiler for a non System Admin User

Security and Permissions

Tracing can expose a lot of information about not only the state of the server, but also the data sent to and returned from the database engine by users. The ability to monitor individual queries down to the batch or even query plan level is at once both powerful and worrisome; even exposure of stored procedure input arguments can give an attacker a lot of information about the data in your database.
In order to protect SQL Trace from users that should not be able to view the data it exposes, previous versions of SQL Server allowed only administrative users (members of the sysadmin fixed server role) access to start traces. That restriction proved a bit too inflexible for many development teams, and as a result it has been loosened.



GRANT/REVOKE ALTER TRACE permission using T-SQL

 We can also assign this permission by running T-SQL commands. We can get this privilege by running the below commands in the Master database. If you want to assign this permission by running this T-SQL command then do not follow Step 3 thru Step 5 above.

--  Grant access to Windows Login
USE Master;
GO
GRANT ALTER TRACE TO [DomainName\WindowsLogin]
GO

-- To Grant access to a SQL Login
USE master;
GO
GRANT ALTER TRACE TO Geoff;
GO

GRANT permission on ALTER trace by T-SQL
Now you can go ahead and launch SQL Server Profiler to verify that you have access.

  If you want to revoke this access from the assigned login then you can run the below commands to remove the ALTER TRACE permission.

--REVOKE access FROM a SQL Login
USE Master;
GO
REVOKE ALTER TRACE FROM Geoff;
GO

-- REVOKE access FROM a Windows Login
USE master;
GO
REVOKE ALTER TRACE FROM [DomainNAME\WindowsLogin]
GO

SET ROWCOUNT - SQL Server to stop processing the query after the specified number of rows are returned


NOTE:Setting the SET ROWCOUNT option causes most Transact-SQL statements to stop processing when they have been affected by the specified number of rows. This includes triggers. The ROWCOUNT option does not affect dynamic cursors, but it does limit the rowset of keyset and insensitive cursors. This option should be used with caution.
SET ROWCOUNT overrides the SELECT statement TOP keyword if the rowcount is the smaller value.
The setting of SET ROWCOUNT is set at execute or run time and not at parse time.

SET ROWCOUNT stops processing after the specified number of rows. In the following example, note that over 500 rows meet the criteria of Quantity less than 300. However, after applying SET ROWCOUNT, you can see that not all rows were returned.
USE AdventureWorks2012;
GO
SELECT count(*) AS Count
FROM Production.ProductInventory
WHERE Quantity < 300;
GO
Here is the result set.
Count
-----------
537
(1 row(s) affected)
Now, set ROWCOUNT to 4 and return all rows to demonstrate that only 4 rows are returned.
SET ROWCOUNT 4;
SELECT *
FROM Production.ProductInventory
WHERE Quantity < 300;
GO
(4 row(s) affected)

Checklist for Restoring TDE Encrypted database on different SQL Server

In order to perform a successful restore, we'll need the database master key in the master database in place and we'll need to restore the certificate used to encrypt the database, but we'll need to make sure we restore it with the private key. In checklist form:
  • There's a database master key in the master database.
  • The certificate used to encrypt the database is restored along with its private key.
  • The database backup to be restored.