Quantcast
Channel: Forum SQL Server Database Engine
Viewing all 15889 articles
Browse latest View live

External Connection - Storage Emulator

$
0
0

Hi all,

I'm trying to connect my local SQL Server database instance to the blob storage emulator as an external connection, however I'm getting a "Bad or inaccessible location specified" error. Here are the steps I'm taking:

CREATE MASTER KEY ENCRYPTION BY PASSWORD = '<password>';
GO

-- Create the credentials using the generic storage account key
CREATE DATABASE SCOPED CREDENTIAL localBlobStorageCredential
WITH IDENTITY = 'devstoreaccount1',
SECRET = 'Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==';
GO

-- Create the data source pointing to the blob storage location I created (blob-storage-location)
CREATE EXTERNAL DATA SOURCE localBlobStorage
WITH (
    TYPE = BLOB_STORAGE,
    LOCATION = 'http://127.0.0.1:10000/devstoreaccount1/blob-storage-location',
    CREDENTIAL= localBlobStorageCredential
);
GO

-- Attempt to select a file that has been uploaded (cp.json)
SELECT * FROM OPENROWSET(
	BULK 'cp.json',
	SINGLE_BLOB,
	DATA_SOURCE = 'localBlobStorage'
) AS j;

I then get the following error:

Bad or inaccessible location specified in external data source "localBlobStorage".

Any ideas?

Regards,

James


Utility Explorer

$
0
0

Hello!

I've downloaded and installed the Developer edition of SQL Server 2016 SP1 and the latest SSMS (SSMS-Setup-ENU.exe). Now when I'm trying to open Utility Explorer I just can't find it:

???

Thank you in advance,

Michael


Options to minimize performance impact caused by Row Level Security

$
0
0

Hello,

We have plans to implement Row Level Security in a table of size 1TB to control access to records. We are using
a function, security policy and a lookup table with the list of user groups to achieve this logic.
The logic seems to be working fine. However, we are observing a performance hit as the function is being called every time
for every row returned by the query. 

Please let us know if anyone of you have any recommendation/option to minimize the performance impact caused by the implementation of RLS.

SQL Server Version : SQL 2016 - SP2

Regards

MDW Data Collector Server Activity collection step 3 suddenly stopped working

$
0
0

For many weeks my data collections from my database instace have been uploading to my MDW server. Suddenly step 3 of the Server Activity Collector has stopped working giving the following error: Job Name                            DataCollector Server Stats Coll n Upload Step Name                         collection_set_3_noncached_collect_and_upload_upload Duration                              00:00:05 Sql Severity        0 Sql Message ID 0 Operator Emailed
Operator Net sent
Operator Paged
Retries Attempted          0

Message Executed as user: domain_svcAccount. SSIS error. Component name: DFT - Find and save sql text, Code: -1071636471, Subcomponent: LKUP - Look up query text on target server [2], Description: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80040E14.   An OLE DB record is available.  Source: "Microsoft SQL Server Native Client 11.0"  Hresult: 0x80040E14  Description: "The handle that was passed to dm_exec_sql_text was invalid.".   .  SSIS error. Component name: DFT - Find and save sql text, Code: -1071611309, Subcomponent: LKUP - Look up query text on target server [2], Description: OLE DB error occurred while fetching parameterized rowset. Check SQLCommand and SqlCommandParam properties.   .  SSIS error. Component name: DFT - Find and save sql text, Code: -1073450974, Subcomponent: SSIS.Pipeline, Description: SSIS Error Code DTS_E_PROCESSINPUTFAILED.  The ProcessInput method on component "LKUP - Look up query text on target server" (2) failed with error code 0xC0208253 while processing input "Lookup Input" (16). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.  There may be error messages posted before this with more information about the failure.   .The master package exited with error, previous error messages should explain the cause.  Process Exit Code 5.  The step failed.

Has anyone encountered this error before ? Thanks

Creating Distinct Top N records

$
0
0

Hello All,

I am trying to create a query to get distinct top n records in SQL.

Please can someone help?

Cheers,

s

Backup performance 4x slower than manual filecopy

$
0
0

Hi,

I noticed that when copying a file from a SAN volume to a network share on my Windows 2012 R2 server the speed is 400MB/s.  When SQL Server performs a backup of data from that same disk, it happens at about 100MB/s.  Running a Crystal Disk Mark benchmark of sequential read 5GB, I got 700MB/s.  Running BACKUP DATABASE MYDB TO DISK = 'NUL:', I also get 100MB/s. 

The SAN vendor reviewed the server configuration and doesn't see any issues.  This same server attached to storage from another vendor is getting near 400MB/s on backups - this is a different instance. Resource governor is not used, if that could even have an effect. The array is EMC.  Microsoft MPIO driver is being used, no special SAN software needed.

To summarize: 

  1. SQL backups are slow.
  2. SQL backup to NUL: is slow, indicating slow reads from SAN volume.
  3. Manual file copy through explorer is 4x faster.
  4. Sequential read test is 7x faster.

Does anyone have suggestions on how to improve this performance?

Thanks,

Sam

 


Hiding or masking sensitive information in the SQL Server audit log

$
0
0

I'm looking into auditing of who are accessing the data in all tables in a database. We do have the additional requirement that we in the log can't have any information that can be used to identify a person.

The first part can easily done using SQL Server audit to log all select, insert and delete statements for a specified database. It is the second part that I have problem with. The whole SQL statement including any personal information in e.g the where clause will be stored in the 'statement' column. This meaning that whoever that have access to the audit log will be able to see this information and identify the person in question. We do not need the information that are stored in the 'statement' column. The information stored in 'database_name', 'schema_name' and 'object_name' columns are enough for us.

My question is are there any way to prevent SQL Server audit to store the SQL statement at all or at least in clear text in the 'statement'column?

Error Number 825' occurred - Disk I/O

$
0
0

SQL Server Alert System: 'Error Number 825' occurred 

Error:

DESCRIPTION:    A read of the file 'G:\MSSQLDataFiles\ecommerce_amzn_items_2.ndf' at offset 0x00000097dae000 succeeded after failing 1 time(s) with error: 1117(The request could not be performed because of an I/O device error.). Additional messages in the SQL Server error log and system event log may provide more detail. This error condition threatens database integrity and must be corrected. Complete a full database consistency check (DBCC CHECKDB). This error can be caused by many factors; for more information, see SQL Server Books Online.

this DB is heavy production & used for Transactional replication as (publisher)

how to check by passing DBCC checkdb  ( is there possibility for replication corruption ? ) 



VS Just in time debugger alert

$
0
0

Hi guys, I am seeing this alert in one of my servers:

Any idea what is?

Thanks

How get data from Oracle in SQL Server with open Query in "where" Oracle.data = SqlServer.data

$
0
0

Hello i have a question please:

How can get data from Oracle in SQL SERVER with OpenQuery for compare data.. here my query:

 

SELECT TOP (1000) [TADIG]
      ,[TELEFONO]
      ,[ANEXO]
      ,[COD_CLIENTE]
      ,

  (CASE WHEN [CLIENTE] IS NULL

THEN  (select * from OPENQUERY(ALDM ,'select customer_legal_name from ODSDMP.CUSTOMER ')as A where A.subscriber_last_name = B.[SEGMENTO])

WHEN [CLIENTE] = LTRIM(RTRIM('.'))
THEN 'ACAA!!!!'
else 
[CLIENTE]

END

)  AS CLIENTE


      ,[FECHA_ALTA]
      ,[TIPO_CLIENTE]
      ,[SEGMENTO]

  FROM [NRTRDE].[dbo].[ESOTOP_HURS_3] B

Thanks a los for u answers!! :)


Can We Delete MS_AgentSigningCertificate.cer?

$
0
0

After installing SQL Server (2016, 2017) there is an "MS_AgentSigningCertificate.cer" file in the system database files folder (i.e., where the files for the [master] database are). I have read in several other thread that this file is used for internal processes during installation.

Can the file be safely deleted after installation is complete?


Dan Jameson
Associate Director of IT/DBA
Children's Oncology Group
www.ChildrensOncologyGroup.org

Compress backup

$
0
0

When I set up full backup on SQL Server 2014 Enterpriese, there is section "Set Backup Compression". If I choose "Use the default server setting", how can I make sure it will be compressed or not?

I would like to make the size of backup file as smaller as possible because the disk for backup file storage is limited in my environment, and I would like to change the configuration to compress backup. Is there any matter of concern to use compress backup? In my understanding, it will be high CPU usage, so it had better to schedule it less user access term, and it can not append to the same backup set with compress backup and non-compress backup, so I think I need to create new backup set for that. Is there any matter of concern to use compress backup file?

My most concern is how it will impact to recovery procedure with compressed backup file. Is there any difference to recover database with compressed backup file or not compressed backup file?

Any advice will be greatefully appreciated. Thank you..

Transaction log just after changing recovery mode from simple to full

$
0
0

Please correct me if any of my following understanding is wrong..

If I change recovery mode from simple to full on SQL Server 2014 Enterpriese, Transaction log truncation will keep automatically executing as same as simple recovery mode UNTIL the first full backup is executed since changing recovery mode from simple to Full.

After the 1st full backup, the size of transaction log will be enlarged until you execute transaction log backup which will Transaction log truncation at the same time..

Any advice will be greatefully appreciated. Thank you..

   



Backup file management

$
0
0

I would like to ask about backup configuration on SQL Server.

It is configured Weekly Full backup, and Daily Differentical & Transaction Log Backup in Full Recovery Mode on my database which is SQL Server 2014 Enterprise. There is only limited storage for backup file in my environment, so I am reviewing my backup configuration to prevent from oppression of capacity because of enlarged backup files as much as possible. Please correct me if any of my understanding below is wrong..

-If I choose "Append to the exisiting backup set" in weekly full backup configuration, all past backup will be remained in the file? For example, if the database to take a backup is 10 GB, and it is schedule to execute as weekly, it will be gaining 10 GB every week? If you need to recover your dabase, you can use all past backup weeks not only the latest but also before week(s) of the latest backup.

-If I choose "Overwrite all existing bacukp sets" in weekly full backup configuration, and if the database to take a backup is 10 GB, is the backup file remain in 10 GB? In this case only one which is the latest backup you can use to recover database.

Is there any way to delete old backup from full backup file with the "Append" configuration to make the file smaller? There is expiration set up section in backup configuration, and I am wondering if this is it. If I set it up as after 30 days, 30 days older backup will it be deleted automatically from the backup file even it is in the "Append" configured backup file?

Any advice will be greatefully appreciated it. Thank you...   

Can't connect to SQL Server after install

$
0
0
I installed SQL Server 2014 Standard on Windows 2016 Standard.  Default instance.  Mixed mode authentication.  During the install, I added my own account and one other domain account as administrators, and gave 'sa' a password.  I am a local admin on the machine, but not a domain admin.  After the install, I updated with SP3 to ensure support for TLS 1.2.  Enabled TCP/IP, Named Pipes, and Shared Memory.  The database engine server is started with my domain account.  I can't login.  I try using SSMS 2014 and SSMS 7.9.  I get error "Login failed for user..."  Error 18456. Any idea what I'm missing?  Thanks.

Changing Recovery Model from Full to Simple

$
0
0

I need to change my recovery model from Full to Simple on SQL Server 2014 Enterprise. This is my environment and plan to change the configuration as below. In my understanding, if I succeed to change it to simple recovery mode, it will be automatically start to transaction log truncation, and the log will not be enlarged like when it does not take transaction log backup in full recovery mode. If there is any minunderstanding of mine, please correct me.. Any advice will be greatefully appreciated. Thank you...

-----------------------------

My Environment

-----------------------------

-SQL Server 2014 Enterprise

-Recovery Model = Full

-Full backup is scheduled in SQL Server Agent as weekly

-Differential backup is schedule in SQL Server Agent as daily

-Trunsuction log backup is schedule in SQL Server Agent as every 12 hours.

-----------------------------------

Configuration change plan from Full to Simple Recovery Model

-----------------------------------

(1) Announce users about downtime for maintenace.

(2) Execute full backup just in case.

(3) Disable those 3 SQL Server Agent jobs.

(4) Change Recovery model from Full to Simple.

(5) Reboot the SQL Server

Violation of Primary Key error in Identity column (SQL Server 2016 Enterprise edition SP1)

$
0
0

Violation of Primary Key error in Identity column (SQL Server 2016 Enterprise edition SP1)

We have a table called “[URCIPRO].[ACTLOGSLNO_Seq]”,this table is having only two columns (1) SeqID <IDENTITY DECIMAL(18,0)> and (2) SeqVal <BIT>. This table used to generate the log table serial number, so it will have frequent and multiple hit at same time.  

Issue: we are getting primary key violation error frequently for this above table even though its IDENTITY KEY.  

Please help us to resolve this issue.

SharePoint database user audit failed

$
0
0

The error message is as follows

System.Data.SqlClient.SqlException: 无法打开登录所请求的数据库 "SharedServices1_DB"。登录失败。  用户 'BEYONDSOFT\t-spsadmin' 登录失败。 

The user's password is correct, but the audit fails all the time. May I ask why?

SQL Server 2014 with (memory_optimized=on) fires Msg 12332

$
0
0

When trying to make a simple table like this:

USE myDb;
create table dbo.t1 
(c1 int not null primary key nonclustered, c2 INT)
with (memory_optimized=on)
go

In a database it works fine in another fires the error below:

Msg 12332, Level 16, State 107, Line 2
Database and server triggers on DDL statements DROP and CREATE are not supported with memory optimized tables.


Parameter Sniffing Options

$
0
0

Currently I am evaluating the database scope option parameter sniffing . What I under standing is it's to get rid of the parameter sniffing problem.  I am using sample database adventurework2012 for testing

CREATE PROCEDURE Get_OrderID_OrderQty
@ProductID INT
AS
 
SELECT SalesOrderDetailID, OrderQty
FROM Sales.SalesOrderDetail
WHERE ProductID = @ProductID;

When Parameter sniffing is ON

EXEC Get_OrderID_OrderQty @ProductID=897    <<<  This will give index seek

EXEC Get_OrderID_OrderQty @ProductID=870    <<<< this will give index seek

When Parameter sniffing is OFF

EXEC Get_OrderID_OrderQty @ProductID=897    <<<  This will give index scan

EXEC Get_OrderID_OrderQty @ProductID=870    <<<< this will give index scan

this is not I expect, I expect after parameter sniffing is off , 897 will be seek, and 870 will be index scan. ..

Viewing all 15889 articles
Browse latest View live