Microsoft SQL Server 2008

49K

Solutions

17K

Contributors

Microsoft SQL Server 2008 is a suite of relational database management system (RDBMS) products providing multi-user database access functionality.Component services include integration (SSIS), reporting (SSRS), analysis (SSAS), data quality, master data, T-SQL and performance tuning. Major improvements include the  Always On technologies and support for unstructured data types.

Share tech news, updates, or what's on your mind.

Sign up to Post

Created an SSRS 2014 report by joining 2 table ,related by CS_ID filtered by year . Notes column is in 2nd table and for each year it is unique . I am able to display data in a tablix. Issue I am facing is ,when some of the rows  are having notes (value of which is same for all rows). How to display that note on  bottom of a report on  a textbox  ? Currently I use another dataset to do this .  I want to do away with this. But I feel there is  way to set a report variable on expression of a column ( I use status column) of the row which has that  note column value. SetValue() does not seem to work . Ensured that report variable is not ready only.
0
Hi Everyone,

How to setup software upgrade group (SUGs)? Can you show me how to configure in SCCM console or GPO push down the windows updates & security to the clients.

How to retrieve full software report for SCCM

Tks.

Lcuky
0
I set my sql server to sql server mixed mode authentication

Question
I add this Windows account, for example domain name\johnsmith to SQL server login
If I want to log into SQL server by this Windows account like domain name\johnsmith on SQL authentication, not on Windows authentication
Can I do that?

My SQL server is SQL 2012
0
Logon Login failed for user . Reason: Attempting to use an NT account name with SQL Server Authentication.

I have a Windows domain account like gooled\solarexpert that is set as sysadmin on mssql database, connect to Solarwind application.  The application connection is not successful. I check the database log, it is "....Reason: Attempting to use an NT account name with SQL Server Authentication."

I use this account gooled\solarexpert  on other database with same setting, it's no issue to connect. Most are nnncet5ed successfully, only few of them is with this error message.

Any advice?

I have tried almost all suggestions posted from web about this error, it still doesn't work.

My server is mssql 2012
0
As part of our failover process in SQL Server 2008 R2 we need to change the registry on our sharepoint servers so they are pointed to the right SQL box on the mirror. This has worked fine until we created 2 vm sharepoint boxes.  The step that changes the registry is in a text doc for each sharepoint box and is accessed using  Operating System (CmdExec) with 'regini -m \\SHP01 d:\scripts\SwitchToAppN1_SHP01.txt '  . With the VMs  the command is regini -m \\VMPRDWEBAPP01 d:\scripts\SwitchToAppN1_VMPRDWEBAPP01.txt. Error is
Executed as user: ########. REGINI: SetValueKey (CLSQLN1\Applications) failed (5)  REGINI: Failed to load from file 'd:\scripts\SwitchToAppN1_VMPRDWEBAPP01.txt' (5).  Process Exit Code 1.  The step failed.

Searches said this has to The problem was w ith permissions on the remote server.  HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurePipeServers\winreg so we have added permission to the reg key for the user/svc-account but are still getting the error.

Any help would be appreciated.
0
Hi,
    We are facing some strange issues on our SQL server 2008 R2 environment where suddenly SQL Agent job which was executing DTS packages weren't returning any error or successful status . When we checked in back-end the file which were supposed to get processed by DTS weren't processed. we haven't changed any configuration change or package upgrade done. the same package was copied to other environment which ran successfully.
To test whether DTS tool are having any issue we created some test package which was executed properly without any issues.
 No error logs are getting logged either on SQL Agent Job-->Steps-->Advanced->Error log file or on package logging as well.
Even running the DTSRUN command on command prompt doesn't show up anything.

Any help would be much appreciated!!
0
I have a tempdb, I set tempdb (tempdb.mdf) initial size too large 40 GB some time ago, now it reachs to 40 GB. I want to reduce the initial size, but it cannot allow me to do that. Even I shrinkfile it, it cannot help much. I don't want to keep running out of disk space and add disk space. I want to reset the initial size to min like 0124GB

Any advice? I want to reset the initial size to something a low number like 1024 from 40GB.

I am running MSSQL 2012
0
I use VFP9 in my job. We want to move data from Microsoft SQL 2008 express to  Microsoft SQL 2017  express.  I have the string connection  for SQL 2008, but not for 2017,   what would be the string connection for SQL 2017 ? .  Basically I need to read from several tables of the old SQL  2008 ,  do some calculations and then write  to some fields on the new SQL 2017  ,  The tables are practically the same in both version (same name ). Is it possible to have both versions of SQL open at the same time ?
0
How to fetch duplicate record occurred in 5- 10 minutes of time interval?

SQL Database.

I will have to identify the list of duplicate transactions happened with in the range of 5- 6 mins.

BillId        Trans   DateTime
A100125  Paid  2018-04-18 11:21:40.873 
A100125  Paid  2018-04-18 11:24:40.873 
A100125  Paid  2018-04-18 11:30:40.873 
A100125  Paid  2018-04-18 12:30:40.873 

Open in new window

I can generate a report for date range. This is about average difference of 5-10 mins.

In the above data, expected data as output is


A100125  Paid  2018-04-18 11:24:40.873 
A100125  Paid  2018-04-18 11:30:40.873 

Open in new window

0
How do you access the global Address book in Outlook with SQL Server 2008?
0
I am trying to create a stored proc in SSMS 2008 and I'm having a performance issue. We've had accounts move from one SOR to another but not all the data transferred so I am having to use 2 linked servers (DB2 & TD-PROD) to get the data.
To build the #Acct_Base only takes a couple of seconds. The performance issue is coming from the #PMNT. There is a date that I need to use from #Acct_Base in the WHERE to get data for #PMNT but after 2 hours of running I cancel it.

How can I make this faster?

If OBJECT_ID(N'tempdb..#Acct_Base', N'U') Is Not Null
      Drop Table #Acct_Base

SELECT *
INTO #Acct_Base
FROM OPENQUERY(DB2,
'WITH LGL_STIP AS (
a.DEBTOR_NO
,a.FRST_PMT_DT
,a.ACCT_NUM
,ect
,ect......................
')

If OBJECT_ID(N'tempdb..#PMNT', N'U') Is Not Null
      Drop Table #PMNT

select x.* into #PMNT from
(SELECT distinct a.DEBTOR_NO
,a.FRST_PMT_DT
,a.ACCT_NUM
      ,p.LN_NO
      ,p.CUST_ACCT_ID
      ,p.DELTA_TRAN
      ,p.TRAN_EFCT_DT
      ,p.SEQUENCE
      ,p.PRIN_BAL_AM
      ,p.TOTAL_RECD
      ,p.INT_CHRG_AM
      ,p.TTLPMTS
from #Acct_Base  a
left join openquery([TD-PROD],'Select
CLIENT_NO
,LN_NO
,CLIENT_NO ||''-''|| LN_NO as CUST_ACCT_ID
,DELTA_TRAN      
,DELTA_PROC_DATE as TRAN_EFCT_DT
,SEQUENCE
,PRIN_PD as PRIN_BAL_AM
,TOTAL_RECD
,INT_PD as INT_CHRG_AM
,SUM(PRIN_PD + INT_PD) AS TTLPMTS
FROM EIW_S1.DLTALH02_FUNDS_PROC_CS
WHERE DELTA_TRAN in (170,175)
GROUP BY
CLIENT_NO
,LN_NO
,DELTA_TRAN      
,DELTA_PROC_DATE
,SEQUENCE
,PRIN_PD
,TOTAL_RECD
,INT_PD

')       as p      
0
Hi,

I'm using SQL SERVER 2008R2, Visual Studio 2008 for SSIS.

When the parent stored procedure usp_TEST_PARM  calls a child stored procedure and there’s an error in the child stored procedure, how do we get the child stored procedure to RETURN  a value so that SSIS would know that an error occurred in the child stored procedure (so that SSIS can send an alert email that says that an error happened in the child stored procedure and not the parent stored procedure usp_TEST_PARM  )?

GO

/****** Object:  StoredProcedure [dbo].[usp_TEST_PARM]    Script Date: 03/08/2018 16:07:55 ******/
SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO


/*

EXECUTE dbo.usp_TEST_PARM 'parm1'

*/

CREATE PROCEDURE [dbo].[usp_TEST_PARM]

@PASS_PARM as varchar(50)='Opps'

AS

SET NOCOUNT ON

SELECT [Field1]='Hello',[Field2]=@PASS_PARM
UNION
SELECT [Field1]='Hello2',[Field2]=@PASS_PARM
UNION
SELECT [Field1]='Hello3',[Field2]='Seconds'

RETURN 3

SET NOCOUNT OFF


GO


/*
to run it:

declare @var1  int
EXEC @var1 = dbo.usp_TEST_PARM 'parm1'
SELECT 'RETURN STATUS ' = @var1
GO
-------------------
RESULT:

Hello	parm1
Hello2	parm1
Hello3	Seconds

----------------------
RETURN STATUS 
3

*/

Open in new window

0
Hello,

I am getting the attached error when I try to launch the SQL Sentry Client.

I login to the server as a domain admin that is supposed to have permission to the SQL server and its database.  I am using integrated windows authentication.

I login to the SQL server directly using the same domain admin account and I am able to see the database and its view tables in SQL Management Studio.

Please advise where I should look.  

Thanks.
0
What is the best way to upgrade 2008 SSIS packages to 2106 SP1

All script tasks disappear during regular upgrade . Please share some experience

thanks
Julia
0
i have a windows app exe which i want to run on another laptop on that lapop dosen't have the SQL SERVER  how should i run that exe with that databse on laptop without installing the SQL SERVER on laptop
how should i encrypt the data on the laptop
0
I will be running a MERGE SQL query to query over a million records in my source table and insert into my target table. This table that I'm doing the SELECT from in the Merge is in production. This table will have an application with many users hitting the table for SELECT, INSERT, UPDATE, DELETE at the same time. I will NOT be modifying the source table data with my MERGE statement, only the target table. I will have SQL Snapshot Isolation enabled, so no reason to use NOLOCK hint. Is there a way to have the query run in batches, or is having the MERGE statement scan the entire table more efficient? I have 2 other merge statements I'll be running after the inital INSERT to do INSERT, UPDATE, DELETE on target table for any changes that were done. Are there any precautions I need to take so as to not cause performance issues with the production application? I'm going to use a stored procedure because I will be running these queries on multiple tables that will be doing the same function over and over again.

*** My sample intial MERGE....

MERGE dl178 as TARGET USING dlsd178 as SOURCE ON (TARGET.docid = source.docid AND TARGET.objectid = source.objectid AND target.pagenum = source.pagenum and target.subpagenum = source.subpagenum and target.pagever = source.pagever and target.pathid = source.pathid and target.annote = source.annote)

WHEN NOT MATCHED BY TARGET THEN INSERT (docid, pagenum, subpagenum, pagever, objectid, pathid, annote, formatid, ftoffset, ftcount) VALUES (…
0
SQL Server detected a logical consistency-based I/O error: torn page (expected signature: 0x00000000; actual signature: 0x00000000). It occurred during a read of page (1:4825942) in database ID 5 at offset 0x000009346ac000 in file 'G:\DATA1\MSSQL\Data\AdMail50_Data.mdf'.  Additional messages in the SQL Server error log or system event log may provide more detail. This is a severe error condition that threatens database integrity and must be corrected immediately. Complete a full database consistency check (DBCC CHECKDB). This error can be caused by many factors; for more information, see SQL Server Books Online.

Please help me how to fix.
0
SQL-MPSQL-MPHello,

Question on SQL 2008 backup files when using a manual backup process (right-click on database, choose "Tasks" -> "Backup") versus using a maintenance plan in SQL Server Management Studio.

I've noticed that when I use a SQL 2008 maintenance plan the backup files for a particular database, the .bak is only ~ 9 GB. But when I manually backup this same database by right-clicking on the database entry and choosing "Tasks" -> "Backup" that the .bak file is ~ 18 GB. I've confirmed the maintenance plan is targeting the correct database, and the MDF and LDF files are approximately 9 GB and 2 GB respectively. Screenshot of the maintenance plan structure is attached.

Could this be considered normal?

Thank you in advance.

Jason
0
I have a request of how to write an Execute SQL task in SSIS.   More of a fail safe for when a vendor's file does not make it to our system.    This is causing the below issue.

I have a SQL job Step 1 that successfully uploads a CSV File from a WINSCP Windows task script to retrieve the latest CSV file from a SFTP site.   Then the SQL job Step 2 sends the file to the SSIS package for loading into SQL.   The problem occurs when the SFTP site does not receive a new file.  
Step 1 does not fail because it successfully retrieved the latest file, however, Step 2 using the package fails because the file that was pulled already exists from the previous pull.   I need a way to check in the SSIS package to see if the dynamically named CSV file exists before processing.    I am very green to writing variables and I have never used the Script task command.  Is there a way to do this check before my package kicks off the For each file task by using the Execute SQL task  ?    

As a side note, I do not have permissions to delete older files off of the SFTP site.    And, even though the SQL Job Step 2 says it failed, it still executes the CSV file so now I have duplicate records for the same day.

Please advise exact steps as I am a beginner.
0
Am trying to Automate log Shipping for 900 DB's which are on 3 instances.

What is the best way to automate the log shipping configuration

Below blog script used SQL CMD to configure log shipping for 1 DB, but i need to change it to all the DB's on the instance.

https://stackoverflow.com/questions/4127668/scripting-log-shipping-automation

Appreciate your inputs.
0
Hi
I am looking for a simple solution to replicate production 2008 SQL DB's  into Lab 2008 SQL  server continuously..

Can someone recommend a simple solution to do this please?

Thanks
0
I am try to connect conquestdicomserver with sql Server 2008 R2 manually and also trying to editing "dicom.ini" file.. but this is not working..

Any One can help me..please.... Thanks in Advances

Here is DICOM.INI file code .

i am using window32

Sql Server Name :- GMDahri-PC
databse    Name :- conquest
login         Name :- conquest
password            :- conquest1415
sa                        :- 123




# This file contains configuration information for the DICOM server
# Do not edit unless you know what you are doing

[sscscp]
MicroPACS                = sscscp

# Network configuration: server name and TCP/IP port#
MyACRNema                = NICVDPACS
TCPPort                  = 5678

# Host(ignored), name, username and password for ODBC data source
SQLHost                  = localhost
SQLServer                = conquestpacs_s
Username                 = conquest
Password                 = conquest1415
DoubleBackSlashToDB      = 0
UseEscapeStringConstants = 0

# Configure server
ImportExportDragAndDrop  = 1
ZipTime                  = 05:
UIDPrefix                = 1.2.826.0.1.3680043.2.135.736588.40949073
EnableComputedFields     = 1

FileNameSyntax           = 4

# Configuration of compression for incoming images and archival
DroppedFileCompression   = un
IncomingCompression      = un
ArchiveCompression       = as

# For debug information
PACSName                 = NICVDPACS
OperatorConsole          = 127.0.0.1
DebugLevel…
0
MY QUERY IS
SHOW DATABASES;

THEN SHOW  ERROR

Msg 2812, Level 16, State 62, Line 1
Could not find stored procedure 'SHOW'.  
   WHY SHOW THIS ERROR PLZ HELP ME.
0
Hi,

I wanted to understand what will be the impact of performance on server if i set Max Pool size =500 and Connection Lifetime=60 seconds in connection string specified in web.config

Is it dangerous to set connection max pool size as well as connection lifetime=60s ? I added this configuration as I was getting exception thrown for default connection max pool size i.e 200 so had to increase it to 500 though i am closing my connection properly as explained in other forums as well as Microsoft docs?

Thanks,

Krutik Sheth
0
We have a sql cluster using 2008R2. When running on one node of the cluster, our SSRS reports are fine. When running on the other node, the reports are extremely slow. For example, we have a 350 page report that the first node can print to pdf in about 30 seconds. The same report take at least an hour on the second node. As a test, I installed sql server and SSRS on my laptop. I can print the same large report also in under a minute.
So what could be happening on the second server that slows printing down so much? It doesn't matter whether we are printing to paper, pdf, etc. Also, it happens with any of our reports. It is not limited to this one large one. It also happens whether there are images  on the report or not.
0

Microsoft SQL Server 2008

49K

Solutions

17K

Contributors

Microsoft SQL Server 2008 is a suite of relational database management system (RDBMS) products providing multi-user database access functionality.Component services include integration (SSIS), reporting (SSRS), analysis (SSAS), data quality, master data, T-SQL and performance tuning. Major improvements include the  Always On technologies and support for unstructured data types.