Microsoft SQL Server 2008

49K

Solutions

17K

Contributors

Microsoft SQL Server 2008 is a suite of relational database management system (RDBMS) products providing multi-user database access functionality.Component services include integration (SSIS), reporting (SSRS), analysis (SSAS), data quality, master data, T-SQL and performance tuning. Major improvements include the  Always On technologies and support for unstructured data types.

Share tech news, updates, or what's on your mind.

Sign up to Post

I need to apply an update to SQL server 2008 in a production environment.

Barring backing up the database would there be any other implications in doing this?
0
CompTIA Network+
LVL 13
CompTIA Network+

Prepare for the CompTIA Network+ exam by learning how to troubleshoot, configure, and manage both wired and wireless networks.

SQL Server v2014.  I have attempted to relocate the tempdb files from C to another drive.  The ALTER statements performed fine, and the restart completed without error as well.  But now that I am trying to adjust the size of the file on the new drive, the system tells me there is not enough space -- because it still thinks they are on the C drive.

This is the error I receive when I attempt to ALTER the size of the tempdb log file:

     Msg 5149, Level 16, State 3, Line 41
     MODIFY FILE encountered operating system error 112(There is not enough space on the disk.) while attempting to expand the physical file
      'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\DATA\templog.ldf'.

Yet at the very same time, sp_helpfile says the files are:
   sp_helpfile results for tempdb


I have relocated the tempdb  countless times before, but I have never run into this problem.  Are there any Experts who can assist?
0
I have two SQL tables i'm trying to join together, while presenting all of the data from one table (Material_Location), and only one field from the other table (Job.Order_Date).  Specifically, I'm trying to get the most recent order date from the Job Table for each line in the Material_Location Table.  Can someone please explain how to properly join them and limit the output?  Currently I get a new row of output for every date in the table for a given part.

SELECT        dbo.Material_Location.Material, dbo.Material_Location.Location_ID, dbo.Material_Location.On_Hand_Qty, dbo.Material_Location.Unit_Cost, dbo.Material_Location.Last_Updated
FROM            dbo.Material_Location LEFT OUTER JOIN
                         dbo.Job ON dbo.Material_Location.Material = dbo.Job.Part_Number

Open in new window


I've also attached a table of sample data showing my anticipated output.

Any assistance would be very much appreciated.
SAMPLESET.JPG
0
I have the query that return the result properly. What I am trying to do is to get the total per school. However, the schools have different IDs but this is basically the same schools. How can do some type of search to get the calculated as a same school.

	SELECT U.SchoolID,LS.SchoolNameTx
		FROM SessionMap SM
		inner join users u on u.userKey = SM.userKey
		inner join lkup_school LS on LS.SchoolID = U.SchoolID
		order by U.SchoolID

Open in new window


The result returned:
SchoolID	SchoolNameTx
429	       Quince Orchard High School
1189	Walter Johnson High School (WJHS)
1228	Academy of the Holy Cross
1395	Academny of the Holy Cross
1395	Academny of the Holy Cross
2656	Academy of Holy Cross
3144	Academy of the Holy Cross High School (AHC)
3144	Academy of the Holy Cross High School (AHC)
3144	Academy of the Holy Cross High School (AHC)
3144	Academy of the Holy Cross High School (AHC)
3144	Academy of the Holy Cross High School (AHC)
3144	Academy of the Holy Cross High School (AHC)
3450	Academy  of the Holy Cross
4892	Stonehill College

Open in new window

0
sql server 2008 r2

I have a job that runs every night.
In the sql , I need some type of email alert that will tell me if after an "insert statement" was completed;
If the records that were in a staging table has the same count as in my master table ?

I cant find any code reference to my  question ?

Thanks
fordraiders
0
I am running an SQLCMD command from DOS against a SQL Server 2008 database that occasionally outputs the following message at the end of the data:
"WARNING: Null value is eliminated by an aggregate or other SET operation."

I do not control the DB and do not want to fix the error, I just want to suppress the warning message.
I know the default ERRORLEVEL is 0.
If I add the output parameter -m1 this warning disappears.  This is what I want because the output is read by a different program and does not expect messages at the end.

But it raises a few other questions for me:
1.  What ERRORLEVEL values exist in SQL Server?
2.  Do they mean anything?
3.  Is there a maximum (if I want to suppress any/all errors)?

Thanks!
0
Hi All,
I'm trying to troubleshoot msdb.dbo.sp_send_dbmail

Below is the procedure.  I know the profile works because I can send something without an attachment and it works fine.

The original select is from a global temp table.  (EG:  Select * from ##MyTempTable).

The select is then modified to append the blank column as we needed a different char for the CSV.

This was working up until last month, and trying to figure out what's going on.  I've tried changing the select to a different table but that's not working either.

the exec returns Null for the mailID, 1 for the RC, 0 for @@error.  

Any help greatly appreciated.  (SQL Server 2008 R2)

ALTER  procedure [dbo].[sendMail] 
@from varchar(max),
@fromDisplayName varchar(max),
@to varchar(max),
@cc varchar(max),
@bcc varchar(max),
@subject nvarchar(255),
@body nvarchar(max),
@AttachmentFileName nvarchar(255),
@select nvarchar(max),
@columnSeparator char(1) = '|'
as

set @columnSeparator = isnull(@columnSeparator, char(9))

declare @fromAdd varchar(max) = @fromDisplayName + ' <' + @from  + '>'

set @select =  'set nocount on; select '''' [sep=|' + CHAR(13) + CHAR(10) + 'BLANK],' + replace(@select, 'select', '')

declare @mailID int
declare @rc int

begin try
EXEC @rc = msdb.dbo.sp_send_dbmail
    @recipients= @to,
	@copy_recipients  = @cc,
	@blind_copy_recipients = @bcc,
    @body= @body,
    @subject = @subject,
    @profile_name = 'WorkingMailProfile',
	@from_address = @fromAdd,
	@query = @select,
	

Open in new window

0
I know how to create a certain trace on the SQL Profiler but I have to....set up a SQL Server Profile trace on our prod server (SQL Server 2008) to capture execute times for a specific stored procedure, and to capture events where a query been delayed more than 2 seconds due to locks which I haven t done before.


Database: testdb
Stored proc: (sp_ex)
Elapsed: more than 300ms
Login: login\test

How would I go about this?
0
How can I get better performance with my sql query in a SP? if you look at below my execution pan you will see that :

IF NOT EXISTS(SELECT * FROM Common.[CustomerEntityIds] WHERE EntityType = @EntityType AND CustomerId = @CustomerId)

Open in new window


has alot of memory usage. How  can I reduce that?

ALTER PROCEDURE [Common].[SaveCustomerEntityIds] 
(
	@EntityType	NVARCHAR(128),
	@CustomerId	INT,
	@EntityId	INT OUTPUT
)
AS
BEGIN
	SET NOCOUNT ON;

	IF NOT EXISTS(SELECT * FROM Common.[CustomerEntityIds] WHERE EntityType = @EntityType AND CustomerId = @CustomerId)
	BEGIN
		INSERT INTO Common.[CustomerEntityIds]
					([EntityId]
					,[CustomerId]
					,[EntityType])
				VALUES
					(0
					,@CustomerId
					,@EntityType)
	END

	UPDATE	Common.[CustomerEntityIds]
	SET		[EntityId] = ([EntityId]) + 1
	WHERE	[EntityType] = @EntityType
			AND CustomerId = @CustomerId

	SELECT	@EntityId = EntityId
	FROM	Common.[CustomerEntityIds]
	WHERE	[EntityType] = @EntityType
			AND CustomerId = @CustomerId
END

Open in new window





picture of execution planpicture of execution planExecutionPlan2.sqlplan
0
Hi;

How can I improve performance issue ? I created below stored procedure but if I check costly stored procedure by using sql query to detect them. I realized that below query needs improvement to fix performance issue.Can you help me please?

USE [FFSNext]
GO
/****** Object:  StoredProcedure [WMS].[UpdateQuantityBySalesOrderLineIds]    Script Date: 6/12/2019 11:55:39 AM ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO

ALTER PROCEDURE [WMS].[UpdateQuantityBySalesOrderLineIds] 
(
	@SalesOrderLineIds VARCHAR(MAX)
)
AS
BEGIN
	SET NOCOUNT ON;

	UPDATE	PLL
	SET		Quantity = SOL.Quantity
	FROM	WMS.PickListLine PLL

	JOIN	WMS.SalesOrderLine SOL ON SOL.Id = PLL.SalesOrderLineId
	JOIN	PIM.Product SOLP ON SOLP.Id = SOL.ProductId

	WHERE	SOLP.HasAssembly = 0 AND PLL.SalesOrderLineId IN
	(
		SELECT	CAST([value] AS INT)
		FROM	STRING_SPLIT(@SalesOrderLineIds, ',')
	)
	

	UPDATE	PLL
	SET		Quantity = (SOL.Quantity * PLLP.Count)
	FROM	WMS.PickListLine PLL

	JOIN	WMS.SalesOrderLine SOL ON SOL.Id = PLL.SalesOrderLineId
	JOIN	PIM.Product SOLP ON SOLP.Id = SOL.ProductId
	JOIN	PIM.ProductAssembly PLLP ON PLLP.AssemblyProductId = PLL.ProductId

	WHERE	SOLP.HasAssembly = 1 AND PLL.SalesOrderLineId IN
	(
		SELECT	CAST([value] AS INT)
		FROM	STRING_SPLIT(@SalesOrderLineIds, ',')
	)
END

Open in new window

0
Learn Ruby Fundamentals
LVL 13
Learn Ruby Fundamentals

This course will introduce you to Ruby, as well as teach you about classes, methods, variables, data structures, loops, enumerable methods, and finishing touches.

This is a related question that Scott helped me with

https://www.experts-exchange.com/questions/29145132/Adding-up-rows.html?headerLink=workspace_answered_questions

And I'm on SQL 2012

I'm running into an issue now that I have more data. You see that #FSS table and Cross Apply? when the #FSS table is empty, I don't get any rows back. I commented that Cross Apply (and fields related to it) and I get rows back.

How can I change it to say... give me rows even if #FSS is empty?
0
Hi guys,
Got a sql server2012 running enterprise evaluation edition.  It’s a free version.
I got license for standard version of 2012 sql

hi i tried to activate through the sql server management studio. entered key and got this error below in pic.. can i ignore that error and go ahead ?
IMG_5639.jpg
or do i need to uninstall enterprise evaluation version.  and install standard version from scratch ? ?
0
Dear Experts,

Are there any housekeeping tips or best practices for an SQL server 2008 R2 (64-bit)?

The server's local hdd of 150GB is full, therefore backup always failed.
0
I have a SQL Server 2008 box that is randomly giving issues with the tempdb file.
At random occurrences the templog file fills up with different transactions executing but then at other times these transactions cause no fill-up.

The only way to bring the templog down when big growth occurs is to manually clear/ kill all open connections / sessions older than two hours that use our front end application that connects to a backend db within SQL Server 2008

tempdb config:
on one drive
 tempdb.....mdf file
 tempdb1....ndf file
 tempdb2...ndf file

templog:
On another drive
templog....ldf file.

Any recommendations?
0
Hi,

Firstly thank you for taking a look at my question.

Say that I have the below queries that are working nicely, but I wanted to put those into a Temp Table or a CTE, so that I can query it, what would be the best way of going about this? Basically I need to do a group by on Area on the below results.

select 'ACT Regional' as [Area] ,count(de_listed_date) as [Num Listed], sum(de_principal) as [$ Listed] ,sum(de_adjustments) as [Adjustments] ,sum(de_paid) as [Paid],sum(case when de_status IN ('PAR','PFP','HAR','PAR','PAD','SOF','PAB') then de_owing else 0.00 END) as [Promised $], sum(case when de_status = 'REC' then 1 else 0 END) as [Recalled #], sum(case when de_status in ('PDF','PLR','SET') then 1 else 0 END) as [Paid & Closed]
from debtor
join client on de_rowid_client = cl_rowid
where de_zip between '2611' and '2620' and cl_number IN ('107','118','117') and de_listed_date > '2016-12-01'
UNION ALL
select 'Adelaide Metro' as [Area] ,count(de_listed_date) as [Num Listed], sum(de_principal) as [$ Listed] ,sum(de_adjustments) as [Adjustments] ,sum(de_paid) as [Paid],sum(case when de_status IN ('PAR','PFP','HAR','PAR','PAD','SOF','PAB') then de_owing else 0.00 END) as [Promised $], sum(case when de_status = 'REC' then 1 else 0 END) as [Recalled #], sum(case when de_status in ('PDF','PLR','SET') then 1 else 0 END) as [Paid & Closed]
from debtor
join client on de_rowid_client = cl_rowid
where de_zip between '5000' and '5199' and cl_number IN ('107','118','117') 

Open in new window

0
hello Guys,

I have a Query that results me 130.000 rows, I need to save all the rows into a txt file.
Today I use my system to do that, row by row and it takes me almost 3 horas processing.
Is there any possibility to have all the rows into a text field, thus I could save all the information
into a file in seconds.

Regards
Alexandre
0
I demoted a 2008r2 domain controller.  Now I cannot log into SQL server.  SQL was installed on this DC because at the time because this was there only server.  Made 2 Manage software runs on this server and the database for this software is installed on SQL.  I didn't realize that this would break SQL.  Anyone know any fix for this?  I won't be back to work until Tuesday to address this, but I an trying to research this now.  Hopefully I will have a fix for this by Tuesday to get our M2M software backup and running.
0
I'm trying to export a View as a Flat text File. The view has 6 months of transactions and executes in 5 seconds.
I created an export data DTS package to export this data to text and it is taking an age, the data file is stored on the same server.

Anyone know why this is taking so long?
0
Here is some sample data for a sql server 2014 table


CREATE TABLE #License(
       [customer_id] [int] NULL,
       [license_key1] [nchar](10) NULL,
       [license_state1] [nchar](10) NULL,
       [license_key2] [nchar](10) NULL,
       [license_state2] [nchar](10) NULL,
       [license_key3] [nchar](10) NULL,
       [license_state3] [nchar](10) NULL
 ) ON [PRIMARY]

 INSERT INTO #License
            ([customer_id]
            ,[license_key1]
            ,[license_state1]
            ,[license_key2]
            ,[license_state2]
            ,[license_key3]
            ,[license_state3])
      VALUES
            (11111111
            ,12345    
            ,'IL'
            ,34567    
            ,'IN'
          ,12345    
            ,'IL')

 INSERT INTO #License
            ([customer_id]
            ,[license_key1]
            ,[license_state1]
            ,[license_key2]
            ,[license_state2]
            ,[license_key3]
            ,[license_state3])
      VALUES
            (22222222
            ,66666    
            ,'MI'
            ,999999    
            ,'NY'
          ,87654    
            ,'CO')

 INSERT INTO #License
            ([customer_id]
            ,[license_key1]
            ,[license_state1]
            ,[license_key2]
            ,[license_state2]
            ,[license_key3]
            ,[license_state3])
      VALUES
            (33333333
            ,908763    
            …
0
Starting with Angular 5
LVL 13
Starting with Angular 5

Learn the essential features and functions of the popular JavaScript framework for building mobile, desktop and web applications.

Hi
What connection string would I use to access my local sql database on my computer. I remember using LocalDB etc
Thanks
0
I have the following stored proc....I am passing @FileData into the from clause....its of type xml ...but the dynamic sql is nvarchar.

how do I format this so that it will use the variable @FileData.

The sql runs fine in a test..only when called as stored proc  I get
Must declare the scalar variable "@fileData".

Here is the stored proc so far.

USE [SharePoint2007Archive]
GO
/****** Object:  StoredProcedure [dbo].[uspCreateAttachments]    Script Date: 5/9/2019 8:49:29 AM ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO

ALTER PROCEDURE [dbo].[uspCreateAttachments]
@ReqID nvarchar(50),
@TableName varchar(MAX),
@xQuery VARCHAR(100),
@fileData XML
AS


BEGIN
Declare @Insert nvarchar(max);

	-- SET NOCOUNT ON added to prevent extra result sets from
	-- interfering with SELECT statements.
SET NOCOUNT ON;

set @Insert = 'INSERT INTO sharePoint2007Archive.dbo.Attachments (RequestID,Form, Document)
Select '   + QUOTENAME(@ReqID,'''') + ' as RequestID, ' + + QUOTENAME(@TableName,'''') + ' as Form, 
 xData.Col.value(''*:Document[1]'',''nvarchar(max)'') Document
from @fileData.nodes(''' + @xQuery + ''') AS xData(Col)
where DATALENGTH(xData.Col.value(''*:Document[1]'',''nvarChar(MAX)'')) > 0;'
print @Insert;

execute(@Insert);

print 'Inserted New Request ' + @ReqID + 'for Form ' +  @Tablename


END -- uspCreateAttachments

Open in new window

0
This is SQL 2012. I'll go step by step and show what I need done. I'm thinking I need a loop but also somehow to see the next row

1. Sample data
  Create Table #test
(
 Device VARCHAR(250),
 CashCredited Money,
 PlayType VarChar(50),
 DateAndTime DateTime,
 [Transaction ID] BIGINT,
 accunulated money)

--inserts

 INSERT INTO #test
 (
     Device,
     CashCredited,
     PlayType,
     DateAndTime,
     [Transaction ID],
	 accunulated

 )
 SELECT 'MX',-2.50,'Debit','2019-02-05 11:12:21.920',1211793,0 --last column needs to be -2.50


INSERT INTO #test
 (
     Device,
     CashCredited,
     PlayType,
     DateAndTime,
     [Transaction ID],
	 accunulated

 )
 SELECT 'MX', 0,'Credit','2019-02-05 11:12:45.000',1211794,0

 INSERT INTO #test
 (
     Device,
     CashCredited,
     PlayType,
     DateAndTime,
     [Transaction ID],
	 accunulated

 )
 SELECT 'MX', -1,'Debit','2019-02-05 11:30:26.383',1211795,0 -- last column needs to be -3.50

  INSERT INTO #test
 (
     Device,
     CashCredited,
     PlayType,
     DateAndTime,
     [Transaction ID],
	 accunulated
 )
 SELECT 'FSS',0 ,'Posting','2019-02-05 11:30:27.383',8888,0 --second column needs to be 3.50

   INSERT INTO #test
 (
     Device,
     CashCredited,
     PlayType,
     DateAndTime,
     [Transaction ID],
	 accunulated
 )
 SELECT 'MX',0,'credit','2019-02-05 11:31:01.000',1211796,0

    INSERT INTO #test
 (
     Device,
     CashCredited,
     PlayType,
     DateAndTime,
     [Transaction ID],
	 

Open in new window

0
We plan to move from MS SQL 2008 SP4 Enterprise edition to
MS SQL 2016 standard edition :

a) besides exporting out the data/tables & import into the Standard
    Edition, what are the things to watch out for to verify the viability
    to migrate

b) what's the differences that Enterprise offers that Standard doesnt?

c) any other migration checklist items, compatibility & points to note?
0
Dreamweaver 2019 CC
Win 10, 64 Bit, all patches current
Microsoft SQL Server 2008 r2 Express
SSMS installed using window for credentials
Apache installed

Installed is a software I sell and support for the Point of Sale industry.
Use localhost via Chrome to log into a "Back Office" usually used for reports for the end user / customer. I / we also use it for adding items ect.
1. Fully functioning installation.
2. Successfully login via Chrome "localhost" using default user name and password
##
Goal:
Use Dreamweaver to edit webpages display - padding fonts and adjust some of the tabular displays.

Problem:
No matter the page I chose - in Live Mode (NOT LIVE PREVIEW) displays the login page.

Live preview did ask for login - but then displays the page. I'd really prefer to not work in this manner - but if this is the only solution.

Kindly help me out. Thank you.
R/
Vincent.

See my settings below.

Dreamweaver-settings.png
LocalTesting.png
BasicConnection.png
AdvancedConnection.png
ServerChoices.png
0
I am upgrading my SQL server from 2008 r2 to 2012 then to 2016. We have hundreds of Legacy DTS packages that I need to convert to SSIS. In the mean time I was wondering if anyone knows of software or a tool to run Legacy DTS packages in server 2012 or later edition

Thank you
0

Microsoft SQL Server 2008

49K

Solutions

17K

Contributors

Microsoft SQL Server 2008 is a suite of relational database management system (RDBMS) products providing multi-user database access functionality.Component services include integration (SSIS), reporting (SSRS), analysis (SSAS), data quality, master data, T-SQL and performance tuning. Major improvements include the  Always On technologies and support for unstructured data types.